- How can we declaritively define behaviours in a standard fashion?
- Behaviours as Gherkin .feature files
- Complex Storytelling made Possible
- Metadata to match existing tests
Requirement: A MAP
that enumerates the State Space - Definition of Done
Our current tests are not super easy to write, read, or review. BDD in go was in it's early days when k8s started integration testing with a tightly coupled testing approach. The e2e Ginko framework evolved upon those tightly coupled assumptions. This approach unfortuneatly lacks a description of the desired behaviour other than in comments in the code. It also lacks metadata and usable tagging for efficient organization into test areas and suites.
Documenting and discovering of all our behaviours will require a combination of automated introspection and some old fashioned human storytelling.
We need to standardize the business language that our bottlenecked people can use to write these stories in a way can be powertool assisted with some automation. This would reduce complexity for articulating concrete requirements for execution in editors, humans, and automation workflows.
I'm suggesting we fund a short 4 week exploration into Gherkin as the behaviour file format and godog as a BDD library to address our current painpoints and lack of automation tooling and processes for behaviour testing.
We could go as far as replace feature with behaviour, as the mapping is pretty spot on.
Feature: Structured Metadata allowing Behaviour Driven tooling automation
In order to auto-generate testing scaffolding
As a sig-X member
I want to decribe the behaviour of X
@sig-X
Scenario: Behaviour X
Given a well formed file describing the behaviour X
When I run the automation
Then I am provided with the basic structure for a corresponding test
And this is fine
@sig-Y
Scenario: Behaviour Y
Given a well formed file describing the behaviour Y
When I run the automation
Then I am provided with the basic structure for a corresponding test
And this is fine
@sig-Y @sig-X
Scenario: Behaviour X+Y
Given a well formed file describing the behaviour X
And a well formed file describing the behaviour Y
When I run the automation
Then I can reuse existing step definitons on multiple tests
And this is fine
~/go/bin/godog --no-colors
Feature: Structured Metadata allowing Behaviour Driven tooling automation
In order to auto-generate testing scaffolding
As a sig-X member
I want to decribe the behaviour of X
Scenario: Behaviour X # features/behaviour.feature:7
Given a well formed file describing the behaviour X
When I run the automation
Then I am provided with the basic structure for a corresponding test
And this is fine
Scenario: Behaviour Y # features/behaviour.feature:13
Given a well formed file describing the behaviour Y
When I run the automation
Then I am provided with the basic structure for a corresponding test
And this is fine
Scenario: Behaviour X+Y # features/behaviour.feature:19
Given a well formed file describing the behaviour X
And a well formed file describing the behaviour Y
When I run the automation
Then I can reuse existing step definitons on multiple tests
And this is fine
3 scenarios (3 undefined)
13 steps (13 undefined)
1.253405ms
You can implement step definitions for undefined steps with these snippets:
func aWellFormedFileDescribingTheBehaviourX() error {
return godog.ErrPending
}
func iRunTheAutomation() error {
return godog.ErrPending
}
func iAmProvidedWithTheBasicStructureForACorrespondingTest() error {
return godog.ErrPending
}
func thisIsFine() error {
return godog.ErrPending
}
func aWellFormedFileDescribingTheBehaviourY() error {
return godog.ErrPending
}
func iCanReuseExistingStepDefinitonsOnMultipleTests() error {
return godog.ErrPending
}
func FeatureContext(s *godog.Suite) {
s.Step(`^a well formed file describing the behaviour X$`, aWellFormedFileDescribingTheBehaviourX)
s.Step(`^I run the automation$`, iRunTheAutomation)
s.Step(`^I am provided with the basic structure for a corresponding test$`, iAmProvidedWithTheBasicStructureForACorrespondingTest)
s.Step(`^this is fine$`, thisIsFine)
s.Step(`^a well formed file describing the behaviour Y$`, aWellFormedFileDescribingTheBehaviourY)
s.Step(`^I can reuse existing step definitons on multiple tests$`, iCanReuseExistingStepDefinitonsOnMultipleTests)
}
We could use inline json or yaml as CRUD input or verification, as well as reuse step definitions from pervious scenarios as needed.
Feature: Intrapod Communication
Pods need to be able to talk to each other, as well as the node talking to the Pod.
@sig-node @sig-pod
Scenario: Nodes can communicate to each other
Given a pods A and B
When pod A says hello to pod B
Then pod B says hello to pod A
@wip
Scenario: Pods can can communicate to Nodes
Given a pod A on a node
When the node says hello to pod A
Then pod A says hello to the node
@tags-that-are-no-longer-part-of-the-test-name
Scenario: Pods can can communicate to Nodes
Given I create pod A with this yaml spec
"""
yaml: [
values
]
"""
And I create pod B with this json spec
"""
{
json: values
}
"""
When I request pod A and pod B talk to each other
Then I can observe a conversations between them
And this is fine
Scenario: Use existing ginkgo framework
As a test contributor
I want to not throw away all our old tests
In order to retain the value generated in them
@sig-node @sig-pod @conformance @release-1.15
Feature: Map behaviours to existing ginkgo tests
Given existing test It('should do the right thing')
When I run the test
Then we utilize our existing test via our new .feature framework
And this is fine