- How can we declaritively define behaviours in a standard fashion?
- Behaviours as Gherkin .feature files
- Complex Storytelling made easy
- Defining matching results within the behaviour description
Requirement: A MAP
that enumerates the State Space - Definition of Done
Our current tests are not super easy to write, read, or review. BDD in go was in it's early days when k8s started integration testing with a tightly coupled testing approach. The e2e Ginko framework evolved upon those tightly coupled assumptions. This approach unfortuneatly lacks a description of the desired behaviour other than in comments in the code. It also lacks metadata and usable tagging for efficient organization into test areas and suites.
Documenting and discovering of all our behaviours will require a combination of automated introspection and some old fashioned human storytelling.
We need to standardize the business language that our bottlenecked people can use to write these stories in a way can be powertool assisted with some automation. This would reduce complexity for articulating concrete requirements for execution in editors, humans, and automation workflows.
I'm suggesting we fund a short 4 week exploration into Gherkin as the behaviour file format and godog as a BDD library to address our current painpoints and lack of automation tooling and processes for behaviour testing.
We could go as far as replace feature with behaviour, as the mapping is pretty spot on.
Feature: Structured Metadata allowing Behaviour Driven tooling automation
In order to auto-generate testing scaffolding
As a sig-X member
I want to decribe the behaviour of X
@sig-X
Scenario: Behaviour X
Given a well formed file describing the behaviour X
When I run the automation
Then I am provided with the basic structure for a corresponding test
@sig-Y
Scenario: Behaviour Y
Given a well formed file describing the behaviour Y
When I run the automation
Then I am provided with the basic structure for a corresponding test
@sig-Y @sig-X
Scenario: Behaviour X+Y
Given a well formed file describing the behaviour X
And a well formed file describing the behaviour Y
When I run the automation
Then I can reuse existing step definitons on multiple tests
~/go/bin/godog --no-colors
Feature: Structured Metadata allowing Behaviour Driven tooling automation
In order to auto-generate testing scaffolding
As a sig-X member
I want to decribe the behaviour of X
Scenario: Behaviour X # features/behaviour.feature:7
Given a well formed file describing the behaviour X
When I run the automation
Then I am provided with the basic structure for a corresponding test
Scenario: Behaviour Y # features/behaviour.feature:12
Given a well formed file describing the behaviour Y
When I run the automation
Then I am provided with the basic structure for a corresponding test
Scenario: Behaviour X+Y # features/behaviour.feature:17
Given a well formed file describing the behaviour X
And a well formed file describing the behaviour Y
When I run the automation
Then I can reuse existing step definitons on multiple tests
3 scenarios (3 undefined)
10 steps (10 undefined)
1.154961ms
You can implement step definitions for undefined steps with these snippets:
func aWellFormedFileDescribingTheBehaviourX() error {
return godog.ErrPending
}
func iRunTheAutomation() error {
return godog.ErrPending
}
func iAmProvidedWithTheBasicStructureForACorrespondingTest() error {
return godog.ErrPending
}
func aWellFormedFileDescribingTheBehaviourY() error {
return godog.ErrPending
}
func iCanReuseExistingStepDefinitonsOnMultipleTests() error {
return godog.ErrPending
}
func FeatureContext(s *godog.Suite) {
s.Step(`^a well formed file describing the behaviour X$`, aWellFormedFileDescribingTheBehaviourX)
s.Step(`^I run the automation$`, iRunTheAutomation)
s.Step(`^I am provided with the basic structure for a corresponding test$`, iAmProvidedWithTheBasicStructureForACorrespondingTest)
s.Step(`^a well formed file describing the behaviour Y$`, aWellFormedFileDescribingTheBehaviourY)
s.Step(`^I can reuse existing step definitons on multiple tests$`, iCanReuseExistingStepDefinitonsOnMultipleTests)
}
Feature: Intrapod Communication
Pods need to be able to talk to each other, as well as the node talking to the Pod.
@sig-node @sig-pod
Scenario: Nodes can communicate to each other
Given a pods A and B
When pod A says hello to pod B
Then pod B says hello to pod A
@wip
Scenario: Pods can can communicate to Nodes
Given a pod A on a node
When the node says hello to pod A
Then pod A says hello to the node
@tags-that-are-no-longer-part-of-the-test-name
Scenario: Pods can can communicate to Nodes
Given I create pod A with this yaml spec
"""
yaml: [
values
]
"""
And I create pod B with this json spec
"""
{
json: values
}
"""
When I request pod A and pod B talk to each other
Then I can observe a conversations between them
See https://github.com/jfilipczyk/gomatch#gherkin-example
Feature: json response matching
In order to validate API responses in our Then steps
As a behaviour writer
I need to specify some parts of the JSON to be matched
Scenario: Looking for X in the heystack
Given an apikind of X the following values:
"""
{
"json": "values"
}
"""
When I send "GET" request to "/v1/X"
Then the response code should be 200
And the response body should match json:
"""
{
"items": [
{
"key1": "value1",
"@...@": ""
},
{
"key2": "value2",
"@...@": ""
}
],
"@...@": ""
}
"""