Automated testing with JBehave
What do developers like most? Build new software! What do developers hate most? Testing and writing documentation! Why would you write tests? It slows down your process of making new software. If you can write that new feature in an hour, why spend another hour on writing unit tests and integration tests for them. You don’t make mistakes and if some user does something you didn’t expect he would, blame him first, then think of fixing it. If you had written tests before then they suddenly break because of your new feature and you need to modify those tests again because you now need to add another condition or the webpage has changed intentionally. Doesn’t that slow you down?
Okay maybe you did actually break functionality because of your new feature. You had not written tests before and the broken functionality is in a different part of the application so you didn’t see it when clicking through your website. Now your customer discovers the bug and thinks you’ve done bad job (which you did). So you start adding unit tests and for new features you switch to Test-driven development.
You develop new features, let all the tests pass and ship the new version. Will it be perfect? Unfortunately not. Your unit tests may cover isolated functionality. One tests checks if you can retrieve all countries your webshop ships to. The other test checks what happens after submitting a registration form. But there is no test that goes to the actual registration form and selects a country to find out that the list of countries is empty because it had not been uploaded to the server.
Recently my team wrote tests that go through the front end of the platform to check if it still works as expected. We chose JBehave to write stories that are human readable, map them in Java code and execute them in a real browser as if a real user is performing the tests. For Apache Rave we also needed to write integration tests because our list of features is growing and unfortunately we ran into a blocking issue despite a high unit test coverage.
As an experiment I started a test project on GitHub. I started with the JBehave Spring archetype, deleted the sample test and added my own test case:
Scenario: User creates a new account and logs in into the portal
When I go to "http://localhost:8080/portal"
Then I see the login page
When I follow the new account link
Then I get the new account form
When I fill in the form with username "newuser" password "password" confirmpassword "password" email "[email protected]"
And I submit the new account form
Then I see the login page
And A message appears "Account successfully created"
When I fill in the login form with username "newuser" password "password"
Then I see my portal page with the add new widgets box
With annotations in Java classes you cover these steps like
public class NewUserSteps {
@Autowired
private Portal portal;
@When("I go to \"$url\"")
public void goTo(String url) {
portal.go(url);
}
@Then("I see the login page")
public void isLoginPage() {
final WebElement title = portal.findElement(By.tagName("title"));
assertThat(title.getText().trim(), equalTo("Login - Rave"));
}
/* Other methods that cover steps */
}
As you can see, the annotations can handle parameters from the story. For readability of the story I wrapped the parameters in quotes, but that is not necessary. From the step a Portal
class is called. This class extends WebDriverPage
which triggers actions in the browser like going to a url, clicking links or filling in form fields:
@Component
public class Portal extends WebDriverPage {
@Autowired
public Portal(WebDriverProvider driverProvider) {
super(driverProvider);
}
public void go(String url) {
get(url);
}
public void pressNewAccountButton() {
final WebElement newAccountButton = findElement(By.id("createNewAccountButton"));
newAccountButton.click();
}
/* Other methods */
}
Within an hour I had written the first automated test which can be run over and over again. Really? Not yet; you need to manually delete the user before running the test a second time. This can also be automated in an @AfterScenario
or @AfterStory
annotation to revert the state of the application.
After running the test, JBehave generates a report of how many stories, scenarios and steps it has run and how many were successful. I have recorded the run of this scenario in this video:
Does it mean I suddenly like writing integration tests? No, but I hate doing all these tests manually even more.