What we think, say & do
Last month I attended the Selenium Conference in London. Tucked away on Puddle Dock just north of the River Thames was two days of presentations by Selenium and Automation testing experts from around the world. This was my first Selenium conference, and in fact my first conference where the bulk of the presentations were all to do with Quality Assurance of one form or another. There were around 30 presentations over the two days and this blog post is about a few of my favourite talks.
Simon Stewart - Zen and the Art of Open Source Maintenance
The two days kicked off with a great presentation from Simon Stewart - the creator of Webdriver and a Selenium core team member. His talk was based on the Art of Open Source, and he mentioned these startling stats from The Standish Chaos Report:
16.2% of projects come in on time and on budget
Projects typically launch with 42% of their planned features
His conclusion was that projects are often late, expensive, and they don’t work - in short, software fails a lot. On the flip side, however, on an open source project these bugs are the gateway to collaboration and contribution that should, ideally, lead to better software.
Talk Link - https://www.youtube.com/watch?v=rOFqg27bqqw
Adam Carmi - Advanced Test Automation Techniques for Responsive Apps and Sites
Another great talk was given by the CTO from Applitools, Adam Carmi.
He did a live coding and execution demo showing how to use the open source Applitools SDK to perform visual tests on a site. His demo was run against the Github website with a colour modification to the Github logo, with the idea being that his tests would detect the colour change and provide an error in the test report. The demo illustrated how easy the SDK can be incorporated into a C# script and how useful the Applitools report is. One of the great features of the report is the ability to gather common errors. In the demo, the Github logo change affected every screenshot that was taken (some 30 or 40) - the report has a feature that allows you to gather up all the common errors across all the screenshots that were taken. In the case of the demo, the 30-40 screenshots were reduced to just 1. This is a great timesaver.
Testium (by Groupon)
Testcafe (this was Umar’s current favourite)
This was a great conference and I felt like I learned a lot from it. The best thing about it was that there were real actions I could take from it and try out myself. Currently, I have a series of visual tests for one client that are written in Java and use the Selenium 2 Visual Diff from kreyssel. This has been working really well up to now, but I am keen to try out the same tests using the Applitools SDK. Certainly the Applitools report is nicer to look at but I am also really keen to see if the screenshot comparison process is any faster.