Monday, 27 November 2017

Language is Still Hindering Testing and The Hiring of Testers

It's been a month now since I attended Test Bash Manchester. I heard two very powerful talks at that conference which have been swishing around in my brain for a while now. Both talks came from speakers that shared a desire to advance the craft of testing.

The first talk was by Martin Hynie (@vds4), currently Director of Test Engineering at Medidata. The second talk was by Michael Bolton (@michaelbolton) a tester, collaborator, coach, consultant, author and Twitter super star.

Martin's talk "The Lost Art of the Journeyman" and Michaels talk "Where do you want to go today? No more exploratory Testing" both invoked the same feeling in me. Change is still very much needed when we are talking about testing. Martin said that only by identifying entrenched beliefs can we find opportunities for change. He explained that one of these entrenched beliefs is what "testing" means. So to invoke change we need to approach from the same side as someone that doesn't understand testing.

Both speakers talked about testing being a craft. Martin went a step further and said that testing is not a commodity.

I still frequently see testing treated as a commodity by people which do not work as testers. I get embarrassed when people believe smart self directed testing is of equal value to scripted testing. It's also very hard being the person trying to explain that someone's beliefs around testing are hindering and causing damage to a project. The belief that all testing is equal is one of those entrenched beliefs Martin told us to be mindful of.

Michael Bolton sums this up on his blog where he says scripted testing is expensive, time consuming, leads to inattentional blindness. Separating the designing of a test script from its execution in turn lengthens and weakens the feedback loop.

Michael told us that scripted testing makes testers incompetent as they are not empowered to think.

The Word 'Empowered' Matters.

As someone doing self-directed testing without a test script it can be very easy to criticise testers that write and work from test scripts and test cases. I have worked with financial institutions which rely heavily on scripts. I have met and spoken face to face with testers that work in this scripted way. Seeing things from their point of view I discovered some of the constraints they have to work within. They are not empowered to throw the scripts away. Management want them to work in this way as it is easy (yet foolish) to measure testing with numbers and stats.

When I worked in the UK games industry, I was lucky that I was able to do testing without scripts, but I was still not empowered. I was stuck behind a wall with many Devs throwing any code they wanted over the wall at a small group of testers. If bugs got missed, that was the testers fault - not the fault of a dysfunctional way of working.

Michael spoke about how the definition of testing had been stolen from testers. Now testing meant something completely different to people outside of testing. He said that the testing community needs to steal the definition of testing back.

What is Your Definition of Testing?

I have recently started asking some of my developer friends the following question: 'What is your definition of testing?' Some of the answers have shocked me!

The first Dev I asked said 'testing is ensuring quality'. I had to try explain that this wasn't entirely true. Testing is an activity that evaluates something (which could be anything) to find problems that matter. The discovery of those problems could have very little to do with ensuring quality if no action is taken once they are discovered!

My challenge to other testers would be start asking people you work with for their definition of testing. Start getting a feeling for how closely your ideas of testing are aligned. Just because you are using the same language does not mean that you are talking about the same things. Do not make the mistake of assuming everyone's idea of testing is the same.

Michael wanted us to return to using the word testing (not exploratory testing - which he said was like calling a cauliflower a vegetarian cauliflower). Martin wanted us to change the language we use for describing testing and testers.

At an open space event on Saturday 28th October 2017 a diverse group of testers sat around a table and openly discussed the testing role. Specifically the language used to describe that role. One thing became very clear very quickly - The language and definition of testing are certainly not shared between testers and non-testers. Even some testers present had slightly conflicting ideas. We certainly have a lot more work to do in this area.

Patrick Prill (@TestPappy) said that he knows people called tester but what they are doing does not match the job ads. Recruiters are have a very hard time when it comes to describing job roles. Instead of hiring testers, maybe we should be hiring people with critical thinking skills. Maybe the best testers aren't actually testers yet?

At the open space gathering it became clear that recruiters can be blind to what testers do. Both Neil Younger(@norry_twitting) and Martin Hynie shared their experiences of pairing with a recruiter. Essentially working together to identify good/bad candidates and the reasons why. Both had positive outcomes from the experience of a recruiter and a tester pairing up and working together.

From my own experiences, observations and conversations I am aware that some skilled testers are still not getting hired. 'Manual tester' has become a dirty word used to devalue testers. I have heard some pretty crazy things this year. I was asked recently by a recruiter if I knew anyone suitable for an 'Automation Tester' position. I also met a manager I met that told me 'most of our testers are manual but they have been with us a long time so rather than replacing them we are going to to train them to be automated testers.'

The first thoughts that went through my head was what is an 'Automated Tester'? Automation is a development task. There is no such thing as automatic testing. Automation is dumb, it can not direct itself, it can not explore or think. Further to that, automation in testing should be the responsibility of the whole team, not a single specialist. By putting the responsibility of an automation project on the shoulders of just one person you are heading for disaster (see the term bus factor).

A Keyword CV Search is Simply Not Enough.

When hiring testers, a keyword search on a CV is simply not enough. This comes back to a need to realign the language we use to talk about testing in the context of 'that thing testers do'.

As well as starting conversations with people we work with about the definition of testing. I believe testers also need to start sharing information with recruiters. This was one of the reasons I was very keen to write and share an article with a recruitment blog. By sharing understanding and knowledge around testing skills and testing work with the very people that are trying to hire us, we not only make things easier for the people trying to hire, but we also make things better for people (like us testers) trying to get hired.

If my job suddenly switched from software tester to recruiter these are some of the things from my experience of testing and testers that I would take with me when trying to specifically recruit testers.

Stop filtering out testing candidates based on certifications.

ISEB/ISTQB really is not a good filter for testing candidates. When I surveyed 187 testers in 2016, only 48% had completed ISEB/ISTQB foundation certificate. I do not hold this ISEB/ISTQB qualification. Some of the brightest smartest testers I know also do not hold ISEB/ISTQB qualifications. There is a big difference between being able to learn answers to some multiple choice questions and test software. By demanding this qualification you will also probably alienate the kind of people you want to attract. Smart testers know these qualifications exist for profit to make money.

Everyone these days puts agile on a CV, this does not mean they are agile.

Better things to ask a candidate rather than looking for the 'agile' keyword:

  • Ask them about a time they changed their mind or changed course.
  • Ask about some experiments they have done within a team.
  • Ask about a time they collaborated or paired with another team member.
  • Ask how they eliminate wasteful testing documentation.

Acknowledge that there is no automated testing

There are no automated testers. Automation is its own development project and should be owned by the whole team. It is possible for someone that can automate (e.g. write code that checks things) to not understand what should be automated. Writing code is a different skill to being able to decide what that code should do.

Acknowledge that there is no manual testing.

There are no manual testers, there is only testing. Trying to divide testers into two groups of manual and automated is a big mistake. Please stop calling testers manual, we don't like it and it damages our craft. If instead of labels we focused on hiring candidates with the ability to think critically and solve problems everyone would be in a much better place.

This post was also published on the blog of Ronald James Digital & Tech Recruitment Agency

Monday, 6 November 2017

Test Bash Manchester 2017 Tweet by Tweet

I was very fortunate that I was able to attend my second ever Test Bash in Manchester. This year was better than last year as two of my co-workers (Hannah & Jack) came along for the ride. I got so excited seeing them get excited!

I spent most of the conference day scribbling notes again. However unlike last year where I mostly wrote text in a pad. This year I had plain paper and used coloured pens. At the open space the following day it was really nice to have my notes from the conference day to hand. In the days following the conference these visual reminders really helped important ideas stick in my head.

I sent all my visual notes up into the Twitter-verse as they were completed. List of tweets below.

Anne-Marie Charrett @charrett Quality != Testing


Goran Kero @ghkero What I, A Tester, Have Learnt From Studying Psychology


Gem Hill @Gem_Hill AUT: Anxiety Under Test


Bas Dijkstra @_basdijkstra Who Will Guard the Guards Themselves? How to Trust Your Automation and Avoid Deceit


James Sheasby Thomas @RightSaidJames Accessibility Testing Crash Course


Vera Gehlen-Baum @VeraGeBa Turning Good Testers Into Great Ones


Simon Dobson Lessons Learnt Moving to Microservices


Martin Hynie @vds4 The Lost Art of the Journeyman


Claire Reckless @clairereckless The Fraud Squad - Learning to manage Impostor Syndrome as a Tester


Michael Bolton @michaelbolton Where Do you Want To Go Today? No More Exploratory Testing



Twitter Mining

Last year, I did some Twitter mining and sentiment analysis after the event. I wanted to re-use those scripts again to tell this year's story. After I got home (and had a bath and good rest) I sat down with my laptop and mined 2700 tweets out of Twitter on the hashtag #testbash. I worked through my code from last year starting to piece together the story of this year's event. If you're interested in the code that this article is based upon, it can be found (along with the raw data) here on Git Hub

Positive and negative word clouds

The word clouds above can be clicked for a larger image. The first thing I noticed after generating some positive and negative word clouds was that the positive cloud was bigger than the negative cloud. 173 unique positive words and 125 unique negative words were identified in the conference day tweets. The conference was a resoundingly positive event!

It didn't surprised me that the word 'Great' was at the center of the positive word cloud. Having done this kind of text crunching a few times now I've learned that 'great' and 'talk' are generally two of the most common words tweeted at conference events. What did surprise me though was the negative word cloud. Right at the center, the most frequently used negative word 'syndrome' closely followed by 'anxiety'. Claire Reckless & Gem Hill spoke about imposter syndrome and anxiety. Both these talks had a huge impact on the Twitter discussions which were taking place on the day. Getting the testing community talking about imposter syndrome and anxiety, even though the words used carry negative sentiments, is a very positive outcome.

The top 5 most favourited tweets were:

#1


#2


#3


#4


#5

Tweets by Time and Positivity

A number representing positivity index was calculated for each tweet. For every word in the tweet present in a dictionary of positive words, the tweet scored +1. For every word in the tweet present in a dictionary of negative words, the tweet scored - 1. The positive and negative words list used to score tweets was created by Minquing Hu and Bing Liu at the University of Illinois and can be found here

The tweet with the most positive sentiment on the day was this one from Richard Bradshaw

The tweet with the most negative sentiment on the day was this one from Dan Billing.

I plotted all the tweets by time and positivity then fitted a loess curve through the points on the scatter plot.

The first thing that really stood out was that one tester was up, awake and tweeting a picture of the venue at 4:17am?!?

Once the event got started, there was a dip in positivity just after 10:00am - Checking some of the tweets around that time

Reason for the dip is related to tweets about bias.

There was another dip in positivity just after 16:00 so I checked those tweets too.

Again, nothing negative was happening, the dip in positivity was caused by the discussion of a subject which has a negative sentiment.

Really positive tweets came at the end of the day once the event had been absorbed. With the last part of the day carrying the most positive sentiment

Tweets by Frequency and Platform

I plotted a frequency polygon broken down by platform to see which parts of the day people engaged the most with Twitter. Again the image below can be clicked for a larger version.

It was very interesting to see how frequently people were tweeting through out the day. The spikes in activity align very closely with the start of each talk. It was also nice to see people taking a break from using twitter on mobile phones over lunch (hopefully this is because real face to face conversations were happening over a meal). The biggest spike of activity happened immediately after lunch time was over during Vera Gehlen-Baum's talk "Turning Good Testers Into Great Ones".

It was a pleasure connecting so many wonderful people at this event. The mix of new faces and familiar faces was fantastic. Test community is the best community ♥ Hopefully see you in Brighton next year!

Wednesday, 19 April 2017

Help Your Testers Succeed in 8 Minutes

2017 has been a stressful year for me so far. I bought a really ugly flat in February, then found myself with two months to make it habitable and move into it. While frantically arranging appointments with trades people and deliveries of essential things (like carpet and furniture), a call for speakers came up for the Agile North East Lightning Talk competition.

I was already so stressed out from trying to move house, the stress of giving a talk felt insignificant by comparison. So I decided to throw my hat into the ring and enter the competition.

I knew the audience would be a diverse group of people, with only one or two software testers in the room so I wanted to come up with a talk that would be interesting to everyone. I came up with a working title of "Things you can do to help your software testers succeed" and wrote the following abstract, it was quite short and a little bit vague in places...

"Testing software is hard. Hiring good testers is hard. Some testing jobs are set up in such a way that testers can never succeed! If you have good testers in your organisation the last thing you want to do is drive them away. I'm going to tell you how you can help your testers succeed and enjoy the many benefits that happy testers can bring to a team."

I found out a few weeks later that my proposal had been accepted and I was in the competition!

It Will Be Alright On The Night

I now had an 8 minute slot in front of a captive audience of people which shared an interest in Agile development. I knew straight away that I was going to have to make each minute count. I wanted to use the opportunity to try raise awareness of the problems software testers face.

I wrote my slides and practised a little bit with a timer to see how much information and advice I could actually jam into 8 minutes. Turns out an 8 minute talk is actually quite a tricky duration to handle because you don't have enough time to get into really detailed explanations, but its long enough that you do have to start explaining concepts.

The day of the talk arrived and I got to the venue about an hour before the event was due to start. The building chosen for the event was a historic listed building, the Northern Institute of Mining and Mechanical Engineers. I was able to scope out the 1895 lecture theatre where the talk would be taking place, see where I would be standing, where the audience would be sitting etc. This really helped reduce some of the stress and nervousness I was feeling on the night.

I was very thankful that some of my friends and co-workers were able to come along to the event. Having a few people there that I knew genuinely wanted me to succeed made the task of speaking mentally easier for me to cope with. I checked with the event organiser that I would be able to make an audio recording with my smart phone and was told this would be fine. I have been trying to record myself every time I speak so I can listen to myself afterwards and find ways to improve.

My lightning talk, "How to Help Testers Succeed" is now up on YouTube.

I was voted 3rd place by the audience and I was absolutely shocked that the 1st and 2nd place winners didn't choose the Lego prize. This let me choose the Lego Millennium Falcon. I haven't built it yet, I need to find someone to help :)

This post was also published on my company's blog Scott Logic Blog

Monday, 16 January 2017

Foreign Currency Trading Heuristic Testing Cheat Sheet

Happy New Year everyone!

For the last 18 months I have been testing software designed to trade foreign currency, known as FX or Forex trading software.

I consider myself lucky as I joined the project on day one which enabled me to learn a lot about testing trading systems.

Challenges

Financial software, including trading applications, can be some of the most incredibly difficult complex applications to test because they contain many challenges such as:

  • Many concurrent users
  • High rates of transactions per second
  • Large numbers of systems, services and applications that all integrate with each other
  • A need to process transactions in real time
  • Time sensitive data e.g. the price to buy a Euro can change multiple times every second
  • Catastrophic consequences for a system failure, bugs can cause financial loss
  • Extremely high complexity level

At the start of my current project, I found very few resources available for testers covering complex financial systems. The few resources that I was able to find were quite dated and generally advised to write detailed plans and document all tests before executing them. I simply couldn't find any information about approaching testing of financial systems in a modern, agile, context driven way.

I was very fortunate on my project that I was able to implement testing with agility and focus on risk. Long checks historically done manually by human testers were replaced with good automated integration test coverage. The team also chose to release to production as frequently as possible, usually once a week. Not having to constantly repeat manual checks of existing functionality gave me time to do a LOT of exploratory testing. Almost all the really bad bugs, the ones with financial consequences, were found during exploratory testing sessions.

Heuristic Testing Cheat Sheet

Given the high level of exploratory testing I was able to do on my project, I generated a lot of ideas and identified some high risk areas and common mistakes. I have decided to put together a heuristic testing cheat sheet for anyone carrying out exploratory testing of trading software.

The full size version of my FX trading heuristic testing cheat sheet can be found here. I wanted to combine my knowledge of trading with some of the ideas I generated. On the sheet my ideas are written around the knowledge in magenta coloured boxes. I hope this may be useful to anyone working on trading software.

This post was also published on my company's blog Scott Logic Blog