New organization, new tester
I have the privilege to coach a new tester in his new job at the current company I'm working for. I know that teaching and coaching other people will sharpen your own mind too, so that's a great benefit. The tester expressed also that he is glad that he can ask me questions about his work and is very keen on learning the trade. With about a year of experience in scripted testing and re-testing defects in big organizations he is now plunged into a small IT department where we started working in an agile way with the scrum method. No pre-written test scripts there. Exploratory testing and using a critical mindset for testing is very important in this environment. New for him, but also for me, as I still am learning everyday about testing as I don't consider myself as "all knowing". Coaching this tester makes me aware though I know a lot about testing already and can really help other testers getting the best out of them. So, on to the actual blog, maybe you could use this in your daily practice.
Testing an overview screen
One change in the product under test was a new overview screen for customers. This screen was made at almost the last day of the sprint (iteration) and was a bit of a surprise because it was not planned. The developers did this in between other functionality. Easy job, no problems, so easy we don't have to test it. The screen was shown to the product owner and he looked at it and said it was OK. Ready.
Ready? Huh? Really?
The tester was relaxed after that and said he had nothing to do anymore for the rest of the day. The demo was given and the sprint was ready, the software is ready for release on Monday.
I asked the tester to show me the screen that had not been tested. In this screen you would see a row of software products, license numbers and the start and end date of the license. It looked something like this: (made an example in Excel)
We looked at the screen and the tester mentioned that he had asked the developer what to test in this screen. Developers did say nothing was testable in this screen. So when he showed me the screen he was staring at it. "It looks OK to me. I don't know what to test here."
Questions to ask
So I spoke my thoughts out loud:
- I see a number of columns with data, where is this data coming from?
- Can we check if this data is correct by looking at the source?
- Why is the name of the software the same for every product in this screen?
- How is this screen sorted?
- There are 15 rows on the screen, but it says it should show the list by 20 items, are there more items?
- How would the screen present 21 items or even more?
- The column "License" showed telephone numbers (with an icon), why does it say "License"?
- There is a search box at the top, does it work? How does it work?
- What if we added items or changed items in the source database or product. Is it shown on this screen after a change?
These were initial thoughts that were reasonably easy for me to think of. For the tester it was a bit of a surprise that he could look at this screen this way. He discussed it with the developers, did some tests, changing the items and got the answers to these questions. The licenses were shown as a phone number because of the Skype software that was presuming these were phone numbers. Changes to the data and more than 20 items worked OK, et cetera.
The only thing not really explainable was the search box. It only searches in the license numbers. Would that be enough for a customer that searches for some software he bought? As it was Friday afternoon we did the "Most people are gone and it is time to get home for a nice weekend" stopping heuristic and went home.
The reason to write this article is to emphasize that if a product doesn't look like there is anything to test, there will be certainly something to test just by asking questions about what you see and make them test ideas to use for a test. Even to check if it really is working as expected.
- Does it really work and was it really that easy to program this?
- Huh? Some things don't look normal to me? Write down these questions.
- Asked the questions to developers or designers. So? It works, but does it really work as expected for the user?
It is also a measurement for risk: If we found a number of issues in this screen, the next time we would test harder and try to raise more questions. But for now it seemed that it was working properly so the next time we will give it a relative lower risk for testing this. It is also building trust in the team. But as I always say: Trust but verify.
Now the urge to quick test this screen is fulfilled and a weekend starts, this screen will be released on Monday and we don't think it will raise a lot of questions for the customer (except maybe the search box).