21 Philosophical Answers

In his blog, Colin Cherry posted 21 philosophical questions about software testing. When reading the questions, I immediately came up with some answers. So at the end of the list I thought, why not answer them in my blog. So here we are.

Software Testing: 21 Philosophical Questions

1) When you get an unexpected outcome, do you assume it’s a Bug?
That depends of course on the outcome. But my first answer would be NO. If the outcome is unexpected, I think about, why did I not expect this. What is wrong, my expectation or the outcome? Are there more possible answers than just one?
I would say, even in case of a check, not all unexpected outcomes might bugs.

2) % Production data (extracts), % manufactured data?
I really like production data. I like to have a big chunk of production data in my systems. Gives good examples on how the customer is using the system.
When it comes to new features or areas like reporting, then I would prefer also good manufactured data. If the system allows it, separated from the production data.
If you really want to know a percentage, I would say about 80/20. And it does not have to be a complete production dump.

3) Controlled analysis or independent thinking?
I like independent thinking. Most important for me is, that my co-workers can explain to me why they are thinking that way. At the end I want to know what went through their mind when testing, how does that fit to my model of the system. What am I missing and what are they missing? If they cannot come up with their own model, I try to assist with explaining my model, to help them build up their own.

4) Software Testing: Science or Art?
I would say both. It’s like an artisan. You need good or even excellent craft skills and you need to be an artist to produce something, that not everybody is able to reproduce.
There is room both for testers like craftsmen and also for artisans, if the environment allows such characters. Artists, like bug magnets without skills, not so much of need.

5) When you find a Bug do you consider it a positive or a negative?
That depends on the where and when. In general bugs are positive. Learning about the system what works and what does not. Often bugs tell me more about the dependencies and relations inside a system than the working processes. In those cases, a-ha effect, awesome.
On the other hand, like on one of my current projects. When the overall quality of a project is deep in the valley, and you think now dev got the turn, and the system gets better build by build, and then you find bugs again or you find bugs that are so easy to spot (like on log-in) and you realize, that they did not even properly test the implementations. When you realize the context of the bug, and it is disappointing, then it’s negative.

6) Is your Testing 50% done or 50% outstanding?
We are 50% done, when we saw big portions of the system and did not find many problems. If we’re halfway through the planned activities and found a lot of problems, then 50% are outstanding, with way more to come.

7) Do you look forward to discussions with Developers regarding their work?
Yes, always. Talking with the developers is always interesting. It gives an insight on the system. You see the different characters of developers. You learn what language they speak and can use their words for a better understanding. You can always ask them what areas of their code, they would test more intense. Ask them for test ideas. So much to talk.

8) How much Testing is enough?
Until some stakeholder is able to make an informed decision, where I’m sure, that he has the biggest part of decision-relevant information at hand. Too vague, of course.

9) Context-Driven or Factory-fed?
I was a context-driven tester in a factory environment for 10 years. Didn’t even know about context-driven testing during all those years, I was always disappointed with the factory-approach to the project. So, definitely context-driven.

10) Automation speeds up or slows down your Testing initiatives?
It helps a lot. But not everywhere. Bring in automation where it makes sense, bring in a good approach to automation, something that is easy to adapt, and it helps.
Try to automate everything, or a bad approach, not good.

11) Vendor, Open-Source or Bespoke tools?

12) Certification or Accreditation?
Proof! Certificates are ok. Accreditation by whom? Most value to me it to see people test, and ask questions along the way.

13) % Prevention, % Cure?
That depends from project to project.

14) Do you get concerned if you don’t find enough Bugs?
Usually, yes. Lately the software tends to break easy. So finding not many bugs makes me skeptical. But what it enough anyway. Looking into the crystal ball, rolling dice, reading tea leaves, asking dev how many bugs they have hidden. All methods are as exact as any other. So it’s only a gut feeling at the end.

15) Should the Testing Team have a say in the release of software?
I like the way my company handles it. Test is providing the information for a Go/NoGo-meeting. There the test department suggests a go or no go based on the open risks. This is not binding the stakeholders. But they have to argue to their bosses when they overrule a no go.

16) What is an acceptable pass/fail ratio for System Test?
Depends on the fail. One epic fail maybe a showstopper, hence 20 small bugs might only be a reason to deliver an update a couple of days after UAT or GoLive.

17) Triage: AM or PM?
Tried different times and approaches. No clear favorite.

18) UAT: Should non-professional Testers (i.e. Users) perform Test execution tasks?
Of course. Non-professional testers will be the users of the system.

19) Testing effort: Onshore or offshore?
Depends on the approach of testing. I tried my first SBTM project and I would not want to do it with offshore.
I like to have my team onsite communicating with each other, learning and improving.
I’m working with offshore now and I have worked both successful and unsuccessful with offshore before. It always depends on the approach, the people involved and the energy needed to keep it rolling.

20) Confidence or scepticism?
Always confident to learn more, always sceptical that PM, BA and Dev really did a good job.

21) Is your Testing completed or finished?
Only on hold for a certain time.

Thanks Colin for your philosophical questions. That brought me the chance to make up my mind a bit in some of those areas.

Advertisement

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: