Profile picture
Michael Bolton @michaelbolton
, 30 tweets, 6 min read Read on Twitter
1. Your periodic reminder: you don't need explicit expected outputs to test. In a real test, a genuine experiment, what counts is what actually happens, independent of any expectation that you might or might not have, explicit or tacit.
2. Moreover, when you're testing, you have tons of tacit expectations, some about outputs and some about other things. Interesting and unanticipated things happen when you're testing. A key job of the tester is to notice them and evaluate them, and to ask "are they *problems*?"
3. The common fixation on "expected result" and "actual result" leads to really lame testing that is practically guaranteed to miss a lot of problems that matter to people. Plus there's plenty of ambiguity about what "expected" means, and sometimes those meanings don't agree.
4. "Expectation" is ambiguous; it could mean "anticipated", "desired", "required", "not surprising", or combinations of those. But something you didn't anticipate could be quite desirable; something that you anticipated could be a bug if it is inconsistent with a requirement.
5. Expressing things in terms of "actual result" and "expected result" can lead to some obtuse reporting, too. ("Actual result: program crashes. Expected result: program should not crash.") As a developer, at best I'd have to assume that the tester was joking. At worst: an idiot.
6. A *condescending* idiot. So, what's a viable alternative to "expected result" and "actual result"? I greatly prefer /Problem, Example, and Oracle/.
7. A problem is something that represents a difference between what someone might perceive and what someone might desire; something that can probably be addressed, though probably not without some effort or some accommodation by someone.
8. A problem is something that represents the possibility of loss, harm, damage, bad feelings, or diminished value to some person who matters.
9. So, when you're testing, it's a really good idea to be especially vigilant for problems. Our clients hire us to find problems during development before they turn into bigger problems later in development or after the product has been released.
10. If you must preserve the "actual" business in your head, the observation of a problem is part of what you've been thinking about as "actual".
11. Another aspect of "actual" is an example: an description or account of how you say this problem, and how someone else might be able to see it too. That often takes the form of a set of procedural steps to observe the problem, but not always.
12. Sometimes, when there's strong mutual understanding of the context, it might less necessary to provide an explicit procedure but to note other things: specific data; a reference to a feature on a specific platform; a certain set of conditions.
13. The decision of what to provide as your example is partly technical, but it's also social. In some contexts, a detailed set of procedural instructions is necessary. In others, "Hey, look at this!" or "I see this problem when I choose a middle seat in business class" will do.
14. And sometimes, that social dimension is super-important. Some developers like explicit procedures in most contexts. Others hate form letters. There are group and individual dynamics that matter. Be aware of that; testing is a social enterprise, not just a technical one.
15. Now, *oracle*. An oracle is a means by which we recognize a problem when we encounter one in testing. At its base, an oracle pretty much always points to an inconsistency between the product's behaviour, state, or nature, and something that someone desires.
16. We have a set of heuristics for identifying (un)desirable (in)consistencies. It's not definitive; periodically we revise it, add new items to it.… Observation of desirable things is sometimes direct; sometimes it's mediated.
17. For instance: a product might show behaviour inconsistent with a claim written in a specification (that's a common source of "expected results") or with a statement that person has uttered verbally. Either way, an inconsistency points to some kind of problem.
18. A product might show behaviour inconsistent with a comparable product. Note that the comparable product in question is not necessarily a *similar* product, but one that *affords a comparison*: a test tool, an algorithm, a similar feature in a dissimilar product.
19. When a product exhibits behaviour inconsistent with predicted and desired output from a tool set up for automated checking, that tool is an oracle. It affords a means by which we observe a problem. It's a medium, though, often reflecting a specification which is a medium too.
20. The automated check represents content from the spec; the spec represents some person's explicitly stated desire or requirement for the product. Inconsistency anywhere in this chain points to some kind of problem in the relationship between the product and some person.
21. The product might have a problem; the automated check might have a problem; the spec might have a problem; and there may be a problem in understanding or fulfilling someone's requirement or desire, irrespective of what someone "expects" (using quotes to emphasize ambiguity).
22. Moreover, any of these problems might be relative to something that has been explicitly stated; to something that has been implicitly or tacitly known all along; or to something that has just now popped into our awareness as we're testing. That is: something *unexpected*.
23. To me, the core of testing a product is to discover what the product *actually* does; to narrate the process of doing that; and (most important to our clients) to identify problems, and provide a credible justification for believing that they are problems that matter.
24. Testing is not simply comparing the product to "expected" results and observing an "actual result". That can be done algorithmically, mechanically. We would call that checking.…
25. Testing a product requires learning about it; exploring it; experimenting with it; *challenging* it in a vigourous search for problems that threaten its value, and that thereby pose risk to the business. This is investigation. This is applied critical thinking.
26. Anyone who has taken on a testing role, however temporarily—testers, developers, DevOps folk, business analysts, tech support people, documenters—can test a product in this way, and can learn to do it well. Plus, it's often less effort and tedium than expected/actual.
27. Testing in this way does require skills: modeling the product, the test space, risk, procedures, oracles, and coverage; seeking and obtaining useful information; note-taking, recording and reporting; and those skills require practice. That practice, and those skills, pay off.
28. Telling the story of the product and the story of your testing is key to something the airline business tends to do very well and that the software business does all too poorly: learning from every problem. Does actual/expected help with that? Not very much at all, I observe.
29. So, if you're still with us: a word from our sponsor. We help people learn how to think about testing and how to test in vastly more powerful ways than expected/actual affords. If you'd like to do that, join us in Seattle:…
30. And if you can't or won't join us in Seattle, some of these links might help you to get over the expected/actual fixation:…;…;…
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to Michael Bolton
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Did Thread Reader help you today?

Support us! We are indie developers!

This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member and get exclusive features!

Premium member ($3.00/month or $30.00/year)

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!