Skip to main content

The Fallacy of Omniscience by Proximity



I was reading about President Hollande's apparently complex love life, and a journalist who used to live in France was asked his opinion on the issue. He responded that although he used to live there, those who wanted his opinion assumed that this meant he knew all about the issue and had a groundbreaking insight into the story. 

Turns out, he hadn't lived in France for years, so was about as wise as the rest of us. Those seeking his opinion had fallen for the wonderfully named:


'The Fallacy of Omniscience by Proximity'

It means (in a general sense) you are/used to be in close proximity to subject, so it stands to reason you must know all about it right?

After pondering this lovely phrase for a while, I realised it has a relevance for us testers. 

Firstly, often, testers are a respected oracle for a domain and/or an application. You know that person who 'has been here forever and knows everything about X.' That is literally not true, and reinforces that all oracles for testing (even our fellow testers) are fallible. I still see (and participate in) this behaviour often.

Secondly, when you have tested an application I have found that it is assumed you have knowledge of every major and minor path through the functionality. As a tester, consider this statement, perhaps by another product stakeholder:


'Well, you tested it! Why didn't you know that when I do Y it triggers behaviour Z?!?'

We know of the impossibility of complete testing (our stakeholders may not) but there can be an assumption that testing brings 'complete knowledge' of an application. This is a classic example of the fallacy in question, and has its roots in a fundamental misunderstanding about the purpose of testing that still permeates today.

For now, I'm delighted to find a new way to describe a problem I ponder often.

Comments

Popular posts from this blog

A Lone Tester at a DevOps Conference

I recently had the chance to go to Velocity Conf in Amsterdam, which one might describe as a DevOps conference. I love going to conferences of all types, restricting the self to discipline specific events is counter intuitive to me, as each discipline involved in building and supporting something isn't isolated. Even if some organisations try and keep it that way, reality barges its way in. Gotta speak to each other some day.

So, I was in an awesome city, anticipating an enlightening few days. Velocity is big. I sometimes forget how big business some conferences are, most testing events I attend are usually in the hundreds of attendees. With big conferences comes the trappings of big business. For my part, I swapped product and testability ideas with Datadog, Pager Duty and others for swag. My going rate for consultancy appears to be tshirts, stickers and hats.

So, lets get to it:

3 Takeaways

Inclusiveness - there was a huge focus on effective teams, organisational dynamics and splitt…

Wheel of Testing Part 2 - Content

Thank you Reddit, while attempting to find pictures of the earths core, you surpass yourself.
Turns out Steve Buscemi is the centre of the world.

Anyway. Lets start with something I hold to be true. My testing career is mine to shape, it has many influences but only one driver. No one will do it for me. Organisations that offer a career (or even a vocation) are offering something that is not theirs to give. Too much of their own needs get in the way, plus morphing into a badass question-asker, assumption-challenger, claim-demolisher and illusion-breaker is a bit terrifying for most organisations. Therefore, I hope the wheel is a tool for possibilities not definitive answers, otherwise it would just be another tool trying to provide a path which is yours to define.


In part one, I discussed why I had thought about the wheel of testing in terms of my own motivations for creating it, plus applying the reasoning of a career in testing to it. As in, coming up with a sensible reflection of real…

Getting started with testability

At TestBash Netherlands, I said that, in my experience, a lot of testers don't really get testability. I would feel bad if I didn't follow that up with a starting point for expanding your mindset and explicitly thinking about testability day to day, and making your testing lives better! 

In large scale, high transaction systems testability really is critical, as compared to the vastness and variability of the world, testing done within organisations, before deployment, is limited by comparison. We need ways to see and learn from the systems we test where it matters, in Production.
Being able to observe, control and understand is central to testing effectively, and there are loads of resources and experience reports out there to help. I was/am inspired by James Bach, Seth Eliot, Matt Skelton, Sally Goble, Martin Fowler and a little bit of PerfBytes/Logchat, so lets see if it works for you! 

Overall Model:

Heuristics of Software Testability by James Bach

http://www.satisfice.com/tool…