Introducing exploratory testing at Improbable Defence

Testing is important to us here at Improbable. It provides the framework for us to check, measure and explain how functional, performative and accessible our products are. Just as you would not put a baby into the bath without making sure the water was not too hot or cold, we would be loath to release our software to partners or customers without making sure that it, too, was comfortably able to meet their needs.
One facet of that testing that we do and one of the pillars that our test practice is built on is Exploratory testing. This is the process by which we use deep and technical system investigations to find and communicate information related to the overall quality narrative.
These system investigations go all the way from reviews of the technical architecture to ensure testability, testing of the components that live in the platform, testing of the data that drives our models, testing of the middle layers of the stack that glues it all together, all the way up to the user interfaces and the information we extract from the systems that monitor all the traffic going across the stack.
A quality narrative is what we expose as a result of designing, implementing, changing, using, testing, monitoring and stressing our systems. This in turn leads to informed decision making around the capability and reliability of our systems.
Improbable’s test practice is responsible for championing testing for our products. We do this through the creation of ways of working related to testing and providing continuous feedback on what we’re building.
Elizabeth Fiennes, Quality Engineering Manager, talks us through how we do our exploratory work and the team behind it.
Our test practice
The test practice within Improbable Defence is relatively new, having been formed in late 2020 based on requests from management and engineers to bring in testers to help professionalise how we verify and validate our systems. Our testers come from a broad range of different industries and disciplines; meaning that we’ve been able to leverage the latest ways of working, as well as creating some of our own, to create a testing and feedback approach unique within the defence industry.
Our testing approach focuses on two pillars: automation, to check that known product behaviour persists, and exploratory. In exploratory testing we run experiments and investigations to uncover new information about the function and performance of our systems. This means stepping out of what we know about the product and looking for the unknowns.
Once we have confirmed expected behaviours and uncovered some unexpected ones (!), these are considered for inclusion in our automation pack thereby reducing the amount of manual testing and increasing the potential amount of future exploratory testing we can do with every release of a product. Future posts will cover the what and the how of our thinking and tools for automation.
Trusted advisor roles
We are embedded into cross-disciplinary agile project teams to see what’s important to them and focus on providing the right information at the right time. We flag issues in small tight feedback loops but also share the good to celebrate when our teams do well. By embedding into a team and focusing on what they need we’re trusted to be the voice of testing, rather than a bottleneck or a gate.
Testing world explorers
We’re happy working with rapidly changing requirements and use exploratory testing throughout the technical stack to uncover new information related to risks. Our exploratory testing is structured, traceable back to requirements and focuses on note-taking to allow for auditability. By exploring the functional and non-functional parts of our systems and components we provide insight about how what we’re building will work in the field.
Ways of working
We work in agile project teams that include testers, developers, designers and project managers all working together to create products either alone or as part of a larger programme of work. When we join a team we consult with them to create an approach that’s both holistic and pragmatic, identifying the testing needs that will make what’s being built a success. Rather than a one size fits all view of testing, we work out what “fit for purpose” means for what we’re building; this includes how it’ll work in the real world (how performant, how secure, how usable and accessible it needs to be).
We shift our testing both left – building testability into architectural proposals and testing early in the software development lifecycle – as well as right – continuously monitoring as well as testing later in the lifecycle post deployment to our customers. This means testing the ideas for what a product should be, testing how we’ve asked for what should be built as well as testing for the impact our product has when it’s used in the real world.
Learning, learning, learning
We coach in the large and mentor in the small within our own practice and to the wider organisation. Whenever we learn about something new, or have knowledge and skills that could be useful, we share it with everyone. As well as our weekly “Let’s get technical testers” where we train within the testing practice on testing tools and skills, we run outreach into the wider organisation through workshops, wiki pages, blogs and proof of concepts.
We also look to learn and teach outside of Improbable too, with our test practice members writing blogs, attending meetups and conferences and presenting talks. Each test practice member also has a learning budget which people can use to attend external courses or buy materials.
No favourite toys
We use a broad range of tools to support our exploratory testing – tools to manipulate data sent to APIs, tools to manipulate browser behaviour (speed and version) and tools to create varied network conditions to see how our application behaves when not on a stable standard London 4G connection. We always look to employ the best tools for the job, rather than trying to crowbar in the wrong things or comfort test.
We’ve researched and run proof of concepts across a number of non-functional test tooling, use CI driven UI/API Automation with industry-leading frameworks and have created bespoke modelling behaviour verification frameworks for testing our simulations. Then we create workshops and teach others how to use them too.
Shortening feedback loops
We live by the mantra “prevention is better than cure” so look to stop bugs before they even happen. We test early by testing ideas, helping to shape stories, whilst looking for risks early to inform technical designs, using what we call Triforce (aka 3 Amigos) Triforce is a meeting where three viewpoints (product, engineering and testing) come together to refine what we’re asking for; this collaboration gives us a holistic perspective of what needs to be built, which reduces the risk of issues or building the wrong thing before we even start any coding. Then, by exploring what’s being built as we build it, we can talk about its purpose and prevent bugs from being committed to the product in the first place.
Innovating by using the best of exploratory testing techniques
Traditional development wisdom says that you can test something once it’s been built; this works fine for a battleship, but for software we want to shorten the time to feedback. Our approach of exploration to uncover information means we can get testing as things are developed, test where there is no spec and find out information that the spec doesn’t even cover.
Planned exploratory testing allows us to find a wealth of information about what we’re building and provide a holistic, technical approach to testing complex integrated systems. Rather than saying “we’ve met the spec”, we can say “here’s how we’ve met the spec, these are the potential pitfalls we saw around it and here’s what we found that’d delight the end user”. This wealth of information related to our systems allows us to make more informed decisions on whether a product or feature is done and ready for the customer.
Join our test practice
If you’d like to learn more about test roles in Improbable Defence you can see our open roles for Exploratory Testers and Software Development Engineers in Test (SDETs) on our careers opportunities page.