Why test plans often aren't test plans
by Rainer Haupt
Many test plans aren’t. They cite the test pyramid, define unit and integration tests, and rehearse content from the ISTQB Foundation syllabus. Germany’s Federal Office of Administration puts it bluntly: “There is no such thing as the test plan.”
The boilerplate problem
In practice, test plans of 30 or 50 pages are common — and largely project-independent. Explanations of the test pyramid. Definitions of test levels. Generic statements about test automation. References to the V-Model. This material belongs in an onboarding wiki, not in the steering document of a specific project.
Martin Klonk writes in Informatik Aktuell: “What gets produced are tomes that don’t really help anyone — and that, apart from the author, almost no one ever picks up again later.”
What the standards actually require
ISTQB CTFL v4.0.1, Principle 6: “Testing is context dependent. There is no single universally applicable approach to testing.” IEEE 829 mandates 16 clauses, at least five of which must be project-specific — including Environmental Needs, Staffing and Training Needs, and Risks and Contingencies. ISO/IEC/IEEE 29119-3 prescribes mandatory sections for context, risk register, communication, staffing, and schedule.
None of these standards require a plan to explain the test pyramid. All three require it to describe the project.
What actually belongs in a test plan
Automation strategy. Which tests get automated, which don’t? Is automation worthwhile for a one-time migration? Are there tests that run exactly two or three times in their lifetime? These answers determine tooling and return on investment.
Regression intent. Tests written for regression have different requirements than tests written for one-time acceptance. Stability, maintainability, and runtime are critical in the first case, almost irrelevant in the second.
Test environments. What is the purpose of each environment? What actions are permitted there? What data may reside there? Which production artifacts are available — and which are not? Hettwer Beratung puts it this way: “The organisation of the test environment must be defined individually for each project.”
Test data. Production data is, in most cases, not legally permissible under GDPR. Germany’s Federal Commissioner for Data Protection defines a clear order of preference: synthetic first, then anonymised, then pseudonymised — real data only when all other options are ruled out. This order belongs in every test plan that touches personal data.
Team skills. Which roles are filled? Are there dedicated testers? Test automation engineers? Are developers available for test support? Which tech stacks does the team master — Python, JavaScript, Java? A plan that prescribes Selenium-Java fails in a team of pytest experts before it starts.
Tech stack. Which languages, frameworks, CI/CD pipelines, and reporting tools are in use? Which tooling decisions fit the existing stack? These answers can’t be generalised — they hang on the specific system under test.
Why it matters
Capgemini’s World Quality Report 2025/26 names a central pain point of the industry: 60 percent of organisations struggle with secure, scalable test data. Projects fail on this — not on the mix ratio of the test pyramid.
Even Mike Bland, who helped popularise the famous 70/20/10 rule at Google, today admits: “Yes, these numbers essentially were pulled out of a hat.” Spotify calls the pyramid “actively harmful” for microservices. Martin Fowler walks back his own 2012 contribution. The practitioner consensus is unambiguous: context beats schema.
Three questions for every test plan
- Does the document contain anything that could appear unchanged in any other project? Then it doesn’t belong here.
- Does the document answer what gets tested in which environment with which data? If not, the central content is missing.
- After reading, would an outside reader have a clear picture of risks and responsibilities? If not, the steering function is missing.
A test plan is a steering document for a specific project. Taking that seriously means writing different plans — shorter where the textbook suffices, longer where environments, data, skills, and risks are at stake. Plans built this way create clarity through testing instead of filling pages.
Sources
- German Federal Office of Administration: QS-Baukasten — Testkonzept (in German)
- Martin Klonk, Informatik Aktuell: The perfect test plan in 6 steps (in German)
- ISTQB CTFL Syllabus v4.0.1, Section 1.3
- IEEE 829 Outline
- ISO/IEC/IEEE 29119-3 (2013)
- Hettwer Beratung: Testumgebung (in German)
- BfDI: Position paper on test data (in German)
- Capgemini World Quality Report 2025/26
- Mike Bland: Small/Medium/Large
- Spotify Engineering: Testing of Microservices
- Martin Fowler: On the Diverse And Fantastical Shapes of Testing (2021)
Writing a test plan for a specific project? In the UTAA workshop we work out the project-specific content that standards demand. More on the method or get in touch directly.