User stories are usually written from the perspective of a functional user of the product. However, we have certain user stories focused on non-functional requirements of a system.
A technical user story is typically focused on non-functional requirements or support of a system. At times, they may be about removing some functionality that is no longer necessary or may be focused on classic non-functional requirements, such as security, performance or scalability.
Just because they cannot articulate functional features, doesn’t mean that they can’t be estimated, prioritized, assigned, tested and completed just like a user story. It just means that they should be written and tested differently.
Sometimes, technical stories are composed of the stories that have bugs or issues that need to be corrected.
We can do testing in the same way just like usual functional stories. Technical user stories do not directly bring up the business value to the customer, however very important in improving the overall quality of the software.
Testing technical stories will give us (test team) another spectrum to think testing ways other than traditional testing carried out on user interface.
It gives us an opportunity to learn, verify and validate the product in unusual ways like checking config and log files, adding parameters to log files for debugging purposes, checking API calls, HTTP requests, querying database etc (listing few) using various tools.
Items to consider:
We are considering following technical user stories which can be tested by the test team.
- User stories for technical debt and refactoring
- User stories for database design
- User stories for software design changes
- User stories for code analysis/coverage fixes
- User stories for SQL performance tuning
- User stories for removal of feature/ technical bugs

Some notes to keep in mind:
Technical user stories require special attention as we are more dependent on the information and discussions with developers who are going to implement the user story. Test team is necessarily attentive during backlog refinement and sprint planning where product owner, developers explain technical user stories and discuss the acceptance criteria. Team also consider the discussions happen during standups and discussion section on the technical user story.
In the agile process, we may drive technical user stories the same way as functional user stories.
Test Strategy:
Functional user stories have definitive acceptance criteria, software application behaves in a certain way after implementation and it has visibility. In the case of technical user stories, we might not find the visual difference in the software product.
In case of technical stories, the user interface may not be available. To decide the test strategy for technical user stories, the test team needs to check Impact area, technical information given in the user story, discussions on the user stories and pull request details. All this information will help us to decide the types of testing, deriving the test scenarios and the test cases.
Test strategy definition: A test strategy is an outline that describes the testing approach of the software development cycle. This includes the testing objective, methods of testing new functions, total time and resources required for the project, and the testing environment.

Testing Types:
- Functional testing
- Regression testing
- Non-functional testing (Security /performance testing)
What We Should Consider:
User Interface testing: We can check all related user interfaces which are likely to be affected by implementation of the technical user story. For example, a technical user story that caters removal of service or third-party API or product feature, we can verify that changes made in the system does not have any impact on existing features.
Database testing: If a technical story is about a new database view developed for a module, we can verify the data retrieved from view is correct and consistently appeared in other modules.
Checking logs: If the user story is about improving system logs, we can check and verify logs by adding additional logger level like , info, warn, fatal, debug to improve readability and verify software product error log file capture all detail system all logs and works as expected and no errors are generating
Verification of the config files: In case, a technical user story is about adding new configurations, triggers in the config file or specifying location of log files, we can check and verify for whether these values exist in config files.
Non-functional testing: Security or performance testing can be done using tools like Fiddler, Jmeter and analyze the results for acceptance.
For security testing, we can check and verify that the browser cookies contain sensitive data in encrypted format. This can be done by capturing HTTP traffic in developer tools or Fiddler.
Performance testing carried out by creating and executing new scripts or running existing scripts and verifying system response has not degraded due to new changes implemented by the technical user story.
Out of Scope:
This article is not considering the below checks for testing the technical user story.
- Installation procedures
- Checks related to coding
Test Management and Execution:
Test management includes test planning, designing, and writing the test cases. Test execution incorporates execution of test cases, test execution results, reporting the bugs if any and retesting them.
Testing accomplished as follows:
- Deriving and writing test scenarios and test cases from the acceptance criteria, information and discussions in sprint backlog refinement meetings, discussions with the Development team, information available in the technical user story like specification, design information etc.
- Reviewed test scenarios, test scenarios
- Test Case execution
- Reporting a bug in case of deviation from accepted behaviour
- Reporting and retesting the bugs
- Delivering the technical user story

Useful Testing Tools:
- Developer tools
- Fiddler
- Notepad++
Technical user stories are informally tested by developers. If testers also test them, it has below advantages
- Test environment simulate the user environment
- It gives additional confidence if tested by the tester
- Tester guarantees the quality of the product
We are Suggesting Context Driven Testing approach, for the following reasons
- Our testing approaches are going to change as we are going to work on different technical user stories. Testing scope may vary with respect to the story we will be testing.
- There are fewer chances to reuse test cases created for technical user stories. We can write the test scenarios/test cases and document them in the task. These task numbers can be documented on organization wiki pages in case of future reference.
- We can verify the config files, monitoring HTTP request and response, checking contents of browser cookies, log files as a part of testing process.
Our testing approach will purely depend on the acceptance criteria, description and discussion on the technical user story. We might follow a different approach for another story by selecting different types of testing, testing different tools, test design technique.
Example At a Glance
We can consider below technical user story as an example and testing on that technical user story.
Technical Story: ‘To ensure that after removal of SMS service from the system, the software system works in the same way and no difference is observed in the functioning of other software modules.’
Testing of the above story requires new test cases and regression testing.
New test cases to check ‘removed SMS service’ no longer exist and no HTTP request and response communication served on removed SMS service from any other system module.
Regression testing includes a set of existing test cases to validate removed features has no impact on other existing modules.
Create new testing tasks against the story, provide time on the task. Assumption is test estimations are already given during the sprint planning session.
- Task for designing test scenarios/ test cases. Test cases can be written in the task itself or in test management tool
- Test case title: Verify that removed SMS service no longer displayed in config files, user interface, Help files, user manual
- Verify that HTTP requests not calling the removed SMS service or associated services
Let us assume that the test case titles are written in the same task under description area. We can write instructions which will use while test execution.
For example, location for config files, log files, detail path on web UI where the feature was visible before removal
- Task for test execution:
- We can copy the test case titles to the test execution task and marked the test passed or failed in the task itself after test case execution
- Reporting bug
- We can log the bug/defect in case of deviation from accepted behavior. Accepted bug/defect will follow the bug life cycle, retested and fixed.
- Once all testing tasks completed and closed, the bugs retested and close, as per definition of done decided, we can close the story.
Useful tools for the above story are Notepad++ to check configuration files, we can use Fiddler to capture the HTTP request and response communication between different systems of the software. We can also use browser developer tools for checking HTTP response and requests on webpages.
Conclusion
With little extra efforts, we can test technical user stories effectively and improve the overall quality of the product.
We can nail down the testing of the technical user story with clear understanding of implementation of the technical story along with good communication within the team.
Credits
- From strategy to execution in a lean and effective way
- Context driven testing James Bach https://www.satisfice.com/
- https://rgalen.com/agile-training-news/2013/11/10/technical-user-stories-what-when-and-how
- https://blog.arielvalentin.com/2007/09/technical-user-stories.html
About the Author:
Hi! I am Dhanalaxmi Otari. I am ISTQB tester with around 9 years of experience in Software testing. Currently working with Instem Inc. I am passionate software tester and like to experiment and implement my learnings in daily work. I like to share, discuss the thoughts and ideas. I worked in different domains like clinical trials, Insurance, eForm, workflow.
Leave a Reply