This is Part 7 of the Klariti Primer on AI for Software Testing
Welcome back to Klariti’s Primer on AI for Software Testing. In our previous discussions, we’ve explored AI’s role in managing deviations and controlling data access. Now, we turn our attention to a cornerstone of the testing process: the Issue Log. This document is the central repository for tracking defects, errors, and other concerns identified during testing. As a software tester, you know that a well-maintained Issue Log is critical for traceability, communication, and ultimately, product quality.
Download Software Testing Templates: MS Word + Excel
Challenge: From Hasty Notes to Actionable Issue Log Entries
As QA and testers, we’re adept at finding issues. However, translating those findings into consistently clear, detailed, and actionable entries in an Issue Log can be a significant challenge, especially under time pressure. Here are some of the common problems I’ve encountered:
- Inconsistent Descriptions: Vague or incomplete descriptions make it difficult for developers to understand and reproduce the issue.
- Subjective Prioritization: Assigning priority (Low, Medium, High) can be inconsistent without clear criteria or impact assessment.
- Missing Information: Key fields like “Issue Related to,” “Activity” (steps to reproduce), or “Outcome” (expected vs. actual) might be overlooked or poorly documented.
- Time Drain: Manually crafting each entry with the necessary detail for all fields (Ref #, Date Raised, Issue Related to, Description, Priority, Activity Date, Resource, Activity, Outcome, Resolved Date, Status) is time-consuming.
An Issue Log filled with poorly documented entries leads to wasted time, miscommunication, and a higher chance of critical issues being misunderstood or deprioritized. How can we ensure our Issue Logs are not just lists, but powerful tools for quality improvement?
Scenario/Context: The Cost of a Poorly Organized Issue Log
I recall a critical performance issue that was logged with a description like “System slow during month-end processing.” The “Activity” section was sparse, and the “Outcome” simply stated “Takes too long.” Because the entry lacked specific metrics, reproducible steps under defined conditions, and a clear articulation of the business impact, developers struggled to pinpoint the cause.
The priority was initially set to Medium. It took several back-and-forth cycles, consuming valuable time from both testing and development, to gather the necessary details.
By the time the full impact was understood and the issue re-prioritized to High, the release timeline was already at risk. This scenario highlights how deficiencies in the Issue Log directly impede efficient defect resolution and can mask the true severity of problems.
Using AI Tools: Crafting Better Issue Log Entries with ‘Structured’ Prompts
AI, particularly LLMs like Google Gemini, can be a helpful assistant in transforming raw observations into high-quality Issue Log entries. By crafting effective prompts, we can use AI to populate key fields with clarity and detail. Let’s look at how we can apply Simple, Advanced, and Complex prompts to enhance our Issue Log management.
Key Issue Log Fields AI Can Assist With:
- Issue Related to: Brainstorming connections to other system parts.
- Description: Drafting comprehensive and clear narratives.
- Priority: Suggesting priority based on impact and keywords.
- Activity: Structuring detailed, reproducible steps.
- Outcome: Clearly articulating expected vs. actual results.
- Status Updates/Summaries: Drafting concise updates or analyzing trends.
(Note: Fields like Ref #, Date Raised, Resource, Activity Date, and Resolved Date are typically factual or system-generated and less suited for AI generation, though AI can help format or check for completeness.)
1. Simple Prompts (Quick Enhancements for Issue Log Entries)
Use these for quick drafting, summarization, or initial categorization.
What they achieve: Quickly generate or refine basic elements of an issue entry.
Example Prompts:
"Given this tester's note: 'Login button not working after 3 tries.' Draft a concise Description for an Issue Log."
Rationale: Converts a brief note into a more formal description."Based on the description: 'User data is not saved when navigating away from the profile page without clicking save, leading to data loss.' Suggest a Priority (Low, Medium, High) with a brief justification."
Rationale: Helps in initial, quick prioritization."My observation: 'Clicked submit on the search form with no text, got a 404 error page.' Help me phrase the Activity (steps) and Outcome (expected vs. actual) for an Issue Log."
Rationale: Structures the core reproducibility information.
2. Advanced Prompts (Detailed and Structured Issue Reporting)
These prompts provide more context and ask AI to generate more complete and well-structured content for multiple fields.
What they achieve: Develop comprehensive issue details, ensuring all relevant information for reproducibility and understanding is captured.
Example Prompts:
"I found a bug where the shopping cart total doesn't update when an item quantity is changed from 2 to 0. The user has to refresh the page to see the correct total. This affects the main checkout flow. Draft a detailed Issue Log entry covering: Description, suggested Priority, Activity (steps to reproduce), and Outcome (Expected: Cart total updates dynamically. Actual: Cart total remains unchanged until page refresh). Also, suggest what this Issue might be Related to (e.g., front-end calculation module, API endpoint for cart updates)."
Rationale: Asks for multiple key fields to be drafted based on a scenario."Review this draft bug report: 'Description: Error on save. Steps: Click save. Outcome: Error message.' Enhance this report for an Issue Log by elaborating on the Description (what page? what data was being saved?), making the Activity more specific, and clarifying the Outcome (what was the exact error message? what was expected?). Assume this is for a 'User Settings' page."
Rationale: Uses AI to improve an existing, poorly written bug report."A user reported that they cannot upload a profile picture larger than 2MB, but there's no error message, the upload just silently fails. The requirement states a 5MB limit with a clear error message for oversized files. Generate the Description, Activity, and Outcome sections for the Issue Log. Suggest a Priority considering user experience and requirement deviation."
Rationale: Focuses on capturing deviations from requirements clearly.
3. Complex Prompts (Strategic Issue Analysis and Summarization)
These prompts leverage AI to analyze patterns in the Issue Log or generate summaries for stakeholders.
What they achieve: Identify trends, potential root causes, or generate high-level summaries from multiple issue entries.
Example Prompts:
"Analyze the following 10 issue summaries from our Issue Log [Paste summaries, each including a 'Component' field like UI, API, Database, and 'Priority']. Identify any potential defect clusters (e.g., multiple high-priority UI issues in the payment module) and provide a brief summary statement about the overall health of the current build based on these issues."
Rationale: Uses AI for preliminary trend analysis across multiple issues."Our Issue Log shows 5 critical issues related to API timeouts under load in the 'Order Processing' module. Draft a concise summary for a stakeholder update, highlighting the business impact (e.g., potential for failed orders during peak times) and the current Status (e.g., 'Under Investigation by Dev Team')."
Rationale: Helps in communicating the essence of critical issues to non-technical audiences."Given a set of resolved issues from the 'User Authentication' module in our Issue Log, suggest 2-3 potential underlying root causes that might have contributed to these defects (e.g., inadequate unit testing, complex legacy code, unclear requirements for password policies)."
Rationale: Pushes AI towards suggesting areas for deeper root cause analysis by the team.
Integrating AI into Your Issue Logging Workflow:
- Draft, Don’t Just Copy: Use AI to generate initial drafts for descriptions, steps, and outcomes.
- Verify and Refine: Always review AI-generated content. Ensure steps are accurate, descriptions are clear, and suggested priorities align with your project’s context. Your expertise is crucial.
- Context is King: The more relevant details you provide in your prompt (e.g., feature under test, user story, environment), the better the AI’s output.
- Iterate: If the first response isn’t perfect, refine your prompt and try again.
Next Steps
AI is not going to take your job as a software tester. However, knowing when, where and how to use AI will help advance your career in software testing.
By strategically using AI prompts, you can significantly refine the quality and consistency of your Issue Logs. This not only aids developers in faster defect resolution but also provides clearer data for trend analysis and process improvement. Moving from manual, sometimes rushed, entries to AI-assisted, well-structured documentation transforms the Issue Log into a more potent tool for quality assurance.
Improve your documentation: For more insights and templates to streamline your testing processes, subscribe to the Klariti Newsletter.
Next up: Every formal document in our testing lifecycle needs proper management – from creation through reviews, approvals, and revisions. We’ll explore the Document Control Form next, examining how AI can assist in maintaining order and traceability for all our critical testing artifacts.
Templates (Free and Paid)
Enhance your testing processes with these Klariti resources:
- Acceptance Test Plan
- Installation Plan Template
- Software Testing Template Pack (MS Word+Excel)
- Test Plan Template
- Quality Assurance Plan