Debugging in Natural Language: From Problem to Root Cause in Minutes

6 min read
AIDebuggingConversational AICode Debugging
Cover image for Debugging in Natural Language: From Problem to Root Cause in Minutes

How AI agents can transform complex debugging into a conversational process, even in poorly architected codebases.

Debugging in Natural Language: From Problem to Root Cause in Minutes

How AI agents can transform complex debugging into a conversational process, even in poorly architected codebases.

The Problem

It started with a simple observation: a component that should display shipping information wasn't showing up on the product page. In a typical development workflow, this would require hours of investigation across multiple files, understanding complex state management, and tracing through API calls.

But what if we could debug this in natural language?

The Conversation: Human-in-the-Loop Debugging

Instead of diving into the codebase immediately, I decided to treat the AI agent as a debugging partner. This is actually how I've been working for the past 6 months - tackling nearly 80% of my work through AI pair programming rather than going directly into the code. For more complex cases, I even use the interactive-mcp server (opens in a new tab) to create persistent chat sessions. In fact, I used this very approach to write this article.

Here's how the conversation unfolded:

Initial Prompt

Me: "We are having one issue that the shipping-days-wrapper is not being shown. I need you to help me debug it to find out where the problem is. Evaluate the implementation and add some console logs adding the DEBUG:EDD message to make it easier to filter later on"

AI Response: The AI immediately understood the scope and added comprehensive debug logging throughout the component hierarchy, including:

  • Component props logging
  • CSS module inspection
  • DOM element verification
  • Helper function debugging
  • Parent component investigation

First Iteration: Component Investigation

After the AI added the logs, I manually went to the browser, refreshed the page, opened the console, filtered the logs, and pasted them back to the agent.

Me: "Here's what I have got: [pasted debug logs showing component returning null]"

AI Analysis: The AI identified that the issue wasn't with the component itself, but with the data flow. The logs showed:

  • Component was rendering but receiving undefined data
  • Parent component was returning null due to missing shipping data
  • The problem was upstream in the data pipeline

AI Discovery: "The shipping-days-wrapper is not being shown because the ShippingDays component is never being rendered at all. The parent component (ProductDeliveryDate) is returning null before it even gets to the ShippingDays component."

Note: I did this manually this time, but often I use AgentDesk Browser tools (opens in a new tab), while in my personal projects where I have more control i have been using Playwright MCP.

Second Iteration: API Investigation

Once the AI identified the data issue, I prompted for API-level investigation.

Me: "Let's add another console log just after we get those data from our backend apis"

AI Response: The AI added debug logging to track the API calls:

  • Shipping data fetching hooks
  • API call parameters and responses
  • Redux state updates
  • Error handling for failed requests

I then shared the new logs with the exact same approach.

Me: "Here we go: [pasted logs from console]"

AI Analysis: The AI identified that the API call was being made correctly but failing with a rejected status.

Third Iteration: Enhanced Error Logging

The AI enhanced the error logging to capture more detailed failure information.

AI Response: The AI added comprehensive error logging including:

  • Detailed error status codes and response data
  • Redux error handling for rejected API calls
  • 500 error specific guidance and possible causes
  • Enhanced debugging for API failures

I shared the updated logs again.

Me: "Here are the updated logs: [pasted logs showing API 500 errors]"

AI Analysis: The AI immediately identified the root cause:

  • API calls were being made correctly
  • Backend was returning 500 Internal Server Error
  • No shipping data was reaching the Redux store
  • Component was working fine - it just had no data

AI Discovery: "The API call is failing with a 500 Internal Server Error, which means there's a backend/server issue with the shipping information endpoint."

Fourth Iteration: Error Reporting

With the root cause identified, I asked the AI to generate a comprehensive report for the backend team.

Me: "I'll report it to the backend. Just write a simple report to them showing which endpoint is failing, what are the params we are sending, to give them a good starting point to help them investigate it."

AI Response: The AI generated a comprehensive bug report with:

  • Exact API endpoint details
  • Request parameters and headers
  • Error response information
  • Curl command for reproduction
  • Clear description of the impact

The Root Cause

Through this conversational debugging process, we discovered that:

  1. The component was working correctly - it just wasn't receiving data
  2. The Redux store was empty - because the API call was failing
  3. The API was returning 500 errors - indicating a backend issue
  4. The problem was server-side - not in the frontend code

The Resolution

Instead of spending hours investigating frontend code, the AI helped us:

  1. Identify the exact API endpoint that was failing
  2. Capture the specific error details (500 Internal Server Error)
  3. Generate a comprehensive bug report for the backend team

The Efficiency Gains

This debugging session took minutes instead of hours because:

  • No manual logs required - AI found key places and added logs for us
  • Natural language investigation - described the problem in plain English
  • Systematic approach - the AI followed a logical debugging flow
  • Immediate feedback - each iteration provided new insights
  • Conversational debugging - back-and-forth dialogue led to rapid discovery
  • Manual verification - I checked the browser console and shared results

The Bigger Picture

This demonstrates how AI agents can transform debugging in complex, poorly architected codebases. Our codebase has:

  • No modularization or clear patterns
  • Complex and badly designed Redux global context
  • Limited observability and server-side logging
  • Spaghetti code architecture

Yet, through natural language debugging, we identified the issue in minutes that would have taken hours of manual investigation.

Key Takeaways

  1. Natural language debugging is powerful - describe the problem, let AI guide the investigation
  2. AI agents excel at systematic debugging - they follow logical flows without getting lost in code complexity
  3. Conversational iteration works - back-and-forth dialogue accelerates problem solving
  4. Efficiency gains are significant - especially in poorly architected systems
  5. Human-in-the-loop works - combine human intuition with AI systematic approach

The Future

While this was mostly a manual process, the same approach can be automated and this is what I'm pursuing in my spare time. Imagine:

  • AI agents monitoring your application
  • Automatic debugging when issues arise
  • Natural language reports of problems and solutions
  • Continuous improvement through AI-assisted development

The future of debugging isn't just about better tools—it's about making debugging conversational and accessible to everyone.

This case study demonstrates how AI agents can transform complex debugging into a natural, efficient process. The key isn't replacing developers but augmenting their capabilities with intelligent, conversational tools that understand both code and context.

>_under construction

Frontend developer specializing in creating clean, user-friendly interfaces with the latest web technologies.

Connect

© 2025 Pedro Camara Junior. All rights reserved.