Remember how AI kept getting stuck in endless loops in Part 3? The secret ingredient we were missing wasn’t more prompting or better AI models - it was context. In this episode, we’ll transform your AI coding assistant from a confused helper into a documentation-powered expert.

The Context Problem

Towards the end of our last coding session with Kiro IDE, I hit a wall. The AI was stuck in an endless loop, trying to fix problems without really understanding the That Open Company framework. It was inventing solutions when simple built-in functions already existed.

The culprit? Lack of context.

Here’s the thing about Large Language Models (LLMs): they’re trained on data that’s often a year or older. When a framework like That Open Engine launches just weeks ago, it’s not in the AI’s built-in memory yet. We need to give it access to current documentation.

Solution 1: The Basic Approach (Manual URL Fetching)

The simplest approach is to just feed the AI a URL:

Go fetch this URL: https://docs.thatopen.com/fragments

The AI will scrape the page and analyze it - essentially doing what you could do manually by viewing the page source or running a curl command. It works, but it’s:

  • Slow
  • Limited to single pages
  • Not much better than manual reading

We can do better.

Solution 2: GitMCP (GitHub Repository Access)

Enter GitMCP - an open-source tool that gives AI direct access to GitHub repositories. Here’s how it works:

  1. Visit the GitMCP website
  2. Paste your GitHub repository URL
  3. Get back an MCP (Model Context Protocol) URL
  4. Configure it in your AI coding tool

I’ve set this up ahead of time for all That Open Company repositories. To add it to Kiro IDE:

  1. Navigate to Kiro settings → MCP servers
  2. Click edit → switch to User Config (so it’s available across all projects)
  3. Add the MCP server definitions

The configuration looks like this:

{
  "thatopen-fragments": {
    "command": "npx",
    "args": ["-y", "@context7/mcp-server@latest", "that-open-company/engine_fragments"]
  }
  // ... additional repositories
}

Once connected, you can ask:

Use the MCP to get documentation about That Open Engine Fragments

The AI now has access to the entire repository - not just HTML pages, but the actual code, documentation, and structure. It can interpret and understand the framework, connecting the dots automatically.

Much better! But we’re not done yet.

Solution 3: Context7 (The Ultimate Documentation Database)

Context7 takes context provision to the next level. It’s a free service with no sign-in required that indexes thousands of codebases and makes them searchable.

Search for “That Open” and you’ll find:

  • That Open Engine
  • That Open Components
  • That Open Fragments
  • And more…

The killer feature? Convert any repository to an MCP server with one click.

Setting Up Context7 MCP

  1. Search for your repository on Context7
  2. Click “MCP server”
  3. Copy the configuration for Kiro IDE
  4. Paste it into your User Config

Example configuration:

{
  "context7": {
    "command": "npx",
    "args": ["-y", "@context7/mcp-server@latest"]
  }
}

The difference is remarkable:

  • Speed: Near-instant responses (advanced indexing)
  • Coverage: Access to thousands of repositories
  • Freshness: Up-to-date documentation
  • Depth: Comprehensive understanding of the entire ecosystem

Practical Application: AI-Powered Code Review

Now comes the magic. With all this context, I started a new session and sent this prompt:

Use MCP to get info about That Open Engine Fragments and That Open
Company Components. Then compare the state of our codebase with their
recommended approach and their available APIs. Basically do an extensive
code review grounded in the documentation from That Open.

The results were shocking.

The AI came back with eight major topics for code review - not hallucinated suggestions, but actionable recommendations grounded in actual documentation. Using Claude Sonnet 4 (a good but not top-tier model), the quality of the response was a testament to how much context matters.

The Human Parallel

Think of it this way: if you ask another developer to “write me an app” without any context, they’re lost. What kind of app? What framework? What features? The same applies to AI.

Context is the difference between confusion and clarity.

With comprehensive documentation access:

  • AI understands framework patterns
  • It recommends actual APIs instead of inventing them
  • It spots deviations from best practices
  • It grounds suggestions in real documentation

Key Takeaways

  1. Basic URL fetching: Better than nothing, but limited
  2. GitMCP: Gives AI access to entire GitHub repositories
  3. Context7: Lightning-fast access to thousands of indexed codebases
  4. MCP servers persist: Set them up once in User Config, use everywhere
  5. Context quality matters more than model size: Great context with a decent model beats poor context with the best model

What’s Next?

In the next episode, we’ll review the AI’s recommendations from that comprehensive code review. We’ll evaluate whether its suggestions make sense and actually implement the changes to align our app with That Open Company’s recommended patterns.

The question is: will the AI’s context-powered review lead to better code? Stay tuned!

Let’s Build Together

Have you tried using MCP servers or Context7 with your AI coding tools? What frameworks would you like to see better documented for AI access? Drop your thoughts in the YouTube comments - I’d love to hear how you’re giving context to your AI assistants!

The GitHub repository for this project is available at: https://github.com/vinnividivicci/openingbim-cicd