How AI Helped Me Tame Our Documentation Chaos

Photo of Jeff Sheltren
Jeff Sheltren - Partner/CIO
October 15, 2025

Take Away: At Tag1, we believe in proving AI within our own work before recommending it to clients. This post is part of our AI Applied content series, where team members share real stories of how they're using Artificial Intelligence and the insights and lessons they learn along the way. Here, Jeff Sheltren, Partner/CIO, shares how AI helped him tame Tag1's documentation chaos.

As Tag1 started training our employees on AI, we developed an internal workshop covering key LLM concepts, approved tools, ethical guidelines, and development best practices. That workshop content quickly found its way into our Notion workspace, where it grew organically alongside other AI-related documentation. Even though it was very high quality content, the fact that it grew organically without any real structure made it difficult for employees to find what they needed. As someone responsible for documenting some of our internal tools, I didn't know where to create those pages -- and I quickly became discouraged by the likelihood that users would struggle to find the content once created.

At that point, I knew that I wanted to reorganize the entirety of our internal AI documentation, but looking at the scope of work involved made me want to look the other way. How could I possibly find and review all the content we had created, document it all, come up with a plan for a more user-friendly navigation experience, and move content into that new organization? It felt like an impossible task, but I decided it might be something that I could accomplish much more easily by leveraging AI.

AI as a Collaborator and Content Detective

Instead of diving in blindly, I wanted to develop a plan. What would the final structure be? How could I fix our current issues without creating a new mess that would still leave employees struggling to find approachable, usable content for onboarding and daily work? I knew I couldn't even start on a plan without first doing an inventory of our existing documentation, so I started out by setting up Claude Code with the Notion MCP. Once that integration was working, I had Claude create a comprehensive inventory and analysis of our existing AI documentation. What would have taken me multiple days of work to systematically click through our documentation and create a manual inventory was done in minutes. Claude was able to pull in all of the relevant content, review for duplicate information, and get a big picture of everything that existed in our AI documentation.

With an inventory in hand, I worked with Claude to review the documentation we had. We pretty quickly made some discoveries based on that inventory and my description of navigational issues I was facing with the documentation:

  • Most of the content was of very high quality, but was difficult to find
  • We had multiple Notion databases across the teamspace, but no clear organization of content
  • The AI Workshop content was mixed in with general documentation and setup guides
  • There was no clear entry point to the documentation, and no logical flow for users looking to read content in a sequence

This analysis proved invaluable - within an hour, I had a comprehensive understanding of our documentation landscape and could identify clear paths forward.

Talking Through the Plan

The insights from the content discovery allowed me to start drafting a plan. I knew I wanted to reorganize the content, but at this point I had more questions than answers. I didn't want to design a documentation structure that worked for me but wasn't approachable to others within the company. I wasn't even sure if it made sense to leverage Notion databases (since we were currently doing so), or if a flat page structure with better organization might be a cleaner solution. I began iterating on the plan with Claude, exploring key questions: "What are the tradeoffs between using a Notion database for some/all of this content?", "As a [developer/infrastructure engineer/project manager/sales lead], what information do I need to access for onboarding and day-to-day?", "These content blocks are present across most documentation pages, do they contain different information and links, or are they generally the same across all content?". These conversations helped me think through the process and brought up many things that I wouldn't have even considered. AI guided me through user journeys for various employee types, and through multiple rounds of refinement, we created a full plan of how to organize the content to make it more accessible.

CAUTION

I needed to understand and guide the plan deeply. Early on, AI wanted to create "missing" content and fill gaps. I had to course correct to focus on reorganizing existing content, not expanding it. Without active oversight, this would have failed.

I ended up iterating on the plan multiple times, reviewing myself, having AI review itself, and ultimately incorporating a couple of AI Agents: a content editor, and a Notion application expert; both of which reviewed the plan from different perspectives and gave feedback that was critical in refining and improving the plan. Without this review, I feel certain the project would have derailed quickly during implementation.

Once I had a solid plan drafted, I wrote it out to a file and tracked it in git so that the full plan and history would always be quickly accessible. This proved essential both for implementation reference, but also as a way to split up the project into smaller tasks which helped both my time constraints (having worked on this in between my typical day-to-day tasks and meetings), and being able to start independent sessions in Claude Code which helped avoid hitting the context window limit -- especially important since I was dealing with large amounts of content in many of the sessions. This approach allowed me to track progress in the plan itself, updating it to mark off completed tasks along the way, and updating it if I saw issues with the approach as it was being implemented.

The Migration: Where AI Really Shines

Plan in place, I was ready to dive into the content reorganization. I put myself in an oversight / team lead sort of role, with Claude executing all the tedious work of copying content over to the sandbox we created to stage the content changes without making any changes to the live documentation. Claude performed a bunch of heavy lifting under my guidance:

  • Systematically copied content from the old structure to new
  • Cleaned up repetitive template blocks while preserving actual content
  • Merged related content and eliminated duplicates
  • Multiple rounds of verification to ensure nothing was lost or changed in the process

The Notion MCP integration was crucial - it allowed Claude to directly read page content, understand database structures, copy text while preserving formatting, and update internal links automatically. Without this direct API access, I would have been stuck with manual copy-pasting and error-prone link updates across 50+ pages. Unlike typical AI integrations that require you to paste content back and forth, the MCP connection meant Claude could work directly within our Notion workspace - reading from one area, writing to another, and maintaining all the formatting and metadata that makes documentation usable.

Building Confidence Through Review

With the new content in place in a sandbox for staging, I ran the reorganized content through multiple layers of review. First, I did my own review of the structure and a handful of key pages. After addressing some small issues that arose, I had an AI editor agent review the sandbox content for consistency and clarity, also ensuring that content was copied over without losing anything. Finally, I had a Notion-expert agent review the sandbox for any technical issues to ensure that links were working and nothing was pointing to the old documentation pages. These review cycles gave me confidence in the sandbox content before sharing it with the editorial team and Tag1 management.

Quick Results: Sandbox to Production

Having AI handle the heavy lifting of the content migration meant I could create a complete, reorganized sandbox within days. This was huge for getting buy-in: instead of describing my vision to key stakeholders, I could show them a working, polished version. With a few small changes to the content based on their feedback, I was able to archive our old documentation and copy the new and improved version into place.

The response was immediate and positive. Employees started actually using the documentation! People can now easily access content that was previously hidden within multiple Notion databases. As a content editor, I have a clear structure to build new content into, and I no longer feel like new documentation will be lost to the mess that was our previous structure. I'm confident that the new organization will serve us well as we grow out even more internal documentation.

What Made This Work

While this implementation was successful, that doesn't mean that I -- or anyone -- can simply throw AI at any problem and have it deliver perfect results. Several key factors made this reorganization successful where similar projects often fail:

  • AI as a collaborator, but I stayed in control: I collaborated with AI, giving feedback on plans, correcting course if it made wrong assumptions, and reviewing all changes iteratively. Without my deep understanding of the plan and review of the work done by AI along the way, I'm certain that the project would have gone off track. Early on, AI even tried to rewrite the documentation, rather than organizing it, a good reminder of why human guidance is critical. While AI might have landed on a reasonable solution on its own, it very likely would have deviated from what I envisioned, and we may have ended up with untrustworthy documentation. It's very important to have a solid plan, and ensure that AI doesn't deviate from it, as it sometimes tends to do.
  • Written plan that evolved: Having a plan provides a foundation for all the work that will be done on the project. Committing it to git gave me a place to store and revise it as we tried out different solutions and settled on the final approach. It also can be treated as a living document, updating it with progress as milestones were achieved, and making notes to document any change in direction from the original plan. While you don't have to store the plan in revision control, it's an invaluable resource should you ever need to revert back to a previous version.
  • Multiple review rounds: Initially during planning, later during implementation of the sandbox, and eventually with the production rollout of the new documentation structure, I was reviewing all changes and work along the way. Not only did I personally review everything, but I leveraged AI agents for both technical and editorial review. These proved invaluable for catching issues early in the process that I may have otherwise missed.
  • Sandbox approach: By doing work in a sandbox, I had the opportunity to test the reorganization and documentation updates in a staging environment without directly editing live content. This not only gave me the ability to test different approaches to the organization and migration, but it also gave me the peace of mind knowing that I wouldn't unintentionally be destroying "production" content that someone worked hard to create. Testing is very important in software development, and the same can be said for any work done with AI: always test and validate any outputs it gives you.

What This Changes About AI in Business

This project shifted my perspective on where AI adds the most value. It's not the flashy stuff - generating code or creating content from scratch. It's tackling the messy, time-consuming work that sits in everyone's "someday" pile.

Every team has those projects: the content audit that's been postponed for months, the process documentation that needs overhauling, the knowledge base cleanup that everyone acknowledges is important but nobody wants to tackle. These aren't glamorous tasks, but they're exactly where AI can provide immediate, measurable value. AI doesn't get overwhelmed by scope or bored by systematic work. It just methodically works through whatever you point it at.

The key insight: approach AI as a capable intern who needs clear direction but can handle the heavy lifting. I stayed in control of strategy and decisions while AI managed the execution details. Now our documentation actually gets used, new hires don't get lost, and I'm building new content with confidence.

If you're facing a similar documentation challenge, start small: pick one messy content area, set up an AI tool with direct access to your platform (whether that's Notion, Confluence, or another system), and have it create an inventory first. You'll be surprised how quickly that systematic view reveals solutions that seemed impossible when you were drowning in the details.

Sometimes the most practical AI wins aren't the most impressive ones - they're just getting important work done that was previously stuck in the "too overwhelming" category.

This post is part of Tag1’s AI Applied series, where we share how we're using AI inside our own work before bringing it to clients. Our goal is to be transparent about what works, what doesn’t, and what we are still figuring out, so that together, we can build a more practical, responsible path for AI adoption.

Bring practical, proven AI adoption strategies to your organization, let's start a conversation! We'd love to hear from you.


Image by Yan Krukau from pexels

Work With Tag1

Be in Capable Digital Hands

Gain confidence and clarity with expert guidance that turns complex technical decisions into clear, informed choices—without the uncertainty.