image.png

<aside> 📌

In progress writing this up…

Some more visualizations and (hopefully) knowledge graphs to come

</aside>

← This is not the final product, just placeholder image until I have time/inspiration to work on this more

I like to journal, and have almost every day for the past 5+ years. See Journal Analytics for another journal analysis project with more exposition on this.

This AI Journal Analysis projects builds off of Notion Wrapped. When building Notion wrapped I finally had a clean way to get programmatically get large amounts of cleanly formatted data from Notion. I wanted to pass this to AI to see what insights I could get. The logical place to start was with my journal.

To start, I could have an AI could do a handful of things

Seeing the data flow by of hundreds of parallel LLM calls each returning info about a separate journal entry makes me SO incredibly happy

You might be asking, can’t Notion AI do this? Well, some if it.

Notion AI is not going to pass every single token from an over 1 million word collection of journal entries, and honestly neither am I, but only due to context window limits. But, after removing stop words I am sending all 800 thousand tokens.

An Agent

Next I turned into tools that an AI agent could call so I could ask more detailed questions about my journal.

I turned the following functions in to MCP tools for Claude Code to use

Soon after making this Notion came out with a refreshed hosted MCP server https://www.notion.com/blog/notions-hosted-mcp-server-an-inside-look. The “Notion-flavored Markdown”, similar to how I was doing it

<journal-entry number="{entry_number}" date="{entry_date}" Rating="{entry_rating}" notable_event="{entry_notable_event}">
{entry_text}
</journal-entry>

Questions I asked about my whole journal