Why I Built a Local-First AI Chatbot

April 24, 2025

AI is changing how we work. The productivity boost is real. This shouldn't be a surprise—the latest models are truly impressive. For software engineers, tools like Cursor have transformed coding workflows. For many others, ChatGPT has become a go-to resource.

But something about these tools kept bothering me—the data privacy implications. Most AI services store your conversations, have those conversations reviewed by humans, and eventually use the data to train future models. My queries, code snippets, and sometimes even personal information were being collected.

While some folks may argue that you can disable this in settings, the privacy policies of the large AI providers have exceptions where your data can still be used for training.

To resolve this concern, I built a simple terminal-based chatbot for my own general questions throughout the day. The chatbot interfaced with Amazon Bedrock, which provides access to a range of language models, including my favorite Claude, with a strong data protection commitment to not store or log conversations.

This worked well for me. It sped up my workflow and kept me focused. Soon, my wife expressed interest in using it too, but a terminal interface wasn't what she had in mind. I searched online for a product that would meet her needs, but unfortunately couldn't find an option that promised zero data retention coupled with a powerful model. While several great local and open source options existed, local models lag in performance, and self-hosting wasn't an appealing option for her.

This is when LastChat was born. The idea was simple—build a chatbot application with two key goals:

  1. Serve cutting-edge AI models

  2. Provide best-in-class privacy for conversation data

There are several privacy-oriented inference providers1 that serve leading models great for coding, writing, math, and more. So goal #1 was fairly straightforward.

For goal #2, I wanted the highest degree of privacy possible—by not storing any conversation data on servers at all. So I set out to build a React-based web app using IndexedDB to store all conversation data locally on the user's device.

After some experimentation, I landed on React Router for client-side routing and data loading, paired with Dexie.js for local data storage. Beyond the privacy benefits, this approach unlocked near-instant navigation speeds throughout the app. Most navigations work offline too!

The end result is an extremely fast, private AI chatbot. No collection of your conversations. No human review. No training on your data. And no tracking for targeted ads.

LastChat is a self-funded project with no pressure to grow or monetize. It will remain a straightforward, privacy-first experience—the kind my wife and I were looking for. I hope you find it useful too!

1

Amazon Bedrock, Groq, and Fireworks AI to name a few.

About Tim Larden

Hey! I'm Tim, the maker of LastChat and founder of Reforged. I'm a software builder, optimist, and coffee enthusiast. Subscribe below (or grab the RSS feed) to follow my thoughts on product development and running my own small business. Thanks for visiting, thanks for reading.