Why I Built a Local-First AI Chatbot

April 24, 2025

AI is changing how we work. The productivity boost is real. This shouldn't be a surprise - the latest models are truly impressive. For software engineers, AI coding assistants have transformed workflows. For many others, AI chatbots have become a daily resource.

But something about these tools kept bothering me: the data privacy implications. Most AI services store your conversations, have them reviewed by humans, and use the data to train future models.

While you can often disable this in settings, privacy policies generally include exceptions where your data can still be used in some circumstances.

To resolve this concern, I built a simple terminal-based chatbot for my own general questions throughout the day. The chatbot interfaced with Amazon Bedrock, which provides access to a range of powerful language models with a strong data protection commitment to not store or log conversations.

This worked well for me - faster workflow, better focus. Soon, my wife wanted to use it too, but a terminal interface wasn't what she had in mind. I searched for a product that would meet her needs but couldn't find one promising zero data retention with a powerful model. Local and open source options existed, but local models lag in performance, and self-hosting wasn't appealing for her.

This is when LastChat was born. The idea was simple - build a chatbot application with two key goals:

  1. Serve cutting-edge AI models

  2. Provide best-in-class privacy for conversation data

There are several privacy-oriented inference providers1 that serve leading models great for coding, writing, math, and more. So goal #1 was fairly straightforward.

For goal #2, I wanted the highest degree of privacy possible by not storing any conversation data on servers at all. So I set out to build a React-based web app using IndexedDB to store all conversation data locally on the user's device.

After some experimentation, I landed on React Router for client-side routing and data loading, paired with Dexie.js for local data storage. Beyond the privacy benefits, this approach unlocked near-instant navigation speeds throughout the app. Most navigations work offline too!

The end result is a fast, private AI chatbot. No collection of your conversations. No human review. No training on your data. No user profiling. No targeted ads.

LastChat is self-funded with no pressure to grow or monetize - a straightforward, privacy-first experience. I hope you find it useful too!

1

Amazon Bedrock, Groq, and Fireworks AI to name a few.

About Tim Larden

Hey! I'm Tim, maker of Sendfully and LastChat. Former engineering manager at Microsoft, now a full-time builder, optimist, and aspiring hot sauce aficionado. Subscribe below (or grab the RSS feed) for my thoughts on product development and running a small business. Thanks for stopping by.