Andrej Karpathy dropped reader3 on November 18—a lightweight EPUB reader that makes it dead simple to read books alongside LLMs. The project exploded to 1,576 GitHub stars in 48 hours, proving massive appetite for practical AI-augmented reading. This isn’t another bloated AI app trying to do everything. It’s infrastructure: display one chapter at a time, copy to your favorite LLM (ChatGPT, Claude, whatever), discuss and understand. That’s it.
The timing matters. As AI tools proliferate, Karpathy shows how LLMs can augment traditional knowledge consumption rather than replace it. And the source? One of the most influential figures in AI—former Tesla AI director, OpenAI co-founder, 120K+ GitHub followers.
“Code Is Ephemeral” – The Philosophy That Matters
Here’s what makes reader3 interesting: Karpathy explicitly states it’s “90% vibe coded” and he won’t support or improve it. Instead, he encourages users to “ask your LLM to change it in whatever way you like” because “code is ephemeral now and libraries are over.” This is a provocative statement about software development—tools should inspire, not constrain.
His track record backs this philosophy. nanoGPT (37K stars), llm.c (28K stars), LLM101n course (35K stars)—all educational, minimalist implementations that prioritize learning over production polish. reader3 continues this pattern: simple enough to understand the entire codebase, complex enough to be genuinely useful, deliberately designed to encourage modification rather than adoption as-is.
The meta aspect is delicious: a tool for reading books with LLMs, which you’re supposed to improve by reading its code with LLMs. Moreover, Karpathy isn’t just building tools—he’s demonstrating how to build tools in the LLM era.
How Reader3 Actually Works
reader3 is deliberately NOT a chatbot or integrated AI reader. It’s infrastructure for human-AI collaborative reading. Furthermore, the setup takes four commands:
# Download book from Project Gutenberg
curl -o dracula.epub https://www.gutenberg.org/ebooks/345.epub.images
# Register to library
uv run reader3.py dracula.epub
# Start server
uv run server.py
# Visit localhost:8123
The architecture is intentionally minimal: Python-based using uv package manager, creates local *_data directories per book, serves a simple web UI. No database, no cloud service, no telemetry. Meanwhile, chapter-by-chapter display optimizes for both LLM context windows and human comprehension. You read, copy the text (Cmd+A, Cmd+C), paste into any LLM, and discuss.
Self-hosted means privacy. LLM-agnostic means freedom—works with ChatGPT, Claude, Gemini, local models via Ollama. Consequently, the simplicity is the feature.
Related: Anthropic Computer Use API: AI Agents Control Your PC
Practical Use Cases Beyond Reading
reader3 shines for technical books—algorithms, systems design, language specs—where you need instant clarification of dense concepts. Language learners can read in their target language with LLM providing translations and cultural context on demand. Additionally, literature enthusiasts discuss themes and symbolism in real-time. Researchers quickly extract key points and generate summaries.
The workflow is straightforward: copy chapter text, paste into LLM, ask “Summarize this chapter in 3 bullet points” or “Explain this technical concept in simpler terms” or “Compare this approach to modern practices.” The LLM becomes your reading companion, not your replacement.
Project Gutenberg provides 70,000+ free EPUBs—perfect content source. And because reader3 is LLM-agnostic, privacy-conscious developers can use local models instead of cloud services.
Related: GitHub Copilot Workspace: AI-Powered Development Workflows
The Alternatives: Minimal vs Full-Featured
reader3 competes with NotebookLM (Google’s full-featured AI research assistant), ChatGPT file uploads, and commercial tools like Readwise Reader. The tradeoff is clear: minimal, self-hosted, and free versus feature-rich, cloud-based, and often paid.
NotebookLM offers automatic summaries, citations, and audio overviews—but requires a Google account and cloud dependence. However, ChatGPT file upload lets you discuss entire documents but lacks chapter-by-chapter structure for long books. Readwise delivers polish with mobile apps and syncing for $7.99/month but locks you into their ecosystem.
Choose reader3 if you value privacy (self-hosted), simplicity (minimal code), and customization (fork and modify). In contrast, choose alternatives if you want polish, integration, or convenience over control. Neither choice is wrong—they optimize for different priorities.
Key Takeaways
- reader3 is infrastructure for LLM-assisted reading, not an integrated AI app—chapter-by-chapter display makes it easy to copy text to any LLM
- Karpathy’s “code is ephemeral, libraries are over” philosophy challenges traditional software development—tools should inspire modification, not constrain adoption
- 1,576 stars in 48 hours validates appetite for simpler AI tools that augment rather than replace human activities
- Self-hosted, free, open-source design contrasts sharply with bloated commercial alternatives—privacy and control over features and polish
- Practical use today: technical books (instant concept clarification), language learning (real-time translations), literature analysis (theme discussion), research note-taking (quick summaries)
The project lives at github.com/karpathy/reader3. Grab a book from Project Gutenberg and try it. Or better yet, fork it and ask your LLM how to add the features you want. That’s exactly what Karpathy intended.











