saga-reader
Description: Saga Reader is an AI-driven think tank-style reader that automatically retrieves and summarizes internet content based on user-specified topics and preferences. It supports both cloud and local large language models (LLMs) for summarization and interactive discussions with an AI reading companion.
Purpose: To provide a secure, privacy-focused, and ad-free reading experience where users can subscribe to topics of interest, get AI-summarized content, and engage in real-time discussions with AI about their readings.
Target Audience: Users who want a lightweight, privacy-conscious, and AI-enhanced reading experience across multiple platforms (Windows, Mac, Linux).
Key Features:
- AI Content Subscription: Automatically fetches and summarizes content based on user-defined keywords.
- Immersive AI Companion: Enables real-time discussions with AI about articles.
- Multilingual Translation: Translates foreign-language articles into the user's preferred language.
- Privacy-Focused: All data is stored locally, ensuring no third-party tracking.
- Multi-Model AI Support: Works with both cloud-based and local LLMs (e.g., Ollama).
- Lightweight Performance: Built with Rust and Svelte, consuming minimal system resources (<10MB RAM).
- Clean UI: Ad-free, simple, and intuitive interface.
Technology Stack:
- Frontend: Svelte/SvelteKit (with TailwindCSS for styling).
- Backend: Rust (Tauri for desktop app framework, SeaORM for database interactions).
- AI Integration: Supports various LLM providers (e.g., GLM Flash, OpenAI) and local models via Ollama.
Cross-Platform: Available for Windows, macOS, and Linux. Downloadable from the official website fa-solid fa-up-right-from-square.
Open Source: Fully free and open-source (MIT licensed), encouraging community contributions for features like additional search providers (Google), LLM integrations, or translations.
Developer Notes: Built as a monorepo with modular Rust crates for scraping, storage, LLM abstractions, and more. The frontend is a Svelte-based Tauri app.