SEC-Insights Web-based Chatbot
(Redirected from SEC-Insights Chatbot)
Jump to navigation
Jump to search
A SEC-Insights Web-based Chatbot is a web-based document-focused chatbot.
- Counter-Example(s):
- See: PDF Processor, PDF File.
References
2023
- https://github.com/run-llama/sec-insights
- QUOTE:
- A real-world full-stack application using LlamaIndex.
- SEC-Insights uses the Retrieval Augmented Generation (RAG) capabilities of LlamaIndex to answer questions about SEC 10-K & 10-Q documents.
- You can start using the application now at secinsights.ai
- You can also check out our End-to-End tutorial guide on YouTube for this project! This video covers product features, system architecture, development environment setup, and how to use this application with your own custom documents (beyond just SEC filings!). The video has chapters so you can skip to the section most relevant to you.
- Why did we make this? 🤔
- As RAG applications look to move increasingly from prototype to production, we thought our developer community would find value in having a complete example of a working real-world RAG application.
- SEC Insights works as well locally as it does in the cloud. It also comes with many product features that will be immediately applicable to most RAG applications.
- Use this repository as a reference when building out your own RAG application or fork it entirely to start your project off with a solid foundation.
- Product Features 😎
- Chat-based Document Q&A against a pool of documents
- Citation of source data that LLM response was based on
- PDF Viewer with highlighting of citations
- Use of API-based tools (polygon.io) for answering quantitative questions
- Token-level streaming of LLM responses via Server-Sent Events
- Streaming of Reasoning Steps (Sub-Questions) within Chat
- Development Features 🤓
- Infrastructure-as-code for deploying directly to Vercel & Render
- Continuous deployments provided by Vercel & Render.com. Shipping changes is as easy as merging into your main branch.
- Production & Preview environments for both Frontend & Backend deployments! Easily try your changes before release.
- Robust local environment setup making use of LocalStack & Docker compose.
- Monitoring & Profiling provided by Sentry.
- Load Testing provided by Loader.io.
- Variety of python scripts for REPL-based chat & data management.