LLM-Supported AI System
(Redirected from LLM-supported system)
Jump to navigation
Jump to search
An LLM-Supported AI System is an GenAI-supported system that implements an LLM-based algorithm to solve an LLM-based task (which requires the use of a large language model).
- Context:
- Task Input: LLM Prompt, System Configuration
- Task Output: Model Response, System Action
- Task Performance Measure: Response Quality, Processing Time, Resource Usage, Safety Measures.
- ...
- It can (typically) be through an LLM-Supported System Development Project (often by an LLM engineering team who follow LLM system development models).
- It can (typically) be composed of LLM Pipelines (such as an LLM training pipeline).
- It can (typically) make use of an LLM Technology (such as an LLM platform or LLM libraries).
- It can (often) aim for a LLM-Centric System Architecture.
- It can (typically) implement various LLM System Automation Patterns:
- LLM Pipeline Automation for automated data and model flow
- LLM Feedback Automation for self-improving systems
- LLM Decision Automation for autonomous choices
- LLM Interaction Automation for user engagement
- It can range from having Simple LLM Automation to Complex LLM Automation
- It can range from having Manual LLM Operation to Autonomous LLM Operation
- LLM Integration Types, such as:
- LLM API Integrations, such as:
- LLM Cloud API Integration for hosted services
- LLM Local API Integration for on-premise deployment
- LLM Framework Integrations, such as:
- LLM Pipeline Integration for workflow systems
- LLM Service Integration for microservices
- LLM Data Integrations, such as:
- LLM Database Integration for structured data
- LLM Knowledge Base Integration for domain knowledge
- LLM API Integrations, such as:
- LLM System Development Stages, such as:
- LLM Prototype Systems, such as:
- LLM Proof of Concept for concept validation
- LLM Technical Demo for capability demonstration
- LLM Production Systems, such as:
- LLM Enterprise Deployment for business use
- LLM Public Service for consumer applications
- LLM Prototype Systems, such as:
- It can range from being an LLM Development Environment to an LLM Production Environment
- It can range from being an LLM Testing System to an LLM Live System
- ...
- It can range from being a Partially-Automated LLM-based System to being a Fully-Automated LLM-based System.
- It can range from being a Passive LLM-based System to being a Active LLM-based System.
- It can range from being a Batch LLM-based System to being an Online LLM-based System.
- It can range from being a Personalized LLM-based System to being an Non-Personalized LLM-based System.
- It can range from being a Zero-Shot LLM-Supported AI System to being a Few-Shot LLM-Supported AI System.
- It can range from being an LLM-Supported Application to being an LLM-Supported Component.
- It can range from being a LLM-based Demo-Ware System to being an LLM-based Application.
- ...
- It can be implemented using an ML Framework (that can include an ML library, or ML tools).
- …
- Example(s):
- General-Purpose LLM Systems, such as:
- LLM-based Conversational Systems, such as:
- LLM Open-Domain Chatbot for llm conversation, such as:
- LLM Personal Assistant for llm task automation, such as:
- LLM Information Systems, such as:
- LLM Information Extraction System for llm data extraction, such as:
- LLM Text Classification System for llm content analysis, such as:
- LLM Content Generation System for llm text creation, such as:
- LLM Recommender System for llm content recommendation.
- ...
- LLM-based Conversational Systems, such as:
- Domain-Specific LLM Systems, such as:
- LLM Business Systems, such as:
- LLM Customer Support System for llm customer service, such as:
- LLM Business Assistant for llm business operations, such as:
- LLM Professional Systems, such as:
- LLM Healthcare Systems, such as:
- LLM Legal Systems, such as:
- LLM Education Systems, such as:
- LLM Development Systems, such as:
- LLM Code Systems, such as:
- LLM Code Generation System for llm software development, such as:
- LLM Code Analysis System for llm code review, such as:
- LLM Documentation Systems, such as:
- LLM Code Systems, such as:
- ...
- LLM Business Systems, such as:
- LLM Infrastructure Systems, such as:
- LLM Cloud Platform for llm cloud deployment, such as:
- LLM Edge System for llm local processing, such as:
- ...
- LLM Development Support Systems, such as:
- LLM Testing Platform for llm quality assurance, such as:
- LLM Development Environment for llm system creation, such as:
- ...
- LLM Data Processing Systems, such as:
- LLM Data Pipeline for llm data flow, such as:
- LLM Vector Database for llm data storage, such as:
- ...
- LLM Integration Systems, such as:
- LLM API Gateway for llm service management, such as:
- LLM Workflow System for llm process automation, such as:
- ...
- General-Purpose LLM Systems, such as:
- Counter-Example(s):
- Traditional ML System, which uses statistical models rather than language models.
- Rule-Based AI System, which relies on predefined rules rather than learned patterns.
- Expert System, which uses knowledge bases rather than language understanding.
- Neural Network System, which focuses on pattern recognition rather than language processing.
- Information Retrieval System, which performs data lookup rather than text generation.
- See: ML-based System.
References
2023
- https://www.fiddler.ai/blog/four-ways-that-enterprises-deploy-llms
- QUOTE: With the rapid pace of LLM innovations, enterprises are actively exploring use cases and deploying their first generative AI applications into production. As the deployment of LLMs or LLMOps began in earnest this year, enterprises have incorporated four types of LLM deployment methods, contingent on a mix of their own talent, tools and capital investment. Bear in mind these deployment approaches will keep evolving as new LLM optimizations and tooling are launched regularly.
The goal of this post is to walk through these approaches and talk about the decisions behind these design choices.
Four approaches to LLMs in production (AWS Generative AI Summit)- There are four different approaches that enterprises are taking to jumpstart their LLM journey. These four approaches range from easy and cheap to difficult and expensive to deploy, and enterprises should assess their AI maturity, model selection (open vs. closed), data available, use cases, and investment resources when choosing the approach that works for their company’s AI strategy. Let’s dive in.
- QUOTE: With the rapid pace of LLM innovations, enterprises are actively exploring use cases and deploying their first generative AI applications into production. As the deployment of LLMs or LLMOps began in earnest this year, enterprises have incorporated four types of LLM deployment methods, contingent on a mix of their own talent, tools and capital investment. Bear in mind these deployment approaches will keep evolving as new LLM optimizations and tooling are launched regularly.
2023
- chat
- Q: An LLM-based production system refers to a system that utilizes large language models (LLMs) like OpenAI's GPT series to enhance, automate, or support various aspects of production or operations. These systems leverage the natural language understanding, generation, and reasoning capabilities of LLMs to perform tasks that typically require human intervention or expertise. While I'm not aware of any specific production systems called "LLM-based production systems," I can give you some examples of systems that utilize LLMs in their functioning:
- Content generation: LLMs like GPT-4 can automatically generate content for blogs, social media, and marketing campaigns, reducing the time and effort required by human content creators.
- Customer support: LLMs can be integrated into chatbots and support systems to handle customer queries, complaints, and concerns more effectively and efficiently.
- Document summarization: LLMs can automatically summarize long documents, reports, or articles, making it easier for professionals to extract relevant information quickly.
- Legal analysis: LLMs can be employed to review contracts, analyze legal documents, and provide insights into complex legal matters, saving time and resources for legal professionals.
- Language translation: LLMs can be used to develop translation systems that accurately convert text from one language to another, making communication easier across different countries and cultures.
- Sentiment analysis: LLMs can analyze customer reviews and feedback to identify positive or negative sentiments, helping businesses improve their products and services.
- Medical diagnosis: LLMs can assist medical professionals by analyzing patient symptoms and medical history to suggest potential diagnoses and treatment plans.
- These are just a few examples of how LLMs can be integrated into various production systems to enhance efficiency, reduce costs, and improve overall output. As AI research and development continues, LLMs will likely find even more applications across a wide range of industries.
- Q: An LLM-based production system refers to a system that utilizes large language models (LLMs) like OpenAI's GPT series to enhance, automate, or support various aspects of production or operations. These systems leverage the natural language understanding, generation, and reasoning capabilities of LLMs to perform tasks that typically require human intervention or expertise. While I'm not aware of any specific production systems called "LLM-based production systems," I can give you some examples of systems that utilize LLMs in their functioning: