LLM DevOps Practice

From GM-RKB
(Redirected from LLM Ops)
Jump to navigation Jump to search

A LLM DevOps Practice is a DevOps practice for productionizing LLM-based workflows.



References

2024

[1] https://quix.io/blog/llmops-running-large-language-models-in-production
[2] https://community.aws/posts/we-built-an-llm-powered-devops-guru-heres-what-we-learned
[3] https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-end-to-end-azure-devops-with-prompt-flow?view=azureml-api-2
[4] https://www.vellum.ai/blog/the-four-pillars-of-building-a-production-grade-ai-application
[5] https://www.ml-architects.ch/blog_posts/reliable_llm.html
[6] https://huyenchip.com/2023/04/11/llm-engineering.html
[7] https://hackernoon.com/embracing-llm-ops-the-next-stage-of-devops-for-large-language-models
[8] https://blogs.starcio.com/2023/08/llm-generative-ai-devops.html
[9] https://arxiv.org/abs/2405.11581
[10] https://github.com/flavienbwk/awesome-llm-devops
[11] https://www.pluralsight.com/resources/blog/software-development/testing-llm-applications-devops
[12] https://learn.microsoft.com/en-us/azure/machine-learning/prompt-flow/how-to-integrate-with-llm-app-devops?view=azureml-api-2
[13] https://www.linkedin.com/pulse/operationalizing-large-language-models-production-madhav-kashyap-dmsjf