LLM-based System Safety Measure

From GM-RKB
Jump to navigation Jump to search

A LLM-based System Safety Measure is a system safety measure for LMM-based systems (for safety risks within AI systems).