Artificial Intelligence (AI) Safety Task
(Redirected from AI Safety)
Jump to navigation
Jump to search
An Artificial Intelligence (AI) Safety Task is an system safety task that ...
- See: AI Alignment, AI Risk, Machine Ethics.
References
2023
- (Wikipedia, 2023) ⇒ https://en.wikipedia.org/wiki/AI_safety Retrieved:2023-7-31.
- AI safety is an interdisciplinary field concerned with preventing accidents, misuse, or other harmful consequences that could result from artificial intelligence (AI) systems. It encompasses machine ethics and AI alignment, which aim to make AI systems moral and beneficial, and AI safety encompasses technical problems including monitoring systems for risks and making them highly reliable. Beyond AI research, it involves developing norms and policies that promote safety.