Eliezer Yudkowsky
Eliezer Yudkowsky is a pioneer in artificial intelligence alignment and rationality. As co-founder and research fellow at the Machine Intelligence Research Institute (MIRI), he leads efforts to ensure AGI benefits humanity. Yudkowsky developed the concept of "Friendly AI" and has made significant contributions to decision theory and metaethics. His influential writings, including "Rationality: From AI to Zombies," have shaped discussions on rationality, cognitive biases, and AI risks. His work has inspired researchers to tackle the challenges of creating beneficial AI systems. He continues to influence AI research and ethics, emphasizing the need for careful consideration of AI development's long-term implications, with Time listing him as one of the 100 most influential people in AI.
Programming descriptions are generated by participants and do not necessarily reflect the opinions of SXSW.