Future of Life Institute cofounder Max Tegmark on regulating AI, Elon Muskâs potential to be a good influence on the administration, understanding how LLMs think, and more.
More than 33,000 peopleâincluding a hall of fame of AI expertsâsigned a March 2023 open letter calling on the tech industry to pause development of AI models more powerful than OpenAIâs GPT-4 for six months rather than continue the rush toward Artificial General Intelligence, or AGI. âPowerful AI systems should be developed only once we are confident that their effects will be positive and their risks will be manageable,â stated the letter, which was spearheaded by an organization called Future of Life Institute.
“In my opinion, this issue is the most important issue of all for the Trump administration, because I think AGI is likely to actually be built during the Trump administration. So during this administration, this is all going to get decided: whether we drive off that cliff or whether AI turns out to be the best thing that ever happened.”
Spoiler: The industry didnât heed the letterâs call. But it did generate tremendous publicity for the case that AGI could spiral out of human control unless safeguards were in place before they were actually needed. And it was only one of many initiatives from the decade-old institute designed to cultivate a conversation around AIâs risks and the best ways to steer the technology in a responsible direction.
Read the complete Fast Company article BY Harry McCracken: https://www.fastcompany.com/91228731/max-tegmark-future-of-life-interview