šŸ¤– Elon's AI plan

Plus open-source AI is getting good

Thumbnail showing the Logo and a Screenshot of šŸ¤– Elon's AI plan

Muskā€™s mission to save us all

I justĀ told youĀ how Elon Muskā€™sĀ XĀ is opening the door to using tweets and public information to train AI, and how I thought that could be even make its way into the ultra-billionaireā€™sĀ xAI.

Well, it seems that hunch was right. But of course Musk being Musk, he isnā€™t stopping there.

According to Walter Isaacson, the famed biographer who is working on a book about Musk, the man has ambitions to give xAI access to footage captured by cameras onĀ TeslaĀ vehicles.

As Isaacson explains it inĀ a piece for Time, the data from Tesla and X could teach ā€œmachines to navigate in physical space and to answer questions in natural language.ā€

If youā€™re thinking that sounds like the basis for a robot, well, yeah - Tesla hasĀ been working on those for some time. And while this all reads like an origin story forĀ SkyNetĀ fromĀ The Terminator, Isaacson suggests Muskā€™s intentions are good.

The writer recollects a conversation with Musk:Ā ā€œWhat can be done to make AI safe?ā€ Musk asked. ā€œI keep wrestling with that. What actions can we take to minimize AI danger and assure that human consciousness survives?ā€

Isaacson goes on to frame Muskā€™s AI efforts through that lens; this is his way of saving humanity from other peopleā€™s AI.

But Isaacson also notes Musk wants to eventually tie his myriad of companies together, which, presumably, would make data-sharing across those organizations even simpler. Considering Muskā€™s companies includeĀ StarlinkĀ internet satellites andĀ NeuralinkĀ brain implants, it doesnā€™t take a polymath to see the moral complications that could arise.

Why it matters:

Right now, large language models (LLMs) are trained on websites and books. In the not-to-distant future, will we need to pay extra attention to terms and conditions in case LLMs are learning from the thoughts in our heads?

Falcon flying high

While companies such asĀ Google,Ā Meta, andĀ OpenAIĀ are making a lot of noise in the LLM space, the open-source community has not been resting on its laurels.

TheĀ Technology Innovation InstituteĀ (TII)Ā has rolled out Falcon 180B onĀ HuggingFace. Professed to be the ā€œlargest openly availableā€ LLM, it features 180 billion parameters (hence the name), and was trained on 3.5 trillion tokens provided by TII.

HuggingFace claims Falcon 180B ā€œtops the leaderboard for (pre-trained) open-access models,ā€ and holds its own against Googleā€™s PaLM-2.

You can check out Falcon 180Bā€™s base modelĀ here, and its chat implementationĀ here.

Why it matters:

Falcon 180B is seemingly extremely capable, outpacing OpenAIā€™s GPT-3.5 - though not quite GPT-4. And considering this is an open-source project, itā€™s an encouraging sign for the future of the technology.

Profile Picture of Tom Wilton

Written By: Tom Wilton

Lead Newsletter Writer

Published Date: Sep 08, 2023

Elon Musk's plans for AI extend to Tesla data, aiming to ensure AI safety. Meanwhile, Falcon 180B emerges as a leading LLM.

By clicking ā€œAcceptā€, you agree AllThingsAI can store cookies on your device and disclose information in accordance with our Cookie Policy.