🤖 Elon's AI plan

Plus open-source AI is getting good

Thumbnail showing the Logo and a Screenshot of 🤖 Elon's AI plan

Musk’s mission to save us all

I just told you how Elon Musk’s X is opening the door to using tweets and public information to train AI, and how I thought that could be even make its way into the ultra-billionaire’s xAI.

Well, it seems that hunch was right. But of course Musk being Musk, he isn’t stopping there.

According to Walter Isaacson, the famed biographer who is working on a book about Musk, the man has ambitions to give xAI access to footage captured by cameras on Tesla vehicles.

As Isaacson explains it in a piece for Time, the data from Tesla and X could teach “machines to navigate in physical space and to answer questions in natural language.”

If you’re thinking that sounds like the basis for a robot, well, yeah - Tesla has been working on those for some time. And while this all reads like an origin story for SkyNet from The Terminator, Isaacson suggests Musk’s intentions are good.

The writer recollects a conversation with Musk: “What can be done to make AI safe?” Musk asked. “I keep wrestling with that. What actions can we take to minimize AI danger and assure that human consciousness survives?”

Isaacson goes on to frame Musk’s AI efforts through that lens; this is his way of saving humanity from other people’s AI.

But Isaacson also notes Musk wants to eventually tie his myriad of companies together, which, presumably, would make data-sharing across those organizations even simpler. Considering Musk’s companies include Starlink internet satellites and Neuralink brain implants, it doesn’t take a polymath to see the moral complications that could arise.

Why it matters:

Right now, large language models (LLMs) are trained on websites and books. In the not-to-distant future, will we need to pay extra attention to terms and conditions in case LLMs are learning from the thoughts in our heads?

Falcon flying high

While companies such as GoogleMeta, and OpenAI are making a lot of noise in the LLM space, the open-source community has not been resting on its laurels.

The Technology Innovation Institute (TII) has rolled out Falcon 180B on HuggingFace. Professed to be the “largest openly available” LLM, it features 180 billion parameters (hence the name), and was trained on 3.5 trillion tokens provided by TII.

HuggingFace claims Falcon 180B “tops the leaderboard for (pre-trained) open-access models,” and holds its own against Google’s PaLM-2.

You can check out Falcon 180B’s base model here, and its chat implementation here.

Why it matters:

Falcon 180B is seemingly extremely capable, outpacing OpenAI’s GPT-3.5 - though not quite GPT-4. And considering this is an open-source project, it’s an encouraging sign for the future of the technology.

Profile Picture of Tom Wilton

Written By: Tom Wilton

Lead Newsletter Writer

Published Date: Sep 08, 2023

Elon Musk's plans for AI extend to Tesla data, aiming to ensure AI safety. Meanwhile, Falcon 180B emerges as a leading LLM.

By clicking “Accept”, you agree AllThingsAI can store cookies on your device and disclose information in accordance with our Cookie Policy.