I justĀ told youĀ how Elon MuskāsĀ XĀ is opening the door to using tweets and public information to train AI, and how I thought that could be even make its way into the ultra-billionaireāsĀ xAI.
Well, it seems that hunch was right. But of course Musk being Musk, he isnāt stopping there.
According to Walter Isaacson, the famed biographer who is working on a book about Musk, the man has ambitions to give xAI access to footage captured by cameras onĀ TeslaĀ vehicles.
As Isaacson explains it inĀ a piece for Time, the data from Tesla and X could teach āmachines to navigate in physical space and to answer questions in natural language.ā
If youāre thinking that sounds like the basis for a robot, well, yeah - Tesla hasĀ been working on those for some time. And while this all reads like an origin story forĀ SkyNetĀ fromĀ The Terminator, Isaacson suggests Muskās intentions are good.
The writer recollects a conversation with Musk:Ā āWhat can be done to make AI safe?ā Musk asked. āI keep wrestling with that. What actions can we take to minimize AI danger and assure that human consciousness survives?ā
Isaacson goes on to frame Muskās AI efforts through that lens; this is his way of saving humanity from other peopleās AI.
But Isaacson also notes Musk wants to eventually tie his myriad of companies together, which, presumably, would make data-sharing across those organizations even simpler. Considering Muskās companies includeĀ StarlinkĀ internet satellites andĀ NeuralinkĀ brain implants, it doesnāt take a polymath to see the moral complications that could arise.
Right now, large language models (LLMs) are trained on websites and books. In the not-to-distant future, will we need to pay extra attention to terms and conditions in case LLMs are learning from the thoughts in our heads?
While companies such asĀ Google,Ā Meta, andĀ OpenAIĀ are making a lot of noise in the LLM space, the open-source community has not been resting on its laurels.
TheĀ Technology Innovation InstituteĀ (TII)Ā has rolled out Falcon 180B onĀ HuggingFace. Professed to be the ālargest openly availableā LLM, it features 180 billion parameters (hence the name), and was trained on 3.5 trillion tokens provided by TII.
HuggingFace claims Falcon 180B ātops the leaderboard for (pre-trained) open-access models,ā and holds its own against Googleās PaLM-2.
You can check out Falcon 180Bās base modelĀ here, and its chat implementationĀ here.
Falcon 180B is seemingly extremely capable, outpacing OpenAIās GPT-3.5 - though not quite GPT-4. And considering this is an open-source project, itās an encouraging sign for the future of the technology.