
Are big language models (LLMS) our new operating systems? If yes, then they are changing the definition that we consider software.
Also: 8 ways to write better chatgpt signals – and they get results you want fast
Many similes are used to describe the impact of rapidly developed AI technologies, such as utilities, time-sharing systems and operating systems. Andrej karpathiOpenEII co-founder and former AI senior director at Tesla believes that an operating system is the most suitable analogy. In this scenario, LLMs are essentially new types of computers that orchestrates memory and calculate for problems-solution activities.
LLMS are complex software ecosystems, Karpathi is explained in one to talk At a recent startup conference in San Francisco. There are many similarities between these ecosystems and the operating systems of your yor. Apparently, the LLM environment “with open-source options such as Windows or Mac OS, Linux,” correspathy. “Lama ecosystem is a close estimate for something like Linux.”
Also: How to replace AI in your own research assistant with this free Google tool
Utility and timinging computing are simulations that can be applied to LLM because they are omnipresent and their construction requires high capital. “We are in this era of this 1960s, where LLM calculations are still very expensive for this new type of computer,” Curpete explained. “It forces LLM to be centralized in the cloud, and we are only thin customers who interact with it on the network.”
This cost and complexity means that, in a sense, a “personal computing revolution” has not yet occurred with LLM, because “it is just economical and does not understand,” he said.
Unlike the current operating system, a general graphical user interface for LLM has not been developed. “Whenever I talk to chat or some LLM directly in the lesson, I think I am talking to an operating system through a terminal,” said Curpent. “A GUI has not yet been really invented in a normal way. Is the GUI only separated from the text bubbles in Chatgpt? Surely, some apps have GUI, but there is no GUI in all tasks.”
However, despite these challenges, progress is faster. Karpathi suggested that we are entering the era of “Software 3.0”. While software 1.0 included coding in a system in the development of programs, and the software was based on the 2.0 nerve net, software 3.0 “uses” signals in our native language “.
Also: AI agents win over professionals – but only to do their grunt work, Stanford studies studies.
Karpathi suggested that his previous experience of working on Tesla autopylyt technology shows shifts from 1.0 to 2.0 and 3.0. “They were undergoing a software stack to produce steering and acceleration,” they explained.
“At a time, the autopylot had a ton of the C ++ code around the Software 1.0 code, and there were some nerve nets that were recognizing the image. As we had improved the autopylot, the nerve network capacity and size had increased in the ability and size – while all C ++ code was removed.
He said that this approach obtained information from images taken from several cameras organized through a nerve network: “We were able to remove a lot of code. Software 2.0 stack was quite literally ate through the autopylot’s software 1.0 stack.”
Also: Do not be foolish thinking that AI is coming for your job – here is the truth
Karpathi said that it is possible to see this infection on a large scale in development because we go to software 3.0. “We have a new type of software and is eating it through the stack,” he said.
“Now we have three completely separate programming paradigms. If you are entering the industry, it is a very good idea to be fluent in them, because they all have professionals and opposition. You want to do some functionality program in 1.0 or 2.0 or 3.0. Do you want to train a nervous mesh? Do you have just an LLM?”
Software growth and purinogen are rapidly changing, with shifts from head-down coding of interactive dialogues with machines. In this process, this new era is opening a wide range of application possibilities.
Get top stories of morning with us in your inbox every day Tech Today Newsletter.

