At its inauguration Lalmakon AI Developer Conference on Tuesday, Meta announced an API for its Lama series of AI Model: The Lama API.
Is available in Limited previewThe Lama allows API developers to detect and use separate Lama models, products run by meta per meta. Combined with the SDK of the meta, it allows developers to manufacture lama-operated services, equipment and applications. Meta did not immediately share the pricing of API with techcrunch.
The rollout of the API comes when the meta looks to maintain a lead in a competitive open model location. While the Lama model has downloaded more than one billion, according to Meta, such as rivals such as Deepsek and Cuven of Alibaba threatened to establish a far -reaching ecosystem with the Meta efforts with the Lama.
LLAma API LLAma 3.3 provides equipment to evaluate and evaluate the performance of the Lama model starting with 8B. Customers can generate data, train on it, and then use meta evaluation suits in Lama API to test the quality of their custom models.

Meta said it would not use Lama API customer data to train the company’s own model and the model manufactured using Lama API can be transferred to another host.
For the manufacture of devas at the top of the recently released Lama 4 model of the meta, in particular, the Lama offers model-service options via API Cerebras and Grakes. Meta stated that these “early experimental” options are “available by request” to help developers prototype their AI apps.
In a blog post provided to Techcrunch, Meta wrote, “By selecting the names of the cerebra or groke model in the API, developers can enjoy a streamlined experience with all the uses tracked at one place.” “(W) E is ready to expand the partnership with additional providers to bring even more options to build on top of the Lama.”
Meta said that this lama would expand access to API “in the coming weeks and months.”