Close Menu
Pineapples Update –Pineapples Update –

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    When it is on our body, we need to use AI

    August 5, 2025

    Apple’s AirPods Pro 2 Drop on Amazon by $ 169

    August 5, 2025

    Dark Matter Theories Hidon Mirror World and Origins of the Age of the Universe suggests

    August 5, 2025
    Facebook X (Twitter) Instagram
    Facebook X (Twitter) Instagram Pinterest Vimeo
    Pineapples Update –Pineapples Update –
    • Home
    • Gaming
    • Gadgets
    • Startups
    • Security
    • How-To
    • AI/ML
    • Apps
    • Web3
    Pineapples Update –Pineapples Update –
    Home»Startups»My Go -TU LLM Tool dropped a super simple Mac and PC app for the local AI – why should you try it
    Startups

    My Go -TU LLM Tool dropped a super simple Mac and PC app for the local AI – why should you try it

    PineapplesUpdateBy PineapplesUpdateAugust 5, 2025No Comments4 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    My Go -TU LLM Tool dropped a super simple Mac and PC app for the local AI – why should you try it
    Share
    Facebook Twitter LinkedIn Pinterest Email

    My Go -TU LLM Tool dropped a super simple Mac and PC app for the local AI – why should you try it

    Jack Waln / Elis Betters Picaro / ZDNET

    Key takeaways of zdnet

    • Olama AI Devas have issued a native GUI for McOS and Windows.
    • The new GUI is very simple using AI locally.
    • The app is easy to install, and allows you to pull various LLMs.

    If you use AI, there are many reasons that you would like to work with it locally instead of cloud.

    First, it provides too much privacy. When using a large language model (LLM) in the cloud, you never know if your questions or results are being tracked or even saved by a third party. In addition, the use of LLM locally saves energy. The amount of energy required to use cloud-based LLM is increasing and may be a problem in the future.

    Argo, locally hosted LLM.

    Also: Local Deepsek AI to protect your privacy – 2 easy ways

    Olama is a device that allows you to run separate LLM. I have been using it for some time and found it to simplify the process of downloading and using various models. Although this requires severe system resources (you won’t want to use it on aging machine), it moves fast, and allows you to use various models.

    But Olama itself has been a command-line-key case. Some are third-party Gui (eg MSty, which is my Go-Two). So far, the developers behind Olama had not produced their own GUI.

    It all recently turned, and is now a straight, user -friendly GUI, named Olama.

    Common works with LLMS – but you can pull others

    GUI is quite basic, but it is designed so that anyone can jump immediately and start using it. There is also a small list of LLM that can be easily drawn from the LLM drop-down list. Those models are quite common (such as Jemma, Dipsek and Quven models). Select one of those models, and Olama GUI will pull it for you.

    If you want to use the model not listed, then you have to pull it from the command line like:

    Olama bridge model

    Where the model is the name of the model you want.

    Also: How do I feed my files to a local AI for better, more relevant reactions

    You can find a complete list of available models Olma library,

    After pulling a model, it appears in the drop-down to the right of the query bar.

    The olma app is equally easy to use as any cloud-based AI interface on the market, and it is free to use for Macos and Windows (sadly there is no Linux version of GUI).

    I kicked the tires of the Olama app and found that, although it does not have the feature set of MSty, it fits and fits better with Macos beauty. The OLLAMA app also sounds slightly faster than MSty (in response to both opening and queries), which is a good thing because local AI can often be slightly slow (due to lack of system resources).

    How to install Olama app on Mac or Windows

    You are in luck, because installing the Olama app is as easy as installing any app on Macos or Windows. You just indicate your browser Olama download pageDownload the app for your OS, double-click the downloaded file, and follow the instructions. For example, on Macos, you draw the Olama app icon into the application folder, and you are.

    Using Olama is equally easy: Select the model you want, download it, then querry.

    The OLLAMA app selects the model showing drop-down.

    Pulling LLM is as easy as choosing it from the list and let the app talk.

    Jack Walons/ZDDNet

    Should you try the Olama app?

    If you are looking for a reason to try local AI, now is the right time.

    Also: I tried the local AI app of SECTAM, and this is what I needed to keep my data private.

    The Olama app makes it as easy to migrate away from cloud-based AI as it can get. The app is free to install and use, as the Olama library has LLMs. Give it a chance, and see if it does not become your Go-to AI tool.

    Want more stories about AI? Check out AI LeaderboardOur weekly newspapers.

    app dropped LLM local Mac simple Super tool
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleThe base network is suffering from 1 downtime since 2023, prevents operations for 29 minutes
    Next Article Your fitbit sleep score just worse – why is this good news here
    PineapplesUpdate
    • Website

    Related Posts

    Startups

    Fans of Cloud performed a funeral for anthropic’s retired AI model

    August 5, 2025
    Startups

    Get Startup Insight from Chef Robotics, NEA and Iconiq to interrupt 2025

    August 5, 2025
    Startups

    Uzbekistan’s first unicorn, Uzum, jumps for $ 1.5B valuation

    August 5, 2025
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    Microsoft’s new text editor is a VIM and Nano option

    May 19, 2025797 Views

    The best luxury car for buyers for the first time in 2025

    May 19, 2025724 Views

    Massives Datenleck in Cloud-Spichenn | CSO online

    May 19, 2025650 Views
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    Latest Reviews

    Subscribe to Updates

    Get the latest tech news from FooBar about tech, design and biz.

    Most Popular

    10,000 steps or Japanese walk? We ask experts if you should walk ahead or fast

    June 16, 20250 Views

    FIFA Club World Cup Soccer: Stream Palmirus vs. Porto lives from anywhere

    June 16, 20250 Views

    What do chatbott is careful about punctuation? I tested it with chat, Gemini and Cloud

    June 16, 20250 Views
    Our Picks

    When it is on our body, we need to use AI

    August 5, 2025

    Apple’s AirPods Pro 2 Drop on Amazon by $ 169

    August 5, 2025

    Dark Matter Theories Hidon Mirror World and Origins of the Age of the Universe suggests

    August 5, 2025

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    Facebook X (Twitter) Instagram Pinterest
    • About Us
    • Contact Us
    • Privacy Policy
    • Terms And Conditions
    • Disclaimer
    © 2025 PineapplesUpdate. Designed by Pro.

    Type above and press Enter to search. Press Esc to cancel.