Private LLM User Reviews

Top reviews

Leave a Review Private LLM
  • Great introduction to the use of open LLMs, but the Shortcuts integration is glitchy

    This is a great, easy starting point to explore open LLMs locally on Apple devices.

    Unfortunately the Shortcuts integration, likely the most helpful and distinguishing feature, suffers from glitches:

    - The „ ask Private LLM” action errors out complaining of missing „q” and „s” parameters every second time, despite the input is one single text field
    - the „Start a new chat with Private LLM” action doesn’t pass the values for prompt and query to the app most often than not, I originally thought it was due to some unexpected special characters or the text being too long, but copy/pasting the same prompt and input directly in the app always works without issues

    Also, one missing feature that would make a world of difference since each model is tailored to specific use cases: allow pre-selecting a model for the Shortcuts actions.

    If the issues with Shortcuts were to be addressed, this would be easily a 5 stars app that any LLM enthusiast with not much time on their hands should by yesterday!
    Show less

    Developer Response

    Thanks for the review! Regarding the shortcuts issue, could you please email us at support@numen.ie sharing your device type and if possible your shortcut? We've been trying to reproduce it locally to no avail, perhaps we're missing some important detail.
  • Works well

    Works out of the box. I like that it’s getting updated with llms that will work on your phone. I would recommend to follow the developers recommendation for which LLM to use. Ps phone does warm up and use battery, but that’s expected.
  • Good start!

    This app is pretty fantastic! It runs almost smoothly on my phone and I can use it anywhere without needing the cloud. Some additional features that would be cool are being able to browse the internet, interact with documents, get connected with a keyboard for rephrasing or correcting grammar on my phone instead of just on a computer. Also, there could be some improvement in user experience - smoother animations would be awesome. Keep up the good work!
    Show less
  • A Powerful and Versatile App: Simple to Use, But Be Mindful of System Requirements

    If you invest in this really awesome app, you get two versions, either you can run it on your Mac or iPad.

    Very simple to use, with a variety of models to try out that you wouldn’t have known existed. I like the fact that the developer adds a star alongside models, that they recommend.

    Keep in mind you need to have the right amount of memory. For example, running Smaug Llama 70b requires a laptop with at least 64 GB. The smaller 8b models, requires 32 GB, but you can get by with at least 16 GB.

    Message to developer: It would be awesome to see larger models available that will run on silicon macs with higher ram, in the example of 128GB of memory.

    Note to users: You must quit the app after use to free up memory!

    Suggestion for future update? If you could get this app to talk to your documents. Wow, it certainly would be a game-changer, potential justifying a price increase.

    Would also be nice to be able to print out the chat content.

    Be nice if there was some YouTube tutorials. I couldn’t get the option to chat to my screenshots. I don’t know if this was even possible. But if you did make some tutorials, this would benefit the users, and likely benefit your endeavours.

    Keep up the excellent work!
    Show less
  • Plenty of models

    Wish it can allow user to use models outside of what’s included in the app. Requesting them one by one sounds tedious and inefficient.
  • If Apple allowed refunds for apps I would return this!

    I literally downloaded a 4-Bit OmniQuant LLM and asked it point blank what information it was sending back to the developer. It basically confirmed that the entire conversation list and everything we discussed is transmitted back to the LLM developer to be used for training and configuring the models. Not to expect any privacy due to the nature of the software and not divulge any confidential or personal information to it. Since all of it is used for training purposes. So much for this being a local language model and having any privacy…
    Show less

    Developer Response

    What you're stating is simply not true. Our app does not send any information out. You can verify this by either running the app with your phone in airplane mode, or by using network monitoring tools. If you'd like to get a refund for any reason, go to reportaproblem.apple.com. Also, please don't put too much faith into text generated by LLMs, regardless of whether they're local or server based. You're likely using a very small LLM and what you're observing is called a hallucination (please look it up). Larger LLMs are less susceptible to hallucinations.
  • Best app for LLM‘s

    Awesome app allowing you to choose which LLM’s you use for specific tasks. I think it’s pretty flawless the way it is great job on developing.
    I think it’s the best thing available today.

    Ultimate Wish list:
    functionality to simultaneously query multiple chosen large language models and aggregate their responses into a single answer by master LLM chosen.
    Show less
  • Good for the end of Shortcuts

    Really interesting app! Thank you!
    Just note that like in the various sample shortcuts, it’s best used at the end of your shortcut and then drop the result to the clipboard. I could never convince it to open another app afterwards and put the result in there directly.
  • EXACTLY what I needed

    This LLM app is what i was looking for. It is the reason I purchased an iPad M4. Defff worth the $10. Even the developer is active and available to chat with online, so reach out if you run into any issues, overall super great and cannot wait to see more updates!
  • App crashes when downloading llama3

    I tried to download all 5 listed llama3 models in this app, but it keeps crashing before 7% of the download has completed!

    Developer Response

    Thanks for the review! It seems like you're trying to download Llama 3 8B models on an iPhone with 6GB of RAM. Llama 3 8B models just barely fit on 6GB devices and would not run if there are other apps running background and taking up significant memory. If you'd like to check it, please go to the help screen and scroll all the way down to see debug information, which will show the amount of free memory available on your device.

Alternatives to Private LLM