Create a Server

Apple MLX: Build Your Own Private AI Server

Apple MLX: Build Your Own Private AI Server

#Apple #MLX #Build #Private #Server

“Mervin Praison”

๐Ÿ‘‹ Hi everyone! In today’s tutorial, I’ll show you how to set up and run a private AI server locally using MLX server, completely independent of the internet. ๐Ÿ–ฅ๏ธ๐Ÿ’ป๐Ÿ“ฑ We’ll dive into how to create a user-friendly chat interface and test it directly from your phone. Perfect for enthusiasts and developers looking to harness the power of AI within their private network! Apple MLX: Build Your Own Private AI Server

๐Ÿ” What You’ll Learn:
How to install and configure MLX server
Setting up ChainLit UI for a seamless chat experience
Running your private AI on both your computer and mobile device

๐ŸŒ mlx_server running on Mac
๐Ÿค– @chainlit_io Chat on my iPhone
๐Ÿ  100% Local AI
๐Ÿ“ Local Wifi Network
๐Ÿ”’ Data remains private

๐Ÿ”— Resources:
Patreon:
Ko-fi:
Discord:
Twitter / X :
Code:

๐Ÿ“Œ Timestamps:
0:02 – Introduction to MLX server setup
0:29 – Step-by-step MLX server configuration
1:07 – Installing MLX and starting the server
1:49 – Creating a UI with ChainLit
2:42 – Testing the chat interface
3:32 – Running the setup on your mobile device

๐Ÿ’ก Make…

source

 

To see the full content, share this page by clicking one of the buttons below

Related Articles

26 Comments

  1. When I run pip install mlx-lm I get an error.

    To fix this you could try to:

    1. loosen the range of package versions you've specified

    2. remove package versions to allow pip attempt to solve the dependency conflict

    ERROR: ResolutionImpossible

  2. Hi Mervin. A lot of companies have been connecting their AI chatbot on their social medias like whatsapp and also in multiple languages. Could you show us how can we do that? Basically a RAG connected on our socials in several different languages.

  3. What kind of m achine on mac does this work well? Feel like it doesn't work at all on my m1 macbook air. Seems like the server and the ui are not commuicating, or my machine is too slow

  4. Can you and Matthew Berman please take one day off per week…ideally on the same day, so that I get one day off a week to play with my dogs, cut the grass, drink a cocktail, go for a swim, and read a fictional book – without having to learn anything new for the day.

  5. This is misleading. You can access any port on the local network from any browser on a device connected to the network. It has nothing to do with your phone other than that your phone has a browser. If you left your house, you'd no longer even have access to the server; you'd need to set up tunneling for that, which you can do most easily with a service like ngrok. But this is not running on your phone or anything of the sort.

  6. brilliant work. can you create a video on how to train llama3 and then use groq with that model? maybe even how to deploy it on places like digital ocean or aws. Thanks keep it up

  7. one thing I would add for the novices like myself trying to follow along is the need for "pip3 install openai", probably obvious for most but I missed it initially

Leave a Reply