Apple MLX: Build Your Own Private AI Server
Apple MLX: Build Your Own Private AI Server
#Apple #MLX #Build #Private #Server
“Mervin Praison”
๐ Hi everyone! In today’s tutorial, I’ll show you how to set up and run a private AI server locally using MLX server, completely independent of the internet. ๐ฅ๏ธ๐ป๐ฑ We’ll dive into how to create a user-friendly chat interface and test it directly from your phone. Perfect for enthusiasts and developers looking to harness the power of AI within their private network! Apple MLX: Build Your Own Private AI Server
๐ What You’ll Learn:
How to install and configure MLX server
Setting up ChainLit UI for a seamless chat experience
Running your private AI on both your computer and mobile device
๐ mlx_server running on Mac
๐ค @chainlit_io Chat on my iPhone
๐ 100% Local AI
๐ Local Wifi Network
๐ Data remains private
๐ Resources:
Patreon:
Ko-fi:
Discord:
Twitter / X :
Code:
๐ Timestamps:
0:02 – Introduction to MLX server setup
0:29 – Step-by-step MLX server configuration
1:07 – Installing MLX and starting the server
1:49 – Creating a UI with ChainLit
2:42 – Testing the chat interface
3:32 – Running the setup on your mobile device
๐ก Make…
source
To see the full content, share this page by clicking one of the buttons below |
When I run pip install mlx-lm I get an error.
To fix this you could try to:
1. loosen the range of package versions you've specified
2. remove package versions to allow pip attempt to solve the dependency conflict
ERROR: ResolutionImpossible
Hi Mervin. A lot of companies have been connecting their AI chatbot on their social medias like whatsapp and also in multiple languages. Could you show us how can we do that? Basically a RAG connected on our socials in several different languages.
What kind of m achine on mac does this work well? Feel like it doesn't work at all on my m1 macbook air. Seems like the server and the ui are not commuicating, or my machine is too slow
At least explain first why Apple MLX, what are the features and benefits compare to others.
Great tutorial! Two requests: please provide sysreqs for the setups in these videos and for this one maybe show how to access this remotely.
Can you and Matthew Berman please take one day off per week…ideally on the same day, so that I get one day off a week to play with my dogs, cut the grass, drink a cocktail, go for a swim, and read a fictional book – without having to learn anything new for the day.
This is misleading. You can access any port on the local network from any browser on a device connected to the network. It has nothing to do with your phone other than that your phone has a browser. If you left your house, you'd no longer even have access to the server; you'd need to set up tunneling for that, which you can do most easily with a service like ngrok. But this is not running on your phone or anything of the sort.
great video Mervin, thank you!
brilliant work. can you create a video on how to train llama3 and then use groq with that model? maybe even how to deploy it on places like digital ocean or aws. Thanks keep it up
one thing I would add for the novices like myself trying to follow along is the need for "pip3 install openai", probably obvious for most but I missed it initially
very nice – great video , great topic… thank you Mervin
you are NOT running โthis on your phoneโ. you simply open the web ui from it. which is a completely different thing.
Interesting I could do the same with ollama server also correct?
thanks your tutorial! is it only for apple machine?
Thank
Thanks Mervin, great concise video. I have one suggestion for improvement. Can you put the terminal commands in the description?
This is outstanding! Thank you Mervin!
Another outstanding video. Short but informative, practical, link to code that works on the first go! Awesome!
Thanks!
I just purchased an m3 max macbook pro, so I will probably actually implement what you are suggesting in this video
I am working on a project, and the user talks to the model via text
Love this ๐๐๐
Genuine question, why use mlx server when If you use streamlit this is provided by out from the box? I can access my streamlit app trough my phone
Yah, chainlit! That was a cracker! Thx Mervin๐
Hello mate, is this possible on Windows
Also, production ready so this can be used by a local business vรญa internet ๐ข
Why so many cuts in your video?