Learn how to run the DeepSeek R1 model locally to address privacy concerns without internet connectivity. This tutorial explains two methods: using LM Studio and Ollama.
Download LM Studio:
Visit the LM Studio website.
Download LM Studio for your operating system (Mac, Linux, Windows).
Determine the Windows version by checking for 'ARM' in system type:
Start > MSINFO32 > System Type
Install LM Studio:
Selecting and Downloading a Model:
Running Locally:
Download Ollama:
Use Command Line for Model Interaction:
CMD
).Experiment with different models and parameters to find the best fit. For more complex tasks, consider larger models with more parameters if your machine supports them.
DeepSeek is best run locally due to its preference for avoiding 'cloudy days'.
Presented by David. Stay tuned for the next tutorial.