Note details

How to Install DeepSeek on Windows

BY b7hxm
May 25, 2025
Public
Private
4349 views

Tutorial: Running DeepSeek R1 Locally

Introduction

Learn how to run the DeepSeek R1 model locally to address privacy concerns without internet connectivity. This tutorial explains two methods: using LM Studio and Ollama.

Method 1: Using LM Studio

  1. Download LM Studio:

    • Visit the LM Studio website.

    • Download LM Studio for your operating system (Mac, Linux, Windows).

    • Determine the Windows version by checking for 'ARM' in system type:

      Start > MSINFO32 > System Type
      
  2. Install LM Studio:

    • Open the downloaded file and install LM Studio.
    • Ensure 'Run LM Studio' is checked before exiting the installer.
  3. Selecting and Downloading a Model:

    • Open LM Studio and click the search icon.
    • Search for "DeepSeek" on Hugging Face.
    • Choose between the distilled models (Qwen 7B or Llama 8B).
  4. Running Locally:

    • Disconnect from the internet to ensure local operation.
    • Use LM Studio for AI queries similar to online services.

Method 2: Using Ollama

  1. Download Ollama:

    • Visit the Ollama website and download for your OS.
    • Run the installer once downloaded.
  2. Use Command Line for Model Interaction:

    • Open Command Prompt (Windows: CMD).
    • On Ollama's site, search for "DeepSeek R1" models.
    • Select an 8 billion parameter Llama model.
    • Run commands in Command Prompt to interact with the model.

Comparison: Local vs Web-Based DeepSeek R1

  • Privacy & Connectivity: Local models don't require WIFI.
  • Distillation Explanation: Models are distilled versions, summarized for usability.
  • Model Parameter Considerations: Greater parameters mean more complexity and resource requirements.

Practical Evaluations

Email Response

  • Firmness: Both models can rewrite emails but have varying levels of authority.

Math Problem

  • Accuracy: Larger models tend to provide more complete answers.

Advice Scenarios

  • Detailed Advice: Larger models offer more step-by-step guidance.

Logic Problems

  • Challenge Levels: AI models struggle with complex riddles.

Music Recommendations

  • Accuracy: Larger models provide better recommendations free from 'hallucinations'.

Conclusion

Experiment with different models and parameters to find the best fit. For more complex tasks, consider larger models with more parameters if your machine supports them.


Final Thought

DeepSeek is best run locally due to its preference for avoiding 'cloudy days'.

Presented by David. Stay tuned for the next tutorial.

    How to Install DeepSeek on Windows