Join Nostr
2025-06-24 18:09:56 GMT

Sebastix on Nostr: Running Ollama with DeepSeek-R1 It thinks / writes just as fast as I can read 😜 ...

Running Ollama with DeepSeek-R1



It thinks / writes just as fast as I can read 😜

Next: try to set this up as a provider with proxy 👀 but is it correct I do need an OpenAI API key? /cc
Should I go with https://github.com/ollama/ollama trying to import https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Qwen-32B?

There are too many models to pick one at first. I guess I have to start simple.

Where are the local LLM experts here? #askNostr