Tag

Amd Gpu

All articles tagged with #amd gpu

"Run an LLM Locally on Your PC in Under 10 Minutes with These Simple Steps"

Originally Published 1 year ago — by The Register

Featured image for "Run an LLM Locally on Your PC in Under 10 Minutes with These Simple Steps"
Source: The Register

It's possible to run large language models (LLMs) like Mistral or Codellama on your PC using tools like Ollama, LM Suite, and Llama.cpp, with support for Nvidia and Apple's M-series GPUs as well as AVX2-compatible CPUs. Ollama can be installed on Windows, Linux, and Mac, and offers various models and quantization options to optimize performance based on system resources. The article provides instructions for installing Ollama, running models, and managing installed models, and suggests exploring other frameworks for running local LLMs if needed.