one can run claude code using local models through the LM Studio endpoint. so that you don't use Claude in the cloud.
4 Matching Annotations
- Last 7 days
-
lmstudio.ai lmstudio.ai
Tags
Annotators
URL
-
- Jan 2026
-
lmstudio.ai lmstudio.ai
-
LM Studio model catalog (for local models). useful to see what is being used mostly at the mo
- [ ] return
-
- Dec 2025
-
www.howtogeek.com www.howtogeek.com
-
Calibre has added AI 'support', mostly to suggest new stuff to read and an option to discuss a book. It has an LM Studio back-end, so I can tie it to my local models.
-
- Dec 2024
-
lmstudio.ai lmstudio.ai
-
LM Studio can run LLMs locally (I have llama and phi installed). It also has an API over a localhost webserver. I use that API to make llama available in Obsidian using the Copilot plugin.
This is the API documentation. #openvraag other scripts / [[Persoonlijke tools 20200619203600]] I can use this in?
Tags
Annotators
URL
-