LM Studio model catalog (for local models). useful to see what is being used mostly at the mo
- [ ] return
LM Studio model catalog (for local models). useful to see what is being used mostly at the mo
Calibre has added AI 'support', mostly to suggest new stuff to read and an option to discuss a book. It has an LM Studio back-end, so I can tie it to my local models.
LM Studio can run LLMs locally (I have llama and phi installed). It also has an API over a localhost webserver. I use that API to make llama available in Obsidian using the Copilot plugin.
This is the API documentation. #openvraag other scripts / [[Persoonlijke tools 20200619203600]] I can use this in?