Ollama is a go utility to run LLM locally. Either in the terminal as a command line or through a REST API
Ollama : https://github.com/jmorganca/ollama
To use Mistral with Ollama:
Ollama + mistral : https://www.reddit.com/r/LangChain/comments/16vwilm/mistralai_llm_with_langchin/
Langchain is a python library to interface with various LLMs:
https://www.langchain.com/
To use Ollama with LangChain:
LangChain + Ollama : https://python.langchain.com/docs/integrations/llms/ollama
A CLI utility and Python library for interacting with Large Language Models, both via remote APIs and models that can be installed and run on your own machine:
https://llm.datasette.io/en/stable/
Embeddings: What they are and why they matter: https://simonwillison.net/2023/Oct/23/embeddings/
HN Post about this blog entry: https://news.ycombinator.com/item?id=37985489
HN Post about Mistral where you will find info on LLM tooling in the comments: https://news.ycombinator.com/item?id=37675496
Using LLaMA 2.0, FAISS and LangChain for Question-Answering on Your Own Data: https://medium.com/@murtuza753/using-llama-2-0-faiss-and-langchain-for-question-answering-on-your-own-data-682241488476
Use whisper to transcribe podcasts: https://gist.github.com/jdmichaud/28503f16c52e7907931c1b3bd6c1817f
How to do Question/Answering using LangChain: langchain-ai/langchain#9121