/r/LocalLLaMA
Dear Microsoft: Can we get a Phi 3.1 Small with GQA? Or even the same context implementation as 3.1 Mini? Please and thank you.
Mark as read: Add to a list
I Created Promptimizer An Automated AI-Powered Prompt Optimization Framework (now works with Ollama!)
Mark as read: Add to a list
Mark as read: Add to a list
Mark as read: Add to a list
Mark as read: Add to a list
Mark as read: Add to a list
How To Build Your Retrieval Augmented Generation (RAG) Using Open-source Tools: LangChain, LLAMA 3, and Chroma. A visual guide + Jupyter Notebook. 🧠
Mark as read: Add to a list
Mark as read: Add to a list