StoryNote
Log in
|
Sign up
/r/ollama
Year:
All
2024
Show search filters
Search by title:
Search by author:
Hide posts already read
Only show posts with narrations
Prototype procedural chat interface (works with 6+ LLM chat APIs).
4 upvotes
•
kleer001
Mark as read:
--
10
9
8
7
6
5
4
3
2
1
0
Add to a list
Integrate with Office?
4 upvotes
•
hwlim
Mark as read:
--
10
9
8
7
6
5
4
3
2
1
0
Add to a list
can llama-guard be bypassed for llama3.1
3 upvotes
•
Expensive-Award1965
Mark as read:
--
10
9
8
7
6
5
4
3
2
1
0
Add to a list
Any way to keep “Ollama ps” (process show) running as a live resource monitor?
3 upvotes
•
Porespellar
Mark as read:
--
10
9
8
7
6
5
4
3
2
1
0
Add to a list
Is it possible to keep the model in memory for custom time in cli?
3 upvotes
•
Necessary-Donkey5574
Mark as read:
--
10
9
8
7
6
5
4
3
2
1
0
Add to a list
Open WebUI and Continue.dev
3 upvotes
•
AbleSugar
Mark as read:
--
10
9
8
7
6
5
4
3
2
1
0
Add to a list
Self hosted Function-calling API service for multiple LLM providers including Ollama
3 upvotes
•
leshiy-urban
Mark as read:
--
10
9
8
7
6
5
4
3
2
1
0
Add to a list
Dataset is >30gb. Llama3 model is not giving proper result
3 upvotes
•
BookkeeperDistinct59
Mark as read:
--
10
9
8
7
6
5
4
3
2
1
0
Add to a list
Howto: Ollama Setup with distrobox Container OpenSUSE Tumbleweed / MicroOS / AeonDesktop with AMD-GPU over ROCm Support
3 upvotes
•
torsten_online
Mark as read:
--
10
9
8
7
6
5
4
3
2
1
0
Add to a list
RPG simulation for NPCs
3 upvotes
•
TimMurrayKM
Mark as read:
--
10
9
8
7
6
5
4
3
2
1
0
Add to a list
Title
Upvotes
Author
Mark as read
Favorited
Rating
Add to a list
Prototype procedural chat interface (works with 6+ LLM chat APIs).
4
kleer001
--
10
9
8
7
6
5
4
3
2
1
0
Integrate with Office?
4
hwlim
--
10
9
8
7
6
5
4
3
2
1
0
can llama-guard be bypassed for llama3.1
3
Expensive-Award1965
--
10
9
8
7
6
5
4
3
2
1
0
Any way to keep “Ollama ps” (process show) running as a live resource monitor?
3
Porespellar
--
10
9
8
7
6
5
4
3
2
1
0
Is it possible to keep the model in memory for custom time in cli?
3
Necessary-Donkey5574
--
10
9
8
7
6
5
4
3
2
1
0
Open WebUI and Continue.dev
3
AbleSugar
--
10
9
8
7
6
5
4
3
2
1
0
Self hosted Function-calling API service for multiple LLM providers including Ollama
3
leshiy-urban
--
10
9
8
7
6
5
4
3
2
1
0
Dataset is >30gb. Llama3 model is not giving proper result
3
BookkeeperDistinct59
--
10
9
8
7
6
5
4
3
2
1
0
Howto: Ollama Setup with distrobox Container OpenSUSE Tumbleweed / MicroOS / AeonDesktop with AMD-GPU over ROCm Support
3
torsten_online
--
10
9
8
7
6
5
4
3
2
1
0
RPG simulation for NPCs
3
TimMurrayKM
--
10
9
8
7
6
5
4
3
2
1
0
«
<
>
»
Page
of 7
Go