StoryNote
Log in
|
Sign up
How feasible is to run you own local LLM as assistant on a basic Mac Mini M2 with 8 GB of memory?
by
/u/onturenio
in
/r/ollama
Read on Reddit
Upvotes:
12
Favorite this post:
Mark as read:
Your rating:
--
10
9
8
7
6
5
4
3
2
1
0
Add this post to a custom list