StoryNote logo

How can i queue the request to llm , currently if llama is processing a prompt and i give it another, it starts with a newer one and discards last one , how to overcome such issue

by /u/GAMION64 in /r/LocalLLaMA

Upvotes: 1

Favorite this post:
Mark as read:
Your rating:
Add this post to a custom list

StoryNote©

Reddit is a registered trademark of Reddit, Inc. Use of this trademark on our website does not imply any affiliation with or endorsement by Reddit, Inc.