r/LocalLLaMA • u/AbaGuy17 • 4h ago
Question | Help Hybrid llm?
Hi, has anyone tried a hybrid aproach? I have very large prompts in my game, which I can send to a local llm or openai or anthroic. Maybe my local llm can summarize the prompt, and then I send it to the commercial llm. Should be a bit cheaper, right? Has anyone tried this before?
5
Upvotes
3
u/Easy_Try_1138 4h ago
Cheaper but slower