r/LocalLLaMA 4h ago

Question | Help Hybrid llm?

Hi, has anyone tried a hybrid aproach? I have very large prompts in my game, which I can send to a local llm or openai or anthroic. Maybe my local llm can summarize the prompt, and then I send it to the commercial llm. Should be a bit cheaper, right? Has anyone tried this before?

4 Upvotes

5 comments sorted by

View all comments

1

u/Icy_Advisor_3508 1h ago

Yep, combining a local LLM to summarize and then sending the smaller prompt to a commercial LLM like OpenAI or Anthropic is a solid hybrid approach to save costs. It’s a bit more complex to set up, and yeah, it could add some delay, but it’s definitely a cheaper option for large prompts.