feat(ai): hybrid backend mode for agentic insights
Adds a `backend` column to photo_insights (default 'local', migration 2026-04-20-000000) and a corresponding optional `backend` field on the agentic request. When a request sets backend=hybrid: - The local Ollama vision model is called once via describe_image to produce a text description. - The description is inlined into the first user message as text — no base64 image is ever sent to the chat model. - The agentic tool-calling loop and title generation route through an OpenRouterClient (dispatched via &dyn LlmClient), letting the user pick any tool-capable model from OpenRouter per request. - describe_photo is removed from the offered tools since the description is already present. Embeddings and vision stay on local Ollama regardless of backend. Hybrid mode requires OPENROUTER_API_KEY; handlers return a clear error when hybrid is requested without it, and also when the selected OpenRouter model lacks tool-calling support. AppState gains an optional openrouter client built from OPENROUTER_API_KEY / OPENROUTER_BASE_URL / OPENROUTER_DEFAULT_MODEL / OPENROUTER_EMBEDDING_MODEL / attribution headers. Default model is anthropic/claude-sonnet-4. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
This commit is contained in:
@@ -134,6 +134,7 @@ async fn main() -> anyhow::Result<()> {
|
||||
|
||||
let generator = InsightGenerator::new(
|
||||
ollama,
|
||||
None,
|
||||
sms_client,
|
||||
insight_dao.clone(),
|
||||
exif_dao,
|
||||
@@ -249,6 +250,7 @@ async fn main() -> anyhow::Result<()> {
|
||||
args.top_k,
|
||||
args.min_p,
|
||||
args.max_iterations,
|
||||
None,
|
||||
)
|
||||
.await
|
||||
{
|
||||
|
||||
Reference in New Issue
Block a user