refactor: introduce LlmClient trait (no-op)
Preparation for a second LLM backend (OpenRouter) and hybrid vision-local / chat-remote mode. Shared wire types (ChatMessage, Tool, ToolCall, etc.) move into a new src/ai/llm_client.rs and are re-exported from ollama.rs so existing imports keep working. OllamaClient now implements LlmClient. No behavior change; callers still hold the concrete OllamaClient. Caller migration to Arc<dyn LlmClient> is deferred to the PR that wires hybrid backend routing. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
This commit is contained in:
@@ -1,6 +1,7 @@
|
||||
pub mod daily_summary_job;
|
||||
pub mod handlers;
|
||||
pub mod insight_generator;
|
||||
pub mod llm_client;
|
||||
pub mod ollama;
|
||||
pub mod sms_client;
|
||||
|
||||
@@ -13,5 +14,9 @@ pub use handlers::{
|
||||
get_insight_handler, rate_insight_handler,
|
||||
};
|
||||
pub use insight_generator::InsightGenerator;
|
||||
pub use ollama::{ModelCapabilities, OllamaClient};
|
||||
#[allow(unused_imports)]
|
||||
pub use llm_client::{
|
||||
ChatMessage, LlmClient, ModelCapabilities, Tool, ToolCall, ToolCallFunction, ToolFunction,
|
||||
};
|
||||
pub use ollama::OllamaClient;
|
||||
pub use sms_client::{SmsApiClient, SmsMessage};
|
||||
|
||||
Reference in New Issue
Block a user