feat(ai): USER_NAME env + shared summary prompt + test-bin knobs

Introduces USER_NAME (default "Me") as the single source for the message
sender label and the first-person persona across daily summaries, SMS
context, insight generation, and chat. Eliminates the "Me:" transcript /
"what I did" ambiguity that confused smaller models, and unhardcodes
"Cameron" from prompt text + the knowledge-graph owner entity. Set
USER_NAME=Cameron in .env to preserve the existing owner entity row
(keyed on UNIQUE(name, entity_type)) — otherwise the next run creates
a fresh owner entity and orphans the existing facts/photo-links.

Also:
- search_messages redirect: when the model calls it with date/contact
  but no query, return a hint pointing at get_sms_messages instead of
  a bare missing-parameter error (prevents same-turn retry loops)
- sharpen search_messages vs get_sms_messages tool descriptions so
  content-vs-time-based intent is unambiguous
- extract build_daily_summary_prompt (+ DAILY_SUMMARY_MESSAGE_LIMIT,
  DAILY_SUMMARY_SYSTEM_PROMPT) shared by daily_summary_job and
  test_daily_summary binary — prompt tweaks now land in both
- EMBEDDING_MODEL const; fixes both insert sites that stored
  "mxbai-embed-large:335m" while generate_embeddings actually runs
  "nomic-embed-text:v1.5"
- test_daily_summary: add --num-ctx / --temperature / --top-p /
  --top-k / --min-p flags wired into OllamaClient setters, and print
  the configured knobs at the top of each run
- OllamaClient::generate now logs prompt/gen token counts and tok/s
  via log_chat_metrics (symmetric with chat_with_tools)

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
This commit is contained in:
Cameron
2026-04-22 23:39:37 -04:00
parent e4a3536f87
commit 6831f50993
6 changed files with 226 additions and 156 deletions

View File

@@ -9,7 +9,10 @@ pub mod sms_client;
// strip_summary_boilerplate is used by binaries (test_daily_summary), not the library
#[allow(unused_imports)]
pub use daily_summary_job::{generate_daily_summaries, strip_summary_boilerplate};
pub use daily_summary_job::{
DAILY_SUMMARY_MESSAGE_LIMIT, DAILY_SUMMARY_SYSTEM_PROMPT, build_daily_summary_prompt,
generate_daily_summaries, strip_summary_boilerplate,
};
pub use handlers::{
chat_history_handler, chat_rewind_handler, chat_stream_handler, chat_turn_handler,
delete_insight_handler, export_training_data_handler, generate_agentic_insight_handler,
@@ -21,5 +24,14 @@ pub use insight_generator::InsightGenerator;
pub use llm_client::{
ChatMessage, LlmClient, ModelCapabilities, Tool, ToolCall, ToolCallFunction, ToolFunction,
};
pub use ollama::OllamaClient;
pub use ollama::{EMBEDDING_MODEL, OllamaClient};
pub use sms_client::{SmsApiClient, SmsMessage};
/// Display name used for the user in message transcripts and first-person
/// prompt text. Reads the `USER_NAME` env var; defaults to `"Me"`. Models
/// often confuse `"Me:"` in a transcript with their own role — setting
/// `USER_NAME=Cameron` (or similar) in the environment eliminates that
/// ambiguity across daily summaries, insight generation, and chat.
pub fn user_display_name() -> String {
std::env::var("USER_NAME").unwrap_or_else(|_| "Me".to_string())
}