Turn noisy search/trend data into clean entity intelligence that other agents can reason about.
This is not a raw data API. It should answer questions like:
The project now keeps a SQLite snapshot history so we can inspect how related-query/topic surfaces change over time, before wiring anything into news-mcp.
pytrends for the first provider0.0.0.08507/mcprun.sh starts the serverkillserver.sh stops a PID stored in server.pidrestart.sh performs kill + run onlyget_interest_over_time(keyword, timeframe)compare_interest(keywords, timeframe)get_attention_score(entity, timeframe)resolve_entity(keyword)get_related_queries(keyword)get_related_topics(keyword)get_ledger_recent(limit)get_ledger_summary(limit)get_entity_history(entity, limit)prune_history(retention_days)History is stored in a single SQLite table (snapshots) under data/trends_history.db.
Each row stores the full normalized tool payload so later analysis can diff change over time,
instead of only counting that a lookup happened.
Pruning is automatic: the history store checks once per day whether retention cleanup is due, and removes rows older than the configured retention window.
The right implementation style here is boring on purpose: