Provide free, normalized entity intelligence for:
👉 This is not a paid trend terminal 👉 It’s an entity resolution + attention context engine
The most reliable value in this project is no longer historical trend curves. It is entity normalization, topic disambiguation, and related-query discovery.
If Google refuses time-series requests, the MCP should still remain useful by:
Markets often move after attention increases.
So this MCP helps answer:
providers/)Primary:
pytrends)Optional later:
Google Trends data is:
Normalize across:
Techniques:
Map:
👉 This is essential for consistency with your other MCPs
Cache trend series:
Key:
trends:{keyword}:{timeframe}
TTL:
Keep them interpretive, not raw
get_interest_over_time“Show attention trend”
{
"keyword": "bitcoin",
"timeframe": "7d"
}
Output:
{
"series": [12, 18, 25, 40, 65, 80],
"trend": "rising"
}
👉 Include simple interpretation (“rising”, “falling”, “flat”)
compare_interest“Which topic is hotter?”
{
"keywords": ["bitcoin", "ethereum"],
"timeframe": "7d"
}
Output:
{
"winner": "bitcoin",
"ratios": {
"bitcoin": 1.0,
"ethereum": 0.72
}
}
detect_trending_entities“What is gaining attention?”
{
"category": "crypto"
}
Output:
[
{
"entity": "Solana",
"trend_score": 0.91,
"velocity": "high"
}
]
get_related_queries“What are people searching around this?”
{
"keyword": "bitcoin"
}
Output:
[
"bitcoin etf",
"bitcoin price prediction",
"btc news"
]
get_attention_score (VERY useful)“How much attention does X have right now?”
{
"entity": "BTC"
}
Output:
{
"score": 0.78,
"relative_to_baseline": 1.4
}
Avoid:
❌ Bad:
get_raw_trends(keyword)
Trends are:
👉 If you skip normalization: your MCP becomes misleading
Example:
👉 You need:
Some queries:
👉 filter aggressively
Raw trend data is weak.
Value comes from:
This is your early warning system
Combined with:
Trends MCP: → “Ethereum ETF” searches spiking
News MCP: → few articles yet
Crypto MCP: → price still flat
👉 This is:
pre-news signal
Each tool should answer:
“Where is attention moving?”
👉 normalization improvements can come later