TrendRadar (sansan0/TrendRadar) is an open-source trend intelligence tool built for developers and researchers who need to track public opinion and emerging topics across Chinese and global platforms simultaneously. It monitors 11+ sources including Zhihu, Weibo, Bilibili, Hacker News, and RSS feeds, filters content by user-defined keywords, and runs LLM-powered analysis via LiteLLM on matched items before pushing summaries to 9+ notification channels: Telegram, Slack, WeChat (WeCom), Feishu, DingTalk, Email, and more. An MCP server exposes the aggregated trend data to any AI assistant for natural-language queries — ask Claude or ChatGPT directly about what's trending in a topic. Supports Docker, GitHub Actions (cloud storage), and direct Python execution. 45k+ GitHub stars with 584 stars/day peak velocity in April 2026.
TrendRadar addresses the signal-to-noise problem in trend monitoring: most tools either cover global English sources or Chinese platforms, and none run LLM analysis on filtered results before delivery. TrendRadar does all three — multi-platform aggregation, keyword-based filtering, and AI-generated briefings — then pushes to wherever your team already receives alerts.
The MCP integration is what makes it stand out for AI practitioners. Once TrendRadar’s MCP server is connected to Claude Code or another MCP-compatible client, you can query the aggregated trend data conversationally: “What’s trending about AI agents on Zhihu this week?” and get structured answers from live data.
Multi-platform aggregation: Monitors Zhihu hot questions, Weibo trending, Bilibili trending videos, Hacker News front page, Reddit rising posts, and arbitrary RSS feeds from a single configuration file. New platform adapters can be added without modifying core logic.
Keyword filtering: Define topic watchlists (keywords, phrases, or regex patterns). TrendRadar filters the raw firehose before LLM analysis runs — keeping API costs proportional to relevance, not volume.
LLM analysis pipeline: Matched items pass through LiteLLM to any connected model (OpenAI, Anthropic, local Ollama). The model generates a structured briefing: topic summary, sentiment analysis, trend trajectory. Results include AI translation for cross-language coverage.
Push delivery: Configure notification channels per alert rule. High-priority signals go to Telegram immediately; daily digests go to Email; team alerts go to Feishu or WeCom. The 9+ channel integrations cover both global and China-based collaboration tools.
MCP conversational interface: The built-in MCP server exposes TrendRadar’s database to any MCP-compatible AI assistant. Analysts can query trend data in natural language without building a separate reporting layer.
docker compose up -d # recommended — includes scheduler and MCP server
GitHub Actions mode runs on a schedule without local infrastructure — trend data is stored in cloud storage (S3-compatible). Python mode gives maximum flexibility for custom integrations.
Researchers, analysts, and developers who need cross-platform trend intelligence with AI-generated summaries — especially for monitoring both Chinese and global AI/tech discussions simultaneously. Also useful for teams building agent workflows that need live trend data as a context source via MCP.
An AI-powered academic search engine that finds and synthesizes evidence-based answers from peer-reviewed scientific research.
An AI research assistant that helps researchers search, analyze, and synthesize findings from academic papers at scale.
xAI's conversational AI with real-time X (Twitter) data access, web search, and image understanding capabilities.