Show HN: I built a personal AI news curator to filter RSS feeds (n8n and OpenAI) Hi HN, I found myself wasting too much time doomscrolling through tech news and RSS feeds, scanning hundreds of headlines just to find the 3-4 items that actually mattered to my work. To fix this, I built a self-hosted automation workflow using n8n that acts as a personal editor. The Architecture: Ingest: Pulls RSS feeds (TechCrunch, Hacker News, etc.) every morning. Filter (The Agent): Passes headlines to GPT-4o-mini with a system prompt to "act as a senior editor." It scores each article 0-10 based on specific interests (e.g., "High interest in Local LLMs," "Low interest in crypto gossip"). Logic: Discards anything with a score < 7. Research: Uses Tavily API to scrape and summarize the full content of the high-scoring articles. Delivery: Sends a single, clean email digest via SMTP. The Hardest Part (SSE & Timeouts): The biggest technical hurdle was handling timeouts. Since the AI research step takes time, the HTTP requests would often drop. I had to configure Server-Sent Events (SSE) and adjust the execution timeout env variables in Node.js to keep the connection alive during the deep-dive research phase. Resources: Workflow/Source (JSON): https://github.com/sojojp-hue/NewsSummarizer/tree/main Video Walkthrough & Demo: https://youtu.be/mOnbK6DuFhc I’d love to hear how others are handling information overload or if there are better ways to handle the long-polling for the AI agents. |