Pattern 3: Newsletter Aggregation
This is a member-only chapter. Log in with your Signal Over Noise membership email to continue.
Log in to readModule 4 · Section 4 of 6
Pattern 3: Newsletter Aggregation
What it does: Pulls items from several RSS feeds on a schedule, filters for relevance, and compiles a list of links for review.
Why it exists: I read a lot of sources. Rather than checking each manually, I want a daily digest of items that match a set of topics.
The nodes:
-
Schedule trigger — Daily at 7:00.
-
RSS Read node (multiple) — One per feed. n8n’s RSS Read node fetches feed items and outputs them as individual objects. I merge the outputs from multiple RSS nodes using a Merge node.
-
Limit node — RSS feeds can return many items. I limit to the 20 most recent across all feeds to keep processing time reasonable.
-
Basic LLM Chain — For each item, passes the title and description to a model with a prompt:
The following is an RSS item title and description. Topics I care about: AI tools, automation, self-hosting, developer tools, security. Is this relevant? Reply with only: yes or no. -
IF node — Routes items where the classification returned
yesto the output branch. -
Aggregate node — Collects all the relevant items into a single list.
-
Code node — Formats the list as a readable block of text with titles and links.
-
Telegram node — Sends the digest.
What to adapt: Your feeds, your topics of interest, your classification prompt. The code node formatting is arbitrary — you could output Markdown, plain text, or HTML depending on your destination.
Note on cost: Running an LLM call per RSS item on a daily schedule adds up. For this workflow I use a local Ollama model. Haiku is also cheap enough that the cost is negligible for a personal workflow.