Every morning at 7 AM, while you’re still reaching for your coffee, a fully automated system is already scanning the internet for the latest punk, hardcore, and emo news. By the time you open Instagram, it’s already written a script, generated a voiceover, created a video with waveform animations, posted it to Instagram (feed AND stories), Facebook, WordPress, TikTok, and logged everything to a spreadsheet. No human touched a button.
That’s the engine behind XcoreNEWS — and it’s built entirely in n8n.
What It Actually Does
Think about what it takes to run a music news account. You have to find breaking stories, write something about them, create visuals, record audio, edit a video, post across multiple platforms with proper hashtags, and keep track of what you’ve already covered. That’s a full-time job.
This workflow does all of it. Every single step. Autonomously.
It pulls from real music journalism — Punknews, BrooklynVegan, No Echo, Punk Rock Theory — and turns raw news articles into polished, multi-platform content. Each piece gets its own AI-narrated video with custom waveform visualizations layered over the article’s imagery, complete with text overlays showing the source, date, and headline.
The result? A steady stream of professional-looking content that keeps the X-core NEWS audience informed without burning out a human content creator.
The Smart Parts
It Knows What’s Already Been Covered
Nobody wants to see the same story twice. The workflow checks every article against a Google Sheets log before doing any heavy lifting. If we’ve already covered it, it skips immediately — no wasted API calls, no duplicate posts.
It Understands Genre
Not every article from BrooklynVegan is about punk or hardcore. An AI classifier reads each article’s title and snippet and decides if it actually fits the X-core NEWS brand. Indie folk? Skip. Hardcore supergroup? Let’s go.
It Handles Missing Data Gracefully
Real-world data is messy. Some articles don’t have thumbnail images. Some have weird characters in their titles. The workflow handles all of it — falling back to alternative image sources, decoding HTML entities, and continuing through errors without crashing the entire pipeline.
It Creates Real Video Content
This isn’t a static image with text slapped on it. Each video features:
- The article’s thumbnail as a full-bleed background
- A darkened overlay for readability
- Animated text showing the source, date, and headline
- A live audio waveform visualization synced to the AI-generated voiceover
- Professional vertical format (1080×1920) optimized for mobile
All rendered server-side with ffmpeg. No templates. No Canva. Pure code.
example:
The Platform Distribution
One article becomes five pieces of content:
- Instagram Feed Post — Square-cropped thumbnail with caption and hashtags
- Instagram Story — Full video with waveform animation
- Instagram Comment — AI-generated hashtags posted as a first comment for algorithm reach
- Facebook Video Post — Native video upload to the page
- WordPress Article — Full post on xcorenews.com
Each platform gets content formatted specifically for its requirements. Instagram gets a 1:1 crop. Stories get the vertical video. Facebook gets native video for better reach. WordPress gets the full write-up.
Under The Hood
For the technically curious, here’s what powers this thing.
Architecture
The workflow runs 40+ nodes in n8n, orchestrated in a loop-first architecture. Every article is processed individually through a SplitInBatches loop, which means:
- Duplicate checking happens before any expensive operations
- If one article fails, the others still process
- Memory stays clean (no binary data conflicts between items)
The Pipeline
Schedule Trigger (7 AM daily)
-> Configuration
-> SerpAPI Google News search
-> Parse & filter results
-> AI genre classification (Claude Haiku)
-> Loop over each article:
-> Duplicate check (Google Sheets)
-> Fetch full article HTML
-> Extract content (title, body, og:image)
-> Generate narration script (Claude Haiku)
-> Generate hashtags (Claude Haiku)
-> Text-to-speech (ElevenLabs)
-> Download & process thumbnail
-> Prepare text overlays
-> Render video with ffmpeg
-> Upload assets to S3
-> Post to Instagram, Facebook, WordPress
-> Log to tracking sheet
AI Stack
- Anthropic Claude Haiku — Script generation, hashtag creation, and genre classification. Fast, cheap, and surprisingly good at writing punchy 50-word news summaries.
- ElevenLabs — Text-to-speech for the voiceover. Turns the AI-written script into natural-sounding audio.
Video Rendering
The video is built entirely with a single ffmpeg command that chains together:
- Background image scaling to 1080×1920
- Dark overlay at 35% opacity
- Three text layers (date, source, headline) rendered from files to avoid shell escaping nightmares
- Audio waveform visualization (
showwavesfilter) with square root scaling for a clean look - H.264 encoding at 35 seconds max duration
Text files are used instead of inline text in the ffmpeg command because article titles with quotes, apostrophes, and special characters would break shell escaping. Base64 encoding the text, writing it to temp files, and referencing them with textfile= is the bulletproof approach.
URL Filtering
Google News results from SerpAPI don’t always return clean article links. The parser filters out:
- Apex domain links (just the homepage, no article)
- Date archive pages (
/2026/02/12/) - Section and landing pages (
/news,/about) - Tag and category indexes (
/tag/punk,/category/hardcore) - Pagination (
/page/3)
This happens through string-based URL parsing rather than the URL constructor, because n8n’s Code node sandbox doesn’t expose it.
Thumbnail Fallbacks
Articles don’t always have og:image meta tags. The workflow uses a two-tier fallback:
- First choice:
og:imageextracted from the article’s HTML - Fallback: thumbnail URL from the SerpAPI search result
This prevents the video render from failing when an article page is missing its featured image.
Error Resilience
Key nodes like thumbnail download, image resize, and Instagram posting are configured with continueRegularOutput error handling. If one step fails for a single article, the loop continues to the next one instead of killing the entire workflow.
Storage & Delivery
- AWS S3 — Videos and images are uploaded to a public bucket for Instagram and Facebook to fetch during their media processing
- Google Sheets — Serves as both a duplicate detection database and a publish log
- WordPress REST API — Creates full blog posts on xcorenews.com
This workflow was built with n8n, a fair-code workflow automation platform. The entire system runs on a self-hosted instance, keeping costs low and control high.