How I Restored YouTube Comment Email Alerts with an AI Assistant and the YouTube Data API
TL;DR: When YouTube stopped sending comment email notifications, I rebuilt inbox alerts in about an hour using an AI assistant (Gemini Pro), the YouTube Data API v3, and a small Dockerized Python script. The result: inbox‑centric notifications that keep creator engagement fast—no new dashboard required.
The problem (short)
That small email that used to arrive whenever someone commented was a workflow hinge: it made timely replies natural and kept conversations alive. YouTube quietly turned off comment email notifications at the end of June. RSS for comments has been gone since 2015, so the official way to get comments programmatically is the YouTube Data API v3 (YouTube’s developer API). Without platform emails, creators either accept third‑party dashboards or rebuild the alert flow themselves.
What I built and why it matters
I used an AI assistant as a pair‑programmer to generate a Python monitoring script that polls (periodically checks) the YouTube Data API v3 once an hour, detects new comments, and sends an email with a link to the comment. I wrapped the script in Docker (a container runtime for running apps consistently) and managed it with Portainer (a lightweight Docker UI). The final code is available on GitHub: davidgewirtz/yt-monitor.
YouTube’s comment emails were a reliable trigger that prompted timely creator responses and helped drive engagement.
Call it “vibe‑coding”: prompt an AI with your intent and let it produce a single‑purpose automation that fits your existing workflow. For me, that meant preserving inbox‑based attention instead of moving it into another web dashboard.
How it works (high level)
- Authenticate: Script uses a YouTube Data API key or OAuth2 credentials to read comments for a channel.
- Poll: Every hour the script queries the API for recent comments (polling = periodic checks).
- Detect: New comment IDs or timestamps are compared against the last seen record.
- Notify: When something is new, the script sends an email with a permalink to the comment.
- Health check: Consecutive API/network failures are counted; 48 consecutive failures (~2 days at hourly checks) trigger an alert that the API may be unreachable.
What I actually ran
- AI assistant: Gemini Pro (Google’s paid AI for code and guidance)
- Code: Python monitoring script (published on GitHub)
- Container: Dockerfile + image, run under Portainer
- Mail delivery: Gmail SMTP with an App Password (for small scale) — alternative: SendGrid/Mailgun for production
- Hosting: Raspberry Pi or a cheap VM (AWS EC2, DigitalOcean) — often $5–10/month
What I did in one hour (timeline)
- 0–10 min: Prompt Gemini with the goal and required APIs (YouTube Data API v3, email delivery), review suggested design.
- 10–30 min: Generate and iterate on the Python script (comment fetch, detection logic, email send); sanity‑check API calls and results.
- 30–45 min: Add Dockerfile, create a simple health check, and package the app.
- 45–60 min: Deploy container with Portainer on a small VM or Pi, test end‑to‑end, and push repo to GitHub.
Practical deployment options & costs
- Local/Edge: Raspberry Pi — low upfront cost, excellent for single creators.
- Cloud VM: AWS Lightsail, DigitalOcean, or a $5–10/month VM offer stable uptime.
- Email: Gmail + App Password (convenient for low volume) or transactional providers (SendGrid, Mailgun) for reliability and deliverability at scale.
- AI cost: Gemini Pro is around $20/month; other assistants (ChatGPT, Claude, Copilot) can substitute.
Security, reliability and governance (what to watch)
DIY automation is powerful, but it introduces operational and security responsibilities. Practical mitigations:
- Credentials: Don’t commit API keys or app passwords to the repo. Use environment variables or a secrets manager; rotate keys regularly.
- Authentication choices: API key vs OAuth2 vs service accounts — OAuth2 is more secure for user‑scoped access; API keys are simpler but more brittle. Gmail App Passwords require 2‑step verification and may not be available to all account types; transactional email providers use API keys designed for programmatic sending.
- Rate limits & quotas: YouTube Data API has quotas. Use efficient queries, cache results, and implement exponential backoff on 429/5xx responses.
- Monitoring: Add logs, basic metrics, and an external heartbeat (healthchecks.io, UptimeRobot) so you know when the job stops running.
- Compliance & TOS: Respect YouTube API Terms of Service and privacy expectations. Don’t aggregate or expose commenter PII unnecessarily.
When not to DIY
- If you need multi‑user access, role controls, or audit trails at enterprise scale, a commercial social management platform is likely a better fit.
- If legal/compliance or data governance is strict (regulated industries), prefer professionally managed tools with SOC/ISO compliance and centralized security controls.
- If you expect high comment volume or complex moderation workflows, plan for a more robust architecture (queueing, workers, centralized logging).
Troubleshooting FAQ
-
Why am I not getting alerts?
Check that the container is running, verify API credentials and quota, confirm the “last seen” logic isn’t inadvertently marking items as already processed, and ensure SMTP/transactional provider credentials are correct.
-
What if email deliverability is poor?
Switch from Gmail SMTP to a transactional email provider with proper DNS records (SPF, DKIM) to improve deliverability and monitoring.
-
How do I handle API rate limits?
Implement retries with exponential backoff and minimize repeated or unnecessary requests; batch where possible and monitor quota usage in Google Cloud Console.
-
How will I know the script died?
Use an external heartbeat monitor or a simple health endpoint checked by a monitoring service that alerts you when the heartbeat stops.
Security checklist (quick)
- Never store secrets in plaintext in the repo.
- Use env vars or a secrets manager (AWS Secrets Manager, HashiCorp Vault, etc.).
- Prefer transactional email provider keys for production sending.
- Enable 2‑step verification for any accounts that can access email delivery.
- Log with structured output and forward to a centralized log store if scaling.
Sample prompt template (reuse with your chosen AI)
Use this template to ask an AI assistant to generate the project. Sanitize with your own channel ID and email approach.
“Write a Python script that uses the YouTube Data API v3 to poll a specified channel once per hour for new comments. The script should store the last‑seen comment ID, detect new comments, and send an email with a permalink to the comment. Include a Dockerfile and a simple health check that counts consecutive API failures and sends an alert after 48 failures. Provide instructions for obtaining YouTube API credentials and for using Gmail SMTP (App Password) or SendGrid. Keep the code minimal but production‑aware (use environment variables for secrets).”
Key takeaways
- You can recreate YouTube comment email alerts: The YouTube Data API v3 makes comment data accessible programmatically.
- AI assistants compress effort: Gemini Pro (and other LLMs) can generate working code, Dockerfiles, and step‑by‑step guidance fast—turning weekend projects into hour‑long builds.
- Inbox‑centric workflows matter: For many creators, email alerts produce faster engagement than switching to a dashboard.
- Plan for operations: Credentials, quotas, monitoring, and compliance are the ongoing costs of DIY automation.
Want to try it? Clone, fork, or inspect the code on GitHub: davidgewirtz/yt-monitor. The repo includes the Python script, Dockerfile, and notes for credential setup. If you plan to run this in production, follow the security checklist above and consider a transactional email provider and OAuth2 for authentication.
0
Restore YouTube Comment Email Alerts in 1 Hour Using an AI Assistant and YouTube Data API
How I Restored YouTube Comment Email Alerts with an AI Assistant and the YouTube Data API
TL;DR: When YouTube stopped sending comment email notifications, I rebuilt inbox alerts in about an hour using an AI assistant (Gemini Pro), the YouTube Data API v3, and a small Dockerized Python script. The result: inbox‑centric notifications that keep creator engagement fast—no new dashboard required.
The problem (short)
That small email that used to arrive whenever someone commented was a workflow hinge: it made timely replies natural and kept conversations alive. YouTube quietly turned off comment email notifications at the end of June. RSS for comments has been gone since 2015, so the official way to get comments programmatically is the YouTube Data API v3 (YouTube’s developer API). Without platform emails, creators either accept third‑party dashboards or rebuild the alert flow themselves.
What I built and why it matters
I used an AI assistant as a pair‑programmer to generate a Python monitoring script that polls (periodically checks) the YouTube Data API v3 once an hour, detects new comments, and sends an email with a link to the comment. I wrapped the script in Docker (a container runtime for running apps consistently) and managed it with Portainer (a lightweight Docker UI). The final code is available on GitHub: davidgewirtz/yt-monitor.
Call it “vibe‑coding”: prompt an AI with your intent and let it produce a single‑purpose automation that fits your existing workflow. For me, that meant preserving inbox‑based attention instead of moving it into another web dashboard.
How it works (high level)
What I actually ran
What I did in one hour (timeline)
Practical deployment options & costs
Security, reliability and governance (what to watch)
DIY automation is powerful, but it introduces operational and security responsibilities. Practical mitigations:
When not to DIY
Troubleshooting FAQ
Security checklist (quick)
Sample prompt template (reuse with your chosen AI)
Use this template to ask an AI assistant to generate the project. Sanitize with your own channel ID and email approach.
Key takeaways
Want to try it? Clone, fork, or inspect the code on GitHub: davidgewirtz/yt-monitor. The repo includes the Python script, Dockerfile, and notes for credential setup. If you plan to run this in production, follow the security checklist above and consider a transactional email provider and OAuth2 for authentication.