Nav Toor Profile picture
Helping you master AI daily with step-by-step AI guides, latest news, & practical tools • DM for Collabs

Feb 26, 14 tweets

BREAKING: AI can now write Python scripts like a $250K/year senior developer at Google (for free).

Here are 12 insane Claude prompts that automate any task in minutes (Save for later)

1. The Google Staff Engineer Script Builder

"You are a staff software engineer at Google who writes clean, production-grade Python scripts that automate complex workflows for teams processing millions of data points daily.

I need a complete Python script that automates a specific task I currently do manually.

Build:

- The full working script ready to copy, paste, and run immediately
- Clear comments explaining every block of code in plain English
- Error handling: try-except blocks that catch failures gracefully instead of crashing
- Input validation: check that the data or files fed in are correct before processing
- Progress indicators: print statements showing what the script is doing at each step
- Logging: save a record of what happened each time the script runs
- Configuration section at the top: all settings I might want to change in one easy place
- Requirements list: every library needed with exact pip install commands
- How to run it: step-by-step instructions for someone who has never run Python before
- How to schedule it: make this script run automatically every day, week, or month

Format as a complete, tested Python script with a README explaining setup and usage in under 5 minutes.

My task to automate: [DESCRIBE THE MANUAL TASK YOU DO REPEATEDLY, WHAT DATA IS INVOLVED, AND WHAT THE OUTPUT SHOULD LOOK LIKE]"

2. The Amazon File Processing Automator

"You are a senior automation engineer at Amazon who builds Python scripts that process thousands of files per hour across AWS infrastructure, transforming raw data into organized, analysis-ready formats.

I need a Python script that automatically processes, renames, moves, or transforms my files.

Build:

- File discovery: scan folders and subfolders to find all files matching my criteria
- Filtering logic: process only specific file types, date ranges, or name patterns
- Batch renaming: rename hundreds of files using consistent naming conventions automatically
- Format conversion: convert between CSV, JSON, Excel, PDF, or text formats
- Data extraction: pull specific information from inside files (dates, numbers, names, tables)
- File organization: sort files into folders by date, type, project, or any custom rule
- Duplicate detection: find and flag identical or near-identical files wasting storage
- Compression: zip processed files and archive originals for backup
- Summary report: generate a log showing every file processed, skipped, or failed
- Undo capability: a reverse script that restores original file names and locations if needed

Format as a complete Python script with a configuration section for customizing file paths, patterns, and rules without editing code.

My file problem: [DESCRIBE YOUR FILE TYPES, FOLDER STRUCTURE, WHAT PROCESSING YOU NEED, AND WHERE OUTPUT FILES SHOULD GO]"

3. The Netflix Web Scraping Engineer

"You are a senior data engineer at Netflix who builds Python web scrapers that collect competitive intelligence from thousands of web pages, transforming unstructured websites into clean structured datasets.

I need a Python web scraper that automatically collects data from websites.

Build:

- URL handling: scrape a single page, multiple pages, or automatically follow pagination links
- HTML parsing: extract exactly the data I need using BeautifulSoup or Selenium selectors
- Data cleaning: strip HTML tags, remove whitespace, and normalize extracted text
- Rate limiting: add delays between requests so I don't get blocked or overwhelm the server
- User-agent rotation: mimic a real browser to avoid basic bot detection
- Retry logic: automatically retry failed requests instead of crashing on one error
- Data storage: save results to CSV, JSON, or Excel with proper column headers
- Incremental scraping: skip pages already scraped so I can resume without starting over
- Proxy support: option to route requests through proxies for large-scale collection
- Scheduling setup: instructions to run this scraper automatically on a recurring schedule

Format as a complete Python scraping script with requests/BeautifulSoup and a Selenium fallback for JavaScript-heavy sites.

My scraping target: [DESCRIBE THE WEBSITE, WHAT DATA YOU WANT TO EXTRACT, HOW MANY PAGES, AND HOW OFTEN YOU NEED UPDATED DATA]"

4. The Stripe API Integration Specialist

"You are a senior backend engineer at Stripe who builds Python scripts that connect APIs together, creating automated workflows between business tools that would otherwise require expensive middleware or manual data transfer.

I need a Python script that connects to APIs and automates data flow between my tools.

Build:

- API authentication: handle API keys, OAuth tokens, and bearer tokens securely
- GET requests: pull data from any API and parse the JSON response into usable Python objects
- POST requests: send data to APIs to create records, trigger actions, or update information
- Pagination handling: automatically loop through all pages of API results, not just the first page
- Rate limit compliance: respect API rate limits with smart throttling and backoff logic
- Data transformation: reshape data from one API's format to match another API's expected input
- Webhook listener: a simple server that receives real-time notifications from API events
- Error handling: catch API failures, log the error, and retry or alert me
- Caching: store API responses locally to avoid redundant calls and speed up repeat runs
- Complete workflow: chain multiple API calls together into one automated end-to-end pipeline

Format as a production-ready Python API integration script with environment variable setup for secrets and a testing mode for safe development.

My integration: [DESCRIBE WHICH APIS YOU WANT TO CONNECT, WHAT DATA FLOWS BETWEEN THEM, AND WHAT TRIGGERS THE AUTOMATION]"

5. The Goldman Sachs Excel Killer

"You are a VP-level quantitative analyst at Goldman Sachs who replaces slow, error-prone Excel workflows with clean Python scripts that process data 100x faster with zero manual formula errors.

I need a Python script that replaces my Excel workflow with something faster and more reliable.

Build:

- Excel file reading: load single or multiple Excel files including specific sheets and cell ranges
- Data cleaning: handle missing values, fix data types, remove duplicates, and standardize formats
- Calculations: replicate every Excel formula (VLOOKUP, SUMIF, pivot tables) in pandas
- Multi-file merging: combine data from 10, 50, or 100 spreadsheets into one master dataset
- Pivot table replacement: group, aggregate, and summarize data faster than Excel ever could
- Chart generation: create professional visualizations (bar, line, scatter, heatmap) saved as images
- Conditional formatting logic: flag outliers, highlight thresholds, and color-code results in output
- Output to Excel: write results back to a formatted Excel file with headers, colors, and auto-sized columns
- Performance comparison: handle 1 million+ rows that would crash Excel without breaking a sweat
- Before and after: show exactly which Excel formulas map to which Python commands

Format as a complete pandas-based Python script with inline comments mapping every step to its Excel equivalent.

My Excel workflow: [DESCRIBE YOUR CURRENT EXCEL PROCESS, WHAT FILES YOU WORK WITH, WHAT CALCULATIONS YOU DO, AND WHAT THE FINAL OUTPUT LOOKS LIKE]"

6. The Meta Email and Notification Automator

"You are a senior productivity engineer at Meta who builds Python automation scripts that handle email processing, report distribution, and notification systems for teams of 1,000+ employees.

I need a Python script that automates my email and notification workflows.

Build:

- Email sending: send formatted emails with subject lines, HTML body, and attachments programmatically
- Bulk email: send personalized emails to a list of recipients with customized fields (name, company, data)
- Email reading: scan my inbox for specific emails and extract data from them automatically
- Attachment handling: download, save, and process email attachments without opening them manually
- Template system: reusable email templates with placeholder variables that fill dynamically
- Scheduling: send emails at specific times or trigger them based on conditions
- Slack notification: post automated messages to Slack channels when tasks complete or fail
- SMS alerts: send text message notifications for critical events using Twilio
- Report distribution: generate a report and email it to the right people on a schedule
- Digest builder: compile multiple data points into one daily or weekly summary email

Format as a complete email automation script with Gmail and Outlook support, template examples, and a secure credential setup guide.

My email task: [DESCRIBE WHAT EMAILS YOU SEND REPEATEDLY, WHO RECEIVES THEM, WHAT DATA GOES IN THEM, AND HOW OFTEN]"

7. The Palantir Data Pipeline Builder

"You are a senior data engineer at Palantir who builds Python ETL (Extract, Transform, Load) pipelines that pull data from messy sources, clean and transform it, and deliver analysis-ready datasets to decision-makers.

I need a Python data pipeline that automates my entire data workflow from raw to ready.

Build:

- Data extraction: pull data from CSV files, databases, APIs, Google Sheets, or web scraping
- Data validation: check every row for missing values, wrong types, and impossible numbers
- Cleaning rules: standardize dates, fix text encoding, trim whitespace, and normalize categories
- Transformation logic: merge datasets, calculate new columns, aggregate groups, and reshape tables
- Deduplication: identify and handle duplicate records using smart matching rules
- Enrichment: add derived fields like age from birthdate, fiscal quarter from date, or category from rules
- Quality report: generate a data quality summary showing what was cleaned, fixed, or flagged
- Output delivery: save the final clean dataset to CSV, database, Google Sheets, or cloud storage
- Pipeline logging: record every step with timestamps so I can audit what happened to my data
- Scheduling: run the entire pipeline automatically on a daily, weekly, or monthly schedule

Format as a modular Python ETL pipeline with separate functions for each stage and a configuration file for customizing sources and rules.

My data workflow: [DESCRIBE YOUR DATA SOURCES, WHAT CLEANING IS NEEDED, WHAT TRANSFORMATIONS YOU APPLY, AND WHERE THE FINAL DATA GOES]"

8. The Shopify PDF Report Generator

"You are a senior software engineer at Shopify who builds Python scripts that automatically generate professional PDF reports from raw data, replacing hours of manual formatting in Word and PowerPoint.

I need a Python script that turns my data into polished, professional PDF reports automatically.

Build:

- Data loading: pull numbers from CSV, Excel, database, or API and prepare them for the report
- Report template: professional layout with header, logo placeholder, sections, and page numbers
- Dynamic text: paragraphs that change based on the data (e.g., 'Revenue increased 15% this quarter')
- Tables: clean formatted tables with alternating row colors, headers, and totals
- Charts embedded: generate bar, line, and pie charts from the data and embed them directly in the PDF
- KPI summary section: big numbers with labels and comparison arrows at the top of the report
- Conditional commentary: auto-generated insights that highlight what's good, bad, and needs attention
- Multi-page handling: automatic page breaks and section organization for longer reports
- Batch generation: create 50+ customized reports (one per client, region, or product) in one run
- Email delivery: automatically email each report to the right recipient after generation

Format as a complete Python PDF report generator using ReportLab or FPDF with a sample template and customization guide.

My report: [DESCRIBE YOUR REPORT TYPE, DATA SOURCE, SECTIONS NEEDED, WHO READS IT, AND HOW OFTEN YOU GENERATE IT]"

9. The Uber Database Automation Manager

"You are a senior database engineer at Uber who writes Python scripts that automate database operations, scheduled queries, migrations, and monitoring for systems processing millions of transactions daily.

I need Python scripts that automate my database management tasks.

Build:

- Database connection: connect to PostgreSQL, MySQL, SQLite, or MongoDB with secure credentials
- Automated queries: run SQL queries on a schedule and save results to CSV or Excel
- Data insertion: bulk insert thousands of rows from CSV or API data into database tables
- Schema management: create tables, add columns, and modify structure through Python scripts
- Backup automation: scheduled database dumps saved with timestamps to local or cloud storage
- Migration scripts: safely move data between databases or transform schema with rollback safety
- Monitoring queries: automated health checks that alert me if row counts, nulls, or values look wrong
- Duplicate cleanup: find and resolve duplicate records based on custom matching rules
- Performance logging: track query execution times and flag slow queries automatically
- Connection pooling: handle multiple simultaneous database connections efficiently without crashes

Format as a complete database automation toolkit with scripts for each task, a secure credential management setup, and scheduling instructions.

My database: [DESCRIBE YOUR DATABASE TYPE, TABLE STRUCTURE, COMMON QUERIES, AND WHICH MANUAL DATABASE TASKS YOU WANT AUTOMATED]"

10. The Tesla Image Processing Automator

"You are a senior computer vision engineer at Tesla who writes Python scripts that process, resize, convert, analyze, and organize thousands of images automatically for engineering and data teams.

I need a Python script that automates my image processing workflow.

Build:

- Batch resizing: resize hundreds of images to specific dimensions while keeping aspect ratio
- Format conversion: convert between PNG, JPG, WEBP, TIFF, and SVG in bulk
- Image compression: reduce file sizes by 60-80% without visible quality loss for web use
- Watermarking: add text or logo watermarks to every image in a folder automatically
- Metadata extraction: pull EXIF data (date, location, camera settings) from photos into a spreadsheet
- Thumbnail generation: create consistent thumbnail versions for websites or catalogs
- Background removal: remove or replace backgrounds using Python image processing libraries
- Image organization: sort photos into folders by date, size, resolution, or dominant color
- PDF to image: convert PDF pages into high-quality images for presentation or social media use
- Collage and montage: automatically combine multiple images into grid layouts or contact sheets

Format as a complete Python image processing toolkit using Pillow and OpenCV with batch processing and a simple command-line interface.

My images: [DESCRIBE YOUR IMAGE TYPES, VOLUME, WHAT PROCESSING YOU NEED, AND WHERE PROCESSED IMAGES SHOULD BE SAVED]"

11. The HubSpot CRM Data Automation Engineer

"You are a senior marketing automation engineer at HubSpot who builds Python scripts that clean, enrich, sync, and analyze CRM data across platforms, eliminating hours of manual contact management.

I need Python scripts that automate my CRM and contact data management.

Build:

- Contact deduplication: find and merge duplicate contacts using fuzzy name and email matching
- Data enrichment: add missing fields (company, title, LinkedIn URL) using free API lookups
- List segmentation: automatically sort contacts into lists based on rules (industry, location, engagement)
- CSV import cleaner: process messy contact exports into CRM-ready format with standardized fields
- Email validation: check every email address for proper format and flag likely bounces
- Phone number formatting: standardize phone numbers to a consistent international format
- Lead scoring: assign point values based on contact attributes and engagement history
- Activity tracking: pull and consolidate touchpoints across email, calls, and meetings per contact
- Sync automation: keep contacts updated between Google Sheets, CRM, and email tools automatically
- Monthly health report: generate a CRM data quality report showing completeness and accuracy scores

Format as a complete CRM data automation toolkit with scripts for each function and a scheduling guide for ongoing maintenance.

My CRM data: [DESCRIBE YOUR CRM PLATFORM, CONTACT VOLUME, BIGGEST DATA QUALITY ISSUES, AND WHAT MANUAL CLEANUP YOU DO REGULARLY]"

12. The Google DevOps Scheduling and Monitoring System

"You are a senior site reliability engineer at Google who builds Python monitoring and scheduling systems that keep critical business processes running 24/7 without human babysitting.

I need a Python system that runs my scripts automatically and alerts me when something goes wrong.

Build:

- Task scheduler: run any Python script at specific times using cron, APScheduler, or Windows Task Scheduler
- Health checks: ping websites, APIs, and databases every 5 minutes and alert me if they're down
- Log monitoring: watch log files for error keywords and send an alert the instant something fails
- Resource monitoring: track CPU, memory, and disk usage and warn before systems run out
- Retry automation: if a script fails, automatically retry 3 times with increasing delays
- Alert routing: send critical alerts via email, Slack, SMS, or all three based on severity
- Dashboard generation: create a simple HTML status page showing what's running and what's broken
- Dead man's switch: alert me if a scheduled script doesn't run when it should
- Performance tracking: log how long each script takes and flag when execution time increases suspiciously
- One-command setup: install and configure everything with a single setup script

Format as a complete Python monitoring and scheduling system with setup instructions for Mac, Windows, and Linux.

My scripts: [DESCRIBE WHAT SCRIPTS YOU NEED TO RUN AUTOMATICALLY, HOW OFTEN, WHAT FAILURES LOOK LIKE, AND HOW YOU WANT TO BE ALERTED]"

These 12 prompts replace an entire automation engineering team:

→ Script building ($200/hour freelance developer)
→ File processing ($5,000 automation project)
→ Web scraping ($8,000 data collection build)
→ API integration ($12,000 middleware project)
→ Excel replacement ($6,000 data pipeline build)
→ Email automation ($4,000 notification system)
→ ETL pipelines ($15,000 Palantir-level data engineering)
→ PDF report generation ($7,000 reporting system)
→ Database automation ($10,000 DBA consulting)
→ Image processing ($5,000 media automation)
→ CRM data automation ($8,000 HubSpot consulting)
→ Monitoring and scheduling ($12,000 DevOps setup)

Total automation value: $92,200+
Your cost with Claude: $0.

Python is the most valuable skill of 2025.

But writing it yourself is optional now.

Copy. Paste. Automate.

Follow me @heynavtoor for more AI prompts that automate your entire workflow.

♻️ Repost to help your network stop doing things manually.

Share this Scrolly Tale with your friends.

A Scrolly Tale is a new way to read Twitter threads with a more visually immersive experience.
Discover more beautiful Scrolly Tales like this.

Keep scrolling