Features • Installation • Usage • Running waybacksteroids • Examples
waybacksteroids is an enumeration tool that automates the retrieval of archived URLs from the Wayback Machine. It supports processing multiple domains simultaneously, making it useful for quickly discovering historical endpoints and uncovering hidden paths across different targets.
- 📤 Flexible input: single domain, wordlist or stdin pipe (fits any recon workflow)
- 🎯 Clean, de-duplicated output per domain (
domain_steroids.txt) - 🔁 Auto-retry on transient failures (configurable)
go install github.com/LucasKatashi/waybacksteroids/cmd/waybacksteroids@latestgit clone https://github.com/LucasKatashi/waybacksteroids.git
cd waybacksteroids
go build -o waybacksteroids ./cmd/waybacksteroidswaybacksteroids -hThis will display help for the tool. Here are all the switches it supports.
Usage:
waybacksteroids [flags]
INPUT METHODS (choose one):
-t, --target single target domain (e.g., example.com)
-w, --wordlist file containing list of domains (one per line)
--stdin read domains from stdin (pipe from other tools)
CONFIGURATION:
-o, --output output directory (required) - saves results in domain_steroids.txt files
--threads number of concurrent threads (default: 3, max recommended: 5)
-r, --retries number of retry attempts for failed requests (default: 3)
OUTPUT OPTIONS:
-p, --print print URLs to stdout (no files created)
-v, --verbose enable verbose mode
-s, --silent suppress banner and info messagesSingle domain
waybacksteroids -t example.com -o output/Multiple domains from file
waybacksteroids -w domains.txt -o output/Pipe from subdomain enumeration tool
seekly -domain example.com | waybacksteroids --stdin -o output/Print to stdout only
waybacksteroids -t example.com -pRunning
waybacksteroids -t example.com -o output/creates output/example.com_steroids.txt containing:
http://example.com/robots.txt
http://example.com/.git/config
http://example.com/api/v1/users
http://example.com/admin/login.jsp
[...]