If you’re running scraping tasks or AI agents that need to browse the web, you don’t want to spawn a browser from scratch every time. You want an army—pre-spawned, ready, and manageable. Here’s the setup I keep coming back to.
RAM is the bottleneck. I usually go for a dedicated machine with as much memory as possible—at least 64GB. No GPU needed, just solid cores and space to breathe.
Datacenter IPs don’t cut it anymore. I use residential proxies behind a simple load balancer. Nothing fancy—just enough rotation to avoid blocks.
I spin up as many Puppeteer instances as the RAM can handle. These sit idle, headless, and waiting. Cold starts are expensive—better to keep the pool warm.
Each browser exposes a WebSocket endpoint using remote debugging. I run a small script that keeps them subscribed to a central “heartbeat,” so I know which ones are alive and which ones died silently.
On request:
Memory leaks? Don’t debug. Just destroy and replace. Works better long-term.
This setup has held up across scraping workloads, agent testing, even some UI testing setups. Clean. Reusable. Disposable.