Piloterr Website Scraping Guide: Crawler, Rendering, WebUnlocker

Piloterr Website Scraping Guide: Crawler, Rendering, WebUnlocker
News

Piloterr offers three complementary website products. This guide explains how they differ and when to use each.

Quick introduction to Web Scraping

Web scraping is the programmatic retrieval of web content (HTML/JSON) to extract structured information. There are two practical approaches:

  • Request-mode : issue HTTP requests with realistic headers, TLS, and network fingerprints to fetch server responses directly.
  • Browser-mode : drive headless browsers that execute JavaScript, load resources, and render the final DOM.

Common challenges include dynamic client-side rendering, redirects, pagination, rate limits, geo/locale variance, and enterprise anti-bot systems. Use scraping responsibly and in compliance with applicable laws and target site terms.

Piloterr Products

  • Website Crawler : HTTP request mode with advanced fingerprinting. Fastest and lowest cost (1 credit), ideal for static HTML and API/JSON endpoints. No JavaScript execution.
  • Website Rendering : Realistic headless browsers that fully execute JavaScript. Supports waits and selectors for reliable DOM readiness. Higher cost (2 credits). May occasionally fail on heavy/slow pages or strict anti-bot setups.
  • Website WebUnlocker : Hybrid of Rendering + Crawler with robust anti-bot evasion (Cloudflare, DataDome, PerimeterX, Akamai, etc.). Allowlist required. 2 credits. 100% success rate on approved domains.

How they work ?

Crawler (Request-mode)

  • Performs direct HTTP(S) requests with smart header and TLS fingerprinting.
  • Does not run JavaScript; returns raw HTML or body payload quickly.
  • Optional flags like allow_redirects and return_page_source control behavior.

Rendering (Browser-mode)

  • Launches realistic browsers to fetch and render pages client-side.
  • Executes Javascript, loads resources, and can wait for DOM stability with wait_in_seconds or wait_for selectors; supports timeout, block_ads, and browser instructions.
  • More resource intensive but essential for JS-heavy apps.

WebUnlocker (Anti-bot)

  • Combines Rendering and Crawler techniques with purpose-built evasion.
  • Tuned to pass advanced anti-bot challenges on allowlisted domains.
  • Targets near-instant success and stability across hardened properties.

When to use which ?

  • Choose Crawler : when pages are mostly static, you’re hitting API endpoints, you need maximum throughput/lowest latency, or you want the most economical option.
  • Choose Rendering : when content is rendered client-side, you need precise DOM readiness, or you require interaction-like behavior (JS execution).
  • Choose WebUnlocker : when you face enterprise-grade bot defenses (e.g., Cloudflare, DataDome, PerimeterX, Akamai) and require a 100% success rate on approved domains.

Key differences at a glance

Feature Crawler Rendering WebUnlocker
JavaScript execution
Anti-bot resilience Basic (fingerprinting) Medium Very high (enterprise anti-bot bypass)
Typical latency Lowest Medium/High Very low; engineered for 100% success on approved domains
Cost per request 1 credit 2 credits 2 credits

Explore the documentation

Conclusion

Use the Crawler when you need speed, scale, and the lowest cost for static or server-rendered pages and APIs. Choose Rendering when the site relies on client-side JavaScript and you need DOM-aware waits. Pick WebUnlocker for properties protected by enterprise anti-bot systems on approved domains it delivers very low latency with a 100% success rate.