Was this helpful?
How to bypass PerimeterX (HUMAN)
Tech builder focused on infrastructure, automation, backend systems, and scalable SaaS development
In many cases, companies have an objective need to bypass Akamas defense mechanisms—primarily for legitimate testing, QA, business process automation, and researching the resilience of their own infrastructure. If you're interested in bypassing it, reach out via the contact form on website, and we'll develop the optimal solution for your needs.
Scraping via raw HTTP requests or default Selenium is a dead end. PerimeterX (now HUMAN Security) fundamentally changed the rules of engagement. Legacy WAFs operate synchronously: the server checks an IP or User-Agent and forces the request to wait for a verdict. PerimeterX flips this model, operating out-of-band and asynchronously.
The architecture relies on three pillars: the Enforcer (a lightweight middleware SDK at the edge), the Sensor (a heavily obfuscated init.js payload injected into the browser), and the Detector (a cloud-based ML engine).
The Enforcer only checks for valid cryptographic tokens (like the _px3 cookie). If the token is valid, the request hits the backend with zero added latency, specifically targeting <2ms for 95% of human requests. If it's missing or expired, the server serves the HTML but injects the JS sensor. Legitimate users experience minimal lag, while bots are sidelined into heavy background verification.
Layer 1: The Network and Instant Drops (TLS, HTTP/2, and JA4)
Scripts often hit a 403 Forbidden wall before the HTML even loads. This happens because of Deep Packet Inspection (DPI) analyzing network footprints during the initial TLS handshake.
Libraries like Python's requests or Go's standard HTTP client format their Client Hello packets (cipher suites, extension order) completely differently from a real Chrome browser. If your script spoofs a Chrome User-Agent but negotiates TLS using OpenSSL defaults, the session is instantly trashed. Modern bot mitigation has moved beyond basic JA3 hashes, leveraging the JA4/JA4H suite for deep HTTP client profiling.
Another massive red flag is datacenter IP addresses. Routing traffic through AWS or DigitalOcean instantly tanks your trust score; surviving this layer requires high-quality rotating residential proxies.
Layer 2: Environment Probing (What px.js Looks For)
Once past the network layer, the JS sensor wakes up. Checking for navigator.webdriver is basic. The sensor vacuums up hardware specs: CPU concurrency, device memory constraints, screen resolution, and system font catalogs. It specifically hunts for server-side environments, probing for Node.js global objects like process or sandbox emulators like JSDOM.
A rookie mistake is disabling Canvas rendering entirely to "prevent tracking." For PerimeterX, a blocked Canvas API is an automatic ban. The system expects a stable WebGL fingerprint, generated by the unique hardware quirks of the host's GPU and drivers. Moreover, in 2025–2026, PerimeterX heavily adopted WebAssembly (Wasm) fingerprinting. By measuring the execution speed of specific mathematical operations, it can identify the true underlying browser engine, completely ignoring a spoofed User-Agent string.
Layer 3: Cursor Physics and the HUMAN Challenge
The sensor's crown jewel is continuous behavioral biometrics. The script attaches event listeners to the DOM and passively records micro-dynamics.
Human behavior is inherently chaotic, defined by variable mouse acceleration, micro-tremors, and nonlinear trajectories. Bots built on Puppeteer or Playwright often draw perfectly straight lines or skip intermediate mousemove events entirely. PerimeterX tracks specific coordinate deltas (like movementX and movementY), compresses this telemetry into a JSON payload, and ships it asynchronously to the cloud.
If the ML engine is uncertain, it triggers the HUMAN Challenge—the infamous "Press & Hold" CAPTCHA. While the button is held, the sensor harvests ultra-high-resolution telemetry, analyzing the capacitance profile of the touch event or the micro-tremors of the cursor to verify biological authenticity.
Layer 4: State Management and Cookies
The state-management infrastructure relies on a hierarchy of cryptographically signed cookies :
-
_px/_px2/_px3: The core risk token (variable lifespan, e.g., 5.5 mins). Contains the session state, visitor ID, and trust score. -
_pxhd: A server-side tracking identifier valid for 1 year, pinning the hardware across sessions. -
_pxff_*: Feature flags (valid for 1 day) instructing the client-side code on which detection modules to run. -
_pxac: An access token payload used by the Enforcer to whitelist programmatic requests. -
X-PX-AUTHORIZATION: A validation header generated exclusively by mobile SDKs.
To maintain GDPR compliance, the Enforcer can strip PII by zeroing out the last IP octet in IPv4 (e.g., 1.2.3.4 becomes 1.2.3.0) before routing telemetry to the cloud. It also features Credential Intelligence (ATO protection), hashing submitted usernames and passwords to cross-reference against breach databases, effectively killing credential stuffing attacks.
Layer 5: Mobile SDKs (Shifting Logic to Native Code)
When scraping mobile applications (iOS/Android), the battlefield changes. The verification logic moves from readable JavaScript in the DOM to compiled, heavily obfuscated ARM assembly.
When the app triggers a protected API, the native SDK intercepts the call, generates a device fingerprint, and requests a mathematical challenge from the backend. This challenge is solved locally using embedded C/C++ libraries. The SDK then injects the resulting cryptographic token into the X-PX-AUTHORIZATION HTTP header. Without it, the server drops the request. Furthermore, the SDK enforces strict SSL certificate pinning to perimeterx.net, neutralizing standard Man-in-the-Middle (MitM) interception attempts via Charles Proxy.
Evasion Strategies
Legitimate QA testing and authorized data collection require structured bypass methods:
-
QA and Internal Testing (Whitelisting): Engineers don't spin up headless browsers for internal tests. Passing a valid token via the
x-px-access-tokenheader or settingbypass_monitor_header: 1bypasses active blocking entirely at the Enforcer level. -
Hybrid Scraping: Fortified browsers (Puppeteer Stealth, Undetected ChromeDriver, SeleniumBase UC Mode) are executed exactly once to solve the JS challenges and harvest valid cookies (like
_px3). The session is then handed off to a fast HTTP client for high-throughput scraping until the token expires. -
Reverse Engineering: Hardcore specialists deobfuscate the
init.jssensor, decipher the cookie generation algorithms, and synthesize behavioral telemetry (movementX/Ycoordinates, touch durations) programmatically. This solves the cryptographic challenges purely via HTTP requests, bypassing the browser entirely. -
Web Unlockers: At the enterprise scale, maintaining custom bypass infrastructure is financially unviable. Teams outsource the headache to Scraping APIs (like ZenRows, Scrapfly, or ScraperAPI), which manage residential proxies, rotate JA4 fingerprints, and automatically solve JS challenges under the hood.
A perfect browser fingerprint is garbage if it contradicts the TLS signature (JA4) of the underlying network request. Absolute consistency across the entire protocol stack is the only way to survive systems like PerimeterX.