v2.4 — Now with batch crawling

Web scraping
infrastructure
for everyone

A powerful dashboard for non-technical users and a complete API & SDK for developers. Turn any URL into structured data—no code required, or fully programmable.

scrape.tsTypeScript
import { Jetscrape } from "@jetscrape/sdk";

const jet = new Jetscrape({ apiKey: process.env.JETSCRAPE_KEY });

const result = await jet.scrape("https://example.com", {
  format: "markdown",
  waitFor: "networkidle",
  extract: {
    title: "string",
    price: "number",
    inStock: "boolean"
  }
});

console.log(result.data);
// { title: "Product X", price: 29.99, inStock: true }
47ms
Median response time
99.97%
Uptime SLA
100M+
Daily requests
100+
Templates
Powering data pipelines at
MicrosoftYandexMindsiteEasypointCivar
Capabilities

Everything you need to
extract data at scale

One API to replace your scraping stack. No proxy management, no browser orchestration, no HTML parsing.

Structured Extraction

Define a schema and get type-safe JSON back. Zod-compatible validation built in.

Headless Browsers

Auto-managed Chromium fleet. JavaScript rendering, infinite scroll, stealth mode.

Clean Markdown

LLM-ready output. Preserves tables, code blocks, headings, and semantic structure.

AI Crawling

No selectors needed. AI automatically identifies and extracts the right elements from any page layout.

UI-to-API Parity

Every operation in the dashboard has an API and SDK equivalent. Build visually, deploy programmatically.

AI Operator

An intelligent agent that guides you through scraping workflows, suggests configurations, and troubleshoots issues.

Crawling Workflows

Chain multi-step crawls: extract parameters from one page, build dynamic URLs with custom headers, and feed them into the next request automatically.

Unblocking & Anti-Bot

Bypass Turnstile, reCAPTCHA, and anti-bot systems automatically. Full browser fingerprint control with stealth mode.

Batch Crawling

Crawl entire sitemaps or domains. Parallel workers with deduplication and rate limiting at scale.

Workflow

Three lines of code.
Any website, structured.

01

Initialize the client

Install the SDK and authenticate with your API key. Supports Node.js, Python, Go, and cURL.

$ npm install @jetscrape/sdk
02

Send a scrape request

Pass a URL and specify the output format. We handle proxies, rendering, and anti-bot bypass.

jet.scrape("https://...", {
format: "markdown"
})
03

Get structured data

Receive clean, validated JSON or Markdown. Ready for LLMs, databases, or downstream processing.

{
"title": "...",
"content": "...",
"links": [...]
}
Deep dive

Built for production workloads

Every component is designed for reliability at scale. No breaking changes, no surprise rate limits.

01 / AI CRAWLING

AI finds selectors for you

Describe what you need in plain language. Our AI analyzes the page structure, identifies the right elements, and builds extraction rules automatically — no CSS selectors or XPath required.

View AI crawling docs
ai-crawl.ts
const result = await jet.scrape("https://store.example.com", {
  aiExtract: true,
  prompt: "Find all products with name, price, stock",
});

// AI auto-detects selectors — no CSS/XPath needed
console.log(result.data);
// [{ name: "Widget", price: 29.99, stock: true }, ...]
02 / AI OPERATOR

Your scraping co-pilot

An intelligent agent that helps you design workflows, troubleshoot failures, suggest optimal configurations, and automate repetitive scraping tasks end-to-end.

Meet the operator
operator.ts
const agent = jet.operator({
  goal: "Monitor competitor prices daily",
  urls: ["https://competitor-a.com", "https://competitor-b.com"],
});

const plan = await agent.plan();
// Agent suggests: schedule, selectors, alerts
await agent.execute(plan);
03 / WORKFLOWS

Multi-step crawling pipelines

Chain requests into complex workflows: crawl a page, extract parameters, build dynamic URLs with custom headers, and feed them into the next step. Handle login flows, pagination, and deep crawls.

Build a workflow
workflow-chain.ts
// Multi-step crawl: login → list → detail
const flow = jet.workflow([
  { url: "https://app.example.com/login",
    actions: [{ fill: "#email", value: env.EMAIL }] },
  { url: "https://app.example.com/items",
    extract: { links: "a.item-link[href]" } },
  { url: "{{each links}}",  // dynamic URLs
    headers: { "X-Token": "{{cookies.session}}" },
    extract: { title: "string", price: "number" } },
]);
04 / UNBLOCKING

Bypass any anti-bot system

Automatically solve Turnstile, reCAPTCHA, and WAF challenges. Full browser fingerprint control, stealth mode, residential proxies, and device emulation — all transparent to your code.

View unblocking docs
unblock.ts
const result = await jet.scrape(url, {
  unblock: true,
  browser: {
    stealth: true,
    solveTurnstile: true,
    fingerprint: "desktop-chrome",
    viewport: { w: 1920, h: 1080 },
  },
  proxy: "residential-us",
});
05 / UI → API

UI-first, API-ready

Build and test scraping workflows visually in the dashboard. When ready, export to API calls or SDK code with one click. Every UI action has a programmatic equivalent.

Explore the SDK
export.ts
// Export any dashboard workflow as code
const workflow = await jet.workflows.export(
  "wf_price_monitor"
);

// Run via SDK — identical to clicking "Run" in UI
await jet.workflows.run(workflow.id, {
  schedule: "0 9 * * *",
  webhook: "https://api.you.com/hook",
});
API Reference

A single endpoint.
Infinite flexibility.

RESTful by default. SDKs for TypeScript, Python, and Go included.

POST /v1/scrape

Send a URL and receive structured data back. Customize output format, rendering behavior, extraction schema, and proxy region.

POSThttps://api.jetscrape.com/v1/scrape
GEThttps://api.jetscrape.com/v1/batch/{id}
Response — 200 OK47ms
{
  "success": true,
  "data": {
    "url": "https://example.com/product/142",
    "title": "Wireless Keyboard Pro",
    "markdown": "# Wireless Keyboard Pro\n\n...",
    "extract": {
      "price": 79.99,
      "currency": "USD",
      "inStock": true,
      "rating": 4.8
    },
    "metadata": {
      "statusCode": 200,
      "latency": 47,
      "proxy": "us-east-1"
    }
  }
}
Pricing

Predictable, usage-based pricing

Start free. Scale with transparent pricing. No hidden fees, no per-seat charges.

Free
For testing and prototyping
$0
1,000 pages / month
  • HTML, JSON & Markdown output
  • Playground access
  • 5 concurrent requests
  • Community support
Get started
Growth
For production workloads
$50
50,000 pages / month
  • Everything in Free
  • Full API access
  • 20 concurrent requests
  • 15 max crawls per minute
  • Email support
Start free trial
Pro
For teams at scale
$250
300,000 pages / month
  • Everything in Growth
  • 50 concurrent requests
  • 30 max crawls per minute
  • Priority support
  • Webhooks & streaming
Get started
Enterprise
Custom infrastructure
Custom
Unlimited pages
  • Everything in Pro
  • Dedicated proxies
  • Custom SLA (99.99%)
  • Custom rate limits
  • Dedicated support engineer
Contact sales

Start scraping in 30 seconds

Free tier included. No credit card required. Get your API key and start extracting structured data immediately.

$npx @jetscrape/cli scrape https://example.com --format markdown