Agentic workflows have different needs than one-off scripts. They run in loops, hit the same URLs repeatedly, need predictable output schemas, and sometimes need to see what a page looks like — not just read it. denkbot.dog is built for this. 15-minute cache, consistent JSON schema, raw PNG screenshots, and no SSRF surprises.
Autonomous research agents, multi-agent pipelines where one agent fetches and another analyzes, continuous monitoring agents that poll URLs for changes, vision agents that need to see rendered pages.
# Agent loop — fetch + analyze pattern:
import httpx
def agent_fetch(url: str) -> dict:
"""Cache-aware fetch — safe to call repeatedly."""
r = httpx.post("https://api.denkbot.dog/scrape",
headers={"Authorization": f"Bearer {DENKBOT_API_KEY}"},
json={"url": url, "format": "json"})
data = r.json()
# data["cached"] == True if already fetched in last 15min
return {
"title": data["title"],
"text": data["text"][:4000], # Trim for context window
"links": data["links"][:20],
"cached": data["cached"],
}Agents often re-fetch URLs in loops. With 15-minute caching, the second request for the same URL is instant and free. Pass no_cache: true to bypass.
Yes. url, title, html, text, metadata, links, statusCode, cached, durationMs — always present, never null (empty string if missing).
Yes. GET /screenshot?url=... returns a PNG publicly — embed it as an image_url in your vision model's message.

€19/year. Unlimited requests. API key ready in 30 seconds.