OpenAI function calling lets your model decide when to fetch a URL. Define a scrape function, hand it to your model, and GPT-4 will call it whenever it needs live web data. denkbot.dog returns structured JSON with title, text, links, and metadata — exactly what a language model needs to reason about a page.
GPT-4 research assistants, o1 agents that browse docs to answer questions, custom ChatGPT-like tools that need live web access, Assistants API agents with browsing capabilities.
import openai, httpx
def scrape(url: str, render_js: bool = False) -> dict:
r = httpx.post("https://api.denkbot.dog/scrape",
headers={"Authorization": f"Bearer {DENKBOT_API_KEY}"},
json={"url": url, "renderJs": render_js, "format": "json"}, timeout=30)
return r.json()
tools = [{
"type": "function",
"function": {
"name": "scrape",
"description": "Fetch content from a URL. Returns title, text, and links.",
"parameters": {
"type": "object",
"properties": {
"url": {"type": "string", "description": "URL to fetch"},
"render_js": {"type": "boolean", "description": "Render JavaScript first"}
},
"required": ["url"]
}
}
}]
response = openai.chat.completions.create(
model="gpt-4o", messages=[{"role": "user", "content": "Summarize https://example.com"}],
tools=tools)Yes. Define scrape as a function tool in your Assistant and it will call it automatically when browsing is needed.
Yes — for vision-enabled models. Use GET /screenshot to get a public image URL and pass it as an image_url message.
Full control over what's fetched, cached responses (15 min), raw HTML access, and no content filtering on the scraping side.

€19/year. Unlimited requests. API key ready in 30 seconds.