ChatGPT's knowledge cutoff is last year. The web is today. Bridge the gap with denkbot.dog — fetch any URL, get clean text, pass it to your LLM as context. The dog fetches the page. ChatGPT reads it. You get answers.
Real-time web grounding for LLMs, URL-based Q&A systems, article summarization bots, research assistants with web access, and AI tools that need to "read" websites.
import openai
import requests
def ask_about_url(url, question):
# Fetch the content
r = requests.post(
'https://api.denkbot.dog/scrape',
headers={'Authorization': 'Bearer YOUR_API_KEY'},
json={'url': url},
)
text = r.json()['text'][:8000] # Token budget
# Ask ChatGPT
client = openai.OpenAI()
response = client.chat.completions.create(
model='gpt-4o',
messages=[
{'role': 'system', 'content': f'Here is the content of {url}:\n\n{text}'},
{'role': 'user', 'content': question},
],
)
return response.choices[0].message.contentWe're an HTTP API, not an LLM. Use the output as context in your own LLM calls.
It depends on the page. Long articles can be 50k+ characters. Slice appropriately for your token budget.
Yes. Latency is 300-2000ms for non-JS pages — acceptable for most chatbot use cases.

€19/year. Unlimited requests. API key ready in 30 seconds.