LangChain agents can use any Python function as a tool. Wrap denkbot.dog in a @tool decorator and your agent can fetch any URL, render JavaScript SPAs, and get back clean text — all with a single API call. No custom scraping logic, no browser lifecycle to manage.
LangChain ReAct agents that browse URLs as part of their reasoning, research chains that collect data from multiple sources, document loaders that pull live web content into vector stores.
from langchain.tools import tool
import httpx
@tool
def scrape_url(url: str) -> str:
"""Fetch and extract text content from any URL."""
resp = httpx.post(
"https://api.denkbot.dog/scrape",
headers={"Authorization": f"Bearer {DENKBOT_API_KEY}"},
json={"url": url, "renderJs": True, "format": "json"},
timeout=30,
)
data = resp.json()
return f"Title: {data['title']}\n\n{data['text']}"
# Use in your agent:
from langchain.agents import AgentExecutor, create_react_agent
agent = create_react_agent(llm, tools=[scrape_url], prompt=prompt)
executor = AgentExecutor(agent=agent, tools=[scrape_url])Yes. The tool is a plain async function — drop it into any LangGraph node as a tool call.
Wrap the response in a LangChain Document with page_content=data["text"] and metadata=data["metadata"]. Done.
Set renderJs: true. Playwright Chromium runs under the hood and returns the fully rendered DOM.

€19/year. Unlimited requests. API key ready in 30 seconds.