Broken links are an embarrassment and an SEO problem. Finding them means crawling the site, extracting links, and checking each one. denkbot.dog handles the crawling and link extraction part β you wire up the status code checking logic. The dog finds the links. You decide which ones are broken.
Website maintenance, SEO health checks, automated 404 detection, pre-launch link audits, and ongoing link health monitoring.
// Step 1: Crawl to get all pages
const crawl = await fetch('https://api.denkbot.dog/crawl', {
method: 'POST',
headers: { 'Authorization': 'Bearer YOUR_API_KEY', 'Content-Type': 'application/json' },
body: JSON.stringify({ url: 'https://example.com', maxPages: 100 }),
}).then(r => r.json())
// Step 2: Scrape each page and extract links
const allLinks = new Set()
for (const page of getAllUrls(crawl.tree)) {
const { links } = await scrape(page)
links.forEach(l => allLinks.add(l.href))
}
// Step 3: Check each link
const results = await Promise.all(
[...allLinks].map(href =>
fetch(href, { method: 'HEAD' })
.then(r => ({ href, status: r.status, ok: r.ok }))
.catch(() => ({ href, status: 0, ok: false }))
)
)Not as a dedicated feature. You'd combine /crawl + /scrape and do the checking yourself.
The links array from /scrape includes all hrefs including external. You can check those too.
Depends on your parallelism. At 10 concurrent requests, around 2-3 minutes.

β¬19/year. Unlimited requests. API key ready in 30 seconds.