cruge (matt)

why peeringdb matters more than ever

Jan 15, 2026 — 8 min read

sometime in 2012, i started getting serious about peering. i was running a small network, doing the usual thing — emailing NOCs, asking for peering, getting ghosted half the time. the problem wasn't convincing people to peer. the problem was finding them.

back then, if you wanted to know who was at a particular internet exchange or colocation facility, you were doing detective work. whois records, mailing lists, word of mouth at NANOG. maybe someone had a spreadsheet somewhere. it was chaos.

that's when i found peeringdb. this scrappy, community-maintained database where networks could register their presence at facilities and exchanges. it was rough around the edges, but the idea was right: a single place where anyone could find out who's where, what ASNs they operate, and how to reach their peering coordinators.

i didn't just use it. i started contributing. fixing data, adding facilities, reaching out to networks who hadn't updated their records. then contributing code. then joining the board.

here's what most people don't realize about peeringdb: it's entirely volunteer-run. no massive engineering team, no VC funding, no enterprise sales motion. just a group of network operators who believe the internet works better when interconnection data is freely available.

we rebuilt the platform from scratch around 2015-2016. the old codebase was showing its age — it was a single PHP application that had been patched and extended for years. we migrated everything to django, built a proper REST API, added authentication and organizational accounts. the migration was terrifying. thousands of networks depended on this data daily.

the API changed everything. suddenly, network automation tools could query peeringdb programmatically. peering managers could script their workflows. route server operators could auto-generate configurations. what used to take hours of manual work became a single API call.

today peeringdb has over 30,000 network records, 12,000 facility listings, and 1,500 internet exchange points catalogued. it processes millions of API requests per month. every major network on the planet — from hyperscalers to regional ISPs — uses it as their source of truth for interconnection data.

but the thing i'm most proud of isn't the scale. it's the model. peeringdb proved that critical internet infrastructure can be maintained by the community it serves. no vendor lock-in, no paywall, no corporate agenda. just operators helping operators.

we've had our challenges. data quality is an ongoing battle — networks merge, facilities close, contacts change roles. we built validation systems, automated staleness detection, and abuse prevention. but ultimately, the data is only as good as the community's willingness to maintain it.

the open-ix association helped formalize things, giving us a legal and organizational framework. but the culture hasn't changed. show up at a NANOG or NLNOG meeting and you'll find peeringdb people at the social events, talking to operators, collecting feedback, fixing bugs on their laptops between talks.

around the same time, i started working on bgpfu and the 20c tooling ecosystem. the idea was simple: if peeringdb gives you the data, you should have tools to act on it. bgpfu generates BGP filter configurations from IRR data. the 20c libraries handle RPKI validation, prefix list generation, and network automation workflows.

these tools aren't glamorous. they don't have landing pages or product hunt launches. but they're running in production at networks that collectively carry a significant chunk of internet traffic. when your youtube video loads without buffering, there's a decent chance the peering session making that possible was set up using peeringdb data and automated with tools we built.

lately i've been thinking about what's next. the internet is getting more complex — cloud interconnection, edge computing, private peering at scale. peeringdb needs to evolve with it. we're working on better cloud exchange support, enhanced facility data models, and improved API capabilities.

the arctic code vault thing still makes me smile. knowing that peeringdb's codebase is preserved in a vault in svalbard, alongside linux and python, feels like validation. this little volunteer project matters enough to preserve for future generations.

if you're running a network and haven't updated your peeringdb record lately, go do it. if you're building network automation and aren't using the API, you're working too hard. and if you want to contribute — code, data, or just showing up at a meeting — we're always looking for people who care about keeping the internet open.

the internet is a community project. always has been. ∞