Needing to check name server records for lots of domains across multiple top level domains (TLDs) can be fairly challenging.
To find records of a single domain you can easily use online tools like Domain Tools or command line tools like dig (domain information groper!?)
dig ns $domainname
If your needing to check millions of old and new gTLD’s (but not ccTLD’s) theres also an easy option. I recommend reaching out to Verisign who maintain the zone records for many of these top level domains. Access is provided for free with an application process into the TLD Zone File Access Program.
Verisign provide zone records via daily updates and files can be downloaded via FTP. Each TLD is broken into it’s own set of files which can be downloaded — if you have enough space available (com ~ 2.5gig compressed, 10gig uncompressed)
For those of you who need to check lots of domains across lots of TLD’s, you might struggle to find a solution — I know I did!
I would love to hear how you solved the problem, my approach was to write a simple python script which takes in a list of domains (via a csv file) and checks each using ‘dig ns $domain’. It’s a very similar approach to many other scripts you might find on sites like stackoverflow.com — with one fundamental problem — it’s slooooooww
To speed things up I looked at multi-threading using Thread. This addition to the script allows 25 concurrent dig ns queries to be pulled from a queue and executed allowing a throughput of around 2 million names inside 24 hours – which was acceptable for my criteria.
Feel free to use the script below, or if you have money to throw at the problem you can achieve similar results using online tools like domainiq.com or domaintools.com