"Doesn't appear to work super well - I tried ckers.org and got nothing, and there definitely is more than one website on this IP."

I only do 2nd level domains, as that is what is publicly available ATM without spidering or brute forcing - neither of which I plan to do. In the future, I may add support for common subdomains such as www, ns, mail, etc, but there is no good way for me to find subdomains such as ha., sla., fu.

" BiDiBlah or Maltego are still the best. one day i'll have enough energy to create an open source version of those neat little killers (BiDiBlah starts to be a little out of date tough)"

I am not sure how these compare, as I believe they simply aggregate public information?

Regardless, this is a tool to find vhosts on a system. Like I said, I only do 2nd level domains because those are public - I don't need site owners to submit anything, I don't need to spider, and I don't need to brute force.

The primary purpose is to find all vhosts on a system, for "cross site hacking" or simply to determine whether your shared server is overburdened. It's not for mapping networks or disclosing any "hidden" subdomains or anything. It is simple a "reverse IP lookup".

Actually, I'm not sure why you said I wasn't fond of Yahoo. If Yahoo would just bend over and let the mating begin, I think MSN and Yahoo have a serious shot against Google. With Microsoft's desktop presence, and Yahoo's portal+search, it would be nearly impossible for Google to compete, realistically. That even more true since the only major publically facing innovations that didn't come through acquisitions has been around advertising anyway.

rsnake Wrote:
-------------------------------------------------------
> Okay, but really and not meaning to be mean here
> or anything, but I think it's almost pointless
> when you have things out there like MSN's IP
> search:
>
> http://search.msn.com/results.aspx?q=ip%3A216.136.
> 25.184&go=&form=QBNO

Cool, didnt know there was such a feature. Do any other SEs have similar options?

From the FAQ:
"Is this your own database, or a front end for someone else's services?

The database is our own and was generated from scratch."
How do you build a database like this in the first place? None of the *.gtld-servers.net (authoritative .com DNS) servers allow AXFR zone requests.

The people at OpenDNS could probably build their own database by caching domains / IP address, but I don't imagine CRUSH is operating DNS servers that are that widely used.

You could just crawl the web, I suppose, caching every single IP address / domain you come across, but that'd require fairly extensive resources, I'd imagine.

The only other thing I can think of is just brute forcing. You look up the A record for a.com, b.com, etc, until you get to zzzzzzzzzz.com or something.