I participated in a conversation along these immediate lines several years
ago which originated from a similar query to alt.cyberpunk.tech. I'll
append the four technically-relevant responses on the topic I saved at the
time, which occurred under the subject line "Coordinate system for the
Internet?"
At the time, there were also a chorus of responses along the lines of
"...but using geographical latitudes/longitudes is counter to the whole
space-independent notion of the Internet." Nonetheless, the point remained
that a whole set of interesting interactions happen at the interfaces
between the virtual and physical worlds, and there are quite a number of
interesting applications which can be derived from visualizing elements of
this interplay (like geographical localizations of net datasources,
bandwidth/latency maps (anyone seen such beasts?), etc.).
Hope the aggregate response isn't too long for the list. Note also that
while I've written several more net-geo-mapping programs as described below
in the interim, I haven't been aware of more systematic service development
such as folks reference in the responses (beyond things of the style of
Gopher++ tags that Paul Lindner references), though there are more Web
sites employing the martini database (check out
http://www.yahoo.com/Science/Geography/Geographic_Information_Systems/
and thereabouts).
--Brygg
From: Thomas B. Libert <[email protected]>
To: [email protected], [email protected]
Cc: [email protected], [email protected]
Organization: Computing Solutions Inc., Ann Arbor, Michigan
Subject: Re: Coordinate system for the Internet?
Date: Thu, 23 Sep 93 10:48:20 EDT
Sender: [email protected]
[In alt.cyberpunk.tech, someone (perhaps [email protected]) asked...]
> Are there any "Domain Spatial Position Servers" around? Online maps
> of the Internet? The original SNMP MIB physical location info was
> just a comment, has this been changed in the new version?
[Brygg then said...]
I've used a composite approach in the past to generate
geograpical mappings from mailing list logs, traceroute
reports, etc. The procedure I've used is roughly as follows.
Begin with an e-mail address; if you have only a set of
names, see if you can derive application-specific heuristics
for automated finger or PH queries, etc. Strip out the
final domain component of the e-mail address; for instance,
given [email protected], filter out uiuc.edu. Then do a
whois query of the InterNIC whois database, searching for
the reduced domain name string. (For instance, "whois -h
rs.internic.net uiuc.edu". Also note that this may tier
break down for many international addresses.)
From this report, automatically extract the City/State/Country
fields (they're positioned in a fashion which makes this relative
simple for scripted processing). Take the resulting symbolic
geographical information, and run a query through the
martini Geography server, reachable via "telnet
martini.eecs.umich.edu 3000". Finally, strip out the
latitude/longitude information, plot this in combination
with lat./long. based physical/political maps, and... there
you are.
I've automated this in the past using mixed perl and AWK
scripts, and have successfully generated maps bringing
together hundreds of geographical locations with a minimum
of human post-processing. I actually considered creating
a server such as you reference which might accept domain-name
queries and return latitude/longitude, with the particular
application in mind of a nice graphical connection display
via traceroute reports for Gopher/WAIS/etc. info queries
and the like. I made several inquiries of the DDN NIC and
Martini folks (considering the potentially significant
loading of widespread application of these approaches to
their databases), but never really received a meaningful
response from either party. I'd be very interested to hear
if you do find a close approximation for domain name->geography
service significantly simpler than the above process!
I'm the bloke who runs the martini geography server. I implemented
it while I was at NSFNET specifically to facilitate the creation
of network maps. I also wrote a client which takes the
contents of networks.txt, sucks addresses from the whois server
on rs.internic.net, pulls lat/long from the GNS, and spits
out files for Brian Reid's netmap program.
A future version of the GNS will provide locations of all
nets automatically, so users won't have to go through all
that nonsense. The process is not quite 100% automatic
yet, because some of the whois data are either impossible
to parse or contain typographical errors and/or abbreviations.
I talked to the NIC folks about providing a lat/long
field in the whois database, but got a lukewarm reception.
That was several years ago, though--maybe I should try
again.
Rest assured I am interested in this problem, and will
eventually implement a reliable, authoritative solution.
(In my spare time, which has been all but nonexistent lately...)
Tom Libert
Research Investigator
Center for Information Technology Integration
University of Michigan
-----------------------------
To: Brygg Ullmer <[email protected]>
Cc: [email protected], [email protected], [email protected],
[email protected]
Subject: Re: Coordinate system for the Internet?
In-Reply-To: Your message of Thu, 23 Sep 93 11:36:16.
<[email protected]>
Date: Thu, 23 Sep 93 11:40:38 EDT
From: [email protected]
two suggestions
if there's a set of perl scripts already running to do some or all of
this, it would make some sense to make them into a module for the
"plexus" world wide web server, so that you could do the searches via
the net. ie.
query -> intermediate server -> geo database
whois server
routing database
If the intermediate server does caching it can remove load from the
original database, and even incorporate fixes and changes that are
really localized (like noting that 148.59.6.* migrates from Ann Arbor
to Houghton every summer :).
there's some other heuristics for non-US nets that could be useful.
One way to do things is to go name -> IP address and then look up the
network number; e.g.
dig csiro.au
note a name server for csiro.au in network 144.110.0.0
whois -h rs.internic.net 144.110.0.0
note a contact address in Carlton VIC 3053
do the Australian postal-code to geo location mapping
This assumes something about how wide networks are, and it fails for
big nets like net 35 that are geographically dispersed, but in many
cases it gets better data than other tries.
--------------------------------------------------
Date: Thu, 23 Sep 93 14:25 EDT
From: [email protected] (Tom Libert)
To: [email protected], [email protected]
Subject: Re: Coordinate system for the Internet?
Cc: [email protected], [email protected], [email protected]
The fundamental problem with relying on rs.internic.net
data is that (as Ed points out) entries don't necessarily
provide the location of the net. Indeed, that's an instrinsically
ill-defined notion (consider net 35, which I generally position
in Ann Arbor. But what about Xerox or DEC? I usually use
Palo Alto, but that's not very satisfying...)
I think the correct answer (and one I used successfully for a
mapping project at NSFNET) is to use the semi-automatic
approach as a starting point, but then fine-tune the locations
the old-fashioned way (contact net admins for ambiguous
cases.) This implies a long-term commitment to maintaining
the data.
I think someone should definitely do it. I definitely don't
want to be that someone. If we can't talk the internic
people into committing to that, then some good samaritan
will have to volunteer.
I'll probably do it myself as a last resort (grumble, complain)
but can't guarantee that I'll always have the cycles to keep
it completely up-to-date (which may be even worse than nothing
at all.)
In the mean time, I can provide a copy of the client I wrote
which automatically computes coordinates for all nets.
---------------------------------------------
To: [email protected] (Tom Libert)
Cc: [email protected], [email protected], [email protected]
Subject: Re: Coordinate system for the Internet?
In-Reply-To: Your message of Thu, 23 Sep 93 14:25:00.
<[email protected]>
Date: Thu, 23 Sep 93 14:49:41 EDT
From: [email protected]
I think someone should definitely do it. I definitely don't
want to be that someone. If we can't talk the internic
people into committing to that, then some good samaritan
will have to volunteer.
Internic has money - this should be phrased as a proposal to them for
a project, some staffing, equipment, what have you. Not that it would
be *hard* to do - I think there's enough understanding of the problem
and enough interest in it to get useful feedback. It's just that if
you want to provide a useful long-term service on today's internet,
an internet where if you build something cool there's a chance that
100,000 people on America Online will try it out tomorrow, it had
better be properly engineered right from the start.
--Ed
(noting that he's already done several "someone should do it"
projects, and that a few have fallen by the wayside because of lack of
long term funding)
<END>