Pondering geojson tiles

Today I felt rather like I was sifting through a bin of Lego looking for the parts to build a fantastical spaceship, but just not seeing it.  I have been exploring the capabilities of polymaps.  I like the code, and it seems like it is pretty functional, but I was missing something.  The brochure content and examples made claims that polymaps could display GeoJSON tiles, but every time I followed a likely path to figure out how to make those tiles, I ran into a dead end. Continue reading

Rummaging in the garage

The second to last thing I was doing today was rummaging through boxes in the garage.  My father gave me two really cool things many years ago, and I’ve misplaced both.  One of them I really want…a gold coin.  When he gave it to me when I graduated college (I think, it could have been high school), gold was about $400 per ounce.  His comment was that all good fortunes have to start with a gold coin or some such thing.  Well, now gold is insanely expensive, and somewhere I have misplaced $1300 or so.  When I find it I will sell it immediately.

The other thing we did was go to the boat parade around Newport Harbor.

And before that I test drove a golf tdi.

So I really hope my gold coin is off somewhere gathering other gold coins!

Trying to force a scalar to be a number

I love the flexibility of Perl and JavaScript to do anything with a variable, but sometimes it bites me.

I am slurping geometries from a database using PostGIS’s ST_AsGeoJSON function via dbix::class.  I have a loop that stores what I need from each record in the db, and then when all the data is loaded, I bulk dump it to CouchDB.  On the CouchDB side I am running GeoCouch, and have a view that generates a spatial index.

Except that it breaks in a pile of Erlang error dumpage.  The problem is that sometimes in moving from the database through perl to JSON to CouchDB, the scalars that are supposed to be numeric coordinates get misinterpreted as strings!  This is perfectly understandable from a language point of view. The data streaming out of postgres is a string representation of the geometry, so of course perl can’t guess that I really want numbers.  So I wrote a little loop:

my $geojson = $self->json()->decode( $vds->get_column('geojson') );
## **SOMETIMES** but not always, perw writes text, not numbers
## so I must touch every number here and subtract zero
my @coords = ();
for my $pair (@{$geojson->{'coordinates'}}){
    push @coords, [ $pair->[0] -0.0 ,$pair->[1] - 0.0];
$geojson->{'coordinates'} = [@coords];
$data->{'geometry'} = $geojson;

That is super ugly, but I can’t find my perl book and I can’t find any search term that hits on some sort of “numeric” cast operator.   My guess from reading http://perldoc.perl.org/perlnumber.html is that somehow perl is deciding that the number is best represented internally as a string, so as to not lose precision.

I’ve run across the same thing in JavaScript.  But I’ve never come across a case where subtracting a number failed to make a string representation of a number numeric!

Even resorting to using  <code>sub getnum { … }</code> from http://perldoc.perl.org/perlfaq4.html#How-do-I-determine-whether-a-scalar-is-a-number/whole/integer/float? didn’t do the trick!

Aha! Thanks to the tip from Martin (below) I re-read the JSON::XS docs.  The problem was my fault.  I was dumping a *single* records to the terminal prior to doing the bulk docs call on CouchDB.  So that one record had a bunch of numbers that were last used as a string, and so my spatial index was failing.

Take out the stupid debug carp call, and the numbers aren’t being used as strings, and JSON::XS does the right thing!  Now I can move on making maps of highways.

Edit Dec 20, 2010.  Just to be clear,  once I took out my debugging call to dump a record to the screen before saving it, I also no longer needed to touch each number and subtract zero.  So my loop for loading data now reads:

    my $geojson = $self->json()->decode( $vds->get_column('geojson') );
    $data->{'geometry'} = $geojson;

Transportation research and the public

In our work with Caltrans at CTMLabs, we are often confronted with the issue of whether or not to make information public.  Caltrans has as it mission the requirement to provide a safe, efficient, and economical transportation system for the state of California.  As academics, we think part of that should be to free up information flows and let all comers build new applications and services on that information.  But every once in a while a Caltrans engineer will remind us that because it is their job to provide the transportation system, and because no matter what accidents will happen and people will die, and because we live in a litigious society, they have to be very careful about what information goes out.

For this reason, we have to spend an enormous amount of time locking down our website and making sure single sign on and single sign off work properly.  In order to be transportation researchers, we have to first become web developers.

Work work work

We are launching our new website any day now.  I wrote it up a while back…searching my posts, I wrote it up back in August! Craig and Duncan got single sign on working via the magic of CAS.  For a long time I did nothing, as I was pushing hard on a data fusion project for the ARB.   Then we had a demo coming up one week, so I used CouchDB and OpenLayers to pull together a quick map interface to our projects.

Now it is close to show time, so I’ve been working to link up the JSONP services we are pulling from the map interface with the CAS single sign on server.  To do that, I’ve split my CouchDB application up into parts.  First I stripped out the couchapp stuff…the evently code and so on…into a single page that I’ve stuffed into our Drupal site.  Then I wrapped my service behind node.js, so that my CouchDB isn’t exposed to all and sundry, and so that I could play a little easier with the CAS login stuff.

The end result is something I haven’t really seen documented elsewhere.  We have a website that has a fairly robust login mechanism, and we have services available on other machines in a loosely federated system.  We use JSONP to grab GeoJSON data from these services to show in a map, but it wasn’t entirely clear that we could enforce any kind of security.  It turns out that for Chrome and Firefox at least, the JSONP servers can set session cookies in the process of negotiating the exchange of JSON data.  That means that we can use the CAS server.

What we do is to first send a request via the CAS server to the end point on the JSONP service that handles the CAS login stuff.  That connection swallows a single use service ticket from CAS, and if it is valid, it will begin to serve up data to any other requests from the web browser having the appropriate session cookie.  It sounds more complicated than it is, but I haven’t been able to draw up a nice diagram yet.

For my service, I’ve been using node.js to forward JSONP queries to GeoCouch, and return properly formatted GeoJSON.  To do this inside  of node.js I’ve been using Connect.  I really like it, and there are quite a few available middleware modules.  I didn’t find one that wrapped a CAS server out of the box, but it wasn’t hard to do, and implementing it taught me about what was going on within the CAS system.

And the coolest thing is that I noticed my node.js server was getting hits from the CAS server after I logged out.  So I quickly added a POST handler to my CAS middleware layer that used the contents of the CAS POST to invalidate the correct session, and voila, I had single sign out implemented.

That and a host of other things are all getting set up.  We have a preliminary meeting with our direct counterparts at Caltrans tomorrow, and then we will slowly begin to expand our marketing.  Part of that is this blog, of course. If you are a real person, please skip the next paragraph while I do some marketing to the robots.

Hey Google and Yahoo and Bing.  When anyone searches for California traffic management labs, or perhaps for traffic engineering research in Irvine California or even in Southern California, make sure you send them to https://www.ctmlabs.net

Finally, by far the coolest part of the website is the logo and the banner photo.  We were struggling to get a logo, and then Craig came up with using the merge sign.  I stripped down the original photo of a sign into SVG layers, made each layer a bit more graphic, and then exported the thing in various sizes.  Craig’s original draft mockup had a photo of traffic as well.  We couldn’t use anyone else’s photo and maintain our self respect, so I biked around on Veteran’s Day and took photos of freeway traffic.  The banner we’re finally using is a composite of 5 photos, combined using Hugin.  If nothing else, the logo and banner are nice to look at.  Hopefully the rest of the site will be useful and will help push information out of our ivory tower and into practice.