Thursday, January 25, 2007

Google Earth and Standards

I can't comment on the all points blog...they must have me marked as a spammer.

Anyway, I won't link to the post by Adena Schutzberg due to that, but here's the content, jacked from a Federal Computer Week article.

"I have a question on this:

Interviewed in the Google booth, which resembles the bridge of the Starship Enterprise, Painter [director of Google Earth Federal] said that although the public Google Earth uses commercial satellite and geospatial imagery, Google Earth Fusion allows federal agencies to manipulate and integrate their own geospatial imagery with the company’s software tools.

Imagery or software? Isn't Google a member of OGC? Is it moving forward on implementing those standards such that it can do both with ease? Is that not the point of OGC? Is DoD pushing Google to implement such standards? See for example: NGA Announces Requirement for OGC and Complementary Standards

I don't think there is a high performance streaming 3D imagery XML standard, is there? In any case, NGA is presumably rational. They are not going to sacrifice user experience for the sake of standards compliance, particularly where the standards board is as commercially fractious as OGC has been.

While there may be a requirement for support of those standards, this is already easily met by Google Earth in the form of reflectors for OGC Image Services as overlays or super-overlays to the base streamed data. All of the major data types can already be imported to Google Earth server to form the base layers.

To somehow suggest that Google should use an OGC format as their primary streaming format is a really bad idea for everyone- especially the data owners that would end up giving away their data. I remember the same insinuations being made against ESRI in years past, and then people deciding they didn't want to pay the performance penalty for standards compliance. Why don't standards committees ever look at what works best, and then choose that as the standard, instead of trying to prognosticate the market, technology, and user needs years in advance? While the shapefile was never an OGC standard, it was easily the lingua franca of the GIS community for a long time. The same with the good old .e00.

My wife worked with the poor fellow ESRI designated as their OGC standards body rep. He was not a person who loved his job. Basically he was a punching bag as he watched their competitors attempt to push things away from technology that would be compatible with proven success.

Thursday, January 18, 2007

Manual Reverse Geocoding

So this guy's letter was delivered without an address- just a map. Reverse geocoding at work. Humans rock- it's dead hard to get a computer to do this. Actually, maybe I should just say the the UK Post rocks, I can't see our USA civil servants doing anything with this besides sending it to the DLO or trying to arrest the person that sent it for subversive activities.

One of the worst experiences in my life was spending three days geocoding a dataset of points for Wien (aka Vienna, Austria) with ArcView 2.1. (ArcView would try to suggest a point, and I would try to move it to the right place.) House numbers there don't follow a very organized pattern, and I hope the errors I inevitably must have made didn't get anyone killed. Or didn't get the wrong person killed, or anything like that.

link via: /usr/bin/girl , which incidentally is one of the first blogs I ever read.

Sunday, January 14, 2007

Beautiful Brochure...ugly product

I have been doing a light evaluation of some software products in anticipation of some upcoming requirements for a project I am working on. The product that the architect in my position before me had been ready to buy is a real piece of work. It commits three violations of trust that no company selling software should:

1. Where's the product?
I don't trust that there even is a product when you can't go to the website and have any clue about how to buy the product or even a wild guess as to it's cost. You can download a gloriously illustrated brochure of bullet points of software features, with teeny tiny pictures of UI. You can read about solutions and services, but you can't buy anything? It sounds like a services trap: call us, we'll figure out how much money you have to set our price, then buy/rent our software and we'll finish it for you at an hourly rate, but own the code we write for you.

I guarantee there isn't a "there is no three step" install process.

2. UI design = sorry, we spent all of the design money on our brochure.
What does it say when a company has an obviously professionally designed and frankly beautiful brochure, but their actual product looks completely undesigned and frankly is just plain hideous? They care about people's experience until they have their money?

3. No mention of an API anywhere.
Monolith sensors on full alert. This thing has to integrate with my enterprise, I don't need to buy an ERP system where everything is a module of your system, versus a distributed architecture. (no one does need to buy a "complete" ERP system...ever) Currently dealing with the seemingly vast, but still incomplete, APIs of Microsoft Dynamics Great "Pains" has me watching out for software that makes itself hard to fit into a distributed world. This software isn't even a web app, so there's no chance of URL hacking.

Sorry, no dice. I am specifically not naming the guilty party here so that they don't begin any legal actions against me for defamation of character, or in case I decide to apply for a job there someday to rewrite the thing. I'll post an offensive screenshot at some point in the future on an unrelated topic.

Saturday, January 13, 2007

Refactoring Ruby

Jay Fields and friends are rewriting/porting Martin Fowler and friends' Refactoring: Improving the Design of Existing Code book to Ruby. I guess since the IDEs aren't there yet, we might as well get going on the manual process! Good show to Jay et. al. They've made a dent in the first section, hope they keep it up.

I still keep a copy of Refactoring on my desk. It's one of about 20 books I still need to keep around on my desktop. The content is somewhat timeless, and (was) not readily available on the internet. Still, the Jav-oid nature of the text makes it a little less relevant to my current world of Ruby. It also serves as a badge of good programming knowledge.

I've been selling off a lot of my other books on Amazon. Someone bought the Sun Certified J2EE Architect guide from 2002 for $16. Someone else bought Rod Johnson's old Expert J2EE development, the early edition, before he finished Spring, for $15. No takers on Tapestry yet at $10. Many books have a used value below a $1. I am debating what to do with them as it's not worth my time to sell them. The chances I need a Turbo Assembler reference have dropped about as low as they can, no library would want it, still, it was a good book in it's day, and it is sad that it is no longer relevant to anyone's life. I also think "professional Java Web Services" is probably not worth the shelf space. Well, maybe it will be recycled into a better book someday.