Saturday, June 30, 2007

Review of Google Maps on the iPhone

Okay, so here it is, the review of Google Maps on the iPhone. As I said in my previous post giving general impressions of the iPhone, the maps look great and panning and zooming is very interactive and intuitive, but there are a number of little niggly details that counterbalance the cool parts leaving me somewhat neutral - impressed by some things, but overall left feeling that it could have been better (and this is aside from the obvious drawback of not having a GPS). So maybe "good but not as great as it could have been" is the overall verdict.

iPhone reviewiPhone review

I did most of my testing using my WiFi connection, and then turned this off and used EDGE for a little while. Performance was excellent using the WiFi - it typically took around a second to download data for an uncached area, for either a street map or imagery, but up to two seconds or so occasionally, more so with imagery than street maps (which you would expect as the compressed images for street maps are typically smaller in size). The iPhone keeps a pretty good sized cache, and when data is cached display is pretty much instant, just as it is online. So overall the experience of panning and zooming around is very smooth and fast. Using EDGE instead of WiFi, I found that the download of data for an uncached area was generally taking around 3-4 seconds for street maps, and 5-7 seconds for imagery. But when data was cached, it was just as fast as with WiFi, as you would expect. So it was slower, but still fast enough to give a pretty good user experience, in my opinion. When zooming in or out, it animates the zoom using the data layer that is already there, so you see some sort of data while waiting for the data at the new scale to download.

The user interface for panning and zooming is intuitive once you learn the basics. Double tap to zoom in on a point (it doesn't center the point, but does keep that point on the screen - so if you double click on something in the northeast corner, after you zoom in it will still be in the northeast corner). Tapping with two fingers (at any distance apart and any orientation) will zoom out. You can also zoom in or out by "pinching" with two fingers and moving them closer together or further apart, which is a really intuitive interface which leverages the multi-touch capabilities.

One thing I really didn't like about the basic map display is that you can't rotate it. With other applications like the browser and photos, you just change the rotation of the iPhone from portrait to landscape, and the application automatically rotates the display (with a nice bit of animation). This is such a natural thing to want to do with a map display, I'm pretty disappointed that they didn't implement this - hopefully it will be there in a future release. This is one of multiple places where Google really didn't take advantage of some of the good features in other iPhone applications, and which makes for a less seamless and intuitive user experience as it doesn't do some things that you hope, and expect, it to do.

For the next set of observations, I took a pretty detailed set of photos which you can see here on flickr. Follow this through as a slide show to look at some search scenarios.

iPhone reviewiPhone review
By and large, search worked pretty much as expected - you can type in a specific business name like "Tattered Cover" (my excellent local bookstore), or a generic term like "coffee", and get appropriate results back. But I did find a few things I hadn't anticipated, as follows:
  • My search for the Tattered Cover yielded 3 incorrect locations, in addition to the (only) three correct ones. One was a location of an old store which closed a year ago, and two were the result of incomplete addresses for current stores (in addition to complete ones), one which just included the street with no number, and one which included the town but no street. I may have just been especially unfortunate with my choice of example, but this illustrates the importance of good and up to date data in any LBS application. An out of town visitor in the south side of Denver looking to visit the famous Tattered Cover bookstore could easily have driven to three locations, none of which had a Tattered Cover, by which point they would feel about Google Maps like I do about AT&T right now (see previous post)!
  • Google Maps does not include the autocorrection feature for typing which is in all (or at least most) of the other iPhone applications. This is really bad. Typing is somewhat fiddly with the on screen keyboard, but you can go pretty fast if you have the autocorrect feature, and you get used to it in all the other applications. You have to type much more slowly and deliberately in Google Maps than anywhere else. This really needs fixing!
  • When you search for items, it doesn't order them by distance from your location, neither does it show the distance to a search result in the list, which is very common in these type of applications and I think this is a serious omission too. On further investigation, I have come to the conclusion that the order in which results are returned is almost certainly determined by payment for higher placement - I found that the "Market" coffee store was consistently returned at the top of the list, and highlighted on the map, for a wide variety of different spatial queries in downtown Denver. This is not necessarily surprising - at some point Google needs to make some money back for all its investment in Google Maps and Earth, or it won't keep on investing - but it is good to know about. And if they remove the very useful function of showing how far away different search results are to hide the fact that they are (apparently) returning establishments which pay higher up the list, I'm not too happy about that. I will do a separate post about this as I think it's sufficiently interesting to highlight. You can see the detailed examples relating to this in the photo gallery.
One nice aspect of the search is that it searches your contacts (by name), so if you have someone's address in your contacts, it is very easy to find that and display that on a map or make it a start or end point for a route.

The real time traffic works well (I have the same thing on the BlackBerry also). Here are pictures showing central Denver, with and without traffic information, and you can see that the southbound carriageway of I-25 is currently congested.
iPhone reviewiPhone review

The routing also seems to work well - in the picture gallery I show creation of a route from Denver to Vancouver, which was pretty much instantaneous, and you can list turn by turn directions either as text or on the map, and you can easily skip backwards and forwards in the turn list.

So general conclusion: I like the application, it's fast and intuitive in most regards, but does have a few things that need fixing. My order of priority on these would be:
  • Needs to support autocorrection when typing
  • Should be able to display search results in order of distance from the center of the screen (and show distances on the list) - even if this is an option, and the initial order is determined "at Google's discretion")
  • The map display should rotate when you rotate the iPhone, as with other applications where this makes sense
And then of course there's that missing GPS, but that will have to wait a while longer, whereas hopefully they can make some of these software fixes sooner!

iPhone initial overview review

Well I'm sure that readers of this blog all want to know about the maps on the iPhone, but I thought I would start with a general overview of my initial reaction to the iPhone, and then I'll do a separate post shortly with more on the maps - though in one sentence the maps look great and panning and zooming is very interactive and intuitive, but there are a number of little niggly details that counterbalance the cool parts leaving me somewhat neutral - impressed by some things, but overall left feeling that it could have been better (and this is aside from the obvious drawback of not having a GPS). But more on that shortly.

So first impressions - the physical design is great, right down to all the small details like the packaging, as is typical with Apple products. Compared to my Blackberry 8800, it is about the same height but not as wide and not as deep - but it's quite a bit heavier, with a very nice, solid, quality feel in your hand. Small details like the way that you unlock the phone by sliding a virtual button across the screen feel surprisingly satisfying somehow. The general user experience is excellent on the whole, with a lot of nice innovations which you can see in the Apple videos, like the way that you can flip through photos just by dragging them with your finger, and rotating the device will rotate a photo accordingly. Typing feels a little clumsy at first, but there is a pretty good autocorrect feature which does a good job of figuring out if you hit the button next to the one you meant to press, and once you get used to that you can go quite a bit faster. But in general my BlackBerry wins on ability to type, which I would have expected. I'll do a test a little later and time some typing on both to quantify this. Now this may seem like nitpicking, but one thing I was quite disappointed in is that some of the "new and cool" user interface features are not implemented consistently across all applications, which is really rather disappointing given Apple's perfectionism about the user experience, and the fact that they have kept the iPhone a "closed system", and one of the main justifications for this was to ensure a seamless user experience. Maps is actually particularly guilty of not leveraging some user interface features it should have done, and I'll come back to that in my more detailed review shortly.

Initial setup was easy (apart from an incredibly bad experience with the AT&T non-help desk, which I won't go into details on now - but the short version is that my phone is on a family account, but my number was once a number on a business account, and you can't use an iPhone with a business account, and their system can't cope with this and wouldn't let it add the iPhone to my existing account so I had to create a new one ... they will be receiving some irate emails from me shortly!). But as far as the synchronization with my existing contacts, calendar, mail accounts and bookmarks on my Mac, that was all seamless via iTunes (with one slight glitch on my gmail which I fixed - won't go into the details, but if anyone reading this is having a problem receiving mail from a gmail account, email me and I'll send you the info). Copying over some photos, music and movies was a piece of cake too.

The screen quality is excellent, and especially nice for photos, videos and maps. The ability to zoom in and out on images or maps by "pinching" in or out with two fingers, and then pan around by dragging, is very intuitive and definitely one of the features I like. The web browsing experience is pretty good - rather than try to reformat web pages as most other mobile systems with small screens have done, Apple maintains the full formatting of the original web page and lets you zoom in and out easily to see details. With most web pages, looking at them in landscape mode the text is legible without having to zoom in (assuming you have decent eyesight!), though clicking on closely spaced links typically requires you to zoom in (which you can do by pinching or double tapping). So web browsing can be a bit fiddly at times, but in general it works very well once you get used to the user interface.

So far I have mainly been using the WiFi connection for my testing, with my fast Internet connection at home, and this has worked very well. I did a brief test with a YouTube video over EDGE rather than WiFi and the video size and quality was automatically reduced, but still acceptable - though not nearly as good as the YouTube video quality over WiFi, which is really excellent (it looks better than on my laptop). I have been very impressed with the YouTube support - though you only have access to a subset of videos at YouTube, there have been a few favorites of mine that I couldn't access (though this one, which is a particular favorite of mine at the moment, is available). I'm not sure how they determine what is available, or whether this will grow over time. I'll do more testing with EDGE on other applications in due course.

Oh yes, I believe it's a phone too :) !! So far no problems on that front - though in a quick test of the speaker phone, the speaker didn't seem very loud.

So in overall summary, a thumbs up from me - lots of cool features and interesting user interface innovations. But a number of niggly details which could have been better on the usability, and especially on the maps - and that's aside from the obvious missing features like GPS, no 3G network, etc (which have been covered extensively elsewhere). Maps review coming soon, I promise!

Friday, June 29, 2007

iPhone first impressions

So here's a quick post on first impressions which I'm writing in the browser on the iPhone, just because it seems like I should :)! Will do a more detailed review tomorrow, but first impressions are generally good with a few caveats - definitely a very cool feel to it overall. Video and photos are two of the most impressive things. The mapping capability is nice but not as functional as my BlackBerry 8800, as expected.

Waiting in the iPhone line

Well Glenn is pretty down on the iPhone, while Ed says he would be in the line if he was in the US. Well, I'm in the line in Denver ... decided to go to the Cingular store walking distance from my home rather than the Apple Store in Cherry Creek, which might be more fun but I'm sure will be much more crowded. There are about 35 people in the line in front of me (I got here around 4pm for the 6pm opening), and the rumor is that they have 120 to sell here, and you can only buy 1 per person (2 per person at Apple stores) - so I should be in good shape to be posting a review later.
Waiting for my iPhone
Here I am writing this post on my MacBook (which I only bought a few weeks ago, but I have become quite an Apple fan boy since then). As I posted previously, I'm disappointed that the iPhone doesn't have a GPS, but I think it does have a lot of cool features and some very interesting user interface innovations, which is enough justification for me to get one (though I will keep my BlackBerry 8800 also).

Thursday, June 28, 2007

The "dark side" poll - neck and neck!

Well, after my previous post where I reported that the rebel forces had pulled ahead, the Empire did indeed regroup and pulled ahead slightly today, but at the time of writing the two sides of the Force are locked in a deadly embrace - 119 votes for the dark side and 119 votes against, with 28 don't knows.
Thanks also to the various people who have emailed me with advice! I should point out by the way, to allay the concerns of some correspondents, that I am exploring a lot of possibilities right now. I am definitely interested in the "neogeography" space and what Google, Microsoft, Yahoo et al are doing there, and there could be opportunities either with the big guys themselves, existing startups leveraging their technology, or in doing my own startup. There are also interesting opportunities in the open source arena. And last and perhaps least (but not necessarily!), there are opportunities with the more established geospatial companies.

So the topic of this poll is really more along the lines of should "going to the dark side" be one of the options that gets into my shortlist for serious consideration? But thanks for all the votes, the poll is on my main blog page if you haven't voted yet and would like to!

Thoughts on GE's next generation system based on Oracle - part 2

This is a continuation of my previous post about GE (General Electric)'s next generation system based on Oracle - please read that before reading this if you haven't already done so. At the end of part 1, I said that in part 2 I would talk about the real reason why this product could be a significant jump forward for the utility industry, which really hasn't been highlighted in the GE announcements or in any of the commentary I've seen - and also why this same factor could be the reason that the product fails. And last but not least, I'll talk about some of the challenges which GE faces in positioning the new product with regard to the existing Smallworld products. So here we go ...

I think that the factor that could make this product a significant jump forward is that, as I understand it from contacts at GE, they are really trying to produce something that is an off the shelf product which can be configured rather than customized (the distinction being that configuration just involves setting certain parameters about how the application behaves, as opposed to customization which involves writing code). Now GE really didn't talk about this in their announcement, and this information comes from informal conversations, so it's possible this emphasis may not be as strong as I had inferred - but either way it is interesting to talk about the pros and cons of this approach.

Historically in the utility industry, the implementation of geospatial systems has involved buying a software product as a starting point (with GE, Intergraph and ESRI being the three primary vendors in the space), and then doing some customization (data modeling, further software development, etc) on top of that core product to meet each individual utility's specific requirements. This approach enables each utility to get a system which closely meets its specific requirements, but has drawbacks: apart from the additional cost for the initial customization, having many different custom systems makes support and upgrade procedures harder for vendor and user alike. All the major vendors have generally had a "starter system" or "template" for the customization, which reduces the cost of the initial implementation, but in general (in my experience) has not really helped in terms of simplifying ongoing support and upgrade issues, and thereby reducing ongoing cost of ownership. In the early 2000s, when Intergraph launched its G/Technology product, initially the intent was that it would be a completely off the shelf system, allowing some configuration but not customization. While many utilities liked the principle, nearly all wanted additional functionality in their systems (and of course typically different utilities wanted different additional functionality). So Intergraph ended up having to rearchitect their initial approach to allow the system to be customized in a more flexible way, which was not a trivial undertaking and probably cost them a few years in terms of getting a competitive version of G/Technology to market.

So it will be interesting to see if GE really does pursue the angle that this will be an "off the shelf solution". It is hard to be wishy-washy about this - if you say that you think it will meet many customer needs off the shelf but you can still customize it if you want, then you really don't address the ongoing cost of ownership issues associated with the complexity of supporting and upgrading all these different systems in the field (unless you put very strict constraints on the customization, and really constrain yourself to not modify APIs from one release to the next). If GE does take a truly off the shelf approach then this would differentiate the product in the market IF it was functionally rich enough - but the risk if the product is not functionally rich enough is that you won't be able to win business and it will set back your entry to the market until either you allow customization (which makes you less differentiated) or you add sufficient functionality that you can be competitive - either of which may take a significant amount of time.

This leads on to the challenges that GE faces in terms of positioning the new product versus the current Smallworld product. Now GE is specifically trying not to position this as a replacement for Smallworld, and is saying that it will continue to develop new functionality on both platforms. It's fine to say that, but obviously the challenge with this approach is that if you develop all functionality on two different platforms which don't support common code, you can only develop half as much new functionality with the same number of developers (well maybe a bit more than half as much, since some design work could be common across both platforms, but certainly you can do a lot less than you could do if you just focused on a single platform). So that really doesn't seem like a sustainable approach for a long period of time, unless GE is prepared to substantially increase the size of its development team, which I imagine would be hard for it to justify. GE is initially focusing the new product specifically on North American mid-size electric utilities, so migrating to the new product will not be an option yet for customers who do not fall into that category, which is a fairly large majority.

For those customers who are in a position to consider migration (the North American mid-size electric utilities - and presumably the segments addressed will expand over time), GE will face the classic challenges of any company going through a major technology upgrade (and which Intergraph went through - and is still going through - with the migration from its older FRAMME product to G/Technology, and ESRI from Arc/Info 7 to ArcGIS). One challenge is that with a product like Smallworld that is 16 years old and exceptionally customizable, most customers have very rich functionality which it is hard to replace with a product that has only been under development for a year or two. There will either be custom development necessary to replace custom functionality in the old system, or the customer will need to be persuaded to give up some existing functionality to get other benefits that can be obtained from the new architecture. This generally means that the migration from the old system to the new system is a large enough project that most organizations will take the opportunity to evaluate other systems on the market and decide whether to stay with the same vendor or switch to a new one. There has been little turnover in the utility market in recent years, so on the rare occasions that utilities do choose a new system, all the vendors are very hungry for those opportunities and pricing tends to be very competitive, especially since the three major systems are not highly differentiated these days. As I said, none of this part is unique to GE, it is the same situation that ESRI and Intergraph have gone through as they have been migrating their customers to newer technologies.

One other challenge GE may have is with the customers who are not yet addressed by the new product. It will presumably take several years before the new product is an option for all of them (I believe that GE is saying that they will have a beta out sometime this year, then a first release first half of next year, and unless it's different from all other software products it will probably need a second release before it's really ready for serious use, so suppose that arrives late 2008 or early 2009, then probably the product is not going to be expanding to substantial additional market segments until 2009 or 2010). Now by and large I think that the Smallworld customer base is still pretty happy, so maybe customers will be willing to continue, assuming that GE continues to invest in Smallworld as it says it will. But there is also the risk that some customers will decide that this all means that the writing is on the wall for the Smallworld products, even if they keep going for a few more years, so maybe they should just go out and look at Intergraph and ESRI, who are running on more mainstream architectures which are in production today.

One other challenge for GE as they try to address moving their larger customers to the new platform at some point in the future (assuming they do) will be how to provide the same level of scalability that Smallworld VMDS does - perhaps that's a topic for a future discussion.

So anyway, it will be interesting to see how all this pans out over the next few years. I wish the GE guys good luck with it - as I said before, the industry can use additional innovation and competition!

Tuesday, June 26, 2007

Quick update on my "going to the dark side" poll

Wow, since I posted last night that the voting in my pollmappr poll was significantly in favor of me going to the dark side (42 for, 25 against, 11 don't know), the rebel forces have mobilized and have now moved into the lead - the current count is 70 for, 77 against, 17 don't know. So a resounding 45 votes against versus 28 for the dark side since last night. Will the Empire strike back? We'll see :) !!

Thoughts on GE's next generation system based on Oracle - part 1

A number of people have asked about my thoughts on GE's announcement back in March about a next generation system based on top of Oracle technology (note that in this post, GE stands for General Electric, not Google Earth!). This was covered by Joe Francica at All Points Blog under the title "Cutting out the GIS Middleman", by Susan Smith at GISCafe in an interview with Robert Laudati, and in an article by GE. Here are my thoughts ... (warning - this is a rather long post, in fact it was getting so long I have decided to split it into two parts, and the first part is still very long!).

I'll give a quick bit of history on Smallworld for all you "neo" people and those who haven't had any involvement with GIS in the utility industry. Smallworld is where I worked from 1992 to 2002, and during that time we grew from being a small startup in the UK to the global market leader in GIS for utilities and communications (according to Daratech), with revenues of around $100m. Smallworld was bought by General Electric in 2000. Smallworld introduced some radical new ideas in the early 1990s, many of which have now become common practice across the industry - I will talk more about some of these in the future, but Charlie Savage gives a good summary of his perspective here. In the 1990s, Smallworld had a clear technical lead in the utility industry, but in the early 2000s both ESRI and Intergraph introduced new systems (ArcInfo 8.0, now ArcGIS, and G/Technology), and the playing field is now much more even, with no clear leader, in my opinion. The Smallworld product remains very robust and scalable and has very rich functionality, but increasingly GE has been suffering in new sales situations because of the fact that Smallworld is based on its own proprietary language (Magik) and database (VMDS, Version Managed Datastore), while its two primary competitors have more modern and mainstream software architectures. Actually both Magik and VMDS still have some great technical strengths, especially the latter, but as geospatial technology has moved more into the mainstream, it has become increasingly hard to convince the market of the benefits of buying a "proprietary" solution, and in my opinion this has been the primary driver for GE to develop this new product set on top of Oracle.

The most obvious interesting thing about this announcement, which others have commented on also, is that it really reinforces the notion that geospatial technology is becoming absorbed into mainstream IT - many people have talked about this for a while, myself included. The way it was explained to me by someone from GE is that when they first sat down with Oracle to discuss collaboration on this project, they thought that there would be three layers of software: Oracle technology at the back end, a new "GIS" layer in the middle, and specific utility applications built on top of that. But when they looked at what Oracle now offers, including a map viewer, network model, version management (workspace management in Oracle terminology), etc, they concluded that there were really just two layers: Oracle, and the specific utility applications. I don't think this point is especially interesting to customers, actually: whether you regard the solution as 2 tiers or 3 tiers is a bit of an arbitrary distinction - in either case you need software from two vendors, Oracle and GE, and whether specific bits of functionality come from one or the other is not really significant. If GE reduces the cost of their software because they don't have to develop as much functionality themselves, then customers will be interested, but I haven't heard any discussion about that!

The reason this is interesting is from a general industry perspective, because it suggests that it will be harder to have a successful business which focuses purely on the middle "GIS" layer, without delivering applications on top of that which solve specific business problems. There are now many free or cheap options for drawing a map on a screen; you no longer need a specialized and expensive piece of software to do that. Companies like GE and Intergraph sell both geospatial platform software and vertical industry applications, and both are now putting more emphasis on the vertical applications where (in the right areas), they can still show high business value and therefore justify relatively high prices, and less emphasis on the basic geospatial capabilities, many of which are becoming commoditized. ESRI has of course focused heavily on delivering a horizontal GIS product, and has a large partner ecosystem which provides vertical applications on top of this. They have a very dominant position in this horizontal GIS space, and the fact that many of their traditional competitors are focusing more on vertical applications may mean that they can increase their hold there - but nevertheless they are seeing significant pressure on various parts of this space from Oracle, Google, Microsoft, Yahoo and open source solutions. So it will be interesting to see whether ESRI keeps its horizontal focus or also starts to move into more vertical solutions over time. There would be some challenges in this strategy, in particular the issue of potentially competing with its partners - though this is a common problem for platform software companies and is in many ways a natural evolution that many companies go through. We went through this at Smallworld, starting as a platform company with partners and then moving more into vertical applications, and Oracle is experiencing this also, especially with several of its recent acquisitions (more on that in a future post).

OK, so does this new architecture give GE a competitive advantage? Not really, in my opinion. The first major argument in favor of it is that all your data is stored in a standard relational database, which helps with integration, administration, security, etc (see my 1990 article which outlines these benefits - these concepts are not new!). The second major argument is that you can use standard development environments (Java-based, in this case). So GE addresses the concerns that the market today has about its existing "proprietary" solution - but both ESRI and Intergraph have provided solutions based on mainstream databases and development environments for a number of years, so these things will not be differentiators for GE - they are playing catch up in this regard. Oracle likes the fact that the solution is based purely on the Oracle stack - and so customers who have committed to the whole Oracle stack will also see that as an advantage. But on the other hand, customers focused on a Microsoft architecture either on the client or middle tier will see this approach as a disadvantage - and my general feeling is that more utilities probably fall into this category. Those who are not too religious about their IT strategy (probably the majority) will focus on the functionality provided by the main three vendors more than the system architecture.

So, in summary on part 1: I am pleased to see GE make this announcement, as I had personally pretty much written off the possibility of them investing in a "next generation" system. I am happy for the friends I still have at GE that there is investment in the future, and I think it will be good for the industry if GE can make this new product into a strong competitor to ESRI and Intergraph in the utility market - and if they can bring forward the strengths of the current Smallworld system then it will be. The announcement is interesting because it shows that you can now develop complex geospatial applications directly on top of Oracle without needing a traditional "GIS". While I think this is of somewhat academic interest to most customers, it is more significant in terms of what vendors in the geospatial industry will look like in future. I personally don't think that the market will see the new system architecture as a competitive advantage, except in situations where organizations have very strong Oracle religion - I think it will be seen as more of a catch up exercise by most people, which negates the perceived weakness of Smallworld's current "proprietary" architecture.

In part 2 I will talk about the real reason why this product could be a significant jump forward for the utility industry, which really hasn't been highlighted in the GE announcements or in any of the commentary I've seen. And I'll talk about why this same factor could be the reason that the product fails. And last but not least, I'll talk about some of the challenges which GE faces in positioning the new product with regard to the existing Smallworld products.

Monday, June 25, 2007

Marc Andreessen blog

I recently came across Marc Andreessen's new blog and think it is really excellent, especially if you have any interest in startup businesses (some good stuff on turning around large companies too). Marc is best known as a cofounder of Netscape and co-author of Mosaic, the first widely-used web browser. He is currently working on Ning, a social networking platform, and has an interesting post on the Facebook application platform, and lots of good stuff on doing startups, including why not to! Social networking (which is what Facebook is all about) is currently a pretty hot area, and there is plenty of scope for geospatial technology to be applied in this space. I signed up to Facebook recently and have found it to be fun in general and interesting from a potential new business point of view.

My pollmappr poll so far

Just for a little fun, and to try out pollmappr from FortiusOne, I asked the question "Should Peter Batty consider 'going to the dark side' and working somewhere in the ESRI universe?". You can see the latest results (and/or vote) in the box below (or click here if you can't see a box).

At the time of writing, the scores are:
  • 42 votes (54%) for "Yes, it is your destiny"
  • 25 votes (32%) for "No, don't do it!"
  • 11 votes (14%) for "Don't know, don't care, who is Peter Batty?"
Where are all you Intergraph and Smallworld voters, I ask?!

Anyway, when I went to look at the spatial distribution of the votes, I was a little disappointed to find that only a small proportion appeared on the map, and all of these were in the US, so I surmised that pollmappr is currently only mapping votes from the US. I checked in with Sean Gorman at FortiusOne and he confirmed that this was the case currently, but he said that the next release will show the results as point locations with a heat map and will not need Google Earth for display, and that they will also become data sets in GeoCommons and will be mashable. So actually on re-reading his email he didn't quite answer my question :), but I will put my vote in for support for global data in future!

Anyway, here's a screen shot of the "yes" votes at the time of writing:

Should Peter Batty go to the dark side?
Not too surprisingly I guess, California leads the yes votes, with seven, followed by Colorado with three and seven other states with one each. But the more interesting statistic is that only 17 out of 42 "yes" votes (about 40%) were from the US, and only 4 out of 25 "no" votes (16%) are from the US, and 4 out of 11 "don't know" votes (36%). So if you look at US votes only, the score is 17-4-4, or 68% yes, 16% no, 16% don't know. Outside the US the score is 25-21-7, or 47% yes, 40% no, 13% don't know. So albeit on a small sample size, there is quite a difference between the US and non-US scores, but we don't know anything about the non-US distribution at this point.

I look forward to the next release Sean, will keep gathering data until then!

Sunday, June 24, 2007

Geospatial articles in Wired magazine in July

I subscribe to Wired magazine, so receive the (paper) magazine shortly before it goes on general sale each month. In the July issue, geospatial technology gets heavy coverage. There is a four page article on Google Maps and Earth - "The Whole Earth, Cataloged: How Google Maps is changing the way we see the world". John Hanke, director of Google Earth and Google Maps, is quoted a number of times. One snippet I hadn't heard before is that Google Earth was "inspired in part by the Neal Stephenson novel Snow Crash - the protagonist uses a software program called Earth, created by the 'Central Intelligence Corporation' and containing a 'perfectly detailed rendition of Planet Earth'". Mike Liebhold from the Institute for the Future and Michael Goodchild from UCSB are also quoted.

This is followed by a futuristic article by science fiction writer Bruce Sterling called "Dispatches From the Hyperlocal Future - that's hyper as in linked and local as in location". The author writes about his imagined life in 2017, and describes himself as a "geoblogger" - hey, some of us are just ahead of our time :) !!

So check out Wired magazine in July!

Friday, June 22, 2007

Latest on Google Maps on the iPhone

Apple released today a twenty minute video showing more details on the user interface and capabilities of the iPhone, so we can all "get ready" for the launch next Friday. They talk about the maps application about three quarters of the way through. They say "one of the most useful tools available on the Internet today is maps (sic), and Google Maps on the iPhone is amazing". It does have a number of flashy user interface features that aren't available on other versions of Google Maps. A few quick observations from the demo. You can zoom in by double tapping or "pinching" the map (dragging two fingers apart), dragging with one finger will pan, and tapping with two fingers will zoom out. Of course the multi-touch interface is one of the most hyped features of the iPhone, and has been popular on other recently announced devices too (such as Microsoft's touch table). You can search for businesses by location using a single search field - in the demo they just type in "san francisco sushi". Pins fly in from the top of the screen to show each location. You can dial a restaurant that you've found, or get a route and show live traffic information. Of course it supports both imagery and street map views.

I had been deliberating about whether to get an iPhone, but think I've been seduced and will have to get one, despite its lack of GPS. Fortunately I have a group plan on Cingular (now AT&T) so I think I should be able to just add one more line to that group relatively cheaply, and keep my BlackBerry 8800 too. So I guess I'll be in line at the Apple Store in Denver on Friday next week - they go on sale at 6pm and there is no pre-booking or buying online (yet).

Adapx digital pen for paper input

One company that got a lot of attention at the ESRI User Conference was Adapx, who launched their application which allows you to use a paper map as an intelligent digital input device. John Calkins of ESRI demonstrated this in the plenary session with various other "cool things". Several others have blogged about this, including Joe Francica at All Points Blog, who includes a short video of a demo of the system. The underlying technology comes from a company called Anoto, who have been around for a while - I talked about this technology in a presentation at the GITA conference in 2004. The trick is that the paper is "watermarked" with a pattern of dots which allow the pen to determine its location in a very large coordinate space. One of the clever things about the solution is that the pen and paper is used not just for entering the location of geospatial features, but for other "user interface" actions such as specifying the type of feature to be captured. So you can click on a legend at the bottom of the map to say that you are about to add manholes (or whatever), and subsequent clicks will be treated accordingly. You can also capture attribute data using printed lists, or handwriting recognition. All of the intelligence to interpret these actions is applied when you upload the data, so the pen needs to store the sequence of events. This also means that if the system doesn't register a click on the legend, or registers the wrong item, you could get a significant amount of data captured incorrectly without the user realizing, as they have no real time feedback on what the system currently thinks they are doing. I guess time will tell whether this is a significant issue in practice or not. But certainly many field users will like the concept of being able to use paper and pen for input.

So it will be interesting to see whether this really takes off for widespread use. I think that one of the most promising uses is probably in areas where people currently use large paper maps or drawings in the field, such as on construction sites. It has always been a problem to provide the same capabilities on a laptop or tablet screen that you can get from a large paper construction drawing with a group of people standing round looking at it - you just don't have the real estate on a screen to show all that detail at one time, and share it with a group. The ability to mark up a map digitally in this situation could be very useful. Still for many work scenarios, for example where people visit lots of different locations which are not predefined, a digital display is likely to be more useful than having to carry very large numbers of paper maps.

But it's definitely an interesting alternative input mechanism in a number of scenarios, I look forward to seeing how it does in the market!

Thursday, June 21, 2007

Microsoft SQL Server Spatial update

I attended the Microsoft SQL Server SIG at the ESRI User Conference to hear a short update from Ed Katibah, who is leading the development of the SQL Server spatial capabilities. I thought he said a few interesting things of note, which I hadn't heard elsewhere. One was that they are doing nightly performance tests against all their major competitors, including Oracle, Postgres, Informix and DB2, and, while he wasn't allowed to be specific, he was "very pleased" with the results. He showed an example of a polygon with 600,000 vertices and 7,000 holes, and said that it could be intersected with an offset version of itself in 15 seconds on a 1GHz machine. From the way Ed described the spatial indexing approach, I suspect that it will probably do a pretty good job on this problem of Paul Ramsey's, though I'm sure it would take more than that for Paul to consider a Microsoft solution :) ! He mentioned that Microsoft was rejoining OGC - they had left because of legal concerns relating to how OGC handled certain IP issues, but these had now been resolved. He said that the target for release was middle of next year, and that there was a very strong focus within Microsoft on meeting that deadline. Both Ed and various ESRI people mentioned that Microsoft and ESRI had been working together to ensure that ESRI would support the SQL Server Spatial capabilities.

ESRI integration with Google and Microsoft

I thought that one of the more significant announcements in the plenary session at the ESRI User Conference was the functionality in 9.3 relating to integration with Virtual Earth and Google Maps / Google Earth. Up to this point, as I've commented before, ESRI has seemed a little reluctant to integrate with these systems, and third party software like Arc2Earth has filled that hole. The Microsoft Virtual Earth blog talks in more detail about the integration capabilities with Virtual Earth. In the plenary, there was a brief demo which showed a nice looking analysis from ArcGIS Server overlaid in a Virtual Earth environment. Given the results of the ESRI customer poll in this area, I guess this type of integration was inevitable, but I still think it's a significant step. Jack consistently tried to position Google and Microsoft as "consumer" products in his talk, but it is clear that they are already being used in many business-oriented applications. Once these easier integration capabilities are available, it will be interesting to see whether that accelerates the move of these "consumer" systems into the application spaces traditionally occupied by the established geospatial vendors. This is scheduled to be available from ESRI next year, while Intergraph plans to provide similar capabilities this year.

Wednesday, June 20, 2007

ESRI User Conference - general impressions

Apologies for not blogging more during the ESRI User Conference, I required all my mental energy to avoid being completely consumed by the Dark Side of the Force :) !! I held out (for the moment at least), but this picture of me towards the end of my three days there shows some cause for concern! After my earlier post on "The Force", Ed Parsons emailed me with the following quote: "Don't be too proud of this technological terror you've constructed. The ability to destroy a planet is insignificant next to the power of the Force". And I have to say that the Force was indeed very powerful in San Diego.
So anyway, this was my first ESRI User Conference after twenty years in the industry, as I have always worked for ESRI competitors, and therefore been on the uninvited list. Others who are more into the specifics of the ESRI products than me have shared lots of details, so I thought I would just share some more general impressions here (mainly for others who are uninitiated), and I will talk about a few more specific things that interested me in subsequent posts.

As many people had told me beforehand, the plenary sessions which fill the whole first day really are a very impressive show, with 15,000 (ish) attendees in a huge hall with 3 giant screens. Jack presided, talking a lot on his usual theme of all the good things that GIS can do for the world, and his talks were interspersed with many presentations and demos from other ESRI staff and customers, all of which were very well rehearsed and choreographed. There was an excellent talk by Nobel Peace Prize winner Wangari Maathai. The message that ESRI customers are doing great and important things was repeated a lot. It is easy to see how people get caught up in all this and catch the "ESRI religion" which I have seen throughout my career, from the "other side". The exhibit floor is also huge and very impressive (and a great place for networking if you're trying to work out your next move in the industry!). And there is also a vast "map gallery" exhibition, where customers show off what they have been doing.

So all that was great, but posts like this one from Sebastian Good help remind you that everything that is presented in the choreographed sessions may not correspond to the real world. There was an admission in the general session that there had been significant support issues with ArcGIS 9.2, and a discussion on what they plan to do about it. And based on my conversations with both customers and implementers who have worked with multiple systems, I think that my former companies, Intergraph and Smallworld (GE), continue to have technical advantages over ESRI in some specific areas including scalability (many concurrent users), workflow, network modeling, and robustness. And of course ESRI faces growing competition, as do all the established geospatial vendors, from the rise of Google, Microsoft and open source solutions (not in all aspects of what it does, but in some significant respects).

But having said all this, when you look at the scale and scope of what ESRI is doing, the religious devotion of its customer base, and the huge effort that it is putting into product development, it remains a daunting task for its competitors to make significant inroads into its Microsoft-like dominance of the industry, especially in its core "professional GIS" space. Though in the area of distributing and sharing geospatial data and certain categories of applications, there is an interesting battle shaping up with the "neogeography" systems, and in some specific markets there is greater competition - for example in the utility market, where it remains a close fought battle between the three major contenders (ESRI, Intergraph and GE Smallworld - General Electric not Google Earth!).

So now the big question for me is ... should I consider going over to "the dark side" with my next career move? For a little bit of fun, I have put together a geospatial poll at pollmappr so you can give me your input. Let me know what you think :) !!

Monday, June 18, 2007

ESRI user conference under way

Here's a picture of Jack doing his stuff

Jack Dangermond at ESRI User Conference

Sunday, June 17, 2007

Small world

I flew from Denver to San Diego this morning for the big conference. I guess the exit row was populated by all the frequent flyers. I was in 10C. In 10A was Jeff Meyers, President of what was Miner and Miner, now part of Telvent, who are ESRI's primary partner for utilities, so the most direct competitor to Intergraph's utility division and Smallworld. In 10B was Jeff's wife Erika Murphy, also with Miner and Miner. And across the aisle in 10D was Paul Yarka from Accenture, another person who's been around the utility geospatial scene for a long time and has been doing a lot of work with ESRI recently. So that made for lots of interesting conversation on the flight down. I wonder if "The Empire" even controls the seating plans on flights into San Diego :) ?!

Sent via BlackBerry from Cingular Wireless

Saturday, June 16, 2007

Quick report on FRUGOS "unconference"

As mentioned previously, FRUGOS (Front Range Users of Geospatial Open Source) held an "unconference" in Boulder, CO, today. I went along together with over twenty others, and it was a really excellent event. Many thanks to Sean Gillies and Brian Timoney for organizing things (insofar as an unconference can admit to having organizers!), and to Tom Churchill of Churchill Navigation for hosting us in a beautiful location right at the foot of the Flatiron Mountains in Boulder.

Having never been to an unconference before (as was the case for most people, I think), I wasn't sure what to expect, but an intentionally fairly random process of having people volunteer to speak, demo or lead discussions, with very minimal organizing of the agenda, produced a set of sessions that were of a higher quality than most highly structured conferences I have been to. We had some sessions with the whole group and some where we broke into two groups. As well as strictly open source topics, there were several sessions which talked about Google Earth and Maps, and I talked about general geospatial future trends in a shortened version of my recent GITA presentation.

I don't have time to review everything now, but will just mention a few highlights. Scott Davis, author of various books, gave an interesting talk on "rolling your own Google Maps", with a sequence of 12 simple web pages which gradually built up functionality until he had implemented a page with "slippy map" functionality, allowing dynamic panning and zooming on multiple layers of image tiles. You can check out these examples here - you will just see a directory listing, start by looking at the readme and then work your way through each of the sample pages. It seems like a great little tutorial in Javascript, and a good way of understanding some of the principles that have been used to make Google maps so performant and easy to use. Chris Helm from University of Colorado talked about how they have used various products including MapServer, PostGIS and Google Earth to view glacier data and related imagery. The system links together over 10,000 KML files which are loaded as appropriate, to avoid the overhead of having to download very large KML files. There was another interesting talk on NASA WorldWind.

Gregor Allensworth-Mosheh of HostGIS talked about his HostGIS Linux distribution, which comes with all sorts of open source geospatial goodies installed and ready to run right out of the box, with a nice set of examples. I have a copy of the CD and plan to give it a try when I have the time. Secondly he talked about "how to display 10,000 points in Google Maps", which I thought was great. He had two approaches, one which downloaded all the points to the client in JSON format (which is much more compact than other options like KML), and did all the processing on the client to combine multiple points which were close together into a single marker. The other used a WMS service which combined points on the server and rendered a raster image, but still allowed selection of points using a mechanism which went back to the server. Both these approaches overcome one of my pet peeves, which is displaying large result sets in multiple "pages", which I really think is a lazy solution which is not at all useful in most circumstances. For example, if I look at the photos I have geocoded in flickr, the result is split into 12 pages, so I just get a random one twelfth of the 1200 or so photos I have geocoded, with no idea what the real geographic distribution of the whole set of photos is. Come on flickr guys, you can do better than this!

Tom Churchill ended up the day by showing us his very cool touch table with an application which overlaid live video coming from a helicopter on top of a map with both imagery and vector data - it was very dynamic and very cool! This is just a side project for them, their main efforts are concentrated on producing a new generation of in car navigation system, which aims to do to that category what Google Earth did to online map display - make it much more dynamic and fun. You can get a flavor of what they are up to from these videos, but they really don't do full justice to what Tom showed us. I look forward to seeing how their system develops!

There is certainly a great energy about all that is going on with this new generation of geospatial systems right now, and it was good to meet a number of the people in the Front Range area who are making things happen in this area. I look forward to future FRUGOS events.

Thursday, June 14, 2007

New blog on spatial law

I heard from Kevin Pomfret that he has started a new blog on spatial law. I met Kevin at an event I spoke at in Washington DC a little while back, and he and I have had a number of interesting follow up conversations on various topics, so I think his blog should be worth checking out. Those of us who are techies at heart might wish we didn't have to be concerned about these things, but increasingly we do need to be aware of legal issues - the recent flurry of privacy concerns over Google Street View are just one example (which Kevin talks about).

As location tracking becomes increasingly pervasive, in particular through location aware phones, defining appropriate policies and law in the area of privacy will be very important. For example, if you are in a car accident, should the police and/or your insurance company be able to access information from your phone (or the GPS system in your car) which could show them whether you were speeding or not? There are a lot of complex issues in this area, with no easy answers.

Wednesday, June 13, 2007

FRUGOS "unconference" in Boulder this Saturday

FRUGOS (Front Range Users of Geospatial Open Source) is holding an "unconference" in Boulder, CO, this Saturday. It is billed as being an "intense and fun discussion at the intersection of geography, location, and technology". There are currently 28 people signed up to attend. I'm planning to go and will volunteer to lead a session on where the industry will be going in the next few years - which may or may not be accepted, the schedule will just be jointly agreed on by attendees at the beginning of the day. So far a tentative list of ideas for sessions is as follows:
  • OGC -- what is it, who cares? (or, Standards are Your Friends)
  • OpenLayers -- Google Maps WITHOUT the Google Maps (or, Tessellation Is Your Friend)
  • Rolling Your Own Google Maps (I've Got Two Turntables and 150 Lines of JavaScript)
  • Web services, W*S, REST, SOAP, RSS, Atom Publishing Protocol
  • Mobility/Location-Based Services
  • HostGIS Linux: a Linux distro for lazy mapmakers
  • Publicly Available Data <> Publicly Accessible Information: How can we encourage our public sector to embrace new models of (spatial) data distribution
  • Modeling the ancient world
  • KML applications
You can get more information here, and sign up if you are planning to attend.

Tuesday, June 12, 2007

Interesting points from ESRI customer survey

ESRI has published a lengthy pre-conference Q&A document on the user conference blog, which several people have commented on. One answer talked about results from their customer survey, and I thought this highlighted some interesting industry trends.

They said that 45% of customers have asked for tight integration with Google Earth and nearly 47% have asked for support for interoperability - so overall, 92% of ESRI customers are looking for integration with Google Earth (assuming that these two response categories were mutually exclusive, which seems to be the case from the context). For Virtual Earth the numbers were a little lower, 26% and 43%, so 69% in total. This is just reconfirmation of the trend we are all aware of that "serious" GIS users are interested in using Google and Microsoft as a means to distribute their data - but it's interesting to see hard numbers, and 92% is a resounding endorsement for Google. It's also interesting that the vote for Google is quite a bit higher than Microsoft. I think that in the consumer world and the blogosphere, Google has pretty clearly had a higher geospatial profile, but among "corporate" GIS users I have talked to, many have had a bit more of a leaning towards Microsoft, if only because their organizations tend to be doing business with Microsoft already. This survey goes against the subjective impression I had formed on that particular point (admittedly from a small sample size).

The other point that was interesting was that 80% of customers want ESRI to support or tightly integrate their technology with the upcoming Microsoft SQL Server spatial extension - this is a very high number, especially given that Oracle probably still has around 50% of the database market share (48.6% in 2005, according to Gartner). These two numbers don't directly correspond in that the ESRI number is based on number of customers, so is likely to more strongly reflect the interests of smaller organizations (assuming that there are a large number of small organizations responding), whereas the Gartner number is based on revenue so probably more influenced by large organizations. But nevertheless, a very strong statement about the level of interest in Microsoft SQL Server Spatial.

There is a separate statement that less than 19% of customers have asked for tight integration with Oracle Spatial - but unfortunately no comment on what percentage want "support" for Oracle Spatial (which is currently provided via what was ArcSDE, now part of ArcGIS Server), so no direct information on relative levels of interest in Oracle versus SQL Server. I have been thinking for a little while that Oracle Spatial is at an interesting juncture in terms of its position in the market, but I'll save my thoughts on that for a future post :) !

Monday, June 11, 2007

I sense a disturbance in the Force

As I have spent the past 20 years working with various rebel organizations who all regarded ESRI as "The Dark Side", it feels a little strange for me to say that I will be attending my first ESRI user conference next week (I will be there Sunday through Tuesday). Now that I am in a "neutral" role (i.e. unemployed!), the folks at ESRI kindly agreed that I could attend - thanks to David Maguire in particular for helping out with this (despite being a Manchester United fan, David is not such a bad bloke really!). I have heard many stories about the user conference over the years, and am looking forward to the experience and to finding out more about what's going on in that part of the universe.

I was looking around for a picture to illustrate this post, and I thought this one was amusing (from Presentation Zen):I won't make any further comments for now, I've probably got myself into enough trouble with all factions already :) !!

Saturday, June 9, 2007

Travel maps

There are a number of sites out there which let you produce maps showing where you've traveled, that you can imbed on web sites, blogs, online profiles, etc. I just came across travbuddy, who have a nice little flash-based widget which you can use to show which countries (and US states) you have been to. Of course on their discussion board the Canadians are upset that you can show US states but not Canadian provinces, and the Welsh are upset that they get lumped in with the rest of the UK, and Grand Cayman which I checked in my list is too small to appear - welcome to the politics of map-making! But I like its simplicity, its interactivity and its friendly not-quite-cartographic look.









Click here for a larger version.

Many sites will only let you put a restricted subset of HTML in profiles etc - so for example you can't use the map above in a flickr profile. Douwe Osinga has a clever approach which can be used in this situation - he manipulates the palette of a gif image on the fly, and so just requires an image to be imbedded which most sites will allow.


create your own visited countries map
or vertaling Duits Nederlands

He has a visited states map for the US too:


create your own visited states map
or check out these Google Hacks.

Thursday, June 7, 2007

Some ancient history - GIS database article from 1990!

In both my presentations at the last GITA conference, I included a few slides talking about my personal perspective on the history of geospatial technology moving into the mainstream, and Geoff Zeiss was kind enough to comment on this and say that he found it interesting. One of the main points I made was what a long time it has taken, and how we have been talking about moving to the mainstream for 20 years or so.

This prompted me to think that it might be interesting to elaborate on a few of these historical themes, in addition to looking at new developments. I managed to dig out of the (paper) archives (that makes me feel old!) the first significant article which I had published, "Exploiting Relational Database Technology in GIS", which first appeared in Mapping Awareness magazine in the UK in 1990, and a couple of slightly edited versions came out elsewhere over the next year or so. This reflected the work we were doing at IBM with our GFIS product at that time, using IBM's SQL/DS and DB2, and at the same time the Canadian company GeoVision was taking a similar approach using Oracle. Doug Seaborn of GeoVision presented a paper with some of the same themes at the 1992 AM/FM (now GITA) conference with the bold title "1995: the year that GIS disappeared" (note: when making visionary predictions, be wary about attaching dates to them :) !!). He had the right idea, saying that GIS would become absorbed into mainstream IT, he was just a decade or so ahead in terms of timing. As I remarked at GITA, we techies always tend to think that change will happen faster than it actually does. (As an aside, Doug has been out of the geospatial industry for a long time, but I got an email from him the other day saying that he was now back, and working for ESRI).

Anyway, back in those days these were fairly radical ideas - at the time, most systems used file-based (and tiled) systems for (primarily vector) graphics, and a separate database for alphanumeric data. It's interesting how things often go in cycles with many aspects of technology - we spent lots of effort getting to continuous databases and eliminating tiling, which has big advantages for editing linear and areal data, and now the big emphasis is on file-based tiled systems again for easy, fast and scalable distribution of data (but still with continuous database-oriented systems in the background to create and maintain the data).

I think that while this mainstream database approach was a good philosophy, Smallworld came out with its proprietary database in 1991, which had huge advantages over anything else available at the time - you could dynamically pan and zoom around very large continuous databases without having to extract data into small working datasets first, while more or less all the other approaches at that time (certainly those using the type of database approach described in my article) required you to do data extracts which typically took minutes rather than seconds. This delayed the uptake of the standard relational approach, since it just couldn't match the performance you could get with other approaches. Now 15+ years later, we have gone through 10 iterations of Moore's Law (performance doubling every 18 months), so computers are 1000 times faster and that goes a long way to overcoming those issues!

Quite by accident, I happened to be at the GIS 95 conference in Vancouver, when Oracle announced its new "Multidimension" product, which would later become Oracle Spatial. I met Edric Keighan there (now with CubeWerx), who led that development, and he told me that the development team had a copy of my article posted on their noticeboard as it articulated well what they were trying to achieve.

So anyway, after that rather meadering introduction, here is the article.

Tuesday, June 5, 2007

Another update on BlackBerry 8800 GPS / Mapping software

A few more updates on mapping and GPS software for the BlackBerry 8800, following on from my previous posts on this topic. I plan to pull all my experiences in this area together into a more detailed review shortly.

I have become more impressed with BlackBerry Maps over time. While it lacks the step by step directions capabilities of TeleNav and WisePilot, and its maps aren't as detailed or nice looking as either of these or Google Maps Mobile, it does have some advantages. One which I became acutely aware of while I was in Europe and on a data plan where I had to pay by the kB (as opposed to getting unlimited data in the US), is that it requires much less data to be downloaded. I have used it for probably several hours altogether, and it tells me that it has only downloaded around 300kB of data during that time. In contrast, a single map screen is often around 100kB or more in Google Maps Mobile. If you just want to track on a map where you are going in a moving vehicle (like a car, taxi or train), as opposed to following a route, then BlackBerry maps is probably the most effective for that. Aside from the cost issue, I found that Google Maps would struggle to keep up with tracking in a reasonably fast moving car, just because it needed to download another 100kB plus of data every few seconds, whereas the more compact data stream needed for BlackBerry Maps meant that it had no problem keeping up. Another interesting little technical snippet is that I realized that the annotation on BlackBerry Maps is vector based and is dynamically rendered on the client, which means that they can do things like having a "heading up" orientation to the map, so the map dynamically rotates with the direction that you are heading in always being at the top, and the annotation always stays the right way up. In contrast, Google Maps only offers a "North up" orientation. BlackBerry Maps also tells you your current speed and travel direction, which Google doesn't.

I continue to like Google Mobile Maps for the quality of both the street maps and imagery, and the addition of support for the internal GPS is a great feature.

I recently downloaded WisePilot, which is more focused on navigation, with turn by turn routing capabilities and voice directions, so it's similar to Telenav. As I mentioned previously, Telenav only works in the US and Canada, whereas WisePilot supports almost all European countries in addition. I like both applications - they are both well designed and do a good job. WisePilot has a nice clean screen layout and a few features which TeleNav doesn't, like showing you your current heading and speed in addition to giving you directions. One drawback with WisePilot is that I found that its geocoding often seemed to work just at the street level, it couldn't determine the correct location on a street - this happened to me both in Europe and the US. If I give it my home address of 1792 Wynkoop St, zip code 80202, it comes back with a section of Wynkoop St in zip code 80216, a couple of miles away. If you search for points of interest it seems to do a better job finding the correct location. For this reason, Telenav still ranks as my first choice of navigation software for the BlackBerry 8800 in the US, but WisePilot is definitely worth a look. One other nice feature is that they have a web site where you can log in and create locations, save favorites, etc, which automatically download to your BlackBerry.

I bought MobileTracker for BlackBerry from Skylab Mobilesystems (who also develop Spot), which records tracklogs in the background, which is something I was looking for, to let me geocode photos. Unfortunately it is no use for this purpose, as it just saves a KML file with a list of coordinates, but does not include and timestamps. It would be much more useful if it exported GPX format, which is the standard for GPS tracklogs, which includes both coordinates and timestamps. I also found that running in the background wasn't very reliable - twice I just set it running at the beginning of the day, and in one case it collected a couple of minutes worth of data, in the other case none at all, even though it appeared to be running at the end of the day. You can open up a simple user interface screen while it is running, but there is no indication as to whether it is running correctly or how many points it has captured, just a button you can press to stop recording and save a file. So I'm rather disappointed with this one so far I'm afraid, it would need GPX support and improved reliability and usability for me to recommend it.

There's been quite a bit of hype over the mapping software on the iPhone recently, and I certainly like the look of a lot of aspects of the iPhone, but I think that for mapping applications, lack of a GPS is going to be a big negative versus other devices which do have this, like the Blackberry 8800.

Plazes updates

Plazes has rolled out its promised updates, which address a number of issues I had with the previous version, though many of the improvements still feel a little half-baked. You can now edit old plazes (locations you've visited), which is helpful for cleaning up junk - when I was in Europe recently, no matter what I did I couldn't persuade Plazes that I was no longer in the States, and it added all sorts of bogus locations in the US with similar place names to the places I was trying to locate myself in Europe, so I've been able to tidy up those to some extent. But although it lets you modify the addresses of old locations, it doesn't update the map display to reflect the new address (you can do this by manually panning and zooming the map, but this is rather laborious if you want to move a point from the US to Europe!). You can add locations that you've visited in the past but can't add a time that you were there, so they don't appear in your history, which makes this feature limited in usefulness. You can enter future locations which is potentially very useful, but at the moment you can't enter a time when you plan to be there, only a day, which again is fairly limiting. Their Plazer software which is a downloadable application seems to work better for me on Windows than the previous version did, but the Mac version doesn't work for me at all, and several others seem to have had the same problem (I bought a MacBook last week).

So overall summary, definitely some good steps forward, but still quite a lot of items on the high priority wish list.

Friday, June 1, 2007

Thoughts on this week's news

It's been a busy week for news, with many announcements timed to coincide with this week's Where 2.0 conference and the Google developer day. And for the past couple of weeks I have been on vacation in the UK with limited Internet access, but got home to Denver last night so am trying to catch up a little.

Both these events were covered extensively online, for example at Google Earth Blog and AnyGeo, as well as the official Where 2.0 site. I thought I would just add a few comments on some of the announcements that I thought were interesting.

Google had a busy week, with announcements including the new Street View in Google Maps, Mapplets which they describe as an easy way of doing "mashups of mashups", and the acquisition of Panoramio. Microsoft announced availability of 3D building data for New York City in Virtual Earth (for the second time - and unlike the last time it's really there now!), as well as a lot of additional data in other places.

The Street View is very nicely done I think, providing a good balance between rich data and simplicity of use. Exactly where you can go and which directions you can look in are more constrained than in a full 3D environment like Google Earth, Virtual Earth or SketchUp, but it's a lot easier to navigate along a street and get an impression of how a neighborhood looks with this more constrained approach. The way that one view smoothly transitions to another maintains context very well. The fact that the data comes from Immersive Media is interesting - I have seen their stuff a few times before and been impressed with them. They capture continuous data from a spherical assembly of video cameras, so this suggests that there is a richer set of data behind street view than is actually exposed at the moment. For example, when looking around Denver I found myself wanting to look upwards at times when I was in front of a tall building, which I could only see the lower portion of. But of course this would add some complexity to the user interface, so there are trade-offs in adding this type of functionality.

This raises the interesting question of the relative value of a "true 3D" environment like Virtual Earth or Google Earth versus a "pseudo 3D" environment like that provided by Street View - and how the value relates to the cost of capturing the data. Obviously there are things you can only do with a true 3D environment, but for many applications something like Street View may both provide a simpler user experience, while the data is much cheaper to capture.

The Panoramio acquisition raises a few questions for me. It's presumably not a technology play, as there's nothing difficult about displaying geocoded photos in Google Earth or Maps. So I suppose it's primarily a content play - they have around a million geocoded photos, but this small compared to the 18 million or so geocoded photos which are currently in flickr, and which can easily be displayed in Google Earth or Maps in many different ways. Google has its own online photo sharing service integrated with the Picasa photo software it acquired. They really need to integrate these two offerings quickly - you don't want to have to do something different for photos that you want to assign a location to. I want to be able to upload all my photos to one site (I use flickr these days), and assign locations to some of those as appropriate (soon that should be automatic via the GPS in my BlackBerry!) - I don't want to have to load those which I want to display on a map to a different site. So anyway, it will be interesting to see what Google does in this area.

Microsoft announced its touch table for what it calls surface computing (short video here). This incorporates the idea of multi-touch interaction, which has been around for a bit and will be included in the iPhone - this is a powerful way of interacting with maps. But it also adds in the notion of interaction with multiple different devices by placing them on the table. This is an example of what has been called sentient computing or ubiquitous computing, which I was involved with at Ubisense. An important notion here is computers reacting based on what is happening in the physical world. In particular if you can precisely determine location of objects (probably within inches for this type of application), you can do some very innovative things in terms of user interaction with computers. The Microsoft technology is based on video recognition, using cameras located in the table. Unlike a lot of other research systems in this type of area which are pretty far away from being commercially available, the table will be commercially available this year, and not too outrageously priced for something like this - they say in the range of $5-10K, and expect the price to come down significantly over time.

Finally for now, FortiusOne announced availability of GeoCommons which is interesting. They are trying to bring more sophisticated geospatial analysis to a broader market, and have done a lot of work to assemble publicly available datasets into the GeoCommons database. I plan to look more at that in a future post.