Wednesday, November 21, 2012

Apple owns the interface of today, Google owns the interface of tomorrow


I've been thinking a bit lately about computer interfaces, and I feel like we're going to see a big change in the next 5-10 years, a shift probably as big as the advent of the GUI.  And I think that Google is the company best prepared to deliver that future.

Let's imagine what the ideal interface would look like.  Ideally, I would just have a computer respond immediately to my every thought.  I would think "remind me to get milk on the way home" and the computer would just do it.  I would think "make this figure graphic with a bunch of equally sized circles here, here and here" and the computer would just do it.  This is obviously still a dream (although perhaps one that is less far off than we think).  But the idea is that the computer just does what you want without you having to do a lot of work.

Contrast this to interfaces of yesterday and today.  In the 80s and 90s, we had software that had thick instruction manuals and very much made us do all the hard work of trying to get our thoughts of what we were trying to accomplish, trying to remember all these weird key codes to try and get Word (or WordPerfect, hah!) to change some stupid font or something like that.  Over time, interfaces have taken a huge step forward, probably because of some combination of better design and more powerful hardware.  Nowadays, it's much less common to read the instructions: interfaces are much more "discoverable" and the usage of well-designed program (or app) will usually be fairly obvious.  Apple is quite clearly the best at this model.  Their apps (and apps in their ecosystem) really do require little to no instruction to use and typically do exactly what you think they will.  And they are better in that regard than Google and definitely Microsoft.  And don't even get me started on Adobe Illustrator.

But this is very much the interface of today.  As computers are getting more powerful, I think there is a change underway towards interfaces that are even closer to the ideal of just think about it and it happens.  To me, the best example is Google search.  Google search has a seemingly magical ability to know what you're thinking about almost before you even think it.  It suggests things that you want before you finish typing, it suggests things you didn't know you want but you do before you finish typing, and it does this on a personal basis and does it super fast.  It doesn't care if you misspell or mistype or whatever.  It just does what you want, at least for some set of things.  It also responds to a variety of different types of input.  I can type "weather" and my local weather pops up.  If I type "weather boulder CO", it gives me the weather in Boulder.  Same if I type "weather 80302".  It doesn't care, it just knows.  It's another step closer to the computer conforming to you rather than you conforming to the computer.

Apple is trying to make headway in this regard with Siri, and it's true that Siri came out before a similar option from Google.  But the internet abounds with example of Google's new voice tool kicking Siri's butt:


One of the most telling moments in this video is when the narrator searches for "How tall is Michael Jordan" and Google shows up instantly while Siri takes 5-6 seconds.  It's not about the timing, but the narrator says something like "Those seconds count, because if it takes that long, you might as well just Google it."  To me, that's the difference.  Google has a HUGE lead in these sort of search queries, probably insurmountable, and Apple is just nowhere close.

Searching for stuff about celebrities, etc., is one thing, but this has real practical consequences as well.  Consider the Apple maps fiasco.  Many have pointed out that the maps are inaccurate, and perhaps they are.  I haven't really noticed anything like that, honestly, and I actually like the new app design and interface a lot.  To me, the far bigger problem, however, is that it just doesn't have all that Google magic "I know what you mean" intelligence to it.  If I search for "Skirkanich Hall" in Google maps, it knows exactly what I mean.  Same thing yields a bunch of random crap in Apple maps.  This sort of thing pervades the new Maps app, where you often have to type in the exact address instead of just saying what you mean.  To me, that's a huge step back in usability and interface.  It's making you conform to the program rather than having the program work for you.

The problem for Apple is that this Google magic is not just about good design (which Apple is rightly famous for).  It's about making some real R&D progress in artificial intelligence.  Apple certainly has the money to do it, and I think I read something about how they're increasing their R&D budget.  But they're comically far behind Google in this regard.  So I think the interface of tomorrow will belong to Google.

No comments:

Post a Comment