One of my main hopes for 2014 is that things should start working better. In many ways, I feel that both iOS and Android have taken a step backward in achieving this ideal this year.
One notable example I ran into was in trying to get photos off of my wife’s Samsung Galaxy (Android) phone and into iPhoto on the Mac. Intuitively, it should just work. You should be able to plug a Samsung phone into the Mac via a USB cable and iPhoto should just start importing the photos. This is how it would work if you plugged in a Samsung (or any other brand) camera. When I took to Twitter with this issue I was told by some Googlers (“Oh – it’s easy! Just download Android File Transfer for Mac!”). Riiight. That’s kind of missing the point. There already is a perfectly good way of doing this – a way that “just works.” The introduction of a new step, or series of steps, or new pieces of software, move us away from that ideal. How exactly are regular people supposed to use this stuff? How would I, as a non-technical user, know that I am supposed to download this special application, which is not mentioned anywhere on the phone’s interface? If the answer is “you just have to download XYZ application and install it, then go through an extra set of procedures whenever you want to get photos off of your phone into iPhoto (which is the Mac application that most people use to manage photos, since it ships with every Mac), then I think we’ve lost sight of something. And what is this in service of, anyway? Possibly to lock the user in to a specific ecosystem? In this example, both Samsung and Google have provided different pieces of software for me to get my photos off the phone, so the user is caught between two competing ecosystems.
Again, this is only one example of what I feel is a trend away from plug-and-play and intuitive usability.
Another example of this move away from intuitive UI has been the numerous issues with IOS 7. I tweeted about my email accounts disappearing, which seemed to be happening to some others as well. Some of the other issues I’ve encountered include:
the “false screens” that greet you when you launch an app. I’m sure someone at Apple thought this would be a good idea: let’s show the user the last screen they were looking at when the app is initializing. Except it’s not a good idea. It’s a terrible idea. Because that screen is invariably not the screen that will present itself when the app finally does initialize. This is especially frustrating when the “false” screen is the one you want (e.g. the album you want to listen to in the Music app) but once the app finally starts you are presented with some other screen. And on top of all that, the false screen many times is not what you were previously looking at, but some other random screen from the app, causing more confusion (and potentially leading to privacy issues). Disturbingly this approach seems to be migrating to OS X Mavericks as well;
magical gestures that “you just need to know” (e.g. swipe left) in order to accomplish anything (like file a message away). These existed on IOS6 but they’ve gotten worse on IOS7 because it’s much easier to trigger some behavior accidentally (e.g. going back a page in the browser);
in the Mail app, you can no longer search by sender or subject (I guess I was the only one using this? You could argue this is a power user function but honestly I think most email users probably find themselves occasionally needing to search by sender – and these can be very important time-critical issues like “what is my record locator code for the flight I am trying to check in to, with a line of people behind me?”);
the “now playing” icon in the music player is a cute little “eq” animation – cute except that 1. it doesn’t match the music and 2. that is not a universally understood indicator of “now playing” in the same way that an arrow is 3. it’s a return to skeuomorphism which I thought we were trying to get away from?
the wifi connection manager has become borderline unusable – it seems not to attempt to connect unless you are using the browser (connected apps don’t trigger it?), provides no visual indication of connection, etc.. These issues are especially crippling when traveling on London’s Tube, when you are able to connect in stations but not between them (and you have no cellular signal);
when you are off-line, you still get multiple modal dialog boxes in the Mail app informing you of this (especially if you have multiple email accounts, where it seems determined to inform you separately for each account instead of simply saying “hey – you’re off-line” – or what about using a non-modal indication of some kind since these dialog boxes actually serve zero purpose;
Another browser UI issue: on iPad, Safari can’t go full screen and never gets rid of its tab bar even when there is only one tab open, so you are wasting a ton of (precious) screen real-estate on browser chrome. (Also pushing me to use Chrome (capital-C) more and more.)
AirPrint has stopped working without explanation – it worked fine with my HP printer pre-IOS7 but now no-go.
Apple’s differentiator is intuitive UI so this trend is especially disturbing for Apple stockholders such as myself. Is it possible that Steve Jobs was solely holding back this tidal wave of bad user interface design?
What about the Web? Popular Web sites and applications have not been immune from the move away from intuitive UI in the name of ecosystem lock-in. Examples include YouTube’s replacement of its comment system with Google+ (part of Google’s general move towards making everything revolve around Google+). More generally, the Web has had a big issue with off-line use that has come into sharp focus in 2013 as more and more Web usage is happening from mobile devices. This has led to very poor user experience for Web sites and applications on mobile – with some notable exceptions such as the Financial Times. This is not an apps. vs web post but the Web needs better underlying plumbing for off-line use in order to fix these issues and enable good mobile web experience. Efforts such as “service worker” are showing a possible way forward – hopefully we’ll see this migrate into mainstream browsers and then into high-use Web applications soon. Performance and access to device capabilities (APIs) are two other areas where the Web has come up short this year. While these are not things that have “gotten worse” in 2013, the rise in the use of the Web from mobile devices by mainstream users has underscored these issues. The other disturbing trend I see on the Web is technologies and “standards” being unevenly supported across browsers and platforms – fragmentation – and the tendency for browser makers to tout new or experimental features to developers even when these features are not available on other browsers. Examples include WebRTC (on Chrome and Firefox but not on Safari or IE) and Apple’s push API (only on Safari on OSX Mavericks but presumably rolling out to IOS Safari soon). This leads to bad user experience of the form of “best viewed in XYZ browser” which harkens back (not in a good way) to to the early days of the Web. Of course, experimentation on new features and technology is how the Web moves forward, but I’m hoping to see a reverse of this trend in 2014 and a return to the promise that the Web should work across platforms and across browsers.
I am not a UX professional and I don’t pretend to be one. But, much like pornography, I know bad UI when I see it. And I hope I see less of it in 2014. Thanks to my good friend Scott Hughes for his helpful advice on this post.
Liked this post? Follow this blog to get more.