Nielsen Norman group slams gestural interface usability, ironically points finger at iPad and iPhone

Nielsen Norman group has slammed gestural interfaces, in an article entitled A Step Backwards In Usability:

The usability crisis is upon us, once again. We suspect most of you thought it was over.

Given that two-year-olds and centenarians are using iPads, I did, yes.

Well you are wrong.

Oh.

In a recent column for Interactions (reference 2) Norman pointed out that the rush to develop gestural interfaces – “natural” they are sometimes called – well-tested and understood standards of interaction design were being overthrown, ignored, and violated.

Violated? Sounds serious. SOMEONE CALL THE USER INTERACTION POLICE.

 

=====

INT: Nielsen Norman group. Donald A. Norman and Jakob Nielsen get into their superhero outfits and zoom towards the scene.

Super Norman: OH MY GOD, it’s worse than we thought, Jackob. It’s horrific.

Super Nielsen: Yes, new technologies require new methods, but the refusal to follow well-tested, well-established principles leads to usability disaster. I will KILL THE VIOLATORS WITH MY LASER VISION.

Super Normal: You don’t have laser vision, Jakob.

Super Nielsen: Bugger. How about moaning about the iPad in my bi-monthly column for ACM CHI magazine, then?

Super Normal: Sounds great!

END CREDITS

=====

 

OK, *serious face*, these guys do have some good points regarding visbility, consistency, scalability and reliability—all standard tenets of strong usability. Gestures aren’t necessarily easily discoverable in iOS and other touch-based systems, but that’s also largely because many of them are new. Guidelines are, through popularity, slowly being formed. Nielsen Norman group also don’t seem to note that the intuitive nature of gestural interfaces (rather than the abstraction seen in other forms of computing) means that things are more easily learned and less likely forgotten. My dad can happily do stuff on my iPhone, despite not owning any iOS device, yet his Mac still regularly flummoxes him.

Anyway, back to the article:

The first crop of iPad apps revived memories of Web designs from 1993, when Mosaic first introduced the image map that made it possible for any part of any picture to become a UI element. As a result, graphic designers went wild: anything they could draw could be a UI, whether it made sense or not. It’s the same with iPad apps: anything you can show and touch can be a UI on this device. There are no standards and no expectations.

No standards? Really? I’m pretty sure Apple has extensive guidelines on user interaction. But there are apparently other reasons people are having trouble.

The misguided insistence by companies (e.g., Apple and Google) to ignore established conventions and establish ill-conceived new ones.

Yes. Let’s stop innovating.

The developer community’s apparent ignorance of the long history and many findings of HCI research which results in their feeling of empowerment to unleash untested and unproven creative efforts upon the unwitting public.

JUST STOP TRYING NEW THINGS, IGNORANT DEVELOPERS!

In comments to Nielsen’s article about our iPad usability studies, some critics claimed that it is reasonable to experiment with radically new interaction techniques when given a new platform. We agree. But the place for such experimentation is in the lab.

ALTHOUGH IF YOU’RE RICH DEVELOPERS, WE PERMIT YOU TO EXPERIMENT IN YOUR ‘LAB’!

Most progress is made through sustained, small incremental steps. Bold explorations should remain inside the company and university research laboratories and not be inflicted on any customers until those recruited to participate in user research have validated the approach.

Bold explorations like the top-selling iPad and iPhone, you mean, rather than the sustained, small incremental steps we’d previously seen in smartphones and tablets? OK, sounds great. I’ll see you back before the turn of the century and we can party like it’s 1999 until we die of RSI through using our mice until our arms explode. I look forward to it.

Hat tip: Chris Brennan.

May 26, 2011. Read more in: Design, News, Opinions, Technology

10 Comments

To WebP or not to WebP—should Safari, Firefox and IE embrace another image format for web design?

Ryan Carson:

Google has used their insanely smart engineers to create an image compression algorithm that’s just as good as JPEG but 39.8% smaller. It’s called WebP and it’s pronounced “weppy”. You can create WebP images in Acorn, Pixelmator, ImageMagick, Leptonica and XnConvert. If you use Photoshop, you can also install the WebP plugin.

The problem is it’s currently only supported by Chrome and Opera, but if all of us in the web community make enough noise, we might succeed in getting it to be adopted by all major browsers.

Ryan’s a smart guy, but I’m curious as to why he’s fighting so hard for WebP, because, bar some slightly superior compression to JPEG—although the quality of said compression is often very subjective—it offers no real benefits and lots of drawbacks.

Jeff Muizelaar:

WebP also comes across as half-baked. Currently, it only supports a subset of the features that JPEG has. It lacks support for any color representation other than 4:2:0 YCrCb. JPEG supports 4:4:4 as well as other color representations like CMYK. WebP also seems to lack support for EXIF data and ICC color profiles, both of which have be come quite important for photography. Further, it has yet to include any features missing from JPEG like alpha channel support. These features can still be added, but the longer they remain unspecified, the more difficult it will be to adopt.

Where does that leave us? WebP gives a subset of JPEG’s functionality with more modern compression techniques and no additional IP risk to those already shipping WebM. I’m really not sure it’s worth adding a new image format for that.

I agree. I’d love to know why people think we should care about WebP. I was very happy when PNG was broadly supported, due to the clear benefits in web design, such as alpha channels. But slightly better compression in a format that actually offers less flexibility? That’s an odd thing to fight for.

There are signs things might change, with Google making promises at IO for new features, but even then WebP still feels like a solution looking for a problem that would be better solved with PNG adding another compression stream.

 

May 26, 2011. Read more in: Design, Opinions, Technology

Comments Off on To WebP or not to WebP—should Safari, Firefox and IE embrace another image format for web design?

Why you shouldn’t mimic real-world interfaces in software

One of the things that bugs me about iOS is Apple’s real-world design. It makes some of its apps akin to real-world items, and so you get a leather-bound calendar for iCal or a virtual book for iBooks. The idea is to presumably assist people in how to use something by providing something they recognise.

The problem with this approach is that it doesn’t really work. Virtual items with virtual controls don’t perform like their real-world counterparts. You don’t have a scrolling panel in a real calendar, for example. The design often doesn’t follow through to the details either, which is particularly apparent in iBooks. That app sits its ebooks on top of an image of an open book, but the image never changes as you read, never updating the number of pages beneath the one(s) you’re viewing. Therefore, what could potentially have been a useful indicator of where you are in any particular volume becomes detrimental (because the eye makes the assumption you’re not making progress, and iBooks then has to provide an alternate—software-oriented—progress bar), and iBooks therefore manages to feel less book-like than Kindle. Amazon, of course, does away with design garbage, instead just giving you the content and a few ways to adjust how it looks (in terms of font styles).

A further problem is addressed by Ben Brooks at The Brooks Review. In Don’t Mimic Real-World Interfaces, he talks about how instead of realism, software designers should be striving to take full advantage of the power of computers, providing new solutions to problems, rather than aping ones built in the real world decades ago.

Ask any person who has used Soulver for Mac or iOS if they think Soulver was difficult to figure out—it is leaps and bounds better than any other calculator app, yet it doesn’t look like any other calculator app. It took me all of two minutes to figure out how to work the app and to realize just how much better it is. What Soulver did was not try to replicate the beloved HP 12c, instead they rethought what a calculator app was to be—and how it should be designed if it is only made for use on a computer, from day one.

It is what calculators would have been if they were invented at the same time computers were, instead of what we have with most calculator apps.

I totally agree with this. Soulver is a fantastic app, like a ‘back of an envelope’ that does the sums for you. Instead of being a virtual calculator, it’s a little bit spreadsheet, a little bit text editor, and quite a bit of power under the hood that you can choose to use or not. If you’re a beginner, you can simply paste lists of values (such as a shopping list of items and prices) from emails and other documents into the main pane and it’ll work out a total (without you having to laboriously remove related text, which also removes the context from your calculations). If you’re happy going deeper, you can work with operators, currency conversion, and mathematical functions. You might argue that Soulver lacks that initial point of recognition (“This is a calculator?”), but it enables you to do commonplace calculations a lot more quickly than you can in typical calculator apps for Windows, Mac and iOS.

In a related article, Brooks also looks into iCal and its resolute desire to stick with real-world conventions and simulated paper, rather than rolling in more dynamic design ideas from GTD apps that would benefit everyone. Given the absolutely hideous iCal UI (Ars Technica) in one of the latest Lion builds, I don’t suspect anyone at Apple shares Brooks’s opinion, nor his taste.

(One possibility, of course, is perhaps the design is intentionally hideous. There’s a full-screen button on the new iCal, to make it a proper full-screen app, in the same manner as iPhoto ’11. If people are so offended and distracted by the torn paper and horrible fake-leather toolbar, perhaps they’ll be more likely to explore the new mode.)

April 15, 2011. Read more in: Apple, Design, Opinions, Technology

2 Comments

On creating a new save icon for the world

David Friedman gets excited about a new save icon on his blog. His thinking: the floppy disk is archaic and many current computer users have likely never used or seen one. Therefore, he creates something new.

I’m not really sure the floppy disk save icon is a problem anyway, for two reasons. First, as iOS has shown, the very concept of manually saving files will soon be obsolete. Mac OS X Lion will soon enable devs to make apps regularly autosave (and provide versioning) on the Mac desktop; other systems will rapidly follow suit. Secondly, popular icons and icon concepts transcend technology and time. As journo chum Chris Brennan wryly pointed out on Twitter:

In the UK the sign for a level crossing is a steam train. I’m not so sure a floppy disk as a save icon is the end of the world.

The difficultly in replacing such icons is two-fold. First, you have to essentially override what’s in people’s heads, icons that are recognised in an instant. Secondly, you have to create something that’s at least as recognisable as what you originally had. This is where Friedman failed, in using a baseball home plate.

The “safe” icon is pointy on one end like an arrow. This can be used to indicate where your file is saved. If the latest version of your file is saved locally, it points down. If the latest version of your file is saved on a server somewhere, it points up.

I’m sure if you know baseball, that all makes sense. But I don’t really know much about baseball—it’s not really a worldwide sport. Similarly, replace the save icon with some kind of football (as in soccer) icon and you’d have Americans scratching their heads. And anyone else who doesn’t know or care for football.

To be fair, it’s very clear that Friedman was only experimenting and playing around, but his article shows how tough it can be to replace existing and popular icons with something that can and will be recognised almost universally. In the meantime, he jokes:

But I still like my idea and urge it to be adopted by anyone writing software for Americans who are baseball fans without internet access or a modern operating system.

April 6, 2011. Read more in: Design, Opinions, Technology

3 Comments

Ideas are worth nothing—you need to make things

Sagely advice from Wil Shipley on his blog post Success, and Farming vs. Mining. Although primarily about software (not least the difference between those who create to sell out and those who simply want to produce great software), the conclusion is something people in all creative disciplines should be mindful of:

All ideas suck, because they are just ideas. They’re worth nothing.

My success is because I worked to make the idea real. A lot. All my life. Starting when I was 12, I learned to program, and I’ve programmed every spare moment since. I didn’t become a millionaire until I’d worked at it for eighteen years. There was no genius idea I had. I just kept working, hating what I did before, and working some more to make it better.

And when you’re done with Shipley’s piece, read Austin Kleon’s How to Steal Like an Artist (and 9 Other Things Nobody Told Me), an excellent essay that advocates just getting on and creating stuff, rather than mulling things over and doing nothing

April 4, 2011. Read more in: Design, Opinions, Technology

Comments Off on Ideas are worth nothing—you need to make things

« older postsnewer posts »