Mystery meat UI design in Windows 8, iOS and OS X could point to a confusing computing future

There’s an interesting video online from Chris Pirillo, with his father battling Windows 8. The problem is that he can’t get back to the initial screen, because Microsoft scrapped the UI conventions he understands and has instead hidden the equivalents as corner-based hot-spots. On Daring Fireball, John Gruber comment:

Could be this has no predictive value regarding how regular people will think about Windows 8, but it’s an eye-opener regarding the risk Microsoft is taking by making essential UI navigation elements hidden until you hover the mouse in the right spots. People navigate with their eyes, not by scrubbing the screen with the mouse.

I don’t think Microsoft’s alone here, but the video highlights a possibly worrying trend in UI design. My father recently used an iPad for the first time, and he had no problem with some aspects of the interface, such as launching apps, zooming and so on. But he at one point came across some text that was cut off. “How do I get to the rest of it?”, he asked me. I responded that you just swipe it.

iOS is full of this kind of thing, and its conventions are increasingly coming to the Mac. Scroll bars are hidden, so you’ve no indication (beyond some apps ‘flashing’ the bars as you access new content) whether content is hidden or not. When text ends with a full sentence rather than cutting off half-way through some letters, it’s not obvious you need to scroll, even if you know the required gesture. And then there are the countless apps that now ‘hide’ controls, requiring you to learn new conventions, but for individual apps rather than the system as a whole. Coherence is being eroded as devices become the tools; it’s almost like a regression, with you having to learn new things every time you buy an app.

I’m not saying these things are necessarily bad. In most cases, modern computing is far more user-friendly than it used to be, and gestures are typically pretty memorable. Additionally, we’re in some cases moving towards more controls in context, which can be helpful. Also, one might argue that many ‘hidden’ aspects of UI are easily learned, and so people really only need to be shown once and they’ll subsequently be fine. But we are definitely seeing a massive shift in how software interfaces work, and I think it’s disingenuous to suggest this is a Microsoft issue or risk—it’s really far more widespread within the industry.

Update: Lukas Mathis explores the new iPhoto for iOS app, in iPhoto’s Mystery Meat Gestures, showcasing problems behind hidden UI.

March 14, 2012. Read more in: Design, Technology

4 Comments

Journo who’s rubbish with technology like iPads in ‘tech like iPads will doom children’ shocker!

Beverley Turner writes for the Telegraph, possibly from some time in the 1970s, with the article The younger generation doesn’t do boredom—it must have an iPad or iPhone to hand. We only get as far as the strap-line before we’re hit in the face with:

Will the boom in apps for children, which has been capitalised on by Disney, stifle their imagination?

I think if we ignore, say, all of the educational interactive apps and games, the answer is a clear yes. Kids clearly cannot be imaginative using a device with myriad apps that can enable them to be creative and have fun.

The technology behemoth Apple is rejoicing after the sale of its 25-billionth app. The Disney game, Where’s My Water?, was, it says, downloaded by a Chinese child who can now swipe one finger across a screen to release water onto a subterranean alligator.

This Chinese child is now, presumably, both unimaginative and receiving counselling, in the mind of Turner, rather than, say, occasionally happily playing an interactive cartoon instead of merely watching cartoons. BAD APPLE!

It is a particularly apt app as its protagonist, Swampy, is also the first original character developed by Disney for a “mobile platform”. Whereas children’s movies such as Cars started on the big screen and morphed into games, Swampy hatched in our palms and will eventually appear in cinemas.

A terrible thing. It’s depressing that Disney has fallen so low as to embrace new forms of technology, rather than making a huge movie and more cynically creating app-oriented marketing-led tie-ins as it has in the past. BAD DISNEY!

Addiction to new characters has suddenly become easier: they are lurking just inside our handbags. Apparently, this is cause for celebration (surely there’s an app to help us look happy). Speaking on the Today programme about the importance of targeting children, Disney’s senior vice-president, Bart Decrem said: “A whole generation of kids is growing up with… [iPads and iPhones] as their ‘first screen’.”

It’s going to be the downfall of civilisation. Kids were much better off when they were growing up with the TV as their first screen. A TV that they couldn’t interact with in anything more than the most basic of ways. A TV that wouldn’t so often encourage “planning, problem solving, and creative self-expression” (GamesBeat).

Fewer phrases could be more chilling

At least to anyone who inexplicably things iPads are evil.

– but mainly because he is right. Disney has always known that chocolate-and-snot-covered fingers lead the way to riches. Apps for children will prove extremely lucrative for the company, and may be welcomed by parents up and down the country as convenient new babysitters. I must confess to being recently bowled over by a Times Tables app that kept two energetic eight-year-olds entertained on a long train journey.

So, your kids were entertained and educated on a long train journey, in part through an app? Sounds terrible.

Furiously, I flail against a tide of technology.

“I am rubbish with technology.”

My son, Croyde, was oddly underwhelmed when I presented him with my old CD collection

I can’t imagine why physical media would underwhelm anyone who’s used to whatever music they want, whenever they want it.

I can’t do email on my mobile phone and – brace yourself – I still use an A-Z.

“I am rubbish with technology.”

But, of course, I’m terrified of leaving my children ill-equipped for the future, so they have limited access to our random assortment of gadgets. I’ve suffered countless five-hour car journeys in which iPod, iPad and even DVD player (how retro) screens are dribbled over. Every passing cow and sheep is given a disdainful reception until I eventually yell “Enough!”, confiscate the devices and insist my children gaze out of the window.

Gazing out of the window being a better use of their time than the Times Tables app you mentioned earlier. Right.

The problem is this: our children do not know how to be bored.

How terrible. It must be hell for a child when it’s always got something to do that it enjoys.

[Disney admits its aim] is to “create engagements within the span of a minute”. In other words, you can complete Where’s My Water? in 60 seconds.

No. You’ll complete a level of Where’s My Water? in 60 seconds. Anyone who can complete the entire game in 60 seconds is clearly a liar, or, perhaps, has pilfered Doctor Who’s TARDIS. Also, this is all about relevance for specific scenarios. Why is your article only a few hundred words long and not a book? Because it’s designed to be consumed, like most online newspapers, in bite-sized chunks. See also: mobile gaming.

We don’t need psychological research to tell us that nascent Michelangelos will struggle to commit to future Sistine Chapels if they expect reward to come in thrilling 60-second bites.

Whereas nascent Michelangelos will commit to future Sistine Chapels by being told to stare at livestock out of car windows? Perhaps art apps, which boost confidence through undos and, according to art teachers at Fraser Speirs‘ school, subsequently lead to more exploration in real-life tools could help children become nascent Michelangelos? Staring at cows is obviously the better route.

God knows how they’ll cope with the monotony of long-term relationships.

By staying engaged and trying new things, rather than being told that boredom is inevitable?

No matter how lovable Disney makes its app characters, looking silently at a handheld screen teaches our children nothing about language, empathy or relationships.

Hogwash. It all depends what’s on the screen. It depends on whether a parent is leaving their kids alone with devices, or playing along with them. It depends on balance—on things like iPads and other technology only being a part of a child’s life. But one thing I would say without doubt is that technology has the potential to enrich lives, from toddlers to centenarians.

Many of us live in a world filled with technology. It’s perhaps only natural and healthy as we age to be a little cautious, but immersion is better than blockage. Recognition of technology’s benefits beats being scared that a cartoon alligator is going to numb the minds of children the world over, when he’s merely offering a brief past-time that can provide enjoyment, planning and puzzle-solving to people of any age.

March 14, 2012. Read more in: Apple, Technology

Comments Off on Journo who’s rubbish with technology like iPads in ‘tech like iPads will doom children’ shocker!

On the iPad Retina display: Throw away your laser printer and get a 100 dpi dot matrix from the 1980s

Marc Palmer on the iPad Retina display:

If you don’t agree with the statement “the retina display on the new iPad is a game changer” you need to consider this:

When you cannot see the individual pixels, on a screen of this size, it will no longer seem like you are looking at a screen. This has a massive effect on the way the user feels and perceives the product and the software that runs on it. If you don’t believe this, throw away your laser printer and get a 100 dpi dot matrix from the 1980s. While you don’t normally think about it your brain and perception is aware of the tiny black grid separating the pixels and the “unnatural” jagged edges on things.

The massive jump in screen technology since my early Apple computing days has been astonishing. I used my first Mac with what was then a hugely expensive 17-inch Trinitron monitor. It was impossible to entirely lose yourself in the display, because the thing flickered like crazy, with the refresh rate on the optimum resolution being 75 Hz. This dropped to an eye-kicking 67 Hz on one of the alternative resolutions.

The first big jump for me was when flat-screens became the norm. No headaches. No flickering. Just lovely, solid imagery. But then came the iPhone, which crammed about twice as many pixels into every inch than my Mac’s monitor did. It made the Mac display, when viewed a bit too close up, look a bit rubbish. The thing is, none of these things prepared me for my first encounter with an iPhone 4.

I held the thing in my hands and had an instant reaction to peel off the sticker, only it didn’t have a sticker. My brain could not comprehend how sharp this display was. As someone who’d sat there in front of several CRT disasters, this new iPhone was quietly laughing at my display history. It had also, in one fell swoop, made everything before it look like crap. I’d thought the iPhone 3GS display was pretty good, but now it looked awful. My iPad’s display, too, felt sub-optimal for reading tasks, due to having a clear case of the jaggies.

Palmer’s right. The new iPad screen is a game changer, because it’s about immersion. In having clear, print-like text, you’re not constantly reminded that you’re looking at a computer display. The device will, when apps are fully optimised, feel even more like it turns into the app you’re running than ever before. Apple’s rivals that cannot compete will claim otherwise, yelling that, sure, the new iPad has a Retina display, but they have a stylus, or an SD card slot, or can run a discontinued version of Flash. Most people don’t care about those things—the display is what you watch and interact with, and it’s, bar perhaps the software ecosystem, the most important thing about this rapidly evolving field of computing. Anyone who doesn’t believe the new iPad is a game-changer in that regard is just kidding themselves.

Hat tip: Keith Martin

March 13, 2012. Read more in: Apple, Technology

3 Comments

Open and closed is not just black and white, as evidenced by iOS gaming

Michael French for Develop writes about his GDC experience in GDC and the death of the gods. He notes that the gaming gods of the industry—Sony, Microsoft, Nintendo—once offered keynotes that defined the future of the industry, but now the gods are dying, largely due to competition from newcomers. He makes one point that I find particularly interesting:

As journalists like me say, ‘the [blank] happened’. The Internet happened. Facebook happened. iPhone happened. The power shifted. And Microsoft, Nintendo, Sony—they all lost some relevance. They had to share power with platforms that were built, at a macro level at least, to not be so draconian. For better or worse, platforms like the App Store are free markets instead of walled gardens.

In case you didn’t catch that:

platforms like the App Store are free markets instead of walled gardens

This isn’t the first time I’ve heard this. At an EA event last year, I spoke to a few developers who’d created games for a number of platforms. They glumly told horror stories of their experiences on the ‘god’ platforms, before brightly saying what a breath of fresh air the relatively open iOS ecosystem is for gaming. Yet we most often only hear about the times when someone at Apple comes down with a bad case of the stupids, rejecting a game or app for spurious reasons, and not the many thousands of games that have ended up on the App Store that simply wouldn’t exist for any other mobile platform.

I’m not suggesting iOS is the most open of platforms, because it clearly isn’t, and it would be great to see the likes of OS X’s Gatekeeper arrive on iOS, providing a little extra freedom regarding apps that can be installed. But open and not-open isn’t black and white—instead there’s a diverse range as you move from one extreme to the other, and this is especially true when it comes to mobile gaming.

March 10, 2012. Read more in: Gaming, Technology

Comments Off on Open and closed is not just black and white, as evidenced by iOS gaming

Consistency across platforms is about more than direct interaction—it’s about concepts

For The TechBlock, Abdel Ibrahim and Jon Dick write Microsoft poised for tablet resurgence, attempting to compare experiences offered by Microsoft’s upcoming Windows 8 and Apple’s OS X and iOS:

Windows 8 […] will roll out across desktops and tablets [and] although Apple’s forthcoming Mountain Lion, due out in late June, will look to blur the line that’s so far separated desktops from mobile devices, it won’t do it to the degree that Microsoft intends. That’s because the software company isn’t planning to simply share features between distinct operating systems, as will Apple. Rather, Microsoft hopes to introduce nearly identical experiences (or as close as the hardware will allow) to each.

If Microsoft pulls that off, and we have no reason to suspect it won’t, it’ll make a very powerful argument to embrace whatever tablets it simultaneously debuts. And it’ll do that for the same reason consumers have gone gaga for all things iOS: people like intuitiveness and familiarity; they like unwrapping a new product and not having to learn the ropes. And that’s precisely the sort of seamlessness Microsoft’s next tablets have in store for the hundreds of millions of consumers who are bound to line up for Windows 8 for desktop (if Windows 7’s reception is any indication).

This opinion is one I’m increasingly hearing, but there are two problems, which are intertwined. First, as Andy Ihnatko and Christian Cantrell (among others) have pointed out, Windows 8 effectively has split-personality disorder. Everyone seems to like Metro, but hates the jolt as you switch to the more typical Windows Desktop. And the gist is that Metro’s great for mobile but not suitable for desktops, while Desktop mode is, naturally, still a good fit for desktops but not so much for mobile devices.

Secondly, people misunderstand what Apple’s doing with its operating systems. They either think Apple’s turning OS X into iOS, or that not enough of OS X has been sent in the other direction. (Never mind that iOS includes apps for email, music playback, dealing with calendars, and so on, all taken from the desktop…). But what Apple’s really doing is creating a consistency of experience in terms of concepts; conversely, Microsoft’s attempting to provide literally the same experience on the desktop and mobile, regardless of suitability.

Apple’s stance is most obvious in Mountain Lion, which freaks out long-time Mac users with its ‘inspired by iPad’ headline. But what’s really happening here is unbundling workflows and making each app focussed. Instead of going to iCal for to-dos and your calendar, you’ll instead go to Calendar for your events and appointments, but use Reminders for your to-dos. And you do the same on iOS. The methods of interaction will not be identical, because touchscreens and desktop machines/laptops do not provide identical interaction experiences. But enough aspects of the operating systems will be similar that someone should be able to switch with reasonable ease between iOS and OS X because the fundamental concepts will be familiar in both.

Microsoft’s gamble is that Apple hasn’t gone far enough, and that the user should instead have the exact same interaction and conceptual model across all devices. But, as noted in the aforelinked articles, this is coming at the expense of a strong user experience, which is heavily compromised on every device the user interacts with. Years back, Microsoft might have gotten away with this, but the reason people have flocked towards iOS and are increasingly buying Macs is because they offer strong user experiences and seek to make things less complex. In seeking to solve one problem for the user—relearning interaction with an OS—Microsoft’s merely placed massive barriers throughout the entire experience, ending up with something that could be fantastic if logically separated into two operating systems, but that appears fundamentally flawed as one.

March 9, 2012. Read more in: Apple, Technology

2 Comments

« older postsnewer posts »