Dear tech journalists: your experience is not ubiquitous

I recently read with interest Thrillist’s tech article Why you should ditch Google Maps for Apple Maps. Mostly, I read with interest because the New York-based writer’s experience — and his reasons for ditching Google Maps — didn’t remotely tally with my own.

Among other things, he argued Apple’s public transport directions are “infinitely better than Google Maps”, search is faster and more specific than Google’s, you get a 3D satellite view of your city, and you can access up-to-the-minute train arrival times.

All of this might be true in New York. Elsewhere, it’s often a different story. Here in the UK, I’ve found Apple Maps has fairly poor intelligence when it comes to points of interest (or, indeed, often even searching for cities and towns), and little knowledge of public transport that doesn’t include London. Also, 3D satellite views of capital cities are a fun toy, but Street View (which Apple Maps currently lacks a direct equivalent of) has for me proven practical when checking out an upcoming journey and looking out for landmarks.

The point isn’t that Thrillist got it wrong. For some people, I’m sure Apple Maps is an excellent product, and one that enables you to avoid Google’s app if having anything to do with Google on your iPhone irks. But the article showcases a problem that’s especially prevalent in tech: forgetting that the rest of the world won’t necessarily have the same experience as you.

I’m sure I’ve been guilty of doing the same at times. I’ve certainly written enough tech articles over the past 15 years to practically guarantee that at some point I’ve written more from my own standpoint than empathising with a wider audience. But these stories when they arrive showcase a need for writers (and their writing) to be better, and to recognise their experience isn’t ubiquitous.

March 8, 2016. Read more in: Opinions, Technology

Comments Off on Dear tech journalists: your experience is not ubiquitous

Nook cooked as DRM continues to punch paying customers in the face

Late last week, I received this cheery email:

Nook email

Nook is dead in the UK, and customers who bought books thinking they might actually own them are now being told they might be able to still access some of them once the Nook store implodes, due to a partnership with “award-winning Sainsbury’s Entertainment on Demand”.

First, which awards? The Sainsbury’s On-Demand Digital Entertainment Brands Run by Sainsbury’s Awards? I was only vaguely aware Sainsbury’s did this sort of thing at all, let alone had won awards for it.

Secondly, this again goes to show that when you’re buying an awful lot of digital content, you should consider doing so nothing more than a temporary rental, even if (and this is the bad bit) you’re not flinging money at streaming. That is, frankly, not good enough.

Thirdly, this again showcases how DRM merrily punches in the face consumers who try to do the right thing. If you spent money on Nook books, chances are you’ll lose at least some of them now. Had you torrented those books, you’d still have copies. And in the UK, you can’t just legally strip the DRM and make your own copies; it’s illegal to breach DRM, with only (minor) exceptions being made for people with disabilities who have no other way to access the content in question.

You should not get a worse user experience on paying for something, but that’s increasingly the case. Music, at least, has been freed up somewhat, with purchases now typically being DRM-free across the industry. Some comics companies (such as 2000 AD) make a point of being DRM-free across platforms. But this is still rare. More often than not, any digital movie, TV show, comic or book you buy is wrapped in DRM, blocking portability and permanence.

Purchasing digital shouldn’t be a glorified extended rental. It’s no wonder many people now opt out of paying for media at all.

March 7, 2016. Read more in: Opinions, Technology

2 Comments

Dear music and telly industries: stop punishing those who buy your stuff

The BBC reported on Friday that it’s once again illegal in the UK to rip CDs to your computer. This might come as a surprise to you. First, you might not have been aware this was illegal in the first place. Secondly, you might be nonplussed that the pathetic changes to the UK’s fair-use laws have in part already been dialled back, but there you go.

About a year ago, I wrote for Stuff.tv about government changes to personal copying exceptions, and how they didn’t go far enough. My argument was (and is) that while companies should be allowed to weld DRM to released media, individuals should be able to circumvent it for personal use, as long as there’s an expectation of ownership with the purchased media. (In other words, you shouldn’t be able to ‘back-up’ music from Spotify or video from Netflix, but you should be able to make personal copies of CDs, digital books and comics, DVDs and games.)

The key sticking point is plainly noted in the BBC piece:

A judge ruled that the government was wrong legally when it decided not to introduce a compensation scheme for songwriters, musicians and other rights holders who face losses as a result of their copyright being infringed.

UK Music estimated the new regulations, without a compensation scheme, would result in loss of revenues for rights owners in the creative sector of £58m a year.

In other words, because you’re not rebuying again and again, rights owners potentially lose money, and so they want something for nothing. They should somehow be ‘compensated’ for you making personal copies of items, for your own use. I imagine they’re pretty angry about the portable nature of digital files, too, since they can be used across devices and platforms, without you having to rebuy for each new machine. Naturally, everyone ignores the fact people have finite money, and people still very much into music are still buying it, often on physical formats; they’re now just once again being punished for having the audacity of wanting to back-up this content.

At the time of the Stuff piece, given the craven and half-arsed nature of the changes in law, it never occurred to me that we’d go backwards and end up again at the status quo. The BBC adds in its story that it’s “unclear how the change will be enforced”, but then it’s almost never been enforced. What is clear is that once again we have industry representatives effectively punishing those who pay for things. All this does is piss people off. By making it illegal to rip your own CDs to your own computer and legally listen to the music you paid for, these organisations are hastening the decline of income from said purchases, not protecting their artists.

 

July 20, 2015. Read more in: Music, Technology

Comments Off on Dear music and telly industries: stop punishing those who buy your stuff

Apple and balance/motion accessibility — yelling into the wind

As a writer, even in an age of social media, it’s hard to tell whether anything you pen affects people in any serious way. In truth, much of what I write is opinion-based: thought pieces and reviews that might briefly help and/or entertain a certain section of a site’s or magazine’s readership, but that relationship between words and results is typically fleeting.

One major exception in my writing career centres around accessibility. When Apple’s iOS 7 for iPad and iPhone arrived, it made a lot of people sick. Aggressive animations became motion-sickness triggers for a surprisingly large range of people. I was fortunate enough to write about the subject for Stuff and twice for The Guardian. Apple rumbled into gear. Changes were eventually to iOS made via the introduction of Reduce Motion, which switched slides and zooms for cross-fades. I have it on good authority that what I and others wrote did have an impact on Apple’s decision-making.

Although motion/balance accessibility remains poorly understood, and third-party developers remain largely ignorant of these issues, merrily peppering apps with animated interface components, I and others are now broadly safe when using iOS. The same is not true for OS X. It’s been three years since I first wrote about the subject on this blog, and I’ve penned articles elsewhere, including for major tech publications. It’s hard to believe that Apple’s listening. The company, despite making great strides in vision/hearing/motor accessibility, appears either ignorant of or uncaring about motion/balance problems.

That might seem like an extreme statement, but I think it’s entirely fair. Major triggers, such as full-screen slides/morphing transitions, and also slide transitions within Preview and Safari, arrived in OS X Lion, and we’ve since seen three major updates to OS X without a single setting for overriding these animations. There’s no Reduce Motion in OS X, despite Mac screens being larger than iOS ones, which means the transitions displayed are more — not less — likely to cause problems.

Today, I fired up the new OS X Photos app. Within five minutes, I felt ill. I shouldn’t have been surprised that a motion/balance trigger is built right into the interface, with the main pane zooming while it crossfades. Presumably, someone at Apple thought this looked pretty. There’s no way to turn it off. For anyone who finds this animation problematic, their choices are to avoid Photos entirely or remember to close their eyes every single time they click a tab.

This is just not good enough. Apple is a company that prides itself on making its technology accessible. Given that a somewhat throwaway setting in a third-party utility can override or entirely disable the majority of full-screen animations, it’s hard to believe Apple couldn’t fit a Reduce Motion system into OS X if it wanted to. If developers could hook into that, most motion/balance issues would disappear in an instant, without affecting the majority of users, who could happily continue watching interface components zoom about before their eyes.

As I wrote today in an email to accessibility@apple.com, I’m sick of the current situation, figuratively and — in a fortunately fairly mild way — literally. Highly animated interfaces may be the ‘in thing’ right now, and sometimes have potential benefits in providing a sense of place; but that doesn’t mean Apple should overlook people for which these often aesthetic additions cause major usability, accessibility and health problems. I’ve no confidence anything will change. Every email sent feels like yelling into the wind, but I’ll be delighted to see and experience a change in direction should that happen in OS X Yosemite’s successor.

April 9, 2015. Read more in: Apple, Opinions, Technology

1 Comment

Apple Watch is the worst thing ever, and here’s why

Yeah, sorry about that link-bait title, but I figured I’d best get in on the current wave of tech stupid before my tech journo credentials are snatched away from me. Mind you, perhaps escaping would be a smart move while the majority of the industry loses its collective mind.

I mentioned tech writers tending towards bile last week, but the latest stick to smack Apple with appears to be the accusation that the company has lost focus and no longer understands the value of simplicity.

Jason Hiner’s piece for ZDNet is fairly typical of this latest raft of Apple Watch moanery, calling it “too ambitious” and “a bit of a mess”. He argues:

the fact that Apple released the product in its current form says something. In fact, it says a lot about Apple under the new leadership regime because it’s the first new product category of the Cook-Ive era. And as far as innovation and discipline goes, this is a wobbly start.

His core complaint seemingly revolves around a belief that all Apple products start out simple and then layer greater functionality as they evolve. He’s right that Apple builds on products (notably software, adding richer features) over time, but what is simple?

For Apple Watch, Hiner complains that the device tries to do too much and that there are a load of new functions for a user to figure out, which are

unlike any other Apple or tech product so they aren’t naturally intuitive.

But what is intuitive? What is fully natural? My dad recently admitted to me he’d never used copy and paste, and he’s been using Macs for well over a decade. He’d just been dragging selections around, muddling through. With Watch, you imagine quite a few people will do something similar, perhaps chancing across functionality. Others will dig deeper. But the point is that many pieces of functionality that tech pundits consider simple and natural are only so to them because they use these things every day.

Consider the mouse and the original Mac. Back then, the windows/icons/mouse/pointer interface wasn’t unique, but it certainly wasn’t commonplace. Then there’s the iPhone, with its gestural interface that had a fair number of elements that felt natural, but also elements users had to learn, in order to access all of the device’s functionality.

Of course, people slammed those things too, saying they’d fail, because that’s what you do with Apple. And perhaps Apple Watch will be a faceplant, but I think the tech industry would be a better place if writers actually started to spend a bit of time with kit before deciding that it was a waste of time, a mess, or too ambitious. (And you can bet that had Apple released a much more locked-down Watch, with a razor-sharp focus and far fewer functions, ZDNet would have been whining about Apple’s closed nature, and how the device was a rip-off for the few things it enabled you to do.)

March 16, 2015. Read more in: Apple, Opinions, Technology

3 Comments

« older postsnewer posts »