One of the often-used memes concerning Apple’s approach to iOS is that it’s for “passive consumers”, people who aren’t creative. In an interesting post on Google App Inventor, O’Reilly’s Mike Loukides dredges this one up again – and I think Mike is committing a classic geek error.
Mike contrasts the approach of App Inventor, which is designed to encourage simple programs for Android, to the higher barrier of entry for development on iPhone, and concludes that it’s a cultural difference:
“But Google has taken another direction altogether: the user’s experience isn’t going to be perfect, but the user’s experience will be the experience he or she wants. If you want to do something, you can build it yourself; you can put it on your own phone without going through a long approval process; you don’t have to learn an arcane programming language. This is computing for the masses. It’s computing that enables people to be creative, not just passive consumers.” [My emphasis]
Here’s Mike’s first error: Conflating “creativity” with programming, and “passivity” with, well, everything else. Mike isn’t the first to do this – I think my friend Cory Doctorow is responsible for the meme, as I pointed out in an earlier post. I’d argue, in fact, that the history of computing teaches us the exact opposite: the less people are required to learn programming in order to be creative with computers, the more creative work you get.
The history of computing over the past 30 years is a move away from requiring people to engage with computers “on their own terms” via programming, and towards enabling users to do creative things through applications. The flowering of creativity this has enabled has been the main creative triumph of the computer. Think of how much less creativity there would be without Photoshop, QuarkXPress, iMovie, or Final Cut, to pick just a few.
These tools have democratised creativity for millions of people. To claim that simply because a platform doesn’t have simple programming tools makes it “only for consumers” is as absurd as claiming that a platform which doesn’t have a easy-to-use DTP package is “only for consumers”. It’s the arrogance of assuming that your chosen mode of creativity is the only mode of creativity.
Look at the iPhone, which is the most pertinent example. Despite having crappy camera hardware, the iPhone made it very simple to take and upload an image, something that wasn’t true for most previous camera phones. What’s more, and this is important, it was a playful experience, one which was enjoyable.
Are all those images “not creative” because the iPhone doesn’t have easier to use coding tools?
Get thee to the engine room, slave!
Mike also brings in an analogy which I think fails:
“It’s sort of like travel: you can go to Club Med or take a cruise ship if you want a crafted experience. But you won’t find out anything about the local culture, you’ll only eat the local food in controlled settings, you’ll never hear the native language spoken. You’ll just do the limited set of things the organizers want you to do.”
I’d argue that the approach he’s taking, which encourages users to get deeper into the hardware and software, to (as he puts it) “find out about the local culture” is actually more like requiring the passengers to do their stint maintaining the engines of the ship, whether they want to or not. The price they “have” to pay for getting on the ship in the first place is to become engineers.
But in the end, the reason why Mike’s argument falls down comes down to choice. The problem is that App Creator isn’t “programming for non-programmers” – it’s “programming for people who want/need to learn programming”.
And most people simply don’t want to learn (or to have to learn) programming. If there’s anything that the history of the personal computer should show clear, it’s this. HyperCard died, not because it wasn’t brilliant (it was, and WAY easier than App Creator) but because 99% of people who got it free with their Mac never found a use for it. It was capable of wonderous things, but most users didn’t even notice when it died.
Most people don’t program not because of lack of simple-enough tools, but because of lack of desire. They want to get on with creative things like taking and editing pictures, writing novels and blog posts, editing home movies, and compiling their family history. And they want to do all that without having to learn to program to do it.
The geek era is over
The geeks – the people who have, so far, been the dominant part of culture in technology and the Internet – are like priests of a religion that finds themselves no longer the centre of their culture’s world. They are displaying all the standard behaviours of a dying religion: Flocking to new prophets, who aggresively promote their message; lashing out bitterly at the heretics who are “betraying” them; and even trying desperately to preserve their way of life by saying “look how easy it is to become a priest!”
What they don’t understand is that their place in the universe has changed. They’re still an important part of the culture, but they no longer run the world. They’re just a part of it, and their creativity is no more – or less – important than anyone else’s.