I've been waiting for us to break out of a lot of the computing metaphors we've held dear.
For decades we've lived with computer screens that tried to add depth by simulating... well... something. Some magical desktop where folders took up the same apparent physical space as a trash can. (One of those must be really tiny!) And then there were windows on top of all of this. Layers upon layers upon layers of boxes. Everything has always seemed off, just a bit. Really, there was no real-world equivalent.
This ran counter to early home computer hardware which, initially, resembled a bloated typewriter due to the dominance of its keyboard.
But we saw early attempts to bridge the gap through software. Magic Desk was one of the early tools that employed a desktop metaphor as literally as possible. Note the typewriter on the desk which, when clicked (with a joystick), would bring up a screen that looked like a typewriter. (It even dinged at the end of a line!) But also note the crazy perspective, lack of depth, and so forth. It doesn't look real but it looks real enough.
This was an effort, as much as possible at the time, to make people comfortable with it. What's kind of crazy is that it's taken us decades to acknowledge that people are now comfortable with digital devices, conceptually. So we saw this through as much as possible. The technology to make this desktop look really good, essentially, got to a point where we could fully make some whiz-bang 3-D model of a desktop if we wanted to. And that power, surprisingly, bled through to brand new things like tablets and phones. Again, crazy to think that a tablet had green felt in software.
With iOS 7, Apple has said: "you get this". You're looking at a piece of glass, tapping on a piece of glass, so the damn thing might as well work like a piece of glass.
In lieu of shadows and drop shadows, frosted panes. Photos and videos. Layers, but seemingly borne from a place of reason than gee-whiz-ness. Depth through using the Z-axis. And simple text with minimal ornamentation. Because, really, when was the last time you saw a piece of glass in the real world with buttons sticking out in front of it - tall enough to cause bevels and shadows to appear?
Yeah, me either.
So, it makes sense conceptually. It is also risky to do this because people rely, in part, on the way things appear in order to clue them in on what things can do - these are affordances.
I'm not sure what to make of buttons, which are still called buttons although they no longer have a resemblance to any button in the physical world I can think of. Buttons are colored text. That's it. No bounding box, no underline, no dancing ants! It really brings up the question of, "How will people know what they can and can not tap?"
Two more things to consider: there's a digital analogy here to the web. Initially, links were blue and underlined. Over time, underlines started to go away. Today links may be underlined, a different color, or both. The web has seemed to do fine. Not quite mystery meat navigation.
The other thing is that people using phones and tablets may not be using a computer, and may not have any of that computer knowledge (baggage) with them. Thus, those metaphors could make less sense contextually. Might as well redefine them now, lest we pretend that these things are just mini-desktop computers.
Something I admire about iOS 7: the muscle memory I have, in a lot of cases, is still intact. Controls haven't moved a lot yet, and that means the whole thing still feels comfortable. It's a bridge. But this too will change over time, likely with iOS 8 and beyond. Apple put down the groundwork for new metaphors with this version, and now they'll exploit them.
I trust that in another 30 years, as new interfaces rise to prominence, we'll look back at the early days of touch computing with wonder. What happens when we're pointing and swiping at things that do not exist? Yeah, that's going to be fun stuff to figure out!
Thanks to Alberta Soranzo for inspiring this piece.