smarterbits

Showing posts tagged tech

It’s Not a Web App. It’s an App You Install From the Web.

Odd blog post by the team behind Forecast, the excellent new weather app that’s attracting attention both for its quality and the medium used to create it.

So why does it feel as if the average native app is so much better than the average web app?

The line of questioning is what nags me. The interesting part of the web vs native debate isn’t about which technology feels better (a question which they’re convincing enough in demonstrating is immaterial anyways) but about why app developers overwhelmingly choose to work on native platforms. What I would’ve like to see is an argument for why developers should prefer web solutions over native ones. Some reasons are obvious: cross platform compatibility, over-the-air updates, and a dynamic and adaptable programming foundation in HTML. What’s tricky is convincing developers those things are worth leaving the advantages native apps provide, especially where it involves justifying a web app’s absence from the one place the majority of people shop for and discover new software. Technologies asides, native apps get a head-start from the visibility, added security, ease of use, and built-in marketing app stores provide. The reality the web advocates have to overcome is one where we’ve built an economy and marketplace around native solutions. The funny historical twist is that if web apps were as capable 6 years ago as they are today1, I can’t start to imagine what kind of conversation we’d be having instead.


  1. In comparison, the iPhone really was a tough space for web developers in 2007. App-wise, where did one start? There was little precedent to take inspiration from. Neither could they benefit from all the legwork Apple would save native app developers from with the iOS SDK in 2008. Other factors were altogether out of their control. Even until the 3GS, iPhones didn’t have the computing, or networking, power to be able to render web performance comparable to native apps. 

The Verge's HTC First Review

Two points I’m glad Dieter Bohn emphasizes in his review: Besides the Facebook integrations, the First runs stock Android (4.1 unfortunately). It also bucks the bigger is better mantra we’re used to seeing when it comes to screen size.

Looks to be the most user focused Android design to date, even compared to Google’s own Nexus line of products.

Facebook Home and the Islands of Internet

A friend of mine believes that all big tech companies treat the web as a service worth competing over. Running with his perspective, I feel less crazy suggesting that Facebook’s ultimate goal is to become its own version of the internet. (He probably put that thought in my head too) This idea lands somewhere between theory and practice. You can — right now — open a browser tab, log into Facebook, and find in that one place a majority of the information you could otherwise go through a variety of sites to find: restaurant menus, concert listings, what your friends are up to, photos, news, blog posts, games, ad infinitum. Still, the other sites endure. Some because they provide better info. Some are more popular. Many survive because Facebook hasn’t found a way to convince us all to log into it every time we launch our browsers. Which is why whatever Facebook Home turns out to be, it’s at least a somewhat pretty sizeable deal.

Despite the whole thing not even being official, the Facebook centric launcher/homescreen/app-OS (Is it all these at once?) is already a strange entity. I’m shocked that Google is endorsing this at all, even if only by doing nothing. Letting Samsung or Amazon use Android as a base layer for their own operating systems is one thing; the information plumbing still runs through Google. Facebook Home is trying to circumvent Google altogether. There may be a Google search bar at the top of the screen, but I have a hard time believing that’s all it takes take to convince your biggest competitor that your intentions are noble and just and “look pal, I think this is the beginning of a fortuitous partnership”. The Google branded search bar — heck the fact that Android is even prominently mentioned on the event invitation — must be a good bit of Jedi Mind Tricks. At first I was going to criticize Facebook Home for being too tentative. Why only a launcher/homescreen/app-OS instead of an entire OS or customized Android UI? Why not try and block Google’s access in every way possible? But it’s obvious, thinking it over twice, why Facebook Home makes more sense as is, rather than as a OS blitzkrieg that would almost certainly fail. A proprietary OS or Android Fork would involve selling millions upon million of phones, something I don’t think OEMs (Facebook is probably the one doing HTC a favour by giving them this much primetime US media attention), carriers, or customers are eager for, nor something I think Facebook has the skills to execute at the scale they need to pull this off. And you can be sure they would be drawing the line in the sand for the whole Android as commodity OS for other web companies thing. Hence why a launcher/homescreen/app is so brilliant. Facebook gets to simulate an OS without the overhead of engineering their own, which they can potentially propagate like a virus across millions of its competitors existing and forthcoming devices. And they get away with it unchallenged because of a search bar? No really, it’s astonishing that Larry Page doesn’t realize he’s handing over his lunch to Mark Zuckerberg.

The splitting of the mobile ad pie is going to matter, but on a bigger scale than most of us are realizing. Supposing Facebook’s new project is in any way successful, it’s going to solve their recurrent login issue and let them be the internet’s portal on the fastest growing, most used class of computers. It means getting our first glimpse of the internet’s segregation.

Thanks to mobile computing, we access the internet through an array of apps, widgets, and OS level services instead of a single browser window. Controlling those portals is big business. Once there’s no longer a single way to reach the World Wide Web (your browser and a search engine being the first), the ad value of our attention skyrockets, meaning the richness and exclusivity of a platform’s information becomes an asset to be stockpiled rather than shared. Facebook wants to be the only internet its users need, and to do so it needs a combination of portals (Facebook Home/Facebook Messenger) and content (anything that’ll fit on a profile page).1 Adding reasons for us to stay, e.g., Instagram, Zynga, Bing search and Maps, becomes crucial to the business. When it begins taking sizeable chunks of their revenue away, it’ll force Google to limit access to their content in an effort to lure and hold onto people using its services. If you were looking for perspective on why Apple would create its own mapping service, simply play this scenario out a little further.

The somewhat pretty sizeable deal in all this: Facebook Home stands to mark the beginning of the absolute ecosystems: silos not just of hardware and software, but also of knowledge. The islands of internet. And proof of my friend’s supernatural prescience.


  1. Google has been doing the same thing for a long time now, only in a less overt manner. It’s strategy was to provide the information backend for everyone else’s software and hardware and monetize the data flowing back and forth. Facebook is the first service with enough scale and reach to actually challenge them, not because it has more services or better information, but because its own data is more valuable to advertisers.

     

Influences

The textile industry is squandering an opportunity. Despite accounting for 8% of manufactured goods sales around the world, they’ve managed to stay on the sidelines of our mind share ever since ire over sweatshops boiled over in the 1990’s. Nowadays it’s software designers undertaking the bulk of the PR work for textiles, as skeuomorphism finally impresses upon an otherwise fabric oblivious generation the nuances of linen, felt, faux leather, and whatever other basic textiles make up your shirt’s blend. Blame my cynicism but I’m shocked Cotton or DuPont hasn’t sized the moment and begun demanding their logos mar every wallpaper or user interface element on which digitized versions of their products appear. Unfortunately for them, it looks as though the public’s honeymoon with skeuomorphism is already coming to a brandless end.

“The Trend Against Skeuomorphic Textures and Effects in User Interface Design”, the latest in a long list of attempts at explaining this particular eventide, stands out thanks to John Gruber’s uncanny ability to summon a history of events wholly disconnected from reality. His essay, like most magic, begins on a benign observation: there’s a trend forming among top tier1 iOS developers steering away from the skeuomorphic design language of the platform. Trying to figure out why, Gruber cites Letterpress, Instapaper, and Twitteriffic 5 as case studies (other good examples: Realmac Software’s Clear, Flipboard, and Simplebot’s Rise), endorsing Dave Wiscus’s false rationale that the examples supra cement iOS’s legacy as the birthplace of leading-edge, non-skeuomorphic design. Things immediately start to fall apart.

*Proper usage of the word skeuomorphism is contentious enough to warrant its own article, so I’ll address it here to avoid issues later on. Most of the ire is concentrated around its misappropriation to designs which aren’t by definition skeuomorphic at all. I prefer deferring to the experts: Christopher Downer provides a good introductory overview that delineates the apples and oranges. In contrast, Chris Baraniuk’s position is polemic, calling into question the entire use of the word in relation to UI design and—not content to stop there—wonders whether or not the Wikipedia definition is more or less entirely rubbish. Louie Mantia also provides some needed Mythbusting on the issue. While I tend to agree with each’s arguments, I still can’t get on board with their prescriptivist position. Doing so would be ignoring how the word has transcended the boundaries of its old meaning and become a catchall term to a larger body of people using and adapting a definition that’s more popular in everyday use. Much the same way minimalism is flung around with little regard for definitions, we can use skeuomorphism as a genre word that, though perhaps frequently misapplied, is apt enough in practice for everyone to distinguish between a skeuomorphic-ish design and one that isn’t. And it’ll be used as such here.

From the start, both men’s design myopia refuses to acknowledge that non-skeuomorphic design has existed elsewhere prior to 2012, whether as the preeminent aesthetic of Windows Phone 7, Microsoft’s mobile operating system2, or through the clean lines and sci-fi sterility of Android’s not-completely-flat-yet-not-stuffed-with-chrome UI. The sidestepping of any outside influence is meant as misdirection, a reshaping of events that encourages the idea that iOS designers live in a vacuum controlled by the whims of Apple. My guess is that Gruber thinks he can get away with this fallacy since Windows Phone sales have been tepid at best and that the stock Android UI is almost always redecorated by whoever’s supplying the hardware. Except popularity isn’t a necessary condition of influence. Any competent accounting of flat UI design shouldn’t, and wouldn’t, ignore the contributions of Microsoft, Google, or even Palm, no matter how disappointing their sales records.3 Having declared iOS as the epicenter of this new trend, an iota of sleight is all that’s needed for Gruber to switch Apple’s position from beneficiary to benefactor.

Gruber’s chosen Apple’s Retina display to be the hero of his story, declaring it a singular breakthrough absolving designers from employing the “the textures, the shadows, the subtle (and sometimes unsubtle) 3D effects” of skeuomorphs that were “accommodat[ing] for [the] crude pixels” of non Retina quality displays. His thought process involves comparing the influence of high resolution displays on UI design to the influence—in this case real and documented—they’ve had on digital type design. Quick recap: Retina caliber displays are behind the viability of print hinted fonts rendered digitally, which had hitherto looked insulting on the sub-par resolution of non Retina displays. They’ve also had the reverse effect on screen optimized fonts by suddenly making them appear vulgar, ridding them of their purpose. Gruber equates the trimmings of skeuomorphic design to stopgap fonts like Georgia and Verdana4: poor solutions used for a lack of better options, given that the “hallmarks of modern UI graphic design style are (almost) never used in good print graphic design”. Therefore, we ought to be thanking Apple for granting designers the opportunity to produce “graphic design that truly looks good.” on our devices.

There’s no evidence I can find—and suspect will ever find—to defend the claim that skeuomorphic textures and effects are scapegoats for the inefficacies of lower quality displays. Gruber so heavily leans on his comparison to screen fonts he starts to redefine the term, implicitly suggesting that skeuomorphism is equivalent to poor design taste. If you’ve made it this far then you know how spurious the whole idea is. Even Dave Wiscus’s 100-level explanation is enough for anyone to articulate the relationship between a skeuomorph’s purpose and a heavily textured material surface. Neither is there any reason to believe that skeuomorphic design is now defunct thanks to Retina displays, given that (a) we know a skeuomorph’s primary function isn’t too cover for crude pixels; (b) contrary to Gruber’s subjective analysis that all drop shadows and glassy surfaces look worse on them, Retina caliber displays allow for even more detailed and striking effects, making already beautiful apps using skeuomorphic elements all the more stunning; and (c) even if we cede the last two points, questions abound on why, since the release of the Retina bearing iPhone 4 in June 2010, Apple has all but ignored the apparent Retina-resolution design era and pushed towards heavier and heavier use of so-called parlor tricks on both iOS and Mac OS, or why so few third party developers have moved away from the skeuomorphic model. His entire essay is being driven in a car without a rear view mirror, aces rushing out of its driver’s sleeves.

Most of the sensible explanations put forth in “The Trend Against Skeuomorphic Textures and Effects in User Interface Design”—that skeuomorphic elements are overused, how Retina caliber displays can influence UI design—are perverted by the misconception that print design and UI design are one and the same.5 They’re not. Where print design is concerned with aesthetic cues and organization of information that’s conveyed subconsciously to the reader (e.g., the way the eye moves between two paragraphs and understands new ideas are being introduced, or how text size imparts hierarchy), UI design’s cues are dynamic and explicit. They must convey function, respond to input, morph, adapt, and tangibly interact with the user. The set of skills required for one doesn’t come close to the set needed for the other. When Gruber tells us that “[the] hallmarks of modern UI graphic design style are (almost) never used in good print graphic design”, he’s right for all the wrong reasons. The differences don’t even matter. What’s he’s trying to demonstrate is how UI design is undergoing the same crippling transitional phase print design—specifically as it concerned fonts—had to endure with the introduction of digital displays. His account of digital type’s hobbled history, right down to its rescue by high-resolution displays, is spot on. Yet the paths between the two arts don’t run parallel; software’s only ever been digital. Where’s the analog6 (or digital) counterpart we compare it to and say “We could do so much more if only we weren’t stuck designing this software on a screen”? As displays march on towards human-grade optics, of course designers’ options have improved, but there isn’t some past UI standard they’re trying to return to. Progress here is strictly forward. Nothing forced skeuomorphism on us.

The upshot to this mess is that Gruber’s initial question is actually worth considering. It never once occurs to him however, that the answer needn’t be as convoluted as he makes it.

In his own words: “There is a shift going on, fashion-wise”.

Designers. Users. No one is immune to the fatigue brought on from overexposure. The numbers themselves are staggering. 700,000 apps downloaded 35,000,000,000 times. Even accounting for the large number of games making up that total, the prominence to skeuomorphic design is inescapable. We’ve refined, developed, added to, twisted, and debased the style down to a chintzy polish.7 Why doesn’t Gruber wonder whether we’ve simply tired of seeing yet another faux-textile background mimic a pair of pants no one would dare buy in the real world?

The analogies to fashion are easy to latch onto because they help make the distinction between aesthetics and function, something Gruber understands and has leaned on previously when describing user interfaces as “clothing for the mind”8. The premise is simple: No matter the amount of “stylistic tweaking”, UIs—or clothes—remain true to their form. So long as it remains able to divide the bill at the end of lunch (form), your calculator app can resemble whatever model Braun calculator it wants (stylistic tweak). The couture comparisons might be heavy handed, but they’re a good starting point from which to find better reasons why we’re moving towards flat user interfaces. For example, it could be that designers are realizing there’s a whole new generation of people for whom the cues of skeuomorphic design aren’t referential, but merely aesthetic.9 What’s the point of mimicking a Braun TG 60 reel-to-reel deck to millions of kids and young adults who will never lay eyes on—never mind use—an actual physical tape recorder in their lives?10 Why stick by a design that’s losing its raison-d’être?(_ed notes: an update to the Podcasts app on 21-03-2013 got rid of the tape deck simulacrum_) We might also consider whether skeuomorphic design is even fit for the UIs of modern computing anymore. As we increasingly interface by way of gestures, voice commands, and inputs disconnected from physical analogs, are digital knobs and textures the most efficient or practical solution? Asking these sorts of questions—not wondering what’s changed since Apple released a new iPhone—is how we begin noticing the influence of an entire mobile industry on itself: We can trace the career of Matias Duarte from Palm to Google and see WebOS’s legacy of physicality continuing on Android. It’s why designers at Microsoft can find solace in the fact that designers are apparently taking inspiration from Windows Phone 8’s text-centric, chrome-less aesthetic and adapting it to their software. Point being, it’s pure fantasy to imagine third party iOS developers leading the charge against embossed text on the basis of a single and insularly engineered cataclysm.11

Skeuomorphism isn’t bad design. Nor is it a fad. A pragmatist might complain it’s no longer ideal in 2013. A pessimist would say we’ve made it kitsch. I suspect John Gruber knows and believes these things. Otherwise his essay is a change of opinion that throws away years of Daring Fireball posts. Then why go to such lengths to find a solution so stretched and un-obvious? My suspicion is that any scenario wherein we acknowledge that fashion-wise something has fallen out of favour inevitably leads to questions about exactly what’s causing the falling out. Fingers want to be pointed and the inconvenient truth here is that skeuomorphism has no bigger an evangelist than Apple.

What goes unmentioned in Gruber’s essay is that most of the gaudy elements he’s reproaching were introduced, if not heavily endorsed and popularized, by Apple.12 iOS’s contribution was to dial the exposure knob to 11 by attracting thousands of eager developers to its ready-made developer tools favouring conformity and uniformity across the entire platform. The formula’s proved so successful that the entire UI language of specific classes of apps has been codified, standardized, and left customizable only at the level of “Which texture or drop shadow angle should we use here?”. Hence the excess.

There’s little satisfaction in getting this far only to have me pin this on one writer blindly marching his party line. While there’s no doubt Gruber’s over thought the situation so Apple can walk away unscathed, what I want to try and coax into sight are the actual consequences at play in this debate. Blaming Apple for abrading our tolerance of skeuomorphism isn’t as worrisome as the idea that it might have no intention of stopping. Hardware aside, there’s enough evidence to suggest that Apple’s institutionalized its taste for the playful, safe, non-threatening, and innocent genre of software espoused by iOS. You’ll notice small doses of it in places like the App Store, where categories and catalogs are given their own tacky icons filled with garish fonts and unimaginative emblems: a golden plaque background for its hall of fame category, an assortment of balls to decorate its sports section. Where it’s most apparent is in their now celebrity-laden, heartstring-tugging commercials, the charms of which have less to do with Apple’s clever wit and genuine passion than applying its fastidious work ethic to clichés we’ve seen elsewhere in advertising. There’s a shift occurring at Apple about who it considers its core audience to be, a shift that consequently reverberates across its product design, i.e., why it continues to be attached to skeuomorphism.

* Marketing is often the simplest way to see who a company cares about, how it perceive its audience, and how it cares to be portrayed. The best way to illustrate this particular shift—without rewinding too far—is by drawing a line somewhere around the launch of the iPhone 4 and comparing Apple’s advertising efforts before and after. The biggest visible change is the introduction of the decidedly cinematic and ostentatious suburban lifestyle vignettes exemplified by the Sam Mende’s directed FaceTime videos, as well as almost the entire run of Siri spots, and the short-lived_ Apple Genius series. They’re evidence of a company shedding its aura of pretentious coolness in favour of innocuous inclusiveness. Even going as far back as the Jeff Goldblum narrated iMac G3 commercials, Apple’s marketing pre-iPhone 4 was often about differentiating its values: Apple’s, and everyone else’s. The Manchurian-like effect on consumers meant—besides exemplifying TBWA\Chiat\Day’s own genius—that owning something California designed was a token of membership. If nothing prevented anyone from enjoying those iPod Shilouette dance videos, nor the charms of the Get a Mac _series, those ads nonetheless introduced dividing lines. If you didn’t own an iPod, didn’t recognize the catchy music (remember when Apple abandoned the opaque dancers and upcoming hipster bands in favour of unmistakable U2 and Coldplay mini-music videos?), owned a PC because you honestly couldn’t tell the difference, or weren’t savvy enough to make out all the references in the classic “Hello” iPhone Oscars spot, you couldn’t help but notice how different you were from those people who did own Apple products, a realization laced with all the consumerist impulses we like to pretend we’re immune to. Today, with so many iPhones and iPads in the hands of people who decidedly don’t care to fit that particular brand image, the old approach becomes alienating. Thus the current marketing—because Apple’s demographics run such a broad spectrum—goes out of its way to avoid any delineation, aiming to associate the brand with a wholesome, family values, American Dream lifestyle that almost anyone can relate or aspire to in some way.

Apple’s cutting edge innovations are both blessing and curse. As responsible as they are for the massive success and ubiquity of Apple within the pockets of a large portion of the developed world, they’re also responsible for populating its base with customers for whom cutting edge technologies have little appeal, traction, or even desirability. Today’s average Apple enthusiast is less likely to care about trends in UI design than they are about whether their current iPhone’s case will fit the next one. The kicker is that it’s proof of Apple shrewd business acumen: the skeuomorphic designs introduced in iOS back in 2007 were central to overthrowing the crude and unapproachable UIs powering devices preceding the iPhone and transforming the smartphone into something desirable to people outside office buildings. In hindsight it’s easy to explain why Apple had a hit on its hands. Today however, the huge heterogenous market Apple managed to attract to iOS is also the huge, heterogenous, and sensitive to change market which expects its median to be catered too. Dealing with expectations of this magnitude is a new world for the company, one which they may not comfortable operating in.13 Even assuming it remains a best of breed consumer electronics company well into the future, the attrition caused by the demands of ubiquitous user base means it’ll be increasingly harder for Apple to remain at the leading edge of the industry, at least UI-wise, without running the risk of estranging that base. While it won’t prevent them from innovating on hardware and technologies, it could force them into tempering their software breakthroughs in aspects they otherwise wouldn’t have if the target market still resembled what it was in 2007. Multi-touch gestures are a good example. Despite possessing the most advanced touch display technology in the industry, gestures remain woefully underplayed in the core iOS interface. Four and five fingered iOS navigation only became available to the public on iOS 5, and their—turned off by default—use limited to the iPad. There’s also no reason why some of those same gestures couldn’t work on smaller iPhone sized devices with one or two fingered substitutes. Yet their absence is conspicuous. Six years in, the gist of working one’s way through iOS remains by tapping buttons over and over again. Even prominent 3rd party innovations like “pull to refresh”, which thanks to their popularity on third party apps could routinely be mistaken as a core element of iOS’s interface, have only been timidly adopted by Apple, if at all. This underlines why the charge away from skeuomorphism is being led by third party developers, and not Apple as Gruber suggests. Third party developers aren’t beholden to the median of iOS users. They can find success in narrow audiences. They can take more risks UI-wise, acting as outliers with aspirations of becoming the trendsetters for next year’s UI fashion trends. It’s a can’t-lose scenario for Apple: at a minimum there’s enough apps to please anyone’s tastes, and if any of these Flat UI projects happen to take off at scale, e.g., Google Maps, certain elements of the native Facebook app, or pull to refresh, Apple benefits by osmosis.

There’s a hitch of course. Nothing explained, debated, or corrected supra applies to any industrial design related activity Apple’s been involved with over the last 13 years. No one would contest that every desktop, notebook, or mobile device bearing its logo hasn’t at one time represented the absolute bleeding edge of its field, achievements superseded only by their successors. There’s no denying how relentless Jony Ive14 and his team have been at pushing the boundaries of what a computer device ought to be, how it ought to look like, and what it ought be made of. Theirs is a unique focus that, mixed with a healthy disregard for whatever customers might want or expect (floppy disks, DVD drives, removable batteries, whatever I/O ports the iPad doesn’t have, and bigger or smaller iPhones depending on the rumours circulating the day you’re reading this), is almost enough to vindicate Apple’s overabundant affection of superlatives when describing its products. But hardware designers enjoy some privileges the software guys don’t. The big one concerns how being at the leading edge of electronic industrial design—as it seems only Apple has realized—actually aligns itself with the goals of the art. However striking its design, hardware’s ultimate goal is to disappear into the user’s unconscious: Lighter so as to not fatigue the hand, smaller so it can fit into any bag. Faster, longer lasting, higher resolution-ed. Whatever means necessary to prevent it from impeding the user’s experience.15 So long as the result doesn’t wildly diverge from the norm (say, twenty-seven inch convertible desktop tablets or buttonless iPods), there’s otherwise little consumer attrition constraining the imaginations of industrial designers. Once in use, most of the physical aspects of our computers fade into the unconscious, out-shined by the attention its software commands. The burden for the software guys lives in that differing proportion of attention. Our relationship with software is so immediate that any atomic change to our literacy of a given UI elicits a larger and longer sustained reaction than any material changes made to our favourite products.16 We’re prone to blame, justly or not, the successes and failures of our computers on software. The feel of brushed aluminum matters more on our screens than in our hands.

Whether tangible or pixelated, fashion remains capitalism’s favourite child. Being able to tap into—or manufacture—the desires of an enormous aggregation of people is SOP for any company hoping to reach the rarefied company of Apples, Coca Colas, and McDonalds(s), even if the usefulness of their brand images don’t make significant contributions past enlarging the guts of the many and the wallets of the few. Yet for UI design, fashion is more than an agent for consumerism: it can solve crucial problems that define how meaningful technologies can be. It’s especially important in mobile computing, where rejection of a long history of desktop UI paradigms has renewed exploration of the ways in which we use computers and what we can accomplish with them.17 What worries me is the possibility that stagnation is penetrating a field that’s still trying to define itself. Even scarier is the possibility that this stagnation germinates from iOS, for the simple reason (personal allegiances aside) that Apple has up to now been the only major tech company with any proven track record of saving us from stagnant trends ,e.g., command line UIs, floppy drives, physical music, and desktop computing. The dilemma with skeuomorphism is that as major driving force for iOS’s success, it’s a design strategy that’s hard to argue against, let alone abandon. Therefore whatever new possibilities leading edge UI design is pointing towards, Apple’s role risks becoming reactive instead of proactive. My question then is whether or not—no matter how best of breed their products remain—having Apple so consummately dominate the mobile computing space is what’s best for the industry. I know the question seems rhetorical given the idiom that competition breeds innovation, but try and name any leading edge mobile platforms that have enjoyed success in any way similar to Apple’s: WebOS not just ruined but killed Palm. Windows Phone 8 is eroding what’s left of Nokia. Windows 8 in general has Microsoft and its OEM partners in a frenzy that proves why not all ideas aren’t created equal (again, like twenty-seven inch convertible tablet desktops marketed to moms and kids). Android as a commodity OS for hardware manufacturers has been a bestseller, but it has left the platform disjointed and lacking cohesiveness from one device to another. Android the stock, presented-by-Google, operating system is almost a misnomer given its relative obscurity to the public. The only thing standing between us and the troves of innovations the aforementioned have created is the painful truth that only Apple has a proven track record of being able to popularize them.

If John Gruber can be fooled into thinking Apple remains at the leading edges of UI design, it’s thanks to its 3rd party developers who’ve inadvertently earned the majority stake in maintaining iOS’s innovative and dazzling pedigree, inadvertently making them iOS’s greatest asset in the process. While Apple is happy to oblige with statistics about the ever enlarging successes of the App Store, little is mentioned about how the ever enlarging clout of the store is shifting the power dynamics of the developer/platform provider relationship. You might describe equilibrium like this: Apple provides a product and platform customers want to buy into, e.g., the iPhone, thereby attracting developers with the promise of an untapped audience. In return developers provide the platform with (sometimes) exclusive software that distinguishes Apple’s platform from others, keeping current customers in the fold and also attracting outsiders that want a seat at the table, e.g., anyone who wanted to use Instagram prior to April 2012.18 This feedback loop is self renewing as long as each player maintains their stride: a new desirable iPhone every year, followed by new apps that take advantage of its new features. Things challenging this balance: On one front, the other platforms are rapidly catching up to, and in some cases surpassing, iOS both software and hardware wise, strengthening their own feedback loops. On another, there’s the aforementioned trend away from skeuomorphism that, at least UI wise, is dulling the appeal of a sticking-to-its-guns iOS and denying developers19 the guidance needed to meet the needs of this new vogue. The latter puts in play a few consequences. If Apple isn’t at least mildly proactive about updating its UI interfaces and campaigning them through its Human Interface Guidelines, then developers are left to act upon their own whims. This lack of uniformity and convention means that a Retina-Resolution era of UI becomes defined as one thing by The Icon Factory and as another by Path, by Simple Bots, Marco Arment, Realmac Software, Flipboard, and every other designer attempting to navigate iOS’s future without Apple’s guidance. I’m already frustrated by the number of Twitter clients disagreeing on what actions left-to-right and right-to-left swipes are supposed to invoke. But here’s the bigger worry: Apple’s hardware edge notwithstanding, what if the only incentive to develop for iOS—or to own an iOS device—is the promise of an ecosystem controlled, determined, and made enticing primarily by developers outside Cupertino? How does Apple prevent a mass migration if (when) another platform comes around proving they can foster developers the same way iOS did back in 2008?20 It’s no small feat for the challengers, but we’re fast approaching this reality.21 Developers aren’t just Apple’s biggest asset then, they’re also its biggest liability. For almost 6 years to pass without Apple demonstrating little interest in updating its UI beyond restrained refinement, beyond what’s necessary to show up with at a yearly keynote event, is either brazen confidence bordering on negligence or a lack of tactical manoeuvrability.

This for me is the real intrigue—the delicate balance between reassuring users and guiding developers—that’s simmering beneath the Skeuo v. Flat debate. Because in 2013 it’s winning the software battles that matters. The challenge for Apple then is whether they can settle on a UI design that’s simple and familiar enough to assuage the large swath of its users who seek nothing else, yet also avant-garde enough to secure its role as the pace-setter of an industry fuelled by innovation. Such a balancing act requires a flummoxing understanding of the power of design and UI’s undisputed role as the nexus of computing today. A particular design decision can not only solve a particular user experience problem, it can also make or break entire corporations while spontaneously introducing new user experience problems we’re not even sure exist yet, begetting new decision solutions, which themselves may or may not solve other unknown user experience problems, introducing who knows what kinds of make or break challenges that will be the death of some companies and the birth of others. On most minds—to say nothing of mine—the entanglement of implications is like boiling water to oatmeal. Imagine if it we were talking about anything more than a trend.






1: I’m tempted to substitute “top-tier” for a one time non-pejorative use of high brow. The distinction is important because we aren’t dealing with a “this is what all the cool kids are doing” type of trend but a “we’re the kids that were doing this before all the cool kids were” kind of trend, one that isn’t responsible for making something mainstream but rather for influencing other designers who’s apps will eventually take it into the mainstream. See: The Devil Wears Prada


2: That Gruber relegates any mention of the Metro aesthetic to 10pt footnotes is pre-emptive of reader riposte at best and negligent at worst.


3: And I’d argue that outside influences of flat design on iOS are too obvious to ignore. Not only thanks to the prevalence of Google’s own apps on iOS, but through the growing popularity of horizontally swiped views that owe a lot to Android and WebOS.


4: No word yet on when Daring Fireball plans to join the retina-resolution era.


5:A mistake on the scale of “print magazines are just as easy on tablets!”


6: Although in a primitive sense we can work our way backwards from our digital user interfaces to the very analog control panels, knobs, levers, keypads, and switches we use to interface with a variety of tools and appliances, which we’ve ⌘c ⌘v into our software. Ergo.


7: Explaining why Gruber’s complaints are often directed at the misapplication—whether by design or laziness—of skeuomorphic elements to UI designs which aren’t skeuomorphic at all, e.g., Find My Friends.


8: Quoted from this Webstock’11 talk. Given Gruber’s apparent knowledge of the subject, it’s all the more suspect that as basic an argument as “style changes” doesn’t warrant the briefest mention in his essay.


9: See: The broach, overalls, fanny packs. The monocle.


10: Nostalgia perhaps, the kind that lets me defend my love of You’ve Got Mail on its historical merits and memories of my own childhood ePenpals. But lets be honest about the Apple Podcast app, and about You’ve Got Mail.


11: And the emergence of flat UI design on iOS proper is still so negligent that it’s hard to go along with a premise that casts Retina displays as the catalyst for designer agency in all this. When Gruber—unblinkingly I imagine—informs us that the Windows 8 interface is “meant to look best on retina-caliber displays”, you have to ask yourself whether you believe in the sort of conspiracies that say either Microsoft is so forward thinking it’s willing to push out a suboptimal product for 2 years waiting for Apple to rescue them, or is just carving another notch in the bedpost of their own folly by being cravenly inept.


12: The representation of physical elements through digital form has been around since the release of the original Macintosh, but it’s really in the last 12 years, since the release of OS X, that Apple has pushed this design philosophy into every corner, background, and window pane of its operating systems. The greater the technology, the greater the amount of physical mimicry Apple has added to its software.


13: Apple’s motto until now has been It Isn’t The Consumers Job to Know What They Want. Even when the iPod was at its peak, Apple showed a surprising disregard for maintaining continuity in the line, often radically redesigning a product within a single generation, and sometimes backtracking the following year when those new designs failed to catch on. Underscored here is the relative insignificance of the iPod software in relation to the physicality of the device. This proportion is reversed with iOS.


14: Hence the excitement over Ive’s recent promotion to director of human interface at Apple, given the decidedly leading edge and un-skeuomorphic style of the industrial design team Ive leads (manifested in their distaste for the philistine, superficial, and heavy-handed traps the accoutrements skeuomorphic design often fall prey to). I liken the situation to MJs decision to try baseball. Here’s a guy who possessed unique natural talents that would make him gifted in any sport he decided to get up for in the morning, yet which weren’t sufficient to finding immediate success versus the experience of his competitors. At the highest levels, all else being equal, experience trumps skill.


15: A topic Ive has broached in the iPhone 5’s introductory video, demonstrating the power of familiarity in user experience.


16: The Microsoft Surface is a perfect case study: incredible, innovative industrial design buried and ignored in the face of the radical changes introduced by Windows 8.


17: A small list of things we either don’t have at all, would have on a smaller scale, or probably would have waited longer to see introduced were it not for smartphones: Siri & Google Now, social networking on a global scale, the explosive ubiquity of digital photography, a gaming industry divorced from its tenured oligopoly, wearable computing, ubiquitous connectivity, geo-location based services, and Angry Birds.


18: An exact description of the video game market from the mid 80s up until 2005/2006, when the economics of making a first rate video game on the current generation of consoles made it virtually impossible to succeed unless it’s sold on every available platform, putting the kibosh on decades of schoolyard turf wars over which console systems were best. But its only made exclusivity that much more valuable. Nintendo’s IP is the only reason the company has any relevance today, if you need just one example.


19: You need only make your own list of restricted, convoluted, clamoured for but denied, or impoverished APIs that could otherwise enable developers to create apps even greater than they already are.


20: Continuing with the video game theme from 18, we’re now describing what Steam could do with the Steam Box, its bid for our living rooms. Valve not only has a Nintendo-like following around its game titles, its also got the best disc-less distribution system out there in Steam. There’s likely no better candidate to endorse in the “most likely to replicate for gaming what the iPhone did for mobile computing” race.


21: Observable (a Google search will emphasize my point better than a link) from the variety of essays and switcher articles on Android finally reaching parity with iOS. From a developer/platform feedback loop perspective, where not quite there yet. While most of the major players (Facebook, Flipboard, Twitter, Instagram, and Angry Birds) have Android versions of their apps, what’s still lacking are desirable exclusives that attracts large swaths of users and makes those on competing platforms jealous. Yet this kind of slow leakage threatens to turn into a flood; the greater the number of major developers on the platform, the greater the level of confidence developers have in it, the greater the odds of Android getting those exclusives. Combined with its superior web services and ever improving hardware, Android is slowly changing the conversation from “Why wouldn’t I get an iPhone?” to “Why should I get an Android device over an iPhone?” to “Why should I get an iPhone over an Android Device?”.

Movember Momembership Month

There’s no beating around the bush: cancer sucks. Chances are you know about that first, second, or even third hand. So why not do something about it? Why not grow a mustache? “Movember” is the name of the global campaign for raising awareness about prostate cancer and mental illness in men. From their official about page:

During November each year, Movember is responsible for the sprouting of moustaches on thousands of men’s faces, in Canada and around the world. With their “Mo’s”, these men raise vital funds and awareness for men’s health, specifically prostate cancer and male mental health initiatives.

Just last year, Movember raised 127 million dollars worldwide1. Grooming for a good cause? Count me in. And I need your support. This month is going to be Smarterbits’s Momembership month. Here’s how you can participate:

  1. Make a donation through my Movember profile and I’ll hook you up with a lifetime membership. You can donate any amount. Just click the ‘donate to me’ button under my mugshot.

  2. If you sign up for a renewing membership through my membership page, your first payment will go towards my Movember donations, AND I’ll match your initial donation myself, up to a maximum total of $300.

  3. If you are already a member, this month’s payment is going to the cause.

And in the meantime I will try to grow a decent moustache for you all to enjoy. In fact, I encourage you to grow your own too.

Fuck cancer.


  1. CAD, ok? It’s still alot. 

The Galaxy Note II, as seen in Lebron James’s hands

Let’s get the snark out of the way. From the looks of it, it seems we’ve discovered who the Galaxy Note is designed for: the enormous hands of NBA superstars. In Lebron James’s palm, the Galaxy Note II seems like… a Galaxy S3. (Honestly, I couldn’t even tell until they showed the name at the end.) That aside, I wanted to talk about this new commercial from Samsung because it’s probably the best smartphone commercial I’ve seen from anyone in quite some time. The spot is fun, captivating, informative, but most of all it comes off as effortless. You might even say cool. And yes I’m talking about Samsung.

The first and most obvious reason why this ad works is Lebron James himself. His performance never looks contrived or forced. It’s a small miracle that someone hasn’t talked Lebron James into a role on some major motion picture.1 James has amassed a fantastic resume of commercials over the years, which I’d attribute to both his otherworldly charisma and comfort in front of the camera. In particular, I think the fact that James’s considers himself just another regular dude with regular friends like the rest of us is what makes him so endearing and relatable.2

In this instance we watch as he’s going around town on the day of the NBA season opener: eating breakfast with his family, getting chased around by fans, grabbing a taco, visiting the barbershop, and getting dressed in a locker room with the faint but looming roar of thousands. There’s nothing special about it but consider how much more natural it comes off than Apple’s series of Siri commercials which employed a similar concept and casted actual bonafide movie stars. The premises to those Zoe Deschanel/Samuel Jackson/Martin Scorsese are ostensibly the same, but they feel like characters in an overly polished Williams & Sonoma showroom, not famous people letting us catch a glimpse into their lives. Those commercials are an instance of Apple’s attention to polish and detail working against them. In abstract spots such as all those utilitarian ‘hand on a white background’ iPhone spots, the production polish helps makes the otherwise uneventful visuals shine. It even works to create an aura of magic in those early iPad spots, where we yearn to be the one under the sheets being bewildered by dinosaurs again for the first time. In the case of the Siri campaign however, the too-even lighting, the way they phones are held all to perfectly to frame the device, and the obviously scripted narrative creates a kind of uncanny valley where we know right away what’s in front of us is fake. The magic is lost. We know that it’s Zoe Deschanel acting in a fake version of Zoe Deschanel’s life that’s supposed to feel like a real version of Zoe Deschanel’s life.

It’s precisely that lack of polish — the absence of perfection, in the world of Samsung/James’s spot that makes it believable and allows us lose ourselves in it. And because the ad doesn’t try to build a narrative around any one specific feature of the phone, events seem to unfold organically. Who doesn’t let their kids play with their phones over breakfast? I too would be watching a video if I was lying down on the floor silently getting stretched by a physiotherapist. And I actually do tweet pictures of my shoes (only sometimes ok?). Again, the ad generates fun and interest by feeling effortless. That’s what allows the scripted glimpse into James’s life to feel plausible. And feigning effortlessness has to this point been’s Samsung’s Achilles’s heel.

Why Samsung is able to feign effortlessness this time around is because they’ve finally produced a campaign that doesn’t acknowledge the existence of the iPhone or try and comment on the meta-commentary surrounding Samsung and Apple. Simply put, it doesn’t try and pander to the tech crowd. Even if there’s some element of humour in some ofthose Galaxy S3 spots, building a script around turning the success of your main competitor into some laughable flaw comes off as the ploy of someone who’s decidedly in second place and is sour about it. Many have made the comparison to Pepsi’s own marketing, which often hinges around talking down or mocking Coca-Cola. While that might be true in spirit, I think we can conclude that Samsung’s ad agency isn’t as skilled at pulling off mockery that doesn’t come off as insecurity.3 The result is even worse when Samsung tries its hand at replicating the magical world building of Apple’s commercials and crawls through its own series of contrived scenarios designed to stimulate emotions that immediately comes off as fake. I’m getting a little reflux just thinking about slow motion shots of a couple gazing into each other’s eye’s as their phones touch to share god knows what over NFC.

The secret to effective movie making, which turns out to also be the secret to effective advertising, is that emotions have to be dangled as threads for the audience to unravel. We have to be left alone to come to our own conclusion. That this new Galaxy Note II ad contains none of it’s previous propaganda is what allows it to succeed. Without some agenda or message it’s determined to beat us over our heads with, Samsung finally gives the Galaxy line its own identity as a phone, one that doesn’t have to live within the shadows of another.


  1. Try to recall all your favourite movies with athletes in the leading role to understand why it’s a miracle. I’m sure many of you will point out Space Jam, but I’d argue that one is the exception that proves the rule. And having seen it in adulthood, I can assure you, it’s not as good as you remember. 

  2. I don’t think it’s just a public persona either. From everything I’ve read or seen about the man, I really think the predominant trait that defines Lebron James is his desire to surround himself with friends and family. Parse through all the significant events and details of his life, and the public’s reaction to them, and I think what rises to the surface is an enduring attempt on his part not to become isolated from the everyday world the more famous and prominent he becomes. (Otherwise known as the standard paradox of fame and success.) Contrast this with Michael Jordan, who always seemed to embrace his position as the centre of attention. He chased and enjoyed the notoriety of being the king of the mountain. Even in commercials, MJ is always utterly alone and isolated in his world. That’s not to say he wasn’t as charismatic or magnetic as James, but my feeling is that Michael Jordan was attractive because we wanted to be him, while Lebron James is attractive because we want to know him. 

  3. Consider how much of a gamble it is for Samsung to include so many inside jokes based around specific anti-Apple sentiment stemming from a very specific segment of the tech-community. I understand where the digs about the iPhone 5 being perceived as a jewel come from, but I’m not convinced that joke has a very broad reach. And who should be the target of a primetime national ad campaign: the blasé nerd keeping a spot in an iPhone line for his parents, or the parents themselves? 

The Magazine

Marco Arment lifted the veil off his much anticipated new app, surprising many with an app-published magazine with a form-eponymous name that raises — and itself asks — lots of questions. The content of this inaugural issue is for the most part1 anodyne; subscribers to Read and Trust will feel right at home, as will anyone who’s browsed a decent blog in the last decade. My impression is that the magazine was inspired by the blog, written by blog writers, and meant for blog readers. And so the whole thing’s raison d’être is a big question mark for me. Marco believes The Magazine sits in a “category between individuals and major publishers” but I can’t, at least so far, distinguish how its content does anything to occupy a space that isn’t already littered with minimalist Wordpress themes.

Reading through Marco’s foreword fails to provide any examples of the new and/or experimental. The content is positioned as being for geeks but not about technology, in other words the stock and trade of blogs. In this first issue you’ll find the same meta-commentary, GTD personality analyses, love letters to sports, and personal introspection you’d find filling your RSS or Twitter stream any other day. The intentionally bare bone layout is great for speed and readability but does nothing to build an interesting foundation for text to build on, something magazines were designed to do. Even the pitch, with its call to arms for content ownership and disdain for traditional media, is familiar. The only perceivable differences (or similarities if you’re on the magazine side of things) are a preference for longer form writing, a publishing platform that’s universally loathed, and a payment scheme that’s been collecting dust in the internet’s closet for a long time. I am curious however about how the latter example has the potential to fill a gaping hole in web publishing. The Magazine actually addresses the topic itself with an article from Guy English:

A business model where the author only occasionally writes longer pieces can’t be sustained — there’s too much time between pieces for sponsorships to work, and daily site traffic will be so low that ads won’t work well either. A Linkblog format offers the author a way to keep consistent traffic, be a constant voice in the greater conversation, and buy time between more in-depth pieces without losing audience interest.

The optimist in me sees The Magazine as an attempt to solve this problem while the pessimist sees ancillary income for the semi-independant link-blogger who’s long-formed thoughts aren’t as profitable on his homepage. There’s opportunity here, but for now Marco’s vision of — and for — the magazine (common) seems better espoused by sites like Thought Catalog, the New Inquiry, or what the Atlantic and the Verge could be if they weren’t beholden to precisely the issues The Magazine is attempting to defeat. And if it doesn’t, then the worst that could happen is that this project turns into a blog centric, shortlist version of The Feature. Albeit one that can actually pay its writers.

Going back to forms for a second, perhaps my biggest question is why Arment felt he had to create another platform to foster long form, non-traditional, financially viable writing. Wasn’t that Instapaper’s destiny? Isn’t it already poised to accomplish what he’s setting out to do with The Magazine? You might even argue Instapaper is in a better position given its popularity among a new generation of readers that want to be the the gravitational center of the content they consume. From the readers perceptive The Magazine is a throwback to the old traditions. Yet this is true only if you believe that the magazine serves only at the reader’s leisure, or that it’s only meant as a compliment to Instapaper. For one it’s probably better suited for the economic stuff. OneWayback Machine trip to Readability.com shows how difficult it is to turn the Instapaper model into a viable living for the independent writer. Could the app-publication’s master end up being the writer? As long as interests and intentions align, there’s no reason why The Magazine couldn’t succeed where Readability failed.2 Maybe that’s enough to justify The Magazine’s existence (blog-centric Readlist or otherwise). Playing along with that possibility would mean Arment is henceforth offering two solutions to the nagging plights of reading in the 21st century: One that empowers the reader, and another for the writer to wield. Whether both succeed or fail, at least there’s someone out there bored by the idea of reinventing publishing only once.


  1. The expression “plant” a basketball is so revealing of one’s lack of knowledge of the sport that I half wonder whether Jason Snell is actually using it to drive his point home. That not being the case, misfit sport-jocks should be aware that one can nail a three-pointer, sink 100% of his free-throws, be nothing but net from downtown, or possess a smooth stroke that never hits anything but twine. But one never plants a jump shot. 

  2. Note for the confidence weary: the 4-week deadline to profitability before the plug is pulled doesn’t seem like an idea hatched by someone in it for the long haul. 

Gizmodo’s Review of the iPad and iPhone 4

As we brace for this week’s deluge, I thought I’d provide two samples—from Gizmodo no less—in contrast to my complaints from last week. Despite sharing a similar format and tone, both are among my favorite gadget reviews. What I like in particular is the way (which most reviews tend to do in reverse) both Chen and Lam use the experience of living with these devices as the method by which we might extract value and meaning from them; Is there space in our lives for the iPad? How do you—why should you—redefine the already ubiquitous experience of owning an iPhone?1

Neither article is perfect (at times too coy and proud of it) but they do point towards an alternative discussion of consumer technology I can’t recall seeing elsewhere since. The only writer exploring in a similar manner whose name comes to mind is Shawn Blanc. The difference is that where Blanc’s voice is technical and definitive, Lam and Chen’s are ambiguous but honest. It’s too bad these reviews remain a distant anomaly in Gizmodo’s mired rear view mirror.2


  1. In hindsight, the reviews complete each other. I’d wager the way Chen talks about the iPhone 4 is the way we could approach this year’s reviews of the fourth generation of the iPad. The numerology isn’t the point here, but rather the transition from the merely ubiquitous to the utterly ubiquitous. 

  2. Knowing full well the futility, I remain curious about what shape a Gizmodo sans Gawker would have taken. The double edged sword to Gizmodo’s identity has always been it’s willingness to risk being contrarian—if not even rebellious—in a space that discourages it. While that risk sometimes materialized into the aforementioned reviews, it’s also been responsible for theft, glamorizing theft, contrarianism for contrarianism’s sake, and hiring Jesus Diaz. My pet theory states that by removing Gawker’s appetite for exploiting sensationalism, Gizmodo’s risk taking would amount to more than repeated embarrassment. 

The Turn

You’ll have to excuse the forthcoming confusion but I think Siegler is using the wrong analogy to make his point. In any magic trick the purpose of the turn is to fool the audience into believing what’s happening on stage, to convince them that what’s unfolding before their eyes isn’t a magician’s simulacra but in fact reality. The prestige, where magic is concerned, is the byproduct of an effective deception. Siegler’s turn— Apple’s meticulous penchant for innovating through repeated iteration, isn’t deception: All those hardware refinements actually come together to create a phone that’s lighter, faster, larger, and more beautiful than anything before it. The difference this year is that the resulting prestige isn’t as effective. If anything, it’s hard to see in the iPhone 5 any difference between turn and prestige.

If I can empathize1 with any part of all these articles lamenting its writers disappointment in the iPhone 5, it would be the absence of any exclusive feature or experience that can’t be had on an iPhone that already exists. You can explain some of this simply as smartphones arriving to computing maturity. The issues that weighed down the original iPhone have long been addressed. The turns of each subsequent generation provided useful and desirable prestige-s: competent networking in the iPhone 3G, the gaming and processing advances of the 3GS, the photographic prowess of the 4, and a next generation2 user interface in the 4S. Each advancement introduced by each successive version of the iPhone were not only leaps in technology, but often unprecedented. All thanks— as Siegler argues, to Apple’s relentless attention to the turn.

The difference this year is that rather than selling us on the prestige created by its advancements in the turn, Apple took the stage and sold us the turn as prestige. Although every Apple keynote is filled with long and detailed accounts of its design and engineering efforts, this year’s keynote offered little evidence of the exclusive leaps in experience3 these advancements were supposed to provide. The iPhone 5 may be superior technically, but little about using it will feel unprecedented. Perhaps this explains why the presentation emphasized the design and engineering processes above all else. It’s uncharacteristic of Apple to tout how hard it works as an argument for why we should buy their products. Worse, it’s highly uncharacteristic of Apple to have as few and uninspiring4 reasons why all that hard work — most of which goes unnoticed, matters once the device is in our hands.

Where I differ from the aforementioned lament-ees is in the belief that this year’s lack of surprises, this year’s dissatisfying prestige, is somehow foreboding. The iPhone remains the best smartphone experience one can purchase, and this latest version keeps taking steps forward. Nor is this the end of Apple as the beacon of innovation. At best the iPhone 5 presents a difficult upgrade decision for 4S owners. At worst it’s a signal that priorities in hardware5 are, if not reaching a plateau, arriving to a golden age where performance gaps between successive generations of smartphones are narrow. If this is the case then Siegler’s argument is on the mark, despite the misplaced metaphor: the endgame is indeed in the turn.


  1. As an iPhone 4S owner. If you’re coming from any device prior, the upgrade will feel significant.

  2. Beta.

  3. Compare the unveiling of the taller iPhone 5 screen to the Retina Display reveal, or the introduction of the iPad. With the latter, it was easy for Apple to be unequivocal about how game-changing these turns were going to be for your enjoyment of iOS devices or how they advanced the industry. Besides widescreen video (which I’d argue is more relevant to the iPod touch) and a fifth row of apps, Apple was surprisingly short on reasons why we should care for a taller iPhone. By contrast, it’s obvious why a taller iPhone matters for Apple to achieve its engineering and strategic (read: responding to market trends) goals.

  4. Faster, lighter, taller!

  5. Alternatively, the “worst” part might be that Apple presented such a disappointing homecoming for iOS 6. If they have cornered the hardware turns market, the market for software turns is a lot more competitive. And it’s over the latter that the battle will be fought.