smarterbits

The Real Problem With Hooking Up

Freitas convincingly demonstrates how Sex and The City, despite its flaws, depicted sex as fun, exciting, and pleasurable, while Girls equates sex with misery and boredom.

There’s only 8 years separating Sex and The City’s series finale (let’s discount the movies) and Girls’s pilot. Staggering.

A Superman for All Seasons

Latter-day attempts at “relevance”—which have seen Superman tackling issues like world hunger and racism—backfired because Superman functions on a higher symbolic level. It is a hard-won lesson of comics: Showing a guy in blue tights and red cape weeping over the body of an abused child doesn’t bridge the distance between his world and ours, it brings the yawning gulf between them into sharp relief

Interesting overview by Glen Weldon of Superman’s lockstep with American culture over the decades, especially for a guy like me who’s never been able to muster any interest in the character. I don’t get the appeal. If he’s supposed to represent some ideal for us to strive for—“truth, justice, and the American way!”—then we’ve been set up for failure. Superman gets to be who he is precisely because he lacks the humanity that makes the rest of us so falible. We’re unable, as a species, to be so stoically selfless. That’s why I prefer his portrayal in Smallville and Superman Returns as someone who struggles, despite his incredible powers, with the same issues and doubts we all face and who’s morals and convictions are actually challenged from time to time. Ratner’s Kal-El might be out-of character by Weldon’s standards, but I think it’s the relatable Superman that’s inspriational beyond mere symbolism.

The Visual Effects of Oblivion

Visual effects caught on camera will always be more captivating than digital ones processed after the fact, so it’s always reassuring to see young filmmakers who appreciate that difference. And compared to others, it seems like working on the set of Oblivion lives up to the promises of Hollywood movie magic. It also just looks damn fun.

The editing for this clip however is not as impressive. More than once, a talking head/voiceover sequence praising XYZ is followed by another talking head/voiceover extolling verbatim about the same subject, revealing how rehearsed and insincere these behind the scenes looks can be.

It’s Not a Web App. It’s an App You Install From the Web.

Odd blog post by the team behind Forecast, the excellent new weather app that’s attracting attention both for its quality and the medium used to create it.

So why does it feel as if the average native app is so much better than the average web app?

The line of questioning is what nags me. The interesting part of the web vs native debate isn’t about which technology feels better (a question which they’re convincing enough in demonstrating is immaterial anyways) but about why app developers overwhelmingly choose to work on native platforms. What I would’ve like to see is an argument for why developers should prefer web solutions over native ones. Some reasons are obvious: cross platform compatibility, over-the-air updates, and a dynamic and adaptable programming foundation in HTML. What’s tricky is convincing developers those things are worth leaving the advantages native apps provide, especially where it involves justifying a web app’s absence from the one place the majority of people shop for and discover new software. Technologies asides, native apps get a head-start from the visibility, added security, ease of use, and built-in marketing app stores provide. The reality the web advocates have to overcome is one where we’ve built an economy and marketplace around native solutions. The funny historical twist is that if web apps were as capable 6 years ago as they are today1, I can’t start to imagine what kind of conversation we’d be having instead.


  1. In comparison, the iPhone really was a tough space for web developers in 2007. App-wise, where did one start? There was little precedent to take inspiration from. Neither could they benefit from all the legwork Apple would save native app developers from with the iOS SDK in 2008. Other factors were altogether out of their control. Even until the 3GS, iPhones didn’t have the computing, or networking, power to be able to render web performance comparable to native apps. 

Quotes and Accents

Great cheat sheet by Jessica Hiche to all those tiny typographic details that make a big difference in quality. Includes all the Mac keystrokes as well, saving you an additional Google search.

Concerning the em dash, make sure that what’s included between them—what you’re going to put here—is a strong enough break from the current thought to warrant its flair. In Hiche’s example, a simple period would be as effective in conveying the emphasis of the narrator’s reactions: I once had to use the bus station bathroom. Horrifying. My own yard stick is to use an em dash for asides too brief to warrant a footnote1 but not suitable for parentheses (use parentheses to add details directly related to the ongoing thought).


  1. Use footnotes for tangents so long they would be distracting placed in the main text. They’re DVD extras. 

The Verge's HTC First Review

Two points I’m glad Dieter Bohn emphasizes in his review: Besides the Facebook integrations, the First runs stock Android (4.1 unfortunately). It also bucks the bigger is better mantra we’re used to seeing when it comes to screen size.

Looks to be the most user focused Android design to date, even compared to Google’s own Nexus line of products.

Facebook Home and the Islands of Internet

A friend of mine believes that all big tech companies treat the web as a service worth competing over. Running with his perspective, I feel less crazy suggesting that Facebook’s ultimate goal is to become its own version of the internet. (He probably put that thought in my head too) This idea lands somewhere between theory and practice. You can — right now — open a browser tab, log into Facebook, and find in that one place a majority of the information you could otherwise go through a variety of sites to find: restaurant menus, concert listings, what your friends are up to, photos, news, blog posts, games, ad infinitum. Still, the other sites endure. Some because they provide better info. Some are more popular. Many survive because Facebook hasn’t found a way to convince us all to log into it every time we launch our browsers. Which is why whatever Facebook Home turns out to be, it’s at least a somewhat pretty sizeable deal.

Despite the whole thing not even being official, the Facebook centric launcher/homescreen/app-OS (Is it all these at once?) is already a strange entity. I’m shocked that Google is endorsing this at all, even if only by doing nothing. Letting Samsung or Amazon use Android as a base layer for their own operating systems is one thing; the information plumbing still runs through Google. Facebook Home is trying to circumvent Google altogether. There may be a Google search bar at the top of the screen, but I have a hard time believing that’s all it takes take to convince your biggest competitor that your intentions are noble and just and “look pal, I think this is the beginning of a fortuitous partnership”. The Google branded search bar — heck the fact that Android is even prominently mentioned on the event invitation — must be a good bit of Jedi Mind Tricks. At first I was going to criticize Facebook Home for being too tentative. Why only a launcher/homescreen/app-OS instead of an entire OS or customized Android UI? Why not try and block Google’s access in every way possible? But it’s obvious, thinking it over twice, why Facebook Home makes more sense as is, rather than as a OS blitzkrieg that would almost certainly fail. A proprietary OS or Android Fork would involve selling millions upon million of phones, something I don’t think OEMs (Facebook is probably the one doing HTC a favour by giving them this much primetime US media attention), carriers, or customers are eager for, nor something I think Facebook has the skills to execute at the scale they need to pull this off. And you can be sure they would be drawing the line in the sand for the whole Android as commodity OS for other web companies thing. Hence why a launcher/homescreen/app is so brilliant. Facebook gets to simulate an OS without the overhead of engineering their own, which they can potentially propagate like a virus across millions of its competitors existing and forthcoming devices. And they get away with it unchallenged because of a search bar? No really, it’s astonishing that Larry Page doesn’t realize he’s handing over his lunch to Mark Zuckerberg.

The splitting of the mobile ad pie is going to matter, but on a bigger scale than most of us are realizing. Supposing Facebook’s new project is in any way successful, it’s going to solve their recurrent login issue and let them be the internet’s portal on the fastest growing, most used class of computers. It means getting our first glimpse of the internet’s segregation.

Thanks to mobile computing, we access the internet through an array of apps, widgets, and OS level services instead of a single browser window. Controlling those portals is big business. Once there’s no longer a single way to reach the World Wide Web (your browser and a search engine being the first), the ad value of our attention skyrockets, meaning the richness and exclusivity of a platform’s information becomes an asset to be stockpiled rather than shared. Facebook wants to be the only internet its users need, and to do so it needs a combination of portals (Facebook Home/Facebook Messenger) and content (anything that’ll fit on a profile page).1 Adding reasons for us to stay, e.g., Instagram, Zynga, Bing search and Maps, becomes crucial to the business. When it begins taking sizeable chunks of their revenue away, it’ll force Google to limit access to their content in an effort to lure and hold onto people using its services. If you were looking for perspective on why Apple would create its own mapping service, simply play this scenario out a little further.

The somewhat pretty sizeable deal in all this: Facebook Home stands to mark the beginning of the absolute ecosystems: silos not just of hardware and software, but also of knowledge. The islands of internet. And proof of my friend’s supernatural prescience.


  1. Google has been doing the same thing for a long time now, only in a less overt manner. It’s strategy was to provide the information backend for everyone else’s software and hardware and monetize the data flowing back and forth. Facebook is the first service with enough scale and reach to actually challenge them, not because it has more services or better information, but because its own data is more valuable to advertisers.

     

Influences

The textile industry is squandering an opportunity. Despite accounting for 8% of manufactured goods sales around the world, they’ve managed to stay on the sidelines of our mind share ever since ire over sweatshops boiled over in the 1990’s. Nowadays it’s software designers undertaking the bulk of the PR work for textiles, as skeuomorphism finally impresses upon an otherwise fabric oblivious generation the nuances of linen, felt, faux leather, and whatever other basic textiles make up your shirt’s blend. Blame my cynicism but I’m shocked Cotton or DuPont hasn’t sized the moment and begun demanding their logos mar every wallpaper or user interface element on which digitized versions of their products appear. Unfortunately for them, it looks as though the public’s honeymoon with skeuomorphism is already coming to a brandless end.

“The Trend Against Skeuomorphic Textures and Effects in User Interface Design”, the latest in a long list of attempts at explaining this particular eventide, stands out thanks to John Gruber’s uncanny ability to summon a history of events wholly disconnected from reality. His essay, like most magic, begins on a benign observation: there’s a trend forming among top tier1 iOS developers steering away from the skeuomorphic design language of the platform. Trying to figure out why, Gruber cites Letterpress, Instapaper, and Twitteriffic 5 as case studies (other good examples: Realmac Software’s Clear, Flipboard, and Simplebot’s Rise), endorsing Dave Wiscus’s false rationale that the examples supra cement iOS’s legacy as the birthplace of leading-edge, non-skeuomorphic design. Things immediately start to fall apart.

*Proper usage of the word skeuomorphism is contentious enough to warrant its own article, so I’ll address it here to avoid issues later on. Most of the ire is concentrated around its misappropriation to designs which aren’t by definition skeuomorphic at all. I prefer deferring to the experts: Christopher Downer provides a good introductory overview that delineates the apples and oranges. In contrast, Chris Baraniuk’s position is polemic, calling into question the entire use of the word in relation to UI design and—not content to stop there—wonders whether or not the Wikipedia definition is more or less entirely rubbish. Louie Mantia also provides some needed Mythbusting on the issue. While I tend to agree with each’s arguments, I still can’t get on board with their prescriptivist position. Doing so would be ignoring how the word has transcended the boundaries of its old meaning and become a catchall term to a larger body of people using and adapting a definition that’s more popular in everyday use. Much the same way minimalism is flung around with little regard for definitions, we can use skeuomorphism as a genre word that, though perhaps frequently misapplied, is apt enough in practice for everyone to distinguish between a skeuomorphic-ish design and one that isn’t. And it’ll be used as such here.

From the start, both men’s design myopia refuses to acknowledge that non-skeuomorphic design has existed elsewhere prior to 2012, whether as the preeminent aesthetic of Windows Phone 7, Microsoft’s mobile operating system2, or through the clean lines and sci-fi sterility of Android’s not-completely-flat-yet-not-stuffed-with-chrome UI. The sidestepping of any outside influence is meant as misdirection, a reshaping of events that encourages the idea that iOS designers live in a vacuum controlled by the whims of Apple. My guess is that Gruber thinks he can get away with this fallacy since Windows Phone sales have been tepid at best and that the stock Android UI is almost always redecorated by whoever’s supplying the hardware. Except popularity isn’t a necessary condition of influence. Any competent accounting of flat UI design shouldn’t, and wouldn’t, ignore the contributions of Microsoft, Google, or even Palm, no matter how disappointing their sales records.3 Having declared iOS as the epicenter of this new trend, an iota of sleight is all that’s needed for Gruber to switch Apple’s position from beneficiary to benefactor.

Gruber’s chosen Apple’s Retina display to be the hero of his story, declaring it a singular breakthrough absolving designers from employing the “the textures, the shadows, the subtle (and sometimes unsubtle) 3D effects” of skeuomorphs that were “accommodat[ing] for [the] crude pixels” of non Retina quality displays. His thought process involves comparing the influence of high resolution displays on UI design to the influence—in this case real and documented—they’ve had on digital type design. Quick recap: Retina caliber displays are behind the viability of print hinted fonts rendered digitally, which had hitherto looked insulting on the sub-par resolution of non Retina displays. They’ve also had the reverse effect on screen optimized fonts by suddenly making them appear vulgar, ridding them of their purpose. Gruber equates the trimmings of skeuomorphic design to stopgap fonts like Georgia and Verdana4: poor solutions used for a lack of better options, given that the “hallmarks of modern UI graphic design style are (almost) never used in good print graphic design”. Therefore, we ought to be thanking Apple for granting designers the opportunity to produce “graphic design that truly looks good.” on our devices.

There’s no evidence I can find—and suspect will ever find—to defend the claim that skeuomorphic textures and effects are scapegoats for the inefficacies of lower quality displays. Gruber so heavily leans on his comparison to screen fonts he starts to redefine the term, implicitly suggesting that skeuomorphism is equivalent to poor design taste. If you’ve made it this far then you know how spurious the whole idea is. Even Dave Wiscus’s 100-level explanation is enough for anyone to articulate the relationship between a skeuomorph’s purpose and a heavily textured material surface. Neither is there any reason to believe that skeuomorphic design is now defunct thanks to Retina displays, given that (a) we know a skeuomorph’s primary function isn’t too cover for crude pixels; (b) contrary to Gruber’s subjective analysis that all drop shadows and glassy surfaces look worse on them, Retina caliber displays allow for even more detailed and striking effects, making already beautiful apps using skeuomorphic elements all the more stunning; and (c) even if we cede the last two points, questions abound on why, since the release of the Retina bearing iPhone 4 in June 2010, Apple has all but ignored the apparent Retina-resolution design era and pushed towards heavier and heavier use of so-called parlor tricks on both iOS and Mac OS, or why so few third party developers have moved away from the skeuomorphic model. His entire essay is being driven in a car without a rear view mirror, aces rushing out of its driver’s sleeves.

Most of the sensible explanations put forth in “The Trend Against Skeuomorphic Textures and Effects in User Interface Design”—that skeuomorphic elements are overused, how Retina caliber displays can influence UI design—are perverted by the misconception that print design and UI design are one and the same.5 They’re not. Where print design is concerned with aesthetic cues and organization of information that’s conveyed subconsciously to the reader (e.g., the way the eye moves between two paragraphs and understands new ideas are being introduced, or how text size imparts hierarchy), UI design’s cues are dynamic and explicit. They must convey function, respond to input, morph, adapt, and tangibly interact with the user. The set of skills required for one doesn’t come close to the set needed for the other. When Gruber tells us that “[the] hallmarks of modern UI graphic design style are (almost) never used in good print graphic design”, he’s right for all the wrong reasons. The differences don’t even matter. What’s he’s trying to demonstrate is how UI design is undergoing the same crippling transitional phase print design—specifically as it concerned fonts—had to endure with the introduction of digital displays. His account of digital type’s hobbled history, right down to its rescue by high-resolution displays, is spot on. Yet the paths between the two arts don’t run parallel; software’s only ever been digital. Where’s the analog6 (or digital) counterpart we compare it to and say “We could do so much more if only we weren’t stuck designing this software on a screen”? As displays march on towards human-grade optics, of course designers’ options have improved, but there isn’t some past UI standard they’re trying to return to. Progress here is strictly forward. Nothing forced skeuomorphism on us.

The upshot to this mess is that Gruber’s initial question is actually worth considering. It never once occurs to him however, that the answer needn’t be as convoluted as he makes it.

In his own words: “There is a shift going on, fashion-wise”.

Designers. Users. No one is immune to the fatigue brought on from overexposure. The numbers themselves are staggering. 700,000 apps downloaded 35,000,000,000 times. Even accounting for the large number of games making up that total, the prominence to skeuomorphic design is inescapable. We’ve refined, developed, added to, twisted, and debased the style down to a chintzy polish.7 Why doesn’t Gruber wonder whether we’ve simply tired of seeing yet another faux-textile background mimic a pair of pants no one would dare buy in the real world?

The analogies to fashion are easy to latch onto because they help make the distinction between aesthetics and function, something Gruber understands and has leaned on previously when describing user interfaces as “clothing for the mind”8. The premise is simple: No matter the amount of “stylistic tweaking”, UIs—or clothes—remain true to their form. So long as it remains able to divide the bill at the end of lunch (form), your calculator app can resemble whatever model Braun calculator it wants (stylistic tweak). The couture comparisons might be heavy handed, but they’re a good starting point from which to find better reasons why we’re moving towards flat user interfaces. For example, it could be that designers are realizing there’s a whole new generation of people for whom the cues of skeuomorphic design aren’t referential, but merely aesthetic.9 What’s the point of mimicking a Braun TG 60 reel-to-reel deck to millions of kids and young adults who will never lay eyes on—never mind use—an actual physical tape recorder in their lives?10 Why stick by a design that’s losing its raison-d’être?(_ed notes: an update to the Podcasts app on 21-03-2013 got rid of the tape deck simulacrum_) We might also consider whether skeuomorphic design is even fit for the UIs of modern computing anymore. As we increasingly interface by way of gestures, voice commands, and inputs disconnected from physical analogs, are digital knobs and textures the most efficient or practical solution? Asking these sorts of questions—not wondering what’s changed since Apple released a new iPhone—is how we begin noticing the influence of an entire mobile industry on itself: We can trace the career of Matias Duarte from Palm to Google and see WebOS’s legacy of physicality continuing on Android. It’s why designers at Microsoft can find solace in the fact that designers are apparently taking inspiration from Windows Phone 8’s text-centric, chrome-less aesthetic and adapting it to their software. Point being, it’s pure fantasy to imagine third party iOS developers leading the charge against embossed text on the basis of a single and insularly engineered cataclysm.11

Skeuomorphism isn’t bad design. Nor is it a fad. A pragmatist might complain it’s no longer ideal in 2013. A pessimist would say we’ve made it kitsch. I suspect John Gruber knows and believes these things. Otherwise his essay is a change of opinion that throws away years of Daring Fireball posts. Then why go to such lengths to find a solution so stretched and un-obvious? My suspicion is that any scenario wherein we acknowledge that fashion-wise something has fallen out of favour inevitably leads to questions about exactly what’s causing the falling out. Fingers want to be pointed and the inconvenient truth here is that skeuomorphism has no bigger an evangelist than Apple.

What goes unmentioned in Gruber’s essay is that most of the gaudy elements he’s reproaching were introduced, if not heavily endorsed and popularized, by Apple.12 iOS’s contribution was to dial the exposure knob to 11 by attracting thousands of eager developers to its ready-made developer tools favouring conformity and uniformity across the entire platform. The formula’s proved so successful that the entire UI language of specific classes of apps has been codified, standardized, and left customizable only at the level of “Which texture or drop shadow angle should we use here?”. Hence the excess.

There’s little satisfaction in getting this far only to have me pin this on one writer blindly marching his party line. While there’s no doubt Gruber’s over thought the situation so Apple can walk away unscathed, what I want to try and coax into sight are the actual consequences at play in this debate. Blaming Apple for abrading our tolerance of skeuomorphism isn’t as worrisome as the idea that it might have no intention of stopping. Hardware aside, there’s enough evidence to suggest that Apple’s institutionalized its taste for the playful, safe, non-threatening, and innocent genre of software espoused by iOS. You’ll notice small doses of it in places like the App Store, where categories and catalogs are given their own tacky icons filled with garish fonts and unimaginative emblems: a golden plaque background for its hall of fame category, an assortment of balls to decorate its sports section. Where it’s most apparent is in their now celebrity-laden, heartstring-tugging commercials, the charms of which have less to do with Apple’s clever wit and genuine passion than applying its fastidious work ethic to clichés we’ve seen elsewhere in advertising. There’s a shift occurring at Apple about who it considers its core audience to be, a shift that consequently reverberates across its product design, i.e., why it continues to be attached to skeuomorphism.

* Marketing is often the simplest way to see who a company cares about, how it perceive its audience, and how it cares to be portrayed. The best way to illustrate this particular shift—without rewinding too far—is by drawing a line somewhere around the launch of the iPhone 4 and comparing Apple’s advertising efforts before and after. The biggest visible change is the introduction of the decidedly cinematic and ostentatious suburban lifestyle vignettes exemplified by the Sam Mende’s directed FaceTime videos, as well as almost the entire run of Siri spots, and the short-lived_ Apple Genius series. They’re evidence of a company shedding its aura of pretentious coolness in favour of innocuous inclusiveness. Even going as far back as the Jeff Goldblum narrated iMac G3 commercials, Apple’s marketing pre-iPhone 4 was often about differentiating its values: Apple’s, and everyone else’s. The Manchurian-like effect on consumers meant—besides exemplifying TBWA\Chiat\Day’s own genius—that owning something California designed was a token of membership. If nothing prevented anyone from enjoying those iPod Shilouette dance videos, nor the charms of the Get a Mac _series, those ads nonetheless introduced dividing lines. If you didn’t own an iPod, didn’t recognize the catchy music (remember when Apple abandoned the opaque dancers and upcoming hipster bands in favour of unmistakable U2 and Coldplay mini-music videos?), owned a PC because you honestly couldn’t tell the difference, or weren’t savvy enough to make out all the references in the classic “Hello” iPhone Oscars spot, you couldn’t help but notice how different you were from those people who did own Apple products, a realization laced with all the consumerist impulses we like to pretend we’re immune to. Today, with so many iPhones and iPads in the hands of people who decidedly don’t care to fit that particular brand image, the old approach becomes alienating. Thus the current marketing—because Apple’s demographics run such a broad spectrum—goes out of its way to avoid any delineation, aiming to associate the brand with a wholesome, family values, American Dream lifestyle that almost anyone can relate or aspire to in some way.

Apple’s cutting edge innovations are both blessing and curse. As responsible as they are for the massive success and ubiquity of Apple within the pockets of a large portion of the developed world, they’re also responsible for populating its base with customers for whom cutting edge technologies have little appeal, traction, or even desirability. Today’s average Apple enthusiast is less likely to care about trends in UI design than they are about whether their current iPhone’s case will fit the next one. The kicker is that it’s proof of Apple shrewd business acumen: the skeuomorphic designs introduced in iOS back in 2007 were central to overthrowing the crude and unapproachable UIs powering devices preceding the iPhone and transforming the smartphone into something desirable to people outside office buildings. In hindsight it’s easy to explain why Apple had a hit on its hands. Today however, the huge heterogenous market Apple managed to attract to iOS is also the huge, heterogenous, and sensitive to change market which expects its median to be catered too. Dealing with expectations of this magnitude is a new world for the company, one which they may not comfortable operating in.13 Even assuming it remains a best of breed consumer electronics company well into the future, the attrition caused by the demands of ubiquitous user base means it’ll be increasingly harder for Apple to remain at the leading edge of the industry, at least UI-wise, without running the risk of estranging that base. While it won’t prevent them from innovating on hardware and technologies, it could force them into tempering their software breakthroughs in aspects they otherwise wouldn’t have if the target market still resembled what it was in 2007. Multi-touch gestures are a good example. Despite possessing the most advanced touch display technology in the industry, gestures remain woefully underplayed in the core iOS interface. Four and five fingered iOS navigation only became available to the public on iOS 5, and their—turned off by default—use limited to the iPad. There’s also no reason why some of those same gestures couldn’t work on smaller iPhone sized devices with one or two fingered substitutes. Yet their absence is conspicuous. Six years in, the gist of working one’s way through iOS remains by tapping buttons over and over again. Even prominent 3rd party innovations like “pull to refresh”, which thanks to their popularity on third party apps could routinely be mistaken as a core element of iOS’s interface, have only been timidly adopted by Apple, if at all. This underlines why the charge away from skeuomorphism is being led by third party developers, and not Apple as Gruber suggests. Third party developers aren’t beholden to the median of iOS users. They can find success in narrow audiences. They can take more risks UI-wise, acting as outliers with aspirations of becoming the trendsetters for next year’s UI fashion trends. It’s a can’t-lose scenario for Apple: at a minimum there’s enough apps to please anyone’s tastes, and if any of these Flat UI projects happen to take off at scale, e.g., Google Maps, certain elements of the native Facebook app, or pull to refresh, Apple benefits by osmosis.

There’s a hitch of course. Nothing explained, debated, or corrected supra applies to any industrial design related activity Apple’s been involved with over the last 13 years. No one would contest that every desktop, notebook, or mobile device bearing its logo hasn’t at one time represented the absolute bleeding edge of its field, achievements superseded only by their successors. There’s no denying how relentless Jony Ive14 and his team have been at pushing the boundaries of what a computer device ought to be, how it ought to look like, and what it ought be made of. Theirs is a unique focus that, mixed with a healthy disregard for whatever customers might want or expect (floppy disks, DVD drives, removable batteries, whatever I/O ports the iPad doesn’t have, and bigger or smaller iPhones depending on the rumours circulating the day you’re reading this), is almost enough to vindicate Apple’s overabundant affection of superlatives when describing its products. But hardware designers enjoy some privileges the software guys don’t. The big one concerns how being at the leading edge of electronic industrial design—as it seems only Apple has realized—actually aligns itself with the goals of the art. However striking its design, hardware’s ultimate goal is to disappear into the user’s unconscious: Lighter so as to not fatigue the hand, smaller so it can fit into any bag. Faster, longer lasting, higher resolution-ed. Whatever means necessary to prevent it from impeding the user’s experience.15 So long as the result doesn’t wildly diverge from the norm (say, twenty-seven inch convertible desktop tablets or buttonless iPods), there’s otherwise little consumer attrition constraining the imaginations of industrial designers. Once in use, most of the physical aspects of our computers fade into the unconscious, out-shined by the attention its software commands. The burden for the software guys lives in that differing proportion of attention. Our relationship with software is so immediate that any atomic change to our literacy of a given UI elicits a larger and longer sustained reaction than any material changes made to our favourite products.16 We’re prone to blame, justly or not, the successes and failures of our computers on software. The feel of brushed aluminum matters more on our screens than in our hands.

Whether tangible or pixelated, fashion remains capitalism’s favourite child. Being able to tap into—or manufacture—the desires of an enormous aggregation of people is SOP for any company hoping to reach the rarefied company of Apples, Coca Colas, and McDonalds(s), even if the usefulness of their brand images don’t make significant contributions past enlarging the guts of the many and the wallets of the few. Yet for UI design, fashion is more than an agent for consumerism: it can solve crucial problems that define how meaningful technologies can be. It’s especially important in mobile computing, where rejection of a long history of desktop UI paradigms has renewed exploration of the ways in which we use computers and what we can accomplish with them.17 What worries me is the possibility that stagnation is penetrating a field that’s still trying to define itself. Even scarier is the possibility that this stagnation germinates from iOS, for the simple reason (personal allegiances aside) that Apple has up to now been the only major tech company with any proven track record of saving us from stagnant trends ,e.g., command line UIs, floppy drives, physical music, and desktop computing. The dilemma with skeuomorphism is that as major driving force for iOS’s success, it’s a design strategy that’s hard to argue against, let alone abandon. Therefore whatever new possibilities leading edge UI design is pointing towards, Apple’s role risks becoming reactive instead of proactive. My question then is whether or not—no matter how best of breed their products remain—having Apple so consummately dominate the mobile computing space is what’s best for the industry. I know the question seems rhetorical given the idiom that competition breeds innovation, but try and name any leading edge mobile platforms that have enjoyed success in any way similar to Apple’s: WebOS not just ruined but killed Palm. Windows Phone 8 is eroding what’s left of Nokia. Windows 8 in general has Microsoft and its OEM partners in a frenzy that proves why not all ideas aren’t created equal (again, like twenty-seven inch convertible tablet desktops marketed to moms and kids). Android as a commodity OS for hardware manufacturers has been a bestseller, but it has left the platform disjointed and lacking cohesiveness from one device to another. Android the stock, presented-by-Google, operating system is almost a misnomer given its relative obscurity to the public. The only thing standing between us and the troves of innovations the aforementioned have created is the painful truth that only Apple has a proven track record of being able to popularize them.

If John Gruber can be fooled into thinking Apple remains at the leading edges of UI design, it’s thanks to its 3rd party developers who’ve inadvertently earned the majority stake in maintaining iOS’s innovative and dazzling pedigree, inadvertently making them iOS’s greatest asset in the process. While Apple is happy to oblige with statistics about the ever enlarging successes of the App Store, little is mentioned about how the ever enlarging clout of the store is shifting the power dynamics of the developer/platform provider relationship. You might describe equilibrium like this: Apple provides a product and platform customers want to buy into, e.g., the iPhone, thereby attracting developers with the promise of an untapped audience. In return developers provide the platform with (sometimes) exclusive software that distinguishes Apple’s platform from others, keeping current customers in the fold and also attracting outsiders that want a seat at the table, e.g., anyone who wanted to use Instagram prior to April 2012.18 This feedback loop is self renewing as long as each player maintains their stride: a new desirable iPhone every year, followed by new apps that take advantage of its new features. Things challenging this balance: On one front, the other platforms are rapidly catching up to, and in some cases surpassing, iOS both software and hardware wise, strengthening their own feedback loops. On another, there’s the aforementioned trend away from skeuomorphism that, at least UI wise, is dulling the appeal of a sticking-to-its-guns iOS and denying developers19 the guidance needed to meet the needs of this new vogue. The latter puts in play a few consequences. If Apple isn’t at least mildly proactive about updating its UI interfaces and campaigning them through its Human Interface Guidelines, then developers are left to act upon their own whims. This lack of uniformity and convention means that a Retina-Resolution era of UI becomes defined as one thing by The Icon Factory and as another by Path, by Simple Bots, Marco Arment, Realmac Software, Flipboard, and every other designer attempting to navigate iOS’s future without Apple’s guidance. I’m already frustrated by the number of Twitter clients disagreeing on what actions left-to-right and right-to-left swipes are supposed to invoke. But here’s the bigger worry: Apple’s hardware edge notwithstanding, what if the only incentive to develop for iOS—or to own an iOS device—is the promise of an ecosystem controlled, determined, and made enticing primarily by developers outside Cupertino? How does Apple prevent a mass migration if (when) another platform comes around proving they can foster developers the same way iOS did back in 2008?20 It’s no small feat for the challengers, but we’re fast approaching this reality.21 Developers aren’t just Apple’s biggest asset then, they’re also its biggest liability. For almost 6 years to pass without Apple demonstrating little interest in updating its UI beyond restrained refinement, beyond what’s necessary to show up with at a yearly keynote event, is either brazen confidence bordering on negligence or a lack of tactical manoeuvrability.

This for me is the real intrigue—the delicate balance between reassuring users and guiding developers—that’s simmering beneath the Skeuo v. Flat debate. Because in 2013 it’s winning the software battles that matters. The challenge for Apple then is whether they can settle on a UI design that’s simple and familiar enough to assuage the large swath of its users who seek nothing else, yet also avant-garde enough to secure its role as the pace-setter of an industry fuelled by innovation. Such a balancing act requires a flummoxing understanding of the power of design and UI’s undisputed role as the nexus of computing today. A particular design decision can not only solve a particular user experience problem, it can also make or break entire corporations while spontaneously introducing new user experience problems we’re not even sure exist yet, begetting new decision solutions, which themselves may or may not solve other unknown user experience problems, introducing who knows what kinds of make or break challenges that will be the death of some companies and the birth of others. On most minds—to say nothing of mine—the entanglement of implications is like boiling water to oatmeal. Imagine if it we were talking about anything more than a trend.






1: I’m tempted to substitute “top-tier” for a one time non-pejorative use of high brow. The distinction is important because we aren’t dealing with a “this is what all the cool kids are doing” type of trend but a “we’re the kids that were doing this before all the cool kids were” kind of trend, one that isn’t responsible for making something mainstream but rather for influencing other designers who’s apps will eventually take it into the mainstream. See: The Devil Wears Prada


2: That Gruber relegates any mention of the Metro aesthetic to 10pt footnotes is pre-emptive of reader riposte at best and negligent at worst.


3: And I’d argue that outside influences of flat design on iOS are too obvious to ignore. Not only thanks to the prevalence of Google’s own apps on iOS, but through the growing popularity of horizontally swiped views that owe a lot to Android and WebOS.


4: No word yet on when Daring Fireball plans to join the retina-resolution era.


5:A mistake on the scale of “print magazines are just as easy on tablets!”


6: Although in a primitive sense we can work our way backwards from our digital user interfaces to the very analog control panels, knobs, levers, keypads, and switches we use to interface with a variety of tools and appliances, which we’ve ⌘c ⌘v into our software. Ergo.


7: Explaining why Gruber’s complaints are often directed at the misapplication—whether by design or laziness—of skeuomorphic elements to UI designs which aren’t skeuomorphic at all, e.g., Find My Friends.


8: Quoted from this Webstock’11 talk. Given Gruber’s apparent knowledge of the subject, it’s all the more suspect that as basic an argument as “style changes” doesn’t warrant the briefest mention in his essay.


9: See: The broach, overalls, fanny packs. The monocle.


10: Nostalgia perhaps, the kind that lets me defend my love of You’ve Got Mail on its historical merits and memories of my own childhood ePenpals. But lets be honest about the Apple Podcast app, and about You’ve Got Mail.


11: And the emergence of flat UI design on iOS proper is still so negligent that it’s hard to go along with a premise that casts Retina displays as the catalyst for designer agency in all this. When Gruber—unblinkingly I imagine—informs us that the Windows 8 interface is “meant to look best on retina-caliber displays”, you have to ask yourself whether you believe in the sort of conspiracies that say either Microsoft is so forward thinking it’s willing to push out a suboptimal product for 2 years waiting for Apple to rescue them, or is just carving another notch in the bedpost of their own folly by being cravenly inept.


12: The representation of physical elements through digital form has been around since the release of the original Macintosh, but it’s really in the last 12 years, since the release of OS X, that Apple has pushed this design philosophy into every corner, background, and window pane of its operating systems. The greater the technology, the greater the amount of physical mimicry Apple has added to its software.


13: Apple’s motto until now has been It Isn’t The Consumers Job to Know What They Want. Even when the iPod was at its peak, Apple showed a surprising disregard for maintaining continuity in the line, often radically redesigning a product within a single generation, and sometimes backtracking the following year when those new designs failed to catch on. Underscored here is the relative insignificance of the iPod software in relation to the physicality of the device. This proportion is reversed with iOS.


14: Hence the excitement over Ive’s recent promotion to director of human interface at Apple, given the decidedly leading edge and un-skeuomorphic style of the industrial design team Ive leads (manifested in their distaste for the philistine, superficial, and heavy-handed traps the accoutrements skeuomorphic design often fall prey to). I liken the situation to MJs decision to try baseball. Here’s a guy who possessed unique natural talents that would make him gifted in any sport he decided to get up for in the morning, yet which weren’t sufficient to finding immediate success versus the experience of his competitors. At the highest levels, all else being equal, experience trumps skill.


15: A topic Ive has broached in the iPhone 5’s introductory video, demonstrating the power of familiarity in user experience.


16: The Microsoft Surface is a perfect case study: incredible, innovative industrial design buried and ignored in the face of the radical changes introduced by Windows 8.


17: A small list of things we either don’t have at all, would have on a smaller scale, or probably would have waited longer to see introduced were it not for smartphones: Siri & Google Now, social networking on a global scale, the explosive ubiquity of digital photography, a gaming industry divorced from its tenured oligopoly, wearable computing, ubiquitous connectivity, geo-location based services, and Angry Birds.


18: An exact description of the video game market from the mid 80s up until 2005/2006, when the economics of making a first rate video game on the current generation of consoles made it virtually impossible to succeed unless it’s sold on every available platform, putting the kibosh on decades of schoolyard turf wars over which console systems were best. But its only made exclusivity that much more valuable. Nintendo’s IP is the only reason the company has any relevance today, if you need just one example.


19: You need only make your own list of restricted, convoluted, clamoured for but denied, or impoverished APIs that could otherwise enable developers to create apps even greater than they already are.


20: Continuing with the video game theme from 18, we’re now describing what Steam could do with the Steam Box, its bid for our living rooms. Valve not only has a Nintendo-like following around its game titles, its also got the best disc-less distribution system out there in Steam. There’s likely no better candidate to endorse in the “most likely to replicate for gaming what the iPhone did for mobile computing” race.


21: Observable (a Google search will emphasize my point better than a link) from the variety of essays and switcher articles on Android finally reaching parity with iOS. From a developer/platform feedback loop perspective, where not quite there yet. While most of the major players (Facebook, Flipboard, Twitter, Instagram, and Angry Birds) have Android versions of their apps, what’s still lacking are desirable exclusives that attracts large swaths of users and makes those on competing platforms jealous. Yet this kind of slow leakage threatens to turn into a flood; the greater the number of major developers on the platform, the greater the level of confidence developers have in it, the greater the odds of Android getting those exclusives. Combined with its superior web services and ever improving hardware, Android is slowly changing the conversation from “Why wouldn’t I get an iPhone?” to “Why should I get an Android device over an iPhone?” to “Why should I get an iPhone over an Android Device?”.

Re: Path 3.0 and Teens

The rituals surrounding updates to Path are verging on tradition: An initial surge of excitement at whatever new beautifully crafted features the social network reveals, propelling everyone to hang around a couple days until its prompt abandonment the following week. At least, that’s what it feels like on my feed.

Path current incarnation is established, isn’t showing signs of struggle, and by most accounts is a shining example of taking existing products — in this case Facebook, Twitter, Foursquare, and Instagram, and remixing them into something fresh. Yet 2 years later I still can’t figure out what Path is aspiring to be beyond a showroom for great ideas and novel iOS design. Judging from this release, I’m not quite sure they’ve figured it out either.*

*You can decide for yourself. Path’s homepage copy in 2011 declared that we should "capture life’s moments to enjoy them with close friends and family". 2012’s proclaimed — less energetically, that we could now "stay connected with family and close friends". 2013’s is almost depressing by comparison, un-evocatively announcing Path’s utility for "private messaging and sharing with friends and family". You get the idea.

The new features introduced in 3.0 steer Path away from the social journaling ethos espoused in its beginnings towards an unsubtle version of a less permeable Facebook. Private messaging seems like the last item left to pick from the hat of social network standards not yet addressed. It gets weird with the art stickers: while they would otherwise go unnoticed by my Pavlovian apathy for In App Purchases, they’re so reminiscent of the Susan Kare designed gifts Facebook launched in 2007 it’s difficult to resist the twinge of nostalgia for a younger version of me who was excited by a now extinct version of Facebook, one concerned with staying connected to close friends and family. Maybe Path is onto something.

On a recent episode of NPR’s Fresh Air discussing online bullying, a brief mention about how kids choose which social media to engage with stuck out, perhaps because the topic was also momentarily mentioned in the first part of This American Life’s engrossing series on Harper High School. While you wouldn’t even have to listen to either show to figure out that most of the decision is weighted by cool hunting (Myspace: old, Instagram: New), part of it also rests on the way teens conceive of the internet as a particular symbiosis of private and public space. They seek out is whatever service has the best balance of exposure and privacy that to us seems like delusional have-your-cake-and-eat-it fantasy: they want their content to be public enough to reach an id flattering audience, and enough isolation to sequester themselves from authority figures that ruin the party.

Here’s a theory. Path could be that utopia. All the cool essentials you could want in one place are accounted for. There’s photo filters, stickers, something that let’s you signal you like something else, robust messaging, media integration, a youthful design. But here’s the important part: Path is a goddamned master when it comes to balancing private and public. Profiles are private as a feature instead of a setting, giving user’s granular control over the size of their networks. Have as many friends as you want participate in your timeline while turning everyone else away. Posts aren’t available to the public unless willingly shared outside the App. For those who get into your network, there’s plenty to do. The genius is that Path’s curated set of actions is big enough to be an effective social network hybrid, yet not big enough for timelines to be bogged down by extraneous activities (games, for example) that turn other social networks (Facebook, for example) into impersonal balagans of information.*

*I also have another theory that Path might be better suited to discourage bullying precisely because public exposure is limited on the service. There’s also no ability to, say, create a Shadoe-Sucks fan page that’s available for everyone — not just Path users, to see. Granted, it doesn’t prevent bullies from acting out on their own profiles with their friends, nor does it have any means for responsible adults to monitor their kids activity (which seems counter-intuitive to complain about now, but is actually important because we do want mom and dad and teachers and friends intervening when things get out of hand). My point is that it grants victims of bullying an ability to participate in a social network that doesn’t also contribute to an interminable and inescapable cycle of online harassment that doesn’t end once last period is dismissed.

So then why hasn’t Path taken off in high schools across America yet? You might say it’s difficult to run against an enormous incumbent like Facebook. You’d be right, except Instagram ran such an effective and popular campaign it was bought out by an enormous incumbent. Maybe it’s a wave of luck and timing that Path can’t seem to find. My guess is it’s just hard to sell to anyone, not just teens. And stickers and private messages won’t change that. Being a hybrid of existing services, it’s challenging for Path to convince us to switch over without some obvious hook — Twitter’s limited set of characters, Instagram’s filters, old Facebook’s network exclusivity, that gets people insatiably curious and signing up. Path’s problem is that the hook it does have, i.e., being able to find that footing between what’s personal and what’s public, thereby letting us feel like our lives are shared rather than exposed, isn’t easy to describe in a way that’s concise and easily differentiated from the competition. Figuring that out should be higher on the list than stickers.