2010s Smartphone Screens are the New 1950s Tail Fins

I don’t know who this Dustin Curtis guy is (I’m sure he returns the favor), but he’s absolutely right. The ever-larger screens on (non-Apple) smartphones are ever-less ergonomic and ever-more ridiculous.

image from Dustin Curtis

The bigger the screen, the more impossible one-handed operation becomes.

It’s another case of companies getting into an arms race over “who can make the biggest x” without any regard to why. Where have we seen this before?

If they’re going to make phones with ever-bigger screens, they should take care to design the UI to improve one-handed use, as suggested by Itai Vonshak‘s awesome Emblaze UI design:

Thumb-centric design

Itai works at Palm now, btw 🙂

H/T to my friend Jon Tzou for sending this my way

–UPDATE–

And the best tweet ever on the subject comes from Sebastiaan de With:

Dénouement

I am brought to tears not from sadness at the loss of Steve Jobs, but from joy at seeing something wondrous.

In any singular moment, have you ever seen such an outpouring of so much love, from so many people, in so many corners of the world, focused together?

Such a moment had never happened in my lifetime. Arguably, it had never happened in human history–we could never have felt together until technology knitted us together–And it may never happen again.

Experiencing this…moment, this magic, this love, together…it was Steve’s final gift to us.

Thank you.

Hello Marco’s Readers

Thank you to everyone who spread the word of my blog, because it somehow made its way to Marco Arment (of Instapaper fame), who linked to it.

I will now take advantage of my 15 minutes of fame to squeeze every bit of self-promotion out of it that I can.

My name is John Kneeland. I am originally from Philadelphia and graduated from the University of Pennsylvania in 2008 with a degree in nothing useful. Now I live in San Francisco.

I currently work with the awesome webOS team (née Palm) at HP.

Building a Better Tablet Browser Part II: My Solution

Much to my surprise, the number of visitors to my blog providing thoughtful commentary far outnumbered spambots. It was great to see your ideas on how to solve the problem of interacting with tablet web browsers.

Now I’ll take you through mine.

In my humble opinion, the best implementation of basic browser functions we have seen on mobile devices to date is the “gesture area” as implemented on all webOS phones since the debut of the Palm Pre in 2009. For those of you who are unfamiliar with the Pre (you’re missing out btw), the gesture area is a capacitive touch-sensitive strip below the display where one can simply swipe their finger in a certain direction to trigger an action. In the browser, swiping left was the equivalent of hitting the “back” button in the browser, while swiping it was the equivalent of hitting the “forward” button.

This is like hitting the 'back' button, except way easier.

I am particularly enamored with gestures as opposed to onscreen buttons in mobile devices, for reasons I will expound upon in a future post. For now I will just say this is my preferred way to interact with the browser. webOS nailed it with the gesture area in their phones.

And so of course HP/Palm went and ditched the gesture area in their first tablet. *facepalm* *faceHP*

Can't touch this (bezel)

In their defense, there are many good reasons the gesture area that works so brilliantly on phones simply doesn’t work well on a tablet, but that’s a topic for another post altogether.

But back to the point at hand: If we do not have a hardware gesture area, what can we do?

Make the whole screen your gesture area

How not to do a screen-wide gesture area

That oughta do it

Heh.

Anyway, since that’s obviously silly, let’s refine the idea some more. Obviously we don’t want the whole screen looking or acting like a gesture area all the time or we can’t actually use the device. Rather, we need a way for the touchscreen to interpret when you want to be using it as a gesture area.

I have two distinct ideas on how to do this:

1. Multitouch gestures

I like this, but the main problem that I see is that Apple has probably patented it. That wouldn’t stop the Android team of course, but it’s enough to give me pause.

2. Touch-and-gesture

Right now lots of touchscreen interfaces have a means of bringing up a contextual menu: hold your finger down in a single spot for a second and the contextual menu pops up next to your finger.

an example of a contextual menu that pops up on holding a finger down

I think this is an area of opportunity for solving the browser nav issue. And so I propose, the Magic Gesture Area (note to self: check if Apple has copyrighted the word “magic”)

First, I am going to reduce the number of steps required. Right now the contextual menu requires you to tap, hold, release, move finger to the desired menu item and tap again. Why not just position the popup options so that you can swipe right to them? I will change it to tap, hold, swipe. Once you trigger the menu, just swipe your finger to the left to go back, or swipe it to go up. Or swipe it up to reload. Whichever way you swipe, of course will have a visual cue to confirm it, just as the current webOS gesture area’s light pulses in the direction of the swipe.

But wait, John, you say. What about the functions that are currently in use by tap-holding, like “open in new card” or text editing features?

Well I’m glad you asked. The contextual menu’s original features still have a home here: on the bottom. Only now instead of tapping on them, you can drag you finger on top of them until the one you want is highlighted (kind of like the System Menus behavior in Mac OS 1 through 7.x)

I’ll post some cobbled together animations of how it works in practice later, but I wanted to get this out there to start and get your first thoughts on it.

Building a Better Tablet Browser Part I: The problem

I have an iPad and a TouchPad. I also had a Samsung Galaxy Tab for a week but didn’t find Honeycomb good enough to warrant a permanent spot in my collection.

One thing that strikes me is how awkward tablet browser interfaces seem to be.

Stretch Armstrong can go back without moving his hands. You and me, not so much.

What the tablet makers have done is take the exact same browser paradigm popular on desktops (with Honeycomb even going so far as to bring desktop browser tabs) and shoehorn it into a tablet screen. Since all the buttons are not within the range of where the fingers are, the user has to move their hand from where it was naturally and extend it up to hit the buttons. Instinctively, this just doesn’t feel right. I’ve tried to break this down into more definable reasons:

  1. Economy of movement is good, and the tablet browser as is does not minimize movement well. The less a user has to move to do something, the better. All the more so on a touchscreen, which requires more movement to get from point A to point B (it’s a 1:1 ratio of movement in life to movement on the screen, whereas a mouse/trackpad amplifies the movements you make in a few inches of space to cover a much larger screen). While it may require an inch or so of movement to flick a PC’s cursor 6 inches up to the back button, it requires the full 6 inches of movement for your finger on a tablet. Bad.
  2. Unlike desktops (or even laptops), the hands are not just used for interacting with a tablet; they’re also used for holding it and supporting its weight. If the user has to move their hand to do something, they have to shift how they are holding the tablet every time they need to use one of the browser’s buttons. Bad.
  3. Accuracy suffers. Pick a key on your keyboard and try hitting it with your wrists resting on the hand rests. Now lift your entire arm and try zooming into it with your finger. It’s not difficult (unless you’ve been hanging out with Jack Daniels), but it does take somewhat more effort than the former. I think this is because when your wrist is stable, you only have to move your fingers whereas you have to use your entire upper arm for the latter, which involves more “moving parts” and takes more effort to get the same level of accuracy. Bad.
How can we fix this? I have my ideas which I’ll discuss later. What about you?