Ok, try picking up your phone and trying out the movement on the back of the phone that this would require. It doesn't work. If the majority of the phone is touch sensitive how are you supposed to hold it? It makes for really awkward movement of your hand and let's say you figured out how to get the scrolling down, you still have A HUGE issue of UI. The iOS experience nor its apps were made for any kind of cursor or mouse interface.
If you need to reach across the screen without taking up screen real estate, use your other hand. Or use a stylus. Problem solved.
Edit: Also not to mention that for the most part iOS does a pretty good job of letting me be pretty accurate with my bulky fingers. It's not perfect, but it's certainly not terrible.
I actually find the scroll motion to be very easy to perform using my index finger. However, I do agree there are issues to consider when it comes to usability. You definitely want to prevent unintended gestures and clicks/taps. So for one thing, the back touch doesn't necessarily have to be "on" all the time. It may be a mode that you enter under certain circumstances. And perhaps more advanced gesturing algorithms are needed to filter out intended from undintended gestures - maybe giving more weight to the size of the touch area, so that you palm resting on the back doesn't register as an event.
I don't pretend to have the answers to all these issues, but something tells me that smartphones can do much better than the current state of the art when it comes to game controls, to pointing at precise location, etc. And it goes without saying that this won't work with the existing mobile OSs as-is.
Same although it causes me problems frequently because I also use a mouse. In fact sometimes when typing I'll every so gently brush the pad and jump somewhere else and start overwriting things.
I had to turn off touch to click on my MacBook Air because I often brush the pad with the heel of my hand while typing, causing the OS to refocus on wherever the cursor happened to be.
I'm not sure what you're getting at. Your laptop is several times bigger than your phone so it's not as big as an issue. We are talking about a device that's smaller than your hand.
hes getting at the fact that laptops are trained to ignore input from your palm and only only respond to your fingers... a smartphone could also ignore certain inputs that are obviously not meant to be used.
The only game as far as I'm aware that properly makes use of it is Tearaway, but even then it only appears in a few places and was kind of awkward to control properly.
The Little Big Planet games use it too, in a more limited manner.
It is in fact awkward to use, IMO, and not just because of the software or Sony's implementation.
There are at least two big problems with it:
One is that it is really difficult to do Wacom-pen style hovering of a "cursor" on a capacitive surface with finger input in a way that works well universally for everyone without a lengthy and awkward calibration. And because of the non-direct method of interaction using your fingers on the back gives you, you really need some sort of non-action hovering indicator for this setup to work well.
The other is this: Put your hands in the positions shown in the original article. Now try moving your your index and middle fingers around as if touching the back surface of a device and try not to move your thumbs (and wrists) all over the place involuntarily. For most people this is difficult. When you are tightly gripping a device this becomes less of a problem but still contributes to the whole thing feeling very uncomfortable and unstable.
I'm nearly positive that various companies like Apple must have tested something like this out (either before/after the PS Vita) for a phone and just found it to be a poor solution when implemented in a real-world prototype.
I have a Vita, but the only game I've played that uses it is Tearaway. This game doesn't use it for precision navigation or selection, so it's hard to say how well a rear touchpad would work for that. Tearaway employs the rear touchpad for tapping to jump and pushing objects around. For that, it's surprisingly effective and adds a fun--if gimmicky--element to the gameplay. The Vita is quite a bit larger than most smartphones, and I could see problems with differing hand sizes requiring people to adjust their grips to use its entire surface--probably not as much of a problem on a phone. The Vita is played two-handed. I think a one-handed grip while using the rear touchpad on a smartphone would be more difficult as your finger's range of motion is limited. The OP article seems to anticipate this by focusing on small gestures rather than ones that would take your finger across a large surface.
On a tangent [Tearaway Spoiler Alert]...when beginning the game, Tearaway asks you to select your skin tone from a few presets. Knowing nothing about the game at that point, I thought that was really strange. I recall thinking, "I don't really care whether my character has my skin tone...and it's really odd that they'd presume that I would." But it turns out, that's not what the skin tone selection was for. In Tearaway, you use the rear touchpad to punch your fingers through the paper backdrops of the game to manipulate things. The first time I did this and they showed "my" fingers in the game, I was startled for a split second and then laughed out loud. Screenshot of the effect:
http://media.officialplaystationmagazine.co.uk/files/2013/11...
It's more magical live as your virtual in-game fingers track the position and angle (angle presumably by extrapolating based upon the current finger position and average hand size/grip) of your actual fingers surprisingly well.
The Notion Ink tablet originally had this back in 2010. It was disabled and eventually removed entirely. There were rumors that it was squashed by an apple patient, but I cannot find anything more substantial.
The whole NI tablet debacle made me quite jaded towards miracle tech. I'm lucky it predated kickstarter, because I would have most likely backed it up to 50% over retail. It's probably why I don't own a 1st gen pebble.
it is one of those things that seem perfect on paper. and may even be good for the power user. but everyone knows that power user and consumer hardware are not a good match.
you are right about the apple patent. Apple bought fingerWorks in 2005 and killed ALL the loved product lines (of note, one keyboard that was in its entire area, a touchpad). apple sit on top of the company's IP assets and used it for nothing but suing people. ...and people still wonder why i avoid any apple product..
anyway, eventually the touch screen tech made it to the iphones. or so they say. but if you compare the fingerworks tech on the few shipped keyboards and the iphone resistive touchscreen, they have little in common. apple was just trolling everyone with the patents and killing innovation all around.
but since fingerworks was dead since 2005 thanks to apple, several other companies with employees that probably never even heard of finger works developed this idea... nokia as you mention. sony with psp. motorola with the backflip (which being one of the first AT&T exclusive android phones suffered of having the worst crap of custom android ROM that ever saw the light of day). And more recently the Oppo N1 already have the very same implementation idea mentioned in the article, and is in production. but you don't see anyone rushing to the stores.
as others have mentioned it's not a new idea. i think the reason you don't see it in Apple products at least is because it is a step backwards in terms of metaphor. the iPhone's core metaphor is direct manipulation. this requires that your brain visually connects the motion of your finger with the objects on screen. if you manipulated things via the back touchpad, this illusion would be broken and it would feel more like interfacing with a traditional computer and less like direct manipulation.
That sounds like the reason Apple refused to use a hardware button to trigger the camera shutter, but thankfully they caved on that one and I can take a photo without having to guess where I'm tapping the screen.
I'd rather see companies trying out ideas that might work instead of just sticking to their metaphorical guns.
Well every design change is a trade off. Breaking metaphors, all other things being equal, is bad. The fact that the author here identifies an on screen cursor as a use case probably is enough to strike fear in any apple designer that this would lead to a regression in ux.
How about instead of a 1:1 touchpad on the back, which could have problems, Put a trackpad on the back of the phone, but have it just be a small square. So it wouldn't be a "full-fledged" touchscreen, but a small touch sensistive section. You could also have a discrete capacitance detecting sliver on the side of the phone so that it's only active if you're physically holding the phone. Clicks would happen using a button.
Why does it have to be on the back? Some old Android phones had a trackball (or an optical sensor like on a mouse) to be used as a sort of cursor. The Palm Pre had a small touch sensitive area below the screen that could be used for OS specific gestures (much better integrated than what Android used the trackball for IMHO).
All of these ideas are pointing to the issues in the use of a touchscreen as input (imprecise, blocks screen during use, etc), yet for some reason they keep getting used (and even taking over regular buttons).
I hope that this decision is being tested by the companies HCI departments, but I worry that marketing is deciding that changing the input would be too much of a risk (or cause fragmentation).
I suspect that Apple, Samsung, etc. have thought of this, so why aren't they doing it? (After all, the PS Vita has had a back touchpad for a while.) The first thing that comes to mind is that people hold their phones with the back. Accidental clicks would be very difficult to avoid. In contrast, the PS Vita is larger, held with both hands and has non-touchpad areas of the back where you can grip it, as I understand.
Sony put this into the Playstation Vita and it's awful. Few games use it, and it's clumsy in the games that do. It's possible that a company like Apple could make it good somehow, through sheer will and hardware/software expertise, but I suspect it's just not a good idea in practice even if it's good on paper.
Based on my limited experience, some potential issues:
Not having the user's fingers obscuring the view of the screen may in fact increase the perceived latency, since they're focused on the screen and don't have the motion of their fingers to distract them.
The latency issue would be twice as bad if you want to render a 'ghost' of your hands on the screen as described in this design concept.
Interacting with onscreen elements is more difficult when using your hands on the rear of a device, even with a 1-1 mapping. I don't really know why this is, but even with a cursor onscreen, I have found it to be true.
Accidental interaction is 5 times worse with a rear touch panel. Apps on the Vita that use it extensively are a huge pain in terms of accidental swipes and touches, especially if you try to lay the device down on a surface for a moment, or set it on your knee to use the front touch panel.
The core problem with virtual buttons/joysticks/gamepads is that you have no physical feedback about where your fingers are, and as a result you lose your 'centering' and your inputs end up being misinterpreted or not landing. Moving your fingers to the rear of the device makes this worse, because you can no longer look at your fingers to figure out where they are.
Yeah, my biggest issue with touch screens by far is the lack of precision. This point seems a bit silly as your fingers typically move fast enough to allow your memory to recall the image.
Maybe in games this is more of an issue, especially where you need to leave your fingers in the same position and the design of the game does a poor job of taking this into account. But those are mostly edge cases.
The problem with precision is one of visibility. In my experience, the problem us that I can't see the exact spot where the device thinks my finger is because my finger is on top of it. This is at least the case with text selection.
The Motorola Backflip (http://www.gsmarena.com/motorola_backflip-3079.php) actually had a touch pad in the back that could be used in the same way trackballs were used on Blackberries. It was pretty awful to use, though that might have been because it ran Android 1.6
I think the future of smartphones are foldable either in the middle like a foldable sheet of paper or expandable flexible oled displays formed like Chinese hand fans or origami.
Further more the cell phone of the future will be able
to borrow any big screen in its vicinity. Something like NFC from the large display/computer monitor and built in AppleTV, Google Chrome MiraCast functionality. You will also be able to borrow local keyboards for better input, but without the bluetooth hassles of setting it up. So the cafe, workplaces of the future will have wireless chargers and screens that you can borrow for your mobile device.
So you will carry your device around but borrow larger displays and keyboards. The device will be powerful enough todo your everyday computing. No need to drag a big laptop around if you do not want to.
These devices will also be user serviceable like Google/Motorola Project Ara. It is simply not good for the environment to throw away a whole phone just because the display, battery is bad or because you want to upgrade the radio components. So in the future devices will be made to be recyclable, this trend will be driven by the scarcity of rare earth metals. It will simply not be good enough to buy and throw devices and not think about the recycling of rare metals and the environment.
Exactly that. You cannot invent (patent) something that has already been invented. Now you could go ahead and invent (patent) something that builds on his ideas, as long as those ideas are a novel addition (not an obvious next step).
... "The samples will be missing lots of implicit information such as how to install the necessary libraries and how to deal with missing dependencies and version conflicts." ..
Even people who grasp code pretty well loading up some code(stable library) needed by tutorial might get tripped up by a wrong advice such as library that was fixed for some security vulnerability that broke a number of thing that current examples need to work and failing code just misbehaves in a number of ways, one remarkable - blaming general implementation of examples that they are trying breaking security standards and not give any alternative to do otherwise. that is unless you find a reference somewhere on some obscure blog that says that you have to load specific version pre-alpha/HEAD^3 because HEAD is broken so many subtle ways that it will cause you even more pain. This happened not once and not only to me I bet. I know I could've patched my code but with the deadline and fact that I don't know much about cryptography that would not be a sane option.
That's a good idea and all, but you know what would be better than two touch panels? A button to turn them off!
With screens bigger than your hands and accidental touch detection being as bad as ever, a simple button to turn off the digitizer would make a lot of people happy...
If you need to reach across the screen without taking up screen real estate, use your other hand. Or use a stylus. Problem solved.
Edit: Also not to mention that for the most part iOS does a pretty good job of letting me be pretty accurate with my bulky fingers. It's not perfect, but it's certainly not terrible.