Tuesday 13 July 2010

Touch Screen and Guestures for the web

Here's a technology that's already common on mobile phones and hasn't yet made it to the internet yet. Touch Screen, and Gestures, the technology has been demoed for ages, e.g. Finger Scrolling Pages. Multitouch demoed some 4 years ago. The pace of innovation on the internet seems slow by comparison now. Yes HP are already selling touch screen PCs. But a lot of this is propitiatory, they aren't yet, common standards for connecting a touchable screen, to a computer. Hint to the industry, send a x,y, pressure single down a USB cable for each place on the screen touched of released.



Windows and Linux are of course built for one mouse, with single and double click, its been like that for years. Since I've over forty and slower to learn, I'll probably to happier with the mouse plus keyboard for the rest of my obsolete little life, but that doesn't mean its a superior interface. The new generation are already happy with there touchable I phones, and would no doubt like a computer with the interface of an Ipad and the huge amount of software as the existing PC.


Why no common framework for touch screen for the PC yet. Is it all patent and litigation threats, or guest laziness?


The Web also needs an update to work this touch screens, especially with mult-touch and gestures. A touch could be mapped to a button click, but can java-script events match more than one figure on the screen a time. The interface is x and y coordinate for a single mouse event. Event libraries need be written for multi-touch and standardised first for the OS and second for the web browser. Even after that, is not obvious at what stage we decide that combined movements become a gesture. OS layers, application layer e.g. browser, and web layers, might in the worst case, all define and recognise two fingers spinning as different gestures, triggering three event at three different layers. So plenty of work for user interface designers for the next tens years or so.