The new UI approach in Synaptics' collaborative Fuse concept handset, and now TAT (The Astonishing Tribe, the folks behind the original Android UI), has posted a brief clip that gives a better idea of the full UI. It's pretty wild, with some sort of rendering engine that really emphasizes depth, lighting and motion.
TAT together with partners (Synaptics, Texas Instruments, Immersion and Alloy), for the first time shows an integrated range of multiple interface technologies— including multi-touch capacitive sensing, haptic feedback, force, grip, accelerometer and proximity sensing all brought together in a fully OpenGL|ES 2.0 hardware accelerated 3D User Interface.
Using TAT Cascades we have implemented a user interface bringing all these modalities together to tackle some of the challenges of current-generation touchscreen phones by on-the-go users, namely the difficulty of single-handed usage and the need to look at the screen. The Fuse’s sensing technologies surrounds the entire device. E.g. grip sensing is achieved via force and capacitive touch sensors on the sides of the phone, and it also introduces for the first time 2D navigation from the back of the phone, enabling single-handed control without obstructing the display.
The output feedback technologies include next-generation haptic effects and ground breaking 3D user interface effects. The TI OMAP3630 platform has been put to the test with the implementation of a dynamic UI design, highlighting the sensor control mechanisms using realtime OpenGL|ES 2.0 shader effects for things like light sources, dynamic colors, reflections, shadows, animated 3D meshes and much more.
The movie below shows a couple of the UI mechanism and effects, and the fully functional device with all sensors and haptic actuators will be available to try out for the first time at CES 2010.
second Fuse UI, Fuse UI images, Fuse UI video, Fuse UI hands-on, Fuse UI android, next gen Fuse UI, google Fuse UI, tat Fuse UI, synapatics Fuse UI
0 comments