On January 2011, Microsoft came out with a revolutionary device named Kinect. While Microsoft made it for their Xbox, many people used the Kinect to give computers eyes, ears, and the capacity to interact naturally with people by simply gesturing and speaking.
A lot of developers were really enthusiast to deliver new tools despite the limitation of that device in terms of speed and accuracy. However seeing the unexpected success and adoption by both developers and consumers. Microsoft decided to make a second version which can work better on computer and be used in a professional environment.
Shortly after LeapMotion announced the avaibility of a compelling device dedicated to gesture - an area the Kinect could not directly do with good results. Both devices opened a complete new world of possibilities for gaming, working, and more.
And nowadays with tablets we have also gesture on screen and can work anywhere we want without losing connection to our friends, community, and our data.
I still cannot believe we could even work differently before.
And now Google Glass, Vuzix with their Smart phone hand-free device and few others who are announcing augmented reality sometimes in 3d, I thought we were going to a 'limited' interactive world where real and virtual mix all together.
USA was definitively driving the innovation in that field until today!
A Russian company has just announced a new device based on the secret Russian research from the cold war they had access to.
They introduced a revolutionary personal interface for human computer interaction reading your brain - But also interact with your brain to send imagery. No more needs of capture motion device, it reads what you want from your brain - no more keyboard, mouse, it detects what you want - no more screen, you don't see an images anymore, it is there in 3D.
All this to tell you I lost couple of months of my life doing research on using & programing these legacy devices like in my Kinect articles (Part 1, Part 2) because they already all legacy devices now :(
Anyway, I was really excited by all this until my boss came to me to say I had to rewrite the complete Maya UI again- You can pre-order the device here.
you should check out the emotiv epoc eeg, which looks like your picture and has been around for a long time... so maybe the timeline you have is a bit off?
Posted by: m | April 08, 2013 at 03:01 AM