- Регистрация
- 17 Февраль 2018
- Сообщения
- 29 377
- Лучшие ответы
- 0
- Баллы
- 2 093
Offline
Some of the best tech we see at CES feels pulled straight from sci-fi. terday at CES 2025, I tested out Neural Lab's AirTouch technology, which lets you interact with a display using hand gestures alone, exactly what movies like Minority Report and Iron Man promised. Of course, plenty of companies have delivered on varying forms of gesture control. Microsoft's Kinect is an early example while the Apple Watch's double tap feature and Vision Pro's pinch gestures are just two of many current iterations. But I was impressed with how well AirTouch delivered and, unlike most gesture technology out there, it requires no special equipment — just a standard webcam — and works with a wide range of devices.
Neural Lab's software is compatible with tablets, computers and really any device running at least Android 11, Windows 10 and later or Linux. The technology was developed with accessibility in mind after one of the founders had trouble keeping in touch with their parents overseas because navigating video conferencing programs was just too difficult for the older generation. The Neural Labs representative I spoke with added how his parents preferred using an iPad to a computer/mouse/keyboard combo because touch controls are so much more intuitive. With AirTouch, they can use their TV much like they do a tablet.
In addition to accessibility, there are plenty of commercial applications too — such as letting surgeons manipulate MRI scans without touching anything or a more commonplace scenario like moving through slides in a presentation.
AirTouch tracks 3D hand movements and keys off of eye gazes to recognize intent, allowing it to ignore extraneous gestures. It currently supports nine gestures and customization allows users to program up to 15.
I tried out two demonstrations: a 3D screen with an animated image of a tree frog and a monitor displaying a webpage on a browser. On the 3D screen, holding up one finger dropped a pinecone on the frog's head, two fingers dropped an acorn, a thumbs up spun the frog around on its leaf perch and a quiet coyote gesture turned it back. It took me all of 15 seconds to learn and use the four gestures and soon I was raining down acorns on the poor frog like some ill-tempered squirrel.
It was nearly as easy (though not quite as fun) to control the screen displaying the web browser. Moving my hand around dragged the cursor across the screen and pinching took the place of clicking. I was able to scroll around on a streaming site, pick something to play, pause it and start it back up again within seconds of learning the hand movements. There were a few instances where my movements didn't do the thing I'd hoped, but after a few tries, I started to get the hang of the controls.
AirTouch is available now as a $30-per-month subscription for individuals (and $300 monthly for companies). Neural Labs says it takes just five minutes to install the software on any compatible device.
This article originally appeared on Engadget at https://www.engadget.com/computing/...ces-with-just-a-webcam-180031750.html?src=rss
Neural Lab's software is compatible with tablets, computers and really any device running at least Android 11, Windows 10 and later or Linux. The technology was developed with accessibility in mind after one of the founders had trouble keeping in touch with their parents overseas because navigating video conferencing programs was just too difficult for the older generation. The Neural Labs representative I spoke with added how his parents preferred using an iPad to a computer/mouse/keyboard combo because touch controls are so much more intuitive. With AirTouch, they can use their TV much like they do a tablet.
In addition to accessibility, there are plenty of commercial applications too — such as letting surgeons manipulate MRI scans without touching anything or a more commonplace scenario like moving through slides in a presentation.
AirTouch tracks 3D hand movements and keys off of eye gazes to recognize intent, allowing it to ignore extraneous gestures. It currently supports nine gestures and customization allows users to program up to 15.
I tried out two demonstrations: a 3D screen with an animated image of a tree frog and a monitor displaying a webpage on a browser. On the 3D screen, holding up one finger dropped a pinecone on the frog's head, two fingers dropped an acorn, a thumbs up spun the frog around on its leaf perch and a quiet coyote gesture turned it back. It took me all of 15 seconds to learn and use the four gestures and soon I was raining down acorns on the poor frog like some ill-tempered squirrel.
It was nearly as easy (though not quite as fun) to control the screen displaying the web browser. Moving my hand around dragged the cursor across the screen and pinching took the place of clicking. I was able to scroll around on a streaming site, pick something to play, pause it and start it back up again within seconds of learning the hand movements. There were a few instances where my movements didn't do the thing I'd hoped, but after a few tries, I started to get the hang of the controls.
AirTouch is available now as a $30-per-month subscription for individuals (and $300 monthly for companies). Neural Labs says it takes just five minutes to install the software on any compatible device.
This article originally appeared on Engadget at https://www.engadget.com/computing/...ces-with-just-a-webcam-180031750.html?src=rss