Interactive experiences: Rock-paper-scissors goes high tech with gesture recognition

Interactive experiences: Rock-paper-scissors goes high tech with gesture recognition

i.MX gesture recognition

Think you could beat a computer?

In my previous post, I shared a video of Computer Assisted Billiards that was featured at NXP FTF 2016. It was one of many showcases featuring i.MX applications processors that we filmed in 360 degrees. The showcases take different interactive, digital world experiences and make them real. Enjoy this immersive 360 degree experience with a headset/viewer or even straight from your phone or laptop by moving side-to-side and up-and-down. Here is Part 2 of the series…

Rock-paper-scissors with human gesture control

What it is:

This showcase demonstrates the use of human gesture to control a machine, experienced through a classic game of rock-paper-scissors. You can play against the computer or another person.

How it works:

Vision detection systems leverages vision acceleration and OpenVX programing to determine when a human desires to interact with the machine. Communication occurs through sign language and visual images. The machine monitors the physical condition of the human to determine the result of the interaction and then communicates required information.

Technology demonstrated:

i.MX applications processor drives realtime gesture recognition, complex movement compensation and realtime decision making.

Relevant applications:

ADAS driver awareness, gesture HMI, industrial automation, robotics and digital signage

Stay tuned for Part 3!

Kyle Fox
Kyle Fox
Kyle Fox is the Product Manager for the i.MX 8 series applications processor. He has more than 20 years of experience in the microprocessor industry where he has held product and technical roles in the application, mobile, desktop and server processor areas. He focuses on the definition and production of advanced application processor products for consumer, industrial and automotive markets. A graduate of the University of Texas at Austin with a degree in Computer Science, he lives in Austin, Texas and holds three patents.

2 Comments

  1. Avatar xiaocong says:

    Hello,
    I’d like to ask two questions regarding the video:
    1. In this video, both the user’s hand and arm are included. What can happen when he/she wears long-sleeved clothes? Have you thinked about segmenting the arm, using only the palm part in order to control the machine?
    2. What kind of algorithm have you used to do the recognition? Modeling match, neural network or something else?
    Thanks for your reply,
    Xiaocong

  2. Avatar JacobJones says:

    Excellent blog! Thanks for sharing such beautiful information on the blog. I Look forward to read more.

Buy now