Researchers use Wi-Fi to see gestures, identify individuals through walls
30 October 2015 - 11:15, by , in News, No comments

shutterstock_210785077

MIT has created a device that can discern where you are, who you are, and which hand you’re moving, from the opposite side of a building, through a wall, even though you’re invisible to the naked eye.

Researchers at MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) have long thought it could be possible to use wireless signals like Wi-Fi to see through walls and identify people.

The team is now working on a system called RF-Capture that picks up wireless reflections from the human body to see the silhouette of a human standing behind a wall.

It’s the first system capable of capturing the human figure when the person is fully occluded, MIT said in an announcement on Wednesday.

CSAIL researchers have been working to track human movement since 2013. They have already unveiled wireless technologies that can detect gestures and body movements “as subtle as the rise and fall of a person’s chest from the other side of a house,” which, MIT says, could enable a mother to monitor a baby’s breathing or a firefighter to determine if there are survivors inside a burning building.

RF-Capture’s motion-capturing technology can also enable it to call emergency services if it detects that a family member has fallen, according to Dina Katabi, an MIT professor, paper co-author and director of the Wireless@MIT center:

We’re working to turn this technology into an in-home device that can call 911 if it detects that a family member has fallen unconscious.

The RF-Capture device works by transmitting wireless signals and then reconstructing a human figure by analyzing the signals’ reflections.

Wi-Fi

Unlike the emergency-alert wristbands and pendants often worn by the elderly – including the meme-generating “I’ve fallen and I can’t get up” LifeCall devices – people don’t need to wear a sensor to be picked up by RF-Capture.

The device’s transmitting power is 10,000 times lower than that of a standard mobile phone.

In a paper accepted to the SIGGRAPH Asia conference taking place next month, the team reports that by tracking a human silhouette, RF-Capture can trace a person’s hand as he writes in the air, determine how a person behind a wall is moving, and even distinguish between 15 different people through a wall, with nearly 90% accuracy.

That’s just one of many possible uses in a networked, “smart” home, Katabi said:

You could also imagine it being used to operate your lights and TVs, or to adjust your heating by monitoring where you are in the house.

Beyond tracking the elderly or saving people from burning buildings, MIT also has its eye on Hollywood.

PhD student Fadel Adib, lead author on the team’s paper, suggested that RF-Capture could be a less clunky way to capture motion than what’s now being used:

Today actors have to wear markers on their bodies and move in a specific room full of cameras.

RF-Capture would enable motion capture without body sensors and could track actors’ movements even if they are behind furniture or walls.

RF-Capture analyzes the human form in two stages: First, it scans a given space in three dimensions to capture wireless reflections off objects in the environment, including furniture or humans.

Given the curvature of human bodies, some of the signals get bounced back, while some get bounced away from the device.

RF-Capture then monitors how these reflections vary as someone moves in the environment, stitching the person’s reflections across time to reconstruct one, single image of a silhouette.

People

To differentiate individuals, the team then repeatedly tested and trained the device on different subjects, incorporating metrics such as height and body shape to create “silhouette fingerprints” for each person.

MIT says the key challenge is that the same signal is reflected from different individuals as well as from different body parts.

How do you tell the difference between various limbs, never mind entire humans?

Katabi says it boils down to number crunching:

The data you get back from these reflections are very minimal. However, we can extract meaningful signals through a series of algorithms we developed that minimize the random noise produced by the reflections.

MIT uses wireless signals to identify people through walls
http://arstechnica.co.uk/information-technology/2015/10/mit-uses-wireless-signals-to-identify-people-through-walls/

Image of Massachusetts Institute of Technology building courtesy of Shutterstock.com

About author:

Comments are closed here.