Monday, August 30, 2010

Use Microsoft Surface to Control a Swarm of Robots With Your Fingertips




A sharp-looking tabletop touchscreen can be used to command robots and combine data from various sources, potentially improving military planning, disaster response and search-and-rescue operations.

Mark Micire, a graduate student at the University of Massachusetts-Lowell, proposes using Surface, Microsoft's interactive tabletop, to unite various types of data, robots and other smart technologies around a common goal. It seems so obvious and so simple, you have to wonder why this type of technology is not already widespread.

In defending his graduate thesis earlier this week, Micire showed off a demo of his swarm-control interface, which you can watch below.

You can tap, touch and drag little icons to command individual robots or robot swarms. You can leave a trail of crumbs for them to follow, and you can draw paths for them in a way that looks quite like Flight Control, one of our favorite iPod/iPad games. To test his system, Micire steered a four-wheeled vehicle through a plywood maze.

The system can integrate a variety of data sets, like city maps, building blueprints and more. You can pan and zoom in on any map point, and you can even integrate video feeds from individual robots so you can see things from their perspective.

As Micire describes it, current disaster-response methods can’t automatically compile and combine information to search for patterns. A smart system would integrate data from all kinds of sources, including commanders, individuals and robots in the field, computer-generated risk models, and more.

Emergency responders might not have the time or opportunity to get in-depth training on new technologies, so a simple touchscreen control system like this would be more useful. At the very least, it seems like a much more intuitive way to control future robot armies.


Monday, April 13, 2009

Our ears may have built-in passwords





YOU are the victim of identity theft and the fraudster calls your bank to transfer money into their own account. But instead of asking them for your personal details, the bank assistant simply presses a button that causes the phone to produce a brief series of clicks in the fraudster's ear. A message immediately alerts the bank that the person is not who they are claiming to be, and the call is ended.

Such a safeguard could one day be commonplace, if a new biometric technique designed to identify the person on the other end of a phone line proves successful. The concept relies on the fact that the ear not only senses sound but also makes noises of its own, albeit at a level only detectable by supersensitive microphones.

If those noises prove unique to each individual, it could boost the security of call-centre and telephone-banking transactions and reduce the need for people to remember numerous identification codes. Stolen cellphones could also be rendered useless by programming them to disable themselves if they detect that the user of the phone is not the legitimate owner.

Called otoacoustic emissions (OAEs), the ear-generated sounds emanate from within the spiral-shaped cochlea in the inner ear. They are thought to be produced by the motion of hair cells within the outer part of the cochlea. Typically, sounds entering the ear cause these outer hair cells to vibrate, and these vibrations are converted to electrical signals which are transmitted along the auditory nerve, allowing the sound to be sensed. Crucially, these cells also create their own sounds as they expand and contract.

That's because "hearing is an active process - the ear actually puts energy into the incoming sound waves to replace energy lost as sound is absorbed by the ear's structure", says Stephen Beeby, an engineer at the University of Southampton, UK, who is leading the research. "This process helps us hear things we otherwise would not, but as a result some of the energy added by the hair cells escapes as OAEs."

Predicted in the 1940s but not detected until ultralow-noise microphones were developed in the 1970s, OAEs can be provoked when a series of clicks is played into the ear. The returning sound emissions comprise signals of between 0 and 5 kilohertz, and vary in amplitude. Click tests are already used to check newborn babies' ears for signs of hearing difficulties, since the OAEs are weaker if the inner ear is defective.

What sparked the interest of Beeby and his colleagues is the fact that the power and frequency distribution in the OAEs provoked by specific series of clicks seem to be highly distinctive, driven by the internal shape of the person's ear. "Anecdotally, audiologists say they can tell different people apart - men, women, even people of different ethnic origins - by the profile of the widely varying types of emissions the clicks evoke," he says.

So with funding from the UK's Engineering and Physical Sciences Research Council, Beeby's team is attempting to work out if OAE patterns can be used in biometry, like iris scans or fingerprints. "In the controlled conditions of a lab, everybody's emissions are indeed different, but whether this is a practical way of telling people apart as a real-world biometric still needs a lot of work," he admits.
Whether this is a practical way of telling people apart as a real world biometric still needs a lot of work

There are a number of problems that must be dealt with, he says. In subjects that have been drinking alcohol, for example, emissions are deadened. And different drugs alter the amplitude of OAEs, as do ear infections or wax build-up.

If they succeed by the project's deadline in mid-2010, they hope to interest electronics firms in making headsets or cellphones with a supersensitive microphone in the earpiece. The rest is done with software, says Beeby.

Establishing a new biometric, however, is a huge task. Tony Mansfield, head of biometrics assessment at the UK's National Physical Laboratory in Teddington, says the team will have to prove not only that their technique has a low false-match rate, but also that a person's recorded OAE will match their OAEs over the long term. "It has to be able to reliably recognise people over long time periods," he says. "For example, a fingerprint taken from a 20-year-old is still valid when they are 60."

Sunday, April 12, 2009

Search