Micromouse, a Robotic Maze-Solving Competition

Micromouse is an event where small robotic “mice” race to get to the end of a maze as quickly as possible, after having explored it beforehand. The winner this year finished in less than 4 seconds; you’ll have to see it to see just how incredible it is: 

Below is a video of the exploratory phase, which to me is even more remarkable. The robot figures out its route pretty quickly, in less than two minutes. If it had good cameras on its sides so it could check dead-ends without running up to them, it would probably blaze right through. 

I imagine the technology and techniques used here are relevant for robots that will navigate other environments – the Roomba comes to mind, but I’m sure there are more important industrial applications.

Advertisements

Robot Can Control a Human Arm

Using electrodes on a human test subject’s arm, a robot could manipulate the human arm as well as its own arms to coordinate an action between them. This is relevant to the pursuit of robots that can assist paralyzed individuals, by using the robot body in addition to helping the paralyzed person move their own limbs. Below is a video showing this robot in action:

 

From Automaton:

The robot controls the human limb by sending small electrical currents to electrodes taped to the person’s forearm and biceps, which allows it to command the elbow and hand to move. In the experiment, the person holds a ball, and the robot a hoop; the robot, a small humanoid, has to coordinate the movement of both arms to successfully drop the ball through the hoop…

“Imagine a robot that brings a glass of water to a person with limited movements,” says Bruno Vilhena Adorno, the study’s lead researcher. “From a medical point of view, you might want to encourage the person to move more, and that’s when the robot can help, by moving the person’s arm to reach and hold the glass.”

Another advantage, he adds, is that capable robotic arms are still big, heavy, and expensive. By relying on a person’s physical abilities, robotic arms designed to assist people can have their complexity and cost reduced. Many research teams are teaching robots how to perform bimanual manipulations, and Adorno says it seemed like a natural step to bring human arms into the mix…

The researchers emphasize that the control of the human arm doesn’t have to be precise, just “good enough” to place it inside the robot’s workspace. They claim that having a robot able to control a person’s arm is better than having a very dexterous robot and a person’s weak, unsteady limb…

He plans to continue the project and adds that they’re now improving the electrical stimulation. They’re now able to move the elbow in both directions, for example. Eventually they hope to move the arm to any point in space.

The basic idea, then, is that it’s difficult to provide assistance to people if they can’t effectively use their own limbs, so why not have their helper robot move their limbs for them? 

I know you’re thinking what I’m thinking: terrifying. Besides that, it should be noted that neurons that don’t get any stimulation for a while can end up dying off, so some paralyzed individuals may not have the option of just getting outside stimulation for their nerves, since they won’t be intact any more. I imagine this solution, activating neurons from the outside, might head that degeneration off if it’s used not too long after the paralyzing event. 

The Telesar V Robot Avatar

Wired has an article about a robot that can be communicated with like an avatar – it mimics a user’s movements and transmits visual, auditory and even sensory information back to the user. Here’s a video of this robot in action:

From Wired:

The Telesar V can deliver a remote experience straight to its operator, transmitting sight, sound and touch data using a series of sensors and a 3D head-mounted display. The robot’s operator wears a 3D display helmet, which relays the robot’s entire field of view. A set of headphones transmit what the robot can hear…

With the Telesar V robot, for instance, you can actually feel the shape and temperature of objects, as well as surface unevenness like that of the bumps on the tops of LEGO blocks…

Some nifty telepresence robots — similar to telexistence, but less immersive — are already available in the U.S. The Anybots’ QB Robot has a webcam in its “head,” relaying visual information to its operator while displaying an image of the person at the helm on a small display underneath the camera. Almost Segway-esque in appearance, the QB is a two-wheeled apparatus controlled remotely via desktop. Though at $15,000 a pop, it’s designed more for corporations who need to check in on remote offices than the average consumer.

As far as movement goes, the Telesar V has 17 degrees of freedom in the body, 8 in the head and 7 in the arm joints (which is the same as a human). The hands have 15 degrees of freedom, which is a good amount less than the roughly 30 degrees of freedom a normal human hand has (and some other robotic hands emulate), but enough to allow the robot to easily manipulate objects.

How useful will these robots be in the future? It’s hard to say. I’m sure there are people who work in dangerous conditions who’d much rather be in a control room – we already see this with Predator and Reaper drones replacing piloted aircraft. Maybe once the technology is affordable, we’ll see robots replacing humans for things like bomb disposal and hazardous chemical jobs as well?

Another possibility mentioned in the article is space exploration, although I question how much a humanoid robot could get done in space. Maybe at some point it’ll be possible to have remote labs on Mars or the moon operated by robot avatars? I think that level of sophistication would probably be overkill, but who knows? Trying to accurately predict the future, especially the march of technology, is a great way to feel really dumb.

Paralyzed Patients Thought-Control a Robotic Arm

A few days ago I wrote about thought-controlled prosthetic limbs, and how a new experiment showed for the first time that monkeys could both control a virtual limb and receive sensory feedback from it. The Associated Press, via Medical Xpress, brings us news of a new robotic arm made by DARPA capable of relatively complex movement as well as of sensing touch through sensors on its fingertips.

The article tells a story of a paralyzed patient using this arm – through chips implanted into his brain – that won’t quite be the same in summary, so you should check it out yourself if you can. The summary is basically that some paralyzed people are part of a project known as BrainGate, among other presumably similar projects, where they’re learning to use electrodes implanted into their brains to control robotic third arms. They haven’t yet tried transmitting the arm’s sensory information back to the user, but that’s coming up soon, and it will be awesome

This all also means I have to take back what I assumed in the monkey post; I didn’t think implanting electrodes into people’s brains (as opposed to recording electrical activity from their scalps) would be a desirable course of action, but it looks like it’s feasible after all. I’m not sure if the electrodes/chips they used in this case had wires coming out or transmitted data wirelessly; I would hope the latter, but even if not, that’s probably not a major hurdle to overcome in the future compared to the rest of it. I also didn’t realize that a touch-sensitive robotic arm was already being used, although it looks like it only has sensors in its fingertips.

This is fantastic stuff, but it’s also going to raise its own issues in the future, especially if prosthetic limbs eventually function better than regular limbs and become desirable. That’s not a major concern compared to ending paralysis and helping amputees though, so again, this will be awesome.

Robotic Insects: Spy Drones of the Future

The Pentagon has been developing and presumably using small spy drones for a few years now, including a mechanical hummingbird. The next step: outfitting tiny drones with hair-covered membrane wings, and compound eyes. From Wired:

MAVs or Micro Air Vehicles are tiny, hovering bots that have been deployed for battlefield reconnaissance. But they’re still as limited as they are small. MAVs can’t really navigate urban environments or maintain a stable hover when the wind suddenly shifts…

So the Pentagon just handed out research awards to make bug eye-structures that look out for obstacles and “micro-feather/hair covered membrane wings for a flapping wing MAV” that sense gusts of wind. The goal is to allow these robo-spies to interact with the environment on their own…

Nature will be the engineers’ muse. A project to equip MAVs with hair-like sensors hopes to produce “the flight efficiency and agility of the hawkmoth,” the insect known for its hovering flight patterns. To figure out how MAVs could keep flying smoothly even when the wind pipes up, another group is looking at how hair cells on bees’ bodies sense changes in air flow.

Paduano’s engineering team, with the help of University of Maryland researchers, also wants to give MAVs the kind of crazy compound eyes that insects have. They’re sticking tiny cameras and processors onto a maneuverable aircraft called the “Skate” to replicate what a bug eye does. The cameras will transmit visual cues to an image-processing minicomputer — the kind used in cellphones — which will direct the aircraft to navigate the environment, swerving around corners, if necessary.

This is all pretty incredible stuff. If you’re interested you should check out Wired’s video of the hummingbird in flight.

%d bloggers like this: