Robotics

Quadruped robots may one day give seeing-eye dogs a run for their money

Quadruped robots may one day give seeing-eye dogs a run for their money
This obstacle-avoiding Unitree A1 robot can be steered via directional tugs on its leash
This obstacle-avoiding Unitree A1 robot can be steered via directional tugs on its leash
View 1 Image
This obstacle-avoiding Unitree A1 robot can be steered via directional tugs on its leash
1/1
This obstacle-avoiding Unitree A1 robot can be steered via directional tugs on its leash

While seeing-eye dogs can be very helpful to the blind, raising and training them is a long and expensive process. Scientists have therefore recently started investigating the possibility of outsourcing the job to dog-like quadruped robots.

According to Guide Dogs of America, it costs approximately US$48,000 to breed, raise and train a single seeing-eye dog.

Although the end user doesn't pay that price, they do have to cover the usual expenses that come with dog ownership. Additionally, the dog typically must be trained by the provider for at least five to eight months, after which further training with the user is required.

By contrast, quadruped robots can be preprogrammed, they don't require food, water or exercise, and they can be purchased for as little as US$1,500. With these and other selling points in mind, Asst. Prof. Shiqi Zhang and colleagues at New York's Binghamton University set about seeing what sort of guide dog a Unitree A1 robot would make.

The commercially available bot comes standard with sensors that allow it to perform functions such as obstacle detection/avoidance, trajectory planning, and navigation – all of which would come in very handy for guiding blind users.

Working with computer science students David DeFazio and Eisuke Hirota, Zhang designed and added an interface to which an ordinary dog leash was attached. Via a reinforcement learning process, custom locomotion control software was then "trained" to turn the robot left or right when the user tugged the leash in the corresponding direction.

In a recent demonstration, the adapted Unitree A1 was able to guide a person along a hallway, and it readily responded to their directional leash-tugs at an intersection within that hall. It's a good start, but more work still needs to be done.

"Our next step is to add a natural language interface," says Zhang. "So ideally, I could have a conversation with the robot based on the situation to get some help. Also, intelligent disobedience is an important capability. For example, if I’m visually impaired and I tell the robot dog to walk into traffic, we would want the robot to understand that. We should disregard what the human wants in that situation."

And while blind people may conceivably be able to buy such robots one day, it is also envisioned that the bots could available for temporary use at difficult-to-navigate locations such as shopping malls and airports.

Source: Binghamton University

2 comments
2 comments
Eggbones
Why would it use such an inconvenient design? I'd have thought something wearable would be just as capable of providing guidance, but with far less stigma.
Daishi
Boston Dynamics just did a demo with Spot + GPT + text to speech that was pretty cool. They prompted it to give it a personality and it was able to describe what it was seering.

I think seeing eye dogs are reliable in a way that this would years away from though.