Mobile Coffee Table Uses Legs To Get Around

For getting around on most surfaces, it’s hard to beat the utility of the wheel. Versatile, inexpensive, and able to be made from a wide array of materials has led to this being a cornerstone technology for the past ten thousand years or so. But with that much history it can seem a little bit played out. To change up the locomotion game, you might want to consider using robotic legs instead. That’s what [Giliam] designed into this mobile coffee table which uses custom linkages to move its legs and get itself from place to place around the living room.

Continue reading “Mobile Coffee Table Uses Legs To Get Around”

Why Walking Tanks Never Became A Thing

The walking tank concept has always captured imaginations. Whether you’re talking about the AT-AT walkers of Star Wars, or the Dreadnoughts from Warhammer 40,000, they are often portrayed in fiction as mighty and capable foes on the battlefield. These legged behemoths ideally combine the firepower and defense of traditional tanks with the versatility of a legged walking frame.

Despite their futuristic allure, walking tanks never found a practical military application. Let’s take a look at why tracks still rule, and why walking combat machines are going to remain firmly in the realm of fiction for the foreseeable future.

Continue reading “Why Walking Tanks Never Became A Thing”

AI Learns To Walk In 3D Training Grounds

AI agents are learning to do all kinds of interesting jobs, even the creative ones that we quite prefer handling ourselves. Nevertheless, technology marches on. Working in this area is YouTuber [AI Warehouse], who has been teaching an AI to walk in a simulated environment.

Albert needed some specific guidance to learn how to walk upright, something that humans tend to figure out innately.

The AI controls a vaguely humanoid-like creature, albeit with a heavily-simplified body and limbs. It “lives” in a 3D environment created in the Unity engine, which provides the necessary physics engine for the work. Meanwhile, the ML-Agents package is used to provide the brain for Albert, the AI charged with learning to walk.

The video steps through a variety of “deep reinforcement learning” tasks. In these, the AI is rewarded for completing goals which are designed to teach it how to walk. Albert is given control of his limbs, and simply charged with reaching a button some distance away on the floor. After many trials, he learns to do the worm, and achieves his goal.

Getting Albert to walk upright took altogether more training. Lumpy ground and walls in between him and his goal were used to up the challenge, as well as encouragements to alternate his use of each foot and to maintain an upright attitude. Over time, he was able to progress through skipping and to something approximating a proper walk cycle.

One may argue that the teaching method required a lot of specific guidance, but it’s still a neat feat to achieve nonetheless. It’s altogether more complex than learning to play Trackmania, we’d say, and that was impressive enough in itself. Video after the break.

Continue reading “AI Learns To Walk In 3D Training Grounds”

Virtual Reality Experiment Tricks Your Feet Into Walking While Sitting Down

The whole idea behind virtual reality is that you don’t really know what’s going on in the world around you. You only know what your senses tell you is there. If you can fake out your vision, for example, then your brain won’t realize you are floating in a tank providing power for the robot hordes. However, scientists in Japan think that you can even fool your feet into thinking they are walking when they aren’t. In a recent paper, they describe a test they did that combined audio cues with buzzing on different parts of the feet to simulate the feel of walking.

The trick only requires four transducers, two on each foot. They tested several different configurations of what the effect looked like in the participant’s virtual reality headgear. Tests were performed in third person didn’t cause test subjects to associate the foot vibrations with walking. But the first-person perspective caused sensations of walking, with a full-body avatar working the best, compared to showing just hands and feet or no avatar at all.

Making people think they are walking in VR can be tricky but it does explain how they fit all that stuff in a little holodeck. Of course, it is nice if you can also sense walking and use it to move your avatar, but that’s another problem.

Open Exosuit Project Helps Physically Challenged Put One Foot In Front Of Another

Humans make walking look simple, but of course that’s an illusion easily shattered by even small injuries. Losing the ability to walk has an enormous impact on every part of your day, so rehabilitative advances are nothing short of life-changing. The Open Exosuit for Differently Abled project is working feverishly on their Hackaday Prize entry to provide a few different layers of help in getting people back on their feet.

We’ve seen a number of exosuit projects in the past, and all of them struggle in a few common places. It’s difficult to incorporate intuitive user control into these builds, and quite important that they stay out of the way of the user’s own balance. This one approaches those issues with the use of a walker that both provides a means of steadying one’s self, and facilitates sending commands to the exosuit. Using the OLED screen and buttons incorporated on the walker, the user can select and control the walking, sitting, and standing modes.

The exoskeleton is meant to provide assistance to people with weakness or lack of control. They still walk and balance for themselves, but the hope is that these devices will be an aid at times when human caregivers are not available and the alternative would be unsteady mobility or complete loss of mobility. Working with the assistive device has the benefit of continuing to make progress in strengthening on the march to recovery.

The team is hard at work on the design, and with less than two weeks left before the entry deadline of the 2020 Hackaday Prize, we’re excited to see where the final push will bring this project!

Continue reading “Open Exosuit Project Helps Physically Challenged Put One Foot In Front Of Another”

Hybrid Robot Walks, Transforms, And Takes Flight

[Project Malaikat] is a 3D printed hybrid bipedal walker and quadcopter robot, but there’s much more to it than just sticking some props and a flight controller to a biped and calling it a day. Not only is it a custom design capable of a careful but deliberate two-legged gait, but the props are tucked away and deployed on command via some impressive-looking linkages that allow it to transform from walking mode to flying mode.

Creator [tang woonthai] has the 3D models available for download (.rar file) and the video descriptions on YouTube contain a bill of materials, but beyond that there doesn’t seem to be much other information available about [Malaikat]. The creator does urge care to be taken should anyone use the design, because while the robot may be small, it does essentially have spinning blades for hands.

Embedded below are videos that show off the robot’s moves, as well as a short flight test demonstrating that while control was somewhat lacking during the test, the robot is definitely more than capable of actual flight.

Continue reading “Hybrid Robot Walks, Transforms, And Takes Flight”

Learning Software In A Soft Exosuit

Wearables and robots don’t often intersect, because most robots rely on rigid bodies and programming while we don’t. Exoskeletons are an instance where robots interact with our bodies, and a soft exosuit is even closer to our physiology. Machine learning is closer to our minds than a simple state machine. The combination of machine learning software and a soft exosuit is a match made in heaven for the Harvard Biodesign Lab and Agile Robotics Lab.

Machine learning studies a walker’s steady gait for twenty periods while vitals are monitored to assess how much energy is being expended. After watching, the taught machine assists instead of assessing. This type of personalization has been done in the past, but the addition of machine learning shows that the necessary customization can be programmed into each machine without a team of humans.

Exoskeletons are no stranger to these pages, our 2017 Hackaday Prize gave $1000 to an open-source set of robotic legs and reported on an exoskeleton to keep seniors safe.

Continue reading “Learning Software In A Soft Exosuit”