This robo-bug can improvise its walk like a real insect

There are plenty of projects out there attempting to replicate the locomotion of insects, but one thing that computers and logic aren’t so good at is improvising and adapting the way even the smallest, simplest bugs do. This project from Tokyo Tech is a step in that direction, producing gaits on the fly that the researchers never programmed in.

“Perhaps the most exciting moment in the research was when we observed the robot exhibit phenomena and gaits which we neither designed nor expected, and later found out also exist in biological insects,” enthused the lead researcher, Ludovico Minati, in a news release.

One could program an immensely complicated AI or pattern generator to respond instantly to any of a thousand situations. But if a bug with a brain the size of a grain of sand can adapt to new situations quickly and smoothly, there must be a simpler, more analog way.

Different gaits produced by different patterns — okay, they don’t look that different, but they definitely are.

That’s what Minati was looking into, and his hexapod robot is certainly a simpler approach. A central pattern generator produces a master signal, which is interpreted by analog arrays and sent to the oscillators that move the legs. All it takes is tweaking one of five basic parameters and the arrays reconfigure their circuits and produce a working gait.

“An important aspect of the controller is that it condenses so much complexity into only a small number of parameters. These can be considered high-level parameters, in that they explicitly set the gait, speed, posture, etc.,” said one of Minati’s colleagues, Yasaharu Koike.

Simplifying the hardware and software needed for adaptable, reliable locomotion could ease the creation of small robots and their deployment in unfamiliar terrain. The paper describing the project is published in IEEE Access.

‘Post-reality’ video of CG imagery projected on a dancing man at high framerates

Not sure what there is to add to the headline, really. Well, I guess I should probably explain a bit.

Back in 2016 (on my birthday in fact) researchers from the University of Tokyo posted an interesting video showing a projector and motion tracking system working together to project an image onto moving, deforming surfaces like a flapping piece of paper or dancing person’s shirt.

Panasonic one-upped this with a more impressive display the next year, but the original lab has clapped back with a new video (spotted by New Atlas) that combines the awkwardness of academia with the awkwardness of dancing alone in the dark. And a quote from “The Matrix.”

Really though, it’s quite cool. Check out the hardware:

This dynamic projection mapping system, which they call DynaFlash v2, operates at 947 frames per second, using a depth-detection system running at the same rate to determine exactly where the image needs to be.

Not only does this let an image follow a person’s movement and orientation, but deformations in the material, such as stretching or the natural contortions of the body when moving.

The extreme accuracy of this process makes for strange possibilities. As Ishikawa Watanabe, the leader of the lab, puts it:

The capacity of the dynamic projection mapping linking these components is not limited to fusing colorful unrealistic texture to reality. It can freely reproduce gloss and unevenness of non-existing materials by adaptively controlling the projected image based on the three-dimensional structure and motion of the applicable surface.

Perhaps it’s easier to show you:

Creepy, right? It’s using rendering techniques most often seen in games to produce the illusion that there’s light shining on non-existent tubes on the dancer’s body. The illusion is remarkably convincing.

It’s quite a different approach to augmented reality, and while I can’t see it in many living rooms, it’s clearly too cool to go unused — expect this to show up in a few cool demos from tech companies and performance artists or musicians. I can’t wait to see what Watanabe comes up with next.

SpaceX’s spacesuited Starman mannequin serves a real purpose

SpaceX put a “Starman” into space today, on a path to a potential wide looping orbit of Mars and Earth — it was actually a mannequin wearing an official SpaceX crew flight suit, but it was more than just a fun payload for a rocket that stood every chance of exploding mid-flight, it turns out.

Elon Musk revealed on a press call following the Falcon Heavy launch on Tuesday that the mannequin was wearing an actual production SpaceX crew spacesuit, rather than a non-functional prototype or mock-up. The suit, which the SpaceX CEO revealed last year via Instagram, will eventually clothe SpaceX astronauts flying on board Crew Dragon, the capsule it’s developing to bring real people to space, with a target initial launch date of later this year if all goes to plan.

The suit, developed in-house by SpaceX, features a sleeker design than most spacefaring flight suits you’ll find. It’s a design that came with a price, however: Musk said that combining style and function was a particular challenge in a spacesuit.

“I mean, it’s a dangerous trip, you want to look good,” he said. “It’s easy to make a spacesuit that looks good but doesn’t work, it’s really hard to make a spacesuit that works, and looks good.”

And the suit does look good: It’s a stylish black and white, with clean lines and a helmet that looks like it’s been pulled from a sci-fi film with excellent costume design.

The suit, as mentioned, has more than good looks, however. It’s also a part of the qualification articles set by NASA that must be met in order to operate crewed launches that it be tested in the correct conditions, so Starman is serving SpaceX’s larger goal of providing crewed flight capabilities, too.

“It definitely works though,” Musk added. “You can just put it on and jump in a vacuum chamber.”

Behind the scenes of SpaceX’s Falcon Heavy launch day prep

SpaceX is launching its Falcon Heavy rocket tomorrow, and if it’s successful, it’ll be twice as powerful in terms of cargo capacity as its next closest active rival. That will help give SpaceX an edge in the growing private space race, and open up new opportunities in terms of potential clients, as well as set the stage for traveling to Mars.

The launch itself is happening on Tuesday, February 6 at 1:30 PM EST, weather permitting. The window lasts until 4 PM EST, however, so if conditions are good within that time the launch should go off as planned. There’s a backup window on February 7, which also starts at 1:30 PM EST, and we’ll be there live to watch it happen and report back all the news right here on TechCrunch.

This humanoid robot works out (and sweats) like we do (or should)

There are plenty of humanoid-looking robots out there, but very few actually have bodies that are particularly analogous to our own when it comes to moving and interacting with the environment. Japanese researchers are working to remedy that with a robot designed specifically to mimic not just human movements but the way humans actually accomplish those movements. Oh, and it sweats.

Kengoro is a new-ish robot (an earlier version made the rounds last year) that emphasizes flexibility and true humanoid structure rather than putting power or efficiency above all else.

As the researchers explain in their paper, published today in Science Robotics:

A limitation of conventional humanoids is that they have been designed on the basis of the theories of conventional engineering, mechanics, electronics, and informatics.

By contrast, our intent is to design a humanoid based on human systems — including the musculoskeletal structure, sensory nervous system, and methods of information processing in the brain — to support science-oriented goals, such as gaining a deeper understanding of the internal mechanisms of humans.

The paper uses Kengoro and similar robot, Kenshiro, as examples of how to accomplish that intent; indeed, the whole issue of Science Robotics was dedicated to the concept of improving anthropomorphic robotics.

It’s important, they explain, to imitate human biology wherever possible, not just where it’s convenient. If your robot has powerful arms but a stiff, straight spine and no neck, that may be better for lifting heavy items — but it just isn’t how humans do it, and if human-like motion is actually desired, you essentially have to put in our weaknesses as well as our strengths.

And truly human-like motion should be desired, if a robot is supposed to exist in human-centric environments and interact with people.

After putting together Kengoro with muscle, joint and bone-like arrangements of motors and struts, the researchers put these similarities to the test by having the robot attempt a number of ordinary exercises, from push-ups to calf raises.

All the way down! Are you a robot or a soft, weak human?

Use that anger!

As you can see, he’s a little jittery (“he” because the robot is modeled after an average 13-year-old Japanese boy). He probably should have stretched first. Still, he probably did more crunches for this article than I did this year.

The sweating thing probably deserves a little explanation. Essentially the motors have water running through them to help cool them off as they work, and they can expel that water through artificial pores in order to more quickly release heat. It’s not exactly a critical feature, but if you’re going to mimic humanity, you might as well go all the way.

It’s an interesting and unsurprisingly complex endeavor that Yuki Asano et al. are pursuing, but the results already seem worthwhile, and the applications they envision are promising. The “human mimetic humanoid” project is ongoing, so expect more from Kengoro in the near future.