This robo-bug can improvise its walk like a real insect


There are plenty of projects out there attempting to replicate the locomotion of insects, but one thing that computers and logic aren’t so good at is improvising and adapting the way even the smallest, simplest bugs do. This project from Tokyo Tech is a step in that direction, producing gaits on the fly that the researchers never programmed in.

“Perhaps the most exciting moment in the research was when we observed the robot exhibit phenomena and gaits which we neither designed nor expected, and later found out also exist in biological insects,” enthused the lead researcher, Ludovico Minati, in a news release.

One could program an immensely complicated AI or pattern generator to respond instantly to any of a thousand situations. But if a bug with a brain the size of a grain of sand can adapt to new situations quickly and smoothly, there must be a simpler, more analog way.

Different gaits produced by different patterns — okay, they don’t look that different, but they definitely are.

That’s what Minati was looking into, and his hexapod robot is certainly a simpler approach. A central pattern generator produces a master signal, which is interpreted by analog arrays and sent to the oscillators that move the legs. All it takes is tweaking one of five basic parameters and the arrays reconfigure their circuits and produce a working gait.

“An important aspect of the controller is that it condenses so much complexity into only a small number of parameters. These can be considered high-level parameters, in that they explicitly set the gait, speed, posture, etc.,” said one of Minati’s colleagues, Yasaharu Koike.

Simplifying the hardware and software needed for adaptable, reliable locomotion could ease the creation of small robots and their deployment in unfamiliar terrain. The paper describing the project is published in IEEE Access.

‘Post-reality’ video of CG imagery projected on a dancing man at high framerates


Not sure what there is to add to the headline, really. Well, I guess I should probably explain a bit.

Back in 2016 (on my birthday in fact) researchers from the University of Tokyo posted an interesting video showing a projector and motion tracking system working together to project an image onto moving, deforming surfaces like a flapping piece of paper or dancing person’s shirt.

Panasonic one-upped this with a more impressive display the next year, but the original lab has clapped back with a new video (spotted by New Atlas) that combines the awkwardness of academia with the awkwardness of dancing alone in the dark. And a quote from “The Matrix.”

Really though, it’s quite cool. Check out the hardware:

This dynamic projection mapping system, which they call DynaFlash v2, operates at 947 frames per second, using a depth-detection system running at the same rate to determine exactly where the image needs to be.

Not only does this let an image follow a person’s movement and orientation, but deformations in the material, such as stretching or the natural contortions of the body when moving.

The extreme accuracy of this process makes for strange possibilities. As Ishikawa Watanabe, the leader of the lab, puts it:

The capacity of the dynamic projection mapping linking these components is not limited to fusing colorful unrealistic texture to reality. It can freely reproduce gloss and unevenness of non-existing materials by adaptively controlling the projected image based on the three-dimensional structure and motion of the applicable surface.

Perhaps it’s easier to show you:

Creepy, right? It’s using rendering techniques most often seen in games to produce the illusion that there’s light shining on non-existent tubes on the dancer’s body. The illusion is remarkably convincing.

It’s quite a different approach to augmented reality, and while I can’t see it in many living rooms, it’s clearly too cool to go unused — expect this to show up in a few cool demos from tech companies and performance artists or musicians. I can’t wait to see what Watanabe comes up with next.

Intel ships update for newest Spectre-affected chips


Intel has announced that the fix is out for its latest chips affected by Spectre, the memory-leakage flaw affecting practically all computing hardware. The patch is for the Skylake generation (late 2015) and newer, though most users will still have to wait for the code to be implemented by whoever manufactured their computer (specifically, their motherboard).

The various problems presented in January by security researchers have to be addressed by a mix of fixes at the application, OS, kernel and microarchitecture level. This patch is the latter, and it replaces an earlier one that was found to be unstable.

These aren’t superficial tweaks and they’re being made under pressure, so some hiccups are to be expected — but Intel is also a huge company that has had months of warning to get this right, so people may be frustrated by the less-than-optimal way the flaws have been addressed.

As before, there isn’t much you as a user can do except make sure that you are checking frequently to make sure your PC and applications are up to date — in addition, of course, to not running any strange code.

If you’re on an older chipset, like Sandy Bridge, you’ll have to wait a bit longer — your fix is still in beta. You don’t want to be their test machine.

Featured Image: Alice Bevan–McGregor/Flickr UNDER A CC BY 2.0 LICENSE

Update for iOS and Macs negates text bomb that crashed devices


Last week we reported a major bug in Apple operating systems that would cause them to crash from mere exposure to either of two specific Unicode symbols. Today Apple fixes this major text-handling issue with iOS version 11.2.6 and macOS version 10.13.3, both now available for download.

The issue, discovered by Aloha Browser in the course of normal development, has to do with poor handling of certain non-English characters. We replicated the behavior, basically an immediate hard crash, in a variety of apps on both iOS and macOS. The vulnerability is listed on MITRE under CVE-2018-4124. If you were curious.

Apple was informed of the bug and told TechCrunch last week that a fix was forthcoming — in fact, it was already fixed in a beta. But the production version patches just dropped in the last few minutes (iOS; macOS). Apple calls the magical characters a “maliciously crafted string” that led to “heap corruption.” It seems that macOS versions before 10.13.3 aren’t affected, so if you’re running an older OS, no worries.

The iOS patch also fixes “an issue where some third-party apps could fail to connect to external accessories,” which is welcome but unrelated to the text bomb.

You should be able to download both updates right now, and you should, or you’ll probably get pranked in the near future.

This autonomous 3D scanner figures out where it needs to look


If you need to make a 3D model of an object, there are plenty of ways to do so, but most of them are only automated to the extent that they know how to spin in circles around that object and put together a mesh. This new system from Fraunhofer does it more intelligently, getting a basic idea of the object to be scanned and planning out what motions will let it do so efficiently and comprehensively.

It takes what can be a time-consuming step out of the process in which a scan is complete and the user has to inspect it, find where it falls short (an overhanging part occluding another, for instance, or an area of greater complexity that requires closer scrutiny) and customize a new scan to make up for these lacks. Alternatively, the scanner might already have to have a 3D model loaded in order to recognize what it’s looking at and know where to focus.

Fraunhofer’s project, led by Pedro Santos at the Institute for Computer Graphics Research, aims to get it right the first time by having the system evaluate its own imagery as it goes and plan its next move.

“The special thing about our system is that it scans components autonomously and in real time,” he said in a news release. It’s able to “measure any component, irrespective of its design — and you don’t have to teach it.”

This could help in creating one-off duplicates of parts the system has never seen before, like a custom-made lamp or container, or a replacement for a vintage car’s door or engine.

If you happen to be in Hanover in April, drop by Hannover Messe and try it out for yourself.

Featured Image: Fraunhofer