DON'T LOSE ACCESS:
Your IP access to ForeignPolicy.com will expire on June 15.
To ensure uninterrupted reading, please contact Rachel Mines, sales director, at firstname.lastname@example.org.
Innovations: A Drone Called WANDA
The U.S. military is studying the movement of fish to design the next generation of autonomous underwater vehicles.
The U.S. military is adding to its weapons arsenal; her name is WANDA. This friendly moniker does little to signify its importance in the future of warfare, but the Navy is betting that the Wrasse-inspired Agile Near-shore Deformable-fin Automaton and its fellow autonomous underwater vehicles (AUVs) will transform sea surveillance—and at a fraction of the cost and risk of current manned systems, like the submarine.
The Navy already uses torpedo-shaped, long-range AUVs to explore the deep sea and report to submariners on water conditions. But at 100-plus pounds and approximately 6 feet in length, these gliders are too cumbersome to navigate through dense obstacles or complete complex maneuvers. Highly agile, slow-moving AUVs—for deployment, say, around sensitive coastal installations or in shallow harbors—are a much more challenging endeavor.
Enter WANDA, whose build was modeled on the bird wrasse, a pointed, medium-sized fish. By studying this creature’s pectoral fins and its swimming patterns through the cluttered coral habitats of its native Indian and Pacific oceans, the U.S. Naval Research Laboratory developed a drone prototype nimble enough to swim in shallow waters. WANDA, which is designed to travel at a speed of 2 knots, can turn in place, move laterally, and hold its position in light currents.
The prototype, which looks a little like a clear plastic football with four fins, is designed to carry a variety of interchangeable sensors that can be installed and replaced in the robot’s body, depending on its mission. WANDA doesn’t have a propeller; rather, its maneuverability comes from its side-mounted fins, which continuously respond to the environment.
This is not the Defense Department’s first turn in biomimicry. In fact, it is funding research on robotic swimmers modeled on tuna and jellyfish, and it is testing another underwater drone—but this one resembles a shark, dorsal fin and all.
Developing AUVs like these takes years. Navy researchers have been studying the wrasse for over a decade, and the WANDA prototype is still undergoing tests. This year, though, Pentagon officials indicated that the 2016 budget would include “significant” investments in underwater robotics. If all goes as planned, WANDA will one day patrol murky, obstacle-filled waters to inspect ship hulls or hunt for the presence of trace chemicals. It would join a growing global AUV fleet, from some 600 in 2014 to an estimated 825 by 2018, according to the research firm Douglas-Westwood. That is to say, the drone called WANDA will likely have a sequel.
Build Here Now
For years architects have been able to develop and explore their designs with digital models. But what if people could actually see a design, in 3-D, at the physical building site? An augmented-reality system developed by researchers at Germany’s Fraunhofer Institute for Applied Information Technology (FIT) promises to let people do just this. Users of Auto AR, as FIT’s system is called, can drive up to a construction site, put on a GPS-equipped virtual-reality headset, and explore the three-dimensional architectural rendering that appears superimposed on the landscape before them. While Auto AR can be installed in any car, the system requires a headset that will display plans exported from the popular design software Autodesk Revit. FIT claims that Auto AR’s 3-D models are accurate to the centimeter.
The system, which was demonstrated at a January architecture and construction conference in Munich, could certainly help architects and their clients, who could scrutinize plans on a real-world scale, as well as average citizens, who could better evaluate the effects of construction on their communities.
Millions of pieces of trash are in constant orbit around the Earth: Dead satellites, old rocket parts, and tiny bits of paint are all spinning through the void at thousands of miles per hour, threatening active satellites and space missions. The International Space Station had to dodge debris five times in 2014 alone. And in 2009, a defunct Russian satellite and a privately owned American satellite collided, shattering both and creating even more trash in the process. The problem will only worsen as countries and corporations race to put more satellites into orbit—around 115 per year through 2023, according to the space and communications consulting firm Euroconsult.
In response, NASA’s Jet Propulsion Laboratory is developing technology that can finally collect all this space junk. By exploiting the same property that binds the tiny hairs on gecko feet to slick surfaces, gripping pads will use mushroom-shaped bristles to grasp pieces of debris, even at the extreme speeds and temperatures of space. Tests haven’t yet been conducted in orbit, but in an August 2014 trial (the results of which were publicized in December), the agency’s gripper managed to wrangle a human volunteer floating weightlessly through the cockpit of an airplane in parabolic flight.
Reach Out and Touch Me
Despite great leaps in the past 15 years, prosthetic limbs still can’t experience the full range of human touch, like when something is burning hot, freezing cold, or about to slip from grasp. In a December article in the journal Nature Communications, a group of American and South Korean researchers from five institutions announced that they had come one step closer to full sensation with a new artificial skin, designed for prostheses, that can sense moisture, temperature, and pressure. It’s thin, and thanks to a special configuration of silicon and gold sensors, it’s extremely flexible and can stretch without tearing. To make it even more realistic, it’s fitted with a heating system, making it warm to the touch.
Before the skin becomes patient-ready, which is still years away, scientists need to develop advanced technology that could trick the brain into thinking prosthetic devices are the body’s own.
Courtesy of U.S. Navy