Video Friday is your weekly selection of awesome robotics videos, collected by your stigmergic Automaton bloggers. We’re also posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):
HRI 2016 – March 7-10, 2016 – Christchurch, New Zealand
RobArch 2016 – March 14-19, 2016 – Sydney, Australia
RoboCup European Open – March 30-4, 2016 – Eindhoven, Netherlands
WeRobot 2016 – April 1-2, 2016 – Miami, Fla., USA
National Robotics Week – April 2-10, 2016 – United States
AISB HRI Symposium – April 5-6, 2016 – Sheffield, United Kingdom
Robotics in Education 2016 – April 14-15, 2016 – Vienna, Austria
NASA Swarmathon – April 18-22, 2016 – NASA KSC, Fla., USA
LEO Robotics Congress – April 21, 2016 – Eindhoven, Netherlands
International Collaborative Robots Workshop – May 3-4, 2016 – Boston, Mass., USA
ICARSC 2016 – May 4-6, 2016 – Bragança, Portugal
Robotica 2016 – May 4-8, 2016 – Bragança, Portugal
ARMS 2016 – May 9-13, 2016 – Singapore
ICRA 2016 – May 16-21, 2016 – Stockholm, Sweden
NASA Robotic Mining Competition – May 18-20, 2016 – NASA KSC, Fla, USA
Skolkovo Robotics Conference – May 20, 2016 – Skolkovo, Russia
Innorobo 2016 – May 24-26, 2016 – Paris, France
RoboCity16 – May 26-27, 2016 – Madrid, Spain
Let us know if you have suggestions for next week, and enjoy today’s videos.
“Academy Award®-nominated director Orlando von Einsiedel, Executive Producer J.J. Abrams, Bad Robot and Epic Digital have joined forces with Google and XPRIZE to create a documentary web series about the people competing for the Google Lunar XPRIZE. The Google Lunar XPRIZE is the largest prize competition of all time with a reward of $30 million and aims to incentivize entrepreneurs to create a new era of affordable access to the Moon and beyond, while inspiring the next generation of scientists, engineers, and explorers.”
Look for it on YouTube on March 7.
[ Google Lunar XPRIZE ]
Thanks Rachel!
“DARPA’s Vertical Takeoff and Landing Experimental Plane (VTOL X-Plane) program seeks to provide innovative cross-pollination between fixed-wing and rotary-wing technologies and by developing and integrating novel subsystems to enable radical improvements in vertical and cruising flight capabilities. In an important step toward that goal, DARPA has awarded the Phase 2 contract for VTOL X-Plane to Aurora Flight Sciences.”
Aurora Flight Sciences has lots of experience building weird VTOL drones, so there’s little doubt that they’ll be able to make this concept work. A few things worth noting: the VTOL X-Plane only has one engine, a turbine in the fuselage. This provides electrical power to the other 24 ducted fans. DARPA is expecting the vehicle to be able to reach sustained flight speeds of 300-400 knots, with a minimum 75 percent hovering efficiency and a useful load of at least 10,000 pounds.
[ DARPA ]
“Strictly using digital production techniques (3D printing, laser cutting and thermoforming on CNC – milled molds), a modular and reproducible, scorpion-based hexapod robot was created. Problems such as gravitational separation and high energy consumption of the motors were solved by the use of elastics, based on the principle of antagonistic muscles. The tail actuation and main robot structure were strongly based on nature’s solutions as well.
The scorpion is programmed with a range of moves and interactions to function more autonomous on expos, while a GUI allows easier calibration and live control over the robot. A marker integrated in the tail allows the scorpion to “stab” expo visitors, creating a trace of persons ‘carrying the information shared at the IDC-stand’ through the expo and representing the idea of stigmergy.”
“Stigmergy.” There’s a word I don’t think I’ve ever heard used in real life before.
[ Scorpion Hexapod/TIII Project ] via [ Gizmag ]
In emergencies, should you trust a robot? The short answer: sure, but be sensible about it.
“People seem to believe that these robotic systems know more about the world than they really do, and that they would never make mistakes or have any kind of fault,” said Alan Wagner, a senior research engineer in the Georgia Tech Research Institute (GTRI). “In our studies, test subjects followed the robot’s directions even to the point where it might have put them in danger had this been a real emergency.”
In the study, sponsored in part by the Air Force Office of Scientific Research (AFOSR), the researchers recruited a group of 42 volunteers, most of them college students, and asked them to follow a brightly colored robot that had the words “Emergency Guide Robot” on its side. The robot led the study subjects to a conference room, where they were asked to complete a survey about robots and read an unrelated magazine article. The subjects were not told the true nature of the research project.
In some cases, the robot – which was controlled by a hidden researcher – led the volunteers into the wrong room and traveled around in a circle twice before entering the conference room. For several test subjects, the robot stopped moving, and an experimenter told the subjects that the robot had broken down. Once the subjects were in the conference room with the door closed, the hallway through which the participants had entered the building was filled with artificial smoke, which set off a smoke alarm. When the test subjects opened the conference room door, they saw the smoke – and the robot, which was then brightly-lit with red LEDs and white “arms” that served as pointers. The robot directed the subjects to an exit in the back of the building instead of toward the doorway – marked with exit signs – that had been used to enter the building.
“We expected that if the robot had proven itself untrustworthy in guiding them to the conference room, that people wouldn’t follow it during the simulated emergency,” said Paul Robinette, a GTRI research engineer who conducted the study as part of his doctoral dissertation. “Instead, all of the volunteers followed the robot’s instructions, no matter how well it had performed previously. We absolutely didn’t expect this.”
[ Georgia Tech ]
For $230 on Kickstarter, you can pledge for a ROS-friendly LIDAR with a 40-meter range that works outdoors:
Other specs you might care about: 2-10 Hz, minimum range 10 cm, angular resolution as good as 1.4 degrees, and accuracy of about 2 percent of measured distance.
[ Kickstarter ]
Thanks Tyson!
Getting industrial robots to work safely alongside humans is still a challenge, and CMU’s Intelligent Workcell Project is experimenting with different ways of keeping humans from getting crushinated:
[ CMU ]
Integrating display systems into soft robotics has an enormous amount of potential for both color changing robots and shape changing displays:
This hyper-elastic light-emitting capacitor (HLEC), which can endure more than twice the strain of previously tested stretchable displays, consists of layers of transparent hydrogel electrodes sandwiching a dielectric (insulating) elastomer sheet. The elastomer changes luminance and capacitance (the ability to store an electrical charge) when stretched, rolled and otherwise deformed.
“We can take these pixels that change color and put them on these robots, and now we have the ability to change their color,” [CMU Professor Rob] Shepherd said. “Why is that important? For one thing, when robots become more and more a part of our lives, the ability for them to have emotional connection with us will be important. So to be able to change their color in response to mood or the tone of the room we believe is going to be important for human-robot interactions.”
[ Cornell ] via [
New Scientist
]
The coolest thing about DJI’s new Phantom 4 camera drone is that it’s got a couple of forward-looking cameras that allow it to do basic stereo vision-based autonomous obstacle avoidance:
The DJI Phantom 4 is available now for $1400.
[ DJI ]
Why sing happy birthday with just one Jibo when you could do it with four Jibos at the same time?
Because it would be super expensive, that’s why.
[ Jibo ]
HJM47: a 2-meter tall, 280-kg boxing robot that can move at 40 mph and throws TORNADO PUNCHES!
[ Hajime Research Institute ]
“Mobile inspection robots crawl over pipelines on magnetic wheels and identify critical points using special sensors. What sounds like science fiction has long since become reality, but the moment that these robots need to be serviced and inspected, it has not been possible to avoid human interaction. Now, for the first time, researchers at the German Aerospace Center (Deutsches Zentrum für Luft- und Raumfahrt; DLR) have successfully integrated an industrial robotic gripper arm with seven degrees of freedom into an autonomous helicopter system as part of the European Union ‘Aerial Robotics Cooperative Assembly System’ (ARCAS) project. This makes it possible to inspect and service robots on the pipelines without risk. Similar systems could also be used for the maintenance of satellites or even for building habitats on other planets.”
[ DLR ]
Having a Pepper take care of you in the hospital actually seems kind of useful:
As long as it also responds to “Go get a human I feel like I’m dying,” that is.
[ Pepper ]
A pneumatic anti-drone mortar launcher with a range of 100 meters? Sure, why not?
Eh, I like the anti-drone eagles better.
[ Business Insider ]
To promote the World Drone Prix, Dubai staged a race between a very fast moving drone and a very slow moving McLaren:
[ World Drone Prix ]
From Vijay Kumar’s lab at the University of Pennsylvania:
“We combine strategies for passive particle assembly in soft matter with robotics to develop new means of controlled interaction. In capillary assembly, particles distort fluid interfaces and move in directions that minimize the surface area. In particular, they move along principle axes on curved interfaces to sites of high curvature via capillary migration.
We propose a robot that serves as a programmable source of fluid curvature and allows the collection of passive particles. When settled on a fluid interface, the magnetic robot distorts the interface, which strongly influences curvature capillary migration. The shape of the robot dictates the interface shape, for example, by imposing high interface curvature near corners, create sites of preferred assembly.
This freedom to manipulate interface curvature dynamically and to migrate laterally on the interface creates new possibilities for directed bottom-up particle assemblies and precise manipulation of these complex assembled structures. Since the passive particles can be functionalized to sense, report and interact with their surroundings, this work paves the way to new schemes for creation and control of functionalized micro robots.”
[ UPenn ]
This week’s CMU Robotics Institute seminar:
Thomas Howard is an assistant professor in the Department of Electrical and Computer Engineering and the Department of Computer Science. He is also a member of the Institute for Data Science and holds a secondary appointment in the Department of Biomedical Engineering. The efficiency and optimality of robot decision making is often dictated by the fidelity and complexity of models for how a robot can interact with its environment. It is common for researchers to engineer these models a priori to achieve particular levels of performance for specific tasks in a restricted set of environments and initial conditions. As we progress towards more intelligent systems that perform a wider range of objectives in a greater variety of domains, the models for how robots make decisions must adapt to achieve, if not exceed, engineered levels of performance. In this talk I will discuss progress towards model adaptation for robot intelligence, including recent efforts in natural language understanding for human-robot interaction.
[ CMU RI Seminar ]
If you’re the sort of person who was interested in the ROS-Industrial Community Meeting that happened last week, you probably didn’t miss the online stream. But if you did, it’s now on YouTube:
[ ROS Industrial ]