OSRF joins Dronecode

Dronecode logoWe’re pleased to announce that OSRF has joined the Dronecode Project, which promotes open source platforms for Unmanned Aerial Vehicles (UAVs). That mission, plus the burgeoning use of ROS and Gazebo in UAV development, make Dronecode and OSRF natural partners.

We’ll work with Dronecode to make our tools even more useful for UAV projects. We’ll also bring together the general robotics community and the aerial robotics community. Both groups have valuable tools and capabilities which can be shared, to everyone’s benefit.

We look forward to getting more involved with the UAV community and seeing some amazing open source flying robots.

Dr. Baxter goes to Oregon

At the end of January, Baxter left OSRF for a stint in Corvallis, Oregon where he will be used in a project that is investigating the use of teleoperated robots in the treatment of highly contagious diseases such as Ebola. He will be joining the Personal Robotics Group, part of Oregon State University’s growing Robotics Program, as part of their NSF-funded work to bring robots to the front lines of the current Ebola outbreak.

Health care workers are at the highest risk of exposure when working in close proximity to infected patients. Even the use of personal protective equipment still exposes workers to considerable risk of infection. These risks are due to both faulty practice and extreme conditions, especially in harsh locations such as West Africa where high temperatures and humidity present real operational challenges. The use of intuitive teleoperation interfaces will enable health care workers to remotely operate robots (such as Baxter) to perform significant portions of their jobs from a safe distance. Examples of potential tasks include patient monitoring, equipment moving, and contaminated material disposal. This will allow health care workers to provide needed care while maintaining the important patient-health care worker interaction, all while making their jobs safer and more tolerable.

Here is an example of a task that Baxter will help investigate, as performed by the PR2:

Gazebo Released for DARPA HAPTIX Project

The Gazebo team has been hard at work setting up a simulation environment for the Defense Advanced Research Projects Agency (DARPA)’s Hand Proprioception and Touch Interfaces (HAPTIX) program. The goal of the HAPTIX program is to provide amputees with prosthetic limb systems that feel and function like natural limbs, and to develop next-generation sensorimotor interfaces to drive and receive rich sensory content from these limbs. Managed by Dr. Doug Weber, HAPTIX is being run out of DARPA’s Biological Technologies Office (BTO).

As the organization maintaining Gazebo, OSRF has been tasked with extending Gazebo to simulate prosthetic hands and test environments, and develop both graphical and programming interfaces to the hands. OSRF is officially releasing a new version of Gazebo for use by HAPTIX participants. Highlights of the new release include support for OptiTrack motion capture system; the NVIDIA 3D vision system; numerous teleoperation options including the Razer Hydra, SpaceNavigator, mouse, mixer board and keyboard; a high-dexterity prosthetic arm; and programmatic control of the simulated arm using Linux, Windows and MATLAB. More information and tutorials are available at the Gazebo website. Here’s an overview video:

“Our track record of success in simulation as part of the DARPA Robotics Challenge makes OSRF a natural partner for the HAPTIX program,” according to John Hsu, Chief Scientist at Open Source Robotics Foundation. “Simulation of prosthetic hands and the accompanying GUI will significantly enhance the HAPTIX program’s ability to help restore more natural functionality to wounded service members.”

Gazebo is an open source simulator that makes it possible to rapidly test algorithms, design robots, and perform regression testing using realistic scenarios. Gazebo provides users with a robust physics engine, high-quality graphics, and convenient programmatic and graphical interfaces. Gazebo was the simulation environment for the VRC, the Virtual Robotics Challenge stage of the DARPA Robotics Challenge.

This project also marks the first time Windows and MATLAB users can interact with Gazebo, thanks to our new cross-platform transport library. The scope is limited to the HAPTIX project, however plans are in motion to bring the entire Gazebo package to Windows.

Teams participating on HAPTIX will have access to a customized version of Gazebo that the Johns Hopkins University Applied Physics Laboratory Modular Prosthetic Limb (MPL) developed under the DARPA Revolutionizing Prosthetics program, as well as representative physical therapy objects used in clinical research environments.

More details on HAPTIX can be found in the DARPA announcement.

OSRF welcomes Ying Lu

OSRF is pleased to welcome Ying Lu! Ying is currently a Ph.D. student at the Robotics Lab of the Department of Computer Science at Rensselaer Polytechnic Institute, under the direction of Prof. Jeff Trinkle. Before that, she received a BS degree from University of Science and Technology of China. Her research focuses on contact and constraint models and solvers in multibody dynamics, with an emphasis on a benchmarking framework for unified interfaces to use different models and solvers. She was a member of the RPI Rockie team at the 2014 Sample Return Challenge, helping with the vision system, using ROS and OpenCV. Ying attended the 2013 and 2014 Grace Hopper Conference, a celebration for women in computing, and she was a 2014 Grace Hopper Scholar.

Ying is enthusiastic and excited to see how robotics is going to change the world, just as the computer revolution does!

OSRF welcomes Louise Poubel

OSRF is pleased to welcome Louise Poubel! Louise grew up in Brazil and thought why not cross the world and go to college in Japan? As if two continents weren’t enough, later she decided to get a master’s in robotics in Europe, where she studied in Poland and France. There, she did research on making humanoid robots imitate human whole body movements in real time. At the end of 2013, Louise joined OSRF as an intern, and has since been collaborating with GUI tools for Gazebo. Now she’s coming to conquer one more continent while joining our team full-time.

She is excited about open source technology and user experience because she believes that machines are here to make life easier for everyone around the world, not the opposite! Along this line of thought, she hopes one day to make robotic Rubik’s cubes which solve themselves while humans just sit back and relax.

ROS2 gets embedded

We are happy to introduce a prototype for ROS 2 in deeply-embedded systems, using the stm3240g-eval board, which contains an STM32F4 microcontroller, an Ethernet interface, and some extra SRAM. The prototype combines a real-time operating system (NuttX) with a pseudo-POSIX interface, a DDS implementation (Tinq), and an example that uses ROS message types to communicate with other ROS 2 machines.

Here’s Victor to tell you about it:

This prototype has several caveats, most importantly that system performance is limited to ~3 Hz at the moment due to the UDP implementation of the underlying RTOS, but we expect that to improve drastically over time. We are hoping that this work is a starting point for the next generation of ROS-compatible sensors and actuators: robot parts that, out of the box, plug into an Ethernet network and interoperate in the ROS environment as first class participants.

Thank you to OSRF supporters!

OSRF logoToward the end of last year, we asked you for financial support. And you responded. We received donations ranging from $2 to $100, coming from individuals spanning 26 countries. It’s fantastic to see this breadth of support. Thank you to everybody who donated!

You can continue to donate to OSRF at any time, and of course we’re always interested to talk with new corporate or government sponsors.

Another way to support OSRF, if you’re an Amazon customer, is to login to Amazon Smile and select us (Open Source Robotics Foundation) as your charity (learn more).

Ubuntu ROS apps on the way

Ubuntu announced today that their new Snappy Core operating system, already being adopted for cloud computing, will be specifically supported on embedded, connected, mobile devices (aka Internet of Things).

And what’s the coolest kind of embedded, connected, mobile device? A robot, of course. Here at OSRF, we’ve been working with Ubuntu to ensure that ROS will be ready to use on Snappy and we’re making plans for a ROS / Snappy store. You’ll be able to write, share, and run ROS-based Snappy apps for your favorite robots (check out an early prototype).

We’ve supported and relied on Ubuntu Linux since the beginning of the ROS project, and we’re excited to be part of this transition to a new Ubuntu-based app ecosystem.

New project: Eyes of Things

EoT logo

We’re happy to announce that OSRF will be an advisor to the Eyes of Things (EoT) project, which was recently selected by the European Commission in one of the first batches of the ICT-H2020 Framework Programme. The EoT project brings together eight European partners: VISILAB (Spain, Project Coordinator), Movidius (Ireland), Awaiba (Portugal), DFKI (Germany), Thales (France), Fluxguide (Austria), nViso (Switzerland) and Evercam (Ireland).

The 3.7M€ project envisages a computer vision platform that can be used both standalone and embedded into more complex systems, particularly for wearable applications, robotics, home products, and surveillance. The core hardware will be based on a system-on-chip (SoC) that has been designed for maximum performance of the always-demanding vision applications while keeping the lowest energy consumption. This will allow always on and truly mobile vision processing. Software will be developed in parallel to this design, at both the low and middleware levels, and also for a number of demonstrators. The demonstrators span applications in surveillance, wearable configuration and embedded into a household item. Eyes of Things will have a duration of three years, starting in January 2015. The kick-off meeting will be January 27th, 2015.

OSRF welcomes Esteve Fernandez

OSRF is pleased to welcome Esteve Fernandez! After working for many years on distributed systems, Esteve switched gears and pursued a career in robotics. He also enjoys writing software and sharing it with others, so combining open source and robotics is an exciting opportunity for him.

Esteve holds a MSc in Artificial Intelligence and Robotics and M.S.E. and B.S.E. in Computer Engineering, and has been a professional developer for over 10 years. He is a member of the Apache Software Foundation and a frequent speaker at open source conferences, such as PyCon US and EuroPython.

Overall, Esteve found in robotics the perfect excuse for playing with Lego in his 30s without getting weird looks.