We'd like to get an idea of what versions of ROS people are using and how long they want those versions supported. If you could fill out this short survey, that would help us to make decisions on where we should focus our efforts for ROS1. We'll leave the survey open for 2 weeks, until April 4, 2018, and post the results shortly after that.

Evan Ackerman interviewed eight of the people involved in the early days of ROS for an amazing oral history of how ROS came to be.

"Could we actually create something that would enable people to build on each other's results rather than continuing this cycle of 90 percent duplicating what someone else has already done, with a little bit at the end of something new, if you're lucky." -- Keenan Wyrobek

"That's one of the things that helped the design of ROS, I think: From Day 1, more than one robot was running on the code." -- Morgan Quigley

"We were going to build both state-of-the-art hardware and software, with the goal of being a LAMP stack for robotics: You'd be able to take its open-source software, put your business model on top, and you'd have a startup" -- Tully Foote

"Across the board, the early adopters were adopting it without us even telling them that they should--they just saw it out there, thought it was cool, and picked it up." -- Brian Gerkey

"It's crazy that Willow Garage agreed to let us just put all our code out in the open. Not only did we do open source, we did it for real--a lot of companies will develop their stuff internally, and occasionally put out releases. But one of the things that this anniversary celebrates is that we committed to every commit that we did being in the open, on a publicly available server. I think it's an amazing thing for a company to decide to do. We basically gave away millions worth of work product, and that's a big deal." -- Ken Conley

We are excited to show off a simulation of a Prius in Mcity using ROS Kinetic and Gazebo 8. ROS enabled the simulation to be developed faster by using existing software and libraries. The vehicle's throttle, brake, steering, and transmission are controlled by publishing to a ROS topic. All sensor data is published using ROS, and can be visualized with RViz.

We leveraged Gazebo's capabilities to incorporate existing models and sensors.
The world contains a new model of Mcity and a freeway interchange. There are also models from the gazebo models repository including dumpsters, traffic cones, and a gas station. On the vehicle itself there is a 16 beam lidar on the roof, 8 ultrasonic sensors, 4 cameras, and 2 planar lidar.

The simulation is open source and available at on GitHub at osrf/car_demo. Try it out by installing nvidia-docker and pulling "osrf/car_demo" from Docker Hub. More information about building and running is available in the README in the source repository.

With each release of ROS we have a tradition of having a logo and making t-shirts. ROS Lunar Loggerhead is coming out in May. To let you show your ROS colors and support this tradition, we have setup a Teespring Campaign in both the US and the EU. Note that both these campaigns can ship worldwide.

This year we're also excited to provide an opportunity to order stickers for Lunar Loggerhead. We're also providing stickers for Lunar as well as all other active ROS distros Indigo, Jade, Kinetic, and a generic ROS Sticker

Note that there's pretty good price breaks if you order in quantity. So if you have a few friends nearby it's probably worth doing a group order. And the stickers will continue to be available. The tshirt campaign ends in 20 days so don't delay ordering.

Since this is the first public announcement, here's the full graphic for Lunar Loggerhead.

Thanks to everyone who's been helping prepare the Lunar release. We're looking forward to the release in May. We expect the tshirts ordered in this campaign will arrive just before ROS Lunar Loggerhead is released!

Southwest Research Institute is providing video editing services to create two separate video montages celebrating the anniversary of two respective ROS-based open source projects. Please visit the video submission page to upload your video for one of the following montages:

In recent years, ROS has become the standard in Service and Research Robotics, and it's making great advances in the Industry.

Most of the robots and components in the market support ROS, though sometimes finding which are really supported, what ROS version they make use, and how to get them is a difficult task. One of our main purposes is to make it easier and simpler for the customer, linking the products with their ROS controllers, explaining how to install and configure them and showing where to find useful information about them.

All the products in the site are supported by ROS, either available directly in the ROS distribution or through their source code. The ROS community has a new meeting point in ROS Components!

ROS as standard

From ROS-Components we strongly believe that ROS is and will be the standard in Robotics for many more years. Therefore we want to encourage roboticists to use it (whether you are not already doing so) as well as manufacturers to give support to it.

Supporting ROS and its Community

As you know, the ROS core is currently being maintained by the Open Source Robotics Foundation (OSRF), which is an independent non-profit R&D company leading the development, maintenance of ROS versions and hosting all the necessary infrastructure.

From ROS Components we try to encourage the use of ROS as well as its maintenance and growth. Therefore we are going to donate part of the benefits of every sale to the OSRF. So, every time you buy in ROS Components, you'll be contributing to the ROS maintenance and development.

The basic reason for forming this project was the question: "How do we make autonomous machines working together on common task?". E..g. when using an autonomous wheel loader loading gravel on a truck, who is deciding on their relative positions; the truck, the loader or a supervising system?

To make a first approach we decided to this in the frame of Volvo Group Academic Preferred Partner (APP) network, involving students and researchers from Chalmers and Mälardalen universities in Sweden and Penn State University, Pennsylvania, US and the Swedish waste mgmt. company, Renova. We all agreed that using ROS was a must to, on one hand, coordinate the three universities and also, use the development made within the frame of ROS (e.g. Gazibo, Rviz, Moveit, Drivers etc.). Thank's to a great engagement from the researchers and students, and of course the ROS components we manage to make this (and a lot more, not shown in the video).

Congratulations to the answers.ros.org community for making the site the thriving resource that it is today. Keep up the fantastic work, and keep the questions--and answers--coming.

With the awareness on the site. If you've asked a question and not marked it answered. Please consider revising it with more details or to add clarity.
And likewise consider trying to answer one question each time you're on the site.

Gaitech EDU provides a comprehensive educational framework on Robot
Operating System (ROS) through a series of tutorials and online videos.

Gaitech EDU is an educational website on robotics and in particular on
Robot Operating System (ROS). The objective is to provide an easy-to-follow
educational content that helps in better mastering the concepts of ROS and
promoting its use for developing robotics software. Gaitech company strives
to contribute to the development of ROS and provides its customers and ROS
users with technical support and an open education framework to learn ROS.

Gaitech Education website is NOT meant to be a substitue of ROS wiki
documentation website, but a complementary website that is more oriented to
providing education and teaching material.

As the primary objective of Gaitech EDU is to promote education of ROS,
tutorials were designed with teaching objectives in mind. Each tutorial
starts with Learning outcomes that the student or the learned is expected
to know at the end of the tutorial. Then, the tutorial is provided in both
textual format and/or video illustrations. Finally, a series of review
questions are proposed so that the student self-evaluation his
understanding about the concepts presented in the tutorial.
It can be used as additional teaching resources in robotics courses using
ROS.

The ROS Qt Creator Plug-in is developed specifically for ROS to increase a developers' efficiency by simplifying tasks and creating a centralized location for ROS tools. Since it is built on top of the Qt Creator platform, users have access to all of its existing features like: syntax highlighting, editors (C++ , Python, etc.), code completion, version control (Git, Subversion, etc.), debuggers (GDB, CDB, LLDB, etc.), and much more

Check out two videos. The first is a short overview of the Qt Creator and its default capabilities. The second video is an overview of the ROS Qt Creator Plug-in developed by Levi Armstrong from Southwest Research Institute. It concludes with an invitation for other to begin using the plug-in for ROS development.

Robotic startup Tend.ai, which just came out of stealth mode today successfully built the world's first fully automated 3D printing system controlled by cloud robots.

One robot fully automates ten 3D printers in the video. The prints are boxed and pushed down a conveyor belt. Any 3D printer can be used, and Tend.ai's artificial intelligence "reads" (OCR) the printers' displays as well as pushes the buttons just like a human would.

Tend.ai allows you to train, control and monitor most collaborative robots from any device (e.g. your mobile phone) without any technical expertise. Tend.ai automatically monitors the state of all machines and optimally executes them.

Tend.ai utilizes ROS in the cloud to control, train and monitor suites of robots from any device. Thanks to cloud computing, standard webcams (< $100) can be used for the vision system.

Tend.ai can tend most machines without any modification or networking.

We are interested in knowing which hardware platforms are the favorite ones to run ROS so we'd like to ask a few minutes of your time to fill the following survey. We'll be sharing the results once it's closed. Thanks for your collaboration!

We look forward to receiving your contributions to make this book successful and useful for ROS community.

In Volume 1, we accepted 27 chapters ranging from beginners level to advanced level, including tutorials, case studies and research papers. The Volume 1 is expected to be released by Feb 2016.

After negotiation with Springer, the authors have benefited of around 80% of discount on hardcopies as an incentive to their contribution, in addition to publishing their work.

The call for chapters website (see above) presents in detail the scope of the book, the different categories of chapters, topics of interest, and submission procedure. There are also Book Chapter Editing Guidelines that authors need to comply with.

In this volume, we intend to make a special focus on unmanned aerial vehicle using ROS. Papers that present the design of a new drone and its integration with ROS, simulation environments of unmanned aerial vehicle with ROS and SITL, ground station to drone communication protocols (e.g. MAVLink, MAVROS, etc), control of unmanned aerial vehicles, best practices to work with drones, etc. are particularly sought.

In a nutshell, abstracts must be submitted by February 15, 2016 to register the chapters and to identify in advance any possible similarities of chapter contents. Full chapters submission is due on April 20, 2016.

Submissions and the review process will be handle through EasyChair. Link will be provided soon.

Each chapter will be reviewed by at least three expert reviewers, one at least should be a ROS user and/or developer.

Want to be a reviewer for some chapters?

We look for the collaboration of ROS community users to provide reviews and feedback about proposals and chapters to be submitted for the book. If you are interested to participate in the review process, please consider filling in the following reviewer interest form.

We look forward to receiving your contribution for a successful ROS reference!

individuals and teams from industry and academia are invited to submit an application for the upcoming euRobotics Technology Transfer Award, which will be a part of the "European Robotics Forum" to be held in Ljubljana 21-23 March 2016(http://www.erf2016.eu/).

Please help us make another great ROS Montage for the upcoming 8th anniversary of ROS. To show off the great variety of things people are doing with ROS we need your videos to share with the community.

Please submit your videos to be considered for inclusion in the 8 Years of ROS montage before November 1st.

The
global aim is to facilitate discussion with our user base. For example,
we're in the process of refactoring our repositories right now to make
for an easier release process and more modular approach. During that
process we'd love to have more feedback from our end users but it is
quite hard for us to reach them. We also think that it'd be a good place
to discuss different uses of our hardware / software, sharing exciting
demos or tutorials, discussing new features, etc.

If
you are using our software (the Shadow hand simulation or real
Hardware, the Cyberglove package, etc...), I hope you'll be joining that
list!

The CITEC Month of Open Research (MORe) is an international program
that offers students stipends to contribute to open source or open data
projects related to the research areas of cognitive interaction technology.
MORe is organized by the Excellence Cluster Cognitive Interaction Technology
(CITEC) of Bielefeld University, Germany. We fund students from abroad who
would like to participate in exciting projects that are proposed by experienced
mentors of CITEC research groups. Participants can gain a maximum paycheck
of 1.500 EUR. We accept applications of English speaking students
from all over the world!

Greg Brill and I have been working on establishing a script that
can more or less automatically set up a ROS desktop_full install on
Mavericks and Yosemite; we'd be glad for a few brave souls to give it a
try:

The
desktop_full build itself is about 30 minutes; the remainder of the
time is spent fetching and building dependencies (especially VTK and
Gazebo 5). Total time on most systems should be < 1h.

Our
intent is to set up some form of CI which can periodically re-run this
setup on a vanilla machine, and thus keep it from regressing, however
such a thing is still to come-- this overall procedure is long enough
that it's not a good fit for Travis CI.

he
ROS-Industrial Consortium is tackling a topic that is of interest to
the whole ROS community: conversion of CAD data to ROS-interpretable
file types (e.g. URDF, SRDF). This work will be conducted over the next
three years by the TU Delft Robotics Institute. To help us make ROS even more convenient to use:

We are developing a lightweight implementation of the ROS middleware on
STM32F4Discovery for interfacing embedded and general-purpose software.
Currently, we can run multiple ROS nodes concurrently on STM32, and we
can send ROS messages between a PC and STM32 over Ethernet (only
UDPROS).

Programming is fun. Robots are fun. Programming robots is awesome! This
episode Michael speaks with Dirk Thomas from the ROS (Robot Operating
System) project. You will learn how to use ROS and ROSPy to program
robots.

We discuss how to use ROS from some of the largest and most complex
robots built (including one on the International Space Station!) all the
way down to basic robots controlled via micro-controllers such as
arduinos.

Enable rapid, cost-effective development of new robotics
capabilities designed to respond to, and even anticipate, quickly
evolving needs in space, maritime, ground, and air operations. RFT will
focus on the development of groundbreaking robotic hardware and software
by funding novel approaches as well as creative adaptations of existing
technologies.

Achieve breakthrough capabilities in less time and at a fraction of
the cost typical of government-supported robotic development processes
by engaging highly agile organizations and individuals who traditionally
have not worked with the U.S. government.

The good old ROS CheatSheet has just been released for Indigo. If you know anyone just starting out in ROS please send this on to them.

I
recently performed some much needed cleanup, reformatting, and content
addition for the CheatSheet. Most notably the GUI tools section has been
greatly improved and includes information on the RQT toolset.

Further it now comes in two flavors, New and Improved Catkin Flavor and Original Extra Crispy Rosbuild. Many thanks to Kei Okada of the JSK lab for adding this dual build functionality and his edits for Hydro!

This service is something that we've heard requested many times,
especially from our industry users, and we're excited that Clearpath is
going to offer it. If you're looking for help or advice in using ROS on
a current or upcoming project, get in touch with Clearpath.

Clearpath Robotics, a leader in unmanned vehicle robotics, has combined
resources with Christie®, one of the most innovative visual
technologies companies in the world, to create a three-dimensional video game
using robots. The pairing of Clearpath and Christie bridges two technologies,
from unrelated fields, to create an interactive experience in a way that has
never been done before.

The project was produced during
Clearpath's "hack week," where team members experiment and innovate with new
technology and ideas. Computer graphics were displayed on the floor using Christie's
3D projection mapping equipment to create a digital arena, while robots dueled
with laser beams. Clearpath was inspired by a
project from MIT; however, they wanted to create a version
using open source software and run as a completely interactive program.

"Teaming up with Christie allowed us to experiment
with the latest 3D projection mapping technology in combination with our Jackal
robots and open source software. This was our recipe for an augmented reality video
game," said Ryan Gariepy, Co-Founder & Chief Technology Officer at
Clearpath Robotics. "Combining both of our technologies resulted in a one-of-a-kind
experience that was fun to work on and even more fun to play with."

Augmented reality is a term used to
describe the superimposing of a computer image in the real world.

Utilizing Christie's overhead 3D
projectors, the Clearpath team created an overlay under their Jackal unmanned ground vehicles to display
weapons, recharging shields, hitpoints, and sound effects for a two player (or
human vs. A.I.) game.

For this project, Christie provided four Christie
HD14K-M 14,000 lumens 3DLP® projectors and two cameras. The
projectors use Christie AutoCal™ software and have Christie Twist™ software
embedded right in. Christie rigged the four projectors in a 2 x 2 configuration
on the ceiling of our warehouse. The cameras captured what was happening on the
floor and sent that information on the Christie AutoCal™ software, which then
automatically aligned and blended the four projectors into one, giant, seamless
30-foot projection-mapped digital canvas. The Christie hardware and software,
in conjunction with two of Clearpath's Jackal robots and computer system
allowed for the augmented reality experience to take place.

The
shadow fixed repository, for those who do not know, is the staging
repository which we use for testing before making new versions public.
You can think of it as a place for release candidates. See http://wiki.ros.org/ShadowRepository for information about the process and how you can try out packages from it.

If
you use rviz on Indigo or Hydro and have some spare cycles I would
appreciate you testing out the new versions either from shadow fixed or
by building it from source. Any issues you might find, please file them
on the rviz issue tracker.

I'll keep the "release candidates" in the shadow fixed repository for about week unless we run into problems.

We'd like to invite you to participate in an effort to develop a standard set of messages for communicating within robotics components on Micro Air Vehicles (MAVs). At the IROS workshop on MAV's (proceedings) this fall it was identified that the MAV community has many different implementations of the same capabilities. They are often all closely related and are almost compatible but rarely is it easy to switch between different implementations, or use different implementations together. From that discussion it was proposed to work toward building up a common way to communicate and enable the MAV community to collaborate most effectively.

To make this happen we have setup a mailinglist and wiki pages to be a place to coordinate this effort (MAV SIG, mailing list). If you are interested in this topic we ask that you join, listen and participate so that we can get as broad a spectrum of voices as possible.

We have chosen the ROS SIG format as it has proven effective at developing standard messages which are used by many people every day. ROS SIG's are relatively unstructured and allow adaptation for differences in each community and process.

We plan to use the ROS .msg format as a way to formalize the messages, since it is a relatively compact way to express messages which has representations in many languages. The most important part of the process will not be the actual msg file that comes out, but the datatypes for which people can rely on being isomorphic when transitioning between systems.

Having common datatypes will allow us to have better modularity and interoperability. As an example from the ROS ecosystem, there are 10+ different laser scanner drivers in the ROS ecosystem and 18+ different camera drivers (ROS sensors). Because these drivers all use a standard set of messages a user of those sensors can switch which sensor they are using on their system, or deploy systems with different sensors and the rest system will continue to operate without modifications. There are more complicated examples such as the navigation stack which has a standard set of messages for sending commands and providing feedback. This same interface has been used for differential drive, holonomic, free flying, and even walking robots.

There are already dozens of MAV related ROS packages released and we hope that developing these standard messages can help coordinate the efforts of the many contributors already working on aerial vehicles in ROS.

If you would like to know more please check out the SIG (LINK). if you're at all interested please join the process. We've started a thread at here to kick off the process.

As we have have now released indigo and are looking forward to Jade, it is time to retire Groovy.

Groovy was first officially released at the end of 2012, but work toward the release had been started in early 2012.[1] During it's life cycle Groovy almost double the number of packages released reaching a maximum of 900.

Reviewing the history of the rosdistro repository which contains the release metadata reveals that there was 2912 commits from 127 contributors over the history of the Groovy release. This represents the maintainers making the releases and does not count the many more contributors to the source code of the individual packages. There were commits on 612 different days over the 794 days tracked in this repository. This means on average there were releases of groovy packages more than 5 days per week. For a quick visualization of the activity on the repository we've put together a rendering of commits to the groovy subdirectory:These statistics only count catkin based releases, not the 178 rosbuild packages indexed separately.)

As you may have already noticed, last week we disabled all the groovy jobs on the farm. We have kept them there for reference but do not intend to reenable them. Along those same lines, we can accept pull-requests to keep source builds working on groovy(such as if a repository is relocated to a new host), but cannot accept pull-requests for new groovy releases.

As always we'd like to pay trubute to the hundreds of people who put the time in to make groovy happen. It would not have happened without your efforts.

Although it's a long way off still we need to look forward to when and where to hold the next instance. To help facilitate that process we'd like the communities feedback on what times and locations would best fit into their schedules. Please take a minute to let us know where you would be able to join us for our next event.

Clearpath Robotics, an early adopter of ROS, is working with the OpenSource Robotics Foundation (OSRF) to determine how the worldwide ROSdevelopment community can best be supported. This may be via supportservices, resources, or tools offered by the OSRF or communitymembers. Now is your opportunity to let us know what you need and howClearpath and OSRF can work together to best support you.Please take a moment to complete this short survey:

Hello, my name is Ben Arvey and I've been developing a set of analysis tools for ROS under the direction of Dr. Bill Smart. Our lab is giving a talk at ROSCon concerning our research, of which this is one aspect.

I'm looking for some preliminary feedback from developers. Any information about what you need in an analysis tool would be very helpful!

In December 2013, TMC and YASKAWA Benelux set out to make a technology demonstrator.

YASKAWA sees a shift in robotics, from welding, handling, painting to new application areas. Some recent developments of YASKAWA are 'milking-robots' and 'slaughter-robots'. And as we are aware of, Universities and research institutions are working hard on the introduction of robots in (health) care or other areas where human interaction is present and essential.

A new era of smart robotics is becoming a reality. The development of these new robots pose new technological problems and need different solutions. It raises questions about how a robot can be programmed to deal with changing environments? How an intuitive and user-friendly interface can be created? Or how flexible mechanics can be be designed to handle different object just like humans do. And more and foremost how to create smarter safety systems such that robots can safely operate among people.

The demonstrator sets out to do accomplish a number of things:

Use a 'higher' software environment like ROS

Integration of vision to make the robot aware of its surroundings;

Implementation of a flexible gripper to make it possible to perform different tasks;

Use a modern user interface like a tablet or smartphone to control the robot;

Make the robot perform some tasks that are expected from service robotics.

After a brainstorm we came up with the idea that the robot should pick, slice and squeeze an orange to make fresh orange juice, next to that it should also serve this glass to someone in the audience.

For 3½ months and with a team of 12 people we shared our Monday evenings and a lot of enthusiasm to build this orange crusher or as we call her nowadays: (Juicy) Lucy. The team consisted of people with different backgrounds: mechatronics, robotics, embedded software, mechanical design and electronics. Our deadline was the High Tech Systems Fair the 7th or 8th of May.

ROS enabled us to quickly prototype and realize our demonstrator. Using ROS we combined several piece of hardware:

An Android tablet where we designed two different apps. A user app to choose the amount of orange juice. And an engineering app that allowed us to control the robot gripper, provided ROS INFO messages plus control over the state machine.

A laptop and a mini pc; where the mini pc was used to perform the image processing.

A webcam, where its images were used to dynamically extract the X, Y coordinates of the oranges.

An Arduino board. Here, the Arduino board was used to control the gripper which was equipped with 4 x stepper motors, 4 end switches and a sonar sensor (the sonar was used to measure the height of the orange).

An example of the system overview can be seen here.

In the end we managed to deliver the first version of our demonstrator that serves as a platform for future enhancements and add more complexity. The fruits of our labor can be viewed in the following video.

We're closing in on the Indigo Release quickly. Today our last package was released to fill out ros desktop and desktop full. There are a few packages which need to be fixed to make the release ready for testing. In preparation for the upcoming release we already have 430 packages released and building on the buildfarm. We expect many more to have been released by the final release planned for later this month.

Also as a reminder coming up in June is ICRA and ROSKong. If you are presenting a paper in which you used ROS and think other ROS users would be interested, we would like to feature it on ROS News blog please email us at ros-news@googlegroups.com

And one last reminder that the Indigo Igloo T-Shirt is available for only 19 hours more hours. We've had a great response at the end of the campaign and want to make sure the you don't miss your opportunity.

The results are in from the January 2014 ROS user survey. Thanks to everyone who participated!

We had a total of 336 responses. We'll walk through the questions, one at a time:

In general, for what do you use ROS?

Not surprisingly, the lion's share of ROS users consider themselves to be doing research. That's where we started, and we expect to continue to see high participation in the research community. But we also see about 1/3 of respondents classifying themselves in education and 1/3 in product development, with a smaller share of self-identified hobbyists. Those are all areas for future growth in ROS usage.

What about ROS convinced you to use it?

Interestingly, the top response here is the communications system. When we set out to build ROS, we started with the communications system, because we believe that robotics problems are most naturally solved by developing distributed systems, and further that developing those systems is hard, requiring solid, easy to use tools. It looks like our users appreciate the effort that's been put into ROS middleware.

Also near the top are what we can call the "healthy open source project" benefits: friendly licensing, helpful community, and playing nicely with related open source projects.

How do you primarily use ROS?

Most users are working with a single robot, but a substantial number of people are working with multiple robots, which was outside the initial design of ROS. Multi-robot support definitely needs improvement, but clearly people are already getting something out of ROS in multi-robot environments.

With what type(s) of hardware do you use ROS?

At least in part because most robots in the world (or at least in research labs) are basically cameras and/or lasers on wheels, we see most of our users working on those platforms. But we also see a fair number of people working with arms and hands, and we expect that the number of legged systems will grow in the future.

Have you shared and/or released your own ROS packages?

Here we see a familiar pattern in open source development: most users don't share their code with the community. That's OK with us, because we know that not everybody is in a position to share their code (for example, commercial users who are building ROS-based products). But if you can share code, please do!

Which ROS packages are most important to you?

Here, we have some clear winners. Visualization is important: rviz is a critical piece of infrastructure in our community, and the rqt library of visualization components is also heavily used. Also highly ranked are planning libraries (navigation and MoveIt!), perception libraries (PCL and OpenCV), coordinate transform management (tf), and simulation (Gazebo). Interestingly, we see the OpenNI driver in the top ten, perhaps reflecting the long-standing connection between ROS and Kinect-like devices, dating back to the ROS 3D Contest.

Where should future ROS development focus?

Less clarity here; basically we should do more of everything.

What is your top priority for future ROS development?

The free-form answers we received in response to this question are challenging to quantify. At a high-level, here's a qualitative distillation of common themes, in no particular order:

more / better documentation

more / better / more up-to-date tutorials

improved usability

greater stability, less frequent releases

better multi-master / multi-robot support

consolidation of related parts into coherent wholes

better / more mature middleware

better / more attentive maintenance of core libraries and tools

add features and fix bugs in rqt

get to "production quality"

IDE support

real time support

Would you be willing to anonymously report usage statistics?

About 1/2 of respondents are willing to install a plugin to roscore that would track and anonymously report usage statistics, which would let us automatically collect data on which packages, nodes, launch files, etc. are most heavily used. Any volunteers to write that plugin?

For more information see the submission on the Matlab Central File Exchange: http://www.mathworks.com/matlabcentral/fileexchange/44853-use-matlab-ros-io-package-to-interact-with-the-turtlebot-simulator-in-gazebo

Thalmic Labs and Clearpath Robotics have joined forces to prove gesture controlled robots are possible. Thalmic Labs, developers of Myo Gesture Control, released the Myo alpha developer unit to Clearpath Robotics for testing. Clearpath has successfully integrated the Myo armband with their Husky Unmanned Ground Vehicle to start, stop and drive the vehicle using simple arm movements.

"There are a lot of interesting applications for using the Myo for robot control and our team is very excited to have the opportunity to work with the Alpha dev unit," said Ryan Gariepy, Chief Technical Officer at Clearpath Robotics. "We've been eagerly following Thalmic's progress and we've got a dozen different robots here we could do some more tests with."

Clearpath Robotics used the Robot Operating System (ROS) for most of the integration work. The Husky software package exposes a standard Twist interface, so the team was required to convert the Myo data into that format to create compatibility. The team did so by using their experimental cross-platform serialization server in socket mode.

For Myo integration and development, Clearpath Robotics added standard Windows Socket code into the provided Thalmic example code, and then determined the proper mapping from the Myo data to the desired robot velocity using timeouts and velocity limits. Further details on Myo integration cannot be released at this time.

I wanted to highlight some recent changes that I hope people will find useful. Following on the recent new website for ROS, http://www.ros.org/ (Which I highly recommend you checkout if you haven't already.) we've been doing some more housekeeping to make things easier to use on our various websites.

We have updated the CSS styles for the wiki, improving the look and feel for things like buttons and font spacing. We have also added some new information to package pages, for example: http://wiki.ros.org/roscpp_tutorials On this page you will notice the new badges for "Released", "Continuous Integration", and "Documented". This should give users more information at a glance for packages which are documented on the wiki. Additionally, there is now a "Jenkins Jobs" link on the right hand side "Package Links" box. If you click this it will expand to list all of the build farm jobs related to this package and their status. We will add more information to the package pages as we can, suggestions and pull requests are welcome.

We have also just launched the http://status.ros.org site. This site gives you an overview of the status of our services as well as some realtime metrics. This site is hosted externally, so we can communicate outages and progress on repairs even when our other infrastructure is down. We would encourage you to follow @rosorg or add our RSS feed/signup for email notifications on the http://status.ros.org site directly.

While I'm on the topic I'd like to encourage all maintainers to make sure that they're on the ros-release mailing list to make sure to stay up to date on release specific information and discussions.

I'd also like to encourage everyone to send announcements and updates on their projects here to the ros-users mailing list or submit them to ros-news@googlegroups.com for posting on the ROS Blog. And if you're blogging about ROS related content submitting your blog to http://planet.ros.org/where you can get an RSS feed of ROS related activities. One of the strengths of ROS is it's large user community sharing project updates and announcements is a great way to contribute to the community.

2013 was ROS's strongest years with more and more people releasing packages against both Groovy and Hydro. The packages available for these distros have grown to be more than 750 and 850 respectively.

Thank you to everyone who's contributed already and to everyone else I encourage you to start by making a small contribution such as answering a question on http://answers.ros.org or updating or extending a wiki page.

Thank you for your responses to the MoveIt! survey. We had a fantastic response with 105 total respondents by the deadline. There are 65 different robots on which MoveIt! is being used now according to the survey (listed below), with multiple instances of the most popular robots using MoveIt!

Compiled list of robots running MoveIt! based on survey responses (please point out any duplicates). The list is in alphabetical order and figures in brackets indicate the number of respondents who reported using MoveIt! with that particular robot.

When we started work on ROS, like most young open source projects, our
greatest need was to recruit early adopters and fellow developers. So
we targeted that audience: we built a wiki, filled it with
documentation, tutorials, and code examples, and made the wiki the
landing page at www.ros.org.

Well, times have changed. Now, six years into the project, we have a
broader audience to consider. We want to reach teachers who are
considering using ROS in their classrooms, managers who want to use
ROS in a new product, journalists who are writing stories about ROS,
and many, many others.

So, in celebration (just a bit late) of ROS's sixth birthday, we're
pleased to present a new www.ros.org.

After all, a grown-up ROS deserves a grown-up website. Don't worry:
the wiki is still there, as are all the other
ROS sites on which we depend.

Btw, like most things we do, the website itself is at
GitHub. If you run into a
problem or have an idea for improving the site, open an
issue and we'll have a
look.

We have passed the 150 threshold for our Teespring Campaign and the Hydromedusa tshirts will be ordered. If you haven't ordered your yet you can still order for the for 13 more days before the campaign ends.

The success of answers.ros.org is thanks to its many contributors. Answers.ros.org has been running for a little bit over 2 years now and in that time, the community has answered 7283 questions, 73% of the questions asked. That's an average of 10 questions per day for the last two years (including weekends and holidays). Traffic has steadily grown, and recently, users have posted closer to 30 questions per day.

There are now 4399 registered users, 388 of whom have earned over 100 Karma, and 60 of whom have amassed 1000 Karma!

@lorenz@tfoote@dornhege and @joq deserve special recognition as each of them has earned over 10,000 Karma. Accumulating a Karma stash of this size requires such actions as their answers being upvoted one thousand times.

Congratulations to the answers.ros.org community for making the site the thriving resource that it is today. Keep up the fantastic work, and keep the questions--and answers--coming.

The Bosch Research and Technology Center in Palo Alto, CA is looking for highly motivated robotics researchers and developers interested in contributing to ROS and being part of the PR2 Beta Program as part of our internship program.

A year ago I released my first video of Kinect robotics when I loosely controlled a KHR (mini humanoid).

Now I have "completed" the robot avatar project. A treadmill, HMD, Wii remotes, Kinect, and NAO have all been integrated together using ROS to create a fully immersive experience. I really feel like my "self" is in the place of the robot while using this.

Here is a video demonstrating it, I use the interface to brush my cat remotely.

Actually it looks like the project is not really complete after all... Something I realized when filming this is that I need to add 2-way audio...

Our NAO humanoid plays Jingle Bells for Christmas on a glockenspiel / xylophone. The robot can read a single-track song derived from MIDI and plays it on the instrument. Implementation by Stefan Band and Jonas Delleske.

Robotic Open Platform (ROP) aims to make hardware designs of robots available under an Open Hardware license to the entire robotic community. It provides CAD drawings, electric schemes and the required documentation to build their own robot. In the near future, standard electromechanic interfaces between the various robot components will be presented to enable the possibility to combine hardware components of various groups into one robot. By making the robots modular, users are encouraged to develop their own components that can be shared with the community.

In software, the Robot Operating System (ROS) is nowadays acknowledged as a standard software platform and is used by numerous (research) institutions. This open source software is available to everyone and by sharing knowledge with the community there is no need to 'reinvent the wheel', hence drastically speeding up development. Similarly, Robotic Open Platform (ROP) functions as a platform to share hardware designs available to all research groups within the community.

Thecorpora has made a great commitment to open source with Qbo, a robot with great robotics technology for everyday consumers. To work on integration ideas, Qbo and his friend, Francisco Paz, stopped by Willow Garage to meet the team and Willow Garage robots, such as the PR2. You can checkout photos of Qbo hanging out with PR2 and other robots at Thecorpora's blog.

Also on the way from Thecorpora is a new Android phone application for Qbo that provides telepresence; hear, see, and communicate with Qbo as if you were in the same room. Imagine getting Qbo to go where you want or direct it using Google's speech recognition software. This new app makes Qbo a telepresence in any room. For details, see their blog post and video.

Version 0.5.17 of rosinstall has been released. You can update using the commands below. This update contains the new experimental rosws tool, updated --distro and --dev options for the roslocate tool, and numerous bug fixes. Please try it out and provide feedback on the new rosws tool and new roslocate with distro specific options.

Update commands

The good folks at turtlebot.eu have released EU-compatible designs for the TurtleBot powerboard as well as metric versions of the TurtleBot trays. They've also adapted the design for consumer Roombas for those that cannot purchase a Create in Europe.

Thecorpora's Qbo showed off some cloud skills at the Campus Party in Valencia: the Qbo in Valencia was able to learn to recognize Tux, the Linux penguin, using a cloud-based object recognition system. Cloud-based recognition systems enable us to access seamlessly and collaboratively update knowledge about the world. During their live demo in Valencia, an engineer in Madrid was able to teach the image of Tux to the system, which was then accessed by the Qbo in Valencia. For more information on this demo and Qbo, you can checkout the Qbo blog.

I Heart Robotics/Engineering has been cranking out TurtleBot accessories as well as a some DIY instructions so that you can get the most out of your TurtleBot hardware -- whether it be new capabilities, or a little bit of flair.

TurtleBot.com has launched! This new site provides access to TurtleBot information and also gives you new ways to access TurtleBot hardware. You can now order parts or assembled kits from several licensed vendors or take advantage of the open-source hardware designs to build your own robot from scratch.

Congrats to the GRASP Lab's PhillieBot for throwing out the first pitch at a Phillies Game! PhillieBot is the creation of Professor Vijay Kumar, Jordan Brindza, Jamie Gewirtz and Christian Moore. It features a Barrett arm on a Segway base and it runs ROS. They worked on several modifications to the Barrett arm to get up to pitching speeds, though the Phillies requested that they limit the pitch to a mild 30-40mph.

Congrats to the NimbRo@Home (University of Bonn) for their victory at the RoboCup German Open. During the competition, their Cosero and Dynamaid robots worked together
to prepare breakfast. They demonstrated many difficult mobile manipulation tasks, like opening and retrieving orange juice from a refrigerator, pouring milk into a ceral bowl, fetching a spoon, and recognizing a pointing gesture. They were also able to deal with unknown environments.

The competition was a great demonstration of ROS software used to solve difficult challenges. ROS, PCL, and OpenRAVE were popular components in the competition -- five out of the eight robots used ROS-related software. The Nimbro@Home robots use ROS for communication as part of their four-layer modular control architecture, which is described in their 2011 paper.

The CCNY Robotics Lab was the first to bring us Kinect drivers for ROS, so it's not surprising that they have some awesome Kinect demos they have been working on.

In the above video, they show some of the latest results of their 6D pose estimation. Simply by moving the Kinect around an office, they are able to register multiple scans together and create a 3D model of the scene. Their code works with no extra sensors: they simply move around the Kinect freehand.

The work was done by Ivan Dryanovski, Bill Morris, Ravi Kaushik, and Dr. Jizhong Xiao. They are using custom RGB-D feature descriptors for the scan registration and use OpenCV, PCL, and ROS under the hood. They are working on releasing and documenting their code. In the meantime, you can checkout the rest of the cool software available in ccny-ros-pkg.

MIT's Robust Robotics Group, University of Washington, and Intel Labs Seattle teamed up to produce this demonstration of 3D map construction with a Kinect on a Quadrotor. Their demonstration combines onboard visual odometry for local control and offboard SLAM for map reconstruction. The visual odometry enables the quadrotor to navigate indoors where GPS is not available. SLAM is implemented using RGBD-SLAM.

A set of enterprising University of Waterloo undergrads have combined mobile robotics and 3D visual SLAM to produce 3D color maps. They mounted a Kinect 3D sensor on a Clearpath Husky A200 and used it to map cluttered industrial and office environment settings. The video shows off the impressive progress and capabilities of their "iC2020" module.

The iC2020 module was created by Sean Anderson, Kirk Mactavish, Daryl Tiong, and Aditya Sharma as part of their fourth-year design project at the University of Waterloo. They formed their group with the goal of using PrimeSense technology to create a globally consistent dense 3D color maps.

Under the hood they use ROS, OpenCV, GPUSURF, TORO to tackle the various challenges of motion estimation, mapping, and loop closure in noisy environments. Their software is capable of allowing real-time views of the 3D environment as it is created. ROS is supported out-of-the-box on the Clearpath Husky, and Sean Anderson noted that "ROS was crucial to the project's success" due to its ease of use and flexibility.

OTL has been a frequent contributor of great Roomba hacks, and this one is no exception. This time he's used a Kinect and a Roomba bluetooth connector to take back control of the vacuum. You can find out more in his blog post (Japanese). His blog is a great Japanese-language resource for getting into ROS.

Their demo is built on an AscTec Pelican with a stripped-down Kinect. To handle the rest of the autonomous flight needs, they use a ADNS 3080 optical flow sensor for position and velocity control, and a SRF10 sonar sensor for altitude control. Sample-consensus algorithms from PCL are used to convert the 3D point cloud data into the estimated positions of these surfaces. Remarkably, they managed to make all of this run on an Atom processor.

Their demo uses the Kinect at both the skeleton tracking and 3D point cloud level. The OpenNI skeleton tracker is used to identify the position of the person in the room, and then the 3D point cloud data is used to start building the full 3D scan. Once all of the point clouds are collected, they use PCL to create a unified 3D model.

The UU robot is a custom MetraLabs Scitos G5 mobile robot with a Kinect mounted at the end of a Schunk 7 DOF manipulator, but their code should be adaptable to other robot platforms.

last December I gave a talk about using ROS on FreeBSD. The sheets areavailable at ftp://rene-ladan.nl/pub/ros-freebsd.pdf . Note that theUSB problem mentioned on sheet 12 is fixed for FreeBSD 8 and newer :)

Zoltan-Csaba Marton and Dejan Pangercic of TUM's "Teleop Kinect Cleanup" entry into the ROS 3D Contest is a couple of demos rolled into one. Using their entry, you can point at an object on a table, then, in the virtual rviz display, move that object somewhere else like a Jedi. You start with a world that looks like your own, but by the time you're done, you've rearranged a new virtual world to your liking.

That's not all. They've also figured out how to make this useful for giving commands to a robot. After you move around a cup in your virtual world to your liking, a command to move the cup can be passed to a robot. Thus, once you've re-arranged your virtual world, it becomes the job of the robot to make the real world look like your virtual world.

Michael Ferguson is a prolific contributor to ROS. His entry into the ROS 3D Contest is "Improved AR Markers for Topological Navigation". AR markers are a cheap and effect way to find the position of objects in an image using cheap cameras. Michael recognized the opportunity to combine these markers with the Kinect, which has both camera and depth data, to transform them into markers in three dimensions. You can even use this to find the position of the robot by attaching the markers to known locations in your map.

We encourage you to check out the many different robots that Michael is building, from the iRobot Create and Dynamixel AX-12-based Nelson to the up-and-coming Create + Kinect + tripod Trike. The software for the contest entry along with these robots can be found in albany-ros-pkg, which also contains a Neato XV-11 driver for ROS.

Colin Lea's Anaglyph Viewer entry into the ROS 3D Contest brings a bit of 3D retro to our entries. Colored glasses for seeing 3D are an inexpensive way of seeing 3D content on a 2D screen. If you are able to see the data in 3D, you can become more immersed in the data that is coming from the Kinect. For example, you can build more effective teleoperation cockpits that let you take advantage of your ability to see depth. Add more Kinect cameras and you can start becoming fully immersed in a 3D world.

The Kinemmings entry by Alberto Jose Ramirez Valadez, Jonathan Rafael Patino Lopez and Marcel Stockli Contreras, is a take on the classic Lemmings game. Now, it's up to you and your body to guide the Kinemmings safely to their exit.

Kinemmings has the distinction of being the only game entry into the ROS 3D contest. In fact, as far as we know, it may be the first game package in all of ROS. We appreciate it as it means we can now tell our boss that we're "working on ROS".

You've have your Kinect and want to mount it on your robot, but now you're faced with a challenge: you need to precisely determine the mounting point of your Kinect so that it the data from it can be interpreted correctly; e.g. if you want to use it to run autonomous navigation.

They also released several lower-level libraries to help build other applications on top: libnabo for running fast K Nearest Neighbor, and libpointmatcher, a modular ICP library. These are important for building tracking applications, as shown in the video, as well as building SLAM and other systems.

His entry for the ROS 3D contest builds on Taylor Veltrop's teleop control to adapt it for the Pi Robot, as well as add in a base controller and the ability to define new gestures for control. Patrick has also contributed a serializer package for those wishing to use the Robotis Serializer microcontroller in ROS. Pi Robot may be one of a kind, but, thanks to Patrick's contributions, you have the software you need to build your own.

Taylor Veltrop had the first ROS 3D contest entry with his teleoperation control of a humanoid KHR/Roboard robot. He wasn't content to leave it at that: he beefed up his teleoperation system with Wiimote and leg-based control. He also is running it on an Aldebaran Nao.

One of the difficulties in using the skeleton tracking libraries with the Kinect is that you do not get much information about the hands of the operator. For those trying to use the skeleton tracking to control a robot's arms, this creates a pickup problem: you can get the arm to location you wish to grab an item, but you don't have the control you need over the angle of the hand and the opening and closing of the gripper to complete the task.

Taylor solves this by enabling you to use Wiimotes in each hand. With this additional controls, the operator can seamlessly use the wiimotes to transmit the additional information about the correct hand position, and you can use the buttons on the Wiimote to perform additional operations, like opening and closing grippers.

Taylor also collaborated with Patrick Goebel to add in leg controls for moving a robot. Placing one leg forwards or backwards move the robot in that direction. Placing a leg to the side makes the robot turn.

You can watch Taylor's new video above, where he puts the Nao teleop through it's paces. If you have ever wanted to see a Nao wield a knife, play chess, or grab a tissue out of a box, check it out.

Halit Bener SUAY entered the ROS 3D contest with this entry that demonstrates teleoperation of an Aldebaran Nao using a Kinect. This is the not the only entry to tackle teleoperation, but it adds its own unique twists. Most notably, there are pre-defined gestures that enable the operator to switch between different modes of control. One leg controls starting and stopping the robot. Another enables the operator to switch between controlling the body and the head. Your arms can either directly control the robot arms are issue other commands, like directing the robot's gaze. All-in-all, it's a great demo of how we can go completely remoteless and still control a complex, walking robot like the Nao.

The University of Freiburg team has put together an impressive 6D-SLAM library for entry into the ROS 3D Contest. By taking advantage of the additional 3D data that a Kinect provides, they've released a new benchmark for the state-of-the-art in the field. It's also a great demo that we can all try ourselves: pick up your Kinect, move it around, and build 3D models of your world.

You can go ahead and checkout the entries yourself. In most cases, you should be able to even download and try them out on your own Kinect or PrimeSense device.

While we tally the results, we'll spotlight the entries here.

First off are Garratt Gallagher's entries. Garratt was our most prolific entrant and produced a total of five separate entries. Each is worth it's own blog post, and many of them already have been featured here:

We're grateful that Garratt has taken the time to, not only enter the contest, but go the extra mile to make sure that others can try out his libraries and build on his creative ideas. If you like what you see, you should consider helping out his Bilibot project, which is a low-cost Kinect + Create platform.

Garratt's newest entry is "Customizable Buttons". Using the Kinect, you can draw on a piece of paper to create your own music board. It's a lot of fun, as you'll see in the video:

Bill Mania gave an introductory presentation on ROS at ChiPy, the Chicago Python users group. He also gave a demo of his RoboMagellan robot that he's bringing up on ROS. This is a good overview for those of you just getting into ROS, especially from a Python perspective.

We think that ROS and the PR2 are great tools for educators. Both platforms allow students to focus on building the relevant parts of a system while incorporating less topical components from the open source community. Students get started faster and complete more impressive projects. Even more importantly, students can take components built in ROS to their next course, research project or job without worrying about licensing.

We've started a wiki page to list courses using ROS or the PR2, and to discuss teaching-related issues. Here are some course examples that you can use for inspiration:

If you're teaching a course using ROS or the PR2, please post a link at ros.org/wiki/Courses. If you have advice on setting up labs, course computers, or any other teaching-related topic, post those too. By sharing material, we'll all create effective courses more quickly.

Taylor Veltrop has made the first entry to our ROS 3D Contest. He uses the Kinect, and NITE to put a Kondo-style humanoid through pushups, waves, and other arm-control gestures. Great work! We look forward to seeing more entries.

Hi everyone!

Please take a look at my entry in the Kinect/RGB-D contest! I'm
really happy with how it's turned out so far.

It's a small humanoid hobby robot by Kondo with a Roboard running
ROS. The arms are controlled master/slave style over the network by a
Kinect.

For Kinect/OpenNI users and VSLAM researchers, we're working on integrating Hauke Strasdat's ScaViSLAM framework into ROS. ScaViSLAM is a a general and scalable framework for visual SLAM and should enable exciting applications like constructing 3D models of environments, creating 3D models of objects, augmented reality, and autonomous navigation.