thanks for your help. I already read through this tutorial and managed to implement my own State Machine. All of my objects already move randomly within my environment.The next part I have to solve is the collision detection between the Husky robot and the dynamic obstacles, which I haven't found any tutorials for. Do you have some literature suggestions for me to read through aswell?

Another problem I am facing is the following (maybe also related to collisions): I'd like to add a LIDAR sensor to the Husky robot, in order to be able to get distances to every obstacles sensed by the robot. I found a Gazebo tutorial (https://www.clearpathrobotics.com/2013/11/husky-simulation-in-gazebo/) which gives help on how to add such a LIDAR sensor to the Husky robot. However, if you look at the link I need to find a .urdf.xacro file of the Husky robot in which i can add the lines printed in the tutorial. The problem is, I couldn't find such a file. The only Husky-robot-specific files I found were the model.config & model.sdf files (on $HBP/Models/husky_model/ path).

Do you have any suggestions how I could implement the suggested LIDAR sensor into the model.sdf file, in order to add it to our Husky robot? Unfortunately, I am not very familiar with ROS and Gazebo, that's why I'm asking for some help here .

I managed to add the LIDAR sensor to the model.sdf file of my Husky robot and attached it via a fixed joint.However, I could use some help on how to implement the Transfer Function, in order to actually use the sensor, constantly turn it around and get the measured sensor values.It would be great, If you could help me out here!

Great news for the sensor, but I am afraid that what you just did only added the geometrical model on top of the husky, and does not yet receive data from the sensor. If that is the case, I would recommend looking for another tutorial which shows how you can read the data of the lidar sensor from ROS. Once you have the data in ROS it is trivial to get it from a TF. Let me know when you have made some progress or if you are stuck.

unfortunately, I couldn't find any further information on how to receive the sensor data.My question is:Can i implement the Gazebo plugin as explained in the last two chapters of the tutorial you gave me (http://gazebosim.org/tutorials?tut=guided_i1), connect it to ROS and then somehow use it in the NRP transfer function?

I don't have any experience with ROS or Gazebo, so I'm quiet stuck here

Maybe you are lucky and ROS already knows that the sensor data is there. That would be the most fortunate case, so l would recommend to first check. A simple test can be done either from the ROS console in the NRP or from just a linux terminal. You just type

rostopic list

And this shows you a list of the available topics that you can read robot sensory data from. If you are lucky in the list there should appear something like /gazebo/robot/laser/scan , or anything at all relevant to the lidar. If you cannot find anything paste the output of the command here and I can take a look.

which prints me a vector of 32 range-values. I guess thats because there are 32 beams in my sensor.So far so good. However, i don't see the laser beams visualized inside the NRP. Whenever i load the sensor model into an empty gazebo world, all 32 beams are shown, which i would like to have for better visualization. Do you have any further ideas on how to accomplish that?

I am very happy that you managed to get the sensor data. For the visualization part, I am afraid there is no easy way to do it in the NRP. I would ask for @lguyot to also comment if he has another idea. If you want to go down the long road of implementing the visualization yourself we could guide you through it and then you could contribute to our code!

Easy solution would be to have a gazebo client running simultaneously with the platform and then you could see the rays there, of course with the overhead of running two separate computationally intensive processes.

thanks again!I would be interested in implementing the visualization myself. However, I am currently working on a project with due date on Sunday, 24th September, and am learning for exams aswell. As soon as I am done with my exams (start of October) I would like to try the implementation out, if you could guide me through it

As for now, I would like to try the easy solution: Is it possible, to run a gazebo client simultaneously with the platform and seeing in gazebo everything that happens in the NRP simulation (together with the functionality implemented in State Machines and Transfer Functions)? Or would it just be a visualization of the robot with its beams?

We could discuss in technical detail when you have some time. In fact we are actively looking for master students to work with us on their thesis. If you are interested I would be more than happy to discuss.

Back to our topic, running standalone gazebo is done from the command line. Just type gzclient and this should open a gazebo GUI. Whatever you do to the simulation from the NRP (State Machines, TFs etc) is immediately reflected to the gazebo client.

The problem is, sometimes i receive an array given all measured ranges, sometimes i get an error: "NoneType object has no attribute "ranges". It did work the other day, without me changing anything. I feel like if I listen to the topic in the ROS terminal simultaneously, I can ask the range values inside the Transfer Function via lidar.value.ranges, which seems kind of strange.Do you see my mistake here?