Hopefully I can help with this, since this is currently my area of work. It turns out your question is actually a huge area of research in engineering (they actually have entire conferences/journals like the IEEE IPIN Conference dedicated to this topic.) There is no easy answer, and it depends on your specific application/environment.

You can use the IMU to do dead-reckoning, however you'll almost certainly need some sort of external sensor (Lidar, optical-flow, camera etc...) to continuously correct the measurements and prevent drift. People have even used different RF or wifi tags in order to track position.

Depending on how much work you want to put into this, your level of experience and the accuracy you need there are several different ways to do this. If you want to research on your own put "UAV indoor navigation" into google scholars. Feel free to message me if you want to discuss this more, I'd love to help.

@mickael I'd be happy to talk. I have some experience using ultrasonic sensors for indoor navigation purposes, send me a message with more specifics about your project: what type of vehicle, what type of sensor, the physical environment you want to navigate in, your end goal etc.

Or better yet, share it on the forum so everyone can share/contribute.

So I am using the erle-copter with Erle Brain 2 escarlata image and I don't have the GPS module.My goal is to fly indoor automatically like in a house, I want to move step by step.1st step : Get a stable copter with the ultrasonic sensor and two walls and the floor or the ceiling.2nd step : Start to move by going along the walls.3rd step : Pass door and stairs.

So I currently have 6 sensors (maybe too much but better too much than less ), this is this type of sensor : https://www.robot-electronics.co.uk/htm/srf02tech.htm link on a I2C bus. One on each directionI am currently wandering how I will collect the data with ROS (firstly) fast enough to stabilize the copter (secondly).

I want to made a copter with no add-on on walls, floor, ceilling.

I need the best parameters for this project, for beginning I will start to move slowly and get faster later. I know this platform is not the best for indoor fly but this is the first step for further development.

I am currently wandering how I will collect the data with ROS (firstly) fast enough to stabilize the copter (secondly).

There are plenty of tutorials online about integrating I2C sensors with ROS, take a look at those. Also, you wouldn't be using the ultrasonic sensor to stabilize flight, that would be done with the on IMU. The 750,00+ lines of ardupilot code take care of this. What you want is to be able to track your position based using the ultrasonic sensor, not necessarily use that sensor to stabilize and fly the drone.

The way I see it, to do the localization you have two choices. First, you can use the ultrasonic sensor alone (which would be okay for simple flights in straight lines and with no tilt). The second and more sophisticated option would be to implement a dead reckoning algorithm on the IMU data and use ultrasonic data to correct and prevent drift, as well as some mapping.

The problem with using the ultrasonic sensor alone is that once your drone starts to do more complex flights which could involves tilts or rotations, you're no longer looking straight at the walls/barriers and the ultrasonic data is not very useful. You have to be able to extract the orientation of the drone in order to be able make sense of that data. This would involve using the accelerometer, gyroscope, and even magnetometer to extract the orientation and then filtering the raw data to get useful information. Since you're going through the work of constantly keeping track of the orientation of the drone you might as well start dead-reckoning to add to the accuracy to your localization. (Dead reckoning is simply tracking location by constantly double integrating acceleration and updating position). You can use the newly filtered ultrasonic data to track your relative velocity to the walls and then use that to correct the drift in your dead reckoning algorithm.

Obviously that's a lot to take in at once. My advice would be to first get the drone flying with an RC. (Always have an RC in case you need to unexpectedly take control during an autonomous flight). Once you have that down start doing simply flights (back and forth in a straight line) and use 1 ultrasonic sensor to track your position in one dimension. Then extract your 1 -dimensional speed. Then start to look into orientation tracking, quaternions and dead reckoning algorithms. It'll come slowly, but it is definitely worth it.

Well, I am understanding what you are saying but if I am overriding the RC like in teleoperation, the ardupilot code will still working right ? So it will be like if I was using the RC remote. I will send my command in function of my sensors and the ardu code will take care of the stabilization.This is correct ?

Glad to see this discussion taking off. Quite interested as well on the topic, particularly on navigation so I'd be happy contributing. For those interested, there's a fantastic ROS talk about navigation from Clearpath Robotics here.

I particularly find quite fascinating navigation with robots based on reinforcement learning techniques (in other words, have robot learn to navigate by itself). It's pretty amazing to see how with few and inexpensive sensors (e.g., a few range finders or a camera), one can get decent results. I've put together, in my free time, a series about RL and my intention is to connect it with robots at some point tackling the navigation problem.

j15:

There are plenty of tutorials online about integrating I2C sensors with ROS, take a look at those. Also, you wouldn't be using the ultrasonic sensor to stabilize flight, that would be done with the on IMU. The 750,00+ lines of ardupilot code take care of this. What you want is to be able to track your position based using the ultrasonic sensor, not necessarily use that sensor to stabilize and fly the drone.

Well, I am understanding what you are saying but if I am overriding the RC like in teleoperation, the ardupilot code will still working right ? So it will be like if I was using the RC remote. I will send my command in function of my sensors and the ardu code will take care of the stabilization.This is correct ?

Ardupilot code is always running no matter what. It doesn't matter if you're using an RC or autonomous flight, ardupilot is what keeps the drone in the air. So, to answer your questions yes you do not need to worry about code to stabilize the drone, you simply send commands to it.