5

I've made a robot controlled by Arduino and Processing, that moves in a room by rotating itself (like a sphere).

What I need is to be able to get the new location once it moves on the floor (let's say within a 3m x 3m room). I'm using a 9DOF sensor (3 axes of accelerometer data, 3 axes gyroscopic, and 3 axes of magnetic data) to determine its roll, pitch and yaw and also its direction.

How is it possible to identify accurately the location of the robot in Cartesian (x,y,z) coordinates relative to its starting position? I cannot use a GPS since the movement is less that 20cm per rotation and the robot will be used indoors.

I found some indoor ranging and 3D positioning solutions like pozyx or by using a fixed camera. However I need it to be cost efficient.

Is there any way to convert the 9DOF data to get the new location or any other sensor to do that? Any other solution such as an algorithm?

dda
  • 5,700
  • 2
  • 23
  • 33
Apollon1954
  • 1,308
  • 4
  • 16
  • 32
  • Integrating acceleration gives velocity, and integrating velocity gives position. It won't be very accurate, though. – Don Reba Aug 19 '15 at 22:02
  • @DonReba Do you know any way of doing that? Any tutorial or article about this? – Apollon1954 Aug 19 '15 at 22:03
  • @DonReba Do you know any other more accurate way? – Apollon1954 Aug 19 '15 at 22:29
  • Integration is simply multiplying each measurement by the time since the last and adding up. Of course, if you have angular acceleration, you have to convert it to linear, but that's also easy. There are probably ways of reducing error using the other sensors. This is just the simplest method. – Don Reba Aug 19 '15 at 23:55
  • If your robot rolls without sliding, perhaps use the motor rotation can be more accurate. With 9-DOF you can compute a position from the starting point, but all error will be accumulated and quickly the errors will be greater than the value. an issue will be to correctly position the captor to match the center of the sphere. – Ôrel Aug 20 '15 at 02:31
  • @Ôrel The sensor is positioned in the center of my robot. The rotation depends of 4 motors with their lead screws. A combination of those is used to rotate the robot. – Apollon1954 Aug 22 '15 at 17:24
  • this reminds me of bublcam http://www.bublcam.com/ – DXM Aug 28 '15 at 23:44

2 Answers2

6

As one points out in the comments, integrating acceleration gives velocity, and integrating this again gives position. This is however not very accurate as errors will accumulate in no time.

Instead what people are using is to use "sensor fusion", which combines the data of several sensors into a better estimate of e.g. the position. It will however still accumulate error over time if you rely on the accelerometer and gyro alone. The magnetic vector will however help you, but it will probably still be inaccurate.

I have found the following guide online that gives an introduction to sensor fusion with kalmann filters on an arduino.

http://digitalcommons.calpoly.edu/cgi/viewcontent.cgi?article=1114&context=aerosp

Warning: you need to know some math to get this up and running.

Sigurd V
  • 4,967
  • 4
  • 16
  • 25
  • Thank you for your answer. I checked the attached paper, However as I understand is not calculating the XYZ coordinates. – Apollon1954 Aug 20 '15 at 21:25
2

My following answer does not include a specific implementation and my expertise does not include robotics. (I'm a researcher in machine learning, NLP, AI field.) However, I believe my lack-of-detail suggestion would be somehow useful because your problem setting still remains at a general level.

SLAM is one of the most famous field which study how to estimate sequential robot locations by sensory-motur data. In the field, there are a lot of studies to estimate robot's locations by sensory-motor data.

Researchers have studied SLAM methods for various specific situations like in slippy floor and complex shape room or with noisy sensor, etc. I think your current setting is a little less specific than one in those researches.

So, If I were you, I would start by trying some standard method of SLAM. I would pick up several popular and general methods from a textbook of SLAM, and look for open source software implementing these methods.

As far as I know, the particle filter(PF) is one of the most popular and successful method in SLAM field. PF is a advanced variance of Kalman filter(KF). PF is very easy to implement. Math is much simpler than KF. I think PF is worth trying in your situation.

rkjt50r983
  • 4,043
  • 11
  • 35
  • 57