robots.net blog for mottershttp://robots.net/person/motters/
robots.net blog for mottersen-usmod_virguleFri, 27 Oct 2017 20:33:58 GMTMon, 2 May 2016 21:25:49 GMT2 May 2016http://robots.net/person/motters/diary.html?start=85
http://robots.net/person/motters/diary.html?start=85So, it's a very long time since I last posted here, and it looks as if the site will be closing. What happened with my projects?<br/>
<br/>
Well, I stopped doing robotics stuff around about 2013. It was a deliberate choice to re-focus my effort on other things which I thought were of higher priority - those being more general internet service self-hosting along the lines of FreedomBox.<br/>
<br/>
Between 2011 and 2013 I did a lot of development on the Turtlebot and also GROK2 - a very large robot about as tall as myself. Those projects were outrageously successful compared to anything I'd done previously, mainly thanks to the ROS operating system and the Kinect sensor. I devised a simple home navigation system which used a combination of button presses and text-to-speech so that you could direct the robots to particular locations in the house. Navigating through doorways was sometimes hazardous, but most of the time it worked.<br/>
<br/>
I didn't completely lose interest in robotics or AI in 2013, and I think it's still very likely that in the coming years I'll return to some new robotics project. There are many challenges still to be overcome.Mon, 20 Jun 2011 17:37:49 GMT20 Jun 2011http://robots.net/person/motters/diary.html?start=84
http://robots.net/person/motters/diary.html?start=84Fitted a D525MW mini-ITX motherboard to the robot, and installed Linux Mint 11 and ROS onto a 16GB USB flash drive.<br/>
<br/>
https://sluggish.homelinux.net/wiki/File:Mini-itx1.jpg<br/>
<br/>
This makes a good minimalist onboard computer, and was considerably cheaper than buying an equivalent netbook. To set everything up I connected a keyboard, mouse and monitor as usual, but once the motherboard was installed on the robot it only requires the wifi adaptor and USB drive to be connected. I deliberately didn't want to use a hard drive (although I have a couple of old ones available) based upon bad experiences with mobile robots and hard drives in the past. Also I reused some old PC speakers which havn't seen the light of day for probably more than a decade. You never know when such things may come in useful.<br/>
<br/>
https://sluggish.homelinux.net/images/f/f2/Grok2_electrical2.jpg<br/>
<br/>
One trick with running the OS from a flash disk is to delete the existing casper-rw file, then create a partition labelled casper-rw. This enabled me to make use of the full USB drive, rather than being limited to 4GB of persistent storage.<br/>
<br/>
The user interface of the robot currently consists of buttons and audio. When you press buttons the robot says something appropriate, so it's not so much a graphical user interface as an audio user interface. For the sorts of tasks I envisage the robot doing this is quite adequate, although if more elaborate instructions were needed I could add a small screen of some sort (finances permitting).<br/>
<br/>
With the robot running I can then use either VNC or ssh to debug code or run different programs.<br/>
<br/>
There's some tidying up remaining to be done on the head of the robot, Since the Kinect sensor's circuit boards are exposed and vulnerable to collisions. I'll devise some sort of covering to go over that.<br/>
<br/>
An initial localising test run done earlier today indicates that everything seems to be working as expected, and the new computer can handle the processing demand. Even for a relatively simple differential drive robot like this there are a considerable number of electrical connections, and there's always some degree of trepidation over whether I've connected them back in the right order. Labelling everything helps a lot.Mon, 13 Jun 2011 11:57:46 GMT13 Jun 2011http://robots.net/person/motters/diary.html?start=83
http://robots.net/person/motters/diary.html?start=83After a day of hacking, bashing and drilling I've slimmed down the GROK2 robot, reducing its width by 40mm on either side. This robot has an AL-101 chassis (Zagros robotics) and fortunately it's made from 3mm aluminium, which is just about sawable with some exertion.<br/>
<br/>
This should give the robot more clearance when passing through doorways. It's still wide enough for a netbook, mini-itx or even a full sized motherboard but it's no longer wide enough to carry the laptop - at least not in the usual orientation.<br/>
<br/>
It's always been in the plan to eventually have some permanent onboard PC, and at present it looks as if netbooks are just not quite up to the job unless they're the latest and most powerful devices (which are expensive). So I might have a go at installing a mini-itx, which are much cheaper than a high end netbook. I could then use a laptop or netbook to ssh into the robot. I have a couple of spare SATA hard drives which could be used, and also a couple of USB wireless adaptors. Another advantage of the mini-itx boards is that they can be run off of a 12 volt supply, which avoids the wasteful DC->AC->DC conversion.<br/>
<br/>
All in all the future for robotics is looking very good, particularly for low hanging fruit applications, such as fetch and carry or just hawling stuff around. I think it would be quite feasible to build a prototype shop/supermarket shelf stacker robot, and also to add an autopilot feature to mobility scooters or wheelchairs.Thu, 9 Jun 2011 23:05:54 GMT9 Jun 2011http://robots.net/person/motters/diary.html?start=82
http://robots.net/person/motters/diary.html?start=82It has been a while since my last blog entry here. As far as ambient events are concerned I continue to be an unemployed software engineer, with the prospects of re-employment looking increasingly remote, but in terms of robotics projects things are going very well indeed. In the last six months using ROS and the Kinect sensor I've made more progress than I'd made over the previous five years of SLAM and stereo vision development.<br/>
<br/>
The GROK2 robot is now navigating well from one room to another. Tuning the localisation parameters took a while, but now the movement looks quite smooth and decisive. I've been able to have the robot navigate reliably to various locations in the kitchen, such as sink, kettle and table. It doesn't have any arms presently, but if I can get some object recognition going then adding an arm would be the next logical step. It's easy to become complacent, but the current level of navigation performance was, until only a few months ago, merely a vague ambition somewhere in the future.<br/>
<br/>
One problem is that it looks as if the robot in its current form is just too wide to get through one particular doorway. This might mean that I need to do some mechanical hacking to thin it down a little and provide more clearance. The small amount of clearance currently available is just too narrow to realistically expect the localisation to be able to handle it reliably. As part of the redesign I may also add a dedicated onboard PC, rather than using a laptop.<br/>
<br/>
Using this sort of system with a PC of some description and RGBD sensor the prospects for robotics over the next decade look far better than at any previous time. 2011 is probably going to be a watershed year in which both the software and the sensor technology became good enough for break-even navigation at a reasonable cost.Thu, 27 Jan 2011 23:52:44 GMT27 Jan 2011http://robots.net/person/motters/diary.html?start=81
http://robots.net/person/motters/diary.html?start=81I've now added a Kinect sensor to the GROK2 robot, which is
described here:
<p> http://streebgreebling.blogspot.com/2011/01/grok2-kinect.html
<p> I think that 2011 could be quite an exciting year for
robotics, with some real progress being made on age-old
problems.Sun, 26 Dec 2010 23:27:59 GMT26 Dec 2010http://robots.net/person/motters/diary.html?start=80
http://robots.net/person/motters/diary.html?start=80I havn't done very much by way of a write up on the GROK2
robot so far, so here is some explanation of the story to date.
<p> http://sluggish.homelinux.net/wiki/GROK2
<p> The robot is still quite static, although it can be driven
by joystick. The next stage is to create a URDF model and
try out some of the ROS mapping/localisation to see whether
it's suitable for use with stereo vision. From the
navigation stack's point of view it shouldn't care what
sensors are being used, since all it will be seeing is point
cloud data.Mon, 13 Dec 2010 10:48:30 GMT13 Dec 2010http://robots.net/person/motters/diary.html?start=79
http://robots.net/person/motters/diary.html?start=79Another point cloud model, with better registration than
some of the previous ones.
<p> http://streebgreebling.blogspot.com/2010/12/point-cloud-model-of-author.html
<p> I may revert to the initial design of the GROK2 head, with
forward and reverse facing stereo cameras. That way I can
grab twice the amount of range data in a similar amount of time.Mon, 6 Dec 2010 23:27:01 GMT6 Dec 2010http://robots.net/person/motters/diary.html?start=78
http://robots.net/person/motters/diary.html?start=78The first dense composite point cloud model has been
generated from the GROK2 robot. Whilst the depth resolution
might not be as good as a Kinect, and the registration of
glimpses is not perfect, I think this proves - at least to
my own satisfaction, if nobody else's - that stereo vision
can be used as a practical depth sensing method.
<p> <p> http://sluggish.homelinux.net/wiki/3D_Models
<p> <p> There's still a fair amount of work to be done to improve on
these results, but it's certainly looking feasible that
recognition of sizable objects such as chairs or desk
surfaces may be achievable. An obvious quick heuristic
would simply be to run an elevation histogram and search for
peaks which could indicate horizontally oriented surfaces.
<p> <p> No doubt the points could also be condensed into voxels to
increase the efficiency of subsequent higher level processing.Wed, 24 Nov 2010 21:03:07 GMT24 Nov 2010http://robots.net/person/motters/diary.html?start=77
http://robots.net/person/motters/diary.html?start=77Whilst there is a big song and dance about the Kinect
progress on Sentience continues. Since they're not
prohibitively expensive and I have plenty of time on my
hands I'll try to acquire a Kinect and evaluate how suitable
it is for robotics uses. Willow Garage already seem to be
doing something Kinect related.
<p> A new dense stereo algorithm called ELAS, developed by
Andreas Geiger, has been added to the v4l2stereo utility.
This works well and at a reasonable frame rate on the
Minoru. It's probably the best dense stereo method that
I've tried to date.
<p> http://code.google.com/p/sentience/wiki/MinoruWebcam
<p> It may turn out that the structured light method which the
Kinect uses isn't very useful outdoors, or suffers from
interference when multiple units are used in close
proximity, so there may still be a place for stereo vision
as a depth sensing method.Thu, 8 Apr 2010 20:53:36 GMT8 Apr 2010http://robots.net/person/motters/diary.html?start=76
http://robots.net/person/motters/diary.html?start=76Whilst testing out omnidirectional stereo vision I thought
it would be a good idea to try to apply a dense stereo
method to images like the ones used to produce this anaglyph.
<p> <p> http://www.youtube.com/watch?v=3L-gJhATQOg
<p> <p> Here the disparity is vertical rather than horizontal as is
usually the case for stereo cameras.
<p> <p> However, it became apparent that I don't yet have a dense
stereo algorithm for v4l2stereo, so I decided to take some
time out to develop one, with the hope being that whatever
is developed in a conventional stereo vision setup can be
similarly applied to the omnidirectional case.
<p> <p> The stereo correspondence method which I've used for dense
stereo is a fairly conventional one, and I've made extensive
use of openmp to make it as multi-core scalable as possible.
This uses the simple "patch matching" approach which is
commonly described in the literature, but works reasonably
well on the Minoru provided that some initial correction is
done to make the colour mean and variance in the left and
right images as similar as possible, so that comparing
pixels becomes a less haphazard affair.
<p> <p> An example of the end result is the "big blob detection"
reminiscent of what I had running on the Rodney humanoid
over five years ago appears in the following video.
<p> <p> http://www.youtube.com/watch?v=ZKnWJTOzyk4
<p> <p> The depth resolution isn't fantastic, but it's functional
and may be of use for obstacle detection or just detecting
people nearby.
<p> <p> Also I wanted to experiment with integrating this code with
the Willow Garage ROS system. This would potentially enable
very expensive stereo cameras traditionally used in academic
research to be replaced by something like a Minoru webcam,
or a pair of webcams, which would be affordable to the
hobbyist. The current source code release for v4l2stereo
includes example ROS publisher and subscriber, which should
make integration with ROS based robots into a fairly
straightforward process.
<p> <p> http://code.google.com/p/sentience/wiki/MinoruWebcam
<p> http://code.google.com/p/libv4l2cam/