'Move the Picker, Move the Tote'

I was tasked by a conveyor company out of Tampa, Fla., to investigate whether a parts-delivery system's throughput could be improved beyond the specified four totes per minute. Information about the system was sketchy at best, but with some home-brewed adapters to the I/O, I was able to determine what the position indicators, laser scanners, pickers (a small elevator), and RS232 command station were doing.

I created my own I/O using some devices provided by a US manufacturer and adapted them to an Everex step 32 computer system. I used QBasic's compiler basic to generate the bulk of the structure with some assembly to handle the laser scanners interrupts.

The delivery system consisted of a 432 tote (two foot x two foot plastic bucket) divided into six layers of 72 totes each and two small elevators, one on each end of a long loop of approximately 60 feet. The picker's job was to receive a tote extracted by a pneumatic piston or insert a tote via a set of belts aided by a paw that helped push the tote on to the chain-linked tray. Each pair of 72-tote layers was controlled by a separate embedded computer. Another computer handled the pickers, and another computer handled the RS232 communications between the delivery system and the host... five computers in all.

The problem was the brute force approach used to drive the now-disconnected I/O system. Each command from the host had to be parsed, and the pickers had to be directed to either go to a certain level and pick up a tote and deliver it to a conveyor level or vice versa. The first cut was to divide and conquer, so all of the extraction commands were divided from the insertion commands. That provided a reasonable outlook as to what the true nature of the requirements of movement were.

From there, I was able to create one single movement and used it twice regardless of the type of command received -- it was "move the picker, move the tote." Now, simply by providing a target for the picker, the routine knew where to move the picker. From that point, if the target was a conveyor, the software would move the picker to the conveyor level and generate the signals needed to drop the "K" stop to pick up the tote and center it on the picker. The second pass parsed from the host command was the delivery of the tote. In this particular example, the software would direct the picker to a storage layer and insert the tote.

This is, of course, very simplified, as there was a layer movement (using a shortest-route algorithm) to get the proper tote number to the correct end for the picker, locking the layer and either executing an extraction or insertion.

All in all, the system was now capable of delivering nine-plus totes per minute, instead of the manufacturer's four. The story doesn't end there, however! Because of the increased delivery speed, the host system had to pace the requests, because the totes would jam the exit conveyors, causing them to buckle and leap off the conveyor!

This entry was submitted by Rick MacLean and edited by Rob Spiegel.

Rick MacLean is a design consultant now working with a motion product company located in Longwood, Fla.

This is a good Sherlock Ohms story about using ingenuity to double the production on the conveyor. A lot of this type of optimization is done with intelligent tools these days. Not long ago, it all had to be done with brain tissue.

I guess IF you were standing next to this fellow during his investigation, you'd probably have a far better appreciation for the engineering/programming dilemma. But, reading about it from the vantage point of several decades later, it seems that one would do just as well reading Homer or Cicero in their native text.

In general, I think most of these blogs are so "lingo-specific" that they lose much of their impact to readers NOT familiar with industry-specific terminology.

Brain tissue is great, Nancy. But now we're seeing embedded intelligence in a wide variety of devices and systems in automation and control. The embedded intelligence allows controllers to run the system like a video game with simulation and optimization at the fingertips.

I would have assumed there would have been a command to tell the system "Do not send more boxes until these have come off the belt" or something like that. I would hate to think there were nuclear devices spread all over the floor because the output could not keep up with the production line!

Oh, I agree Rob - I was just referencing companies with older technology that do not have a budget for the cool new stuff. We can still design stuff that is not as smart but can still do the job. And we still need brain tissue to invent the smarter products ;)

The problem experienced at the discharge ends indicates that the next step was not done, which would be to speed the rest of the process. Sort of like adding a huge boost to the horsepower of a car but not doing anything about keeping the car under control or stopping it.

Perhaps I did not understand completely all about how the changes were able to bring about such an improvement in process speed.

We'll still need the brain tissue, Nancy, no matter how smart our tools get. Yet it is nice to see some of the intelligence getting embedded in our devices. I'm glad my computer has enough embedded intelligence that I don't have to use C prompts any longer. And the price on the intelligent systems usually comes down.

Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.