When setting up an unpaced production line (one that does not employ any form of mechanical pacing), how you design it will impact its efficiency quite considerably. For instance, where should you place operators who work at different speeds, or vary in the speed at which they work; where should you keep unfinished items along the production line? These are just a few of the problems facing the line manager.

One factor that needs serious consideration is determining the size of the storage space between workstations where partly finished products are kept, awaiting the next step in the process. This storage space is referred to as "buffer," and numerous studies have looked at how much buffer capacity should be allocated on a line, and where it should be placed.

The average working time of an operator influences line design. Not all the operators are going to be able to complete their tasks in the same interval of time. People work at different average speeds for several reasons; some are personal, such as their physical capacity and motivation, and some are inherent to the task. They may perform a complex task, for example, or it simply may be a fact that the amount of work along the line just can't be distributed evenly in terms of time. Imagine any process you like, it is clear that some tasks cannot be completed until the preceding steps have taken place. A very straightforward example is that we can't pack a product until it's made.

Workers vary in many ways. Not only do different operators work at different speeds, the same person can vary in the rate at which he or she works over the day. Research has shown that a person's working speed can vary as much as 60% from his or her average. This behavior can occur for different reasons: fatigue, boredom, and tasks that are complex or changing.

Given these realities, why not solve the problem by assigning workers with different mean working times and variabilities to the workstations, and allocating buffer space unevenly between them? The source of the difficulty really lies in two scenarios: in one, an operator temporarily works faster than his predecessor, so buffer stocks are run down. In this case, the succeeding station will suffer from "starving" delays. The other possibility is if an operator temporarily works faster than his successor. In this case the buffer soon fills up completely, and the preceding station will suffer from "blocking" delays.

We could imagine from this that a "balanced" line, where workers complete their tasks in given and equal times and don't vary from their average, and where buffer space is distributed evenly between the workstations, is the ideal situation for getting the best performance out of the line. However, this is not the case.

How can we measure the performance of a production line? One method of measuring how efficiently a line is working is through calculation of the average buffer level for the whole line. Obviously, we want to keep the number of unfinished pieces in storage as low as possible, so average buffer level (ABL) needs to be kept down. Another technique is to measure the time that the line is not functioning (idle time or IT) as a percentage of total working time. This parameter needs to be kept as low as possible as well, in order to keep labor costs down.

In the case of buffers, there are often technical considerations that mean we just cannot put buffers of the same size between stations. This being so, in some cases the total buffer capacity needed for efficient working of the line has to be spaced unevenly. This buffer arrangement is then called an "unbalanced-buffer" line. What we have found through simulating allocation buffers of different sizes in several patterns is that—when compared to the "ideal" balanced line—unbalancing the buffer distribution can actually lead to improvements in the performance of the production line.

Here are some examples of decisions in buffer placement. We ran computer simulations on lines with five and eight workstations, with total buffer capacities (TB) of eight and 24 units for the shorter line, and 14 and 42 units for the longer, eight-station line, giving us average buffer capacities of two and six units respectively for both line lengths. The buffer capacities were then assigned unevenly along the lines.

The patterns can be described in five general policies:

Ascending order: buffer capacity is concentrated at the end of the line.

Descending order: buffer capacity is concentrated at the beginning of the line.

Inverted bowl shape: buffer capacity is concentrated in the middle of the line.

Bowl shape: buffer capacity is lowest in the middle of the line.

General: buffer capacity is not concentrated at one area of the line, it can be:

No particular pattern.

Zigzag: alternating buffer capacity between high and low.

Following the simulations, we were able to see that none of these five general policies were noticeably better or worse than any of the others in broad terms, but that there were particular patterns within the five general policies that showed substantial improvements of performance either in IT or in ABL, when compared to the balanced-buffer line.

In each policy, we found that the less extreme the buffer capacity allocation, i.e., the more evenly spread it was, the better were the results as far as IT was concerned. The two best patterns obtained from the simulations fell into the "general" policy and are illustrated in Figure 3.

As is exhibited in Table 1, pattern 2 in Figure 3 shows a reduction in idle time (-16.14%) compared to the balanced-buffer line. The other lines are all worse than a line with equal buffer sizes would be, and show an increase in IT. In terms of ABL, the best two patterns had their buffer capacities concentrated towards the end of the line, (ascending order policy).

Results shown in Table 2 indicate that the savings obtained are considerable:

All four of these patterns consistently show great improvements in average buffer levels over the balanced-buffer line. So it seems to be well worth unbalancing the buffers to improve stockholding performance.

Average operator working time varies considerably. Placing operators with dissimilar speeds of working at different workstations has often been done with a view to keeping down the differences between the average operating times of the stations. Again, what we have found is that placing the operators in carefully chosen patterns of speeds can in fact bring unexpected advantages in terms of performance.

Simulations of five and eight-station lines were run, with the difference in average working time between successive stations going from slight (2%) to medium (5%), high (12%), and very high (18%). The operators were distributed so as to reflect four basic approaches to line organization.

Buffer sizes (B) were set at one, two, and six units for all the buffers in the line. The most efficient policy for reducing idle time was configuration four (the bowl-shaped pattern), while the worst policy was configuration two (the descending order). The biggest improvement obtained over the balanced line was nearly a 3.5% reduction in idle time for the eight-station line (buffer B=1), and a slight 2% difference in mean times.

When performance was assessed for the average buffer levels, the results were very significant. Pattern 2, with the fastest worker at the end of the line, outperformed the balanced line for all line lengths, buffer capacities, and degrees of difference in mean times. The best pattern resulted in a decrease in ABL of about 87% (for a five-station line having buffer B=6 and a high 12% difference in mean times), as compared to a balanced line. Unfortunately, none of the configurations considered led simultaneously to lower levels of both IT and ABL.

As discussed in the introduction, another effect on production performance comes from placing workers who vary in the speed at which they work. Some people will work quite steadily at the same or similar pace over a period of time, and others will slow down or speed up quite significantly over the same time. The relative variability of operator working speed is ordinarily measured using the coefficient of variation (CV), which is the ratio of their mean time and the standard deviation from this mean.

We simulated lines with five and eight stations, and operator relative variabilities starting at low variability (S) to medium (M), to high variability (V), and with total buffer capacities (B) of one, two, and six units.

For one set of patterns, illustrated in Figure 6a, the workers with lower and higher variability were interspersed. For the bowl-shaped pattern (6b), the workers who had low variability were placed next to each other in the middle of the line with the more variable operators at both ends of the line. The inverted bowl shape (6d) took the opposite form, with station operators of higher variability in the middle. The "chaise lounge" pattern (6c) had operators going progressively from high to low (and from low to high) variability along the line.

The best overall results for reducing both idle times and average buffer levels came from pattern 4, one of the two bowl-shaped patterns considered, with the steadier workers in the middle. In terms of reducing average buffer levels, the same bowl arrangement—pattern 4 for both the shorter and the longer lines—also gave the most effective performances outlined in Table 3.

So using a bowl-shaped arrangement, i.e., putting operators with steadier times in the middle of the line, can lead to substantial reductions in idle time for shorter lines, and gives considerably lower average buffer levels across the board.

One of the main conclusions of this research is that the decision of how to allocate different sized buffers between workstations, and where to place operators with different average working times and variability, will depend on the particular conditions of your production facilities.

It may be a priority to keep unfinished goods in storage as low as possible, for example fresh produce where hygiene and safety issues are important. In this case, a manager would opt for reductions in ABL. To do so, one might place more buffer capacity towards the end of the line. If worker average times are known to differ, it could be advantageous to put the fastest workers towards the end, and when workers vary in their average speeds to a great degree, one might consider placing the steadiest workers in the middle. This is especially the case where just-in-time and lean-buffering strategies are in place.

In contrast, if we are looking at a sector where labor costs are high, for example the automobile industry, then it could be advantageous to move towards reducing IT, and either distributing buffer capacity as evenly as possible along the line, or again considering placing faster or steadier workers towards the middle. Remember, however, that these patterns are specific patterns among numerous possibilities, and that imbalance directed in the wrong way could lead to the opposite effect, i.e., increases in average buffer levels and/or idle times.

In spite of this risk, the potential savings in IT (as much as 43% for correctly placing variable workers) and ABL (up to 87% for best patterns of mean times), when taken over the lifespan of a production line means that unbalancing your line in the right way could be a very worthwhile strategy.

This article was first published in the September 2008 edition of Manufacturing Engineering magazine.