odd labor request... time spent per dish/piece

First time poster here. Baker and Pastry Chef w/ 10 years + in the industry and I've just taken a job with a large hotel group beginning their first in-house F&B program. I lead the bakery team and 'service' all outlets of the hotel-- several restaurants, a cafe, events.

As it is a fledgling program many systems have developed organically (at least as far as I can tell) and I've just gotten a request from Corporate that has befuddled me completely.

In order to distribute my labor budget over the various outlets we service, they're asking me to come up with a "time per piece" so that they can assign a $/cent amount to each item I effectively sell to the outlets. (As a personal aside, I find this request absurd and a flawed approach, but had no say in the decision.)

I've started to collect data in a few ways-- from bare minimum research online for 'active times' for recipes, to just setting a timer during focused tasks. My results are... eh, all over the place (it takes me 1/4th the time as the entry level-baker and many of my bakers are entry-level; there is rarely a time we're only working on one thing; do you factor mistakes in; etcetc).

I guess my question is: how can I better approach this task? Is there any logical way to do this, or should I go with my gut and just pad my labor all over and use best guesses from my experience?

See how many items you can create in a day or even a week, then divide that number by the total working hours of all your employees. You can subdivide the results further by the worker's section, or the time of the day, etc. etc.

I've had to do this for a gov't contract. In our case, there was no overhead, so all of the labor had to be included in 'making the piece' i.e. gathering, prep, and dishes - not just the actual construction time.

We did it like Pat Pat suggested - we looked at how many we could make in a day for several days, averaged it per person, and then padded for unforeseen circumstances (i.e. training and new bakers).

This may be a bit overkill but there is an established methodology for effort estimations as promoted by PMI, the institute that issues certifications for professional project managers.

The method is called a PERT estimate. You come up with three estimates for how long it takes to do a task - Optimistic, Pessimistic, and Most Likely. You then run them through the formula:

Effort = (Optimistic + 4 * Most Likely + Pessimistic) / 6

If optimistic is 4 hours, Most likely 6 hours, and Pessimistic 10 hours, the example would be (4+4*6+10)/6 = 6.33

This averages them in a manner that gives more weight to Most Likely. It only has value if each original estimate is genuinely independent; if you just equally space them out with ML right in the middle, the formula will give the exact same result as normal averaging.

The result can be further refined by calculating in a resource's productivity. As you mentioned, entry level bakers may not be as efficient/productive as experienced ones. The formula for Productivity is:

Productivity = (standard hours / resource's hours) * 100

For example, if resource A is more efficient and takes 6 hours to do an 8 hour job, his productivity would be (8/6)*100=133%
While if resource B is less efficient and takes 10 hours to do an 8 hour job, his productivity would be (8/10)*100=80%

You can then use those results to calculate the Estimated Task Cost. Formula is:

ETC = (Effort / Productivity) * Rate

So, if you have a task with a PERT estimate of 6.33 hrs being done by resource B who is 80% productive and gets paid $25 an hour, it would be:

(6.33 / 80%) * 25 = $197.81

Again, this may be overkill for what you need but it is how effort estimations are done professionally, FWIW. Hope you find some value from it.

(EDIT - in my last example I meant to put ".8" instead of "80%" but apparently an 8 followed by a close parenthesis automatically converts into an emoji, like so... )