Assume there is a hardware board having RISC CPU(e.g. ARM Cortex processor), and associated peripherals, Data cache/Instruction cache, DRAM, and some OS running on it (Say embedded linux) roughly what percentage portion of the CPU clocks(MHz) (max limit) can be alloted to a Video decoder?

Asked differently, if a Cortex-A8 processor runs at 666 MHz , then how much of this would be available to a Video decoder in a mobile phone system, in a real time scenario?, which would help decide the max. MCPS the Video decoder can consume?

(I know the exact answer to this will depend upon what all code/components run in the real system, like Audio decoder, Video decoder, Modem, something else..., but for a typical mobile phone system are there any estimates)