I know this cannot be done, but couldn't (ashamed to say) convincingly explain this to a friend why it can't using realistic values.

Can anyone explain why I cannot use the power produced by a battery to loop back and power that same battery to have continual power? Explain it using numbers/math and some common units of measures (e.g. volts, amps, ohms, power...)?

Here's an example. Why wouldn't this work to have continual power?

You have a 12V car battery which is connected to a motor. Then, you use that motor to turn an alternator which in turn recharges the original 12V car battery (similar to how a car works).

Now, I know the common replies will state you lose power along the way through heat, friction, etc... so you don't have enough power to loop back to the battery to keep charging it. As such, eventually you'll run out of power.

How can you prove this using my basic model above with electrical measures, math, or numbers using a more realistic example of measures like volts, amps, ohms, watts, etc...?

I tried this but need real measures:

The battery has 100 units of power, which it transfers the motor. The motor then takes that 100 units of power, but only transfers 90 units to the alternator (10 units of power are lost from heat, friction, etc..). Last, the alternator then gives only 70 units back to the original battery (loses 10 unis like the motor).... Process starts again with 80. As you can see you keep repeating this, eventually you'll run out of power.