Friday, August 2, 2013

The mathematics of war and dodgeball

Lanchester's Laws are a handful of mathematical formulae describing armed combat between two forces. I'll talk about their derivation for a little bit, then move on to dodgeball (duh).

Ranged combat

(Age of Empires III, Ensemble Studios, 2005)

Two armies of foes face one another. One is armed with rifles; the other with crossbows. Which army prevails?

Let's call the size of the respective armies \(x\) and \(y\), and call their soldiers' lethality \(\alpha\) and \(\beta\) respectively. (The lethality of a soldier is the number of enemy soldiers they can strike down per unit time.)

(Here \(x_0\) and \(y_0\) denote the initial sizes of the two armies.)

In general, the winner of the battle depends on which is higher: \(\beta y_0^2\), or \(\alpha x_0^2\). Thus we can define an army's strength in a way that is completely independent of
its opponents (and so if we wanted to we could rank a collection of armies by strength). An army's strength is the lethality of its soldiers times the square of its size.

Two example battle trajectories. In the lower (red) curve, \(x_0 = 100\), \(y_0 = 100\), \(\alpha = 2\), and \(\beta = 1\).
In the upper (green) curve, the values are the same except that \(y_0 = 200\). In this case, double the numbers beats double the lethality.

The intuition: give your soldiers guns that fire twice as fast, and they're twice as deadly.
Double your soldiers' numbers, and they're four times as deadly. (That's times two because
of the twice-as-big total lethality, and times another two 'cause the average soldier will survive [and keep shooting] for twice as long.)

The applications of this are questionable for even WWI combat, where strategy and tactical manoeuvring often accounted for a lot more than sheer numbers did — let alone modern combat, which is almost never two giant armies shooting at each other in an open field.

On the other hand, this is surprisingly useful for game design [Ad04], specifically for balancing games.
Using equations like this allow the designers of real-time strategy games like Starcraft or Age of Empires (sequel pictured above) to estimate whether the games' different factions
are competitive with one another.

I'd wager that laws like this apply to any game where the worse off you are, the harder it is to be effective.
Civilisation V.
Vanilla Magic: the Gathering. Maybe even Monopoly.
(On the other hand, probably not Tekken or Call of Duty, since in those games your raw lethality isn't affected by how much damage you've suffered.)

Ranged combat with bad aim

Same setup, slightly different assumption. Let's assume that the soldiers are all very bad at aiming. (Maybe they're fighting in the dark.) They fire randomly in the enemy army's
direction. This means that it's a hundred times harder to hit a lone gunwoman than an army of a hundred. This is effectively how action movies work.

Dodgeball with bad aim

Let's talk about a variation of dodgeball I used to play, where whenever you get hit by a ball, you switch to the opposite team. (Yes, nobody's ever out 'til the game's over. There are no losers: how very 21st century primary school.)

There are obvious similarities to the military battles that Lanchester's Laws describe, but instead of casualties eliminating people from the field, we merely have people swapping between sides.

Solving for this by eliminating \(t\) as before gives:
\[\frac{dx}{dy} = -1\]

...which is completely useless. It tells us that as one team grows, the other shrinks, which we already knew.
Instead, let's look at how the team sizes change over time. Let \(N > 0\) be the total number of players, a constant, and let \(\gamma = \alpha - \beta\), i.e. the strength advantage of team \(x\). Then:

The equation makes sense: if \(\gamma\) is positive, i.e. team \(x\) is stronger, then \(x \rightarrow k\) as \(t \rightarrow \infty\). Similarly, if team \(y\) is stronger, then \(x\) approaches zero over time.

Interpreting this: \(\frac{A}{A+1}\) is the fraction of players who are on team \(x\) at time 0; hence \(A\) is the ratio of \(x\) players to \(y\) players at time 0. This said, this is an asymptotic approach: the game drags on slower and slower over time, and then suddenly recess is over.

But it's barely worth getting any deeper into this, because our use of \(\alpha\) and \(\beta\) implies that the two teams have different skill levels, which makes no sense if kids are swapping between teams all the time.
(The only conceivable exception is if being on one team confers an extra advantage; e.g. if everyone on Team Emu stands in the parking lot while everyone on Team Cockatoo drops balls down on them from the second storey.)

So in all practicality, \(\gamma = 0\), i.e. the game is a complete equilibrium, and kids move between the teams at equal rates, regardless of which team is bigger.

Here \(A\) can be interpreted as the initial value of \(\frac{x-y}{2}\), i.e. the (scaled) difference between initial team sizes.

There are two possible cases in which this is constant:

when \(A = 0\), i.e. when the game starts with exactly equal numbers between the teams, or,

when \(\alpha = 0\), i.e. when none of the players are able to hit each other because they are all toddlers and they cannot throw for shit.

Notice that this is a highly unstable equilibrium: if even one extra player joins team \(y\), then team \(x\) is sunk, and their size exponentially snowballs downwards until it smashes to zero with a painful thunk.

About the Author

Chris Chen is a self-styled human being who enjoys philosophy, music composition, rants about kyriarchy, Tumblrs full of shiny pictures, abstract maths, programming, surreptitious narcissism, and other things besides.

Born in Melbourne and based in Sydney, she's not, whatever her display picture suggests, a Japanese creation goddess reincarnated as a wolf.