Utility maximization of potential disasters

I've been doing some thinking on what resources should be allocated to stymie anthropogenic global warming. Some figures: Anywhere between 88-98% of publishing climate scientists believe in some form of AGW, based on over 6 large scale surveys. In one survey, 41% of meteorologists and geophysicists in agreement with AGW thought the consequences would be "disastrous" and 40% thought they would be bad but not "disastrous", 19% voted "not necessarily bad". This reflects growing consensus over the slippery slope hypothesis (c02 --> warming --> glaciers melt --> methane hydrate released --> glaciers melt --> methane hydrate --> ..). It has been empirically demonstrated that the % of those that accept AGW theory increases as their "expertise" (number of published papers in climatology) increases. All these surveys are in the primary sources section of the wikipedia page "scientific opinion on climate change".

So I'm faced with the dilemma of who to vote for in my country. (A) is really good with the economy but doesn't believe in AGW. (B) is bad with the economy and wants to create a trading scheme.

Totally forgetting my actual predicament with who to vote for, how are policy makers supposed to act in a fashion that will maximize utility? Let us define utility (diverging from the economic definition for now) for the AGW scenario as "the net amount of non-negative human conscious time experienced".

In my preliminary thoughts, I've found an approximate solution difficult to come by because of the possibility of some type of extropianism in the future - the extension of human life to other planets. This assumes that humanity will then live on for a long long time after that as it will guarantee non-destruction in the next ice age, and in this scenario it is highly likely humanity will have outgrown war like tendencies (otherwise we would be all gone by that time) or there are numerous purposeful structural impediments to self-destruction. The possibility (however remote) of the survival of humanity for such a long time into the future to the point of extropianism which guarantees even longer survival time monstrously skews any utility maximization approximation attempt, and should lead anyone with the objective function of maximizing utility previously specified to invest a lot of resources into curbing c02 emissions.

As you say, our gut feeling is that we want to maximise complexity. But in a world with limits, that becomes a trade-off. We can either produce a lot of complexity rapidly and die off (live fast, die young, leave a beautiful body), or live longer, but more slowly.

We have chosen option A, but kind of hope that we will be able to invent our way out of the limits - such as via Kurzweil's Singularity, Tipler's Omega point and other extropian dreams.

Someone has been reading too much science fiction and not enough neurology text books if they believe in this particular loophole.

So it comes back to enjoying the ride. Survival to eternity does not seem an option.

And what can be said there is that consumerism is not actually what makes people happy. So we can argue that we are indeed burning up the planet faster than it makes sense to do.

As you say, our gut feeling is that we want to maximise complexity. But in a world with limits, that becomes a trade-off. We can either produce a lot of complexity rapidly and die off (live fast, die young, leave a beautiful body), or live longer, but more slowly.

Are you saying that if we work to minimize emissions that this will halt progress, and perhaps we'll live longer but it'll be "stretched"?

apeiron said:

So it comes back to enjoying the ride. Survival to eternity does not seem an option.

And what can be said there is that consumerism is not actually what makes people happy. So we can argue that we are indeed burning up the planet faster than it makes sense to do.

Forgetting usual notions of utility in economic discourse, what do you think of how I've defined it in this particular optimization problem (it's not really an optimization problem, but close enough)? It indeed begs the question, but I cannot think of another objective that we should be aiming to maximize that will change the solution - the solution being the allocation of quite a bit of resources, or the opportunity cost of economic gain, to AGW.

Forgetting usual notions of utility in economic discourse, what do you think of how I've defined it in this particular optimization problem (it's not really an optimization problem, but close enough)? It indeed begs the question, but I cannot think of another objective that we should be aiming to maximize that will change the solution - the solution being the allocation of quite a bit of resources, or the opportunity cost of economic gain, to AGW.

The question you are posing is not sufficiently clear to me. What are you talking about trading off - the possibility of discovering technical solutions as a result of continuing business as normal?

So I'm faced with the dilemma of who to vote for in my country. (A) is really good with the economy but doesn't believe in AGW. (B) is bad with the economy and wants to create a trading scheme.

Voting is just one way to have an impact in a democracy.

Figure out which is the more short term concern, vote for the person who best addresses this. Then join an advocacy group to keep the person you voted for from messing up the more longterm concern too badly. Politics is about compromise. This is why politicians have a hard time keeping election promises.

I gave the definition of "utility" that we're trying to make better (i.e. "maximize"), 'the expected net amount of non-negative (non-starvation, painful, etc) conscious time experienced by humans now and into the future'.

So, that is the objective that we're trying to "maximize". This is very vague and open to interpretation, but I wanted a way to consider A) the people in the now and B) all the potential people in the future. The definition provided does this.

My considerations in this problem include:
1. 88-98% of climatologists in consensus of AGW.
2. About 40% of those think the consequences will be disaster.
3. There's a potential for humanity to live on for a long long time. Perhaps some kind of extropian future wherein humanity has expanded to other planets. Such a scenario guarantees survival of the next earth ice age and gives us reason to suspect that humanity has somehow overcome the problem of self-destruction.
4. There's a potential for humanity to not have an extropian future, but survive till the next ice age and even through it because of technology.
5. Potential people of the future matter just as much as people today.

Since 3. & 4. exists as potentials, and considering 2. and 1., I'm proposing that given the objective function (the definition of utility provided) it only makes sense to allocate a huge amount of resources to the issue now to the downfall of the economic benefit of the ~7 billion people today in order to allow for the potential of trillions of future humans.

Given this reasoning, I think the economy should take a big blow. If we multiple 88-98% with 40% (given today's knowledge, this will be our approximate probability of self-destruction if we don't act) and then multiply that amount to our expectation of the future amount of people that could exist under all different potential futures without global warming, this is our approximation of the amount of people we will be killing if we do not act. It will be an extremely large number. We therefore have to act in a way that will reduce economic utility of the ~7 billion in existence today to afford future people the opportunity to exist, and do justice to the mathematics under the objective/utility function supplied.

----

I'll give you an example of what I'm talking about. Let's say that out of the 40% of geophysicists and meteorologists that thought it would be "disastrous", 30% think that it will mean the end of the human race, and out of those 30%, some think it will be in 400 years, some 1000 years, some 200 years, etc. Let's approximate the average as 400 years (it's probably not this, but just for purposes of exposition...)

Let's say that IF global warming isn't a danger at all (or if AGW theory is incorrect), from 2410 A.D. (i.e. the year 2010 + 400 years) there's a 10% chance of 15 trillion people existing in the future before annihilation, 10% chance of 10 trillion people existing before annihilation, etc (wild speculation, but it conveys a point).

Where
- 95% = the average amount of climatologists that agree with AGW
- 30% = the average amount of climatologists that think that AGW will lead to the destruction of the human race from 2410 A.D. as the average
- the last bracket is the probability distribution of all future people that could exist.

THIS is the approximate number of people we are "killing", if we don't act, given today's knowledge.

It will be a big number.

There are a few other variables that I'm not considering because if I put them in it'll just be confusing.

This big number forces one to allocate a large amount of today's resources to combat AGW under the objective function of maximizing "the amount of non-negative conscious time experienced by all humans, now and in the future".

I think the reasoning is solid, what about the objective function/definition of utility?