Friday, December 6, 2013

I was watching my youngest son play Minecraft on Xbox the other
day. He was playing around with Redstone repeaters and comparators. For those
of you who are not geeks, gamers, or have been living under rocks - Redstone is
the Minecraft equivalent to electricity. Repeaters are equivalent to diodes and
comparators are equivalent to op amp transistors. Combined you can move
pistons, light lamps, open doors and manipulate the virtual world around you. Being
an armchair engineer I found the concept of making rudimentary machines in a
simulated world pretty fascinating. I did a little research online and found
some very clever computers made by hardcore crafters. Large transistor arrays
set up to perform simple arithmetic and display it on large Redstone lamp
screens. That’s when it hit me. How do I know I am not in a computer
simulation, a virtual world? I know, this has been the theme of many Sci-Fi
books and movies. The Matrix immediately comes to mind. Think about it though,
how do I know? I am not proposing I am unknowingly enslaved by a robot race, I
am proposing myself, everyone else, and everything around me is software. Like
Minecraft we are Steve’s living in a virtual world we can control and
manipulate.

Sounds crazy doesn’t it? If you take it seriously for a
second the idea is rather demoralizing. It really strips you of your humanity
at first. Then again, if humanity has only ever been a simulation, maybe it
doesn’t. Being software doesn’t change the fact that I have lived my life.
Loved my family, my friends. Felt excitement, joy, pain and loss. In the end, it changes nothing about me.

After romancing on the idea of being “not real” in a sense I
began to ask myself, “What are the signs of being in a simulation?” I of course
immediately turned to physics and came up with a slew of postulates, the
following two being the most prominent.

Speed of Light

I postulate that the light barrier only exists in our
virtual universe. It is an artifact of the processing speed of the physics
engine powering our simulation. I also postulate that the time dilation experienced
by an object with mass approaching the speed of light is due to the simulation locally
preempting processing cycles it has no time to process due to the load.

Relativistic Reference Frames

Special Relativity, or SR, is Einstein’s theory that
explains how to interpret motion between different inertial frames. In the
simplest terms it talks about simultaneity between two bodies in motion. That
Two people moving at different velocities experience different rates of time
locally. You and I driving in different cars at different speeds experience
different rates of time. If we are sitting in the same car, driving together,
we experience the same rate of time. A crude explanation, I know but bear with
me.

I postulate that inertial frames are a partitioning strategy
for improving, or perhaps maintaining performance. In a massively parallel,
real-time environment it stands to reason that to maintain the human sense (or
any detector built in the simulation) of simultaneity you would exchange software
messages (events), between entities in the same frame of reference first then
emit those messages to neighboring frames, and so on, and so on. This
propagation across partitions, combined with the maximum processing speed of
the physics engine postulated above could further explain the time dilation in
SR, also mentioned above.

The Hypothesis

Both my postulates focus on Einstein’s relativity because
they describe hard limitations in our known universe. There are many other quantum
effects like Super Position and wave duality that one can quickly start making
connections between said effects and the complex nature of a massive
simulation. This is good segue in to my first hypothesis:

I can start to describe the nature of the systems running
this simulation by understanding the known physical limitations of this
universe. Looking at micro (sub atomic) and macro laws of this universe one can
start to understand its specifications. I will attempt, with help from my peers,
to describe the system running our simulation. Processing speed, memory, cache,
storage, networking, partitioning, and clustering. I will first test my theory
by writing software that can determine the limitations of its environment and
infer the system design - clock speed, memory, etc… It stands to reason that
simulation software (or games) we humans write effectively represent the
systems we created to run them. My goal in this exercise is to gain the
patterns and principals for applying this test to our “real-world”. Funny,
while many engineers spend their careers trying to create artificial
intelligence, I will be trying to write software that can say to itself, “Hey,
I’m not real. I’m just software.”

My next hypothesis is that load in our “real-world”
simulation causes entropy, or error. Once I understand the simulations runtime
environment I can start to look for things that cause load. One could postulate
that an extremely large number of people doing stuff in a small, condensed or
confined space could tax the system enough to cause the proverbial “lag” on the
system and induce entropy. If I can find the correct conditions and a good way
to determine what’s “normal” then find entropy under those conditions it could
expose the underlying workings of the system. In the postulates above I speculated
that mass approaching the speed of light causes entropy. This is hard to test
on the macro level as it’s really hard to get large objects moving the speed of
light. It’s also expensive to test on the micro level and may not tax the system
enough so I will not focus on that. I am going to focus on math, more
specifically probability. On can speculate that math is prevalent in the
universe that created this simulation and just like our computers are
influenced by our math, this simulation is influenced by their math. I will
turn to the probability around random number generation. I will build a pseudo
random number generator that runs Monte Carlo Simulations. I will stage these
devices in numerous areas where there are likely to be huge, dense, masses of
people and compare their accuracy against isolated control devices. My theory is
that entropy in the simulation will cause entropy in its arithmetic engine causing
the random number generators to slip in accuracy. Hopefully data from this type
of experiment will lead to more diagnostic tests. Your thoughts and opinions
are welcome.

I wanted to quickly come back around to quantum mechanics.
As we peer deeper in to the sub atomic world it all starts to unravel, the
known universe starts to break down. The phenomenon we see like Super Position
as described by Schrodinger's cat, or wave duality in the Double-Slit could be
artifacts of an old model. As little as a couple hundred years ago humans had
no ability to observe at that level so there was no need for a simulation to
model it. Perhaps humans are learning to manipulate this simulation faster than
the operators can update the code to model it.

Now before you send the white coats out to have
me committed, just really stop and think about it. It’s no crazier than
believing in god. A lot can be learned by undertaking an endeavor like this. On
can imaging the consequence, assuming it is true, of understanding the
environment and manipulating it in an unintended manner. Hacking our universe.
Could be hugely beneficial, or worst case, cause a system crash and a hard
reboot! If anyone is interested in joining me just leave a comment. I will be
starting an open source project on github for the software and hardware soon. If
anything, it will be fun!