A big evolution
Version 0.1k; November
10, 2016. This work is in its utter infancy, more metaphysics and
speculation than proper science. It is largely a collection of
thoughts on this topic until I can organise it more coherently.
I'm working on software to model universes (or parts thereof) as
described below at multiple scales, some of which is published peer reviewed
scientific research.
We present a comprehensive and consistent meta theory of universes
(or theory of meta universes aka "the Universe"). The Universe is an
infinitely expanding and dynamic information theoretic bitstring
encoding its own grammar and parser that progresses undergoing
evolutionary operations going through a step function in a recursive
fashion. The Big Evolution Theory (TBET) posits similar principles
identified in biological evolution such as mutation and natural
selection, but applied universally and extended beyond biological
objects and entities to abiotic objects, abstractions, and,
ultimately, information. Universes or universal bitstrings (and
portions thereof) may thus be evaluated in terms of homology and
analogy. These universes exist as a dynamic equilibrium of
interacting particles (bits of information) contained in a universe as
specified by chaos theory [1]. The
laws of complexity thus followed the laws of physics. Everything
therefore is evolution all happening at once.
A universe is a quantum relative universal computing device that
evolved to encode its own grammar and parser described by the basic
recursive equation:
U(s) = U(s) + U(s++)
U is the recursive step function that operates on itself to
generate its corresponding component universe bitstrings and s
represents a step in its evolution (initialised to 0). Recursion is thus postulated to be
the fundamental process by which things happen in nature. As U
recurses on itself, it causes an evolutionary expansion like a balloon
inflating except at each step there's also an evolutionary process of
variation of selection happening at the same time. The goal here is to
model the organisation of information that occurs by this process,
rather than deal with its material aspects.
In more formal terms, a universe is a digital computing device
encoded in the form of a bitstring that includes both a grammar used
to generate it and an interpreter to compile and execute it. Our
Universe is a Universal Turing Machine encoding a grammar that
describes quantum and relativistic physics interpreted (compiled and
executed) by a corresponding automaton (in the case of context-free
grammars, it is a push down automaton). Summing and concatenation are
interchangeable operations in and by bitstrings universes where a
given universe variable (opcode) consisting of i bits (the bitindex)
is represented by the number 2^i and its value is equal to the
bitstring encoded by i-1 bits.
This model is a superset of all string theories
including M-theory
and loop
quantum gravity. (It is left as an exercise to the reader to
figure out why and how this is the case. :)
It follows logically that a theory of everything would not only
explain and unified quantum mechanics and relativity but also would be
completely self contained, consistent, and explain itself. As with string theories
including M-theory
this model does not make new predictions that lend themselves to
falsification (in general, any theory of everything has this
problem).
In terms of mathematical logic, all the string theories and any
other theory that unifies quantum mechanics and relativity are
equivalent. The bitstring theory or model says that a digital
computing device encoding a push down automaton is provably equivalent
to a universal
Turing computing machine. The model implies that our Universe can
be simulated in a digital computing device with the von Neumann
architecture, which is consistent with the development of modern
computers from quantum mechanics. The model also indicates that the
mechanics of quantum objects illustrates the limits of Gödel's
Incompleteness Theorems.
Most importantly, the bitstring model indicates that all objects
and problems in nature are recursively enumerable substrings that can
be interpreted (compiled and executed) by a push down automaton
encoded by the universal Turing machine or digital computing
device. This statement is made accepting the fractal-based nature of
the universe and by understanding that if our universe could be
simulated using computing devices of the type used currently, then it
(the statement) becomes a tautology.
If we think of the universe as akin to computing devices, then the
issue of hardware vs. software becomes a matter of state. When a
modern computing device is constructed and an operating system and
associated software run on it (which is essentially one big program)
to achieve a specific outcome, we cannot say the outcome is achieved
by one or the other alone. When trying to model universes, the
distinction becomes meaningless as it does in formal computing science
(i.e., the universal Turing machine is an abstract machine that can
run any type of a program, including one that specifies a modern
computing device's operating system). The hardware and software are
viewed as being the same in the bitstring model: the hardware
represents the abstract bit object and the software specifies the
information content (or arrangement and interactions) of the
bits.
- A bit is an abbreviation for binary digit with the values of 0
and 1. A period (.) refers to one bit; b refers to a string of bits of
arbitrary length; and B refers to the bitstring representing our
Universe.
- A bitstring b may be thought of as a quantum relative (QR)
object (QRO) denoted by o and O.
- Our Universe is an uppercase U and correspondingly when
referring the evolution of our Universe specifically, we will refer
to it as the/our Big Evolution) and the bistring representing U is
referred to as B. Any given universe is referred to with a lowercase
u.
- <=> is used to denote equivalence of nonintegers. Thus b <=> o
<=> u and B <=> O <=> U.
- length(b) is abbreviated as l(b) and corresponds to the number
of steps taken by/in a QRO/universe. We then note by convention that
l(b) = l(o) = l(u) = l. l(B) = l(O) = l(U) = L.
A conceptual framework for creating and manipulating bitstring
universes is now presented. If the description of this framework is
not clear, then reading how it is
implemented may help with clarity.
Information theoretic component
The physical component
- Even though infinity is generally conceived as being something
that is long, large, big, etc. the humble circle drawn in two
dimensions also represents infinity: you can go around it forever in
one of the dimensions (its circumference). I don't believe it's mere
coincidence that the symbol for zero is represented by a
circle. Likewise, the less humble Moebius strip in three dimensions
illustrates how can go keep going for an infinitely long time on a
two dimensional surface (the edge of a Moebium strip is homemorphic
to a circle). A Klein bottle similarly allows us to visualise the
same concept in four dimensions if one is travelling in the three
dimensional space within the bottle. An important point I want to
make here that even though these are finite objects when considered
holistically, there is an infinite component to them depending on
the axes being traversed. I like to think of our physical universe
as a balloon that is constantly inflated (a balloon that is
naturally specifiable by a set of bits) that creates these sorts of
infinite universal substructures on its surface and that the meta
Universe is a colllection of universes upon universes upon
universes, ad infinitum.
- The creation of our universe involved the creation of time
first (which is automatic as the universe evolves---the time bit is
just tracking the number of steps taken), followed by energy
(light), then mass (corresponding to energy/2*x), and then
dimension. It is also possible . There's a reason for stating this.
The fact that this Universe fits the equation E=m*c^2 is how we came
up with this concept but I am not using relativity to justify
this. Right now it fits the relation E=m*x^2. The creation of all
universes also proceed in a similar manner. In a general sense, x is
a constant that is only a function of the age of a given universe,
which in the bitstring model depends on how easy it is to the shift
an entire set of bits. In our Universe, the value of x is fixed to
the speed of light in vaccuum (c).
- The next series of events (time steps) will represent what we
know about our current Universe and represents the first (and
perhaps only) real assumption in the model. The assumption is that
the creation of the next opcode, which we'll say is frequency,
occurs *after* the equivalent number steps in time that can store
the value of c (299,792,458), which is the speed of light. That is,
the bitstring representing a universe where its Gödel number is
equal to c will be the first QR object created. In information
theoretic terms, since the speed of light being a constant is the
only information needed for a minimal quantum object, a bit in this
bitstring model can be thought of as the container of that
information (again, this emphasises that we are referring to
information theoretic bits).
- This Gödel number should also correspond to the number for
the "god particle" or so called Higgs boson and should contain the
bitstring of 1011. In other words, for the creation of our universe, we
see that the first quantum object had properties that is
proportional to c when its field became applicable. The creation of
a quantum object in our current universe must also have the same
Gödel number.
The minimum number of bits required to store this number alone in a
simple fashion is 28. At 28 bits, the value of
2^28+2^27+2^26+2^25+2^24+2^23+2^22+2^21+2^20+2^19+2^18+2^17+2^16+2^15+2^14+2^13+2^12+2^11+2^10+2^9+2^8+2^7+2^6+2^5+2^4+2^3+2^2+2^1+2^0
is 536,870,911
This is using a primitive method to store a value that can be stored
more efficiently using a much smaller number of bits. But the main
issue that matters is the concept, that we step through the number
of bits required to store the speed of light.
- At this point it should be obvious that the choice of
t, e, m, and initially is rather arbitrary. What now matters is
relativistic effects will start to become apparent and will apply,
and will always end up being proportional to number of bits left in
relation to the time + speed of the light. Thus what the model is
proposing is that each quantum object, including atoms and so on,
are universes in their own right. What we have when have an atom is
the creation by conversion of a small universe. Our current
Universe is one that has grown in a collection of universes.
- Including the four bits above, our baby universe consisting of
just a photon is now at least 32 bits long. This universe
contains/encodes the information (number of bits) that will satisfy
the E=mc^2 equation as well as the Planck-Einstein equation E = hf
where h is Planck's constant (6.626069 57(29) * 10^-34 J/s or
4.1356675 16(91) * 10^−15 eV/s). In other words, if a bitstring
corresponding to a quantum object can store the value of c, as well
as contain the first opcodes, then it can also store the value of
c^2 or hf.
- Reserving 32 bits automagically results in the creation of the
first constant c, created after going through the number of steps
which corresponds to the bitstring the can hold the speed of
light. There's a reason this is chosen thus. Because we know the
speed of light is the limit in this Universe. It is the limit in the
quantum world as well as in the relativistic world for a reason
which is given by the model above. Creation of this constant should
automatically enable the creation of the values corresponding to
Planck's constant or the Boltzmann constant. All these constants
should be containable in the bitstring used to contain c (even
though that is the only information used). Readers familiar with
quantum mechanics should realise that this universe is now alleged
to bring in Planck's law, Rayleigh-Jean's law (at low frequencies or
large wavelengths), while in the limit of high frequencies
(i.e. small wavelengths) it tends to the Wien's law.
- The final opcode is thus set to be a number and we'll call it
frequency, which is a count of how things happen relative to the time
advanced which causes a fraction. (Not that from frequency and speed
you can get wavelength and vice versa.) At this point, spontaneity
gets very hard since we just introduced a bunch of bits
into our universe (a primitive universe at zero time will then be
about 32 bits long at least). The probability of a spontaneous event
occurring then will be 1/2^lob where lob is the string length of the
current universe's bitstring. However, different universes can merge
to create bigger universes and universes can also break (which is
how the quantum revolution was begun bench wise) provided the large
Universe can absorb the change in the bitstring.
- The model says that a given position (or other quantity) can
interchange with respect to time (or any quantity; but it just means
adding bits the local universe bitstring OR shifting bits between
other quantities). Thus in the local universe, an object of mass 1
or greater will have an energy that is equal to the corresponding to
the number of steps in time times the speed of light. If
the time steps is 0, and the object has an energy of 1, then its
mass will 1/c^2. It is also possible to have an object that has an
energy of m * c^2 = h (or Planck's constant) with the frequency
opcode. Stepping through the bitstring is what corresponds
to the energy (or any other quantity) quanta observed as Planck's
constant.
- We've now thus explained the photon (see implementation, the most basic quantum
object with a resting mass of zero and a resting energy
corresponding to the speed of light which is simply the number of
steps the quantum mechanical relativistic computer has recursed
through. We can also start to have the creation of quantum objects
all the way through atoms and beyond with whatever properties we
desire but they should match what we observe in the quantum
world. These objects can have a number of properties which depends
on the bitstring that is encoding them at this point. An object can
have a mass of 0 and energy of 0, and its time will correspond to
the number of steps taken in the universe thus far. An object can
have a mass of 1, in which case it can have an maximum energy that
corresponds to c^2. If it has a lower energy, then the number of
remaining bits can be used to encode other information such as
position/dimension and so on. Thus the smallest particles are likely
to fly closer to the speed of light; the smaller they get, i.e., the
closer the mass to zero, the more closer they will get to the speed
of light.
- Again, the maximum or minimum value possible for any arbitrary
quantity for an arbitrary universe corresponds as to the remaining
bits left in the opcode times the maximum value to the right of
the bitstring. The maximum or minimum value possible for any
arbitrary quantity for our Universe thus far is likewise the same
except that we can already say that energy and mass are related by
power of two and that the maximum number of spatial dimensions is 3
(if you include time it is 4).
- Quantum physics is discrete math and calculus with the delta
set to quanta measured in Planck units. Relativistic physics is
discrete math and calculus with the delta set to c, the speed of
light.
It is one thing to specify a conceptual model, but it is another
to say how it will be implemented. We don't know how it is implemented
in nature, though we can guess. Since we're working in a digital
computer, and since I'm writing this out in English on the Web, for
simplicity's sake there are some implementation details that I'll
specify. This just means that these issues are separate from the model
itself and should not be confused.
- I've already referred to the universe as a computer. I've also
specified a recursive function. This results in the universe being a
giant stack that operates very much like a modern computer would.
- A modern computer is called a push down automata for a good
reason. Basically there is a stack, a stack pointer, and operations
to push and pop stuff off the stack. Instructions are given as
opcode-operand pairs where opcode specifies the command (in this
case, a variable or quantity type) and the operand specifies the
value. Simply put, the bitstring for a given universe is
represented by a stack, a stack pointer, and push and pop operations.
- The probability of bitstring conversion happening spontaneously to
another bitstring b is 1/2^l(b).
- All information is in the stack + stack pointer; push
operations don't cost anything (corresponds to time) and pops don't
occur within a universe in an absolute sense; thus the substacks of
our Universe can be just thought of as shifting between states. The
environment it is doing it in can be thought of as a field. This is
why we have the second law of thermodynamics. Entropy is time.
- We'll assume that like in a machine, opcodes and operands
follow each other in pairs. As mentioned above, opcodes can be
specified by itself. In terms of actual implementation and
detail and writing it out, if you just have a string of bits, it
gets confusing (to us) as to what really is an opcode operand
pair. We'll thus assume that these pairs when they are specified are
separated by a bit that says whether we have only the
opcode (0) or a full pair (1). If we have only the opcode specified
then the bit determines whether the operand value of the opcode is 0 (which
is the same as not specifying the opcode) or whether the operand
value of the opcode is the maximum. In nature, within the bitstring
of our Universe, there's no need for this sort of separation since
we're dealing with physical objects.
- An unused opcode operand pair is how to specify a constant, the
number of bits in the opcode specifies the exponent (as power of 2),
and the operand is used to specify the mantissa.
- Thus the expanded specification for a photon at rest would be:
4.b-3.3-2.0-1.b-0.299792458 - This means that the object at rest has
a frequency which is represented by the value that follows 4
(separated by a .) stored in the bits up to the next opcode/operand
pair (separated by a -) , a dimension of 3, a mass of 0, an energy
that corresponds to a number stored in B and the amount of time
elapsed is equal to
299792458, the speed of light (c). This is one of the most basic
quantum objects that can be created though an object with just time
or just time and energy is possible. The two b together just need
to equal Planck's constant (h) divided by the wavelength which can
be stored as an exponent/mantissa pair.
- This is the full representation given for clarification. If we
are parsimonious, then we can drop the mass the opcode/operand pair
(2.0) entirely (which would mean the object
has a mass of 0), we could drop the time opcode, and we could just
specify the dimension opcode. Thus the reduced representation will
have: 4.b-3-1-299792458 If we drop the frequency and dimension, we
can just get 1-299792458 which can also be used to specify c^2 (this
would mean that the frequency * wavelength = c). Thus shifting the
bitstrings around allows us to go between Planck's constant times
the frequency (gives E), and c^2 (gives E), and frequency *
wavelength (equal to c). Once we fill the position ocode/operand
pair in 4D timespace at the number of steps where the light starts to
curve away from the source, all the constants start to get defined.
As an example, these could be the first opcodes and their
corresponding bit strings representing primordial baby universes.
t = 000 = 0 or time
e = 001 = 1 or energy (or <= 0 time)
m = 010 = 2 or mass (or <= 0,1 time or E/2)
d = 011 = 3 or dimension/position (or <= 0,1,2 time, energy, mass)
x = 100 = 4 or position x (fill at number of steps (address) where speed of light is a constant in our universe)
y = 100 = 5 or position y
z = 101 = 6 or position z
f = 111 = 7 or count of steps (i.e., gives frequency/wavelength) relative to position (or <= 0,1,2,3 time, energy, mass, dimension)
The number of steps in a given universe is the universe's
Gödel number.
If it is hard to imagine how a one dimensional (1D) string can
give rise to two dimensional (2D) and three dimensional (3D)
structures: Imagine a flat (2D) piece of paper that has nothing
written on it on both of its surfaces. By definition it is empty or
contains nothing. Then imagine a small dent made on the surface on one
side which then shows up as a bump on the other. When this dent/bump
occurs, the overall shape of the paper changes ever so slightly
especially when you consider that this paper is expanding/inflating at
the speed of light. Now imagine this process occurring over billions
of years and in parallel, with the odds of any point on the paper
becoming a bump or a dent constantly changing as more of them appear.
Neighbouring bumps and dents will start to interact and form higher
dimensional structures on the two surfaces. Where once the 2D surface
was flat and empty, containing nothing, it now has 3D structure that
is constantly in motion due to the inflation/expansion. The
dents/bumps are the fundamental particles that comprise the universe,
i.e., the bits, and TBET posits that they are interacting according to
principles we observe in biological evolution. The dualism which is
not present when the paper is flat and empty arises the moment the
first bump is created. Regardless of the dimensional space one starts
with, if this kind of big evolutionary process takes hold, then you
will ultimately end up in a position where one not only has matter,
but also life and sentience. Interestingly, the inherent property of
any individual bit or bitstring to seek the lowest energy state
thereby enabling evolutionary action on a system of such
bits/bitstrings, could be thought of as a form of fundamental
intelligence or sentience.
A 1D string in this model is a collection of bits, as is a 2D
sheet of paper, or 3D (or even higher dimensional) construct. That is,
the first string arose by the joining of a number of bits, and a 2D
paper is a collection of strings In the above imaginary scenario
involving the 2D surfaces, we assume there exists some of a medium
(like the aether of yore) where bumps/dents can occur. This may be
true but it may also be that the surface is being created on the fly
and what we are calling expansion/inflation is simply the addition of
bits around the edges to make the universe bigger. In other words, the
complex system dynamics of a given universe is what gives rise to the
phenomenon of inflation (just like with economics for
instance). Either way, the outcome of a growing and evolving universe
remains the same.
- Perhaps the most aesthetic aspect of this model is that it
proposes how to create something from nothing. Rather than assuming
a Universe where there was a "big bang" (there may have well been,
corresponding to an energy that is twice the speed of light
squared), we assume a Big Evolution. It is stepping through time
equal to the age of our current Universe that
we have evolved to this point, to ask the question of how our
Universe was created.
- Dualities are more simply and readily explained by invoking the
process of recursion to explain all fundamental objects, events, and
forces in nature. If the universe was built from the ground up by
adding to existing universes or objects (similar to how natural
numbers arise from set theory), then duality is a natural
consequence that signifies at an information theoretic level whether
or not a universe with a particular configuration exists.
- time, energy, mass, and dimension are likely to be the first
four basic quantities (types) or opcodes. These corresponds to what
we call physical properties. This is a consequence (feature) of the
bitstring representation.
- time is the most basic variable quantity corresponding to an
opcode of 0 and is the only variable capable of being truly infinite
with arbitrary probability. This is because it is the step function
of this Universe and the probability of a time stepping through is
infinity (inverse of 0). Thus time always keeps advancing. "Time"
here doesn't refer to Einsteinean time but rather simply a step
function. It is possible a entire universe exists "in the now" by
having its bits come on and off in a particular fashion but it
contains what we would call past, present, and future.
- Time (0) and energy (1) are related. Energy is defined as the
"capacity to do work". There's inherent energy in everything because
of the bitstring flipping seemingly paradoxically, 0 is {0} is 1
which then wants to go back to 0. It could be that 0 itself is
potential energy and 1 is kinetic energy. This is also leads to the
notion that all bits are simply energy or time+energy bits that have
just changed form depending on the length and composition of the
bitstring, further leading to dualities of energy that are possible
such as positive (matter) and negative (gravity) energy.
- In our Universe, energy and mass can interconvert, but is
related by the power of 2. Three dimensions (3D) can reduce to two
dimensions (2D) only by changes in mass and/or energy. The
probability of a given event occurring spontaneously is equal to
1/2^length(U) where U is the length of the universal bitstring to
the right. Also like time (or the first bit specifier), energy (the
second bit specifier) is also infinite as long as the universe keeps
growing. At the same time two bits could cancel each other out
resulting in both a zero and infinite energy universe.
- Within our Universe which has the Higgs Field, we can
have spontaneous conversion of bitstrings that represent quantum
objects and I'm proposing this is happening all the time. Keep in
mind that all bitstrings within our universe have to be created and
destroyed within our Universe's bitstring. The Higgs Field is
why we have to have the small energy quanta that is proportional to
the speed of light. It requires changes or shifts in the local
bitstring that is our Universe. This doesn't mean that we have
spontaneous *creation* in an absolutely sense though it may appear
that way due to conversion. We can only move the bitstrings around
locally so much before we bump into the limits of our Universe which
has that huge time bitstring attached to it and even locally it
around gets complex pretty darn fast. Thus at any level other than
the quantum level, our Universe is effectively fixed.
- In order for quantum effects to occur, it makes sense for
the bitstring representing quantum objects should be as small as
possible though this isn't a necessary rule as the rules for the
interconversions will not change; it just makes the likelihood of
interconversions slightly better.
- Our Universe is stepping through time, so anything
with respect to time is a freebie. By this mean if a quantum object
is at rest, time in the large Universe is still stepping through so it
will always change. It may require shifting energy or mass around a
bit, but it can and will happen spontaneously and will not stop
until some other subbitstring gets in the way. This thus gives rise
the laws of motion.
- The change in a given opcode/quantity to another of
equal value (i.e., occupying an equal number of bits, all other
things being equal) is possible spontaneously and it is equal to the
probability of changing one opcode to another. For example, changing
energy to mass is possible with a probability related to the power
of 2.
- Motion is just incrementing (moving along) this bitstring
(i.e., motion in our 3D space is modifying a four bit string; i.e.,
motion in space is time and vice versa). Motion of an object with
mass is a function of the neighbouring bitstring in the local
universe (within the light cone). This is related to the laws of
motion.
- I've already gotten everything I need to get
for the equation E = mc^2. Again, I am not using any of the
information in the above model explicitly. It just falls out. To
make it very clear, right now in a given universe, all you can have
are interconventions with an appropriate constant which is 0, 1, or
2. Time cannot interconvert (since it cannot interconvert to
anything other than itself; this is why time is relative). Energy
and mass can interconvert spontaneously with a probability
correspond to the bitstring on the right. Space, energy and mass
can interconvert. If mass = 1, then the only interconversion
possible is to energy. If energy = 1, then the only interconversion
possible is to time.
- Thus there is a universal law of conservation at play
here already. I'm not using that in the above model but I could
make it explicit at this point to avoid shifting bits
constantly. Shifting the entire bitwise string corresponding to our
Universe for anything other than time is close to impossible.
- The uncertainty principle also falls out of the above
model. It basically says that a shift in the bitstring is not
measurable by the amount of the shift since the shift has already
occurred. It is a tautology since it just corresponds
to the length of the bitstring.
- Constants in the Universe occur because the maximum
value of an opcode has been reached. This is when an opcode is equal
to its operand which can correspond to anything in theory but aside
from time requires an interconvention to something, at least
time. This means that the Universe once created can't be changed
beyond the maximum value short of shifting the entire Universe (what
is within its "light cone").
- The values or operands for the time, energy, and mass
quantities are capable of being infinite but not zero. This is
because they can be added to the bitstring at the very end. No other
quantity is capable of being infinite in our Universe all other
things being equation since it would require a entire shift of the
universal bitstring.
- Energy and mass are capable of zero. but not infinite (requires
shift of bitstring).
- Dimension is an opcode with a fixed value (operand) of 3 in 3D
space but not when the quantum object is created. At that time, if
dimension is 1, then you have an opcode-operand pair that is free
for the universe to use.
- Speed is the next quantity and is the first quantity to
be defined in terms of time and position and falls out of the
bitstring automagically. Speed is defined as change in
position relative to time is theoretically capable of achieving any
value. Speed of anything we observe in our universe cannot exceed
the speed of light. You could argue this information is contained in
my model, but it is not explicitly at least at this point. All this
means that everything we know that happens in our Universe needs to
happen and (occurs) by the time corresponding to the bitstring that
can hold the speed of light.
- Energy, mass, and speed are interconvertible using the equation
E=mc^2 (or E=mcc) where c is the speed of light.
- The first four "quantities" that came about are time, energy,
mass, and speed; yet using the simple bitwise string model above and
assuming only the speed of light, I'm able to reproduce Einstein's
famous equation. Now you may think I've thought of Einstein's
equation and came up with the model, which I did but all I am
assuming is the value of c in the model itself. I'm going to go ahead
and show it retrodictively will predict what we know about QM also.
- Assuming only like to like conversions (aside from time to
time, which means time is just incrementing) in the rest of a
universe, then the likelihood of a given interconversion occurring
within the universe from one opcode to another is proportional to a
power of two (shift of bits required to change one opcode to
another). This is again the universal law of conservation.
- You can create a object with tiny, even zero, mass (QM) or
you can create an object with tiny, zero energy, but not
both simultaneously (in the local universe).
- Overall, regardless of the shape a given universe adopts
(circle, sphere, Moebius strip, Klein bottles, etc.), it is a
coherent system that exhibits nonlinear dynamics that also loops
back on itself. It is expanding (i.e, adding bits) and this
expansion is what we call time which also gives it a direction
(time's arrow). Objects exist on the surface of this structure and
because of the expansion we have the concepts of a past, present
(surface), and future. The surface configuration of objects in the
universe is transient, so the past no longer exists. One way to
imagine this is as though we are dwelling on the surface of an ever
expanding balloon and even though we can go back to the same point
in space, due to the expansion it is different (i.e., at a different
time). Black holes and white holes are tears on this balloon
possibly resulting in other (postive and negative) universes.
- The bits and bitstrings (substrings) are interconnected and
influence each other's states/behaviors simply by the virtue of
being in an interacting system. An analogy is ripples in a pond
caused by dropping two stones. The ripples will intersect when the
two stones are dropped at the same time, but if the second stone is
dropped just a bit later than the first, then the ripples from the
first will be influenced by the ripples from the second, even though
it came later (future affects the past). This is the result of
spatial and temporal separation in 4D spacetime but a similar thing
happens at the bitstring level (bitsepartion, which could simply be
the positional differences but could also involve the composition).
- Similarly, a metabitstring universe (metauniverse) could
contain substrings that are universes (subuniverses), all of which
have undergone big evolutions at different rates (big metaevolution)
and possessing different physical characteristics/laws. Closed
subuniverses may present themselves as dark matter and energy to
observers in a metauniverse that contains them, which itself may be
a subuniverse contained within some other metauniverse. This is
analogous to the multiscale organisation of biological ecosystems.
The model agrees with what we know about quantum mechanics,
relativity, and indeed, all of what physics which can be rightly
termed "laws" (i.e., they are unlikely to be wrong, in our universe at
least). That is, it makes retrospection predictions.
- If a universe is created within a Universe, the laws applying
to the top level Universe must hold. The is known as the correspondence
principle and is well known. But this is just because changing
the laws for anything other than something that is very tiny is not
possible without shifting the larger Universe in its entirety.
- When mass is small, energy is small. When energy = 0 and mass =
0, and time just increments. When energy = 1, mass = 1/c^2. When
energy = 1/c^2, mass = 1.
- The energy to create and dislodge a photon in a pure
universe with a mass of 0 is 0. This not only occurred to create
this Universe but the model predicts that it is happening
spontaneously all the time. Spontaneous conversions between
opcodes/quantities/states are also possible provided the
universal/Universal bitcode is preserved or there are only shifts
with regards to the time axis after the first four opcodes are
defined.
- This is why quantum particles seem to have entangled
phases. Entangled phases are just correlations from different
particles (bitstrings) or arise due to shifting bitstrings or
particles that are created with the same opcode arrangement but with
different values for particular opcodes. In other words, entangled
states share one bit in common.
- It is easy to imagine new universes being created many ways
under this pardigm. But let's say initially all there was was a
(the?) void, nothingness, and bitstring universes started to crop
up, two bitstring universes could collide/intersect somehow and
either blew themselves up or smoothly integrate with each other.
- The Big Bang, if it really happened, may have been been the
result of baby? universes colliding with each other (or perhaps even
representing a chain reaction of such universe collisions to create
the universe as we know it).
- Or it could all be a Big Evolution, everything that has
happened is all out of a growth like process and we're just getting
started. (I understand that this prediction and the one above it
contradict each other, but not if you consider that the bitstring
model described above is for multiverses, not just for a single
universe. In any event, hypotheses may be contradictory and may be
disproved or supported by observational evidence using well
controlled studies.)
- If both hypotheses are correct in the multiverse, but not
necessarily for any given universe, then perhaps there is a limit at
which baby universes can smoothly merge to form a bigger
universe. Nonetheless, if a universe is created by merging, it is a
Big Evolution and if it is created by colliding, then it is a Big
Bang.
- In a computing simulation of this model, if you blow up a
balloon (made up of the bit objects that comprise the universe) and
drop an object with mass on it, will it spin (as predicted by
Relativity)? And if you drop two objects, will they interact as two
objects do in spac? This results in the concept of angular momentum,
i.e., where does it come from? Like with the current explanation of
why angular momentum exists in material objects, the interactions of
bits within an object will result in that object possessing a
certain angular momentum which gives rise to the gravitational
force. In the Big Bang scenario the bits were all compressed and
expanded out at about the same time giving rise to the universe we
have now, including the four fundamental forces. In the Big
Evolution scenario the bits were already there and started to
coalesce in an evolutionary context. In other words, the fundamental
forces is really one force which depends on mass and energy of the
objects involved which are interchangeable bitstring operations.
- The bits themselves represent the structure of the universe and
the relationships between bits (and sets thereof) is where the
emergent information is present (i.e., the network is where synergy
occurs). The evolutionary trajectory actually taken and possibly
taken also represent a set of relationships from which emergent
information may be derived. These trajectories are also a form of
memory.
Anything that involves a duality can be thought of as a related
metaphor. In general, all dualities can be mathematically shown to be
equal to one another by the means of the Gödel bitstring
numbering. This applies to various theoretical, natural, life, and
applied sciences such as mathematics, chemistry, biology, medicine; as well as fiction in the form
of movies and music.
Indeed, in terms of philosophy,
metaphysics, and epistemology, the Gödel
equivalences set up the way described above, the distinction between
concepts like "real" and "imaginary" becomes as irrelevant as the
distinction between space and time. Thus the concept of choice
itself is an illusion.
Delving into mythology just a bit,
the concepts of good and evil, god and the devil, the fundamental
elements of fire (radiation), earth (matter), wind (currents), water,
are related metaphors. Dualities occur as a result of recursion which
represents the 0 (off) or 1 (on) states. Recursing on 0 gives you 1 ==
{0}. Recursing on 1 gives you 2 =={10}. Recursing on 2 gives you 4
{11}. Reading this right to left gives you the strings "42" and "420".
In fact, there's a self consistent feedback loop/pattern being
established here and one can see a Gödel mapping everything. This makes sense since a
theory of everything should not only be able to predict itself but it
should also predict why it took us the time it did to get there. This
is why transparency and openness is illuminating". Even an extremely
loaded metaphysical/epistemological statement like that within the
quotes relates/maps to statements about quantum physics and optics can
be written down as a bitstring. Heck, it actually already is if you're
reading this online. The Gödel number for it is
9223372036854775807. This is the same way Gödel's Incompleteness
Theorem was proven, by encoding a self-referential statement in number
theory/mathematics.
The encoding process lets you give meaning to a statement, in this
case understanding its paradoxical nature (only within the
system). This is what meaning is, after all, providing a context ofr a
frame of reference. As an exercise, think about the plaque that was
engraded on the Voyager spacecraft destined for outer space---how
would you make aliens understand who we are?
With this model, Platonic space could be thought of as metaphors
that have a Gödel mapping between them. Quantum physics and
relativitistic physics are essentially isomorphous groups. This
doesn't mean humans can't be stupid. Stupidity and ignorance are
states in a system.
1 | 0 (state bit)
on | off (state bit)
positive | negative (state/direction bit)
up | down (direction bit)
clockwise | counterclockwise (direction bit)
opcode | operand (pair bit)
variable | value (pair bit)
integer | real (pair bit)
exponent | mantissa (pair bit)
h | c (pair bit)
energy | mass (state bit)
start | end (state bit)
time | space (direction bit)
frequency | amplitude (direction bit)
wave | particle (energy/mass pair bit)
motion | rest (time/space pair bit)
discrete | continuous (h/c pair bit)
quantum | relative (h/c pair bit)
small | large (h/c pair bit)
mathematics | physics
information | entropy
life | death
new | old
good | bad
light | dark
genius | stupidy
happiness | sadness
freedom | slavery
bound | unbound
real | illusion
causal | coincidental
choice | no choice
Number
theory is a branch of pure
mathematics dealing with the study of integers. All integers are factors
of prime
numbers and in a similar fashion give rise to rational
numbers.
In 1931, Kurt
Gödel proved a statement of the form that consistent axiomatic
formulations of number theory includes undecidable
propositions. Corollarywise, such
formulations cannot prove their own self consistency.
In other words, sufficiently powerful axiomatic formulations of
number theory cannot be both consistent and
complete. (This gives us yet another dualism.) How could he do this?
The beauty of Gödel's
Incompleteness Theorems, as they are known, lies not just in the
theorems, but rather the method used in the proof.
Gödel's
proof consists of encoding a Gödel statement (G) of the form
"G cannot be proved in formulation F" using the mapping of symbols
(which have a semantic meaning associated with them) to integers (the Gödel
mapping or numbering) in a number theoretic formulation F. By
performing arithmetic operations on these integers, a given statement
can be proved. Thus the Incompleteness Theorem is proven by
contradiction. What proof indicates is that if you have a powerful
enough symbolic representation of a system, an inconsistent statement
within that system can be arrived at by an appropriate mapping of the
symbols. This mapping can be thought of as existing in a higher (or
meta) level.
Research into theories of quantum
gravity considering spacetime
as a fluid and an emergent property of fundamental particles is
more along the lines I'm thinking of. Thus Quantum
information theory is very relevant to what is talked about here.
In general the bits I'm talking about may be thought of as qubits, but
these qubits themselves represent an information theoretic bitstring,
and no-cloning
and no-teleportation
(no complete measurement) would apply; perhaps interactions with
the universal bitstring which is inescapable is what causes this
phenomenon.
Perhaps chaos/complexity theory offers the best visual
representations of the kind of evolution I'm talking about, that links
what Hofstadter talks about, to chaos theory, to quantum chaos, to
quantum consciousness, to poised realm. This image of a strange attractor is what I
see a bitstring initially evolving to, before and after the Big Bang
(or as part of the Big Evolution).
This also indicates a higher dimensional space that we are embedded in
to create this strange attractor that is our universe. The description
of the strange attractor feedback loops are related to the higher
dimensional mappings used to prove Göodel's Incompleteness
Theorem.
If all of nature (physics) is scale free then it follows that the
results of nature (evolution) are also scale free. Sentience is one subnetwork or subsystem within a
larger system that is capable of creating a model of itself. In
computational neuroscience, metastable
collections of neurons are believed to interact together to
perform certain tasks in a dynamic fashion: the mind perhaps works in
this manner. More generally, bitstring universes tend to map to other
subuniverses within them. See the mapping
to medicine for actual research we're
doing in this area.
Our CANDO drug discovery
project as well a lot of the research we do exploits this
bitstring universe model.
The songs Particle Man and My Evil Twin
by They Might be
Giants, and the albums Dark Side of the Moon and
The Wall by Pink
Floyd are related metaphors.
A lot of the music by my solo project, TWISTED HELICES, including songs
like Renaissance:
Art, Philosophy, and Science which has extensive mappings
and a recursive structure and the Proteomusic which
converts
protein
structure to music are artistic examples of the same recursive
bitstring themes being proposed here.
The movies
The Matrix,
The One, and
Dark City
to name just a few, are related metaphors.
I have been using the bitstring model to recursively enumerate problem spaces in
biology and then using knowledge based scoring functions to pick the
most promising solutions. I came up with generalising this
completely to quantum mechanics and relativity using nothing but
binary digits and progression of strings via a step function that
represents time. This happened while I driving down to the local
grocery store with my daughters and my neighbour's daughter. It just
struck me that I could just recurse on the most basic quantities using
a binary representation and obtain everything we know about quantum
mechanics and general relativity. Indeed, I realised that this model
covers everything we know about physics and mathematics.
I spent the next few days discussing this with my group who provide a great intellectual
foil and working out minor kinks. Every time I ended up seemingly
stuck, staying true to the model ultimately resolves it. And why
wouldn't it? The universal recursive function is the definition of a
digital computing device.
The above model presented is just that, a speculative theory that
makes no new testable predictions at this time. I assert that I've
followed this bitstring model throughout my entire life and scientific
career and it is the reason why the tools developed by us work to the
extent that they do. So the only shred of evidence that they
eventually may end up being correct is the value our research has on
this world. If the research we do improves human health and quality of
life, then I'd say following the model has paid off.
Also, the above model is a bit of a cop out, since anything we
talk about can be thought of using concepts from information theory,
so there could be something further underlying the bits (and this
might go on infinitely or even recurse back to everything, i.e., 0 and
1 are intimately connected). Again, I refer to quantum
information theory and potential
mappings to physics.
If I had to write a story about how everything came about (and
this is probably the best way to describe what I am thinking of), it
would be go like this:
"Nothing created the universe. Nothing was bored and wanted something
to happen, and then lo behold, to nothing's surprise, there was
something..."
We could rephrase it in whatever way that makes one comfortable
(for example, replace "no one" instead of "nothing") but read what I'm
writing carefully. Am I or am I not attributing conscious action to
nothing? What does it mean when I do the above?
If nothing is 0, then immediately there is is 1 or {0} or
something. Nothing and something are yin and yang and give rise to
energy which is defined as the capacity to do work.
The answer to what happened with the universe lies with what
happens with any conscious object or entity (primarily humans for us,
since we can't really evaluate what dolphins and elephants think and
feel). When you are conceived, are you conscious? How about when you
are born? When you are two years old? Three? Is it the same for all
humans? Do YOU remember when you gained consciousness? How far is your
earliest memory?
When/if we achieve the singularity, we'll know whether the above
is possible or not. Even then, we can also speculate as to what
actually happened short of going back in time to when our universe was
created.
Pseudointellectual ramblings
|| Ram Samudrala
|| me@ram.org