God of the Gaps (well then, how did...")

Let's review.

Scientifically, a "creator" requires information that exists outside of space and time.

So far there is no proof for such a thing. In fact everything we see suggests the opposite. Information is conserved, it neither enters nor leaves the universe.

What we have, here in Flatland, is local concentrations of information. They go by names like matter, and life. We can trace the increasing complexity of these local information structures to within seconds of the big bang.

Any time information is created, there must be an increase in entropy somewhere else. This action is shown at the quantum level by the degradation of entangled states. They interact with "something" (possibly the microwave background radiation, or possibly even the vacuum itself) and information is transferred away from them.

To prove even the POSSIBILITY of a creator, one must show that information can exist outside of spacetime.
 
The answer comes from the science called "quantum thermodynamics", which views the information in terms of an OPEN system rather than a closed system.

There, the formula for entropy is

H = H(s) + H(b) + H(sb)

where s is the system, b is the "bath" (which could be the universe), and sb is the interaction between the system and the bath.
Interesting. Under the First Law, Britannica's recently updated page on all four laws of thermodynamics states:
If the system is not isolated, the change in a system’s internal energy ΔU is equal to the difference between the heat Q added to the system from its surroundings and the work W done by the system on its surroundings; that is, ΔU = QW.
There's still no "sb"-like term to handle production of heat from say a chemical reaction between the "system" and its "surroundings." In G.B. anyway, they've sadly only just begun thinking this stuff through. Presuming "an isolated system," as all four laws have, was always dumb since there's no such thing in every day reality.
 
Interesting. Under the First Law, Britannica's recently updated page on all four laws of thermodynamics states:

There's still no "sb"-like term to handle production of heat from say a chemical reaction between the "system" and its "surroundings." In G.B. anyway, they've sadly only just begun thinking this stuff through. Presuming "an isolated system," as all four laws have, was always dumb since there's no such thing in every day reality.
What really happens is:

1. Information is exchanged with the bath.
2. This alters the number of configurations in the bath.
3. Which changes the probability distributions both in the bath and in the system.
4. The new probabilities then act back on the system.

This is an example of a change in the density matrix that specifies the probabilities for each possible outcome. The change in probability is called a probability current or probability flux.


Mathematically the fluxes are given by the Kraus operators, a special case of which is the Luders rule.


Density matrix:


More information, Google "stochastic equation density matrix quantum thermodynamics"


and

 
Last edited:
Thanks, but I'm too old and busted to follow you far down such a rabbit hole. Love the Fock space though. There, had to teach my dictionary a new word, WTF?, ha ha.
 
Hawking's idea:

Instead, the way the universe started out at the Big Bang would be determined by the state of the universe in imaginary time. Thus, the universe would be a completely self-contained system. It would not be determined by anything outside the physical universe, that we observe

That's not very interesting, it amounts to the statement "the universe exists" which even we agree on.
 
Thanks, but I'm too old and busted to follow you far down such a rabbit hole. Love the Fock space though. There, had to teach my dictionary a new word, WTF?, ha ha.
Be warned, Fort Fun Indiana and scruffy are avoiding discussing the core issue - to what can we attribute the universe's presence. He's fond of speculating how an already existing system might behave but that's not the subject here. The subject is not how an existing system behaves but what led to that system existing in the first place, why should something with a behavior even exist.

The problem for him is that we can't use the behavior of a system to explain how the system came to exist, science cannot answer the question because until the system exists we have no fields, laws, probabilities and so on.

He and a few others here are insisting that the explanation for the universe's existence can be a mechanistic process but until things exist there can be no mechanistic processes.

Look at every post he makes on this question, they all begin with assertions and assumptions that something exists that we can describe, but that's easy, how it came to exist is not.
 
Last edited:
Few things amuse me more that watching people attempt to provide material, scientific answers to non-scientific questions! They insist on using a tool that cannot work, yet they'll stubbornly insist that given time we can get it to work. No amount of trying to unlock a lock with the wrong key ever works, all you do is try over and over again to perform a fruitless task.
 
For something to exist where it once did not exist, is by definition a change, a state change. However until something does exist there can be no change, no state and if change cannot take place then nothing will ever change and hence the universe would continue to not exist forever.

This is the materialists problem, how to describe the change from nothing to something when every description involves already existing things.

Like the square peg in the round hole though, they'll press on, forge ahead to this brave new world where science has unlimited explanatory powers, only problem is it is an intellectual dead end as even the Greek's knew.
 
It's a bit more than that, as shown in the quote.
It's pop-science, I was reading (but not fully understanding) Hawking and others when I was 14 which is over 50 years ago, you post this stuff like its revolutionary, it isn't it is interesting speculation, cosmologists love to do this. People don't seem to grasp that speculations about the universe is not new. In the 70s it was just as common as it is today, the players might have changed but the game's the same.

Here are some of the books on my shelf's physics area:

The universe: Its beginning and end 1975
White Holes 1977
From quarks to quasars 1977
The structure of the universe 1977

I have several more too, all bought back in the 1970s when I was studying relativity and lots of other stuff.
 
Last edited:
The answer comes from the science called "quantum thermodynamics", which views the information in terms of an OPEN system rather than a closed system.

There, the formula for entropy is

H = H(s) + H(b) + H(sb)

where s is the system, b is the "bath" (which could be the universe), and sb is the interaction between the system and the bath.


To understand this, we have to look at the Lindblad master equations which once again reference an OPEN system.


The further rub is that any change to the bath entropy alters the dimensionality of the phase space, a situation which has since been addressed.
I shall alert the media.
 
you post this stuff like its revolutionary
False. I posted it directly in response to the idea that no such plausible theory exists or even can exist.

You just can't help yourself. This is not the section for dishonest posters. As I think you are finding out.
 
I wonder what would make the initial state (pre big bang, for instance) so perturbed — assuming that it was fully in balance to begin with — that something of any kind could happen could then be created.

The supposition that something outside the previously closed system must have snuck in, suggests that the closed system on its own (without some outside influence) would never have resulted in any matter/energy/time/space.

Well, what created the original closed system and what created that outside force which then entered the previously closed system?
 
Thanks, but I'm too old and busted to follow you far down such a rabbit hole. Love the Fock space though. There, had to teach my dictionary a new word, WTF?, ha ha.
Here's what I'm trying to do:

Quantify the information in a neural network as it learns.

My starting point is a Hopfield network, which is about as simple as it gets. Everything is binary, the probability densities are known and fixed. That's where I am right now.

Next, I want to understand what happens in a Boltzmann machine, which is the same thing except it has an external gradient that breaks symmetry.

Finally, I want to understand what happens when we add or subtract neurons "as" the device is learning.

These behaviors should be analogous to thermodynamics. The study necessarily takes us into "non-equilibrium" thermodynamics, which is analogous in some ways to open quantum systems.

It is already obvious (from simulations) that the entropy picture changes radically when the neurons have 3 states instead of two (on, off, and "refractory", the latter being like a gap or constraint in the time evolution).

When you add a neuron, you're not adding information, you're actually losing some, because the number of available paths (the "dimensionality") increases. It is only after learning that the information content can reach and then surpass its original level.

So formally, adding or subtracting neurons is very much like having an "open" thermodynamic system. The same thing happens, the dimensionality changes at the same time the probability densities change. The math of probability currents applies, but it's more than that. In quantum land you have fixed and predictable relationships between the different types of information, in neural network land you don't. In a simulation "I as the experimenter" decide when to add a neuron, but in real life it's somewhat random.

What we learn from open thermodynamic systems is that the relationship between events inside the system and the character of the bath is dynamic, it's best described as stochastic diff eq's that act on both the values and the dimensionality of the probability densities. Such a system should be formulable algebraically (using matrix mechanics). The math for changes in dimensionality already exists, it can be found in Lie algebras for example. To make it work though, we need a rock solid understanding of the various kinds of information and how they play into the system's behavior.

In a real neural network you don't just have binary elements, you have analog elements as well, with continuous (but bounded) information transfer. So, approximating firing behavior by a "rate" isn't going to work, the two behaviors are fundamentally different and their information spaces are different too. If we want an expression for entropy we have to take all this into account.
 
I wonder what would make the initial state (pre big bang, for instance) so perturbed — assuming that it was fully in balance to begin with — that something of any kind could happen could then be created.
That's the big one. You nailed it.
 
Here's what I'm trying to do:

Quantify the information in a neural network as it learns.

My starting point is a Hopfield network, which is about as simple as it gets. Everything is binary, the probability densities are known and fixed. That's where I am right now.

Next, I want to understand what happens in a Boltzmann machine, which is the same thing except it has an external gradient that breaks symmetry.

Finally, I want to understand what happens when we add or subtract neurons "as" the device is learning.

These behaviors should be analogous to thermodynamics. The study necessarily takes us into "non-equilibrium" thermodynamics, which is analogous in some ways to open quantum systems.

It is already obvious (from simulations) that the entropy picture changes radically when the neurons have 3 states instead of two (on, off, and "refractory", the latter being like a gap or constraint in the time evolution).

When you add a neuron, you're not adding information, you're actually losing some, because the number of available paths (the "dimensionality") increases. It is only after learning that the information content can reach and then surpass its original level.

So formally, adding or subtracting neurons is very much like having an "open" thermodynamic system. The same thing happens, the dimensionality changes at the same time the probability densities change. The math of probability currents applies, but it's more than that. In quantum land you have fixed and predictable relationships between the different types of information, in neural network land you don't. In a simulation "I as the experimenter" decide when to add a neuron, but in real life it's somewhat random.

What we learn from open thermodynamic systems is that the relationship between events inside the system and the character of the bath is dynamic, it's best described as stochastic diff eq's that act on both the values and the dimensionality of the probability densities. Such a system should be formulable algebraically (using matrix mechanics). The math for changes in dimensionality already exists, it can be found in Lie algebras for example. To make it work though, we need a rock solid understanding of the various kinds of information and how they play into the system's behavior.

In a real neural network you don't just have binary elements, you have analog elements as well, with continuous (but bounded) information transfer. So, approximating firing behavior by a "rate" isn't going to work, the two behaviors are fundamentally different and their information spaces are different too. If we want an expression for entropy we have to take all this into account.
Keep at it. Your enthusiasm is infectious and bound to produce results. Inspiring others if nothing else. My mom did information science. Lots of interest and support out there, particularly from the military.
 
I wonder what would make the initial state (pre big bang, for instance) so perturbed — assuming that it was fully in balance to begin with — that something of any kind could happen could then be created.
I don't believe nature likes "fully in balance" any more than it likes either extreme void or pure matter. I picture a long dying universe with black holes quickly vacuuming up all the remaining matter and space while locally creating an abundance of dielectric potential. Then bam! The cycle restarts.
 
I don't believe nature likes "fully in balance" any more than it likes either extreme void or pure matter. I picture a long dying universe with black holes quickly vacuuming up all the remaining matter and space while locally creating an abundance of dielectric potential. Then bam! The cycle restarts.
It cold be a cycle. A long ass cycle. But still, a cycle.

Doesn’t address how it got started.
 

New Topics

Back
Top Bottom