quantum computer generates the first truly random numbers ever

scruffy

Diamond Member
Joined
Mar 9, 2022
Messages
25,908
Reaction score
22,374
Points
2,288
As is well known, digital computers are incapable of generating truly random numbers.

The best they can do is "quasi-random", which means they look random but will repeat after a long enough time.

Turns out, quantum land is the one and only place in the universe where we get truly random behavior.

And now for the first time, quantum computers are proven to be able to harness this behavior.


From the standpoint of physics, randomness is a primary property of the quantum universe. No one knows how it happens. One of the very interesting things about it is, its distribution is flat, not Gaussian. In other words, it disobeys the law of large numbers.

Which in turn means that there's a unitary process underlying it. (As distinct from a bunch of little processes).

Nothing in this universe would be possible without the randomness. Everything from our consciousness to light itself depends on it.

Mathematically, it is entirely unclear whether there is any connection between randomness and quantization. No one knows if these are different processes, or part of the same process. Quantization is associated with counting, whereas randomness falls into the "uncountable" category. One can think of this in terms of the difference between the integers and the reals. The integers are "countably infinite" whereas the reals are "uncountable".

The relationship between the two was explored by the mathematician Georg Cantor, who discovered the famous Cantor Dust. It works like this:

Quantum processes generate random real numbers between 0 and 1. So take the interval (0,1) and chop out the middle third. (Which means you know have two intervals remaining, the left third and the right third, each of which have length 1/3). For every remaining interval, chop out the middle third, and keep doing this recursively an infinite number of times. You end up with a "dust", and what Cantor proved is that this dust has the same number of points as the interval it started from. The dust has the same number of points as the entire interval (0,1), its cardinality is the same.

This remarkable and counterintuitive proof arises because we're trying to count the reals. Which, apparently, doesn't work. In math, 'points' are countable, and topology changes this into the concept of "neighborhoods", which overlap in uncountable ways.

So when we generate a random number, we're not really generating a 'point', we're generating a neighborhood. This concept is driven home in probability theory. If you have a continuous distribution with probabilities on the interval (0,1), the probability of getting "exactly" .5 is ZERO. However the probability of getting a result in an epsilon-neighborhood around .5 is finite and positive. In other words you have to integrate over the neighborhood to get a non-zero probability.
 
As is well known, digital computers are incapable of generating truly random numbers.
I suppose it must be true if you think about it.

Mathematically, it is entirely unclear whether there is any connection between randomness and quantization. No one knows if these are different processes, or part of the same process.
I think it seems pretty certain the two are connected. Because one is the act or event being observed, and the other is the observer. It is the interaction between these two which either party or wholly creates/defines randomness.
 
Dragon dice are both random and quantized ... rolling an infinite number of times yields a random irrational number ... which requires randomness to be smooth ... think one time key pad ...

The OP is correct ... computers can't do this, not without infinite memory ... but you know, 256 raised to the 1,000th power is random enough ... what we call in the trade "pink noise" ...
 
As is well known, digital computers are incapable of generating truly random numbers.

The best they can do is "quasi-random", which means they look random but will repeat after a long enough time.

Turns out, quantum land is the one and only place in the universe where we get truly random behavior.

And now for the first time, quantum computers are proven to be able to harness this behavior.


From the standpoint of physics, randomness is a primary property of the quantum universe. No one knows how it happens. One of the very interesting things about it is, its distribution is flat, not Gaussian. In other words, it disobeys the law of large numbers.

Which in turn means that there's a unitary process underlying it. (As distinct from a bunch of little processes).

Nothing in this universe would be possible without the randomness. Everything from our consciousness to light itself depends on it.

Mathematically, it is entirely unclear whether there is any connection between randomness and quantization. No one knows if these are different processes, or part of the same process. Quantization is associated with counting, whereas randomness falls into the "uncountable" category. One can think of this in terms of the difference between the integers and the reals. The integers are "countably infinite" whereas the reals are "uncountable".

The relationship between the two was explored by the mathematician Georg Cantor, who discovered the famous Cantor Dust. It works like this:

Quantum processes generate random real numbers between 0 and 1. So take the interval (0,1) and chop out the middle third. (Which means you know have two intervals remaining, the left third and the right third, each of which have length 1/3). For every remaining interval, chop out the middle third, and keep doing this recursively an infinite number of times. You end up with a "dust", and what Cantor proved is that this dust has the same number of points as the interval it started from. The dust has the same number of points as the entire interval (0,1), its cardinality is the same.

This remarkable and counterintuitive proof arises because we're trying to count the reals. Which, apparently, doesn't work. In math, 'points' are countable, and topology changes this into the concept of "neighborhoods", which overlap in uncountable ways.

So when we generate a random number, we're not really generating a 'point', we're generating a neighborhood. This concept is driven home in probability theory. If you have a continuous distribution with probabilities on the interval (0,1), the probability of getting "exactly" .5 is ZERO. However the probability of getting a result in an epsilon-neighborhood around .5 is finite and positive. In other words you have to integrate over the neighborhood to get a non-zero probability.

The last time you pontificated about randomness you made a fool of yourself. You claimed chaos was random but it is not, here this might jog your memory:

 
I suppose it must be true if you think about it.


I think it seems pretty certain the two are connected. Because one is the act or event being observed, and the other is the observer. It is the interaction between these two which either party or wholly creates/defines randomness.

The two may be connected at a deep level.

Quantum entanglement, for instance, resolves to a quantized observable.
 
The two may be connected at a deep level.
Quantum entanglement, for instance, resolves to a quantized observable.

Think about it--- no computer can truly, directly, independently say it is "random"--- the very act of being "random" also requires an observance (observer) of it being random.
 
so quanty mechanics is broken? listening to this right now wondering why I'm doing this again talking with the crowd. Could be that the computer hallucinates the crowd along with me? is that random? hallcuination cause randomness scruffy ?

 
True random chance must include any number sequence I could imagine or generate. IOW, a true random number generator must be allowed to select a series of numbers that appear anything but random to a human observer, 1, 2, 3, 4, 5 for example, or 1, 1, 1, 1, 1, 1, 1, etc.

That being established, who can say whether any series of numbers was randomly selected or selected in a predetermined order?
 
True random chance must include any number sequence I could imagine or generate. IOW, a true random number generator must be allowed to select a series of numbers that appear anything but random to a human observer, 1, 2, 3, 4, 5 for example, or 1, 1, 1, 1, 1, 1, 1, etc.

That being established, who can say whether any series of numbers was randomly selected or selected in a predetermined order?
Be warned scruffy is out of his depth discussing this aspect of science.
 
without memory ghosts can't even put definitions to the words they say. That ll take you through some philosophical moments but it boils down to them not having any existence and the listener interpreting even the most direct commentary. But randomness, does that require infinite memory? scruffy ?
 
If a ghost was a part of the mind it would have access to the memory? As far as I can tell I get silence when asked to repeat themselves as well as slue of other memory tricks that prove they can't learn from experience. they are neither outside or inside but an interpretation of the mind and its memory capabilities as well as stories and other things.
 
what I'm saying scruff, excuse me, scruffy is that without a memory or temporal context you could consider that random?
You raise an interesting point, Trevor.

Some of the smartest mathematicians ever, have tried to answer that question, and can't.

Two things of note are, the American engineer Norbert Wiener who worked with radar at MIT during WW2, said it is important to distinguish between the process and the outcomes.

He studied the seemingly random reflections of radar signals from planes and ships, and it led him to investigate Brownian motion. He formulated an abstract "process" (which is named after him, the Wiener process) which he called a "generator", as in it generates the outcomes. The process is deemed to be random irrespective of the observations. They call it "stochastic", to distinguish it from the outcomes which are observed to be "random".

As an observer, many successive observations are needed to characterize the randomness of a generator. The observer's point of view is described by the Wiener "kernel", which is one way to characterize the stochasticity of the underlying generator.

The other thing is, the Russian mathematicians were big into this. You've surely heard of Andrei Markov, and another big player was Kolmogorov. Turns out, the generators have shapes, they're not all the same. You can have memory in the generator itself, and it can still be random. The idea is, if the thing the memory depends on is random, then its influence will be random too. A Markov process is specifically memory-free, it's a "point process". The Italian mathematician Volterra characterized processes with memory. He began by working on linear systems and defined the "Volterra kernel", and turned his attention to random behaviors just before he died.

To answer your question though, an observer tests the outcomes relative to a standard, namely the probability distribution (or density, for a continuous system). You are right that in theory, an infinite number of observations would be needed to establish "exact" conformance. This is where concepts like variance (and higher moments) come in. For most practical engineering applications, 1 part in 1000 is sufficient. (So for example you commonly see polls with 1000 data points). You can plot variance against number of observations to see if you get an asymptote. A big part of modern data science is distinguishing the variance due to the observation, from the variance of the underlying process. Machines are typically much better and faster with this than humans are.
 
This paper describes the certification protocol in more detail.

Certification of random numbers is kind of like the atomic clock, whereas ordinary engineering applications are kind of like a wristwatch.

Certification of real valued numbers from quantum computers uses an information theoretic approach, they measure "entropy" which then translates into a number of bits.


In this paper they claim over 70,000 bits, and if you took 2 to the 70,000th power that would be an astronomical number. The precision is much greater than any supercomputer could generate within a human lifetime.

There are certification standards for digital computers, like NIST SP 800-90B and FIPS 140-2.


These mainly deal with "sufficient" entropy for cryptographic purposes.
 
The last time you pontificated about randomness you made a fool of yourself. You claimed chaos was random but it is not, here this might jog your memory:

Is randomness at odds with a denial of deterministic outcomes?

If so, is it always so?

Also, quick question: can any outcome of chaos be determined?
 
Back
Top Bottom