• Truly Random Numbers On A Quantum Computer??

    From Lawrence D'Oliveiro@21:1/5 to All on Fri Mar 28 21:16:29 2025
    These researchers claim to have a technique, based on quantum
    computing, that can generate provably random numbers <https://www.csoonline.com/article/3855710/researchers-claim-their-protocol-can-create-truly-random-numbers-on-a-current-quantum-computer.html>.

    Trouble is, there ain’t no such thing. This part doesn’t make any
    sense:

    Then, to verify that true random numbers had been generated, the
    randomness of the results was mathematically certified to be
    genuine using classical supercomputers at the US Department of
    Energy.

    The definition of “randomness” is “you don’t know what’s coming next”.
    How do you prove you don’t know something? You can’t. There are
    various statistical tests for randomness, but remember that a suitably encrypted message can pass every one of them, and a person who knows
    the message knows that the bitstream is not truly random.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Lawrence D'Oliveiro@21:1/5 to All on Fri Mar 28 23:10:36 2025
    On Fri, 28 Mar 2025 21:16:29 -0000 (UTC), I wrote:

    The definition of “randomness” is “you don’t know what’s coming next”.
    How do you prove you don’t know something? You can’t. There are various statistical tests for randomness, but remember that a suitably encrypted message can pass every one of them, and a person who knows the message
    knows that the bitstream is not truly random.

    Here’s an even simpler proof, by reductio ad absurdum.

    Suppose you have a sequence of numbers which is provably random. Simply pregenerate a large bunch of numbers according to that sequence, and store them. Then supply them one by one to another party. The other party
    doesn’t know what’s coming next, but you do. Therefore they are not random to you.

    Which contradicts the original assumption of provable randomness. QED.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Richmond@21:1/5 to Lawrence D'Oliveiro on Sat Mar 29 11:50:06 2025
    Lawrence D'Oliveiro <ldo@nz.invalid> writes:

    On Fri, 28 Mar 2025 21:16:29 -0000 (UTC), I wrote:

    The definition of “randomness” is “you don’t know what’s coming next”.
    How do you prove you don’t know something? You can’t. There are various >> statistical tests for randomness, but remember that a suitably encrypted
    message can pass every one of them, and a person who knows the message
    knows that the bitstream is not truly random.

    Here’s an even simpler proof, by reductio ad absurdum.

    Suppose you have a sequence of numbers which is provably random. Simply pregenerate a large bunch of numbers according to that sequence, and store them. Then supply them one by one to another party. The other party
    doesn’t know what’s coming next, but you do. Therefore they are not random
    to you.

    Which contradicts the original assumption of provable randomness. QED.

    I think your definition of randomness is wrong. If the sequence can be
    repeated by anyone, then it is pseudo random, not random.

    Random is without a predictable pattern or plan.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Richard Kettlewell@21:1/5 to Richmond on Sat Mar 29 15:05:58 2025
    Richmond <dnomhcir@gmx.com> writes:
    [...]
    Random is without a predictable pattern or plan.

    I can think of worse definitions.

    From the original article:

    As deterministic systems, classical computers cannot create true
    randomness on demand. As a result, to offer true randomness in
    classical computing, we often resort to specialized hardware that
    harvests entropy from unpredictable physical sources, for instance,
    by looking at mouse movements, observing fluctuations in
    temperature, monitoring the movement of lava lamps or, in extreme
    cases, detecting cosmic radiation. These measures are unwieldy,
    difficult to scale and lack rigorous guarantees, limiting our
    ability to verify whether their outputs are truly random.

    Physical sources can be found in pretty much every commodity CPU for the
    last decade . So not that “difficult to scale” apparently.

    A lot of people are pushing QRNGs of various kinds right now. I’ve yet
    to be convinced, personally.

    --
    https://www.greenend.org.uk/rjk/

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mike Spencer@21:1/5 to Richard Kettlewell on Sat Mar 29 18:38:08 2025
    Richard Kettlewell <invalid@invalid.invalid> writes:

    A lot of people are pushing QRNGs of various kinds right now. I've yet
    to be convinced, personally.

    As a tech and math amateur, I made a setup to try to extract random
    numbers from serial images of a plasma ball taken by a consumer-grade
    web cam. Really random stuff happening in there, right? I never got
    any results, despite experiments with various datum selection
    strategies, image formats etc. that were any where near acceptable.

    The concept still seems to me to be potentially usable, but
    whaddoiknow?

    Talked to a guy at MIT in the 90s who was trying to extract random
    numbers from the turbulence of gas surrounding a hard drive. Never
    learned the tech or theoretical details -- above my amateur pay
    grade.

    --
    Mike Spencer Nova Scotia, Canada

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Lawrence D'Oliveiro@21:1/5 to Mike Spencer on Sat Mar 29 22:08:11 2025
    On 29 Mar 2025 18:38:08 -0300, Mike Spencer wrote:

    Talked to a guy at MIT in the 90s who was trying to extract random
    numbers from the turbulence of gas surrounding a hard drive. Never
    learned the tech or theoretical details -- above my amateur pay grade.

    That is in production use today. I believe it’s a standard part of the entropy-gathering process in the Linux kernel.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Computer Nerd Kev@21:1/5 to Richard Kettlewell on Sun Mar 30 09:31:01 2025
    Richard Kettlewell <invalid@invalid.invalid> wrote:
    From the original article:

    As deterministic systems, classical computers cannot create true
    randomness on demand. As a result, to offer true randomness in
    classical computing, we often resort to specialized hardware that
    harvests entropy from unpredictable physical sources, for instance,
    by looking at mouse movements, observing fluctuations in
    temperature, monitoring the movement of lava lamps or, in extreme
    cases, detecting cosmic radiation. These measures are unwieldy,
    difficult to scale and lack rigorous guarantees, limiting our
    ability to verify whether their outputs are truly random.

    Physical sources can be found in pretty much every commodity CPU for the
    last decade . So not that "difficult to scale" apparently.

    Simple circuits using the (ancient) 2N3904 transistor abound on the
    internet, and pre-date it as well.

    Here's a newer circuit design specifically for battery-powered
    cryptographic use and with lots of analysis and comparison with
    another circuit:
    https://betrusted.io/avalanche-noise

    None of it requires cutting-edge technology. The main issue in the
    past has simply been that it wasn't part of the original PC
    architecture, so things like "looking at mouse movements" needed to
    be done at first until it was added to modern hardware.

    --
    __ __
    #_ < |\| |< _#

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Lawrence D'Oliveiro@21:1/5 to Ethan Carter on Sun Mar 30 04:58:15 2025
    On Sat, 29 Mar 2025 20:25:23 -0300, Ethan Carter wrote:

    There's also an interesting paper by Anna Johnston on entropy, in which
    she makes the (correct, in my opinion) remark that entropy really is a relative notion.

    That makes sense. I’ve long thought that one’s estimates of the probabilities of various events depends very much on one’s point of view.

    I think Bayes’ Theorem says as much.

    I get the feeling here that, by the same token, you could never have a provably secure cryptosystem because someone knows the private key?

    None of our cryptosystems are provably secure. RSA depends on the assumed difficulty of two problems: factorizing large integers, and computing
    discrete logarithms, and would break if either one was solved. There is no proof that either of these problems is actually hard: we simply don’t know
    of any good algorithms for them, after decades, even centuries of looking.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Mike Spencer@21:1/5 to Lawrence D'Oliveiro on Sun Mar 30 04:37:59 2025
    Lawrence D'Oliveiro <ldo@nz.invalid> writes:

    On 29 Mar 2025 18:38:08 -0300, Mike Spencer wrote:

    Talked to a guy at MIT in the 90s who was trying to extract random
    numbers from the turbulence of gas surrounding a hard drive. Never
    learned the tech or theoretical details -- above my amateur pay grade.

    That is in production use today. I believe it's a standard part of the entropy-gathering process in the Linux kernel.

    Cool. I hope my friend, with whom I've lost contact, has been able to
    cash in on the development, either academicaally or financially.

    --
    Mike Spencer Nova Scotia, Canada

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Stefan Ram@21:1/5 to Mike Spencer on Sun Mar 30 09:14:24 2025
    Mike Spencer <mds@bogus.nodomain.nowhere> wrote or quoted:
    As a tech and math amateur, I made a setup to try to extract random
    numbers from serial images of a plasma ball taken by a consumer-grade
    web cam.

    Even stuff like the current CPU load or the exact time right
    now adds a bit of "entropy." Plus, my computer here has a
    microphone input that probably picks up some noise too.
    I'm guessing you could get roughly evenly distributed values
    in a certain range by using modulo or XOR operations on that.

    The stats for quantum random numbers can differ from those of
    classical random numbers - but honestly, asking whether "quantum
    randomness" or "classical randomness" is the "real randomness"
    seems kind of pointless to me. Random values for observables are
    definitely central to quantum physics, though whether the world
    is fundamentally deterministic or random is still not fully
    understood! ("Measurement problem in quantum physics").

    Here's something I've posted before in comp.lang.javascript,
    in <randomness-20170601030554@ram.dialup.fu-berlin.de>:

    |I'd say, a bit source is truly random when the probability
    |of any party to correctly guess the next bit is 0.5.
    |
    |(Possibly interesting in this context:
    |
    |"In contrast with software-generated randomness (called
    |pseudo-randomness), quantum randomness is provable
    |incomputable, i.e., it is not exactly reproducible by any
    |algorithm. We provide experimental evidence of incomputability
    |--- an asymptotic property --- of quantum randomness by
    |performing finite tests of randomness inspired by algorithmic
    |information theory."
    |
    |arXiv.org > quant-ph > arXiv:1004.1521
    |
    |and also
    |
    |arXiv:quant-ph/0611029v2
    |
    |.)

    .

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Scott Dorsey@21:1/5 to invalid@invalid.invalid on Sat Mar 29 12:58:05 2025
    Richard Kettlewell <invalid@invalid.invalid> wrote:
    A lot of people are pushing QRNGs of various kinds right now. I’ve yet
    to be convinced, personally.

    The QRNG may not in fact be random, but if they turn out not to be random
    this indicates some sort of currently-unknown determinism in the
    universe and that in itself is really interesting... far more interesting
    than the mere quality of a random number generator.

    One of the traditional high-entropy RNGs has been related to the decay
    of a radioactive source since you can never tell when an atom in a sample
    is going to decay. If you COULD tell, it would be extremely useful and
    worth a Nobel at the absolutely minimum.
    --scott

    --
    "C'est un Nagra. C'est suisse, et tres, tres precis."

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Stefan Ram@21:1/5 to Ethan Carter on Sun Mar 30 14:32:46 2025
    Ethan Carter <ec1828@gmail.com> wrote or quoted:
    The definition of ``probability'' (in the sense of how to interpret it)
    is sort of an open problem.

    |The probability P(A|C) is interpreted as a measure of the
    |tendency, or propensity, of the physical conditions describe
    |by C to produce the result A. It differs logically from the
    |older limit-frequency theory in that probability is
    |interpreted, but not redefined or derived from anything more
    |fundamental. It remains, mathematically, a fundamental
    |undefined term.
    "Quantum Mechanics" (1998) - Leslie E. Ballentine

    Thus far we have interpreted the probability of an event of a given >experiment as being a measure of how frequently the event will occur
    when the experiment is con- tinually repeated.

    |One of the oldest interpretations is the /limit frequency/
    |interpretation. If the conditioning event /C/ can lead
    |to either A or "not A", and if in /n/ repetitions of such
    |a situation the event A occurs /m/ times, then it is asserted
    |that P(A|C) = lim n-->oo (m/n). This provides not only
    |an interpretation of probability, but also a definition
    |of probability in terms of a numerical frequency ratio.
    |Hence the axioms of abstract probability theory can
    |be derived as theorems of the frequency theory.
    |
    |In spite of its superficial appeal, the limit frequency
    |interpretation has been widely discarded, primarily because
    |there is no assurance that the above limit really exists for
    |the actual sequences of events to which one wishes to apply
    |probability theory.
    |
    "Quantum Mechanics" (1998) - Leslie E. Ballentine

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Lawrence D'Oliveiro@21:1/5 to Richmond on Sat Mar 29 22:09:46 2025
    On Sat, 29 Mar 2025 11:50:06 +0000, Richmond wrote:

    Random is without a predictable pattern or plan.

    Let’s say I collect and store a sequence that meets your definition. Then
    I play it back when you ask me for a random number sequence. Does it still
    meet your definition? If not, what has changed?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Lawrence D'Oliveiro@21:1/5 to Computer Nerd Kev on Sun Mar 30 21:18:59 2025
    On 30 Mar 2025 09:31:01 +1000, Computer Nerd Kev wrote:

    The main issue in the past has simply been that it wasn't part of
    the original PC architecture, so things like "looking at mouse
    movements" needed to be done at first until it was added to modern
    hardware.

    The trouble with building in a purported random-number source is: how can
    you be sure you can trust it?

    Intel added random-number generation instructions to the x86 architecture;
    but how can be we sure they work as they’re advertised?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Computer Nerd Kev@21:1/5 to Lawrence D'Oliveiro on Mon Mar 31 08:15:54 2025
    Lawrence D'Oliveiro <ldo@nz.invalid> wrote:
    On 30 Mar 2025 09:31:01 +1000, Computer Nerd Kev wrote:

    The main issue in the past has simply been that it wasn't part of
    the original PC architecture, so things like "looking at mouse
    movements" needed to be done at first until it was added to modern
    hardware.

    The trouble with building in a purported random-number source is: how can
    you be sure you can trust it?

    That's the justification the designer of the circuit I linked to
    stated for why they decided to use a separate circuit made from
    discrete components. USB devices using similar circuits can also be
    purchased for the same reason. Anyway, you don't need a quantum
    computer to do it.

    Intel added random-number generation instructions to the x86 architecture; but how can be we sure they work as they're advertised?

    How can you be sure anything works as advertised? There's always
    the risk of backdoors in the Intel Management Engine enabling all
    sorts of possible attacks. That designer likes FPGA-based CPUs for
    this reason, although there's still a small risk that the FPGAs
    could be maliciously designed to specifically sabotage that
    approach too.

    --
    __ __
    #_ < |\| |< _#

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Lawrence D'Oliveiro@21:1/5 to Computer Nerd Kev on Mon Mar 31 01:30:09 2025
    On 31 Mar 2025 08:15:54 +1000, Computer Nerd Kev wrote:

    Lawrence D'Oliveiro <ldo@nz.invalid> wrote:

    Intel added random-number generation instructions to the x86
    architecture; but how can be we sure they work as they're advertised?

    How can you be sure anything works as advertised?

    There are ways to test things. But not (easily) with randomness.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Lawrence D'Oliveiro@21:1/5 to Richmond on Mon Mar 31 01:29:30 2025
    On Sat, 29 Mar 2025 22:39:26 +0000, Richmond wrote:

    Lawrence D'Oliveiro <ldo@nz.invalid> writes:

    On Sat, 29 Mar 2025 11:50:06 +0000, Richmond wrote:

    Random is without a predictable pattern or plan.

    Let’s say I collect and store a sequence that meets your definition.
    Then I play it back when you ask me for a random number sequence. Does
    it still meet your definition? If not, what has changed?

    Because you have stored it, it is predictable by you and you have a
    plan.

    But you don’t.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Lawrence D'Oliveiro@21:1/5 to Stefan Ram on Mon Mar 31 01:34:54 2025
    On 30 Mar 2025 14:32:46 GMT, Stefan Ram wrote:

    |In spite of its superficial appeal, the limit frequency
    |interpretation has been widely discarded, primarily because there
    |is no assurance that the above limit really exists for the actual
    |sequences of events to which one wishes to apply probability
    |theory.
    |
    "Quantum Mechanics" (1998) - Leslie E. Ballentine

    Discarded or not, it’s the definition used in gambling. In other words, people literally bet money on it.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Lawrence D'Oliveiro@21:1/5 to Ethan Carter on Mon Mar 31 01:32:52 2025
    On Sun, 30 Mar 2025 11:19:00 -0300, Ethan Carter wrote:

    The definition of ``probability'' (in the sense of how to interpret
    it) is sort of an open problem.

    It’s a term which can be defined in more than one way. One obvious one is
    as the relative frequency of different possible outcomes. I think there
    are others.

    Lawrence D'Oliveiro <ldo@nz.invalid> writes:

    On Sat, 29 Mar 2025 20:25:23 -0300, Ethan Carter wrote:

    I get the feeling here that, by the same token, you could never have a
    provably secure cryptosystem because someone knows the private key?

    None of our cryptosystems are provably secure.

    One example of provably secure system is the one-time pad.

    But it’s not. Where do you get the pad from? Proof of security of the
    system relies on proof of the randomness of the pad. Which takes us back
    to square one.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ethan Carter@21:1/5 to Lawrence D'Oliveiro on Tue Apr 1 10:25:30 2025
    Lawrence D'Oliveiro <ldo@nz.invalid> writes:

    On Sun, 30 Mar 2025 11:19:00 -0300, Ethan Carter wrote:

    Lawrence D'Oliveiro <ldo@nz.invalid> writes:

    On Sat, 29 Mar 2025 20:25:23 -0300, Ethan Carter wrote:

    I get the feeling here that, by the same token, you could never have a >>>> provably secure cryptosystem because someone knows the private key?

    None of our cryptosystems are provably secure.

    One example of provably secure system is the one-time pad.

    But it’s not. Where do you get the pad from? Proof of security of the system relies on proof of the randomness of the pad. Which takes us back
    to square one.

    I think your ``square one'' is that no system is provably secure.
    This denies the work of various thinkers who have written definitions
    and proofs. A proof is usually work of mathematical nature, not of
    engineering nature. Randomness is assumed in all of these proofs, so
    there is not a single step in them that's flawed in any way.

    So I think your position is that the assumption of randomness is not a
    good idea. You seem to rather prefer to assume that randomness
    doesn't exist. But that's just another assumption. And it's not an interesting one. It destroys a lot of good work.

    Why is randomness assumed? We can't calculate without it. For
    instance, what's the probability of getting a 6 in a fair die? It's
    1/6. But that's not true in your choice of assumptions because you
    reject the assumption of randomness. What do you get as a result? I
    think none---you wouldn't have a model to work with.

    --8<-------------------------------------------------------->8--- --8<-------------------------------------------------------->8---

    What about the practical world? We have enough randomness to run the
    entire world as it is currently done despite the accidents we've had
    and could still have. So I don't think it's a good idea to say that
    we don't have provably secure systems because someone may have
    criticisms with respect to the quality of random number generators: we
    have various systems that satisfy the definition of provably secure.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ethan Carter@21:1/5 to Lawrence D'Oliveiro on Tue Apr 1 10:31:59 2025
    Lawrence D'Oliveiro <ldo@nz.invalid> writes:

    On 30 Mar 2025 14:32:46 GMT, Stefan Ram wrote:

    |In spite of its superficial appeal, the limit frequency
    |interpretation has been widely discarded, primarily because there
    |is no assurance that the above limit really exists for the actual
    |sequences of events to which one wishes to apply probability
    |theory.
    |
    "Quantum Mechanics" (1998) - Leslie E. Ballentine

    Discarded or not, it’s the definition used in gambling. In other words, people literally bet money on it.

    Discarded in its theoretical use, which is where the discussion is. I
    think nearly nobody disputes how useful the limit-frequency
    interpretation is.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Lawrence D'Oliveiro@21:1/5 to Ethan Carter on Fri Apr 4 19:05:05 2025
    On Tue, 01 Apr 2025 10:31:59 -0300, Ethan Carter wrote:

    Lawrence D'Oliveiro <ldo@nz.invalid> writes:

    On 30 Mar 2025 14:32:46 GMT, Stefan Ram wrote:

    |In spite of its superficial appeal, the limit frequency
    |interpretation has been widely discarded, primarily because there |is
    no assurance that the above limit really exists for the actual
    |sequences of events to which one wishes to apply probability |theory.
    |
    "Quantum Mechanics" (1998) - Leslie E. Ballentine

    Discarded or not, it’s the definition used in gambling. In other words,
    people literally bet money on it.

    Discarded in its theoretical use, which is where the discussion is. I
    think nearly nobody disputes how useful the limit-frequency
    interpretation is.

    “The difference between theory and practice is, in theory there is no difference, but in practice there is.”

    I wonder who said that ... ?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Toaster@21:1/5 to Scott Dorsey on Fri Apr 4 20:16:55 2025
    On Sun, 30 Mar 2025 09:11:47 -0400 (EDT)
    kludge@panix.com (Scott Dorsey) wrote:

    Richard Kettlewell <invalid@invalid.invalid> wrote:
    Richard Kettlewell <invalid@invalid.invalid> writes:
    Exactly! All the stuff about lava lamps, helium motion inside hard
    disks, etc is just gimmicks. Real random numbers are tiny
    electronic
    ^generators
    components built into CPUs, HSMs, etc.

    Strictly I should probably say “entropy sourcesâ€_, since there’s
    generally a DRBG between the electronics and the application, as
    well.

    The problem with those genuine random number generators is that they
    are usually comparatively slow. They take milliseconds to spit out a
    number, sometimes tens or even hundreds of them. So we use the
    genuine RNG to seed a PNG in situations where we don't need complete randomness but need pretty good randomness and need a lot of it fast.
    Knuth has a discussion of this.
    --scott

    im no expert but can't you just amplify thermal (white) noise and just
    sample it? it's completely random.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Scott Dorsey@21:1/5 to toaster@dne3.net on Fri Apr 4 20:56:58 2025
    Toaster <toaster@dne3.net> wrote:

    im no expert but can't you just amplify thermal (white) noise and just
    sample it? it's completely random.

    Yes, but first of all you need to make sure you are only getting thermal
    noise and not anything else leaking in that might be repetitive. Secondly
    the rate at which you can generate random numbers is directly tied to the bandwidth of the noise source. But this is in fact how hardware RNGs often work.
    --scott

    --
    "C'est un Nagra. C'est suisse, et tres, tres precis."

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Lawrence D'Oliveiro@21:1/5 to Toaster on Sat Apr 5 02:13:13 2025
    On Fri, 4 Apr 2025 20:16:55 -0400, Toaster wrote:

    im no expert but can't you just amplify thermal (white) noise and just
    sample it? it's completely random.

    In theory, there are lots of sources in nature of “completely random” numbers.

    The problem is, how do you construct a mechanism to sample those numbers,
    and prove that there are no bugs introduced (whether accidentally or deliberately) somewhere along the way that subvert the randomness of the output?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Richard Kettlewell@21:1/5 to Toaster on Sat Apr 5 09:08:35 2025
    Toaster <toaster@dne3.net> writes:
    kludge@panix.com (Scott Dorsey) wrote:
    Richard Kettlewell <invalid@invalid.invalid> wrote:
    Richard Kettlewell <invalid@invalid.invalid> writes:
    Exactly! All the stuff about lava lamps, helium motion inside hard
    disks, etc is just gimmicks. Real random number [generators] are tiny
    electronic components built into CPUs, HSMs, etc.

    Strictly I should probably say “entropy source”, since there’s
    generally a DRBG between the electronics and the application, as
    well.

    The problem with those genuine random number generators is that they
    are usually comparatively slow. They take milliseconds to spit out a
    number, sometimes tens or even hundreds of them. So we use the
    genuine RNG to seed a PNG in situations where we don't need complete
    randomness but need pretty good randomness and need a lot of it fast.
    Knuth has a discussion of this.

    im no expert but can't you just amplify thermal (white) noise and just
    sample it? it's completely random.

    The physics isn’t my department but I think you’re on the right track.
    The point is that what you get out of the hardware component needs some additional processing before it’s usable in practice e.g. to generate cryptographic keys of a chosen strength. (Scott is for some reason
    repeating my remark about using a DRBG.)

    --
    https://www.greenend.org.uk/rjk/

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)