![]() |
Forum Index : Microcontroller and PC projects : Truly random?
Page 1 of 3 ![]() ![]() |
|||||
Author | Message | ||||
aargee Senior Member ![]() Joined: 21/08/2008 Location: AustraliaPosts: 255 |
How valid is this a simple test of randomness? (Written for Armite) The closer to 0.5 the more equal of the spread of values? Although if you had half the numbers of values equal to 2.5 and the other half 7.5 you would still get 0.5 n = 10000
Do k = Rnd 'k = random value between 0 and 1 j = j+1 l = l+ k 'Sum of all random values m= l/j 'Average of n random values Loop Until j=n Print m The more I think about it the more not. For crying out loud, all I wanted to do was flash this blasted LED. |
||||
JohnS Guru ![]() Joined: 18/11/2011 Location: United KingdomPosts: 4038 |
Sorry but that's not going to do what you seem to want. There are vast volumes about statistical tests for randomness but I suspect they're best read if you cannot get to sleep. Maybe instead trust the uC maker to have something that's random enough. Otherwise get reading but be prepared for that sleepy feeling..... John |
||||
G8JCF![]() Guru ![]() Joined: 15/05/2014 Location: United KingdomPosts: 676 |
All STM32F405xx and STM32F407xx products embed an RNG that delivers 32-bit random numbers generated by an integrated analog circuit. RNG on STM32F4 is based on analog circuitry. It makes analog noise and that noise is connected to linear shift register. Analog circuitry is designed from ring oscillators whose outputs are XORed. For RNG circuitry is also separate LFSR clock and is independent from System clock. This allows RNG to be independent from system clock between different STM32F4 devices. ERNIE works on a similar principle I believe ZZZZZZZZZZZ The only Konstant is Change |
||||
matherp Guru ![]() Joined: 11/12/2012 Location: United KingdomPosts: 10246 |
There are indeed but I think the summary is "you can't prove randomness, you can only prove non-randomness" ERNIE is indeed based on a noisy diode. I was part of a team way back when quoting for the replacement of the original ERNIE. We didn't get the job but I seem to remember we were intending to use radioactive decay. The noisy diode works well but is such a boring solution ![]() |
||||
G8JCF![]() Guru ![]() Joined: 15/05/2014 Location: United KingdomPosts: 676 |
On the other hand we could all drink copious amounts of grog and stagger about creating the "random walk theory" - which would probably be a lot less boring than a noisy diode and even radioactive decay ![]() The only Konstant is Change |
||||
matherp Guru ![]() Joined: 11/12/2012 Location: United KingdomPosts: 10246 |
That is what our bid team did the night before the client presentation which is perhaps why we didn't win the bid ![]() |
||||
G8JCF![]() Guru ![]() Joined: 15/05/2014 Location: United KingdomPosts: 676 |
![]() ![]() ![]() ![]() ![]() ![]() I really did Laugh Out Loud when I read your post Peter ! The only Konstant is Change |
||||
aargee Senior Member ![]() Joined: 21/08/2008 Location: AustraliaPosts: 255 |
ERNIE? Ah, government sponsored gambling. Yes all my program will do is indicate a sway away from the average and hence a diversion from truly random number generation but this is far from an indication of truly randomness. For crying out loud, all I wanted to do was flash this blasted LED. |
||||
G8JCF![]() Guru ![]() Joined: 15/05/2014 Location: United KingdomPosts: 676 |
One of the interesting things about Premium Bonds (which is what ERNIE picks at random) is that the overall rate of payout of "prizes" is inline with Interest Rates which the UK Govt pays to borrow at, but of course not everybody holding a Premium Bond gets that interest rate, most get nothing and a few get way over that notional interest rate. I say interesting because finding ways of borrowing money at the cheapest cost to the borrower is the goal of just about every borrower on the planet, and introducing this element of gambling, which it isn't because with gambling the house always wins at the expense of the gamblers, whereas, in this case the "gamblers" in the collective are not losing on the contrary as a collection they always win, ie the notional interest paid, and they all get to keep the capital (equivalent to stake for real gambling) which they put in We've moved a long way from noisy diodes ! Peter The only Konstant is Change |
||||
Grogster![]() Admin Group ![]() Joined: 31/12/2012 Location: New ZealandPosts: 9593 |
Ahhhhhhh, sweet memories..... ![]() ![]() Smoke makes things work. When the smoke gets out, it stops! |
||||
JohnS Guru ![]() Joined: 18/11/2011 Location: United KingdomPosts: 4038 |
No, it doesn't do that. You don't seem to understand random numbers very well. Try this John |
||||
aargee Senior Member ![]() Joined: 21/08/2008 Location: AustraliaPosts: 255 |
Read the Wikipedia article. So if my program repeatedly produced a value in the 0.7 to 0.8 range it would not indicate a non-random number generator? This would be a pattern of regularity, the only mean value that wouldn't would be would be 0.5? Essentially checking the values that they equally fall above 0.5 or below 0.5 is a monobit test. Interesting subject though ![]() For crying out loud, all I wanted to do was flash this blasted LED. |
||||
JohnS Guru ![]() Joined: 18/11/2011 Location: United KingdomPosts: 4038 |
Can't quite grasp your meaning but testing for randomness is way more complex than just a simple loop. (Especially one for not many values.) If you think about it, to test for a random value with an expected mean of 0.5 it's not enough to check for an average of 0.5 since many non-random values would also produce 0.5. Indeed, a generator that always returned 0.5 would pass such a test! John |
||||
Grogster![]() Admin Group ![]() Joined: 31/12/2012 Location: New ZealandPosts: 9593 |
aargee - The issue is because a computer system being a logical machine, cannot in essence be random, or be capable of creating a truly random number. This is because a computer has to perform mathematics in order to arrive at the number - ANY number, so no matter what you do, there will always be some form of predictability and mathematical certainty to the "Random" numbers in that the computer has to compute the number being used as your "Random" number reference. Computing a number is not truly random! ![]() Systems like that mentioned in the link are quite crafty though, and using an analog noise as a reference to generate a 32-bit random number is about as random as you can get these days, but even that has certain predictable patterns, if you know how the "Random" number is being created. Smoke makes things work. When the smoke gets out, it stops! |
||||
JohnS Guru ![]() Joined: 18/11/2011 Location: United KingdomPosts: 4038 |
Various kernels capture any randomness they can - typically from external events - such as when a network packet arrives or its contents or when/what a user types at a console. But then for a stream of random numbers there's just not enough of these external events so some pseudo-random number generator is used. A hardware source looks attractive, if it works. John |
||||
robert.rozee Guru ![]() Joined: 31/12/2012 Location: New ZealandPosts: 2437 |
if you are wanting to test the output from an RNG that outputs a bit stream, one rough-and-ready method is to capture a (very) long data stream into a file and then feed it through a good file compressor. if it compresses, your RNG is not performing so well. if it does not compress, then it is worthy of doing further testing. i used this approach years ago when developing a hardware PRNG that was initialized with a couple of keys for cryptography. a truly random bit stream can not be compressed. rob :-) |
||||
redrok![]() Senior Member ![]() Joined: 15/09/2014 Location: United StatesPosts: 209 |
This is strictly not true. A truly random bit stream MUST contain PASTERNS, or sequences of bits that do repeat. People often assume the bit stream will have a smooth even distribution of numbers, but this is by definition not random. A compression algorithm will always do some compression, actually it's required, on a random file because there are always bits of pattern that can be compressed. A truly random bit stream is always "noisy" and "lumpy". Just listen to the noise between radio stations and you can hear it. Or watch an old analog TV between channels and you can see the "roughness" of the screen display.redrok |
||||
twofingers![]() Guru ![]() Joined: 02/06/2014 Location: GermanyPosts: 1576 |
Redrok is right, I think! A truly random bit stream can contain an uninterrupted serie of millions of zero (or 1) values. Not very likely, but possible. ![]() Michael causality ≠ correlation ≠ coincidence |
||||
robert.rozee Guru ![]() Joined: 31/12/2012 Location: New ZealandPosts: 2437 |
um, now we are talking about LOSSLESS compression here, where the origianl content can be recovered through decompression... if a (lossless) compression algorithm were to always be able to compress any data stream, then we could just keep on feeding the output file back through the compressor again and again until we had compressed any initial file down to 1 bit. worse still, that 1-bit file, because we are using a lossless compressor, could then be decompressed (with multiple passes through the decompression algorithm) back into the original file. but there are many more possible input files that we could start off with than can be uniquely represented by a file of length 1-bit. the above presents a paradox. one must either accept the paradox or reject the initial assumption. now a lossless compression algorithm can be viewed in a number of different ways, but each is exactly equivalent to the others: 1. the algorithm detects regularities in the data stream and encodes them into a shorter stream. but after doing this no regularities will remain. or, 2. the algorithm attempts to predict what the next bit (or byte, or block) will be. when it gets this prediction right, compression occurs. but when it gets the prediction wrong, the output stream gets longer. the goal is to predict correctly more often than not. or, 3. the algorithm removes non-informational content. in essence, the algorithm 'condenses' the informational content of an input file by removing redundant syntactic structure. this tells us that a file which can not be further compressed contains BOTH: (a) 100% informational content with no extraneous redundancy, (b) is completely indistinguishable from a random data stream. for most folks, 2. is the simplest explanation to understand - it is the one that most computer science courses teach. from the perspective of the philosopher, 3. is far more interesting, as (a) and (b) present a wonderfully contrasting dichotomy. cheers, rob :-) |
||||
JohnS Guru ![]() Joined: 18/11/2011 Location: United KingdomPosts: 4038 |
I don't think he said or meant to say that ANY stream can ALWAYS be made smaller (yes we're all meaning lossless). But a long random stream may well compress. It won't compress well (meaning percentages). Different compression methods work better on data, sometimes one is better sometimes another. That will apply to "truly" random data as well. By way of example, an immensely long random string will eventually have a million zero bytes next to each other. So a crude compression would just reduce those and hey shorter by roughly a million bytes. Is there any actual point here.... I go back to randomness being worth reading about if anyone cares to understand it. It's trickier than one tends to guess. Testing for it is much trickier still. John |
||||
Page 1 of 3 ![]() ![]() |
![]() |
![]() |
The Back Shed's forum code is written, and hosted, in Australia. | © JAQ Software 2025 |