Hardware random for the masses
I made available the result of the ring oscillator random generator as a GPL project called Whirlygig
. It's a 2.75cm x 4cm PCB with a mini USB connector, it provides a sustained 5.5Mbps (~620KBytes/sec) of apparently very high quality random bits using the Linux hw_random API. The large amount of randomness should make it useful for statistical tests as well as hard crypto.
I prototyped it using a couple of boards I had lying around, so I know it works fine, but I am waiting for the PCBs to come back from fabrication to actually build a final one. I placed the CPLD VHDL, the board hardware design, the driver software and the firmware for the USB controller into http://git.warmcat.com
I spent some time worrying about how to test the quality of the result -- I found that "diehard" mentioned in an earlier post has been superceded by "dieharder"
. This has a much tougher general testing regime, even though many of its test are reproductions of the diehard ones -- it runs each test many times and forms histograms of the p-value results from the many runs, and gives an assessment of fail, poor, possibly weak or pass on the spread of results rather than a single result.
At first the RNG failed three of the 18 tests, but on looking closer one of the tests (#2) currently fails for all RNG input and is marked up as not for use with assessing RNG quality, and the two others required by default more than the 400MBytes of randomness I had prepared. Unfortunately in that case they simply rewind the randomness file and re-use the same data to make up the balance! Of course this is no longer quite "random". When I adjusted those two tests to use a smaller sample that fitted into the 400MBytes without repetition, the output of the RNG get a "pass" on all 17 of the relevant dieharder suite tests.
During the validation phase I changed the RNG algorithm in the CPLD significantly. The scheme is described on the project page, but basically I moved away from a bit-centric to a byte-centric design with 8 identical sets of 3 oscillators. To stop any characteristic of a particular oscillator's routing from being associated with a particular bit of the result byte and creating a bias, I introduced a "mixer" that first generates 8 random bits by combining six oscillator outputs each with XOR, then rotates these oscillator sets between the result bits sequentially at 24MHz. I also removed the toggling action and used the random bit directly.
I also found the Linux rng-tools suite which repeatedly runs FIPS-140-2 tests on the bits, this fails 1 in 1200 or so packets of testing over 20 billion bits, I believe this is normal for a real random generator that it will produce sequences with low probability that don't look very random in the short term.
Aside from passing dieharder and FIPS-140-2, the changes also got me a reported 8.000000 bits of entropy per byte from the ENT test, so there are reasons to imagine the quality of the output is very good.