It is possible to generate truly random numbers by digitizing "white noise", hoever this has not generally been common in the past. Tables of truly random bit patterns are available, but are large and need a random starting point if every user is not to see the same "random" digits. Most computer applications use psuedo-random numbers generated by a variety of algorithms. As the term psuedo-random implies, the numbers generated are not really random, but they are good enough for most purposes.
A truly random number is unpredictable, does not start over and repeat the complete sequence, meets statistical tests of randomness, and does not favor certain subsequences over others any more often than predicted by chance. It is surprisingly hard to generated pseudo random numbers that meet stringent tests of randomness, however, a number of algorithms will create numbers that are good enough for most purposes. The creation of psuedo-random numbers for computer applications has been much studied and lacks any one single definitive answer. A web search for "Random Number Generation" will turn up thousands of papers (I got over half a million hits on using the Google search engine). The discussion in Donald Knuth's "The Art of Computer Programming" Volume 2 is probably a good starting point for anyone who is curious about algorithms and nuances. Some feeling for the problems involved can be gotten from Knuth's account of setting out to write a superior random number generator and ending up with a elaborate algorithm that generated the sequence 6065038420 over and over.
Specialized algorithms are known that meet such constaints as random distribution of values, but where every value in the number set must be generated once before any number is repeated. An algorithm like that might be needed for something like testing of memory performance.
Return To Index Copyright 1994-2002 by Donald Kenney.