of which has been deduced from the observation that unused programs or features will often stop working after sufficient time has passed, even if `nothing has changed'. The theory explains that bits decay as if they were radioactive. As time passes, the contents of a file or the code in a program will become increasingly garbled.There actually are physical processes that produce such effects (alpha particles generated by trace radionuclides in ceramic chip packages, for example, can change the contents of a computer memory unpredictably, and various kinds of subtle media failures can corrupt files in mass storage), but they are quite rare (and computers are built with error-detecting circuitry to compensate for them). The notion long favored among hackers that cosmic rays are among the causes of such events turns out to be a myth; see the cosmic rays entry for details.
The term software rot is almost synonymous. Software rot is the effect, bit rot the notional cause.