Entropy in /dev/random vs used RAM

This time I didn’t really know what to let my computer do. So I opted to let it calculate the correlation between used RAM and entropy pool level in /dev/random. At first I thought that there would be no correlation whatsoever since I thought that the two variables were almost totally unrelated. It turns out I was wrong but that makes sense now.

In order to fulfil this task, I decided to make a measurement of both values every 5 seconds, during a few hours where I’d use the computer and some minutes where I wasn’t.

The bash and R scripts that I used can be found on my github repository.

And the results there.

Here’s a plot of both the entropy and used RAM in function of time.

entropy-ram  Like one could guess from the plot, the correlation is negative. For the whole data (that goes outside the range of the graph), the correlation turns out to be worth about -0.20.

One explanation for such (surprising to me) result is that the more used RAM implies more programs running and programs seem to use /dev/random (though I wonder why they don’t use /dev/urandom instead since the latter is “ok” for more than 99% of purposes including random password generators).

One day later I decided to rerun the experiment, this time well after having rebooted my machine so that I don’t start with a low used RAM (and high entropy pool level). To my surprise the results were quite different: the correlation is worth only -0.02. A graph of the results an be found below:

entropyvsRAM-secondtry

It looks like long period of inactivity left both my used RAM and entropy pool level oscillating around a certain value, and as soon as activity went up in my computer both values started to be more chaotic.