I’ve made a Python script that (for now at least) plots the number of times each package on my system was installed + upgraded. That is, if the y-axis reads “2”, it means the package was installed and upgraded once. If the y-axis reads “1” it means the package was installed once and never upgraded.
As my system is rather new (about 2 months old), most packages were not upgraded. The package that was most upgraded was Linux (10 times), followed by youtube-dl and python-setuptools. I decided to only show the name of these 3 packages as they were the most upgraded and the x-axis would contain 531 package’s names if I were to show them all.
I seek to post the code soon on github so you can use it and modify it as you wish.
The idea of a God as an entity that can think is the following.
If a God exists, then it must follow the laws of Physics, i.e. he cannot break the rules that we know hold until today. This severely limits an almighty God who can do whatever he wants at will. This also implies that if an entity that can think exists throughout most of the universe, then we could consider the universe as a sort of brain where an equivalent to neurons would be found in galaxies. Unfortunately I believe that such a large brain makes no sense whatsoever, despite having almost no knowledge in neurosciences and how conciousness appears from a stack of neurons. I have read that we’re not even sure that neurons are required nor sufficient for conciousness to appear.
Therefore if God exists, then it has very limited powers and cannot be more than earthly creatures or so, just like we are.
This thinking has at least the following flaws :
- The laws of Physics as we know them could be broken while we’re not watching and so that we would have no evidence that the laws were broken.
- The universe mostly contains dark matter, something we’re not really understand well. I don’t think galaxies are its main constituents.
Nevertheless, I still believe that an intelligent God cannot exists.
A high school level study that could be performed on pigeons to get to know the answer to serveral questions listed below.
Assume that a student at a certain distance r from a group of N pigeons claps his hands or one of his shoes on the ground. The pigeons can either fly away (we’ll consider fly away when they fly away at least say 10 meters away from where they were previously to the clapping sound) or stay at the same place.
- In average (i.e. you have to perform the experiment multiple times, with a different set of pigeons each time), how many dB are required for the pigeon(s) to fly away?
- Is this number dependent on the number N of pigeons in the group of pigeons?
- If so, how is this dependence? Elaborate a theory to explain this result.
- Is there any difference if there’s a bright flash simulataenous with the sound?
- Is the answer to the 1st question also dependent on the distance r between the student and the pigeons? If so, how is this dependence?
This time I didn’t really know what to let my computer do. So I opted to let it calculate the correlation between used RAM and entropy pool level in /dev/random. At first I thought that there would be no correlation whatsoever since I thought that the two variables were almost totally unrelated. It turns out I was wrong but that makes sense now.
In order to fulfil this task, I decided to make a measurement of both values every 5 seconds, during a few hours where I’d use the computer and some minutes where I wasn’t.
The bash and R scripts that I used can be found on my github repository.
And the results there.
Here’s a plot of both the entropy and used RAM in function of time.
Like one could guess from the plot, the correlation is negative. For the whole data (that goes outside the range of the graph), the correlation turns out to be worth about -0.20.
One explanation for such (surprising to me) result is that the more used RAM implies more programs running and programs seem to use /dev/random (though I wonder why they don’t use /dev/urandom instead since the latter is “ok” for more than 99% of purposes including random password generators).
One day later I decided to rerun the experiment, this time well after having rebooted my machine so that I don’t start with a low used RAM (and high entropy pool level). To my surprise the results were quite different: the correlation is worth only -0.02. A graph of the results an be found below:
It looks like long period of inactivity left both my used RAM and entropy pool level oscillating around a certain value, and as soon as activity went up in my computer both values started to be more chaotic.
With a simple bash script, I’ve monitored the entropy in /dev/random, the entropy of the Linux kernel entropy pool. Note however that the way I’ve done it, it lowers the entropy’s pool level by a few bits at every entropy’s level check. So that I’ve limited the entropy checking at a frequency of 1 measurement every 2 seconds, during 6000 seconds (1 h 40 mins).
Here’s a quick summary of the data obtained:
Min. : 728.0
1st Qu.: 929.0
Median : 986.0
Mean : 982.3
Here’s a histogram and a plot of the entropy pool level in function of time:
Below is a Python code for a very simple bot that logs in into FICS, ask for a username and password that one must manually enter and then the bot will chat in channel 53, a channel for general discussion where guests can participate.
The code assume that you have a file called “random-words.txt” in /path-to/ directory. You can edit those according to your needs. The bot will randomly pick lines in the text file and send them in the 53th channel. It will do so at a rate of 1 line every 30 seconds by default, but this is adjustable.
Have fun editing and using this bot!
HOST = "freechess.org"
# user = raw_input("Enter your user name account: ")
user = input("Enter your user name account: ")
password = input("Enter your password: ")
user_bytes = user.encode('utf-8')
password_bytes = password.encode('utf-8')
tn = telnetlib.Telnet(HOST)
tn.write(user_bytes + b"\n")
tn.write(password_bytes + b"\n")
tn.write(b"set tell 1\n")
tn.write(b"set pin 1\n")
tn.write(b"tell 53 Hello, I am a robot!\n")
word_file = "/path-to/random-words.txt"
WORDS = open(word_file).read().splitlines()
word = random.choice(WORDS)
word_bytes = word.encode('utf-8')
tn.write(b"t 53 " + word_bytes + b"\n")
I’ve completed the curriculum of a “licenciatura en física” which corresponds roughly to a masters degree in physics -5 years of studies at university-. While I’ve had to study what I’d call unusual topics for a physics undergraduate degree such as firing of neurons, there are several “holes” in topics I would have loved to learn.
Thus far my list of such topics includes:
- Scattering, in both CM and QM. Indeed I haven’t dealt at all with scattering problems and theory in neither classical or quantum mechanics. I feel this is a huge lack of knowledge that needs to be filled up someday.
- Solving problems numerically. I’ve had one numerical analysis course that introduced some methods to solve ODE’s, numerical integrations, etc. but it wasn’t geared toward solving physics problems, unfortunately. It was done with Fortran 90, which is probably not the best choice as a first programming language for a physicist nowadays. In the end I feel like they should have taught us Python and help us to solve physical problems like ODE’s and PDE’s that we see hundreds of times during our degree.
- General relativity. We’ve seen Special relativity using tensors and the Minkowski’s metric, but I feel like an introduction to general relativity would have been quite interesting. Black holes included.
- Feynman path integrals. I’ve seen them mentioned a lot on the internet in forums and physics stack exchange, yet I have no knowledge on them.
- Feynman diagrams. Idem than Feynman path integrals.
- Decoherence in quantum mechanics. I would have loved to learn more about dechoherence and the collapse of the wave function which has never been mentioned during my studies.
- Solid State Physics. The course I took wasn’t formal enough to my taste and the exercises to solve weren’t that hard nor numerous so that in the end I feel like the introduction to this topic has been too light.
- Group theory.