It is always good to know when will other people want to cut off your head. Let us look at when were most heads cut off: that was probably the French revolution. OK, what made people want to cut off other people’s heads? Extreme inequality. OK, what was the Gini coefficient? Well, one estimate is 59 [1]. OK, so if we look at the US Gini coefficient it seems to be rising approximately linearly since 1980. In particular, US Census Bureau (via Wikipedia [2]) has the following data:

1980

40.3

1990

42.8

2000

46.2

2009

46.8

Let us do linear regression to obtain:

2020

50.0

2030

52.4

2040

54.8

2050

57.1

2058

59.0

2060

59.5

2070

61.9

2080

64.3

So, I guess people will start trying to cut other people’s heads off around 2058. We’re all safe till then.

So, it was -15 celsius (barely positive farenheit) and snowing this morning, actually it’s still -15 and still snowing. Surely only a complete moron would bike to work today. Actually the ride was pretty good. Yesterday it was -18 and I think I can tell the difference (though it was not snowing). The downside of biking in this weather is that the gears are refusing to shift. The levers just sort of stick and nothing happens. Fortunately, I’m in a reasonable gear right now.

Another downside is that I have these cool hybrid tires on the mountain bike. Sort of like road tires with spikes only on the sides, which is reasonable in terrain and much smoother than regular mountain bike tire on the road. They were really good at the UCSD campus (San Diego) where I’d go off road often and it never snowed. Because they kind of suck on snow.

With the Wikipedia blackout, what will journalists do? There will be a lot less in the wenesdays edition, though on the plus side it will have a better chance to be correct or at least original. Also, all school essays that were due wenesday will be turned in a day late.

I finally posted a new version of the real analysis book with the new metric space chapter, it weighs in at 192 pages now (with 12pt font though). It also fixes a now record number of errata; though my standard for what is an erratum rather than just a simple obvious typo has dropped slightly. One thing that makes me feel better about errata is that it seems Rudin’s 3rd edition of principles still has some errata that is essentially the same as what I did (independently, not that I really worry about taking credit for errata:).

The differential equations book nowadays seems to be hitting fewer and fewer problems: 6 errata in the past year, 2 of which have been in new stuff added before the summer. This is despite several people using the books and one ongoing partial translation. So I guess they are rather “correct” by now.

My goal right now is to get them to be correct rather than perfect. So I haven’t really been reordering things or rewriting things that could be improved. I’ve been at most doing small improvements in exposition. The main thing is that I want to spend time doing other things too of course:)

Pissed off about the CPU overheating, I wrote a simple daemon. It monitors the core temperatures and sets the cpufreq governor to powersave when temperature in either core is above 90 deg celsius, and sets it back to ondemand when it gets below 75 in both cores (those numbers I pulled out of my ass, they might need tuning). It simply polls the temperature every 2 seconds. There is no configuration or anything, simply change code and recompile. It’s all the neurons I’m willing to commit to this problem.

Yes I know performance might suffer since the CPU can go even further, but I don’t care about performance, I care about the machine not turning off randomly. I guess ondemand is actually better poewr (and heat) wise when everything is normal, but when then heat is high, powersave does come to the rescue.

Here is the code for those that want to do something similar. You might need to modify it heavily. I called it noverheatd.c, I simply compile the thing with something like gcc -O2 -Wall noverheatd.c -o noverheatd, placed the resulting binary in /root and then in /etc/rc.local I added /root/noverheatd &. The parts that need modification is set_policy where you need to set the number of cpus your kernel thinks it has, and then in the main loop you need to set the right coretemp paths for all the coretemps you have. I had to run “sensors-detect” as root from the lm_sensors package to obtain those guys.

About a week ago I finally added those new extra exercises I’ve been promising on the webpage (almost 150 new exercises) to my differential equations book. These are now with solutions. I did not want to add solutions to existing exercises. I still feel that it’s better to not have solutions. But I guess having some exercises with solutions does make the students feel better. Plus it seems this was an argument against using my book in at least department (not enough exercises and no exercises with solutions). So there. Some of the new exercises are interesting, many are just simple plug and play exercises to get students going. I’ve added all of them as exercises numbered 101 and above so that I would not change existing numbers. I suppose the “even/odd” thing is the common theme, but this has the added advantage that I can have fewer exercises without solutions.

As for the real analysis book. After having taught with Rosenlicht at UCSD (because my book doesn’t have metric spaces), I decided I will use my book at Madison this coming fall. This requires that I write up some metric space stuff. So I will be adding a Chapter 7 to the book. It will probably not be completely polished by the fall, so I might keep it separate even for the fall and only add it once all the bugs are caught after teaching with it. The plan is to do first everything on the real line and then do metric spaces. I found that metric space stuff was a bit too abstract for the students if I jumped right in. It might be better to do first sequences and continuity with real numbers only. I will skip some other material such as series though, and cover other material more lightly, due to time constraints. As for other news on the book: It is now (in slightly modified form) the standard book at University of Pittsburg.

This is funny in a very dark sort of way. Let’s do some numbers: Suppose that after some time about 100 people a year die from extra radiation exposure due to all these scanners. That’s a fairly low number, it would be impossible to
even notice this number from the cancer statistics. Given the millions of people that get scanned every year and that spend sufficient time near those scanners, that’s a tiny percentage. Now let’s see, let’s suppose we take the last 50 years of plane travel before 2001 and we notice that number of people killed in terrorist attacks including 2001 is a little more than 3000. That’s about 60 a year. Let’s suppose that no terrorist attack ever happens again (yeah right). Still we’d be killing almost twice as many by our security measures.

I wish my granddad were alive, this is what he did (nuclear hygiene). It would be nice to ask if 100 is a reasonable estimate.

Doing math is like being in the military it seems, at least if you want to be in the academia. You don’t really get much of a choice where you go work, you sort of get an assignment. Especially in this job market. The idea is the following: you send about 100 applications (if looking for research jobs), then about 50 of those you won’t consider anyway, though you don’t know that when applying. Then out of the next 50 you might get some interviews, and some of those you decide you don’t want to go to. Then you get an offer from a place you didn’t interview at, for a postdoc, for which you didn’t really apply (actually I got two such offers this year). The whole process takes about half a year, though then you get 2 weeks to decide once you get an offer.

My first job application 4 years ago when I went to Illinois was a bit simpler. I applied to 100 places. 50 of those were tenure-track positions where I had no chance straight out of the PhD. The 50 were postdocs, out of which probably 5 I was a reasonable candidate for since no place will hire a postdoc unless they have a group in your area. Out of that I got two offers around the same time, and one sort of informal offer. Then you pick, and go live in the midwest for 3 years.

So we spent 3 years in Illinois, 1 year in San Diego, next up to 3 years in Wisconsin, then …? Wisconsin will be the 3rd state that Maia will be living in and she’s only going to be 5. So when someone asks her “Where’d you grow up?” Then instead of saying: “We moved around, my dad was in the military” she’ll sat “We moved around, my dad was a mathematician.”

Well at least it will make for a more interesting story. I wish the job market gets better so that I can find a permanent job I like. I know it sounds like whining, since I have a guaranteed job for next 3 years (though it’s also guaranteed I will not have that job in 3 years time).

Academia is one of those careers where you’re well in your thirties before you really can possibly settle down. And before your salary starts reflecting the level of eduction you got. Professors at top schools do get paid well, but it takes a long time. As a programmer, I could be making the amount of money I hope to be making 10 years from now as a mathematician, 10 years ago.

Then you hear someone say “Oh this or that person couldn’t get a job in the industry so he went to academia” … yeah right … that’s the easy way out.

Hopefully I’ve solved my overheating problems with the lenovo. First using the nvidia blob seems to have lowered the gpu temperature, but it wasn’t enough. Turning off the “discrete graphics” and trying to run the thing with the Intel GPU led to scary kernel crashes. I’ve realized that cpufreq does not take into account cpu temperature (that’s kind of dumb isn’t it). The few posts I found had solutions of the form “cpus should never overheat” and “reapply thermal paste” … yeah that’s very useful. My acpi does not output temperature for some reason, though lm_sensors seem to be working, so it seems cpuspeed daemon won’t work I guess. So it’s either hacking cpuspeed or the simpler solution just loewring the maximum speed of the cpu. That seems to be working beautifully. I tried very hard to overheat it and it’s still good. I can’t really tell that it’s slower so I don’t really care.

Still I hoped this would have been solved long ago. I sort of assumed it was actually.

Before I managed to “fix it,” I have come up against the “run fsck manually” message, which I filed as a bug against fedora only to get “what did that look like and that shouldn’t happen nowdays” response. Well I am not about to replicate this as I actually need to … you know … do work. And I don’t want to end up spending the day reinstalling the computer in case the filesystem really does get hosed.

Anyway, not too happy with the Lenovo anymore. There are plenty of problems with this hardware. Given how much everyone was raving about lenovo, I expected a lot better. Next time (which given how this hardware seems to be working is going to be soon) i will buy another one of those linux preinstalled laptops. The hardware will suck, but at least I won’t have to buy windows.

I wish I could buy a laptop and have it for years. That doesn’t seem to be a possibility. First you buy a laptop and must install a bleeding edge distro to get all the hardware to work. Then by the time the version of the distro you use is reasonably mature and bug free (or you can use a different long term supported kind of distro) then your hardware breaks down, forcing you to buy a new laptop. The cycle of life!

I wish people built things that meant to last for more than 1-2 years.