Here's a link to the best tips for a Google interview I've seen so far. One point raised was that it's necessary to practice writing code using paper and pencil. The reason is because you'll be asked to do that (or use a whiteboard). Most programmers nowadays code using IDEs, which contain tools such as editors and debuggers to aid in the development process. It makes people more productive; why write something out in longhand that you have to enter into the computer anyway? There was a time when people did not have IDEs, so they used flowcharts, desk checking, and the like before entering their programs on punched cards or tape. IDEs were developed to simplify the programming process. But it seems now that we have reason to doubt the productivity of IDEs, because people might be relying on them as crutches. (Sort of like using calculators instead of doing arithmetic using pen and paper.) Ironic, isn't it? We are regressing because we do not know how to ascertain whether someone knows how to do something.

I was reminded of when I was at Stuy, I could focus much more on doing well in school because I didn't have a lot of conflicting priorities. I did have some things I needed to do, such as chores, but for the most part, I could spend a lot of time on schoolwork without worrying that I needed to do something else. Also, I didn't have conflicting priorities with regards to what was assigned – it wasn't as if I had to prepare for the possibility of tests from teachers I didn't know or topics I wasn't very familiar with (that were not covered in class).

I think that the culture of Google is sort of like high school in that regard – because so many other things are taken care of (meals, laundry, etc.), it gives the engineers more time to dwell on these matters that most people can't afford to spend time on. When I was at AV, if I'd spent time thinking of all the different types of algorithms I might use to solve a given problem, my management would have been upset with me because I was not solving their problems. I had to pick solutions that were likely to work over a broad class of situations, particularly because there often wasn't time to rewrite something. I had to move on to the next thing.

I think that in the AV engineering groups, there was more opportunity to do that type of brainstorming. I remember during the brief time I was part of the index build team, I attended some meetings, at which design issues were discussed. It was very refreshing from the sorts of things that used to come up at my group's meetings, like debates over whether cookies should be used to determine the number of unique users, and so on. I missed having that type of interaction, which I think is important to the continued professional development of a computer scientist or software engineer. But as time went on, even the AV engineering groups were forced to just implement something and move on to the next thing, because so many people had been laid off and there was no time to ponder what might be a better way of doing something.

I'm not saying that Google should change its hiring criteria, but they should be very honest and upfront about what types of people they're looking for, and why it's difficult to find them. The US Congress is trying to address the issues of a (presumed) lack of CS or SE expertise, not a lack of geeks. Google (and all the other companies complaining about a lack of qualified applicants) should just be honest about what they're looking for, so there can be a meaningful discussion in the US Congress about rectifying this situation. Put it to the voters – should the US Congress go to great pains to accommodate the specific staffing needs of companies?

Tags:

Comments

From the article at the link: "A bad hire is worst [sic] then [sic] screening out a good candidate."

I'm not sure I could bring myself to recommend hiring someone who wrote a sentence like that. :-)

There are a couple of issues being conflated here. First, I don't believe that there is such a great shortage of technical expertise. Sure, a lot of folks could do with some brushing up on material they learned decades earlier, and I would never denigrate the importance of understanding algorithms and data structures. Programming without considering them is like cooking without considering the taste of the ingredients. However, most companies making the shortage complaint seem more concerned about the fact that many of the forty-somethings (and I'm one of them) who are not employed (and I'm not one of these - at least not yet) make a lot more than new hires, cheap and easily controlled H1B hires, etc.

Second, the last thing I want Congress to do is "rectify" anything with regards to education. The federal constitution gives Congress no legislative authority in this regard (okay, so following the Constitution is a quaint notion, but please bear with me), and there is no reason why wealthy companies need academic subsidies (i.e., corporate welfare) for their hiring needs.

I'm no fan of Google the company (as opposed to Google the search engine), and would not work there; I'm at least a decade or more past my tech sweatshop days, if I ever even had any. However, they're entitled to have whatever hiring standards or weeding out processes they wish. If their standards are sensible, they'll probably do well. If not, they'll lose talent to competitors.

Finally, I agree with your assertion that there is often a conflict between the more pragmatic push to roll out software and the (only ostensibly) less pragmatic encouragement of brainstorming, independent thinking and blue sky designing sessions. I think most companies err on the side of the former, to their detriment. Again, though, striking a profitable balance must be up to each individual employer, with the usual rewards for success and penalties for failure provided by the market.