Musings from some guy who know stuff...and thinks he knows other stuff, and has opinions on just about everything, and is more than happy to tell you what he thinks and why...when he has time and the inclination to sit down and write in this thing.

Tuesday, December 19, 2006

I'm not sure when it happened or why (ok, I know in general why), but, as a nation, we are absolutely petrified about being "safe." Too much so. No, this isn't about the war, or terrorism, though those are certainly related. This is about auto safety and crash tests. Fine, the government and insurance companies test cars for their safety in various crashes and certain safety devices are required. From the perspective of those two groups it makes sense as safer cars mean less severe injuries which results in smaller hospital bills. Further for automakers, it makes sense to capitalize on the paranoia by touting safety devices and encouraging said paranoia makes sense. So I get it. All of it. That doesn't make it okay, though. In particular this is regarding the "small car vs. big car" aspect touted in the article.

Reading the article one gets the distinct impression that one will suffer and/or die if they choose a small car over, say, a tank. In particular, this leads parents to think "well, I don't want to drive an Envoy, but it's the only way for my children to be safe." Of course, one of the largest reasons that small cars are less safe is because large cars are on the road (most accidents do not involve running into buildings). If the circular reasoning remains unbroken this leads to a nifty little downward spiral of more ridiculous vehicles on the roads. There is no compelling reason for 99.9% of the population to own a vehicle larger than a Scion or Yaris or Civic. None. Not one. Our roads would be safer if people drove smaller cars because response times and visibility would be notably improved. This is never minding the financial side of things: smaller cars have less impact. Yes, impact means environmental in terms of burning less fuel, resulting in lower gas costs and prices, but it also means infrastructural in terms of less road wear, resulting in reduced road work costs, and it also means physical, as in when an accident does occur there is less [energy contained in the] impact, which could mean less cost in terms of repair and hospitalization--for the other driver.

All told, if everyone were driving smaller cars, everyone would be safer, there would be less of an impact on the planet, and we would spend less on infrastructure upkeep. Small cars are not dangerous, they're responsible.

Friday, December 15, 2006

It's amazing how disrespectful Hollywood can be of Christmas. Everyone knows that the best way to exploit the holiday to make a bazillion dollars is to release happy Christmas moves. I'd probably be less upset if these people would show similar outrage towards the Iraq war where real people are dying, and where over one hundred thousand American troops will have to spend the holidays, many of them under fire. I also demonstrate some level of misplaced outrage, I know, but I am not drafting press releases receiving national attention at CNN.

The idea that raw data is good is rather pervasive. Obviously "raw" data has not been tampered with, massaged, excluded, modified, or diminished. Raw data contains all possible information and none of the potential prejudices of those who recorded the data. This is true to a certain extent, but often times the raw data is raw because it requires expert analysis, review and understanding to have the meaningful bits extracted. In science the amount of raw data generated is phenomenal. In the absence of careful analysis there is no point to it. It is not publishable, it is not relevant, it is, by itself, nothing. When someone with expertise and understanding analyzes the data, however, and extracts the meaningful aspects, then shows and explains those to others, there is a crystallization of the data. A form appears from nothing and others can build more off of it.

This is what scientists do (as opposed to technicians who may or may not be scientists), they construct form from raw data. Journalists are supposed to do something similar. If reporting with no analysis, or insight is the same a as journalism, there is no expertise associated with being a journalist. Parroting and repeating things reduces journalists to base reporters, automated mimeographs. This is not always bad, mind you. I would love for there to be real reporting on Iraq, for example. But when it comes to some things, like politics, and (ostensibly) scientific work, journalists are a must.

Tuesday, December 12, 2006

It's funny that a good point coming from Lou Dobbs suddenly sounds bad. PC thuggery is bad, but so are the (equally thuggish) groups complaining about a "War on Christmas." A very noisy group of the population got up in arms about WalMart wishing visitors "Happy Holidays" at its stores instead of "Merry Christmas." Even if no one is offended by "Merry Christmas," the phrase "Happy Holidays" is definitely more welcoming to more people and, really, isn't that part of the spirit of Christmas? Plus, even if one is Christian, there are at least two holidays coming up, exactly one week apart, and some Christians also place value on celebrating the Epiphany. So, while there are some pretty fair reasons why someone could find the assumption behind "Merry Christmas" offensive, there is no reason that anyone should be offended by "Happy Holidays."

That said, the Christmas season is supposed to be about peace (but not in wreath form), love (of Augusto Pinochet and capitalism) and togetherness (only applies to heterosexual, married couples and their totally unplanned children, and like friends and neighbors), so maybe the rest of us should not be offended by anyone who finds offense in attempts at being inoffensive and inclusive. I mean, in reality, Christmas has become a secular holiday (which should really tick off those who are offended by "Happy Holidays"). Just smile back and ask if they have their solstice tree up yet.

Okay, I was yelled at for a previous post. I was condescending and, further, unclear as to exactly why the article made me mad. The data is fine, there may be more to it than that, but the article is bad for this very simple reason: raw data has very little meaning. This is true of any raw data. Statistics, in particular are rather devoid of meaning, as even their generation can be influenced by those who pose the questions compared with those who interpret the questions. An example from the article/study would be related to the point about "imagining life without instant messaging." One person may see that as "would my life be noticeably different if IM's didn't exist?" Another may read that as "do I think IMing will ever die out?" and another as "do I think people could get along in a meaningful way if there was no instant messaging?" Those are three very different questions. Exactly how the question was posed and how the respondents interpreted it is very relevant to the meaning of the results. I deliberately assumed mocking interpretations of the possible questions from the data given to demonstrate the meaninglessness of the report as a whole. People will read the CNN article and believe that there is value of some sort contained within. That is a problem.

There may be value in the data, but it is open to the ideas of the reader, not the reality of the report. In any case of reporting in which the reader/listener must make their own conclusions with nothing but raw data and no explanations then not only is the journalism bad, it is in fact non-existent. No relevant information beyond raw data was given. Further, there is no extrapolation within the report regarding what significance the data may contain. The additional reporting discusses the addictiveness of messaging. This is tangentially relevant, at best, but even then is not related to the data at all. This is not journalism. This does not belong on CNN. It may be amusing, but in the absence of analysis, reporting data is not a story.

One of the bits in the Time article stated that American children should get more foreign language. As much as I don't really want to say it, and as arrogant as it sounds: in strict terms of relevance, anyone who speaks English has no practical need to learn another language. English is the de facto (Latin anyone) language of the world in terms of economy, science, and politics. This is, largely, because of the dominant role of the United States in all of those fields, but, at this point, even if our role is diminished, the role of English will not be. (grammar on the other hand...) This does not mean, however, that I disagree with the sentiment.

I think that learning and speaking another language is a lot of fun. I enjoy doing it (and not because it annoys certain others who do not understand me). I frequently translate things in my head and play out imaginary conversations. I like singing songs in other languages as well...as a kid I loved Christmas-time masses because we would sometimes sing "Adeste Fideles" (Latin version of "Oh Come All Ye Faithful"). Aside from the fun factor, though, which may not appeal to everyone, foreign language is one of the few areas of education in which Americans get a real look at a different culture. American culture is unique, but it has permeated throughout the world such that it is often ambiguous to the point that (foreigners) may accuse us of not having one (any). That is untrue and unfair--yes, McDonald's is part of it, but so too is our constitutional democracy and our deep seeded belief in certain, "inalienable" rights. Perhaps because of the notion of the US as a "melting pot" or combination of many cultures, we often overlook cultural aspects of other societies (Iraq and Iran, anyone?); we assume that if something works here it will work anywhere (I have heard that Euro-Disney is doing better). Language classes are among the few places where we may see how that is not true.

Various aspects of culture were involved in all language classes I took. Art, history and literature all got involved; video showing traditional dance and classes where we would prepare and eat relevant food. Spanish class also opened the door for me to spend a summer in Paraguay, in a small village with no paved roads. There could be more or less in various language classes, but seldom do those classes devolve into rote memorization of vocabulary words and verb conjugations, though those definitely play a substantial role. The understanding of the culture that develops is not one that is expressly taught, but by showing the culture and, in a sense, becoming part of it for a few hours a week, an empathy will develop on its own.

I'll admit, there is something to be said about how poorly we educate our children in terms of the world, but even doing more in terms of world history and modern civilizations will not provide the near immersion aspect that learning a language does. Even if the language itself is poorly retained, the cultural empathy will remain. I often wonder if more people in this country spoke Spanish, would there still be so much hatred toward Hispanic immigrants? Even Dubbya, who seems to have little in the way of empathy toward his fellow human, doesn't hate the brown man so...he also speaks some Spanish...poorly (must. not. sound. like. I. like. that. man.).

I'm not linking it, but Time's cover article is about education in the 21st century and it's need to adapt. Lots of interesting points, and some things that I, and many educators already knew. I've got a personal feeling to shed on the language issue, but will do that in another post. My big point here is simple: no child left behind (NCLB) is crap.

Testing is the centerpiece of NCLB and it is causing more problems than it is fixing. Testing sounds like a good idea to people who are not educators, because they tend to think, "Gosh, when I was in school we took tests to 'prove' what/that we had learned, so testing kids will 'prove' that they are learning and that schools are succeeding." Feel free to think this way all you want, no matter how stupid it is, and it is stupid. Let's take a look at a simple benchmark of learning and determine how to test it: children should know how to multiply and divide. Well, we've all taken math tests before, most of us had to memorize multiplication tables up through 12. If I ask you what is 12 x 11? Does your answer represent understanding? a real ability to multiply? memorization? Okay, so let's make it 324 x 17... This is more challenging, so it will better gauge a student's abilities. But there are different approaches to doing this problem. One is to write the numbers on top of each other and multiply row by column then sum. This is how most people learned. Another approach may be to make use of the distributive property (324x17 = 10x324 + 7x300 + 7x25 -7). The latter demonstrates a significantly better understanding of mathematics and how multiplication relates to addition and subtraction, but it hasn't been tested. The test may have another question related to the distributive property and another dealing with the order of operations, but someone who just uses them automatically has a better understanding of how they all work. It isn't tested, though. You can have children show their work, but then you have to make a qualitative decision as to the better answer even if two kids get the answer correct. Moreover, anyone who does the work in their head may not show any, and how is that to be marked? When these kids get out into the real world would it be better if they know how to multiply 324x17 or would it be best if they can figure it as being ~20x325 = 20x300 + 5x4x25 = 6500. What if they have $70 and need to pick up 17 (items) that cost 3.24 (and they live in OR, where there is no sales tax)? Of course, real world problems suck and there really is no legit comparison. The real answer is that the student who can apply different aspects of their learning in one area will be better able to do so in others and will be better able to extrapolate what they have learned to real problems and, therefore, be better able to function in a dynamic society.

Testing does not (and will never) demonstrate this. The fundamental flaw of (standardized) testing is that it is goal oriented, with the wrong goal. This is not to say that all testing (or even all standardized testing) is bad. Nonstandardized testing, as practiced in many classes, is very valuable if not necessary to gauge student progress, and can, in fact, examine their understanding at a higher level. So long as the test is not a benchmark for passing and failure for a student or school, standardized test results can be useful to model simple aspects of education, and disparity within it. The SECOND they determine a student's and/or a school's fate they become meaningless measures of mediocrity...methinks. Our education system already does a poor job of nurturing creativity and intellectual curiosity, NCLB testing makes it worse.

Monday, December 11, 2006

There was a poll comparing teens and adults on IM practices and views. I'll start by saying I do IM, though not much. Now, let's look at some of the patently idiotic things reported as news:

-"Almost three-fourths of adults who do use instant messages still communicate with e-mail more often. Almost three-fourths of teens send instant messages more than e-mail."

Adults have jobs, and freedom, and money. IMing requires both parties be at their (connected) computers (this is not talking about texting)...this is way more likely to happen if you are a kid and can't just head out at 9:00 if you want to. Dumb-ass, non-point.

-"More than half of the teens who use instant messages send more than 25 a day, and one in five send more than 100. Three-fourths of adult users send fewer than 25 instant messages a day."

See above: adults have more control over their lives and IMs are more impersonal and less fun than irl meeting or even phone calls.

-"Teen users (30 percent) are almost twice as likely as adults (17 percent) to say they can't imagine life without instant messaging."

My favorite idiot point. First off, who the hell are those adults?!? Think for a second: "can't imagine life?" Okay, teens likely have no point of memory where IMing did not exist, so that makes sense. Adults, on the other hand, likely lived through the internet transition (though that becomes less likely below ~25) at an age where they have distinct before and after memories. Hell most adults remember when using a cell phone made you look like a jackass (mostly because you were holding a brick to your ear and nuking your brain).

-"When keeping up with a friend who is far away, teens are most likely to use instant messaging, while adults turn first to e-mail."

Back to point one. Adults are less likely to have success w/friends through IM...unless they set it up in advance with, say, an e-mail.

-"About a fifth of teen IM users have used IM to ask for or accept a date. Almost that many, 16 percent, have used it to break up with someone."

They also have their friend ask her friend if she maybe likes him. Yes, teen dating is obviously something to compare with adults.

Saturday, December 09, 2006

I have watched and read too much about Mt. Everest and the various expeditions. I'll admit a bizarre fascination with the people who chose to ascend that particular mound of earth. I don't really get it, though. One of the last specials (Disc channel?) I saw had one group with a mess of variously disabled folks. There have been lots of stories about people summiting Everest who were blind/asthmatic/limbless/whatever, and there is always accompaniment to the tune of "(one) can accomplish anything if they put their mind to it, as I have proven by making it to the top of Everest despite (some disability/health issue)." It really is bullshit. Fact is, in most of those cases it is far more likely to be technology and other people who have made the feat possible. In the end the ability to climb Everest is dictated by free time and money. That's it. If you have the cash behind you to support training and cover the cost of the expedition, then you, too can climb to the top of the world. That's it. People who fail were insufficiently prepared. Knowing that it is money (and free time, but really, they're related) and not "the will of the human spirit" that gets people up and (more importantly) down the icy rock really takes something away from the folks that do it. Everest specials are kind of like NASCAR on Discovery, you wait to see if someone vomits blood or dies or some other horrible thing. Plenty of people have succeeded in summiting. It's only interesting now if they fail.

Although, I would love to see a quadriplegic ascend as some Sherpa's backpack, and cheer their triumph of being able to breathe at great elevations...with supplemental oxygen...but that is probably insensitive of me.

Friday, December 08, 2006

Google too smart? There is such a thing as being "too smart for your own good," and it can cause problems in several areas. One of those areas pointed out in the Google article is that smart (talented) people don't like playing second fiddle. It can lead to resentment, poor work ethic, and in extreme cases, active sabotaging of others' work. There is also an aspect of "my idea, my credit" which can poison relations. Bosses get credit for their underlings' work, even if those underlings get credit themselves, it is, at best, shared. Also there is a sense of not liking another's idea and not respecting it...especially if it is the boss's idea. As much as this may be an issue for Google, it is definitely an issue in academics.

From before, academic groups have graduate students, post-docs and a principle investigator. That is a whole lot of education and brain power in a small group. When there is functional flow of information and sharing of ideas and mutual respect things are copacetic. When there is a breakdown it can be a disaster. Intellectual property theft, accusations of dishonesty, and career debilitating working conditions can all result. Most of the time, differences of opinion are just that and can be worked out. Sometimes, however, those differences become ingrained as "right" and "wrong" and very little can be done to break them. For the most part, professors understand that there is little absolute in life, save that absolutes are wrong. Most post-docs and grad students realize that they have far less experience and expertise in the field and differing to those with more is wise.

The reason that academic research groups, by and large, function as well as they do is the transient nature of all group members save one. So long as the students and post-docs are (or feel like they are) learning, there is value in the group and there is meaning to working. This is further coupled with the knowledge of an exit. Knowing that there is an end, that it is not really too far away, and having specific "things to do" that get you there, allow for any disgruntlement to be suppressed. Again, this doesn't always work out, but by and large it does. When it does not, it is most likely because one of these things has been compromised. Not seeing the end can cause other stresses to break and those previously mentioned problems to arise.

In the "real world" there may not be a fix. The fact is that any group must have leaders and followers; there can only be one "smartest person in the room," and sometimes there are absolutes, especially when the question is posed as: "Which of these is better (best)?"