from the say-what-now? dept

So we were just writing about how the White House appeared to be going with a security by obscurity tactic in denying an Associated Press FOIA request concerning the security behind Healthcare.gov. Specifically, the request was denied because the White House claimed that revealing such info might help hackers. As we noted, if revealing the basic security plan you're using will help hackers, then you're not secure and chances are you've already been hacked.

Of course, perhaps the reason why the cybersecurity is so awful is because the White House's "cybersecurity coordinator," Michael Daniel, not only isn't a cybersecurity expert but thinks that's a good thing. I wish I was joking. After spending a few minutes talking about all his training at Princeton and the Kennedy School at Harvard taught him to communicate well and "break down problems" he dismisses the need for actual technical knowledge.

You don't have to be a coder to really do well in this position. In fact, actually, I think being too down in the weeds at the technical level could actually be a little bit of a distraction..... You can get taken up and sort of enamored with the very detailed aspects of some of the technical solutions. And, particularly here at the White House... the real issue is to look at the broad, strategic picture and the impact that technology will have.

Now there is some truth to the idea that it's important to be able to look at the bigger picture, but when you're talking about cybersecurity, part of the way that you can look at the bigger picture is to actually understand the technology. That's not "a distraction" it's part of the core and necessary knowledge to then do the job of a cybersecurity coordinator. People who don't spend much time with these things view cybersecurity and technology as a kind of "magic." But it's not. Nor is technology economics, but Daniel thinks it is:

But the other issue in my mind is that at a very fundamental level, cybersecurity isn't just about the technology but it's also about the economics of cybersecurity. Why companies choose to invest the way they invest. It's about the pscyhology of cybersecurity. You know, one of my sayings is that 'expediency trumps cybersecurity every time' meaning that people will prioritize convenience over being secure many times. So you need to have the understanding of those kinds of factors: the psychology, the economics, the broad policy, the politics with a little p, in addition to the technology. So you need to be more of a generalist than having a lot of expertise particularly in the technological side.

Yes, in addition to the technology. All of those things are important, but they're mostly useless if you don't understand the underlying technology. He's then asked what are the biggest challenges and... after talking about how important it is to understand the psychology and economics (more important than the technology) he admits that he doesn't actually understand the psychology and economics. Because, apparently, he wants to make sure that he has none of the job qualifications for the job.

There are a few [challenges] that I can identify. One is that we don't actually truly understand the economics and psychology behind cybersecurity. We know that a huge number of intrusions rely on known fixable vulnerabilities... We know that intruders get in through those holes that we know about that we could fix. The question is, 'Why don't we do that?' That clearly leads me to the conclusion that we really don't understand all of those economics and psychology well enough.

So there you have it folks. The White House's cybersecurity expert doesn't have the technological expertise, but insists it's okay because he's focused on the economics and psychology of the fact that people don't patch their computers -- and then admits he has no idea why that happens.

from the urls-we-dig-up dept

Year after year, news reports state that the US has horrible test scores in math compared to other countries. This leads commenters to speculate on the dismal future of the US economy and to complain about the weaknesses of the entire US educational system. However, international tests have never correlated that well with the relative economic performance of a country, so it's hard to see how bad test scores would accurately predict future economic rankings. There are plenty of things to try to fix in the US education system, but perhaps we should be planning a longer term strategy (instead of trying to turn the ship every election year) and focus on the evidence of what produces good educational results (if we can even agree on what results we want).

from the and-again-and-again-and-again dept

Every few months or so, we read about some freaked out reporter/columnist/pundit/politician complaining about how the internet and texting are destroying kids' ability to write. Yet, pretty much every study on the subject has found the opposite to be true. Study after study after study after study after study have all found that kids today are better writers than in the past.

"I think we're in the midst of a literacy revolution the likes of which we haven't seen since Greek civilization."

That's because people are constantly writing. Almost all of this communication actually involves writing. In the past, outside of school -- or certain job functions, many people barely wrote at all. And, yes, kids use txt spk at times, but every generation changes and morphs the language. But, more importantly, kids are smart enough to know what's appropriate when in most cases:

Lunsford's team found that the students were remarkably adept at what rhetoricians call kairos—assessing their audience and adapting their tone and technique to best get their point across. The modern world of online writing, particularly in chat and on discussion threads, is conversational and public, which makes it closer to the Greek tradition of argument than the asynchronous letter and essay writing of 50 years ago.

But there's also an interesting philosophical shift that he highlights. Since the type of writing and the audience is different than in the past, many younger people today approach writing in a different manner, and even have rethought what they consider to be good writing:

The fact that students today almost always write for an audience (something virtually no one in my generation did) gives them a different sense of what constitutes good writing. In interviews, they defined good prose as something that had an effect on the world. For them, writing is about persuading and organizing and debating, even if it's over something as quotidian as what movie to go see. The Stanford students were almost always less enthusiastic about their in-class writing because it had no audience but the professor: It didn't serve any purpose other than to get them a grade.

This is really fascinating when you think about it. Historically, many people haven't been that concerned about their writing, because it didn't matter. But, the more it matters, the more seriously they take it. This certainly doesn't mean that everyone has become a good writer -- far from it (just view any open comment forum). But, when people really care about what they're saying, they tend to get better at it, and the internet gives more people more reasons to care. As for all the bad writing out there? It's not a sign of the destruction of written English. Those people probably wouldn't be writing much at all without the internet. So it's actually a step up, relatively, from what they would have been doing in an alternate internetless universe.

from the @WWII-thanks,-but-we'll-ttyl dept

It's quite common for schools to struggle with how and what to teach kids when it comes to technology, often trying to balance newfangled topics like computer skills with the tried-and-true classics like history. But a new version of England's primary-school curriculum would make the teaching of certain historical topics, like the Victorian period and World War II, non-compulsory, but dictate that kids should "leave primary school familiar with blogging, podcasts, Wikipedia and Twitter as sources of information and forms of communication." It's easy to see this story leading to knee-jerk reactions from people decrying how kids aren't learning what's important, and spending their time playing computer games, and so on. But the reactions in The Guardian's article seem, for the most part, pretty measured. While mentioning Twitter makes for a tasty headline, the real thrust of the new curriculum seems not to be to teach kids particular platforms like Twitter or blogs, but rather to build their technological understanding, and allows schools some flexibility in how they do so. That would follow some earlier UK government reports, which found the schools doing the best job of teaching IT skills were those that spread computer skills across multiple topics, rather than segregating them into specific IT courses. By integrating technology into the entire curriculum, just as technology is integrated across multiple aspects of modern life, it would seem that young students will be best prepared for future success.