Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

I'm no fan of Java-based curricula, for the same reason I'd be no fan of Fortran-based curricula. Computing isn't about one language. Each language and system shows you one hyperplane of a vast multidimensional space. The best programmers know lots of languages, and choose wisely among them — or even create new ones when appropriate.

In the production world, there are times where some C++ or Java code is appropriate... and there are times when what you want is a couple of lines of shellscript and some pipes... and there are times when the most sensible algorithm for something can't be neatly expressed in a language like C++ or Java, and really requires something like Common Lisp or Haskell. If you need to exploit multiple processors without getting bogged down in locking bullshit and race conditions, you're much better off using Go than Java.

(Just last night, at a meetup, I was talking with two bright young physicists who reported that their universities don't do a good enough job of teaching Fortran, which is the language they actually need to do their job. Scientific computing still relies heavily on Fortran, Matlab, and other languages well removed from what's trendy in the CS department — no matter if that CS department is in the Java, Haskell, or Python camp. But if you want to learn to write good Fortran, you basically need a mentor in the physics department with time to teach you.)

And there are times when the right thing to do is to create a new language, whether a domain-specific language or a new approach on general-purpose computing. There's a good reason Rob Pike came up with Sawzall, a logs-analysis DSL that compiles to arbitrarily parallel mapreduces; and then Go, a C-like systems language with a rocket engine of concurrency built in.

(And there's a good reason a lot of people adopting Go have been coming not from the C++/Java camps that the Go developers expected, but from Python and Ruby: because Go gives you the raw speed of a concurrent and native-compiled language, plus libraries designed by actual engineers, without a lot of the verbose bullshit of C++ or Java. Would I recommend Go as a first language? I'm not so sure about that....)

What would an optimal computing curriculum look like? I have no freakin' clue. It would have to cover particular basics — variable binding, iteration, recursion, sequencing, data structures, libraries and APIs, concurrency — no matter what the language. But it can't leave its students thinking that one language is Intuitive and the other ones are Just Gratuitously Weird... and that's too much of what I see from young programmers in industry today.

Sure, I know and like DNSBLs including Spamhaus's, but this is a distinct application from XBL. Specifically, removal needs to be rapid in order for it to be useful for rejecting customer Web traffic. That's an engineering requirement that email anti-spam systems don't have, since SMTP is designed to retry for days if necessary to get a message through. Moreover, hosts that send any legitimate email are very few compared to hosts that send Web requests; and even though email admins are frequently dense, unresponsive, or victim-blaming, they're still a level above typical users in knowing what the fuck is going on with their computer.

One approach would be to have each DDoS victim continually (e.g. every hour) assert which addresses were attacking it, and only list those addresses which are currently attacking. This way, as soon as a host stops attacking, it will drop off the list. This has weaknesses — for instance, an attacker can use your host all night while you're not using it, without you noticing — but it's still an improvement over what we have today. And it still depends on each subscribing site having a good enough backchannel to the listing service to stay open during the DDoS. Back in the day we'd do it with a dedicated modem line — the bandwidth requirements are really quite minimal — but nobody knows what that is any more.

Sites under DoS attack should publish (through a channel not congested by the attack) a list of the IP addresses attacking them, through some trustworthy third party. Then, other sites should subscribe to that list and refuse service to those addresses until they clean up and stop attacking.

For instance, consider your uncle who uses AOL. His computer is infected with botnet garbage and is participating in a DoS attack against (say) Slashdot. Slashdot sends a list of attacking IPs, including your uncle's, to Team Cymru (the third party). Cymru aggregates these and publishes a list, updated every three hours. AOL subscribes to that list. When your uncle goes to check his AOL email, he gets an error: "We regret to inform you, your computer has been hacked, and is being used by criminals to break the Internet. You can't get to your AOL email until you kick the criminals off by installing an antivirus program and running a full scan. Click here to install Kaspersky Antivirus for free. Thank you for helping keep criminals from breaking everyone's Internet. Sincerely, Tim Armstrong, CEO, AOL."

Then your uncle gets mad and calls up AOL and complains. They try walking him through using the antivirus program, but he just curses them out and says he'll go to Hotmail instead. He tries... but Hotmail also subscribes to the same list and tells him the same thing: "Your computer is infected with malware and is being used to attack other sites on the Internet. You cannot obtain a Hotmail account until your computer is clean. Click here to install Microsoft Antivirus." He gives up and calls AOL back, and they help him get his computer cleaned up. Within half an hour, it's off the botnet; and within three hours, it's off the list of attacking hosts, and your uncle can get his AOL email again.

I have IPv6 through my ISP, Sonic.net. Whenever I use BitTorrent, I see plenty of IPv6 hosts. The reason is pretty obvious to me: if you're passing IPv6 through your home router, you have an externally-reachable IPv6 address... but you may not have an externally-reachable IPv4 address thanks to your home router's NAT.

Presumably, this means that one incentive for home users getting IPv6 is to get a better-connected BitTorrent network. BitTorrent is pretty popular, but ISPs are never going to tell you "Get IPv6 so you can download movies... er, I mean, Ubuntu Live CDs!... faster."

The easiest place to reach with the mouse is the current position. The second easiest is the four corners of the screen. The third easiest are the four sides of the screen. The hardest place is a square in the middle of the screen. Ancient UI guidelines are still relevant today.

Yep. This a corollary of Fitts's Law and while it's often associated with the design of the Macintosh menu bar, the underlying research dates to 1954, thirty years before the Mac.

Sadly, it hasn't been well learned on a lot of systems. Although Windows and Ubuntu both put a useful menu in a corner, few systems but the Mac make really effective use of the screen edge. Windows and many Linux desktops occupy much of one whole screen edge with a rarely used application switcher; but most users switch applications by pointing and clicking, or using keyboard shortcuts like Alt-Tab.

One big win that a lot of systems have benefited from, though, is contextual menus, which take advantage of the current position.

One problem is that as the size of a corporation increases, the influences on its behavior may become dominated by principal-agent problems and specific motivations of individuals within the corporation. It's easy for the members of a ten-person startup to keep "increasing shareholder value" in mind, but in a ten-thousand-person company, a middle manager or mid-ranking engineer may be much more interested in his or her next quarterly review or promotion.

Furthermore, the internal economy of a large corporation is a command economy, not a free market. In a free market, decision makers can count on prices to show them which goods are the most efficient choices, or which products may be the most lucrative. But within a large corporation, management is expected to know how best to apportion budgets, wages, investments, etc. â" all without the benefit of a pricing mechanism that accurately reflects (internal) needs. And as the corporation gets larger and more heavily capitalized, it becomes more and more different from the outside world, so external signals (such as the prevailing wages in the industry) become less relevant to internal decision-making.

I saw it about a week ago. Overall, my biggest impression was one of missed potential.

(Note, here I'm talking primarily about the story and the world-building, not about the cinematography.)

The overall structure was a weakness from the start. Sam Flynn turns out to be yet anotherPrince Harry character: the heir to the throne who goofs around and avoids his inherited position until he's handed a confrontation that forces him to prove himself, at which point he rises to the occasion as a True Prince. We've seen this before; it's the usual aristocratic nonsense: worth is not achieved, but inherited and then revealed.
Contrast the original: Kevin Flynn was an honest working hacker who was forced to go rogue when he was screwed over by a yuppie coworker. Kevin's triumph was to prove himself as a creator. He set out with the aim of showing that he and not Ed Dillinger was the author of Space Paranoids; and in the end, he accomplished that goal, but in a way that -- through his creative "User power" -- changed the Programs' world for the better.
Sam isn't a creator. He sets out with no particular goals of his own; he is handed all his goals by his inheritance. Kevin Flynn was a creative adult seeking justice; Sam Flynn is an irresponsible rich boy growing up. And that's a story that's been played out far too many times.

One of Legacy's few big world-building ideas is the emergence of the Isos: Programs evolved from the System itself, rather than being created in the image of a User. This could have been huge. But instead it is presented merely to give Sam's love interest a tragic backstory. The war is over; the Isos lost, here's the last surviving princess of a dead race. Give her a hug.
The political vision of the System in Tron is more complex. There are old powers in the System that defy the MCP's regime at personal risk to themselves: Dumont at the I/O Tower. The MCP's assimilation of the whole System into itself is not complete; it can be resisted. In Legacy, CLU's genocide of the Isos is over and done with... and nobody even bothers to say, "Sam, you dickhead, if you'd logged in yesterday, you could have stopped the fucking Holocaust."

Another new world-building idea is the possibility that a Program could use the laser terminal to escape into the real world: that the laser wasn't limited to objects that originated in the real world (oranges or Kevin Flynns), but could also play back a Program into human form. Thus Quorra's escape; thus CLU's threat to invade our world with armies of Programs.
Well, Tron's MCP didn't need armies to take over the world. The MCP could just hack the Pentagon. In Tron, the deep entanglement of the real world and the System is made clear: the MCP can threaten Dillinger not with armies materializing in ENCOM's laser bay, but with the legal and political forces native to our world.
Ironically enough, the 1982 vision has more in common with today's Internet-enabled reality than the 2010 version. As far as we know, the System in Legacy isn't even on the Net: it's a dusty minicomputer sitting in the basement of Flynn's Arcade with barely enough connectivity to reach Alan Bradley's pager.
Ultimately, CLU is much less of a real-world threat than the MCP. The MCP had taken over the System that ENCOM used to do its business, and was extending tentacles into banks, major governments, and who knows what else. CLU's domain is that one minicomputer; the big threat would be shut off if Alan or Sam had just unplugged the laser terminal.

Both of the above two problems point at a bigger problem with Legacy: it ultimately doesn't take Programs and the System seriously as an independent sort of intelligent existence rather than a mere imitation of our world.
Quorra longs to see the sun; CLU wants to get out into our world to "perfect" it; the Programs have nightclubs and sports arenas imitating human ones. The way it's presented in Legacy, the best thing that could happen to a Program is to get out of the confining, artificial System into the authentic, sun-blessed, material world.
That notion is alien to the original. Tron, Yori, and Dumont may revere the Users but they don't want to become Users. They want to free their own world and live in it pursuing their own purposes -- not escape into the human world. They aren't imitation humans who want to grow up to be Real Boys like Pinocchio -- they're Programs, and they know what their purpose is in life: it's to fulfill the goals their Users set up for them.
(Extra bonus for real sf nerds: Tron's Programs may have something in common with C. J. Cherryh's azi: confidence of purpose. As Grant would put it, self-doubt is for born-men. Azi do not wish they were born-men; azi take refuge in the certainty that born-men lack.)

And speaking of lost story potential, how about Rinzler? Anyone who'd seen the original knows that Rinzler is a hacked-up copy of Tron from his very first appearance, thanks to the "T" insignia on his chest. Kevin Flynn mentions it once in passing, and at the end it's clear that Rinzler is "rebooting" back into Tron. But Rinzler hasn't had enough character development for us to care: he's a literally faceless killing machine. And as killing machines go, he's got less character than Darth Maul, and that's saying something.

All in all, Legacy came across to me as too circumscribed of a world, and Sam Flynn as too much of a True Prince cardboard character. Movie-wise, I wanted to see more of the Isos and a lot less of Dr. Frank-N-Furter.

Your digital camera may embed metadata into photographs with the camera's serial number or your location.

Record your location? Sure, if it's a smartphone with GPS. For standalone cameras, GPS is not exactly a common feature. There are about two models of pocket digital camera on the market that have GPS, and not very many SLRs with it either... go look. Those that have it make no secret of it; it's actually a big marketing point for people who want to record where they've been taking pictures.

As for smartphone models, I don't know about the Apple or Windows offerings, but Android's camera app exposes it as an option right on the main screen, next to the flash and focus settings... and I'm pretty sure it defaults to off. People turn this on because they actively want it.

Rather than scaring people about what their devices might be recording, it would be a lot more useful to tell people how to find out what tags are on their photos. For instance, the Linux command line program "exiftags" will tell you this kind of stuff: (Picked from a random image file I had lying around on my laptop.)

The problem isn't that math isn't important. The problem is that the math being taught isn't important.

Yes. Exactly.

Fuck calculus. You don't need it unless you're going into one of a few specific fields. But there are whole swaths of math that most folks completely miss, that are directly applicable to everyday life:

Probability and statistics. No, not for understanding the census, nor for gambling -- rather, for understanding what's meant by words like "evidence". Bayesian probability can be taught to anyone who can understand percentages and division, and it can be straightforwardly applied to reasoning about the everyday world.

Proof and logic. The notion of logical proof has been around since Aristotle, but symbolic logic is much newer. Nonetheless, the notion of logical validity of an argument, of conclusions following from premises, is directly applicable to all sorts of real-world decision-making. Logic is also an obvious point to dovetail math into the humanities, via the analysis of written arguments.

Abstract algebra. Not the proofs, nor the deep abstractions, but rather the notions of properties such as commutativity, associativity, etc. and the idea that these can be applied to any sorts of operations, not just "mathematical" ones. Does it matter if you mix the eggs in before the butter? Do you need to do X separately to A, B, and C, or can you put A+B+C together and then do X all at once? The notion that some situations or problems have the same structure as others is itself pretty powerful. (And lends itself to comparison with the literary idea of analogy.)

Whether they annoy you or fulfill your nerdy collection habit, achievements have spread across the gaming landscape and are here to stay. The Xbox Engineering blog recently posted a glimpse into the creation of the Xbox 360 achievement system, discussing how achievements work at a software level, and even showing a brief snippet of code. They also mention some of the decisions they struggled with while creating them:
"We are proud of the consistency you find across all games. You have one friends list, every game supports voice chat, etc. But we also like to give game designers room to come up with new and interesting ways to entertain. That trade-off was at the heart of the original decision we made to not give any indication that a new achievement had been awarded. Some people argued that gamers wouldn't want toast popping up in the heat of battle and that game designers would want to use their own visual style to present achievements. Others argued for consistency and for reducing the work required of game developers. In the end we added the notification popup and its happy beep, which turned out to be the right decision, but for a long time it was anything but obvious."

... a small one. Here's what our policy to prevent piracy would have been:

Please don't pirate stuff too much. If we get notices saying that you're pirating stuff and asking you to quit, we'll call you in to the office and give them to you. If we get court orders telling us to give them your name, we'll probably have to do that, since we can't afford lawyers much.

If you really have to pirate stuff, please at least try to leech it off of your friends on the LAN rather than flooding our dinky little Internet uplink. Because if you do that, we'll probably end up blocking your IP address for a while so that email and our Debian updates can get in.

And while you're at it, here's the address of the porn server that some freshman set up. Get your porn over there, please don't mirror all of abbywinters.com over our connection.

I currently work as a Linux sysadmin for a large Web company. Previously, I've been the chief information security technician for a well-known research institution, and the only sysadmin for a small liberal arts college.