I’m getting email about Comcast migrating MSNBC and CNN out of its expanded tier to a higher priced tier while keeping Fox News on expanded tier in a number of markets. If this is actually going on, I’m mightily curious.

Such shifts do not happen casually. They are generally the product of fairly intense negotiations among cable operators and programmers. They also require advance notice to viewers. This makes me extremely reluctant to impute a political motive here. If NBC and Time Warner (the owners of MSNBC and CNN respectively) were being screwed against their will over a political agenda, I would have expected to hear it in DC. What mainstream coverage there is of this suggests it is part of Comcast’s general digital upgrade. So we should expect to see all remaining channels migrated off to the higher priced tier eventually. While that will constitute a significant rate increase, it will put everyone back on equal footing. Besides, as the DC Circuit instructed us all last month, cable operators have no market power and cannot influence the programming market, whatever your personal experience to the contrary may be.

So if anyone has more info on this and would like to either comment below or talk to me, I’d love to hear about it.

We’ve created ordinary http URLs that teleport you to places in-world in Qwaq Forums, Being programmers, we could not resist the pun of calling them QRLs. The most common uses today are:

I was here – recording a history of where you were in a bookmark or some sort of audit trail

go there – even if working asynchronously, you can tell people where to go to explore more from a Web page, blog, or wiki

Most programs will recognize http://… and turn it into something clickable that starts your Web browser if it is not already started. Our QRLs produce a page that displays instructions, which is nice if you don’t yet have the Forums client installed. But if it is installed, the page can automatically launch the client and place you directly at the designated location.

How do we improve the breed of collaborative programming tools? Should we have spectator programming competitions on the Internet? (The people who like those things only watch for the crashes!)

I don’t think there’s a good commercial driver for improving programmer productivity(*), but spectator sports and particularly racing has been a good driver in other fields.

I think there’s a lot of relevance for the game-theory outcomes of nice-sized sprint programming problems such as whether, say, Tit-for-tat or Pavlov is a better algorithm for Free-Rider scenarios, or whether that changes for a mix of Free-Rider and Volunteer’s-Dilemma.

I think most programmers and programming managers still have never really seen very dynamic languages and live debugging environments, and such competition would be a great way to show them off.

I think it would put nice stress on the collaborative environment. How many people can watch? Can they see everything such as keystrokes and mouse movement? Is that important? Can they easily see who is doing what? Can they see multiple players’ activity at once? Multiple teams? Can they record and have instant replay?

I was taught that science is all about managing complexity by creating abstractions over different domains. A common layman’s mistake is to anecdotally observe or hear that something is true at some level, somewhere, and assume that this fact or definition applies throughout every discussion. For example:
One hears that computers are “programmed in binary,” or that they “understand binary,” but in fact, programmers don’t write in binary. Programmers work at a higher level of abstraction than binary encoding.
One hears that computers use “digital circuits,” that are simply “on” or “off”, but in fact, the physics of each electronic component is continuously variable. Device physics is at a lower level of abstraction than digital electronics.

So, what’s a server and what is peer-to-peer? It depends on what ‘s being discussed?

On the other hand, the decision potentially provides a substantial boost both the FCC’s ancillary authority and to its leased access reform order, currently pending before the Sixth Circuit. While I find this rather cold and uncertain comfort at the moment, it’s the best I can do in the face of what has become an utter rout for LFAs and PEG programmers. God willing, a future FCC will conduct the inquiry into strengthening PEG programming Commissioners Adelstein and Copps have repeatedly urged.

I had hoped to be able to tell all my friends at the National Conference on Media Reform in the beginning of June about the fantastic opportunity to put independent progressive programming, minority-oriented programming, and local programming on cable when the new rates and improved rules for cable leased access became effective June 1. Unfortunately, due to a decision by the Federal Court of Appeals for the Sixth Circuit granting the cable request for a stay pending resolution of the challenges to the rules, that won’t happen. While not a total loss (the Sixth Circuit rejected the NCTA’s motion to transfer the case to the D.C. Circuit) and not preventing programmers from trying to take advantage of leased access now, this is a serious bummer for a lot of reasons — not the least of which is the anticipated crowing by the cable guys (ah well, we all endure our share of professional hazards).

But mostly, I am disappointed that the cable operators will continue to withold the real rates under the new formula. As part of the stay request to the FCC (and subsequently to the 6th Cir.), the cable operators had submitted affidavits claiming that under the leased access rate formula adopted by the Commission, the new rate would be FREE!!! and they would have to drop C-Span and any other programming you like as a result. Since the cable operators always claim that the impact of any regulation is that they will need to charge higher rates, drop C-Span, stop deploying broadband, etc., etc., I am not terribly inclined to believe them this time and had looked forward to either their releasing real rates or putting programmers on for free. But since cable operators uniformly refuse to make the new rates available before the new rules go into effect (another reason I disbelieve the “the rate will be zero” claim), and because they control all the information relevant to the rate calculation, I can’t actually prove they are blowing smoke. Now it looks like we will have to win the court case (which will likely take a year or more) before we find out the real leased access rates.

Mind you, leased access had already hit a few roadblocks, owing to the inexplicable delay in sending the rules to the Office of Management and Budget (OMB). Although the rules were approved in November ’07, released on February 1, 2008, and published in Fed Reg on February 28, the order was not sent to OMB for the mandatory review under the Paperwork Reduction Act until April 28. I might almost think the cable folks in the Bureau were less than enthusiastic about supporting leased access reform. OTOH, since it also took the broadcast enhanced disclosure rules a a few months to get to OMB, it may just be the natural slowness of the process. After all, by federal law, the carrier pigeons used to take the text in little scraps from FCC across town to OMB can fly no more than two flights a day.

But to return to the critical point, what does the court ruling mean for leased access reform and the hope that local programmers, progressive programmers, minority programmers and others could have an effective means of routing around the cable stranglehold on programming?

It is not practical to expect users to develop applications in Squeak. There is too much to learn. But neither is it practical to expect users to develop applications in Java or ANY OTHER COMPUTER LANGUAGE. There is no way that any community of professional developers could possibly keep up with the demand that we hope for unique applications. No matter what language they used, nor even how many developers were available. There’s simply many more users — and user needs — than there are developers. As with scalability of load, we need another approach. The answer is the same: push the load to the edge of the network.

I had hoped to have a usable version of the components framework by now. Instead, I have a reasonably self-consistent set of scaffolding that illustrates a lot of the concepts. It isn’t at a critical mass of functionality, and it has a lot of bugs and mis-steps. I was sure that copy semantics, multiple views, and event handling were going to be hard, as would getting enough corners tacked down so that I could start to cut the cloth. But they turned out to be much harder than I imagined. Nonetheless, I’ve now got a stake in the ground as the starting point. Maybe now there’s enough ‘it’ there that I can next report, “made ‘it’ do such-and-such”, or “added X to ‘it’.”

Below the fold is a diary/log of how I got to this point. (I originally called this a “bootstrapping” architecture, because components allow people to build their Croquet models from within Croquet itself.)