April 30, 2005

I woke up at 8:30am, which is a rarity for me on a Saturday. Normally, Saturday is reserved for decompression and recuperation, which means that seeing my ass out of bed prior to noon is a special occasion.

However, I think the excitement of going to go see "The Hitchhiker's Guide to the Galaxy" shorted out my sleep circuits...either that, or the ear infection I've been unsuccessfully fighting for the last week. One of the two...

So once my wife finally finished getting ready (a process which takes the better part of three hours), we walked to the train station and arrived just as our train left. We decided to walk down to the next train station and caught a bus going the same way about halfway down.

We got to Cityplace Station eventually, I ran in to the Loew's Cineplex and picked up our two tickets, then we walked over to Whataburger for lunch. I was...less than impressed. I felt like I was eating a Whopper with half the taste.

New Personal Rule #1: Any place that my co-worker's rave about shall now be viewed with abject skepticism.

We walk back over to the theater and end up as the first ones in our particular theater. This was notable as this is the first movie we've gone to see since I arrived in Texas. It was also notable because I realized exactly how much Loew's hated us. While the theater was large enough for an opening weekend show, the seats were literally the worst seats I've been in since the last time I went to the discount theater in Kaysville, Utah.

New Personal Rule #2: Loew's sucks...go to AMC instead.

So I watch the movie with only a minor allergic reaction to the seeing eye dogs that the blind couple brought in with them, and I must say that while it doesn't hold a candle to the books, it is a well-written, well-acted piece in its own right. My wife, who hasn't read the books, loved it. I give it a 3 of 5 stars, she gives it a 4 of 5. If you go, stay through the end credits for a modified in-joke for people who read the books in the guise of a Guide entry.

We try to figure out what to do next, and I get a wild hair up my ass and decide to go to Mockingbird Station and explore a bit. We get to Mockingbird in a matter of moments, and I decide to treat my wife to a place that neither of us had been to...Cold Stone Creamery.

New Personal Rule #3: Cold Stone Creamery doesn't hold a candle to Nielsen's Frozen Custard back in Utah, but it's a good enough substitute for me.

We enjoy our desserts, wonder why a dessert restaurant wouldn't have any seating at all, and walk over to the Virgin Megastore. I finally see Darth Tater, and my wife nearly buys a babydoll tank that says, "I Slept With The Guitar Player." At that point, I think that we should drop by Lone Star Comics and see if they got my art back.

We hop on the bus, and when we get off at Mockingbird and Abrams, we split up. My wife goes to try to find a solution for her hair problem, and I go to Lone Star. They have some extremely cute yarn plushies of the Hitchhiker characters in, including a lovable Marvin plush. Nobody remembers me from 24-Hour Comics Day, but the moment my wife walks in, they say, "Hey, I remember you! Lego Spaghetti!" They give me my art back, and say that someone called me to tell me to come pick it up. News to me...

I pick up a copy of "The Settlers of Catan" from them so I can keep my current record of never leaving the store empty-handed. On my way out of the store, I see this lady fawning over the Marvin plush with her son there looking embarrassed. I make an off-hand remark about how much it looks like the Marvin yarn figure in the movie, and she goes off about how she saw it last night, how faithful it was to "Doug Adams'" vision, how it didn't "cheat" the books, and how she never envisioned Zaphod having a Texas accent. She thought it was a perfect dig at the current president.

So not only do I get to see the strangest movie in recent history and eat the lamest meat there is to be had in Texas, I also find that the fact that I'm balding from the chin back is not as memorable as my wife's choice of cooking containers and the first Texas Democrat I meet is a "Doug Adams" fan.

Oh, did I miss that? I have a growing section on my face that will not grow hair. It started in the center of my chin, and has now grown to an oval that's about 2" by 1 1/2". Evidently, I can't even go bald normally.

Finally, we head out to catch the bus back, and the bus arrives early, thereby ensuring we'll miss it.

New Personal Rule #4: Buses in Dallas are as early as my wife is late on average...

So anyway, I'm home now. Tomorrow, I'll start scanning in the comic. I'm going to put it on my my domain in a dedicated section. I'm also going to transcribe the dialog in a screen-reader friendly format, and put in "director's commentary" describing what went through my mind during the 12 hour 33 minute experience. Hopefully, I'll have all 26 pages up by the time I go to bed on Sunday.

April 29, 2005

Of course, when you're $80 million off of your projections, I can see why people would be asking questions. The hype over upcoming products over the last two years, plus the complete and utter lack of shipping any of said products probably contributed to it.

He claims that the industry is in danger because "[t]here are four or five simple game categories and nothing really new or different."

He continues:

The categories are shooters, puzzles and mazes, adventure games, sports games, and simulations. That's it. Most of today's hottest games are combinations of two or three of these categories, with a storyline added to keep the players from being bored stiff. When my kids show me a game, I usually say that it's nothing but the same old running-jumping-kicking-shooting with a new background. They leave in a huff.

His other complaint is that some games today are too hard:

If [the never-ending quest for more realistic graphics] doesn't flatten the market, the never-ending need to satisfy the demanding full-time game-player should do it. Some of today's games are ridiculously hard to play—unless gaming is your so-called life—and so daunting to casual players that they will quickly reject them. Who needs to devote themselves to a game just to play it once in a while? I'll take Spider Solitaire instead.

By the same arguments, the movie industry should have tanked by now as well. Go into any video store and you'll see about 6-10 categories of movies (comedies, drama, horror, anime, etc.) and the art films that are so esoteric that only a hardcore movie fan can possibly fathom the inner meaning of the film, let alone stand to watch more than the first 20 minutes of it.

The video game industry is currently going through a transitional phase similar to the film industry's multiple transitions from silent films to "talkies" to color. Admittedly, we're on a much faster pace than the film industry, but that doesn't mean it's happening.

The early days (1970-1984) were essentially the "silent era" for video games. Most games were developed either solo, or with small teams. A game development cycle could be measured in days or weeks, and innovation was driven by creating focused experiences. Developers spent their time trying to find a small, fun core and implement it in as small a space as possible to save on costs. Most original Atari 2600 cartridges were only able to store four kilobytes. Similarly, silent films were smaller, focused experiences. Most were less than 30 minutes in length, and focused on very specific experiences that could be expressed sans sound.

The next ten years (1984-1994) were the "talkie era" for video games. Games became longer and more in-depth, graphics improved from four color to 16-bit color (65,536 colors), audio went from blips and bloops to speech and symphonic scores. Many of the musical scores from this era are still remembered fondly today, be they from Super Mario Brothers or Final Fantasy. While there was still innovation during this era, most successful games refined and expanded on previous play mechanics. Similarly, "talkies" were longer, had better film quality, could tell deeper and more intricate stories and provide people with more memorable experiences. While there were still some innovative films, most films simply told takes on older stories with more flair.

The last ten years (1994-2004) were the "color film era" for video games. With the launch of Sony's Playstation in Japan in December 1994, the third dimension was open to us. There was a lot of effort put in place to bring the mechanics of old to the new world. A lot of companies could not make the transition, and faded into obscurity, bankruptcy or both. Some companies made games that challenged our perceptions of what could be done with this newfound power, and lots of them were either acquired (ION Storm/"Deux Ex", Bungie/"Halo," "Marathon," "Myth") or faded into the ether (Looking Glass Studios/"Thief," "System Shock"). Some innovators were coddled and supported by their new corporate masters, like Wil Wright ("The Sims"), and as a result, created boucoup dollars. Other innovators chafed under corporate control, and went off to do their own things with mixed results. Still others said, "Hey, you know what, we don't really have any gameplay, so we're just going to make something that looks beautiful." Again, "color films" did the same things.

Now we're entering the digital age. The old business models are being challenged by electronic distribution, teams that used to exist in a single room are now spread out over the earth, outsourcing of assets is becoming commonplace, and just like the movie industry only a small percentage of titles are making back what it costs to make them. Things are going to be a little weird for a bit during the transition. In the last few years, we've graphically moved from being able to do "Tron"-style graphics in real-time to being able to do graphics that rival "Final Fantasy: The Spirits Within" in real-time. However, it has come at a cost. Up until recently, most of the work for graphics has been done on the CPU. Now, thanks to shaders, we can move that work to the graphics card and use those freed cycles for the innovation that this industry lives on.

I'm going to offer you an example: "Halo 2" vs. "Katamari Damacy." "Halo 2" is an evolution of the genre, a refinement if you will. Graphically polished, visually stunning, smart enemies, but the gameplay itself was a relatively minor step up from "Halo." "Katamari Damacy" is anything but visually stunning. To be honest, while the game does have a quirky surrealistic vibe going for it, the game is still damn ugly. However, because of the minimalistic approach to graphics, Namco was able to create one of the most innovative games in recent history.

Because the CPU is doing so much of our graphics work right now, we have to make that tradeoff (simple gameplay/beautiful graphics, or ugly as sin/amazing gameplay), similar to how modelbuilders and special effects technicians had to make tradeoffs in films. With shaders becoming more and more complex and our graphics chipsets able to handle these shaders, we will soon be limited solely by our imaginations.

Do I know what the future holds for this industry? No. I wish I did, but I don't. But I can look back at history and try to apply it to the modern era. At every transition in the film industry, there were nay-sayers that thought that the current shift would doom the industry to failure. "Talkies won't make money." "Color is too expensive." "Digital effects look horrible." I can feel happy knowing that our industry now has its own short-sighted nay-sayer that will quickly be proven wrong.

April 27, 2005

Managed DirectX is fun and relatively easy for the basics...advanced functionality can be a nightmare when you have to bounce back and forth between the unmanaged and managed documentation in order to figure out parameter lists and supported flags...

This message goes out to Tom Miller at Microsoft. I'm a Visual Basic .NET developer, and while I don't like the fact that the documentation is C# only, I can handle it. What I can't handle is how incomplete the managed documentation is. If you get the documentation up to the same level as the unmanaged documentation, the complaints will decrease across the board.

I'm going to put this as gently as I can...Gator can suck my salty white ballsack.

My wife installed a "Butterflies" screen saver "sponsored by GAIM." (No, I'm not linking to it.) While it looked nice, she kept getting these "Install Completed" prompts when her screen saver would dismiss. The screen saver was downloading installers and advancing through the installers on my wife's behalf. The screen saver installed IE toolbars, ad redirectors, and spyware up the ying-yang. It took me the better part of two hours to clean her system.

Then there are the "wallpaper" places that demand that you use an ActiveX control to install the wallpaper...these guys are all crooks, every last one of them.

So my wife is now paranoid to install anything she finds on the web, even legitimate software, because of this experience...but that's not the scary part. We were walking by our apartment complex's office last night, and my wife noticed that the office manager's desktop and the office secretary's desktop were both running the Spyware Screensaver from Hell.

This worries me greatly. Should these guys get a wild hair up their asses, they'll have full access to rental records, credit card numbers, credit checks on prospective tenants, everything that our apartment complex has on us...and our office manager consented to it when she installed a screen saver.

So to the wonderful developers at Gator who market and create this joyous combination of software, I hope that large burly prison escapees with lots of sexually transmitted diseases find you, sodomize you violently, poke out an eye, and repeatedly violate the socket...all the while broadcasting it on a 1-900 phone sex line paid for using your credit cards and ordering fifteen of every product made available on the Home Shopping Network.

So I am going to start by agreeing with some parts of what Rob says. True, the MSDN subscription model has been completely different than other Microsoft subscription models. With MSDN, you buy Subscription X to get Product Y. With other Microsoft products, you buy Product X and you then buy a subscription to Product Y. The exception to that is MSN Explorer, which isn't a product as much as a front-end to your subscription. And true, most people have been undergoing price shock, and as such, the fact that lots of the Enterprise Architect functionality has fallen down into Professional has been missed by several people who have been in the debate.

I'm also going to admit that the public is currently working with limited information. Rob makes several references to an expanded MSDN subscription, but doesn't state what is expanded in it. His entire spiel seems to be that the MSDN subscriptions will no longer contain the development tools, which is fine given his previous logic. Let Software Assurance cover the development tools for those who qualify, let the rest of us pay for our upgrades at reduced prices like we always did, and just give the servers in the subscriptions.

Now for my list of issues, and believe me, this is a long one.

The base value of an MSDN Premium Subscription seems to be $2,299/year plus a one-time $500 setup charge. I get that from looking at your charts. Therefore, following your previous logic, each of the Team System SKU's should be able to retail for approximately $2,670, or $2,170 if you drop the Team Foundation CAL from the bundle. I got that figure by subtracting the price of the MSDN subscription from the first-year price. That price is very competitive with the current price of Enterprise Architect. It makes sense...you roll older functionality into the lower SKU's, and add new functionality to your high-end SKU. It's the Microsoft way.

So why can't I buy the tool seperately? If you want to be treated like other Microsoft products, give us the same set of options. For the type of development that I do, the MSDN Premium subscription is a waste of money. The only subscription that's needed is either MSDN Operating Systems or MSDN Professional.

The other message that I keep getting is that while the new add-ons individually seem like a minor addition to the toolset, it's the Team Foundation integration that makes it all worthwhile. Well, Microsoft Office says the same thing. However, they have a different strategy. Their server integration ships in the box for an affordable price. If people want the collaboration and integration the Office Server System provides, then they pay for the servers. Sharepoint Portal Server retails for $5,619 with 5 CAL's, and the CAL's are $71 each.

You decided to low-ball the server and high-ball the CAL's, which is more than acceptable...but your message has been that if you aren't willing to pay Microsoft yearly, the Team System SKU Functionality is out of your league. Plus, you are also the only Microsoft server product to ship with less than 5 CAL's. Rick LaPlante said in the video that Team Foundation Server only comes with 2 CAL's.

If I buy Office, I get all of the functionality within Office except for the collaboration features which rely on the servers. The Team System SKU's functionality doesn't rely on the servers. Unit tests don't rely on the server. Code coverage doesn't rely on the server. While they can utilize the server, the server isn't necessary.

My other issue with your "messaging" is the Express SKU's. I hear tons of whining saying that "Oh, you can't compare Visual Studio to the Express products because the Express products are new!" Bullshit. I can compare the Express products to the single-language Visual Studio SKU's, and while you say that the Microsoft way is to drop functionality from higher-priced SKU's into the lower-priced SKU's, in the same breath, you killed lots of stuff in the Express SKU's. The Express SKU's are now getting a limited amount of documentation, a limited subset of the debugging abilities available, fewer code templates, web editing as a seperate SKU, limited data support compared to the Standard SKU's, etc.

So, we look at your new pricing structure, and you are at least consistent in one aspect of all of this. Debugging is considered an extra SKU. Visual Studio 2005 Standard is the sum total of all the Express SKU's, with an extra $50 tossed in for the ability to actually debug your code in a reasonable amount of time and a full documentation library.

Finally, the last bit of your message, at least according to Rich LaPlante's video, is that this is an effort to recoup revenues. I fully understand your effort to monetize your market, but now we're seeing the Hyde to the Dr. Jeckyl decision to make Microsoft's earnings more transparent. It used to be that Windows and Office subsidized everything. Because Windows and Office used Visual Studio and Visual Studio enriched their product lines, those products didn't mind subsidizing you. So fine. If you want to monetize your group, do what PSS does and charge the other groups to use your products/services. I can guarantee you'll be funded within minutes.

And Rick LaPlante is right. It is the retail market that's screaming. I should know...I buy retail. So where is my upgrade pricing from Enterprise Architect (Retail) to Team Developer? You pride yourself so much in saying that "nobody pays retail for Microsoft products" that you forget the many of us who do.

You said in the video you've got more messaging coming out in about a month, so you've got a month to think about it. Make retail happy. You know as well as I do that we're the only ones who will be vocal about pricing because if you're under these other miscellaneous pricing programs, you "technically" aren't allowed to talk about your pricing.

Here is a sneak preview of my "art," which I will post once I get it back from Lone Star Comics.

Lone Star is going to send in my submission, then send me back the originals. Now, for bed. I'm already breaking my "Don't Post At Midnight" rule. Oh, and thanks to everyone who was there drawing with me. You guys were fun.

April 20, 2005

First, the good. The art is simplistic, but recognizable. You can tell by looking at an object for a fraction of a second what it is. Variation is added through inconsequential means, such as changes in the color of balloons. The controls are extremely easy. The game remembers your highest score, and the game has a very stable Internet high score function. The sounds are extremely satisfying. Finally, the kitten physics, while simplistic, are solid and consistent.

Now, the bad. This game is a game in the same sense that a slot machine is a game. You feed in your input, and the kitten does follow a very solid physics model, but the world is never the same twice. You never know what will be coming at your kitten, so it doesn't matter if you enter the same input or not. Every time, something different will happen. When you're keeping score, it's a horrible thing to do, as the player's input is incidental to the end result.

How can this one deficiency be corrected, but still allow for "infinite" worlds? Fairly easily, amazingly enough. At launch, generate a random seed and store that seed. Then at the beginning of each launch, reseed a predictable random number generator with that seed. When you post the high score, pass the seed, angle and power along with the score.

This does several things. First, it lets your players make informed decisions as to how to adjust their aim. Second, it gives your players a new game every time they launch it. Finally, it gives you a means of verifying their high score. If the physics are predictable and the random numbers are predictable based on the seed, you can easily feed those numbers into your own application and see what the end results are. If they match, the score is valid. If they don't, well, you know who to ban from your scoreboards.

When your output log from your release build is over 32,000 lines and there are no warnings at all in there, and the executable itself takes over an hour to build on a dual P4-3.6Ghz, chances are your codebase is just a wee bit out of control...

Of course, once the asset build process gets included in that, look out!

From looking at the filename, it's fairly obvious that the accompanying KB article is #894553. However, that KB article is nowhere to be found, at least as of this writing. Even trying a direct link to where it should be leads nowhere.

I write this because right now, my Media Center works fine. Aside from occasionally recoding shows in higher quality than it should, it works okay. I don't want to install an update on a functioning machine without knowing what side effects are going to be there, if other fixes are rolled in with it, or exactly what the problem is that the update is supposed to cure.

The current description is vague to say the least. "The Windows XP Media Center Edition 2005 Create DVD Update enables increased reliability when creating a video DVD using recorded TV files (.dvr-ms)." So what was the problem that was causing the unreliability. Will the update affect any work I'm doing on Media Center plug-ins? Were API's changed?

So please, Microsoft, tell us. Exactly what is affected by this update?

April 19, 2005

I've been getting mixed feedback about my blogging about the VSTS pricing, and I think that I should outline my motivations.

For the last several years, Microsoft has been making a shift from being the "easy to use" company to being the "security" company. They've been doing everything they can to ensure that code that is released is of high quality.

Because of their efforts, we now have canaries on the stack, we have the .NET Framework which does its best to eliminate the need for the writing of native code by the majority of developers, we have DEP and support for the NX bit...we've got a lot. However, the biggest security issue is that code is still running on people's machines. That will never change. The next focus should be to ensure that people's code is as bug-free and stable as possible.

So what does Microsoft do? The same thing that got them into this mess in the first place. They release powerful development tools priced to the masses with absolutely abysmal debugging support. If you want your code bug-free, you are expected to either struggle with the Express toolset for hours on end to get it done, or spend the $800 for Professional and get the tools you need. Want code coverage? That'll cost you an arm and a leg more.

What kind of message is that? "Sure, you can write software for our platform...but if you want to write *good* software, well, you're gonna have to pay for it."

In previous versions of Visual Studio, it was the code generation that was held back. Lesser versions of Visual Studio didn't get the optimizing C++ compiler or all of the code wizards, but the debugging tools were there for people to use. If someone had a lesser version, they'd just have to write more code because the IDE wouldn't make the code for them. If you wanted to use some advanced platform features, you'd have to buy the higher versions because the lower versions didn't come with redist rights for the DLL's.

Now, the lesser versions are being stripped of enough debugging tools that instead of motivating new developers, they're going to get frustrated. They won't have the tools they need to find their bugs quickly and easily. As a developer, how many bugs have you found just by stepping through your code with the Watch window open and seeing the values appear in red to indicate the change?

I'm all for bringing in more developers with simpler tools, but by limiting their ability to debug their code, you're fostering a community where bugs are acceptable simply because the pain threshhold to find and fix the bugs is too high.

And fostering a community of developers who won't fix bugs is too high a risk for me to ignore.

April 18, 2005

Well, I just installed Visual Basic 2005 Express Edition Beta 2. I figured most people would be focusing on VSTS, so I thought I'd take a look at the carrot that Microsoft is dangling in front of our VB brethren...and I'm not happy.

First, I'm happy about more warnings being added to the language. However, it would help if the warnings would actually be correct...

More specifically, it's in the following block:

Throw New ExceptionDim MyList As New List(Of Integer)MyList.Add(i)

True, MyList will never be instantiated because the code will never be reached, but at least make the error descriptive enough that your target market (novices/hobbyists) could find it...

A modal QuickWatch dialog? What happened to dragging variable names to the Watch or QuickWatch window? I can only watch one variable at a time? While it is true that the new "hover tips" are nice and they let you change variable values in place, you still have to move your cursor all over the place while stepping through in order to see what's happening.

A single project per solution unless you dig down into the options and specifically say "Always Show Solution"? Stripped down documentation, but you still dedicate almost 40 help topics to obscure flags that might or might not be needed by the command line compiler, C# code and style-sheet stuff that was stripped out?

And what is a Handle clause?Did you mean "Handles clause"? I mean, when the topic is essentially copy/paste from the previous Visual Studio, I'd at least expect the name of the keyword to be right...

It seems that the motto of the Express products has been not how can we make things easier for new developers, but how can we inconvenience experienced developers and blame it on simplifying things? "We don't want the experienced guys only spending $50...no, we want them popping the extra $750!"

For example, easy would be Control-I for Insert Snippet, not Control-R, X. What novice wants to learn a double-chord for a common task for beginners? At least Edit & Continue works in VB Express...

I know that this is beta software, but you want people to enthusiastically pitch the Express SKU's to people who want to learn development. Right now, I'm not seeing anything that would make me be enthusiastic about your Express SKU. True, I am an experienced developer spoiled by Enterprise Architect, but this just doesn't seem well thought out.

10. Dates turned off by answer of "If I told you, I'd have to kill you."9. Constant doubt wondering whether answering "What's up?" would have you fired.8. Telling Regis you can't give him your final answer without violating a contract.7. Two words: Punitive Damages.6. When you start mentioning a word beginning with X and Guido pre-emptively kneecaps you to prevent you from squealing.5. "What did you do at your last job?" "Sorry, I can't tell you that."4. Embarassing when fellow convict makes you sign one in front of witnesses before violating you.3. Constant barrage of "I can't tell you" really destroys mood at dinner party.2. You know the answer to the question on everyone's lips, and you are forced to just look dumb.

Heaven forbid that we allow our college athletes to learn a skill that they can use to make some money after they blow out their knees after one game. Lord knows that we don't want our athletes to be able to do anything except play their game, burn out quickly, and end up selling Ford trucks to make ends meet...

As is expected, i leaves scope as soon as the loop ends. j also leaves scope as soon as the loop ends. So what is the argument behind this behavior from the Visual Basic 2005 Language Enhancements chat? "This behaves the way that Visual Basic has always behaved."

I thought that there were two purposes behind the move from Visual Basic 6 to Visual Basic .NET. The first was fewer bugs through things like guaranteed zeroing of variables, improved object management, and guaranteed object disposal due to the garbage collector. The second was that there were going to be breaking changes in the language, so it was best to make all of them all at once.

I love Visual Basic because the language is essentially a common-sense language. You can look at a piece of code and know what it does. The code above looks like it makes a new j variable on each loop, but it doesn't.

As far as creating temporary variables inside of a loop, there are times when that makes sense. While performance may be increased by moving the allocation outside of the loop, there are also times where it is desirable to create variables at the beginning of the scope that they will be used in. That's why we got the For i As [Type] construct.

Fact is, I don't care what you do under the hood. If you allocate the memory ahead of time, that's fine...but the emitted code should function in a common-sense way. If a developer looks at the code and says that it appears like a new j is being created each loop, the end result should act like a new j is being created. That could mean you create and destroy the j, or it could mean you create j ahead of time, and just emit an MSIL instruction to zero it out at the point where the declaration is made.

You're already recognizing when it leaves scope...you may as well do the same for when it enters scope.

April 12, 2005

Well, I think I finally figured out where the disconnect is on the Visual Studio Team System pricing fiasco. Microsoft has three compelling products, each at three compelling prices, that they're trying to bundle into a bundle that is less than compelling.

Visual Studio 2005 Professional is compelling. We get a better IDE, faster compilers, more language features, and a better price. $799 is a wonderful price point for what you are getting.

The Team Foundation Server is compelling by itself. For about $2,700, you get the server product, which has bug tracking, source control, project management tools, work item tracking, the works. That's a one-time cost. Even assuming that this is the only server in Microsoft's server line that doesn't come with 5 CAL's to start, that's a good deal. Pay $500 for the CAL, and those features plug into Visual Studio 2005 Professional, according to the gentleman I spoke with last night.

So, my client cost is $1,299. Team Suite's public price is $10,939. Sure, you can try to dicker down to a lower price should your company buy in bulk, but this is your baseline public message, and as such, it is what I must work with.

Microsoft has to show that the extra that you get for your $9,640 is worth it for people who buy their tools.

Now, for MSDN subscribers, the debate is even more cut and dried. An MSDN Premium subscription is $2,499 for one year, and is an excellent price for what you get. At that price, you get Visual Studio 2005 Professional, Microsoft's other language products (FoxPro, MASM) and test/development licenses for almost every single product that Microsoft makes. You also get Visio, Project and Office licensed for production use.

So, you've got the $2,499 for MSDN Premium and $500 for the Team Foundation CAL. That's $2,999. Microsoft now has to make the case that these relatively simple features integrated into Visual Studio are worth a difference of nearly $8,000 per seat. Even looking at the Team System for Developer SKU, which is $5,469, that's a difference of $2,470. So let's look at it like $2,470 is the difference.

Architect is the easiest one to make a case for, because that ends up being a savings of $25 off the listed price for Rational Rose.

Developer isn't as easy. Compuware's DevParter Studio is a known quantity that fulfills 99% of the difference, but DevPartner Studio is still nearly $200 cheaper than Team Studio. Plus you're "tossing in" PREfast, a tool that's been in use at Microsoft since before I started there and has paid for itself more times than not. When people accused Microsoft of having secret API's and that's why your programs were so much faster/better/able to mash potatoes, they should have been shown this. You could have avoided the court rulings.

So to sum up, Team Foundation is compelling, MSDN is compelling, and Visual Studio 2005 Professional is compelling. Your combo SKU's, however, are atrocious. There's still time to fix the message and fix the problem, but it's going to require a concerted effort on your part. Remember, I'm test. I'm used to delivering bad news and I'm used to continually delivering the bad news until it's fixed or it's shipped.

Remember how we were all told that the .NET Framework guaranteed that our declared variables would be set to their default values on declaration? Well, there is at least one instance where that won't happen.

Let's look at the following code:

Sub Main()

For i As Integer = 1 To 10 Dim j As Integer j += 1 Console.WriteLine(j) Next

End Sub

The variable j leaves scope at the end of each iteration, but this code does not print 10 lines containing "1". It counts from 1 to 10.

Why? To save the time of reallocating and reinitializing the variable each loop, the Visual Basic compiler switches the code around a bit so you get this:

Sub Main()

Dim j As Integer

For i As Integer = 1 To 10 j += 1 Console.WriteLine(j) Next

End Sub

The solution is that any variable that you declare inside of your loop needs to be explicitly initialized, like so:

Sub Main()

For i As Integer = 1 To 10 Dim j As Integer = 0 j += 1 Console.WriteLine(j) Next

End Sub

The above snippet will work as expected: 10 lines, each containing a "1".

Now that being said, where is my disagreement? Simple. I don't mind that it's the lifecycle management tools that are being priced out of my reach. What I mind is the features that they chose to tie to the lifecycle management tools to try to "encourage" me to reach beyond my means.

For example, there is no way for me as a Visual Studio 2005 Professional developer to get the Code Profiler, the Code Analyzers, the Unit Testing or the Code Coverage modules without upgrading to Team System. Those items have either no ties or superficial ties to the Team Foundation System, and should be used by nearly all developers.

We're at a point right now where we as developers need every tool that we can get our hands on to ensure that our products are shipping as bug-free as possible. Steve McConnell said in Code Complete (p.615) that "Robert Grady of Hewlett-Packard reports that testing done without measuring code coverage typically exercises only 55 percent of the code." People constantly complain that Windows runs too slowly, when in fact it's the program that they're running which runs like a dog.

For the last two years, Microsoft has been beating down our doors with reports of how Visual Studio 2005 was going to help us improve the quality of our software...but all of the tools that are going to improve the quality are outside of the reach of the very people who could use it the most. Not every company has 100-man test teams per product. Most are lucky to have test teams in the single-digits, if they have a test team at all.

I understand that Microsoft wants to make money off of the lifecycle-management stuff, but I refuse to believe that performant, bug-free, well-tested code is a luxury that can only be afforded by those who have the thousands to spend.

I'm a firm believer that in order to grow, you have to stretch not only beyond your abilities in your chosen field, but to try to expand beyond your chosen field. You never know what you might learn through these little experiments in other fields, but you always come away with a greater appreciation for what other people do.

With that in mind, I have decided to try to participate in 24 Hour Comics Day 2005. My plan is to head down to Lone Star Comics on Mockingbird this Saturday and register. Then I plan to show up at 11:00am on Saturday, April 23 with my art pad (sized so that my scanner at home can scan the result) and my Sharpies and do my best.

I'll be posting the results here to my blog, regardless of whether or I succeed or fail. Now, I'm going to set expectations. I'm going to do at least 24 pages (more if the story requires it), black and white only. No shading, no color. I'll be hand-lettering the comic, but if my handwriting is illegible and time permits, I will wipe out the text and computer letter when I get home if necessary. For my randomizer, I will be taking a small dictionary and when the event starts, I shall flip to three random words in said dictionary, write them down, and build my story around those words.

Will the story be good? I don't know...but I've found that some of my best work occurs when there is a random element introduced. Of course, usually the random element that is introduced is me, but that's besides the point.

So wish me luck!

By the way, I ordered from Lone Star Comics regularly when I was in Utah. I finally went into their Mockingbird store yesterday, and found a clean store, a helpful staff, and a massive drain on my pocketbook for years to come.

_alloca() allocates space on the stack, and is one of an optimizing developer's most powerful tools, but can be a nightmare if not handled with care.

"Why?" I hear you ask. "It's allocated on the stack, so that memory is freed once the function is over." Well, let's look at it this way.

You have a function that calls _alloca(), gets a pointer to that memory, and calls another function. That function incorrectly writes past the end of the allocated memory. Guess what happened? Your stack got thrashed.

You think it's not going to happen? Evidently, you haven't worked with C++ before. This kind of stuff does happen. That's one reason why I love managed code...you have to tell the compiler you're going to blow off your own foot and you have to tell your system that it has permissions to let you blow your foot off before you can remove your lower limbs.

Essentially, any function that will be called using a pointer to your _alloca()-allocated space as an argument requires in-depth testing. Make damn sure that there is no chance in Hell that those functions could ever do a buffer-overrun...because if they can, well, your code is owned.

[Updated to fix grammatical error and missing words. Note to self: Never post at midnight.]

Yesterday, I attended the MSDN Webcast on Visual Studio 2005 Team Test, and I got a bit clearer message about what the different team editions were. I left halfway through because you could tell that the presenter had absolutely no respect for testers, and that's not a message that you want to convey to your target audience.

First thing first, Microsoft could have avoided a metric shit-ton of headaches had they released this chart when they released the pricing. When we saw the breakdowns, we were thinking that each SKU was only getting the limited amount of things that were mentioned and nothing more.

So, with Team Architect, the unique stuff you get is the Application, Infrastructure and Deployment modeling stuff. For Developer, it's the dynamic and static code analyzers and the profiler. For Test, it's the Manual and Load Testing modules and the Test Case Management.

Unit Testing and Code Coverage are shared between Test and Developer, and Class Modeling and Visio/UML Modeling are shared between Architect and Developer. So if you have to get a Team Edition, get Team Developer. You get the most bang for your buck.

Now looking at the Team Edition breakdown and comparing it to competitive pricing, here's what I see. Team Architect is competitively priced. Team Architect is competing with the Rational product suite with the unique feature set that it has, so I will not debate that pricing system. However, for v1 products, they better be extremely good in order for this to be valid.

Team Developer and Team Test, however, are still overpriced. The Load Testing system is essentially an enhanced version of ACT, the Unit Testing stuff competes with the *Test products from ParaSoft and is essentially NUnit with a GUI/Code Creation front end, and most of the Test Case Management features that a Test Manager or Test Lead would need are only available if you use Team Foundation.

On the Team Developer front, Microsoft is going to be competing head-on with VTune from Intel, and I don't think that a v1 product is going to be able to do it. As for the code analyzers, well, Microsoft has had PREfix and PREfast internally for years and they have paid for themselves repeatedly. Admittedly, they were a pain in the ass to get working, but once you got them working on your product, you would get marked size and speed improvements on your executables.

What's scary is that looking at Team Developer, aside from the Team Foundation Client, everything else that is there is what we would expect from a v.next software package from Microsoft. They'd say, "Hey, guess what, here's everything from last year's version, and a few extra features...all for the same price!" Now it's, "Hey, guess what, here's everything from last year's version, and a few extra features...and we're going to take you for over twice as much!"

April 4, 2005

Well, right now it looks like there were fewer "gotcha's" moving from Visual Basic 6 to Visual Basic .NET than there are moving to REALbasic.

This is a "Top 14" list of things to keep in mind when coverting your code from Visual Basic 6 to REALbasic.

1. Replace any & used for concatenation (bad programming practice anyway!); use + instead. Left("Honk", 1) & "ank" becomes Left("Honk", 1) + "ank" 2. Dim all variables on a single line to ensure YOU understand what they are supposed to do. 3. Use As Integer, not %, etc., in your code. 4. Be sure to remove DefXXX in each Form and Module to detect your data types. 5. Visual Basic Integers (16-bit) will become REALbasic Integers (32-bit longs values) in REALbasic, so check the logic of your code. 6. Do not use Visual Basic currency or byte data types unless you really need them. 7. Use Visual Basic Chr/Asc as Byte replacement. 8. Use Double as Currency replacement. 9. Visual Basic Error Handling will get Rem'd out, but you can locate it and fix it. 10. Remove line numbers as they are not supported in REALbasic. 11. Goto supported, but only with string label (no numbers). 12. Move as much code as possible from forms into modules. (In general, VB Project Converter does a better job with modules, and you can actually open a module directly using REALbasic without using VB Project Converter at all, if you wish.) 13. GoSub not supported, so replace with functions/methods as required. 14. Database will require some re-work, but SQL logic is largely compatible.

Well, let's hit this list, shall we? Numbers 2, 3, 4 and 13 are the same in .NET. Number 1 is more efficient in .NET than using the plus (+) sign. If you use & on your strings, it concatenates them. If you use plus on your strings, it checks to see if both sides of the equation are numeric, and if they are, it converts them to numbers, adds them, and converts the result back to a string. Sounds like a fairly major consequence to not point out, eh?

In Visual Basic .NET, you can use bytes. You need to use a MemoryBlock in REALbasic. You get Shorts in VB.NET for 16-bit integers, you need to use MemoryBlocks or upgrade to a 32-bit integer in REALbasic.

However, I'm willing to let that slide given that REALbasic is designed as a cross-platform BASIC, but we'll see after a prolonged test. I plan to use this entire weekend evaluating this language and compare it to Visual Basic .NET 2003. Who will come out on top? Tune in next week and find out, or try it yourself at:

So I was reading this article in InformationWeek about Microsoft ending support for Visual Basic 6, and the article mentioned that Real Software Inc. was offering RealBasic 5.5 for free through April 15.

So, I went to their site, and I'll be damned if I can find out how to get the free version. I see "demo" everywhere, but no full version. Let's see if the power of the blogosphere points me to a better BASIC or if Real isn't all that "real."

I agree with damn near everything being said on the thread, but I also understand the necessity for copy protection. Partly due to piracy, PC game sales have been plummeting. The only games that haven't had their sales cannibalized are the ones that either have a) effective copy protection or b) online authentication.

PC games aren't the profit center that they once were. They used to be an attractive alternative to consoles because you didn't have the $8-10/unit platform fee and the $10k development kit costs, but now that per unit costs seems mild given how low the piracy rate is on consoles in comparison.

Now several PC games end up being ports of their console brethren, usually handled by a 1-2 man team, in an effort to mildly expand the play base. Less than a man-year is invested on the port, and the port usually sells about 60,000 units at $40 a pop. Sure, that's $2.4 million coming in at retail, but the developer doesn't get that. Game gets sold to the wholesalers at about half of that, so the publisher's are getting $1.2 million. The developer gets maybe 8% of that, so the developer is only getting $96,000. Take away the average developer's salary, and you might actually break into 5-figure land on the ROI.

So believe me, I do understand the frustration that leads to the copy protection being installed. However, I draw the line where the copy protection stops me from doing legal things with my machines. I deal with a lot of ISO images. I use Microsoft's Virtual CD-ROM Control Panel to mount the ISO images. I don't like being told that I have to unload that driver before I can play a game. I write a lot of code. I don't like being told that my minimized copy of Visual Studio that's compiling in the background has to be closed for me to play a frickin' card game.

We're getting to the point where the copy protection side effects are so severe that the copy protection itself is the primary cause of customer dissatisfaction. Steam, while an effective copy protection system due to the authentication system, has been the subject of international lawsuits and concern about what would happen if Valve went under. Starforce, well, the less said, the better.

So we're to the point where some copy protection is needed, but drivers are considered too much. Let's assume that this is a negotiation table. The publisher's are willing to give up driver-based copy protection, but they want something from you in return. What would you be willing to give up for that?

April 3, 2005

I get asked a lot about what I think is the most important ability for a tester to have. Some people are going to argue technical knowledge; others are going to argue logic; others still are going to argue negotiation skills. While all of those are important, there is one skill above all that is vital for survival in the software testing industry...the ability to control your emotions.

Note that I said "control," not "suppress." If you suppress your emotions, you're just going to make yourself sick. But if you can control your emotions and keep them from taking control, you will survive.

Why is controlling your emotional state so important for testers? Well, our entire job is essentially to find fault in the work of others. As a result, you will be called a liar, a hack, a worthless piece of human debris, a developer wannabe, unskilled, unwanted, a waste of money, a waste of flesh, a waste of your father's sperm, dead weight, and a pathetic excuse for a human being...and that's by the people who respect your opinion. There are others who will do much worse. It's not uncommon for a software tester to get in at least one fistfight during their career, unfortunately.

So you have to be able to control your emotions while you are dealing with non-testers, but you also need to be able to vent your emotions afterwards. I've left meetings and burst into tears. I've known testers who have left meetings and put holes in walls. Emotions run high in this industry, and the only people who will say otherwise are selling something.

Most testers try to suppress these emotional urges, but all that does it shorten their careers. You can only keep something bottled up for so long before it leaks out, either as an ulcer or a violent career-limiting move. The hardest job for a test manager is to identify when a tester is suppressing their emotions, pull them aside, and let them get it all out.

As an industry, we like to think that the process of software creation is clinical. We posit to outsiders that our architects design our software to exacting specifications, our developers build using the highest quality tools, and our testers are there to ensure that our software works. In fact, the "design" of most products amounts to scrawls on the back of a napkin, most developers cobble their work together with a hodge-podge of mismatched tools that barely work together, and testers are thrown at a project like hamsters being catapulted against a brick wall in hopes of breaking it down.

Our industry is as much chaos as science, and as such, emotions are a factor. If you ignore the emotional impact of your employees, you do so at your employee's and your customer's peril.

I always hate the first week after Daylight Saving Time strikes...it's hard enough sleeping normally thanks to my wife, little Miss All-Over-The-Bed, but going to sleep essentially an hour earlier so you can get up an hour earlier messes with your mind.

Okay, Robert Scoble restarts his link blog, and four of the posts that he puts up are mine. Nifty.

Of course, I have a lot to thank Scoble for. Thanks to his links on my main page last month, my traffic more than doubled from 2038 visits in February to 4260 visits in March. Of course, that doesn't include RSS feeds...because of the way that Blog*Spot works, I can't track that.

It's weird that Microsoft still listens to their ex-employees. Some of my top visitors are tide*.microsoft.com, the Microsoft proxy servers. Those accounted for about 1600 visits last month.

Fact of the matter is that I still support Microsoft even though I quit back on September 30, 2003. I quit in part because of my family, in part because of stress, and in large part due to a schmuck who is now working on an upcoming hardware add-on for the next Xbox from what I understand. But there is one other underlying reason behind my leaving.

Microsoft is full of individuals who want nothing more than to make their customers happy. They want to make products that will be enjoyed by people. They want as many people as possible to use their products. They go to great pains to ensure that their products work across the board on as large a hardware and software base as possible in as many ways as possible. It's the opposite of Linux, where most development is digital masturbation, solely for the pleasure of the developer.

For a long time, Microsoft the company had a vision in line with Microsoft the employees. Those were heady days. Microsoft took good care of its employees, slathering them with options that were worth something and provided them with anything they needed so that they could be more productive...kind of like Google is doing today.

Then, Judge Thomas Penfield Jackson struck, and the stock began it's long slide. Fortunately for those who own Microsoft stock, the slide has ended and the price has evened out, but it's because Microsoft the company has changed its focus.

Microsoft is now focused on ensuring shareholder value, not on customer and employee care. The change started with the nickel and dime things. Office supplies are cut back. Employees don't get towel service anymore. Employees had to subsidise some of their prescription benefits. Investors saw the nickel and dime shit and were happy, so the stock price went up.

Microsoft made the product units more transparent in the earnings statements, showing that everything was losing money except for Windows and Office. Investors liked the transparency, so the stock price went up.

Now the management of the various business units are all about the money. They're all trying to get their units back in black. Unfortunately, it's showing. The VSTS pricing debate is bad, but it's also symptomatic of the issues that are being faced by Microsoft as the company transitions to a sharefolder focus instead of a customer focus.

I don't know how to close this, so I'm going to end by saying, "Be careful, Microsoft." Don't focus on the shareholders too much, or you'll end up going Enron on us. Don't whittle away on your employees too much...Google is in the neighborhood now. And finally, don't whittle away at your customers too much. We keep you in business.

About Me

Currently Sr. Software Engineer in Test at Netflix. Formerly Sr. Quality Engineer on Firefly at Amazon, QA Manager at Ritual Entertainment, Software Test Lead at Microsoft Game Studios, Director of IT for Meeting Professionals International.

Opinions expressed are my own and do not necessarily represent those of any current, former, or future employer.