Contact me here

Monday, May 31, 2004

Bob Woods, chairman of the Industry Advisory Council and president of Topside Consulting Group, will speak on enterprise content management, "are we ready to break the code?" At the NCC AIIM June social at the International Spy Museum.

The idea behind rolling wave planning is that you can't know everything about the project in advance, so don't bother trying to plan a lot in detail. Plan the next few weeks in detail, always staying about 3-4 weeks ahead of the project. Of course, if you know you have hard dates like end of quarter or a trade show, put those events in the schedule. But rolling wave planning is much more likely to help you achieve any of those hard dates.

I incorporate adaptive planning into my rolling waves, by using the knowledge I've gained about the project to (re)organize the work as necessary.

This Focused Performance Weblog is a "business management blog" containing links and commentary related primarily to organizational effectiveness with a "Theory of Constraints" perspective. TOC is noted for its applications in Project and Multi-Project Management (Critical Chain) and Operations Management (Drum-Buffer-Rope), as well as in Marketing, Strategic Planning and Change Management (TOC Thinking Processes).

NEW YORK After a strong Q1, accompanied by growth in April, it seems that the newspaper industry is finally on the receiving end of a recovery. Increased gains in advertising revenues coupled with bright employment numbers -- the last true holdout -- buoyed many companies this year. So far, so good. And according to Miles Groves, president of MG Strategic Research, he expects that Q2 will keep humming along as well.

Sunday, May 30, 2004

This post by Zef Hemel explains the difference between open source and free software. Technoflak is still learning about this, and hesitates to weigh in with an opinion.

However, it is worth remembering that the GPL License came out of the free software movement and their philosophy that living in freedom means using free software. Technoflak believes that when you agree to a license, you should adhere to its terms. If that means paying money and restricting your use, so be it. If that means not charging for any product you build using products covered by a GPL license, then that is what you have agreed to. The whole structure of the free enterprise system will come to pieces if we all decide to make up the rules as we go along.

Saturday, May 29, 2004

Web services technology is also effective in extending the value of legacy data that may have originally been built for a specific function, not necessarily to share data.

“We never start with a clean piece of paper,” says Wayne Beekman, co-founder of Information Concepts, which builds large-scale custom database applications. “Although we build our applications from scratch, we get most of the data from legacy databases.” Organizations may also build new Web services applications on top of existing applications. For example, Information Concepts worked with an association that wanted to create a Web services framework on top of its legacy applications.

“The original applications can remain untouched,” Beekman continues, “and the Web services framework can be changed as needed.” On security, Beekman points out that any application can be built in a secure or insecure way, and that the developers have options to make sure it’s done correctly. He emphasizes that the same holds true for Web services applications.

Friday, May 28, 2004

I am consistently amazed at how often some flacks make the most obvious mistakes, from the article:

2. Again, no attachments: What I find absolutely amazing is that people send out emails with gigantic file attachments to unsuspecting recipients. I have broadband at home, but I sometimes find it difficult to get a broadband connection on the road and I can't tell you how many times a stupid email attachment has caused me delay. There is no reason why something in the form of a Word document can't be cut and pasted into the body of an email. If it's something that is graphically intensive, put it on a website and provide link. ....

9. Contact, please: I found at least one hundred emails where people did not put their contact information. I also found some strange occurrences where people did not put an area code for their phone number. ...

16. Format in plain text: Inserting your company logo or using graphics in the body of an email can make the email cumbersome. These images may also not appear properly in some email programs and the email will appear in a way that makes it completely unreadable.

It is disheartening to read columns like that. Our clients pay us to know about things like that so they don't have to.

Blogosphere has given new meaning to the old joke about freedom of the press belonging to those who own the press. Now every yahoo with a modem is a publisher, and that opens new opportunities in damage control and countering negative press.

Thursday, May 27, 2004

When a hurricane makes landfall, the area hit by the eye wall suffers the most damage.

Jim Horton is currently chronicling his client's crisis. I have heard presentations on crisis communications, but they have all been case studies. Watching a crisis unfold and reading about its affects on those on the receiving end gives a different, and sobering view.

Tuesday, May 25, 2004

Last Tuesday, I went to the monthly meeting of the Federal XML Work Group. This is a small part of the effort to establish a single federal enterprise IT architecture standard. From the website:

Extensible Markup Language (XML) embodies the potential to alleviate many of the interoperability problems associated with the sharing of documents and data. Realizing the potential requires cooperation not only within but also across organizations. Our purpose is to facilitate the efficient and effective use of XML through cooperative efforts among government agencies, including partnerships with commercial and industrial organizations. Contributions are welcome and encouraged!

The standard that evolves out of this process will enable not only the assorted agencies of the federal government to cooperate, but also enable cooperation between state and municipal governments as well. Given the collective buying power of government agencies, it will become the de facto industry standard. Technoflak has been deeply impressed by the level of commitment of all the participants to developing a standard that will have lasting and practical value for everyone.

The May 18 meeting was a combined meeting of the IRS XML Stakeholders group and the XML Working Group, hosted by the IRS. These groups have many of the same objectives and are working collaboratively. The XML working Group is co-chaired by Owen Ambur (Interior); the IRS XML Stakeholders is chaired by Sol Safran(IRS).

Arriving late, Technoflak missed most of Judy Newton’s presentation on developing a metadata registry with a focus on ISO11179. One thing I did catch was her concern that the software tools that automatically generate XML schema, say from SQL, do not document the source of the data. One way to ameliorate this problem is with a link to a 11179-based metadata registry. Newton encouraged participants to visit the Joint Technical Committee’s website.

Steve Wasko, of IRS public portals, spoke about their public portal strategy for metadata and taxonomy. The IRS and Mitre are working to develop tax administration terms and a companion thesaurus. The goal is to deliver better search results for taxpayers using the IRS website. Mitre consultants have spoken with fifteen IRS business units and sixty IRS staff members to develop a list of 800 terms. Their Office of Chief Counsel had used a uniform issue list of 16,000 terms, but agreed to use the new list of 800 terms. Technoflak could only marvel at the triumph of bureaucratic cooperation.

Wasko said there were problems integrating the metadata and new taxonomy with the existing content management system.

Claude Matthews, of IRS web services, spoke about the use of Metasoft to organize eLearning. He described the convergence between the IRS employee support systems, the learning & education systems and the media & publications systems.

He described the work of the Enterprise Data Management Office (EDMO), which involves procedures (not correspondence). It is important to keep in mind the different needs of the taxpayer, tax examiner and security. Matthews encouraged participants to write technical procedures in XML, making it possible to display information in different ways to different users.

He characterized Metasoft as a developer’s tool and explained that Infopath is used to develop templates, because it facilitates XML tagging. Metasoft is used by InfoPath as a trusted source.

Matthews said that the Internal Revenue Manual should drive all IRS content. The current topic map is in simple English; the internal system is more technical. Successful system integration may require adoption of a new XML structure (OASIS? LegalXML?). Namespaces will consolidate many “stove-piped” areas.

Currently, content is driven by Business Operating Divisions and process owners, leading to duplication of effort and an updating nightmare. In the future, content will be driven by the content data model, MITS via BSMO/EDMO, an easily updatable trusted source. Matthews said a collision is imminent. Technoflak agrees.

Manny Tayas of Systinet spoke about UDDI best practices. From his first slide:

“The Universal Description, Discovery and Integration (UDDI) protocol is one of the major building blocks required for successful Service Oriented Architectures.”

“UDDI creates a standard, interoperable platform that enables organizations and applications to quickly, easily, and dynamically find and use shared services over standard internet protocols such as HTTP."

“UDDI is a cross-industry effort driven by major platform and software providers, as well as marketplace operators and e-business leaders within the OASIS standards consortium.”

UDDI is standards based, flexible, with a wide variety of publish/discover tools available. Tayas stressed that UDDI is not a repository, but a description of services: basic contact information and identifiers about a company or service provider, categorization of web services using taxonomies, and technical information describing a web service. He encouraged participants to visit the UDDI website to get a clear understanding of its capabilities.

Modeling UDDI entities should start with thinking about what constitutes a businessEntity and a businessService within the organization. Once these are established, this information should be documented for others to consume and governance policies should be put into place to enforce. Tayas suggested the organization chart as an example of defining businessEntities. A lively discussion ensued on whether project level or security aspects might serve as a better basis for definition of businessEntities.

UDDI was developed as a general-purpose service registry and was not specifically designed for web services, though web services are the most logical type of service to publish in the registry and a good place to start.

Tayas explained the process of publishing from Web Services Description Language (WSDL) to UDDI. Using UDDI ensures interoperability with application vendors for discovering WSDL-based services. It will enable better query capabilities. He said using taxonomies are the key to promoting reuse and to establish categorization schemes before deploying.

Tayas concluded with a discussion of process and procedural considerations. It is critical to standardize publications and establish publication guidelines.

Monday, May 24, 2004

Via Steve Rubel, the AD:TECH conference is being covered live by a group of bloggers. I think this will become part of event organizing in the near future, and not just technology events. I have already told my clients to encourage bloggers to attend their events with free passes and links to the official event web site.

Friday, May 21, 2004

Gary started the started the meeting off with an amusing anectode about how it seems that people are more likely to listen to you if you write a book about bad things (like breaking code) instead of positive ways of securing software (like his previous books). I was heartened to hear him talk about many of the same things I deal with on a daily basis: that security is emergent property of the system and not a feature that is added by tossing in a username and password; that developers of software (the builders) can do more to protect software than the operators; that network security is only the first of dozens of steps to securing software and data.

He went on to state that commercial security is a reactive process when good software security need to be proactive. Everybody chuckled at his remark that "crypto fairy dust" will not solve security problems: he was pointing out that most developers believe, wrongly, that just using SSL and cryptography will secure their software. It's so common to think that putting in a SecureID card instead of normal password challenge/response is somehow "software security" (sure, it's part of the solution, but only a start).

He also elaborated on the fact that systems are, of course, only growing more complex and that there is a correlation between the complexity of systems (measured in lines of code, function points, etc) and the risk associated with systems. If you have 1,000 lines of code you will have fewer security problems than if you have 1 million. It's one of those "duh" moments and Gary pointed out that if you can convince your customer to reduce complexity of requirements or cut features altogether you can create less complex systems with fewer security holes. He said something I hadn't thought about ealier: to help build more secure software, reduce features and simplify architecture which will help reduce cost. This means building more secure software can actually pay for itself. Not sure how many managers would buy the argument, but it's worth making for sure.

In the decades-long discussion of how to manage data, one branch of the conversation has always confused me. Since the latter 1960s, the text community has grappled with the inherent contradiction that lies in all textual material; it must retain its rich, flowing nature in order to retain its meaning and coherence, but it must also be viewed as a collection of overlapping logical structures in order to be properly automated. This isn't much of a challenge for a reasonably well-educated human--the brain just works that way--but it has proven a real challenge for the computer industry.

I spent Tuesday at the monthly meeting of the federal XML work group and am working on an account of the meeting.

Wednesday, May 19, 2004

A WikiWikiWeb enables documents to be authored collectively in a simple markup language using a web browser. Because most wikis are web-based, the term "wiki" is usually sufficient. A single page in a wiki is referred to as a "wiki page", while the entire body of pages, which are usually highly interconnected, is called "the wiki".

"Wiki wiki" means "fast" in the Hawaiian language, and it is the speed of creating and updating pages that is one of the defining aspects of wiki technology. Generally, there is no prior review before modifications are accepted, and most wikis are open to the general public or at least to all persons who also have access to the wiki server. In fact, even registration of a user account is not often required.

A writer's greatest obligation is to her readers. Think twice before you accept this offer from AiYo:

AiYo AdFavorites is for web publishers. With AiYo AdFavorites you can generate revenue by recommending products on your website or blog. With AdFavorites you can deliver very high quality content to your readers which are your recommendations and are NOT impersonal ads.

It is one thing to run ads, or have an affinity marketing link to a site such as Amazon, but I don't think it is right to recommend specific products to readers unless you make it clear you are being paid for your recommendations.

I went to the Sake Club last evening for a DC Bloggers meeting and met Tom Biro of The Media Drop. We had an entertaining conversation about technology, media, marketing, public relations and blogging.

The Sake Club is a nice restaurant, in the elegant minimalist style of the Japanese. I am more the Brickskeller type myself.

Biro is an interesting fellow, destined to make a big contribution to blogosphere.

What kind of company actually allows an unauthorized, unedited blog about one of its most sensitive projects? While transparency and openness behind the firewall may be a wonderful thing, I don't know how thrilled I'd be about ERP blogs if I were CIO.

Answer: the kind of company who understands the risks of poor communication on a critical project are higher than the risks of information leaks.

Internal IT project web sites have been around for a while, I remember hearing about them at least a year ago at a DC SPIN presentation.

Russert: Thank you very much, sir. In February of 2003, you put your enormous personal reputation on the line before the United Nations and said that you had solid sources for the case against Saddam Hussein. It now appears that an agent called Curveball had misled the CIA by suggesting that Saddam had trucks and trains that were delivering biological and chemical weapons. How concerned are you that some of the information you shared with the world is now inaccurate and discredited?

Powell: I'm very concerned. When I made that presentation in February 2003, it was based on the best information that the Central Intelligence Agency made available to me. We studied it carefully; we looked at the sourcing in the case of the mobile trucks and trains. There was multiple sourcing for that. Unfortunately, that multiple sourcing over time has turned out to be not accurate. And so I'm deeply disappointed. But I'm also comfortable that at the time that I made the presentation, it reflected the collective judgment, the sound judgment of the intelligence community. But it turned out that the sourcing was inaccurate and wrong and in some cases, deliberately misleading. And for that, I am disappointed and I regret it.

Russert: Mr. Secretary, we thank you very much for joining us again and sharing your views with us today.

Powell: Thanks, Tim.

Russert: And that was an unedited interview with the secretary of state taped earlier this morning from Jordan. We appreciate Secretary Powell's willingness to overrule his press aide's attempt to abruptly cut off our discussion as I began to ask my final question.

Coming next, the view from the Senate with Republican John McCain and Democrat Joe Biden. Then, our Meet the Press Minute, with wartime Secretary of Defense Robert McNamara from 36 years ago. All coming up right here on Meet the Press.

Dan Gillmor alerts us to Hoder's post on the security hole in Blogrolling's application software.

I could write a self-serving post about the necessity of hiring a flack to track blogosphere for your company. But I think there is a larger lesson about the importance of security. If your software is not secure, it is not ready to ship. Public Relations should never be considered a substitute for quality assurance.

Tuesday, May 04, 2004

The last meeting of DC Spin featured two presentations. After Noopur Davis’s presentation, and a delicious buffet hosted by Booz Allen Hamilton, we heard from the grand guru of software process improvement, Watts Humphrey. There were almost 300 attendees and clearly Humphrey was the star attraction.

In the early days of computing, we could assume that users were friendly, interested in results, and willing to help. The Internet used to be like that, but no more. (Technoflak recalls a client saying “The Internet was a great idea, until people got a hold of it.” ) Humphrey said the situation is serious and getting worse every year. Many successful crimes have not been publicized because of corporate embarrassment, and as Humphrey put it, “We haven’t even got to the terrorists yet.”

The way we work now is to hold our nose and ship product and fix it after defects become known. Sometimes we even charge for fixing defects (hearty laughter from the audience). But obviously the current strategy is failing. It accepts the cost of the initial attack, is impractical for system administrators, costly to suppliers, and has unknown litigation exposures.

Humphrey repeated Davis’s lesson that defective software is not secure. Over 90% of software security incidents are due to attackers exploiting known security defect types. The top ten defect types account for 75% of all vulnerabilities. Unfortunately, most software professionals do not understand that vulnerabilities are defects, do not know how to recognize and fix security defects, and do not view quality as their responsibility.

Humphrey said that security is an emergent property of the system, that security fails from the components up.

Humphrey joked that “if it didn’t have to work, we could build it pretty quick,” but went on to say that it is surprising how many defects software can have and still work.

Humphrey discussed the limitations of testing at length, joking that as long as people use it the way you test it, your software will work. He described writing a simple program with fifty-seven lines of code and coming up with one hundred and sixty-eight test cases. Obviously it is not economic to exhaustively test large and complex software programs.

Humphrey compared software development and testing to road repair. (Insert Microsoft joke here.) Suppose you were to build a road system but only repair pot holes on the most obvious routes. The roads would soon be impassable. Currently, Microsoft tests 1% of all test cases. (No, there was no derisive laughter here, possibly because the audience consisted of programmers and project managers.) Humphrey described the “universe of stress” and compared it to the much smaller “footprint of testing”. He pointed out that malicious hackers would always aim their attacks at the areas beyond the test footprint. Ideally, code should have no defects inside the test footprint.

Quality products are not built by mistake. We need development teams where all members know how to do quality work, recognize security defects, strive to find and fix every defect before testing, and consistently produce defect-free software. This requires properly trained and highly motivated development teams with supportive management.

Humphrey observed that a compiler will take whatever you give it and try to make code. In that sense, a compiler is “anti-quality.” Compilers will only find 50% of errors.

Humphrey described the best projects as an artful balance of conflicting forces. Project teams must consider business needs, technical capability, and customer desires. To build superior products, teams must understand the complete context for their projects. (One of Technoflak’s clients says that failure to understand business needs is the most common failing of developers.)

The best teams have skilled and motivated members who set aggressive but achievable (here Humphrey added deliverable) goals. Humphrey said that in his experience, management never overrules a team’s schedule, once a team has drawn up a plan. It was not clear to Technoflak whether he was speaking in general or only in cases of companies who have undergone Team Software Process training.

Humphrey described how altering the process of software development can improve quality. If A consistently produces better code than B, then have A teach B what A is doing. Not only will B’s code improve, so will A’s, because teaching his method will cause A to examine what he is doing and improve upon it. Humphrey pointed out that, “this is simply scientific method.”

Humphrey compared directing software professionals as herding cats. He said, “You must convince developers that these methods work.” As developers go through the Personal Software Process training, they gather data, and it is their data on what they did that convinces them of the worth of the method. Only three measures are used: time, size, and defects. After PSP training developers see a dramatic improvement in the reduction of defects.

Humphrey described the quality challenge as “pretty severe.” We do not have data on what part of software defects are security vulnerabilities. Presumably, if vulnerabilities were sufficiently reduced, the response strategy, fixing defects after shipment, would work. How many defects are too many? It depends on how much time a hacker has and the availability of the code: how exposed the code is, who is looking, how long they have to look, and the potential for serious damage.

We cannot acheive high quality by looking for defects, by “crawling through the code.” Simply “trying harder” will not achieve the needed quality or security levels. We need a quality management strategy that is as formal as is practical. Development teams must review and inspect all requirement designs and code. We need a strategy to record, track, and analyze every defect to fix the problem, to fix likely similar problems, to find similar problems, and to upgrade the process to prevent future problems.

Humphrey said, “People think I am against testing; I am absolutely not against testing. Testing is the only way to know what works.” He went on to discuss the component quality profile: standard design time, code review time, compile quality, unit test quality, and standard design review time. Humphrey said, “Saying the problem is requirements is tantamount to saying it is the customer’s fault.” (Technoflak was struck by this observation because one of her clients is firmly of the view that failure to get proper requirements is the most common cause of project failure. My client put the blame failure squarely on developers who do not understand business needs, fail to listen to clients, and thus are unable to develop proper requirements.)

Humphrey reminded the audience that our entire industry suffers from missed commitments, poor-quality and insecure products and unhappy users. He described security as a quality problem and said that to fix the security problem our top priority must be quality. To build quality software consistently, we must change the software-development process. We must also have skilled and motivated development teams who are dedicated to producing secure products.

A question period followed and the first question was about the importance of management support in the Team Software process. Humphrey said the Software Engineering Institute will not offer TSP without management training. Technoflak was deeply impressed by this requirement.

Humphrey went on to say that in the twenty-four teams they have studies for, the return on investment was 27% in the first year and the internal rate of return averaged 78%.

Another questioner wanted to know why hardware was so much more reliable than software. Watts said we could learn from hardware quality assurance and talked about the history of semiconductor manufacturing how when one plant reduced defective chips all the others had to meet the same standard of quality to stay in business.

Technoflak asked about employee turnover and its impact on quality and security. Humphrey agreed that high turnover contributes to quality problems and reducing turnover is desirable.