Comments 0

Document transcript

Applied ResearchVolume 57, Number 4, November 2010●Technical Communication375Practitioner’sTakeawayData from May 2008 show that single sourcing and content managementwere slowly and steadily beingadopted by technical communicationworkgroups; however, these methodsand tools were diverse, and no singlekind of SS/CM method or toolseemed destined to become dominant.Single sourcing, both with and without content management,apparently had reached a critical massof adopters, but content managementwithout single sourcing had not.Microsoft Word and FrameMaker were respondents’ most-used primaryauthoring tools, and more than threetimes as many respondents producedPDF ﬁ les as produced content usingthe Extensible Markup Language orits predecessor, Standard GeneralizedMarkup Language.Purpose: To gather reliable empirical data on (1) STC members’ use of and attitudestoward single sourcing and/or content management (SS/CM) methods and tools;(2) factors perceived to be driving or impeding adoption of this technology;(3) transition experiences of adopting work groups; (4) perceived impacts ofSS/CM methods and tools on efﬁ ciency, usability, customer focus, and job stress.Method: Cross-sectional sample survey of 1,000 STC members conducted in May2008; multiple survey contacts by email with link to online survey instrument.Results: Of 276 respondents, half reported using SS/CM methods and tools. About1 in 10 respondents reported experience with a failed implementation of SS/CM; halfthe SS/CM users reported signiﬁ cant downsides or tradeoffs. Perceived top drivers ofSS/CM adoption were faster development, lower costs, regulatory and compliancepressures, and translation needs. About 1 in 9 respondents used Darwin InformationTyping Architecture (DITA). Large company size made use of SS/CM signiﬁ cantlymore likely, and work groups using single sourcing with content management weresigniﬁ cantly larger than work groups of other SS/CM subgroups and non-users ofSS/CM. Single sourcing without content management seems destined to achieve alarger proportion of adopters than single sourcing with content management, barringa technology breakthrough. Among all respondents, Microsoft Word and FrameMakerwere the most-used primary authoring tools.Conclusions: With regard to these methods and tools, STC members appear to be inthe Early Majority phase of Everett M. Rogers’s innovation adoption curve. Diffusionof these methods and tools appeared to have been steady in the ﬁ ve years prior to thesurvey, with no dramatic increase in more recent pace of adoption.Keywords: single sourcing, content management, methods and tools, technologytransfer, survey methodsSingle Sourcing and Content Management:A Survey of STC MembersDavid Dayton and Keith HopperAbstractSingle Sourcing and Content ManagementApplied Research376Technical Communication●Volume 57, Number 4, November 2010IntroductionDuring the past decade, scores of authors fromboth academic and practitioner ranks of technicalcommunication have written and talked aboutmethods and tools associated with the terms singlesourcing and content management. Despite the steadyﬂ ow of information and opinions on these topics (seeAppendix A for a brief annotated bibliography), wehave not had hard data on how many practitioners usesuch methods and tools and what they think aboutthem. To ﬁ ll that gap, we conducted a probabilitysample survey of STC members in May 2008.We begin our report by deﬁ ning key terms. InObjectives and Methodology, we state what we set outto learn, explain how we designed, tested, and deployedthe survey, and describe how we analyzed the data.We organize the Summary of Results with statementssumming up the most noteworthy ﬁ ndings that we tookfrom the data, which we report in abbreviated form. Inthe Conclusions section, we recap and brieﬂ y discusswhat the survey results tell us about STC members’ useof single sourcing and content management.Deﬁ nitions Used in the SurveyAny discussion about single sourcing and contentmanagement should begin by deﬁ ning those termscarefully. The terms are not synonymous, thoughoften conﬂ ated, as anyone who researches these topicsquickly discovers. Searching bibliographic databasesor the Web using the term single sourcing, you mayﬁ nd case stories about single sourcing carried outusing a content management system (e.g., Happonen& Purho, 2003; Petrie, 2007), but you may alsoﬁ nd that a case is about an application or methodthat does not include a content management system(Welch & Beard, 2002). Likewise, results producedby the search term content management will list articlesabout a system that enables single sourcing (Hall,2001; Pierce & Martin, 2004) as well as articlesabout a Web content management system lackingthe functionality that would enable single-sourcepublishing (McCarthy & Hart-Davidson, 2009; PettitJones, Mitchko, & Overcash, 2004). Indeed, manyWeb content management systems are designed inways that make single sourcing impossible.In our survey, we deﬁ ned single sourcing by quotinga widely recognized authority on the topic (Ament,2003). Kurt Ament deﬁ nes single sourcing asa method for systematically re-usinginformation [in which] you developmodular content in one source documentor database, then assemble the contentinto different document formats fordifferent audiences and purposes (p. 3).We want to emphasize that true single sourcing doesnot include cutting and pasting content from the sourceto different outputs; single sourcing uses software so thatdifferent outputs can easily be published from a singlesource document or database.If we were to repeat the survey, we would reviseAment’s deﬁ nition to “you develop modular contentin one source document, Help project, or database.”Widely used Help authoring tools such as AdobeRobohelp and MadCap Flare enable single sourcing asAment deﬁ nes it, but their primary content repositoryis a project, which is neither a document nor, strictlyspeaking, a database. A Help project collects and storesall the ﬁ les needed to publish content, which can becustomized for different audiences and products and/or different outputs, such as Web help and manuals inPDF (portable document format). Those who insist onabsolute semantic precision with regard to this topiccan expect to be frustrated for some time to come. Theevolution of Help authoring applications like thosementioned (and others, no doubt) will even morethoroughly blur the distinction between single sourcingand content management.We wanted our survey respondents to think ofsingle sourcing as a method of information developmentdistinct from content management systems, which wedeﬁ ned as a method-neutral technology:For the purposes of this survey, contentmanagement systems are applications thatusually work over a computer networkand have one or more databases attheir core; they store content, as wholeApplied ResearchVolume 57, Number 4, November 2010●Technical Communication377documents and/or as textual and graphicalcomponents; they mediate the workﬂ owto collect, manage, and publish contentwith such functions as maintaining linksamong content sources and providingfor revision control. They may be usedin conjunction with single sourcing,but some types of content managementsystems are not compatible with singlesourcing.Before composing this deﬁ nition, we reviewed theextended deﬁ nitions of content management systemsoffered by Rockley (2001), Rockley, Kostur, andManning (2002), and Doyle (2007). Our goal was toprovide respondents with a distilled description leadingthem to focus on a networked information system andnot on a general information management process.(See Clark [2008] for a discussion of process versustechnology in deﬁ ning content management, as well asdescriptions of general types of content managementsystems.)We use the following terms and abbreviationsto refer to the three possible situations that apply totechnical communication work groups with regard tothe use of single sourcing and content managementsystems:Single sourcing without a content management system (SS)Single sourcing with a content management system (SSwCM)No single sourcing but use of a content management system (CM)Note that we use SS/CM as shorthand for “SS and/orCM”—in other words, whenever we refer to the groupof respondents who reported using SS only, CM only,or SSwCM. In reporting our results, we often comparethe group composed of all SS/CM respondents withthe group composed of all whose work groups didnot use any SS/CM method or tool. Within themain group of interest—the users of SS/CM—weoften break down the results for the three subgroups:SS only, CM only, and SSwCM.A few additional deﬁ nitions are needed becauseit is impractical to discuss this topic without them.Extensible Markup Language (XML) is an open-source,application-independent markup language frequentlyused in (though not required by) tools across thespectrum of SS/CM applications and systems. XML isbecoming a universal markup language for informationdevelopment and exchange. Many times, people usingXML-based tools are unaware of XML’s role, as whenone saves a document in Word 2007’s default “.docx”format, which is a zip ﬁ le containing XML components.Our survey included questions about the use of XMLand its precursor, SGML (Standard Generalized MarkupLanguage), as well as a question about three standardsfor implementing XML to develop and managedocumentation: DocBook, Darwin Information TypingArchitecture (DITA), and S100D (a standard used inthe aerospace and defense industries).Objectives and MethodologyOur study had the following four objectives, to:Produce a cross-sectional statistical proﬁ le of SS, 1.CM, and SSwCM use by STC membersIdentify important factors perceived by STC 2.members to be driving or impeding the adoption ofSS, CM, and SSwCM methods and toolsGather data on the transition experiences of work 3.groups after they adopted these methods and toolsLearn whether and how these methods and tools 4.are perceived by STC members using them to haveimpacts on efﬁ ciency, documentation usability,customer focus, and job stressDevelopment of the SurveyThe survey was the central element of a multimodalresearch proposal that Dayton submitted to the STCResearch Grants Committee, a group of prominentacademics and practitioners with many years ofexperience conducting and evaluating applied researchprojects. Dayton revised the ﬁ rst formal draft of thesurvey in response to suggestions from the committee,which recommended to the STC Board that the revisedproposal receive funding. The Board approved thefunding in June 2007, and Dayton obtained approvalfor the study from the Institutional Review BoardSingle Sourcing and Content ManagementApplied Research378Technical Communication●Volume 57, Number 4, November 2010(IRB) for the Protection of Human Participants atTowson University in Maryland.Based on several formal interviews and someinformal conversations with technical communicatorsabout single sourcing and content management methodsand tools, Dayton revised the survey and solicitedreviews of the new draft from three practitioners withexpertise in the subject matter and from an academicwith expertise in survey research. Dayton again revisedthe survey in response to those reviewers’ suggestions.Hopper then converted the survey into an interactiveWeb-delivered questionnaire using Zoomerang (acopy of the survey that does not collect data may beexplored freely at http://www.zoomerang.com/Survey/WEB22B38UWBJKZ).Moving the survey from a page-based format tomulti-screen Web forms proved challenging. Multiplebranching points in the sequence of questions createdﬁ ve primary paths through the survey: no SS/CM, SSonly, CM only, SSwCM, and academics. Respondentsnot using SS or CM were presented with 20 or 21questions depending on whether their work group hadconsidered switching to SS/CM methods and tools.Respondents in the three subgroups of SS/CM werepresented with 30 to 33 questions, depending on theiranswers to certain ones. The version of the survey foracademics contained 24 questions, but we ultimatelydecided to leave academics out of the sampling framefor reasons explained later.For all paths through the survey, question typesincluded choose one, choose all that apply, and openended. All ﬁ xed choice questions included a ﬁ nal answerchoice of “Other, please specify” followed by a spacefor typing an open-ended answer. The ﬁ rst completedraft of the Web-based survey was pilot tested byabout 30 participants, which included practitioners,graduate students, and academics. The reported timesfor completing the survey ranged from less than 8 to25 minutes. Testers who went through the path foracademics and the path for those not using SS or CMreported the fastest completion times and offered thefewest suggestions. Testers answering the questions forthose using SS/CM suggested some improvements inwording, formatting, and answer options, most of whichwe agreed with and made changes to address.Deployment of the SurveyThe version of the survey for academics was entirelydifferent from the four variations for practitioners.Following the pilot test, we reassessed the pros andcons of ﬁ elding two surveys at the same time. We wereparticularly concerned that the number of academicrespondents would be quite small unless we drew aseparate sample of only academic members. After theSTC Marketing Manager assured us that academicscould be ﬁ ltered from the membership database beforedrawing a sample, we decided to limit the samplingframe to practitioners. (The sampling frame is the totalpopulation of people from whom the random sample isdrawn.)The sampling frame consisted of about 13,500STC members, about 3,000 fewer than the totalmembership at that time (May 2008). In addition toexcluding academics, students, and retirees, the STCMarketing Manager also excluded STC members whohad opted not to receive messages from third-partyvendors. From the sampling frame of about 13,500members, the STC Marketing Manager drew a randomsample of 1,000 using an automated function for thatpurpose available in the STC ofﬁ ce’s membershipdatabase application.Over 11 days, the Marketing Manager e-mailed tothe sample four messages that we composed. The ﬁ rste-mail went out on a Thursday: a brief message fromSTC President Linda Oestreich describing the surveyand encouraging participation. The second e-mailwas sent the following Tuesday, signed by us, invitingrecipients to take the survey and providing a link to theconsent form. (Researchers working for federally fundedinstitutions are required by law to obtain the informedconsent of anyone asked to participate in a researchstudy.) Respondents accessed the survey by clicking thelink at the bottom of the consent form. (Appendix Ccontains copies of the two e-mails mentioned above andthe consent form.)The Internet server housing the survey wasconﬁ gured to prohibit multiple submissions from thesame computer. When a respondent completed thesurvey by clicking the Submit button on the ﬁ nal screen,a conﬁ rmation page displayed our thank-you messageand offered respondents the option of e-mailing theDayton and HopperApplied ResearchVolume 57, Number 4, November 2010●Technical Communication379STC Marketing Manager to be taken off the list of thosereceiving reminder e-mails. In addition, respondentscould check an option to receive an e-mail from STCafter the survey had closed, giving them an early look atthe results.We received data from 117 respondents within24 hours of sending out the ﬁ rst e-mail with a link tothe survey. Based on the response time data that we hadobtained in previous online surveys, this level of initialresponse suggested that we were headed for a lower thananticipated response rate. Two days after our ﬁ rst e-mailwith a link to the survey went out, the ﬁ rst remindere-mail was sent—with a revised subject line and senderaddress. The initial two e-mails had been conﬁ gured tohave stc@stc.org as the sender, which we feared mightbe leading some recipients to delete it reﬂ exively or toﬁ lter it to a folder where they would not see it until itwas too late to take the survey. We arranged with STCstaff to have the reminder e-mails show the sender asdavid_dayton@stc.org, an alias account. A second andﬁ nal reminder was e-mailed the following Monday,11 days after the advance notice e-mail went out.The sequencing, timing, and wording of the fourmessages e-mailed to the 1,000 STC members in thesample were based on best practices for conductingInternet surveys (cf. especially Dillman, 2007). Becausewe did not have direct control over the sampling frameand the mass e-mails used to distribute the surveyinvitations, some aspects of the survey deploymentdid not meet best-practices standards; speciﬁ cally, oure-mailed invitations lacked a personalized greeting and,for the ﬁ rst two e-mails, also contained impersonalsender-identifying information.Response RateTwo weeks after the ﬁ rst e-mail went out to STCmembers in the sample, the survey closed. We hadreceived data from 276 practitioners who completedthe survey. We will not report data from four otherrespondents who answered the version of the surveyfor academics, and we discarded partial data from 46participants who abandoned the survey after startingto ﬁ ll it out. Using the standard assumption that the1,000 e-mailed survey invitations were all receivedby those in the sample, the response rate was 28%,slightly better than other recent STC surveys. (Thelast salary survey that STC invited 10,000 membersto take in 2005 had a response rate of 23%. A samplesurvey conducted by a consulting ﬁ rm hired in 2007to collect members’ opinions about STC publicationshad a response rate of 22%.) Our survey’s response rateof 28% may represent a source of bias in the surveyresults. We comment on this brieﬂ y toward the end ofthe summary of results and discuss it in some depthin Appendix B, where we review recent research andthinking about low response rates from the socialscience literature.Data Analysis MethodsData from submitted surveys were collected in a textﬁ le on the Zoomerang.com server and downloadedafter the survey closed. Microsoft Excel 2007 was usedto create frequency tables and bar graphs to examinedescriptive statistics for each survey question. Datafrom key variables were sorted by technology type—SSonly, CM only, or SSwCM—and tested for signiﬁ cantdifferences or associations using statistical software torun the most appropriate procedures based on the levelof the data. Standard measures were used to calculatethe strength of any statistically signiﬁ cant differences orassociations (p ≤ .05). Please note that data are roundedto whole numbers using the “round up for odd, downfor even” rule when the exact proportion produces a 5after the decimal point; thus, the whole numbers forthe same item will occasionally not add up to 100%.For example, 18.5% will be reported as 18%, while19.5% will be reported as 20%. This is a standardrounding protocol intended to produce greater clarityin reporting the results for this type of survey.Summary of ResultsIn this section, we present a summary of the surveydata organized under headings that highlight the mostnoteworthy results. Readers wishing to explore thesurvey data in more depth may visit the STC LiveLearning Center (www.softconference.com/stc/), whichhas an audio recording and PowerPoint slidedeck ofour presentation at the STC 2009 Summit.Single Sourcing and Content ManagementApplied Research380Technical Communication●Volume 57, Number 4, November 2010Four of Five Were Regular Employees; Half Workedin High-techThe group proﬁ le of our 276 respondents in termsof employment status and industry seems typical ofthe STC membership before the current economicrecession: 81% were regular employees; 18% werecontractors, consultants, freelancers, or businessowners; and 2% were unemployed. Respondentsworked for a wide range of industries, though alittle more than half worked in industries commonlyclustered under the rubric “high-technology”:companies making or providing software, computerand networking hardware, software and IT services,and telecommunications products and services.Slightly More Than Half Worked in Large CompaniesWe asked respondents to categorize the size of thecompany they worked at. Table 1 shows that therange of company sizes was weighted slightly (55%)toward companies with more than 500 employees,and the largest category proportionately is 10,000or more employees, with 25%. (The Small BusinessAdministration most often uses 500 employees asthe maximum size company allowed to access itsprograms.) Table 1 includes Census Bureau data forthe entire U.S. economy in 2004 as comparative data.Half Used SS Only, CM Only, or SS With CM—andHalf Used No SS/CMOf the 276 respondents, 139 (50%) reported that theydid not use SS/CM methods and tools, and 137 (50%)reported that they did (see Figure 1). In the SS/CMgroup, SSwCM users were the most numerous (55, or20% of all respondents), followed by SS only (47,17%) and CM only (35, 13%).As Figure 2 shows, about two-thirds of SS/CMusers reported that their work groups produced morethan half their output using SS/CM methods and tools.One in ﬁ ve, however, reported that their work groupused SS/CM to produce 25% or less of their output, aﬁ nding consistent with the data collected on recentnessof SS/CM adoption and the average time reported forreaching certain benchmarks for proportion of totaloutput using SS/CM methods and tools. (Those resultsare reported in subsequent tables and ﬁ gures.)Table 1. Company Size Reported by Respondents Comparedwith 2004 U.S. Census DataCompany Size % of 276 STCrespondents% of U.S. CensusData 2004*1 to 4 7% 5%5 to 9 2% 6%10 to 19 2% 7%20 to 99 14% 18%100 to 499 21% 15%500 to 999 7% 5%1,000 to 9,999 22% 18%10,000 or more 25% 26%*Source: Statistics about Business Size (including Small Business) from theU.S. Census Bureau, Table 2a. Employment Size of Employer and NonemployerFirms, 2004. Accessed August 16, 2009, at http://www.census.gov/epcd/www/smallbus.htmlFigure 1. Use of SS/CM by 276 Survey RespondentsFigure 2. Proportion of Total Information Product OutputUsing SS/CMDayton and HopperApplied ResearchVolume 57, Number 4, November 2010●Technical Communication381About 1 in 4 Used XML and/or SGML; About 1 in 9Used DITAAll 276 respondents answered a question asking themto identify the types of information products theirwork groups produced. Seventy-six (28%) checked theanswer “content developed using XML or SGML.”Respondents using SS/CM (n = 137) were presentedwith another question asking them to indicate if theirwork group used XML and/or SGML. Figure 3 graphsthe results from that question, showing that abouthalf the SS/CM respondents produced content usingXML and/or SGML. Three out of four in that groupof SS/CM users indicated their work group’s systemused XML alone, while most of the others indicated asystem using both XML and SGML.Another question presented to SS/CM respondentsasked them to indicate which, if any, documentationstandard their work group used. About 2 of 3 SS/CMrespondents (64%) reported that their work groupused no standard. About 1 in 5 (21%) indicated thatthey used DITA, and one person used both DITA andDocBook. The 30 DITA-using respondents, then, were11% of all survey respondents, or 1 in 9.About 1 in 10 Reported a Failed SS/CM ImplementationTwenty-four respondents (9% of N = 276) reportedthat they had been part of a work group whoseattempt to implement an SS/CM system had failed.Seven indicated that a CM system was involved, andsix wrote that it was the wrong tool for their workgroup, citing one or more reasons. Three respondentsindicated that an SS tool had failed, two saying thatthe SS tool had not performed to expectations and thethird saying that lack of management support led tofailure of the project. Fourteen respondents did notspecify which type of tool was involved in the failedproject, and for this subgroup no single reason for thefailure predominated. Poor ﬁ t, difﬁ culty, and cost werethe main reasons cited for the failed implementations.Almost Half the SS/CM Work Groups Had Used TheirSystem for Two Years or LessThe survey asked those using SS/CM how long agotheir work group had started using their current SS/CMsystem. Figure 4 shows that 45% of the SS/CM users’work groups had been using their SS/CM system for lessthan two years, and 24% had been using their systemfor less than a year. When asked how long the workgroup had researched options before deciding on itsSS/CM system, 103 respondents provided an estimatein months. Setting aside an outlier (40 months), therange of answers was 0 to 24 months, with a medianof 4, a mean of 6.04, and a standard deviation of 6.03(see Table 2).The survey also asked SS/CMusers to estimate how long (inmonths) it took their work group toreach the point of producing 25%of their information products usingtheir SS/CM system. Estimates(n = 97 valid) ranged from 0 to28 months, with a median of 4months, a mean of 6.4, and astandard deviation of 6.25. Ofthe 137 respondents using SS/CM, 55% reported that theirwork group had completed theirSS/CM implementation; 45%reported that their group was stillworking to complete their SS/CMimplementation (however theydeﬁ ned that milestone, which isFigure 3. Use of XML and SGML by 137 SS/CM RespondentsSingle Sourcing and Content ManagementApplied Research382Technical Communication●Volume 57, Number 4, November 2010not usually deﬁ ned as 100% of information productionoutput, as shown in Figure 2). Table 2 reveals that theaverage time it takes a work group to implement anSS/CM system seems reasonable: most work groupsadopting SS/CM systems complete their implementationin well under a year. However, some work groupsexperience very long implementation times.Caution must be exercised in comparing estimatesby those working toward completion of SS/CMimplementation with the historical estimates bythose looking back at that completed milestone. Forthose in the “not done” group, we do not know howlong SS/CM projects had been underway when theyestimated how long it would be before their work groupcompleted its implementation. With that caveat inmind, we observe that the data in Table 2 are consistentwith what we know about human nature: those lookingahead to completion of SS/CM implementation tendedto see the process taking somewhat longer than thoselooking back in time.SS/CM Respondents Reported Many Activities toPrepare for TransitionThe survey asked SS/CM users what activities theirwork group engaged in to help them make thetransition to SS/CM, and 83% in the SS/CM groupprovided answers. Figure 5 shows that SS/CM workgroups engaged in a wide range of research andprofessional development activities to pave the way foradoption and implementation of SS/CM systems. Aswe would expect, about half of the work sites gatheredinformation from vendor Web sites. The next mostmentioned activity was trying out the product, which37% said their work group did. Only slightly fewer(31%) indicated that members of their work groupattended conferences and workshops to learn moreabout SS/CM systems. About 1 in 4 (23%) indicatedthat their work group hired a consultant to help themmake the transition.Top Drivers: Faster Development, Lower Costs,Regulatory and Compliance Pressures, TranslationNeedsOn one question, the 137 SS/CM users indicatedwhich listed business goals inﬂ uenced the decision toadopt the SS/CM system their work group used. TheFigure 4. How Long Ago Did Work Group Begin Using SS/CMSystem?Table 2. Estimated Months to Research Options, to Reach 25% Production with SS/CM, and to Complete the ImplementationProcessMeasures of centraltendencyMonths duringwhich work groupresearched SS/CMoptionsn = 103 validMonths before workgroup produced 25%of its output withSS/CMn = 97 validMonths it took tocomplete SS/CMimplementation(historical)n = 56 validMonths it will taketo completeimplementation(projection)n = 49 validMedian 4 4 6 10.5Mean 6.1 6.4 7.9 10.7SD 5.96 6.25 7.07 7.95Range 0 to 24 0 to 28 0 to 28 0 to 24Dayton and HopperApplied ResearchVolume 57, Number 4, November 2010●Technical Communication383next question asked them to select the business goalthat was the most important driver of the decision toadopt the SS/CM system. Figure 6 charts the resultsfrom these related questions. On the “choose all thatapply” question, the business goal most often selectedwas providing standardization and consistency (73%).Three other business goals were indicated as inﬂ uentialby more than half of the SS/CMgroup: speeding up development(57%), lowering costs (56%), andproviding more usable and usefulinformation products (52%).In identifying the single mostimportant business goal drivingthe decision to adopt the SS/CMsystem, about 1 in 5 respondentspicked one of the ﬁ rst threefactors listed above, with loweringcosts edging out standardizationand development speed as themost-picked factor. About 1 in 8picked either lowering translationcosts speciﬁ cally or providing moreusable and useful informationproducts as the most importantfactor; only 6% chose respondingto regulatory or compliance pressures as the singlemost important driver of adoption.SSwCM Respondents Reported Signiﬁ cantly LargerWork GroupsTable 3 shows that respondent work group sizes weresimilar for three groups: No SS or CM use; use of SSonly; and use of CM only. However,the work group size reported bySSwCM users was signiﬁ cantlydifferent.SS/CM and Non-use Groups VariedSigniﬁ cantly by Company SizeKnowing that larger work groupsizes predict a signiﬁ cantly greaterlikelihood of using SS/CM methodsand tools, we would expect thesame to hold true, generally, forthe association between companysize and likelihood of using SS/CM. That is the case, though theassociation is not as strong as workgroup size. Chi square analysisrevealed that the proportions shownin Table 4 are signiﬁ cantly different,Figure 5. Transition to SS/CM Activities Reported by SS/CM Respondents* n = 114 due to item nonresponse, but percentages shown are based on n = 137, which is totalof SS/CM respondentsFigure 6. Business Goals Driving Decision to Implement SS/CM SystemSingle Sourcing and Content ManagementApplied Research384Technical Communication●Volume 57, Number 4, November 2010χ2(9, N = 275) = 25.283, p = .003. Somers’ d, used totest the strength of signiﬁ cant chi square associationsfor ordinal by ordinal data, had a value of .17, whichis noteworthy, though weak. (In other words, knowingthe size of a respondent’s company reduces predictionerrors about which SS/CM subgroup the respondent isin by 17%.)SS/CM Was Signiﬁ cantly Associated with GreaterTranslation NeedsA question presented to all respondents asked,“Regarding your work group’s information products:Into how many languages are some or all of thoseproducts translated?” Table 5 sorts the answers intothe four categories formed by the ﬁ xed choices, whichranged from 0 languages to 10 or more languages. Chisquare analysis revealed that the proportions shown inTable 5 are signiﬁ cantly different, χ2(9, N = 276) =34.563, p = .000. Goodman and Kruskal’s tau, aproportional reduction in error directional measureof association for nominal by nominal data, was 0.51with the SS/CM category as the dependent variable.(Knowing the number of languages for translationreduces errors in predicting the SS/CM category byhalf.) These results strongly support the perceptionamong many technical communicators that translationneeds are often a critically important factor injustifying the costs of moving to SS and/or SSwCMsystems.SS/CM Groups Differed Signiﬁ cantly on SomeLikert-type Items About ImpactsThe survey presented the SS/CM users with a seriesof 10 Likert-type items about perceived impacts ofTable 4. SS/CM Use Categories Cross-Tabulated with Company Size CategoriesCategory of SS/CM Use 1–99 100–999 1,000–9,999 10,000 or more TotalsNo SS / CMCount 42 42 25 30 139% within category 30% 30% 18% 22% 100%SS onlyCount 13 18 9 7 47% within category 28% 38% 19% 15% 100%CM onlyCount 8 7 7 13 35% within category 23% 20% 20% 37% 100%SSwCMCount 5 10 20 19 54% within category 9% 19% 37% 35% 100%TotalCount 68 77 61 69 275% within category 25% 28% 22% 25% 100%* Null hypothesis that differences in proportions across columns are due to chance was rejected: χ2(9, N = 275) = 25.283, p = .003; Somers’ d = .172Table 3. SSwCM Users Reported Signiﬁ cantly Larger WorkGroup Sizes*Measuresof centraltendencyNo SS/CMn = 137SSonlyn = 46CMonlyn = 33SS with CM(SSwCM)n = 53Median 4.00 5.00 5.00 12.00Mean 6.91 8.24 9.45 18.00*SD 9.919 11.478 11.869 17.747Range 1 to 75 1 to 70 1 to 50 1 to 65* The null hypothesis that differences in work group size are due to chancewas rejected: A one-way Welch’s variance-weighted ANOVA was used totest for differences among the group sizes reported by respondents in thefour categories, and these were found to differ signiﬁ cantly F (3, 86.8) = 6.19,p = .001. Tamhane post hoc comparisons of the four groups show that worksizes reported by those in the SSwCM category (M = 18.0) differ signiﬁ cantlyfrom those of the No SS or CM category (M = 9.91, p = .000); those of theSS only category (M = 8.24, p = .009); and those of the CM only category(M = 9.45, p = .053).Dayton and HopperApplied ResearchVolume 57, Number 4, November 2010●Technical Communication385using SS/CM. These 137 respondents picked ananswer on a ﬁ ve-point scale ranging from stronglydisagree (value of 1) to strongly agree (value of 5).The mean ratings elicited by the 10 statements areshown in Figure 7. Pairwise comparison using theKruskal-Wallis non-parametric test of independentgroups showed signiﬁ cant differences betweengroups, which are footnoted in Figure 7. Thesestatistically signiﬁ cant differences can be summedup as follows:Respondents whose work groups used single sourcing without content management (SS) agreedmore strongly that their system “has helped speedup development of information products” thanrespondents from the other two groups—contentmanagement without single sourcing (CM)and single sourcing with content management(SSwCM).CM respondents agreed less strongly than respondents from the other two groups that theirsystem “has helped speed up development ofinformation products.”SS respondents more strongly agreed than SSwCM respondents that their system “has made our routinework less stressful overall.”SSwCM respondents more strongly agreed than respondents using SS only or CM only thattheir system “has improved the usability of ourinformation products.”Half the SS/CM Users Reported Signiﬁ cantDownsides or TradeoffsThe survey asked those using SS/CM systems, “Hasyour work group and/or company experienced anysigniﬁ cant downsides or tradeoffs resulting fromswitching to its SS/CM system?” Seventy-two of the137 respondents (53%) answered “Yes” and also typedcomments into a text-entry space. We did an initialcoding of the comments and then further reducedthe categories with a second round of coding, whichproduced the results shown in Table 6. Table 7 containsa representative sample of the comments in each of thetop six categories.One in Four SS/CM Users Said That Their Work Groupwas Considering a Change in ToolsThirty-nine or 28% of the SS/CM users indicatedthat their work group was considering a change to adifferent SS/CM system. In an open-ended follow-up question, 26 respondents mentioned speciﬁ c toolsTable 5. SS/CM Category Cross Tabulated with Number of Languages Translated*Number of Languages for Translations0 1–4 5–9 10 or + TotalNo SS or CMCount 72 50 8 9 139% within category 52% 36% 6% 7% 100%SS onlyCount 22 10 6 9 47% within category 47% 21% 13% 19% 100%CM onlyCount 16 14 3 2 35% within category 46% 40% 9% 6% 100%SSwCMCount 14 15 10 16 55% within category 26% 27% 18% 29% 100%TotalCount 124 89 27 36 276% within category 45% 32% 10% 13% 100%* Null hypothesis that differences in proportions across columns are due to chance was rejected: χ2(9, N = 276) = 34.563, p = .000.Single Sourcing and Content ManagementApplied Research386Technical Communication●Volume 57, Number 4, November 2010under consideration. DITA was mentioned in nineresponses; other tools mentioned more than one timewere Structured FrameMaker (5 times), MadCap Flare(4), SharePoint (3), RoboHelp (2), and XMetal (2).Half of the No-SS/CM Work GroupsHad Considered SS/CM, but FewPlanned to AdoptIn addition, about 1 in 3 reported thattheir work group had never consideredswitching to SS/CM, and about 1 in10 were not sure or gave ambiguousexplanations after checking “Other.”For 66 respondents (47%) in the no-SS/CM group who answered a follow-up question about factors drivingtheir work group to consider usingSS/CM, the most important factorswere speeding up development (71%of n = 66), providing standardizationand consistency (68%), and cuttingcosts (61%). These results are similarto those from SS/CM respondents(see Figure 6).The 66 non-SS/CM respondentsreporting that their work groupshad considered SS/CM were askedto explain what their group hadconcluded about the feasibility of adopting SS/CM.About half these respondents mentioned as obstacles themoney, time, and/or resources required to move forwardwith a transition to SS/CM. About 1 in 5 indicated thattheir work group or management concluded that SS/CM was not practical for them or not needed. Another1 in 5 indicated that no decision had yet been madeabout the feasibility of switching to SS/CM.Respondents Reported Producing a Diverse Arrayof Information ProductsAll respondents were presented with a long list ofinformation products and checked all the ones theirwork group produced (see Table 8). Not surprisingly,PDF ﬁ les and technical content documents were thetop categories, selected by 9 out of 10 respondents.About 3 out of 4 said their work groups producedMicrosoft Word ﬁ les and/or content with screenshots. Two other types of products were selected byover half the respondents: HTML-based help contentand instructional content. Far fewer respondentsindicated their work groups produced multimediaTable 6. Comments on Downsides of SS/CM Implementation:Count by CategoryCategory into which commentwas sortedn== 72 % of 137Awkward production/slowerproduction/more work for writers23 17Difﬁ cult or slow transition/learningcurve/team member resistance22 16Bugs and technical glitches 13 10Lack of ability to customize 5 4Expense 3 2Garbage in, garbage out 3 2Technical skills demands; loss ofprocess control; too early to tell1 each 1Figure 7. Mean Rating of SS/CM Users on 10 Likert-Type Statements AboutSS/CM Impacts3.043.083.143.353.413.453.463.593.654.071 2 3 4 5Reduced overall work stressFacilitated focus on information usabilityMade work group more customer-centeredMade me feel more positive about my workImproved information product ease of useImproved cost-effectivenessImproved information product usefulnessSpeeded up developmentWorth the effortWill continue to be used by work groupStrongly Disagree (1) to Strongly Agree (5)n=137******* SS only users more strongly agreed that their system reduced stress than SSwCM users.(H = 7.73, DF = 3, p = .052)** SSwCM users more strongly agreed about better usability focus than users of SS only or CMonly. (H = 16.01, DF = 3, p = .001)*** SS only users more strongly agreed about gains in speed than other users; CM only usersagreed less strongly about gains in speed than other users. (H = 18.12, DF = 3, p = .000)Dayton and HopperApplied ResearchVolume 57, Number 4, November 2010●Technical Communication387and/or interactive content, such as videos, animation,simulations, and interactive Flash or Shockwave content.We intended for the product categories to overlapand to represent as broad a spectrum of informationproducts as possible, but respondents could add othertypes in an open-ended “other” follow up question. Weexamined the 29 responses to the “other” question andidentiﬁ ed 12 responses representing types of informationproducts not already checked by the respondent, such asreports, proposals, forms, posters, and so forth.Microsoft Word Was the Most-used PrimaryAuthoring ToolAll 276 respondents answered this question by typinginto a text-entry box: “What is your work group’sprimary tool for textual content authoring/editing?”Table 7. Sample Comments on Downsides of SS/CM Implementation, for Top Six Categories from Table 6Awkwardproduction /slowerproduction /more work forwritersWhat was promised was not delivered. Gained little, but cause huge hits on our resources to get it implemented and clean up the database. Network connectivity was a big problem for our globalcompany. The CM turned out to be a very expensive storage system with none of the beneﬁ ts ofsingle sourcing.We did a rapid implementation of the CMS and it remains incomplete. Workﬂ ows, content delivery, and providing access to the content for groups outside our department remain huge challenges.Extensive overhead involved in creating topics, conrefs, maps, etc. More churn, fewer people able to produce an entire doc product without large external bottlenecks and dependencies.My understanding is that SS/CM was for translation. It has burdened the writers, because we do more of the upfront translation work. It has beneﬁ tted translation and not the writers.Difﬁ cult or slowtransition /learning curve /team memberresistanceThe time it is taking to switch to DITA and re-train and re-tool the entire department is signiﬁ cant. Political battles over product selection, disagreement over content submission form and workﬂ ow design, overall cost, cost recovery issues, maintaining stability of production environment asimplementation requirements escalate over time, and technical implementation nightmares haveseverely hampered implementation.Big learning curve, tool knowledge all in one part of the team that is physically far from the rest, almost complete change in team members in past two years, so newbies with no buy-in for the tool,unable to implement the tool in the way team members wanted.There’s also been a social cost—an “us-against-them” mentality has developed between the “not getting it” writers and the staff who understand the tools and techniques. The “not getting it” crowdfeels that the SS/CM implementers are imposing on them, and the implementers are losing patiencewith the “not getting it” bunch. I shudder to think what will happen when we migrate to structure!Bugs andtechnicalglitchesA few software bugs and gremlins. Not very signiﬁ cant, but present. [Product] is buggy especially when having to reinstall after a system crash. Ugh! The software is overly customized and the CM is somewhat unstable. We’re upgrading/switching soon.Lack of abilityto customizeCustomers tend to want to edit and use source ﬁ les, but they cannot do that without the same licenses and style sheets our group uses, and most of them are not willing to invest the time.Need to use existing templates, which don’t always ﬁ t our needs. Expense The product was expensive. Garbage in,garbage outInitial data entry was a problem, as we just converted our old stuff into the new, even when it was bad. Ended up with a big database with bad information.Single Sourcing and Content ManagementApplied Research388Technical Communication●Volume 57, Number 4, November 2010Naturally, we had to categorize the answers, shown inTable 9. About 1 in 2 respondents (46%) identiﬁ edMicrosoft Word as their work group’s primaryauthoring/editing tool. Approximately 1 in 3 (30%)named Adobe FrameMaker. The remaining quarterof the respondents listed a variety of tools, includingArbortext Editor (4%), RoboHelp (3%), Author-it(3%), XMetal (2%), and InDesign (2%).ConclusionsThe survey results summarized above provide asnapshot—taken in May 2008—depicting STCmembers’ use of single sourcing and contentmanagement methods and tools. These results are theﬁ rst publicly available data from a random samplesurvey on this topic. In this section, we discuss themost important conclusions to be drawn from the data.Has Single Sourcing and/or Content ManagementReached a Critical Mass?Everett M. Rogers (1995) depicted the rate of adoptionfor any given innovation as a normal, bell-shaped curveand designated categories of adopters—from earlyadopters to laggards—based on their postulated time-to-adopt relative to the average time for all potentialadopters (see Figure 8). Rogers further postulated thata “critical mass” had to adopt an innovation beforeit could “take off ”—reaching what popular authorMalcolm Gladwell (2000) famously termed “thetipping point.”Table 9. Primary Authoring/Editing Tool% Tool N=27646 Microsoft Word 12730 Adobe FrameMaker 834 Arbortext Editor (Epic Editor) 123 Adobe RoboHelp 93 Author-it 82 XMetaL 72 Adobe InDesign 51 XML 31 MadCap Flare 27 Misc. other 19Table 8. Types of Information Products Reported by AllRespondents% Information Products N=27691 PDF ﬁ les 25291 Technical content documents 25279 Content with screen shots 21772 Microsoft Word ﬁ les 20057 Help content (HTML Help, Web Help, etc.) 15756 Instructional content 15646 Content with technical diagrams orillustrations12642 Other Web page-delivered content 11728 XML or SGML content 7627 e-learning content 7524 Knowledgebase topics 6524 Video or animation content 6518 Software demonstrations or simulations 5017 Flash Player interactive content 4714 Content with 3D models 395 Content for mobile devices 155 Shockwave Player interactive content 154 Miscellaneous other not counted in above 12Figure 8. Rogers’s Innovation Adopter Categories Depicted asNormal Distribution of Time-to-Adoption (Rogers, 1995, p. 262)Dayton and HopperApplied ResearchVolume 57, Number 4, November 2010●Technical Communication389If all varieties of SS/CM are considered togetheras the innovation, the answer about critical mass is aconﬁ dent yes: half of our respondents reported usingSS, CM, or SSwCM. In addition, as shown by thedata on how long groups had been using their SS/CMsystem (see Figure 4), the pace of adoption of all threecategories of SS/CM had picked up somewhat duringthe 2 years prior to the survey—from about mid-2006to mid-2008. The current recession began in December2007 (National Bureau of Economic Research, 2008).Undoubtedly, the recession has put a damper on thespread of SS/CM among technical communicationwork groups over the past 2 years. We think it is likely,however, that the recession may have had less impact onthe adoption of SS systems, which generally have a lowerprice tag, than on the more expensive SSwCM systems.If we regard each set of SS/CM methods and toolsas a distinct innovation competing with the others, thenour answer about critical mass, based on the Figure 1data, becomes maybe for SS only and for SSwCM:Those methods and tools appear to have reached acritical mass of adopters. However, the results suggestthat CM without single sourcing did not seem destinedfor widespread adoption in technical communication.In sum, our survey shows that as of mid-2008 STCmembers had moved into the Early Majority phase(Figure 8) for SS only and SSwCM, but CM by itselfwas still in the Early Adopter phase. Likewise, withregard to XML adoption, STC members were inthe Early Majority phase, but for DITA they werein the Early Adopter phase (see Figure 3 and relatedexplanatory text).Are Larger Companies More Likely to Use SS/CM?Yes—see Table 4—but the strength of the statisticallysigniﬁ cant association is weaker than some wouldpredict. We found a stronger association betweenwork-group size and likelihood of using SS/CM.And, of course, we come back to the problem ofconﬂ ating all types of SS/CM methods and tools:The cost of adoption in time and money will varywidely depending on the speciﬁ c solution adopted,adapted, and/or developed. Some SSwCM systems areexpensive, and only companies with deep pockets canafford them. On the other hand, a small work groupwith one or two technically savvy and resourcefulmembers could develop an SS-only or even an SSwCMsystem with relatively low-cost and/or open-sourcetools.Are Translation Requirements a Big Driver of SS/CMAdoption?Absolutely, yes: See Table 5. Our data support whatanyone would have assumed who has followed thistopic at STC conferences. However, translation is notthe top driver of SS/CM adoption, as demonstrated inFigure 6, which shows that three business goals werepicked about evenly as the most important driver ofthe decision to adopt an SS/CM system: Loweringcosts generally, speeding up development, andproviding standardization or improving consistency.What Are the Biggest Surprises in the Survey Results?For us, the biggest surprise was that only 1 in 10respondents reported that they had been involved in awork group whose attempt to implement an SS/CMsystem had failed. On more than one occasion, one ofus (Dayton) has heard prominent consultants at STCconferences estimate failure rates for SS/CM projectsat 50% and higher. We think the data from our surveyprobably underestimates the actual failure rate forsuch projects, but we also suspect that these resultsmean that failure rates are commonly overestimated.This may be explained by different notions of whatconstitutes a failed project. Half of our survey’srespondents who reported no SS/CM use also reportedthat their work group had considered a switch toSS/CM but had no plans to move in that direction.This suggests that many work groups investigateSS/CM options, including contacting consultants,but end up deciding to stay with the methods andtools they have, often without trying to implement anSS/CM system. To a consultant, that may count as afailure to implement, but to insiders it may simply be asuccessful conclusion to a deliberative process focusedon feasibility.Another surprise was that 1 in 4 respondents inwork groups using SS/CM was considering a change inmethods and tools and that 1 in 2 reported signiﬁ cantdownsides to their current SS/CM methods and tools.We did not expect that high a level of dissatisfactionwith SS/CM methods and tools; on the other hand, weSingle Sourcing and Content ManagementApplied Research390Technical Communication●Volume 57, Number 4, November 2010did not ask non-users of SS/CM a similar question aboutperceived downsides of their methods and tools.What Else in the Results Deserves to Be Highlighted?Microsoft Word and FrameMaker were by far the most-used primary authoring tools of the survey respondents,and more than three times as many respondents producedPDF ﬁ les as produced content using XML or SGML.We also think that the data on the Likert-typeagreement-disagreement items are intriguing: SS-onlyrespondents were signiﬁ cantly more in agreement thattheir system had speeded up their work while reducingtheir work-related stress. SSwCM respondents, however,were signiﬁ cantly more in agreement that their systemhad made work groups more focused on informationusability issues. These results tempt us to speculate thatthe added complexity of implementing single sourcingthrough a content management system adversely impactsperceptions of overall efﬁ ciency and stressfulness whilebolstering perceptions that the work group is giving moreattention to the usability of its information products.Perhaps implementing SSwCM is more likely to compelwork groups to re-invent their information developmentprocesses, leading to more user-centered analysis andtesting of their information products.Is It Likely That This Survey Underestimates Useof SS/CM by STC Members?For surveys of the general public, textbooks aboutsocial science research instruct that a low responserate, commonly speciﬁ ed as below 50% (Babbie, 2007,p. 262), warrants caution in assuming that data fromthe survey accurately represent the results that would beproduced if data could be gathered from all membersof the represented group. Our survey’s response rateof 28% must be viewed as a limitation of the study:Because we lack information about the nonrespondentsto the survey, we cannot know whether they, as a group,differ signiﬁ cantly from respondents in regard to thetopics covered by the survey. The discussion about howlikely it is that the survey’s results accurately representthe experiences and attitudes of STC members in 2008must be grounded in logical imputation.We do not think the results underestimateSTC members’ use of single sourcing and contentmanagement in the spring of 2008. Indeed, we think thatit seems just as likely that the survey overestimates SS andCM use by STC members. We make that argument inAppendix B, for those who may be interested in a reviewand discussion of research supporting the propositionthat low survey response rates do not automaticallymean questionable data quality. Our examination of theliterature on that topic has bolstered our conﬁ dence thatour survey presents a reasonably accurate snapshot ofSTC members’ experiences and opinions related to singlesourcing and content management.From the Survey Results, What Dare We Predict Aboutthe Future of SS/CM?The survey results make for a rather cloudy crystal ball.Nevertheless, adding them to what we know from halfa decade of following the information about SS/CMdisseminated in the publications and at the conferencesof technical communication practitioners and academics,we feel conﬁ dent in making these general predictions:Single sourcing will slowly but steadily gain wider acceptance among technical communicationworkgroups. Single sourcing seems destined to reacha signiﬁ cantly larger proportion of adopters thansingle sourcing with content management—barringa technological breakthrough that makes SSwCMsystems signiﬁ cantly cheaper and easier to install, use,and maintain. Perhaps, though, one or more popularSS tools such as Adobe FrameMaker and MadCapFlare will evolve into true SSwCM solutions, alteringthe SS/CM marketplace quite dramatically.Pushing XML-enabled single sourcing to the tipping point may take the arrival, or the more effectivemarketing, of user-friendly and affordable plug-intools for Microsoft Word, which was by far the most-used authoring tool of STC members in May 2008.The number of eventual SS/CM adopters in technical communication may be somewhat lower than SS/CMvendors and consultants anticipate. Already, Web 2.0and social media/networking methods and tools arestealing the spotlight from SS/CM topics at the leadingconferences attended by technical communicators.That last conjecture seems a suitably provocativenote to end on. Standardized structure and control are atthe heart of the SS/CM paradigm, but those qualities areanathema to the Web 2.0/social networking paradigm.Dayton and HopperApplied ResearchVolume 57, Number 4, November 2010●Technical Communication391What’s going on here? Could it be that many companiesﬁ nd today that they need technical communicators toproduce a continuous stream of just-in-time, variouslystructured, often transient, multimedia content—asmuch or more than they need them to producehighly regulated and uniform topics in a databasewhose information, as well as its meta-information, iscomposed almost entirely of words?This question, in simpler forms, will becomethe focus of much discussion among technicalcommunicators. It represents only one of several obviousdirections for further research related to the incessantsearch for better, cheaper, and faster ways of creatinguseful and usable technical information products.ReferencesAment, K. (2003). Single sourcing: Building modulardocumentation. Norwich, NY: William AndrewPublishing.Babbie, E. R. (2007). Th e practice of social research, 11thed. Belmont, CA: Thomson Wadsworth.Clark, D. (2008). Content management and theseparation of presentation and content. TechnicalCommunication Quarterly, 17, 35–60.Dillman, D. A. (2007). Mail and Internet surveys: Th etailored design method (2nd ed.). Hoboken, NJ: Wiley.Doyle, B. (2007). Selecting a content managementsystem. Intercom, 54(3): 9–13.Gladwell, M. (2000). Th e tipping point: How little thingscan make a big diff erence. Boston: Little, Brown.Hall, W. P. (2001). Maintenance procedures for a classof warships: Structured authoring and contentmanagement. Technical Communication, 48,235–247.Happonen, T., & Purho, V. (2003). A single sourcingcase study. Presentation (slides) at STC 50th annualconference (Dallas, TX, May 18–21). Retrievedfrom http://www.stc.org/edu/50thConf/dataShow.asp?ID=110McCarthy, J. E., & Hart-Davidson, W. (2009). Findingusability in workplace culture. Intercom, 56(6), 10–12.National Bureau of Economic Research. (2008,December 11). Determination of the December2007 peak in economic activity. Retrieved fromhttp://wwwdev.nber.org/dec2008.pdf.Petrie, G. (2007). Industrial-strength single-sourcing:Using topics to slay the monster project. Presentation(slides) at 54th annual conference of the Societyfor Technical Communication (Minneapolis, MN,May 13–16). Retrieved from http://www.stc.org/edu/54thConf/dataShow.asp?ID=27.Pettit Jones, C., Mitchko, J., & Overcash, M. (2004).Case study: Implementing a content managementsystem. In G. Hayhoe (ed.), Proceedings of the51st annual conference of the Society for TechnicalCommunication (Baltimore, Maryland, May 9–12).Arlington, VA: STC. Retrieved from http://www.stc.org/ConfProceed/2004/PDFs/0048.pdf.Pierce, K., & Martin, E. (2004). Content managementfrom the trenches. In G. Hayhoe (ed.), Proceedings ofthe 51st annual conference of the Society for TechnicalCommunication (Baltimore, MD, May 9–12).Arlington, VA: STC. Retrieved from http://www.stc.org/ConfProceed/2004/PDFs/0049.pdf.Rockley, A. (2001). Content management for singlesourcing. In Proceedings of the 48th annual conferenceof the Society for Technical Communication (Chicago,IL, May 13–16). Arlington, VA: STC. Retrievedfrom http://www.stc.org/ConfProceed/2001/PDFs/STC48-000171.pdf.Rockley, A., Kostur, P., & Manning, S. (2002).Managing enterprise content: A unifi ed contentstrategy. Indianapolis, IN: New Riders.Rogers, E. M. (1995). Diff usion of innovations, 4th ed.New York: Free Press.Single Sourcing and Content ManagementApplied Research392Technical Communication●Volume 57, Number 4, November 2010Welch, E. B., & Beard, I. (2002). Single sourcing: Ourﬁ rst year. In G. Hayhoe (ed.), Proceedings of the49th annual conference of the Society for TechnicalCommunication (Nashville, Tennessee, May 5–8).Arlington, VA: STC. Retrieved from http://www.stc.org/ConfProceed/2002/PDFs/STC49-00070.pdf.About the AuthorsDavid Dayton is an Associate Fellow of STC. Hehas worked in technical communication since 1989as a technical writer and editor, Web content designerand usability professional, and university teacherand researcher. He conducted this research while hewas a faculty member of the English Department atTowson University. He recently left academe to jointhe International Affairs and Trade team of the U.S.Government Accountability Ofﬁ ce, where he works asa Communications Analyst. E-mail address: dr.david.dayton@gmail.comKeith B. Hopper has taught in the master’s program inInformation Design and Communication at SouthernPolytechnic State University since 2001. An associateprofessor there, he also teaches in the TechnicalCommunication undergraduate program. Recently,he launched an innovative master’s degree programin Information and Instructional Design: http://iid.spsu.edu. He holds a PhD in Instructional Technologyfrom Georgia State University. E-mail address:khopper@spsu.eduDayton manuscript received 26 February 2010, revised 28August 2010, accepted 8 September 2010.Appendix A: An Annotated BibliographyBecause our survey was about methods and toolsthat have been much discussed in conferences andthe literature of the ﬁ eld for over a decade, we didnot begin our report with an introductory literaturereview—the conventional way of justifying a new studyand showing its relation to prior research and theory.Instead, we provide this brief annotated bibliography.We selected these sources as recent and useful startingpoints for delving into the abundant literature bytechnical communicators discussing single sourcingand content management.Dayton, D. (2006). A hybrid analytical frameworkto guide studies of innovative IT adoption by workgroups. Technical Communication Quarterly, 15,355–382.This article reports a case study of a medium-sized company that carried out a user-centereddesign process, complete with empirical audienceresearch and usability tests, to put all its technicalreference, troubleshooting, training, and userassistance information into a single-source, database-driven content management system. The case studyis interpreted through the lens of a hybrid analyticalframework that combines and aligns three distincttheoretical traditions that have been used to guidetechnology adoption and diffusion studies.Dayton, D. (2007). Prospectus for a multimodalstudy of single sourcing and content management.In IPCC 2007: Engineering the future of humancommunication. Proceedings of the 2007 IEEEInternational Professional CommunicationConference (IPCC) held in Seattle, Washington, Oct.1–3, 2007. Piscataway, NJ: IEEE.This proceedings paper describes the researchproject funded by STC in 2007, of which the surveyreported in our article is the major part. It containsa justiﬁ cation for the focus of the study based in atraditional review of the literature.Kastman Breuch, L. (2008). A work in process: Astudy of single-source documentation and documentreview processes of cardiac devices. TechnicalCommunication, 55, 343–356.This article from the STC journal documents acase study with details on implementation and impactsthat offer a healthy practical counterpoint to the moreabstract and theoretical perspectives that dominate thechapters in the Pullman and Gu collection. KastmanBreuch is particularly interested to explore the impactsDayton and HopperApplied ResearchVolume 57, Number 4, November 2010●Technical Communication393of single sourcing (implemented through a contentmanagement system) on the document review process:“Both of these practices inﬂ uence the roles and identitiesof technical writers as individual authors. What happenswhen we examine the impact of both practices—document review processes and single sourcing—together?” (p. 345).Pullman, G., & Gu, B. (Eds.). (2008). Contentmanagement: bridging the gap between theory andpractice. Amityville, NY: Baywood Pub. Co.A collection of 11 articles originally published in aspecial issue of Technical Communication Quarterly, thisbook will appeal primarily to those seeking an in-depth,critical exploration of content management systems.The book’s editors deﬁ ne CM broadly, and none ofthe chapters speciﬁ cally focus on single sourcing. Anonline copy of the book’s introduction is available atthe publisher’s Web site: http://www.baywood.com/intro/378-9.pdf.Rockley, A. (2001). The impact of single sourcing andtechnology. Technical Communication, 48, 189–193.This article in the STC’s journal was the ﬁ rst topropose a comprehensive scheme for deﬁ ning typesof single sourcing. Rockley described four distinctlevels of single sourcing, with level 2 correspondingto what we have deﬁ ned as single sourcing withoutcontent management. Level 3 corresponds to what wehave deﬁ ned as content management: “Information isdrawn from a database, not from static, pre-built ﬁ lesof information” (p. 191). Rockley equates level 4 withadvanced electronic performance support systems thatare not practical to implement in most user-assistancescenarios.Williams, J. D. (2003). The implications of singlesourcing for technical communicators. TechnicalCommunication, 50, 321–327.This article by a practicing technical communicatorprovides an excellent starting point for readers new tothe topic of single sourcing. Williams provides concisebut comprehensive summaries of key articles and booksfrom 2000 to 2003 and provides a well-selected furtherreading list that includes articles from 1995 to 2002.Appendix B: New Thinking About SurveyResponse RatesResearchers have recently called into question whethera survey response rate of 60% to 70% should beconsidered, by itself, to ensure that the results are moretrustworthy than those from a survey with a muchlower response rate (Curtin, Presser, & Singer, 2000;

Keeter et al., 2000; Merkle & Edelman, 2002). Groves,Presser, and Dipko (2004) sum up the challenge tothe conventional wisdom on response rates: “While alow survey response rate may indicate that the risk ofnonresponse error is high, we know little about whennonresponse causes such error and when nonresponseis ignorable” (p. 2).“Emerging research,” Radwin wrote (2009),“shows that despite all the hand-wringing about surveynonresponse, the actual effect of response rate on surveyaccuracy is generally small and inconsistent, and in anycase it is less consequential than many other serious butoften ignored sources of bias” (para. 4). Radwin citesa study by Visser, Krosnick, Marquette, and Curtin(1996) that compared the pre-election results of mailsurveys conducted from 1980 through 1994 with theresults of parallel telephone surveys conducted in thesame years. The average response rate of the mail surveyswas 25% while the telephone surveys reported estimatedresponse rates of 60% to 70%. Based on response ratealone, conventional wisdom would predict that thetelephone surveys were signiﬁ cantly more accurate thanthe mail surveys, but the opposite was the case. The mailsurveys consistently outperformed the telephone surveyson accuracy. Visser et al. concluded that “to view a highresponse rate as a necessary condition for accuracy is notnecessarily sensible, nor is the notion that a low responserate necessarily means low accuracy” (p. 216).We believe that what Visser et al. (1996) found tobe true of surveys of the electorate is even more likely tohold true for surveys such as ours whose sampling frameis conﬁ ned to the members of a professional organization.Almost four decades ago, Leslie (1972) noted that “whensurveys are made of homogeneous populations (personshaving some strong group identity) concerning theirattitudes, opinions, perspectives, etc., toward issuesconcerning the group, signiﬁ cant response-rate biasSingle Sourcing and Content ManagementApplied Research394Technical Communication●Volume 57, Number 4, November 2010is probably unlikely” (p. 323). In their recent meta-analysis of studies on nonresponse error in surveys,Groves and Peytcheva (2008) concluded that “theimpression that membership surveys tend to suffer fromunusually large nonresponse biases may be fallacious”(p. 179), even though relatively low response rates forsuch surveys have become a well-known problem.Rogelberg et al. (2003) stress the self-evidentpoint, often forgotten in discussions on this topic,that survey nonresponse is not the same as surveynoncompliance—the purposeful refusal to take a survey.If a sizable number of our e-mailed survey invitationsnever reached the intended recipients, because of spamblockers, for example, or ﬁ lters created by recipientsto delete e-mails from certain senders, then the actualresponse rate would be higher—though by how muchis impossible to say. Similarly, it is impossible to knowhow many times the e-mails about the survey mayhave been deleted automatically by recipients who didnot make a conscious decision to refuse the invitationto take the survey. During May 2008, along with oursurvey invitation STC sent out multiple e-mails tomembers about the upcoming annual conference. Manymembers in the sample may have paid scant attentionto our initial e-mails about the survey because the ﬁ rstidentiﬁ ed stc@stc.org as the sender. (We had the STCstaff member change the sender to ddayton@stc.org forthe two reminder e-mails.)We believe that most of our survey’s nonrespondentswere passive, not active nonrespondents. Based ontheir in-depth ﬁ eld study, Rogelberg et al. (2003)concluded that only about 15% of nonrespondentsto organizational surveys were active nonrespondents,and also concluded that passive nonrespondents wereidentical to respondents when the survey variables hadto do with attitudes toward the organization. Whileour survey was directed at members of an organization,the questions were not about the organization, andthe type of organization is a special class—professionalmembership organizations. Thus, we cannot assume thatthe ﬁ ndings and reasoning reported by Rogelberg et al.(2003) apply to our nonrespondents; on the other hand,we think the question raised is one worth consideringin regard to our survey: Were most nonrespondentspassively passing up the chance to take our survey, orwere most of them actively rejecting the invitationbecause of some attitude related to the topic of the surveyor attributable to some other cause that might meanthat their answers on the survey would be signiﬁ cantlydifferent from the answers of those who responded?If failing to achieve a certain response rate isnot automatically an indicator of nonresponse biasin a sample survey, how then can we estimate thelikelihood that the survey results are biased becauseof missing data from the random sample? Rogelberg(2007) summed up the answer: “Bias exists whennonrespondent differences are related to standing onthe survey topic of interest such that respondents andnonrespondents differ on the actual survey variables ofinterest” (p. 318). Translating that into plain Englishfor the case in question, if a signiﬁ cant proportionof our survey’s nonrespondents were signiﬁ cantlydifferent from respondents in their experience withor attitudes toward single sourcing and contentmanagement, then their missing data represents asource of bias in our survey results. Thinking aboutwhy recipients of our e-mails about the survey wouldpurposely ignore or actively reject the invitation,we surmise that most such active nonrespondents,as opposed to the likely majority of passivenonrespondents, would have found the survey topicof little interest because they had no experience withsingle sourcing and/or content management systems.Even though we worded our survey invitations tostress our desire to collect information from all STCmembers, regardless of whether they used SS/CMmethods and tools, it seems likely that many recipientsof our messages who had no experience with suchmethods and tools would have felt disinclined to takethe time to ﬁ ll out the survey. To the extent that ourconjecture about this is accurate, the survey resultswould overestimate the proportion of STC memberswhose work groups used SS/CM methods and tools inMay 2008.References for Appendix BCurtin, R., Presser, S., & Singer, E. (2000). The effectsof response rate changes on the index of consumersentiment. Public Opinion Quarterly, 64, 413–428.doi:10.1086/318638.Dayton and HopperApplied ResearchVolume 57, Number 4, November 2010●Technical Communication395Groves, R. M., Presser, S., & Dipko, S. (2004). The roleof topic interest in survey participation decisions.Public Opinion Quarterly, 68, 2–31. doi:10.1093/poq/nfh002.Groves, R. M., & Peytcheva, E. (2008). The impact ofnonresponse rates on nonresponse bias: A meta-analysis. Public Opinion Quarterly, 72, 167–189.doi:10.1093/poq/nfn011.Keeter, S., Miller, C., Kohut, A., Groves, R., & Presser,S. (2000). Consequences of reducing nonresponsein a national telephone survey. Public OpinionQuarterly, 64, 125–48.Leslie, L. L. (1972). Are high response rates essential tovalid surveys? Social Science Research, 1, 323–334.doi:10.1016/0049-089X(72)90080-4.Merkle, D., & Edelman, M. (2002). Nonresponse in exitpolls: A comprehensive analysis. In R. M. Groves,D. A. Dillman, J. L. Eltinge, & R. J. A. Little (Eds.),Survey nonresponse (pp. 243–258). New York: Wiley.Radwin, D. (2009). High response rates don’t ensuresurvey accuracy. Th e Chronicle of Higher Education,(October 5). Retrieved from http://chronicle.com/article/High-Response-Rates-Dont/48642/.Rogelberg, S. G. (2006). Understanding nonresponseand facilitating response to organizationalsurveys. In A. I. Kraut (ed.)., Getting action fromorganizational surveys: new concepts, methods, andapplications (pp. 312–325). San Francisco, CA:Jossey-Bass.Rogelberg, S. G., Conway, J. M., Sederburg, M. E.,Spitzmüller, C., Aziz, S., & Knight, W. E. (2003).Proﬁ ling active and passive nonrespondentsto an organizational survey. Journal of AppliedPsychology, 88(6), 1104–1114. doi:10.1037/0021-9010.88.6.1104.Visser, P. S., Krosnick, J. A., Marquette, J., & Curtin,M. (1996). Mail surveys for election forecasting?An evaluation of the Columbus Dispatch poll. PublicOpinion Quarterly, 60, 181–227.Appendix C: Survey DocumentsLink to a Non-Working Archival Copy of the Surveyhttp://www.zoomerang.com/Survey/WEB22B38UWBJKZCopy of Survey Notiﬁ cation Message from STCPresident Linda OestreichSubject: Please participate in a research study of STCmembersThe STC is sponsoring research to discover therange of information development methods and toolsbeing used by STC members. We especially want toknow how many members are using single sourcing andcontent management methods and tools.Whether or not you use single sourcing and/orcontent management, we need your input. You areincluded in the small random sample of members whowill receive an e-mail containing the link to an onlinequestionnaire.The survey can be done anonymously, or youcan provide an e-mail address for follow-up contactor to receive an early view of the results. Most testersreported that they completed the survey in 10 to15 minutes.I am excited that Dr. David Dayton (PhD,Technical Communication) and Dr. Keith Hopper(PhD, Instructional Technology) have designed andtested the survey instrument and are ready to collect andanalyze the data that you provide.Look for an e-mail with a link to the survey onTuesday, May 13.Dr. Dayton will give a report on the surveyresults at a session of the 2008 TechnicalCommunication Summit, which will be held inPhiladelphia June 1-4.Copy of First E-mail Message Containing a Link to theSurveySubject: Please participate in a research study of STCmembersSingle Sourcing and Content ManagementApplied Research396Technical Communication●Volume 57, Number 4, November 2010We professional technical communicators lackreliable data on the range of information developmenttools and technologies being used by practitioners.The STC is sponsoring research to collect thatinformation, with a focus on ﬁ nding out what singlesourcing and/or content management methods and toolsare being used.Your name was among the small random sample ofmembers receiving this invitation to participate in anonline survey accessed at this page: [typed here was alink to the informed consent Web page reproduced afterthis message]The survey can be done anonymously, or you canprovide an e-mail address for possible follow-up contactor to receive an early view of results. The exact set ofquestions presented will depend on your answers to keyquestions, so the time required to ﬁ ll out the surveywill vary. Most testers reported that they completed thesurvey in 10 to 15 minutes.Whether or not you use single sourcing and/or content management, we need your input. Byparticipating, you will help us construct a reliable proﬁ leof information development methods and tools used bySTC members.Because the random sample is a small fraction of thetotal STC membership, it is critical that we have yourdata in the survey results. It is equally critical that membersof the sample do not forward the survey link to others.If you have any problems with the link to the surveyor with the survey itself, please contact David Dayton atddayton@rcn.com.David Dayton: research project leadTowson University (Maryland)Keith Hopper: survey deployment and statistical analysisSouthern Polytechnic State University (Georgia)Copy of informed consent Web page giving accessto the surveySingle Sourcing and Content Management inTechnical Communication: A Survey of STCMembersConsent FormBecause you were included in a small randomsample of STC members, your information is vital toachieving the purpose of the survey even if you donot use single sourcing or content management.This consent form is required by federal regulations.By clicking the agreement link at the bottom of thisform, you acknowledge that your participation isvoluntary, that you may abandon the survey at anypoint, and that your information is anonymous unlessyou provide contact information, in which case wepromise to handle your information with the strictestconﬁ dentiality.Time RequiredMost testers of the survey reported that it took them10–15 minutes to ﬁ ll out the questionnaire that willappear after you click on the “I agree” link at thebottom of this form.Purpose of the StudyThis survey will collect information from a sampleof STC members about their use or non-use of singlesourcing and content management tools and methods–and their opinions about them. (In the survey, wedeﬁ ne precisely what we mean by “single sourcing” and“content management.”)What You Will Do in the StudyYour only task is to ﬁ ll in the Web survey itself.Beneﬁ tsRespondents who complete the survey will be offeredan early look at the preliminary data, which we willcontinue to analyze and will later report in conferencepresentations and published articles. As a technicalcommunicator, you may beneﬁ t in that the survey datawill provide a statistical snapshot of the informationdevelopment methods and tools that STC membersare using today and their opinions about some of thosemethods and tools.Conﬁ dentialityThe information you provide will be handledconﬁ dentially. If you choose not to identify yourselfto us, we will not try to ﬁ nd out who you are. You willhave the option of identifying yourself for follow-upDayton and HopperApplied ResearchVolume 57, Number 4, November 2010●Technical Communication397contact by e-mail or to view the preliminary surveyresults.We will present the survey ﬁ ndings in terms ofgroup percentages, look for common themes in theopen-ended questions, and cite remarks where they areinteresting and appropriate. No individual respondentswill be identiﬁ ed.RisksWe do not believe there are any risks associated withparticipating in this survey.Voluntary Participation and Right to WithdrawYour participation in this study is completelyvoluntary, and you have the right to withdraw fromthe study at any time without penalty.How to Withdraw from the StudyIf you want to withdraw from the study, you may doso at any time simply by closing the browser in whichthis form or the questionnaire appears.Whom to Contact About this Study or Your Rights in theStudyPrincipal InvestigatorsDavid Dayton, ddayton@rcn.com, Towson University(Maryland)Keith Hopper, khopper@spsu.edu, Southern PolytechnicState University (Georgia)Chairperson, Institutional Review Board for theProtection of Human Participants, Towson University(Maryland): Patricia Alt, palt@towson.eduAgreementIf you agree, click here to start the survey. If youexperience a problem with the link above, please copyand paste the following URL into your browser: [fullWeb address to the survey was typed here]If you do not agree to participate in the survey,please close the browser now or go to the STC homepage.THIS PROJECT HAS BEEN REVIEWED BY THEINSTITUTIONAL REVIEW BOARD FOR THEPROTECTION OF HUMAN PARTICIPANTS ATTOWSON UNIVERSITY.