Analyst Views Weeklyhttps://avweekly.wordpress.com
Commentary and analysis from Northern LightThu, 19 Jun 2008 16:05:04 +0000enhourly1http://wordpress.com/https://s2.wp.com/i/buttonw-com.pngAnalyst Views Weeklyhttps://avweekly.wordpress.com
Hyperconnectivity Impacthttps://avweekly.wordpress.com/2008/06/19/hyperconnectivity-impact/
https://avweekly.wordpress.com/2008/06/19/hyperconnectivity-impact/#respondThu, 19 Jun 2008 16:05:04 +0000http://avweekly.wordpress.com/?p=101The title of an IDC white paper released in May and sponsored by Nortel, A Global Look at the Exploding ‘Culture of Connectivity’ and Its Impact on the Enterprise, makes the content of the study apparent.

For the study IDC surveyed, “2,367 men and women across 17 countries in various industries, company size classes, and age segments.” All respondents were fully employed, over 17 years old, used a PC at work, and owned or used a PDA or mobile phone for either business or personal activities, and had access to the Internet.

From the survey results IDC distilled four distinct clusters of users. Barebones Users, Passive Online, Increasingly Connected, and Hyperconnected. Barebones Users, at the low-end of the connectivity scale are defined as: “Those who are online but pretty much stick to email, desktop access to the Internet, and cell phone use for voice calls.” The other end of the connectivity spectrum is reserved for the Hyperconnected: “Those who have fully embraced the brave new world, with more devices per capita than the other clusters and more intense use of new communications applications. They liberally use technology devices and applications for both personal and business use.”

The largest of the clusters, comprised of 36 percent of respondents, is the Increasingly Connected while the smallest is Hyperconnected (with just 16 percent of respondents). However, the trend is toward more connectivity: The number of Hyperconnecteds is set for growth. As older workers retire, the percent of those Hyperconnected will rise to 25 percent within a few years, and assuming that some of those in the Increasingly Connected cluster move up the ladder (which is the trend) the Hyperconnected could reach 40 percent in that same time period.

This scenario is not exclusive to North America, “the quest for personal connectivity has no national boundaries.” Looking globally, “the country with the highest percentage of hyper-connected respondents in the study was China, the country with the highest percentage of increased Hyperconnectivity was Russia.” Of the 17 countries included in the study Canada and the United Arab Emirates had the lowest number of hyperconnected respondents.

A number of factors contribute to a country’s ‘connectivity landscape,’ and its ‘culture of connectivity’ according to IDC. These factors include telecommunications infrastructure quality and coverage, workforce demographics, and the mix of small, medium, and large companies. Social or work environment, urban density, and the acceptance of working on personal time are also factors. However, though each country may have a unique connectivity landscape, there are two characteristics common to all: inexorable growth of hyperconnected individuals and the need for enterprises to be on top of this growth if they are to compete in the global marketplace.

The report sums up enterprises’ need to prepare: “Enterprises will either manage this migration or get trampled.”

IDC notes that “One of the most striking – although perhaps not surprising – results of the survey is the degree to which personal connectivity is blending with work connectivity.” Generally speaking, this blurring of the personal and business boundary is at the heart of the challenges which enterprises will contend with. With the lack of distinction between business and personal use of devices, applications, and connectivity, “enterprises may find themselves supporting personal use.” With this comes headaches over security, privacy, and management. Balancing the two, work and personal, is also an issue: “Companies will not only need to develop policies and strategies to help employees find that balance, but may face legal or union issues with merged personal and business activities.”

Furthermore, as the anytime/anyplace workforce grows more connected and younger it will treat hyperconnectivity and its devices as a condition for employment; enterprises will deploy applications to maintain workers – these applications will themselves need to be maintained. Finally, “The dependency of the hyperconnected on the devices and applications that make them hyperconnected raises the importance of network and application reliability, security, and availability even higher than it is today.”

IDC’s report goes into much more detail as to how inevitable the transition to hyperconnectedness is and what the impact on the enterprise will be. However, if there are any questions as to the importance of preparation they conclude with this: “What we have learned about the state of Hyperconnectivity today, its pace of adoption, and its observable impact on organizations suggests a clear call to action.”

]]>https://avweekly.wordpress.com/2008/06/19/hyperconnectivity-impact/feed/0dmartelTwitterhttps://avweekly.wordpress.com/2008/06/05/twitter/
https://avweekly.wordpress.com/2008/06/05/twitter/#respondThu, 05 Jun 2008 16:26:52 +0000http://avweekly.wordpress.com/?p=100Even before it won its first SXSW award in March of 2007 Twitter has been making waves. Since then the messaging service has achieved a valuation, “pegged at way north of $70 million,” according to leading tech blogger Om Malik who also notes that its most recent round of funding raised the company another $15 million.

Originated as a browser-based microblogging tool allowing users to post updates of up to 140 characters to groups of followers (rather than just to individuals), Twitter has spawned a plethora of tools now piggybacking on its success; from desktop clients which free users from Twitter’s browser-based origin to those that allow searching of the stream of Twitter messages known as ‘tweets’.

Once the playground of ‘tech elite’ early adopters Twitter is now being utilized by businesses to monitor their status and brand among not only those early adopters, but among the rapidly growing number of more mainstream users now using the service.

Twitter is now on the way to rearing its head in full and in doing so has also become the center of one of Web 2.0’s greatest love-hate relationships.

Though its popularity is evident from mentions in the press and conversations in the blogosphere actual usage statistics for Twitter are a bit hard to come by. At the end of April TechCrunch reported: “Hitwise says web visits have increased 8x in the last year . . . Compete shows about 900,000 U.S. monthly website visitors. Comscore puts the worldwide number at 1.3 million unique monthly visitors in March.” Furthermore, as much of the action on Twitter occurs via mobile phones, instant messaging and desktop clients, statistics such as those above are not necessarily of much use. What is agreed upon, however, is that Twitter usage is increasing rapidly.

Commentary of late has revolved largely around issues such as Twitter’s downtime, which seems to be increasing in direct relationship to its popularity, and its business viability (it is still a free service without income). And many are still asking one of the original questions of Twitter: “What is the point?”

There is no avoiding the fact that Twitter’s increased traffic has taxed its ability to perform optimally, its downtime seems to have increased in direct relationship to its popularity. Twitter’s Ruby-on-Rails architecture has been criticized for its lack of scalability and the company’s lead architect, Blaine Cook, recently departed. Dennis Howlett of the blog AccMan comments on the scenario: “Right now I see way too many instances of instant forgiveness in the use of tools that are not delivering on the promise they first espoused and yet which seem to march on regardless. Digg is one example. Twitter, with its continuing failures is another.” Howlett is quite correct, Twitter may be failing to keep its end of the deal, but what may be an issue for other services still has not pushed Twitter to the tipping point.

As to Twitter’s viability as a business, the company is in the same boat as other Web 2.0 companies which launched free sites or services and which are now struggling to find that elusive working business model. Even if it can overcome its repeated crashes and deliver a more stable architecture will Twitter merely be swallowed by a larger company and lose its luster, lose out to another upstart (such as its competitor Pownce co-founded by Digg’s Kevin Rose), or be found to be just another fad of limited or no value. Though ultimately only time will tell Twitter’s fate, it is making some headway in demonstrating its value.

As noted earlier, Twitter is now being used by businesses to monitor comments, participate in conversations, and as a tool to communicate with customers in near real-time. In his piece for BusinessWeek entitled Why Twitter Matters, Stephen Baker points out that, “Businesses such as H&R Block and Zappos are now using Twitter to respond to customer queries. Market researchers look to it to scope out minute-by-minute trends.” As an example of Twitter’s value in this arena Baker refers to Dell, which using the data mining company Visible Technologies, “scouts out the tweets and dispatches its Twittering workers to jump into the conversations. At a conference last week, the company claimed to have boosted sales through these efforts by $500,000 in recent months.” Companies, such as IBM which has an enterprise tool based on Twitter, are exploring the use of Twitter behind corporate firewalls and media companies including the New York Times, Reuters, and National Public Radio are maintaining a Twitter presence.

However, Twitter needs to work hard not only at proving its stability and value, but at maintaining its spot and gaining ground among potential users: Right or wrong, some still just don’t get it. Some out-and-out disregard Twitter as noise and some hate to love it, stating its noise is addictive. (Though there are those that love it all the more for its noise and cite the volume of ‘noise’ as its benefit.)

Abbie Lundberg, editor in chief of CIO, now among the Twitter converts, notes, “When I first came across it last year, I thought it was a joke. An online spewing of inconsequential details by self-absorbed people with too much time on their hands.” Though the title of Baker’s aforementioned BusinessWeek article, Why Twitter Matters, suggests he is also a convert he does state that, “It’s easy to laugh at nonsense on Twitter, the microblogging rage. ‘My nose is leaking,” writes someone called Zapples,” he writes “so imma go to sleep now’” However, Baker adds, “But I’ve heard lots of similar drivel (and even produced some myself) on the phone—an important technology if there ever was one.” While some will doubtless say the step from Twitter to the phone is a stretch, it may not be so if one places email (of which similar comments were made early on) as a stepping stone. Of course it is also the type of analogy that Twitter likes to see. Twitter co-founder Biz Stone is quoted by Baker: “It can become a communication utility,” Stone says, “something people use every day.”
TechCrunch’s Michael Arrington says, “It’s a huge marketing tool, and information tool. But it is also a social habit that’s hard to kick.” Lundberg is forced to agree: “I have to admit, sometimes I’m a slave. Depending on who’s posting (or “tweeting”) on any given day, I can find it hard to stay away.”) Having the distraction of visiting blogs to keep up on the latest gave way to reading RSS feeds and users are now faced with maintaining or viewing a constant presence of messages of the Twitter stream.

As for dismissing Twitter for hosting too much noise uber blogger (and Twitterer) Robert Scoble makes an interesting case against that. “The news is in the noise. Which is why Twitter is crack for newsmakers. There’s no better place to find noise, er news, than on Twitter.”

“Love it or hate it, Twitter, a service that embodies our narcissism, is one that we can’t stop talking about,” says Malik.

You can also follow Northern Light’s Content Analyst and Analyst Views’ author, David Martel, on Twitter. Click here.

]]>https://avweekly.wordpress.com/2008/06/05/twitter/feed/0dmartelSoftware-as-a-Service: Moving Uphttps://avweekly.wordpress.com/2008/05/22/software-as-a-service-moving-up/
https://avweekly.wordpress.com/2008/05/22/software-as-a-service-moving-up/#respondThu, 22 May 2008 15:53:28 +0000http://avweekly.wordpress.com/?p=98According to reports from McKinsey & Company and Forrester, adoption of the Software-as-a-Service model is on the rise. SaaS is moving beyond its place in the small and medium sized business (SMB) space and is increasingly being implemented in larger enterprises. “For platform vendors, the only falloff in interest comes at the largest enterprises, those employing more than 25,000 people. In short, nearly every company – or division of a larger enterprise – is a customer or a prospect for SaaS platforms,” states McKinsey’s 2008 Enterprise Software Customer Survey. This thought is mirrored by Forrester research: “SaaS use is growing across types of applications, companies, and user groups.” The momentum created by this movement is pulling in established software vendors and fueling quick-moving startups. As both vie for position, it is acknowledged that some will fall, but the model is predicted to survive.

Forrester cites “shorter deployment times, faster return on investment (ROI), and pay-as-you-go pricing for new software needs” as factors contributing to increased interest in SaaS.There is some overlap between this assessment and that of McKinsey: “The momentum behind adoption of subscription and on-demand purchasing models is clearly being driven by SMB customers, for whom the pricing models have the greatest initial appeal.” That SMB customers are hoping to save money via SaaS deployments is not a surprise; the new development is wider-scale adoption in the enterprise.

McKinsey notes that more and more enterprises are “converting to the models that underpin SaaS offerings.” Forrester also notes the rise in SaaS adoption in organizations of all sizes. “SaaS usage is significant both at the large enterprise and the SMB level, with multiple SaaS solutions increasingly deployed.” In fact, according to Forrester, a (slightly) higher percentage of large enterprises are using SaaS than their smaller counterparts in the SMB space: 16 percent of large enterprises and 15 percent of SMBs implement SaaS solutions. (That represents an increase of 33 percent over last year’s reported numbers for implementation by enterprises.)

Of course everyone wants in, and, McKinsey points out, the uptake in interest in SaaS is setting the stage for a, “tremendous battle between the largest software vendors and the newer SaaS providers.” McKinsey predicts this battle to be one towards the center. “While each of these players has an advantage at one end of the spectrum (large vendors such as IBM, Oracle, SAP and Microsoft do best in large enterprises, while SaaS “incumbents” such as Salesforce, NetSuite and RightNow are more in favor with small businesses), the real battle is in the mid-market space.”

Rivals are not the only thing that individual vendors will need to overcome in the future. While not new, hurdles present early on for the SaaS model still exist and, even though some of them are dropping this is leading to new challenges.

In its April report, SaaS Clients Face Growing Complexity, Forrester states that among those who do not consider SaaS solutions integration is the top concern. This is not an unfounded fear. Even among those who implement SaaS, “integration can be a challenge, since many SaaS solutions evolved as standalone offerings that their creators sold to business users and therefore did not focus on building out strong integration tools, creating thorough documentation, or writing prebuilt connectors.” McKinsey reports the same: “Whether a SaaS platform vendor is finding a niche or trying to maintain a market advantage, the survey underscores the vital importance of offering customers speedy deployment and a smooth path to integration with existing applications and IT infrastructure. In fact, those factors were selected as the most important twice as often as costs.”

Forrester also reports that concern about data security, an issue with the SaaS model from the beginning, still presents a barrier to adoption. (Half the North American firms and nearly forty percent of European firms which don’t consider SaaS solutions cite data security as the reason.) As with integration, this may not be an irrational concern. Forrester notes that, “Providers entering the SaaS market usually don’t invest heavily in multiple data center locations that would provide the kind of redundancy and disaster recovery that large enterprise clients typically look for.” Furthermore, “the cost of data replication poses challenges to the whole financial model of SaaS providers.”

Finally the growth of the market is its own challenge. New providers and solutions are frequently entering the market which makes it, “difficult for firms to feel secure about the long-term stability of their application purchases.”

“The expansion of the software-as-a-service (SaaS) business model appears to be inevitable. It is both Web 2.0 and business technology (BT) and reflects the wishes of business users to have more control over their IT,” states Forrester in another report. This is probably a good assessment; the model has gained traction despite the continued presence of early challenges and appears set to overcome new ones as they arise. The model is a flexible one.

For Further Reading:

Is the Recession Good for SaaS?
The InformationWeek Blog, May 2, 2008
I heard opposing voices at Interop about whether the bad economy will drive companies to software as a service. IT budgets can be precarious in the best of times, and the current economic recession will affect IT’s ability to spend.

McKinsey Surveys the New Software Landscape
Rough Type, April 29, 2008
A new study, to be released today by McKinsey & Company, reveals in some of the clearest terms yet the sea change that is under way in business software. The consulting firm surveyed more than 850 corporate software buyers, from firms of all sizes, and found that software-as-a-service is rapidly “becoming mainstream,” with three-quarters of software buyers saying they are “favorably disposed to adopting SaaS platforms” for software development and deployment.

Are You Ready for SaaS?
Destination CRM, February 1, 2008
Software-as-a-service (SaaS) seems to be everywhere lately, especially in the CRM space. For small and midsize businesses (SMBs), SaaS has been a popular option for a number of years; now larger enterprises are beginning to adopt it in meaningful numbers. SaaS, it can be said, is ready for you, no matter who you are. But are you ready for SaaS?

]]>https://avweekly.wordpress.com/2008/05/22/software-as-a-service-moving-up/feed/0dmartelWeb 2.0: Going, Going, Stronghttps://avweekly.wordpress.com/2008/05/08/web-20-going-going-strong/
https://avweekly.wordpress.com/2008/05/08/web-20-going-going-strong/#respondThu, 08 May 2008 15:10:24 +0000http://avweekly.wordpress.com/?p=97Forrester Research defines Web 2.0 as, “A set of technologies and applications that enable efficient interaction among people, content, and data in support of collectively fostering new businesses, technology offerings, and social structures.” According to Forrester’s research and blogs, everything is rosy in the Web 2.0 world. Recession or not, the space should continue to heat up for at least the next five years.
Forrester’s Josh Bernoff, writing on the Groundswell blog, differentiates the current economic stage from that which was seen at the end of the ‘tech boom’: “This time, the precipitating event is a housing bubble, and technology spending is not irrational.” This distinction may not even be necessary though, advertising expenditures in the Web 2.0 environment would likely increase regardless of the economic situation. Bernoff and Forrester reported last month that 72 percent of interactive marketers interviewed stated that they expect to, “keep their interactive spend on plan or increase it in a recession.”

The increase in spending will be distributed through a number of channels; says Forrester: “Interactive marketers are most likely to increase investment in social networks, with 48 percent planning an increase and another 34 percent keeping investment steady. Marketers are also bullish on user-generated content and blogs. Podcasts, RSS, and widgets, while less popular, will still generate increased investment from at least 20 percent of interactive marketers.” That three quarters of interactive marketers have expressed these plans is a statement to the staying power of Web 2.0; it is all the more so that they are doing it despite pushback from within their organizations. In closing Forrester notes that, “60 percent of marketers say that they struggle to build the case for interactive marketing in their organization,” however, “it is finally clear that interactive is not an experiment that will go away in a few years.”

Standing in the way of enterprise Web 2.0 adoption are three primary factors. First of all, enterprises rely on IT departments for implementation, and since, “70 percent of the average IT budget goes to maintaining past investments,” the challenge is obvious. Second, the assumption, based on consumer-side products, is that cost will be minimal. (Though not necessarily at the high end of spending, enterprise Web 2.0 products cost substantially more than the free versions which generated interest in the first place.) Finally, “Web 2.0 tools enter a crowded landscape of legacy software investment,” a, “space full of legacy software and processes that are difficult to displace and with which Web 2.0 software must integrate to be fully effective.” These factors will all be overcome.

Forrester predicts that spending on Web 2.0 in the enterprise will skyrocket: “the collected expenditure on social networking, RSS, wikis, blogs, mashups, podcasting, and widgets will grow at a compound annual rate of 43 percent over the next five years.” By 2013 annual spending on enterprise Web 2.0 will reach $4.6 billion. As with the increased spending in interactive marketing, spending within the enterprise “will not accrue evenly across all product categories, geographies, or implementation types.”

Social networking remains at the top of the list of technologies receiving an influx of capital. Forrester projects companies will spend $258 million in this space in the current year and that this growth will continue. Of Web 2.0 technologies in the enterprise, Social networking will experience the strongest annual growth rate over the next five years; by 2013 spending will reach almost $2 billion. Mashup technologies will make a strong second place showing; growing from only $39 million in 2007 to $682 million in 2013.

Enterprise spending on Web 2.0 is now focused on internal audiences and projects, over the next five years this will change. “Forrester forecasts external enterprise Web 2.0 expenditure to pass internal expenditure in 2009 and ultimately dwarf internal expenditure by nearly a billion dollars in 2013.” There will also be a shift in the geography of spending. Though North America currently accounts for 62 percent of enterprise Web 2.0 spending, it will, by 2009, account for only 40 percent.

Though enterprise Web 2.0 is not going away, it may become invisible. Forrester notes: “it will eventually disappear into the fabric of the enterprise, despite the major impacts the technology will have on how businesses market their products and optimize their workforces.” Perhaps it will then be time for the next generation.

Further Reading:

Social Network Spending to Increase
Jeremiah Owyang/Web Strategy by Jeremiah
Social Networks continue to show a strong future of growth. Two recent Forrester reports published by my colleagues, Josh Bernoff and Oliver Young, both showing the future of social computing for the interactive marketer and for enterprise 2.0 purchasing. A very obvious trend for both of these reports is the growth of budgets by marketers and companies for social networks.

Social Media’s Future Looks Bright, Apply Sunscreen
CNET, May 1, 2008
Social media is in the spotlight because from a consumer perspective, it’s causing a shift in how people spend their time online and how they relate to media. All those involved, from advertisers to entrepreneurs to major media companies, are trying to figure out what it means to their business and how they should take advantage of it. What’s more, many social-media companies are still figuring out how to turn a profit.

Web 2.0 Means Business
Signal, May 2008
Social networking and other Web 2.0 capabilities are creating new avenues for commerce by facilitating communication inside the corporate structure and extending collaboration beyond company walls. Key to making the most out of new technology, however, is determining corporate goals before throwing a new tool into the mix. When chosen and applied judiciously, nearly every Web 2.0 weapon—from del.icio.us to wikis—can play meaningful and profitable roles within any company.

The Way of the Widget in the Age of the Social Web
Java Developers Journal, May 1, 2008
As the Internet’s newest way to connect brands with consumers, widgets have officially arrived. These portable applets appear on blogs, websites, and social networking sites like MySpace and Facebook. Offered by third-party developers as embedded Flash (.swf) objects, the self-contained badges allow page owners to personalize their sites with photo slide shows, music playlists, games, and other content. Widgets also allow companies to engage their audience with compelling content while also branding a company and/or product.

Social Technology Marketers Bullish in Face of Recession
Josh Bernoff/Groundswell, April 30, 2008
In February we published research based on our expectation that interactive marketers should continue their investments in social applications with a recession potentially coming. Today we published the results of new research that shows that many interactive marketers actually plan increases in the face of recession.

Why Social Applications Will Thrive in a Recession
Josh Bernoff/Groundswell, February 6, 2008
Is a recession coming? Don’t ask me — I’m not an economist, and even the economists don’t really know. But if it’s anything like the last recession, advertising will plummet and experimental media will crater. (In the 2001 recession, US advertising dropped 9% and Internet advertising plummeted 27%, according to Veronis Suhler Stevenson.) But do not panic. Things are different this time.

Between January 1st and December 31st 2007 the IC3 processed 219,553 complaints: 206,884 complaints were received via their website (a drop of 0.3 percent from the previous year), the remaining 12,669 were referrals from other agencies.Though the IC3 tracks numerous types of complaints, those which come via the website typically, “do not represent dollar loss but provide a picture of the types of scams that are emerging via the Internet. These complaints in large part are comprised of fraud involving reshipping, counterfeit checks, phishing, etc.”2007 was the second year in a row that the number of complaints to the IC3 website dropped; 231,493 complaints were received in 2005 and 207,492 in 2006.

On the other hand, the “vast majority” of referred complaints, “alleged fraud and involved a financial loss on the part of the complainant.”Last year the total dollar loss due to fraud from referred complaints reached $239.09 million; $198.44 million was reported in 2006 and $183.12 million in 2005. The number of referred complaints was up again last year to 90,008 but was still well below the spike it saw in 2004 when 103,959 cases were referred. It could be interesting to note that the “Yearly Dollar Loss” has risen steadily over the same period in which complaints to the website shrank.In fact, if 2004 is overlooked, the dollar loss has increased every year since the IC3 began its tracking. Total dollar losses are relatively easy to report, but the IC3 is cautious, providing both mean and median numbers when reporting the average monetary loss incurred by individual complainants:“Of those complaints with a reported monetary loss, the mean dollar loss was $2,529.90 and the median was $680.00.” More than half of reported losses were under $1,000 and almost a third were between $1,000 and $5,000. Small percentages reported losses between $5,000 and $10,000 and between $10,000 and $100,000, 6.5% and 5.3% respectively.

Though the IC3 does categorize complaints and offer statistics, it also urges caution here, suggesting that the numbers may be skewed. The, “the perception of consumers and how they characterize their particular victimization within a broad range of complaint categories,” as well as the fact that many key Internet stake holders have provided their customers with links to the IC3 website may produce a misleading picture. That being said, as far as what was complained about in 2007, via both the IC3 website and referrals, over 60 percent falls into two categories: Auction Fraud (35.7 percent) and Non-Delivery (24.9 percent).(Interestingly Auction Fraud fell by 20.5 percent from 2006 while Non-Delivery Fraud rose by 31.1 percent.)After those two there is a significant drop: Confidence Fraud, the number three category, garnered a mere 6.7 percent of the total complaints and pulling up the rear was the Nigerian Letter Fraud with 1.1 percent.A slightly different picture emerges if this data is looked at in another way: according to how much financial loss is generated.

Auction Fraud and Non-Delivery Fraud were again at the top of a list, with 22.4 percent and 17.8 percent respectively; they were numbers one and two in regards to the percent of total loss reported. However, they only ranked fifth and sixth in terms of median average loss per complaint: $483.95 and $466. In this list both fell behind the Nigerian Letter Fraud, which at number three, claimed a median average of $1,922.99. The top culprits were Investment Fraud, $3,547.94, and Check Fraud, $3,000.

In addition to reporting the types of complaints and their financial impact, the IC3 also provides insights into the demographic of fraud perpetrators. The United States was home to the most perpetrators and a few states in particular were particularly bad. Half of the perpetrators lived in one of seven states which are among the most populous, but controlling for population generates another list: only Florida and New York made both. Outside of the U.S., the next five homes for perpetrators were United Kingdom, Nigeria, Canada, Romania, and Italy. Over 75 percent of perpetrators were male.

]]>https://avweekly.wordpress.com/2008/04/25/ic3s-internet-crime-report/feed/0dmartelSome IT Numbershttps://avweekly.wordpress.com/2008/04/10/some-it-numbers-2/
https://avweekly.wordpress.com/2008/04/10/some-it-numbers-2/#respondThu, 10 Apr 2008 18:21:27 +0000http://avweekly.wordpress.com/?p=95The Robert Half Technology IT Hiring Index and Skills Report, released last month, reports that of the 1,400 CIOs from companies across the United States with 100 or more employees surveyed 82 percent expect to maintain current information technology staff levels in the second quarter of 2008, 14 percent expect to add to staffing levels, and 2 percent foresee reductions. Similar numbers are easily found; that there is to be growth in the IT sector is likely to seem obvious. Not so obvious is the difficulty IT departments may have in fueling that growth.

Deloitte’s Competing for Talent, also released last month, notes that the majority of respondents of the 150 technology and telecommunications companies surveyed expect work force growth in the coming years: two thirds expect growth of six percent in the next 12 months. Only 6 percent expect a workforce decline. These respondents, at least 67 percent of them, believe that the required talent is “readily available in the marketplace.” It would seem then that there should be little concern about meeting staffing needs. However, attracting, developing and retaining talent was identified by 71 percent of respondents as the “most critical people/talent issue” facing their company. A paradox is apparent.

According to Deloitte, the reason for the paradox is, “the majority of technology and telecommunications companies continue to rely on financial incentives and other traditional approaches for luring and retaining talent. These outmoded techniques might work for awhile, but they don’t address the long-term problem.” The reason being that what workers want is to work, “on their terms.” Deloitte is careful to note that the workers referred to include “Gen Yer’s,” “Gen Xer’s'” as well as retirees re-entering the workforce.

While many companies are moving in the right direction, “most still have a long way to go to meet the needs of today’s workers.” And figuring out the right thing to do is vital for companies’ success. “According to the survey, respondents that fail to address their talent management challenges over the next three years will feel the pain where it really hurts: in limited growth, increased time to market, reduced innovation, damage to customer relationships, and more.”

IT staffing challenges are not exclusive to North America; the Harvey Nash and PA Consulting Group reports data from the UK. According to the results of the 2007/08 Strategic Insights Survey, “In the next 12 months, more than half of all the technology leaders surveyed in this report will have moved jobs. With 23 percent already in their jobs for less than 12 months and a further 34 percent planning to move within the next 12 months, we could be about to witness one of the most dramatic migrations of technology leaders in recent times.” This report runs a somewhat positive spin on this, stating that, “the good news is that the most talented and ambitious individuals are unsettled and looking for the next exciting challenge.” However, as in North America, companies are being forced to determine ways to keep employees or face the costs. The task might be somewhat easier for North American companies.

Unlike their North American counterparts, who merely want to work on their terms, seventy-eight percent of respondents in the UK stated that they felt at least ‘fulfilled’ in their current position. Furthermore, of the thirty percent of respondents claiming to be ‘very fulfilled’, 35 percent are still planning on leaving their position in the next 24 months. Harvey Nash states that this, “reaffirms that technology leaders see the process of moving jobs regularly simply as a way to advance their career.” This type of regular migration could be more difficult to stem.

Interesting to read concurrently with the Nash and Half reports is the Computing Research Association’s (CRA) March bulletin. The bulletin summarizes some data from the forthcoming Taulbee Survey, “the principal source of information on the enrollment, production, and employment of Ph.D.s in computer science and computer engineering and in providing salary and demographic data for faculty in CS & CE in North America.” Data from this report indicates to an extent what might be happening to the future size of the IT applicant pool; will there be even more talent available to fuel the needed growth in IT.

Though information on graduate degrees will not be released until May, undergraduate data has been released early: results are inconclusive. Between fall 2000 and 2005 ‘the percentage of incoming undergraduates among all degree-granting institutions who indicated they would major in CS declined by 70 percent.’ The number also fell among Ph.D.-granting departments surveyed by CRA. “After seven years of declines, the number of new CS majors in fall 2007 was half of what it was in fall 2000.” There could be indications that the numbers are at least stabilizing, growth in 2006 was flat but increased slightly in 2007.

IT departments should be hoping the number of applicants grows. A larger number means no shortage of supply and, as workers fight through the crowd, favors employers in determining terms of employment. However, with numbers currently stable at best and graduation years away, IT departments still face the immediate challenge of meeting the needs of their workers. And the terms and methods used to meet that challenge could easily become the standard for the next generation of workers regardless of its size.

According to the results of Harvey Nash’s 2007/08 Strategic Insights Survey, “In the next 12 months, more than half of all the technology leaders surveyed in this report will have moved jobs. With 23% already in their jobs for less than 12 months and a further 34% planning to move within the next 12 months, we could be about to witness one of the most dramatic migrations of technology leaders in recent times.”

The Robert Half Technology IT Hiring Index and Skills Report reports that of the 1,400 CIOs from companies across the United States with 100 or more employees surveyed 82 percent expect to maintain current information technology staff levels in the second quarter of 2008, 14 percent expect to add to staffing levels, and 2 percent foresee reductions.

Interesting to read concurrently with the Nash and Half reports is the Computing Research Association’s bulletin from March 1. The bulletin summarizes some data from the forthcoming The Taulbee Survey, “the principal source of information on the enrollment, production, and employment of Ph.D.s in computer science and computer engineering and in providing salary and demographic data for faculty in CS & CE in North America.” Information on graduate degrees will not be released until May, but undergraduate data has been released early. “According to HERI/UCLA, the percentage of incoming undergraduates among all degree-granting institutions who indicated they would major in CS declined by 70 percent between fall 2000 and 2005. Unsurprisingly, the number of students who declared their major in CS among the Ph.D.-granting departments surveyed by CRA also fell. After seven years of declines, the number of new CS majors in fall 2007 was half of what it was in fall 2000 (15,958 versus 7,915). Nevertheless, the number of new majors was flat in 2006 and slightly increased in 2007. This might indicate that interest is stabilizing.”

]]>https://avweekly.wordpress.com/2008/04/02/some-it-numbers/feed/0dmartelHarvey_NashRobert HalfCRA_UseFebruary Search Engine Rankingshttps://avweekly.wordpress.com/2008/04/02/february-search-engine-rankings/
https://avweekly.wordpress.com/2008/04/02/february-search-engine-rankings/#respondWed, 02 Apr 2008 17:15:01 +0000http://avweekly.wordpress.com/2008/04/02/february-search-engine-rankings/comScore released their search engine rankings for the month of February. Google, at the top of the list with a 59.2 percent share, is followed by Yahoo! with 21.6 percent, Microsoft with 9.6 percent, AOL with 4.9 percent, and Ask.com with 4.6 percent. Noted is the drop in search activity for February: the 9.9 billion searches at the core search engines represents a 6 percent drop from January.
]]>https://avweekly.wordpress.com/2008/04/02/february-search-engine-rankings/feed/0dmartelThe Digital Universehttps://avweekly.wordpress.com/2008/03/27/the-digital-universe/
https://avweekly.wordpress.com/2008/03/27/the-digital-universe/#respondThu, 27 Mar 2008 17:38:03 +0000http://avweekly.wordpress.com/2008/03/27/the-digital-universe/A year ago this month, IDC produced a white paper for EMC entitled, “The Expanding Digital Universe: A Forecast of Worldwide Information Growth through 2010,” this month the forecast was revised. The digital universe that a year ago was merely expanding is now “diverse and exploding.” IDC computes the size of the digital universe last year at 281 exabytes or 281 billion gigabytes and reports that, “By 2011, the digital universe will be 10 times the size it was in 2006.” From another perspective: “The number of digital ‘atoms’ in the digital universe is already bigger than the number of stars in the universe.”

Largely accounting for the increase in size of the digital universe is the growing amount of visual data: both still and video images and their increasing resolution. Though the conversion from film to digital is almost over in the still image arena, this may serve only to drive the creation of more images (the resolution of which will continue to rise). In the surveillance world (which is now responsible for a growing amount of bytes) the conversion to digital from film is progressing rapidly from its infancy: “Most cameras are still analog. But shipments of networked digital cameras are doubling every year.” Generation of visual data is likely to continue.

Of course where there is data there is a want to store it and IDC points out storage and content’s symbiotic relationship. “Cheaper storage allows us to take high-resolution photos on our cell phones, which in turn drives demand for more storage. Higher-capacity drives allow us to replicate more information, which drives growth of content.” With storage capacity rising and prices dropping, the math is pretty simple. But dropping prices and rising capacity cannot keep up with data creation, 2007 marked the first time that, “the amount of information created, captured, or replicated exceeded available storage.” This divergence is predicted to continue: “by 2011, almost half of the digital universe will not have a permanent home.”

As the size of the digital universe increases, its composition is becoming more varied. The digital universe is now comprised of “images, video clips, TV shows, songs, voice packets, financial records, documents, sensor signals, emails, text messages, RFID tag transmissions, barcode scans, X-rays, satellite images, toll booth transponder pings, and the notes of ‘Happy Birthday’ coming from singing greeting cards.” The size of the constituent parts is varied as well. “An archived digital movie master kept at the National Academy of Arts and Sciences might be a terabyte. A DVD might be 5 gigabytes. An email a few kilobytes. An RFID signal only 128 bits.” While observing the size and variation, it is interesting to note how the pieces add up to make the whole. “The tiny signals from sensors and RFID tags and the voice packets that make up less than 6 percent of the digital universe by gigabyte,” states IDC “account for more than 99 percent of the ‘units,’ information ‘containers,’ or ‘files’ in it.”

IDC reiterates from last year’s report that at the core of the digital universe remains a critical dilemma: “While 70 percent or more of the digital universe is created, captured, or replicated by individuals — consumers and desk and information workers toiling far away from the datacenter — enterprises, at some point in time, have responsibility or liability for 85 percent.” This has implications for those in enterprise IT positions. Regulations will need to be instituted to address all phases of the data lifecycle: how is it created; is it worthy of being archived; how is it to be stored, searched for, and retrieved, how is it protected? These are the questions on the table, and more attention than previously necessary will be required of them.

Towards the end of the report makes an interesting aside, “Of that portion of the digital universe created by individuals, less than half can be accounted for by user activities — pictures taken, phone calls made, emails sent — while the rest constitutes a digital ‘shadow’ — surveillance photos, Web search histories, financial transaction journals, mailing lists, and so on.”

It appears that Big Brother is watching while the universe explodes.

]]>https://avweekly.wordpress.com/2008/03/27/the-digital-universe/feed/0dmartelAdobe’s AIRhttps://avweekly.wordpress.com/2008/03/13/adobes-air/
https://avweekly.wordpress.com/2008/03/13/adobes-air/#respondThu, 13 Mar 2008 17:32:17 +0000http://avweekly.wordpress.com/?p=85After being in beta since last June, Adobe released its Adobe Integrated Runtime (AIR) on February 25. AIR is one of a fresh batch of technologies promising to bridge the gap between desktop and web. It is already in use by eBay, SalesForce.com, the New York Times, and a host of developers. At the moment AIR is available for Windows and Mac, a Linux release is expected later this year, and hopefuls expect a mobile version as well. Like Adobe’s nearly ubiquitous Flash, AIR is also free.

Adobe’s AIR did not come out of the void. It has been in beta since June, but has roots in a trend that was set in motion years ago with the first strides towards cloud computing. Adobe’s chief software architect says that the idea for AIR was spawned by the benefits of cloud computing: continual access, platform independence, version currency. However, as the cloud computing movement moves ahead with Microsoft, Google, SalesForce.com, and others continuing developments, AIR has already gone beyond the paradigm.

InformationWeek says this about AIR: “AIR, a cross-operating system platform that was code-named Apollo, attempts to bridge the gap between the Web and the desktop by allowing developers to create Internet-connected applications that aren’t restricted by the form and functionality of Web browsers.” Traditional cloud computing technologies are interested in moving offline applications online, such as Google Docs; they are focused on the online/offline dichotomy. While AIR fully allows for this it is more interested in dissolving the desktop/web dichotomy. Using AIR developers can create Rich Internet Applications (RIA) which connect to the Web, but which are not restricted to it, and which can furthermore interact directly with the local machine.

Apart from being a catalyst for AIR and creating, according to Lynch, a “tidal shift in how people are actually creating software,” cloud computing has also been noted as introducing a challenge to the reign of Microsoft Office applications. If there are free versions out there, such as Google Docs, the need to pay the price to Redmond for the ability to write is removed. Cloud computing may be the tip of the iceberg, but Microsoft has yet to feel a real hit. The next wave could be an AIR-based attack and it could be substantial, as it strikes deeper than just the applications. The impact on Microsoft and its Office could be merely the byproduct of a paradigm shift.

AIR could pose a serious threat for a number of reasons. For one, it is platform independent which means developers can write applications for Mac, Microsoft, and Linux clients all at the same time; users will not be bound to an OS to gain access to certain tools. As writing for AIR is far less complex than writing desktop applications developers can devote more time to the product; development time and the barrier to entry both drop significantly. With AIR, the operating system itself becomes of little relevance. And, there is the source, Adobe, to keep in mind.

PC World notes, “Indeed, Adobe, particularly with the acquisition of Macromedia in 2005, has been successful at building a comprehensive set of tools that developers use primarily to deliver multimedia and high-impact, customer-facing Web sites and Web-based applications. Barring Microsoft, the company really has no major rival in this space.” Al Hilwa, Program Director at IDC concurs “Adobe has been focused on improving the Web experience and delivering the underlying technologies to produce more interactive and expressive Web sites and applications, and the Adobe technology platform for RIAs hits right at a key need companies have today.”

Software developer Hank Williams is even a bit more assertive. He writes, “Adobe’s strategy is a death stroke to Windows as a strategic monopolistic platform. And Adobe as a software company with revenues north of three billion dollars has the muscle, the development community, and the momentum to fight this battle. They will not be ‘Netscaped.'”