A Rational Discussion of Cell Phones and Towers

The International Agency for Research on Cancer (IARC), a division of the World Health Organization (WHO) has classified cell phones as a possible carcinogen. This new classification has resulted in a number of alarming news headlines, but the new IARC classification needs to be put into context. The IARC has not done any new studies, they reviewed a number of scientific studies on cell phones and found inconclusive evidence that warrants some caution. The IARC did not classify cell phones as probable or known carcinogens.

The IARC defines “group 1″ as known carcinogens, “group 2A” as probably carcinogens, and “group 2B” as possible carcinogens. A full list of substances and their group classification can be found here, and yesterday’s announcement put cell phones in group 2B. But to put this classification into proper context, we need to look at some other common items that are equally or more carcinogenic.

IARC Group 1 – Known carcinogens

Tobacco. After decades of education, everyone knows that tobacco is carcinogenic.

Sunlight (especially 10AM to 2PM). Ordinary sunlight includes ultraviolet ionizing radiation which is high enough in energy to alter human DNA. In 2010, this resulted in 1 million new cases of skin cancer in the United States. But as with so many things in human health, dangerous substances can be healthy or even essential in small doses. Ultraviolet radiation is essential to vitamin synthesis in the human body.

Wood dust. For wood workers or people who often work with wood, they may not be aware of the fact that wood dust known carcinogen.

Mineral oils. This one is probably more surprising as mineral oils are used in baby lotions, cold creams, ointments, cosmetics, and food packaging.

IARC Group 2A – Probable carcinogens

Diesel engine exhaust

Art glass

Frying and emissions from high temperature (cooking)

IARC Group 2B – Possible carcinogens

Cell phones. This was just added to this classification, but let’s see what else falls under this classification.

Pickled vegetables.

Coffee. It’s starting to sound like the only way to avoid all possible and known carcinogens is to live in a sealed plastic bag. But critics of cell phone safety would prefer to compare cell phones to the next scary sounding substance.

Conflicting studies on cell phone safety

There are a number of conflicting studies that show no correlation between cell phones and cancer and other studies that show some correlation. The wireless association (CTIA), the Federal Communications Commission, and the National Cancer Society favor the studies that found no correlation of cancer and cell phones. Other groups that are critical of wireless technology favor the studies that found correlation. The conflicting studies make it very difficult for the public to draw conclusions while the two sides battle it out.

Some like Joel M. Moskowitz, director of the UC Berkeley Center for Family and Community Health argued that the higher quality studies on cell phones found correlation with cancer and that the studies which found no risk were of lower quality. But that characterization is unreasonable given the fact that those “high quality” studies which found cancer correlation used relatively small samples of populations while the no correlation studies used extremely large samples. For example, the study from Lennart Hardell finding cancer correlation surveyed 209 cases and 425 controls. By comparison, a large study in Denmark in 2006 tracked 420,000 people (including 52,000 who used cell phones 10 to 21 years) and found no correlation between cell phones and cancer.

Another large study in 2010 conducted by Interphone (group from 13 heath agencies in the European Union) found no increased risk of cancer with cell phone use. But Interphone study did find some potentially alarming results that found that tumor patients developed tumors on the same side of their head where they used cell phones, but this does not contradict Interphone’s earlier conclusion of no increased cancer risk. In other words, there was a correlation in the side of the head that developed tumors but there was no increase in the number of people who developed tumors. From a risk point of view, it won’t matter which side of the brain develops a tumor. What is significant is that the rate of tumors and cancer isn’t higher among cell phone users.

Critics charge that the Interphone study is invalid because it showed some positive health effects with cell phone users having fewer incidents of cancer (the Danish study also showed this to some extent). They charge that the results are absurd because cell phones can’t possibly be beneficial to human health. But this attitude is highly unscientific because scientists never insist on any particular outcome one way or the other. They might make predictions but their minds are opened to and even excited about the possibility that their predictions turn out wrong by actual data. Ultraviolet radiation which is known to be a deadly carcinogen is an example of a deadly substance having the beneficial and crucial role of human vitamin synthesis. There is also scientific data to suggest beneficial aspects of cell phone radio waves with mice. A study in 2010 showed that cell phone exposure can prevent or reverse Alzheimer’s disease in mice. While that may not translate to humans, it is an exciting possibility to study and may shed light on mechanisms that can treat or cure Alzheimer’s in humans.

How precautions can be counterproductive

Most people take the position of caution and that it is better to be safe than sorry, but this attitude often ends up producing the opposite of the desired effect. For example, the same people who are more worried about cell phones and possible cancer risk oppose cell tower construction. Yet cell towers are thousands to millions of times weaker in terms of strength of their radio signal compared to cell phones simply because of proximity. Radio propagation physics dictates that something 1,000 times closer in proximity will be 1,000,000 stronger. Even if the cell tower is 100 times stronger in transmit power, it will still be 10,000 times stronger if it is 1,000 times further away than a cell phone.

The significance of this is that cell phones operating closer to cell towers will require significantly lower transmit power than cell phones operating at a distance or in a location that obstructs radio signals. And because the power level of a nearby cell phone dwarfs the cell tower, the priority for those who want to minimize cell phone radio frequency exposure is to reduce cell phone transmit power. The only way to achieve this is to have more cell towers or better yet, have a personal cell tower called a “Femtocell” inside one’s own home.

More cell towers mean lower powered cell towers

The lower cell phone power argument hasn’t convinced all the cell tower detractors because they argue that some people (mostly children) don’t use cell phones and they’re better off living far away from cell towers. The problem with this logic is that someone’s child or someone who doesn’t use cell phones will invariably live or learn or work near a cell tower. Having 4 times as many cell towers means that each cell tower can operate at 1/4th the power level not to mention the fact that there will be 4 times as much call capacity and data capacity. Yet parents often picket near their schools demanding that a cell tower not be put up anywhere near their school. The result is that some cities like San Francisco can boast not approving a single cell tower for a decade.

The level of irrationality is highlighted by the fact that many of those same parents who oppose cell tower often have Wi-Fi base stations at home, work, and school which are hundreds of times more powerful than cell towers because of their closer proximity. Wi-Fi uses similar radio frequencies as cell towers but there is an even more notable example. TV broadcast towers are vacating the exact same frequencies (50 to 700 MHz) that will be used for newer LTE 4G cell towers. Those TV towers have been blasting out millions of watts since the 1950s which is longer than most people have been alive. The LTE 4G towers will typically be in the sub 100 watt range yet cell tower obstructionists continue to argue that they have not been proven safe.

The problem is that the radio frequency alarmists are not using science to come up with sensible public policy. They demand laws that set minimal distance guidelines for cell towers when the sensible policy is to set maximum permissible radio frequency exposure which would allow for more distributed and closer proximity low power cell towers. The FCC already sets maximum radio frequency exposure guidelines but the alarmists have ignored the FCC in favor of irrational policies that force cell phones to operate at the highest transmit power levels.

The FCC has already shown great leadership when they implemented a “shot clock” that would set a time limit for local governments to approve or deny new cell tower applications. Many local governments set applications on hold for years or even decades to prevent wireless carriers from taking them to court. But the FCC could do more to promote wireless broadband deployment if they had mandates for local governments. Some of those local government officials have even asked the FCC to override them so that they don’t have to take the blame from vocal anti cell tower voters. The FCC already prohibits landlords and home owner associations from banning outdoor TV antennas and it would be extremely helpful if they put an end to cell tower obstruction. Not only would it improve nationwide wireless deployment, it would reduce radio frequency exposure for everyone. Unless we are prepared to revert to a pre-technological era, this is the only policy that makes sense.

Some sensible precautions

Even though early results form science as a whole show that cell phones aren’t dangerous, the science isn’t done. That means sensible precautions and not blind obstructionism of cell towers can be wise. The easiest way to eliminate over 99% of the cell phone radio wave exposure is to keep the phone a few feet form the head and body by using a wired or wireless Bluetooth headset whenever a person intends to have a prolonged conversation. Bluetooth headsets also uses the same kinds of radio waves that cell phones use, but it operates at 1000 times lower power levels because it only needs to operate within 10 feet while cell phones have to transmit out to a few miles to reach the cell tower.

Headsets also allow the phone to operate away from the human body and get much better call reception. The human hands can easily block 20 to 25 dB signal level on the iPhone 4 which means roughly 99% of the signal is lost. That also forces the phone to operate at much higher radio transmit levels to compensate for the blockage. There are even more benefits to headsets like the prevention of neck problems or car accidents.

Be aware of the snake oil

Consumers should also be aware of shady snake-oil salesman trying to take advantage of the WHO classification of cell phones as possible carcinogens. I had the misfortune of debating someone trying to sell a special necklace and a small button to attach onto phones that they claim will eliminate the dangers of cell phones. These devices also claim to have magical properties of making food taste better or eliminating “negative thoughts”, but it sounds more like they want to eliminate some hundred dollar bills from your wallet. These devices have no basis in science and if they actually did something to block radio waves, they would render the cell phone inoperable.

George Ou was a network engineer who built and designed wired network, wireless network, Internet, storage, security, and server infrastructure for various fortune 100 companies. He is also a Certified Information Systems Security Professional (CISSP #109250). He was Technical Director and Editor at Large at ZDNet.com and wrote one of their most popular blogs “Real World IT.” In 2008, he became a Senior Analyst at ITIF.org, and he currently writes for High Tech Forum

Premium Research

Wikibon argues strongly against Revolution towards a 3rd platform. The conclusion from this analysis is that applications will evolve; conversion should be avoided like the plague. The greatest opportunity is to continuously adapt today's operational applications by the addition of real-time or near real-time analytics applied directly to the current organizational processes and applications that support these processes. This is likely to translate to the greatest value to most organizations, and where possible avoid the risks of converting systems. The study of organizations that have applied real-time analytics to their current operational systems have shown incredible improvements in lower costs and greater adaptability. Business and IT executives should understand the enormous potential for adding decision automation through real-time analytics to current operational applications in their organizations. New technologies should be judged by their ability to support real-time analytics applied to operational systems, and supporting incremental improvement over time.

In a recent web-based survey conducted by Wikibon, 300 North American enterprises whom had either been utilizing, or considering the adoption of public cloud, answered questions regarding IaaS (Infrastructure as a Service) perceptions and usages. These questions varied in topic but were centered around an examination of which workloads were best suited for usage in the public cloud. This research examines a few additional key insights that shed some light on the growing IaaS world.

Today's Technology infrastructure management is largely non-differentiated and wasteful. Technology executives must re-think the strategic role of human capital and begin to implement new ways to consume IT as a service. This post draws on the learnings of senior executive Alan Nance from Royal Philips who is dogmatic in its approach to transforming its infrastructure to a service model.

There have only been two successful volume introductions into the marketplace in the last 50 years - DRAM and NAND flash. There has to be a clear volume case with good economics for 3D XP to be able to gain a foothold in consumer products. Without volume in the consumer space, there is unlikely to be much volume traction in the enterprise space. CIOs, CTOs and enterprise professionals should take a wait and see stance, and monitor the adoption of 3D XP in the consumer and military spaces. If and when there is volume production for 3D XP, enterprise adoption should start about two years later.

The use of open source software continues to accelerate and expand in the marketplace, especially in areas where technology is significantly disrupting established business models. IT organizations should be actively seeking to understand how open communities operate, how different licensing models work, and how they can be more actively engaged with both the vendors and communities that are shaping open source software.

CIOs understand that a clear cloud strategy is critical for IT today. Wikibon believes the biggest mistake organizations can make is converting major applications into the public cloud (including SaaS) without thinking about the implications to their existing business process workflows. Wikibon recommends that IT develop and implement a hybrid cloud strategy using the existing management workflows and compliance processes for both the public and private cloud components in the hybrid cloud.

In 2014, Wikibon defined a new category "Server SAN" that sits at the intersection of software-defined storage, hyperscale methodologies and converged infrastructure. This article is the executive summary of primary research that gives the status of the market, examines the vendor ecosystem, lays forth the revenue and 10 year forecast and gives direction for expansion beyond simple "hyperconverged infrastructure". This information is available for public consumption, the full research is available to Wikibon clients.

In this research paper, Wikibon looks back at the introductory Server SAN research, adjusts the Server SAN definition to include System Drag, and increases the speed of adoption of Server SAN based on very fast adoption from 2012 to 2014. The overall growth of Server SAN is projected to be about a 23% CAGR from 2014 to 2026, with a faster growth from 2014 to 2020 of 38%. The total Server SAN market is projected to grow to over $48 billion by 2026. The traditional enterprise storage market is projected to decline by -16% CAGR, leading to an overall growth in storage spend of 3% CAGR through 2026. Traditional enterprise storage is being squeezed in a vice between a superior, lower cost and more flexible storage model with Enterprise Server SAN, and the migration of IT towards cloud computing and Hyperscale Server SAN deployments. Wikibon strongly recommends that CTOs & CIOs initiate Server SAN pilot projects in 2015, particularly for applications where either low cost or high performance is required.

If containers are at the center of a shift in how applications are developers and delivered, and their pace of growth and change is unprecedented in IT history, this could have a massive ripple effect on both suppliers and consumers of the ecosystem of IT technologies.