There are few phenomena as ubiquitous or vital to human existence as electromagnetic radiation (EMR). It permeates everything we experience, be it the visible light illuminating all we see, or the broadcast media transmitted across the globe by radio-wave. In medicine, X-ray and gamma rays have revolutionized both anatomical imaging and treatment for cancer. In the era of wireless communication, our phones and routers take advantage of microwave radiation to rapidly convey virtually the entire repository of human knowledge to our fingertips at staggering velocity.

But while EMR is an inescapable part of our universe, there are many who worry about potential detrimental effects. In particular, the propagation of personal communication devices has been a source of concern to many. There is a vocal cohort who claim to suffer from a condition called electromagnetic hypersensitivity (EHS or ES), whose symptoms include everything from fatigue and sleep disturbance to generic pains and skin conditions. More still fixate on idea that our increasingly wireless offices and homes might amplify our cancer risks. Such narratives are common and understandably disturbing. But should we be concerned?

ABC Catalyst program linking mobile phones to brain cancer 'should never have aired'

Read more

To answer that question, it’s important to clarify a few potential sources of confusion. Radiation itself is a deeply misunderstood term, frequently conjuring up worrying associations with radioactivity in the public conscious. But radiation simply refers to transmission of energy through a medium. In the context of EMR this means radiant energy released by an electromagnetic process. This energy moves at the speed of light, characterised by its wavelength and frequency. The electromagnetic spectrum is the range of all possible frequencies of EMR, where energy is proportional to frequency. While we only see a tiny portion of the spectrum in the form of visible light, we can think of it as a range of light particles (photons) with different energies. Some of these even have sufficient energy to eject electrons from an atom or smash apart chemical bonds, which renders them capable of causing DNA damage. This is known as ionizing radiation, and this ionizing potential is exploited when X-rays are harnessed to kill tumour cells in radiotherapy.

This fact can make people uneasy – if light can be used to destroy cells, could our heavy usage of wireless communications perhaps induce this kind of DNA damage and ultimately lead to cancer? This is reasonable to ask, but we have to keep in mind how unbelievably vast the electromagnetic spectrum truly is. Modern communications, from our Wi-Fi networks to phones, are firmly rooted around the microwave end of the scale, with frequencies between 300 MHz and 300 GHz. In the scheme of the EM spectrum, these photons are of relatively low frequency and low energy. To put this in perspective, even the lowest energy visible light (wavelength ~700nm) still carries roughly 1430 times the energy of the most energetic microwave photon (wavelength 0.1cm). Microwave radiation is undisputedly non-ionizing, and completely incapable of direct DNA damage.

In spite of their low energy, microwaves are remarkably effective at heating certain substances through a process known as dielectric heating. Certain molecules, like water, have regions of partial positive and negative charge which in the presence of an electric field rotate to align themselves in direction of the field. Domestic microwave oven emit photons with a frequency of approximately 2.45 GHz, meaning their electric field changes polarity 2.45 billion times a second, causing these polar molecules to rapidly bump off each other as they try to align to the rapidly changing field. The friction from these rapid collisions is converted to heat, which is precisely why microwaves are so efficient at cooking our predominantly water-based food. This is unfortunately rife for confusion; an entire plethora of blogs and dubious websites assert that microwave cooked food is harmful by dint of being exposed to radiation. But this is wrong-headed: microwaves are not radioactive and do not “irradiate” food – they merely harness vibrational energy to heat it.

Other lines of dubious reasoning rely on misguided extrapolation: if microwave ovens can cook meat, then our Wi-Fi routers and cell phones are therefore cooking us too. But while thermal effects are certainly possible with microwave radiation, the power output of our communication technology is many orders of magnitude below that of ovens, with typical home routers outputting less than 100mW . On top of this, ovens are designed to concentrate high power microwave radiation using specially designed waveguides, magnetrons and reflective chambers, a situation neither encountered nor desirable in our conventional communication technology. It’s important too to note that the intensity of an approximately spherical source of electromagnetic radiation has an inverse square relationship with distance . For example, the field intensity a metre from an EM source will be 4 times greater than the intensity 2 metres away, and 9 times greater than a measurement taken 3 metres away from the source. In practice, this means the strength of an EM source diminishes enormously even over modest distances.

Electrosensitivity: is technology killing us?

Read more

Of course, our cell phones by definition come into very close contact with our heads, and so avoiding thermal ill-effect is a major consideration. The heat energy absorbed by tissue exposed to an EM field is given by the specific absorption rate (SAR). In the European Union, the maximum exposure to EM fields is tightly regulated to a maximum of 2W per kilogram, averaged over the 10g volume receiving the most direct heating to circumvent thermal effects. Importantly, dielectric heating only increases tissue temperature and will not by itself cause any damage to DNA bonds, so SAR should not be taken as a proxy for cancer risk. To date, there is no evidence that mobile phone usage increases cancer risk – The World Health Organisation state that “no adverse health effects have been established as being caused by mobile phone use”. Even long-term studies of radar workers show no signs of increased lifetime cancer incidence, despite their exceptionally high levels of exposure to microwave radiation.

Even so, with the huge uptake in phone usage over the last two decades, it’s pragmatic to keep a cautious eye on emergent trends. The 13-country INTERPHONE study concluded there was no apparent causal relationship between phone use and the rates of common brain tumours such as glioblastoma and meningioma. The dose-response curve did not betray any obvious signs of correlation: in some instances a decrease in risk was even seen, with the possible exception of heaviest users, where biases in the data made it impossible to ascertain any solid relationship.

Similarly, a Danish cohort study did not reveal any obvious link between phone usage and tumour rates. American cell phone use increased from almost nothing in 1992 to practically 100% by 2008, yet studies thus far have indicated that glioma rates showed no apparent increase. This result has been replicated by numerous other studies, and while constant monitoring is laudable, the evidence to date certainly doesn’t support the hypothesis that cell phone usage results in increased cancer risk.

Yet despite the sincerity of these beliefs and the discomfort experienced by sufferers, the inescapable reality is that there is zero evidence supporting their position. In provocation trials, sufferers have been completely unable to identify when sources of EMR are present. Subjects also reported negative effects even when exposed to fake EM sources. These results have been replicated in a number of trials, strongly suggesting that the illness sufferers feel is psychological rather than physical, and that for some the belief one is allergic to EM radiation is enough to trigger an unpleasant psychosomatic reaction.

Those struggling with EHS appear to be victims not of electromagnetic malaise but rather of a psychological quirk known as the nocebo response. The more familiar placebo effect is the observation that people given an inactive treatment tend to rate themselves as improving, provided they are unaware the treatment is inert. Less well known is the converse complement of this: the nocebo effect. In such instances, if subjects truly believe something to be harmful, they tend to report an adverse reaction when confronted with that thing. Subjects under the sway of the nocebo effect even report these reactions when the source is a sham. The WHO summation, while sympathetic, is unequivocally clear: “The symptoms are certainly real and can vary widely in their severity. Whatever its cause, EHS can be a disabling problem for the affected individual. EHS has no clear diagnostic criteria and there is no scientific basis to link EHS symptoms to EMR exposure.”

While it might be tempting to dismiss EHS as a fake illness, it is vital to recognise that sufferers experience a very real discomfort. The fact that their illness appears to be psychosomatic rather than physiological in origin does not make it any less real to the afflicted, even if they are mistaken about the cause of their woes. The harrowing complexity of this has recently been portrayed sensitively in Better Call Saul, where the protagonist’s brother is severely afflicted by EHS yet remains convinced his illness is physical rather than psychosomatic, even when confronted with evidence to the contrary. That EHS sufferers might benefit more from psychological intervention than physical approaches does not detract from their evident pain.

As always, we have to be wary, and beguided by best evidence rather than panic. Most EMR is invisible and inescapable, and apprehension over what we cannot see is completely understandable. But if we are to make informed decisions on health and technology, misplaced fear of the unknown or dogmatic convictions are simply no substitute for evidence and understanding.

The headline on this piece was changed on 18 February to more accurately reflect the article.