Trivia night!

Lost in Science Trivia - Birmingham Hotel 20 August, 6:30 pm for 7:30 start

Are you sick of trivia nights that don’t have enough science questions? Then come to the inaugural LOST IN SCIENCE TRIVIA NIGHT!

Help us raise money to keep 3CR Community Radio (and us) on the air and celebrate National Science Week.

Doors open at 6.30 for pub meals and chatting before a 7.30 start. Come along and join a team, or bring your own of 4–8 people and guarantee a victory! (Victory not guaranteed)

Find out more on Facebook or book tickets now on Eventbrite!

Sport dope cheaters are not winners

Drug cheats risk damage their health and their sport, but they may not even be getting any benefits victory-wise.

For instance, Essendon got busted by the Australian Sports Anti-Doping Authority (ASADA) for a big long list of drugs (See pages 24–25 [PDF 167 KB]). Some of these were bordering on the homeopathic and some were just vitamin and mineral supplements—which are of dubious benefit as well.

But the one that’s caused the most controversy is AOD 9604, a peptide developed in the hope it would help with weight loss. Peptides are essentially small proteins: like proteins, they’re made out of chains of amino acids, but they’re generally too short to be considered a proper protein.

However, they can sometimes mimic parts of other proteins, and in some cases they’re what are known as secretogogues, in that they stimulate the secretion of growth hormones. These are then meant to do things like increase muscle mass, which is where you got the idea of weight loss—and the theory they improve athletic performance.

Bottles of the peptide AOD 9604, also known as Lipotropin
The banned peptide AOD 9604, aka Lipotropin (Image: ABC)

The trouble is that even if you took pure growth hormone itself it probably wouldn’t enhance performance—as shown in a systematic review of research involving 303 participants (Hau Liu, Bravata DM, Olkin I, Friedlander A, Liu V, Roberts B, Bendavid E, Saynina O, Salpeter SR, Garber & Hoffman AR, “Systematic review: the effects of growth hormone on athletic performance”, Annals of Internal Medicine 2008;148(10):747-758. doi:10.7326/0003-4819-148-10-200805200-00215).

Although it’s surprising that people would risk their careers over something that doesn’t benefit them, when looked at from a scientific point of view it shouldn’t be that surprising. After all, many drugs designed to treat a specific disease don’t work as well as theory says they should, so why should those used to enhance performance be any different?

But some of these drugs do work, and for evidence let’s turn to a proven “winner”: cyclist Lance Armstrong, who admitted to doping  and was subsequently stripped of his seven victories in the Tour de France.

The main drug Armstrong used was erythropoietin, or EPO, which is a protein that controls red blood cell production. The idea is that taking it increases the number of red blood cells, which increases the amount of oxygen your system can carry.

And the research stacks up. A study published in February last year which found that it improved running performance by athletes by 6%. Even 4 weeks afterwards, their performance was still 3% better than normal (Durussel J, Daskalaki E, Anderson M, Chatterji T, Wondimu DH, Padmanabhan N, Patel RK, McClure JD, Pitsiladis YP, “Haemoglobin mass and running time trial performance after recombinant human erythropoietin administration in trained men”, PLOS One, Published: February 13, 2013, DOI: 10.1371/journal.pone.0056151).

Graph of Tour de France winner times from 1980 to 2013
Further evidence of the benefits of doping? Winning times for the Tour de France increased slightly after the doping scandals emerged (Lance Armstrong’s “victories” are shown in green) (Data: Wikipedia)

After testing tightened up, Armstrong and his cronies avoided getting caught by instead using transfusions of their own concentrated red blood cells, to basically get the same benefit as taking EPO.

Naturally, that’s harder to detect, but now there’s more monitoring of riders’ physiology—what they call a “biological passport”—to see whether there are suspicious changes.

But where these get into grey areas is that it’s also possible to increase the concentration of red blood cells—and even EPO—through “natural” means, like altitude training (McLean BD, Buttifant D, Gore CJ, White K, Liess C & Kemp J, “Physiological and performance responses to a pre-season altitude training camp in elite team sport athletes”, International Journal of Sports Physiology and Performance, 2013, 8, 391-399 [PDF 257 KB]).

Even though that method wears off after a few days, and it’s not always practical to go back up and down a mountain to get it back, does it make a difference how you do it if it’s essentially the same physiological effect?

This puts the whole issue of doping into an even greyer area, but the question can also be turned around: why take possibly harmful drugs when there are other, more reliable ways to get the same result?

In more ways than one, the risk to a sporting career isn’t worth the benefit.

(This story went to air on 17 July 2014. You can listen to the podcast.)

Time travel model tests quantum theory

Even though the laws of physics seem to permit time travel, many physicists and non-physicists still worry about the paradoxes that arise when you try to change the past (for an example see… well, pretty much any movie involving time travel). But quantum mechanics can give a way out of the mess, and an experiment performed at the University of Queensland has tested what happens when you try this.

Although not a dinkum time machine, the experiment simulated the effect of sending a photon (a particle of light) back in time by using two photons, one acting as the past version and one as the future version, and having them interact (Martin Ringbauer, Matthew A. Broome, Casey R. Myers, Andrew G. White & Timothy C. Ralph 2014, “Experimental simulation of closed timelike curves”, Nature Communications 5, Article number: 4145, doi:10.1038/ncomms5145).

Diagram of the time travel simulation experiment
A diagram of the experimental set-up, in which two single photons, generated in a nonlinear β-barium-borate crystal, are sent by optical fibres into the top and bottom paths. The bottom photon represents the time traveller, which is polarised according to a theoretical quantum consistency condition. The two photons interact in the middle in a polarising beam splitter and then are detected by photo-diodes at the outputs (Image Ringbauer et al., via Nature)

To make it act like a time machine, they imposed theoretical conditions established in 1991 by David Deutsch (Deutsch 1991, “Quantum mechanics near closed timelike lines”, Physical Review D 44, 3197, DOI:10.1103/PhysRevD.44.3197 [PDF 4.7 MB]).

In Deutsch’s theory, any attempt to meddle with the past gives mixed results, i.e. a quantum mixture of meddled and non-meddled. These separate possibilities exist simultaneously, and are often interpreted as alternative timelines created by the act of travelling back in time.

There are other theories of time travel,  such as those that forbid any changes to the past that haven’t already happened, but Deutsch’s model is a particularly popular one.

The study of even a simulated time machine gives clues to how our understanding of physics may have to change to accommodate such bizarre circumstances.

To find out more about it, we spoke to one of the experimenters, PhD student Martin Ringbauer. Continue reading

Newt troublemakers settle in

An entirely new order of invasive amphibians has set up residence in Australia, with populations of newts thriving on the outskirts of Melbourne.

Two smooth newts (click to embiggen)
Smooth newts, Lissotriton vulgaris, look a bit like skinks, and the worry is that native predators may make the same mistake (Photo: Museum Victoria)

The smooth newt, Lissotriton vulgaris, is native to Europe, but the similar climate in south-eastern Australia appears to have helped it establish breeding populations at at least four sites around Melbourne (Tingley R, Weeks AR, Smart AS, van Rooyen AR, Woolnough AP & McCarthy MA 2014, “European newts establish in Australia, marking the arrival of a new amphibian order”, Biological Invasions, DOI: 10.1007/s10530-014-0716-z).

This is a big deal, because there are no newts native to Australia. In fact, the only indigenous amphibians are frogs, many of which—like the Baw Baw Frog—are endangered due to an infectious disease caused by chytrid fungus.

However, newts were imported as pets up until 1997, when the Victorian government declared it a “controlled pest animal.” It’s likely that the current populations originally came from escaped or released pets.

Being a relative newt-comer, it’s not yet known how bad an impact they will have. The newts eat invertebrates, crustaceans, and the eggs and hatchlings of frogs and fish, so it’s likely that they will compete with and even prey on native species.

It’s possible they could also transmit the frog-killing chytrid fungus. But the researchers are also concerned about a toxin they produce. So far it’s looking like the toxin is at too low levels to do any harm, but there is still a worry that they could poison other native species that may prey on them.

Fortunately, it still seems to be early days for the newt invasion, so there is some hope that quick intervention could get them under control.

Anyone who encounters a newt is encouraged to report them to the Department of Environment and Primary Industries (DEPI).

Beth spoke to researcher Dr Reid Tingley about the newts on our show on 10 July 2014. You can listen to the podcast.

Doppler affects you and me, quite frequently

It’s making the news in oceans both Indian and Saturnian, tracking the movements of space probes and missing Malaysian airliners. And yet you encounter it every day, when you hear a car passing you on the road change from high to low pitch. So what exactly is the Doppler effect, and how does it work?

(Q: What sound does a cat make when it goes past at high speed? A: Meeeeeeeeeeeeeeeowwwww.)

As you might expect, the Doppler effect was named after the Christian Doppler, an Austrian physicist—although he only became a physicist because he was too frail to enter his father’s stonemason business—who proposed it in Prague in 1842.

It happens whenever there is movement relative to a source that’s emitting waves, whether they’re light, sound, water or something else. In the case of the moving car, think of its soundwaves as a series of peaks and troughs. The car emits one wave, i.e. one peak, and then another about 1 millisecond later.

But in that millisecond the car has moved closer to you, so the second peak has less distance to travel. It therefore reaches you less than 1 millisecond after the first peak does. This means that for you each peak is separated by less than a millisecond, so you hear the sound at a higher frequency.

OK, that maybe a little hard to picture, so try it visually instead. Imagine the waves as concentric rings being emitted by the source, they bunch up at the front and stretch out behind it. Or don’t imagine it: look at the picture below.

Doppler effect showing circular wave fronts emitted from a source moving to the right
Doppler effect from a source moving at 0.7 the speed of wave propagation (Image by Lookang with many thanks to Fu-Kwun Hwang and author of Easy Java Simulation Francisco Esquembre, via Wikimedia Commons)

However you imagine it, the frequency change due to the Doppler effect makes a very convenient way to measure velocity, so it has many applications. Talking about moving cars, well it’s the Doppler effect that the police radar uses to tell whether you’re speeding (see the NSW Police Radar Manual [PDF 4.3 MB]).

It’s also famously what we use to measure the expansion of the universe. When a light source like a star or a galaxy is moving away from us, the electromagnetic waves it emits go to the low frequency or red end of the spectrum, so we say it’s red-shifted. If it’s coming towards us, it’s blue-shifted. By measuring the redshift of galaxies depending on how far away they are from us, we can calculate how fast the universe is expanding (due to the expansion of the universe, the further something is, the faster it is moving away).

But if understanding the history of the universe isn’t enough, the Doppler effect still makes the news; specifically, in the hunt for missing Malaysian Airlines flight MH370.

Using what the BBC called “cutting-edge methods”, the British satellite firm Inmarsat received radio pings from the missing plane, and by comparing how the frequency of the signal differed from what it’s supposed to be when it’s transmitted, they could work out how the plane was moving. That’s how they determined it flew to  the Southern Indian Ocean, where the search is currently focussed.

Diagram showing how by triangulating the pings from the MH 370 with a calculation of its speed as determined by the Doppler effect, it was possible to calculate the aircraft's path

The other bit of recent Doppler effect news was the discovery of an ocean under the icy surface of Enceladus, a moon of Saturn. Again, the scientists used changes in the frequency of radio signals, this time from the spacecraft Cassini, which was flying past it (Iess L, Stevenson DJ, Parisi M, Hemingway D, Jacobson RA, Lunine JI, Nimmo F, Armstrong JW, Asmar SW, Ducci M & Tortora P 2014, “The gravity field and interior structure of Enceladus”, Science, vol. 344, no. 6179, pp. 78–80, DOI: 10.1126/science.1250551).

By looking at how Cassini’s speed changed as it flew past Enceladus, they could determine the forces of gravity acting on it, which in turn allowed them to calculate the distribution of mass inside the moon. These were changes in speed of mere millimetres per second, but allowed them to figure out there was liquid water—which is denser than ice—and a relatively light rocky core.

Cross-section image of Saturn's moon Enceladus, showing its rocky core and liquid ocean at the southern pole, emitting geysers through the icy crust
Diagram of the theorised interior of Saturn’s moon Enceladus, based on measurements by NASA’s Cassini spacecraft and NASA’s Deep Space Network. The gravity measurements suggest an ice outer shell and a low density, rocky core with a liquid water ocean sandwiched in between. This is also responsible for the plumes of water vapour shown at the moon’s South Pole (Image by NASA/JPL-Caltech)

So it may be commonplace, everyday science, but it’s good to see the Doppler effect is still making waves after all these years.

Fish still radioactive near Fukushima, but mostly safe elsewhere

Recent catches of fish with record levels of radiation show there is still contamination in the waters around Fukushima following the nuclear disaster in March 2011, but fears of dangerous levels reaching the West Coast of the United States seem to be mostly exaggerated.

In January 2013, a bottom-dwelling Murasoi fish was caught with 2,540 times the legal limit for radioactivity of 100 becquerels per kilogram (about the same level as a banana). And then in February 2013, a greenling with 7,400 times the limit was caught in a cage next to the Fukushima Dai-ichi plant.

Fat greenling, Hexagrammos otakii, seen on some oyster shells
Fat greenling, Hexagrammos otakii, the fish (not the actual fish) found near Fukushima with radioactive caesium at a record level of 740,000 becquerels per kilogram (photo from OpenCage, via Wikimedia Commons)

Although most fish caught in the area are actually below the safe level, a paper published by Ken Buessler of the Woods Hole Oceanographic Institute in the United States found that the number above the limit is not decreasing with time as you’d expect. This indicates that radioactive caesium is still entering the food chain, either from sediments—these were bottom-feeding fish, remember—or from ongoing leaks (Buessler Ken O 2012, “Fishing for answers off Fukushima”, Science, vol. 338, pp. 480–482, DOI: 10.1126/science.1228250 [PDF 3.7 MB]).

This has since been admitted by the Japanese government, with radioactive water leaking from containment tanks into groundwater, which then flows into the Pacific Ocean at a rate of about 300 tons per day. This could mean that fish from Fukushima will be inedible for at least a decade, which could paradoxically mean that they benefit from the lack of fishing—although the long-term effects of radiation on the fish themselves is rarely discussed.

This may be because many of the fish don’t live long enough for it to have an impact, and conversely could be why radioactive isotopes have been detected in long-lived, migratory species like Pacific bluefin tuna (Thunnus orientalis), although even then at levels comparable to natural sources (Fisher NS, Beaugelin-Seiller K, Hinton TG, Baumann Z, Madigan DJ and Garnier-Laplace J 2013, “Evaluation of radiation doses and associated risk from the Fukushima nuclear accident to marine biota and human consumers of seafood”, Proceedings of the National Academy of Sciences, vol. 110, no. 26, pp. 10670–10675, doi: 10.1073/pnas.1221834110).

(Of course, there are other reasons to avoid tuna, such as overfishing or build-up of mercury and other toxic chemicals.)

As for fears of radiation directly reaching the United States, it was expected to take about 3 years to travel across the Pacific, so should be arriving right about now. The Woods Hole Oceanographic Institute is keeping an eye on that too, with a citizen science project asking people to send in samples of seawater for testing. As of yet though, their results are showing no detectible caesium from Fukushima.

But what about Australia, you may ask? Well, because we’re in the southern hemisphere, it will take even longer to reach here—about 5 years, according to a report from the Australian Radiation Protection and Nuclear Safety Agency [PDF 1.7 MB] (although a small amount of atmospheric fallout was detected here in April 2011).  By then though, it will be diluted even more than it is in the United States, so is unlikely to be of concern.

A good night’s sleep literally clears your head

The excellent web-toon XKCD said it best when it pointed out that we all spend so much time sleeping—and most of us love doing it—but we still don’t know why. But never fear, some scientists finally think they may have a clue—a discovery so significant it was a runner-up for Science magazine’s Breakthrough of the Year.

Stanford sleep researcher William Dement said that after 50 years of studying sleep, the only really solid explanation he knows for why we do it is 'because we get sleepy'.
XKCD.com

(The quote in the alt text—and you should always read XKCD’s alt text—is from a National Geographic story where they interviewed Stanford University’s William Dement, a co-discoverer of both REM sleep and narcolepsy, who said, after 50 years of sleep research, “As far as I know, the only reason we need to sleep that is really, really solid is because we get sleepy.”)

Of course, we do know that we need sleep, on the basis that insomnia and sleep deprivation are harmful. The trouble is that how exactly they’re harmful isn’t known either—experiments by Allan Rechtschaffen in the 1980s found that sleep deprivation was eventually fatal to rats, but a specific cause of death couldn’t be found.

Naturally, there are theories:

  • An obvious one would be that sleep conserves energy, which is fine except that we use only 5-10% less energy asleep than awake, which from an evolutionary point of view doesn’t sound enough to justify the increased vulnerability to predators.
  • However, it could also help protect from predators, as hiding somewhere without moving is safer than running around in the open; although again, simply sitting still, or quiescence, would be just as effective and make it possible to react in an emergency.
  • Growth hormones do seem to be influenced by sleep, but there doesn’t seem to be a strong correlation between children’s growth and their amount of sleep; plus, it doesn’t explain why adults sleep.
  • Memory does benefit, as numerous studies have shown that a good night’s sleep helps you remember what you learned the previous day, with some suggesting that it gives the brain the chance to remove unnecessary connections. Maybe not the whole story though, as animals with very simple brains also go through a sleep cycle.
  • Restoring the body may be closer to the mark, as a 2004 study (also on rats) showed that sleep deprivation may slow the healing of wounds.
  • The immune system also seems to benefit, as further rat studies found that sleep deprivation reduced white blood cell counts and increased other cells and chemicals that encourage cancer growth.

Furthering this theme of restoration, a new study by researchers led by Maiken Nedergaard at the University of Rochester in upstate New York, has found a connection between sleep and a brain-cleaning system they discovered (Xie L, Kang H, Xu Q, Chen MJ, Liao Y, Thiyagarajan M, O’Donnell J, Christensen DJ, Nicholson C, Iliff JJ, Takano T, Deane R & Nedergaard M 2013, “Sleep drives metabolite clearance from the adult brain”, Science, vol. 342, no. 6156, pp. 373–377, DOI: 10.1126/science.1241224 [PDF 1.9 MB]).

They call this the “glymphatic system”. It’s similar to the lymphatic system, and it actually connects to it. Lymph comes from the interstitial fluid, the stuff between cells, which has its own circulation system that removes bacteria and toxins, sending them through the lymph nodes and eventually to the veins so they can be cleaned from the body.

However, the lymphatic system doesn’t go through the brain. That’s because the brain has its own cerebro-spinal fluid. This is the liquid that the brain sits in—it actually floats in a big bag of the stuff, which prevents it being damaged by its own weight, as well as cushioning it against injury.

What Nedergaard’s team discovered is that this cerebra-spinal fluid (CSF) also flows through channels around ordinary blood vessels, and then on through other, smaller conduits formed by glial cells (these are cells in the nervous system that aren’t neurons—the name actually comes from the Greek word for “glue”, in that they glue the nerves together. Basically, they’re the support system for the nervous system).

This flow of the CSF removes waste products that neurons produce, things like like beta amyloid, which is a protein that accumulates and forms sticky plaques in patients with Alzheimer’s disease.

For such an important system, it’s surprising that it was only identified a couple of years ago. But that’s until you consider that it only operates in living brains: it needed to wait for the development of sophisticated brain scanning technology.

In this case, they used a technique called two-photon microscopy, in which fluorescent dyes are activated by two photons in the infrared range, which penetrates further into tissue.

With this method they found that during sleep the brain cells reduce in size by 60%, creating more space between them for the cerebro-spinal fluid to flow through and flush out waste products.

Flow of cerebro-spinal fluid, illuminated by fluorescent dye, around blood vessels in a sleeping brain compared with an awake brain. There is much more dye visible when asleep than when awake (watch a video).
Brain scans showing the much greater flow of cerebro-spinal fluid in the brain of a sleeping mouse, compared with after they woke it by “gently moving its tail”. They injected different-coloured dyes at different stages to tell them apart (University of Rochester Medical Center)

This is consistent with other research that shows that levels of beta amyloid declines in human brains during sleep, although they haven’t confirmed that it does the same in mice. However, as Nedergaard points out, “Isn’t it interesting that Alzheimer’s and all other diseases associated with dementia, they are linked to sleep disorders?”

Whether this is still the main explanation for why we sleep is still unknown, but as other experts have said, it does demonstrate one clear physiological function. And it’s consistent with the other studies that suggest restorative benefits, like with wounds and the immune system.

So another reason to get a good night’s sleep, to dream pure thoughts and wake up with a fresh mind. Not like you needed me to tell you that…

(This story first aired on 10 April 2014—you can listen to the podcast.)

Bushfires starting earlier as the climate changes

The new federal environment minister, Greg Hunt, “looked up what Wikipedia says” and concluded that Australia has always had bushfires in hotter months – but if he looked a bit harder, he might find that we’re getting more of those hot months and earlier.

To be sure, Wikipedia is quite reliable; they even say so themselves. But a government minister can probably find more detailed information, as can anyone else if they dig a bit.

One good, independent site is Romsey Australia, which lists major historical bushfires for each Australian state and territory. If you look at the starting date for all the NSW fires listed (from 1926 to 2006) you see the following:

Plot of starting dates of NSW bushfires from 1926 to 2006, with an average for each decade moving earlier in the year

The spread of bushfires throughout the year definitely appears to be increasing, and there’s a clear trend of them starting earlier. Now, it’s likely that the increased availability of data is a factor here, but I reckon this is at least as good as perfunctory ministerial Wikipedia research.

But is this climate change? After all, we also have the prime minister Tony Abbott claiming that “these fires are certainly not a function of climate change,” and that the United Nations climate chief Christiana Figueres was “talking through her hat” when she linked them.

Well, the Intergovernmental Panel on Climate Change (IPCC), in their 2007 Fourth Assessment Report, Working Group II on Impacts, Adaptation and Vulnerability, said that:

Fire frequency is expected to increase with human-induced climate change, especially where precipitation remains the same or is reduced (Stocks et al., 1998).

So far this year in their Fifth Assessment Report, the IPCC has only released the section on the physical science basis, with discussion of impacts yet to come. But they definitely predict that temperatures will continue to increase, and dry areas in the sub-tropics and mid-latitudes are likely to get drier – both factors that contribute to bushfires.

Now, I tend to agree that you can’t attribute a single event (or events, considering there were over 70 burning at the same time) to climate change, but if you look at the trend you see what the scientists were forecasting all along.

Brain boxes build bottle brains

The brain in a vat is a classic philosophical thought experiment, but now an actual vat has been used to grow actual brains. Well, tiny, tiny brainlets, grown out of stem cells.

Glass contraption used to grow the brains, which can be seen as tiny specks in a closeup (click to embiggen)
The spinning bioreactor system used to grow the brains, which you can see as little specks on the right (photo by Madeline A Lancaster)

Dr Madeline Lancaster and her colleagues from the Institute of Molecular Biotechnology at the Austrian Academy of Sciences grew embryonic stem cells into what they called “cerebral organoids”, in a gel-like substance under conditions similar to the human womb (Lancaster MA, Renner M, Martin C-A, Wenzel D, Bicknell LS, Hurles ME, Homfray T, Penninger JM, Jackson AP & Knoblich JA 2013, “Cerebral organoids model human brain development and microcephaly”, Nature, doi:10.1038/nature12517).

The miniature brains, each about 3-4 mm in diameter, developed simple cerebral cortices, retinas and other kinds of brain tissue.

Of course, it’s not really an attempt to create disembodied consciousness, but rather models to help understand brain functions and disorders.  The team has previously made models of other organs, like eyes, pituitary glands and livers, but these were the first brains.

To demonstrate how this can help understand disorders, some of the organoids were made from cells from a patient with microcephaly. As you’d expect, those brains turned out smaller, but in the process they revealed why: the stem cells seemed to differentiate earlier, before they could grow in volume.

So this miniscule grey matter, although not able to think, has already taught us something.

And next time someone tells you to grow a brain, you’ll know how to do it.

(This story first aired on 12 September 2013 – you can listen to the podcast.)

Big pharma brings interesting conflicts

Many of us are suspicious of pharmaceutical companies, and rightly so: they have an increasing influence on medical practice. Even if they’re not necessarily being evil, the opportunity and temptation for mischief is definitely there.

A recent study by Ray Moynihan of Bond University and colleagues showed that the expert panels that define diseases often have ties to companies that stand to benefit from broadening their scope (Moynihan RM, Cooke GPE, Doust JA, Bero L, Hill S & Glasziou PP 2013, “Expanding disease definitions in guidelines and expert panel ties to industry: a cross-sectional study of common conditions in the United States”, PLoS Medicine, vol. 10, no. 8, e1001500, doi:10.1371/journal.pmed.1001500).

These panels put out guidelines for doctors that are meant to summarise the latest thinking on how a disease should be diagnosed and what treatment is recommended. And these are fairly common and significant conditions: the study looked at panels covering asthma, bipolar, high cholesterol, chronic obstructive pulmonary (i.e., lung) disease, depression, type 2 diabetes, hypertension, gastric reflux (GERD), myocardial infarction (heart attack), multiple sclerosis and rheumatoid arthritis.

What they found was that of 16 disease redefinitions published between 2000 and April 2013, ten of them proposed changes widening disease diagnosis, and only one narrowed a definition.

Furthermore, among the 14 panels which disclosed industry ties, 75% of their members had such connections – to a median of seven companies each.  And 12 panels were actually chaired by people with industry ties.

Quite unsurprisingly – for cynics amongst us, at least – the highest proportions of ties were to companies that manufactured drugs used to treat the relevant disease.

For a closer look, let’s choose one of the most common conditions on the list: high blood pressure, otherwise known as hypertension.

World Health Organisation poster for their World Health Day 2013, telling people to cut their risk of heart attack and stroke by controlling their blood pressure
Is even the World Health Organisation in on the conspiracy? (Image: WHO)

The panel in question defined a new condition, “pre-hypertension”, where people who would previously have been considered to have normal blood pressure are at risk of future adverse effects.

Although the panel cited research for their decision – which we’ll come to later – eight of the 11 panel members had financial ties to companies that make hypertension drugs: companies like Bristol-Myers Squibb, Merck and Novartis.

You’d have to imagine it’s in the interest of those companies for more people to be prescribed blood pressure medication. Which isn’t good, considering those drugs have been shown not to be much help for people with mildly high blood pressure, i.e. between 140/90 and 159/99 (Diao D, Wright JM, Cundiff DK & Gueyffier F, “Pharmacotherapy for mild hypertension”, Cochrane Database of Systematic Reviews 2012, issue 8, art. no.: CD006742. DOI: 10.1002/14651858.CD006742.pub2).

Not only that, but it’s part of a growing trend towards overdiagnosis, where more and more people are considered diseased, purely due to redefining illness rather than any change to the patients themselves.

So, a pretty clear case of conflict of interest, right? Well, maybe… But when you look deeper into the actual change, the story’s slightly different.

In the report where the panel introduced pre-hypertension – defined as blood pressure between 120/80 and 139/89 – they actually recommended patients change their lifestyle to prevent heart disease – not take medication (Chobanian AV, Bakris GL, Black HR, et al. 2003, “The seventh report of the Joint National Committee on Prevention, Detection, Evaluation, and Treatment of High Blood Pressure: The JNC 7 Report”, Journal of the American Medical Association, vol. 289, no. 19, pp. 2560-2571, doi:10.1001/jama.289.19.2560).

So they’re not necessarily getting new customers for the companies they’re tied to – in fact, if the lifestyle changes work they could be losing future customers.

And given that there are indeed studies that claim pre-hypertension triples your risk of heart attack, then improving your lifestyle probably isn’t a bad thing (Qureshi AI, M. Suri MFK, Kirmani JF, Divani AA, Mohammad Y 2005, “Is prehypertension a risk factor for cardiovascular diseases?”, Stroke: Journal of the American Heart Association, no. 36, pp. 1859-1863, doi: 10.1161/​01.STR.0000177495.45580.f1).

Now, to be fair Moynihan et al’s paper didn’t actually make any judgements about whether the disease redefinitions were good or bad. And they also admit that they haven’t proven that industry ties are causing the broadening of diagnoses – in the absence of control groups, you can’t say what a non-industry panel would have concluded.

But it does show how the system works, and it raises questions about what can or should be done to avoid conflicts of interest. One of those probably is to make sure frontline doctors are independent and fully-informed of the biases, so that they can choose the best treatment for each patient.

Human nature being what it is, if there’s an opportunity for abuse of power, someone’s going to abuse it.

(This story first aired on 12 September 2013 – you can listen to the podcast.)