Monthly Archives: September 2012

Survival of the best fit

While we wait for next year’s commemorations of Alfred Wallace – who independently came up with the theory of evolution through natural selection and for many years was credited alongside Charles Darwin – it’s a good time to reflect on how natural selection works, and how ‘survival of the fittest’ may not be an accurate description.

Two sides of the Darwin-Wallace medal of the Linnaean Society, depicting both Charles Darwin and Alfred Wallace (click to embiggen)
The Darwin-Wallace medal of the Linnaean Society, awarded for major advances in evolutionary biology. Charles Darwin and Alfred Wallace separately but simultaneously developed the theory of evolution by natural selection.
That famous term was first used by the economist Herbert Spencer as an interpretation of the basic premise of natural selection, but Darwin himself adopted it in later editions of his book. However, it’s since been misused and misapplied in many ways.

At its simplest, the Darwin-Wallace theory suggests that organisms reproduce in higher numbers according to how suitable they are for their environment. That suitability is really the ‘best fit’ for the environment, but it can also be referred to as ‘fitness’. The trouble is that this causes confusion because fitness has taken on a different meaning in everyday speech, now being more associated with health.

The word ‘survival’ can also be a bit misleading as it should not be assumed to mean survival of any single organism, because evolution occurs across an entire population. We are instead referring to successful continuation of the species, not individual persistence.

In biology, the success of any organism can be expressed as the relative number of offspring, or even more precisely, as the relative number of genes they pass down to the following generations. This counting of genes is a much more accurate definition, because there is no way of predicting an organism’s success based on physical features at any time during its life cycle. The only meaningful measure of success in biology is the ongoing propagation of particular, inherited traits measured by genes in a population of any organism.

When people talk of competition in evolution, many people will automatically think of an example such as lions hunting grazing animals. These are examples of competition in a way, except that the competition which drives natural selection is that between each lion and the other lions in catching their food, and between each antelope in not being eaten.

These examples are easy enough to understand, but in reality, the competition is often not defined, and is therefore not possible to gauge by looking at past performance.

If the climate in which the antelopes grazed warmed a couple of degrees, or the rainfall increased by a few millimetres a year, the range of plants on which they graze may shift. A higher protein or more easily digested grass may become prevalent, and the antelopes may change their grazing habits or get fatter, meaning they are slower in running away from lions. Or their meat could take on a different flavour and become repellent to lions, or they could develop greater muscle strength and outrun the lions, who may in turn have to become smaller and faster to keep feeding on them.

So there is no way to predict which of the grasses, or the antelopes or the lions will be more successful under the new conditions, so only in hindsight can any of their success be determined.

It has been suggested more recently that competition between organisms is less of a driver of evolutionary change than external changes in the environment opening up new niches in which organisms can diversify, and this highlights the unpredictability of advantage even further.

Taking the fittest in one type of competition will give no indication of their success in an unknown competition, like taking the Australian swimming team and sending them to the 200 metre hurdles to race. They may do well, but there is no way to know until they have finished the race.

Honey hardly a help for hay fever

Ah, spring. Sun shining, birds singing, flowers blooming. In other words, it’s hay fever season.

If, like me, you suffer from hay fever – or allergic rhinitis, if you will – you’ve probably had some well-meaning people advising you to eat honey to hold back the symptoms. But not just any honey: it has to be local honey. The theory is that if you eat honey made from local flowers, you will reduce your sensitivity to their pollen.

But there’s one big problem with this: the plants that cause hay fever are not the same plants from which bees make honey.

A bee approaching a flower with intent to pollinate (click to embiggen)
Bees make honey by gathering nectar from flowers, distributing pollen in the process. These are not the same flowers whose pollen is spread by the wind and cause allergies – if it was, they wouldn’t need the bees (Photo by Louise Docker, via Wikimedia Commons)
Flowering plants need to spread their pollen to reproduce, and they’ve evolved different means of doing so. Some release their pollen to the wind, and these, like the grasses that give me such grief, are the ones that contaminate your air supply and cause an allergic reaction.

Others rely on insects to spread pollen from bloom to bloom, enticing them with sugary nectar. These are the ones that bees visit and use to make honey. They’re completely different species, with completely different pollen, to the wind-pollinated plants.

Although, even if they were the same plants, the time it takes to produce, collect and distribute the honey means they wouldn’t still be flowering by the time you ate it anyway. So it still couldn’t be the same pollen that causes your allergies.

Of course, this is just challenging a theory with a superior theory. But biology is complicated, and theories can be wrong, or there could be some unexpected mechanism that causes something to work. So with medical issues, I think it’s always important to look at actual clinical studies, and there have been some done on this question.

The most notable study failed to find a benefit from unfiltered, unpasteurised local honey compared with both filtered, pasteurised honey and a control of honey-flavoured corn syrup. The 36 participants, all of whom suffered from both runny noses and sore eyes, ate one tablespoon of their assigned substance per day. None of the groups got relief from their symptoms (Rajan TV, Tennen H, Lindquist RL, Cohen L & Clive J 2002, “Effect of ingestion of honey on symptoms of rhinoconjunctivitis”, Annals of Allergy, Asthma and Immunology, vol. 88, no. 2, pp. 198-203, PMID: 11868925).

Now, the authors admitted that a possible shortcoming of their research may have been that the dose was too low. But considering that 13 people dropped out of the trial because they couldn’t stand the sweetness, I think it’s fair to say you couldn’t expect people to eat much more.

Interestingly, there was a more recent study performed in Finland that found a benefit from honey laced with birch pollen – birch trees being a major source of allergies in Europe. But once again, that wasn’t honey actually made from birch trees: the pollen had to be artificially added to the honey. Also, the patients still took antihistamines, so even then it wasn’t a complete cure (Saarinen K, Jantunen J & Haahtela T 2011, “Birch pollen honey for birch pollen allergy – a randomized controlled pilot study”, International Archives of Allergy and Immunology, vol. 155, no. 2, pp. 160-166, DOI: 10.1159/000319821).

To sum up, there are plenty of good reasons to eat honey, especially local honey – if nothing else, you’re promoting pollination in your area – but it won’t help your hay fever.

So what can you do? More evidence-based recommendations are:

Mistletoe missed when it’s gone

Mistletoe is a parasitic plant that most people think is only useful for triggering Christmas kisses and as an ingredient in magic strength potion from Ancient Gaul. But recently published research suggests that it plays a crucial role in its ecosystem.

Professor David Watson gathering mistletoe (click to read more)
Professor David Watson gathering mistletoe – golden sickle not pictured (Photo Charles Sturt University)

Professor David Watson and Matthew Herring of Charles Sturt University set out to test whether mistletoe is a keystone species – that’s an organism that appears insignificant but actually has a large effect on other species. The best way to test whether a species is a keystone is to remove it from an ecosystem and see what happens. However, that’s usually not only extremely difficult, but risks causing irreparable damage.

In this case though, it’s possible to remove mistletoe without damaging their host trees. After obtaining the necessary permissions, the researchers and teams of volunteers spent two years removing 46 tonnes of mistletoe – predominantly Amyema miquellii, known as either Box, Stalked or Drooping Mistletoe – from 17 woodland sites in the southern Riverina region of New South Wales. They then waited another three years before returning to compare the changes with 11 control sites and 12 where mistletoe was naturally missing.

As predicted, the absence of mistletoe affected the local bird population, with a third of species missing after only 3 years. But what was surprising was that the birds affected weren’t those that nested or fed in mistletoe, but instead it was the insect-eaters. Watson believes that this is because mistletoe drops more leaf litter than its host tree, so its removal takes away the habitat for the insects on which the birds feed.

This study is believed to be the first test of a keystone plant, and the most rigorous ever test of any keystone. Its unexpected outcome demonstrates the subtle dependencies that can exist in ecosystems, and how a single species may do more than you think.

Reference: Watson DM & Herring M 2012, “Mistletoe as a keystone resource: an experimental test”, Proceedings of the Royal Society B: Biological Sciences, vol. 279, no. 1743, pp. 3853-3860 (doi: 10.1098/rspb.2012.0856)

Junk DNA not all junk, but some still is

We  sometimes pick on media reporting here on Lost in Science – like just the other day – but it pains me to have to bring up criticism of what sounded like one the biggest science stories of the year. However, there has been a lot of hyperbole about the results of the ENCODE project and the claims that it ‘debunked’ the concept of junk DNA.

For those who missed the wall-to-wall media coverage, ENCODE stands for the ENCyclopedia Of DNA Elements. It’s an international collaboration of 442 scientists, who used 24 different tests on 147 human cell types and catalogued what each bit of DNA does.

In their press releases, ENCODE claimed to have found a function for 80% of DNA, including the non-coding parts, aka ‘junk’. But does this really overturn everything we thought we knew about DNA?

DNA, of course, is deoxyribonucleic acid. It’s a molecule found in the nucleus of each of our cells, and it’s the template for the construction of those cells and how they operate. Basically, it’s the blueprint to build a human being.

Some sections of DNA make up genes, which are codes for particular proteins. A related substance called ribonucleic acid (RNA) reads or transcribes the code and uses it to construct a protein, which then goes off and performs some sort of biological function.

The rest that isn’t a code for genes is the non-coding DNA. For years, it’s been known that some of this non-coding DNA contains switches that determine whether a gene gets read or not. But many have thought that the bulk of it is probably junk left over from evolution – things like old, mutated genes, and bits of viruses that have wormed their way into our DNA. This is the so-called ‘junk DNA’.

In 2003 the Human Genome Project – another big, international collaboration – successfully mapped all the genes in the human body. But they also found that only about 1.5% of our DNA is used to code genes.

ENCODE aims to to fill in the rest. Its members found plenty of those gene switches I mentioned earlier, but they also found some that weren’t genes but were still able transcribed by RNA, and some that only seem to work when the DNA molecule is curled up in a 3D shape.

These add up to the figure quoted before, of 80% of DNA with a biological function. But the key question, and the one that has generated most of the controversy about these claims, is what exactly do they mean by ‘function’?

For instance, you’d expect that those old viruses and broken copies of genes could be transcribed by RNA, even though they don’t do anything useful. All that’s really saying is that that bit of DNA can be copied. And although it’s true that ENCODE did find millions of switches, and we know they actually do something useful in cells, that’s not news because biologists already knew those switches existed.

If you restrict the definition of ‘functional’ only to genes and their switches, the estimate drops from the upper estimate of 80% to only around 20% having a function.

A Canadian biologist, T. Ryan Gregory, who has been one of the scientists most critical of the ENCODE hype, has pointed out that you can get an idea of the usefulness of the bulk of non-coding DNA by comparing humans with other species.

Pufferfish, for example, have more genes than us, but their DNA is 1/10 the size. So either the rest of it isn’t doing much, or they’ve found some way to get along without it. Then there are onions, which have about 5 times more DNA than humans. Assuming it’s not mostly junk, are onions really so complex to justify the difference?

Of course, this is one point of view – among many others – and it doesn’t mean that ENCODE is worthless. It is, after all, a huge project that has catalogued the various functions for future study. It’s also already provided some clues to diseases – sections of DNA that are associated with diseases were more likely to be non-coding – and it’s introduced lots of innovative tools for exploring the data.

But the real concern is the exaggerated claims and the way they were repeated unquestioningly by news outlets (helpfully, Gregory has catalogued the major reports).

Although we often blame inaccurate science reporting on the low priority it’s assigned in the media – The Australian dumping their science reporter Leigh Dayton being one example – in this case it was seen in internationally respected news organisations, including many dedicated science publications.

Nature and Science, two of the world’s most prestigious science journals, worked closely with ENCODE to publish special features and ensure maximum impact, with the simultaneous release of 30 papers (see www.nature.com/encode).

And that’s perhaps the problem. Scientific revolutions make very appealing stories, and the backing of such a powerful public relations machine means that even the best find them hard to resist.

Sometimes these stories are true, but generally it pays to be sceptical of the hype. In this case that’s justified, as ENCODE hasn’t completely revolutionised the way we understand DNA. We already knew it wasn’t all junk, and there’s still good reason to believe that much of it is.

Myna inconvenience

The Common Myna (Acridotheres tristis) is hated so much that on ABC’s Wildwatch it was voted Australia’s number one pest or problem, above cane toads and rabbits. So much, that even when research shows it might not be worth trying to eradicate, newspapers report the exact opposite. In fact, their only redeeming feature seems to as an easy source of headline puns.

Common Mynas are also known as Indian Mynas, and as the name suggests they originally came from India. But, perhaps because they thrive in human habitats, they’ve since spread throughout the world, and are found on every continent except Antarctica.

So they’re an invasive species, but how damaging are they really? Well, it’s actually hard to tell. They lay their eggs in tree or wall cavities, and compete with native species that do the same. And they mate for life, forming a formidable pair that aggressively defends a 1-3 hectare territory from other birds.

This tendency to attack smaller birds is probably one of the reasons people hate them. Although, I should point out that many of these incidents are mistaken identification of the even more aggressive native species, the Noisy Miner (Manorina melanocephala). They both look fairly similar, with black heads and yellow beaks and eyeliner.

The big difference, apart from the spelling, is that Common Mynas are mostly brown and the native Noisy Miners are grey. And instead of attacking in pairs, Noisy Miners tend to gang up on other birds in larger numbers.

Comparison photos of a Common Myna, Noisy Miner and Bell Miner (click to embiggen)
From left to right, Common Myna (Acridotheres tristis), Noisy Miner (Manorina melanocephala) and Bell Miner (Manorina melanophrys). Photos by Dick Daniels, Quartl and Brett Donald respectively, via Wikimedia Commons

But whether they’re mynas or miners, the impact of competition is relatively difficult to measure compared to something like predation. If one species is actively killing another, it’s pretty easy to see the effect just by counting the victims.

But competition is more subtle. So what Kate Grarock and colleagues from the Australian National University and the University of Canberra have done is to use data from a birdwatcher club, the Canberra Ornithologists Group, to track how populations of various species across Canberra were changing after the arrival of mynas (Grarock K, Tidemann CR, Wood J & Lindenmayer DB 2012, “Is it benign or is it a pariah? Empirical evidence for the impact of the Common Myna (Acridotheres tristis) on Australian birds”, PLOS One, vol. 7, no. 7: e40622, doi:10.1371/journal.pone.0040622).

They tracked bird populations according to the number of years since mynas arrived – they were first introduced to Melbourne in 1862 to control insects in market gardens, but didn’t reach Canberra until 1968 – together with a lot of sophisticated statistical analysis that corrected for factors like urban development and type of vegetation.

The bird species they analysed were split into three groups: there were other cavity-nesters, such as cockatoos, parrots and kookaburras; then there were birds smaller than about 25 cm, like Willy Wagtails and Magpie-Larks, which you might expect to be intimidated by Common Mynas; and finally there were large birds about 30 cm or bigger, like magpies and currawongs, that should be harder to push around and so are sort of a control group.

What they found was a definite negative correlation between myna numbers and three of the cavity-nesting species: the Sulphur-crested Cockatoo, the Crimson Rosella and the Laughing Kookaburra. And seven of the small bird species were affected as well.

As expected, there was no correlation with large species numbers. But interestingly, there was also no significant effect on four other cavity-nesting species, namely the Galah, Australian King-Parrot, Eastern Rosella and Common Starling. And one of the small bird species seemed to be OK, although that was
only the House Sparrow.

But an important point is that the majority of the species actually increased in number over the 29-year study period. It’s just that their increase seemed to be at a slower rate after the arrival of mynas than it would otherwise have been.

There are two interesting aspects to this. The first is the point made earlier, that the level of hatred directed towards mynas in the community has led to some slightly inaccurate, hysteria-inducing science reporting.

The Age newspaper on 13 August 2012 reported on the Canberra study but got the numbers wrong. For instance, Crimson Rosellas increased in number by 5.9 birds per km2 every year. But The Age mistakenly reported that rate of increase as the total population density. So because the rate of increase was slower in the presence of mynas – 3.5 birds per km2 per year slower – they interpreted that as a decline in overall numbers to 2.4 birds per km2. Which of course it wasn’t (actual Crimson Rosella numbers were around 50-150 birds per km2).

These numbers may sound tricky, but they’re easy to verify as the entire research paper is available for free online (look  under ‘Results’). However, that didn’t prevent the publishing of another article with exactly the same mistake in The Age on 23 September 2012.

This time, they even quoted Kate Grarock, saying ”I think it’s great that the community is involved with environmental management. However I do fear that the passion for hating the myna is way too extreme. Australians appear to be more worried about mynas than cane toads, foxes, feral cats and rabbits.”

Unfortunately, the newspaper had already decided its conclusion that Common Mynas are a major pest that’s reducing species abundance. Which of course is the story that everyone expected to read, even though the actual research found an increase in the abundance of most of the species.

Which brings me to my other point, which is that the biggest factor affecting species numbers was change in habitat. Many of these birds, including both Common Mynas and Noisy Miners, do best in urban or lightly-forested areas, and are not found in dense forests.

The researchers concluded that attempting to eradicate Common Mynas would be very difficult and probably nowhere near as cost-effective as improving habitat to encourage native species.

It seems though that you won’t read that in The Age. Instead, you can read Kate Grarock’s own commentary on The Conversation website, http://theconversation.edu.au/we-love-to-hate-the-common-myna-but-what-should-we-do-about-it-8530