Showing posts with label Richard Dawkins. Show all posts
Showing posts with label Richard Dawkins. Show all posts

Tuesday 12 May 2020

Ancestral tales: why we prefer fables to fact for human evolution

It seems that barely a month goes by without there being a news article concerning human ancestry. In the eight years since I wrote a post on the apparent dearth of funding in hominin palaeontology there appears to have been some uptake in the amount of research in the field. This is all to the good of course, but what is surprising is that much of the non-specialist journalism - and therefore public opinion - is still riddled with fundamental flaws concerning both our origins and evolution in general.

It also seems that our traditional views of humanity's position in the cosmos is often the source of the errors. It's one thing to make such howlers as the BBC News website did some years' back, in which they claimed chimpanzees were direct human ancestors, but there are a key number of more subtle errors that are repeated time and again. What's interesting is that in order to explain evolution by natural selection, words and phrases have become imbued with incorrect meaning or in some cases, just a slight shift of emphasis. Either way, it seems that evolutionary ideas have been tacked onto existing cultural baggage and in the process, failed to explain the intended theories; personal and socio-political truths have triumphed over objective truth, as Neil deGrasse Tyson might say.

1) As evolutionary biologist Stephen Jay Gould use to constantly point out, the tree of life is like the branches of a bush, not a ladder of linear progression. It's still fairly common to see the phrase 'missing link' applied to our ancestry, among others; I even saw David Attenborough mention it in a tv series about three years' ago. A recent news article described - as if in surprise - that there were at least three species of hominins living in Africa during the past few million years, at the same time and in overlapping regions too. Even college textbooks use it - albeit in quotation marks - among a plethora of other phrases that were once valid, so perhaps it isn't surprising that popular publications continue to use them without qualification.

Evolution isn't a simple, one-way journey through space and time from ancestors to descendants: separate but contemporaneous child species can arise via geographical isolation and then migrate to a common location, all while their parent species continues to exist. An example today would be the lesser black-backed and herring gulls of the Arctic circle, which is either a single, variable species or two clearly distinct species, depending where you look within its range.

It might seem obvious, but species also migrate and then their descendants return to the ancestral homeland; the earliest apes evolved in Africa and then migrated to south-east Asia, some evolving into the ancestors of gibbons and orangutan while others returned to Africa to become the ancestors of gorillas and chimpanzees. One probable culprit of the linear progression model is that some of the examples chosen to teach evolution such as the horse have few branches in their ancestry, giving the false impression of a ladder in which a descendant species always replaces an earlier one.

2) What defines a species is also much misunderstood. The standard description doesn't do any favours in disentangling human evolution; this is where Richard Dawkins' oft-repeated phrase 'the tyranny of the discontinuous mind' comes into play. Examine a range of diagrams for our family tree and you'll find distinct variations, with certain species sometimes being shown as direct ancestors and sometimes as cousins on extinct branches.

If Homo heidelbergensis is the main root stock of modern humans but some of us have small amounts of Neanderthal and/or Denisovan DNA, then do all three qualify as direct ancestors of modern humans? Just where do you draw the line, bearing in mind every generation could breed with both the one before and after? Even with rapid speciation events between long periods of limited variability (A.K.A. punctuated equilibrium) there is no clear cut-off point separating us from them. Yet it's very rare to see Neanderthals labelled as Homo sapiens neanderthalensis and much more common to see them listed as Homo neanderthalensis, implying a wholly separate species.

Are the religious beliefs and easy-to-digest just-so stories blinding us to the complex, muddled background of our origins? Obviously, the word 'race' has profoundly negative connotations these days, with old-school human variation now known to be plain wrong. For example, there's greater genetic variation in the present-day sub-Saharan African population than in the rest of the world combined, thanks to it being the homeland of all hominin species and the out-of-Africa migrations of modern humans occurring relatively recently.

We should also consider that species can be separated by behaviour, not just obvious physical differences. Something as simple as the different pitches of mating calls separate some frog species, with scientific experiments proving that the animals can be fooled by artificially changing the pitch. Also, just because species appear physically similar doesn't necessarily mean an evolutionary close relationship: humans and all other vertebrates are far closer to spiny sea urchins and knobbly sea cucumbers than they are to any land invertebrates such as the insects.

3) Since the Industrial Revolution, societies - at least in the West - have become obsessed with growth, progress and advance. This bias has clearly affected the popular conception that evolution always leads to improvements, along the lines of faster cheetahs to catch more nimble gazelles and 'survival of the fittest'. Books speak of our epoch as the Age of Mammals, when by most important criteria we live in the era of microbes; just think of the oxygen-generating cyanobacteria. Many diagrams of evolutionary trees place humans on the central axis and/or at the pinnacle, as if we were destined to be the best thing that over three billion years of natural selection could achieve. Of course, this is no better than what many religions have said, whereby humans are the end goal of the creator and the planet is ours to exploit and despoil as we like (let's face it, for a large proportion of our existence, modern Homo sapiens was clearly less well adapted to glacial conditions than the Neanderthals).

Above all, these charts give the impression of a clear direction for evolution with mammals as the core animal branch. Popular accounts still describe our distant ancestors, the synapsids, as the 'mammal-like reptiles', even though they evolved from a common ancestor of reptiles, not from reptiles per se. Even if this is purely due to lazy copying from old sources rather than fact-checking, doesn't it belie the main point of the publication? Few general-audience articles admit that all of the earliest dinosaurs were bipedal, presumably because we would like to conflate standing on two legs with more intelligent or 'advanced' (a tricky word to use in a strict evolutionary sense) lineages.

The old ladder of fish-amphibian-reptile/bird-mammal still hangs over us and we seem unwilling to admit to extinct groups (technically called clades) that break our neat patterns. Incidentally, for the past 100 million years or so, about half of all vertebrate species have been teleost fish - so much for the Age of Mammals! No-one would describe the immensely successful but long-extinct trilobites as just being 'pill bug-like marine beetles' or similar, yet when it comes to humans, we have a definite sore spot. There is a deep psychological need to have an obvious series of ever-more sophisticated ancestors paving the way for us.

What many people don't realise is that organisms frequently evolve both physical and behavioural attributes that are subsequently lost and possibly later regained. Some have devolved into far simpler forms, frequently becoming parasites. Viruses are themselves a simplified life form, unable to reproduce without a high-jacked cell doing the work for them; no-one could accuse them of not being highly successful - as we are currently finding out to our cost. We ourselves are highly adaptable generalists, but on a component-by-component level it would appear that only our brains make us as successful as we are. Let's face it, physically we're not up to much: even cephalopods such as squid and octopus have a form of camera eye that is superior to that of all vertebrates.

Even a cursory glance at the natural history of life, using scientific disciplines as disparate as palaeontology and comparative DNA analysis, shows that some lineages proved so successful that their outward physiology has changed very little. Today, there are over thirty species of lancelet that are placed at the base of the chordates and therefore closely related to the ancestors of all vertebrates. They are also extremely similar in appearance to 530-million-year-old fossils of the earliest chordates in the Cambrian period. If evolution were a one-way ticket to progress, why have they not long since been replaced by later, more sophisticated organisms?

4) We appear to conflate success simply with being in existence today, yet our species is a newcomer and barely out of the cradle compared to some old-timers. We recently learned that Neanderthals wove plant fibre to make string and ate a wide variety of seafood. This knowledge brings with it a dwindling uniqueness for modern Homo sapiens. The frequently given explanation of our superiority over our extinct cousins is simply that they aren't around anymore, except as minor components of our genome. But this is a tautology: they are inferior because they are extinct and therefore an evolutionary dead end; yet they became extinct because of their inferiority. Hmmm...there's not much science going on here!

The usual story until recently was that at some point (often centred around 40,000-50,000 years ago) archaic sapiens developed modern human behaviour, principally in the form of imaginative, symbolic thinking. This of course ignores the (admittedly tentative) archaeological evidence of Neanderthal cave-painting, jewelry and ritual, all of which are supposed to be evidence of our direct ancestor's unique Great Leap Forward (yes, it was named after Chairman Mao's plan). Not only did Neanderthals have this symbolic behaviour, they appear to have developed it independently of genetically-modern humans. This is a complete about-turn from the previous position of them being nothing more than poor copyists.

There are alternative hypotheses to the Great Leap Forward, including:
  1. Founder of the Comparative Cognition Project and primate researcher Sarah Boysen observed that chimpanzees can create new methods for problem solving and processing information. Therefore, a gradual accumulation of cognitive abilities and behavioural traits over many millennia - and partially inherited from earlier species - may have reached a tipping point. 
  2. Some geneticists consider there to have been a sudden paradigm shift caused by a mutation of the FOXP2 gene, leading to sophisticated language and all that it entails.
  3. Other researchers consider that once a certain population size and density was achieved, complex interactions between individuals led the way to modern behaviour. 
  4. A better diet, principally in the form of larger amounts of cooked meat, led to increased cognition. 
In some ways, all of these are partly speculative and as is often the case we may eventually find that a combination of these plus other factors were involved. This shouldn't stop us from realising how poor the communication of evolutionary theories still is and how many misconceptions exist, with the complex truth obscured by our need to feel special and to tell simple stories that rarely convey the amazing evolution of life on Earth.



Wednesday 20 March 2019

My family & other animals: what is it that makes Homo sapiens unique?

It's a curious thing, but I can't recall ever having come across a comprehensive assessment of what differentiates Homo sapiens from all other animals. Hence this post is a brief examination on what I have found out over the years. I originally thought of dividing it into three neat sections, but quickly discovered that this would be, as Richard Dawkins once put it, 'a gratuitously manufactured discontinuity in a continuous reality.' In fact, I found a reasonably smooth gradation between these segments:
  1. Long-held differences now found to be false
  2. Possibly distinctions - but with caveats
  3. Uniquely human traits
Despite the carefully-observed, animal-centered stories of early civilisations - Aesop's fable of The Crow and the Pitcher springs to mind - the conventional wisdom until recently was that animals are primarily automatons and as such readily exploitable by humanity. Other animals were deemed vastly inferior to us by a question of kind, not just degree, with a complete lack of awareness of themselves as individuals.

The mirror test developed in 1970 has disproved that for a range of animals, from the great apes to elephants, dolphins to New Caledonian crows. Therefore, individuals of some species can differentiate themselves from their kin, leading to complex and fluid hierarchies within groups - and in the case of primates, some highly Machiavellian behaviour.

Man the tool-maker has been a stalwart example of humanity's uniqueness, but a wide range of animals in addition to the usual suspects (i.e. great apes, dolphins and Corvidae birds) are now known to make and use tools on a regular basis. Examples include sea otters, fish, elephants, and numerous bird species, the latter creating everything from fish bait to insect probes. Even octopuses are known to construct fences and shelters, such as stacking coconut shells - but then they do have eight ancillary brains in addition to the main one!

We recognise regional variations in human societies as the result of culture, but some animal species also have geographically-differentiated traits or tools that are the obvious equivalent. Chimpanzees are well known for their variety of techniques used in obtaining food or making tools. These skills are handed down through the generations, remaining different to those used in neighbouring groups.

Interestingly, farming has really only been adopted by the most humble of organisms, namely the social insects. Ants and termites farm aphids and fungi in their complex, air-conditioned cities that have more than a touch of Aldous Huxley's Brave New World about them; in a few species, the colonies may even largely consist of clones!

Although many animals construct nests, tunnels, dams, islets or mounds, these appear to serve purely functional purposes: there is no equivalent of the human architectural aesthetic. Octopus constructions aside, birds for example will always build a structure that resembles the same blueprint used by the rest of their kind.

Many species communicate by aural, gestural or pheremonal languages, but only humans can store information outside of the body and across generations living at different times. Bird song might sound pretty, but again, this appears to be a series of basic, hard-wired, communications. Conversely, humpback whale song may contain artistic values but we just don't know enough about it to judge it in this light.

Birds and monkeys are happy to hoard interesting objects, but there is little aesthetic sense in animals other than that required to identify a high-quality mate. In contrast, there is evidence to suggest that other species in the hominin line, such as Neanderthals and Homo erectus, created art in forms recognisable today, including geometric engravings and jewellery.

Some of our ancestor's earliest artworks are realistic representations, whereas when armed with a paint brush, captive chimps and elephants produce abstract work reminiscent of pre-school children. We should remember that only since the start of the Twentieth Century has abstract art become an acceptable form for professional artists.

Jane Goodall's research on the Gombe chimps shows that humans are not the only animal to fight and kill members of the same species for reasons other than predation or rivalry. Sustained group conflict may be on a smaller scale and have less rules than sanctioned warfare, but it still has enough similarity to our own violence to say that humanity is not its sole perpetrator. One interesting point is that although chimps have been known to use sharpened sticks to spear prey, they haven't as yet used their weapons on each other.

Chimpanzees again have been shown to empathise with other members of their group, for example after the death of a close relative. Altruism has also been observed in the wild, but research suggests there is frequently another motive involved as part of a long-term strategy. This is countered with the notion that humans are deemed able to offer support without the expectation of profit or gain in the future; then again, what percentage of such interactions are due to a profitless motivation is open to suggestion.

A tricky area is to speculate on the uniqueness of ritual to Homo sapiens. While we may have usurped the alpha male position in domesticated species such as dogs, their devotion and loyalty seems too far from deity worship to be a useful comparison; certainly the idea of organised religion has to be alien to all other species? Archaeological evidence shows what appears to be Neanderthal rituals centred on cave bears, as well as funereal rites, but the DNA evidence for interbreeding with modern humans doesn't give enough separation to allow religion to be seen as anything other than a human invention. What is probably true though is that we are the only species aware of our own mortality.

One area in which humans used to be deemed sole practitioners is abstract thought, but even here there is evidence that the great apes have some capability, albeit no greater than that of a pre-schooler. Common chimps and bonobos raised in captivity have learnt - in some cases by observation, rather than being directly taught - how to use sign language or lexigrams to represent objects and basic grammar. It's one thing to see a button with a banana on it and to learn that pressing it produces a banana, but to receive the same reward for pressing an abstract symbol shows a deeper understanding of relationship and causality.

A consideration of a potential future is also shared with birds of the Corvidae family, who are able to plan several steps ahead. Where humans are clearly far ahead is due to a gain in degree rather than just kind. Namely, we have the ability to consider numerous future paths and act accordingly; this level of sophistication and branch analysis appears to be uniquely human, allowing us to cogitate about possibilities in the future that might occur - or may never be possible. Both prose and poetic literature are likely to be uniquely human; at least until we can decipher humpback whale song.

Finally, there is science, possibly the greatest of human inventions. The multifarious aspects of the scientific endeavour, from tentative hypothesis to experimentation, advanced mathematics to working theory, are unlikely to be understood let alone attempted by any other species. The combination of creative and critical thinking, rigour and repetition, and objectivity and analysis require the most sophisticated object in the known universe, the human brain. That's not to say there aren't far more intelligent beings out there somewhere, but for now there is one clear activity that defines us as unique. And thank goodness it isn't war!

Friday 11 January 2019

Hot, cold or in between: thermoregulation and public misunderstanding of science

I recently spotted an intriguing paleontology article concerning the 180 million year old fossil remains of an ichthyosaur, a marine reptile from the Early Jurassic. The beastie, belonging to the genus Stenopterygius,  is so well preserved that it shows coloration patterns (if not the colours themselves) on patches of scaleless skin, as well as a thick layer of insulating fat or blubber. What makes the latter so intriguing is that reptiles just aren't meant to have blubber. Then again, like some snakes and skinks today, ichthyosaurs must have given birth to live young. Thus the gap between reptiles and mammals surely grows ever smaller?

This conundrum touches on some interesting issues about the public's knowledge of science. Several times I've commented on what Richard Dawkins calls the "tyranny of the discontinuous mind", which is the way in which we use categorisation to make it easier to understand the world. It might seem that this is the very essence of some aspects of science, as in New Zealand physicist Ernest Rutherford's famously ungenerous quote that "Physics is the only real science. The rest are just stamp collecting." Indeed, examination of the life and work of many early botanists for example might appear to verify this statement. However, there needs to be an understanding that science requires a flexibility of mind set, a fundamental scientific process being the discarding of a pet theory in favour of a more accurate one.

I'm sure I've remarked countless times - again, echoing Professor Dawkins - that science is in this respect the antithesis of most religions, which set key ideas into stone and refuse to accept any challenges towards them. In the case of the blubber-filled Stenopterygius, it is still a reptile, albeit one that had many of the attributes of mammals. As for the latter, from our pre-school picture books onwards we tend to think of the main mammalian subclass, the placentals, but there are two smaller subclasses: the marsupials, such as the kangaroo; and the monotremes, for example the duck-billed platypus. It has been known since the 1880s that the platypus lays eggs rather than giving birth to live young, a characteristic it shares with the other four monotreme species alive today. In addition, their body temperature is five degrees Celsius lower than that of placental mammals, part of a suite of features presumably retained from their mammal-like reptile ancestors.

Even so, these traits do not justify the comment made by host Stephen Fry in a 2005 episode of the BBC TV quiz show QI, when he claimed that marsupials are not mammals! Richard Dawkins has frequently pointed out that it would be unacceptable to have a similar level of ignorance about the arts as there is on scientific matters, with this being a clear case in point as regards the cultured and erudite Mr Fry. Yet somehow, much of the general public has either a lack or a confusion concerning basic science. Indeed, only  last week I listened to a BBC Radio topical comedy show in which none of the panel members could work out why one face of the moon is always hidden from our view. Imagine the response if it had been a basic lack of knowledge in the arts and literature, for example if an Oxbridge science graduate had claimed that Jane Austen had written Hamlet!

Coming back to the ichthyosaur, one thing we may have learnt as a child is that some animals are warm-blooded and others cold-blooded. This may be useful as a starting point but it is an overly-simplistic and largely outmoded evaluation of the relevant biology; the use of such binary categorisation is of little use after primary school age. In fact, there is series of steps from endothermic homeotherms (encompassing most mammals and birds) to ectothermic poikilotherms (most species of fish, reptiles, amphibians and invertebrates), with the former metabolic feature having evidently developed from the latter.

Ichthyosaurs are likely to have had one of the intermediate metabolisms, as may have been the case for some species of dinosaurs, possibly the smaller, feathered, carnivorous theropods. Likewise, some tuna and shark species are known to be able to produce heat internally, but in 2015 researchers at the US National Marine Fisheries Service announced that five species of the opah fish were found to be whole-body endotherms. Clearly, the boundaries between us supposedly higher mammals and everything else is far less secure than we had previously believed.

At times, science terminology might appear as too abstruse, too removed from the everyday and of little practical use outside of a pub quiz, but then does being able to critique Shakespeare or Charles Dickens help to reduce climate change or create a cure for cancer? Of course we should strive to be fully-rounded individuals, but for too long STEM has been side-lined or stereotyped as too difficult or irrelevant when compared with the humanities.

Lack of understanding of the subtleties and gradations (as opposed to clearly defined boundaries) in science make it easy for anti-science critics to generate public support. Ironically, this criticism tends to take one of two clearly opposing forms: firstly, that science is mostly useless - as epitomised by the Ig Nobel Prize; and alternatively, that it leads to dangerous inventions, as per the tabloid scare-mongering around genetically modified organisms (GMOs) or 'Frankenfoods' as they are caricatured.

Being able to discern nuanced arguments such as the current understanding of animal thermoregulation is a useful tool for all of us. Whether it is giving the public a chance to vote in scientifically-related referendums or just arming them so as to avoid quack medicine, STEM journalism needs to improve beyond the lazy complacency that has allowed such phrases as 'warm-blooded', 'living fossil', 'ice age' and 'zero gravity' to be repeatedly misused. Only then will science be seen as the useful, relevant and above all a much more approachable discipline than it is currently deemed to be.

Thursday 27 September 2018

The anaesthetic of familiarity: how our upbringing can blind us to the obvious

In the restored Edwardian school classroom at Auckland's Museum of Transport and Technology (MOTAT) there is a notice on the wall stating 'Do not ask your teacher questions.' Fortunately, education now goes some way in many nations to emphasising the importance of individual curiosity rather than mere obedience to authority. Of course, there are a fair number of politicians and corporation executives who wish it wasn't so, as an incurious mind is easier to sway than a questioning one. As my last post mentioned, the World Wide Web can be something of an ally for them, since the 'winner takes all' approach of a review-based system aids the slogans and rhetoric of those who wish to control who we vote for and what we buy.

Even the most liberal of nations and cultures face self-imposed hurdles centered round which is the best solution and which is just the most familiar one from our formative years. This post therefore looks at another side of the subjective thinking discussed earlier this month, namely a trap that Richard Dawkins has described as the "anaesthetic of familiarity". Basically, this is when conventions are so accepted as to be seen as the primary option instead of being merely one of a series of choices. Or, as the British philosopher Susan Stebbing wrote in her 1939 book Thinking to Some Purpose: "One of the gravest difficulties encountered at the outset of the attempt to think effectively consists in the difficulty of recognizing what we know as distinguished from what we do not know but merely take for granted."

Again, this mind set is much loved by the manufacturing sector; in addition to such well-known ploys as deliberate obsolescence and staggered release cycles, there are worse examples, especially in everyday consumerism. We often hear how little nutritional value many highly processed foods contain, but think what this has done for the vitamin and mineral supplement industry, whose annual worldwide sales now approach US$40 billion!

Citizens of developed nations today face very different key issues to our pre-industrial ancestors, not the least among them being a constant barrage of decision making. Thanks to the enormous variety of choices available concerning almost every aspect of our daily lives, we have to consider everything from what we wear to what we eat. The deluge of predominantly useless information that we receive in the era of the hashtag makes it more difficult for us to concentrate on problem solving, meaning that the easiest way out is just to follow the crowd.

Richard Dawkins' solution to these issues is to imagine yourself as an alien visitor and then observe the world as a curious outsider. This seems to me to be beyond the reach of many, for whom daily routine appears to be their only way to cope. If this sounds harsh, it comes from personal experience; I've met plenty of people who actively seek an ostrich-like head-in-the-sand approach to life to avoid the trials and tribulations - as well as the wonders - of this rapidly-changing world.

Instead, I would suggest an easier option when it comes to some areas of STEM research: ensure that a fair proportion of researchers and other thought leaders are adult migrants from other nations. Then they will be able to apply an outside perspective, hopefully identifying givens that are too obvious to be spotted by those who have grown up with them.

New Zealand is a good example of this, with arguably its two best known science communicators having been born overseas: Siouxsie Wiles and Michelle Dickinson, A.K.A. Nanogirl. Dr Wiles is a UK-trained microbiologist at the University of Auckland. She frequently appears on Radio New Zealand as well as undertaking television and social media work to promote science in general, as well as for her specialism of fighting bacterial infection.

Dr Dickinson is a materials engineering lecturer and nanomaterials researcher at the University of Auckland who studied in both the UK and USA. Her public outreach work includes books, school tours and both broadcast and social media. She has enough sci-comm kudos that last year, despite not having a background in astronomy, she interviewed Professor Neil deGrasse Tyson during the Auckland leg of his A Cosmic Perspective tour.

The work of the above examples is proof that newcomers can recognise a critical need compared to their home grown equivalents. What is interesting is that despite coming from English-speaking backgrounds - and therefore with limited cultural disparity to their adoptive New Zealand - there must have been enough that was different to convince Doctors Wiles and Dickinson of the need for a hands-on, media savvy approach to science communication.

This is still far from the norm: many STEM professionals believe there is little point to promoting their work to the public except via print-based publications. Indeed, some famous science communicators such as Carl Sagan and Stephen Jay Gould were widely criticised during their lifetime by the scientific establishment for what were deemed undue efforts at self-promotion and the associated debasement of science by combining it with show business.

As an aside, I have to say that as brilliant as some volumes of popular science are, they do tend to preach to the converted; how many non-science fans are likely to pick up a book on say string theory, just for a bit of light reading or self-improvement (the latter being a Victorian convention that appears to have largely fallen from favour)? Instead, the outreach work of the expat examples above is aimed at the widest possible audience without over-simplification or distortion of the principles being communicated.

This approach may not solve all issues about how to think outside the box - scientists may be so embedded within their culture as to not realise that there is a box - but surely by stepping outside the comfort zone we grew up in we may find problems that the local population hasn't noticed?

Critical thinking is key to the scientific enterprise, but it would appear, to little else in human cultures. If we can find methods to avoid the anaesthetic of familiarity and acknowledge that what we deem of as normal can be far from optimal, then these should be promoted with all gusto. If the post-modern creed is that all world views are equally valid and science is just another form of culture-biased story-telling, then now more than ever we need cognitive tools to break through the subjective barriers. If more STEM professionals are able to cross borders and work in unfamiliar locations, isn’t there a chance they can recognise issues that fall under the local radar and so supply a new perspective we need if we are to fulfil our potential?

Wednesday 12 September 2018

Seasons of the mind: how can we escape subjective thinking?

According to some people I've met, the first day of spring in the Southern Hemisphere has been and gone with the first day of September. Not incidentally, there are also some, myself included, who think that it has suddenly started to feel a bit warmer. Apparently, the official start date is at the spring equinox during the third week of September. So on the one hand, the weather has been warming since the start of the month but on the other, why should a planet followed neat calendrical conventions, i.e. the first of anything? Just how accurate is the official definition?

There are many who like to reminisce about how much better the summer weather was back in their school holidays. The rose-tinted memories of childhood can seem idyllic, although I also recall summer days of non-stop rain (I did grow up in the UK, after all). Therefore our personal experiences, particularly during our formative years, can promote an emotion-based response that is so far ingrained we fail to consider they may be inaccurate. Subjectivity and wishful thinking are key to the human experience: how often do we remember the few hits and not the far more misses? As science is practiced by humans it is subject to the same lack of objectivity as anything else; only its built-in error-checking can steer practitioners onto a more rational course than in other disciplines.

What got me to ponder the above was that on meeting someone a few months' ago for the first time, almost his opening sentence was a claim that global warming isn't occurring and that instead we are on the verge of an ice age. I didn't have time for a discussion on the subject, so I filed that one for reply at a later date. Now seems like a good time to ponder what it is that leads people to make such assertions that are seemingly contrary to the evidence.

I admit to being biased on this particular issue, having last year undertaken research for a post on whether agriculture has postponed the next glaciation (note that this woolly - but not mammoth, ho-ho - terminology is one of my bugbears: we are already in an ice age, but currently in an interglacial stage). Satellite imagery taken over the past few decades shows clear evidence of large-scale reductions in global ice sheets. For example, the northern polar ice cap has been reduced by a third since 1980, with what remains only half its previous thickness. Even so, are three decades a long enough period to make accurate predictions? Isn't using a scale that can be sympathetic with the human lifespan just as bad as relying on personal experience?

The UK's Met Office has confirmed that 2018 was that nation's hottest summer since records began - which in this instance, only goes back as far back as 1910.  In contrast, climate change sceptics use a slight growth in Antarctic sea ice (contrary to its steadily decreasing continental icesheet) as evidence of climate equilibrium. Now I would argue that this growth is just a local drop in the global ocean, but I wonder if my ice age enthusiast cherry-picked this data to formulate his ideas? Even so, does he believe that all the photographs and videos of glaciers, etc. have been faked by the twenty or so nations who have undertaken Earth observation space missions? I will find out at some point!

If we try to be as objective as possible, how can we confirm with complete certainty the difference between long term climate change and local, short term variability? In particular, where do you draw the line between the two? If we look at temporary but drastic variations over large areas during the past thousand years, there is a range of time scales to explore. The 15th to 18th centuries, predominantly the periods 1460-1550 and 1645-1715, contained climate variations now known as mini ice ages, although these may have been fairly restricted in geographic extent. Some briefer but severe, wide-scale swings can be traced to single events, such as the four years of cold summers following the Tambora eruption of 1815.

Given such variability over the past millennium, in itself a tiny fragment of geological time, how much certainty surrounds the current changes? The public have come to expect yes or no answers delivered with aplomb, yet some areas of science such as climate studies involve chaos mathematics, thus generating results based on levels of probability. What the public might consider vacillation, researchers consider the ultimate expression of scientific good practice. Could this lack of black-and-white certainty be why some media channels insist on providing a 'counterbalancing' viewpoint from non-expert sources, as ludicrous as this seems?

In-depth thinking about a subject relies upon compartmentalisation and reductionism. Otherwise, we would forever be bogged down in the details and never be able to form an overall picture. But this quantising of reality is not necessarily a good thing if it generates a false impression regarding cause and effect. By suffering from what Richard Dawkins calls the “tyranny of the discontinuous mind” we are prone to creating boundaries that just don't exist. In which case, could a line ever be found between short term local variation and global climate change? Having said that, I doubt many climate scientists would use this as an excuse to switch to weather forecasting instead. Oh dear: this is beginning to look like a 'does not compute' error!

In a sense of course we are exceptionally lucky to have developed science at all. We rely on language to define our ideas, so need a certain level of linguistic sophistication to achieve this focus; tribal cultures whose numbers consist of imprecise values beyond two are unlikely to achieve much headway in, for example, classifying the periodic table.

Unfortunately, our current obsession with generating information of every quality imaginable and then loading it to all available channels for the widest possible audience inevitably leads to a tooth-and-claw form of meme selection. The upshot of this bombardment of noise and trivia is to require an enormous amount of time just filtering it. The knock-on effect being that minimal time is left for identifying the most useful or accurate content rather than simply the most disseminated.

Extremist politicians have long been adept at exploiting this weakness to expound polarising phraseology that initially sounds good but lacks depth; they achieve cut-through with the simplest and loudest of arguments, fulfilling the desire most people have to fit into a rigid social hierarchy - as seen in many other primate species. The problem is that in a similar vein to centrist politicians who can see both sides of an argument but whose rational approach negates emotive rhetoric, scientists are often stuck with the unappealing options of either taking a stand when the outcome is not totally clear, or facing accusations of evasion. There is current trend, particularly espoused by politicians, to disparage experts, but discovering how the universe works doesn't guarantee hard-and-fast answers supplied exactly when required and which provide comfort blankets in a harsh world.

Where then does this leave critical thinking, let alone science? Another quote from Richard Dawkins is that "rigorous common sense is by no means obvious to much of the world". This pessimistic view of the human race is supported by many a news article but somewhat negated by the immense popularity of star science communicators, at least in a number of countries.

Both the methods and results of science need to find a space amongst the humorous kitten videos, conspiracy theorists and those who yearn for humanity to be the pinnacle and purpose of creation. If we can comprehend that our primary mode of thinking includes a subconscious baggage train of hopes, fears and distorted memories, we stand a better chance of seeing the world for how it really is and not how we wish it to be. Whether enough of us can dissipate that fog remains to be seen. Meanwhile, the ice keeps melting and the temperature rising, regardless of what you might hear...

Sunday 15 July 2018

Minding the miniscule: the scale prejudice in everyday life

I was recently weeding a vegetable bed in the garden when out of the corner of my eye I noticed a centipede frantically heading for cover after I had inadvertently disturbed its hiding spot. In my experience, most gardeners are oblivious to the diminutive fauna and flora around them unless they are pests targeted for removal or obliteration. It's only when the likes of a biting or stinging organism - or even just a large and/or hairy yet harmless spider - comes into view do people consciously think about the miniature cornucopia of life around them.

Even then, we consider our needs rather greater than theirs: how many of us stop to consider the effect we are having when we dig up paving slabs and find a bustling ant colony underneath? In his 2004 essay Dolittle and Darwin, Richard Dawkins pondered what contemporary foible or -ism future generations will castigate us for. Something I consider worth looking at in this context is scale-ism, which might be defined as the failure to apply a suitable level of consideration to life outside of 'everyday' measurements.

I've previously discussed neo-microscopic water-based life but larger fauna visible without optical aids is easy to overlook when humans are living in a primarily artificial environment - as over half our species is now doing. Several ideas spring to mind as to why breaking this scale-based prejudice could be important:
  1. Unthinking destruction or pollution of the natural environment doesn't just cause problems for 'poster' species, predominantly cuddly mammals. The invertebrates that live on or around larger life-forms may be critical to these ecosystems or even further afield. Removal of one, seemingly inconsequential, species could allow another to proliferate at potentially great cost to humans (for example, as disease vectors or agricultural pests). Food webs don't begin at the chicken and egg level we are used to from pre-school picture books onwards.
  2. The recognition that size doesn't necessarily equate to importance is critical to the preservation of the environment not just for nature's sake but for the future of humanity. Think of the power of the small water mould Phytophthora agathidicida which is responsible for killing the largest residents of New Zealand's podocarp forests, the ancient coniferous kauri Agathis australis. The conservation organisation Forest and Bird claims that kauri are the lynchpin for seventeen other plant species in these forests: losing them will have a severe domino effect.
  3. Early detection of small-scale pests may help to prevent their spread but this requires vigilance from the wider public, not just specialists; failure to recognise that tiny organisms may be far more than a slight nuisance can be immensely costly. In recent years there have been two cases in New Zealand where the accidental import of unwanted insects had severe if temporary repercussions for the economy. In late 2017 three car carriers were denied entry to Auckland when they were found to contain the brown marmorated stink bug Halyomorpha halys. If they had not been detected, it is thought this insect would have caused NZ$4 billion in crop damage over the next twenty years. Two years earlier, the Queensland fruit fly Bactrocera tryoni was found in central Auckland. As a consequence, NZ$15 million was spent eradicating it, a small price to pay for the NZ$5 billion per annum it would have cost the horticulture industry had it spread.
Clearly, these critters are to be ignored at our peril! Although the previous New Zealand government introduced the Predator Free 2050 programme, conservation organisations are claiming the lack of central funding and detailed planning makes the scheme unrealistic by a large margin (if anything, the official website suggests that local communities should organise volunteer groups and undertake most of the work themselves!) Even so, this scheme is intended to eradicate alien mammal species, presumably on the grounds that despite their importance, pest invertebrates are just too small to keep excluded permanently - the five introduced wasp species springing to mind at this point.

It isn't just smaller scale animals that are important; and how many people have you met who think that the word animal means only a creature with a backbone, not insects and other invertebrates? Minute and inconspicuous plants and fungi also need considering. As curator at Auckland Botanic Gardens Bec Stanley is keen to point out, most of the public appear to have plant blindness. Myrtle rust is a fungus that attacks native plants such as the iconic pōhutukawa or New Zealand Christmas tree, having most probably been carried on the wind to New Zealand from Australia. It isn't just New Zealand's Department of Conservation that is asking the public to watch out for it: the Ministry for Primary Industries also requests notification of its spread across the North Island, due to the potential damage to commercial species such as eucalyptus. This is yet another example of a botanical David versus Goliath situation going on right under our oblivious noses.

Even without the economic impact, paying attention to the smaller elements within our environment is undoubtedly beneficial. Thinking more holistically and less parochially is often a good thing when it comes to science and technology; paradigm shifts are rarely achieved by being comfortable and content with the status quo. Going beyond the daily centimetre-to-metre range that we are used to dealing with allows us to comprehend a bit more of the cosmic perspective that Neil deGrasse Tyson and other science communicators endeavour to promote - surely no bad thing when it comes to lowering boundaries between cultures in a time with increasingly sectarian states of mind?

Understanding anything a little out of the humdrum can be interesting in and of itself. As Brian Cox's BBC documentary series Wonders of Life showed, a slight change of scale can lead to apparent miracles, such as the insects that can walk up glass walls or support hundreds of times their own weight and shrug off equally outsized falls. Who knows, preservation or research into some of our small-scale friends might lead to considerable benefits too, as with the recent discovery of the immensely strong silk produced by Darwin's bark spider Caerostris darwini. Expanding our horizons isn't difficult, it just requires the ability to look down now and then and see what else is going on in the world around us.

Wednesday 30 May 2018

Photons vs print: the pitfalls of online science research for non-scientists


It's common knowledge that school teachers and university lecturers are tired of discovering that their students' research is often limited to one search phrase on Google or Bing. Ignoring the minimal amount of rewriting that often accompanies this shoddy behaviour - leading to some very same-y coursework - one of the most important questions to arise is how easy is it to confirm the veracity of online material compared to conventionally-published sources? This is especially important when it comes to science research, particularly when the subject matter involves new hypotheses and cutting-edge ideas.

One of the many problems with the public's attitude to science is that it is nearly always thought of as an expanding body of knowledge rather than as a toolkit to explore reality. Popular science books such as Bill Bryson's 2003 best-seller A Short History of Nearly Everything follow this convention, disseminating facts whilst failing to illuminate the methodologies behind them. If non-scientists don't understand how science works is it little wonder that the plethora of online sources - of immensely variable quality - can cause confusion?

The use of models and the concurrent application of two seemingly conflicting theories (such as Newton's Universal Gravitation and Einstein's General Theory of Relativity) can only be understood with a grounding in how the scientific method(s) proceed. By assuming that scientific facts are largely immutable, non-scientists can become unstuck when trying to summarise research outcomes, regardless of the difficulty in understanding the technicalities. Of course this isn't true for every theory: the Second Law of Thermodynamics is unlikely to ever need updating; but as the discovery of dark energy hints, even Einstein's work on gravity might need amending in future. Humility and caution should be the bywords of hypotheses not yet verified as working theories; dogma and unthinking belief have their own place elsewhere!

In a 1997 talk Richard Dawkins stated that the methods of science are 'testability, evidential support, precision, quantifiability, consistency, intersubjectivity, repeatability, universality, and independence of cultural milieu.' The last phrase implies that the methodologies and conclusions for any piece of research should not differ from nation to nation. Of course the real world intrudes into this model and so culture, gender, politics and even religion play their part as to what is funded and how the results are presented (or even which results are reported and which obfuscated).

For those who want to stay ahead of the crowd by disseminating the most recent breakthroughs it seems obvious that web resources are far superior to most printed publications, professional journals excepted - although the latter are rarely suitable for non-specialist consumption. The expenses associated with producing popular science books means that online sources are often the first port of call.

Therein lies the danger: in the rush to skim seemingly inexhaustible yet easy to find resources, non-professional researchers frequently fail to differentiate between articles written by scientists, those by journalists with science training, those by unspecialised writers, largely on general news sites, and those by biased individuals. It's usually quite easy to spot material from cranks, even within the quagmire of the World Wide Web (searching for proof that the Earth is flat will generate tens of millions of results) but online content written by intelligent people with an agenda can be more difficult to discern. Sometimes, the slick design of a website offers reassurance that the content is more authentic than it really is, the visual aspects implying an authority that is not justified.

So in the spirit of science (okay, so it's hardly comprehensive being just a single trial) I recently conducted a simple experiment. Having read an interesting hypothesis in a popular science book I borrowed from the library last year, I decided to see what Google's first few pages had to say on the same subject, namely that the Y chromosome has been shrinking over the past few hundred million years to such an extent that its days - or in this case, millennia - are numbered.

I had previously read about the role of artificial oestrogens and other disruptive chemicals in the loss of human male fertility, but the decline in the male chromosome itself was something new to me. I therefore did a little background research first. One of the earliest sources I could find for this contentious idea was a 2002 paper in the journal Nature, in which the Australian geneticist Professor Jennifer Graves described the steady shrinking of the Y chromosome in the primate order. Her extrapolation of the data, combined with the knowledge that several rodent groups have already lost their Y chromosome, suggested that the Home sapiens equivalent has perhaps no more than ten million years left before it disappears.

2003 saw the publication of British geneticist Bryan Sykes' controversial book Adam's Curse: A Future Without Men. His prediction based on the rate of atrophy in the human Y chromosome was that it will only last another 125,000 years. To my mind, this eighty-fold difference in timescales suggests that for these early days in its history, very little of the hypothesis could be confirmed with any degree of certainty.

Back to the experiment itself. The top results for 'Y chromosome disappearing' and similar search phrases lead to articles published between 2009 and 2018. They mostly fall into one of two categories: (1) that the Y chromosome is rapidly degenerating and that males, at least of humans and potentially all other mammal species, are possibly endangered; and (2) that although the Y chromosome has shrunk over the past few hundred million years it has been stable for the past 25 million and so is no longer deteriorating. A third, far less common category, concerns the informal polls taken of chromosomal researchers, who have been fairly evenly divided between the two opinions and thus nicknamed the "leavers" and the "remainers". Considering the wildly differing timescales mentioned above, perhaps this lack of consensus is proof of science in action; there just hasn't been firm enough evidence for either category to claim victory.

What is common to many of the results is that inflammatory terms and hyperbole are prevalent, with little in the way of caution you would hope to find with cutting-edge research. Article titles include 'Last Man on Earth?', 'The End of Men' and 'Sorry, Guys: Your Y Chromosome May Be Doomed ', with paragraph text contain provocative phrases such as 'poorly designed' and 'the demise of men'. This approach is friendly to organic search at the same time as amalgamating socio-political concerns with the science.

You might expect that the results would show a change in trend of time, first preferring one category and then the other, but this doesn't appear to be the case. Rearranged in date order, the search results across the period 2009-2017 include both opinions running concurrently. This year however has seen a change, with the leading 2018 search results so far only offering support to the rapid degeneration hypothesis. The reason for this difference is readily apparent: publication of a Danish study that bolsters support for it. This new report is available online, but is difficult for a non-specialist to digest. Therefore, most researchers such as myself would have to either rely upon second-hand summaries or, if there was enough time, wait for the next popular science book that discusses it in layman's terms.

As it is, I cannot tell from my skimming approach to the subject whether the new research is thorough enough to be completely reliable. For example, it only examined the genes of sixty-two Danish men, so I have no idea if this is a large enough sample to be considered valid beyond doubt. However, all of the 2018 online material I read accepted the report without question, which at least suggests that after a decade and a half of vacillating between two theories, there may now be an answer. Even so, by examining the content in the "remainers" category, I wonder how the new research confirms a long term trend rather than short term blip in chromosomal decline. I can't help thinking that the sort of authoritative synthesis found in the better sort of popular science books would answer these queries, such is my faith in the general superiority of print volumes!

Of course books have been known to emphasise pet theories and denigrate those of opponents, but the risk of similar issues for online content is far greater. Professor Graves' work seems to dominate the "leavers" category, via her various papers subsequent to her 2002 original, but just about every reference to them is contaminated with overly emotive language. I somehow doubt that if her research was only applicable to other types of animals, say reptiles, there would be nearly so many online stories covering it, let alone the colourful phrasing that permeates this topic. The history of the Y chromosome is as extraordinary as the chromosome itself, but treating serious scientific speculation - and some limited experimental evidence - with tabloid reductionism and show business hoopla won't help when it comes to non-specialists researching the subject.

There may be an argument here for the education system to systematically teach such basics as common sense and rigour, in the hopes of giving non-scientists a better chance of detecting baloney. This of course includes the ability to accurately filter online material during research. Personally, I tend to do a lot of cross-checking before committing to something I haven't read about on paper. If even such highly-resourced and respected websites as the BBC Science News site can make howlers (how about claiming that chimpanzees are human ancestors?) why should we take any of these resources on trust? Unfortunately, the seductive ease with which information can be found on the World Wide Web does not in any way correlate with its quality. As I found out with the shrinking Y chromosome hypothesis, there are plenty of traps for the unwary.

Tuesday 29 August 2017

Cerebral celebrities: do superstar scientists harm science?

One of my earliest blog posts concerned the media circus surrounding two of the most famous scientists alive today: British physicist Stephen Hawking and his compatriot the evolutionary biologist Richard Dawkins. In addition to their scientific output, they are known in public circles thanks to a combination of their general readership books, television documentaries and charismatic personalities. The question has to be asked though, how much of their reputation is due to their being easily-caricatured and therefore media-friendly characters rather than what they have contributed to human knowledge?

Social media has done much to democratise the publication of material from a far wider range of authors than previously possible, but the current generation of scientific superstars who have arisen in the intervening eight years appear party to a feedback loop that places personality as the primary reason for their media success. As a result, are science heroes such as Neil deGrasse Tyson and Brian Cox merely adding the epithet 'cool' to STEM disciplines as they sit alongside the latest crop of media and sports stars? With their ability to fill arenas usually reserved for pop concerts or sports events, these scientists are seemingly known far and wide for who they are as much as for what they have achieved. It might seem counterintuitive to think that famous scientists and mathematicians could be damaging STEM, but I'd like to put forward five ways by which this could be occurring:

1: Hype and gossip

If fans of famous scientists spend their time reading, liking and commenting at similarly trivial levels, they may miss important material from other, less famous sources. A recent example that caught my eye was a tweet by British astrophysicist and presenter Brian Cox, containing a photograph of two swans he labelled ‘Donald' and ‘Boris'. I assume this was a reference to the current US president and British foreign secretary, but with over a thousand 'likes' by the time I saw it I wonder what other, more serious, STEM-related stories might have been missed in the rapid ebb and flow of social media.

As you would expect with popular culture fandom the science celebrities' material aimed at a general audience receives the lion's share of attention, leaving the vast majority of STEM popularisations under-recognised. Although social media has exacerbated this, the phenomenon does pre-date it. For example, Stephen Hawking's A Brief History of Time was first published in 1988, the same year as Timothy Ferris's Coming of Age in the Milky Way, a rather more detailed approach to similar material that was left overshadowed by its far more famous competitor. There is also the danger that celebrities with a non-science background might try to cash in on the current appeal of science and write poor-quality popularisations. If you consider this unlikely, you should bear in mind that there are already numerous examples of extremely dubious health, diet and nutrition books written by pop artists and movie stars. If scientists can be famous, perhaps the famous will play at being science writers.

Another result of this media hubbub is that in order to be heard, some scientists may be guilty of the very hype usually blamed on the journalists who publicise their discoveries. Whether to guarantee attention or self-promoting in order to gain further funding, an Australian research team recently came under fire for discussing a medical breakthrough as if a treatment was imminent, despite having so are only experimented on mice! This sort of hyperbole both damages the integrity of science in the public eye and can lead to such dangerous outcomes as the MMR scandal, resulting in large numbers of children not being immunised.

2: Hero worship

The worship of movie stars and pop music artists is nothing new and the adulation accorded them reminds me of the not dissimilar veneration shown to earlier generations of secular and religious leaders. The danger here then is for impressionable fans to accept the words of celebrity scientists as if they were gospel and so refrain from any form of critical analysis. When I attended an evening with astrophysicist Neil deGrasse Tyson last month I was astonished to hear some fundamental misunderstandings of science from members of the public. It seemed as if Dr Tyson had gained a personality cult who hung on each utterance but frequently failed to understand the wider context or key issues regarding the practice of science. By transferring hero worship from one form of human activity to another, the very basis - and differentiation - that delineates the scientific enterprise may be undermined.

3: Amplifying errors

Let's face it, scientists are human and make mistakes. The problem is that if the majority of a celebrity scientist's fan base are prepared to lap up every statement, then the lack of critical analysis can generate further issues. There are some appalling gaffes in the television documentaries and popular books of such luminaries as Sir David Attenborough (as previously discussed) and even superstar Brian Cox is not immune: his 2014 book Human Universe described lunar temperatures dropping below -2000 degrees Celsius! Such basic errors imply that the material is ghost-written or edited by authors with little scientific knowledge and no time for fact checking. Of course this may embarrass the science celebrity in front of their potentially jealous colleagues, but more importantly can serve as ammunition for politicians, industrialists and pseudo-scientists in their battles to persuade the public of the validity of their own pet theories - post-truth will out, and all that nonsense.

4: Star attitude

With celebrity status comes the trappings of success, most usually defined as a luxury lifestyle. A recent online discussion here in New Zealand concerned the high cost of tickets for events featuring Neil deGrasse Tyson, Brian Greene, David Attenborough, Jane Goodall and later this year, Brian Cox. Those for Auckland-based events were more expensive than tickets to see Kiwi pop star Lorde and similar in price for rugby matches between the All Blacks and British Lions. By making the tickets this expensive there is little of chance of attracting new fans; it seems to be more a case of preaching to the converted.

Surely it doesn't have to be this way: the evolutionary biologist Beth Shapiro, author of How to Clone a Mammoth, gave an excellent free illustrated talk at Auckland Museum a year ago. It seems odd that the evening with Dr Tyson, for example, consisting of just himself, interviewer Michelle Dickinson (A.K.A. Nanogirl) and a large screen, cost approximately double that of the Walking with Dinosaurs Arena event at the same venue two years earlier, which utilised US$20 million worth of animatronic and puppet life-sized dinosaurs.

Dr Tyson claims that by having celebrity interviewees on his Star Talk series he can reach a wider audience, but clearly this approach is not feasible when his tour prices are so high. At least Dr Goodall's profits went into her conservation charity, but if you consider that Dr Tyson had an audience of probably over 8000 in Auckland alone, paying between NZ$95-$349 (except for the NZ$55 student tickets) you have to wonder where all this money goes: is he collecting ‘billions and billions' of fancy waistcoats? It doesn't look as if this trend will soon stop either, as Bill Nye (The Science Guy) has just announced that he will be touring Australia later this year and his tickets start at around NZ$77.

5: Skewing the statistics

The high profiles of sci-comm royalty and their usually cheery demeanour implies that all is well in the field of scientific research, with adequate funding for important projects. However, even a quick perusal of less well-known STEM professionals on social media prove that this is not the case. An example that came to my attention back in May was that of the University of Auckland microbiologist Dr Siouxsie Wiles, who had to resort to crowdfunding for her research into fungi-based antibiotics after five consecutive funding submissions were rejected. Meanwhile, Brian Cox's connection to the Large Hadron Collider gives the impression that even such blue-sky research as the LHC can be guaranteed enormous budgets.

As much as I'd like to thank these science superstars for promoting science, technology and mathematics, I can't quite shake the feeling that their cult status is too centred on them rather than the scientific enterprise as a whole.  Now more than ever science needs a sympathetic ear from the public, but this should be brought about by a massive programme to educate the public (they are the taxpayers, after all) as to the benefits of such costly schemes as designing nuclear fusion reactors and the research on climate change. Simply treating celebrity scientists in the same way as movie stars and pop idols won't help an area of humanity under siege from so many influential political and industrial leaders with their own private agendas. We simply mustn't allow such people to misuse the discipline that has raised us from apemen to spacemen.

Saturday 16 August 2014

The escalating armoury: weapons in the war between science and woolly thinking

According to that admittedly dubious font of broad knowledge Wikipedia, there are currently sixteen Creationist museums in the United States alone. These aren't minor attractions for a limited audience of fundamentalist devotees either: one such institution in Kentucky has received over one million visitors in its first five years. That's hardly small potatoes! So how much is the admittance fee and when can I go?

Or maybe not. It isn't the just the USA that has become home to such anti-scientific nonsense either: the formerly robust secular societies of the UK and Australia now house museums and wildlife parks with similar anti-scientific philosophies. For example, Noah's Ark Zoo Farm in England espouses a form of Creationism in which the Earth is believed to be a mere 100,000 years old. And of course in addition to traditional theology, there is plenty of pseudo-scientific/New Age nonsense that fails every test science can offer and yet appears to be growing in popularity. Anyone for Kabbalah?

It's thirty-five years since Carl Sagan's book Broca's Brain: Reflections on the Romance of Science summarised the scientific response to the pseudo-scientific writings of Immanuel Velikovsky. Although Velikovsky and his bizarre approach to orbital mechanics - created in order to provide an astrophysical cause for Biblical events - has largely been forgotten, his ideas were popular enough in their time. A similar argument could be made for the selective evidence technique of Erich von Daniken in the 1970's, whose works have sold an astonishing 60 million copies; and to a less extent the similar approach of Graham Hancock in the 1990's. But a brief look at that powerhouse of publishing distribution, Amazon.com, shows that today there is an enormous market for best-selling gibberish that far outstrips the lifetime capacity of a few top-ranking pseudo-scientists:
  • New Age: 360,000
  • Spirituality: 243,000
  • Religion: 1,100,000
  • (Science 3,100,000)
(In the best tradition of statistics, all figures have been rounded slightly up or down.)

Since there hasn't exactly been a decrease of evidence for most scientific theories, the appeal of the genre must be due to changes in society. After writing-off the fundamentalist/indoctrinated as an impossible-to-change minority, what has lead to the upsurge in popularity of so many publications at odds with critical thinking?

It seems that those who misinterpret scientific methodology, or are in dispute with it due to a religious conviction, have become adept at using the techniques that genuine science popularisation utilises. What used to be restricted to the printed word has been expanded to include websites, TV channels, museums and zoos that parody the findings of science without the required rigorous approach to the material. Aided and abetted by well-meaning but fundamentally flawed popular science treatments such as Bill Bryson's A Short History of Nearly Everything, which looks at facts without real consideration of the science behind them, the public are often left with little understanding of what separates science from its shadowy counterparts. Therefore the impression of valid scientific content that some contemporary religious and pseudo-science writers offer can quite easily be mistaken for the genuine article. Once the appetite for a dodgy theory has been whetted, it seems there are plenty of publishers willing to further the interest.

If a picture is worth a thousand words, then the 'evidence' put forward in support of popular phenomenon such an ancient alien presence or faked moon landings seems all the more impressive. At a time when computer-generated Hollywood blockbusters can even be replicated on a smaller scale in the home, most people are surely aware of how easy it is to be fooled by visual evidence. But it seems that pictorial support for a strongly-written idea can resonate with the search for fundamental meaning in an ever more impersonal technocratic society. And of course if you are flooded with up-to-the-minute information from a dozen sources then it is much easier to absorb evidence from your senses than having to unravel the details from that most passé of communication methods, boring old text. Which perhaps fails to explain just why there are quite so many dodgy theories available in print!

But are scientists learning from their antithesis how to fight back? With the exception of Richard Dawkins and other super-strict rationalists, science communicators have started to take on board the necessity of appealing to hearts as well as minds. Despite the oft-mentioned traditional differentiation to the humanities, science is a human construct and so may never be purely objective. Therefore why should religion and the feel-good enterprises beloved of pseudo-scientists hold the monopoly on awe and wonder?

Carl Sagan appears to have been a pioneer in the field of utilising language that is more usually the domain of religion. In The Demon-Haunted Word: Science As A Candle In The Dark, he argues that science is 'a profound source of spirituality'. Indeed, his novel Contact defines the numinous outside of conventional religiosity as 'that which inspires awe'. If that sounds woolly thinking, I'd recommend viewing the clear night sky away from city lights...

Physicist Freeman Dyson's introduction to the year 2000 edition of Sagan's Cosmic Connection uses the word 'gospel' and the phrase 'not want to appear to be preaching'. Likewise, Ann Druyan's essay A New Sense of the Sacred in the same volume includes material to warm the humanist heart. Of course, one of the key intentions of the Neil deGrasse Tyson-presented reboot of Cosmos likewise seeks to touch the emotions as well as improve the mind, a task at which it sometimes - in my humble opinion - overreaches.

The emergence of international science celebrities such as Tyson is also helping to spread the intentions if not always the details of science as a discipline. For the first time since Apollo, former astronauts such as Canadian Chris Hadfield undertake international public tours. Neil deGrasse Tyson, Michio Kaku and Brian Cox are amongst those practicing scientists who host their own regular radio programmes, usually far superior to the majority of popular television science shows. Even the seven Oscar-winning movie Gravity may have helped promote science, with its at times accurate portrayal of the hostile environment outside our atmosphere, far removed from the science fantasy of most Hollywood productions. What was equally interesting was that deGrasse Tyson's fault-finding tweets of the film received a good deal of public attention. Can this suppose that despite the immense numbers of anti-scientific publications on offer, the public is prepared to put trust in scientists again? After all, paraphrasing Monty Python, what have scientists ever done for us?

There are far important uses for the time and effort that goes into such nonsense as the 419,000 results on Google discussing 'moon landing hoax'. And there's worse: a search for 'flat earth' generates 15,800,00 results. Not that most of these are advocates, but surely very few would miss most of the material discussing these ideas ad nauseum?

Although it should be remembered that scientific knowledge can be progressed by unorthodox thought - from Einstein considering travelling alongside a beam of light to Wegener's continental drift hypothesis that led to plate tectonics - but there is usually a fairly obvious line between an idea that may eventually be substantiated and one that can either be disproved by evidence or via submission to parsimony. Dare we hope that science faculties might teach their students techniques for combating an opposition that doesn't fight fair, or possibly even how to use their own methods back at them? After all, it's time to proselytise!

Saturday 15 March 2014

Cutting remarks: investigating five famous science quotations

If hearing famous movie lines being misquoted seems annoying, then misquoted or misused science citations can be exasperating, silly or downright dangerous. To this end, I thought that I would examine five well-known science quotations to find the truth behind the soundbite. By delineating the accurate (as far as I'm aware) words in the wider context in which they were said/written down/overheard by someone down the hallway, I may be able to understand the intended meaning, and not the autopilot definition frequently used. Here goes:

1) God does not play dice (Albert Einstein)

Possibly Einstein's most famous line, it sound like the sort of glib comment that could be used by religious fundamentalists to denigrate science in two opposing fashions: either Einstein is being facetious and therefore sacrilegious; or he supports an old-fashioned version of conventional Judeo-Christian beliefs in which God can be perceived in the everyday world. Talk about having your cake and eating it!

Einstein is actually supposed to have said: "It is hard to sneak a look at God's cards. But that he would choose to play dice with the world...is something that I cannot believe for a single moment." This gives us much more material to work with: it was actually a quote Einstein himself supplied to a biographer. Some years earlier he had communicated with physicist Max Born along similar lines: "Quantum mechanics is certainly imposing. But an inner voice tells me that it is not yet the real thing. The theory says a lot, but does not really bring us any closer to the secret of the 'old one'. I, at any rate, am convinced that He does not throw dice."

So here is the context behind the quote: Einstein's well-known disbelief in the fundamental nature of quantum mechanics. As I've discussed in a previous post Einstein's opinions on the most accurate scientific theory ever devised was completely out of step with the majority of his contemporaries - and physicists ever since. Of course we haven't yet got to the bottom of it; speaking as a non-scientist I find the Copenhagen Interpretation nonsense. But then, many physicists have said something along the lines of that if you think you understand quantum mechanics, you haven't understood it. Perhaps at heart, Einstein was stuck in a Nineteenth Century mind set, unable to conceive of fundamental limits to our knowledge or that probability lies at the heart of reality. He spent decades looking for a deeper, more obviously comfortable, cause behind quantum mechanics. And as for his interest in the 'Old One', Einstein frequently denied his belief in a Judeo-Christian deity but referred to himself as an agnostic: the existence of any presence worthy of the name 'God' being "the most difficult in the world". Now there's a quote worth repeating!

2) Science is a way of thinking much more than it is a body of knowledge (Carl Sagan)

As I've mentioned before, Bill Bryson's A Short History of Nearly Everything is chock full of the results of scientific investigation but rarely stops to consider the unique aspects that drive the scientific method, or even define the limits of that methodology. Sagan's full quote is: "Science is more than a body of knowledge. It is a way of thinking; a way of sceptically interrogating the universe with a fine understanding of human fallibility. If we are not able to ask sceptical questions, to interrogate those who tell us that something is true, to be sceptical of those in authority, then, we are up for grabs for the next charlatan (political or religious) who comes rambling along."

It is interesting because it states some obvious aspects of science that are rarely discussed, such as the subjective rather than objective nature of science. As human beings, scientists bring emotions, selective memory and personal preferences into their work. In addition, the socio-cultural baggage we carry is hardly ever discussed until a paradigm shift (or just plain, old-fashioned time has passed) and we recognise the idiosyncrasies and prejudices embedded into research. Despite being subject to our frailties and the zeitgeist, once recognised, these limitations are part of the strength of the discipline: it allows us, at least eventually, to discover their effect on what was once considered the most dispassionate branch of learning.

Sagan's repeated use of the word sceptical is also of great significance. Behind the multitude of experimental, analytical and mathematical methods in the scientific toolkit, scepticism should be the universal constant. As well as aiding the recognition of the biases mentioned above, the sceptical approach allows parsimony to take precedence over authority. It may seem a touch idealistic, especially for graduate students having to kowtow to senior faculty when seeking research positions, but open-minded young turks are vital in overcoming the conservative old guard. Einstein's contempt for authority is well-known, as he made clear by delineating unthinking respect for it as the greatest enemy of truth. I haven't read Stephen Jay Gould's Rocks of Ages: Science and Religion in the Fullness of Life, but from what I understand of his ideas, the distinction concerning authority marks a clear boundary worthy of his Non-Overlapping Magisteria.

3) The mystery of the beginning of all things is insoluble by us; and I for one must be content to remain an agnostic (Charles Darwin)

From the original publication of On the Origin of Species in 1859 to the present day, one of the most prominent attacks by devoutly religious critics to natural selection is the improbability of how life started without divine intervention. If we eventually find microbial life on Mars - or larger organisms on Titan, Europa or Enceladus - this may turn the tide against such easy a target, but one thing is for certain: Darwin did not attempt to detail the origin of life itself. Although he stated in a letter to a fellow scientist: "But if (and Oh! What a big if!) we could conceive in some warm little pond, with all sorts of ammonia and phosphoric salts, lights, heat, electricity etc., present that a protein compound was chemically formed ready to undergo still more complex changes" there are no such broad assumptions in his public writings.

As it turns out, Darwin may have got some of the details correct, although the 'warm little pond' is more likely to have been a deep sea volcanic vent. But we are still far from understanding the process by which inert chemicals started to make copies of themselves. It's been more than sixty years since Harold Urey and Stanley Miller at the University of Chicago produced amino acids simply by recreating what conditions were then thought to resemble on the early Earth. Despite numerous variations on this classic experiment in subsequent decades, we are little closer to comprehending the origin of life. So it was appropriate that Darwin, who was not known for flights of fancy (he once quipped "My mind seems to have become a kind of machine for grinding general laws out of large collections of facts") kept speculation out of his strictly evidence-based publications.

Just as Darwin has been (at times, deliberately) misquoted by religious fundamentalists determined to undermine modern biology, his most vociferous disciple today, Richard Dawkins, has also been selectively quoted to weaken the scientific arguments. For example, printing just "The essence of life is statistical improbability on a colossal scale" as opposed to the full text from The Blind Watchmaker discussing cumulative natural selection, is a cheap literary device that lessens the critique, but only if the reader is astute enough to investigate the original source material.

4) Anybody who has been seriously engaged in scientific work of any kind realizes that over the entrance to the gates of the temple of science are written the words: 'Ye must have faith.' (Max Planck)

Thomas Henry Huxley (A.K.A. Darwin's Bulldog) once wrote that "Science is organized common sense where many a beautiful theory was killed by an ugly fact." But that was back in the Nineteenth Century, when classical physics ruled and scientists predicted a time in the near future when they would understand all the fundamentals of the universe. In these post-modern, quantum mechanical times, uncertainty (or rather, Uncertainty) is key, and common sense goes out of the window with the likes of entanglement, etc.

Back to Planck. It seems fairly obvious that his quote tallies closely with the physics of the past century, in which highly defined speculation and advanced mathematics join forces to develop hypotheses into theories long before hard evidence can be gleaned from the experimental method. Some of the key players in quantum physics have even furthered Copernicus' preference for beautiful mathematics over observation and experiment. Consider the one-time Lucasian Professor of Mathematics Paul Dirac's partiality for the beauty of equations over experimental results, even though he considered humanity's progress in maths to be 'feeble'. The strangeness of the sub-atomic world could be seen as a vindication of these views; another of Planck's quotes is "One must be careful, when using the word, real."

Leaving aside advanced physics, there are examples in the other scientific disciplines that confirm Planck's view. In the historical sciences, you can never know the full story. For example, fossils can provide some idea of the how and when a species diverged into two daughter species, but not necessarily the where and why (vis-à-vis ecological 'islands' in the wider sense). Not that this lack of precision should be taken as doubt of validity. As evolutionary biologist Stephen Jay Gould once said, a scientific fact is something "confirmed to such a degree that it would be perverse to withhold provisional assent."  So what might appear to primarily apply to one segment of the scientific endeavour can be applied across all of science.

5) Space travel is utter bilge (Richard van der Riet Woolley, Astronomer Royal)

In 1956 the then-Astronomer Royal made a prediction that was thoroughly disproved five years later with Yuri Gagarin's historic Vostock One flight. The quote has been used ever since as an example of how blind obedience to authority is unwise. But Woolley's complete quote was considerably more ambiguous: "It's utter bilge. I don't think anybody will ever put up enough money to do such a thing...What good would it do us? If we spent the same amount of money on preparing first-class astronomical equipment we would learn much more about the universe...It is all rather rot." He went on say: "It would cost as much as a major war just to put a man on the moon." In fact, the latter appears to be quite accurate, and despite the nostalgia now aimed at the Apollo era, the lack of any follow-up only reinforces the notion that the race to the moon was simply the ultimate example of Cold War competition. After all, only one trained geologist ever got there!

However, I'm not trying to defend the edited version of Woolley's inopportune statement since he appears to have been an armchair naysayer for several decades prior to his most famous quote. Back in 1936, his review of Rockets Through Space: The Dawn of Interplanetary Travel by the first president of the British Interplanetary Society (BIS) was even more pessimistic: "The whole procedure [of shooting rockets into space]...presents difficulties of so fundamental a nature, that we are forced to dismiss the notion as essentially impracticable, in spite of the author's insistent appeal to put aside prejudice and to recollect the supposed impossibility of heavier-than-air flight before it was actually accomplished." Again, it might appear in hindsight that Woolley deserves scorn, were it not for the fact that nearly everyone with some knowledge of space and aeronautics was of a similar opinion, and the opposition were a few 'cranks' and the like, such as BIS members.

The moral of the this story is that it is far from difficult to take a partial quote, or a statement out of context, and alter a sensible, realistic attitude (for its time and place) into an easy piece of fun. A recent tweet I saw was a plaintive request to read what Richard Dawkins actually says, rather than what his opponents claim he has says. In a worst-case scenario, quote-mining makes it possible to imply the very opposite of an author's intentions. Science may not be one hundred percent provable, but it's by the far the best approach we have to finding out that wonderful thing we humans call 'the truth'.

Saturday 19 October 2013

School sci-tech fairs: saviours of the future?

It's frequently said that a picture is worth a thousand words, but could it be true that hands-on experiments are worth even more when it comes to engaging children in science? As the current Google / iPad / your-designation-of-choice generation is being bombarded from the egg onwards with immense amounts of audio-visual noise, how will they get the opportunity to learn that science can be both rewarding and comprehensible when textbooks seem so dull by comparison with their otherwise digitally-enhanced lives?

The infant school my daughters attend recently held a science and technology exhibition based on the curriculum studied during the last term. An associated open evening (colloquially labelled a 'Sci-tech fair') showed that parents too could delight in simple hands-on demonstrations as well as gain an appreciation of the science that their five- to eleven-year olds practice.

In addition to the experiments, both the long-term projects undertaken over several months and those carried out on the night, the entries for a science-themed photographic competition gave interesting insights into the mentality of pre-teens today. All the submissions included a brief explanatory statement and ranged from reportage to self-organised experimentation. One entry that I can only assume was entirely the child's own work especially caught my eye: a photograph of their pet dog standing in front of half a dozen identically-sized sheets of paper, on each of which was a same-sized mound of the dog's favourite food. The sheets of paper were each a different colour, the hypothesis being whether the dog's choice of food was influenced by the colour it was placed upon.  I say it was probably the child's work since I assume most adults know that dogs do not see as wide a variety of colours as humans, being largely restricted to the blues and yellows. But what a fantastic piece of work from a circa ten year old, nonetheless!

Apart from highlighting the enormous changes in science education - chiefly for the better, in my opinion - since my UK school days in the 1970s and 80s, the exhibition suggested that there is an innate wealth of enthusiasm at least for the practice of science, if not for the underlying theories.  If only more people could have access to such events, perhaps the notion that science largely consists of dry abstractions and higher mathematics would be dispelled. After all, if children in their first year of school can practice scientific methodology, from hypothesis via experimentation to conclusion, it can't be all that difficult, can it?

Each experiment in the sci-tech exhibition was beautifully described, following the structure of an aim or hypothesis, an experimental procedure, and then the results and conclusions; in effect, the fundamentals of the scientific method. Themes varied widely, from wave action to solar power (miniature cells being used to drive fans in scale model houses), animal husbandry to biological growth and decay. One of my favourite experiments involved the use of Mentos (mints, if you don't know the brand) to produce miniature geysers when added to various soft drinks. Much to the children's surprise the least favoured contender of the half dozen tried, Diet Coke, won outright, producing a rush of foam over five metres high. The reasons behind this result can be found on the Science Kids website, from which several of the term's projects were taken. The site looks to be a fantastic resource for both teachers and enthusiastic parents who want to the entire family pursue out-of-school science. I'll no doubt be exploring it in detail over the coming year...

Having dabbled in the world of commercially-available science-themed toys the description of how to make your own volcanic eruption experiment on the Science Kids site led my daughters and I to spend a happy Sunday afternoon creating red and yellow lava flows in the garden, courtesy of some familiar ingredients such as sodium bicarbonate and citric acid. They may not have learnt the exact nature of volcanism, but certainly understood something about creating chemical reactions.

Make your own volcano kit
Have fun making your own miniature volcano!

Although these hands-on procedures are considerably more interesting than the dull-as-dishwater investigations I undertook at senior school, the idea of children's participation in experiments is nothing new. The Royal Institution in London has been holding its annual Christmas Lecture series since 1825, with audience members frequently invited to aid the speaker. Although I've never attended myself, I remember viewing some of the televised lectures, with excited children aiding and abetting in the - at times - explosive demonstrations. The lecturers over the past few decades have included some of the great names in science popularisation, from Sir David Attenborough to Richard Dawkins, Carl Sagan to Marcus du Sautoy. Anyone care to bet how long it will be before Brian Cox does a series (if he can find time in his busy media schedule, that is)?

Getting to grips with the scientific method via experimental procedures is a great start for children: it may give them the confidence to think critically and question givens; after all, how many people - even students at top universities - still think the seasons are caused by solar proximity? If that's a bit of a tall order, perhaps hands-on experimenting might help children to appreciate that many scientific concepts are not divorced from everyday experience but with a little knowledge can be seen all around us.

Of course it's far more difficult to maintain interest in science during adolescence, but New Zealand secondary schools aren't left out thanks to the National School Science and Technology Awards and the National Institute of Water and Atmospheric Research (NIWA)-sponsored regional Science and Technology Fairs. It's one thing to give scholarships to scientifically-gifted - or at least keen - children, but quite another to offer a wider audience the opportunities these programmes offer. All in all, it's most encouraging. I even have the sneaky suspicion that had such inspiration been available when I was at school, I might have eschewed the arts for a career in a scientific discipline - at least one with minimal complex mathematics, that is!