Showing posts with label Einstein. Show all posts
Showing posts with label Einstein. Show all posts

Monday 25 January 2021

Ignorance is bliss: why admitting lack of knowledge could be good for science

"We just don't know" might be one of the best phrases in support of the scientific method ever written. But unfortunately it carries an inherent danger: if a STEM professional - or indeed an amateur scientist/citizen scientist - uses the term, it can be used by those wishing to disavow the subject under discussion. Even adding "- yet" to the end of it won't necessarily improve matters; we humans have an unfortunate tendency to rely on gut instinct rather than rational analysis for our world model, hence - well, just about any man-made problem you care to name, now or throughout history.

Even though trust in scientists and the real-world application of their work may have taken an upswing thanks to some rapid vaccine development during the current pandemic, there are many areas of scientifically-gleaned knowledge that are still as unpopular as ever. Incidentally, I wonder whether if it wasn't for much stricter laws in most countries today, we would have seen far more of the quackery that arose during the 1918 Spanish flu epidemic. During this period low-tech 'cures' included gas inhalation, enemas and blood-letting, the former about as safe as last year's suggestion to drink bleach. I've seen very little about alternative cures, no doubt involving crystals, holy water or good old-fashioned prayer, but then I probably don't mix in those sort of circles (and certainly don't have that type of online cookie profile). But while legislation might have prevented alternative pandemic treatments from being advertised as legitimate and effective, it hasn't helped other areas of science that suffer from widespread hostility. 

Partly this is due to the concept - at least in liberal democracies - of free speech and the idea that every thesis must surely have an antithesis worthy of discussion. Spherical planets not your bag, baby? Why not join the Flat Earth Society. It's easy to be glib about this sort of thing, but there are plenty of more serious examples of anti-scientific thinking that show no sign of abating. The key element that disparate groups opposing science seem to have in common is simple; it all comes down to where it disagrees with the world picture they learnt as a child. In most cases this can be reduced even further to just two words: religious doctrine.

This is where a humble approach to cutting-edge research comes in. Humility has rarely been a key characteristic of fictional scientists; Hollywood for example has often depicted (usually male) scientists as somewhere on a crude line between power-crazed megalomaniacs and naive, misguided innocents. The more sensational printed volumes and tv documentaries communicating scientific research to a popular audience likewise frequently eschew ambiguities or dead-ends in favour of this-is-how-it-is approach. Only, quite often, it isn't how it works at all. Doubts and negative results are not only a key element of science, they are a fundamental component; only by discarding failures can the search for an answer to an hypothesis (or if you prefer the description of the brilliant-yet-humble physicist Richard Feynman: a guess) be narrowed down. 

There are plenty of examples where even the most accomplished of scientists have admitted they don't know the answer to something in their area of expertise, such as Sir Isaac Newton being unable to resolve the ultimate cause of gravity. As it was, it took over two centuries for another genius - Albert Einstein - to figure it out. Despite all the research undertaken over the past century or so, the old adage remains as true as ever: good science creates as many new questions as it answers. Key issues today that are unlikely to gain resolution in the next few years - although never say never - include what is the nature of dark energy (and possibly likewise for dark/non-baryonic matter) and what is the ultimate theory behind quantum mechanics? 

Of course, these questions, fascinating though they are, hold little appeal to most people; they are just too esoteric and far removed from everyday existence to be bothered about. So what areas of scientific knowledge or research do non-scientists worry about? As mentioned above, usually it is something that involves faith. This can be broken down into several factors:

  1. Disagreement with a key religious text
  2. Implication that humans lack an non-corporeal element, such as an immortal soul
  3. Removal of mankind as a central component or focal point for the universe 

These obviously relate to some areas of science - from a layman's viewpoint - far more than others. Most non-specialists, even religious fundamentalists, don't appear to have an issue with atomic theory and the periodic table. Instead, cosmology and evolutionary biology are the disciplines likely to raise their ire. Both are not in any sense complete; the amount of questions still being asked is far greater than the answers so far gleaned from research. The former is yet to understand what 96% of the universe is composed of, while the latter is still piecing together the details of the origin and development of life of our planet, from primordial slime up to Donald Trump (so possibly more of a sideways move, then). 

Herein lies the issue: if scientists claim they are 'certain' about the cause of a particular phenomenon or feature of reality, but further research confirms a different theory, then non-scientists are  legitimately able to ask why the new idea is any more final than the previous one? In addition, the word 'theory' is also prone to misinterpretation, implying it is only an idea and not an hypothesis (guess, if you like) that hasn't yet failed any tests thrown at it, be they practical experiments, digital simulations or mathematical constructions. Bill Bryson's best-selling A Short History of Nearly Everything is an example of how science can be done a disservice by material meant to promote it, in that the book treats science as if it were an ever-expanding body of knowledge rather than as a collection of methods that are used to explore answerable questions about life, the universe, and of course, everything.

Perhaps one answer to all this would be for popular science journalism, from books written by professional scientists to short news items, to include elements related to what is not yet known. The simplistic approach that avoids the failures only serves to strengthen the opinion that experts are arrogant believers in their own personal doctrines, as inflexible and uncompromising as holy writ. 

Unfortunately, in efforts to be both concise and easy-to-comprehend, much science communication appears to render the discipline in this manner, avoiding dissension and doubt. In addition, the often wonderful - and yet to be resolved subtleties - of research are neglected. For example, the majority of specialists agree that birds are descended from theropod (i.e. carnivorous) dinosaurs, and yet the primary growth axis on the forelimbs of the two groups differs. This issue has not been satisfactorily answered, but the vast collection of evidence, both from fossils and experimentation, claims it as the most plausible solution to this particular phylogenetics tree. Further research, especially in embryology, may one day find a more complete solution.

Ultimately then, science education would probably benefit from it confirming boundaries of uncertainty, where they exist. This may help allay fears that the discipline wants to impose absolutes about everything; in most areas (the second law of thermodynamics excepted) we are still in the early stages of understanding. This doesn't mean that the Earth may be flat or only six thousand years old, but it does mean that science usually works in small steps, not giant paradigm shifts that offer the final say on an aspect of reality. After all, if scientists already knew everything about a subject, there wouldn't be any need for further research. What a boring world that would be!

Monday 23 November 2020

Self-destructive STEM: how scientists can devalue science

Following on from last month's exploration of external factors inhibiting the scientific enterprise, I thought it would be equally interesting to examine issues within the sector that can negatively influence STEM research. There is a range of factors that vary from the sublime to the ridiculous, showing that science and its practitioners are as prey to the whims of humanity as any other discipline. 

1) Conservatism

The German physicist Max Planck once said that a "new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it." With peer review of submitted articles, it's theoretically possible that a new hypothesis could be prevented from seeing the light of day due to being in the wrong place at the wrong time; or more precisely, because the reviewers personally object to the ideas presented.

Another description of this view is that there are three stages before the old guard accept the theories of the young turks, with an avant garde idea eventually being taken as orthodoxy. One key challenge is the dislike shown by established researchers to outsiders who promote a new hypothesis in a specialisation they have no formal training in. 

A prominent example of this is the short shrift given to meteorologist Alfred Wegener when he described continental drift to the geological establishment; it took over thirty years and a plethora of evidence before plate tectonics was found to correlate with Wegener's seemingly madcap ideas. More recently, some prominent palaeontologists wrote vitriolic reviews of the geologist-led account of the Chicxulub impact as the main cause of the K-T extinction event. 

This also shows the effect impatience may have; if progress in a field is slow or seemingly negative, it may be prematurely abandoned by most if not all researchers as a dead end.

2) Putting personal preferences before evidence 

Although science is frequently sold to the public as having a purely objective attitude towards natural phenomena, disagreements at the cutting edge are common enough to become cheap ammunition for opponents of STEM research. When senior figures within a field disagree with younger colleagues, it's easy to see why there might be a catch-22 situation in which public funding is only available when there is consensus and yet consensus can only be reached when sufficient research has as placed an hypothesis on a fairly firm footing.

It is well known that Einstein wasted the last thirty or so years of his life trying to find a unified field theory without including quantum mechanics. To his tidy mind, the uncertainty principle and entanglement didn't seem to be suitable as foundation-level elements of creation, hence his famous quote usually truncated as "God doesn't play dice". In other words, just about the most important scientific theory ever didn't fit into his world picture - and yet the public's perception of Einstein during this period was that he was the world's greatest physicist.

Well-known scientists in other fields have negatively impacted their reputation late in their career. Two well-known examples are the astronomer Fred Hoyle and microbiologist Lynn Margulis. Hoyle appears to have initiated increasingly fruity ideas as he got older, including the claim that the archaeopteryx fossil at London's Natural History Museum was a fake. Margulis for her part stayed within her area of expertise, endosymbiotic theory for eukaryotic cells, to claim her discoveries could account for an extremely wide range of biological functions, including the cause of AIDS. It doesn't take much to realise that if two such highly esteemed scientists can publish nonsense, then uninformed sections of the public might want to question the validity of a much wider variety of established scientific truths.

3) Cronyism and the academic establishment

While nepotism might not appear often in the annals of science history, there have still been plenty of instances in which favoured individuals gain a position at the expense of others. This is of course a phenomenon as old as natural philosophy, although thankfully the rigid social hierarchy that affected the careers of nineteenth century luminaries such as physicist Michael Faraday and dinosaur pioneer Gideon Mantell is no longer much of an issue. 

Today, competition for a limited number of places in university research faculties can lead to results as unfair as in any humanities department.  A congenial personality and an ability to self-publicise may tip the balance on gaining tenure as a faculty junior; scientists with poor interpersonal skills can fare badly. As a result, their reputation can be denigrated even after their death, as happened with DNA pioneer Rosalind Franklin in James Watson's memoirs. 

As opponents of string theory are keen to point out, graduates are often forced to get on bandwagons in order to gain vital grants or academic tenure. This suggests that playing safe by studying contemporary ‘hot' areas of research is preferred to investigating a wider range of new ones. Nobel Laureate and former Stephen Hawking collaborator Roger Penrose describes this as being particularly common in theoretical physics, whereby the new kids on the block have to join the entourage of an establishment figure rather than strike out with their own ideas.

Even once a graduate student has gained a research grant, it doesn't mean that their work will be fairly recognised. Perhaps the most infamous example of this occurred with the 1974 Nobel Prize in Physics. One of the two recipients was Antony Hewish, who gained the prize for his "decisive role in the discovery of pulsars”. Yet it was his student Jocelyn Bell who promoted the hypothesis while Hewish was claiming the signal to be man-made interference. 

4) Jealousy and competitiveness

Although being personable and a team player can be important, anyone deemed to be too keen on self-aggrandising may attract the contempt of the scientific establishment. Carl Sagan was perhaps the most prominent science communicator of his generation but was blackballed from the US National Academy of Sciences due to being seen as too popular! This is despite some serious planetary astronomy in his earlier career, including work on various Jet Propulsion Laboratory probes. 

Thankfully, attitudes towards sci-comm have started to improve. The Royal Society has advocated the notion that prominent scientists should become involved in promoting their field, as public engagement has been commonly judged by STEM practitioners as the remit of those at the lower end of scientific ability. Even so, there remains the perception that those engaged in communicating science to the general public are not proficient enough for a career in research. Conversely, research scientists should be able to concentrate on their work rather than having to spend large amounts of their time of seeking grants or undertaking administration - but such ideals are not likely to come to in the near future!

5) Frauds, hoaxes and general misdemeanours 

Scientists are as human as everyone else and given the temptation have been known to resort to underhand behaviour in order to obtain positions, grants and renown. Such behaviour has been occurring since the Enlightenment and varies from deliberate use of selective evidence through to full-blown fraud that has major repercussions for a field of research. 

One well-known example is the Piltdown Man hoax, which wasn't uncovered for forty years. This is rather more due to the material fitting in with contemporary social attitudes rather than the quality - or lack thereof - of the finds. However, other than generating public attention of how scientists can be fooled, it didn't damage science in the long run. 

A far more insidious instance is that of Cyril Burt's research into the heritability of intelligence. After his death, others tried to track down Burt's assistants, only to find they didn't exist. This of course placed serious doubt on the reliability of both his data and conclusions, but even worse his work was used by several governments in the late twentieth century as the basis for social engineering. 

Scandals are not unknown in recent years, providing ammunition for those wanting to deny recognition of fundamental scientific theories (rarely the practical application). In this age of social media, it can take only one person's mistake - deliberate or otherwise - to set in motion a global campaign that rejects the findings of science, regardless of the evidence in its favour. As the anti-vaccination lobby have proven, science communication still has long way to go if we are to combine the best of both worlds: a healthy scepticism with an acceptance of how the weird and wonderful universe really works, and not how we would like it to.

Tuesday 27 October 2020

Bursting the bubble: how outside influences affect scientific research

In these dark times, when some moron (sorry, non-believer in scientific evidence) can easily reach large numbers of people on social media with their conspiracy theories and pseudoscientific nonsense, I thought it would be an apt moment to look at the sort of issues that block the initiation, development and acceptance of new scientific ideas. We are all aware of the long-term feud between some religions and science but aside from that, what else can influence or inhibit both theoretical and applied scientific research?

There are plenty of other factors, from simple national pride to the ideologies of the far left and right that have prohibited theories considered inappropriate. Even some of the greatest twentieth century scientists faced persecution; Einstein was one of the many whose papers were destroyed by the Nazis simply for falling under the banner 'Jewish science'. At least this particular form of state-selective science was relatively short-lived: in the Soviet Union, theories deemed counter to dialectical materialism were banned for many decades. A classic example of this was Stalin's promotion of the crackpot biologist Trofim Lysenko - who denied the modern evolutionary synthesis - and whose scientific opponents were ruthlessly persecuted. 

Even in countries with freedom of speech, if there is a general perception that a particular area of research has negative connotations then no matter how unfounded, public funding may be affected likewise. From the seemingly high-profile adulation of STEM in the 1950s and 1960s (ironic, considering the threat of nuclear war), subsequent decades have seen a decreasing trust in both science and its practitioners. For example, the Ig Nobel awards have for almost thirty years been a high-profile way of publicising scientific projects deemed frivolous or a waste of resources. A similar attitude is frequently heard in arts graduate-led mainstream media; earlier this month, a BBC radio topical news comedy complemented a science venture that was seen as "doing something useful for once." 

Of course, this attitude is commonly related to how research is funded, the primary question being why should large amounts of resources go to keep STEM professionals employed if their work fails to generate anything of immediate use? I've previously discussed this contentious issue, and despite the successes of the Large Hadron Collider and Laser Interferometer Gravitational-Wave Observatory, there are valid arguments in favour of them being postponed until our species has dealt with fundamental issues such as climate change mitigation. 

There are plenty of far less grandiose projects that could benefit from even a few percent of the resources given to the international, mega-budget collaborations that gain the majority of headlines. Counter to the 'good science but wrong time' argument is the serendipitous nature of research; many unforeseen inventions and discoveries have been made by chance, with few predictions hitting the mark.

The celebrity-fixated media tends to skew the public's perception of scientists, representing them more often as solitary geniuses rather than team players. This has led to oversimplified distortions, such as that inflicted on Stephen Hawking for the last few decades of his life. Hawking was treated as a wise oracle on all sorts of science- and future-related questions, some far from his field of expertise. This does neither the individuals involved nor the scientific enterprise any favours. It makes it appear as if a mastermind can pull rabbits out of a hat, rather than hardworking groups spending years on slow, methodical and - let's face it - from the outsider's viewpoint what appears to be somewhat dull research. 

The old-school caricature of the wild-haired, lab-coated boffin is thankfully no longer in evidence, but there are still plenty of popular misconceptions that even dedicated STEM media channels don't appear to have removed. For example, almost everyone I meet fails to differentiate between the science of palaeontology and the non-science of archaeology, the former of course usually being solely associated with dinosaurs. If I had to condense the popular media approach to science, it might be something along these lines:

  • Physics (including astronomy). Big budget and difficult to understand, but sometimes exciting and inspiring
  • Chemistry. Dull but necessary, focusing on improving products from food to pharmaceuticals
  • Biology (usually excluding conventional medicine). Possibly dangerous, both to human ego and our ethical and moral compass (involve religion at this point if you want to) due to both working theories (e.g. natural selection) and practical applications, such as stem cell research. 

Talking of applied science, a more insidious form of pressure has sometimes been used by industry, either to keep consumers purchasing their products or prevent them moving to rival brands. Various patents, such as for longer-lasting products, have been snapped up and hidden by companies protecting their interests, while the treatment meted out to scientific whistle blowers has been legendary. Prominent examples include Rachel Carson's expose of DDT, which led to attacks on her credibility, to industry lobbying of governments to prevent the banning of CFCs after they were found to be destroying the ozone layer.

When the might of commerce is combined with wishful thinking by the scientist involved, it can lead to dreadful consequences. Despite a gathering body of evidence for smoking-related illnesses, the geneticist and tobacco industry spokesman Ronald Fisher - himself a keen pipe smoker - argued for a more complex relationship between nicotine and lung disease. The sector used his prominence to denigrate the truth, no doubt shortening the lives of immense numbers of smokers.

If there's a moral to all this, it is that even at a purely theoretical level science cannot be isolated from all manner of activities and concerns. Next month I'll investigate negative factors within science itself that have had deleterious effects on this uniquely human sphere of accomplishment.

Tuesday 17 March 2020

Printing ourselves into a corner? Mankind and additive manufacturing

One technology that has seemingly come out of nowhere in recent years is the 3D printer. More correctly called additive manufacturing, it has only taken a few years between the building of early industrial models and a thriving consumer market - unlike say, the gestation period between the invention and availability of affordable domestic video cassette recorders.

Some years ago I mentioned the similarities between the iPAD and Star Trek The Next Generation's PADD, with only several decades separating the real-world item from its science fiction equivalent. Today's 3D printers are not so much a primitive precursor of the USS Enterprise-D's replicator as a paradigm shift away in terms of their profound limitations. And yet they still have capabilities that would have seemed incredibly futuristic when I was a child. As an aside, devices such as 3D printers and tablets show just how flexible and adaptable we humans are. Although my generation would have considered them as pure sci-fi, today's children regularly use them in schools and even at home and consider the pocket calculators and digital watches of my childhood in the same way as I looked at steam engines.

But whilst it can't yet produce an instant cup of earl grey tea, additive manufacturing tools are now being tested to create organic, even biological components. Bioprinting promises custom-made organs and replacement tissue in the next few decades, meaning that organ rejection and immune system repression could become a thing of the past. Other naturally-occurring substances such as ice crystals are also being replicated, in this case for realistic testing of how aircraft wings can be designed to minimise problems caused by ice. All in all, the technology seems to find a home in practically every sector of our society and our lives.

Even our remotest of outposts such as the International Space Station are benefiting from the use of additive manufacturing in cutting-edge research as well as the more humdrum role of creating replacement parts - saving the great expense of having to ship components into space. I wouldn't be surprised if polar and underwater research bases are also planning to use 3D printers for these purposes, as well as for fabricating structures in hostile environments. The European Space Agency has even been looking into how to construct a lunar base using 3D printing, with tests involving Italian volcanic rock as a substitute for lunar regolith.

However, even such promising, paradigm-shifting technologies as additive manufacturing can have their negative aspects. In this particular case there are some obvious examples, such as home-printed handguns (originally with very short lifespans, but with the development of 3D printed projectiles instead of conventional ammunition, that is changing.) There are also subtle but more profound issues that arise from the technology, including how reliance on these systems can lead to over-confidence and the loss of ingenuity. It's easy to see the failure due to hubris around such monumental disasters as the sinking of the Titanic, but the dangers of potentially ubiquitous 3D printing technology are more elusive.

During the Apollo 13 mission in 1970, astronauts and engineers on the ground developed a way to connect the CSM's lithium hydroxide canisters to the LM's air scrubbers, literally a case of fitting a square peg into a round hole. If today's equivalents had to rely solely on a 3D printer - with its power consumption making it a less than viable option - they could very well be stuck. Might reliance on a virtual catalogue of components that can be manufactured at the push of a button sap the creativity vital to the next generation of space explorers?

I know young people who don't have some of the skills that my generation deemed fairly essential, such as map reading and basic arithmetic. But deeper than this, creative thinking is as important as analytical rigour and mathematics to the STEM disciplines. Great physicists such as Einstein and Richard Feynman stated how much new ideas in science come from daydreaming and guesswork, not by sticking to robot-like algorithmic processes. Could it be that by using unintelligent machines in so many aspects of our lives we are starting to think more like them, not vice versa?

I've previously touched on how consumerism may be decreasing our intelligence in general, but in this case might such wonder devices as 3D printers be turning us into drones, reducing our ability to problem-solve in a crisis? Yes, they are a brave new world - and bioprinting may prove to be a revolution in medicine - but we need to maintain good, old-fashioned ingenuity; what we in New Zealand call the 'Number 8 wire mentality'. Otherwise, our species risks falling into the trap that there is a wonder device for every occasion - when in actual fact the most sophisticated object in the known universe rests firmly inside our heads.

Thursday 19 December 2019

Our family and other animals: do we deliberately downplay other species' intelligence?

I recently heard about a project investigating canine intelligence, the results being that man's best friend can distinguish similar-sounding words, even if spoken by strangers. Yet again, it appears there is a less and less that makes our species unique: from the problem-solving skills of birds to social insects' use of farming techniques we find ourselves part of a continuum of life rather than standing alone at the apex.

Reading the Swedish philosopher Nick Bostrom's thought-provoking book Superintelligence, I was struck by his description of the variation of human intellect (from as he put it, Einstein to the village idiot) as being startling narrow when compared to the potential range of possible intelligences, both biological and artificial.

The complexity of animal brains has been analysed by both quantitive and qualititive methods, the former dealing with such measurements as the number of neurons while the latter looks at behaviour of members of a species, both in the wild and under laboratory conditions. However, a comparison of these two doesn't necessarily provide any neat correlation.

For example, although mammals are generally - and totally incorrectly - often described as the pinnacle of creation due to their complex behaviour and birth-to-adult learning curve, the quantitive differences in neural architecture within mammals are far greater than those between amphibians and some mammalian families. In addition, there are many birds, mostly in the Psittacidae (parrot) and Corvidae (crow) families, that are both quantitatively and qualitatively superior to most mammals with the exception of some primates.

I think it was the essays of evolutionary biologist Stephen Jay Gould that introduced me to the concept of EQ or encephalisation quotient, which is a label for the brain-mass to body-mass ratio. On these terms, the human brain is far larger than nearly all other species with a similar sized body, the exception (perhaps not surprisingly) being dolphins.

However, it's difficult to draw accurate conclusions just from examination of this general trend: both the absolute size of the brain and neuron density play a fundamental role in cognitive powers. For example, gorillas have a lower EQ that some monkeys, but being a large ape have a far greater brain mass. It could be said then, that perhaps beyond a certain mass the absolute brain size renders the EQ scale of little use. A 2009 study found that different rules for scaling come into play, with humans also having a highly optimal use of the volume available with the cranium, in addition to the economical architecture common among primates.

As historian and philosopher Yuval Noah Harari has pointed out, the development of farming, at least in Eurasia, went hand in hand with the evolution of sophisticated religious beliefs. This led to a change in human attitudes towards the other animals, with a downplay of the latter's emotional needs and their categorisation as inferior, vassal species in a pre-ordained (read: divinely-given) chain of being.

By directly connecting intelligence - or a lack thereof - to empathy and emotions, it is easy to claim that domesticated animal species don't mind their ruthless treatment. It isn't just industrial agriculture that makes the most of this lack of empathy today; I've seen small sharks kept in a Far Eastern jewellery store (i.e. as decoration, not as future food) in tanks barely longer than the creature's own body length.

Although the problem-solving antics of birds such as crows are starting to redress this, most people still consider animal intelligence strictly ordered by vertebrate classes, which leads to such inaccuracies as the 'three second goldfish memory'. I first noticed how incorrect this was when keeping freshwater invertebrates, namely shield shrimp A.K.A. triops, almost a decade ago. Even these tiny creatures appear to have a range of personalities, or perhaps I should say - in an effort to avoid blatant anthropomorphizing - a wide variety of behaviour.

Now on the verge of setting up a tropical aquarium for one of my children, I've been researching what is required to keep fish in fairly small tanks. I've spoken to various aquarium store owners and consulted numerous online resources, learning in the process that the tank environment needs to fulfill certain criteria. There's nothing in usual in this you might think, except that the psychological requirements need to be considered alongside the physical ones.

For example, tank keepers use words such as 'unhappy' and 'depression' to describe what happens when schooling fish are kept in too small a group, active swimmers in too little space and timid species housed in an aquarium without hiding places. We do not consider this fish infraclass - i.e. teleosts - to be Einsteins (there's that label again) of the animal kingdom, but it would appear we just haven't been observing them with enough rigour. They may have minute brains, but there is a complexity that suggests a certain level of emotional intelligence in response to their environment.

So where does all this leave us Homo sapiens, masters of all we survey? Neanderthal research is increasingly espousing the notion that in many ways these extinct cousins/partial ancestors could give us modern humans a run for our money. Perhaps our success is down to one particular component of uniqueness, namely our story-telling ability, a product of our vivid imagination.

Simply because other species lack this skill doesn't mean that they don't have any form of intellectual ability; they may indeed have a far richer sense of their universe than we would like to believe. If our greatest gift is our intelligence, don't we owe it to all other creatures we raise and hold captive to make their lives as pleasant as possible? Whether it's battery farming or keeping goldfish in a bowl, there's plenty we could do to improve things if we consider just what might be going on in the heads of our companion critters.

Monday 13 May 2019

Which side are you on? The mysterious world of brain lateralisation

There are many linguistic examples of ancient superstitions still lurking in open sight. Among the more familiar are sinister and dexterous, which are directly related to being left- and right-handed respectively. These words are so common-place that we rarely consider the pre-scientific thinking behind them. I was therefore interested last year to find out that I am what is known as 'anomalous dominant'. Sounds ominous!

The discovery occurred during my first archery lesson where - on conducting the Miles test for ocular dominance - I discovered that despite being right-handed, I am left-eye dominant. I'd not heard of cross-dominance before, so I decided to do some research. As Auckland Central City Library didn't have any books on the subject I had to resort to the Web, only to find plenty of contradictory information, often of dubious accuracy, with some sites clearly existing so as to sell their strategies for overcoming issues related to the condition.

Being cross-dominant essentially means it takes longer for sensory information to be converted into physical activity, since the dominant senses and limbs must rely on additional transmission of neurons between the hemispheres of the brain. One common claim is that the extra time this requires has an effect on coordination and thus affects sporting ability. I'm quite prepared to accept that idea as I've never been any good at sport, although I must admit I got used to shooting a bow left-handed much quicker than I thought; lack of strength on my left side proved to be a more serious issue than lack of coordination due to muscle memory.

Incidentally, when I did archery at school in the 1980s, no mention was ever made about testing for eye dominance and so I shot right-handed! I did try right-handed shooting last year, only to find that I was having to aim beyond the right edge of the sight in order to make up for the parallax error caused by alignment of the non-dominant eye.

Research over the past century suggests children with crossed lateralisation could suffer a reduction in academic achievement or even general intelligence as a direct result, although a 2017 meta-analysis found little firm evidence to support this. Archery websites tend to claim that the percentage of people with mixed eye-hand dominance is around 18%, but other sources I have found vary anywhere from 10% to 35%. This lack of agreement over so fundamental a statistic suggests that there is still much research to be done on the subject, since anecdotal evidence is presumably being disseminated due to lack of hard data.

There is another type of brain lateralisation which is colloquially deemed ambidextrous, but this term covers a wide range of mixed-handedness abilities. Despite the descriptions of ambidextrous people as lucky or gifted (frequently-named examples include Leonardo da Vinci, Beethoven, Gandhi and Albert Einstein) parenting forums describe serious issues as a result of a non-dominant brain hemisphere. Potential problems include dyspraxia and dyslexia, ADHD, even autism or schizophrenia.

While the reporting of individual families can't be considered of the same quality as professional research, a 2010 report by Imperial College London broadly aligns with parents' stories. 'Functional disconnection syndrome' has been linked to learning disabilities and slower physical reaction times, rooted in the communications between the brain's hemispheres. There also seems to be evidence for the opposite phenomenon, in which the lack of a dominant hemisphere causes too much communication between left and right sides, generating noise that impedes normal mental processes.

What I would like to know is why there is so little information publicly available? I can only conclude that this is why there is such a profusion of non-scientific (if frequently first-hand) evidence. I personally know of people with non-dominant lateralisation who have suffered from a wide range of problems from dyslexia to ADHD, yet they have told me that their general practitioners failed to identify root causes for many years and suggested conventional solutions such as anti-depressants.

Clearly this is an area that could do with much further investigation; after all, if ambidexterity is a marker for abnormal brain development that arose in utero (there is some evidence that a difficult pregnancy could be the root cause) then surely there is clearly defined pathway for wide scale research? This could in turn lead to a reduction in people born with these problems.

In the same way that a child's environment can have a profound effect on their mental well-being and behaviour, could support for at-risk pregnant women reduce the chance of their offspring suffering from these conditions? I would have thought there would be a lot to gain from this, yet I can't find evidence of any medical research seeking a solution. Meanwhile, why not try the Miles test yourself and find out where you stand when it comes to connectivity between your brain, senses and limbs?

Friday 21 December 2018

The Twelve (Scientific) Days Of Christmas

As Christmas approaches and we get over-saturated in seasonal pop songs and the occasional carol, I thought it would be appropriate to look at a science-themed variation to this venerable lyric. So without further ado, here are the twelve days of Christmas, STEM-style.

12 Phanerozoic periods

Although there is evidence that life on Earth evolved pretty much as soon as the conditions were in any way suitable, microbes had the planet to themselves for well over three billion years. Larger, complex organisms may have gained a kick-start thanks to a period of global glaciation - the controversial Snowball Earth hypothesis. Although we often hear of exoplanets being found in the Goldilocks zone, it may also take an awful lot of luck to produce a life-bearing environment. The twelve geological periods of the Phanerozoic (literally, well-displayed life) cover the past 542 million years or so and include practically every species most of us have ever heard of. Hard to believe that anyone who knows this could ever consider our species to be the purpose of creation!

11 essential elements in humans

We often hear the phrase 'carbon-based life forms', but we humans actually contain over three times the amount of oxygen than we do of carbon. In order of abundance by mass, the eleven vital elements are oxygen, carbon, hydrogen, nitrogen, calcium, phosphorus, potassium, sulfur, sodium, chlorine and magnesium. Iron, which you might think to be present in larger quantities, is just a trace mineral; adults have a mere 3 or 4 grams. By comparison, we have about 25 grams of magnesium. In fact, iron and the other trace elements amount to less than one percent of our total body mass. Somehow, 'oxygen-based bipeds' just doesn't have the same ring to it.

10 fingers and toes

The evolution of life via natural selection and genetic mutation consists of innumerable, one-off events. This is science as history, although comparative studies of fossils, DNA and anatomy are required instead of written texts and archaeology. It used to be thought that ten digits was canonical, tracing back to the earliest terrestrial vertebrates that evolved from lobe-finned fish. Then careful analysis of the earliest stegocephalians of the late Devonian period such as Acanthostega showed that their limbs terminated in six, seven or even eight digits. The evolution of five-digit limbs seems to have occurred only once, in the subsequent Carboniferous period, yet of course we take it - and the use of base ten counting - as the most obvious of things. Just imagine what you could play on a piano if you had sixteen fingers!

9 climate regions

From the poles to the equator, Earth can be broadly divided into the following climate areas: polar and tundra; boreal forest; temperate forest; Mediterranean; desert; dry grassland; tropical grassland; tropical rainforest. Mountains are the odd region out, appearing in areas at any latitude that contains the geophysical conditions suitable for their formation. Natural selection leads to the evolution of species suited to the local variations in daylight hours, weather and temperature but the labels can be deceptive; the Antarctic for example contains a vast polar desert. We are only just beginning to understand the complex feedback systems between each region and its biota at a time when species are becoming extinct almost faster than they can be catalogued. We upset the relative equilibrium at our peril.

8 major planets in our solar system

When I was a child, all astronomy books described nine known planets, along with dozens of moons and numerous asteroids. Today we know of almost four thousand planets in other solar systems, some of a similar size to Earth (and even some of these in the Goldilocks zone). However, since 1996 our solar system has been reduced to eight planets, with Pluto amended to the status of a dwarf planet. Technically, this is because it fails one of the three criteria of major planets, in that it sometimes crosses Neptune’s orbit rather than sweeping it clear of other bodies. However, as there is at least one Kuiper belt object, Eris, almost as large as Pluto, it makes sense to stick to a definition that won’t see the number of planets continually rise with each generation of space telescope. This downgrading appears to have upset a lot of people, so it’s probably a good to mention that science is as much a series of methodologies as it is a body of knowledge, with the latter being open to change when required - it’s certainly not set-in-stone dogma! So as astronomer Neil DeGrasse Tyson and author of the best-selling The Pluto Files: The Rise and Fall of America's Favorite Planet put it: "Just get over it!"

7 colours of the rainbow

This is one of those everyday things that most of us never think about. Frankly, I don't know anyone who has been able to distinguish indigo from violet in a rainbow and yet we owe this colour breakdown not to an artist but to one of the greatest physicists ever, Sir Isaac Newton. As well as fulfilling most of the criteria of the modern day scientist, Newton was also an alchemist, numerologist, eschatologist (one of his predictions is that the world will end in 2060) and all-round occultist. Following the mystical beliefs of the Pythagoreans, Newton linked the colours of the spectrum to the notes in Western music scale, hence indistinguishable indigo making number seven. This is a good example of how even the best of scientists are only human.

6 mass extinction events

Episode two of the remake of Carl Sagan's Cosmos television series featuring Neil DeGrasse Tyson was called 'Some of the Things That Molecules Do'. It explored the five mass extinction events that have taken place over the past 450 million years. Tyson also discusses what has come to be known as the Holocene extinction, the current, sixth period of mass dying. Although the loss of megafauna species around the world has been blamed on the arrival of Homo sapiens over the past 50,000 years, the rapid acceleration of species loss over the last ten millennia is shocking in the extreme. It is estimated that the current extinction rate is anywhere from a thousand to ten thousand times to the background rate, resulting in the loss of up to two hundred plant or animals species every day. Considering that two-thirds of our pharmaceuticals are derived or based on biological sources, we really are shooting ourselves in the foot. And that's without considering the advanced materials that we could develop from nature.

5 fundamental forces

Also known as interactions, in order from strongest to weakest these are: the strong nuclear force; electro-magnetism; the weak nuclear force; and gravity. One of the most surprising finds in late Twentieth Century cosmology was that as the universe expands, it is being pushed apart at an ever-greater speed. The culprit has been named dark energy, but that's where our knowledge ends of this possible fifth force. Although it appears to account for about 68% of the total energy of the known universe, the label 'dark' refers to the complete lack of understanding as to how it is generated. Perhaps the most radical suggestion is that Einstein's General Theory of Relativity is incorrect and that an overhaul of the mechanism behind gravity would remove the need for dark energy at all. One thing is for certain: we still have a lot to learn about the wide-scale fabric of the universe.

4 DNA bases

Despite being one of the best-selling popular science books ever, Bill Bryson's A Short History of Nearly Everything manages to include a few howlers, including listing thiamine (AKA vitamin B1) as one of the four bases, instead of thymine. In addition to an understanding how the bases (adenine, cytosine, guanine and thymine) are connected via the double helix backbone, the 1953 discovery of DNA's structure also uncovered the replication mechanism, in turn leading to the development of the powerful genetic editing tools in use today. Also, the discovery itself shows how creativity can be used in science: Watson and Crick's model-building technique proved to be a faster way of generating results than the more methodical x-ray crystallography of Rosalind Franklin and Maurice Wilkins - although it should be noted that one of Franklin's images gave her rivals a clue as to the correct structure. The discovery also shows that collaboration is often a vital component of scientific research, as opposed to the legend of the lonely genius.

3 branches of science

When most people think of science, they tend to focus on the stereotypical white-coated boffin, beavering away in a laboratory filled with complex equipment. However, there are numerous branches or disciplines, covering the purely theoretical, the application of scientific theory, and everything in between. Broadly speaking, science can be divided into the formal sciences, natural sciences and social sciences, each covering a variety of categories themselves. Formal sciences include mathematics and logic and has aspects of absolutism about it (2+2=4). The natural or 'hard' sciences are what we learn in school science classes and broadly divide into physics, chemistry and biology. These use observation and experiment to develop working theories, but maths is often a fundamental component of the disciplines. Social or 'soft' sciences speak for themselves, with sub-disciplines such as anthropology sometimes crossing over into humanities such as archaeology. So when someone tells you that all science is impossibly difficult, you know they obviously haven't considered just what constitutes science!

2 types of fundamental particles

Named after Enrico Fermi and Satyendra Nath Bose respectively, fermions and bosons are the fundamental building blocks of the universe. The former, for example quarks and electrons, are the particles of mass and obey the Pauli Exclusion Principle, meaning no two fermions can exist in the same place in the same state. The latter are the carriers of force, with photons being the best known example. One problem with these particles and their properties such as angular momentum or spin is that most analogies are only vaguely appropriate. After all, we aren't used to an object that has to rotate 720 degrees in order to get back to its original state! In addition, there are many aspects of underlying reality that are far from being understood. String theory was once mooted as the great hope for unifying all the fermions and bosons, but has yet to achieve absolute success, while the 2012 discovery of the Higgs boson is only one potential advance in the search for a Grand Unifying Theory of creation.

1 planet Earth

There is a decorative plate on my dining room wall that says "Other planets cannot be as beautiful as this one." Despite the various Earth-sized exoplanets that have been found in the Goldilocks zone of their solar system, we have little chance in the near future of finding out if they are inhabited as opposed to just inhabitable. Although the seasonal methane on Mars hints at microbial life there, any human colonisation will be a physically and psychologically demanding ordeal. The idea that we can use Mars as a lifeboat to safeguard our species - never mind our biosphere - is little more than a pipedream. Yet we continue to exploit our home world with little consideration for the detrimental effects we are having on it. As the environmental movement says: there is no Planet B. Apart from the banning of plastic bags in some supermarkets, little else appears to have been done since my 2010 post on reduce, reuse and recycle. So why not make a New Year’s resolution to help future generations? Wouldn’t that be the best present for your children and your planetary home?

Wednesday 12 December 2018

New neurons: astrocytes, gene therapy and the public fear of brain modification

Ever since the first cyberpunk novels of the early 1980s - and the massive increase of public awareness in the genre thanks to Hollywood - the idea of artificially-enhanced humans has been a topic of intense discussion. Either via direct augmentation of the brain or the development of a brain-computer interface (BCI), the notion of Homo superior has been associated with a dystopian near-future that owes much to Aldous Huxley's Brave New World. After reading about current research into repairing damaged areas of the brain and spinal cord, I thought it would be good to examine this darkly-tinged area.

Back in 2009 I posted about how science fiction has to some extent been confused with science fact, which coupled with the fairly appalling quality of much mainstream media coverage of science stories, has led to public fear where none is necessary and a lack of concern where there should be heaps. When it comes to anything suggestive of enhancing the mind, many people immediately fall back on pessimistic fictional examples, from Frankenstein to Star Trek's the Borg. This use of anti-scientific material in the consideration of real-world STEM is not an optimal response, to say the least.

Rather than working to augment normal humans, real research projects on the brain are usually funded on the basis that they will generate improved medical techniques for individuals with brain or spinal cord injuries. However, a combination of the fictional tropes mentioned above and the plethora of internet-disseminated conspiracy theories, usually concerning alleged secret military projects, have caused the public to concentrate on entirely the wrong aspects.

The most recent material I have read concerning cutting-edge work on the brain covers three teams' use of astrocytes to repair damaged areas. This is an alternative to converting induced pluripotent stem cells (iPSCs) to nerve cells, which has shown promise for many other types of cell. Astrocytes are amazing things, able to connect with several million synapses. Apparently Einstein's brain had far more of them than usual in the region connected with mathematical thinking. The big question would be whether this accumulation was due to nature or nurture, the latter being the high level of exercise Einstein demanded of this region of his brain.

Astrocyte research for brain and spinal cord repair has been ongoing since the 1990s, in order to discover if they can be reprogrammed as functional replacements for lost neurons without any side effects. To this end, mice have been deliberately brain-damaged and then attempts made to repair that damage via converted astrocytes. The intention is to study if stroke victims could be cured via this method, although there are hopes that eventually it may also be a solution for Parkinson's disease, Alzheimer's and even ALS (motor neurone disease). The conversion from astrocyte to neuron is courtesy of a virus that introduces the relevant DNA, although none of the research has as yet proven that the converted cells are fully functional neurons.

Therefore, it would seem we are some decades away from claiming that genetic manipulation can cure brain-impairing diseases. But geneticists must share some of the blame for giving the public the wrong impression. The hyperbole surrounding the Human Genome Project gave both public and medical workers a false sense of optimism regarding the outcome of the genome mapping. In the late 1990s, a pioneer gene therapist predicted that by 2020 virtually every disease would include gene therapy as part of the treatment. We are only just over a year short of this date, but most research is still in first phase trial - and only concern diseases that don't have a conventional cure. It turned out that the mapping was just the simplest stage of a multi-part programme to understand the complexities of which genes code for which disorders.

Meanwhile, gene expression in the form of epigenetics has inspired a large and extremely lucrative wave of pseudo-scientific quackery that belongs in the same genre as homeopathy, crystal healing and all the other New Age flim-flam that uses real scientific terminology to part the gullible from their cash. The poor standard of science education outside of schools (and in many regions, probably within them too) has led to the belief that changing your lifestyle can fix genetic defects or affect cures of serious brain-based illnesses.

Alas, although gene expression can be affected by environmental influences, we are ultimately at the mercy of what we inherited from our parents. Until the astrocyte research has been verified, or a stem cell solution found, the terrible truth is that the victims of strokes and other brain-based maladies must rely upon established medical treatments.

This isn't to say that we may in some cases be able to reduce or postpone the risk with a better lifestyle; diet and exercise (of both the body and brain) are clearly important, but they won't work miracles. We need to wait for the outcome of the current research into astrocytes and iPSCs to find out if the human brain can be repaired after devastating attacks from within or without. Somehow I doubt that Homo superior is waiting round the corner, ready to take over the world from us unenhanced humans…

Wednesday 30 May 2018

Photons vs print: the pitfalls of online science research for non-scientists


It's common knowledge that school teachers and university lecturers are tired of discovering that their students' research is often limited to one search phrase on Google or Bing. Ignoring the minimal amount of rewriting that often accompanies this shoddy behaviour - leading to some very same-y coursework - one of the most important questions to arise is how easy is it to confirm the veracity of online material compared to conventionally-published sources? This is especially important when it comes to science research, particularly when the subject matter involves new hypotheses and cutting-edge ideas.

One of the many problems with the public's attitude to science is that it is nearly always thought of as an expanding body of knowledge rather than as a toolkit to explore reality. Popular science books such as Bill Bryson's 2003 best-seller A Short History of Nearly Everything follow this convention, disseminating facts whilst failing to illuminate the methodologies behind them. If non-scientists don't understand how science works is it little wonder that the plethora of online sources - of immensely variable quality - can cause confusion?

The use of models and the concurrent application of two seemingly conflicting theories (such as Newton's Universal Gravitation and Einstein's General Theory of Relativity) can only be understood with a grounding in how the scientific method(s) proceed. By assuming that scientific facts are largely immutable, non-scientists can become unstuck when trying to summarise research outcomes, regardless of the difficulty in understanding the technicalities. Of course this isn't true for every theory: the Second Law of Thermodynamics is unlikely to ever need updating; but as the discovery of dark energy hints, even Einstein's work on gravity might need amending in future. Humility and caution should be the bywords of hypotheses not yet verified as working theories; dogma and unthinking belief have their own place elsewhere!

In a 1997 talk Richard Dawkins stated that the methods of science are 'testability, evidential support, precision, quantifiability, consistency, intersubjectivity, repeatability, universality, and independence of cultural milieu.' The last phrase implies that the methodologies and conclusions for any piece of research should not differ from nation to nation. Of course the real world intrudes into this model and so culture, gender, politics and even religion play their part as to what is funded and how the results are presented (or even which results are reported and which obfuscated).

For those who want to stay ahead of the crowd by disseminating the most recent breakthroughs it seems obvious that web resources are far superior to most printed publications, professional journals excepted - although the latter are rarely suitable for non-specialist consumption. The expenses associated with producing popular science books means that online sources are often the first port of call.

Therein lies the danger: in the rush to skim seemingly inexhaustible yet easy to find resources, non-professional researchers frequently fail to differentiate between articles written by scientists, those by journalists with science training, those by unspecialised writers, largely on general news sites, and those by biased individuals. It's usually quite easy to spot material from cranks, even within the quagmire of the World Wide Web (searching for proof that the Earth is flat will generate tens of millions of results) but online content written by intelligent people with an agenda can be more difficult to discern. Sometimes, the slick design of a website offers reassurance that the content is more authentic than it really is, the visual aspects implying an authority that is not justified.

So in the spirit of science (okay, so it's hardly comprehensive being just a single trial) I recently conducted a simple experiment. Having read an interesting hypothesis in a popular science book I borrowed from the library last year, I decided to see what Google's first few pages had to say on the same subject, namely that the Y chromosome has been shrinking over the past few hundred million years to such an extent that its days - or in this case, millennia - are numbered.

I had previously read about the role of artificial oestrogens and other disruptive chemicals in the loss of human male fertility, but the decline in the male chromosome itself was something new to me. I therefore did a little background research first. One of the earliest sources I could find for this contentious idea was a 2002 paper in the journal Nature, in which the Australian geneticist Professor Jennifer Graves described the steady shrinking of the Y chromosome in the primate order. Her extrapolation of the data, combined with the knowledge that several rodent groups have already lost their Y chromosome, suggested that the Home sapiens equivalent has perhaps no more than ten million years left before it disappears.

2003 saw the publication of British geneticist Bryan Sykes' controversial book Adam's Curse: A Future Without Men. His prediction based on the rate of atrophy in the human Y chromosome was that it will only last another 125,000 years. To my mind, this eighty-fold difference in timescales suggests that for these early days in its history, very little of the hypothesis could be confirmed with any degree of certainty.

Back to the experiment itself. The top results for 'Y chromosome disappearing' and similar search phrases lead to articles published between 2009 and 2018. They mostly fall into one of two categories: (1) that the Y chromosome is rapidly degenerating and that males, at least of humans and potentially all other mammal species, are possibly endangered; and (2) that although the Y chromosome has shrunk over the past few hundred million years it has been stable for the past 25 million and so is no longer deteriorating. A third, far less common category, concerns the informal polls taken of chromosomal researchers, who have been fairly evenly divided between the two opinions and thus nicknamed the "leavers" and the "remainers". Considering the wildly differing timescales mentioned above, perhaps this lack of consensus is proof of science in action; there just hasn't been firm enough evidence for either category to claim victory.

What is common to many of the results is that inflammatory terms and hyperbole are prevalent, with little in the way of caution you would hope to find with cutting-edge research. Article titles include 'Last Man on Earth?', 'The End of Men' and 'Sorry, Guys: Your Y Chromosome May Be Doomed ', with paragraph text contain provocative phrases such as 'poorly designed' and 'the demise of men'. This approach is friendly to organic search at the same time as amalgamating socio-political concerns with the science.

You might expect that the results would show a change in trend of time, first preferring one category and then the other, but this doesn't appear to be the case. Rearranged in date order, the search results across the period 2009-2017 include both opinions running concurrently. This year however has seen a change, with the leading 2018 search results so far only offering support to the rapid degeneration hypothesis. The reason for this difference is readily apparent: publication of a Danish study that bolsters support for it. This new report is available online, but is difficult for a non-specialist to digest. Therefore, most researchers such as myself would have to either rely upon second-hand summaries or, if there was enough time, wait for the next popular science book that discusses it in layman's terms.

As it is, I cannot tell from my skimming approach to the subject whether the new research is thorough enough to be completely reliable. For example, it only examined the genes of sixty-two Danish men, so I have no idea if this is a large enough sample to be considered valid beyond doubt. However, all of the 2018 online material I read accepted the report without question, which at least suggests that after a decade and a half of vacillating between two theories, there may now be an answer. Even so, by examining the content in the "remainers" category, I wonder how the new research confirms a long term trend rather than short term blip in chromosomal decline. I can't help thinking that the sort of authoritative synthesis found in the better sort of popular science books would answer these queries, such is my faith in the general superiority of print volumes!

Of course books have been known to emphasise pet theories and denigrate those of opponents, but the risk of similar issues for online content is far greater. Professor Graves' work seems to dominate the "leavers" category, via her various papers subsequent to her 2002 original, but just about every reference to them is contaminated with overly emotive language. I somehow doubt that if her research was only applicable to other types of animals, say reptiles, there would be nearly so many online stories covering it, let alone the colourful phrasing that permeates this topic. The history of the Y chromosome is as extraordinary as the chromosome itself, but treating serious scientific speculation - and some limited experimental evidence - with tabloid reductionism and show business hoopla won't help when it comes to non-specialists researching the subject.

There may be an argument here for the education system to systematically teach such basics as common sense and rigour, in the hopes of giving non-scientists a better chance of detecting baloney. This of course includes the ability to accurately filter online material during research. Personally, I tend to do a lot of cross-checking before committing to something I haven't read about on paper. If even such highly-resourced and respected websites as the BBC Science News site can make howlers (how about claiming that chimpanzees are human ancestors?) why should we take any of these resources on trust? Unfortunately, the seductive ease with which information can be found on the World Wide Web does not in any way correlate with its quality. As I found out with the shrinking Y chromosome hypothesis, there are plenty of traps for the unwary.

Thursday 9 November 2017

Wonders of Creation: explaining the universe with Brian Cox and Robin Ince

As Carl Sagan once you said: "if you wish to make an apple pie from scratch, you must first invent the universe." A few nights' ago, I went to what its' promoters bill as ‘the world's most successful and significant science show', which in just over two hours presented a delineation of the birth, history, and eventual death of the universe. In fact, it covered just about everything from primordial slime to the triumphs of the Cassini space probe, only lacking the apple pie itself.

The show in question is an evening with British physicist and presenter Professor Brian Cox. As a long-time fan of his BBC Radio show The Infinite Monkey Cage I was interested to see how the celebrity professor worked his sci-comm magic with a live audience. In addition to the good professor, his co-presenter on The Infinite Monkey Cage, the comedian Robin Ince, also appeared on stage. As such, I was intrigued to see how their combination of learned scientist and representative layman (or 'interested idiot' as he styles himself) would work in front of two thousand people.

I've previously discussed the trend for extremely expensive live shows featuring well-known scientists and (grumble-grumble) the ticket's to Brian Cox were similarly priced to those for Neil deGrasse Tyson earlier this year. As usual, my friends and I went for the cheaper seats, although Auckland must have plenty of rich science fans, judging by the almost packed house (I did a notice a few empty seats in the presumably most expensive front row). As with Professor Tyson, the most expensive tickets for this show included a meet and greet afterwards, at an eye-watering NZ$485!

When Cox asked if there were any scientists in the audience, there were very few cheers. I did notice several members of New Zealand's sci-comm elite, including Dr Michelle Dickinson, A.K.A. Nanogirl, who had met Ince on his previous Cosmic Shambles LIVE tour; perhaps the cost precluded many STEM professionals from attending. As I have said before, such inflated prices can easily lead to only dedicated fans attending, which is nothing less than preaching to the converted. In which case, it's more of a meet-the-celebrity event akin to a music concert than an attempt to spread the wonder - and rationality - of science.

So was I impressed? The opening music certainly generated some nostalgia for me, as it was taken from Brian Eno's soundtrack for the Al Reinert 1983 feature-length documentary on the Apollo lunar missions. Being of almost the same age as Professor Cox, I confess to having in my teens bought the album of vinyl - and still have it! Unlike Neil deGrasse Tyson's show, the Cox-Ince evening was an almost non-stop visual feast, with one giant screen portraying a range of photographs and diagrams, even a few videos. At the times, the images almost appeared to be 3D, seemingly hanging out of the screen, with shots of the Earth and various planets and moons bulging onto the darkened stage. I have to admit to being extremely impressed with the visuals, even though I had seen some of them before. Highlights included the Hubble Space Telescope's famous Ultra-Deep Field of the earliest galaxies and the montage of the cosmic microwave background taken by the WMAP probe.

The evening (okay, let's call it a cosmology lecture with comic interludes) began as per Neil deGrasse Tyson with the age and scale of the universe, then progressed through galaxy formation and a few examples of known extra-solar planets. However, the material was also bang up to date, as it included the recent discoveries of gravitational waves at LIGO and the creation of heavy elements such as gold and platinum in neutron star collisions.

Evolution of the universe

Our universe: a potted history

Professor Cox also took us through the future prospects of the solar system and the eventual heat death of the universe, generating a few "oohs" and "aahs" along the way.  Interestingly, there was little explanation of dark matter and dark energy; perhaps it was deemed too speculative a topic to do it justice. Black holes had a generous amount of attention though, including Hawking radiation. Despite having an audience of primarily non-STEM professionals (admittedly after a show of hands found a large proportion of them to be The Infinite Monkey Cage listeners), a certain level of knowledge was presupposed and there was little attempt to explain the basics. Indeed, at one point an equation popped up - and it wasn't E=MC2. How refreshing!

Talking of which, there was a brief rundown of Einstein's Special and General Theories of Relativity, followed by the latter's development into the hypothesis of the expanding universe and eventual proof of the Big Bang model. Einstein's Cosmological Constant and his initial dismissal of physicist-priest Georges Lemaître's work were given as examples that even the greatest scientists sometimes make mistakes, showing that science is not a set of inviolable truths that we can never improve upon (the Second Law of Thermodynamics excluded, of course). Lemaître was also held up to be an example of how science and religion can co-exist peacefully, in this case, within the same person.

Another strand, proving that Cox is indeed deeply indebted to Carl Sagan (aren't we all?) was his potted history of life on Earth, with reference to the possibility of microbial life on Mars, Europa and Enceladus. The lack of evidence for intelligent extra-terrestrials clearly bothers Brian Cox as much as it did Sagan. However, Cox appeared to retain his scientific impartiality, suggesting that - thanks to the 3.5 billion year plus gap between the origin of life and the evolution of multi-cellular organisms - intelligent species may be extremely rare.

For a fan of crewed space missions, Cox made little mention of future space travel, concentrating instead on robotic probes such as Cassini. The Large Hadron Collider also didn't feature in any meaningful way, although one of the audience questions around the danger of LHC-created black holes was put into perspective next to the natural black holes that might be produced by cosmic ray interactions with the Earth's atmosphere; the latter's 108 TeV (tera electron volts) far exceed the energies generated by the LHC and we've not been compressed to infinity yet.

Robin Ince's contributions were largely restricted to short if hilarious segments but he also made a passionate plea (there's no other word for it) on the readability of Charles Darwin and his relevance today. He discussed Darwin's earthworm experiments and made short work of the American evangelicals'  "no Darwin equals no Hitler" nonsense, concluding with one of his best jokes: "no Pythagoras would mean no Toblerone".

One of the friends I went with admitted to learning little that was new but as stated earlier I really went to examine the sci-comm methods being used and their effect on the audience. Cox and Ince may have covered a lot of scientific ground but they were far from neglectful of the current state of our species and our environment. Various quotes from astronauts and the use of one of the 'pale blue dot' images of a distant Earth showed the intent to follow in Carl Sagan's footsteps and present the poetic wonder of the immensity of creation and the folly of our pathetic conflicts by comparison. The Cox-Ince combination is certainly a very effective one, as any listeners to The Infinite Monkey Cage will know. Other science communicators could do far worse than to follow their brand of no-nonsense lecturing punctuated by amusing interludes. As for me, I'm wondering whether to book tickets for Richard Dawkins and Lawrence Krauss in May next year. They are slightly cheaper than both Brian Cox and Neil deGrasse Tyson. Hmmm…

Monday 10 July 2017

Genius: portraying Albert Einstein as a human being, not a Hollywood stereotype

I recently watched the National Geographic docudrama series Genius, presenting a warts-and-all look at the life and work of Albert Einstein. In these post-truth times in which even a modicum of intellectual thought is often regarded with disdain, it's interesting to see how a scientific icon is portrayed in a high-budget, high-profile series.

A few notable examples excepted, Dr Frankenstein figures still inform much of Hollywood's depiction of STEM practitioners. Inventors are frequently compartmentalised as either patriotic or megalomaniac, often with a love of military hardware; Jurassic Park's misguided and naive Dr John Hammond seemingly a rare exception. As for mathematicians, they are often depicted with more than a touch of insanity, such as in Pi or Fermat's Room.

So does Genius break the mould or follow the public perception of scientists as freaky, geeky, nerdy or plain evil? The script is a fairly sophisticated adaptation of real life events, although the science exposition suffers as a result. Despite some computer graphic sequences interwoven with the live action, the attempts to explore Einstein's thought experiments and theories are suggestive rather than comprehensive, the tip of the iceberg when it comes to his scientific legacy. Where the series succeeds is in describing the interaction of all four STEM disciplines: science, technology, engineering and mathematics; and the benefits when they overlap. The appalling attitudes prevalent in the academia of his younger years are also brought to vivid life, with such nonsense as not questioning tutors piled onto the usual misogyny and xenophobia.

Albert Einstein

Contrary to the popular conception of the lone genius - and counter to the series' title - the role of Einstein's friends such as Marcel Grossmann and Michele Besso as his sounding boards and mathematical assistants is given a high profile. In addition, the creative aspect of science is brought to the fore in sequences that show how Einstein gained inspiration towards his special and general theories of relativity.

The moral dimension of scientific research is given prominence, from Fritz Haber's development of poison gas to Leo Szilard's persuasion of Einstein to both encourage and later dissuade development of atomic weapons. As much as the scientific enterprise might appear to be separate from the rest of human concern, it is deeply interwoven with society; the term 'laboratory conditions' applies to certain processes, not to provide a wall to isolate science from everything else. Scientists in Genius are shown to have the same human foibles as everyone else, from Einstein's serial adultery (admittedly veering to Hollywood family drama at times, paternal guilt complex etal) to Philipp Lenard's dismissal of Einstein's theories due to his anti-Semitism rather than any scientific evidence. So much for scientific impartiality!

The last few episodes offer a poignant description of how even the greatest of scientific minds lose impetus, passing from creative originality as young rebels to conservative middle age stuck-in-the-muds, out of touch with the cutting edge. General readership books on physics often claim theoretical physicists do their best work before they are thirty, with a common example being that Einstein might as well have spent his last twenty years fishing. Although not as detailed as the portrayal of his early, formative years, Einstein's obsessive (but failed) quest to find fault with quantum mechanics is a good description of how even the finest minds can falter.

All in all, the first series of Genius is a very noble attempt to describe the inspiration and background that led to some revolutionary scientific theories. The irony is that by concentrating on Einstein as a human being it might help the wider public gain a better appreciation, if not comprehensive understanding, of the work of scientists and role of STEM in society. Surely that's no bad thing, especially if it makes Hollywood rethink the lazy stereotype of the crazy-haired scientist seeking world domination. Or even encourages people to listen to trained experts rather than the rants of politicians and religious nutbars. Surely that's not a difficult choice?

Friday 26 August 2016

The benefit of hindsight: the truth behind several infamous science quotes

With utmost apologies to Jane Austen fans, it is a truth universally acknowledged that most people misinterpret science as an ever-expanding corpus of knowledge rather than as a collection of methods for investigating natural phenomena. A simplistic view for those who adhere to the former misapprehension might include questioning science as a whole when high-profile practitioners make an authoritative statement that is proven - in a scientific sense - to be incorrect.

Amongst the more obvious examples of this are the numerous citations from prominent STEM (Science, Technology, Engineering and Mathematics) professionals that are inaccurate to such an extreme as to appear farcical in light of later evidence. I have already discussed the rather vague of art of scientific prognostication in several connected posts but now want to directly examine several quotations concerning applied science. Whereas many quotes are probably as deserving of contempt as the popular opinion of them, I believe the following require careful reading and knowledge of their context in which to attempt any meaningful judgement.

Unlike Hollywood, STEM subjects are frequently too complex for simple black versus white analysis. Of course there have been rather derisible opinions espoused by senior scientists, many of which - luckily - remain largely unknown to the wider public. The British cosmologist and astronomer Sir Fred Hoyle has a large number of these just to himself, from continued support for the Steady State theory long after the detection of cosmic microwave background radiation, to the even less defensible claims that the Natural History Museum's archaeopteryx fossil is a fake and that flu germs are really alien microbes!

Anyhow, here's the first quote:

1) Something is seriously wrong with space travel.

Richard van der Riet Woolley was the British Astronomer Royal at the dawn of the Space Age. His most infamous quote is the archetypal instance of Arthur C. Clarke's First Law:  "When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong."

Although a prominent astronomer, van der Riet Woolley had little knowledge of the practical mechanics that would be required for spaceflight. By the mid-1930s the British Interplanetary Society had developed detailed (although largely paper-only) studies into a crewed lunar landing mission. In 1936 Van der Riet Woolley publically criticised such work, stating that the development of even an unmanned rocket would present fundamental technical difficulties. Bear in mind that this was only six years before the first V2 rocket, which was capable of reaching an altitude of just over 200km!

In 1956, only one year before Sputnik 1 - and thirteen years prior to Apollo 11 - the astronomer went on to claim that near-future space travel was unlikely and a manned lunar landing "utter bilge, really". Of course this has been used as ammunition against him ever since, but the quote deserves some investigation. Van der Riet Woolley goes on to reveal that his primary objection appears to have changed (presumably post-V2 and its successors) from an engineering problem to an economic one, stating that it would cost as much as a "major war" to land on the moon.

This substantially changes the flavour of his quote, since it is after all reasonably accurate. In 2010 dollars, Project Apollo has an estimated budget of about US$109 billion - incidentally about 11% of the cost of the contemporary Vietnam War. In addition, we should bear in mind that a significant amount of the contractors' work on the project is said to have consisted of unpaid overtime. Is it perhaps time to reappraise the stargazer from a reactionary curmudgeon to an economic realist?

Indeed, had Apollo been initiated in a subsequent decade, there is reasonable evidence to suggest it would have failed to leave the ground, so to speak. The uncertainty of the post-Vietnam and Watergate period, followed by the collapse of the Soviet Union, suggest America's loss of faith in technocracy would have effectively cut Apollo off in its prime. After all, another colossal American science and engineering project, the $12 billion particle accelerator the Superconducting Super Collider, was cancelled in 1993 after being deemed unaffordable. Yet up to that point only about one-sixth of its estimated budget had been spent.

In addition, van der Riet Woolley was not alone among STEM professionals: for three decades from the mid-1920s the inventor of the vacuum tube Lee De Forest is said to have claimed that space travel was impractical. Clearly, the Astronomer Royal was not an isolated voice in the wilderness but part of a large consensus opposed to the dreamers in the British Interplanetary Society and their ilk. Perhaps we should allow him his pragmatism, even if it appears a polar opposite to one of Einstein's great aphorisms: "The most beautiful thing we can experience is the mysterious. It is the source of all true art and science. .."

Talking of whom…

2) Letting the genie out of the bottle.

In late 1934 an American newspaper carried this quotation from Albert Einstein: "There is not the slightest indication that (nuclear energy) will ever be obtainable. It would mean that the atom would have to be shattered at will." This seems to be rather amusing, considering the development of the first self-sustaining nuclear chain reaction only eight years later. But Einstein was first and foremost a theorist, a master of the thought experiment, his father's work in electrical engineering not being noticeably sustained in his son. There is obviously a vast world of difference between imagining riding a beam of light to the practical difficulties in assembling brand new technologies with little in the way of precedent. So why did Einstein make such a definitive prediction?

I think it is possible that it may also have been wishful thinking on Einstein's part; as a pacifist he would have dreaded the development of a new super weapon. As the formulator of the equivalence between mass and energy, he could have felt in some way responsible for initiating the avalanche that eventually led to Hiroshima and Nagasaki. Yet there is no clear path between E=mc2 and a man-made chain reaction; it took a team of brilliant experimental physicists and engineers in addition to theorists to achieve a practical solution, via the immense budget of $26 billion (in 2016 dollars).

It is hardly as if the good professor was alone in his views either, as senior officials also doubted the ability to harness atomic fission for power or weaponry. In 1945 when the Manhattan Project was nearing culmination, the highest-ranking member of the American military, Fleet Admiral William Leahy, apparently informed President Truman that the atomic bomb wouldn't work. Perhaps this isn't as obtuse as it sounds, since due to the level of security only a very small percentage of the personnel working on the project knew any of the details.

Leahy clearly knew exactly what the intended outcome was, but even as "an expert in explosives" had no understanding of the complexity of engineering involved. An interesting associated fact is that despite being a military man, the Admiral considered the atomic bomb unethical for its obvious potential as an indiscriminate killer of civilians. Weapons of mass destruction lack any of the valour or bravado of traditional 'heroic' warfare.  Is it possible that this martial leader wanted the bomb to fail for moral reasons, a case of heart over mind? In which case, is this a rare example in which the pacifism of the most well-known scientist was in total agreement with a military figurehead?

Another potential cause is the paradigm shift that harnessing the power of the atom required. In the decade prior to the Manhattan Project, New Zealand physicist Ernest Rutherford had referred to the possibility of man-made atomic energy as "moonshine" whilst another Nobel laureate, American physicist Robert Millikan, had made similar sentiments in the 1920s. And this from men who were pioneers in understanding the structure of the atom!

As science communicator James Burke vividly described in his 1985 television series The Day the Universe Changed, major scientific developments often require substantial reappraisals in outlook, seeing beyond what is taken for granted. The cutting edge of physics is often described as being ruled by theorists in their twenties; eager young turks who are more prepared to ignore precedents. When he became a pillar of the establishment, Einstein ruefully commented: "To punish me for my contempt for authority, fate made me an authority myself."

Perhaps then, such fundamental shifts in technology as the development of space travel and nuclear fission require equally revolutionary changes in mind set and we shouldn't judge the authors of our example quotes too harshly. Then again, if you are an optimist, Clarke's First Law might seem applicable in this situation, in which case quotes from authority figures with some knowledge of the subject in hand should take note of the ingenuity of our species. If there is a moral to this to story, it is other than the speed of light in a vacuum and the Second Law of Thermodynamics, never say never...

Thursday 28 May 2015

Presenting the universe: 3 landmark science documentary series

They say you carry tastes from your formative years with you for the rest of your life, so perhaps this explains why there are three science documentary television series that still have the power to enchant some decades after first viewing. Whilst there has been no shortage of good television science programming since - Planet Earth and the Walking with... series amongst them - there are three that remain the standard by which I judge all others:
  1. The Ascent of Man (1972) - an account of how humanity has evolved culturally and technology via biological and man-made tools. Presented by mathematician and renaissance man Jacob Bronowski.
  2. Cosmos (1980) - the history of astronomy and planetary exploration, interwoven with the origins of life. Presented by Carl Sagan (as if you didn't know).
  3. The Day the Universe Changed (1985) - a study of how scientific and technological breakthroughs in Western society generate paradigm shifts. Presented by the historian of science James Burke.

All three series have been proclaimed 'landmark' shows so it is interesting to compare their themes, viewpoints and production techniques, discovering just how similar they are in many ways. For a start, their excellent production values allowed for a wide range of international locations and historical recreations. They each have a charismatic presenter who admits to espousing a personal viewpoint, although it's quite easy to note that they get progressively more casual: if Jacob Bronowski has the appearance of a warm elder statesman then Carl Sagan is the father figure for a subsequent generation of scientists; James Burke's on-screen persona is more akin to the cheeky uncle, with a regular supply of puns, some good, some less so.

To some extent it is easy to see that the earliest series begat the second that in turn influenced the third. In fact, there is a direct link in that Carl Sagan hired several of the producers from The Ascent of Man for his own series, clearly seeing the earlier show as a template for Cosmos. What all three have is something extremely rare in other science documentaries: a passion for the arts that promotes a holistic interpretation of humanity's development; science does not exist in isolation. As such, the programmes are supported by superbly-illustrated tie-in books that extend the broadcast material from the latter two series whilst Bronowski's book is primarily a transcript of his semi-improvised monologue.

In addition to considering some of the standard examples of key developments in Western civilisation such as Ancient Greece and Galileo, the series include the occasional examination of Eastern cultures. The programmes also contain discussions of religions, both West and East. In fact, between them the series cover a vast amount of what has made the world the way it is. So not small potatoes, then!

The series themselves:

The Ascent of Man

To some extent, Jacob Bronowski was inspired by the earlier series Civilisation, which examined the history of Western arts. Both series were commissioned by David Attenborough, himself a natural sciences graduate who went on to present ground-breaking series in his own discipline as well as commissioning these landmark programmes. (As an aside, if there are any presenters around today who appears to embody the antithesis of C.P. Snow's 'the two cultures' then Sir David is surely in the top ten).

Bronowski's presentation is an astonishingly erudite (for all its improvisation) analysis of the development of our species and its technological society. Although primarily focused on the West, there is some consideration of other regions, from the advanced steel-making technology of medieval Japan to Meso-American astronomy or the relatively static culture of Easter Island. Time and again, the narrative predates the encumbrance of political correctness: that it was the West that almost solely generated our modern technological society - the 'rage for knowledge' for once outshining dogma and inertia.

Of course, it would be interesting to see how Bronowski might have written it today, in light of Jared Diamond's ground-breaking (in my humble opinion) Guns, Germs and Steel. Although he works hard to present science, the plastic arts, literature and myth as emerging from the same basic elements of our nature, it is clear that Bronowski considers the former to be much rarer - and therefore the more precious - discipline. Having said that, Bronowski makes a large number of Biblical references, primarily from the Old Testament. In light of the current issues with fundamentalism in the USA and elsewhere, it is doubtful that any science documentary today would so easily incorporate the breadth of religious allusions.

If there is a thesis underlying the series it is that considering how natural selection has provided humanity with a unique combination of mental gifts, we should use them to exploit the opportunities thus presented. By having foresight and imagination, our species is the only one capable of great heights - and, as he makes no pretence of - terrible depths. As he considers the latter, Bronowski admits that we should remain humble as to the state of contemporary knowledge and technology, which five hundred years hence will no doubt appear childlike. In addition, he states that belief in absolute knowledge can lead to arrogance; if we aspire to be gods, it can only end in the likes of Auschwitz. But his final speeches contain the wonderful notion that the path to annihilation can be avoided if science is communicated to all of society with the same vigour and zest as given to the humanities.

Cosmos

I was already an astronomy and astronautics fan when I saw this series. Its first UK broadcast slot was somewhat later than my usual bedtime, so it seemed a treat to be allowed to stay up after the rest of the family had gone to bed. Like Star Wars a few years before, it appeared to me to be an audio-visual tour-de-force; not surprisingly, both the tie-in hardback and soundtrack album arrived on my birthday that year.

Nostalgia aside, another key reason for the series' success was the charisma of the presenter himself. Much has been written of Sagan's abilities as a self-publicist, and the programmes do suffer from rather too many staring-beatifically-into-the-distance shots (as to some extent replicated more recently by Brian Cox in his various Wonders Of... series). Of course, it must have taken considerable effort to get the series made in the first place, especially in gaining a budget of over $6 million. After all, another great science populariser, the evolutionary biologist Stephen Jay Gould, never managed to gain anything beyond the occasional one-off documentary.

What is most apparent is Sagan's deep commitment to presenting science to the widest possible audience without distorting the material through over-simplification. However, in retrospect it is also obvious that he was using ideas from several scientific disciplines, such as the Miller-Urey experiment, to bolster his opinions on the likelihood of extra-terrestrial life. To some extent his co-writers reined him in, the final episode given over not to SETI but to plea for environmental stewardship.

Whilst the series is primarily concerned with a global history of astronomy and astrophysics, supplemented with first-hand accounts of planetary exploration, Sagan like Bronowski is equally at home with other scientific disciplines. He discusses the evolution of intelligence and incorporates elements of the humanities with equal aplomb. Another key element is the discussion of the role superstition and dead ends have played in the hindrance or even advancement of scientific progress, from Pythagorean mysticism, via Kepler's conflation of planetary orbits with the five Platonic solids, to Percival Lowell's imaginary Martian canals. Although Sagan repeats his earlier debunking of astrology, UFO sightings and the like, he doesn't rule out the role of emotions in the advancement of science and technology, citing for example the rocket pioneer Robert Goddard's Mars-centred epiphany.

Perhaps the primary reason that the series - despite the obvious dating of some of the knowledge - is still so engaging and why Sagan's narration is so widely quoted, is that he was a prose poet par excellence. Even when discussing purely scientific issues, his tone was such that the information could be effortlessly absorbed whilst allowing the viewer to retain a sense of wonder. Of course, Sagan had ample assistance from his two co-writers Ann Druyan and Steven Soter, as clearly proven by their scripts for the Neil deGrasse Tyson-hosted remake Cosmos: A Spacetime Odyssey. Nonetheless, it is hard to think of another presenter who could have made the original series the success it was on so many levels.

The Day the Universe Changed

Although James Burke had already made a large-scale history of science and technology series called Connections in 1978, it contained a rather different take on some of the same material. By focussing on interactive webs, the earlier series was somewhat glib, in that some of the connections could probably be replaced by equally valid alternatives.

In contrast, The Day the Universe Changed uses a more conventional approach that clearly shares some of the same perspectives as the earlier programmes. Like The Ascent of Man and the Cosmos remake, mediaeval Islamic science is praised for its inquisitiveness as well as the preservation of Classical knowledge. Burke was clearly influenced by his predecessors, even subtitling the series 'A Personal View by James Burke'. Perhaps inevitably he covers some of the same material too, although it would be difficult to create a brief history without reference to Newton or Ancient Greece.

As with Bronowski, Burke integrates scientific advances within wider society, a notable example being the rediscovery of perspective and its profound effect on contemporary art. He also supports the notion that rather than a gradual series of changes, paradigm shifts are fundamental to major scientific breakthroughs. In effect, he claims that new versions of the truth - as understood by a scientific consensus - may rely on abandonment of previous theories due to their irreconcilable differences. Having recently read Rachel Carson's 1950 The Sea Around Us I can offer some agreement: although Carson's geophysical analysis quietly screams in favour of plate tectonics, the contemporary lack of evidence lead her to state the no doubt establishment mantra of the period concerning static land masses.

What Burke constantly emphasises even more than his predecessors is that time and place has a fundamental influence on the scientific enquiry of each period. Being immersed in the preconceived notions of their culture, scientists can find it as difficult as anyone else to gain an objective attitude. In actuality, it is all but impossible, leading to such farcical dead-ends as Piltdown Man, a hoax that lasted for decades because it fulfilled the jingoistic expectations of British scientists. Burke's definition of genius is someone who can escape the givens of their background and thus achieve mental insights that no amount of methodical plodding can equal. Well, perhaps, on occasion.

The series also goes further than its predecessors in defining religion as anti-scientific on two grounds: its demand for absolute obedience in the face of logic and evidence, with reference to Galileo; or the lack of interest in progress, as with the cyclical yet static Buddhist view, content for the universe to endlessly repeat itself. Burke also shows how scientific ideas can be perverted for political ends, as with social Darwinism. But then he goes on to note that as the world gets ever more complex, and changes at an ever faster rate, non-specialists are unable to test new theories in any degree and so are having to rely on authority just as much as before the Enlightenment. How ironic!

All in all, these common threads are to my mind among the most important elements of the three series:
  1. Science and the humanities rely on the same basic processes of the human brain and so are not all that different;
  2. Scientific thinking can be as creative an endeavour as the arts;
  3. Scientists don't live in a cultural vacuum but are part and parcel of their world and time;
  4. Religion is the most change-resistant of human activities and therefore rarely appears sympathetic to science's aims and goals.

As Carl Sagan put it, "we make our world significant by the courage of our questions and the depth of our answers." For me, these three series are significant for their appraisal of some of those courageous explorers who have given us the knowledge and tools we call science.