Showing posts with label Frankenstein. Show all posts
Showing posts with label Frankenstein. Show all posts

Wednesday 12 December 2018

New neurons: astrocytes, gene therapy and the public fear of brain modification

Ever since the first cyberpunk novels of the early 1980s - and the massive increase of public awareness in the genre thanks to Hollywood - the idea of artificially-enhanced humans has been a topic of intense discussion. Either via direct augmentation of the brain or the development of a brain-computer interface (BCI), the notion of Homo superior has been associated with a dystopian near-future that owes much to Aldous Huxley's Brave New World. After reading about current research into repairing damaged areas of the brain and spinal cord, I thought it would be good to examine this darkly-tinged area.

Back in 2009 I posted about how science fiction has to some extent been confused with science fact, which coupled with the fairly appalling quality of much mainstream media coverage of science stories, has led to public fear where none is necessary and a lack of concern where there should be heaps. When it comes to anything suggestive of enhancing the mind, many people immediately fall back on pessimistic fictional examples, from Frankenstein to Star Trek's the Borg. This use of anti-scientific material in the consideration of real-world STEM is not an optimal response, to say the least.

Rather than working to augment normal humans, real research projects on the brain are usually funded on the basis that they will generate improved medical techniques for individuals with brain or spinal cord injuries. However, a combination of the fictional tropes mentioned above and the plethora of internet-disseminated conspiracy theories, usually concerning alleged secret military projects, have caused the public to concentrate on entirely the wrong aspects.

The most recent material I have read concerning cutting-edge work on the brain covers three teams' use of astrocytes to repair damaged areas. This is an alternative to converting induced pluripotent stem cells (iPSCs) to nerve cells, which has shown promise for many other types of cell. Astrocytes are amazing things, able to connect with several million synapses. Apparently Einstein's brain had far more of them than usual in the region connected with mathematical thinking. The big question would be whether this accumulation was due to nature or nurture, the latter being the high level of exercise Einstein demanded of this region of his brain.

Astrocyte research for brain and spinal cord repair has been ongoing since the 1990s, in order to discover if they can be reprogrammed as functional replacements for lost neurons without any side effects. To this end, mice have been deliberately brain-damaged and then attempts made to repair that damage via converted astrocytes. The intention is to study if stroke victims could be cured via this method, although there are hopes that eventually it may also be a solution for Parkinson's disease, Alzheimer's and even ALS (motor neurone disease). The conversion from astrocyte to neuron is courtesy of a virus that introduces the relevant DNA, although none of the research has as yet proven that the converted cells are fully functional neurons.

Therefore, it would seem we are some decades away from claiming that genetic manipulation can cure brain-impairing diseases. But geneticists must share some of the blame for giving the public the wrong impression. The hyperbole surrounding the Human Genome Project gave both public and medical workers a false sense of optimism regarding the outcome of the genome mapping. In the late 1990s, a pioneer gene therapist predicted that by 2020 virtually every disease would include gene therapy as part of the treatment. We are only just over a year short of this date, but most research is still in first phase trial - and only concern diseases that don't have a conventional cure. It turned out that the mapping was just the simplest stage of a multi-part programme to understand the complexities of which genes code for which disorders.

Meanwhile, gene expression in the form of epigenetics has inspired a large and extremely lucrative wave of pseudo-scientific quackery that belongs in the same genre as homeopathy, crystal healing and all the other New Age flim-flam that uses real scientific terminology to part the gullible from their cash. The poor standard of science education outside of schools (and in many regions, probably within them too) has led to the belief that changing your lifestyle can fix genetic defects or affect cures of serious brain-based illnesses.

Alas, although gene expression can be affected by environmental influences, we are ultimately at the mercy of what we inherited from our parents. Until the astrocyte research has been verified, or a stem cell solution found, the terrible truth is that the victims of strokes and other brain-based maladies must rely upon established medical treatments.

This isn't to say that we may in some cases be able to reduce or postpone the risk with a better lifestyle; diet and exercise (of both the body and brain) are clearly important, but they won't work miracles. We need to wait for the outcome of the current research into astrocytes and iPSCs to find out if the human brain can be repaired after devastating attacks from within or without. Somehow I doubt that Homo superior is waiting round the corner, ready to take over the world from us unenhanced humans…

Friday 11 August 2017

From steampunk to Star Trek: the interwoven strands between science, technology and consumer design

With Raspberry Pi computers having sold over eleven million units by the end of last year, consumer interest in older technology appears to have become big business. Even such decidedly old-school devices as crystal radio kits are selling well, whilst replicas of vintage telescopes are proof that not everyone has a desire for the cutting-edge. I'm not sure why this is so, but since even instant Polaroid-type cameras are now available again - albeit with a cute, toy-like styling - perhaps manufacturers are just capitalising on a widespread desire to appear slightly out of the ordinary. Even so, such products are far closer to the mainstream than left field: instant-developing cameras for example now reach worldwide sales of over five million per year. That's hardly a niche market!

Polaroid cameras aside, could it be the desire for a less minimal aesthetic that is driving such purchases? Older technology, especially if it is pre-integrated circuit, has a decidedly quaint look to it, sometimes with textures - and smells - to match. As an aside, it's interesting that whilst on the one hand current miniaturisation has reduced energy consumption for many smaller pieces of technology from the Frankenstein laboratory appearance of valve-based computing and room-sized mainframes to the smart watch etal, the giant scale of cutting-edge technology projects require immense amounts of energy, with nuclear fusion reactors presumably having overtaken the previous perennial favourite example of space rockets when it comes to power usage.

The interface between sci-tech aesthetics and non-scientific design is a complicated one: it used to be the case that consumer or amateur appliances were scaled-down versions of professional devices, or could even be home-made, for example telescopes or crystal radios. Nowadays there is a massive difference between the equipment in high-tech laboratories and the average home; even consumer-level 3D printers won't be able to reproduce gravity wave detectors or CRISPR-Cas9 genome editing tools any time soon.

The current trend in favour - or at least acknowledgement - of sustainable development, is helping to nullify the pervasive Victorian notion that bigger, faster, noisier (and smellier) is equated with progress. It's therefore interesting to consider the interaction of scientific ideas and instruments, new technology and consumerism over the past century or so. To my mind, there appear to be five main phases since the late Victorian period:
  1. Imperial steam
  2. Streamlining and speed
  3. The Atomic Age
  4. Minimalism and information technology
  5. Virtual light

1) Imperial steam

In the period from the late Nineteenth Century's first generation of professional scientists up to the First World War, there appears to have been an untrammelled optimism for all things technological. Brass, iron, wood and leather devices - frequently steam-powered - created an aesthetic that seemingly without effort has an aura of romance to modern eyes.

Although today's steampunk/alternative history movement is indebted to later authors, especially Michael Moorcock, as much as it is to Jules Verne and H.G. Wells, the latter pair are only the two most famous of a whole legion of late Victorian and Edwardian writers who extolled - and occasionally agonised over - the wonders of the machine age.

I must confess I much prefer steam engines to electric or diesel locomotives, despite the noise, smuts and burning of fossil fuels. Although the pistons and connecting rods of these locomotives might be the epitome of the design from this phase, it should be remembered that it was not unknown for Victorian engineers to add fluted columns and cornucopia reliefs to their cast iron and brass machinery, echoes of a pre-industrial past. An attempt was being made, however crude, to tie together the might of steam power to the Classical civilisations that failed to go beyond the aeolipile toy turbine and the Antikythera mechanism.

2) Streamlining and speed

From around 1910, the fine arts and then decorative arts developed new styles obsessed with mechanical movement, especially speed. The dynamic work of the Futurists led the way, depicting the increasing pace of life in an age when humans and machines were starting to interact ever more frequently. The development of heavier-than-air flight even led to a group of 'aeropainters' whose work stemmed from their experience of flying.

Although scientific devices still had some of the Rube Goldberg/Heath Robinson appearance of their Nineteenth Century forebears, both consumer goods and vehicles picked up the concept of streamlining to suggest a sophisticated, future-orientated design. Items such as radios and toasters utilised early plastics, stainless steel and chrome to imply a higher level of technology than their interiors actually contained. This is in contrast to land, sea and aerial craft, whereby the practical benefits of streamlining happily coincided with an attractive aesthetic, leading to design classics such as the Supermarine seaplanes (forerunners of the Spitfire) and the world speed record-holding A4 Pacific Class steam locomotives.

3) The Atomic Age

By the 1950s practically anything that could be streamlined was, whether buildings that looked like ocean liners or cars with rocket-like tailfins and dashboards fit for a Dan Dare spaceship. However, a new aesthetic was gaining popularity in the wake of the development of atomic weapons. It seems to have been an ironic move that somewhere between the optimism of an era of exciting new domestic gadgets and the potential for nuclear Armageddon, the Bohr (classical physics) model of the atom itself gained a key place in post-war design.

Combined with rockets and space the imagery could readily be termed 'space cadet', but it wasn't the only area of science to influence wider society. Biological research was undergoing a resurgence, which may explain why stylised x-ray forms, amoebas and bodily organs become ubiquitous on textiles, furnishings, and fashion. Lighting fixtures were a standout example of items utilising designs based on the molecular models used in research laboratories (which famously gave Crick and Watson the edge in winning the race to understand the structure of DNA).

Monumental architecture also sought to represent the world of molecules on a giant scale, culminating in the 102 metre-high Atomium built in Brussels for the 1958 World's Fair. It could be said that never before had science- and technological-inspired imagery been so pervasive in non-STEM arenas.

4) Minimalism and information technology

From the early 1970s the bright, optimistic designs of the previous quarter century were gradually replaced by the cool, monochromatic sophistication of minimalism. Less is more became the ethos, with miniaturisation increasing as solid-state electronics and then integrated circuits became available. A plethora of artificial materials, especially plastics, meant that forms and textures could be incredibly varied if refined.

Perhaps a combination of economic recession, mistrust of authority (including science and a military-led technocracy) and a burgeoning awareness of environmental issues led to the replacement of exuberant colour with muted, natural tones and basic if self-possessed geometries. Consumers could now buy microcomputers and video games consoles; what had previously only existed in high-tech labs or science fiction became commonplace in the household. Sci-fi media began a complex two-way interaction with cutting-edge science; it's amazing to consider that only two decades separated the iPad from its fictional Star Trek: The Next Generation predecessor, the PADD.

5) Virtual light

With ultra high-energy experiments such as nuclear fusion reactors and the ubiquity of digital devices and content, today's science-influenced designs aim to be simulacra of their professional big brothers. As stated earlier, although consumer technology is farther removed from mega-budget science apparatus than ever, the former's emphasis on virtual interfaces is part of a feedback loop between the two widely differing scales.

The blue and green glowing lights of everything from futuristic engines to computer holographic interfaces in many Hollywood blockbusters are representations of both the actual awesome power required by the likes of the Large Hadron Collider and as an analogy for the visually-unspectacular real-life lasers and quantum teleportation, the ultimate fusion (sorry, couldn't resist that one) being the use of the real National Ignition Facility target chamber as the engine core of the USS Enterprise in Star Trek: Into Darkness.

Clearly, this post-industrial/information age aesthetic is likely to be with us for some time to come, as consumer-level devices emulate the cool brilliance of professional STEM equipment; the outer casing is often simple yet elegant, aiming not to distract from the bright glowing pixels that take up so much of our time. Let's hope this seduction by the digital world can be moderated by a desire to keep the natural, material world working.

Monday 20 March 2017

Tsunamis and sunsets: how natural disasters can inspire creativity

Just as war is seen as a boost to developments in military technology, so major disasters can lead to fruitful outbursts in creativity. The word disaster, literally meaning 'bad star' in Ancient Greek, might seem more appropriate to meterorite impacts or portents associated with comets, but there are plenty of terrestrial events worthy of the name. One interesting geophysical example appears to have had an obvious effect on Western art and literature: the eruption of Mount Tambora in April 1815.

This Indonesian volcano exploded with such force that ash fell in a cloud over 2,500 km in diameter, with the initial flows and tsunami causing over 10,000 deaths. The subsequent death toll may have been ten times that number, primarily due to starvation and disease. The short-term changes in climate are thought to have accelerated the spread of a cholera strain, leading eventually to millions of deaths during the next few decades.

Although volcanic aerosols lasted for some months after the eruption, the effects were still being felt the following year. Indeed, 1816 earned such delightful nicknames as 'The Year Without a Summer' and 'Eighteen Hundred and Froze to Death', with global temperatures dropping just over half a degree Celsius. This might not sound like much, but as an example of the freak conditions the northern USA received snow in June. Thanks to the recording of weather data at the time, it seems that the climate didn't return to normal for that period until 1819.

The terrible weather and its resulting famines and spread of disease led to riots in many nations, with the short-term appearance of vivid sunsets - due to the fine volcanic dust - failing to make up for the deprivations of food shortages and very cold conditions. One artist who was probably inspired by the former effect was J.M.W. Turner, whose paintings of evening skies appear extremely garish. As a child, I thought this seemingly unnatural colouration was due to artifice, not realising that Turner was depicting reality.

The post-Tambora aerosols contributed to Turner's stylistic change towards depicting the atmospheric effects of light at the expense of form. His radiant skies and translucent ambience inspired the Impressionist school of painting, so perhaps modern art can be said to have its roots in this two hundred year-old disaster.

Literature also owes a debt to Tambora's aftermath: during their famous Swiss holiday in June 1816, Lord Byron produced the outline of the first modern vampire story whilst Mary Shelley started writing Frankenstein. It's easy to suggest that the food riots and wintry weather then current in Switzerland could have contributed towards her tale, in which mankind's best efforts to control nature are doomed to failure.

However, it isn't just the arts that were affected by the aftermath of the volcanic eruption: several key technologies had their roots in the widespread food shortages generated by Tambora. In 1817 the German inventor Karl Drais, aware of the lack of fodder then available to feed horses, developed the earliest steerable - if pedal-less - bicycle. Although its use was short-lived, the velocipede or hobby horse was the first link in the chain (go on, spot the pun) that led to the modern bicycle.

If that doesn't appear too convincing, then the work of another German, the chemist Justus von Liebig, might do. Having as a child been a victim of the post-Tambora famine, von Liebig is known as the 'father of the fertiliser industry' for his work in the 1840s to increase crop yields via nitrogen-based fertilisers.

There is still a widespread perception that scientists' thought processes differ from the rest of humanity's, utilising thought methods that lack any emotion. However, the after effects of Tambora imply that creativity in response to surroundings can be just as important for scientific advance, in the same way that artists respond to their immediate environment. Hopefully, recognition of this will be another nail in the coffin for the harmful idea of C.P. Snow's 'Two Cultures' and lead more people to respect the values of science, upon which our civilisation so heavily relies. Perhaps that way we'll be rather better prepared for the next great natural disaster; after all, it's only a question of time...


Wednesday 25 May 2016

From Dr Strangelove to Dr Evil: Hollywood's anti-science stance

Despite the decades of hard work by the likes of Bill Nye, Stephen Hawking, Carl Sagan, Stephen Jay Gould etal, there is still an enormous amount of public suspicion surrounding scientists and their work. From wavering opinion concerning climate change to the negative publicity revolving around genetically-modified crops (A.K.A. 'Frankenfoods') it seems that popular opinion of scientists isn't far above that meted out in recent years to politicians and merchant bankers.

Tabloid media cannot be solely to blame for this, although the ridiculous scaremongering stories given front page attention, frequently involving medical science, are certainly no help. Instead, I would argue that some of the blame for the public attitude to STEM (Science, Technology, Engineering and Mathematics) comes from that ubiquitous global communicator, mainstream Hollywood. So where did the world's movie capital get its ideas from?

It seems that the denigration of science and its technological applications has probably existed as long as modern science itself. Before there were films to spread the negativity, literature had a mixed opinion of the discipline. Could some of the most famous apparently anti-scientific publications from Europe have inspired Hollywood's pioneers, many of whom were European emigrés?

Jonathan Swift's third book of Gulliver's Travels concerns the scientific elite of a floating island called Laputa. First published in 1726 during the so-called Age of Enlightenment, the book is typical of Swift's no holds barred approach to satire, making much use of the learning of the day. Despite being far more concerned with social and political issues rather than an anti-scientific stance, the material is still echoed today in the popular media.

Granted, many would agree that some of the more expensive STEM research projects such as the Large Hadron Collider could wait until global issues concerning hunger, medicine, environmental degradation - and poverty in general - are solved, but then wealth is rarely evenly distributed. After all, the USA apparently spends twice as much on pet grooming as it does on nuclear fusion research. Incidentally, isn't this bizarre in itself: it's not just that we consider ourselves so much more rational than all other animals, but that the human brain is the most complex object in the known universe. That's a pretty scary thought!

As for Mary Shelley's classic novel whose title is evoked during criticism of GM foods, she may have been inspired by the general feeling of doom then in the air; almost literally in fact, due to the 1815 eruption of Mount Tambora, with volcanic dust creating 1816's 'Year without a Summer'. As an aside, the astonishingly lurid colours of J.M.W. Turner's sunsets of the period were another artistic response associated with the high-altitude volcanic aerosols.

In addition to the extremely cold, wet conditions of that year, Shelley is thought to have stopped near to the original Frankenstein Castle in Germany, where alchemy and other dubious dark arts were reputed to have been practiced. Combined with Luigi Galvani's experiments on frogs' legs - originally performed several decades earlier but much imitated still in Shelley's time, including on human cadavers - the novel is clearly a reflection of widespread anxieties of the time.

With the expansion of industrial cities and their associated squalor, the mid-Nineteenth Century saw the origin of philosophies that associated technological advances (and their scientific underpinnings) with a debasement of humanity. William Blake's description of 'satanic mills' epitomises this mode of thought, seen in as diverse a range of expression as the Pre-Raphaelite Brotherhood of artists, the Arts and Crafts movement, even the political writings of Marx and Engels. To blame the greed of the new captains of industry on science is obviously unfair, but then the latter were a far easier target. After all, the English chemist and political radical Joseph Priestley fled to the United States after an authority-sponsored mob burnt down his house in 1791.

Blake's over-wraught emoting ("Science is the Tree of Death") is amongst the strongest negativity of the period, but can we blame him, considering science was, as it is today, often wrongly blamed as the root cause of the widespread destruction of nature to make way for a soulless, artificial environment? But it wasn't just a response to the changes to society and landscape that Blake took exception to: he detested the mechanistic vision of the universe built upon the work of Galileo and Newton, believing that too much knowledge destroyed wonder and awe.

This is clearly as subjective a viewpoint as any discussion of a work of art; it can be easily rebuffed, although the attitude behind it should be treated seriously. Happily, today's plethora of glossy coffee table books on such scientifically-gleaned wonders as Hubble Space Telescope imagery show there is still plenty to be in awe of.

Mainstream cinema frequently paints a very A versus B picture of the world (think classic westerns or war films). But science can rarely fit into such neat parcels: consider how the more accurate general theory of relativity can live alongside its predecessor from Newton. In addition, it's very tricky to make interesting drama within a traditional narrative structure that utilises scientist protagonists unless it's a disaster movie (even the likes of Jurassic Park falls within this category.)

It isn't difficult to recall many negative examples of scientists in Hollywood movies, from at best those too wrapped up in their own work to notice its wider effects, to at worst insane megalomaniacs intent on either world domination or destruction. In contrast, how many sympathetic movie scientists are there?

It seems such a shame that such a ubiquitous form of entertainment consistently portrays such a lack of sympathy towards science. Even the film version of Carl Sagan's novel Contact lacked the cosmic spiritual elements of the source material, as if afraid that a combination of astrophysics and the mystical wouldn't be comprehensible to audiences (2001 syndrome, perhaps?) Science fiction films these days often seem keen to boast of their technical consultants, so what about a more sympathetic attitude to the practitioners of science itself? After all, most scientists don't live with their private armies in secret headquarters bases, planning to takeover the world...