Showing posts with label 3D printing. Show all posts
Showing posts with label 3D printing. Show all posts

Tuesday 17 March 2020

Printing ourselves into a corner? Mankind and additive manufacturing

One technology that has seemingly come out of nowhere in recent years is the 3D printer. More correctly called additive manufacturing, it has only taken a few years between the building of early industrial models and a thriving consumer market - unlike say, the gestation period between the invention and availability of affordable domestic video cassette recorders.

Some years ago I mentioned the similarities between the iPAD and Star Trek The Next Generation's PADD, with only several decades separating the real-world item from its science fiction equivalent. Today's 3D printers are not so much a primitive precursor of the USS Enterprise-D's replicator as a paradigm shift away in terms of their profound limitations. And yet they still have capabilities that would have seemed incredibly futuristic when I was a child. As an aside, devices such as 3D printers and tablets show just how flexible and adaptable we humans are. Although my generation would have considered them as pure sci-fi, today's children regularly use them in schools and even at home and consider the pocket calculators and digital watches of my childhood in the same way as I looked at steam engines.

But whilst it can't yet produce an instant cup of earl grey tea, additive manufacturing tools are now being tested to create organic, even biological components. Bioprinting promises custom-made organs and replacement tissue in the next few decades, meaning that organ rejection and immune system repression could become a thing of the past. Other naturally-occurring substances such as ice crystals are also being replicated, in this case for realistic testing of how aircraft wings can be designed to minimise problems caused by ice. All in all, the technology seems to find a home in practically every sector of our society and our lives.

Even our remotest of outposts such as the International Space Station are benefiting from the use of additive manufacturing in cutting-edge research as well as the more humdrum role of creating replacement parts - saving the great expense of having to ship components into space. I wouldn't be surprised if polar and underwater research bases are also planning to use 3D printers for these purposes, as well as for fabricating structures in hostile environments. The European Space Agency has even been looking into how to construct a lunar base using 3D printing, with tests involving Italian volcanic rock as a substitute for lunar regolith.

However, even such promising, paradigm-shifting technologies as additive manufacturing can have their negative aspects. In this particular case there are some obvious examples, such as home-printed handguns (originally with very short lifespans, but with the development of 3D printed projectiles instead of conventional ammunition, that is changing.) There are also subtle but more profound issues that arise from the technology, including how reliance on these systems can lead to over-confidence and the loss of ingenuity. It's easy to see the failure due to hubris around such monumental disasters as the sinking of the Titanic, but the dangers of potentially ubiquitous 3D printing technology are more elusive.

During the Apollo 13 mission in 1970, astronauts and engineers on the ground developed a way to connect the CSM's lithium hydroxide canisters to the LM's air scrubbers, literally a case of fitting a square peg into a round hole. If today's equivalents had to rely solely on a 3D printer - with its power consumption making it a less than viable option - they could very well be stuck. Might reliance on a virtual catalogue of components that can be manufactured at the push of a button sap the creativity vital to the next generation of space explorers?

I know young people who don't have some of the skills that my generation deemed fairly essential, such as map reading and basic arithmetic. But deeper than this, creative thinking is as important as analytical rigour and mathematics to the STEM disciplines. Great physicists such as Einstein and Richard Feynman stated how much new ideas in science come from daydreaming and guesswork, not by sticking to robot-like algorithmic processes. Could it be that by using unintelligent machines in so many aspects of our lives we are starting to think more like them, not vice versa?

I've previously touched on how consumerism may be decreasing our intelligence in general, but in this case might such wonder devices as 3D printers be turning us into drones, reducing our ability to problem-solve in a crisis? Yes, they are a brave new world - and bioprinting may prove to be a revolution in medicine - but we need to maintain good, old-fashioned ingenuity; what we in New Zealand call the 'Number 8 wire mentality'. Otherwise, our species risks falling into the trap that there is a wonder device for every occasion - when in actual fact the most sophisticated object in the known universe rests firmly inside our heads.

Saturday 26 January 2019

Concrete: a material of construction & destruction - and how to fix it

How often is it that we fail to consider what is under our noses? One of the most ubiquitous of man-made artifices - at least to the 55% of us who live in urban environments - is concrete. Our high-rise cities and power stations, farmyard siloes and hydroelectric dams wouldn't exist without it. As it is, global concrete consumption has quadrupled over the past quarter century, making it second only to water in terms of humanity's most-consumed substance. Unfortunately, it is also one of most environmentally-unfriendly materials on the planet.

Apart from what you might consider to be the aesthetic crimes of the bland, cookie-cutter approach to International Modernist architecture, there is a far greater issue due to the environmental degradation caused by the concrete manufacturing process. Cement is a key component of the material, but generates around 8% of all carbon dioxide emissions worldwide. As such, there needs to be a 20% reduction over the next ten years in order to fulfil the Paris Agreement - yet there is thought there may be a 25% increase in demand for concrete during this time span, particularly from the developing world. Although lower-carbon cements are being developed, concrete production causes other environmental issues as well. In particular, sand and gravel extraction is bad for the local ecology, including catastrophic damage to the sea bed.

So are there any alternatives? Since the 1990's, television series such as Grand Designs have presented British, New Zealand and Australian-based projects for (at times) extremely sustainable houses made from materials such as shipping containers, driftwood, straw bales, even shredded newspaper. However, these are mostly the unique dream builds of entrepreneurs, visionaries and let's face it, latter-day hippies. The techniques used might be suitable for domestic architecture, but they are impractical at a larger scale.

The US firm bioMASON studied coral in order to develop an alternative to conventional bricks, which generate large amounts of greenhouse gases during the firing process. They use a biomineralisation process, which basically consists of injecting microbes into nutrient-rich water containing sand and watching the rod-shaped bacteria grow into bricks over three to five days.  It's still comparatively early days for the technology, so meanwhile, what about applying the three environmental ‘Rs' of Reduce, Reuse and Recycle to conventional concrete design and manufacturing?

1 Reduce

3D printers are starting to be used in the construction industry to fabricate building and structural components, even small footbridges. Concrete extrusion designs require less material than is required by conventional timber moulds - not to mention removing the need for the timber itself. One common technique is to build up shapes such as walls from thin, stacked, layers. The technology is time-effective too: walls can be built up at a rate of several metres per hour, which may induce companies to make the initial outlay for the printing machinery.

As an example of the low cost, a 35 square metre demonstration house was built in Austin, Texas, last year at a cost of US$10,000 - and it only took 2 days to build. This year may see an entire housing project built in the Netherlands using 3D-printed concrete. Another technique has been pioneered at Exeter University in the UK, using graphene as an additive to reduce the amount of concrete required. This greatly increases both the water resistance and strength compared to the conventional material, thus halving the material requirement.

2 Reuse

Less than a third of the material from conventionally-built brick and timber structures can be reused after demolition. The post-war construction industry has continually reduced the quality of the building material it uses, especially in the residential sector; think of pre-fabricated roof trusses, made of new growth, comparatively unseasoned timber and held together by perforated connector plates. The intended lifespan of such structures could be as little as sixty years, with some integrated components such as roofing failing much sooner.

Compare this to Roman structures such as aqueducts and the Pantheon (the latter still being the world's largest unreinforced concrete dome) which are sound after two millennia, thanks to their volcanic ash-rich material and sophisticated engineering. Surely it makes sense to use concrete to construct long-lasting structures, rather than buildings that will not last as long as their architects? If the reuse of contemporary construction materials is minimal (about as far removed as you can get from the traditional approach of robbing out stone-based structures in their entirety) then longevity is the most logical alternative.

3 Recycle

It is becoming possible to both recycle other waste into concrete-based building materials and use concrete itself as a secure storage for greenhouse gases. A Canadian company called CarbonCure has developed a technique for permanently sequestering carbon dioxide in their concrete by converting it into a mineral during the manufacturing process, with the added benefits of increasing the strength of the material while reducing the amount of cement required.

As for recycling waste material as an ingredient, companies around the world have been developing light-weight concrete incorporating mixed plastic waste, the latter comprising anywhere from 10% to 60% of the volume, particularly with the addition of high density polyethylene.

For example New Zealand company Enviroplaz can use unsorted, unwashed plastic packaging to produce Plazrok, a polymer aggregate for creating a concrete which is up to 40% lighter than standard material. In addition, the same company has an alternative to metal and fibreglass panels in the form of Plaztuff, a fully recyclable, non-corroding material which is one-seventh the weight of steel. It has even been used to build boats as well as land-based items such as skips and playground furniture.

Therefore what might appear to be an intractable problem appears to have a variety of overlapping solutions that allow sustainable development in the building and civil engineering sector. It is somewhat unfortunate then that the conservative nature of these industries has until recently stalled progress in replacing a massive pollutant with much more environmentally sound alternatives. Clearly, green architecture doesn't have to be the sole prerogative of the driftwood dreamers; young entrepreneurs around the world are seizing the opportunity to create alternatives to the destructive effects of construction.

Saturday 3 March 2018

Hi-tech roadblock: is some upcoming technology just too radical for society to handle?

Many people still consider science to be a discipline wholly separate from other facets of human existence. If there's one thing I've learnt during the eight years I've been writing this blog it's that there are so many connections between STEM and society that much of the scientific enterprise cannot be considered in isolation.

Cutting-edge theories can take a long time to be assimilated into mainstream society, in some cases their complexity (for example, quantum mechanics) or their emotive value (most obviously, natural selection) meaning that they get respectively misinterpreted or rejected. New technologies emerge out of scientific principles and methodology, if not always from the archetypal laboratory. STEM practitioners are sometimes the driving force behind new devices aimed at the mass market; could it be that their enthusiasm and in-depth knowledge prohibits them from realising that the world isn't yet ready for their brainchild? In some cases the "Hey, wow, cool, look what we can do!" excitement masks the elaborate web of socio-economic factors that mean the invention will never be suitable for a world outside the test environment.

There are plenty of examples of pioneering consumer-oriented technology that either could never fit into its intended niche (such as the UK's Sinclair C5 electric vehicle of the mid-1980s), or missed public demand, the Sony Betamax video recorder having been aimed at home movie makers rather than audiences just wanting to watch pre-recorded material (hence losing out to the inferior-quality VHS format).

At the opposite pole, mobile phone manufacturers in the early 1990s completely underestimated the public interest in their products, which were initially aimed at business users. Bearing in mind that there is considerable worldwide interest in certain new radical technologies that will presumably be aimed at the widest possible market, I thought I'd look at their pros and cons so as to ascertain whether non-STEM factors are likely to dictate their fortunes.

1) Driverless automobiles

There has been recent confirmation that in the next month or so vehicle manufacturers may be able to test their autonomous cars on California's state highways. With Nissan poised to test self-driving taxis in time for a 2020 launch, the era of human drivers could be entering its last few decades. Critics of the technology usually focus on the potential dangers, as shown by the system's first fatality in May 2016.

But what of the reverse? Could the widespread introduction of driverless road vehicles - once the public is convinced of their superior safety attributes - be opposed by authorities or multinational corporations? After all, in 2016 almost 21% of drivers in the USA received a speeding ticket, generating enormous revenue. Exact figures for these fines are unknown, but estimates for annual totals usually centre around six billion dollars. In addition to the fines themselves adding to national or local government coffers (for all sorts of traffic misdemeanours including parking offences), insurance companies benefit from the increase in premiums for drivers with convictions.

Whether vested interests would find the economic losses suitably offset by the prevention of thousands of deaths due to driver error remains to be seen. This stance might seem unjustly anti-corporate, but when the past half-century's history of private profit ahead of public interest is examined (for example, the millions paid by the fossil fuel and tobacco industries to support their products) there are obvious precedents.

One key scientific method is parsimony, A.K.A. Occam's razor. According to this principle, the simplest explanation is usually the correct one, at least in classical science; quantum mechanics plays by its own rules. An example counter to this line of thought can be seen in the work of the statistician, geneticist and tobacco industry spokesman R.A. Fisher, a keen pipe smoker who argued that rather than a cause-and-effect between smoking and lung cancer, there was a more complicated correlation between people who were both genetically susceptible to lung disease and hereditarily predisposed to nicotine addiction! Cigarette, anyone?

As for relinquishing the steering wheel to a machine, I think that a fair proportion of the public enjoy the 'freedom' of driving and that a larger contingent than just boy racers won't give up manual control without a fight, i.e. state intervention will required to put safety ahead of individuality.

2) Extending human lifespan

It might seem odd that anyone would want to oppose technology that could increase longevity, but there would have to be some fairly fundamental changes to society to accommodate anything beyond the most moderate of extended lifespans. According to a 2009 report in The Lancet medical journal, about half of all children born since 2000 could reach their hundredth birthday.

Various reports state that from 2030-2050 - about as far in the future as anyone can offer realistic prognostication for - the proportion of retirees, including far greater numbers of Alzheimer and dementia sufferers, will require many times more geriatricians than are practicing today. The ratio of working-age population to retiree will also drop, from 5:1 to 3:1 in the case of the USA, implying a far greater pensions crisis than that already looming. Numerous companies are using cutting-edge biotech to find cell renewal techniques, including the fifteen teams racing for the Palo Alto Longevity Prize, so the chances of a breakthrough are fairly high.

Japan offers a hint of how developed nations will alter once extended lifespans are available on a widespread basis: one-third of the population are over sixty and one in eight over seventy-five. In 2016 its public debt was more double the GDP and Japan also faces low labour productivity compared to other nations within the OECD. Figures such as these show that governments will find it economically challenging to support the corresponding population demographics, even if many of the healthcare issues usually associated with the elderly are diminished.

However, unlike driverless cars it's difficult to conceive of arguments in favour of legislation to prevent extended lifespans. If all nations achieved equilibrium in economy, technology and demographics there would be far fewer issues, but the gap between developed and developing nations is wide enough to deem that unlikely for many decades.

Discussions around quality of life for the elderly will presumably become more prominent as the age group gains as a proportion of the electorate. There are already various types of companion robots for those living alone, shaped anything from cats to bears to anthropomorphic designs such as the French Buddy and German Care-O-bot, the latter to my mind resembling a giant, mobile chess piece.

3) Artificial intelligence

I've already looked at international attitudes to the expansion and development of AI but if there's one thing most reports discuss it is the loss of jobs to even semi-intelligent machines. Even if there is a lower proportion of younger people, there will still be a need to keep the populace engaged, constructive or otherwise.

Surveys suggest that far from reducing working hours, information technology has caused employees in developed nations to spend more time outside work still working. For example, over half of all American and British employees now check their work email while on holiday. Therefore will governments be able to fund and organise replacement activities for an obsolete workforce, involving for example life-long learning and job sharing?

The old adage about idle hands rings true and unlike during the Great Depression, the sophistication of modern technology doesn't allow for commissioning of large-scale infrastructure projects utilising an unskilled labour pool. Granted that AI will generate new jobs in novel specialisms, but these will be a drop in the ocean compared to the lost roles. So far, the internet and home computing have created work, frequently in areas largely unpredicted by futurists, but it seems doubtful the trend will continue once heuristic machines and the 'internet of things' become commonplace.

So is it possible that governments will interfere with the implementation of cutting-edge technology in order to preserve the status quo, at least until the impending paradigm shift becomes manageable? I could include other examples, but many are developments that are more likely to incur the annoyance of certain industries rather than governments or societies as a whole. One of the prominent examples used for the up-coming Internet of Things is the smart fridge, which would presumably reduce grocery wastage - and therefore lower sales - via its cataloguing of use-by dates.

Also, if people can buy cheap (or dare I mention pirated?) plans for 3D printing at home, they won't have to repeatedly pay for physical goods, plus in some cases their delivery costs. Current designs that are available to print items for use around the home and garage range from soap dishes to measuring cups, flower vases to car windscreen ice scrapers. Therefore it's obvious that a lot of companies producing less sophisticated household goods are in for a sticky future as 3D printers become ubiquitous.

If these examples prove anything, it's that scientific advances cannot be treated in isolation when they have the potential of direct implementation in the real world. It's also difficult to predict how a technology developed for a single purpose can end up being co-opted into wholly other sectors, as happened with ferrofluids, designed to pump rocket fuel in the 1960's and now used in kinetic sculptures and toys. I've discussed the problems of attempting to predict upcoming technology and its future implementation and as such suggest that even if an area of technological progress follows some sort of predictable development, the wider society that encapsulates it may not be ready for its implementation.

It may not be future shock per se, but there are vested interests who like things just the way they are - certain technology may be just too good for the public. Who said anything about how much fossil fuel industries have spent denying man-made climate change? Or could it be time to consider Occam's razor again?

Friday 11 August 2017

From steampunk to Star Trek: the interwoven strands between science, technology and consumer design

With Raspberry Pi computers having sold over eleven million units by the end of last year, consumer interest in older technology appears to have become big business. Even such decidedly old-school devices as crystal radio kits are selling well, whilst replicas of vintage telescopes are proof that not everyone has a desire for the cutting-edge. I'm not sure why this is so, but since even instant Polaroid-type cameras are now available again - albeit with a cute, toy-like styling - perhaps manufacturers are just capitalising on a widespread desire to appear slightly out of the ordinary. Even so, such products are far closer to the mainstream than left field: instant-developing cameras for example now reach worldwide sales of over five million per year. That's hardly a niche market!

Polaroid cameras aside, could it be the desire for a less minimal aesthetic that is driving such purchases? Older technology, especially if it is pre-integrated circuit, has a decidedly quaint look to it, sometimes with textures - and smells - to match. As an aside, it's interesting that whilst on the one hand current miniaturisation has reduced energy consumption for many smaller pieces of technology from the Frankenstein laboratory appearance of valve-based computing and room-sized mainframes to the smart watch etal, the giant scale of cutting-edge technology projects require immense amounts of energy, with nuclear fusion reactors presumably having overtaken the previous perennial favourite example of space rockets when it comes to power usage.

The interface between sci-tech aesthetics and non-scientific design is a complicated one: it used to be the case that consumer or amateur appliances were scaled-down versions of professional devices, or could even be home-made, for example telescopes or crystal radios. Nowadays there is a massive difference between the equipment in high-tech laboratories and the average home; even consumer-level 3D printers won't be able to reproduce gravity wave detectors or CRISPR-Cas9 genome editing tools any time soon.

The current trend in favour - or at least acknowledgement - of sustainable development, is helping to nullify the pervasive Victorian notion that bigger, faster, noisier (and smellier) is equated with progress. It's therefore interesting to consider the interaction of scientific ideas and instruments, new technology and consumerism over the past century or so. To my mind, there appear to be five main phases since the late Victorian period:
  1. Imperial steam
  2. Streamlining and speed
  3. The Atomic Age
  4. Minimalism and information technology
  5. Virtual light

1) Imperial steam

In the period from the late Nineteenth Century's first generation of professional scientists up to the First World War, there appears to have been an untrammelled optimism for all things technological. Brass, iron, wood and leather devices - frequently steam-powered - created an aesthetic that seemingly without effort has an aura of romance to modern eyes.

Although today's steampunk/alternative history movement is indebted to later authors, especially Michael Moorcock, as much as it is to Jules Verne and H.G. Wells, the latter pair are only the two most famous of a whole legion of late Victorian and Edwardian writers who extolled - and occasionally agonised over - the wonders of the machine age.

I must confess I much prefer steam engines to electric or diesel locomotives, despite the noise, smuts and burning of fossil fuels. Although the pistons and connecting rods of these locomotives might be the epitome of the design from this phase, it should be remembered that it was not unknown for Victorian engineers to add fluted columns and cornucopia reliefs to their cast iron and brass machinery, echoes of a pre-industrial past. An attempt was being made, however crude, to tie together the might of steam power to the Classical civilisations that failed to go beyond the aeolipile toy turbine and the Antikythera mechanism.

2) Streamlining and speed

From around 1910, the fine arts and then decorative arts developed new styles obsessed with mechanical movement, especially speed. The dynamic work of the Futurists led the way, depicting the increasing pace of life in an age when humans and machines were starting to interact ever more frequently. The development of heavier-than-air flight even led to a group of 'aeropainters' whose work stemmed from their experience of flying.

Although scientific devices still had some of the Rube Goldberg/Heath Robinson appearance of their Nineteenth Century forebears, both consumer goods and vehicles picked up the concept of streamlining to suggest a sophisticated, future-orientated design. Items such as radios and toasters utilised early plastics, stainless steel and chrome to imply a higher level of technology than their interiors actually contained. This is in contrast to land, sea and aerial craft, whereby the practical benefits of streamlining happily coincided with an attractive aesthetic, leading to design classics such as the Supermarine seaplanes (forerunners of the Spitfire) and the world speed record-holding A4 Pacific Class steam locomotives.

3) The Atomic Age

By the 1950s practically anything that could be streamlined was, whether buildings that looked like ocean liners or cars with rocket-like tailfins and dashboards fit for a Dan Dare spaceship. However, a new aesthetic was gaining popularity in the wake of the development of atomic weapons. It seems to have been an ironic move that somewhere between the optimism of an era of exciting new domestic gadgets and the potential for nuclear Armageddon, the Bohr (classical physics) model of the atom itself gained a key place in post-war design.

Combined with rockets and space the imagery could readily be termed 'space cadet', but it wasn't the only area of science to influence wider society. Biological research was undergoing a resurgence, which may explain why stylised x-ray forms, amoebas and bodily organs become ubiquitous on textiles, furnishings, and fashion. Lighting fixtures were a standout example of items utilising designs based on the molecular models used in research laboratories (which famously gave Crick and Watson the edge in winning the race to understand the structure of DNA).

Monumental architecture also sought to represent the world of molecules on a giant scale, culminating in the 102 metre-high Atomium built in Brussels for the 1958 World's Fair. It could be said that never before had science- and technological-inspired imagery been so pervasive in non-STEM arenas.

4) Minimalism and information technology

From the early 1970s the bright, optimistic designs of the previous quarter century were gradually replaced by the cool, monochromatic sophistication of minimalism. Less is more became the ethos, with miniaturisation increasing as solid-state electronics and then integrated circuits became available. A plethora of artificial materials, especially plastics, meant that forms and textures could be incredibly varied if refined.

Perhaps a combination of economic recession, mistrust of authority (including science and a military-led technocracy) and a burgeoning awareness of environmental issues led to the replacement of exuberant colour with muted, natural tones and basic if self-possessed geometries. Consumers could now buy microcomputers and video games consoles; what had previously only existed in high-tech labs or science fiction became commonplace in the household. Sci-fi media began a complex two-way interaction with cutting-edge science; it's amazing to consider that only two decades separated the iPad from its fictional Star Trek: The Next Generation predecessor, the PADD.

5) Virtual light

With ultra high-energy experiments such as nuclear fusion reactors and the ubiquity of digital devices and content, today's science-influenced designs aim to be simulacra of their professional big brothers. As stated earlier, although consumer technology is farther removed from mega-budget science apparatus than ever, the former's emphasis on virtual interfaces is part of a feedback loop between the two widely differing scales.

The blue and green glowing lights of everything from futuristic engines to computer holographic interfaces in many Hollywood blockbusters are representations of both the actual awesome power required by the likes of the Large Hadron Collider and as an analogy for the visually-unspectacular real-life lasers and quantum teleportation, the ultimate fusion (sorry, couldn't resist that one) being the use of the real National Ignition Facility target chamber as the engine core of the USS Enterprise in Star Trek: Into Darkness.

Clearly, this post-industrial/information age aesthetic is likely to be with us for some time to come, as consumer-level devices emulate the cool brilliance of professional STEM equipment; the outer casing is often simple yet elegant, aiming not to distract from the bright glowing pixels that take up so much of our time. Let's hope this seduction by the digital world can be moderated by a desire to keep the natural, material world working.

Friday 23 December 2016

O Come, All ye Fearful: 12 woes for Christmas future

This month I thought I would try and adopt something of the Yuletide spirit by offering something short and sharp (if not sweet) that bares a passing resemblance to the carol On the Twelve Days of Christmas. However, instead of gifts I'll be attempting to analyse twelve key concerns that humanity may face in the near future, some being more immediate - not to mention inevitable - than others.

I'll start off with the least probable issues then gradually work down to those most likely to have widespread effects during the next few decades. As it is meant to be a season of good cheer I'll even suggest a few solutions or mitigation strategies where these are applicable. The ultimate in low-carb gifts: what more could you ask for?

12: ET phones Earth. With the SETI Institute and Breakthrough Listen project leading efforts to pick up signals from alien civilisations, what are the chances that we might receive an extra-terrestrial broadcast in the near future? Although many people might deem this just so much science fiction, the contents of a translated message (or autonomous probe) could prove catastrophic. Whether it would spark faith-based wars or aid the development of advanced technology we couldn't control - or be morally fit enough to utilise - there may be as many negative issues as positive ones.

Solution: Keeping such information secret, especially the raw signal data, would be incredibly difficult. Whether an international translation project could be conducted in secret is another matter, with censorship allowing a regular trickle of the less controversial information into the public domain. Whilst this is the antithesis of good scientific practice, it could prove to be the best solution in the long term. Not that most politicians are ever able to see anything that way, however!

11. Acts of God. There is a multitude of naturally-occurring events that are outside of human control, both terrestrial (e.g. super volcano, tsunami) and extra-terrestrial, such as asteroid impacts. Again, until recently few people took much interest in the latter, although Hollywood generated some awareness via several rather poor movies in the late 1990s. The Chelyabinsk meteor of February 2013 (rather than meteorite, as most of the material exploded at altitude led to 1500 injuries, showing that even a small object that doesn't reach the ground intact can cause havoc. Since 2000, there have been over twenty asteroid impacts or atmospheric break-ups ranging from a kiloton up to half a megaton.

Solution: Although there are various projects to assess the orbits of near-Earth objects (NEOs), the development of technologies to deflect or destroy impactors requires much greater funding than is currently in place. Options range from devices that use just their velocity to knock NEOs off-course to the brute force approach of high-powered lasers and hydrogen bombs. However, with the cancellation of NASA's Ares V heavy launch vehicle it's difficult to see how such solutions could be delivered in time. Hopefully in the event something would be cobbled together pretty quickly!

10. Grey goo scenario. As defined by Eric Drexler in his 1986 book Engines of Creation, what if self-replicating nanobots (developed for example, for medical purposes), break their programming and escape into the world, eating everything in their path? Similar to locust swarms, they would only be limited by the availability of raw materials.

Solution: The Royal Society's 2004 report on nanoscience declared that the possibility of von Neumann machines are some decades away and therefore of little concern to regulators. Since then, other research has suggested there should be limited need to develop such machines anyway. So that's good to know!

9. Silicon-destroying lifeforms. What if natural mutations lead to biological organisms that can seriously damage integrated circuitry? A motherboard-eating microbe would be devastating, especially in the transport and medical sectors, never mind the resulting communication network outages and financial chaos. This might sound as ridiculous as any low-grade science fiction plot, but in 1975 nylon-eating bacteria were discovered. Since then, research into the most efficient methods to recover metals from waste electronics have led to experiments in bioleaching. As well as bacteria, the fungus Aspergillus niger has been shown to breakdown the metals used in circuits.

Solution: As bioleaching is potentially cheaper and less environmentally damaging it could become widespread. Therefore it will be up to the process developers to control their creations. Fingers crossed, then!

8. NCB. Conventional weapons may be more common place, but the development of nuclear, chemical and biological weapons by rogue states and terrorist organisations is definitely something to be worried about. The International Atomic Energy Agency has a difficult time keeping track of all the radioactive material that is stolen or goes missing each year.  As the 1995 fatal release of the nerve agent sarin on the Tokyo subway shows, terrorists are not unwilling to use weapons of mass destruction on the general public.

Solution: There's not much I can suggest here. Let's hope that the intelligence services can keep all the Dr Evils at bay.

7. Jurassic Park for real. At Harvard last year a chicken embryo's genes were tweaked in such a way as to create a distinctly dinosaurian snout rather than a beak. Although it may be sometime before pseudo-velociraptors are prowling (high-fenced) reserves, what if genome engineering was used to develop Homo superior? A 2014 paper from Michigan State University suggests both intellectual and physical improvements via CRISPR-cas9 technology is just around the corner.

Solution: If the tabloids are to be believed (as if) China will soon be editing human genomes, to fix genetic diseases as well as generating enhanced humans. Short of war, what's to stop them?

Planet Earth wrapped as a Christmas present

6. DIY weaponry. The explosion in 3D printers for the domestic market means that you can now make your own handguns. Although current designs wear out after a few firings, bullets are also being developed that will work without limiting their lifespan. Since many nations have far more stringent gun laws than the USA, an increase in weaponry among the general public is just what we don't need.

Solution: how about smart locking systems on printers so they cannot produce components that could be used to build a weapon? Alternatively, there are now 3D printer models that can manufacture prototype bulletproof clothing. Not that I'd deem that a perfect solution!

5. Chemical catastrophe. There are plenty of chemicals no longer in production that might affect humanity or our agriculture. These range from the legacy effects of polychlorinated biphenyl (PCB), a known carcinogen, to the ozone depletion causing by CFCs, which could be hanging around the stratosphere for another century; this doesn't just result in increased human skin cancer - crops are also affected by the increased UVB.

Solution: we can only hope that current chemical development now has more rigorous testing and government regulation than that accorded to PCBs, CFCs, DDTs, et al. Let's hope all that health and safety legislation pays off.

4. The energy crisis. Apart from the obvious environmental issues around fossil fuels, the use of fracking generates a whole host of problems on its own, such as the release of methane and contamination of groundwater by toxic chemicals, including radioactive materials.

Solution: more funding is required for alternatives, especially nuclear fusion (a notoriously expensive area to research). Iceland generated 100% of its electricity from renewables whilst Portugal managed 4 consecutive days in May this year via wind, hydro, biomass and solar energy sources. Greater recycling and more incentives for buying electric and hybrid vehicles wouldn't hurt either!

3. Forced migration. The rise in sea levels due to melt water means that it won't just be Venice and small Pacific nations that are likely to become submerged by the end of the century. Predictions vary widely, but all in the same direction: even an increase of 150mm would be likely to affect over ten million people in the USA alone, with probably five times that number in China facing similar issues.

Solution: a reduction in greenhouse gas emissions would seem to be the thing. This requires more electric vehicles and less methane-generating livestock. Arnold Schwarzenegger's non-fossil fuel Hummers and ‘Less meat, less heat, more life' campaign would appear to be good promotion for the shape of things to come - if he can be that progressive, there's hope for everyone. Then of course there's the potential for far more insect-based foodstuffs.

2. Food and water. A regional change in temperature of only a few degrees can seriously affect crop production and the amount of water used by agriculture. Over 700 million people are already without clean water, with shortages affecting agriculture even in developed regions - Australia and California spring to mind. Apparently, it takes a thousand litres of water to generate a single litre of milk!

Solution: A few far-sighted Australian farmers are among those developing methods to minimise water usage, including a few low-tech schemes that could be implemented anywhere. However, really obvious solutions would be to reduce the human population and eat food that requires less water. Again, bug farming seems a sensible idea.

1. Preventing vegegeddon. A former professor at Oxford University told me that some of his undergraduates have problems relating directly to others, having grown up in an environment with commonplace communication via electronic interfaces. If that's the problem facing the intellectual elite, what hope for the rest of our species? Physical problems such as poor eyesight are just the tip of the iceberg: the human race is in severe danger of degenerating into low-attention ‘sheeple' (as they say on Twitter). Children are losing touch with the real world, being enticed into virtual environments that on the surface are so much more appealing. Without knowledge or experience of reality, even stable democracies are in danger of being ruled by opportunistic megalomaniacs, possibly in orange wigs.

Solution: Richard Louv, author of  Last Child in the Woods: Saving Our Children From Nature-Deficit Disorder suggests children require unstructured time out of doors in order to gain an (occasionally painful) understanding of  the real world; tree-climbing, fossicking, etc. Restricting time on electronic devices would seem to go hand in hand with this.

Well, that about wraps it up from me. And if the above seems somewhat scary, then why not do something about it: wouldn't working for a better future be the best Christmas present anyone could ever give?