Read Stripping Down Science Online

Authors: Chris Smith,Dr Christorpher Smith

Stripping Down Science (7 page)

BOOK: Stripping Down Science
2.83Mb size Format: txt, pdf, ePub
ads

A long-held view in cosmology circles is that the solar system, comprising the sun, the earth and all of our neighbouring planets, was catapulted into existence about four and a half billion years ago by the arrival of a huge shockwave. A nearby star that had reached the end of its life is thought to have blown itself to pieces in a catastrophic explosion known as a supernova, producing in the process a multimillion mile an hour maelstrom of dust and debris. This cannoned through space until it ran headlong into a cloud of swirling gas.

This cloud would have been about 100 times the present earth–sun distance in diameter and would have weighed about three times as much as the sun. The arriving shockwave is thought to have squeezed the gas together, causing it to begin to collapse under its own gravity, triggering the formation of a central ‘proto-star'. This would have been surrounded by a protoplanetary soup of spinning material from which the planets we see today would slowly have emerged as gravity glued the debris together.

A nice theory, but a recent scientific discovery has shown that it's a myth. If this model of earth's birth were true, then remnants from the early solar system, such as ancient meteorites, ought to contain the signature of radioactive iron compounds known as iron-60, which would have been blasted from the innards of the star that exploded. But when Copenhagen University researcher Martin Bizzarro and his colleagues
23
went looking for this iron-60 (in the form of its radioactive breakdown product, nickel-60), they couldn't find any. In fact, younger meteorites, formed after the solar system had come into existence, contained more iron-60 than the older ones. This totally turned the supernova theory on its head, because the story told by the meteorites suggested that a nearby star had indeed blown up back in history, but it happened after the solar system had already formed.

So what did spark us into existence? Luckily, there was another chemical clue lurking inside the samples that the team analysed. Both the young and old meteorites contained another element, aluminium-26, and this can only mean one thing:
that a super-massive star, at least 30 times the size of our own sun and with a lifetime of just a few million years, must have existed in our cosmic backyard during the time when the solar system was forming.

Stars on this scale burn off their fuel very fast and produce a powerful solar wind laden with material from their surface layers, which includes aluminium-26 but not iron-60. ‘This rules out the supernova trigger,' says Bizzarro. Instead, he thinks the wind from this giant star probably buffeted the ball of gas that became us into forming the solar system, adding aluminium-26 to the mixture as it went. A few million years later, the star blew itself to pieces, showering the young solar system with iron-60 from its core, thus explaining why the younger meteorites had the hallmark of iron-60 but the older ones didn't.

It looks as if we may have to rewrite the history of how the solar system came to be and on the basis of these findings, we may well have had gentler origins than space scientists first thought.

Nitrous oxide (formula N
2
O) is a volatile gas discovered by the English clergyman and scientist Joseph Priestley in 1793. (Priestley was certainly a bit of a gas man, because he also discovered oxygen, carbon monoxide, carbon dioxide, ammonia and sulphur dioxide.) Priestley made his nitrous oxide by heating ammonium nitrate in the presence of iron filings, and then passing the nitric oxide (NO) that came off through water: 2NO + H
2
O + Fe
N
2
O + Fe(OH)
2
.

Humphry Davy, from the Pneumatic Institute in Bristol, England, then began to experiment with the physiological properties of the gas, and visitors to the institute were given nitrous oxide to breathe. Their reactions, and his own experiments, led Davy to coin the term ‘laughing gas', and he also noticed that the gas had anaesthetic properties. ‘As nitrous oxide in its extensive operation appears capable of destroying physical pain, it may probably be used with advantage during surgical operations in which no great effusion of blood takes place.'

However, for 40 years or so after Davy made this observation, most N
2
O was used for recreational purposes, including at public shows and carnivals where members of the public would pay a small price to inhale a minute's worth of the gas. It wasn't until the mid-1800s that doctors and dentists began to re-explore its painkilling potential.

As is usually the case with a medical breakthrough, it took an accident to help a local dentist make the intellectual leap that was to catapult N
2
O into the domain of medicine. Horace Wells watched as one of the volunteers breathing the gas, a man named Samuel Cooley, staggered into some nearby benches and injured his leg. What intrigued Wells was that Cooley remained unaware of his injury until the effects of the gas wore off. Realising that N
2
O might possess painkilling qualities, Wells approached the demonstrator, a medical school dropout called Gardner Quincy Colton, and invited him to participate in an experiment the next day.

Colton agreed and subsequently administered nitrous oxide to Dr Wells while another dentist extracted one of Wells' teeth. Wells experienced no pain during the procedure, and the birth
of N
2
O as a dental and medical painkiller had arrived. That's the history of the gas. Since that time, it's been embraced as a safe agent that can be used for pain relief (such as during childbirth and dental procedures) and in general anaesthesia. On its own, it's not a sufficiently potent anaesthetic to induce (i.e. cause) anaesthesia, but once a patient is ‘under', it's a very good gaseous agent for anaesthetic ‘maintenance'.

In this respect, N
2
O isn't that unusual, since most volatile gases can behave as anaesthetics with intoxicating effects – they differ only in their potency (i.e. how much of them is needed to have an effect). Volatiles with this property include the butane you squirt into your cigarette lighter and even petrol vapours. In fact, this latter example has been a serious problem in parts of Australia, where members of some communities have been sniffing petrol. This has resulted in BP (British Petroleum) recently producing a blend of unleaded for the Australian market that is less suitable for sniffing.

No one knows exactly how general anaesthetics work, but the fact that they are usually organic, lipid-loving chemicals suggests that they probably alter nerve cell function by dissolving in the oily
membrane that surrounds our cells and affecting the behaviour of membrane pores or channels which control the excitability of the cell. Alcohol probably works similarly and there is now evidence that it specifically renders cells more sensitive to one of the brain's inhibitory nerve transmitters called GABA. This means that cells become less responsive in the presence of alcohol, which is why booze is a central nervous system depressant.

As an aside, it's not just animals that can benefit from the effects of N
2
O. Cars receive a boost in performance when a burst of ‘nitro' is injected into the cylinder during combustion. The heat of the burning fuel causes the nitrous oxide to decompose to nitrogen and oxygen: 2N
2
O -> 2N
2
+ O
2
. So two molecules of gas turn into three molecules of gas, which increases the volume of products inside the cylinder, boosting performance. A bit like Viagra really, although that relies initially on the effects of nitric oxide (NO), rather than nitrous!

When American surgeon and Nobel laureate Joseph Murray performed the world's first kidney transplant in December 1954, he was successful largely because the recipients, Richard and Ronald Herrick, were identical twins. Because the brothers shared the same DNA code, their individual immune systems could not distinguish the organs of one man from the other and so there was no question of rejection.

Unfortunately, the majority of the 6.8 billion people currently alive on earth aren't lucky enough to have an identical twin from whom to beg or borrow an organ whenever they need one. Even if they did, it's unlikely that organs like the heart, of which most normal people have just one, would be volunteered willingly. As a result, the majority of the transplants carried out today are ‘allografts'. That is, they involve organs taken from someone who is genetically different from
the recipient – often a healthy individual who has died following an accident.

But therein lies the problem. The immune system marshals a highly trained army of white blood cells and antibodies which are programmed to tell friend, known as ‘self', from foe. So whenever foreign materials are introduced to the body, with the clever exception of a developing foetus, the immune system can detect that the chemical markers or antigens displayed on the surfaces of the cells in foreign tissue do not match those in the rest of the body. When this occurs, the presumed impostor is attacked and destroyed. If this involves a donor organ, it's termed rejection, a process first described by the French surgeon and Nobel winner Alexis Carrel at the turn of the last century.

To prevent rejection from occurring, transplant doctors try to genetically match donors and recipients as closely as they can, but there will inevitably be differences between the two and so there is always the prospect of an immune attack. This problem held back the transplant field for many years until the 1970s, when the immune-suppressing drug cyclosporine was discovered. Cyclosporine chemically deafens white blood
cells to the sounds of their own inflammatory signals, which prevents the immune system from mounting its normally well-orchestrated attack and thus protects the donor organ from destruction.

Sadly, suppressing the immune system in this way comes at a cost, because patients are less able to fight off infections and they are also more prone to malignancies, since the immune system also has a role in killing off cancer cells. Until recently, doctors thought that these side effects were necessary evils and that immunosuppression was vital for the continued survival of a transplanted organ. Now it looks like this might be at least part-myth.

Indeed, Harvard transplant researcher Megan Sykes and her colleagues
24
have discovered that if a patient undergoing a kidney transplant also simultaneously receives a bone marrow transplant from the same donor, together with a drug to temporarily remove all their white blood cells, they can subsequently stop all immunosuppressive drugs without any signs of rejection. ‘If the bone marrow of two different individuals exists in the
same patient at the same time, the donor bone marrow can re-educate the immune system so it regards tissue from the donor as self,' says Sykes. ‘We've done five patients in a pilot study; four are doing very well. They've been off immunosuppression for several years, five years in one case, and their kidneys are being accepted despite the lack of any immunosuppression.'

BOOK: Stripping Down Science
2.83Mb size Format: txt, pdf, ePub
ads

Other books

Sudden Vacancies by James Kipling
Home for the Holidays by Debbie Macomber
5: Hood - Pack Trust by Weldon, Carys
A Christmas Affair by Byrd, Adrianne
Alice by Delaney, Joseph
The Last Enemy by Grace Brophy