"I promise to work for a better world, where science and technology are used in socially responsible ways. I will not use my education for any purpose intended to harm human beings or the environment. Throughout my career, I will consider the ethical implications of my work before I take action. While the demands placed upon me may be great, I sign this declaration because I recognize that individual responsibility is the first step on the path to peace."
The above is the text of a ‘Hippocratic Oath for Scientists’ proposed by the physicist and Nobel Peace Prize laureate Sir Joseph Rotblat in Science in 1999 (1). Rotblat, a British physicist who grew up in Poland and moved to England just before the Second World War, was originally part of the Manhattan Project that created the atom bomb. When in 1944 it became obvious that the Germans were themselves not capable of building such a device, Rotblat was the only member of the group – which included many of the finest physicists of the 20th century such as Niels Bohr and Richard Feynman – who decided he no longer wanted to be part of the creation of such a powerful weapon of mass destruction. Throughout the rest of his life, he advocated that scientists had the responsibility of protecting the environment and humankind and should be main players in the quest for world peace.
Rotblat did not see science as amoral; he felt that scientists had a clear responsibility to the public and if the results of certain lines of research could lead to adverse effects for humanity and the environment, scientists should decide against conducting those experiments. But many have argued that, while the misuse of scientific knowledge can certainly have dangerous consequences, the scientific process itself is, and has to be, ultimately neutral.
Arguably the strongest advocate for such a position is Lewis Wolpert, Emeritus Professor in Cell and Developmental Biology at University College London. In 2000, in a published exchange of letters (2) between Wolpert and the environmentalist and founder of The Ecologist, Edward Goldsmith, he argued that “it is the very nature of science that it is not possible to predict what will be discovered, or how these discoveries could be applied”, and suggests that one should not confuse knowledge gained by scientific research with the (technical) application that follows. While he also argues that “scientists have neither the right nor the skill to make ethical decisions about the application of their work”, he does suggest that scientists have a responsibility to provide the public with the tools to make informed decisions about the application of science: they should explain their research to the public, explore possible consequences – positive and negative – of their scientific results, and make sure that the research is trustworthy (3). This then would allow for informed debates amongst policy makers and the general public such that society as a whole would be involved in decisions as to how scientific progress is applied.
The debate about the ethical responsibilities of scientists came to the fore again in May this year when the controversial American scientist, Dr Craig Venter, announced that researchers at the J. Craig Venter Institute, a private organisation run by a board of which Dr Venter is President, had been able to create ‘synthetic life’ - a notion disputed by some scientists who say that rebooting a living cell with synthetic DNA is not equal to the creation of new life - by replicating DNA from a bacterial cell artificially and replacing the original DNA of the cell with the artificial DNA (4).
While the technical advancements of this study are largely undisputed, it has been the ethical questions surrounding this research that have caught the headlines. Some of the more controversial reactions to Venter’s research came from Professor Julian Savulescu of the Uehiro Centre for Practical Ethics, who is quoted in the Times Online saying that “Venter is creaking open the most profound door in humanity’s history, potentially peeking into its destiny”, and even goes on to say that “he [Craig Venter] is going towards the role of a God: creating artificial life that could never have existed naturally … the risks are also unparalleled. These could be used in the future to make the most powerful bioweapons imaginable.” David King, director of the Human Genetics Alert watchdog also criticised Venter for playing God: “what is really dangerous is these scientists’ ambitions for total and unrestrained control over nature.” Although these statements seem extreme and guided by emotion rather than reason, they do illustrate an important issue: that there is a discrepancy in understanding of scientific research between scientists and lay-audiences – even highly educated ones – that needs to be addressed.
This research has also grabbed the attention of environmentalist groups who are calling for a moratorium on the release of any synthetic life forms into the environment. They argue that artificial life forms may threaten existing wild life and could lead to hastened extinction, resulting in diminished biodiversity. At the UN Conventional on Biological Diversity held in Kenya at the end of May this year, the Action Group on Erosion, Technology and Concentration (ETC Group) - an international civil society organisation that researches the impact of new technologies on marginalised peoples and is based in Ottawa, Canada - helped formulate a ‘de facto moratorium’ on synthetic biology, which calls for a total ban on any experiment where artificial life forms are released into the wild (5). Environment ministers of the 193 member countries of the Biological Convention will meet in Japan later this year to decide whether to adopt such a moratorium.
Browsing through the popular media coverage, it might seem as if Rotblat would not have approved of Craig Venter’s experiments; however, as Venter himself has pointed out, the ability to create synthetic life allows us to start understanding the function of the fundamental components of life. This may ultimately lead to cures for genetic diseases, such as cystic fibrosis and Alzheimer’s disease by replacing damaged DNA with synthetic DNA that does not have mutations. The environment may benefit significantly too. Synthetic cells may be used to develop synthetic fuels that could address our need for fossil fuels and the large amount of carbon emission in our atmosphere. Such technologies therefore have huge potential for business; BP and Exxon Mobil are already large funders of Dr. Venter’s research.
To a young scientist, it might seem quite straightforward to sign up to Rotblat’s proposed oath, but what would it practically mean to adhere to this code? How do you decide where your personal responsibilities lie? Are you always aware the possible effects of your research? These are particularly difficult questions to answer, especially in a scientific world where high profile publications are ever more important as a token for research money.
As Wolpert argues, scientific discovery is not necessarily predictable, and the suggestion that controversial results can be avoided by not carrying out such experiments seems naïve. Furthermore, while some findings may be used in harmful ways, often their benefits far outweigh their risks. For example, 14% of the world’s electricity needs (30% in the EU alone) are provided for by nuclear power (6,7). Thus, despite its major image problem, this positive use of nuclear fission is of vital importance in today’s society.
While the argument for the scientific process being neutral seems well founded, it is harder to conceive exactly how the application of science should happen democratically. There are only a few scientific minds in the current UK parliament. The crucial question is whether we would really want to leave decisions about the application of synthetic life or genetically modified crops up to politicians or groups of activists. Many comments on Venter’s study came from philosophers, and they could potentially play a pivotal role in democratising the application of scientific progress. In the current era, where major scientific discoveries follow each other in rapid succession, there is a need for ethicists to guide scientists and policy makers on how to use controversial results responsibly. This need is beginning to be widely acknowledged and has already led to the founding of numerous centres dealing with the ethical side of scientific discovery worldwide. Scientists could, or maybe even should, play a major role in this process by helping ethicists and policy makers understand the science behind discovery.
While it may seem that the media increasingly represents scientific advances as lead to increasing a threats to our existence, such fears are by no means new. An image of the “mad scientist”, tinkering with the natural world, can be seen as far back as Aristophanes’ comedy ‘The Clouds’. Mary Shelley’s famous 19th century novel ‘Frankenstein’ expressed the uneasy feelings many people might have about science. However, one has to bear in mind that controversial advances have always led to criticism and ethical questions, but equally, controversial advances in science as well as literature, art and society, has made our society the way it is today.
References
1. Rotblat, J. A (1999). Hippocratic Oath for Scientists. Science 286: 1475
2. Wolpert, L. and Goldsmith, E. (2000). Letter exchange: Is science neutral? The Ecologist 30 No. 3 (May).
3. Wolpert, L. (1999). Is science dangerous? Nature 398: 281-282.
4. Gibson, D.G., Glass, J.I., Lartigue, C. et al., (2010). Creation of a Bacterial Cell Controlled by a Chemically Synthesized Genome. Science 329: 52-56.
5. Etc Group (2010). Synthia is Alive … and Breeding: Panacea or Pandora's Box? http://www.etcgroup.org/en/node/5142
6. International Energy Agency (2009). Key world energy statistics. http://www.iea.org/publications/free_new_Desc.asp?PUBS_ID=1199.
7. European Commission (2009). Eurostat. http://epp.eurostat.ec.europa.eu/portal/page/portal/product_details/publication?p_product_code=KS-SF-09-055.
This article appeared in the Science and Society section of the Michaelmas Term issue of 'Phenotype', the termly science magazine from the Oxford University Biochemical Society. Here you can read the magazine its entirety.
Tuesday, 19 October 2010
Sunday, 17 October 2010
Finally ...
I know - months have gone past without fresh words on this blog. I have not been entirely idle during this period, but few of my energies have been directed towards writing (evidently!).
Moreover, I have also recently been moonlighting online elsewhere and have just posted a piece on Science Oxford Online provoked by the pointless nature-nurture argument on a recent edition of The Today Programme following the announcement of a possible genetic component to ADHD (Attention Deficit Hyperactivity Disorder).
This comes as part of my increasing involvement with Science Oxford, a charitable organisation for the promotion of local science to the general public and also more generally to educate and inspire children and adults about science from across the globe.
For the younger generation, they provide what is called a "Discovery Zone", essentially a large, educative and interactive play pen, which provides seemingly hours of distraction and fun while also quietly demonstrating some of the laws of magnetism and gravity and the like.
While many adults also seem to gain as much enjoyment from these toys as their children, Science Oxford also organises events more specifically tailored to their age group on Thursday evenings. Over the past year, I have helped out at talks on topics ranging from sustainable farming to Alzheimer's disease via a celebration of the contribution of black scientists. They even allowed me an evening to consider how memory and identity interact. For those of you not in Oxford, many of the events are streamed and archived on the website and can be downloaded as video podcasts via iTunes (search for Science Oxford in the iTunes store). Their recent event on everything you always wanted to know about sex (whether or not you were afraid to ask) should be up there shortly and there will soon be talks on the universe and the atom (quite some change in scale!) along with a seasonal take on the science of fireworks.
But don't worry, Science Oxford has not entirely taken over my life and I shan't be forsaking this blog entirely. I hope shortly to write about some of the science that has caught my eye over the summer months. Without a 4 month delay this time, I promise.
Moreover, I have also recently been moonlighting online elsewhere and have just posted a piece on Science Oxford Online provoked by the pointless nature-nurture argument on a recent edition of The Today Programme following the announcement of a possible genetic component to ADHD (Attention Deficit Hyperactivity Disorder).
This comes as part of my increasing involvement with Science Oxford, a charitable organisation for the promotion of local science to the general public and also more generally to educate and inspire children and adults about science from across the globe.
For the younger generation, they provide what is called a "Discovery Zone", essentially a large, educative and interactive play pen, which provides seemingly hours of distraction and fun while also quietly demonstrating some of the laws of magnetism and gravity and the like.
While many adults also seem to gain as much enjoyment from these toys as their children, Science Oxford also organises events more specifically tailored to their age group on Thursday evenings. Over the past year, I have helped out at talks on topics ranging from sustainable farming to Alzheimer's disease via a celebration of the contribution of black scientists. They even allowed me an evening to consider how memory and identity interact. For those of you not in Oxford, many of the events are streamed and archived on the website and can be downloaded as video podcasts via iTunes (search for Science Oxford in the iTunes store). Their recent event on everything you always wanted to know about sex (whether or not you were afraid to ask) should be up there shortly and there will soon be talks on the universe and the atom (quite some change in scale!) along with a seasonal take on the science of fireworks.
But don't worry, Science Oxford has not entirely taken over my life and I shan't be forsaking this blog entirely. I hope shortly to write about some of the science that has caught my eye over the summer months. Without a 4 month delay this time, I promise.
Thursday, 24 June 2010
Double blind & placebo controlled, but where are the statistics?
For those of you who missed this morning's The Today Programme on BBC Radio 4, listen here to a fragment where homoeopathy-sceptic Simon Singh and conservative MP David Tredinnick take opposing views on the healing powers of homoeopathic remedies. Based on a couple of studies published recently, Tredinnick feels it is worth re-opening the debate on whether or not the NHS should fund homoeopathic remedies; Simon Singh explains why this would be a mistake. The latter's description of a French homoeopathic flu medicine that contains one crushed and highly diluted duck that serves millions of people - and therefore earns the company who created the remedy millions of Euros - as the ultimate quack remedy is a wonderful and almost perfect put-down for the values of homoeopathy.
Wednesday, 9 June 2010
Reading between the lines
Whenever we compare human and animal behaviour, one major difference that always comes up is our ability to use words to communicate with our fellow human beings. We can use words to order food in a restaurant and chat about everything and nothing over coffee with our friends, and the act of talking has become so normal to us that we don't even give much thought as to how extraordinary speech really is. And now researchers have discovered that our use of language gives away more than just the literal meaning of the words: there are hidden clues about ourselves in our words, in the frequency with which we use certain words, in the number of different words there are in our personal vocabulary and in the richness of our sentences.
In 'Vanishing Words', a recent episode of my ever favourite science-based radio programme Radiolab, Jad and Robert talk about those hidden messages. Drs Ian Lancashire, Kelvin Lim and Serguei Pakhomov are all interested in how characteristics of our personal vocabulary may be an indication of our likelihood of developing memory related diseases such as Alzheimer's. Moving from analysis of the crime novels of Agatha Christie to a comparison of how the linguistic content of teenage essays by a group of nuns relates sixty years on to their cognitive function, these researchers have found surprising associations between the richness of our language and our later susceptibility to cognitive decline.
And more messages can be found in this blog post on the British Psychological Society's Research Digest blog. Dr Paul Rozin of the University of Pennsylvania showed in a study published in Cognition and Emotion that our existential feelings about positive and negative events are reflected in our language. Analysing the frequency of positive and negative words in our language, the researchers found a large prevalence of the former type of utterance, mirroring, they claim, the preponderance of positive events that occur. Nonetheless, while negative words are comparably scarcer, they exist in greater variety, which the authors argue again is typical of how such events occur in our everyday environment.
Perhaps this latter is not so surprising. Even at its most gossipy or mundane, language is partly a translation of our needs and desires, and partly a reflection of our environment. Slips-of-the-tongue have endlessly been interpreted, not just as mistakes, but as windows onto our souls (see, for instance, the furore when Gordon Brown once accidentally said that he had "saved the world" rather than just the monetary crisis). Similarly, the past fifty years have seen fierce battles over Whorfian linguistic relativism, questioning how language affects thought, perception and behaviour.
All this does give the phrase 'reading between the lines' quite a different meaning!
In 'Vanishing Words', a recent episode of my ever favourite science-based radio programme Radiolab, Jad and Robert talk about those hidden messages. Drs Ian Lancashire, Kelvin Lim and Serguei Pakhomov are all interested in how characteristics of our personal vocabulary may be an indication of our likelihood of developing memory related diseases such as Alzheimer's. Moving from analysis of the crime novels of Agatha Christie to a comparison of how the linguistic content of teenage essays by a group of nuns relates sixty years on to their cognitive function, these researchers have found surprising associations between the richness of our language and our later susceptibility to cognitive decline.
And more messages can be found in this blog post on the British Psychological Society's Research Digest blog. Dr Paul Rozin of the University of Pennsylvania showed in a study published in Cognition and Emotion that our existential feelings about positive and negative events are reflected in our language. Analysing the frequency of positive and negative words in our language, the researchers found a large prevalence of the former type of utterance, mirroring, they claim, the preponderance of positive events that occur. Nonetheless, while negative words are comparably scarcer, they exist in greater variety, which the authors argue again is typical of how such events occur in our everyday environment.
Perhaps this latter is not so surprising. Even at its most gossipy or mundane, language is partly a translation of our needs and desires, and partly a reflection of our environment. Slips-of-the-tongue have endlessly been interpreted, not just as mistakes, but as windows onto our souls (see, for instance, the furore when Gordon Brown once accidentally said that he had "saved the world" rather than just the monetary crisis). Similarly, the past fifty years have seen fierce battles over Whorfian linguistic relativism, questioning how language affects thought, perception and behaviour.
All this does give the phrase 'reading between the lines' quite a different meaning!
Wednesday, 2 June 2010
Illuminating the brain's bright future
In the 1920s Felix the Cat had a brilliant idea and a light bulb appeared over his head; thus was created the signature of an epiphany. But recent advances in neuroscience leave you wondering whether in the future we will be more familiar with light bulbs actually driving our thoughts and inspiration rather than just being a visual metaphor. Gero Miesenböck, currently Waynflete Professor of Physiology at Oxford University, has been pioneering work that uses light to control brain cells, a field known as optogenetics.
Our brain consists of approximately 100 billion neurons that, as Miesenböck lyrically describes, form “an intricate tapestry”. To understand how neuronal signalling drives our behaviour, he says, we need to tease apart the disparate contributions that each of the different populations of neurons make to our behaviour. Nobel laureate Francis Crick remarked in a famous article in 1979 that one thing scientists have dreamed about is a tool that would allow them to selectively activate or turn off certain groups of cells while leaving others unaffected. Twenty years later, he suggested how this might be achieved: with light and molecular engineering. And this is precisely what optogenetics does.
To understand this technique we have to go back to the 1990s when German biologist Peter Hegemann discovered that green algae, commonly found in ponds, respond to light by wagging their tail. This behaviour was intriguing because algae are unicellular creatures without eyes. Hegemann discovered that when light photons hit the protein coils packed in the algae’s cell membrane, a chemical reaction creates a tiny gap in the membrane, causing an ionic current to be produced and the algae’s tail to wag. The protein that allows this reaction with light is called channelrhodopsin and is comparable to rhodopsins found in our own eyes.
Meanwhile, Miesenböck and his colleagues, working in New York and later at Yale, wondered whether they could exploit a similar mechanism to control brain cells. They took light sensitive proteins like the photoreceptors of our eyes, transplanted them into neurons and, by simply shining a light on them, the team was able to activate the modified neurons, a first step towards neuronal control.
To exploit the full power of this method, however, the researchers needed to discover a way just to excite or inhibit selected populations of cells, and with genetic engineering they were able to achieve this. By harnessing the cunning of viruses or by creating genetically-modified mice and flies, it was possible to make expression of the rhodopsin-encoding gene specific to particular neurons, meaning that only those neurons would become active when illuminated.
The road to success for optogenetics was not easy. The first difficult step was to find out whether they were able to transplant the rhodopsin-containing photoreceptors of flies to other cells in a culture and activate them with a flash of light. Once they succeeded in doing this, the second, even more complicated challenge was to move from changing neuronal activity in a cell culture to changing the behaviour of a living being, in Miesenböck’s case the fruit fly. The promise became initially clear when Susana Lima, Miesenböck’s PhD student at the time, showed him the first baby steps taken by a fruit fly on command of light. Within 5 years, they had learned how to remote control a fly.
The technique is now so advanced there is a large volume of work looking at how brain cells control behaviour. Last year in Cell, Miesenböck and his team exposed the learning mechanisms of a fly by creating false memories (1). They placed a fly in a narrow chamber, half of which smelled of an old tennis shoe, the other half of sweet fruit. By observing how much time the fly spent on either side, the researchers were able to work out which was the fly’s preferred smell. When this location was later paired with a memorable, aversive signal – a painful electrical shock – the fly learned to avoid this location and spend more time on the opposite side of the chamber. From previous research, Miesenböck knew which neurons were involved in learning to associate the shock with an odour and could therefore directly target this system with optogenetics. By activating these cells with light when the fly was in the location of its preferred smell, Miesenböck’s team was able to provoke identical avoidance behaviour even though no electric shock was given. Thus, the fly learned from an experience it never had.
Might we be able to use this technique to control our minds in the future? Miesenböck thinks that it will be a while before optogenetics can be used in humans: “You would have to express a foreign gene in a targeted fashion and this is where the show-stopper currently lies”. While using this technique in humans may be a long way off, he does believe that optogenetic research in flies might nonetheless directly aid our understanding of the human brain because biology is generally conserved. “Nature rarely invents the wheel twice”.
For now, Miesenböck thinks the field should focus on blurring the boundaries between work in whole organisms and fine-scale research in cell cultures. They could make use of the fact that tissue in a cell culture can be treated as if it was still part of a functioning brain by activating the cells with flashes of light – a use of optogenetics that is currently underappreciated. “There will be room for brain-free neurobiology, where optogenetics provides the interface to allow researchers to really talk to and feed artificial information into neuronal systems”.
Miesenböck also advocates using light “to enable scientists to drive nervous systems outside their normal operating limits, because this is often where mechanisms reveal themselves”. Miesenböck’s team used this approach to investigate the origin of sex differences in flies. While male and female fly brains are very similar, they nonetheless display sex-specific courting behaviours. The gene that controls male courting behaviour is expressed in a very small number of neurons in the abdominal ganglia of the fly. By specifically targeting these cells with optogenetics and shining light onto this circuitry, Miesenböck’s team was able to produce male courting behaviour in all the flies, even the females (2). Thus, they were able to show that females possess a bisexual brain containing a motor programme necessary for male courtship behaviour, but do not activate it because the neuronal commands required for the behaviour are absent.
With the ability to dissect neuronal functioning in the healthy brain, optogenetics might also hold potential to help understand the exact mechanisms that cause neurological and psychiatric diseases such as depression and schizophrenia and even help treat them. For example, Karl Deisseroth and his team at Stanford University in California published a study in Science last year that used optogenetics in rats to investigate directly how deep brain stimulation might alleviate symptoms of Parkinson’s Disease, something that had previously been poorly understood (3).
Thus, despite the difficulties in applying the method to humans, Miesenböck is hopeful: “With optogenetics we can really identify the players that are responsible for particular behaviour and that may give us knowledge for targets of more conventional treatment. Then conventional treatment can become more effective and cleaner.”
References:
1. Claridge-Chang et al., 2009. Writing memories with light-addressable reinforcement circuitry. Cell 139:405- 415.
2. Dylan Clyne & Miesenböck, 2008. Sex-Specific Control and Tuning of the Pattern Generator for Courtship Song in Drosophila. Cell 133:354-363.
3. Gradinaru et al., 2009. Optical deconstruction of parkinsonian neural circuitry. Science 324:354-359.
This article appeared in 'Phenotype'. Here you can read the magazine its entirety.
A shorter, related post can be found on my blog here.
Our brain consists of approximately 100 billion neurons that, as Miesenböck lyrically describes, form “an intricate tapestry”. To understand how neuronal signalling drives our behaviour, he says, we need to tease apart the disparate contributions that each of the different populations of neurons make to our behaviour. Nobel laureate Francis Crick remarked in a famous article in 1979 that one thing scientists have dreamed about is a tool that would allow them to selectively activate or turn off certain groups of cells while leaving others unaffected. Twenty years later, he suggested how this might be achieved: with light and molecular engineering. And this is precisely what optogenetics does.
To understand this technique we have to go back to the 1990s when German biologist Peter Hegemann discovered that green algae, commonly found in ponds, respond to light by wagging their tail. This behaviour was intriguing because algae are unicellular creatures without eyes. Hegemann discovered that when light photons hit the protein coils packed in the algae’s cell membrane, a chemical reaction creates a tiny gap in the membrane, causing an ionic current to be produced and the algae’s tail to wag. The protein that allows this reaction with light is called channelrhodopsin and is comparable to rhodopsins found in our own eyes.
Meanwhile, Miesenböck and his colleagues, working in New York and later at Yale, wondered whether they could exploit a similar mechanism to control brain cells. They took light sensitive proteins like the photoreceptors of our eyes, transplanted them into neurons and, by simply shining a light on them, the team was able to activate the modified neurons, a first step towards neuronal control.
To exploit the full power of this method, however, the researchers needed to discover a way just to excite or inhibit selected populations of cells, and with genetic engineering they were able to achieve this. By harnessing the cunning of viruses or by creating genetically-modified mice and flies, it was possible to make expression of the rhodopsin-encoding gene specific to particular neurons, meaning that only those neurons would become active when illuminated.
The road to success for optogenetics was not easy. The first difficult step was to find out whether they were able to transplant the rhodopsin-containing photoreceptors of flies to other cells in a culture and activate them with a flash of light. Once they succeeded in doing this, the second, even more complicated challenge was to move from changing neuronal activity in a cell culture to changing the behaviour of a living being, in Miesenböck’s case the fruit fly. The promise became initially clear when Susana Lima, Miesenböck’s PhD student at the time, showed him the first baby steps taken by a fruit fly on command of light. Within 5 years, they had learned how to remote control a fly.
The technique is now so advanced there is a large volume of work looking at how brain cells control behaviour. Last year in Cell, Miesenböck and his team exposed the learning mechanisms of a fly by creating false memories (1). They placed a fly in a narrow chamber, half of which smelled of an old tennis shoe, the other half of sweet fruit. By observing how much time the fly spent on either side, the researchers were able to work out which was the fly’s preferred smell. When this location was later paired with a memorable, aversive signal – a painful electrical shock – the fly learned to avoid this location and spend more time on the opposite side of the chamber. From previous research, Miesenböck knew which neurons were involved in learning to associate the shock with an odour and could therefore directly target this system with optogenetics. By activating these cells with light when the fly was in the location of its preferred smell, Miesenböck’s team was able to provoke identical avoidance behaviour even though no electric shock was given. Thus, the fly learned from an experience it never had.
Might we be able to use this technique to control our minds in the future? Miesenböck thinks that it will be a while before optogenetics can be used in humans: “You would have to express a foreign gene in a targeted fashion and this is where the show-stopper currently lies”. While using this technique in humans may be a long way off, he does believe that optogenetic research in flies might nonetheless directly aid our understanding of the human brain because biology is generally conserved. “Nature rarely invents the wheel twice”.
For now, Miesenböck thinks the field should focus on blurring the boundaries between work in whole organisms and fine-scale research in cell cultures. They could make use of the fact that tissue in a cell culture can be treated as if it was still part of a functioning brain by activating the cells with flashes of light – a use of optogenetics that is currently underappreciated. “There will be room for brain-free neurobiology, where optogenetics provides the interface to allow researchers to really talk to and feed artificial information into neuronal systems”.
Miesenböck also advocates using light “to enable scientists to drive nervous systems outside their normal operating limits, because this is often where mechanisms reveal themselves”. Miesenböck’s team used this approach to investigate the origin of sex differences in flies. While male and female fly brains are very similar, they nonetheless display sex-specific courting behaviours. The gene that controls male courting behaviour is expressed in a very small number of neurons in the abdominal ganglia of the fly. By specifically targeting these cells with optogenetics and shining light onto this circuitry, Miesenböck’s team was able to produce male courting behaviour in all the flies, even the females (2). Thus, they were able to show that females possess a bisexual brain containing a motor programme necessary for male courtship behaviour, but do not activate it because the neuronal commands required for the behaviour are absent.
With the ability to dissect neuronal functioning in the healthy brain, optogenetics might also hold potential to help understand the exact mechanisms that cause neurological and psychiatric diseases such as depression and schizophrenia and even help treat them. For example, Karl Deisseroth and his team at Stanford University in California published a study in Science last year that used optogenetics in rats to investigate directly how deep brain stimulation might alleviate symptoms of Parkinson’s Disease, something that had previously been poorly understood (3).
Thus, despite the difficulties in applying the method to humans, Miesenböck is hopeful: “With optogenetics we can really identify the players that are responsible for particular behaviour and that may give us knowledge for targets of more conventional treatment. Then conventional treatment can become more effective and cleaner.”
References:
1. Claridge-Chang et al., 2009. Writing memories with light-addressable reinforcement circuitry. Cell 139:405- 415.
2. Dylan Clyne & Miesenböck, 2008. Sex-Specific Control and Tuning of the Pattern Generator for Courtship Song in Drosophila. Cell 133:354-363.
3. Gradinaru et al., 2009. Optical deconstruction of parkinsonian neural circuitry. Science 324:354-359.
This article appeared in 'Phenotype'. Here you can read the magazine its entirety.
A shorter, related post can be found on my blog here.
Monday, 3 May 2010
Death rites or delusions?
Yesterday evening I watched ‘Fantastic Mr. Fox’, Wes Anderson’s interpretation of Roald Dahl’s story about the adventures of Mr. Fox and his family. The film juxtaposes wonderfully human and animal behaviour: relationships between the animals are as layered and complex as those of humans, while their dining habits are highly characteristic for animals - putting their faces in their plates and ripping the food to pieces. As well as being a delightful story, this film also touches upon an interesting and ever-present question: do animals have feelings in the same way as humans do?
One aspect of this issue was addressed in two arresting articles in the always-interesting scientific journal Current Biology, both of which received a lot of press last week. Both concerned the reactions of groups of chimpanzees, one wild and one captive, to the death of a member of their troop. When Pansy, a 50+ year old female chimp living in the Blair Drummond Safari and Adventure Park in Sterling was dying, the females of the troop groomed her and following her death, her daughter spent the night by her side on a platform where she would never normally stay. The images from the other article were no less affecting: a mother carrying her deceased infant for as long as 68 days after its death.
When reading such stories, it is hard not to think in terms of the chimps' grief or anguish and to consider these rituals as part of a mourning process, just as we might respond to the loss of a mother or child. There are reports of death rituals in birds too, but I doubt they would affect us in quite the same way as watching these wild, yet recognisably close creatures in an apparent state of mourning - chimps are genetically our closest ancestor and possess features that we cannot help but respond to.
But at what point does such empathy turn into anthropomorphism? This touches on an old question of how well we can start to describe how animals actually feel, even one as close to us as a chimp. The philosopher Thomas Nagel once raised this famously in his succinctly titled article "What is it like to be a bat?”. This general problem was also a prime driver in the behaviourist movement in experimental psychology where people such as B.F. Skinner, and more recently Howard Rachlin, have argued that there can only be observable inputs and outcomes - the animal responds to its environment partly as a sort of reflex – as internal emotions or motivations are just hypothetical constructs which we try to guess through an animal’s responses.
The particular interpretations of these two recent Current Biology articles are discussed in a lucid comment piece in the Guardian last Saturday by Ros Coward, pointing out a need for replicable research that measures tangible effects such as stress levels or the amount of time spent engaging in particular behaviours. These existential questions concerning what goes on in the minds of animals are approached almost from an entirely opposite angle across a few episodes of my favourite radio programme Radiolab, where the potential similarities between our minds and those of other mammals are explored (main episode: Animal Minds, Shorts: Fu Manchu and The shy baboon).
There is also another side to the anthropomorphism that is less remarked upon, which highlights something fundamental, perhaps unique, to being human: our incessant ability to find narrative. In a fascinating study which builds on original research from the 1940s and was published in the journal Brain in 2002, a group lead by Uta Frith at UCL documented how people would ascribe intentions and emotions such as being happy or even seduction to abstract shapes moving around on a screen. Have a look at the coaxing, dancing and drifting shapes and see for yourself how easy it is to attach human emotion to these inanimate animated objects. Interestingly, this effect that was much less evident in adults with autism, which goes along with a prominent theory that autistic people lack an ability to ascribe mental states to others.
Do these results published in Current Biology mean that animals have the same feelings of fear and pain as we do? If so, does the only difference between them and us lie in the fact that we can put our feelings into words to explain them to others? And how ‘animal’ are we? Are we just as much responding to our environment in a reflex-like manner, but with the benefit of having an ability to narrate our actions, seemingly explaining them to ourselves and others?
One aspect of this issue was addressed in two arresting articles in the always-interesting scientific journal Current Biology, both of which received a lot of press last week. Both concerned the reactions of groups of chimpanzees, one wild and one captive, to the death of a member of their troop. When Pansy, a 50+ year old female chimp living in the Blair Drummond Safari and Adventure Park in Sterling was dying, the females of the troop groomed her and following her death, her daughter spent the night by her side on a platform where she would never normally stay. The images from the other article were no less affecting: a mother carrying her deceased infant for as long as 68 days after its death.
When reading such stories, it is hard not to think in terms of the chimps' grief or anguish and to consider these rituals as part of a mourning process, just as we might respond to the loss of a mother or child. There are reports of death rituals in birds too, but I doubt they would affect us in quite the same way as watching these wild, yet recognisably close creatures in an apparent state of mourning - chimps are genetically our closest ancestor and possess features that we cannot help but respond to.
But at what point does such empathy turn into anthropomorphism? This touches on an old question of how well we can start to describe how animals actually feel, even one as close to us as a chimp. The philosopher Thomas Nagel once raised this famously in his succinctly titled article "What is it like to be a bat?”. This general problem was also a prime driver in the behaviourist movement in experimental psychology where people such as B.F. Skinner, and more recently Howard Rachlin, have argued that there can only be observable inputs and outcomes - the animal responds to its environment partly as a sort of reflex – as internal emotions or motivations are just hypothetical constructs which we try to guess through an animal’s responses.
The particular interpretations of these two recent Current Biology articles are discussed in a lucid comment piece in the Guardian last Saturday by Ros Coward, pointing out a need for replicable research that measures tangible effects such as stress levels or the amount of time spent engaging in particular behaviours. These existential questions concerning what goes on in the minds of animals are approached almost from an entirely opposite angle across a few episodes of my favourite radio programme Radiolab, where the potential similarities between our minds and those of other mammals are explored (main episode: Animal Minds, Shorts: Fu Manchu and The shy baboon).
There is also another side to the anthropomorphism that is less remarked upon, which highlights something fundamental, perhaps unique, to being human: our incessant ability to find narrative. In a fascinating study which builds on original research from the 1940s and was published in the journal Brain in 2002, a group lead by Uta Frith at UCL documented how people would ascribe intentions and emotions such as being happy or even seduction to abstract shapes moving around on a screen. Have a look at the coaxing, dancing and drifting shapes and see for yourself how easy it is to attach human emotion to these inanimate animated objects. Interestingly, this effect that was much less evident in adults with autism, which goes along with a prominent theory that autistic people lack an ability to ascribe mental states to others.
Do these results published in Current Biology mean that animals have the same feelings of fear and pain as we do? If so, does the only difference between them and us lie in the fact that we can put our feelings into words to explain them to others? And how ‘animal’ are we? Are we just as much responding to our environment in a reflex-like manner, but with the benefit of having an ability to narrate our actions, seemingly explaining them to ourselves and others?
Monday, 19 April 2010
Why does sleep deprivation alleviate symptoms of depression?
Here's a link to an interesting article in the New York Times about depression and sleep that surprised me. A couple of months ago the neighbours upstairs had a new baby. He cries a lot, day and night, and you would think that the sleepless nights his crying causes could easily make anyone depressed. Indeed, postpartum depression can be remarkably common, affecting anything between 5-25% of new mothers in the first few months after giving birth. But if anything, according to this article by Terry Sejnowski, a renowned computational neurobiologist, an entire night without sleep can actually lift symptoms of depression in such afflicted women. Unfortunately, this is no miracle cure. You cannot escape the lack of concentration, irritability and memory loss inevitable after a night without sleep, and even the shortest nap can break the spell. Exactly how sleep deprivation alleviates symptoms of depression is still a still largely unknown, but one avenue of research scientists are currently exploring focuses on general sleep patterns with a particular focus on the rapid eye-movement (REM) stages of sleep. A link between REM sleep and depression has already been made: a common antidepressants blocks REM sleep and people with a genetic predisposition for entering REM sleep very early on in their sleep cycle are at a larger risk of becoming depressed. Thus, while staying awake for one night may not actually, at least not in a longer term, cure or even treat depression effectively, the alleviating effects of staying awake may give scientists an interesting direction for research into depression.
Friday, 16 April 2010
Freud's full circle?
While Freud's theories have had an enormous impact on psychiatry - psychoanalysis today still uses similar methods to the ones Freud developed in the beginning of the 20th century - they have long been engulfed in controversy. Freud's psychoanalytical thinking focused on the understanding of human behaviour by gaining access into the unconscious mind. In a typical session on Freud's sofa you might talk about your dreams and fantasies, letting your mind wander and speak without controlling your thoughts. Freud would listen to you, absorbing your thoughts and interpreting them, unravelling the unconscious conflicts that caused the symptoms for which you came to this session. Unveiling and subsequently dealing with these unconscious conflicts would cure the original symptoms of your mental instability.
One of the major criticisms of Freud lies in the lack of experimental scrutiny that surrounds his methods of baring the unconscious. Such lack of experimental evidence was, and still is, seen as unscientific. In the 1960s and 70s however, the idea of the presence of the unconscious re-emerged and became of particular interest for neuropsychologists who were trying to gain understanding in seemingly unconscious processes in split-brain patients and in disorders such as Alien Hand Syndrome. In split-brain patients, all the connecting fibres between the two sides of the brain were surgically cut to alleviate severe symptoms of epilepsy such that there are no direct routes for communication between the two halves of the brain any more. While this undoubtedly helped reduced the severity of symptoms, this procedure also had some other interesting effects. In a series of experiments that went on to gain him a Nobel Prize, Roger Sperry showed that each hemisphere could seemingly have simultaneous systems of volition. For instance, when he showed a split-brain patient a picture on the left side of a computer screen, which will be processed by the right side of the brain, the side that usually does not contain the language areas, they would tell him they had not seen anything. However, when he then asked them to select an object from several alternatives with their left hand (the one controlled by the right hemisphere), they would choose the object that was presented to them just a second ago even though they could not express why they had picked that exact object.
While complete sections of the corpus callosum tend no longer to be performed, similar bizarre “unconscious” desires also manifest themselves in patients with particular brain damage that affects this region. For instance, in patients with Alien Hand Syndrome one hand does something completely different and independent from the other. Perhaps the most famous example was Dr. Strangelove, who had to keep one hand in control with the other. Another compelling example is that of a woman who was determined to smoke a cigarette, but whenever her one hand had put the cigarette in her mouth, the other would grab it and throw it away.
In fact, as Emeritus Professor of Neuropsychology at Oxford Larry Weiskrantz has pointed out, a curious facet of many clinical syndromes caused by brain damage is that, while these patients may lose particular conscious faculties such as being able to recall past events or identify people by their faces, they still retain “unconscious” abilities to do exactly these things. A patient with prosopagnosia may not consciously be able to recognise faces as a result of damage to the temporal lobe, a region in the lower part of the brain particularly important for memory, but will still able to show changes in arousal when seeing someone familiar.
Today, with the ability to look inside the human brain while someone is ‘thinking’, we can observe the processes that go on inside, even the unconscious ones. With such brain imaging techniques neuroeconomists have already started to gain insight into unconscious thought processing by showing that when we make economic decisions, for instance buying something on eBay, we tend to depend much less on our conscious, rational deliberation and much more on subconscious gut feeling and emotion. Perhaps Professor John-Dylan Haynes at the Bernstein Center for Computational Neuroscience in Berlin made an even more intriguing discovery: he was able to predict, by looking at someone’s pattern of brain activity with functional neuroimaging, what a person is going to do and when they will do it nearly 10 seconds before he or she actually does it.
In an article published in Brain this week, Robin Carhart-Harris and Karl Friston argue that with the aid of these brain-imaging techniques, Freudian concepts might now be tested experimentally. Until recently, one of the most common ways to analyse brain imaging data was to directly compare networks of brain activation during a specific task to networks of activation during periods where the brain was assumed to be at rest. However, over the past ten years, research pioneered by Marcus Raichle started looking into what was actually going on in the brain during these periods of rest. Surprisingly, he and his colleagues noticed that the patterns of activity during rest periods were remarkably consistent, which lead him and other researchers to suggest the existence of a “default” network. According to Carhart-Harris and Friston this default network might represent intrinsic internal thought remarkably consistent with the unconscious thought processes in Freud’s later theories. Many of the key principles of Freud's theory they argue, such as 'the ego' (our conscious self) and 'the id' (our unconscious self), echo our current knowledge of how the brain functions on a global level (i.e. a different set of areas in the brain that is active during conscious processing from the set that is active during unconscious processing).
Could it be that, after his initial success and subsequent fall from grace, Freud has now come full circle? Appropriately, it turns out that even Freud himself had originally attempted a not dissimilar scientific approach in the Project of Scientific Psychology published in 1895. In his neurophysiological theory he suggested that the transfer of energy between neurons in the brain caused unconscious processes, but in the years to come he decided that neuronal processing as understood at the time seemed much too complex for such an interpretation. Instead, as a result of his analyses of dreams, he proposed that the unconscious was a result of highly condensed, symbolic thoughts – the primary processes – and the conscious a highly rational and logical way of thinking – the secondary processes. That neuroscientists are, consciously or unconsciously, currently returning to these ideas would likely have amused Freud.
This post also appeared on Cherwell's Matter Scientific
One of the major criticisms of Freud lies in the lack of experimental scrutiny that surrounds his methods of baring the unconscious. Such lack of experimental evidence was, and still is, seen as unscientific. In the 1960s and 70s however, the idea of the presence of the unconscious re-emerged and became of particular interest for neuropsychologists who were trying to gain understanding in seemingly unconscious processes in split-brain patients and in disorders such as Alien Hand Syndrome. In split-brain patients, all the connecting fibres between the two sides of the brain were surgically cut to alleviate severe symptoms of epilepsy such that there are no direct routes for communication between the two halves of the brain any more. While this undoubtedly helped reduced the severity of symptoms, this procedure also had some other interesting effects. In a series of experiments that went on to gain him a Nobel Prize, Roger Sperry showed that each hemisphere could seemingly have simultaneous systems of volition. For instance, when he showed a split-brain patient a picture on the left side of a computer screen, which will be processed by the right side of the brain, the side that usually does not contain the language areas, they would tell him they had not seen anything. However, when he then asked them to select an object from several alternatives with their left hand (the one controlled by the right hemisphere), they would choose the object that was presented to them just a second ago even though they could not express why they had picked that exact object.
While complete sections of the corpus callosum tend no longer to be performed, similar bizarre “unconscious” desires also manifest themselves in patients with particular brain damage that affects this region. For instance, in patients with Alien Hand Syndrome one hand does something completely different and independent from the other. Perhaps the most famous example was Dr. Strangelove, who had to keep one hand in control with the other. Another compelling example is that of a woman who was determined to smoke a cigarette, but whenever her one hand had put the cigarette in her mouth, the other would grab it and throw it away.
In fact, as Emeritus Professor of Neuropsychology at Oxford Larry Weiskrantz has pointed out, a curious facet of many clinical syndromes caused by brain damage is that, while these patients may lose particular conscious faculties such as being able to recall past events or identify people by their faces, they still retain “unconscious” abilities to do exactly these things. A patient with prosopagnosia may not consciously be able to recognise faces as a result of damage to the temporal lobe, a region in the lower part of the brain particularly important for memory, but will still able to show changes in arousal when seeing someone familiar.
Today, with the ability to look inside the human brain while someone is ‘thinking’, we can observe the processes that go on inside, even the unconscious ones. With such brain imaging techniques neuroeconomists have already started to gain insight into unconscious thought processing by showing that when we make economic decisions, for instance buying something on eBay, we tend to depend much less on our conscious, rational deliberation and much more on subconscious gut feeling and emotion. Perhaps Professor John-Dylan Haynes at the Bernstein Center for Computational Neuroscience in Berlin made an even more intriguing discovery: he was able to predict, by looking at someone’s pattern of brain activity with functional neuroimaging, what a person is going to do and when they will do it nearly 10 seconds before he or she actually does it.
In an article published in Brain this week, Robin Carhart-Harris and Karl Friston argue that with the aid of these brain-imaging techniques, Freudian concepts might now be tested experimentally. Until recently, one of the most common ways to analyse brain imaging data was to directly compare networks of brain activation during a specific task to networks of activation during periods where the brain was assumed to be at rest. However, over the past ten years, research pioneered by Marcus Raichle started looking into what was actually going on in the brain during these periods of rest. Surprisingly, he and his colleagues noticed that the patterns of activity during rest periods were remarkably consistent, which lead him and other researchers to suggest the existence of a “default” network. According to Carhart-Harris and Friston this default network might represent intrinsic internal thought remarkably consistent with the unconscious thought processes in Freud’s later theories. Many of the key principles of Freud's theory they argue, such as 'the ego' (our conscious self) and 'the id' (our unconscious self), echo our current knowledge of how the brain functions on a global level (i.e. a different set of areas in the brain that is active during conscious processing from the set that is active during unconscious processing).
Could it be that, after his initial success and subsequent fall from grace, Freud has now come full circle? Appropriately, it turns out that even Freud himself had originally attempted a not dissimilar scientific approach in the Project of Scientific Psychology published in 1895. In his neurophysiological theory he suggested that the transfer of energy between neurons in the brain caused unconscious processes, but in the years to come he decided that neuronal processing as understood at the time seemed much too complex for such an interpretation. Instead, as a result of his analyses of dreams, he proposed that the unconscious was a result of highly condensed, symbolic thoughts – the primary processes – and the conscious a highly rational and logical way of thinking – the secondary processes. That neuroscientists are, consciously or unconsciously, currently returning to these ideas would likely have amused Freud.
This post also appeared on Cherwell's Matter Scientific
Thursday, 8 April 2010
A day at the museum ...
Much to my shame I have to admit that it has taken me well over two years of living in Oxford to visit the Pitt Rivers Museum and the History of Science Museum. Now that I finally have, I want to spread the word and urge everyone to go.
In 1884, Augustus Henry Lane Fox Pitt Rivers gave his collection of nearly 20,000 objects collected from many different cultures all over the world to the University of Oxford. As Pitt Rivers wanted it, the collection, spanning two floors in the beautiful open plan building behind the Natural History Museum (in itself worth a visit - don't forget to look up to see the amazing glass roof!), is grouped according to how the objects were made and what they were used for rather than on the basis of their age and cultural origin. This makes it easy to compare objects both through the ages and between cultures. The museum displays an excellent selection of medical tools through the ages and, in one of the many drawers you're allowed to open, you can find a collection of pendants, dead frogs, mole feet, and even a 20-year-old hot cross bun, all used to cure diseases. Or move on to one of the most macabre things I've ever seen on display in a museum: the shrunken heads. The heads belonged to the less fortunate in battle and were hung around the neck of conqueror.
Once you’ve had your fill of the Pitt Rivers’ gothic charms, you can turn your attention to the relics of past scientific endeavours down the road at the History of Science Museum. To gain entrance to the 17th century building which houses the collection, you have to pass by four imposing “emperor's heads” and climb a grandiose set of steps, but once inside, it is the small scale and beauty that strikes you. Throughout, you can find an amazing collection of scientific instruments, from the first camera to the largest collection of ancient astrolabes in the UK. Perhaps the most famous object in the museum is the blackboard with Einstein's notes from a 1931 lecture given in Oxford dealing with some of the fundamental questions in cosmology. Rivalling some of the gruesome fascination of the Pitt Rivers’ shrunken heads is the display of medical instruments through the ages. As I stood in front of the display, I couldn’t help but imagine the sheer brutality of the procedures and the pain people in the Middle Ages must have gone through to have their limbs amputated. The museum also features special exhibitions alongside their permanent collection. In Steampunk, both the name of the exhibition and an art form, they wonderfully displayed the marriage between science and art showing objects (scroll down for some images of the objects) with futuristic ideas that are at the same time reminiscent of the past in the intricacy of their construction.
It is nice to know that one does not have to travel all the way to London to have your imagination transported and see the way in which human ingenuity, engineering and superstition have all played a role in our scientific progress.
In 1884, Augustus Henry Lane Fox Pitt Rivers gave his collection of nearly 20,000 objects collected from many different cultures all over the world to the University of Oxford. As Pitt Rivers wanted it, the collection, spanning two floors in the beautiful open plan building behind the Natural History Museum (in itself worth a visit - don't forget to look up to see the amazing glass roof!), is grouped according to how the objects were made and what they were used for rather than on the basis of their age and cultural origin. This makes it easy to compare objects both through the ages and between cultures. The museum displays an excellent selection of medical tools through the ages and, in one of the many drawers you're allowed to open, you can find a collection of pendants, dead frogs, mole feet, and even a 20-year-old hot cross bun, all used to cure diseases. Or move on to one of the most macabre things I've ever seen on display in a museum: the shrunken heads. The heads belonged to the less fortunate in battle and were hung around the neck of conqueror.
Once you’ve had your fill of the Pitt Rivers’ gothic charms, you can turn your attention to the relics of past scientific endeavours down the road at the History of Science Museum. To gain entrance to the 17th century building which houses the collection, you have to pass by four imposing “emperor's heads” and climb a grandiose set of steps, but once inside, it is the small scale and beauty that strikes you. Throughout, you can find an amazing collection of scientific instruments, from the first camera to the largest collection of ancient astrolabes in the UK. Perhaps the most famous object in the museum is the blackboard with Einstein's notes from a 1931 lecture given in Oxford dealing with some of the fundamental questions in cosmology. Rivalling some of the gruesome fascination of the Pitt Rivers’ shrunken heads is the display of medical instruments through the ages. As I stood in front of the display, I couldn’t help but imagine the sheer brutality of the procedures and the pain people in the Middle Ages must have gone through to have their limbs amputated. The museum also features special exhibitions alongside their permanent collection. In Steampunk, both the name of the exhibition and an art form, they wonderfully displayed the marriage between science and art showing objects (scroll down for some images of the objects) with futuristic ideas that are at the same time reminiscent of the past in the intricacy of their construction.
It is nice to know that one does not have to travel all the way to London to have your imagination transported and see the way in which human ingenuity, engineering and superstition have all played a role in our scientific progress.
Thursday, 25 March 2010
The evolution of God?
In his Lent Lectures (scroll down for the text) the Reverend Professor Alister McGrath reflects on the relationship between natural sciences, faith and religion. He talks about the wonders of scientific discovery and where it has led us. Ultimately, he says, science is neutral. The good that has been done with scientific progress is as much down to human nature as is the evil it has been used for.
But McGrath captures perhaps a more profound point in this quote by the famous philosopher of science Karl Popper: "Science doesn't make assertions about ultimate questions - about the riddles of existence." Science has shed light on many of the mechanisms that surround and even compose us - we have a much better understanding of our body and brain thanks to science, we can talk about climate change because, among other things, we understand our own atmosphere, and our explorations go far beyond the reach of the solar system. However, it has not (yet?) been able to answer the ultimate questions human beings tend to ask themselves: 'Why am I here?', 'What is the purpose of all this?'.
Could an inability to answer such questions be the cause of the human need for religion? In part one of the BBC Radio 4 programme 'God on my mind', Matthew Taylor goes on a journey to discover why so many of us feel the need for a higher power to make sense out of our being. He concludes that "religion depends on its own ability to change and adapt".
But McGrath captures perhaps a more profound point in this quote by the famous philosopher of science Karl Popper: "Science doesn't make assertions about ultimate questions - about the riddles of existence." Science has shed light on many of the mechanisms that surround and even compose us - we have a much better understanding of our body and brain thanks to science, we can talk about climate change because, among other things, we understand our own atmosphere, and our explorations go far beyond the reach of the solar system. However, it has not (yet?) been able to answer the ultimate questions human beings tend to ask themselves: 'Why am I here?', 'What is the purpose of all this?'.
Could an inability to answer such questions be the cause of the human need for religion? In part one of the BBC Radio 4 programme 'God on my mind', Matthew Taylor goes on a journey to discover why so many of us feel the need for a higher power to make sense out of our being. He concludes that "religion depends on its own ability to change and adapt".
Monday, 15 March 2010
Man-made disaster? Cultural image of testosterone and not just its biological effects may contribute to our bargaining strategies
“Both feminist and mainstream economists have pointed out that the credit crunch is quite literally a man-made disaster, a monster created in the testosterone-drenched environment of Wall Street and the City.”
This quote from Ruth Sunderland in the Guardian – and there are many more examples in a similar vein – suggests that the global recession was at least partly driven by the hormone that has, in Western culture, become inherently associated with aggressive, masculine and risky behaviour. But to what degree is it fair to connect the crisis of the world economy to a single hormone?
University of Zurich economist Ernst Fehr has long believed that theoretical economics has failed to adequately take into account our social and biological nature when building models of decision-making. In a series of earlier experiments, scientists led by Fehr demonstrated that people could be made more trusting in an economic game simply by being given a dose of a hormone called oxytocin which had previously been implicated in social bonding and maternal behaviour. He has now turned his attention to investigate whether testosterone might also alter the way we interact and trade.
One paradigm used by Fehr to probe how social factors influence our decisions is the Ultimatum Game. In this game, a proposer divides a sum of money between himself and a responder and the responder then decides whether to accept or reject the offer. If accepted, both players get the money as split by the proposer; otherwise, neither receives anything.
Although this game is played only once and participants often never meet, responders have consistently been shown to reject offers which strongly favour the proposer. The proposer’s task therefore is to find an appropriate balance between fairness and selfish monetary gain.
In January of this year, Fehr and colleagues published their first findings on the effects of testosterone in the Ultimatum Game in the journal Nature. Against expectations, rather than making them more risk-taking and uncooperative, the women who received testosterone became more generous in their role as proposer, offering significantly more money to the responder than those who had taken a placebo.
Arguably, however, the most intriguing results emerge when Fehr examined the offers made by those participants who simply believed they had been given the drug, irrespective of whether they actually had. Those who reported that they had taken the hormone played the game less fairly, proposing to split money significantly more unevenly, compared to the women who believed they had received the placebo. In other words, while testosterone was working to encourage fairer offers, long-held notions of how testosterone should make us behave were paradoxically promoting selfish behaviour.
The precise interaction between hormones and economic behaviour, however, is likely to be complicated. Two studies published in Proceedings of the National Academy of Sciences last year reported contradictory results, with higher levels of testosterone being linked with reduced risk aversion in female MBA students but not in post-menopausal women. Moreover, testosterone may actually promote better trading in some real-life situations. In an experiment conducted on the City of London floor, John Coates and Joe Herbert showed that professional traders had higher levels of testosterone on days on which they enjoyed above-average returns.
During his 2010 Clarendon Lectures at the University of Oxford, Fehr argued that testosterone may only lead to aggression and risk-taking when such behaviour is necessary to uphold status in social situations. If true, it might turn out that testosterone is not the “monster of Wall Street and the City” but instead only turns into this when people feel required to take great risks to comply with expectations.
This quote from Ruth Sunderland in the Guardian – and there are many more examples in a similar vein – suggests that the global recession was at least partly driven by the hormone that has, in Western culture, become inherently associated with aggressive, masculine and risky behaviour. But to what degree is it fair to connect the crisis of the world economy to a single hormone?
University of Zurich economist Ernst Fehr has long believed that theoretical economics has failed to adequately take into account our social and biological nature when building models of decision-making. In a series of earlier experiments, scientists led by Fehr demonstrated that people could be made more trusting in an economic game simply by being given a dose of a hormone called oxytocin which had previously been implicated in social bonding and maternal behaviour. He has now turned his attention to investigate whether testosterone might also alter the way we interact and trade.
One paradigm used by Fehr to probe how social factors influence our decisions is the Ultimatum Game. In this game, a proposer divides a sum of money between himself and a responder and the responder then decides whether to accept or reject the offer. If accepted, both players get the money as split by the proposer; otherwise, neither receives anything.
Although this game is played only once and participants often never meet, responders have consistently been shown to reject offers which strongly favour the proposer. The proposer’s task therefore is to find an appropriate balance between fairness and selfish monetary gain.
In January of this year, Fehr and colleagues published their first findings on the effects of testosterone in the Ultimatum Game in the journal Nature. Against expectations, rather than making them more risk-taking and uncooperative, the women who received testosterone became more generous in their role as proposer, offering significantly more money to the responder than those who had taken a placebo.
Arguably, however, the most intriguing results emerge when Fehr examined the offers made by those participants who simply believed they had been given the drug, irrespective of whether they actually had. Those who reported that they had taken the hormone played the game less fairly, proposing to split money significantly more unevenly, compared to the women who believed they had received the placebo. In other words, while testosterone was working to encourage fairer offers, long-held notions of how testosterone should make us behave were paradoxically promoting selfish behaviour.
The precise interaction between hormones and economic behaviour, however, is likely to be complicated. Two studies published in Proceedings of the National Academy of Sciences last year reported contradictory results, with higher levels of testosterone being linked with reduced risk aversion in female MBA students but not in post-menopausal women. Moreover, testosterone may actually promote better trading in some real-life situations. In an experiment conducted on the City of London floor, John Coates and Joe Herbert showed that professional traders had higher levels of testosterone on days on which they enjoyed above-average returns.
During his 2010 Clarendon Lectures at the University of Oxford, Fehr argued that testosterone may only lead to aggression and risk-taking when such behaviour is necessary to uphold status in social situations. If true, it might turn out that testosterone is not the “monster of Wall Street and the City” but instead only turns into this when people feel required to take great risks to comply with expectations.
Thursday, 4 February 2010
The name Samantha tastes like bubblegum
“The number 4 is a bright acid-yellow and 5 is crayola-blue. Together they should make 8, which is a bright green, but instead they really make 9, which is wet-dirt-brown. It has never made sense to me. Algebra is what makes X turn brown, too. Letters least of all should be brought into that mess.”
If you are like me, this will undoubtedly not make any sense to you. However, to approximately one in twenty people it may, at least to a certain degree, seem familiar, even if they may disagree vehemently on the exact pairings of colours and numbers. The above quotation is the writing of a sixteen-year-old girl with synaesthesia. Synaesthesia (syn meaning ‘together’ and aisthesis, ‘sensation’) is a neurological condition in which an instant, involuntary co-occurrence of one sensation takes place as a result of the occurrence of another type. This can happen between any of the senses – days of the week may have their own particular colours, G-major a particular smell, and a triangle a specific taste.
While the concept of synaesthesia is not new – ancient Greek philosophers already investigated the link between colour and music, and Newton suggested that colours and sounds may have similar frequencies – it was not until the 1980s that scientists started investigating synaesthesia in earnest.
Those without synaesthesia may wonder whether it is simply a set of made-up or delusional associations. However, one of the characteristics of synaesthesia is that pairings between sensations remain stable over time (life-long). Moreover, these sensations seem to arise from an organic basis: patterns of activity in a synaethetes’ brain reflect both the appropriate and the paired sensation as if it ‘really’ perceives both types of stimulation. For example, while anyone’s auditory cortex will be activated when listening to music, a sound-colour synaesthete will also activate the visual cortex to reflect the colours simultaneously experienced in their mind. Further investigation into synaesthetes’ brains has led to the discovery of ‘hyperconnectivity’, namely the existence of many more pathways between the cortical regions that process different sorts of sensory information as compared to a normal brain, maybe allowing more possibility for crossing-over of different sorts of sensory information.
Behavioural studies in babies have shown that our brain is hyperconnected at birth and that, as part of the maturation process, we lose this hyperconnectivity in the first few month to years of our life. Although a specific gene has not yet been discovered, synaesthesia tends to run in the family, and this has led researchers to believe that a genetic abnormality prevents the brain from complete cortical maturation, thus leaving the brain hyperconnected.
Even though synaesthesia is a neurological condition, it is difficult to claim that that people suffer from it. Many highly creative people such as Nabokov, Kandinsky, and Messiaen were synaesthetes, and for most it is just as innate as the colour of their eyes and the size of their feet. This ‘normalness’ of the condition may well be the reason why prevalence in the general population has been estimated from anywhere between 1 in 20000 to 1 in 20 people (the latter being a more likely estimate).
Although this may still leave the vast majority of us without such abilities, it is often overlooked how much we all possess some synaesthetic abilities. For instance, Professor Charles Spence of the University of Oxford showed in an experiment conducted at Heston Blumenthal’s award-winning restaurant that sounds play a particularly important role in our perception of food: a bacon and egg ice-cream was perceived as tasting more strongly of bacon when it was accompanied by the sizzling sound of bacon being fried compared to when there was no such sound present, the result of sensations crossing-over. Similarly, many of us automatically associate shapes and sounds (does “Kiki” sound sharp or “Bouba” rounded?) or even perceive certain names as being sexier than others. For fun exploring your own synaesthesic tendencies test yourself here.
As someone who does not have the slightest hint of a synaesthesic mind, I can only wish I could, if only for one day, return to my infant state and re-see the days of the week in vivid colour, let Kandinsky’s paintings make music in my mind, and finally find out what circles really taste like …
This post was written for and published on Cherwell's Matters Scientific
If you are like me, this will undoubtedly not make any sense to you. However, to approximately one in twenty people it may, at least to a certain degree, seem familiar, even if they may disagree vehemently on the exact pairings of colours and numbers. The above quotation is the writing of a sixteen-year-old girl with synaesthesia. Synaesthesia (syn meaning ‘together’ and aisthesis, ‘sensation’) is a neurological condition in which an instant, involuntary co-occurrence of one sensation takes place as a result of the occurrence of another type. This can happen between any of the senses – days of the week may have their own particular colours, G-major a particular smell, and a triangle a specific taste.
While the concept of synaesthesia is not new – ancient Greek philosophers already investigated the link between colour and music, and Newton suggested that colours and sounds may have similar frequencies – it was not until the 1980s that scientists started investigating synaesthesia in earnest.
Those without synaesthesia may wonder whether it is simply a set of made-up or delusional associations. However, one of the characteristics of synaesthesia is that pairings between sensations remain stable over time (life-long). Moreover, these sensations seem to arise from an organic basis: patterns of activity in a synaethetes’ brain reflect both the appropriate and the paired sensation as if it ‘really’ perceives both types of stimulation. For example, while anyone’s auditory cortex will be activated when listening to music, a sound-colour synaesthete will also activate the visual cortex to reflect the colours simultaneously experienced in their mind. Further investigation into synaesthetes’ brains has led to the discovery of ‘hyperconnectivity’, namely the existence of many more pathways between the cortical regions that process different sorts of sensory information as compared to a normal brain, maybe allowing more possibility for crossing-over of different sorts of sensory information.
Behavioural studies in babies have shown that our brain is hyperconnected at birth and that, as part of the maturation process, we lose this hyperconnectivity in the first few month to years of our life. Although a specific gene has not yet been discovered, synaesthesia tends to run in the family, and this has led researchers to believe that a genetic abnormality prevents the brain from complete cortical maturation, thus leaving the brain hyperconnected.
Even though synaesthesia is a neurological condition, it is difficult to claim that that people suffer from it. Many highly creative people such as Nabokov, Kandinsky, and Messiaen were synaesthetes, and for most it is just as innate as the colour of their eyes and the size of their feet. This ‘normalness’ of the condition may well be the reason why prevalence in the general population has been estimated from anywhere between 1 in 20000 to 1 in 20 people (the latter being a more likely estimate).
Although this may still leave the vast majority of us without such abilities, it is often overlooked how much we all possess some synaesthetic abilities. For instance, Professor Charles Spence of the University of Oxford showed in an experiment conducted at Heston Blumenthal’s award-winning restaurant that sounds play a particularly important role in our perception of food: a bacon and egg ice-cream was perceived as tasting more strongly of bacon when it was accompanied by the sizzling sound of bacon being fried compared to when there was no such sound present, the result of sensations crossing-over. Similarly, many of us automatically associate shapes and sounds (does “Kiki” sound sharp or “Bouba” rounded?) or even perceive certain names as being sexier than others. For fun exploring your own synaesthesic tendencies test yourself here.
As someone who does not have the slightest hint of a synaesthesic mind, I can only wish I could, if only for one day, return to my infant state and re-see the days of the week in vivid colour, let Kandinsky’s paintings make music in my mind, and finally find out what circles really taste like …
This post was written for and published on Cherwell's Matters Scientific
Saturday, 16 January 2010
Radiolab's exciting scientific journeys
I have recently discovered a wonderful science based radio programme on New York Public Radio called Radiolab. Since that first discovery, I have downloaded and listened to many of the podcasts in which Jad and Robert explore the world of science in a unique and thoughtful way. Their genuine curiosity for the topics makes this programme so beautiful, sometimes even touching; and because the topics are often easy to relate to it is not difficult to feel the excitement. I could say a lot more about it but this quote, taken from their website, says it all:
"Radiolab believes your ears are a portal to another world. Where sound illuminates ideas, and the boundaries blur between science, philosophy, and human experience. Big questions are investigated, tinkered with, and encouraged to grow. Bring your curiosity, and we'll feed it with possibility."
I hope you will enjoy this programme as much as I do!
"Radiolab believes your ears are a portal to another world. Where sound illuminates ideas, and the boundaries blur between science, philosophy, and human experience. Big questions are investigated, tinkered with, and encouraged to grow. Bring your curiosity, and we'll feed it with possibility."
I hope you will enjoy this programme as much as I do!
Wednesday, 13 January 2010
Birthdays and battle lines
Happy New Year!
This year promises to be an interesting one scientifically, not least in that the venerable institution, the Royal Society, celebrates its 350th birthday which is being marked with a year of events and activities. To start this off, Radio 4 and Melvyn Bragg presented an incisive and in-depth 4-part history of the Society, from its early days in Wadham College here in Oxford when the language had no term for the scientist and instead "natural philosophers" discussed ideas and witness early (and sometimes strange and/or barbaric sounding) experiments, to its present inception.
An interesting counterpoint perhaps can be observed in the travails of another old scientific society, the Royal Institution, and the removal of Susan Greenfield as its director. Away from the politics that appears to be accompanying this, there is an interesting post by Mark Henderson in the Times Online questioning the need for scientific popularisers and mediators between journalists and the researchers themselves.
It is a viable question: does someone who makes a career of explaining science really still have time to conduct research and keep abreast of current developments in their own specialist fields? But equally, can a journalist stay on top of everything from stem cells to quarks and therefore be able to interrogate each new scientific finding without having some kind of expert mediation to suggest the types of questions to pose?
This year promises to be an interesting one scientifically, not least in that the venerable institution, the Royal Society, celebrates its 350th birthday which is being marked with a year of events and activities. To start this off, Radio 4 and Melvyn Bragg presented an incisive and in-depth 4-part history of the Society, from its early days in Wadham College here in Oxford when the language had no term for the scientist and instead "natural philosophers" discussed ideas and witness early (and sometimes strange and/or barbaric sounding) experiments, to its present inception.
An interesting counterpoint perhaps can be observed in the travails of another old scientific society, the Royal Institution, and the removal of Susan Greenfield as its director. Away from the politics that appears to be accompanying this, there is an interesting post by Mark Henderson in the Times Online questioning the need for scientific popularisers and mediators between journalists and the researchers themselves.
It is a viable question: does someone who makes a career of explaining science really still have time to conduct research and keep abreast of current developments in their own specialist fields? But equally, can a journalist stay on top of everything from stem cells to quarks and therefore be able to interrogate each new scientific finding without having some kind of expert mediation to suggest the types of questions to pose?
Subscribe to:
Posts (Atom)