skip to content

Faculty of Biology

 

Ability of multi-drug resistant infection to evolve within cystic fibrosis patients highlights need for rapid treatment

Thu, 29/04/2021 - 19:00

Around one in 2,500 children in the UK is born with cystic fibrosis, a hereditary condition that causes the lungs to become clogged up with thick, sticky mucus. The condition tends to decrease life expectancy among patients.

In recent years, M. abscessus, a species of multi-drug resistant bacteria, has emerged as a significant global threat to individuals with cystic fibrosis and other lung diseases. It can cause a severe pneumonia leading to accelerated inflammatory damage to the lungs, and may prevent safe lung transplantation. It is also extremely difficult to treat – fewer than one in three cases is treated successfully.

In a study published today in Science, a team led by scientists at the University of Cambridge examined whole genome data for 1,173 clinical M. abscessus samples taken from 526 patients to study how the organism has evolved – and continues to evolve. The samples were obtained from cystic fibrosis clinics in the UK, as well as centres in Europe, the USA and Australia.

The team found two key processes that play an important part in the organism’s evolution. The first is known as horizontal gene transfer – a process whereby the bacteria pick up genes or sections of DNA from other bacteria in the environment. Unlike classical evolution, which is a slow, incremental process, horizontal gene transfer can lead to big jumps in the pathogen’s evolution, potentially allowing it to become suddenly much more virulent.

The second process is within-host evolution. As a consequence of the shape of the lung, multiple versions of the bacteria can evolve in parallel – and the longer the infection exists, the more opportunities they have to evolve, with the fittest variants eventually winning out. Similar phenomena have been seen in the evolution of new SARS-CoV-2 variants in immunocompromised patients.

Professor Andres Floto, joint senior author from the Centre for AI in Medicine (CCAIM) and the Department of Medicine at the University of Cambridge and the Cambridge Centre for Lung Infection at Royal Papworth Hospital, said: “What you end up with is parallel evolution in different parts of an individual’s lung. This offers bacteria the opportunity for multiple rolls of the dice until they find the most successful mutations. The net result is a very effective way of generating adaptations to the host and increasing virulence. 

“This suggests that you might need to treat the infection as soon as it is identified. At the moment, because the drugs can cause unpleasant side effects and have to be administered over a long period of time – often as long as 18 months – doctors usually wait to see if the bacteria cause illness before treating the infection. But what this does is give the bug plenty of time to evolve repeatedly, potentially making it more difficult to treat.”

Professor Floto and colleagues have previously advocated routine surveillance of cystic fibrosis patients to check for asymptomatic infection. This would involve patients submitting sputum samples three or four times a year to check for the presence of M. abscessus infection. Such surveillance is carried out routinely in many centres in the UK.

Using mathematical models, the team have been able to step backwards through the organism’s evolution in a single individual and recreate its trajectory, looking for key mutations in each organism in each part of the lung. By comparing samples from multiple patients, they were then able to identify the key set of genes that enabled this organism to change into a potentially deadly pathogen.

These adaptations can occur very quickly, but the team found that their ability to transmit between patients was constrained: paradoxically, those mutations that allowed the organism to become a more successful pathogen within the patient also reduced its ability to survive on external surfaces and in the air – the key mechanisms by which it is thought to transmit between people. 

Potentially one of the most important genetic changes witnessed by the team was one that contributed towards M. abscessus becoming resistant to nitric oxide, a compound naturally produced by the human immune system. The team will shortly begin a clinical trial aimed at boosting nitric oxide in patients’ lung by using inhaled acidified nitrite, which they hope would become a novel treatment for the devastating infection.

Examining the DNA taken from patient samples is also important in helping understand routes of transmission. Such techniques are used routinely in Cambridge hospitals to map the spread of infections such as MRSA and C. difficile – and more recently, SARS-CoV-2. Insights into the spread of M. abscessus helped inform the design of the new Royal Papworth Hospital building, opened in 2019, which has a state-of-the-art ventilation system to prevent transmission. The team recently published a study showing that this ventilation system was highly effective at reducing the amount of bacteria in the air.

Professor Julian Parkhill, joint senior author from the Department of Veterinary Medicine at the University of Cambridge, added: “M. abscessus can be a very challenging infection to treat and can be very dangerous to people living with cystic fibrosis, but we hope insights from our research will help us reduce the risk of transmission, stop the bug evolving further, and potentially prevent the emergence of new pathogenic variants.”

The team have used their research to develop insights into the evolution of M. tuberculosis – the pathogen that causes TB about 5,000 years ago. In a similar way to M. abscessus, M. tuberculosis likely started life as an environmental organism, acquired genes by horizontal transfer that made particular clones more virulent, and then evolved through multiple rounds of within-host evolution. While M. abscessus is currently stopped at this evolutionary point, M. tuberculosis evolved further to be able to jump directly from one person to another.  

Dr Lucy Allen, Director of Research at the Cystic Fibrosis Trust, said: “This exciting research brings real hope of better ways to treat lung infections that are resistant to other drugs. Our co-funded Innovation Hub with the University of Cambridge really shows the power of bringing together world-leading expertise to tackle a health priority identified by people with cystic fibrosis. We’re expecting to see further impressive results in the future coming from our joint partnership.”

The study was funded by the Wellcome Trust, Cystic Fibrosis Trust, NIHR Cambridge Biomedical Research Centre and Fondation Botnar.

Reference
Bryant, JM et al. Stepwise pathogenic evolution of Mycobacterium abscessus. Science; 30 Apr 2021

Scientists have been able to track how a multi-drug resistant organism is able to evolve and spread widely among cystic fibrosis patients – showing that it can evolve rapidly within an individual during chronic infection. The researchers say their findings highlight the need to treat patients with Mycobacterium abscessus infection immediately, counter to current medical practice.

cystic fibrosisevolutiondrug resistanceAndres FlotoJulian ParkhillCystic Fibrosis TrustWellcome Sanger InstituteWellcomeNIHR Cambridge Biomedical Research CentreFondation BotnarSchool of Clinical MedicineDepartment of MedicineDepartment of Veterinary MedicineCambridge Centre for AI in MedicineSchool of the Biological SciencesWe hope insights from our research will help us reduce the risk of transmission, stop the bug evolving further, and potentially prevent the emergence of new pathogenic variantsJulian ParkhillJon SneddonPatient with cystic fibrosis


The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

YesNews type: News

Scientists develop new class of cancer drug with potential to treat leukaemia

Mon, 26/04/2021 - 16:00

Our genetic code is written in DNA, but in order to generate proteins – molecules that are vital to the function of living organisms – DNA first needs to be converted into RNA. The production of proteins is controlled by enzymes, which make chemical changes to RNA. Occasionally these enzymes become mis-regulated, being produced in over-abundance.

In a study published in 2017, a team led by Professor Tony Kouzarides from the Milner Therapeutics Institute and the Gurdon Institute at the University of Cambridge showed how one such enzyme, METTL3, plays a key role in the development and maintenance of acute myeloid leukaemia. The enzyme becomes over-expressed – that is, over-produced – in certain cell types, leading to the disease.

Acute myeloid leukaemia (AML) is a cancer of the blood in which bone marrow produces abnormal white blood cells known as myeloid cells, which normally protect the body against infection and against the spread of tissue damage. AML proceeds rapidly and aggressively, usually requiring immediate treatment, and affects both children and adults. Around 3,100 people are diagnosed with the condition every year in the UK, the majority of whom are over 65 years of age.

Now, Professor Kouzarides and colleagues at STORM Therapeutics, a Cambridge spinout associated with his team, and the Wellcome Sanger Institute, have identified a drug-like molecule, STM2457, that can inhibit the action of METTL3. In tissue cultured from individuals with AML and in mouse models of the disease, the team showed that the drug was able to block the cancerous effect caused by over-expression of the enzyme.

Professor Kouzarides said: “Proteins are essential for our bodies to function and are produced by a process that involves translating our DNA into RNA using enzymes. Sometimes, this process can go awry with potentially devastating consequences for human health. Until now, no one has targeted this essential process as a way of fighting cancer. This is the beginning of a new era for cancer therapeutics.”

To investigate the anti-leukaemic potential of STM2457, the researchers tested the drug on cell lines derived from patients with AML and found that the drug significantly reduced the growth and proliferation of these cells. It also induced apoptosis – ‘cell death’ – killing off the cancerous cells.

The researchers transplanted cells from patients with AML into immunocompromised mice to model the disease. When they treated the mice with STM2457, they found that it impaired the proliferation and expansion of the transplanted cells and significantly prolonged the lifespan of the mice. It reduced the number of leukaemic cells in the mouse bone marrow and spleen, while showing no toxic side effects, including no effect on body weight.

Dr Konstantinos Tzelepis from the Milner Therapeutics Institute at the University of Cambridge and the Wellcome Sanger Institute added: “This is a brand-new field of research for cancer and the first drug-like molecule of its type to be developed. Its success at killing leukaemia cells and prolonging the lifespans of our mice is very promising and we hope to begin clinical trials to test successor molecules in patients as early as next year.

“We also believe that this approach – of targeting these enzymes – could be used to treat a wide range of cancers, potentially offering us a new weapon in our arsenal against these terrible diseases.”

Michelle Mitchell, Chief Executive of Cancer Research UK, said: "This work is yet another example of how our researchers strive to get new cancer treatments into the clinic and improve outcomes for cancer patients. 

"Acute myeloid leukaemia is an aggressive form of cancer which grows rapidly. Treatment is required as soon as possible after diagnosis, which means research like this can't come soon enough. 

"We look forward to seeing the outcomes of the phase 1 trial and the benefits it may have for AML sufferers and their families in the future."

The research was supported by Cancer Research UK, the European Research Council, Wellcome, the Kay Kendall Leukaemia Fund, and Leukaemia UK.

STORM Therapeutics is a University of Cambridge spin-out, supported by Cambridge Enterprise. It specialises in translating research in RNA epigenetics into the discovery of first-in-class drugs in oncology and other diseases.

Reference
Yankoka, E, et al. Small molecule inhibition of METTL3 as a therapeutic strategy for acute myeloid leukaemia. Nature; 26 Apr 2021; DOI: 10.1038/s41586-021-03536-w

Scientists have made a promising step towards developing a new drug for treating acute myeloid leukaemia, a rare blood disorder. In a study published today in Nature, Cambridge researchers report a new approach to cancer treatment that targets enzymes which play a key role in translating DNA into proteins and which could lead to a new class of cancer drugs.

Cancerspotlight on future therapeuticsFuture therapeuticsdrug discoveryleukaemiaTony KouzaridesKonstantinos TzelepisWellcome Sanger InstituteSTORM TherapeuticsCancer Research UK (CRUK)European Research CouncilWellcomeKay Kendal Leukaemia FundLeukaemia UKSchool of Clinical MedicineSchool of the Biological SciencesMilner Therapeutics InstituteGurdon InstituteCambridge EnterpriseUntil now, no one has targeted this essential process as a way of fighting cancer. This is the beginning of a new era for cancer therapeuticsTony KouzaridesNational Cancer InstituteHuman cells with acute myelocytic leukemia, shown with an esterase stain at 400x


The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

YesLicense type: Public DomainNews type: News

Simple treatment during pregnancy can protect baby from memory problems in later life, study in rats suggests

Wed, 21/04/2021 - 00:01

Low oxygen in the womb - known as chronic fetal hypoxia - is one of the most common complications in human pregnancy. It can be diagnosed when a routine ultrasound scan shows that the baby is not growing properly and is caused by a number of conditions including pre-eclampsia, infection of the placenta, gestational diabetes or maternal obesity. 

The new results show that chronic fetal hypoxia leads to a reduced density of blood vessels, and a reduced number of nerve cells and their connections in parts of the offspring’s brain. When the offspring reaches adulthood, its ability to form lasting memories is reduced and there is evidence of accelerated brain ageing. 

Vitamin C, an anti-oxidant, given to pregnant rats with chronic fetal hypoxia was shown to protect the future brain health of the offspring. The results are published today in the journal FASEB J.

“It’s hugely exciting to think we might be able to protect the brain health of an unborn child by a simple treatment that can be given to the mother during pregnancy,” said Professor Dino Giussani from the University of Cambridge’s Department of Physiology, Development and Neuroscience, who led the study.

The researchers used Vitamin C because it is a well-established and used anti-oxidant. However, only high doses were effective, which could cause adverse side-effects in humans. Follow-up studies are now searching for alternative anti-oxidants to treat chronic fetal hypoxia in humans.

To conduct the research, a group of pregnant rats were kept in ambient air with 13% oxygen – causing hypoxic pregnancies. The rest were kept in normal air (21% oxygen). Half of the rats in each group were given Vitamin C in their drinking water throughout the pregnancy. Following birth, the baby rats were raised to four months old, equivalent to early adulthood in humans, and then performed various tests to assess locomotion, anxiety, spatial learning and memory.

The study found that rats born from hypoxic pregnancies took longer to perform the memory task, and didn’t remember things as well. Rats born from hypoxic pregnancies in which mothers had been given Vitamin C throughout their pregnancy performed the memory task just as well as offspring from normal pregnancies. 

Analysing the brains of the rat offspring, the researchers found that the hippocampus - the area associated with forming memories – was less developed in rats from hypoxic pregnancies. 

In deeper analysis, the scientists showed that hypoxic pregnancy causes excess production of reactive oxygen species, called ‘free radicals’, in the placenta. In healthy pregnancy the body keeps the level of free radicals in check by internal anti-oxidant enzymes, but excess free radicals overwhelm these natural defences and damage the placenta in a process called ‘oxidative stress’.  This reduces blood flow and oxygen delivery to the developing baby.

In this study, placentas from the hypoxic pregnancies showed oxidative stress, while those from the hypoxic pregnancies supplemented with Vitamin C looked healthy.

Taken together, these results show that low oxygen in the womb during pregnancy causes oxidative stress in the placenta, affecting the brain development of the offspring and resulting in memory problems in later life. 

“Chronic fetal hypoxia impairs oxygen delivery at critical periods of development of the baby’s central nervous system. This affects the number of nerve connections and cells made in the brain, which surfaces in adult life as problems with memory and an earlier cognitive decline,” said Dr Emily Camm from Cambridge’s Department of Physiology, Development and Neuroscience, first author of the report, who has recently taken up a new position at The Ritchie Centre in Australia.

The interaction between our genes and lifestyle plays a role in determining our risk of disease as adults. There is also increasing evidence that the environment experienced during sensitive periods of fetal development directly influences our long-term health - a process known as ‘developmental programming.’ 

Brain health problems that may start in the womb due to complicated pregnancy range from attention deficit hyperactivity disorder, to brain changes in later life that have been linked with Alzheimer’s disease. 

“In medicine today there has to be a shift in focus from treatment of the disease, when we can do comparatively little, to prevention, when we can do much more. This study shows that we can use preventative medicine even before birth to protect long term brain health,” said Giussani.

The research was funded by The British Heart Foundation and The Medical Research Council, and the programme of work was approved by the University of Cambridge Animal Welfare and Ethical Review Board.

Reference
Camm et al: ‘Maternal antioxidant treatment protects adult offspring against memory loss and hippocampal atrophy in a rodent model of developmental hypoxia.’ The FASEB Journal, April 
2021. DOI: 10.1096/fj.202002557RR

A new study in laboratory rats has discovered a direct link between low oxygen in the womb and impaired memory function in the adult offspring. It also finds that anti-oxidant supplements during pregnancy may protect against this.

ReproductionDino GiussaniDepartment of Physiology, Development and Neuroscience (PDN)School of the Biological SciencesThis study shows that we can use preventative medicine even before birth to protect long term brain health.Dino Giussani Ryan Franco on UnsplashPregnant woman


The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

YesLicense type: AttributionNews type: News

Stone Age bear genome reconstructed from DNA in Mexican cave

Mon, 19/04/2021 - 17:30

A team of scientists led by Professor Eske Willerslev in the University of Cambridge’s Department of Zoology and the Lundbeck Foundation GeoGenetics Centre, University of Copenhagen, have recreated the genomes of animals, plants and bacteria from microscopic fragments of DNA found in the remote Chiquihuite Cave in Mexico.

The findings have been described as the ‘moon landings of genomics’, because researchers will no longer have to rely on finding and testing fossils to determine genetic ancestry and connections.

The results, published today in the journal Current Biology, are the first time environmental DNA has been sequenced from soil and sediment. They include the ancient DNA profile of a Stone Age American black bear taken from samples in the cave.

Working with highly fragmented DNA from soil samples means scientists no longer have to rely on DNA samples from bone or teeth for enough genetic material to recreate a profile of ancient DNA.

The samples included faeces and droplets of urine from an ancestor of the American black bear, which allowed the scientists to recreate the entire genetic code of two species of the animal: the Stone Age American black bear, and a short-faced bear called Arctodus simus that died out 12,000 years ago. 

Professor Willerslev said: “When an animal or a human urinates or defecates, cells from the organism are also excreted. We can detect the DNA fragments from these cells in the soil samples and have now used these to reconstruct genomes for the first time. We have shown that hair, urine and faeces all provide genetic material which, in the right conditions, can survive for much longer than 10,000 years.

“Analysis of DNA found in soil could have the potential to expand the narrative about everything from the evolution of species to developments in climate change – fossils will no longer be needed.”

Chiquihuite Cave is a high-altitude site, situated 2,750 metres above sea level. DNA of mice, black bears, rodents, bats, voles and kangaroo rats was also found. The scientists say that DNA fragments in sediment will now be able to be tested in many former Stone Age settlements around the world.

Professor Willerslev said: “Imagine the stories those traces could tell. It’s a little insane – but also fascinating – to think that, back in the Stone Age, these bears urinated and defecated in the Chiquihuite Cave and left us the traces we’re able to analyse today.”

Reference

Petersen, M.K. et al, Environmental genomics of Late Pleistocene black bears and giant short-faced bears. Current Biology, April 2021. DOI: 10.1016/j.cub.2021.04.027

Adapted from a press release by St John's College, Cambridge.

 

Scientists have reconstructed ancient DNA from soil for the first time, in an advance that will significantly enhance the study of animal, plant and microorganism evolution.

Eske WillerslevLundbeck Foundation GeoGenetics CentreUniversity of CopenhagenDepartment of ZoologySchool of the Biological SciencesSt John's CollegeAnalysis of DNA found in soil could have the potential to expand the narrative about everything from the evolution of species to developments in climate change – fossils will no longer be needed.Eske WillerslevDevlin A. GandyAssistant Professor Mikkel Winther Pedersen with team members sampling the different cultural layers in the cave.


The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

YesLicense type: AttributionNews type: News

Artificial intelligence could be used to triage patients suspected at risk of early stage oesophageal cancer

Thu, 15/04/2021 - 16:00

When researchers applied the technique to analysing samples obtained using the ‘pill on a string’ diagnostic tool Cytosponge, they found that it was capable of reducing by half pathologists’ workload while matching the accuracy of even experienced pathologists.

Early detection of cancer often leads to better survival because pre-malignant lesions and early stage tumours can be more effectively treated. This is particularly important for oesophageal cancer, the sixth most common cause for cancer-related deaths. Patients usually present at an advanced stage with swallowing difficulties and weight loss. The five-year overall survival can be as low as 13%.

One main subtype of oesophageal cancer is preceded by a condition known as Barrett oesophagus, in which cells in the lining of the oesophagus change shape. Barrett oesophagus occurs in patients with Gastro-oesophageal Reflux Disease (GORD), a digestive disorder where acid and bile from the stomach return into the oesophagus leading to heartburn symptoms. In Western countries, 10-15% of the adult population are affected by GORD and are hence at an increased risk of having Barrett oesophagus.

At present Barrett oesophagus can only be detected by a gastroscopy and tissue biopsy. Researchers at the University of Cambridge have developed a far-less invasive diagnostic tool called the Cystosponge – a ‘pill on a string’ that dissolves in the stomach and which, as it is withdrawn, picks up some cells from the lining of the oesophagus. These cells are then stained using a laboratory marker called TFF3 and can then by examined under a microscope.

Now, in a study published today in Nature Medicine, a team at Cambridge has applied deep learning techniques to the sample analysis, stratifying patients into eight triage classes that determine whether a patient sample requires manual review or if automated review would suffice. The algorithms were trained using 4,662 pathology slides from 2,331 patients.

Professor Rebecca Fitzgerald from the MRC Cancer Unit at the University of Cambridge, who developed the Cytosponge and worked with the AI team, said: “Any system that supports clinical decisions needs to balance its performance against workload reduction and potential economic impact. Replacing pathologists entirely could lead to substantial workload reduction and speed up diagnoses, but such an approach would only be viable if the performance remains comparable to that of human experts and there are regulatory hurdles to overcome.”

For the analysis of Cytosponge-TFF3 samples, the triaging approach showed several benefits, substantially reducing workload and matching the sensitivity and specificity of experienced pathologists. Sensitivity is the ‘true positive’ rate – that is, how often a test correctly generates a positive result for people who have Barrett oesophagus. Specificity, on the other hand, measures a test’s ability to correctly generate a negative result for people who don’t have the disease.

The researchers showed that a fully manual review by a pathologist achieves 82% sensitivity and 93% specificity. In a fully automated approach, they observed a sensitivity of 73% and a specificity of 93%. The team was able to demonstrate that using a triage-driven approach, up to two-thirds of cases can be reviewed automatically while achieving a sensitivity of 83% and specificity of 93%. The team estimates that this approach would reduce workload for the pathologists by 57%.

The team were able to build into their algorithm problem-solving techniques applied by pathologists familiar with Cytosponge-TFF3 samples. This meant that the algorithms were interpretable – in other words, a clinician would be able to understand why they had reached a particular decision. This is important for accountability.

Dr Florian Markowetz from the CRUK Cambridge Institute, who led the work on the AI algorithm, said: “We’ve shown that it’s possible to use computer-aided tools to streamline identification of people at risk of Barrett oesophagus. By semi-automating the process, we can reduce the workload by more than half while retaining the accuracy of a skilled pathologist. This could potentially speed up the diagnosis of Barrett oesophagus and, potentially, the identification of those individuals at greatest risk of oesophageal cancer.”

The team say that this triage-driven approach could be applied beyond the Cytosponge to a number of tests for other conditions such as pancreatic cancer, thyroid cancer or salivary gland malignancies.

The research was supported by Cancer Research UK, the Medical Research Council and Cambridge University Hospitals NHS Foundation Trust.

Reference
Gehrung, M et al. Triage-driven diagnosis of Barrett esophagus for early detection of esophageal adenocarcinoma using deep learning. Nat Med; 15 Apr 2021; DOI: 10.1038/s41591-021-01287-9

 

Artificial intelligence ‘deep learning’ techniques can be used to triage suspected cases of Barrett oesophagus, a precursor to oesophageal cancer, potentially leading to faster and earlier diagnoses, say researchers at the University of Cambridge.

Canceroesophageal cancerArtificial intelligencediagnosisRebecca FitzgeraldFlorian MarkowetzCancer Research UK (CRUK)Medical Research CouncilCambridge University Hospitals NHS Foundation TrustSchool of Clinical MedicineCambridge Cancer CentreMedical Research Council (MRC) Cancer UnitCancer Research UK Cambridge InstituteWe’ve shown that it’s possible to use computer-aided tools to streamline identification of people at risk of Barrett oesophagus... This could potentially speed up the diagnosis of Barrett oesophagus and, potentially, the identification of those individuals at greatest risk of oesophageal cancerFlorian MarkowetzCytosponge


The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

YesNews type: News

Stress does not lead to loss of self-control in eating disorders, study finds

Mon, 12/04/2021 - 18:00

People who experience bulimia nervosa and a subset of those affected by anorexia nervosa share certain key symptoms, namely recurrent binge-eating and compensatory behaviours, such as vomiting. The two disorders are largely differentiated by body mass index (BMI): adults affected by anorexia nervosa tend to have BMI of less than 18.5 kg/m2. More than 1.6 million people in the UK are thought to have an eating disorder, three-quarters of whom are women.

One prominent theory of binge-eating is that it is a result of stress, which causes individuals to experience difficulties with self-control. However, until now, this theory has not been directly tested in patients.

To examine this theory, researchers at the University of Cambridge, working with clinicians at Cambridgeshire and Peterborough NHS Foundation Trust, invited 85 women – 22 with anorexia nervosa, 33 with bulimia nervosa and 30 healthy controls – to attend a two-day stay at Wellcome-MRC Institute of Metabolic Science Translational Research Facility (TRF). The facility, which includes an Eating Behaviour Unit, is designed so that a volunteer’s diet and environment can be strictly controlled and their metabolic status studied in detail during a residential status. The setting is intended to be as naturalistic as possible.

During their stay, each morning the women would receive controlled meals provided by a nutritionist. The women then underwent a fasting period during which they were taken to the next door Wolfson Brain Imaging Centre, where they performed tasks while their brain activity was monitored using a functional MRI scanner.

The first tasks involved stopping the progression of a bar rising up a computer screen by pressing a key. The main task involved stopping the moving bar as it reached the middle line. On a minority of trials, stop-signals were presented, where the moving bar stopped automatically before reaching the middle line; participants were instructed to withhold their response in the event of a stop-signal.

The women then performed a task aimed at raising their stress levels. They were asked to carry out a series of mental arithmetic tests while receiving mild but unpredictable electric shocks, and were told that if they failed to meet the performance criterion, their data would be dismissed from the study. They were given feedback throughout the task, such as ‘Your performance is below average’.

The women then repeated the stop-signal task again.

Once the tasks had been completed – but while the volunteers might still be expected to be in a heightened state of stress – they returned to the Eating Behaviour Unit, where they were offered an ‘all you can eat’ buffet in its relaxing lounge and were told they could eat as much or as little as they would like.

On the second day of their study, the volunteers carried out the same tasks, but without the added stress of unpleasant electric shocks and pressure to perform. (For some participants, the order of the days was reversed.)

Dr Margaret Westwater, who led the research while a PhD student at Cambridge’s Department of Psychiatry, said: “The idea was to see what happened when these women were stressed. Did it affect key regions of the brain important for self-control, and did that in turn lead to increases in food intake? What we found surprised us and goes counter to the prevailing theory.”

The team found that even when they were not stressed, those women with bulimia nervosa performed worse on the main task, where they had to stop the rising bar as it reached the middle bar - but this was not the case for those women affected by anorexia nervosa. This impairment occurred alongside increased activity in a region in the prefrontal cortex, which the team say could mean these particular women were unable to recruit some other regions required by the brain to perform the task optimally.

Interestingly – and contrary to the theory – stress did not affect the actual performance in any way for either of the patient groups or the controls. However, the patient groups showed some differences in brain activity when they were stressed – and this activity differed between women with anorexia and those with bulimia.

While the researchers observed that the patients in general ate less in the buffet than the controls, the amount that they ate did not differ between the stress and control days. However, activity levels in two key brain regions were associated with the amount of calories consumed in all three groups, suggesting that these regions are important for dietary control.

Dr Westwater added: “Even though these two eating disorders are similar in many respects, there are clear differences at the level of the brain. In particular, women with bulimia seem to have a problem with pre-emptively slowing down in response to changes in their environment, which we think might lead them to make hasty decisions, leaving them vulnerable to binge-eating in some way.

“The theory suggests that these women should have eaten more when they were stressed, but that's actually not what we found. Clearly, when we're thinking about eating behaviour in these disorders, we need to take a more nuanced approach.”

In findings published last year, the team took blood samples from the women as they performed their tasks, to look at metabolic markers that are important for our sense of feeling hungry or feeling full. They found that levels of these hormones are affected by stress.

Under stress, patients with anorexia nervosa had an increase in ghrelin, a hormone that tells us when we are hungry. But they also had an increase in peptide tyrosine tyrosine (PYY), a satiety hormone. In other words, when they are stressed, people with anorexia nervosa produce more of the hunger hormone, but contradictorily also more of a hormone that should tell them that they are full, so their bodies are sending them confusing signals about what to do around food.

The situation with bulimia nervosa was again different: while the team saw no differences in levels of ghrelin or PYY, they did see lower levels of cortisol, the ‘stress hormone’, than in healthy volunteers. In times of acute stress, people who are chronically stressed or are experiencing depression are known to show this paradoxical low cortisol phenomenon.

Professor Paul Fletcher, joint senior author at the Department of Psychiatry, said: “It’s clear from our work that the relationship between stress and binge-eating is very complicated. It’s about the environment around us, our psychological state and how our body signals to us that we’re hungry or full.

“If we can get a better understanding of the mechanisms behind how our gut shapes those higher order cognitive processes related to self-control or decision-making, we may be in a better position to help people affected by these extremely debilitating illnesses. To do this, we need to take a much more integrated approach to studying these illnesses. That's where facilities such as Cambridge’s new Translational Research Facility can play a vital role, allowing us to monitor within a relatively naturalistic environment factors such as an individual’s behaviour, hormone levels and, brain activity.”

The research was funded by the Bernard Wolfe Health Neuroscience Fund, Wellcome, the NIH-Oxford-Cambridge Scholars Program and the Cambridge Trust. Further support was provided by the NIHR Cambridge Biomedical Research Centre.

Reference
Westwater, ML, et al. Prefrontal responses during proactive and reactive inhibition are differentially impacted by stress in anorexia and bulimia nervosa. JNeuro; 12 April 2021; DOI: 10.1523/JNEUROSCI.2853-20.2021

A unique residential study has concluded that, contrary to perceived wisdom, people with eating disorders do not lose self-control – leading to binge-eating – in response to stress. The findings of the Cambridge-led research are published today in the Journal of Neuroscience.

Spotlight on neuroscienceNeuroscienceeating disordersAnorexiaPaul FletcherMargaret WestwaterBernard Wolfe Health Neuroscience FundWellcomeNational Institutes of HealthCambridge TrustNIHR Cambridge Biomedical Research CentreSchool of Clinical MedicineDepartment of PsychiatryWellcome-MRC Institute of Metabolic ScienceTranslational Research FacilityIt’s clear from our work that the relationship between stress and binge-eating is very complicated. It’s about the environment around us, our psychological state and how our body signals to us that we’re hungry or fullPaul FletcherVolkan OlmezGrey-scale image of a woman


The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

YesLicense type: Public DomainNews type: News

Conservationists may be unintentionally spreading pathogens between threatened animal populations

Mon, 12/04/2021 - 10:00

The new report published in the journal Conservation Letters focuses on freshwater mussels, which the researchers have studied extensively, but is applicable to all species moved around for conservation purposes. 

Mussels play an important role in cleaning the water of many of the world’s rivers and lakes, but are one of the most threatened animal groups on Earth. There is growing interest in moving mussels to new locations to boost threatened populations, or so they can be used as ‘biological filters’ to improve water quality. 

A gonad-eating parasitic worm, Rhipidocotyle campanula, which can leave mussels completely sterile, was identified as a huge risk for captive breeding programmes where mussels from many isolated populations are brought together.  

“We need to be much more cautious about moving animals to new places for conservation purposes, because the costs may outweigh the benefits,” said Dr David Aldridge in the Department of Zoology at the University of Cambridge, senior author of the report.

He added: “We’ve seen that mixing different populations of mussels can allow widespread transmission of gonad-eating worms – it only takes one infected mussel to spread this parasite, which in extreme cases can lead to collapse of an entire population.”

Pathogens can easily be transferred between locations when mussels are moved. In extreme cases, the pathogens may cause a population of mussels to completely collapse. In other cases infections may not cause a problem unless they are present when other factors, such as lack of food or high temperatures, put a population under stress leading to a sudden outbreak.

The report recommends that species are only relocated when absolutely necessary and quarantine periods, tailored to stop transmission of the most likely pathogens being carried, are used. 

It identifies four key factors that determine the risk of spreading pathogens when relocating animals: proportion of infected animals in both source and recipient populations; density of the resulting population; host immunity; and the life-cycle of the pathogen. Pathogens that must infect multiple species to complete their life-cycle, like parasitic mites, will only persist if all of the species are present in a given location.

“Moving animals to a new location is often used to protect or supplement endangered populations. But we must consider the risk this will spread pathogens that we don’t understand very well at all, which could put these populations in even greater danger,” said Josh Brian, a PhD student in the Department of Zoology at the University of Cambridge and first author of the report.

Different populations of the same species may respond differently to infection with the same pathogen because of adaptations in their immune system. For example, a pack of endangered wolves moved to Yellowstone National Park died because the wolves had no immunity to parasites carried by the local canines.

The researchers say that stocking rivers with fish for anglers, and sourcing exotic plants for home gardens could also move around parasites or diseases. 

“Being aware of the risks of spreading diseases between populations is a vital first step towards making sure we avoid unintentional harm in future conservation work,” said Isobel Ollard, a PhD student in the Department of Zoology at the University of Cambridge, who was also involved in the study.

This research was funded by the Woolf Fisher Trust.

Reference
Brian, J.I., Ollard, I.S., & Aldridge, D.C. ‘Don’t move a mussel? Parasite and disease risk in conservation action.’ Conservation Letters, April 2021. DOI: 10.1111/conl.12799

Moving endangered species to new locations is often used as part of species conservation strategies, and can help to restore degraded ecosystems. But scientists say there is a high risk that these relocations are accidentally spreading diseases and parasites.

Biodiversity conservationSustainable EarthInfectious diseasesJosh BrianDavid AldridgeIsobel OllardDepartment of ZoologySchool of the Biological SciencesSt Catharine's CollegeWe’ve seen that mixing different populations of mussels can allow widespread transmission of gonad-eating worms.David Aldridge David AldridgeAt-risk species of river mussel


The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

YesLicense type: AttributionNews type: News

Researchers call for greater awareness of unintended consequences of CRISPR gene editing

Mon, 12/04/2021 - 09:11

CRISPR-Cas9 genome editing is a widely used research tool which allows scientists to remove and replace sections of DNA in cells, allowing them, for example, to study the function of a given gene or to repair mutations. Last year the researchers who developed CRISPR-Cas9 were awarded the Nobel Prize in Chemistry.

In the study published in the journal PNAS, scientists retrospectively analysed data from previous research in which they had studied the role of the OCT4 protein in human embryos during the first few days of development.

The team found that while the majority of CRISPR-Cas9-induced mutations were small insertions or deletions, in approximately 16% of samples there were large unintended mutations that would have been missed by conventional methods to assess DNA changes. 

Research is ongoing to understand the exact nature of the changes at the target sites, but this could include deletions of sections of DNA or more complex genomic rearrangements. 

The discovery highlights the need for researchers who use CRISPR-Cas9-mediated genome editing to edit human cells, whether somatic or germline, to be aware of and test for these potential unintended consequences. This is even more essential if they hope their work will be used clinically, as unintended genetic changes like this could lead to diseases like cancer.  

“Other research teams have reported these types of unintended mutations in human stem cells, cancer cells and other cellular contexts, and now we’ve detected them in human embryos,” said Professor Kathy Niakan, group leader of the Human Embryo and Stem Cell Laboratory at the Francis Crick Institute and Professor of Reproductive Physiology at the University of Cambridge, and senior author of the study.

“This work underscores the importance of testing for these unintended mutations to understand exactly what changes have happened in any human cell type.” 

The researchers have developed an open-source computational pipeline to identify whether CRISPR-Cas9 has caused unintended on-target mutations based on different types of next-generation sequencing data. 

“We and others are trying to develop and refine the tools to assess these complex mutations, “ added Niakan. 

“It is important to understand these events, how they arise and their frequency, so we can appreciate the current limitations of the technology and inform strategies to improve it in the future to minimise these mutations.” 

Gregorio Alanis-Lobato, lead author and former postdoctoral training fellow in the Human Embryo and Stem Cell Laboratory at the Crick, said: “Conventional tests used to check the accuracy of CRISPR-Cas9 can miss the types of unintended on-target mutations we identified in this study. There’s still so much for us to learn about the effects of CRISPR-Cas9 technology and while this valuable tool is refined, we need to thoroughly examine all changes.”  

There are important ongoing debates around the safety and ethics of using CRISPR-Cas9 genome editing on human embryos for reproductive purposes. And in 2019, there was international condemnation of the work of a researcher in China who edited embryos which led to the birth of twins. In the UK, its use on human embryos is closely regulated and is only allowed for research purposes. Research is restricted to the first 14 days of development and embryos are not allowed to be implanted into a womb. 

The data for this work related to embryos previously studied by the Crick’s Human Embryo and Stem Cell Laboratory. The embryos were at the blastocyst stage of early development, consisting of around 200 cells. They had been donated to research by people undergoing in vitro fertilisation (IVF) and were not needed during the course of their treatment. 

The research was led by scientists at the Francis Crick Institute, in collaboration with Professor Dagan Wells at the University of Oxford. Kathy Niakan is Director of the University of Cambridge’s Centre for Trophoblast Research, and Chair of the Cambridge Strategic Research Initiative in Reproduction.

Reference
Alanis-Lobato, G., et al: Frequent loss-of-heterozygosity in CRISPR-Cas9-edited early human embryos. PNAS, April 2021. DOI: 10.1073/pnas.2004832117

Adapted from a press release by the Francis Crick Institute.

CRISPR-Cas9 genome editing can lead to unintended mutations at the targeted section of DNA in early human embryos, researchers have revealed. This highlights the need for further research into the effects of CRISPR-Cas9 genome editing, especially when used to edit human DNA in laboratory research.

Kathy NaikanFrancis Crick InstituteDepartment of Physiology, Development and Neuroscience (PDN)Centre for Trophoblast ResearchCambridge Reproduction Strategic Research InitiativeSchool of the Biological SciencesWe and others are trying to develop and refine the tools to assess these complex mutations.Kathy NiakanArek Socha on PixabayDNA


The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

YesLicense type: AttributionNews type: News