skip to content

Faculty of Biology

 

Scientists can detect brain tumours using a simple urine or blood plasma test

Latest Research in Cambridge - Fri, 23/07/2021 - 16:05

The team say that a test for detecting glioma using urine is the first of its kind in the world.

Although the research, published in EMBO Molecular Medicine, is in its early stages and only a small number of patients were analysed, the team say their results are promising.

The researchers suggest that in the future, these tests could be used by GPs to monitor patients at high risk of brain tumours, which may be more convenient than having an MRI every three months, which is the standard method.

When people have a brain tumour removed, the likelihood of it returning can be high, so they are monitored with an MRI scan every three months, which is followed by biopsy.

Blood tests for detecting different cancer types are a major focus of research for teams across the world, and there are some in use in the clinic. These tests are mainly based on finding mutated DNA, shed by tumour cells when they die, known as cell-free DNA (cfDNA).

However, detecting brain tumour cfDNA in the blood has historically been difficult because of the blood-brain-barrier, which separates blood from the cerebrospinal fluid (CSF) that surrounds the brain and spinal cord, preventing the passage of cells and other particles, such as cfDNA.

Researchers have previously looked at detecting cfDNA in CSF, but the spinal taps needed to obtain it can be dangerous for people with brain tumours so are not appropriate for patient monitoring.

Scientists have known that cfDNA with similar mutations to the original tumour can be found in blood and other bodily fluids such as urine in very low levels, but the challenge has been developing a test sensitive enough to detect these specific mutations.

The researchers, led by Dr Florent Mouliere who is based at the Rosenfeld Lab of the Cancer Research UK Cambridge Institute and at the Amsterdam UMC, and Dr Richard Mair, who is based at Cancer Research UK Cambridge Institute and the University of Cambridge developed two approaches in parallel to overcome the challenge of detecting brain tumour cfDNA.

The first approach works for patients who have previously had glioma removed and biopsied. The team designed a tumour-guided sequencing test that was able to look for the mutations found in the tumour tissue within the cfDNA in the patient’s urine, CSF, and blood plasma.

A total of eight patients who had suspected brain tumours based on MRIs were included in this part of the study. Samples were taken at their initial brain tumour biopsies, alongside CSF, blood and urine samples.

By knowing where in the DNA strand to look, the researchers found that it was possible to find mutations even in the tiny amounts of cfDNA found in the blood plasma and urine.

The test was able to detect cfDNA in 7 out of 8 CSF samples, 10 out of the 12 plasma blood samples and 10 out of the 16 urine samples.

For the second approach the researchers looked for other patterns in the cfDNA that could also indicate the presence of a tumour, without having to identify the mutations.

They analysed 35 samples from glioma patients, 27 people with non-malignant brain disorders, and 26 healthy people. They used whole genome sequencing, where all the cfDNA of the tumour is analysed, not just the mutations.

They found in the blood plasma and urine samples that fragments of cfDNA, which came from patients with brain tumours were different sizes than those from patients with no tumours in CSF. They then fed this data into a machine learning algorithm which was able to successfully differentiate between the urine samples of people with and without glioma.

The researchers say that while the machine learning test is cheaper and easier, and a tissue biopsy from the tumour is not needed, it is not as sensitive and is less specific than the first tumour-guided sequencing approach.

MRIs are not invasive or expensive, but they do require a trip to the hospital, and the three-month gap between checks can be a regular source of anxiety for patients.

The researchers suggest that their tests could be used between MRI scans, and could ultimately be able to detect a returning brain tumour earlier.

The next stage of this research will see the team comparing both tests against MRI scans in a trial with patients with brain tumours who are in remission to see if it can detect if their tumours are coming back at the same time or earlier than the MRI. If the tests prove that they can detect brain tumours earlier than an MRI, then the researchers will look at how they can adapt the tests so they could be offered in the clinic, which could be within the next ten years.

“We believe the tests we’ve developed could in the future be able to detect a returning glioma earlier and improve patient outcomes,” said Mair. “Talking to my patients, I know the three-month scan becomes a focal point for worry. If we could offer a regular blood or urine test, not only will you be picking up recurrence earlier, you can also be doing something positive for the patient’s mental health.”

Michelle Mitchell, Chief Executive of Cancer Research UK said, “While this is early research, it’s opened up the possibility that within the next decade we could be able to detect the presence of a brain tumour with a simple urine or blood test. Liquid biopsies are a huge area of research interest right now because of the opportunities they create for improved patient care and early diagnosis. It’s great to see Cancer Research UK researchers making strides in this important field.”

Sue Humphreys, from Wallsall, a brain tumour patient, said: "If these tests are found to be as accurate as the standard MRI for monitoring brain tumours, it could be life changing.

If patients can be given a regular and simple test by their GP, it may help not only detect a returning brain tumour in its earliest stages, it can also provide the quick reassurance that nothing is going on which is the main problem we all suffer from, the dreaded Scanxiety.

The problem with three-monthly scans is that these procedures can get disrupted by other things going on, such as what we have seen with the Covid pandemic. As a patient, this causes worry as there is a risk that things may be missed, or delayed, and early intervention is the key to any successful treatment.”

 

Reference:
Florent Mouliere et al. ‘Fragmentation patterns and personalized sequencing of cell-free DNA in urine and plasma of glioma patients.’ EMBO Molecular Medicine (2021). DOI: 10.15252/emmm.202012881

Adapted from a Cancer Research UK press release.

Researchers from the Cancer Research UK Cambridge Institute have developed two tests that can detect the presence of glioma, a type of brain tumour, in patient urine or blood plasma.

Cancerdiagnosticbrainbrain tumourstumoursFlorent MouliereRichard MairCancer Research UK (CRUK)Cancer Research UK Cambridge InstituteSchool of Clinical Medicine


The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

YesNews type: News

Scientists can detect brain tumours using a simple urine or blood plasma test

Research in Medicine - Fri, 23/07/2021 - 16:05

The team say that a test for detecting glioma using urine is the first of its kind in the world.

Although the research, published in EMBO Molecular Medicine, is in its early stages and only a small number of patients were analysed, the team say their results are promising.

The researchers suggest that in the future, these tests could be used by GPs to monitor patients at high risk of brain tumours, which may be more convenient than having an MRI every three months, which is the standard method.

When people have a brain tumour removed, the likelihood of it returning can be high, so they are monitored with an MRI scan every three months, which is followed by biopsy.

Blood tests for detecting different cancer types are a major focus of research for teams across the world, and there are some in use in the clinic. These tests are mainly based on finding mutated DNA, shed by tumour cells when they die, known as cell-free DNA (cfDNA).

However, detecting brain tumour cfDNA in the blood has historically been difficult because of the blood-brain-barrier, which separates blood from the cerebrospinal fluid (CSF) that surrounds the brain and spinal cord, preventing the passage of cells and other particles, such as cfDNA.

Researchers have previously looked at detecting cfDNA in CSF, but the spinal taps needed to obtain it can be dangerous for people with brain tumours so are not appropriate for patient monitoring.

Scientists have known that cfDNA with similar mutations to the original tumour can be found in blood and other bodily fluids such as urine in very low levels, but the challenge has been developing a test sensitive enough to detect these specific mutations.

The researchers, led by Dr Florent Mouliere who is based at the Rosenfeld Lab of the Cancer Research UK Cambridge Institute and at the Amsterdam UMC, and Dr Richard Mair, who is based at Cancer Research UK Cambridge Institute and the University of Cambridge developed two approaches in parallel to overcome the challenge of detecting brain tumour cfDNA.

The first approach works for patients who have previously had glioma removed and biopsied. The team designed a tumour-guided sequencing test that was able to look for the mutations found in the tumour tissue within the cfDNA in the patient’s urine, CSF, and blood plasma.

A total of eight patients who had suspected brain tumours based on MRIs were included in this part of the study. Samples were taken at their initial brain tumour biopsies, alongside CSF, blood and urine samples.

By knowing where in the DNA strand to look, the researchers found that it was possible to find mutations even in the tiny amounts of cfDNA found in the blood plasma and urine.

The test was able to detect cfDNA in 7 out of 8 CSF samples, 10 out of the 12 plasma blood samples and 10 out of the 16 urine samples.

For the second approach the researchers looked for other patterns in the cfDNA that could also indicate the presence of a tumour, without having to identify the mutations.

They analysed 35 samples from glioma patients, 27 people with non-malignant brain disorders, and 26 healthy people. They used whole genome sequencing, where all the cfDNA of the tumour is analysed, not just the mutations.

They found in the blood plasma and urine samples that fragments of cfDNA, which came from patients with brain tumours were different sizes than those from patients with no tumours in CSF. They then fed this data into a machine learning algorithm which was able to successfully differentiate between the urine samples of people with and without glioma.

The researchers say that while the machine learning test is cheaper and easier, and a tissue biopsy from the tumour is not needed, it is not as sensitive and is less specific than the first tumour-guided sequencing approach.

MRIs are not invasive or expensive, but they do require a trip to the hospital, and the three-month gap between checks can be a regular source of anxiety for patients.

The researchers suggest that their tests could be used between MRI scans, and could ultimately be able to detect a returning brain tumour earlier.

The next stage of this research will see the team comparing both tests against MRI scans in a trial with patients with brain tumours who are in remission to see if it can detect if their tumours are coming back at the same time or earlier than the MRI. If the tests prove that they can detect brain tumours earlier than an MRI, then the researchers will look at how they can adapt the tests so they could be offered in the clinic, which could be within the next ten years.

“We believe the tests we’ve developed could in the future be able to detect a returning glioma earlier and improve patient outcomes,” said Mair. “Talking to my patients, I know the three-month scan becomes a focal point for worry. If we could offer a regular blood or urine test, not only will you be picking up recurrence earlier, you can also be doing something positive for the patient’s mental health.”

Michelle Mitchell, Chief Executive of Cancer Research UK said, “While this is early research, it’s opened up the possibility that within the next decade we could be able to detect the presence of a brain tumour with a simple urine or blood test. Liquid biopsies are a huge area of research interest right now because of the opportunities they create for improved patient care and early diagnosis. It’s great to see Cancer Research UK researchers making strides in this important field.”

Sue Humphreys, from Wallsall, a brain tumour patient, said: "If these tests are found to be as accurate as the standard MRI for monitoring brain tumours, it could be life changing.

If patients can be given a regular and simple test by their GP, it may help not only detect a returning brain tumour in its earliest stages, it can also provide the quick reassurance that nothing is going on which is the main problem we all suffer from, the dreaded Scanxiety.

The problem with three-monthly scans is that these procedures can get disrupted by other things going on, such as what we have seen with the Covid pandemic. As a patient, this causes worry as there is a risk that things may be missed, or delayed, and early intervention is the key to any successful treatment.”

 

Reference:
Florent Mouliere et al. ‘Fragmentation patterns and personalized sequencing of cell-free DNA in urine and plasma of glioma patients.’ EMBO Molecular Medicine (2021). DOI: 10.15252/emmm.202012881

Adapted from a Cancer Research UK press release.

Researchers from the Cancer Research UK Cambridge Institute have developed two tests that can detect the presence of glioma, a type of brain tumour, in patient urine or blood plasma.

Cancerdiagnosticbrainbrain tumourstumoursFlorent MouliereRichard MairCancer Research UK (CRUK)Cancer Research UK Cambridge InstituteSchool of Clinical Medicine


The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

YesNews type: News

Blushing plants reveal when fungi are growing in their roots

Latest Research in Cambridge - Fri, 23/07/2021 - 08:06

This is the first time this vital, 400 million year old process has been visualised in real time in full root systems of living plants. Understanding the dynamics of plant colonisation by fungi could help to make food production more sustainable in the future.

Almost all crop plants form associations with a particular type of fungi – called arbuscular mycorrhiza fungi – in the soil, which greatly expand their root surface area. This mutually beneficial interaction boosts the plant’s ability to take up nutrients that are vital for growth. 

The more nutrients plants obtain naturally, the less artificial fertilisers are needed. Understanding this natural process, as the first step towards potentially enhancing it, is an ongoing research challenge. Progress is likely to pay huge dividends for agricultural productivity.

In a study published in the journal PLOS Biology, researchers used the bright red pigments of beetroot – called betalains – to visually track soil fungi as they colonised plant roots in a living plant. 

“We can now follow how the relationship between the fungi and plant root develops, in real-time, from the moment they come into contact. We previously had no idea about what happened because there was no way to visualise it in a living plant without the use of elaborate microscopy,” said Dr Sebastian Schornack, a researcher at the University of Cambridge’s Sainsbury Laboratory and joint senior author of the paper. 

To achieve their results, the researchers engineered two model plant species – a legume and a tobacco plant – so that they would produce the highly visible betalain pigments when arbuscular mycorrhiza fungi were present in their roots. This involved combining the control regions of two genes activated by mycorrhizal fungi with genes that synthesise red-coloured betalain pigments.

The plants were then grown in a transparent structure so that the root system was visible, and images of the roots could be taken with a flatbed scanner without disturbing the plants.

Using their technique, the researchers could select red pigmented parts of the root system to observe the fungus more closely as it entered individual plant cells and formed elaborate tree-like structures – called arbuscules – which grow inside the plant’s roots. Arbuscules take up nutrients from the soil that would otherwise be beyond the reach of the plant. 

Other methods exist to visualise this process, but these involve digging up and killing the plant and the use of chemicals or expensive microscopy. This work makes it possible for the first time to watch by eye and with simple imaging how symbiotic fungi start colonising living plant roots, and inhabit parts of the plant root system over time.

“This is an exciting new tool to visualise this, and other, important plant processes. Beetroot pigments are a distinctive colour, so they’re very easy to see. They also have the advantage of being natural plant pigments, so they are well tolerated by plants,” said Dr Sam Brockington, a researcher in the University of Cambridge’s Department of Plant Sciences, and joint senior author of the paper.

Mycorrhiza fungi are attracting growing interest in agriculture. This new technique provides the ability to ‘track and trace’ the presence of symbiotic fungi in soils from different sources and locations. The researchers say this will enable the selection of fungi that colonise plants fastest and provide the biggest benefits in agricultural scenarios.

Understanding and exploiting the dynamics of plant root system colonisation by fungi has potential to enhance future crop production in an environmentally sustainable way. If plants can take up more nutrients naturally, this will reduce the need for artificial fertilisers – saving money and reducing associated water pollution. 

This research was funded by the Biotechnology and Biological Sciences Research Council, Gatsby Charitable Foundation, Royal Society, and Natural Environment Research Council. 

Reference
Timoneda, A. & Yunusov, T. et al: ‘MycoRed: Betalain pigments enable in vivo real-time visualisation of arbuscular mycorrhizal colonisation.’ PLOS Biology, July 2021. DOI: 10.1371/journal.pbio.3001326

Scientists have created plants whose cells and tissues ‘blush’ with beetroot pigments when they are colonised by fungi that help them take up nutrients from the soil.

plant sciencesGlobal food securitySustainable agricultureSustainable EarthSam BrockingtonSebastian SchornackCambridge University Botanic GardenDepartment of Plant SciencesSainsbury LaboratorySchool of the Biological SciencesWe can now follow how the relationship between the fungi and plant root develops, in real-time, from the moment they come into contact.Sebastian SchornackTemur Yunusov and Alfonso TimonedaCells of roots colonised by fungi turn red


The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

YesLicense type: AttributionNews type: News

Blushing plants reveal when fungi are growing in their roots

Research in Medicine - Fri, 23/07/2021 - 08:06

This is the first time this vital, 400 million year old process has been visualised in real time in full root systems of living plants. Understanding the dynamics of plant colonisation by fungi could help to make food production more sustainable in the future.

Almost all crop plants form associations with a particular type of fungi – called arbuscular mycorrhiza fungi – in the soil, which greatly expand their root surface area. This mutually beneficial interaction boosts the plant’s ability to take up nutrients that are vital for growth. 

The more nutrients plants obtain naturally, the less artificial fertilisers are needed. Understanding this natural process, as the first step towards potentially enhancing it, is an ongoing research challenge. Progress is likely to pay huge dividends for agricultural productivity.

In a study published in the journal PLOS Biology, researchers used the bright red pigments of beetroot – called betalains – to visually track soil fungi as they colonised plant roots in a living plant. 

“We can now follow how the relationship between the fungi and plant root develops, in real-time, from the moment they come into contact. We previously had no idea about what happened because there was no way to visualise it in a living plant without the use of elaborate microscopy,” said Dr Sebastian Schornack, a researcher at the University of Cambridge’s Sainsbury Laboratory and joint senior author of the paper. 

To achieve their results, the researchers engineered two model plant species – a legume and a tobacco plant – so that they would produce the highly visible betalain pigments when arbuscular mycorrhiza fungi were present in their roots. This involved combining the control regions of two genes activated by mycorrhizal fungi with genes that synthesise red-coloured betalain pigments.

The plants were then grown in a transparent structure so that the root system was visible, and images of the roots could be taken with a flatbed scanner without disturbing the plants.

Using their technique, the researchers could select red pigmented parts of the root system to observe the fungus more closely as it entered individual plant cells and formed elaborate tree-like structures – called arbuscules – which grow inside the plant’s roots. Arbuscules take up nutrients from the soil that would otherwise be beyond the reach of the plant. 

Other methods exist to visualise this process, but these involve digging up and killing the plant and the use of chemicals or expensive microscopy. This work makes it possible for the first time to watch by eye and with simple imaging how symbiotic fungi start colonising living plant roots, and inhabit parts of the plant root system over time.

“This is an exciting new tool to visualise this, and other, important plant processes. Beetroot pigments are a distinctive colour, so they’re very easy to see. They also have the advantage of being natural plant pigments, so they are well tolerated by plants,” said Dr Sam Brockington, a researcher in the University of Cambridge’s Department of Plant Sciences, and joint senior author of the paper.

Mycorrhiza fungi are attracting growing interest in agriculture. This new technique provides the ability to ‘track and trace’ the presence of symbiotic fungi in soils from different sources and locations. The researchers say this will enable the selection of fungi that colonise plants fastest and provide the biggest benefits in agricultural scenarios.

Understanding and exploiting the dynamics of plant root system colonisation by fungi has potential to enhance future crop production in an environmentally sustainable way. If plants can take up more nutrients naturally, this will reduce the need for artificial fertilisers – saving money and reducing associated water pollution. 

This research was funded by the Biotechnology and Biological Sciences Research Council, Gatsby Charitable Foundation, Royal Society, and Natural Environment Research Council. 

Reference
Timoneda, A. & Yunusov, T. et al: ‘MycoRed: Betalain pigments enable in vivo real-time visualisation of arbuscular mycorrhizal colonisation.’ PLOS Biology, July 2021. DOI: 10.1371/journal.pbio.3001326

Scientists have created plants whose cells and tissues ‘blush’ with beetroot pigments when they are colonised by fungi that help them take up nutrients from the soil.

plant sciencesGlobal food securitySustainable agricultureSustainable EarthSam BrockingtonSebastian SchornackCambridge University Botanic GardenDepartment of Plant SciencesSainsbury LaboratorySchool of the Biological SciencesWe can now follow how the relationship between the fungi and plant root develops, in real-time, from the moment they come into contact.Sebastian SchornackTemur Yunusov and Alfonso TimonedaCells of roots colonised by fungi turn red


The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

YesLicense type: AttributionNews type: News

Scientists reverse age-related memory loss in mice

Latest Research in Cambridge - Thu, 22/07/2021 - 15:18

In a study published in Molecular Psychiatry, the team show that changes in the extracellular matrix of the brain – ‘scaffolding’ around nerve cells – lead to loss of memory with ageing, but that it is possible to reverse these using genetic treatments.

Recent evidence has emerged of the role of perineuronal nets (PNNs) in neuroplasticity – the ability of the brain to learn and adapt – and to make memories. PNNs are cartilage-like structures that mostly surround inhibitory neurons in the brain. Their main function is to control the level of plasticity in the brain. They appear at around five years old in humans, and turn off the period of enhanced plasticity during which the connections in the brain are optimised. Then, plasticity is partially turned off, making the brain more efficient but less plastic.

PNNs contain compounds known as chondroitin sulphates. Some of these, such as chondroitin 4-sulphate, inhibit the action of the networks, inhibiting neuroplasticity; others, such as chondroitin 6-sulphate, promote neuroplasticity. As we age, the balance of these compounds changes, and as levels of chondroitin 6-sulphate decrease, so our ability to learn and form new memories changes, leading to age-related memory decline.

Researchers at the University of Cambridge and University of Leeds investigated whether manipulating the chondroitin sulphate composition of the PNNs might restore neuroplasticity and alleviate age-related memory deficits.

To do this, the team looked at 20-month old mice – considered very old – and using a suite of tests showed that the mice exhibited deficits in their memory compared to six-month old mice.

For example, one test involved seeing whether mice recognised an object. The mouse was placed at the start of a Y-shaped maze and left to explore two identical objects at the end of the two arms. After a short while, the mouse was once again placed in the maze, but this time one arm contained a new object, while the other contained a copy of the repeated object. The researchers measured the amount of time the mouse spent exploring each object to see whether it had remembered the object from the previous task. The older mice were much less likely to remember the object.

The team treated the ageing mice using a ‘viral vector’, a virus capable of reconstituting the amount of 6-sulphate chondroitin sulphates to the PNNs and found that this completely restored memory in the older mice, to a level similar to that seen in the younger mice.

Dr Jessica Kwok from the School of Biomedical Sciences at the University of Leeds said: “We saw remarkable results when we treated the ageing mice with this treatment. The memory and ability to learn were restored to levels they would not have seen since they were much younger.”

To explore the role of chondroitin 6-sulphate in memory loss, the researchers bred mice that had been genetically-manipulated such that they were only able to produce low levels of the compound to mimic the changes of ageing. Even at 11 weeks, these mice showed signs of premature memory loss. However, increasing levels of chondroitin 6-sulphate using the viral vector restored their memory and plasticity to levels similar to healthy mice.

Professor James Fawcett from the John van Geest Centre for Brain Repair at the University of Cambridge said: “What is exciting about this is that although our study was only in mice, the same mechanism should operate in humans – the molecules and structures in the human brain are the same as those in rodents. This suggests that it may be possible to prevent humans from developing memory loss in old age.”

The team have already identified a potential drug, licensed for human use, that can be taken by mouth and inhibits the formation of PNNs. When this compound is given to mice and rats it can restore memory in ageing and also improves recovery in spinal cord injury. The researchers are investigating whether it might help alleviate memory loss in animal models of Alzheimer's disease.

The approach taken by Professor Fawcett’s team – using viral vectors to deliver the treatment – is increasingly being used to treat human neurological conditions. A second team at the Centre recently published research showing their use for repairing damage caused by glaucoma and dementia.

The study was funded by Alzheimer’s Research UK, the Medical Research Council, European Research Council and the Czech Science Foundation.

 

Reference
Yang, S et al. Chondroitin 6-sulphate is required for neuroplasticity and memory in ageing. Molecular Psychiatry; 16 July 2021; DOI: doi.org/10.1038/s41380-021-01208-9

Scientists at Cambridge and Leeds have successfully reversed age-related memory loss in mice and say their discovery could lead to the development of treatments to prevent memory loss in people as they age.

brainNeuroscienceageingneurodegenerationanimal researchJames FawcettUniversity of LeedsAlzheimer's Research UKEuropean Research CouncilMedical Research CouncilCzech Science FoundationJohn van Geest Centre for Brain RepairSchool of Clinical MedicineAlthough our study was only in mice, the same mechanism should operate in humans – the molecules and structures in the human brain are the same as those in rodents. This suggests that it may be possible to prevent humans from developing memory loss in old ageJames FawcettMichael Shribak, Marine Biological Laboratory, Woods Hole, MASpatially oriented neurons (mouse brain)


The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

YesLicense type: Attribution-NoncommericalNews type: News

Scientists reverse age-related memory loss in mice

Research in Medicine - Thu, 22/07/2021 - 15:18

In a study published in Molecular Psychiatry, the team show that changes in the extracellular matrix of the brain – ‘scaffolding’ around nerve cells – lead to loss of memory with ageing, but that it is possible to reverse these using genetic treatments.

Recent evidence has emerged of the role of perineuronal nets (PNNs) in neuroplasticity – the ability of the brain to learn and adapt – and to make memories. PNNs are cartilage-like structures that mostly surround inhibitory neurons in the brain. Their main function is to control the level of plasticity in the brain. They appear at around five years old in humans, and turn off the period of enhanced plasticity during which the connections in the brain are optimised. Then, plasticity is partially turned off, making the brain more efficient but less plastic.

PNNs contain compounds known as chondroitin sulphates. Some of these, such as chondroitin 4-sulphate, inhibit the action of the networks, inhibiting neuroplasticity; others, such as chondroitin 6-sulphate, promote neuroplasticity. As we age, the balance of these compounds changes, and as levels of chondroitin 6-sulphate decrease, so our ability to learn and form new memories changes, leading to age-related memory decline.

Researchers at the University of Cambridge and University of Leeds investigated whether manipulating the chondroitin sulphate composition of the PNNs might restore neuroplasticity and alleviate age-related memory deficits.

To do this, the team looked at 20-month old mice – considered very old – and using a suite of tests showed that the mice exhibited deficits in their memory compared to six-month old mice.

For example, one test involved seeing whether mice recognised an object. The mouse was placed at the start of a Y-shaped maze and left to explore two identical objects at the end of the two arms. After a short while, the mouse was once again placed in the maze, but this time one arm contained a new object, while the other contained a copy of the repeated object. The researchers measured the amount of time the mouse spent exploring each object to see whether it had remembered the object from the previous task. The older mice were much less likely to remember the object.

The team treated the ageing mice using a ‘viral vector’, a virus capable of reconstituting the amount of 6-sulphate chondroitin sulphates to the PNNs and found that this completely restored memory in the older mice, to a level similar to that seen in the younger mice.

Dr Jessica Kwok from the School of Biomedical Sciences at the University of Leeds said: “We saw remarkable results when we treated the ageing mice with this treatment. The memory and ability to learn were restored to levels they would not have seen since they were much younger.”

To explore the role of chondroitin 6-sulphate in memory loss, the researchers bred mice that had been genetically-manipulated such that they were only able to produce low levels of the compound to mimic the changes of ageing. Even at 11 weeks, these mice showed signs of premature memory loss. However, increasing levels of chondroitin 6-sulphate using the viral vector restored their memory and plasticity to levels similar to healthy mice.

Professor James Fawcett from the John van Geest Centre for Brain Repair at the University of Cambridge said: “What is exciting about this is that although our study was only in mice, the same mechanism should operate in humans – the molecules and structures in the human brain are the same as those in rodents. This suggests that it may be possible to prevent humans from developing memory loss in old age.”

The team have already identified a potential drug, licensed for human use, that can be taken by mouth and inhibits the formation of PNNs. When this compound is given to mice and rats it can restore memory in ageing and also improves recovery in spinal cord injury. The researchers are investigating whether it might help alleviate memory loss in animal models of Alzheimer's disease.

The approach taken by Professor Fawcett’s team – using viral vectors to deliver the treatment – is increasingly being used to treat human neurological conditions. A second team at the Centre recently published research showing their use for repairing damage caused by glaucoma and dementia.

The study was funded by Alzheimer’s Research UK, the Medical Research Council, European Research Council and the Czech Science Foundation.

 

Reference
Yang, S et al. Chondroitin 6-sulphate is required for neuroplasticity and memory in ageing. Molecular Psychiatry; 16 July 2021; DOI: doi.org/10.1038/s41380-021-01208-9

Scientists at Cambridge and Leeds have successfully reversed age-related memory loss in mice and say their discovery could lead to the development of treatments to prevent memory loss in people as they age.

brainNeuroscienceageingneurodegenerationanimal researchJames FawcettUniversity of LeedsAlzheimer's Research UKEuropean Research CouncilMedical Research CouncilCzech Science FoundationJohn van Geest Centre for Brain RepairSchool of Clinical MedicineAlthough our study was only in mice, the same mechanism should operate in humans – the molecules and structures in the human brain are the same as those in rodents. This suggests that it may be possible to prevent humans from developing memory loss in old ageJames FawcettMichael Shribak, Marine Biological Laboratory, Woods Hole, MASpatially oriented neurons (mouse brain)


The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

YesLicense type: Attribution-NoncommericalNews type: News

Llama ‘nanobodies’ could hold key to preventing deadly post-transplant infection

Latest Research in Cambridge - Thu, 22/07/2021 - 10:07

Around four out of five people in the UK are thought to be infected with HCMV, and in developing countries this can be as high as 95%. For the majority of people, the virus remains dormant, hidden away inside white blood cells, where it can remain undisturbed and undetected for decades. If the virus reactivates in a healthy individual, it does not usually cause symptoms. However, for people who are immunocompromised – for example, transplant recipients who need to take immunosuppressant drugs to prevent organ rejection – HCMV reactivation can be devastating.

At present, there is no effective vaccine against HCMV, and anti-viral drugs often prove ineffective or have very serious side-effects.

Now, in a study published in Nature Communications, researchers at Vrije Universiteit Amsterdam in the Netherlands and at the University of Cambridge have found a way to chase the virus from its hiding place using a special type of antibody known as a nanobody.

Nanobodies were first identified in camels and exist in all camelids – a family of animals that also includes dromedary, llamas and alpacas. Human antibodies consist of two heavy and two light chains of molecules, which together recognise and bind to markers on the surface of a cell or virus known as antigens. For this special class of camelid antibodies, however, only a single fragment of the antibody – often referred to as single domain antibody or nanobody – is sufficient to properly recognize antigens.

Dr Timo De Groof from Vrije Universiteit Amsterdam, the study’s joint first author, said: “As the name suggests, nanobodies are much smaller than regular antibodies, which make them perfectly suited for particular types of antigens and relatively easy to manufacture and adjust. That’s why they’re being hailed as having the potential to revolutionise antibody therapies.”

The first nanobody has been approved and introduced onto the market by biopharmaceutical company Ablynx, while other nanobodies are already in clinical trials for diseases like rheumatoid arthritis and certain cancers. Now, the team in The Netherlands and the UK have developed nanobodies that target a specific virus protein (US28), one of the few elements detectable on the surface of a HCMV latently infected cell and a main driver of this latent state.

Dr Ian Groves from the Department of Medicine at the University of Cambridge said: “Our team has shown that nanobodies derived from llamas have the potential to outwit human cytomegalovirus. This could be very important as the virus can cause life-threatening complications in people whose immune systems are not functioning properly.”

In laboratory experiments using blood infected with the virus, the team showed that the nanobody binds to the US28 protein and interrupts the signals established through the protein that help keep the virus in its dormant state. Once this control is broken, the local immune cells are able to 'see' that the cell is infected, enabling the host’s immune cells to hunt down and kill the virus, purging the latent reservoir and clearing the blood of the virus.

Dr Elizabeth Elder, joint first author, who carried out her work while at the University of Cambridge, said: “The beauty of this approach is that it reactivates the virus just enough to make it visible to the immune system, but not enough for it to do what a virus normally does – replicating and spreading. The virus is forced to put its head above the parapet where it can then be killed by the immune system.”

Professor Martine Smit, also from from the Vrije Universiteit Amsterdam, added: “We believe our approach could lead to a much-needed new type of treatment for reducing – and potentially even preventing – CMV infectious in patients eligible for organ and stem cell transplants.”

The research was funded by the Dutch Research Council (NWO), Wellcome and the Medical Research Council, with support from the NIHR Cambridge Biomedical Research Centre.

 

Reference
De Groof TWM, Elder E, et al. Targeting the latent human cytomegalovirus reservoir for T-cell mediated killing with virus specific nanobodies. Nature Communications (2021). DOI: 10.1038/s41467-021-24608-5

Scientists have developed a ‘nanobody’ – a small fragment of a llama antibody – that is capable of chasing out human cytomegalovirus (HCMV) as it hides away from the immune system. This then enables immune cells to seek out and destroy this potentially deadly virus.

medicinevirologytransplantationInfectious diseasesElizabeth ElderIan GrovesVrije Universiteit AmsterdamDutch Research Council (NWO)WellcomeMedical Research CouncilDepartment of MedicineSchool of Clinical MedicineNIHR Cambridge Biomedical Research CentreOur team has shown that nanobodies derived from llamas have the potential to outwit human cytomegalovirusIan GrovesJessica Knowlden on UnsplashLlamas


The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

YesNews type: News

Llama ‘nanobodies’ could hold key to preventing deadly post-transplant infection

Research in Medicine - Thu, 22/07/2021 - 10:07

Around four out of five people in the UK are thought to be infected with HCMV, and in developing countries this can be as high as 95%. For the majority of people, the virus remains dormant, hidden away inside white blood cells, where it can remain undisturbed and undetected for decades. If the virus reactivates in a healthy individual, it does not usually cause symptoms. However, for people who are immunocompromised – for example, transplant recipients who need to take immunosuppressant drugs to prevent organ rejection – HCMV reactivation can be devastating.

At present, there is no effective vaccine against HCMV, and anti-viral drugs often prove ineffective or have very serious side-effects.

Now, in a study published in Nature Communications, researchers at Vrije Universiteit Amsterdam in the Netherlands and at the University of Cambridge have found a way to chase the virus from its hiding place using a special type of antibody known as a nanobody.

Nanobodies were first identified in camels and exist in all camelids – a family of animals that also includes dromedary, llamas and alpacas. Human antibodies consist of two heavy and two light chains of molecules, which together recognise and bind to markers on the surface of a cell or virus known as antigens. For this special class of camelid antibodies, however, only a single fragment of the antibody – often referred to as single domain antibody or nanobody – is sufficient to properly recognize antigens.

Dr Timo De Groof from Vrije Universiteit Amsterdam, the study’s joint first author, said: “As the name suggests, nanobodies are much smaller than regular antibodies, which make them perfectly suited for particular types of antigens and relatively easy to manufacture and adjust. That’s why they’re being hailed as having the potential to revolutionise antibody therapies.”

The first nanobody has been approved and introduced onto the market by biopharmaceutical company Ablynx, while other nanobodies are already in clinical trials for diseases like rheumatoid arthritis and certain cancers. Now, the team in The Netherlands and the UK have developed nanobodies that target a specific virus protein (US28), one of the few elements detectable on the surface of a HCMV latently infected cell and a main driver of this latent state.

Dr Ian Groves from the Department of Medicine at the University of Cambridge said: “Our team has shown that nanobodies derived from llamas have the potential to outwit human cytomegalovirus. This could be very important as the virus can cause life-threatening complications in people whose immune systems are not functioning properly.”

In laboratory experiments using blood infected with the virus, the team showed that the nanobody binds to the US28 protein and interrupts the signals established through the protein that help keep the virus in its dormant state. Once this control is broken, the local immune cells are able to 'see' that the cell is infected, enabling the host’s immune cells to hunt down and kill the virus, purging the latent reservoir and clearing the blood of the virus.

Dr Elizabeth Elder, joint first author, who carried out her work while at the University of Cambridge, said: “The beauty of this approach is that it reactivates the virus just enough to make it visible to the immune system, but not enough for it to do what a virus normally does – replicating and spreading. The virus is forced to put its head above the parapet where it can then be killed by the immune system.”

Professor Martine Smit, also from from the Vrije Universiteit Amsterdam, added: “We believe our approach could lead to a much-needed new type of treatment for reducing – and potentially even preventing – CMV infectious in patients eligible for organ and stem cell transplants.”

The research was funded by the Dutch Research Council (NWO), Wellcome and the Medical Research Council, with support from the NIHR Cambridge Biomedical Research Centre.

 

Reference
De Groof TWM, Elder E, et al. Targeting the latent human cytomegalovirus reservoir for T-cell mediated killing with virus specific nanobodies. Nature Communications (2021). DOI: 10.1038/s41467-021-24608-5

Scientists have developed a ‘nanobody’ – a small fragment of a llama antibody – that is capable of chasing out human cytomegalovirus (HCMV) as it hides away from the immune system. This then enables immune cells to seek out and destroy this potentially deadly virus.

medicinevirologytransplantationInfectious diseasesElizabeth ElderIan GrovesVrije Universiteit AmsterdamDutch Research Council (NWO)WellcomeMedical Research CouncilDepartment of MedicineSchool of Clinical MedicineNIHR Cambridge Biomedical Research CentreOur team has shown that nanobodies derived from llamas have the potential to outwit human cytomegalovirusIan GrovesJessica Knowlden on UnsplashLlamas


The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

YesNews type: News

Biological ‘fingerprints’ of long COVID in blood could lead to diagnostic test, say Cambridge scientists

Latest Research in Cambridge - Mon, 19/07/2021 - 08:39

The team has received funding from the National Institute for Health Research to develop a test that could complement existing antibody tests. They also aim to use similar biological signatures to develop a test and monitor for long COVID.

While most people recover from COVID-19 in a matter of days or weeks, around one in ten people go on to develop symptoms that can last for several months. This can be the case irrespective of the severity of their COVID-19 – even individuals who were asymptomatic can experience so-called ‘long COVID’.

Diagnosing long COVID can be a challenge, however. A patient with asymptomatic or mild disease may not have taken a PCR test at the time of infection – the gold standard for diagnosing COVID-19 –  and so has never had a confirmed diagnosis.  Even antibody tests – which look for immune cells produced in response to infection – are estimated to miss around 30% of cases, particularly among those who have had only mild disease and or beyond six months post-initial illness.

A team at the University of Cambridge and Cambridge University Hospital NHS Foundation Trust has received £370,000 from the National Institute for Health Research to develop a COVID-19 diagnostic test that would complement existing antibody tests and a test that could objectively diagnose and monitor long COVID.

The research builds on a pilot project supported by the Addenbrooke’s Charitable Trust. The team has been recruiting patients from the Long COVID Clinic established in May 2020 at Addenbrooke’s Hospital, part of Cambridge University Hospital NHS Foundation Trust.

During the pilot, the team recruited 85 patients to the Cambridge NIHR COVID BioResource, which collects blood samples from patients when they are first diagnosed and then at follow-up intervals over several months. They now hope to expand their cohort to 500 patients, recruited from Cambridgeshire and Peterborough.

In their initial findings, the team identified a biomarker – a biological fingerprint – in the blood of patients who had previously had COVID-19. This biomarker is a molecule known as a cytokine produced by T cells in response to infection. As with antibodies, this biomarker persists in the blood for a long time after infection. The team plans to publish their results shortly.

Dr Mark Wills from the Department of Medicine at the University of Cambridge, who co-leads the team, said: “We need a reliable and objective way of saying whether someone has had COVID-19. Antibodies are one sign we look for, but not everyone makes a very strong response and this can wane over time and become undetectable.

“We’ve identified a cytokine that is also produced in response to infection by T cells and is likely to be detectable for several months – and potentially years – following infection. We believe this will help us develop a much more reliable diagnostic for those individuals who did not get a diagnosis at the time of infection.”

By following patients for up to 18 months post-infection, the team hopes to address several questions, including whether immunity wanes over time. This will be an important part of helping understand whether people who have been vaccinated will need to receive boosters to keep them protected.

As part of their pilot study, the team also identified a particular biomarker found in patients with long COVID. Their work suggests these patients produce a second type of cytokine, which persists in patients with long COVID compared to those that recover quickly and might be one of the drivers behind the many symptoms that patients experience. This might therefore prove to be useful for diagnosing long COVID.

Dr Nyarie Sithole, also from the Department of Medicine at the University of Cambridge, who co-leads the team and helps to manage long COVID patients, said:  “Because we currently have no reliable way of diagnosing long COVID, the uncertainty can cause added stress to people who are experiencing potential symptoms. If we can say to them ‘yes, you have a biomarker and so you have long COVID’, we believe this will help allay some of their fears and anxieties.

“There is anecdotal evidence that patients see an improvement in symptoms of long COVID once they have been vaccinated – something that we have seen in a small number of patients in our clinic. Our study will allow us to see how this biomarker changes over a longer period of time in response to vaccination.”

At the moment, the team is using the tests for research purposes, but by increasing the size of their study cohort and carrying out further work, they hope to adapt and optimise the tests that can be scaled up and speeded up, able to be used by clinical diagnostic labs.

As well as developing a reliable test, the researchers hope their work will help provide an in-depth understanding of how the immune system responds to coronavirus infection – and why it triggers long COVID in some people.

Dr Sithole added: “One of the theories of what’s driving long COVID is that it’s a hyperactive immune response – in other words, the immune system switches on at the initial infection and for some reason never switches off or never goes back to the baseline. As we’ll be following our patients for many months post-infection, we hope to better understand whether this is indeed the case.”

In addition, having a reliable biomarker could help in the development of new treatments against COVID. Clinical trials require an objective measure of whether a drug is effective. Changes in – or the disappearance of – long-COVID-related cytokine biomarkers with corresponding symptom improvement in response to drug treatment would suggest that a treatment intervention is working.

Markers in our blood – ‘fingerprints’ of infection – could help identify individuals who have been infected by SARS-CoV-2, the coronavirus that causes COVID-19, several months after infection even if the individual had only mild symptoms or showed no symptoms at all, say Cambridge researchers.

COVID-19CoronavirusdiagnosticInfectious diseasesimmunologyNyarie SitholeMark WillsNational Institute for Health Research (NIHR)School of Clinical MedicineDepartment of MedicineCambridge University Hospitals NHS Foundation TrustAddenbrooke's HospitalCambridge Infectious DiseasesBecause we currently have no reliable way of diagnosing long COVID, the uncertainty can cause added stress to people who are experiencing potential symptoms. If we can say to them ‘yes, you have a biomarker and so you have long COVID’, we believe this will help allay some of their fears and anxietiesNyarie SitholeAnnie SprattTired looking woman


The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

YesLicense type: Public DomainNews type: News

Biological ‘fingerprints’ of long COVID in blood could lead to diagnostic test, say Cambridge scientists

Research in Medicine - Mon, 19/07/2021 - 08:39

The team has received funding from the National Institute for Health Research to develop a test that could complement existing antibody tests. They also aim to use similar biological signatures to develop a test and monitor for long COVID.

While most people recover from COVID-19 in a matter of days or weeks, around one in ten people go on to develop symptoms that can last for several months. This can be the case irrespective of the severity of their COVID-19 – even individuals who were asymptomatic can experience so-called ‘long COVID’.

Diagnosing long COVID can be a challenge, however. A patient with asymptomatic or mild disease may not have taken a PCR test at the time of infection – the gold standard for diagnosing COVID-19 –  and so has never had a confirmed diagnosis.  Even antibody tests – which look for immune cells produced in response to infection – are estimated to miss around 30% of cases, particularly among those who have had only mild disease and or beyond six months post-initial illness.

A team at the University of Cambridge and Cambridge University Hospital NHS Foundation Trust has received £370,000 from the National Institute for Health Research to develop a COVID-19 diagnostic test that would complement existing antibody tests and a test that could objectively diagnose and monitor long COVID.

The research builds on a pilot project supported by the Addenbrooke’s Charitable Trust. The team has been recruiting patients from the Long COVID Clinic established in May 2020 at Addenbrooke’s Hospital, part of Cambridge University Hospital NHS Foundation Trust.

During the pilot, the team recruited 85 patients to the Cambridge NIHR COVID BioResource, which collects blood samples from patients when they are first diagnosed and then at follow-up intervals over several months. They now hope to expand their cohort to 500 patients, recruited from Cambridgeshire and Peterborough.

In their initial findings, the team identified a biomarker – a biological fingerprint – in the blood of patients who had previously had COVID-19. This biomarker is a molecule known as a cytokine produced by T cells in response to infection. As with antibodies, this biomarker persists in the blood for a long time after infection. The team plans to publish their results shortly.

Dr Mark Wills from the Department of Medicine at the University of Cambridge, who co-leads the team, said: “We need a reliable and objective way of saying whether someone has had COVID-19. Antibodies are one sign we look for, but not everyone makes a very strong response and this can wane over time and become undetectable.

“We’ve identified a cytokine that is also produced in response to infection by T cells and is likely to be detectable for several months – and potentially years – following infection. We believe this will help us develop a much more reliable diagnostic for those individuals who did not get a diagnosis at the time of infection.”

By following patients for up to 18 months post-infection, the team hopes to address several questions, including whether immunity wanes over time. This will be an important part of helping understand whether people who have been vaccinated will need to receive boosters to keep them protected.

As part of their pilot study, the team also identified a particular biomarker found in patients with long COVID. Their work suggests these patients produce a second type of cytokine, which persists in patients with long COVID compared to those that recover quickly and might be one of the drivers behind the many symptoms that patients experience. This might therefore prove to be useful for diagnosing long COVID.

Dr Nyarie Sithole, also from the Department of Medicine at the University of Cambridge, who co-leads the team and helps to manage long COVID patients, said:  “Because we currently have no reliable way of diagnosing long COVID, the uncertainty can cause added stress to people who are experiencing potential symptoms. If we can say to them ‘yes, you have a biomarker and so you have long COVID’, we believe this will help allay some of their fears and anxieties.

“There is anecdotal evidence that patients see an improvement in symptoms of long COVID once they have been vaccinated – something that we have seen in a small number of patients in our clinic. Our study will allow us to see how this biomarker changes over a longer period of time in response to vaccination.”

At the moment, the team is using the tests for research purposes, but by increasing the size of their study cohort and carrying out further work, they hope to adapt and optimise the tests that can be scaled up and speeded up, able to be used by clinical diagnostic labs.

As well as developing a reliable test, the researchers hope their work will help provide an in-depth understanding of how the immune system responds to coronavirus infection – and why it triggers long COVID in some people.

Dr Sithole added: “One of the theories of what’s driving long COVID is that it’s a hyperactive immune response – in other words, the immune system switches on at the initial infection and for some reason never switches off or never goes back to the baseline. As we’ll be following our patients for many months post-infection, we hope to better understand whether this is indeed the case.”

In addition, having a reliable biomarker could help in the development of new treatments against COVID. Clinical trials require an objective measure of whether a drug is effective. Changes in – or the disappearance of – long-COVID-related cytokine biomarkers with corresponding symptom improvement in response to drug treatment would suggest that a treatment intervention is working.

Markers in our blood – ‘fingerprints’ of infection – could help identify individuals who have been infected by SARS-CoV-2, the coronavirus that causes COVID-19, several months after infection even if the individual had only mild symptoms or showed no symptoms at all, say Cambridge researchers.

COVID-19CoronavirusdiagnosticInfectious diseasesimmunologyNyarie SitholeMark WillsNational Institute for Health Research (NIHR)School of Clinical MedicineDepartment of MedicineCambridge University Hospitals NHS Foundation TrustAddenbrooke's HospitalCambridge Infectious DiseasesBecause we currently have no reliable way of diagnosing long COVID, the uncertainty can cause added stress to people who are experiencing potential symptoms. If we can say to them ‘yes, you have a biomarker and so you have long COVID’, we believe this will help allay some of their fears and anxietiesNyarie SitholeAnnie SprattTired looking woman


The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

YesLicense type: Public DomainNews type: News

University of Cambridge launches roadmap to support future growth of life sciences cluster

Latest Research in Cambridge - Fri, 16/07/2021 - 10:49

The roadmap sets out a clear plan to create a bridge between two of Cambridge’s historical strengths — biomedical research and cutting-edge technology — and bring these specialisms together to develop new treatments and health tech with real world applications. The solutions in the roadmap are scalable beyond Cambridge and also applicable to other disciplines and sectors.

Professor Andy Neely, Pro-Vice-Chancellor for Enterprise and Business Relations at the University of Cambridge, said: “Cambridge has a deep and rich history of discovery and collaboration, and its interdisciplinary environment is the perfect testbed for new models of innovation in the life sciences. Our roadmap sets out a plan to do just that and will ensure that Cambridge remains a global leader in health technology into the next generation.

“This will require us to pioneer new ways of working and creating connections between different institutions across disciplines, be they academic or private enterprise. Such a model has been proven to work at a small scale – our proposal in the roadmap is to scale this up and apply it across the cluster and beyond.”

The University sits at the heart of the so-called ‘Cambridge cluster’, in which more than 5,300 knowledge-intensive firms employ more than 67,000 people and generate £18 billion in turnover. Cambridge has the highest number of patent applications per 100,000 residents in the UK.

The mission of the University is to contribute to society through the pursuit of education, learning and research at the highest international levels of excellence. This includes cultivating and delivering excellent research and world-leading innovation and training of the next generation of highly skilled researchers and entrepreneurs, thereby underpinning the UK's economic growth and competitiveness.

Professor Tony Kouzarides, Director of the Milner Therapeutics Institute at the University of Cambridge, said: “The pandemic has clearly shown the importance of rapid innovation in healthcare. We are determined to harness the power of innovation, creativity and collaboration in Cambridge, and apply this towards solving some of the biggest medical challenges facing the country, and the world.”

The Connect: Health Tech roadmap is a result of consultation with major stakeholders and a series of road-mapping workshops with the Cambridge community. It aims to shape the future success of the Cambridge cluster in health tech through a supportive and dynamic ecosystem that aligns with the needs of the community.

The roadmap includes ambitious steps to build strong foundations for the Cambridge cluster for the next 20 years and will support the region's economic recovery post-pandemic and bring cutting-edge research, businesses and innovators together to be better prepared and connected for the future. Connect: Health Tech will also increase access to the Cambridge ecosystem extending reach and helping to level up growth and investment across the East of England and the Oxford-Cambridge Arc.

One of the major recommendations in the report is to create and foster connectivity at the interface between medicine and technology and across sectors. This recommendation has been piloted by expanding the Cambridge cluster from a physical community to a digital one.

The COVID19 pandemic has required the creation of an innovative model of access and navigation to Cambridge. The digital platform simplifies navigation of the Cambridge research community and enables new companies based all over the world to access expertise and knowledge across the University with the aim of increasing inward investment in the life sciences. It also pilots an approach to navigation and connectivity that can be scaled up across the Arc and the UK. This new way of working will speed up the development of new healthcare innovations and technologies that the NHS will use in years to come.

Connect: Health Tech is a Cambridge University initiative funded by Research England. Connect: Health Tech UEZ has been created to build a highly effective interdisciplinary bridge between two Cambridge research hubs and beyond: the West science and technology hub anchored at the Maxwell Centre and South biomedical hub anchored at the Milner Therapeutics Institute. The bridge will bring together and integrate a community from across the University, research institutes, NHS, industry, investors, local and national Government, with a focus on medtech, digital health and therapeutics, to create opportunities that will transform ideas at the interface between medicine and technology into reality.

Read Creating a University Enterprise Zone for Cambridge across the life and physical sciences

Connect: Health Tech, the University of Cambridge Enterprise Zone, has today launched a roadmap, ‘Creating a University Enterprise Zone for Cambridge across the life and physical sciences’, that examines the challenges faced in futureproofing and sustaining the growth of the life sciences cluster to maintain Cambridge as a global centre of excellence for health tech.

Innovationspotlight on innovationenterpriseAndy NeelyTony KouzaridesCambridge Biomedical CampusSchool of Clinical MedicineMilner Therapeutics InstituteCambridge has a deep and rich history of discovery and collaboration, and its interdisciplinary environment is the perfect testbed for new models of innovation in the life sciencesAndy NeelyAstraZenecaCambridge Biomedical Campus


The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

YesNews type: News

University of Cambridge launches roadmap to support future growth of life sciences cluster

Research in Medicine - Fri, 16/07/2021 - 10:49

The roadmap sets out a clear plan to create a bridge between two of Cambridge’s historical strengths — biomedical research and cutting-edge technology — and bring these specialisms together to develop new treatments and health tech with real world applications. The solutions in the roadmap are scalable beyond Cambridge and also applicable to other disciplines and sectors.

Professor Andy Neely, Pro-Vice-Chancellor for Enterprise and Business Relations at the University of Cambridge, said: “Cambridge has a deep and rich history of discovery and collaboration, and its interdisciplinary environment is the perfect testbed for new models of innovation in the life sciences. Our roadmap sets out a plan to do just that and will ensure that Cambridge remains a global leader in health technology into the next generation.

“This will require us to pioneer new ways of working and creating connections between different institutions across disciplines, be they academic or private enterprise. Such a model has been proven to work at a small scale – our proposal in the roadmap is to scale this up and apply it across the cluster and beyond.”

The University sits at the heart of the so-called ‘Cambridge cluster’, in which more than 5,300 knowledge-intensive firms employ more than 67,000 people and generate £18 billion in turnover. Cambridge has the highest number of patent applications per 100,000 residents in the UK.

The mission of the University is to contribute to society through the pursuit of education, learning and research at the highest international levels of excellence. This includes cultivating and delivering excellent research and world-leading innovation and training of the next generation of highly skilled researchers and entrepreneurs, thereby underpinning the UK's economic growth and competitiveness.

Professor Tony Kouzarides, Director of the Milner Therapeutics Institute at the University of Cambridge, said: “The pandemic has clearly shown the importance of rapid innovation in healthcare. We are determined to harness the power of innovation, creativity and collaboration in Cambridge, and apply this towards solving some of the biggest medical challenges facing the country, and the world.”

The Connect: Health Tech roadmap is a result of consultation with major stakeholders and a series of road-mapping workshops with the Cambridge community. It aims to shape the future success of the Cambridge cluster in health tech through a supportive and dynamic ecosystem that aligns with the needs of the community.

The roadmap includes ambitious steps to build strong foundations for the Cambridge cluster for the next 20 years and will support the region's economic recovery post-pandemic and bring cutting-edge research, businesses and innovators together to be better prepared and connected for the future. Connect: Health Tech will also increase access to the Cambridge ecosystem extending reach and helping to level up growth and investment across the East of England and the Oxford-Cambridge Arc.

One of the major recommendations in the report is to create and foster connectivity at the interface between medicine and technology and across sectors. This recommendation has been piloted by expanding the Cambridge cluster from a physical community to a digital one.

The COVID19 pandemic has required the creation of an innovative model of access and navigation to Cambridge. The digital platform simplifies navigation of the Cambridge research community and enables new companies based all over the world to access expertise and knowledge across the University with the aim of increasing inward investment in the life sciences. It also pilots an approach to navigation and connectivity that can be scaled up across the Arc and the UK. This new way of working will speed up the development of new healthcare innovations and technologies that the NHS will use in years to come.

Connect: Health Tech is a Cambridge University initiative funded by Research England. Connect: Health Tech UEZ has been created to build a highly effective interdisciplinary bridge between two Cambridge research hubs and beyond: the West science and technology hub anchored at the Maxwell Centre and South biomedical hub anchored at the Milner Therapeutics Institute. The bridge will bring together and integrate a community from across the University, research institutes, NHS, industry, investors, local and national Government, with a focus on medtech, digital health and therapeutics, to create opportunities that will transform ideas at the interface between medicine and technology into reality.

Read Creating a University Enterprise Zone for Cambridge across the life and physical sciences

Connect: Health Tech, the University of Cambridge Enterprise Zone, has today launched a roadmap, ‘Creating a University Enterprise Zone for Cambridge across the life and physical sciences’, that examines the challenges faced in futureproofing and sustaining the growth of the life sciences cluster to maintain Cambridge as a global centre of excellence for health tech.

Innovationspotlight on innovationenterpriseAndy NeelyTony KouzaridesCambridge Biomedical CampusSchool of Clinical MedicineMilner Therapeutics InstituteCambridge has a deep and rich history of discovery and collaboration, and its interdisciplinary environment is the perfect testbed for new models of innovation in the life sciencesAndy NeelyAstraZenecaCambridge Biomedical Campus


The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

YesNews type: News

Top UK organisations release annual statistics for use of animals in research

Latest Research in Cambridge - Thu, 15/07/2021 - 10:07

This is to coincide with the Home Office’s publication of Great Britain’s statistics for animals used in research in 2020.

These ten organisations carried out 1,343,893 procedures, 47% or nearly half of the 2,883,310 procedures carried out in Great Britain in 2020. More than 99% of these 1,343,893 procedures were carried out in rodents or fish.

The statistics are freely available on the organisations’ websites as part of their ongoing commitment to transparency and openness around the use of animals in research.

The ten organisations are listed below alongside the total number of procedures that they carried out in 2020. This is the sixth consecutive year organisations have come together to publicise their collective statistics and examples of their research.

Organisation Number of Procedures The Francis Crick Institute 183,811 University of Cambridge 177,219 Medical Research Council 173,637 University of Oxford 169,511 University of Edinburgh 151,669 UCL 142,988 University of Glasgow 102,526 University of Manchester 93,448 King's College London 85,414 Imperial College London 63,670 TOTAL 1,343,893

A further breakdown of Cambridge’s numbers, including the number of procedures by species and detail of the levels of severity, can be found on our animal research pages.

Animal research has been essential for developing lifesaving vaccines and treatments for Covid-19. Ferrets and macaque monkeys were used to test the safety and efficacy of Covid-19 vaccines, including the successful Oxford / AstraZeneca vaccine. Hamsters are being used to develop Covid-19 treatment strategies as they display a more severe form of the disease than ferrets and monkeys. Guinea pigs have also been used in regulatory research to batch test vaccine potency.

Despite all this research to develop vaccines and treatments for Covid-19, the majority of UK research facilities carried out significantly less research than usual due to the various national lockdowns. Therefore, the 2020 figures cannot be reasonably compared with previous statistics.

All organisations are committed to the ‘3Rs’ of replacement, reduction and refinement. This means avoiding or replacing the use of animals where possible; minimising the number of animals used per experiment and optimising the experience of the animals to improve animal welfare. However, as institutions expand and conduct more research, the total number of animals used can rise even if fewer animals are used per study.

All organisations listed are signatories to the Concordat on Openness on Animal Research in the UK, a commitment to be more open about the use of animals in scientific, medical and veterinary research in the UK. More than 120 organisations have signed the Concordat including UK universities, medical research charities, research funders, learned societies and commercial research organisations.

Wendy Jarrett, Chief Executive of Understanding Animal Research, which developed the Concordat on Openness, said:

"Animal research has been essential to the development and safety testing of lifesaving COVID-19 vaccines and treatments. Macaque monkeys and ferrets have been used to develop vaccines, including the Oxford / AstraZeneca vaccine, hamsters are being used to develop treatments, and guinea pigs are used to quality-check each batch of vaccines.

"Animal testing provided scientists with initial data that the vaccines were effective and safe enough to move into human clinical trials. During these trials, thousands more humans than animals were used to test how effective and safe the vaccines were in people. The pandemic has led to increased public interest in the way vaccines and medicines are developed and UAR has worked with research institutions and funding bodies throughout the UK to develop resources that explain to the public how animals have been used in this critical research."

University of Cambridge Establishment Licence Holder Dr Martin Vinnell said:

“Animal research currently plays an essential role in our understanding of health and disease and in the development of modern medicines and surgical techniques. Without the use of animals, we would not have many of the modern medicines, antibiotics, vaccines and surgical techniques we take for granted in both human and veterinary medicine.

“We always aim to use as few animals as possible, refining our research and actively looking for ways of replacing their use, for example in the development of ‘mini-organs’ grown from human cells, which can be used to model disease.”

Adapted from a press release by Understanding Animal Research.

  Find out more

A team in the University of Cambridge’s Department of Engineering is developing implantable devices to bypass nerve damage and restore movement to paralysed limbs.

“Our aim is to make muscles wireless by intercepting electrical signals from the brain before they enter the damaged nerve and sending them directly to the target muscles via radio waves,” says Sam Hilton, a Research Assistant in the team.

The procedure has been tested and refined in computer simulations, and on cells grown in the lab. But before it can be tested in humans there is another important step: testing its safety in living rats. To avoid testing in animals entirely would place untenable risk on the first human recipients of this new device. All the experiments are carefully designed to ensure that just enough animals are used to produce convincing data, without resulting in unnecessary excess.

By working out how complex microelectronics can interface with living tissue in a very precise and controlled way, this work has potential to improve or restore movement in patients suffering severe nerve damage - improving their quality of life and easing the burden on our healthcare services.

The ten organisations in Great Britain that carry out the highest number of animal procedures – those used in medical, veterinary and scientific research - have today released their annual statistics.

animal researchUnderstanding Animal ResearchSchool of Clinical MedicineSchool of Biological SciencesWe always aim to use as few animals as possible, refining our research and actively looking for ways of replacing their use. Martin Vinnell


The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

YesNews type: News

Top UK organisations release annual statistics for use of animals in research

Research in Medicine - Thu, 15/07/2021 - 10:07

This is to coincide with the Home Office’s publication of Great Britain’s statistics for animals used in research in 2020.

These ten organisations carried out 1,343,893 procedures, 47% or nearly half of the 2,883,310 procedures carried out in Great Britain in 2020. More than 99% of these 1,343,893 procedures were carried out in rodents or fish.

The statistics are freely available on the organisations’ websites as part of their ongoing commitment to transparency and openness around the use of animals in research.

The ten organisations are listed below alongside the total number of procedures that they carried out in 2020. This is the sixth consecutive year organisations have come together to publicise their collective statistics and examples of their research.

Organisation Number of Procedures The Francis Crick Institute 183,811 University of Cambridge 177,219 Medical Research Council 173,637 University of Oxford 169,511 University of Edinburgh 151,669 UCL 142,988 University of Glasgow 102,526 University of Manchester 93,448 King's College London 85,414 Imperial College London 63,670 TOTAL 1,343,893

A further breakdown of Cambridge’s numbers, including the number of procedures by species and detail of the levels of severity, can be found on our animal research pages.

Animal research has been essential for developing lifesaving vaccines and treatments for Covid-19. Ferrets and macaque monkeys were used to test the safety and efficacy of Covid-19 vaccines, including the successful Oxford / AstraZeneca vaccine. Hamsters are being used to develop Covid-19 treatment strategies as they display a more severe form of the disease than ferrets and monkeys. Guinea pigs have also been used in regulatory research to batch test vaccine potency.

Despite all this research to develop vaccines and treatments for Covid-19, the majority of UK research facilities carried out significantly less research than usual due to the various national lockdowns. Therefore, the 2020 figures cannot be reasonably compared with previous statistics.

All organisations are committed to the ‘3Rs’ of replacement, reduction and refinement. This means avoiding or replacing the use of animals where possible; minimising the number of animals used per experiment and optimising the experience of the animals to improve animal welfare. However, as institutions expand and conduct more research, the total number of animals used can rise even if fewer animals are used per study.

All organisations listed are signatories to the Concordat on Openness on Animal Research in the UK, a commitment to be more open about the use of animals in scientific, medical and veterinary research in the UK. More than 120 organisations have signed the Concordat including UK universities, medical research charities, research funders, learned societies and commercial research organisations.

Wendy Jarrett, Chief Executive of Understanding Animal Research, which developed the Concordat on Openness, said:

"Animal research has been essential to the development and safety testing of lifesaving COVID-19 vaccines and treatments. Macaque monkeys and ferrets have been used to develop vaccines, including the Oxford / AstraZeneca vaccine, hamsters are being used to develop treatments, and guinea pigs are used to quality-check each batch of vaccines.

"Animal testing provided scientists with initial data that the vaccines were effective and safe enough to move into human clinical trials. During these trials, thousands more humans than animals were used to test how effective and safe the vaccines were in people. The pandemic has led to increased public interest in the way vaccines and medicines are developed and UAR has worked with research institutions and funding bodies throughout the UK to develop resources that explain to the public how animals have been used in this critical research."

University of Cambridge Establishment Licence Holder Dr Martin Vinnell said:

“Animal research currently plays an essential role in our understanding of health and disease and in the development of modern medicines and surgical techniques. Without the use of animals, we would not have many of the modern medicines, antibiotics, vaccines and surgical techniques we take for granted in both human and veterinary medicine.

“We always aim to use as few animals as possible, refining our research and actively looking for ways of replacing their use, for example in the development of ‘mini-organs’ grown from human cells, which can be used to model disease.”

Adapted from a press release by Understanding Animal Research.

  Find out more

A team in the University of Cambridge’s Department of Engineering is developing implantable devices to bypass nerve damage and restore movement to paralysed limbs.

“Our aim is to make muscles wireless by intercepting electrical signals from the brain before they enter the damaged nerve and sending them directly to the target muscles via radio waves,” says Sam Hilton, a Research Assistant in the team.

The procedure has been tested and refined in computer simulations, and on cells grown in the lab. But before it can be tested in humans there is another important step: testing its safety in living rats. To avoid testing in animals entirely would place untenable risk on the first human recipients of this new device. All the experiments are carefully designed to ensure that just enough animals are used to produce convincing data, without resulting in unnecessary excess.

By working out how complex microelectronics can interface with living tissue in a very precise and controlled way, this work has potential to improve or restore movement in patients suffering severe nerve damage - improving their quality of life and easing the burden on our healthcare services.

The ten organisations in Great Britain that carry out the highest number of animal procedures – those used in medical, veterinary and scientific research - have today released their annual statistics.

animal researchUnderstanding Animal ResearchSchool of Clinical MedicineSchool of Biological SciencesWe always aim to use as few animals as possible, refining our research and actively looking for ways of replacing their use. Martin Vinnell


The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

YesNews type: News

Climate changed the size of our bodies and, to some extent, our brains

Latest Research in Cambridge - Thu, 08/07/2021 - 10:21

An interdisciplinary team of researchers, led by the Universities of Cambridge and Tübingen, has gathered measurements of body and brain size for over 300 fossils from the genus Homo found across the globe. By combining this data with a reconstruction of the world’s regional climates over the last million years, they have pinpointed the specific climate experienced by each fossil when it was a living human.

The study reveals that the average body size of humans has fluctuated significantly over the last million years, with larger bodies evolving in colder regions. Larger size is thought to act as a buffer against colder temperatures: less heat is lost from a body when its mass is large relative to its surface area. The results are published today in the journal Nature Communications.

Our species, Homo sapiens, emerged around 300,000 years ago in Africa. The genus Homo has existed for much longer, and includes the Neanderthals and other extinct, related species such as Homo habilis and Homo erectus.

A defining trait of the evolution of our genus is a trend of increasing body and brain size; compared to earlier species such as Homo habilis, we are 50% heavier and our brains are three times larger. But the drivers behind such changes remain highly debated.

“Our study indicates that climate - particularly temperature - has been the main driver of changes in body size for the past million years,” said Professor Andrea Manica, a researcher in the University of Cambridge’s Department of Zoology who led the study.

He added: “We can see from people living today that those in warmer climates tend to be smaller, and those living in colder climates tend to be bigger. We now know that the same climatic influences have been at work for the last million years.”

The researchers also looked at the effect of environmental factors on brain size in the genus Homo, but correlations were generally weak. Brain size tended to be larger when Homo was living in habitats with less vegetation, like open steppes and grasslands, but also in ecologically more stable areas. In combination with archaeological data, the results suggest that people living in these habitats hunted large animals as food - a complex task that might have driven the evolution of larger brains.

“We found that different factors determine brain size and body size – they’re not under the same evolutionary pressures. The environment has a much greater influence on our body size than our brain size,” said Dr Manuel Will at the University of Tubingen, Germany, first author of the study.

He added: “There is an indirect environmental influence on brain size in more stable and open areas: the amount of nutrients gained from the environment had to be sufficient to allow for the maintenance and growth of our large and particularly energy-demanding brains.”

This research also suggests that non-environmental factors were more important for driving larger brains than climate, prime candidates being the added cognitive challenges of increasingly complex social lives, more diverse diets, and more sophisticated technology.

The researchers say there is good evidence that human body and brain size continue to evolve. The human physique is still adapting to different temperatures, with on average larger-bodied people living in colder climates today. Brain size in our species appears to have been shrinking since the beginning of the Holocene (around 11,650 years ago). The increasing dependence on technology, such as an outsourcing of complex tasks to computers, may cause brains to shrink even more over the next few thousand years.

“It’s fun to speculate about what will happen to body and brain sizes in the future, but we should be careful not to extrapolate too much based on the last million years because so many factors can change,” said Manica.

This research was funded by the European Research Council and the Antarctic Science Platform.

Reference

Will, M. et al: ‘Different environmental variables predict body and brain size evolution in Homo.’ Nature Communications, July 2021. DOI: 10.1038/s41467-021-24290-7

The average body size of humans has fluctuated significantly over the last million years and is strongly linked to temperature. Colder, harsher climates drove the evolution of larger body sizes, while warmer climates led to smaller bodies. Brain size also changed dramatically but did not evolve in tandem with body size.

evolutionClimate changeAndrea ManicaUniversity of TübingenDepartment of ZoologyClare CollegeSchool of the Biological SciencesOur study indicates that climate - particularly temperature - has been the main driver of changes in body size for the past million years.Andrea ManicaHuman fossils illustrating the variation in brain (skulls) and body size (thigh bones) during the Pleistocene.


The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

YesNews type: News

Climate changed the size of our bodies and, to some extent, our brains

Research in Medicine - Thu, 08/07/2021 - 10:21

An interdisciplinary team of researchers, led by the Universities of Cambridge and Tübingen, has gathered measurements of body and brain size for over 300 fossils from the genus Homo found across the globe. By combining this data with a reconstruction of the world’s regional climates over the last million years, they have pinpointed the specific climate experienced by each fossil when it was a living human.

The study reveals that the average body size of humans has fluctuated significantly over the last million years, with larger bodies evolving in colder regions. Larger size is thought to act as a buffer against colder temperatures: less heat is lost from a body when its mass is large relative to its surface area. The results are published today in the journal Nature Communications.

Our species, Homo sapiens, emerged around 300,000 years ago in Africa. The genus Homo has existed for much longer, and includes the Neanderthals and other extinct, related species such as Homo habilis and Homo erectus.

A defining trait of the evolution of our genus is a trend of increasing body and brain size; compared to earlier species such as Homo habilis, we are 50% heavier and our brains are three times larger. But the drivers behind such changes remain highly debated.

“Our study indicates that climate - particularly temperature - has been the main driver of changes in body size for the past million years,” said Professor Andrea Manica, a researcher in the University of Cambridge’s Department of Zoology who led the study.

He added: “We can see from people living today that those in warmer climates tend to be smaller, and those living in colder climates tend to be bigger. We now know that the same climatic influences have been at work for the last million years.”

The researchers also looked at the effect of environmental factors on brain size in the genus Homo, but correlations were generally weak. Brain size tended to be larger when Homo was living in habitats with less vegetation, like open steppes and grasslands, but also in ecologically more stable areas. In combination with archaeological data, the results suggest that people living in these habitats hunted large animals as food - a complex task that might have driven the evolution of larger brains.

“We found that different factors determine brain size and body size – they’re not under the same evolutionary pressures. The environment has a much greater influence on our body size than our brain size,” said Dr Manuel Will at the University of Tubingen, Germany, first author of the study.

He added: “There is an indirect environmental influence on brain size in more stable and open areas: the amount of nutrients gained from the environment had to be sufficient to allow for the maintenance and growth of our large and particularly energy-demanding brains.”

This research also suggests that non-environmental factors were more important for driving larger brains than climate, prime candidates being the added cognitive challenges of increasingly complex social lives, more diverse diets, and more sophisticated technology.

The researchers say there is good evidence that human body and brain size continue to evolve. The human physique is still adapting to different temperatures, with on average larger-bodied people living in colder climates today. Brain size in our species appears to have been shrinking since the beginning of the Holocene (around 11,650 years ago). The increasing dependence on technology, such as an outsourcing of complex tasks to computers, may cause brains to shrink even more over the next few thousand years.

“It’s fun to speculate about what will happen to body and brain sizes in the future, but we should be careful not to extrapolate too much based on the last million years because so many factors can change,” said Manica.

This research was funded by the European Research Council and the Antarctic Science Platform.

Reference

Will, M. et al: ‘Different environmental variables predict body and brain size evolution in Homo.’ Nature Communications, July 2021. DOI: 10.1038/s41467-021-24290-7

The average body size of humans has fluctuated significantly over the last million years and is strongly linked to temperature. Colder, harsher climates drove the evolution of larger body sizes, while warmer climates led to smaller bodies. Brain size also changed dramatically but did not evolve in tandem with body size.

evolutionClimate changeAndrea ManicaUniversity of TübingenDepartment of ZoologyClare CollegeSchool of the Biological SciencesOur study indicates that climate - particularly temperature - has been the main driver of changes in body size for the past million years.Andrea ManicaHuman fossils illustrating the variation in brain (skulls) and body size (thigh bones) during the Pleistocene.


The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

YesNews type: News

Rare genetic variants confer largest increase in type 2 diabetes risk seen to date

Latest Research in Cambridge - Wed, 07/07/2021 - 12:10

Type 2 diabetes is thought to be driven in part by inherited genetic factors, but many of these genes are yet unknown. Previous large-scale studies have depended on efficient ‘array genotyping’ methods to measure genetic variations across the whole genome. This approach typically does a good job at capturing the common genetic differences between people, though individually these each confer only small increases in diabetes risk.

Recent technical advances have allowed more comprehensive genetic measurement by reading the complete DNA sequences of over 20,000 genes that code for proteins in humans. Proteins are essential molecules that enable our bodies to function. In particular, this new approach has allowed for the first time a large-scale approach to study the impact of rare genetic variants on several diseases, including type 2 diabetes.

By looking at data from more than 200,000 adults in the UK Biobank study, researchers from the Medical Research Council (MRC) Epidemiology Unit at the University of Cambridge used this approach to identify genetic variants associated with the loss of the Y chromosome. This is a known biomarker of biological ageing that occurs in a small proportion of circulating white blood cells in men and indicates a weakening in the body’s cellular repair systems. This biomarker has been previously linked to age-related diseases such as type 2 diabetes and cancer.

In results published today in Nature Communications, the researchers identified rare variants in the gene GIGYF1 that substantially increase susceptibility to loss of the Y chromosome, and also increase an individual’s risk of developing type 2 diabetes six-fold. In contrast, common variants associated with type 2 diabetes confer much more modest increases in risk, typically much lower than two-fold.

Around 1 in 3,000 individuals carries such a GIGYF1 genetic variant. Their risk of developing type 2 diabetes is around 30%, compared to around 5% in the wider population. In addition, people who carried these variants had other signs of more widespread ageing, including weaker muscle strength and more body fat.

GIGYF1 is thought to control insulin and cell growth factor signalling. The researchers say their findings identify this as a potential target for future studies to understand the common links between metabolic and cellular ageing, and to inform future treatments.

Dr John Perry, from the MRC Epidemiology Unit and a senior author on the paper, said: “Reading an individual’s DNA is a powerful way of identifying genetic variants that increase our risk of developing certain diseases. For complex diseases such as type 2 diabetes, many variants play a role, but often only increasing our risk by a tiny amount. This particular variant, while rare, has a big impact on an individual’s risk.”

Professor Nick Wareham, Director of the MRC Epidemiology Unit, added: “Our findings highlight the exciting scientific potential of sequencing the genomes of very large numbers of people. We are confident that this approach will bring a rich new era of informative genetic discoveries that will help us better understand common diseases such as type 2 diabetes. By doing this, we can potentially offer better ways to treat – or even to prevent – the condition.”

Ongoing research will aim to understand how the loss of function variants in GIGYF1 lead to such a substantial increase in the risk of developing type 2 diabetes. Their future research will also examine other links between biomarkers of biological ageing in adults and metabolic disorders.

The research was funded by the Medical Research Council. UK Biobank is supported Wellcome, the Medical Research Council, British Heart Foundation, Cancer Research UK, Department of Health, Northwest Regional Development Agency and the Scottish Government.

Reference
Zhao, Y. et al. GIGYF1 loss of function is associated with clonal mosaicism and adverse metabolic health. Nature Communications 2021; 07 Jul 2021; DOI: 10.1038/s41467-021-24504-y

Scientists at the University of Cambridge have identified rare genetic variants – carried by one in 3,000 people – that have a larger impact on the risk of developing type 2 diabetes than any previously identified genetic effect.

Spotlight on public healthPublic healthtype 2 diabetesdiabetesgeneticsGenomicsJohn PerryNick WarehamMedical Research CouncilUK BiobankWellcomeCancer Research UK (CRUK)British Heart FoundationDepartment of HealthNorthwest Regional Development AgencyScottish GovernmentSchool of Clinical MedicineMedical Research Council (MRC) Epidemiology UnitFor complex diseases such as type 2 diabetes, many variants play a role [in disease risk], but often only increasing our risk by a tiny amount. This particular variant, while rare, has a big impact on an individual’s riskJohn PerryqimonoDNA jigsaw


The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

YesLicense type: Public DomainNews type: News

Rare genetic variants confer largest increase in type 2 diabetes risk seen to date

Research in Medicine - Wed, 07/07/2021 - 12:10

Type 2 diabetes is thought to be driven in part by inherited genetic factors, but many of these genes are yet unknown. Previous large-scale studies have depended on efficient ‘array genotyping’ methods to measure genetic variations across the whole genome. This approach typically does a good job at capturing the common genetic differences between people, though individually these each confer only small increases in diabetes risk.

Recent technical advances have allowed more comprehensive genetic measurement by reading the complete DNA sequences of over 20,000 genes that code for proteins in humans. Proteins are essential molecules that enable our bodies to function. In particular, this new approach has allowed for the first time a large-scale approach to study the impact of rare genetic variants on several diseases, including type 2 diabetes.

By looking at data from more than 200,000 adults in the UK Biobank study, researchers from the Medical Research Council (MRC) Epidemiology Unit at the University of Cambridge used this approach to identify genetic variants associated with the loss of the Y chromosome. This is a known biomarker of biological ageing that occurs in a small proportion of circulating white blood cells in men and indicates a weakening in the body’s cellular repair systems. This biomarker has been previously linked to age-related diseases such as type 2 diabetes and cancer.

In results published today in Nature Communications, the researchers identified rare variants in the gene GIGYF1 that substantially increase susceptibility to loss of the Y chromosome, and also increase an individual’s risk of developing type 2 diabetes six-fold. In contrast, common variants associated with type 2 diabetes confer much more modest increases in risk, typically much lower than two-fold.

Around 1 in 3,000 individuals carries such a GIGYF1 genetic variant. Their risk of developing type 2 diabetes is around 30%, compared to around 5% in the wider population. In addition, people who carried these variants had other signs of more widespread ageing, including weaker muscle strength and more body fat.

GIGYF1 is thought to control insulin and cell growth factor signalling. The researchers say their findings identify this as a potential target for future studies to understand the common links between metabolic and cellular ageing, and to inform future treatments.

Dr John Perry, from the MRC Epidemiology Unit and a senior author on the paper, said: “Reading an individual’s DNA is a powerful way of identifying genetic variants that increase our risk of developing certain diseases. For complex diseases such as type 2 diabetes, many variants play a role, but often only increasing our risk by a tiny amount. This particular variant, while rare, has a big impact on an individual’s risk.”

Professor Nick Wareham, Director of the MRC Epidemiology Unit, added: “Our findings highlight the exciting scientific potential of sequencing the genomes of very large numbers of people. We are confident that this approach will bring a rich new era of informative genetic discoveries that will help us better understand common diseases such as type 2 diabetes. By doing this, we can potentially offer better ways to treat – or even to prevent – the condition.”

Ongoing research will aim to understand how the loss of function variants in GIGYF1 lead to such a substantial increase in the risk of developing type 2 diabetes. Their future research will also examine other links between biomarkers of biological ageing in adults and metabolic disorders.

The research was funded by the Medical Research Council. UK Biobank is supported Wellcome, the Medical Research Council, British Heart Foundation, Cancer Research UK, Department of Health, Northwest Regional Development Agency and the Scottish Government.

Reference
Zhao, Y. et al. GIGYF1 loss of function is associated with clonal mosaicism and adverse metabolic health. Nature Communications 2021; 07 Jul 2021; DOI: 10.1038/s41467-021-24504-y

Scientists at the University of Cambridge have identified rare genetic variants – carried by one in 3,000 people – that have a larger impact on the risk of developing type 2 diabetes than any previously identified genetic effect.

Spotlight on public healthPublic healthtype 2 diabetesdiabetesgeneticsGenomicsJohn PerryNick WarehamMedical Research CouncilUK BiobankWellcomeCancer Research UK (CRUK)British Heart FoundationDepartment of HealthNorthwest Regional Development AgencyScottish GovernmentSchool of Clinical MedicineMedical Research Council (MRC) Epidemiology UnitFor complex diseases such as type 2 diabetes, many variants play a role [in disease risk], but often only increasing our risk by a tiny amount. This particular variant, while rare, has a big impact on an individual’s riskJohn PerryqimonoDNA jigsaw


The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

YesLicense type: Public DomainNews type: News

Autistic individuals may be more likely to use recreational drugs to self-medicate their mental health

Latest Research in Cambridge - Thu, 01/07/2021 - 23:30

There is significant debate about substance use of autistic adolescents and adults. Some studies indicate that autistic individuals are less likely to use substances, whereas others suggest that autistic individuals are at greater risk of substance misuse or abuse. The team at the Autism Research Centre in Cambridge used a ‘mixed methods’ design to consider both the frequency of substance use among autistic individuals, as well as their self-reported experiences of substance use.

Overall, 1,183 autistic and 1,203 non-autistic adolescents and adults (aged 16-90 years) provided information about the frequency of their substance use via an anonymous, online survey; of this group, 919 individuals also gave more in-depth responses about their experiences of substance use.

Autistic adults were less likely than non-autistic peers to use substances. Only 16% of autistic adults, compared to 22% of non-autistic adults, reported drinking on three or more days per week on average. Similarly, only 4% of autistic adults reported binge-drinking compared to 8% of non-autistic adults.

There were also some sex differences in patterns of substance use: autistic males were less likely than non-autistic males to report ever having smoked or used drugs. In contrast, the team did not find differences in the patterns of frequency of smoking or drug use between autistic and non-autistic females.

However, despite lower rates of substance use overall, the qualitative findings of the study provide a much less hopeful picture: autistic adults were nearly nine times more likely than non-autistic peers to report using recreational drugs (such as marijuana, cocaine and amphetamines)  to manage unwanted symptoms, including autism-related symptoms.

Drugs were used to reduce sensory overload, help with mental focus, and provide routine, among other reasons. Several autistic participants also indirectly referenced using substances to mask their autism. Past research has shown that this behavioural management (also known as ‘camouflaging’ or ‘compensating’) has been linked to emotional exhaustion, worse mental health, and even increased risk of suicide among autistic adults.

Autistic adolescents and adults were also over three times more likely than others to report using substances to manage mental health symptoms, including anxiety, depression, and suicidal thoughts. Several participants specifically noted that they used drugs for self-medication. However, this self-medication was not always viewed as negative by participants, and several noted that using recreational drugs allowed them to reduce the doses of prescribed medications for mental health conditions, which was a welcome change due to the sometimes significant side effects from their prescribed medications.

Another area of concern was the strong association between vulnerability and substance use among autistic teenagers and adults. Previous work from the Cambridge team suggests that autistic adults may be much more likely to have adverse life experiences and be at greater risk of suicide than others. The findings of the new study indicate that autistic individuals are over four times more likely to report vulnerability associated with substance use compared to their non-autistic peers, including dependence/addiction, using drugs to deal with past trauma, and substance use associated with suicide.

In addition, the study identified two new areas of vulnerability not been previously reported: being forced, tricked, or accidentally taking drugs; and childhood use of substances (at the age of 12 years or younger).

Elizabeth Weir, a PhD student at the Autism Research Centre in Cambridge, and the lead researcher of the study, said: “Whether or not the substances currently classed as ‘recreational’ could be used medically remains an open question. It is evident that the current systems of health and social care support are not meeting the needs of many autistic teenagers and adults.

“No one should feel that they need to self-medicate for these issues without guidance from a healthcare professional. Identifying new forms of effective support is urgent considering the complex associations between substance use, mental health, and behaviour management—particularly as camouflaging and compensating behaviours are associated with suicide risk among autistic individuals.”

Dr Carrie Allison, Director of Research Strategy at the Autism Research Centre and a member of the research team, said: “While some of our results suggest lower likelihood of substance use overall, physicians should not assume that their autistic patients aren’t using drugs. Drug use can be harmful so healthcare providers should aim to establish trusting relationships with autistic and non-autistic patients alike to foster frank and honest conversations about substance use.”

Professor Simon Baron-Cohen, Director of the Autism Research Centre and a member of the team, said: “We continue to see new areas in which autistic adults experience vulnerability: mental health, physical health, suicide risk, lifestyle patterns, the criminal justice system, and so on. Substance use is now another area that we need to consider when developing new forms of support for autistic individuals. It is essential that we ensure that autistic people have equal access to high quality social and healthcare that can appropriately support their specific needs; and, unfortunately, it seems clear that our current systems are still not meeting this mark.”

The research was funded by the Autism Research Trust, Rosetrees Trust, Cambridge and Peterborough NHS Foundation Trust, Corbin Charitable Trust, Medical Research Council, Wellcome and the Innovative Medicines Initiative.

Reference
Weir, E., Allison, C., & Baron-Cohen, S. Understanding the substance use of autistic adolescents and adults: a mixed methods approach. The Lancet Psychiatry (2021).

While autistic individuals are less likely to use substances, those who do so are more likely to self-medicate for their mental health symptoms, according to new research from the University of Cambridge and published today in The Lancet Psychiatry.

Spotlight on neuroscienceNeuroscienceautismdrugsElizabeth WeirSimon Baron-CohenCarrie AllisonAutism Research TrustRosetrees TrustCambridge and Peterborough NHS Foundation TrustCorbin Charitable TrustMedical Research CouncilWellcomeInnovative Medicines InitiativeSchool of Clinical MedicineAutism Research CentreIt is essential that we ensure that autistic people have equal access to high quality social and healthcare that can appropriately support their specific needs; and, unfortunately, it seems clear that our current systems are still not meeting this markSimon Baron-CohenGRAS GRÜNMan smoking


The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

YesLicense type: Public DomainNews type: News

Autistic individuals may be more likely to use recreational drugs to self-medicate their mental health

Research in Medicine - Thu, 01/07/2021 - 23:30

There is significant debate about substance use of autistic adolescents and adults. Some studies indicate that autistic individuals are less likely to use substances, whereas others suggest that autistic individuals are at greater risk of substance misuse or abuse. The team at the Autism Research Centre in Cambridge used a ‘mixed methods’ design to consider both the frequency of substance use among autistic individuals, as well as their self-reported experiences of substance use.

Overall, 1,183 autistic and 1,203 non-autistic adolescents and adults (aged 16-90 years) provided information about the frequency of their substance use via an anonymous, online survey; of this group, 919 individuals also gave more in-depth responses about their experiences of substance use.

Autistic adults were less likely than non-autistic peers to use substances. Only 16% of autistic adults, compared to 22% of non-autistic adults, reported drinking on three or more days per week on average. Similarly, only 4% of autistic adults reported binge-drinking compared to 8% of non-autistic adults.

There were also some sex differences in patterns of substance use: autistic males were less likely than non-autistic males to report ever having smoked or used drugs. In contrast, the team did not find differences in the patterns of frequency of smoking or drug use between autistic and non-autistic females.

However, despite lower rates of substance use overall, the qualitative findings of the study provide a much less hopeful picture: autistic adults were nearly nine times more likely than non-autistic peers to report using recreational drugs (such as marijuana, cocaine and amphetamines)  to manage unwanted symptoms, including autism-related symptoms.

Drugs were used to reduce sensory overload, help with mental focus, and provide routine, among other reasons. Several autistic participants also indirectly referenced using substances to mask their autism. Past research has shown that this behavioural management (also known as ‘camouflaging’ or ‘compensating’) has been linked to emotional exhaustion, worse mental health, and even increased risk of suicide among autistic adults.

Autistic adolescents and adults were also over three times more likely than others to report using substances to manage mental health symptoms, including anxiety, depression, and suicidal thoughts. Several participants specifically noted that they used drugs for self-medication. However, this self-medication was not always viewed as negative by participants, and several noted that using recreational drugs allowed them to reduce the doses of prescribed medications for mental health conditions, which was a welcome change due to the sometimes significant side effects from their prescribed medications.

Another area of concern was the strong association between vulnerability and substance use among autistic teenagers and adults. Previous work from the Cambridge team suggests that autistic adults may be much more likely to have adverse life experiences and be at greater risk of suicide than others. The findings of the new study indicate that autistic individuals are over four times more likely to report vulnerability associated with substance use compared to their non-autistic peers, including dependence/addiction, using drugs to deal with past trauma, and substance use associated with suicide.

In addition, the study identified two new areas of vulnerability not been previously reported: being forced, tricked, or accidentally taking drugs; and childhood use of substances (at the age of 12 years or younger).

Elizabeth Weir, a PhD student at the Autism Research Centre in Cambridge, and the lead researcher of the study, said: “Whether or not the substances currently classed as ‘recreational’ could be used medically remains an open question. It is evident that the current systems of health and social care support are not meeting the needs of many autistic teenagers and adults.

“No one should feel that they need to self-medicate for these issues without guidance from a healthcare professional. Identifying new forms of effective support is urgent considering the complex associations between substance use, mental health, and behaviour management—particularly as camouflaging and compensating behaviours are associated with suicide risk among autistic individuals.”

Dr Carrie Allison, Director of Research Strategy at the Autism Research Centre and a member of the research team, said: “While some of our results suggest lower likelihood of substance use overall, physicians should not assume that their autistic patients aren’t using drugs. Drug use can be harmful so healthcare providers should aim to establish trusting relationships with autistic and non-autistic patients alike to foster frank and honest conversations about substance use.”

Professor Simon Baron-Cohen, Director of the Autism Research Centre and a member of the team, said: “We continue to see new areas in which autistic adults experience vulnerability: mental health, physical health, suicide risk, lifestyle patterns, the criminal justice system, and so on. Substance use is now another area that we need to consider when developing new forms of support for autistic individuals. It is essential that we ensure that autistic people have equal access to high quality social and healthcare that can appropriately support their specific needs; and, unfortunately, it seems clear that our current systems are still not meeting this mark.”

The research was funded by the Autism Research Trust, Rosetrees Trust, Cambridge and Peterborough NHS Foundation Trust, Corbin Charitable Trust, Medical Research Council, Wellcome and the Innovative Medicines Initiative.

Reference
Weir, E., Allison, C., & Baron-Cohen, S. Understanding the substance use of autistic adolescents and adults: a mixed methods approach. The Lancet Psychiatry (2021).

While autistic individuals are less likely to use substances, those who do so are more likely to self-medicate for their mental health symptoms, according to new research from the University of Cambridge and published today in The Lancet Psychiatry.

Spotlight on neuroscienceNeuroscienceautismdrugsElizabeth WeirSimon Baron-CohenCarrie AllisonAutism Research TrustRosetrees TrustCambridge and Peterborough NHS Foundation TrustCorbin Charitable TrustMedical Research CouncilWellcomeInnovative Medicines InitiativeSchool of Clinical MedicineAutism Research CentreIt is essential that we ensure that autistic people have equal access to high quality social and healthcare that can appropriately support their specific needs; and, unfortunately, it seems clear that our current systems are still not meeting this markSimon Baron-CohenGRAS GRÜNMan smoking


The text in this work is licensed under a Creative Commons Attribution 4.0 International License. Images, including our videos, are Copyright ©University of Cambridge and licensors/contributors as identified.  All rights reserved. We make our image and video content available in a number of ways – as here, on our main website under its Terms and conditions, and on a range of channels including social media that permit your use and sharing of our content under their respective Terms.

YesLicense type: Public DomainNews type: News