Every DNA counts – and we would know

The National Measurement Laboratory at LGC turned 30 years old this year, and to celebrate we’ve been looking back at notable accomplishments, and looking at where we are now. Clinical measurement is one field where our scientists have excelled and innovated throughout our time.

biology-clinic-doctor-4154Clinical measurement “is the development, use, on-going support, and maintenance of technology for diagnosing, aiding or treating patients.” Modern medicine wouldn’t be possible if we couldn’t rely on the accuracy of clinical tests and diagnosis. Poor measurement can lead to misdiagnosis, incorrect prescription and dosage of medicine, or false interpretation of data. Therefore, reliable certified reference materials are absolutely necessary to ensure the quality and accuracy of clinical measurement.

Throughout the last 30 years, the National Measurement Laboratory (NML) at LGC has worked in this area to ensure that testing methods and reference materials are of the highest quality.

In one case study from 2006¹, scientists in the NML developed isotope dilution liquid chromatography-mass spectrometry (IDMS) methodologies that were then used to generate reference values for clinical reference materials (CRM), some of which led to the analysis of creatine in frozen serum and testosterone in frozen serum CRMs.

In another blog post, we outlined the work we’ve done to improve Alzheimer’s diagnosis, which could lead to techniques for earlier diagnosis of the disease, and in another, we illustrate the importance of harmonising newborn blood sport screening tests to ensure infants are diagnosed and treated early so that they can live as normal lives as possible.

An important part of working in the field of clinical medicine and measurement is communicating our knowledge with other scientists and medical professionals to ensure that good measurement is being performed consistently across the board. We have worked with the NHS and England’s Chief Scientific Officer Sue Hill on doing just that as part of the Knowledge Transfer Partnership Programme, which aims to improve patient care through new approaches to measurement.

And now, our scientists can even count DNA and measure changes to that DNA over time. Identification and targeting of specific genetic sequences forms the basis of many promising advanced healthcare solutions such as: precision (personalised) medicine in cancer, gene therapies to end genetic disorders in children and the detection of pathogenic and non-pathogenic bacteria in a wide spectrum of infectious and autoimmune diseases.

However, the new methods and technologies currently being developed will only achieve their full potential if we can ensure they are safe and can be reproduced. High accuracy reference methods are one of the key factors in supporting their development into routine application.

Using tests for guiding treatment of colorectal cancer as a model, our scienists outlined in a paper published in Clinical Chemistry how a range of dPCR assays and platforms compared and how precisely they measured the cancer mutation. An inter-laboratory study of clinical and National Measurement Institute laboratories demonstrated reproducibility of the selected method. Together these results reveal the unprecedented accuracy of dPCR for copy number concentration of a frequently occurring gene mutation used to decide on drug treatment.

This study has shown that using high-accuracy dPCR measurements can support the traceable standardisation, translation and implementation of molecular diagnostic procedures that will advance precision medicine.

All of this just goes to show you how far we’ve come in 30 years!

¹VAM Bulletin, Issue 35, Autumn 2006, pp 13. ‘Case Study 3: IDMS certification of clinical reference materials using LC-MS/MS”

Nanotechnology: The big challenge behind the characterization of the small

Nanomaterials and nanotechnology developments are having an increasingly significant impact on human life, from enabling more targeted cancer treatments to improving the efficacy of vaccines or the delivery of agrochemicals. However, their small size can lead to potentially toxic effects.

To protect human health and the environment, it is crucial that we are able to characterise nanomaterials effectively and understand their behaviour within biological systems. What do we really know about the potential effects when they come into contact with complex matrices and how do we ensure that nanoproducts are safe?

The global market for nanomaterials are estimated by Allied Market Research to have a market value of $14.7 billion in 2015, and some reports forecast that to grow to as much as $55 billion by 2022.

We know that the properties of nanomaterials can change significantly when used in complex matrices, such as biological systems, potentially affecting functionality and behaviour. Nanobiotechnology or nanomedical applications exploit these changes. For example, in some therapeutic applications, protein coated nanoparticles (apolipoprotein E coatings) can target specific locations, such as the brain.

However, there may be other currently unknown biological interactions which could pose a potential risk to human health. These risks are compounded by a lack of robust methods to characterise nanomaterials in complex biological matrices.

AB Still 0003As the NML we have been instrumental in developing new international documentary standards (ISO) to support this field. For example, we provided expert input into a newly released Technical Specification (ISO TS 19590:2017) that outlines a novel method (single particle inductively coupled plasma-mass spectrometry, spICP-MS) for determining the size distribution and concentration of nanoparticles in aqueous samples. We’ve been invited to provide the UK expert view for a new standard on the analysis of nano-objects using a gentle separation technique (field flow fractionation, ISO TS 21362).

These standards have been produced as a response to the worldwide demand for suitable methods for the detection and characterization of nanoparticles in food and consumer products. In addition, we provided the particle size reference measurements for a new silica reference material (ERM-FD101b) released this year by the European Commission (EC JRC Directorate F (Health, Consumers and Reference Materials). This material will support the implementation of the EC definition of ‘nanomaterial’.

The NML is co-ordinating the first international measurement comparison study between National Measurement Institutes (under the auspices of the CCQM) on the determination of number concentration of nanoparticles (colloidal gold). An interlaboratory comparison using the same material that is open to industrial and academic laboratories with an interest in nanoparticle analysis will be run in parallel through VAMAS (Versailles Project on Advanced Materials and Standards) in collaboration with NPL. This will allow a comparative evaluation across users and measurement institutes and may lead to the development of new international written standards to support regulation around nanoparticles.

LGC’s involvement supporting the development of nanotechnology regulation, and the underpinning standardisation efforts required at both a national and international level, recognises both the individual expertise of our scientists and our reputation in this field.

Our input will help ensure current and future consumer safety and ultimately protect human health and the environment whilst supporting the growth and development of this enabling technology.

You can read more about the work we do in our Annual Review, and have a look through our case studies to learn about our impact.

A4I is back for another round!

Analysis for Innovators is back! The latest round of the A4I programme from Innovate UK and its partners (LGC, NPL, NEL, & STFC) has now opened, with up to £3M available in total for Round 3.

In our role as the National Measurement Laboratory, we have worked with Innovate UK since the very start of A4I, back in January 2017, and the programme has proved such a success that we are already moving on to the third round!

But be quick to take advantage of this opportunity as the first stage of the application closes at noon on 6th September.  A4I is a very unique programme from Innovate UK – it helps UK businesses address difficult problems that restrict their potential productivity and competitiveness.  The scope is very wide (chemical, physical, biological and computing) but the problems must be of a measurement or analysis nature.

A4I targets real industry problems that haven’t been solved with existing products or services. As such, it is of interest to companies that have not traditionally considered applying for funding. Any size of business, with any type of measurement or analysis problem, are eligible to apply. If your company makes it past the first stage, you will be matched with us, NPL, NEL or STFC for a consultation. After this stage, some companies will continue to work with us in our own world-class measurement labs.

The first two rounds of the A4I programme have seen us help several companies overcome measurement problems. In Round 1, we worked with the Coconut Collaborative, a manufacturer of coconut yoghurt, and STFC to develop a rapid and robust screening method to detect rancid coconut cream before its use. The use of rancid cream led to lost sales and waste for the company. We helped develop a novel screening approach with multi-spectral imaging, which will help the Coconut Collective avoid annual costs of £500k.

We also worked with Sistemic to help ensure the safety of cell therapy products, by increasing the sensitivity of their novel technology, which detects contamination in cell therapy products. Cell therapies are seen as the future of treatment in a number of areas including diabetes and cardiovascular disease. However, one type of cell being used to generate cell therapy products (pluripotent stem cells, or PSCs) has the potential to form tumours. The NML enhanced the sensitivity and specificity of the Sistemic novel prototype miRNA-assay to the levels required for market (<10 cells per million). This assay will ensure producers can accurately assess PSC contamination in their cell therapy products.

Other examples of the companies that were funded under A4I Round 1 can be found at Analysis for Innovators winning projects, and for more information about the work and case studies of the NML at LGC, have a look here at our latest annual review.

And don’t forget to apply now– there’s £3 million up for grabs!

Peanut allergen quantification: a tough nut to crack

As part of the National Measurement Laboratory’s 30th anniversary, we’re sharing stories and case studies from the last three decades.

One of our case studies touches on an issue that affects hundreds of thousands of people across the UK alone: peanut allergies. So read on to learn how LGC scientists developed a unique allergen quality control material, which can be used to help protect the people in the UK with a peanut allergy and also help to prevent contamination in the food production process, potentially saving the food industry millions of pounds.

The problem

The prevalence of peanut allergy has nearly doubled in Europe over the past two decades and it now affects around 500,000 people in the UK [1]. Peanut allergy is the most common cause of fatal food allergy reaction. It occurs when the immune system mistakenly identifies peanut proteins as something harmful. The fear of accidental exposure in food reduces the quality of life of peanut allergy sufferers and severely limits the social habits of allergic individuals, their families and even their friends.

SMALL_iStock_000004018390Small peanutsIt is not only those with peanut allergies who have to worry about the risk of allergic reactions or death by anaphylaxis; it also creates problems for businesses. Testing for allergen proteins in food is difficult, as samples usually contain a lot of protein and it can be difficult to separate the allergen protein of interest. This has an impact on the ability of manufacturers and suppliers to adequately label their goods and also has implications for defining threshold levels and detecting food fraud.

All food companies throughout the EU are compelled by law to declare major allergens including peanut, if included in food products as ingredients. The current labelling rules, brought into force in December 2014 by European Regulation 1169/2011 (the EU Food Information for Consumers Regulation, EU FIC) ensure that all consumers are given highlighted information about the use of allergenic ingredients in pre-packed food [2]. This is to make it easier for people with food allergies to identify the foods they need to avoid. The EU FIC also extends to food sold loose or served when eating out. Prevention of cross contamination with peanut through product testing, validation and verification of cleaning, and checking of ‘peanut-free’ products requires exacting testing.

ELISA (enzyme-linked immunosorbent assay), PCR (polymerase chain reaction) and mass spectrometry (MS) methods can be used to detect food allergens, but there are problems obtaining reliable quantitative results with all three. Prior to this project, there were no suitable reference materials available in the form of a food matrix, making it difficult for laboratories and test-kit manufacturers to validate quantitative methods for allergen measurement.

The solution

A quality control (QC) material that is a real food, containing a known amount of specific allergen protein, and is stable and homogenous could assist laboratories in the validation and monitoring of their analysis. Consequently, a project was undertaken by LGC to develop a food matrix peanut allergen QC material.

The chosen matrix was a chocolate dessert product developed for low-dose threshold studies in food allergic individuals in the European research project ‘EuroPrevall’. Two QC materials were prepared by University of Manchester researchers in the form of chocolate dessert product pastes designed to be reconstituted with water before analysis. One material (LGCQC1011) was prepared as a peanut free negative control and the other material (LGCQC1012) was prepared as a positive control with the addition of light roast, partially defatted peanut flour (a commercial food ingredient) to give a peanut protein content of 10 mg kg-1. The pastes were transferred to LGC, packaged in nitrogen-flushed sealed sachets to aid stability and the units were numbered sequentially in fill order. LGC assessed and proved their homogeneity and stability, underpinned by a validation study of the test method using a commercially available ELISA kit (Romer AgraQuant® Peanut kit). The National Measurement System funded the ELISA kit validation studies, and a Technology Strategy Board and LGC co-funded research and development project established the design and production of the QC material.

Impact

Failure in food allergen management means ‘food-allergen’ related incidents are the most common reason for product withdrawals and recalls in the United Kingdom according to the UK Food Standards Agency. The 34 recalls related to allergens in 2010 were estimated to cost stakeholders £10- 15 million. In 2013, the number of Allergy Alerts issued to withdraw food or drink products had risen to 47.

Phil Goodwin, MD of Bio-Check (UK) a food allergen test kit manufacturer, has worked in this area for 30 years and welcomes LGC’s recent initiatives:

“The science of food allergen detection, let alone quantitation, has failed to move forward anything like quickly enough since it began in the late 1980s. The emergence of such high quality QC materials as are being produced by LGC is a significant step forward to a time when all commercial test kits can be demonstrated to show good agreement on allergen levels. LGC are to be applauded for taking on this difficult challenge and I urge all allergen kit producers and analysts to use the material to improve their products and results.”

 

[1] http://www.mrc.ac.uk/news-events/publications/outputs-outcomesand-impact-of-mrc-research-2013-14/

[2] http://allergytraining.food.gov.uk/english/rules-and-legislation/

This blog first appeared as a NML case study on the LGC Group website. To learn more about the NML, visit their site here.

What’s funny about your honey?

Ensuring the safety and authenticity of the food we eat is of paramount importance and there is growing concern, both at the EU and global level, to ensure the quality control of food to protect the health and safety of consumers. And during the National Measurement Laboratory’s thirty years, we’ve done a lot of work to support reliable measurements in food testing and authentication.

Honey is known to have multiple health and nutritional benefits and is in high demand among consumers. It is defined as the natural sweet substance produced by bees and there is significant regulation around the composition and labelling of honey in order to protect consumers from food fraud. However, due to the declining numbers of bees, the impact of weather conditions on supply and the high costs production, honey is expensive. This makes it a prime target for economically-motivated food fraud.

StockSnap_97LJAKWL36Some research suggests that humans began to hunt for honey 8,000 years ago, and the oldest known honey remains, dating back to between 4,700 – 5,500 years ago, were discovered in clay vessels inside of a tomb in the country of Georgia.

The ancient Egyptians used honey to sweeten dishes and to embalm the dead, while the ancient Greeks actually practised beekeeping so much that laws were passed about it. Honey was prevalent around the ancient world, being used in ancient India, China, Rome and even among the Mayans. It even plays a role in many religions, representing the food of Zeus, an elixir of immortality, and a healing substance.

And just like any other important product, fraudsters have been faking it since it’s been in use. Ancient Greeks and Romans both mention honey adulteration, and back in 1889, Dr Harvey W. Wiley testified in front of Congress that it was the most adulterated product in the U.S.

Honey is still one of the most adulterated food products globally, with a report last year citing that more than 14% of tested samples were adulterated.

There are two types of food fraud associated with honey: adulteration and fraudulent labelling. Honey adulteration typically occurs by substituting honey for cheaper sweeteners such as high fructose corn syrup, cane or beet sugar syrup. Fraudulent labelling occurs because honeys from a particular geographic or botanical source, such as Manuka, command premium prices amongst consumers.

Detecting these types of fraud presents a significant measurement challenge for food regulators: adulterated products show very similar physical and chemical properties to pure honey and mis-labelled products are, in fact, pure honey, just of lower quality. Several reports indicate that there is more Manuka honey being sold than Manuka bees can  produce, which illustrates how often lower quality honeys are passed for premium ones in order to maximise profit.

During our thirty years as the National Measurement Laboratory (NML) for chemical and bio-measurement, our scientists have conducted several reviews and studies of methods for detecting honey fraud1. For instance, nearly forty years ago, scientists began to use stable carbon isotope ratio mass spectrometry (IR-MS) to detect high fructose corn syrup in honey.  As our scientists found2, it is possible to identify food fraud in honey using IR-MS, which measures small but observable variations in the ratios of the two stable isotopes of carbon (C-13 and C-12). Sugars, although chemically identical, have a different isotopic signature depending on the way in which the plant processes carbon dioxide. As the majority of honey-source plants use a different pathway than plant sugars typically used as honey adulterants, it is possible to detect adulteration using IR-MS. The specific geography of the plants also plays a role in the isotopic fingerprint and IR-MS can be used to determine where honeys originated.

However, in order that these types of measurements are robust and reliable in detecting food fraud across the supply chain the comparability of results is critical. To support this, LGC co-ordinated an international comparison study in 2016 for isotope ratios in honey involving 6 national measurement institutes (NMIs) and 6 expert laboratories (contacted via the Forensic Isotope Ratio Mass Spectrometry (FIRMS) Network) and the results between participants showed good comparability.

Demonstrating the comparability of isotope ratio measurements is crucial to detecting many types of food fraud and supporting food authenticity claims, of which honey is just one example. The international study coordinated by LGC demonstrates the measurement framework is in place to support food fraud regulation in the future.

 

1 D. Thorburn Burns, Anne Dillon, John Warren, and Michael J. Walker, 2018, A Critical Review of the Factors Available for the Identification and Determination of Mānuka Honey, Food Analytical Methods, https://doi.org/10.1007/s12161-018-1154-9.

2 Helena Hernandez, “Detection of adulteration of honey: Application of continuous-flow IRMS”, VAM Bulletin, 1999, Vol 18, pp 12-14.

Food Safety Week and beyond: LGC’s long history in food testing

Food Safety Week, organised by the UK’s Food Standards Agency, is an opportunity to learn more about current food issues, including food crime, compliance and food hygiene. This year’s campaign celebrates “the people who protect your plate” – the workers who ensure the UK public can trust the food they eat, including inspectors, local authorities, and public analysts.

Also at the forefront of the fight for food safety are chemists, who analyse food, drinks and supplements to ensure manufacturers can verify the safety of their food products.

tea_world food day

The original Government Laboratory plaque and tea samples.

Consumers trust that when they buy food and drink, they are getting exactly what they’ve been told they are getting.  Each food has a distinct composition, much like its own fingerprint, and with the right expertise and tools, it’s possible to study these foods to determine their authenticity.  LGC has been involved in food testing for over 175 years. In fact, it’s the very reason we were established. In 1842, the Board of Excise needed a scientific authority to see that goods, like tea, tobacco and spirits, were not adulterated for profit, and so it created the Government Laboratory.

The Government Chemist role was created in 1909, to ensure the Laboratory of the Government Chemist could work independently of the Inland Revenue department (which provided staff to the Laboratory) and the Board of Customs and Excise (which controlled it). Nowadays the Government Chemist oversees the statutory function of referee analyst, resolving disputes over analytical measurements, particularly in relation to food regulatory enforcement.

As LGC grew, so did our roles involved in food and feed testing. Not only are we involved as the referee analyst for disputes in the food industry, we also provide products and solutions to food safety-related issues.

In order for food producers to know with certainty that their food is authentic, it’s necessary to compare what they’ve produced with a known and verified version of the food – this is called a reference material, or standard.  Currently, we have over 15,000 reference materials for food analysis, for everything from allergens, contaminants, and toxins to food flavourings, dyes and proteins, and much more.

Chemists also study new methods of authenticating foods, including via mass spectrometry, which is considered to be the gold standard in analysis, especially when combined with chromatography. Mass spectrometers analyse a sample’s elemental molecular weight, which is its ‘fingerprint’.  The tools and expertise of the National Measurement Laboratory at LGC allow our measurement scientists to be accurate about the content of a sample to up to one part per quadrillion. In other words, we can detect one lump of sugar dissolved in a bay.  These capabilities allow us to work on specific projects, tailoring our research to benefit many different sectors and solve specific problems.

This was particularly evident during a recent case studying selenium within food products and supplements.  It is essential that the correct amount and species of selenium is present in order for fortified food products and supplements to be safe for human consumption.  Selenium-enriched foods and supplements have become more prominent in Europe since it has moved to using more wheat that is naturally low in selenium.

However the accurate measurement of total selenium in food and food supplements presents analytical challenges due to the complex nature of food samples. Furthermore, selenium speciation analysis presents additional challenges due to the low levels of each specific selenium species and the molecular complexity of such samples.

LGC’s measurement research team for inorganic mass spectrometry has extensive experience in selenium speciation and was able to develop and characterise a range of reference materials, including a matrix selenium-enriched wheat flour standard, to support the food industry.

With over 175 years in the food testing arena, we have a lot to say about the subject, so if you want to learn more, head over to our website where you can read case studies and learn about our reference materials.

You can also join us at next week’s Government Chemist Conference, where we will be discussing current food safety issues at length, including Brexit, food authenticity, and food regulation, with many experts in their fields, including the FSA themselves. Visit the conference website to view the entire programme and register.

The National Measurement Laboratory turns 30!

In 1988, Government Chemist Alex Williams, seeing the need for improved quality of analytical measurements, initiated and launched the Valid Analytical Measurement (VAM) programme to develop a chemical measurement infrastructure in the UK.

This programme would go on to evolve into the National Measurement Laboratory for chemical and bio-measurement. The UK was one of the pioneers within the global measurement community to recognise the need to address the new and developing challenges of measurement across chemistry and biology.

An article from the early VAM bulletins (1989).

That means 2018 marks the NML’s 30th birthday and kicks off our ‘Year of Measurement’. It is an opportunity to celebrate the importance of measurement science (‘metrology’) as we enjoy our 30th birthday and join the upcoming Festival of Measurement, which launches in September and lasts through May 2019.

In our thirty year history of performing measurements to support the UK, we’ve experienced a lot of growth, seen big changes in the challenges we’ve been set and made some major breakthroughs. We’ve asked (and answered) a lot of questions, like ‘What are the best methods for the detecting the adulteration of honey’ or ‘Is the computer a friend or foe?’ (The answer is ‘friend’…or ‘both’ if you’ve invested heavily in encyclopaedias.)

We’ve already outlined in a recent blog post how important accurate measurement is, affecting everything from food and drink to medicine. Accurate and precise measurement is the foundation of public health and safety. But it’s also just as important to the economy.  In 2009, it was estimated that £622 billion of the UK’s total trade relied on measurement in some way, meaning that measurement plays a role in nearly every aspect of our lives.

Our Chief Scientific Officer, Derek Craston, agrees that good measurement is crucial to economies. ““In my role, I am fortunate to be able to see the major benefits that chemical and biological measurements make to the prosperity of companies and the lives of individuals across areas as broad as clinical diagnosis, drug development, environmental protection and food security. Indeed, in a global economy, with complex supply chains and regulatory frameworks, it is hard to see how many markets could function without it.”

We’re proud of the work we’ve done as the National Measurement Laboratory, where our work supports manufacture and trade, protects consumers and enhances quality of life. And over the next few months, we plan to share stories and case studies from our thirty years at the forefront of measurement with you, as well as look forward to the next thirty years.

World Metrology Day: Setting the standard for measurement

This Sunday, 20th May, is World Metrology Day, the birthday of the signing of the Metre Convention on 20 May 1875 (and pretty much the best day of the year for measurement scientists like us). This convention set the framework for global collaboration in the science of measurement (metrology). Its aim- to ensure we use uniform measurements across the globe- remains as important for industry, commerce and society today as it was over 140 years ago.

Measurement is present in everything: from food and drink safety to the efficacy of pharmaceuticals, from diagnosis and detection of disease to navigation, from air and water quality to forensics. Mobile phones and computers run on accurate measurement, and if you’ve ever had to consistently reset a clock, it was inaccurate measurement that was annoying you.

As the National Measurement Laboratory (designated for chemical and bio-measurement), LGC forms part of the UK National Measurement System (NMS) that provides the core measurement infrastructure for the UK. The measurements we make support manufacture and trade, protect consumers and enhance quality of life.

Did you know that, for all of human history, measurements have been based on actual physical weights and measures, called artefacts? Humans have been working on measurement standardisation for a long time. The ancient Egyptians used what is widely regarded as the first measurement standard, the cubit, a wooden rod that was used to determine standard lengths and heights, like for measuring flood levels of the Nile River. In ancient Babylon, the mina was created and used to measure weight, and the early Chinese civilisation used the chi. Even these standards had much variation within their societies, making wider trade and exchanges difficult. The Magna Carta in 1215 required that the same set of standards be used throughout the realm.  Finally, the International System of Units (SI) was agreed to during the Metre Convention on 20 May 1875, when representatives from seventeen countries set out to close gaps and reach uniformity of measurement around the world.

Even now, the Kilogram is a cylinder made from platinum and iridium alloy that sits in a vault near Paris. The vault is a necessary precaution to ensure the kilogram isn’t damaged, but the last time it was taken out and weighed against a copy, it actually lost weight. Think about that. Mass is always calibrated against another officially confirmed mass, but what happens when the official artefact is no longer reliable? Is the artefact the correct weight or is the copy? Does this mean all of the weights in the world are incorrect?

This could have huge consequences, especially when you consider how integral accurate measurement is to our society, which is why scientists have long been looking for a way to redefine standards, developing an independent system that means we don’t have to rely on a physical artefact which could be damaged or degraded. And the most logical way to revolutionise metrology is with math.

Scientists have been searching for a natural constant, an unchanging number present in nature that would represent each unit and would therefore make accurate measurement reproducible without physical weights. The theme for this year’s World Metrology Day is ‘Constant evolution of the International System of Units (SI)’, chosen because this year sees the culmination of that change: the four base units not defined in terms of natural constants – the kilogram, the mole, the ampere and the kelvin – are expected to be revised.

The world will come together at the General Conference on Weights and Measures in November 2018 and is expected to agree to this change. If approved, this will be the most radical change to the SI since its inception and it will hopefully improve measurement forever, providing a springboard for future innovation.

So feel free to celebrate this Metrology Day in style!

Alzheimer’s disease diagnosis: the end of the guessing game?

There are currently around 850,000 people living with dementia in the UK, and the number of people affected is expected to reach 2 million by 2051. The costs associated with dementia, estimated now at £26 billion a year, are likely to treble.

Alzheimer’s disease is the most common type of dementia, affecting between 60 and 80 percent of those diagnosed. There is no known cure, with treatments limited to preserving cognitive function. Currently, there is no non-invasive method for diagnosing Alzheimer’s disease with GP’s relying on in depth cognitive tests, with clinical confidence in diagnosis typically at 70-80%.

Doctor Helping Elderly

If confident early diagnosis could be achieved through noninvasive techniques, treatment could be introduced earlier delaying the onset of memory impairment.

The solution

The development of plaques or tangles of certain proteins (β-amyloid and tau proteins) in the brain is a known feature in Alzheimer’s disease. It is also known that abnormal accumulation of metals underlies several neurodegenerative diseases. Iron, in particular, is associated with the formation of neurofibrillary tangles in the β-amyloid plaques. The recent advances in the use of Magnetic Resonance Imaging (MRI) for the earlier detection of neurological diseases require validation to ensure the integrity of the images obtained is adequate for diagnostic purposes.

Researchers at LGC, in collaboration with partners, have been working to establish a link between novel MRI scans and quantitative elemental mapping of soft tissues. A method of mapping the levels of iron in sections of the brain using laser ablation (LA) coupled to Inductively Coupled Plasma Mass Spectrometry (ICP-MS) has been developed, along with a novel calibration strategy and standard to support quantitative tissue imaging. Correlation of the metal content associated with β-amyloid protein and MRI images will help diagnosis of AD at an early stage, where preventative therapy will have greater impact.

LGC has developed a novel calibration strategy for LA-ICP-MS that produced quantitative images for iron in whole mouse brain sections (provided through collaboration with Kings College London and the University of Warwick) and compared them with results from micro x-ray fluorescence (μ-XRF) (provided through collaboration with Ghent University and the University of Warwick). The data showed good agreement in total iron concentrations for a selection of areas within the mouse brain sections. This finding supports the proposed method as a quantitative approach; the calibration strategy has been published in the Journal of Analytical Atomic Spectrometry¹.

Impact

The development of this method for quantitative imaging of iron in the brain has the potential to lead to techniques for earlier diagnosis of Alzheimer’s disease, enabling earlier intervention, therapies and treatment aimed at delaying the onset of symptoms.

Delaying the onset of neurodegenerative disorders, such as Alzheimer’s disease, by five years could halve the number of deaths from the condition, saving 30,000 lives a year and billions of pounds in treatment costs. Reducing severe cognitive impairment in the elderly by 1% pa would cancel all estimated increases in long-term care costs due to our ageing population.

The methodology will also provide deeper understanding of the early development of Alzheimer’s disease leading the way for new treatments aimed at preventing the disease.

Heidi Goenaga-Infante, Principal Scientist for inorganic analysis at LGC, commented: “This cutting-edge research is already proving to be of significant benefit to the validation of non-invasive diagnostic tools for Alzheimer’s disease. The potential for metal imaging mass spectrometry of other biological tissues to probe the reported links between metals and disease states is now a step closer.”

If you’d like to learn more about our work and read other case studies, visit our website.

¹ J O’Reilly, D Douglas, J Braybrook, P.-W. So, E Vergucht, J Garrevoet, B Vekemans, L Vinczec and H Goenaga-Infante, “A novel calibration strategy for the quantitative imaging of iron in biological tissues by LA-ICP-MS using matrix-matched standards and internal standardisation”, J Anal. At. Spectrom., 2014, 29, 1378-1384

Delivering impact to support AIDS research

LGC is helping to ensure that research into a cure for HIV is based on sound fundamental measurements.

Over 36 million people currently live with HIV, with approximately 2 million becoming infected each year (WHO 2015). Although HIV can be successfully managed with combination antiretroviral therapy (cART), the therapy must be continued indefinitely as no cure presently exists. This can be challenging in regions with high HIV prevalence and long-term use can potentially have toxic side effects.

One barrier to curing HIV is the presence of infected host cells that are not targeted by current therapies but lay dormant (so-called ‘viral reservoir’). These cells have the potential to become re-activated so novel strategies to cure HIV aim to target this reservoir. To determine whether these new approaches are successful, accurate and robust, methods for measuring HIV DNA are required.

The Molecular and Cell Biology team at LGC perform research to support accurate and reliable measurement as part of our National Measurement Laboratory (NML) role. Recent work by NML scientists comparing different molecular methods (qPCR, digital PCR) for quantification of HIV DNA has raised some concerns around the current popular choice of calibrator used to compare results between HIV clinical studies (8E5, ATCC® CRL-8993). It appears to lose HIV DNA copies during cell growth, potentially producing misleading estimates of how much HIV DNA is present and affecting whether novel strategies towards curing HIV are deemed successful or not.

Based in part on our work, the NIH AIDS Reagent Program, which provides critical reagents and resources to support research in the areas of AIDS therapeutics and vaccine development, has recently highlighted the potential instability of the standard on its reagent database to support the research community and enable the best chances of success.

 

 

Citation:

Busby E et al. Instability of 8E5 calibration standard revealed by digital PCR risks inaccurate quantification of HIV DNA in clinical samples by qPCR (2017) Sci Rep 7(1):1209. doi:10.1038/s41598-017-01221-5