Every DNA counts – and we would know

The National Measurement Laboratory at LGC turned 30 years old this year, and to celebrate we’ve been looking back at notable accomplishments, and looking at where we are now. Clinical measurement is one field where our scientists have excelled and innovated throughout our time.

biology-clinic-doctor-4154Clinical measurement “is the development, use, on-going support, and maintenance of technology for diagnosing, aiding or treating patients.” Modern medicine wouldn’t be possible if we couldn’t rely on the accuracy of clinical tests and diagnosis. Poor measurement can lead to misdiagnosis, incorrect prescription and dosage of medicine, or false interpretation of data. Therefore, reliable certified reference materials are absolutely necessary to ensure the quality and accuracy of clinical measurement.

Throughout the last 30 years, the National Measurement Laboratory (NML) at LGC has worked in this area to ensure that testing methods and reference materials are of the highest quality.

In one case study from 2006¹, scientists in the NML developed isotope dilution liquid chromatography-mass spectrometry (IDMS) methodologies that were then used to generate reference values for clinical reference materials (CRM), some of which led to the analysis of creatine in frozen serum and testosterone in frozen serum CRMs.

In another blog post, we outlined the work we’ve done to improve Alzheimer’s diagnosis, which could lead to techniques for earlier diagnosis of the disease, and in another, we illustrate the importance of harmonising newborn blood sport screening tests to ensure infants are diagnosed and treated early so that they can live as normal lives as possible.

An important part of working in the field of clinical medicine and measurement is communicating our knowledge with other scientists and medical professionals to ensure that good measurement is being performed consistently across the board. We have worked with the NHS and England’s Chief Scientific Officer Sue Hill on doing just that as part of the Knowledge Transfer Partnership Programme, which aims to improve patient care through new approaches to measurement.

And now, our scientists can even count DNA and measure changes to that DNA over time. Identification and targeting of specific genetic sequences forms the basis of many promising advanced healthcare solutions such as: precision (personalised) medicine in cancer, gene therapies to end genetic disorders in children and the detection of pathogenic and non-pathogenic bacteria in a wide spectrum of infectious and autoimmune diseases.

However, the new methods and technologies currently being developed will only achieve their full potential if we can ensure they are safe and can be reproduced. High accuracy reference methods are one of the key factors in supporting their development into routine application.

Using tests for guiding treatment of colorectal cancer as a model, our scienists outlined in a paper published in Clinical Chemistry how a range of dPCR assays and platforms compared and how precisely they measured the cancer mutation. An inter-laboratory study of clinical and National Measurement Institute laboratories demonstrated reproducibility of the selected method. Together these results reveal the unprecedented accuracy of dPCR for copy number concentration of a frequently occurring gene mutation used to decide on drug treatment.

This study has shown that using high-accuracy dPCR measurements can support the traceable standardisation, translation and implementation of molecular diagnostic procedures that will advance precision medicine.

All of this just goes to show you how far we’ve come in 30 years!

¹VAM Bulletin, Issue 35, Autumn 2006, pp 13. ‘Case Study 3: IDMS certification of clinical reference materials using LC-MS/MS”

Nanotechnology: The big challenge behind the characterization of the small

Nanomaterials and nanotechnology developments are having an increasingly significant impact on human life, from enabling more targeted cancer treatments to improving the efficacy of vaccines or the delivery of agrochemicals. However, their small size can lead to potentially toxic effects.

To protect human health and the environment, it is crucial that we are able to characterise nanomaterials effectively and understand their behaviour within biological systems. What do we really know about the potential effects when they come into contact with complex matrices and how do we ensure that nanoproducts are safe?

The global market for nanomaterials are estimated by Allied Market Research to have a market value of $14.7 billion in 2015, and some reports forecast that to grow to as much as $55 billion by 2022.

We know that the properties of nanomaterials can change significantly when used in complex matrices, such as biological systems, potentially affecting functionality and behaviour. Nanobiotechnology or nanomedical applications exploit these changes. For example, in some therapeutic applications, protein coated nanoparticles (apolipoprotein E coatings) can target specific locations, such as the brain.

However, there may be other currently unknown biological interactions which could pose a potential risk to human health. These risks are compounded by a lack of robust methods to characterise nanomaterials in complex biological matrices.

AB Still 0003As the NML we have been instrumental in developing new international documentary standards (ISO) to support this field. For example, we provided expert input into a newly released Technical Specification (ISO TS 19590:2017) that outlines a novel method (single particle inductively coupled plasma-mass spectrometry, spICP-MS) for determining the size distribution and concentration of nanoparticles in aqueous samples. We’ve been invited to provide the UK expert view for a new standard on the analysis of nano-objects using a gentle separation technique (field flow fractionation, ISO TS 21362).

These standards have been produced as a response to the worldwide demand for suitable methods for the detection and characterization of nanoparticles in food and consumer products. In addition, we provided the particle size reference measurements for a new silica reference material (ERM-FD101b) released this year by the European Commission (EC JRC Directorate F (Health, Consumers and Reference Materials). This material will support the implementation of the EC definition of ‘nanomaterial’.

The NML is co-ordinating the first international measurement comparison study between National Measurement Institutes (under the auspices of the CCQM) on the determination of number concentration of nanoparticles (colloidal gold). An interlaboratory comparison using the same material that is open to industrial and academic laboratories with an interest in nanoparticle analysis will be run in parallel through VAMAS (Versailles Project on Advanced Materials and Standards) in collaboration with NPL. This will allow a comparative evaluation across users and measurement institutes and may lead to the development of new international written standards to support regulation around nanoparticles.

LGC’s involvement supporting the development of nanotechnology regulation, and the underpinning standardisation efforts required at both a national and international level, recognises both the individual expertise of our scientists and our reputation in this field.

Our input will help ensure current and future consumer safety and ultimately protect human health and the environment whilst supporting the growth and development of this enabling technology.

You can read more about the work we do in our Annual Review, and have a look through our case studies to learn about our impact.

A4I is back for another round!

Analysis for Innovators is back! The latest round of the A4I programme from Innovate UK and its partners (LGC, NPL, NEL, & STFC) has now opened, with up to £3M available in total for Round 3.

In our role as the National Measurement Laboratory, we have worked with Innovate UK since the very start of A4I, back in January 2017, and the programme has proved such a success that we are already moving on to the third round!

But be quick to take advantage of this opportunity as the first stage of the application closes at noon on 6th September.  A4I is a very unique programme from Innovate UK – it helps UK businesses address difficult problems that restrict their potential productivity and competitiveness.  The scope is very wide (chemical, physical, biological and computing) but the problems must be of a measurement or analysis nature.

A4I targets real industry problems that haven’t been solved with existing products or services. As such, it is of interest to companies that have not traditionally considered applying for funding. Any size of business, with any type of measurement or analysis problem, are eligible to apply. If your company makes it past the first stage, you will be matched with us, NPL, NEL or STFC for a consultation. After this stage, some companies will continue to work with us in our own world-class measurement labs.

The first two rounds of the A4I programme have seen us help several companies overcome measurement problems. In Round 1, we worked with the Coconut Collaborative, a manufacturer of coconut yoghurt, and STFC to develop a rapid and robust screening method to detect rancid coconut cream before its use. The use of rancid cream led to lost sales and waste for the company. We helped develop a novel screening approach with multi-spectral imaging, which will help the Coconut Collective avoid annual costs of £500k.

We also worked with Sistemic to help ensure the safety of cell therapy products, by increasing the sensitivity of their novel technology, which detects contamination in cell therapy products. Cell therapies are seen as the future of treatment in a number of areas including diabetes and cardiovascular disease. However, one type of cell being used to generate cell therapy products (pluripotent stem cells, or PSCs) has the potential to form tumours. The NML enhanced the sensitivity and specificity of the Sistemic novel prototype miRNA-assay to the levels required for market (<10 cells per million). This assay will ensure producers can accurately assess PSC contamination in their cell therapy products.

Other examples of the companies that were funded under A4I Round 1 can be found at Analysis for Innovators winning projects, and for more information about the work and case studies of the NML at LGC, have a look here at our latest annual review.

And don’t forget to apply now– there’s £3 million up for grabs!

How genotyping is aiding in the fight against malaria

mosquitoe-1548975_19203.2 billion people across 106 countries and territories, live in areas at risk of malaria transmission. The serious and sometime fatal mosquito-borne disease is caused by the Plasmodium parasite – in 2015, malaria caused 212 million clinical episodes, and 429,000 deaths.

Malaria has been a public health problem in Brazil ever since it was brought to the region during its colonization. By the 1940s it is estimated that six to eight million infections and 80,000 malaria-related deaths occurred every year in the country.

Due to a concerted series of malaria control policies, Brazil has recorded a 76.8% decrease in malaria incidence between 2000 and 2014 – and effort which the country was praised by the WHO.  In 2014, there were 143,910 of microscopically confirmed cases of malaria and 41 malaria-related deaths.

Part of Brazil’s malaria control policy involves the use of primaquine – a medication first made in 1946, to treat and prevent malaria. It is particularly effective against the Plasmodium vivax parasite that is prevalent in the Brazil.

Unfortunately primaquine can induce haemolytic anaemia in glucose-6-phosphate dehydrogenase (G6PD)-deficient individuals and may lead to severe and fatal complications. 330 million people worldwide are affected with G6PD deficiency, with recent studies suggesting the prevalence of the deficiency could be as high as 10% in Brazil.

Recently, molecular biologists from LGC enabled a cutting edge study in collaboration with researchers from Brazil and the London School of Hygiene and Tropical Medicine.

The researchers looked for mutations in a sample of 516 male volunteers that could be used as clinical indicators for G6PD deficiency that could lead to complications in people prescribed with primaquine.

Blood samples were collected from around Brazil at hospitals during surgeries, as well as using the local Brazilian radio stations to ask people to come and submit blood.

Needing a fast and efficient way to generate results in high throughput, the team turned to LGC’s integrated genomics toolkit to facillitate the research. Each sample was screened against 24 KASP assays to assess the genetic bases of G6PD deficiency. In combination with the IntelliQube®,a fully automated point and click PCR system;  the team collected the data in roughly three hours of instrument time and one hour hands on time.

KASP is a flexible, highly specific genotyping technology, which can be used to determine SNPs and InDels.  KASP uses unlabelled oligonucleotide primers, which gives the technology a cost advantage and allows more data to be generated, increasing data quality.

The data indicates that approximately one in 23 males from the Alto do Juruá could be G6PD deficient and at risk of haemolytic anaemia if treated with primaquine. The authors conclude that routine G6PDd screening to personalize primaquine administration should be considered – particularly as complete treatment of patients with vivax malaria using chloroquine and primaquine, is crucial for malaria elimination.

The teams are continuing their collaboration to help further research in to treatments for malaria, and we can’t wait to see more!

To access the paper, please click here, or to see the IntelliQube in action and learn more about this automated PCR instrument click here.

 

 

Sources:

Malaria. (2017, July 13). Retrieved August 8, 2017, from https://www.cdc.gov/malaria/about/index.html

Maia, U. M., Batista, D. C., Pereira, W. O., & Fernandes, Thales Allyrio Araújo de Medeiros. (n.d.). Prevalence of glucose-6-phosphate dehydrogenase deficiency in blood donors of Mossoró, Rio Grande do Norte. Retrieved August 8, 2017, from http://www.scielo.br/scielo.php?pid=S1516-84842010000500017&script=sci_arttext&tlng=en

 

This blog post was originally published on the Biosearch Technologies blog.

Fatbergs: The monster lurking below

If you haven’t been paying attention to sewer-related news throughout the past few years, you might have missed that fatbergs are a thing. Large (sometimes hundreds of metres long), congealed lumps of fat and other substances, fatbergs have been clogging up the sewer systems under major cities like London, Melbourne, Baltimore and Cardiff.

martin-brechtl-721491-unsplashJust a quick Google search of the word ‘fatberg’ turns up a trove of related videos and news that could gross anyone out. Fatbergs now have their own museum exhibition and were even the subject of a prime time documentary, Fatberg Autopsy, which is exactly as captivating and weird as it sounds.  And just as our fascination for these grotesque reflections of modern life has grown faster than a fatberg in a sewer, so is our understanding of them.

These beasts begin to form when large amounts of cooking oils, fats and grease are dumped into drains, where they thicken. Adding to the frequency of fatbergs is the increased usage of wet wipes, which don’t break down in drainage pipes, but instead team up with the congealed cooking oils to form a monster from a subterranean horror film. Fatbergs are particularly susceptible in old pipes or pipes with rough walls where debris can get trapped and build up.

And despite its moniker, documented fatbergs are mostly made up of wet wipes, which account for 93 percent of the material blocking sewers, while actual fat and grease make up only 0.5 percent. In one case, the fatberg in London had grown to weigh as much as a blue whale, the largest animal known to have ever existed.

Studying products of human behaviour, like fatbergs, can provide a lot of information into how people in these cities live.

Simon Hudson, Technical Director of Sport and Specialised Analytical Services at LGC, has been involved with method development and analysis for many projects looking into identifying the makeup of substances found in public systems, like fatbergs. In addition to analysing samples for Fatberg Autopsy, Simon has also worked with scientists from King’s College London, Guy’s and St Thomas’s NHS Foundation Trust and King’s Health Partners, Hull York Medical School and other institutions to analyse anonymised pooled urine from UK cities.

By using various analytical methods on samples from street urinals, the scientists have been able to provide a geographical trend analysis of the recreational drugs and novel psychoactive substances (NPS) that are being used, showing the most common drugs in specific cities.

Studies on recreational drug use have traditionally been done by self-reported user surveys, which are helpful but flawed if respondents either don’t know what drugs they are taking or don’t disclose everything they’ve used. By analysing samples from urinals, these methods can be used to confirm actual drugs being used and can be particularly useful for public health initiatives in identifying new psychoactive substances that may not have been reported or known to officials yet. It also provides insight into common potential adulterants of drugs.

By taking pooled samples from street urinals near night clubs and bars, these studies provide a snapshot of what is happening inside the night life across UK cities.

Findings include everything from nicotine and caffeine to cocaine, cannabis, ketamine, methamphetamine, anabolic steroids and several uncontrolled psychoactive substances. In one specific study¹, cocaine and 3,4-methylenedioxy–methamphetamine (MDMA, Ecstasy) were the most common recreational drugs to turn up, while morphine and methadone were detected in seven and six cities, respectively.

Like his analysis of fatbergs, Simon’s work on urine samples provides insight into the hidden aspects of modern life, the things that aren’t talked about over coffee or seen while heading into the office. They’re also valuable in shaping public health knowledge and responses to potential issues.

If you’re interested in learning more about our science, head over to lgcgroup.com or read Simon’s various publications on pooled urine analysis listed below.

 

¹Archer, J.R.H, S. Hudson, O. Jackson, T. Yamamoto, C. Lovett, H.M. Lee, S. Rao, L. Hunter, P.I. Dargan, and D.M. Wood (2015). Analysis of anonymized pooled urine in nine UK cities: variation in classical recreational drug, novel psychoactive substance and anabolic steroid use.  QJM: An International Journal of Medicine. 108(12), pp. 929-933.

Other publications:

  1. R. H Archer, P. I. Dargan, S. Hudson, S. Davies, M. Puchnarewicz, A. T. Kicman, J. Ramsey, F. Measham, M. Wood, A. Johnston, and D. M. Wood (2013). Taking the Pissoir – a novel and reliable way of knowing what drugs are being used in nightclubs. Journal of Substance Use. 00 (0), pp. 1-5.
  2. R. H. Archer, P. I. Dargan, H. M. D. Lee, S. Hudson & D. M. Wood (2014) Trend analysis of anonymised pooled urine from portable street urinals in central London identifies variation in the use of novel psychoactive substances, Clinical Toxicology, 52:3, 160-165, DOI: 10.3109/15563650.2014.885982

Accelerating rice improvement in South Asia

WP_20180515_009Diversity is the spice of life and is also key to breeding rice that delivers increased yields. Rice is a crucial staple food for about half a billion people in Asia, but it suffers from diseases that reduce yields, destroy harvests and put food security and livelihoods at risk. But there is hope – by tracking DNA markers of natural genetic variants through generations of crosses, breeders can identify better combinations that enrich crop vitality and resilience leading to more reliable and sustainable rice production.

collaborative project between Bangor University and LGC with a university partner in India (SKUAST), a research institute in Pakistan (NIBGE) and, in Nepal, a government research centre (NARC) and a private seed company (Anamolbiou) is addressing this challenge and has already identified over a million new markers in rice. They can reveal linkage to genes and patterns of diversity that help rice breeders select for a wide range of resistance genes to improve many different varieties. The project continues to develop these markers into more KASP assays that will be made available in publicly searchable databases.

Modern disease-resistant varieties are not always well adapted to specific environments, so breeders aim to incorporate markers for both biotic and abiotic stress resistance as well as yield components into locally accepted varieties that may already possess value traits, such as aroma. Molecular markers such as Simple Sequence Repeats (SSRs) in rice were developed in the 1990s for marker-assisted selection (MAS) and these are still used by some rice breeders in Asia to improve selection efficiency. Smaller breeding companies do not have all the resources (i.e. trained personnel, instrumentation for extraction or genotyping) to use such markers in-house. They can benefit from a service-based approach such as LGC Genomics’ genotyping service using KASP technology that offers a lower cost per data point and is faster to implement or use in their own lab. KASP assays offer greater sensitivity, speed, and safety than the older techniques, such as SSRs, when carried out in breeders’ own labs.

WP_20180515_007The collaboration with Bangor University and partners has already developed new methods to identify suitable SNP and InDel markers that can replace existing SSRs in target breeding crosses which have been adopted by Nepalese breeders. Now, a broader survey of suitable SNPs and InDel markers, across a set of 130 publically available rice genome sequences selected for geographic diversity, is discovering novel markers that are relevant to both Indica and Japonica rice backgrounds.

Before the research team started this project there was a choice of 2055 useful KASP assays that breeders could use, depending on their breeding strategy, but this project has increased the choice to over 245,000 potential markers that should benefit a wider range of rice breeding programs. This increase in the number of KASP assays enables the project and research community to utilize KASP technology on a scale that was only available to big breeding companies before this project. It’s exciting times for rice breeding!

Bangor University and partners plan to make thousands of the rice markers from this project available in the form of a searchable database so that rice breeders can easily find the most suitable options to replace their target SSRs in existing programs or to identify the appropriate loci for a range of possible new crosses. LGC will also offer them as validated KASP assays on its website. The large database of validated KASP assays produced by this project will thus give rice breeders the ability to carry out genomic selection (GS) with many thousands of loci across their populations, enabling smaller breeders to benefit from the same genomic scale technologies that generally require significant resource investment to develop on their own. The availability of this marker set to the public sector, and the services provided by LGC Genomics, will enable rice breeders of all sizes to apply genomic tools to accelerate their MAS and GS breeding programs to develop new rice varieties that will improve food security.

To learn more about our KASP genotyping services click here.

 

This blog originally appeared on the Biosearch Technologies blog.

Peanut allergen quantification: a tough nut to crack

As part of the National Measurement Laboratory’s 30th anniversary, we’re sharing stories and case studies from the last three decades.

One of our case studies touches on an issue that affects hundreds of thousands of people across the UK alone: peanut allergies. So read on to learn how LGC scientists developed a unique allergen quality control material, which can be used to help protect the people in the UK with a peanut allergy and also help to prevent contamination in the food production process, potentially saving the food industry millions of pounds.

The problem

The prevalence of peanut allergy has nearly doubled in Europe over the past two decades and it now affects around 500,000 people in the UK [1]. Peanut allergy is the most common cause of fatal food allergy reaction. It occurs when the immune system mistakenly identifies peanut proteins as something harmful. The fear of accidental exposure in food reduces the quality of life of peanut allergy sufferers and severely limits the social habits of allergic individuals, their families and even their friends.

SMALL_iStock_000004018390Small peanutsIt is not only those with peanut allergies who have to worry about the risk of allergic reactions or death by anaphylaxis; it also creates problems for businesses. Testing for allergen proteins in food is difficult, as samples usually contain a lot of protein and it can be difficult to separate the allergen protein of interest. This has an impact on the ability of manufacturers and suppliers to adequately label their goods and also has implications for defining threshold levels and detecting food fraud.

All food companies throughout the EU are compelled by law to declare major allergens including peanut, if included in food products as ingredients. The current labelling rules, brought into force in December 2014 by European Regulation 1169/2011 (the EU Food Information for Consumers Regulation, EU FIC) ensure that all consumers are given highlighted information about the use of allergenic ingredients in pre-packed food [2]. This is to make it easier for people with food allergies to identify the foods they need to avoid. The EU FIC also extends to food sold loose or served when eating out. Prevention of cross contamination with peanut through product testing, validation and verification of cleaning, and checking of ‘peanut-free’ products requires exacting testing.

ELISA (enzyme-linked immunosorbent assay), PCR (polymerase chain reaction) and mass spectrometry (MS) methods can be used to detect food allergens, but there are problems obtaining reliable quantitative results with all three. Prior to this project, there were no suitable reference materials available in the form of a food matrix, making it difficult for laboratories and test-kit manufacturers to validate quantitative methods for allergen measurement.

The solution

A quality control (QC) material that is a real food, containing a known amount of specific allergen protein, and is stable and homogenous could assist laboratories in the validation and monitoring of their analysis. Consequently, a project was undertaken by LGC to develop a food matrix peanut allergen QC material.

The chosen matrix was a chocolate dessert product developed for low-dose threshold studies in food allergic individuals in the European research project ‘EuroPrevall’. Two QC materials were prepared by University of Manchester researchers in the form of chocolate dessert product pastes designed to be reconstituted with water before analysis. One material (LGCQC1011) was prepared as a peanut free negative control and the other material (LGCQC1012) was prepared as a positive control with the addition of light roast, partially defatted peanut flour (a commercial food ingredient) to give a peanut protein content of 10 mg kg-1. The pastes were transferred to LGC, packaged in nitrogen-flushed sealed sachets to aid stability and the units were numbered sequentially in fill order. LGC assessed and proved their homogeneity and stability, underpinned by a validation study of the test method using a commercially available ELISA kit (Romer AgraQuant® Peanut kit). The National Measurement System funded the ELISA kit validation studies, and a Technology Strategy Board and LGC co-funded research and development project established the design and production of the QC material.

Impact

Failure in food allergen management means ‘food-allergen’ related incidents are the most common reason for product withdrawals and recalls in the United Kingdom according to the UK Food Standards Agency. The 34 recalls related to allergens in 2010 were estimated to cost stakeholders £10- 15 million. In 2013, the number of Allergy Alerts issued to withdraw food or drink products had risen to 47.

Phil Goodwin, MD of Bio-Check (UK) a food allergen test kit manufacturer, has worked in this area for 30 years and welcomes LGC’s recent initiatives:

“The science of food allergen detection, let alone quantitation, has failed to move forward anything like quickly enough since it began in the late 1980s. The emergence of such high quality QC materials as are being produced by LGC is a significant step forward to a time when all commercial test kits can be demonstrated to show good agreement on allergen levels. LGC are to be applauded for taking on this difficult challenge and I urge all allergen kit producers and analysts to use the material to improve their products and results.”

 

[1] http://www.mrc.ac.uk/news-events/publications/outputs-outcomesand-impact-of-mrc-research-2013-14/

[2] http://allergytraining.food.gov.uk/english/rules-and-legislation/

This blog first appeared as a NML case study on the LGC Group website. To learn more about the NML, visit their site here.

Revolutionising cancer treatment one Array at a time

While the science of pharmacogenomics has been around for years, its popularity is starting to pick up steam as precision medicine and how we treat individual patients becomes more and more common place in the medical world. Geneticists and doctors are fully embracing the fact that our individual genes make us all unique and that these genes hold clues to how each patient’s body will metabolise medications.

Pharmacogenetics, or the study of how people respond differently to medicines due to their genetics, is making a splash lately thanks to companies like Minneapolis, MN-based OneOme, which co-developed its RightMed test with Mayo Clinic. The company collects a patient’s DNA sample using a simple cheek swab that is then analysed at OneOme’s lab with PCR – in this case on LGC’s IntelliQube® – to determine the patient’s genetics.  This information is then used to determine whether the patient has any genetic variations that may cause them to have a certain reaction to a medication. These results give doctors “graphic genetic pinpoint accuracy” on the medications that should work and those likely to be less effective. In simplest terms, these tests, combined with PCR instruments are empowering patients and doctors with information that may not only make their lives better, but also safer. Or as we like to say, science for a safer world.

Take a look at just how much pharmacogenomics is impacting and “revolutionizing” patient care by watching the video here, or visit our website.

 

This story was originally published on the Biosearch Technologies blog.

Food chain resilience in a changing world

A few weeks ago, we were joined by experts and industry leaders at our biennial Government Chemist Conference, and this year’s theme was ‘Food chain resilience in a changing world’.

Attendees were treated to a variety of presentations about food chain resilience from Food Standards Agency, Public Health England, the European Commission’s Joint Research Council, Cambridge University, and many others.

Topics ranged from food crime to genome sequencing and genetics, as well as preparing the food industry for Brexit and systems for fighting fraud.

Among some of the popular topics discussed were meat speciation techniques and food authenticity, which underline current issues surrounding consumer trust in food manufacturing.

Methods for detecting trace amounts of undeclared ingredients in food have evolved enormously in recent years, but incidents still occur. Recent reports suggest that some ‘meat-free’ ready meals have even contained trace amounts of meat, although the exact amount and method of transfer have yet to be determined.

Any food used as an ingredient in a pre-packed processed product, (i.e. in the ‘recipe’) must be declared in the list of ingredients. Adventitious meat cross contamination isn’t generally regarded as deliberate fraud under 1 %. But even below this ‘cut-off’ point there are implications for consumer choice, especially if avoiding meat (vegetarian or vegan preferences), or specific meat species for religious reasons.

When ‘trace’ amounts of a material have been found in food, it suggests adventitious cross contamination (which could be obtained from inadequate cleaning of equipment, for example), rather than intentional adulteration. Particularly with foods that contain many ingredients, like ready meals, this could come from any of the ingredients at any point along the supply chain.

This makes the methodology of detection that much more important, as each technique has its own level of accuracy. For instance, Polymerase Chain Reaction (PCR) screens for the absence or presence of specific DNA within a defined limit of detection, which would require the scientist to know what to look for. Care is required in carrying out these tests and interpretation of the results.  Meanwhile, Next Generation Sequencing (NGS) detects and sequences all DNA material in a sample, which allows for a greater understanding of the makeup of foods. Once the NGS finishes its analysis, millions of sequences can be analysed to identify species, but this method is more expensive and can be resource intensive.

These are just two examples of methods used recently to determine authenticity, but there isn’t uniformity in methods and standards around the world. Now that we are becoming more globally focussed than ever before, in both trade and knowledge sharing, there should be more harmonisation among techniques used in different places. Food supplies might cross several different borders before becoming food; processed, tested and analysed with different standards. It’s important that we have robust systems in place to ensure that food standards and methods for measurement are equal and that all food is both safe and exactly what it claims to be.

And many of the speakers and attendees of the GC Conference are working toward that goal, sharing their expertise on sound science, building systems for detection of fraud, and enforcing stronger regulations for food safety.

Watch the video from the conference, or you can learn more about the speakers and see their presentations here.

What’s funny about your honey?

Ensuring the safety and authenticity of the food we eat is of paramount importance and there is growing concern, both at the EU and global level, to ensure the quality control of food to protect the health and safety of consumers. And during the National Measurement Laboratory’s thirty years, we’ve done a lot of work to support reliable measurements in food testing and authentication.

Honey is known to have multiple health and nutritional benefits and is in high demand among consumers. It is defined as the natural sweet substance produced by bees and there is significant regulation around the composition and labelling of honey in order to protect consumers from food fraud. However, due to the declining numbers of bees, the impact of weather conditions on supply and the high costs production, honey is expensive. This makes it a prime target for economically-motivated food fraud.

StockSnap_97LJAKWL36Some research suggests that humans began to hunt for honey 8,000 years ago, and the oldest known honey remains, dating back to between 4,700 – 5,500 years ago, were discovered in clay vessels inside of a tomb in the country of Georgia.

The ancient Egyptians used honey to sweeten dishes and to embalm the dead, while the ancient Greeks actually practised beekeeping so much that laws were passed about it. Honey was prevalent around the ancient world, being used in ancient India, China, Rome and even among the Mayans. It even plays a role in many religions, representing the food of Zeus, an elixir of immortality, and a healing substance.

And just like any other important product, fraudsters have been faking it since it’s been in use. Ancient Greeks and Romans both mention honey adulteration, and back in 1889, Dr Harvey W. Wiley testified in front of Congress that it was the most adulterated product in the U.S.

Honey is still one of the most adulterated food products globally, with a report last year citing that more than 14% of tested samples were adulterated.

There are two types of food fraud associated with honey: adulteration and fraudulent labelling. Honey adulteration typically occurs by substituting honey for cheaper sweeteners such as high fructose corn syrup, cane or beet sugar syrup. Fraudulent labelling occurs because honeys from a particular geographic or botanical source, such as Manuka, command premium prices amongst consumers.

Detecting these types of fraud presents a significant measurement challenge for food regulators: adulterated products show very similar physical and chemical properties to pure honey and mis-labelled products are, in fact, pure honey, just of lower quality. Several reports indicate that there is more Manuka honey being sold than Manuka bees can  produce, which illustrates how often lower quality honeys are passed for premium ones in order to maximise profit.

During our thirty years as the National Measurement Laboratory (NML) for chemical and bio-measurement, our scientists have conducted several reviews and studies of methods for detecting honey fraud1. For instance, nearly forty years ago, scientists began to use stable carbon isotope ratio mass spectrometry (IR-MS) to detect high fructose corn syrup in honey.  As our scientists found2, it is possible to identify food fraud in honey using IR-MS, which measures small but observable variations in the ratios of the two stable isotopes of carbon (C-13 and C-12). Sugars, although chemically identical, have a different isotopic signature depending on the way in which the plant processes carbon dioxide. As the majority of honey-source plants use a different pathway than plant sugars typically used as honey adulterants, it is possible to detect adulteration using IR-MS. The specific geography of the plants also plays a role in the isotopic fingerprint and IR-MS can be used to determine where honeys originated.

However, in order that these types of measurements are robust and reliable in detecting food fraud across the supply chain the comparability of results is critical. To support this, LGC co-ordinated an international comparison study in 2016 for isotope ratios in honey involving 6 national measurement institutes (NMIs) and 6 expert laboratories (contacted via the Forensic Isotope Ratio Mass Spectrometry (FIRMS) Network) and the results between participants showed good comparability.

Demonstrating the comparability of isotope ratio measurements is crucial to detecting many types of food fraud and supporting food authenticity claims, of which honey is just one example. The international study coordinated by LGC demonstrates the measurement framework is in place to support food fraud regulation in the future.

 

1 D. Thorburn Burns, Anne Dillon, John Warren, and Michael J. Walker, 2018, A Critical Review of the Factors Available for the Identification and Determination of Mānuka Honey, Food Analytical Methods, https://doi.org/10.1007/s12161-018-1154-9.

2 Helena Hernandez, “Detection of adulteration of honey: Application of continuous-flow IRMS”, VAM Bulletin, 1999, Vol 18, pp 12-14.