How genotyping is aiding in the fight against malaria

mosquitoe-1548975_19203.2 billion people across 106 countries and territories, live in areas at risk of malaria transmission. The serious and sometime fatal mosquito-borne disease is caused by the Plasmodium parasite – in 2015, malaria caused 212 million clinical episodes, and 429,000 deaths.

Malaria has been a public health problem in Brazil ever since it was brought to the region during its colonization. By the 1940s it is estimated that six to eight million infections and 80,000 malaria-related deaths occurred every year in the country.

Due to a concerted series of malaria control policies, Brazil has recorded a 76.8% decrease in malaria incidence between 2000 and 2014 – and effort which the country was praised by the WHO.  In 2014, there were 143,910 of microscopically confirmed cases of malaria and 41 malaria-related deaths.

Part of Brazil’s malaria control policy involves the use of primaquine – a medication first made in 1946, to treat and prevent malaria. It is particularly effective against the Plasmodium vivax parasite that is prevalent in the Brazil.

Unfortunately primaquine can induce haemolytic anaemia in glucose-6-phosphate dehydrogenase (G6PD)-deficient individuals and may lead to severe and fatal complications. 330 million people worldwide are affected with G6PD deficiency, with recent studies suggesting the prevalence of the deficiency could be as high as 10% in Brazil.

Recently, molecular biologists from LGC enabled a cutting edge study in collaboration with researchers from Brazil and the London School of Hygiene and Tropical Medicine.

The researchers looked for mutations in a sample of 516 male volunteers that could be used as clinical indicators for G6PD deficiency that could lead to complications in people prescribed with primaquine.

Blood samples were collected from around Brazil at hospitals during surgeries, as well as using the local Brazilian radio stations to ask people to come and submit blood.

Needing a fast and efficient way to generate results in high throughput, the team turned to LGC’s integrated genomics toolkit to facillitate the research. Each sample was screened against 24 KASP assays to assess the genetic bases of G6PD deficiency. In combination with the IntelliQube®,a fully automated point and click PCR system;  the team collected the data in roughly three hours of instrument time and one hour hands on time.

KASP is a flexible, highly specific genotyping technology, which can be used to determine SNPs and InDels.  KASP uses unlabelled oligonucleotide primers, which gives the technology a cost advantage and allows more data to be generated, increasing data quality.

The data indicates that approximately one in 23 males from the Alto do Juruá could be G6PD deficient and at risk of haemolytic anaemia if treated with primaquine. The authors conclude that routine G6PDd screening to personalize primaquine administration should be considered – particularly as complete treatment of patients with vivax malaria using chloroquine and primaquine, is crucial for malaria elimination.

The teams are continuing their collaboration to help further research in to treatments for malaria, and we can’t wait to see more!

To access the paper, please click here, or to see the IntelliQube in action and learn more about this automated PCR instrument click here.

 

 

Sources:

Malaria. (2017, July 13). Retrieved August 8, 2017, from https://www.cdc.gov/malaria/about/index.html

Maia, U. M., Batista, D. C., Pereira, W. O., & Fernandes, Thales Allyrio Araújo de Medeiros. (n.d.). Prevalence of glucose-6-phosphate dehydrogenase deficiency in blood donors of Mossoró, Rio Grande do Norte. Retrieved August 8, 2017, from http://www.scielo.br/scielo.php?pid=S1516-84842010000500017&script=sci_arttext&tlng=en

 

This blog post was originally published on the Biosearch Technologies blog.

Fatbergs: The monster lurking below

If you haven’t been paying attention to sewer-related news throughout the past few years, you might have missed that fatbergs are a thing. Large (sometimes hundreds of metres long), congealed lumps of fat and other substances, fatbergs have been clogging up the sewer systems under major cities like London, Melbourne, Baltimore and Cardiff.

martin-brechtl-721491-unsplashJust a quick Google search of the word ‘fatberg’ turns up a trove of related videos and news that could gross anyone out. Fatbergs now have their own museum exhibition and were even the subject of a prime time documentary, Fatberg Autopsy, which is exactly as captivating and weird as it sounds.  And just as our fascination for these grotesque reflections of modern life has grown faster than a fatberg in a sewer, so is our understanding of them.

These beasts begin to form when large amounts of cooking oils, fats and grease are dumped into drains, where they thicken. Adding to the frequency of fatbergs is the increased usage of wet wipes, which don’t break down in drainage pipes, but instead team up with the congealed cooking oils to form a monster from a subterranean horror film. Fatbergs are particularly susceptible in old pipes or pipes with rough walls where debris can get trapped and build up.

And despite its moniker, documented fatbergs are mostly made up of wet wipes, which account for 93 percent of the material blocking sewers, while actual fat and grease make up only 0.5 percent. In one case, the fatberg in London had grown to weigh as much as a blue whale, the largest animal known to have ever existed.

Studying products of human behaviour, like fatbergs, can provide a lot of information into how people in these cities live.

Simon Hudson, Technical Director of Sport and Specialised Analytical Services at LGC, has been involved with method development and analysis for many projects looking into identifying the makeup of substances found in public systems, like fatbergs. In addition to analysing samples for Fatberg Autopsy, Simon has also worked with scientists from King’s College London, Guy’s and St Thomas’s NHS Foundation Trust and King’s Health Partners, Hull York Medical School and other institutions to analyse anonymised pooled urine from UK cities.

By using various analytical methods on samples from street urinals, the scientists have been able to provide a geographical trend analysis of the recreational drugs and novel psychoactive substances (NPS) that are being used, showing the most common drugs in specific cities.

Studies on recreational drug use have traditionally been done by self-reported user surveys, which are helpful but flawed if respondents either don’t know what drugs they are taking or don’t disclose everything they’ve used. By analysing samples from urinals, these methods can be used to confirm actual drugs being used and can be particularly useful for public health initiatives in identifying new psychoactive substances that may not have been reported or known to officials yet. It also provides insight into common potential adulterants of drugs.

By taking pooled samples from street urinals near night clubs and bars, these studies provide a snapshot of what is happening inside the night life across UK cities.

Findings include everything from nicotine and caffeine to cocaine, cannabis, ketamine, methamphetamine, anabolic steroids and several uncontrolled psychoactive substances. In one specific study¹, cocaine and 3,4-methylenedioxy–methamphetamine (MDMA, Ecstasy) were the most common recreational drugs to turn up, while morphine and methadone were detected in seven and six cities, respectively.

Like his analysis of fatbergs, Simon’s work on urine samples provides insight into the hidden aspects of modern life, the things that aren’t talked about over coffee or seen while heading into the office. They’re also valuable in shaping public health knowledge and responses to potential issues.

If you’re interested in learning more about our science, head over to lgcgroup.com or read Simon’s various publications on pooled urine analysis listed below.

 

¹Archer, J.R.H, S. Hudson, O. Jackson, T. Yamamoto, C. Lovett, H.M. Lee, S. Rao, L. Hunter, P.I. Dargan, and D.M. Wood (2015). Analysis of anonymized pooled urine in nine UK cities: variation in classical recreational drug, novel psychoactive substance and anabolic steroid use.  QJM: An International Journal of Medicine. 108(12), pp. 929-933.

Other publications:

  1. R. H Archer, P. I. Dargan, S. Hudson, S. Davies, M. Puchnarewicz, A. T. Kicman, J. Ramsey, F. Measham, M. Wood, A. Johnston, and D. M. Wood (2013). Taking the Pissoir – a novel and reliable way of knowing what drugs are being used in nightclubs. Journal of Substance Use. 00 (0), pp. 1-5.
  2. R. H. Archer, P. I. Dargan, H. M. D. Lee, S. Hudson & D. M. Wood (2014) Trend analysis of anonymised pooled urine from portable street urinals in central London identifies variation in the use of novel psychoactive substances, Clinical Toxicology, 52:3, 160-165, DOI: 10.3109/15563650.2014.885982

Accelerating rice improvement in South Asia

WP_20180515_009Diversity is the spice of life and is also key to breeding rice that delivers increased yields. Rice is a crucial staple food for about half a billion people in Asia, but it suffers from diseases that reduce yields, destroy harvests and put food security and livelihoods at risk. But there is hope – by tracking DNA markers of natural genetic variants through generations of crosses, breeders can identify better combinations that enrich crop vitality and resilience leading to more reliable and sustainable rice production.

collaborative project between Bangor University and LGC with a university partner in India (SKUAST), a research institute in Pakistan (NIBGE) and, in Nepal, a government research centre (NARC) and a private seed company (Anamolbiou) is addressing this challenge and has already identified over a million new markers in rice. They can reveal linkage to genes and patterns of diversity that help rice breeders select for a wide range of resistance genes to improve many different varieties. The project continues to develop these markers into more KASP assays that will be made available in publicly searchable databases.

Modern disease-resistant varieties are not always well adapted to specific environments, so breeders aim to incorporate markers for both biotic and abiotic stress resistance as well as yield components into locally accepted varieties that may already possess value traits, such as aroma. Molecular markers such as Simple Sequence Repeats (SSRs) in rice were developed in the 1990s for marker-assisted selection (MAS) and these are still used by some rice breeders in Asia to improve selection efficiency. Smaller breeding companies do not have all the resources (i.e. trained personnel, instrumentation for extraction or genotyping) to use such markers in-house. They can benefit from a service-based approach such as LGC Genomics’ genotyping service using KASP technology that offers a lower cost per data point and is faster to implement or use in their own lab. KASP assays offer greater sensitivity, speed, and safety than the older techniques, such as SSRs, when carried out in breeders’ own labs.

WP_20180515_007The collaboration with Bangor University and partners has already developed new methods to identify suitable SNP and InDel markers that can replace existing SSRs in target breeding crosses which have been adopted by Nepalese breeders. Now, a broader survey of suitable SNPs and InDel markers, across a set of 130 publically available rice genome sequences selected for geographic diversity, is discovering novel markers that are relevant to both Indica and Japonica rice backgrounds.

Before the research team started this project there was a choice of 2055 useful KASP assays that breeders could use, depending on their breeding strategy, but this project has increased the choice to over 245,000 potential markers that should benefit a wider range of rice breeding programs. This increase in the number of KASP assays enables the project and research community to utilize KASP technology on a scale that was only available to big breeding companies before this project. It’s exciting times for rice breeding!

Bangor University and partners plan to make thousands of the rice markers from this project available in the form of a searchable database so that rice breeders can easily find the most suitable options to replace their target SSRs in existing programs or to identify the appropriate loci for a range of possible new crosses. LGC will also offer them as validated KASP assays on its website. The large database of validated KASP assays produced by this project will thus give rice breeders the ability to carry out genomic selection (GS) with many thousands of loci across their populations, enabling smaller breeders to benefit from the same genomic scale technologies that generally require significant resource investment to develop on their own. The availability of this marker set to the public sector, and the services provided by LGC Genomics, will enable rice breeders of all sizes to apply genomic tools to accelerate their MAS and GS breeding programs to develop new rice varieties that will improve food security.

To learn more about our KASP genotyping services click here.

 

This blog originally appeared on the Biosearch Technologies blog.

Peanut allergen quantification: a tough nut to crack

As part of the National Measurement Laboratory’s 30th anniversary, we’re sharing stories and case studies from the last three decades.

One of our case studies touches on an issue that affects hundreds of thousands of people across the UK alone: peanut allergies. So read on to learn how LGC scientists developed a unique allergen quality control material, which can be used to help protect the people in the UK with a peanut allergy and also help to prevent contamination in the food production process, potentially saving the food industry millions of pounds.

The problem

The prevalence of peanut allergy has nearly doubled in Europe over the past two decades and it now affects around 500,000 people in the UK [1]. Peanut allergy is the most common cause of fatal food allergy reaction. It occurs when the immune system mistakenly identifies peanut proteins as something harmful. The fear of accidental exposure in food reduces the quality of life of peanut allergy sufferers and severely limits the social habits of allergic individuals, their families and even their friends.

SMALL_iStock_000004018390Small peanutsIt is not only those with peanut allergies who have to worry about the risk of allergic reactions or death by anaphylaxis; it also creates problems for businesses. Testing for allergen proteins in food is difficult, as samples usually contain a lot of protein and it can be difficult to separate the allergen protein of interest. This has an impact on the ability of manufacturers and suppliers to adequately label their goods and also has implications for defining threshold levels and detecting food fraud.

All food companies throughout the EU are compelled by law to declare major allergens including peanut, if included in food products as ingredients. The current labelling rules, brought into force in December 2014 by European Regulation 1169/2011 (the EU Food Information for Consumers Regulation, EU FIC) ensure that all consumers are given highlighted information about the use of allergenic ingredients in pre-packed food [2]. This is to make it easier for people with food allergies to identify the foods they need to avoid. The EU FIC also extends to food sold loose or served when eating out. Prevention of cross contamination with peanut through product testing, validation and verification of cleaning, and checking of ‘peanut-free’ products requires exacting testing.

ELISA (enzyme-linked immunosorbent assay), PCR (polymerase chain reaction) and mass spectrometry (MS) methods can be used to detect food allergens, but there are problems obtaining reliable quantitative results with all three. Prior to this project, there were no suitable reference materials available in the form of a food matrix, making it difficult for laboratories and test-kit manufacturers to validate quantitative methods for allergen measurement.

The solution

A quality control (QC) material that is a real food, containing a known amount of specific allergen protein, and is stable and homogenous could assist laboratories in the validation and monitoring of their analysis. Consequently, a project was undertaken by LGC to develop a food matrix peanut allergen QC material.

The chosen matrix was a chocolate dessert product developed for low-dose threshold studies in food allergic individuals in the European research project ‘EuroPrevall’. Two QC materials were prepared by University of Manchester researchers in the form of chocolate dessert product pastes designed to be reconstituted with water before analysis. One material (LGCQC1011) was prepared as a peanut free negative control and the other material (LGCQC1012) was prepared as a positive control with the addition of light roast, partially defatted peanut flour (a commercial food ingredient) to give a peanut protein content of 10 mg kg-1. The pastes were transferred to LGC, packaged in nitrogen-flushed sealed sachets to aid stability and the units were numbered sequentially in fill order. LGC assessed and proved their homogeneity and stability, underpinned by a validation study of the test method using a commercially available ELISA kit (Romer AgraQuant® Peanut kit). The National Measurement System funded the ELISA kit validation studies, and a Technology Strategy Board and LGC co-funded research and development project established the design and production of the QC material.

Impact

Failure in food allergen management means ‘food-allergen’ related incidents are the most common reason for product withdrawals and recalls in the United Kingdom according to the UK Food Standards Agency. The 34 recalls related to allergens in 2010 were estimated to cost stakeholders £10- 15 million. In 2013, the number of Allergy Alerts issued to withdraw food or drink products had risen to 47.

Phil Goodwin, MD of Bio-Check (UK) a food allergen test kit manufacturer, has worked in this area for 30 years and welcomes LGC’s recent initiatives:

“The science of food allergen detection, let alone quantitation, has failed to move forward anything like quickly enough since it began in the late 1980s. The emergence of such high quality QC materials as are being produced by LGC is a significant step forward to a time when all commercial test kits can be demonstrated to show good agreement on allergen levels. LGC are to be applauded for taking on this difficult challenge and I urge all allergen kit producers and analysts to use the material to improve their products and results.”

 

[1] http://www.mrc.ac.uk/news-events/publications/outputs-outcomesand-impact-of-mrc-research-2013-14/

[2] http://allergytraining.food.gov.uk/english/rules-and-legislation/

This blog first appeared as a NML case study on the LGC Group website. To learn more about the NML, visit their site here.

Revolutionising cancer treatment one Array at a time

While the science of pharmacogenomics has been around for years, its popularity is starting to pick up steam as precision medicine and how we treat individual patients becomes more and more common place in the medical world. Geneticists and doctors are fully embracing the fact that our individual genes make us all unique and that these genes hold clues to how each patient’s body will metabolise medications.

Pharmacogenetics, or the study of how people respond differently to medicines due to their genetics, is making a splash lately thanks to companies like Minneapolis, MN-based OneOme, which co-developed its RightMed test with Mayo Clinic. The company collects a patient’s DNA sample using a simple cheek swab that is then analysed at OneOme’s lab with PCR – in this case on LGC’s IntelliQube® – to determine the patient’s genetics.  This information is then used to determine whether the patient has any genetic variations that may cause them to have a certain reaction to a medication. These results give doctors “graphic genetic pinpoint accuracy” on the medications that should work and those likely to be less effective. In simplest terms, these tests, combined with PCR instruments are empowering patients and doctors with information that may not only make their lives better, but also safer. Or as we like to say, science for a safer world.

Take a look at just how much pharmacogenomics is impacting and “revolutionizing” patient care by watching the video here, or visit our website.

 

This story was originally published on the Biosearch Technologies blog.

Food chain resilience in a changing world

A few weeks ago, we were joined by experts and industry leaders at our biennial Government Chemist Conference, and this year’s theme was ‘Food chain resilience in a changing world’.

Attendees were treated to a variety of presentations about food chain resilience from Food Standards Agency, Public Health England, the European Commission’s Joint Research Council, Cambridge University, and many others.

Topics ranged from food crime to genome sequencing and genetics, as well as preparing the food industry for Brexit and systems for fighting fraud.

Among some of the popular topics discussed were meat speciation techniques and food authenticity, which underline current issues surrounding consumer trust in food manufacturing.

Methods for detecting trace amounts of undeclared ingredients in food have evolved enormously in recent years, but incidents still occur. Recent reports suggest that some ‘meat-free’ ready meals have even contained trace amounts of meat, although the exact amount and method of transfer have yet to be determined.

Any food used as an ingredient in a pre-packed processed product, (i.e. in the ‘recipe’) must be declared in the list of ingredients. Adventitious meat cross contamination isn’t generally regarded as deliberate fraud under 1 %. But even below this ‘cut-off’ point there are implications for consumer choice, especially if avoiding meat (vegetarian or vegan preferences), or specific meat species for religious reasons.

When ‘trace’ amounts of a material have been found in food, it suggests adventitious cross contamination (which could be obtained from inadequate cleaning of equipment, for example), rather than intentional adulteration. Particularly with foods that contain many ingredients, like ready meals, this could come from any of the ingredients at any point along the supply chain.

This makes the methodology of detection that much more important, as each technique has its own level of accuracy. For instance, Polymerase Chain Reaction (PCR) screens for the absence or presence of specific DNA within a defined limit of detection, which would require the scientist to know what to look for. Care is required in carrying out these tests and interpretation of the results.  Meanwhile, Next Generation Sequencing (NGS) detects and sequences all DNA material in a sample, which allows for a greater understanding of the makeup of foods. Once the NGS finishes its analysis, millions of sequences can be analysed to identify species, but this method is more expensive and can be resource intensive.

These are just two examples of methods used recently to determine authenticity, but there isn’t uniformity in methods and standards around the world. Now that we are becoming more globally focussed than ever before, in both trade and knowledge sharing, there should be more harmonisation among techniques used in different places. Food supplies might cross several different borders before becoming food; processed, tested and analysed with different standards. It’s important that we have robust systems in place to ensure that food standards and methods for measurement are equal and that all food is both safe and exactly what it claims to be.

And many of the speakers and attendees of the GC Conference are working toward that goal, sharing their expertise on sound science, building systems for detection of fraud, and enforcing stronger regulations for food safety.

Watch the video from the conference, or you can learn more about the speakers and see their presentations here.

What’s funny about your honey?

Ensuring the safety and authenticity of the food we eat is of paramount importance and there is growing concern, both at the EU and global level, to ensure the quality control of food to protect the health and safety of consumers. And during the National Measurement Laboratory’s thirty years, we’ve done a lot of work to support reliable measurements in food testing and authentication.

Honey is known to have multiple health and nutritional benefits and is in high demand among consumers. It is defined as the natural sweet substance produced by bees and there is significant regulation around the composition and labelling of honey in order to protect consumers from food fraud. However, due to the declining numbers of bees, the impact of weather conditions on supply and the high costs production, honey is expensive. This makes it a prime target for economically-motivated food fraud.

StockSnap_97LJAKWL36Some research suggests that humans began to hunt for honey 8,000 years ago, and the oldest known honey remains, dating back to between 4,700 – 5,500 years ago, were discovered in clay vessels inside of a tomb in the country of Georgia.

The ancient Egyptians used honey to sweeten dishes and to embalm the dead, while the ancient Greeks actually practised beekeeping so much that laws were passed about it. Honey was prevalent around the ancient world, being used in ancient India, China, Rome and even among the Mayans. It even plays a role in many religions, representing the food of Zeus, an elixir of immortality, and a healing substance.

And just like any other important product, fraudsters have been faking it since it’s been in use. Ancient Greeks and Romans both mention honey adulteration, and back in 1889, Dr Harvey W. Wiley testified in front of Congress that it was the most adulterated product in the U.S.

Honey is still one of the most adulterated food products globally, with a report last year citing that more than 14% of tested samples were adulterated.

There are two types of food fraud associated with honey: adulteration and fraudulent labelling. Honey adulteration typically occurs by substituting honey for cheaper sweeteners such as high fructose corn syrup, cane or beet sugar syrup. Fraudulent labelling occurs because honeys from a particular geographic or botanical source, such as Manuka, command premium prices amongst consumers.

Detecting these types of fraud presents a significant measurement challenge for food regulators: adulterated products show very similar physical and chemical properties to pure honey and mis-labelled products are, in fact, pure honey, just of lower quality. Several reports indicate that there is more Manuka honey being sold than Manuka bees can  produce, which illustrates how often lower quality honeys are passed for premium ones in order to maximise profit.

During our thirty years as the National Measurement Laboratory (NML) for chemical and bio-measurement, our scientists have conducted several reviews and studies of methods for detecting honey fraud1. For instance, nearly forty years ago, scientists began to use stable carbon isotope ratio mass spectrometry (IR-MS) to detect high fructose corn syrup in honey.  As our scientists found2, it is possible to identify food fraud in honey using IR-MS, which measures small but observable variations in the ratios of the two stable isotopes of carbon (C-13 and C-12). Sugars, although chemically identical, have a different isotopic signature depending on the way in which the plant processes carbon dioxide. As the majority of honey-source plants use a different pathway than plant sugars typically used as honey adulterants, it is possible to detect adulteration using IR-MS. The specific geography of the plants also plays a role in the isotopic fingerprint and IR-MS can be used to determine where honeys originated.

However, in order that these types of measurements are robust and reliable in detecting food fraud across the supply chain the comparability of results is critical. To support this, LGC co-ordinated an international comparison study in 2016 for isotope ratios in honey involving 6 national measurement institutes (NMIs) and 6 expert laboratories (contacted via the Forensic Isotope Ratio Mass Spectrometry (FIRMS) Network) and the results between participants showed good comparability.

Demonstrating the comparability of isotope ratio measurements is crucial to detecting many types of food fraud and supporting food authenticity claims, of which honey is just one example. The international study coordinated by LGC demonstrates the measurement framework is in place to support food fraud regulation in the future.

 

1 D. Thorburn Burns, Anne Dillon, John Warren, and Michael J. Walker, 2018, A Critical Review of the Factors Available for the Identification and Determination of Mānuka Honey, Food Analytical Methods, https://doi.org/10.1007/s12161-018-1154-9.

2 Helena Hernandez, “Detection of adulteration of honey: Application of continuous-flow IRMS”, VAM Bulletin, 1999, Vol 18, pp 12-14.

Food Safety Week and beyond: LGC’s long history in food testing

Food Safety Week, organised by the UK’s Food Standards Agency, is an opportunity to learn more about current food issues, including food crime, compliance and food hygiene. This year’s campaign celebrates “the people who protect your plate” – the workers who ensure the UK public can trust the food they eat, including inspectors, local authorities, and public analysts.

Also at the forefront of the fight for food safety are chemists, who analyse food, drinks and supplements to ensure manufacturers can verify the safety of their food products.

tea_world food day

The original Government Laboratory plaque and tea samples.

Consumers trust that when they buy food and drink, they are getting exactly what they’ve been told they are getting.  Each food has a distinct composition, much like its own fingerprint, and with the right expertise and tools, it’s possible to study these foods to determine their authenticity.  LGC has been involved in food testing for over 175 years. In fact, it’s the very reason we were established. In 1842, the Board of Excise needed a scientific authority to see that goods, like tea, tobacco and spirits, were not adulterated for profit, and so it created the Government Laboratory.

The Government Chemist role was created in 1909, to ensure the Laboratory of the Government Chemist could work independently of the Inland Revenue department (which provided staff to the Laboratory) and the Board of Customs and Excise (which controlled it). Nowadays the Government Chemist oversees the statutory function of referee analyst, resolving disputes over analytical measurements, particularly in relation to food regulatory enforcement.

As LGC grew, so did our roles involved in food and feed testing. Not only are we involved as the referee analyst for disputes in the food industry, we also provide products and solutions to food safety-related issues.

In order for food producers to know with certainty that their food is authentic, it’s necessary to compare what they’ve produced with a known and verified version of the food – this is called a reference material, or standard.  Currently, we have over 15,000 reference materials for food analysis, for everything from allergens, contaminants, and toxins to food flavourings, dyes and proteins, and much more.

Chemists also study new methods of authenticating foods, including via mass spectrometry, which is considered to be the gold standard in analysis, especially when combined with chromatography. Mass spectrometers analyse a sample’s elemental molecular weight, which is its ‘fingerprint’.  The tools and expertise of the National Measurement Laboratory at LGC allow our measurement scientists to be accurate about the content of a sample to up to one part per quadrillion. In other words, we can detect one lump of sugar dissolved in a bay.  These capabilities allow us to work on specific projects, tailoring our research to benefit many different sectors and solve specific problems.

This was particularly evident during a recent case studying selenium within food products and supplements.  It is essential that the correct amount and species of selenium is present in order for fortified food products and supplements to be safe for human consumption.  Selenium-enriched foods and supplements have become more prominent in Europe since it has moved to using more wheat that is naturally low in selenium.

However the accurate measurement of total selenium in food and food supplements presents analytical challenges due to the complex nature of food samples. Furthermore, selenium speciation analysis presents additional challenges due to the low levels of each specific selenium species and the molecular complexity of such samples.

LGC’s measurement research team for inorganic mass spectrometry has extensive experience in selenium speciation and was able to develop and characterise a range of reference materials, including a matrix selenium-enriched wheat flour standard, to support the food industry.

With over 175 years in the food testing arena, we have a lot to say about the subject, so if you want to learn more, head over to our website where you can read case studies and learn about our reference materials.

You can also join us at next week’s Government Chemist Conference, where we will be discussing current food safety issues at length, including Brexit, food authenticity, and food regulation, with many experts in their fields, including the FSA themselves. Visit the conference website to view the entire programme and register.

The National Measurement Laboratory turns 30!

In 1988, Government Chemist Alex Williams, seeing the need for improved quality of analytical measurements, initiated and launched the Valid Analytical Measurement (VAM) programme to develop a chemical measurement infrastructure in the UK.

This programme would go on to evolve into the National Measurement Laboratory for chemical and bio-measurement. The UK was one of the pioneers within the global measurement community to recognise the need to address the new and developing challenges of measurement across chemistry and biology.

An article from the early VAM bulletins (1989).

That means 2018 marks the NML’s 30th birthday and kicks off our ‘Year of Measurement’. It is an opportunity to celebrate the importance of measurement science (‘metrology’) as we enjoy our 30th birthday and join the upcoming Festival of Measurement, which launches in September and lasts through May 2019.

In our thirty year history of performing measurements to support the UK, we’ve experienced a lot of growth, seen big changes in the challenges we’ve been set and made some major breakthroughs. We’ve asked (and answered) a lot of questions, like ‘What are the best methods for the detecting the adulteration of honey’ or ‘Is the computer a friend or foe?’ (The answer is ‘friend’…or ‘both’ if you’ve invested heavily in encyclopaedias.)

We’ve already outlined in a recent blog post how important accurate measurement is, affecting everything from food and drink to medicine. Accurate and precise measurement is the foundation of public health and safety. But it’s also just as important to the economy.  In 2009, it was estimated that £622 billion of the UK’s total trade relied on measurement in some way, meaning that measurement plays a role in nearly every aspect of our lives.

Our Chief Scientific Officer, Derek Craston, agrees that good measurement is crucial to economies. ““In my role, I am fortunate to be able to see the major benefits that chemical and biological measurements make to the prosperity of companies and the lives of individuals across areas as broad as clinical diagnosis, drug development, environmental protection and food security. Indeed, in a global economy, with complex supply chains and regulatory frameworks, it is hard to see how many markets could function without it.”

We’re proud of the work we’ve done as the National Measurement Laboratory, where our work supports manufacture and trade, protects consumers and enhances quality of life. And over the next few months, we plan to share stories and case studies from our thirty years at the forefront of measurement with you, as well as look forward to the next thirty years.

World Metrology Day: Setting the standard for measurement

This Sunday, 20th May, is World Metrology Day, the birthday of the signing of the Metre Convention on 20 May 1875 (and pretty much the best day of the year for measurement scientists like us). This convention set the framework for global collaboration in the science of measurement (metrology). Its aim- to ensure we use uniform measurements across the globe- remains as important for industry, commerce and society today as it was over 140 years ago.

Measurement is present in everything: from food and drink safety to the efficacy of pharmaceuticals, from diagnosis and detection of disease to navigation, from air and water quality to forensics. Mobile phones and computers run on accurate measurement, and if you’ve ever had to consistently reset a clock, it was inaccurate measurement that was annoying you.

As the National Measurement Laboratory (designated for chemical and bio-measurement), LGC forms part of the UK National Measurement System (NMS) that provides the core measurement infrastructure for the UK. The measurements we make support manufacture and trade, protect consumers and enhance quality of life.

Did you know that, for all of human history, measurements have been based on actual physical weights and measures, called artefacts? Humans have been working on measurement standardisation for a long time. The ancient Egyptians used what is widely regarded as the first measurement standard, the cubit, a wooden rod that was used to determine standard lengths and heights, like for measuring flood levels of the Nile River. In ancient Babylon, the mina was created and used to measure weight, and the early Chinese civilisation used the chi. Even these standards had much variation within their societies, making wider trade and exchanges difficult. The Magna Carta in 1215 required that the same set of standards be used throughout the realm.  Finally, the International System of Units (SI) was agreed to during the Metre Convention on 20 May 1875, when representatives from seventeen countries set out to close gaps and reach uniformity of measurement around the world.

Even now, the Kilogram is a cylinder made from platinum and iridium alloy that sits in a vault near Paris. The vault is a necessary precaution to ensure the kilogram isn’t damaged, but the last time it was taken out and weighed against a copy, it actually lost weight. Think about that. Mass is always calibrated against another officially confirmed mass, but what happens when the official artefact is no longer reliable? Is the artefact the correct weight or is the copy? Does this mean all of the weights in the world are incorrect?

This could have huge consequences, especially when you consider how integral accurate measurement is to our society, which is why scientists have long been looking for a way to redefine standards, developing an independent system that means we don’t have to rely on a physical artefact which could be damaged or degraded. And the most logical way to revolutionise metrology is with math.

Scientists have been searching for a natural constant, an unchanging number present in nature that would represent each unit and would therefore make accurate measurement reproducible without physical weights. The theme for this year’s World Metrology Day is ‘Constant evolution of the International System of Units (SI)’, chosen because this year sees the culmination of that change: the four base units not defined in terms of natural constants – the kilogram, the mole, the ampere and the kelvin – are expected to be revised.

The world will come together at the General Conference on Weights and Measures in November 2018 and is expected to agree to this change. If approved, this will be the most radical change to the SI since its inception and it will hopefully improve measurement forever, providing a springboard for future innovation.

So feel free to celebrate this Metrology Day in style!