Counting down to the SI redefinition: kelvin and degree Celsius

kelvin mountainIn case you missed it, the redefinition of the International System of Units (SI) is going into effect this World Metrology Day, 20 May 2019. Each month we are bringing you a blog post featuring one of the units of the SI. This month we are focusing on the kelvin, the SI base unit for thermodynamic temperature. It’s winter in the Northern hemisphere and outdoor temperatures have dropped, so let’s jump in!

Unit of the month – kelvin

Accurate temperature measurement is essential in a wide range of everyday processes, from controlling chemical reactions and food production, to the assessment of weather and climate change. And almost every engineering process depends on temperature – sometimes critically. Knowing the correct temperature is also essential, but much more difficult, in more extreme conditions, like the intensely hot temperatures required to produce steel or the very low temperatures required to use superconductors.

Measuring temperature has a long history. About 2,000 years ago, the ancient Greek engineer Philo of Byzantium came up with what may be the earliest design for a thermometer: a hollow sphere filled with air and water, connected by tube to an open-air pitcher. The idea was that air inside the sphere would expand or contract as it was heated or cooled, pushing or pulling water into the tube. Later, people noticed that the air contracted in volume by about one third as the sphere was cooled from the boiling temperature of water to the ice point. This caused people to speculate on what would happen if one could keep cooling the sphere. In the middle of the 19th century, British physicist William Thomson – later Lord Kelvin – also became interested in the idea of ‘infinite cold’ a state we now call the absolute zero of temperature. In 1848, he published a paper, On an Absolute Thermometric Scale’ in which he estimated that absolute zero was approximately, -273 °C. In honour of his investigations, we now name the unit of temperature, the kelvin, after him.

kelvin imageWhen Lord Kelvin carried out his investigations, it was not yet universally accepted that all substances were made out of molecules in ceaseless motion. We now know that temperature is a measure of the average energy of motion of these particles, and absolute zero – zero kelvin – corresponds to the lowest possible temperature, a state where the thermal motion of molecules has ceased.

In 1960, when the SI was established, the temperature of the triple point of water was defined to be 273.16 K exactly. This is the temperature at which (in the absence of air) liquid water, solid water (ice) and water vapour can all co-exist in equilibrium. This temperature was chosen as a standard temperature because it was convenient and highly reproducible. Accordingly, the kelvin was defined to be the fraction 1/273.16 of the temperature of the ‘triple point’ of water. We then measured the temperature of an object by comparing it against the standard temperature. Unusually in the SI, we also defined another unit of temperature, called the degree Celsius (°C). This is related to the kelvin by subtracting 273.15 from the numerical value of the temperature expressed in kelvin.

t(in °C) = T(in K) – 273.15

The reason for this is to make it easier to use in a wide variety of applications that had previously used the ‘centigrade’ scale. In our everyday life we are used to expressing temperature in degrees Celsius. On this scale water freezes at about 0 oC and boils at approximately 100 oC. Notice the conversion from kelvin to degrees Celsius subtracts 273.15, so the triple point of water is 0.01 °C.

With the redefinition, the kelvin will no longer be defined in terms of an arbitrarily-chosen reference temperature. Instead, we will define temperatures in terms of the energy of molecular motion. We will do this by taking the value of the Boltzmann constant k to be 1.380 649 × 10−23 exactly when expressed in units of joules per kelvin (J K−1). One joule per kelvin is equal to one kg m2s−2 K−1, where the kilogram, metre and second are defined in terms of hc and ∆ν. So after this redefinition, we will be effectively measuring the temperature in terms of the energy of molecular motion. The degree Celsius will be related to kelvin in the same way as it was before May 2019.

Why is redefinition of Kelvin important?

For almost all users, the redefinition will pass unnoticed; water will still freeze at 0 °C, and thermometers calibrated before the change will continue to indicate the correct temperature. However, the redefinition opens up the possibility of using new technologies to measure temperature, something that is likely to be of benefit first at extremely high or low temperatures.

Range of temperatures

Coldest natural air temperature measured on Earth (Antarctic) -89.2 °C
Mercury Freezes -38.8 °C
Water freezes (1) 0 °C
Earth surface, average over year, land and ocean 1978 (2) 14.1 °C
Earth surface, average over year, land and ocean 2017 (2) 14.9 °C
Hottest natural air temperature measured on Earth, Furnace Creek, USA 56.7 °C
Water boils (1) 100 °C (actually 99.974 °C)
Molten Steel About 1 600 °C
Surface of the Sun About 5 500 °C
Centre of the Earth (estimate) About 7 000 °C
Centre of the Sun (estimate) About 15 million °C

(1) changes with altitude, this value at sea level.
(2) This year’s value and represents the general trend at the time.

Most people use the degree Celsius (°C) where °C = K – 273.15.

Advertisements

Christmas, candles and the countdown to the SI redefinition

Following the recent decision, taken by measurement scientists from around the world, to redefine the International System of Measurement (SI) units; on the 20th of each month we will be looking at one of the seven SI base units. You’ll be able to find out where it’s used in everyday life, how it’s defined now, and the changes that will come into force on 20 May 2019. 

candle-3837577_1920

In a first for theatre, the Swan United Electric Light Company was commissioned to create miniature lights which twinkled from wreaths worn by the lead fairies. At the time electric lighting was still cutting edge and the tiny lights – powered by battery packs hidden in costumes – amazed audiences. The term ‘fairy lights’ was born. A year later Edward Johnson, a colleague of Thomas Edison, put fairy lights on a Christmas tree for the first time.

UNIT OF THE MONTH: CANDELA

Which bring us to our SI unit of the month: the candela. The light, or luminous intensity, from a single clear indoor fairy light is approximately one candela, regardless of whether you have traditional tungsten filament fairy lamps or modern LED versions.

The candela is the only SI base unit linked to human perception. As the eye cannot see all light colours equally well, being most sensitive to yellow-green light, luminous intensity measures light adjusted for our human sensitivity to different frequencies.

Although the candela will effectively stay the same from 2019, as it is already defined in relation to other base units, the accuracy will be improved by updates to the second (find out more on 20 March) and the metre (see November’s update). The new definition will be:  

The candela is defined by taking the fixed numerical value of the luminous efficacy of monochromatic radiation of frequency 540×1012 Hz, Kcd, to be 683 when expressed in the unit lmW−1, which is equal to cdsrW−1, or cdsrkg−1m−2s3, where the kilogram, metre and second are defined in terms of h, c and ΔνCs. 

With rapid innovation in energy efficient lighting the need for reliable ways to compare the brightness of different light sources has become ever more important. This includes our fairy lights. Clear tungsten fairy lights use about ten times the electricity of modern LED lights, and coloured tungsten fairy lights are even less energy efficient as most light is absorbed by the coloured coating. Yet LEDs create colours with more visible light per electrical watt using different semiconductor materials. Accurately measuring their luminous intensity allows us to compare the visual appearance of each.

The lights most people hang on their trees this Christmas will be LEDs. As with most modern lighting, the tiny twinkling bulbs that amazed 19th Century opera fans have been superseded by energy efficient alternatives. And for that we have the candela to thank.

Merry Christmas!

Countdown to the SI redefinition

metr_bck_3Throughout history, measurement has been a fundamental part of human advancement. The oldest systems of weights and measures discovered date back over 4000 years. Early systems were tied to physical objects, like hands, feet, stones and seeds, and were used, as we still do now, for agriculture, construction, and trade. Yet, with new inventions and global trade more ever more accurate and unified systems were needed. In Europe, it wasn’t until the 19th Century that a universally agreed measurement system began to be adopted and the International System of Units (SI units) was born.

Now, after years of hard work and scientific progress, we are ready once again to update and improve the SI units. The redefinition of the International System of Units enacted on the 16 November 2018 during the General Conference for Weights and Measures will mean that the SI units will no longer be based on any physical objects, but instead derived through fundamental properties of nature. Creating a system centred on stable and universal natural laws will ensure the long-term stability and reliability of measurements, and act as a springboard for the future of science and innovation.

The redefinition of the SI units will come into force on the 20th of May 2019, the anniversary of the signing of the Metre Convention in 1875, an international treaty for the international cooperation in metrology.  To celebrate, we’ll be counting down each of the SI units – the metre, second, kilogram, kelvin, mole, candela, and ampere. Join us on the 20th of every month to find out where units are commonly used, how they’re defined, and the changes that will take place!

UNIT OF THE MONTH: METRE

“You’ve never heard of the Millennium Falcon? … It’s the ship that made the Kessel run in less than 12 parsecs!” Han  Solo’s description of the Millennium Falcon in Star Wars is impressive, but something’s not quite right. Do you know why? The unit he uses to illustrate the prowess of the Falcon – a parsec – isn’t actually a measure of time, but length! It probably won’t surprise anyone Han Solo isn’t very precise when it comes to the physics of his ship, but in fact he isn’t too far from the truth. This is because we use time to define length.

metre facts

What does this mean? Well, in the case of Han Solo, one parsec is about 3.26 light-years, and a light-year is the distance light travels in one year. Back down on Earth, we have the same method for defining length. In the International System of Units (SI), the base unit of length is the metre, and it can be understood as:

A metre is the distance travelled by light in 1/299792458 of a second.

The reason we use the distance travelled by light in a certain amount of time is because light is the fastest thing in the universe (that we know of) and it always travels at exactly the same speed in a vacuum. This means that if you measure how far light has travelled in a vacuum in 1/299792458 of a second in France, Canada, Brazil or India, you will always get exactly the same answer no matter where you are!

On 20 May next year the official definition of the metre will change to:

The metre is defined by taking the fixed numerical value of the speed of light in vacuum c to be 299 792 458 when expressed in the unit m s−1, where the second is defined in terms of the caesium frequency, ∆ν.

We’ll be returning to the definition of the second on 20 March, so join us again then to find out more.

So, what’s the difference? Actually, there’s no big change coming for the metre. Although the word order has been rephrased, the physical concepts remain the same.

Making a mountain out of a Mole Day

Today is Mole Day, chemists’ #1 holiday! Mole Day occurs every year on October 23 from 6:02am to 6:02pm to commemorate Avogadro’s Number and the basic measuring unit of chemistry, the mole.

What is Avogadro’s Number?

Avogadro’s Number is currently defined as the number of atoms in 12 grams of carbon-12, which comes to 6.02 x 10²³.

Amadeo Avogadro was a 19th century Italian scientist who first proposed in 1811 that equal volumes of all gases will contain equal numbers of molecules to each other (known as Avogadro’s Law).  Nearly one hundred years later, in 1909, chemists decided to adopt the mole as a unit of measure for chemistry. At the time, the scientists decided to define the mole based on the number of atoms in 12 grams of carbon-12. Jean Baptiste Perrin suggested this number should be named after Avogadro, due to his contributions to molecular theory.

Molecules and atoms are very tiny and numerous, which makes counting them particularly difficult. To put it into perspective, an atom is one million times smaller than the width of the thickest human hair. It’s useful to know the precise amount of certain substances in a chemical reaction, but calculating the number of molecules would get very messy if every time we had to use numbers like 602,214,129,270,000,000,000,000.

Enter Avogadro’s number! Using the mole simplifies complex calculations. Before the mole was adopted, other units were inadequate for measuring such miniscule amounts. After all, one millilitre of water still has 33,456,340,515,000,000,000,000 H₂O molecules!

This doesn’t mean that one mole of different substances equal each other in mass or size; it simply refers to the number of something, while size and mass vary by object. For example, a mole of water molecules would be about 18 millilitres, while a mole of aluminium molecules would weigh about 26 grams. However, a mole of pennies would cover the Earth at a depth of over 400 metres.  And a mole of moles would weigh over half the size of the moon!

Why Mole Day?

Schools around the U.S. and other places use the day as a chance to cultivate an interest in chemistry among students. Mole Day goes back to the 1980s, when an article in The Science Teacher magazine proposed celebrating the day. This inspired other teachers to get involved and  a high school chemistry teacher in Wisconsin founded the National Mole Day Foundation in 1991. The American Chemical Society then planned National Chemistry Week so this it falls on the same week as Mole Day every year.

Every year, chemistry teachers use this as an opportunity to perform fun experiments, bake mole-shaped desserts, and teach random facts about Avogadro’s number to students, with the aim of increasing science engagement

revised-SI-logoWhat about the revised SI?

In a previous blog post, we outlined how several of the units of the International Standards of units are undergoing a change. For example, the kilogram will no longer be based on a physical artefact, but on a constant. In the case of the mole, the current definition defines one mole as containing as many molecules as “atoms in 12 grams of carbon-12”. The new definition, which will likely come into effect next May, simply defines the mole as containing exactly 6.02214076 x 10²³ elementary entities. This eliminates any reference to mass and lays out the exact number of molecules as Avogadro’s constant, so the mole will not be dependent on any substance’s mass.

More Mole Facts

A mole of doughnuts would cover the earth in a layer five miles deep!

All of the living cells in a human body make up just over half a mole.

A mole of rice grains would cover all of the land area on Earth at a depth of 75 metres.

A mole of turkeys could form sixteen earths.

Head on over to our Twitter page to tell us what you think about Mole Day (or share more great facts), and to see what everyone is talking about!

Analysis for Innovators: Supporting industry

The Coconut Collaborative Ltd (CCL) manufactures Coconut Yogurt for the UK and a wide international market. Based on its innovative products and strong market presence it has become the market leading coconut brand in the UK.

Quality checks are required to ensure CCL maintains the high quality of product expected by its growing consumer base. The unwanted use of a barrel of coconut cream tainted by rancidity in the manufacture of coconut cream renders it unsuitable for sale and consumption. This leads to complete batches of coconut yogurt being rejected. Checks for rancidity are currently performed manually, with batches of coconut cream being tasted ahead of their use in production. With the growth of the business, this is becoming increasingly impractical but there are currently no automated methods available to test for rancidity.

beach-coconut-delicious-322483Through the Analysis for Innovators (A4I) partnership, CCL had access to innovative and advanced measurement and analytical technologies at both the National Measurement Laboratory (NML) and the Science and Technology Facilities Council (STFC) to develop assess the feasibility of developing a rapid and robust screening approach to detect rancidity in coconut cream.

Impact

Supply specialists, engineers and scientists from CCL, the NML and STFC assessed the feasibility of using multispectral imaging (MSI) and Raman spectroscopy to detect traces of rancid coconut cream ahead of its use in the production of coconut yogurt.

Multispectral imaging (MSI) methods showed the sensitivity and repeatability to screen for and detect rancid coconut cream, performing a non-destructive test in no more than 20 seconds. MSI has also been shown to have the potential to be used as a quantitative screening approach to determine the level of rancidity in a sample of coconut cream.

These encouraging results have demonstrated proof of principle for using MSI as the basis for an enhanced level of quality control and screening in CCL’s manufacturing plants. This screening approach will help avoid annual costs in excess of £500k through reduced production and material charges. With further optimisation, MSI could also be used as a predictive tool upstream in the sample production process prior to the onset of
full rancidity, making further efficiency and cost savings for the industry in general.

In addition, the method has been “future proofed” so that it can also be extended to understand variations in coconut cream consistency between batches, suppliers and even geographic origin, as well as screening for the presence of other undesirable materials which could affect the quality of coconut cream.

This project has allowed CCL to continue to support the growth of its business whilst benefiting from the expertise brought by the collaboration with the NML and STFC.

Every DNA counts – and we would know

The National Measurement Laboratory at LGC turned 30 years old this year, and to celebrate we’ve been looking back at notable accomplishments, and looking at where we are now. Clinical measurement is one field where our scientists have excelled and innovated throughout our time.

biology-clinic-doctor-4154Clinical measurement “is the development, use, on-going support, and maintenance of technology for diagnosing, aiding or treating patients.” Modern medicine wouldn’t be possible if we couldn’t rely on the accuracy of clinical tests and diagnosis. Poor measurement can lead to misdiagnosis, incorrect prescription and dosage of medicine, or false interpretation of data. Therefore, reliable certified reference materials are absolutely necessary to ensure the quality and accuracy of clinical measurement.

Throughout the last 30 years, the National Measurement Laboratory (NML) at LGC has worked in this area to ensure that testing methods and reference materials are of the highest quality.

In one case study from 2006¹, scientists in the NML developed isotope dilution liquid chromatography-mass spectrometry (IDMS) methodologies that were then used to generate reference values for clinical reference materials (CRM), some of which led to the analysis of creatine in frozen serum and testosterone in frozen serum CRMs.

In another blog post, we outlined the work we’ve done to improve Alzheimer’s diagnosis, which could lead to techniques for earlier diagnosis of the disease, and in another, we illustrate the importance of harmonising newborn blood sport screening tests to ensure infants are diagnosed and treated early so that they can live as normal lives as possible.

An important part of working in the field of clinical medicine and measurement is communicating our knowledge with other scientists and medical professionals to ensure that good measurement is being performed consistently across the board. We have worked with the NHS and England’s Chief Scientific Officer Sue Hill on doing just that as part of the Knowledge Transfer Partnership Programme, which aims to improve patient care through new approaches to measurement.

And now, our scientists can even count DNA and measure changes to that DNA over time. Identification and targeting of specific genetic sequences forms the basis of many promising advanced healthcare solutions such as: precision (personalised) medicine in cancer, gene therapies to end genetic disorders in children and the detection of pathogenic and non-pathogenic bacteria in a wide spectrum of infectious and autoimmune diseases.

However, the new methods and technologies currently being developed will only achieve their full potential if we can ensure they are safe and can be reproduced. High accuracy reference methods are one of the key factors in supporting their development into routine application.

Using tests for guiding treatment of colorectal cancer as a model, our scienists outlined in a paper published in Clinical Chemistry how a range of dPCR assays and platforms compared and how precisely they measured the cancer mutation. An inter-laboratory study of clinical and National Measurement Institute laboratories demonstrated reproducibility of the selected method. Together these results reveal the unprecedented accuracy of dPCR for copy number concentration of a frequently occurring gene mutation used to decide on drug treatment.

This study has shown that using high-accuracy dPCR measurements can support the traceable standardisation, translation and implementation of molecular diagnostic procedures that will advance precision medicine.

All of this just goes to show you how far we’ve come in 30 years!

¹VAM Bulletin, Issue 35, Autumn 2006, pp 13. ‘Case Study 3: IDMS certification of clinical reference materials using LC-MS/MS”

Nanotechnology: The big challenge behind the characterization of the small

Nanomaterials and nanotechnology developments are having an increasingly significant impact on human life, from enabling more targeted cancer treatments to improving the efficacy of vaccines or the delivery of agrochemicals. However, their small size can lead to potentially toxic effects.

To protect human health and the environment, it is crucial that we are able to characterise nanomaterials effectively and understand their behaviour within biological systems. What do we really know about the potential effects when they come into contact with complex matrices and how do we ensure that nanoproducts are safe?

The global market for nanomaterials are estimated by Allied Market Research to have a market value of $14.7 billion in 2015, and some reports forecast that to grow to as much as $55 billion by 2022.

We know that the properties of nanomaterials can change significantly when used in complex matrices, such as biological systems, potentially affecting functionality and behaviour. Nanobiotechnology or nanomedical applications exploit these changes. For example, in some therapeutic applications, protein coated nanoparticles (apolipoprotein E coatings) can target specific locations, such as the brain.

However, there may be other currently unknown biological interactions which could pose a potential risk to human health. These risks are compounded by a lack of robust methods to characterise nanomaterials in complex biological matrices.

AB Still 0003As the NML we have been instrumental in developing new international documentary standards (ISO) to support this field. For example, we provided expert input into a newly released Technical Specification (ISO TS 19590:2017) that outlines a novel method (single particle inductively coupled plasma-mass spectrometry, spICP-MS) for determining the size distribution and concentration of nanoparticles in aqueous samples. We’ve been invited to provide the UK expert view for a new standard on the analysis of nano-objects using a gentle separation technique (field flow fractionation, ISO TS 21362).

These standards have been produced as a response to the worldwide demand for suitable methods for the detection and characterization of nanoparticles in food and consumer products. In addition, we provided the particle size reference measurements for a new silica reference material (ERM-FD101b) released this year by the European Commission (EC JRC Directorate F (Health, Consumers and Reference Materials). This material will support the implementation of the EC definition of ‘nanomaterial’.

The NML is co-ordinating the first international measurement comparison study between National Measurement Institutes (under the auspices of the CCQM) on the determination of number concentration of nanoparticles (colloidal gold). An interlaboratory comparison using the same material that is open to industrial and academic laboratories with an interest in nanoparticle analysis will be run in parallel through VAMAS (Versailles Project on Advanced Materials and Standards) in collaboration with NPL. This will allow a comparative evaluation across users and measurement institutes and may lead to the development of new international written standards to support regulation around nanoparticles.

LGC’s involvement supporting the development of nanotechnology regulation, and the underpinning standardisation efforts required at both a national and international level, recognises both the individual expertise of our scientists and our reputation in this field.

Our input will help ensure current and future consumer safety and ultimately protect human health and the environment whilst supporting the growth and development of this enabling technology.

You can read more about the work we do in our Annual Review, and have a look through our case studies to learn about our impact.

A4I is back for another round!

Analysis for Innovators is back! The latest round of the A4I programme from Innovate UK and its partners (LGC, NPL, NEL, & STFC) has now opened, with up to £3M available in total for Round 3.

In our role as the National Measurement Laboratory, we have worked with Innovate UK since the very start of A4I, back in January 2017, and the programme has proved such a success that we are already moving on to the third round!

But be quick to take advantage of this opportunity as the first stage of the application closes at noon on 6th September.  A4I is a very unique programme from Innovate UK – it helps UK businesses address difficult problems that restrict their potential productivity and competitiveness.  The scope is very wide (chemical, physical, biological and computing) but the problems must be of a measurement or analysis nature.

A4I targets real industry problems that haven’t been solved with existing products or services. As such, it is of interest to companies that have not traditionally considered applying for funding. Any size of business, with any type of measurement or analysis problem, are eligible to apply. If your company makes it past the first stage, you will be matched with us, NPL, NEL or STFC for a consultation. After this stage, some companies will continue to work with us in our own world-class measurement labs.

The first two rounds of the A4I programme have seen us help several companies overcome measurement problems. In Round 1, we worked with the Coconut Collaborative, a manufacturer of coconut yoghurt, and STFC to develop a rapid and robust screening method to detect rancid coconut cream before its use. The use of rancid cream led to lost sales and waste for the company. We helped develop a novel screening approach with multi-spectral imaging, which will help the Coconut Collective avoid annual costs of £500k.

We also worked with Sistemic to help ensure the safety of cell therapy products, by increasing the sensitivity of their novel technology, which detects contamination in cell therapy products. Cell therapies are seen as the future of treatment in a number of areas including diabetes and cardiovascular disease. However, one type of cell being used to generate cell therapy products (pluripotent stem cells, or PSCs) has the potential to form tumours. The NML enhanced the sensitivity and specificity of the Sistemic novel prototype miRNA-assay to the levels required for market (<10 cells per million). This assay will ensure producers can accurately assess PSC contamination in their cell therapy products.

Other examples of the companies that were funded under A4I Round 1 can be found at Analysis for Innovators winning projects, and for more information about the work and case studies of the NML at LGC, have a look here at our latest annual review.

And don’t forget to apply now– there’s £3 million up for grabs!

Peanut allergen quantification: a tough nut to crack

As part of the National Measurement Laboratory’s 30th anniversary, we’re sharing stories and case studies from the last three decades.

One of our case studies touches on an issue that affects hundreds of thousands of people across the UK alone: peanut allergies. So read on to learn how LGC scientists developed a unique allergen quality control material, which can be used to help protect the people in the UK with a peanut allergy and also help to prevent contamination in the food production process, potentially saving the food industry millions of pounds.

The problem

The prevalence of peanut allergy has nearly doubled in Europe over the past two decades and it now affects around 500,000 people in the UK [1]. Peanut allergy is the most common cause of fatal food allergy reaction. It occurs when the immune system mistakenly identifies peanut proteins as something harmful. The fear of accidental exposure in food reduces the quality of life of peanut allergy sufferers and severely limits the social habits of allergic individuals, their families and even their friends.

SMALL_iStock_000004018390Small peanutsIt is not only those with peanut allergies who have to worry about the risk of allergic reactions or death by anaphylaxis; it also creates problems for businesses. Testing for allergen proteins in food is difficult, as samples usually contain a lot of protein and it can be difficult to separate the allergen protein of interest. This has an impact on the ability of manufacturers and suppliers to adequately label their goods and also has implications for defining threshold levels and detecting food fraud.

All food companies throughout the EU are compelled by law to declare major allergens including peanut, if included in food products as ingredients. The current labelling rules, brought into force in December 2014 by European Regulation 1169/2011 (the EU Food Information for Consumers Regulation, EU FIC) ensure that all consumers are given highlighted information about the use of allergenic ingredients in pre-packed food [2]. This is to make it easier for people with food allergies to identify the foods they need to avoid. The EU FIC also extends to food sold loose or served when eating out. Prevention of cross contamination with peanut through product testing, validation and verification of cleaning, and checking of ‘peanut-free’ products requires exacting testing.

ELISA (enzyme-linked immunosorbent assay), PCR (polymerase chain reaction) and mass spectrometry (MS) methods can be used to detect food allergens, but there are problems obtaining reliable quantitative results with all three. Prior to this project, there were no suitable reference materials available in the form of a food matrix, making it difficult for laboratories and test-kit manufacturers to validate quantitative methods for allergen measurement.

The solution

A quality control (QC) material that is a real food, containing a known amount of specific allergen protein, and is stable and homogenous could assist laboratories in the validation and monitoring of their analysis. Consequently, a project was undertaken by LGC to develop a food matrix peanut allergen QC material.

The chosen matrix was a chocolate dessert product developed for low-dose threshold studies in food allergic individuals in the European research project ‘EuroPrevall’. Two QC materials were prepared by University of Manchester researchers in the form of chocolate dessert product pastes designed to be reconstituted with water before analysis. One material (LGCQC1011) was prepared as a peanut free negative control and the other material (LGCQC1012) was prepared as a positive control with the addition of light roast, partially defatted peanut flour (a commercial food ingredient) to give a peanut protein content of 10 mg kg-1. The pastes were transferred to LGC, packaged in nitrogen-flushed sealed sachets to aid stability and the units were numbered sequentially in fill order. LGC assessed and proved their homogeneity and stability, underpinned by a validation study of the test method using a commercially available ELISA kit (Romer AgraQuant® Peanut kit). The National Measurement System funded the ELISA kit validation studies, and a Technology Strategy Board and LGC co-funded research and development project established the design and production of the QC material.

Impact

Failure in food allergen management means ‘food-allergen’ related incidents are the most common reason for product withdrawals and recalls in the United Kingdom according to the UK Food Standards Agency. The 34 recalls related to allergens in 2010 were estimated to cost stakeholders £10- 15 million. In 2013, the number of Allergy Alerts issued to withdraw food or drink products had risen to 47.

Phil Goodwin, MD of Bio-Check (UK) a food allergen test kit manufacturer, has worked in this area for 30 years and welcomes LGC’s recent initiatives:

“The science of food allergen detection, let alone quantitation, has failed to move forward anything like quickly enough since it began in the late 1980s. The emergence of such high quality QC materials as are being produced by LGC is a significant step forward to a time when all commercial test kits can be demonstrated to show good agreement on allergen levels. LGC are to be applauded for taking on this difficult challenge and I urge all allergen kit producers and analysts to use the material to improve their products and results.”

 

[1] http://www.mrc.ac.uk/news-events/publications/outputs-outcomesand-impact-of-mrc-research-2013-14/

[2] http://allergytraining.food.gov.uk/english/rules-and-legislation/

This blog first appeared as a NML case study on the LGC Group website. To learn more about the NML, visit their site here.

What’s funny about your honey?

Ensuring the safety and authenticity of the food we eat is of paramount importance and there is growing concern, both at the EU and global level, to ensure the quality control of food to protect the health and safety of consumers. And during the National Measurement Laboratory’s thirty years, we’ve done a lot of work to support reliable measurements in food testing and authentication.

Honey is known to have multiple health and nutritional benefits and is in high demand among consumers. It is defined as the natural sweet substance produced by bees and there is significant regulation around the composition and labelling of honey in order to protect consumers from food fraud. However, due to the declining numbers of bees, the impact of weather conditions on supply and the high costs production, honey is expensive. This makes it a prime target for economically-motivated food fraud.

StockSnap_97LJAKWL36Some research suggests that humans began to hunt for honey 8,000 years ago, and the oldest known honey remains, dating back to between 4,700 – 5,500 years ago, were discovered in clay vessels inside of a tomb in the country of Georgia.

The ancient Egyptians used honey to sweeten dishes and to embalm the dead, while the ancient Greeks actually practised beekeeping so much that laws were passed about it. Honey was prevalent around the ancient world, being used in ancient India, China, Rome and even among the Mayans. It even plays a role in many religions, representing the food of Zeus, an elixir of immortality, and a healing substance.

And just like any other important product, fraudsters have been faking it since it’s been in use. Ancient Greeks and Romans both mention honey adulteration, and back in 1889, Dr Harvey W. Wiley testified in front of Congress that it was the most adulterated product in the U.S.

Honey is still one of the most adulterated food products globally, with a report last year citing that more than 14% of tested samples were adulterated.

There are two types of food fraud associated with honey: adulteration and fraudulent labelling. Honey adulteration typically occurs by substituting honey for cheaper sweeteners such as high fructose corn syrup, cane or beet sugar syrup. Fraudulent labelling occurs because honeys from a particular geographic or botanical source, such as Manuka, command premium prices amongst consumers.

Detecting these types of fraud presents a significant measurement challenge for food regulators: adulterated products show very similar physical and chemical properties to pure honey and mis-labelled products are, in fact, pure honey, just of lower quality. Several reports indicate that there is more Manuka honey being sold than Manuka bees can  produce, which illustrates how often lower quality honeys are passed for premium ones in order to maximise profit.

During our thirty years as the National Measurement Laboratory (NML) for chemical and bio-measurement, our scientists have conducted several reviews and studies of methods for detecting honey fraud1. For instance, nearly forty years ago, scientists began to use stable carbon isotope ratio mass spectrometry (IR-MS) to detect high fructose corn syrup in honey.  As our scientists found2, it is possible to identify food fraud in honey using IR-MS, which measures small but observable variations in the ratios of the two stable isotopes of carbon (C-13 and C-12). Sugars, although chemically identical, have a different isotopic signature depending on the way in which the plant processes carbon dioxide. As the majority of honey-source plants use a different pathway than plant sugars typically used as honey adulterants, it is possible to detect adulteration using IR-MS. The specific geography of the plants also plays a role in the isotopic fingerprint and IR-MS can be used to determine where honeys originated.

However, in order that these types of measurements are robust and reliable in detecting food fraud across the supply chain the comparability of results is critical. To support this, LGC co-ordinated an international comparison study in 2016 for isotope ratios in honey involving 6 national measurement institutes (NMIs) and 6 expert laboratories (contacted via the Forensic Isotope Ratio Mass Spectrometry (FIRMS) Network) and the results between participants showed good comparability.

Demonstrating the comparability of isotope ratio measurements is crucial to detecting many types of food fraud and supporting food authenticity claims, of which honey is just one example. The international study coordinated by LGC demonstrates the measurement framework is in place to support food fraud regulation in the future.

 

1 D. Thorburn Burns, Anne Dillon, John Warren, and Michael J. Walker, 2018, A Critical Review of the Factors Available for the Identification and Determination of Mānuka Honey, Food Analytical Methods, https://doi.org/10.1007/s12161-018-1154-9.

2 Helena Hernandez, “Detection of adulteration of honey: Application of continuous-flow IRMS”, VAM Bulletin, 1999, Vol 18, pp 12-14.