Countdown to the SI redefinition

metr_bck_3Throughout history, measurement has been a fundamental part of human advancement. The oldest systems of weights and measures discovered date back over 4000 years. Early systems were tied to physical objects, like hands, feet, stones and seeds, and were used, as we still do now, for agriculture, construction, and trade. Yet, with new inventions and global trade more ever more accurate and unified systems were needed. In Europe, it wasn’t until the 19th Century that a universally agreed measurement system began to be adopted and the International System of Units (SI units) was born.

Now, after years of hard work and scientific progress, we are ready once again to update and improve the SI units. The redefinition of the International System of Units enacted on the 16 November 2018 during the General Conference for Weights and Measures will mean that the SI units will no longer be based on any physical objects, but instead derived through fundamental properties of nature. Creating a system centred on stable and universal natural laws will ensure the long-term stability and reliability of measurements, and act as a springboard for the future of science and innovation.

The redefinition of the SI units will come into force on the 20th of May 2019, the anniversary of the signing of the Metre Convention in 1875, an international treaty for the international cooperation in metrology.  To celebrate, we’ll be counting down each of the SI units – the metre, second, kilogram, kelvin, mole, candela, and ampere. Join us on the 20th of every month to find out where units are commonly used, how they’re defined, and the changes that will take place!

UNIT OF THE MONTH: METRE

“You’ve never heard of the Millennium Falcon? … It’s the ship that made the Kessel run in less than 12 parsecs!” Han  Solo’s description of the Millennium Falcon in Star Wars is impressive, but something’s not quite right. Do you know why? The unit he uses to illustrate the prowess of the Falcon – a parsec – isn’t actually a measure of time, but length! It probably won’t surprise anyone Han Solo isn’t very precise when it comes to the physics of his ship, but in fact he isn’t too far from the truth. This is because we use time to define length.

metre facts

What does this mean? Well, in the case of Han Solo, one parsec is about 3.26 light-years, and a light-year is the distance light travels in one year. Back down on Earth, we have the same method for defining length. In the International System of Units (SI), the base unit of length is the metre, and it can be understood as:

A metre is the distance travelled by light in 1/299792458 of a second.

The reason we use the distance travelled by light in a certain amount of time is because light is the fastest thing in the universe (that we know of) and it always travels at exactly the same speed in a vacuum. This means that if you measure how far light has travelled in a vacuum in 1/299792458 of a second in France, Canada, Brazil or India, you will always get exactly the same answer no matter where you are!

On 20 May next year the official definition of the metre will change to:

The metre is defined by taking the fixed numerical value of the speed of light in vacuum c to be 299 792 458 when expressed in the unit m s−1, where the second is defined in terms of the caesium frequency, ∆ν.

We’ll be returning to the definition of the second on 20 March, so join us again then to find out more.

So, what’s the difference? Actually, there’s no big change coming for the metre. Although the word order has been rephrased, the physical concepts remain the same.

Making a mountain out of a Mole Day

Today is Mole Day, chemists’ #1 holiday! Mole Day occurs every year on October 23 from 6:02am to 6:02pm to commemorate Avogadro’s Number and the basic measuring unit of chemistry, the mole.

What is Avogadro’s Number?

Avogadro’s Number is currently defined as the number of atoms in 12 grams of carbon-12, which comes to 6.02 x 10²³.

Amadeo Avogadro was a 19th century Italian scientist who first proposed in 1811 that equal volumes of all gases will contain equal numbers of molecules to each other (known as Avogadro’s Law).  Nearly one hundred years later, in 1909, chemists decided to adopt the mole as a unit of measure for chemistry. At the time, the scientists decided to define the mole based on the number of atoms in 12 grams of carbon-12. Jean Baptiste Perrin suggested this number should be named after Avogadro, due to his contributions to molecular theory.

Molecules and atoms are very tiny and numerous, which makes counting them particularly difficult. To put it into perspective, an atom is one million times smaller than the width of the thickest human hair. It’s useful to know the precise amount of certain substances in a chemical reaction, but calculating the number of molecules would get very messy if every time we had to use numbers like 602,214,129,270,000,000,000,000.

Enter Avogadro’s number! Using the mole simplifies complex calculations. Before the mole was adopted, other units were inadequate for measuring such miniscule amounts. After all, one millilitre of water still has 33,456,340,515,000,000,000,000 H₂O molecules!

This doesn’t mean that one mole of different substances equal each other in mass or size; it simply refers to the number of something, while size and mass vary by object. For example, a mole of water molecules would be about 18 millilitres, while a mole of aluminium molecules would weigh about 26 grams. However, a mole of pennies would cover the Earth at a depth of over 400 metres.  And a mole of moles would weigh over half the size of the moon!

Why Mole Day?

Schools around the U.S. and other places use the day as a chance to cultivate an interest in chemistry among students. Mole Day goes back to the 1980s, when an article in The Science Teacher magazine proposed celebrating the day. This inspired other teachers to get involved and  a high school chemistry teacher in Wisconsin founded the National Mole Day Foundation in 1991. The American Chemical Society then planned National Chemistry Week so this it falls on the same week as Mole Day every year.

Every year, chemistry teachers use this as an opportunity to perform fun experiments, bake mole-shaped desserts, and teach random facts about Avogadro’s number to students, with the aim of increasing science engagement

revised-SI-logoWhat about the revised SI?

In a previous blog post, we outlined how several of the units of the International Standards of units are undergoing a change. For example, the kilogram will no longer be based on a physical artefact, but on a constant. In the case of the mole, the current definition defines one mole as containing as many molecules as “atoms in 12 grams of carbon-12”. The new definition, which will likely come into effect next May, simply defines the mole as containing exactly 6.02214076 x 10²³ elementary entities. This eliminates any reference to mass and lays out the exact number of molecules as Avogadro’s constant, so the mole will not be dependent on any substance’s mass.

More Mole Facts

A mole of doughnuts would cover the earth in a layer five miles deep!

All of the living cells in a human body make up just over half a mole.

A mole of rice grains would cover all of the land area on Earth at a depth of 75 metres.

A mole of turkeys could form sixteen earths.

Head on over to our Twitter page to tell us what you think about Mole Day (or share more great facts), and to see what everyone is talking about!

Analysis for Innovators: Supporting industry

The Coconut Collaborative Ltd (CCL) manufactures Coconut Yogurt for the UK and a wide international market. Based on its innovative products and strong market presence it has become the market leading coconut brand in the UK.

Quality checks are required to ensure CCL maintains the high quality of product expected by its growing consumer base. The unwanted use of a barrel of coconut cream tainted by rancidity in the manufacture of coconut cream renders it unsuitable for sale and consumption. This leads to complete batches of coconut yogurt being rejected. Checks for rancidity are currently performed manually, with batches of coconut cream being tasted ahead of their use in production. With the growth of the business, this is becoming increasingly impractical but there are currently no automated methods available to test for rancidity.

beach-coconut-delicious-322483Through the Analysis for Innovators (A4I) partnership, CCL had access to innovative and advanced measurement and analytical technologies at both the National Measurement Laboratory (NML) and the Science and Technology Facilities Council (STFC) to develop assess the feasibility of developing a rapid and robust screening approach to detect rancidity in coconut cream.

Impact

Supply specialists, engineers and scientists from CCL, the NML and STFC assessed the feasibility of using multispectral imaging (MSI) and Raman spectroscopy to detect traces of rancid coconut cream ahead of its use in the production of coconut yogurt.

Multispectral imaging (MSI) methods showed the sensitivity and repeatability to screen for and detect rancid coconut cream, performing a non-destructive test in no more than 20 seconds. MSI has also been shown to have the potential to be used as a quantitative screening approach to determine the level of rancidity in a sample of coconut cream.

These encouraging results have demonstrated proof of principle for using MSI as the basis for an enhanced level of quality control and screening in CCL’s manufacturing plants. This screening approach will help avoid annual costs in excess of £500k through reduced production and material charges. With further optimisation, MSI could also be used as a predictive tool upstream in the sample production process prior to the onset of
full rancidity, making further efficiency and cost savings for the industry in general.

In addition, the method has been “future proofed” so that it can also be extended to understand variations in coconut cream consistency between batches, suppliers and even geographic origin, as well as screening for the presence of other undesirable materials which could affect the quality of coconut cream.

This project has allowed CCL to continue to support the growth of its business whilst benefiting from the expertise brought by the collaboration with the NML and STFC.

International Coffee Day: Would you like chicory with that?

nathan-dumlao-492751-unsplashToday is International Coffee Day and, not that anyone needed an extra reason to have a cup, the world is taking the opportunity to enjoy a fine cup of one of the world’s most popular drinks. But what if we told you that 170 years ago, enjoying a cup of joe often meant drinking a hot cup of roasted root vegetables instead?

Coffee adulteration has been common since at least the early 1800s, when laws were already in place banning the substitution of coffee with other mixtures¹. One of the most common adulterants was chicory, a plant whose roots are baked, roasted and ground for use as a food additive.

Chicory is still in use today as a legitimate additive for various foods, including coffee and beer. But in the 1800’s, many sellers advertised their mixtures as ‘pure coffee’, so much so that our earliest lab, known then as the Government Laboratory, was tasked with analysing samples from coffee mixtures to determine if they were in fact pure coffee.

At that time, the Victorians were a bit obsessed with coffee, England’s most popular drink until tea overtook it in 1853. “In 1840, the year Victoria married Prince Albert, Britain imported 28 million pounds of tea, but we imported more than twice as much coffee at 70 million pounds,” said the Telegraph in a report on an old ONS survey.

Because the market for coffee was so strong, there was financial incentive for adulterating coffee with other substances. Now called economically motivated adulteration, this coffee adulteration led the government to have botanists and chemists study the composition of various plants, and ultimately led to advances in methods of analysing the differences between coffee, chicory and other substances.

IMG_20170929_114736

Original analytical table in letter from John Lindley to John Wood, 1852.

In a letter dated 9 June 1852², English botanist John Lindley wrote to the Inland Revenue’s Chairman John Wood, “…we have carefully examined samples of Coffee and Chicory in different states and there is no difficulty in detecting their mixture however finely they may be ground if they be examined under a good microscope.” What follows are detailed descriptions of the cells of various substances, including chicory, to which Lindley pointed out, “When roasted Chicory in powder is dropped in mere water, cold, a pale amber yellow cloud will gradually form round each particle; but roasted coffee powder gives out no such colour.”

In another letter, dated 9 November 1852³, Lindley describes how, following their research, they have been able to identify other adulterants, saying “It appears that the articles usually employed for mixing are the Roots of Mangel Wurzel, Turnips, Parsnips and Carrots, or Seeds such as Beans, Peas, Lupines, Acorns and Malt.”

Coffee adulteration still occurs today, but our tools and analysis capabilities, as well as food safety laws, have come a long way.  So while you’re enjoying your cup of joe on International Coffee Day, relish the fact that you are sharing a time-honoured tradition with the Victorians, but more importantly, that you aren’t drinking roasted acorns!

¹Weighed in the Balance, by PW Hammond and Harold Egan, 1992, pg 40-43.

²Lindley, John. Chicory & Coffee. Letter. London, 9 June 1852. Inland Revenue, Laboratory of the Government Chemist.

³Lindley, John. Chicory & Coffee. Letter. London, 9 November 1852. Inland Revenue, Laboratory of the Government Chemist.

Helping authorities detect fentanyl analogues

The past several years has seen growing awareness of a drug called fentanyl, which is increasingly cited in relation to drug overdoses, including many high profile deaths, as well as becoming the focus of many law enforcement agencies, especially in the United States.

macro of pills

The U.S. Drug Enforcement Administration (DEA) have named fentanyl the most significant synthetic opioid threat in the U.S. in 2018, while the Centre for Disease Control in the United States have determined that the rate of drug overdose death from synthetic opioids, not including methadone, doubled in just one year from 2015 to 2016. In 2016, opioids caused 42,000 deaths, and nearly half of those were fentanyl-related, including the deaths of Prince, Tom Petty and Lil Peep.

But is fentanyl a new drug? And if not, why has it suddenly become a major factor in the opioid crisis?

Fentanyl was first synthesised in 1959 by Paul Janssen and has been used as a pain reliever and general anaesthetic in operating rooms. While there are legitimate uses for fentanyl, it presents a formidable public health risk, especially in the United States. While some drug users seek it out for recreational use, many are unaware that what they are buying is fentanyl, as illicit drug makers use it to adulterate more expensive pharmaceuticals and opioids, like heroin. It is 80 to 100 times more potent than morphine, and while it only costs $6,000 to purchase one kilogram of fentanyl in a lab, that one kilogram can have a distribution value of up to $1.6 million. This presents an enormous economic motive for replacing common opioids with fentanyl.

Fentanyl analogues, or compounds with a similar molecular structure, make it even more difficult to regulate. Analogues are manufactured in labs, and once one is discovered by law enforcement and outlawed, another analogue is already waiting to be put into use. Some, like carfentanil, are particularly dangerous. Carfentanil is 100 times stronger than fentanyl (making it 10,000 times stronger than a unit of morphine), and as such, is used to sedate large mammals, like elephants. These highly potent drugs can rapidly incapacitate by causing central nervous and respiratory depression.

Our Sport and Specialised Analytical Services team performs analysis for forensics laboratories, including those working with police authorities and coroners, to detect and identify drugs in body fluids and drug seizures.  Work is also performed to understand more about how the latest drugs are metabolised in the body. The study of drug metabolism, or pharmacokinetics, is vital to understanding how drugs break down in the human body.  In a forensic environment it is very important to know how the body changes a drug in order to be able to detect it in forensic tests.

Scientists at LGC Simon Hudson and Charlotte Cutler studied the metabolic fate of several analogues of fentanyl, including carfentanil, and published three white papers on their findings. Each paper goes through their methodology, which can be used as an aid to detect the analogues in biological fluids.

In one of the case studies, Simon studied carfentanil in post mortem blood samples.  Not much has been understood about the metabolism of carfentanil, which suggests that the true extent of carfentanil-related deaths is unknown. After analysing over 70 carfentanil cases, Simon found that the parent drug was always present in blood and urine post mortem and that in some cases, due to the low levels of carfentanil, extremely sensitive analytical equipment was required detect it’s presence.

In the other papers, Simon and Charlotte studied samples from UK siezures of drugs that were originally reported by authorities in Latvia and Slovenia between December 2016 and August 2017. They were able to identify many metabolites of cyclopropylfentanyl and methoxyacetylfentanyl. These studies are a beneficial tool to help authorities and scientists detect these analogues in the future.

To learn more about the history of fentanyl, its chemistry and current issues, watch our interesting webinar. And to understand more about Simon’s work and studies on the drug, read the white papers on carfentanil, methoxyacetylfentanyl, and cyclopropylfentanyl.

Every DNA counts – and we would know

The National Measurement Laboratory at LGC turned 30 years old this year, and to celebrate we’ve been looking back at notable accomplishments, and looking at where we are now. Clinical measurement is one field where our scientists have excelled and innovated throughout our time.

biology-clinic-doctor-4154Clinical measurement “is the development, use, on-going support, and maintenance of technology for diagnosing, aiding or treating patients.” Modern medicine wouldn’t be possible if we couldn’t rely on the accuracy of clinical tests and diagnosis. Poor measurement can lead to misdiagnosis, incorrect prescription and dosage of medicine, or false interpretation of data. Therefore, reliable certified reference materials are absolutely necessary to ensure the quality and accuracy of clinical measurement.

Throughout the last 30 years, the National Measurement Laboratory (NML) at LGC has worked in this area to ensure that testing methods and reference materials are of the highest quality.

In one case study from 2006¹, scientists in the NML developed isotope dilution liquid chromatography-mass spectrometry (IDMS) methodologies that were then used to generate reference values for clinical reference materials (CRM), some of which led to the analysis of creatine in frozen serum and testosterone in frozen serum CRMs.

In another blog post, we outlined the work we’ve done to improve Alzheimer’s diagnosis, which could lead to techniques for earlier diagnosis of the disease, and in another, we illustrate the importance of harmonising newborn blood sport screening tests to ensure infants are diagnosed and treated early so that they can live as normal lives as possible.

An important part of working in the field of clinical medicine and measurement is communicating our knowledge with other scientists and medical professionals to ensure that good measurement is being performed consistently across the board. We have worked with the NHS and England’s Chief Scientific Officer Sue Hill on doing just that as part of the Knowledge Transfer Partnership Programme, which aims to improve patient care through new approaches to measurement.

And now, our scientists can even count DNA and measure changes to that DNA over time. Identification and targeting of specific genetic sequences forms the basis of many promising advanced healthcare solutions such as: precision (personalised) medicine in cancer, gene therapies to end genetic disorders in children and the detection of pathogenic and non-pathogenic bacteria in a wide spectrum of infectious and autoimmune diseases.

However, the new methods and technologies currently being developed will only achieve their full potential if we can ensure they are safe and can be reproduced. High accuracy reference methods are one of the key factors in supporting their development into routine application.

Using tests for guiding treatment of colorectal cancer as a model, our scienists outlined in a paper published in Clinical Chemistry how a range of dPCR assays and platforms compared and how precisely they measured the cancer mutation. An inter-laboratory study of clinical and National Measurement Institute laboratories demonstrated reproducibility of the selected method. Together these results reveal the unprecedented accuracy of dPCR for copy number concentration of a frequently occurring gene mutation used to decide on drug treatment.

This study has shown that using high-accuracy dPCR measurements can support the traceable standardisation, translation and implementation of molecular diagnostic procedures that will advance precision medicine.

All of this just goes to show you how far we’ve come in 30 years!

¹VAM Bulletin, Issue 35, Autumn 2006, pp 13. ‘Case Study 3: IDMS certification of clinical reference materials using LC-MS/MS”

Nanotechnology: The big challenge behind the characterization of the small

Nanomaterials and nanotechnology developments are having an increasingly significant impact on human life, from enabling more targeted cancer treatments to improving the efficacy of vaccines or the delivery of agrochemicals. However, their small size can lead to potentially toxic effects.

To protect human health and the environment, it is crucial that we are able to characterise nanomaterials effectively and understand their behaviour within biological systems. What do we really know about the potential effects when they come into contact with complex matrices and how do we ensure that nanoproducts are safe?

The global market for nanomaterials are estimated by Allied Market Research to have a market value of $14.7 billion in 2015, and some reports forecast that to grow to as much as $55 billion by 2022.

We know that the properties of nanomaterials can change significantly when used in complex matrices, such as biological systems, potentially affecting functionality and behaviour. Nanobiotechnology or nanomedical applications exploit these changes. For example, in some therapeutic applications, protein coated nanoparticles (apolipoprotein E coatings) can target specific locations, such as the brain.

However, there may be other currently unknown biological interactions which could pose a potential risk to human health. These risks are compounded by a lack of robust methods to characterise nanomaterials in complex biological matrices.

AB Still 0003As the NML we have been instrumental in developing new international documentary standards (ISO) to support this field. For example, we provided expert input into a newly released Technical Specification (ISO TS 19590:2017) that outlines a novel method (single particle inductively coupled plasma-mass spectrometry, spICP-MS) for determining the size distribution and concentration of nanoparticles in aqueous samples. We’ve been invited to provide the UK expert view for a new standard on the analysis of nano-objects using a gentle separation technique (field flow fractionation, ISO TS 21362).

These standards have been produced as a response to the worldwide demand for suitable methods for the detection and characterization of nanoparticles in food and consumer products. In addition, we provided the particle size reference measurements for a new silica reference material (ERM-FD101b) released this year by the European Commission (EC JRC Directorate F (Health, Consumers and Reference Materials). This material will support the implementation of the EC definition of ‘nanomaterial’.

The NML is co-ordinating the first international measurement comparison study between National Measurement Institutes (under the auspices of the CCQM) on the determination of number concentration of nanoparticles (colloidal gold). An interlaboratory comparison using the same material that is open to industrial and academic laboratories with an interest in nanoparticle analysis will be run in parallel through VAMAS (Versailles Project on Advanced Materials and Standards) in collaboration with NPL. This will allow a comparative evaluation across users and measurement institutes and may lead to the development of new international written standards to support regulation around nanoparticles.

LGC’s involvement supporting the development of nanotechnology regulation, and the underpinning standardisation efforts required at both a national and international level, recognises both the individual expertise of our scientists and our reputation in this field.

Our input will help ensure current and future consumer safety and ultimately protect human health and the environment whilst supporting the growth and development of this enabling technology.

You can read more about the work we do in our Annual Review, and have a look through our case studies to learn about our impact.

A4I is back for another round!

Analysis for Innovators is back! The latest round of the A4I programme from Innovate UK and its partners (LGC, NPL, NEL, & STFC) has now opened, with up to £3M available in total for Round 3.

In our role as the National Measurement Laboratory, we have worked with Innovate UK since the very start of A4I, back in January 2017, and the programme has proved such a success that we are already moving on to the third round!

But be quick to take advantage of this opportunity as the first stage of the application closes at noon on 6th September.  A4I is a very unique programme from Innovate UK – it helps UK businesses address difficult problems that restrict their potential productivity and competitiveness.  The scope is very wide (chemical, physical, biological and computing) but the problems must be of a measurement or analysis nature.

A4I targets real industry problems that haven’t been solved with existing products or services. As such, it is of interest to companies that have not traditionally considered applying for funding. Any size of business, with any type of measurement or analysis problem, are eligible to apply. If your company makes it past the first stage, you will be matched with us, NPL, NEL or STFC for a consultation. After this stage, some companies will continue to work with us in our own world-class measurement labs.

The first two rounds of the A4I programme have seen us help several companies overcome measurement problems. In Round 1, we worked with the Coconut Collaborative, a manufacturer of coconut yoghurt, and STFC to develop a rapid and robust screening method to detect rancid coconut cream before its use. The use of rancid cream led to lost sales and waste for the company. We helped develop a novel screening approach with multi-spectral imaging, which will help the Coconut Collective avoid annual costs of £500k.

We also worked with Sistemic to help ensure the safety of cell therapy products, by increasing the sensitivity of their novel technology, which detects contamination in cell therapy products. Cell therapies are seen as the future of treatment in a number of areas including diabetes and cardiovascular disease. However, one type of cell being used to generate cell therapy products (pluripotent stem cells, or PSCs) has the potential to form tumours. The NML enhanced the sensitivity and specificity of the Sistemic novel prototype miRNA-assay to the levels required for market (<10 cells per million). This assay will ensure producers can accurately assess PSC contamination in their cell therapy products.

Other examples of the companies that were funded under A4I Round 1 can be found at Analysis for Innovators winning projects, and for more information about the work and case studies of the NML at LGC, have a look here at our latest annual review.

And don’t forget to apply now– there’s £3 million up for grabs!

How genotyping is aiding in the fight against malaria

mosquitoe-1548975_19203.2 billion people across 106 countries and territories, live in areas at risk of malaria transmission. The serious and sometime fatal mosquito-borne disease is caused by the Plasmodium parasite – in 2015, malaria caused 212 million clinical episodes, and 429,000 deaths.

Malaria has been a public health problem in Brazil ever since it was brought to the region during its colonization. By the 1940s it is estimated that six to eight million infections and 80,000 malaria-related deaths occurred every year in the country.

Due to a concerted series of malaria control policies, Brazil has recorded a 76.8% decrease in malaria incidence between 2000 and 2014 – and effort which the country was praised by the WHO.  In 2014, there were 143,910 of microscopically confirmed cases of malaria and 41 malaria-related deaths.

Part of Brazil’s malaria control policy involves the use of primaquine – a medication first made in 1946, to treat and prevent malaria. It is particularly effective against the Plasmodium vivax parasite that is prevalent in the Brazil.

Unfortunately primaquine can induce haemolytic anaemia in glucose-6-phosphate dehydrogenase (G6PD)-deficient individuals and may lead to severe and fatal complications. 330 million people worldwide are affected with G6PD deficiency, with recent studies suggesting the prevalence of the deficiency could be as high as 10% in Brazil.

Recently, molecular biologists from LGC enabled a cutting edge study in collaboration with researchers from Brazil and the London School of Hygiene and Tropical Medicine.

The researchers looked for mutations in a sample of 516 male volunteers that could be used as clinical indicators for G6PD deficiency that could lead to complications in people prescribed with primaquine.

Blood samples were collected from around Brazil at hospitals during surgeries, as well as using the local Brazilian radio stations to ask people to come and submit blood.

Needing a fast and efficient way to generate results in high throughput, the team turned to LGC’s integrated genomics toolkit to facillitate the research. Each sample was screened against 24 KASP assays to assess the genetic bases of G6PD deficiency. In combination with the IntelliQube®,a fully automated point and click PCR system;  the team collected the data in roughly three hours of instrument time and one hour hands on time.

KASP is a flexible, highly specific genotyping technology, which can be used to determine SNPs and InDels.  KASP uses unlabelled oligonucleotide primers, which gives the technology a cost advantage and allows more data to be generated, increasing data quality.

The data indicates that approximately one in 23 males from the Alto do Juruá could be G6PD deficient and at risk of haemolytic anaemia if treated with primaquine. The authors conclude that routine G6PDd screening to personalize primaquine administration should be considered – particularly as complete treatment of patients with vivax malaria using chloroquine and primaquine, is crucial for malaria elimination.

The teams are continuing their collaboration to help further research in to treatments for malaria, and we can’t wait to see more!

To access the paper, please click here, or to see the IntelliQube in action and learn more about this automated PCR instrument click here.

 

 

Sources:

Malaria. (2017, July 13). Retrieved August 8, 2017, from https://www.cdc.gov/malaria/about/index.html

Maia, U. M., Batista, D. C., Pereira, W. O., & Fernandes, Thales Allyrio Araújo de Medeiros. (n.d.). Prevalence of glucose-6-phosphate dehydrogenase deficiency in blood donors of Mossoró, Rio Grande do Norte. Retrieved August 8, 2017, from http://www.scielo.br/scielo.php?pid=S1516-84842010000500017&script=sci_arttext&tlng=en

 

This blog post was originally published on the Biosearch Technologies blog.

Fatbergs: The monster lurking below

If you haven’t been paying attention to sewer-related news throughout the past few years, you might have missed that fatbergs are a thing. Large (sometimes hundreds of metres long), congealed lumps of fat and other substances, fatbergs have been clogging up the sewer systems under major cities like London, Melbourne, Baltimore and Cardiff.

martin-brechtl-721491-unsplashJust a quick Google search of the word ‘fatberg’ turns up a trove of related videos and news that could gross anyone out. Fatbergs now have their own museum exhibition and were even the subject of a prime time documentary, Fatberg Autopsy, which is exactly as captivating and weird as it sounds.  And just as our fascination for these grotesque reflections of modern life has grown faster than a fatberg in a sewer, so is our understanding of them.

These beasts begin to form when large amounts of cooking oils, fats and grease are dumped into drains, where they thicken. Adding to the frequency of fatbergs is the increased usage of wet wipes, which don’t break down in drainage pipes, but instead team up with the congealed cooking oils to form a monster from a subterranean horror film. Fatbergs are particularly susceptible in old pipes or pipes with rough walls where debris can get trapped and build up.

And despite its moniker, documented fatbergs are mostly made up of wet wipes, which account for 93 percent of the material blocking sewers, while actual fat and grease make up only 0.5 percent. In one case, the fatberg in London had grown to weigh as much as a blue whale, the largest animal known to have ever existed.

Studying products of human behaviour, like fatbergs, can provide a lot of information into how people in these cities live.

Simon Hudson, Technical Director of Sport and Specialised Analytical Services at LGC, has been involved with method development and analysis for many projects looking into identifying the makeup of substances found in public systems, like fatbergs. In addition to analysing samples for Fatberg Autopsy, Simon has also worked with scientists from King’s College London, Guy’s and St Thomas’s NHS Foundation Trust and King’s Health Partners, Hull York Medical School and other institutions to analyse anonymised pooled urine from UK cities.

By using various analytical methods on samples from street urinals, the scientists have been able to provide a geographical trend analysis of the recreational drugs and novel psychoactive substances (NPS) that are being used, showing the most common drugs in specific cities.

Studies on recreational drug use have traditionally been done by self-reported user surveys, which are helpful but flawed if respondents either don’t know what drugs they are taking or don’t disclose everything they’ve used. By analysing samples from urinals, these methods can be used to confirm actual drugs being used and can be particularly useful for public health initiatives in identifying new psychoactive substances that may not have been reported or known to officials yet. It also provides insight into common potential adulterants of drugs.

By taking pooled samples from street urinals near night clubs and bars, these studies provide a snapshot of what is happening inside the night life across UK cities.

Findings include everything from nicotine and caffeine to cocaine, cannabis, ketamine, methamphetamine, anabolic steroids and several uncontrolled psychoactive substances. In one specific study¹, cocaine and 3,4-methylenedioxy–methamphetamine (MDMA, Ecstasy) were the most common recreational drugs to turn up, while morphine and methadone were detected in seven and six cities, respectively.

Like his analysis of fatbergs, Simon’s work on urine samples provides insight into the hidden aspects of modern life, the things that aren’t talked about over coffee or seen while heading into the office. They’re also valuable in shaping public health knowledge and responses to potential issues.

If you’re interested in learning more about our science, head over to lgcgroup.com or read Simon’s various publications on pooled urine analysis listed below.

 

¹Archer, J.R.H, S. Hudson, O. Jackson, T. Yamamoto, C. Lovett, H.M. Lee, S. Rao, L. Hunter, P.I. Dargan, and D.M. Wood (2015). Analysis of anonymized pooled urine in nine UK cities: variation in classical recreational drug, novel psychoactive substance and anabolic steroid use.  QJM: An International Journal of Medicine. 108(12), pp. 929-933.

Other publications:

  1. R. H Archer, P. I. Dargan, S. Hudson, S. Davies, M. Puchnarewicz, A. T. Kicman, J. Ramsey, F. Measham, M. Wood, A. Johnston, and D. M. Wood (2013). Taking the Pissoir – a novel and reliable way of knowing what drugs are being used in nightclubs. Journal of Substance Use. 00 (0), pp. 1-5.
  2. R. H. Archer, P. I. Dargan, H. M. D. Lee, S. Hudson & D. M. Wood (2014) Trend analysis of anonymised pooled urine from portable street urinals in central London identifies variation in the use of novel psychoactive substances, Clinical Toxicology, 52:3, 160-165, DOI: 10.3109/15563650.2014.885982