Testing for
Food
Authenticity
IN THIS ISSUE
Introduction
A conservative estimate of the annual
cost of food fraud is around $15 Billion
globally affecting 10% of all
commercially sold food products.
Food fraud is of course not a new
concept. Historical and recent
records are littered with examples
where unscrupulous individuals have
adulterated both high value (e.g.
saffron and caviar) and mass
produced foods (e.g. fish, coffee,
wine) with low value substitutes to
increase their profits. In many cases
the additives are not fit for human
consumption. The first law set down in
British statute to combat food fraud
was the Assisa panis et cervisiae
(Assize of bread and ale) in the
13th Century. This was enacted to
regulate the price, weight and quality
of beer and bread. Since then
numerous other laws have been
implemented to address this issue
including the European FIC
Regulation, which outlines the rules for
general food and nutrition labelling.
The relatively recent horsemeat
scandal is a prime example of the
impact that food fraud can have on
businesses and individuals. A year
after the initial reports, sales of frozen
ready meals were down 6% year on
year, which led to the demise of
several enterprises, such as Spanghero
and Silvercrest Foods, along with
associated job losses and a loss of
trust in some major supermarkets and
food producers, even those who were
not directly involved in the crisis. A key
question posed at the end of this crisis
was how and why did it happen?
Clearly the root cause was fraud but
how did it become widespread and
why did it go undetected for so long?
The UK government commissioned a
review, led by Prof. Chris Elliott, which
addressed these questions; his report
recommended that there should be
zero tolerance of food fraud and that
there needs to be a focus on
intelligence gathering.
Current Authenticity Analysis
When undeclared horsemeat was first
detected in processed foods in 2012,
only two analytical methods were
commonly in use in both commercial
and regulatory laboratories to detect
the presence of meat contaminants:
assays based on Enzyme Linked
Immunosorbent Assay (ELISA) or
Polymerase Chain Reaction (PCR). As
with many analytical tests neither
technology is designed to detect
meat directly, rather they detect
specific factors within a matrix that
are unique to a particular mammalian
species. ELISA uses antibodies to
detect specific proteins and PCR uses
oligonucleotide primers to detect
specific DNA sequences. Both assays
are able to rapidly and reliably detect
contamination of meat by surrogates
in the majority of samples, however
they work very differently and have
some specific drawbacks.
On the whole most service
laboratories that offer ELISA based
analysis do so using kits developed by
a small number of manufacturers.
These kits, although fit for purpose, are
somewhat limited in their capabilities.
They are able to detect contaminants
typically at a level of 1% but are only
able to detect fixed and limited
species (key contaminants, such as
chicken, pork, beef, sheep and horse
meats). These restrictions are
predominantly as a result of the time
TESTING FOR FOOD AUTHENTICITY | FLATFISH LTD 2
and significant cost associated with
identifying and validating antibodies
that are both sensitive (so as to avoid
false negatives) and specific (so as to
avoid false positive results) and the
return on investment that can be
made from selling such kits.
Furthermore, due to the way proteins
are modified during processing,
different antibodies and therefore
different kits are required to detect
both cooked and uncooked
contaminants.
PCR assays are also offered by
numerous kit manufacturers to detect
key contaminants. However,
analytical laboratories with active
molecular biology research
capabilities are able to develop and
validate assays, with varying limits of
detection (typically between 0.01 and
1%), and accredit to ISO17025
standards relatively rapidly (within
weeks). The cost of development is
relatively low (as primary analysis can
be performed in silico); the costs of
key reagents are negligible and most
analytical laboratories will have
quality control material at hand. This
allows for a relatively rapid response,
once a novel contaminant has been
identified.
Despite the advantages afforded by
PCR and the fact that several
laboratories were offering horse
contaminant detection prior to 2012,
the assays were not performed. Why
not? As horsemeat was rarely
processed alongside most of the
commercial meats consumed in the
UK and it was not considered to be a
cheaper meat, any risk based profile
would have suggested that it was
highly unlikely to be a contaminant in
processed food. This, coupled with the
relatively high costs of authenticity
services (c. >£150 for detection of a
single contaminant, with additional
contaminants being charged at c.
£50 per sample) resulted in only
certain commonly processed meats
being assessed. With hindsight more
testing should have been done,
however, given the large number of
commonly eaten mammalian species
globally, the cost of detecting all
potential contaminants is vast and
could not have been practically
performed by either PCR or ELISA
based methods. It is clear that the
analytical service provision was
inadequate to support the food
industry in policing and preventing the
sort of unforeseen food fraud
exemplified by the horsemeat
scandal. However novel diagnostics
are now available to detect far larger
numbers of contaminants in a cost
effective manner.
Next Generation Sequencing: the
future of authenticity analysis?
The most recent advance to be
offered commercially is the detection
of contaminants based Next
Generation Sequencing (NGS)
technology (also known as massively
parallel sequencing). Sequencing
technology has advanced
significantly over the last 20 years. The
announcement of the completion of
the first human genome sequence
came in 2003provide cost effective
solutions for analytical laboratories. . It
took around 13 years and cost $2.7
billion. In 2014, Illumina launched the
Hiseq X Ten Sequencer with the
capability of sequencing a human
genome in 2-3 days at a cost of $1000
per genome and a throughput of
18,000 genomes a year. However, this
technology comes with a requirement
for significant investment, with the
hardware alone costing $10 million.
Fortunately, in parallel Illumina and
Life Technologies have adapted their
sequencing technologies (the MiSeq
and the Ion PGM respectively) to offer
lower cost platforms that can be used
to provide cost effective solutions for
analytical laboratories.
NGS differs significantly from Sanger
sequencing. With Sanger sequencing,
a large amount of template DNA is
needed, this is typically in the form of
a PCR product. To get an accurate
output, the input needs to be
homogenous i.e. for a meat
authenticity test, the DNA all needs
come from the same species. So if a
complex matrix is assessed, even one
with only two different species, a
sequencing reaction would fail to
generate meaningful data. NGS
technologies only require a single
DNA molecule to derive an output
and are termed ‘massively parallel’ as
each single DNA molecule is
sequenced independently in a
‘microreactor’. In this way very
complex mixes of DNA samples can
be assessed. This advance has been
exploited by several research groups,
which have developed methods of
analysing and de-convoluting the
complex nature of microbial
communities (microbiomes) in clinical
samples to identify the causative
agents of disease and in foods to
investigate the microbes responsible
for various fermentation processes
and spoilage.
TESTING FOR FOOD AUTHENTICITY | FLATFISH LTD 3
Schematic representation of
analysis of a sample containing 2
species by:
A: ELISA using a specific antibody
(blue) that binds epitopes in the
‘blue’ species but not the ‘red’. The
example shows detection by
sandwich ELISA.
B: PCR using oligonucleotides (blue)
that can prime amplification from
the ‘blue’ species but not the ‘red’.
The example shows detection of
double stranded DNA binding using
a fluorescent dye (Sybr green)
C:NGS amplicon sequencing using
universal PCR oligonucleotides
(black) that can prime
amplification from all species. The
amplified products are sequenced
and the outputs are compared in
silico to a database of sequences
from several thousand species. In
this way all contaminants (so long
as their sequences are represented
in the database) can be identified.
These assays, frequently termed
‘amplicon based metagenomics’
employ three key steps: PCR
amplification (specifically
amplification of a target region,
such as mitochondrial 16s, 5s or 12s
rRNA) followed by NGS and in silico
comparison of each output sequence
to a reference database. In contrast
to standard PCR assays, which
provide species specific results
because oligonucleotide primers are
designed to highly variable regions of
a genome, NGS assays employ PCR to
amplify a fragment of DNA using
primers designed to regions of
exceptionally high conservation (i.e.
sequences common in all species)
that flank regions with relatively low
sequence conservation (i.e. vary
significantly from species to species).
PCR products are amplified from all
species in a mixed sample and each
DNA strand is independently
sequenced. By comparing the output
from the sequencer, which generates
in excess of 8 million sequence reads
even on the most basic platform, to
an annotated database of
sequences, one can discern the
nature of a sample. Whereas in a
microbiome sample there could be
several hundred thousand different
bacterial species or millions of fungal
species, the number of mammalian or
plant species is significantly lower
making the analysis somewhat easier.
An early attempt to adapt this
technology to the detection of
contaminants in food showed some
promise and now an NGS assay has
been commercialised for the
detection of in excess of 7000 different
meat and plant contaminants.
Conclusion
The introduction of an NGS assay to
detect several thousand biological
contaminants simultaneously removes
the need for the food industry to
second guess what adulterant will be
used by fraudsters and provides
manufacturers, retailers and
authorities with the ability to identify
fraud that would have hitherto gone
undetected. However significant
challenges still face the analytical
industry. For example, they are still
unable to accurately quantitate the
level of adulterants in food. NGS
technologies have the potential to be
adapted to provide true quantitative
analysis, by performing deep
sequencing without PCR amplification
(i.e. true metagenomics analysis),
however these assays are still at the
development stage.

Testing for Food Authenticity

  • 1.
    Testing for Food Authenticity IN THISISSUE Introduction A conservative estimate of the annual cost of food fraud is around $15 Billion globally affecting 10% of all commercially sold food products. Food fraud is of course not a new concept. Historical and recent records are littered with examples where unscrupulous individuals have adulterated both high value (e.g. saffron and caviar) and mass produced foods (e.g. fish, coffee, wine) with low value substitutes to increase their profits. In many cases the additives are not fit for human consumption. The first law set down in British statute to combat food fraud was the Assisa panis et cervisiae (Assize of bread and ale) in the 13th Century. This was enacted to regulate the price, weight and quality of beer and bread. Since then numerous other laws have been implemented to address this issue including the European FIC Regulation, which outlines the rules for general food and nutrition labelling. The relatively recent horsemeat scandal is a prime example of the impact that food fraud can have on businesses and individuals. A year after the initial reports, sales of frozen ready meals were down 6% year on year, which led to the demise of several enterprises, such as Spanghero and Silvercrest Foods, along with associated job losses and a loss of trust in some major supermarkets and food producers, even those who were not directly involved in the crisis. A key question posed at the end of this crisis was how and why did it happen? Clearly the root cause was fraud but how did it become widespread and why did it go undetected for so long? The UK government commissioned a review, led by Prof. Chris Elliott, which addressed these questions; his report recommended that there should be zero tolerance of food fraud and that there needs to be a focus on intelligence gathering. Current Authenticity Analysis When undeclared horsemeat was first detected in processed foods in 2012, only two analytical methods were commonly in use in both commercial and regulatory laboratories to detect the presence of meat contaminants: assays based on Enzyme Linked Immunosorbent Assay (ELISA) or Polymerase Chain Reaction (PCR). As with many analytical tests neither technology is designed to detect meat directly, rather they detect specific factors within a matrix that are unique to a particular mammalian species. ELISA uses antibodies to detect specific proteins and PCR uses oligonucleotide primers to detect specific DNA sequences. Both assays are able to rapidly and reliably detect contamination of meat by surrogates in the majority of samples, however they work very differently and have some specific drawbacks. On the whole most service laboratories that offer ELISA based analysis do so using kits developed by a small number of manufacturers. These kits, although fit for purpose, are somewhat limited in their capabilities. They are able to detect contaminants typically at a level of 1% but are only able to detect fixed and limited species (key contaminants, such as chicken, pork, beef, sheep and horse meats). These restrictions are predominantly as a result of the time
  • 2.
    TESTING FOR FOODAUTHENTICITY | FLATFISH LTD 2 and significant cost associated with identifying and validating antibodies that are both sensitive (so as to avoid false negatives) and specific (so as to avoid false positive results) and the return on investment that can be made from selling such kits. Furthermore, due to the way proteins are modified during processing, different antibodies and therefore different kits are required to detect both cooked and uncooked contaminants. PCR assays are also offered by numerous kit manufacturers to detect key contaminants. However, analytical laboratories with active molecular biology research capabilities are able to develop and validate assays, with varying limits of detection (typically between 0.01 and 1%), and accredit to ISO17025 standards relatively rapidly (within weeks). The cost of development is relatively low (as primary analysis can be performed in silico); the costs of key reagents are negligible and most analytical laboratories will have quality control material at hand. This allows for a relatively rapid response, once a novel contaminant has been identified. Despite the advantages afforded by PCR and the fact that several laboratories were offering horse contaminant detection prior to 2012, the assays were not performed. Why not? As horsemeat was rarely processed alongside most of the commercial meats consumed in the UK and it was not considered to be a cheaper meat, any risk based profile would have suggested that it was highly unlikely to be a contaminant in processed food. This, coupled with the relatively high costs of authenticity services (c. >£150 for detection of a single contaminant, with additional contaminants being charged at c. £50 per sample) resulted in only certain commonly processed meats being assessed. With hindsight more testing should have been done, however, given the large number of commonly eaten mammalian species globally, the cost of detecting all potential contaminants is vast and could not have been practically performed by either PCR or ELISA based methods. It is clear that the analytical service provision was inadequate to support the food industry in policing and preventing the sort of unforeseen food fraud exemplified by the horsemeat scandal. However novel diagnostics are now available to detect far larger numbers of contaminants in a cost effective manner. Next Generation Sequencing: the future of authenticity analysis? The most recent advance to be offered commercially is the detection of contaminants based Next Generation Sequencing (NGS) technology (also known as massively parallel sequencing). Sequencing technology has advanced significantly over the last 20 years. The announcement of the completion of the first human genome sequence came in 2003provide cost effective solutions for analytical laboratories. . It took around 13 years and cost $2.7 billion. In 2014, Illumina launched the Hiseq X Ten Sequencer with the capability of sequencing a human genome in 2-3 days at a cost of $1000 per genome and a throughput of 18,000 genomes a year. However, this technology comes with a requirement for significant investment, with the hardware alone costing $10 million. Fortunately, in parallel Illumina and Life Technologies have adapted their sequencing technologies (the MiSeq and the Ion PGM respectively) to offer lower cost platforms that can be used to provide cost effective solutions for analytical laboratories. NGS differs significantly from Sanger sequencing. With Sanger sequencing, a large amount of template DNA is needed, this is typically in the form of a PCR product. To get an accurate output, the input needs to be homogenous i.e. for a meat authenticity test, the DNA all needs come from the same species. So if a complex matrix is assessed, even one with only two different species, a sequencing reaction would fail to generate meaningful data. NGS technologies only require a single DNA molecule to derive an output and are termed ‘massively parallel’ as each single DNA molecule is sequenced independently in a ‘microreactor’. In this way very complex mixes of DNA samples can be assessed. This advance has been exploited by several research groups, which have developed methods of analysing and de-convoluting the complex nature of microbial communities (microbiomes) in clinical samples to identify the causative agents of disease and in foods to investigate the microbes responsible for various fermentation processes and spoilage.
  • 3.
    TESTING FOR FOODAUTHENTICITY | FLATFISH LTD 3 Schematic representation of analysis of a sample containing 2 species by: A: ELISA using a specific antibody (blue) that binds epitopes in the ‘blue’ species but not the ‘red’. The example shows detection by sandwich ELISA. B: PCR using oligonucleotides (blue) that can prime amplification from the ‘blue’ species but not the ‘red’. The example shows detection of double stranded DNA binding using a fluorescent dye (Sybr green) C:NGS amplicon sequencing using universal PCR oligonucleotides (black) that can prime amplification from all species. The amplified products are sequenced and the outputs are compared in silico to a database of sequences from several thousand species. In this way all contaminants (so long as their sequences are represented in the database) can be identified. These assays, frequently termed ‘amplicon based metagenomics’ employ three key steps: PCR amplification (specifically amplification of a target region, such as mitochondrial 16s, 5s or 12s rRNA) followed by NGS and in silico comparison of each output sequence to a reference database. In contrast to standard PCR assays, which provide species specific results because oligonucleotide primers are designed to highly variable regions of a genome, NGS assays employ PCR to amplify a fragment of DNA using primers designed to regions of exceptionally high conservation (i.e. sequences common in all species) that flank regions with relatively low sequence conservation (i.e. vary significantly from species to species). PCR products are amplified from all species in a mixed sample and each DNA strand is independently sequenced. By comparing the output from the sequencer, which generates in excess of 8 million sequence reads even on the most basic platform, to an annotated database of sequences, one can discern the nature of a sample. Whereas in a microbiome sample there could be several hundred thousand different bacterial species or millions of fungal species, the number of mammalian or plant species is significantly lower making the analysis somewhat easier. An early attempt to adapt this technology to the detection of contaminants in food showed some promise and now an NGS assay has been commercialised for the detection of in excess of 7000 different meat and plant contaminants. Conclusion The introduction of an NGS assay to detect several thousand biological contaminants simultaneously removes the need for the food industry to second guess what adulterant will be used by fraudsters and provides manufacturers, retailers and authorities with the ability to identify fraud that would have hitherto gone undetected. However significant challenges still face the analytical industry. For example, they are still unable to accurately quantitate the level of adulterants in food. NGS technologies have the potential to be adapted to provide true quantitative analysis, by performing deep sequencing without PCR amplification (i.e. true metagenomics analysis), however these assays are still at the development stage.