NanoSight visualizes, measures and characterizes virtually all nanoparticles. Pls contact A&P Instrument Co.Ltd in Hong Kong for detail. Email: anson@anp.com.hk
Nanoparticle Tracking Analysis (NTA) with NanoSight equipment. NTA can be used in a variety of research areas including the following: extracellular vesicle characterization, viral vaccine research and development, development of drug delivery systems, protein aggregation studies, and more!
Dr. Jeff Bodycomb shares the core principle behind laser diffraction and two laser diffraction models (Fraunhofer and Mie) that are the backbone of laser diffraction theory.
Nanoparticle Tracking Analysis (NTA) with NanoSight equipment. NTA can be used in a variety of research areas including the following: extracellular vesicle characterization, viral vaccine research and development, development of drug delivery systems, protein aggregation studies, and more!
Dr. Jeff Bodycomb shares the core principle behind laser diffraction and two laser diffraction models (Fraunhofer and Mie) that are the backbone of laser diffraction theory.
Dynamic light scattering (DLS) or Quasi-Elastic Light Scattering (QELS), is a non-invasive, well-established technique for measuring the size and size distribution of molecules and particles typically in the submicron region, and with the latest technology lower than 1nm.
In This slide the working principle and the function of DLS is Explained in brief and precise way.
Dynamic light scattering (DLS) is a technique in physics that can be used to determine the size distribution profile of small particles in suspension or polymers in solution.
Other names are
Photon correlation spectroscopy
Quasi-elastic light scattering.
Selecting the Best Particle Size Analyzer for your ApplicationHORIBA Particle
Mark Bumiller from HORIBA Particle discusses the benefits and limitations of modern particle size analyzers and ideas on how to approach the choice of a new measurement technique or instrument.
This presentation is archived with the original webinar video in the Download Center at www.horiba.com/us/particle.
Deb Newberry, Director of the Nano-Link program at Dakota County Technical College, MN, talks about the exciting area where nanotechnology and biotechnology converge.
Dynamic light scattering (DLS) or Quasi-Elastic Light Scattering (QELS), is a non-invasive, well-established technique for measuring the size and size distribution of molecules and particles typically in the submicron region, and with the latest technology lower than 1nm.
In This slide the working principle and the function of DLS is Explained in brief and precise way.
Dynamic light scattering (DLS) is a technique in physics that can be used to determine the size distribution profile of small particles in suspension or polymers in solution.
Other names are
Photon correlation spectroscopy
Quasi-elastic light scattering.
Selecting the Best Particle Size Analyzer for your ApplicationHORIBA Particle
Mark Bumiller from HORIBA Particle discusses the benefits and limitations of modern particle size analyzers and ideas on how to approach the choice of a new measurement technique or instrument.
This presentation is archived with the original webinar video in the Download Center at www.horiba.com/us/particle.
Deb Newberry, Director of the Nano-Link program at Dakota County Technical College, MN, talks about the exciting area where nanotechnology and biotechnology converge.
the branch of technology that deals with dimensions and tolerances of less than 100 nanometres, especially the manipulation of individual atoms and molecules.
Quality Measurements Using NIR/MIR Spectroscopy: A Rotten Apple Could Turn Yo...TechRentals
Light interacts with a product's organic molecules causing variations in light absorption. The transmitted or reflected light can be measured with a spectrometer and the resultant spectral signature used to qualify or quantify properties of the product. The discussion will include - how light interacts with molecules, characteristics of the different electromagnetic spectral bands, in-line hardware required to collect light, and fundamentals of chemometrics.
Presenter -- Gary Brown
Gary Brown is one of the principle engineers with Australian Innovative Engineering and has spent the last 12+ years developing in-line instrumentation using NIR spectroscopy to measure properties of fresh fruit. He is now concentrating his efforts in applying the technology for in-line product authentication for the food and pharmaceutical industries.
Nanoparticles are solid colloidal particles ranging in size from 10 to 1000 nm.
Nanoparticles are made of a macromolecular material which can be of synthetic or natural origin.
Maruthi Prithivirajan, Head of ASEAN & IN Solution Architecture, Neo4j
Get an inside look at the latest Neo4j innovations that enable relationship-driven intelligence at scale. Learn more about the newest cloud integrations and product enhancements that make Neo4j an essential choice for developers building apps with interconnected data and generative AI.
GridMate - End to end testing is a critical piece to ensure quality and avoid...ThomasParaiso2
End to end testing is a critical piece to ensure quality and avoid regressions. In this session, we share our journey building an E2E testing pipeline for GridMate components (LWC and Aura) using Cypress, JSForce, FakerJS…
zkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex ProofsAlex Pruden
This paper presents Reef, a system for generating publicly verifiable succinct non-interactive zero-knowledge proofs that a committed document matches or does not match a regular expression. We describe applications such as proving the strength of passwords, the provenance of email despite redactions, the validity of oblivious DNS queries, and the existence of mutations in DNA. Reef supports the Perl Compatible Regular Expression syntax, including wildcards, alternation, ranges, capture groups, Kleene star, negations, and lookarounds. Reef introduces a new type of automata, Skipping Alternating Finite Automata (SAFA), that skips irrelevant parts of a document when producing proofs without undermining soundness, and instantiates SAFA with a lookup argument. Our experimental evaluation confirms that Reef can generate proofs for documents with 32M characters; the proofs are small and cheap to verify (under a second).
Paper: https://eprint.iacr.org/2023/1886
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!SOFTTECHHUB
As the digital landscape continually evolves, operating systems play a critical role in shaping user experiences and productivity. The launch of Nitrux Linux 3.5.0 marks a significant milestone, offering a robust alternative to traditional systems such as Windows 11. This article delves into the essence of Nitrux Linux 3.5.0, exploring its unique features, advantages, and how it stands as a compelling choice for both casual users and tech enthusiasts.
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
3. Mai 13
As the schematic below shows, the NanoSight
technology comprises:
• a metallised optical element
• illuminated by laser beam
.
Laser as seen at low magnification
NanoSight LM20 device
www.nanosight.com
Hardware: laser illuminated sample chamber + optical microscope
4. Fast and easy to use system
Loading of 200nm latex standard
Mai 13 www.nanosight.com
5. Titanium Dioxide
Mai 13 www.nanosight.com
Polydisperse titanium dioxide particles as seen at standard magnification
6. Particles are Visualised, not imaged
Particles are too small to be
imaged by the microscope
Particles seen as point
scatterers moving under
Brownian motion
Larger particles scatter
significantly more light
Speed of particles varies
strongly with particle size
Microvesicles purified from serum by ultracentrifugation
Mai 13 www.nanosight.com
7. • Nanoparticles move under Brownian
movement due the random movement
of water molecules (red molecules in
movie) surrounding them.
• Small particle move faster than larger
particles.
• Diffusion Coefficient can be calculated
by tracking the movement of each
particle and then through application of
the Stokes-Einstein equation particle
size can be calculated.
Principle of Measurement
Mai 13 www.nanosight.com
8. • Brownian motion of each particle is
followed in real-time via video
• Tracking software establishes mean
square displacement and diffusion
coefficient (Dt)
• Then from Stokes Einstein can be
obtained particle (sphere equivalent
hydrodynamic) diameter, dh
KB = Boltzmann Constant
T = Temperature
= viscosity
KBT
Stokes-Einstein equation
Dt =
3 dh
Sizing is an absolute method
Mai 13 www.nanosight.com
Dt
yx
2
4
,
9. Nanoparticle Tracking Analysis (NTA) is the gathering of unique information and
comes from assessment of individual particles, rather than averaging over a bulk
sample.
This provides a distinct advantage in determining particle size
analysistrackingcapture
Nanoparticle Tracking Analysis
Mai 13 www.nanosight.com
10. The NanoSight NTA
2.0 (nano-particle
tracking) analysis
suite allows for
captured video
footage to be
simultaneously
tracked and
analysed…
100+200nm nanoparticles being tracked and analysed by NanoSight NTA 2.0
Particle Sizing in action - Software Analysis
Mai 13 www.nanosight.com
11. Minimum size limit is related to:
• Material type
• Wavelength and power of illumination
source
• Sensitivity of the camera
Size Concentration
Minimum concentration is related to:
• Poor statistics (Requiring longer analysis time)
10 – 40 nm
1 000 – 2 000 nm
Maximum Size limit is related to:
• Limited Brownian motion
• Viscosity of solvent
Approx 105-6 / mL
Maximum concentration is related to:
• Inability to resolve neighboring particles
• Tracks too short before crossing occurs
Approx 109 / mL
NTA Detection Limits
www.nanosight.com
12. True size distribution profile
Mixture of 100nm and 200nm latex microspheres
dispersed in water in 1:1 ratio
Particle distribution displays a number
count vs particle size.
Mai 13 www.nanosight.com
13. Mai 13
Number count of particle in a determined volume :
Absolute concentration measurement
in 10E8 part/mL.
Particles are Counted– By Number Count
www.nanosight.com
Global
concentration of
all populations
Concentration of a
population
selected by user
14. NanoSight has the unique ability to plot each particle’s size as a function of its
scattered intensity
2D size v.
number
2D size v.
intensity
3D size v. Intensity v.
concentration
Relative intensity scattered
Mai 13 www.nanosight.com
16. 100 PS
60nm Au
30nm Au
In this mixture of 30 nm and 60 nm gold nanoparticles mixed with 100 nm polystyrene, the
three particle types can be clearly seen in the 3D plot confirming indications of a tri-modal
given in the normal particle size distribution plot. Despite their smaller size, the 60 nm Au can
be seen to scatter more than the 100 nm PS.
Resolving mixtures of different particle types and sizes
Mai 13 www.nanosight.com
17. NanoSight systems can be fitted with
• Choice of long pass or band pass filters allow specific nanoparticles to be
tracked in high backgrounds
Applications in:
• Nanoparticle toxicity studies
• Nano-rheology
• Bio-diagnostics
• Phenotyping specific exosomes
• Laser diode capable of exciting fluorophores and quantum dots
405nm 488nm 532nm 635nm
Fluorescent Mode available
Mai 13 www.nanosight.com
18. Analysis of 100 nm Fluorescence standard particles suspended in FBS
Mai 13 www.nanosight.com
Modal particle size peaks: 103 nm
Concentration: 9.5 x108 particles/ml
• 100 nm fluorescently labelled particles suspended in 100% FBS
• Sample maintained at a constant temperature (37 oC)
• Sample viscosity – 1.33 cP
• Excitation wavelength - 532 nm
• Using a 565 nm Long pass optical filter
No filter 565 nm Long pass optical filter
This allows selective visualising
of fluorescent/fluorescently
labelled particles
19. Dynamic Light Scattering (DLS or PCS)
Mai 13 www.nanosight.com
• DLS (alternatively known as Photon Correlation Spectroscopy (PCS))
is deservedly an industry standard technique and widely used for 40
years...
• Also relies on Brownian motion and relates the random motion of
particles in liquid suspension to a hydrodynamic radius. Technique
relates the rate of change of the intensity of the speckle to particle
size using the auto-correlation function.
• Very effective technique for measuring particle size for mono-
dispersed samples from a few nm up to microns .
• Widely used, big installed base.
20. Comparison of different particle size
distribution
Particle by particle method Static light scattering Dynamic Light Scattering
21. DLS Analysis and NanoSight equivalent
for monodisperse systems
NTA vs DLS - Monodisperse
Mai 13 www.nanosight.com
DLS NTA
22. With NanoSight, the analysis shows both 100nm
and 200nm peaks
Polystyrene reference spheres in water (100nm and 200nm)
NTA vs DLS – Bimodal Sample
Mai 13 www.nanosight.com
23. 100nm particles not
seen due to intensity
bias
Mixed Sample of 100nm+200nm
particles at a ratio of 10:1 by weight
?
Mai 13
NTA vs DLS – Bimodal Sample
www.nanosight.com
24. Data reproduced from
Filipe, Hawe and Jiskoot (2010) “Critical
Evaluation of Nanoparticle Tracking Analysis
(NTA) by NanoSight for the Measurement of
Nanoparticles and Protein Aggregates”,
Pharmaceutical Research, DOI:
10.1007/s11095-010-0073-2
NTA (red profiles)
Mixtures of polystyrene of
different sizes
DLS (blue bars)
Mai 13 www.nanosight.com
NTA vs DLS in publications
25. • NTA is not intensity-weighted towards larger particles, unlike DLS
• While DLS is an ensemble technique, NTA operates particle-by-particle
• NTA has much higher resolving power with respect to multimodal and
polydisperse samples and heterogenous/mixed sample types
• NanoSight unique and direct view visually validates the particle size
distribution data
• Number vs. Intensity vs. Size is provided for each particle size class
• NTA provides particle concentration information
• Fluorescence mode
• Zeta Potential, particle by particle
• NTA requires no information about collection angle, wavelength or solvent
refractive index , unlike DLS
NanoSight’s Advantages Vs. DLS/PCS
Mai 13 www.nanosight.com
26. Comparing different Technologies
Mai 13 www.nanosight.com
Technique Strengths Weaknesses
Electronic microscope
(TEM, SEM)
-Higher magnification, greater
resolution, compositional
information, topography and
morphology
-High costs
- Complex sample preparation
procedure
Analytical Ultra Centrifugation -High resolution -High cost , complex sample
preparation & data evaluation ,
-Low sample throughput
Coulter Counters (i.e IZON) -Very low cost.
- Small size
-Need calibration
-Surface charge of particle
influences result
-Clogging of aperture is
common
Flow Cytometers -High Throughput
-Sorting of different size of
particle
-Fluoresence
-Not suitable for vesicles sizing
due to no accurate refractive
index to calibrate
-Lower limit is about 300nm
Dynamic Light Scattering -suited to mono-disperse
-samples from a few nm to a
few microns
-Intensity-biased Plot
-No Fluorescence and no
counting ability
27. Mai 13
Application Example : Exosomes
www.nanosight.com
•Microvesicles - typically 100nm to 1um range
•Nanovesicles or Exosomes - 30-100nm
•Originally thought to be cellular debris
•Now appreciated that these cell-derived particles are of
significant importance and play an important role both in
cellular communication and signalling
•Recently found to be present at significantly elevated
levels in a number of diseases conditions
Exosomes and Nanovesicles
28. Mai 13
Application Example : Exosomes
www.nanosight.com
Why use NanoSight in this application ?
• We COUNT particles – concentration is key to this application.
Researchers can not get good concentration information any other
way – this is KEY. Remember an increase in the number of
exosomes could be related to the onset of disease.
•We can work in FLUORESCENCE. This means they can
fluorescently label these particles and therefore SPECIATE – i.e.
They know which exosomes they are looking at.
•High resolution SIZE.
•All of these parameters in a SINGLE MEASUREMENT – allows them
to determine the size, concentration and type of exosome
population.
29. Mai 13
Application Example : Exosomes
www.nanosight.com
Comparisons of different techniques
Flow Cytometry:
Limited lower limit of detection – can’t detect the
smaller exosomes
DLS:
Samples are often not very homogenous –
therefore not suited for measurement with DLS.
DLS cannot count and also can’t phenotype the
exosomes – meaning it has limited value.
EM:
Time consuming and expensive
31. Mai 13
Application Example : Purified Influenza Virus
www.nanosight.com
Why Nanosight?
•Manufacturer interested in the purity of their product – i.e. The size
distribution of particles within a sample.
•Manufacturer interested in the dosage of the vaccine i.e. How many viruses
are present?
•Manufacturers need to know the ratio of infective to non-infective viruses in
their sample.
•Manufacturer interested in detecting specific viruses as the samples are
purified – therefore fluorescence is of interest to them.
•COUNTING aspect is the most important feature as manufacturers often use
long winded infectivity assays to characterise their samples.
32. Current methologies for counting
such as plaque assay only count
infectious particles which often
represent a small component in
attenuated vaccines i.e. perhaps only
1% of product remains infective.
The ability to count
viruses in liquid
suspension is
essential for those
working in vaccine
development.
Particle aggregation
and yield quality are
factors which need to
be understood when
developing these viral
vaccines.
Mai 13
Application Example : Purified Influenza Virus
www.nanosight.com
33. • NanoSight technology has a
unique application in the detection of
early stage aggregation in protein
therapeutics
• Protein monomer is too small to
be individually resolved by this
technique, but early stage aggregates
are readily detected
• Protein monomer at high
concentration causes high
background noise in image, with the
aggregate forming the resolvable
particles
• Both size and number of
aggregates can be calculated and
studied, providing insight into
product stability.
Data reproduced from
Filipe, Hawe and Jiskoot (2010) Pharmaceutical
Research, DOI: 10.1007/s11095-010-0073-2
Mai 13
Application Example: Protein Aggregation at controlled temperature (15-55o C)
www.nanosight.com
34. • DLS analysis of aggregated protein sample generally produces a bimodal analysis.
• The protein monomer and the very large aggregates get picked up through DLS as these are the
regions which scatter most light. The monomer scatters a lot of light by virtue of its high concentration
and the larger particles by virtue of their size.
• Clearly a protein sample cannot aggregate from monomer to large micron sized aggregates with
nothing in between. Whilst NanoSight cannot measure the monomer it can provide valuable
information in the 30nm and above range which is typically the region which is poorly served by
alternative techniques.
Mai 13
Application Example: Protein Aggregation: NTA vs DLS
www.nanosight.com
35. Mai 13
Application Example: Drug Delivery system
www.nanosight.com
The size will affect its
1.Circulation and
residence time in the
blood,
2. the efficacy of the
targeting,
3. the rate of cell
absorption
Advantage of using Iiposome as a drug deliver
• Drugs can be targeted to specific areas by attaching ligands to the liposome
• Readily absorbed by cells
• The rate of drug release can be controlled
• Allows potentially lower doses of drug to be used, reducing toxicity and
side-effects.
36. • Multi-walled Carbon
nanotubes
• Cosmetics
• Foodstuffs
• Ink jet inks and
pigment particles
• Ceramics
• Fuel additives
• Liposomes and other
drug delivery vehicles
• Virus and VLP
samples
• Exosomes and
nanovesicles
• Nanobubbles
• Magnetic
Nanoparticles
• Precursor chemical
for wafer
fabrication.
• Quantum dots
• Polymers and
colloids
• CMP Slurries
Mai 13
The NanoSight system is widely applicable
www.nanosight.com
38. • BASF, Europe
• BP Castrol, Europe
• DuPont, USA
• Epson, Japan
• Exxon Mobile, USA
• GlaxoSmithKline, Europe
• Merck, USA
• Medimmune
• Nestle, Europe
• NIST, USA
Selected Users
• Novartis, Europe
• Proctor & Gamble, USA
• Roche, Europe
• Smith & Nephew, Europe
• Solvay, Europe
• Toshiba, Japan
• Unilever, Europe
• US EPA
• US Forces
• Wyeth Biopharma, USA
39. NTA devices worldwilde mapping
Mai 13 www.nanosight.com
> 500 devices
≈ 800 publications
The above image shows a schematic of the laser sample chamber.Within the laser sample unit is a 35mW 638nm red laser diode, a glass prism and a top plate which forms a seal with the glass optic.The top plate and glass optic sandwich a layer of liquid between them which contains the sample.The laser beam hits the angled surface of the glass optic at 90 degrees and as such passes straight through this leading edge. The laser beam then propagates from the glass optic into the liquid layer. It enters the liquid layer from the glass at and angle and due to a difference in RI between the glass and liquid, the laser beam will refract across the surface of the glass optic.The high power density, high SNR laser beam passes through the samples and the nanoparticles scatter light which is collected using the microscope objective.
This video demonstrates the instrument in use and how easy it is to load a sample:Inject the sample using a syringe through the luer ports into the sample chamber: this must be done slowly and vertically to avoid air bubblesAS the sample fills the cell the red laser beam emerges into from the right to the left hand side of the chamberAt this point you can do an eyeball check on your sample: if the beam is “thick” then there will be lots of particles and it is likely your sample is too concentrated. You ideally want to see a thin red line.Slot the sample chamber back onto the main instrument and there is instant live feed from the camera to the PC and you can visualise your samples immediately. In total this takes about 10 secondsThe sample chamber is also easy to clean, and can simply be flushed through when using the same solvent. Cleanliness is clear in the view the instrument provides.
Here is a polydisperse sample of Titanium Dioxide in water in NanoSight video format.The image is information rich and the user can establish many things about their sample at a glance:The largest particle is about 500nm we know this because we are beginning to see its shape ie its no longer truly spherical. The smallest ones are around 60 – 70 nm and are point scatterersIn between those sizes we see a range of diffraction patterns (Mie scatter, Airey rings)That nothing is bigger than 1micron
Please note the viruses are not imaged directly as they are smaller than the wavelength of light and hence cannot be resolved – this is the reason you can image viruses in EM as a low wavelength electron source is used which allows much smaller particles to be resolved.As such the technique is used to image the light scattered from the particles and not the particle itself. The particles act as point scatteres and the light scattered is used to identify the position of the particle for the purpose of tracking their movement.No information is available on the surface characteristics of the nanoparticle.
The principle of measurement for NTA is Brownian motion. The key factors pertinent to our measurement are:Nanoparticles move randomly under solvent bombardment at a speed related to their size: Large particle move slowly under Brownian motion whilst smaller particles move fast under brownian motion.Brownian motion is independent of particle density and therefore 100nm gold particles will move at the same speed as 100nm polystyrene particles.
The technique measures mean squared displacement from which the particle diffusion coefficient can be calculated. The Stokes Einstein equation can then be applied to calculate particle size.The size calculated is a sphere equivalent size.The user needs to know the viscosity and temperature of the solvent as these parameters influence the particle diffusion.No information is required on particle density or the solvent refractive index as the technique is independent of these parameters NTA is an absolute method requiring no calibration.
Unlike standard industry particle analysis NanoSight tracks the movement of each individual nanoparticle/virus undergoing Brownian motion to establish particle size, this is known as NanoParticle Tracking Analysis (NTA) and is unique and patented by NanoSight.Our specially designed NTA software determines the particle size is a 3 step process:Capture: The software captures a movie file of the particle moving under Brownian motion. Tracking:. The software follows the movement of each particle independently and then calculates the mean squared displacement for each particleAnalysis: Application of the Stokes Einstein equation means that particle size can be calculated for each particle.
This is the software in action , we are processing a recorded clip.You can see the individual particles being tracked and a calculation being made for each track.The longer the processing time the better the repeatability of the result – approximately 40 seconds for a mono-disperse and a twice as long for a polydisperse system.The NTA programme The NTA programme allows for1. automatic optimisation of camera settings,2. automatic rejection of particle tracks which cross,3. automatic adjustment of optimum track lengthselection,4. image noise reduction,5. sample drift correction• All these features contribute to a reproducible andaccurate nanoparticle measurement system.What information does the NTA AnalyticalSoftware provide? Real-Time dynamic nano-particle visualisation Particle-by-particle analysis Particle counting and sizing Particle scatter intensity vs. count and size (3D plot) Particle size distributions displayed as histograms, andoutput to spreadsheet Full reporting and batch processing facilities
Size range is 10 – 1000 nm (1 micron)The lower end of the size limit is affected by the refractive index of the particle. Also the ratio of the RI of the solvent and particle is important; if the refractive indices are too similar (“index-matched”) then no scatter occurs and particles cannot be detected or tracked.When working with samples near the lower limit of detection, there is no subsitute for sample work, which NanoSight are always happy to undertake.
Unlike any other method NanoSight are able to offer an approximate number count for sub micron particle and is more accurate that a volume density calculation. This is because NanoSight tracks the individual particles and not an average as in DLS. The field screen is 120 x 80 x x20 microns so we know we are counting within a discrete area. The typical concentration range is 10^6 – 10^9particles/ml above this you will need to dilute.The video illustrates the accurate count of a bi-modal sample of latex spheres at 100 and 200 nm dispersed 1:1 in water. The particle count can been seen on this section of the screen The graph again is particle size versus particle concentration . This cannot be directly compared to DLS/PCS as they graph based on light intensity.
Unlike any other method NanoSight are able to offer an approximate number count for sub micron particle and is more accurate that a volume density calculation. This is because NanoSight tracks the individual particles and not an average as in DLS. The field screen is 120 x 80 x x20 microns so we know we are counting within a discrete area. The typical concentration range is 10^6 – 10^9particles/ml above this you will need to dilute.The video illustrates the accurate count of a bi-modal sample of latex spheres at 100 and 200 nm dispersed 1:1 in water. The particle count can been seen on this section of the screen The graph again is particle size versus particle concentration . This cannot be directly compared to DLS/PCS as they graph based on light intensity.
Size range is 10 – 1000 nm (1 micron)The lower end of the size limit is affected by the refractive index of the particle. Also the ratio of the RI of the solvent and particle is important; if the refractive indices are too similar (“index-matched”) then no scatter occurs and particles cannot be detected or tracked.When working with samples near the lower limit of detection, there is no subsitute for sample work, which NanoSight are always happy to undertake.
Here is the scatter intensity data added as a third axis, giving improved resolution. This is a unique capability of NanoSight.
Photon Correlation Spectroscopy (PCS) also known as Dynamic Light Scattering (DLS) is an industry standard technique which relates Brownian motion to particle size.DLS does not generate an image of the particle, rather analyses the fluctuations in light intensity caused by the movement of the particles within a sample.Light scattered from particles is relative to the 6th power of the particle radius, hence a 200nm particle will scatter x64 more light than a 100nm particle.As this technique looks at fluctuations in light intensity, the technique has an inherent bias towards larger particles within a sample, by virtue of the fact they scatter more light.The technique produces an average particle size, averaged out over hundreds of thousands of particles. As such it is very well suited for analysis of mono-disperse samples. Due to the skew towards the larger particles it is less well suited for samples with larger aggregates or samples which are poly-disperse. Accurate information with regards to particle concentration can not be achieved through DLS.
This slide compares the two systems results for a monodisperse sample . As you can see the NanoSight technique has good correlation with DLS for mono-disperse samples.The only difference is NanoSight can produce an estimated of number count
And so to a biomodal dispersion :Here we have a mixed sample of 100 and 200 nm particles. The video shows a the 2 particle types in the sample with the 200 nm particles being brighter and slower moving. You can definitely see with NanoSight that there are two sub populations of particles
With the bimodal sample the PCS/DLS misses the 100nm particles because it assesses scatter intensity and the signal from the 100’s is much less than the 200nm particles. The larger particles therefore occlude the smaller ones: you can equate this as looking for a candle light in amongst a flood light
With the bimodal sample the PCS/DLS misses the 100nm particles because it assesses scatter intensity and the signal from the 100’s is much less than the 200nm particles. The larger particles therefore occlude the smaller ones: you can equate this as to looking for a candle light within a flood light beam
Direct view of what is in sample – this is unique to NanoSight. Seeing is believing.Not an average, but a particle-by-particle count – we are tracking each individual particle and are able to give you number, intensity and sizeFar less intensity weighted to larger particles – making the instrument perfect for polydisperse samples but also great to check your DLS results particularly if you are using the instruments for QA purposes whether mono or polydisperse' – note we went into one lab and found their apparently pure lab water was contaminated with so many particles three years worth of results were incorrect!The NanoSight NTA 2.0 software allows user to watch the particle size distributions grow as the analysis progresses through the captured footageThere is no requirement for inputting information on angle, wavelength or solvent refractive index.DLS is great for QA as samples there tend to be very stable but for troubleshooting in process development, R&D labs benefit from NanoSight.Summary of the advantages ofNTA over DLS NTA conducts particle-by-particle analysis, compared to DLS’saverage. So NTA avoids DLS’s bias towards larger particles NTA is much better for poly-disperse systems, where NTA canbetter resolve several peaks in a sample population. NTA can detect at much lower concentrations, down to 107particles per ml. NTA gives a good estimate of concentration, DLS cannot. NTA does not need calibration, and knowledge of refractiveindex is not required (which is a key weakness in DLS). NanoSight images are rich in information, and they provide aquick and easy way to check a sample and validate the resultsAdvantages of DLS over NTA DLS samples many more particles. DLS is ideally suited to monodispersed particlesystems. DLS has a wider size range (3-6,000nm comparedto NTA's 10-1,000nm).
NanoSight is widely applicable. Contact us if your particles not appear on this list.