This document discusses various characterization techniques for nanoparticles. It describes microscopy methods like scanning electron microscopy (SEM), transmission electron microscopy (TEM), and scanning tunneling microscopy (STM) that can be used to determine nanoparticle size, shape, composition and crystalline structure at high resolution. Spectroscopy methods like X-ray diffraction (XRD), small angle X-ray scattering (SAXS), X-ray photoelectron spectroscopy (XPS), UV-vis spectroscopy, and Fourier transform infrared spectroscopy (FT-IR) are also outlined for analyzing nanoparticle properties. The key techniques of SEM, TEM, XRD and SAXS are then explained in more detail regarding their basic principles and what types of nanoparticle information can be obtained
Introduction
Nanoparticle characterization techniques
Electron Microscope
Scanning electron microscope
Transmission electron Microscope
X-ray powder diffraction
Nuclear Magnetic Resonance
Introduction
Nanoparticle characterization techniques
Electron Microscope
Scanning electron microscope
Transmission electron Microscope
X-ray powder diffraction
Nuclear Magnetic Resonance
This presentation contains a basic introduction to quantum dots,their discovery, properties, applications,advantages,limitations and future prospects.It also contains a brief overview of experimental work carried out and results obtained during my summer term project.
The following presentation is only for quick reference. I would advise you to read the theoretical aspects of the respective topic and then use this presentation for your last minute revision. I hope it helps you..!!
Mayur D. Chauhan
here you can find the most rare topics in detail
all fields of chemistry are deeply understood here for presenting the lectures
stay blessed and keep supporting
Nanoparticles are solid colloidal particles ranging in size from 10 to 1000 nm.
Nanoparticles are made of a macromolecular material which can be of synthetic or natural origin.
Characterization methods - Nanoscience and nanotechnologiesNANOYOU
An introduction to characterization methods.
This chapter is part of the NANOYOU training kit for teachers.
For more resources on nanotechnologies visit: www.nanoyou.eu
This presentation contains a basic introduction to quantum dots,their discovery, properties, applications,advantages,limitations and future prospects.It also contains a brief overview of experimental work carried out and results obtained during my summer term project.
The following presentation is only for quick reference. I would advise you to read the theoretical aspects of the respective topic and then use this presentation for your last minute revision. I hope it helps you..!!
Mayur D. Chauhan
here you can find the most rare topics in detail
all fields of chemistry are deeply understood here for presenting the lectures
stay blessed and keep supporting
Nanoparticles are solid colloidal particles ranging in size from 10 to 1000 nm.
Nanoparticles are made of a macromolecular material which can be of synthetic or natural origin.
Characterization methods - Nanoscience and nanotechnologiesNANOYOU
An introduction to characterization methods.
This chapter is part of the NANOYOU training kit for teachers.
For more resources on nanotechnologies visit: www.nanoyou.eu
Preparation and characterization of nimesulide loaded cellulose acetate hydro...Jing Zang
The aim of this study is to prepare nimesulide loaded cellulose acetate hydrogen phthalate nanoparticles by salting out technique. In this study Cellulose acetate Hydrogen phthalate was taken as polymer. Nimesulide was selected as a model drug. This technique is suitable for drugs and polymers that are soluble in polar solvents such as acetone or ethanol. The effect of drug concentration and polymer concentration on nanoparticle size, shape, uniform size distribution and stability was studied. Nanoparticles were evaluated for particle size, zetapotential and particle size distribution. Size of the particle was measured by SEM.(Scanning electron microscope).Surface charge and stability of the resultant nanoparticles was determined by Zetasizer. Particle size distribution was determined by Photon Correlation Spectroscopy (PCS) with a Malvern Zetasizer Nano-ZS. The cellulose acetate hydrogen phthalate concentration and nimesulide concentration was varied from 5mg/ml to 10 mg/ml. The effect of drug and polymer concentrations on nanoparticle size, shape, particle size distribution was studied. Increased drug concentration has no impact on the particle size. The size of the particle was found to be decreased with increased polymer concentration. Increased polymer concentration has resulted in uniform particle size distribution. Higher the polymer concentrations and lower the drug concentrations resulted in uniform particle size distribution.
Electron Diffraction Using Transmission Electron MicroscopyLe Scienze Web News
Electron diffraction via the transmission electron microscope is a powerful method for characterizing the structure of materials, including perfect crystals and defect structures. The advantages of elec- tron diffraction over other methods, e.g., x-ray or neutron, arise from the extremely short wavelength (≈2 pm), the strong atomic scattering, and the ability to exam- ine tiny volumes of matter (≈10 nm3). The NIST Materials Science and Engineer- ing Laboratory has a history of discovery and characterization of new structures through electron diffraction, alone or in combination with other diffraction methods. This paper provides a survey of some of this work enabled through electron mi- croscopy.
In mineral science, there are several analytical instruments used for various purpose, viz…
Scanning electron microscopy
X-ray diffraction
Transmission electron microscopy
X-ray fluorescence
Flame atomic absorption spectroscopy
Electron microprobe analysis
Secondary ion mass spectrometry
Atomic force microscopy
Transmission Electron Microscope (TEM), RESOLVING POWER, Scanning Electron Microscope, PRINCIPLE AND WORKING OF SEM, SEM SAMPLE PREPARATION, Limitations of Scanning Electron Microscopy (SEM), ADVANTAGES & DISADVANTAGES OF SEM, APPLICATIONS OF SEM, PRINCIPLE, AND WORKING OF TEM, SAMPLE PREPARATION FOR TEM, ADVANTAGES & DISADVANTAGES OF TEM, APPLICATIONS OF TEM, Differences between SEM and TEM.
Richard's aventures in two entangled wonderlandsRichard Gill
Since the loophole-free Bell experiments of 2020 and the Nobel prizes in physics of 2022, critics of Bell's work have retreated to the fortress of super-determinism. Now, super-determinism is a derogatory word - it just means "determinism". Palmer, Hance and Hossenfelder argue that quantum mechanics and determinism are not incompatible, using a sophisticated mathematical construction based on a subtle thinning of allowed states and measurements in quantum mechanics, such that what is left appears to make Bell's argument fail, without altering the empirical predictions of quantum mechanics. I think however that it is a smoke screen, and the slogan "lost in math" comes to my mind. I will discuss some other recent disproofs of Bell's theorem using the language of causality based on causal graphs. Causal thinking is also central to law and justice. I will mention surprising connections to my work on serial killer nurse cases, in particular the Dutch case of Lucia de Berk and the current UK case of Lucy Letby.
Cancer cell metabolism: special Reference to Lactate PathwayAADYARAJPANDEY1
Normal Cell Metabolism:
Cellular respiration describes the series of steps that cells use to break down sugar and other chemicals to get the energy we need to function.
Energy is stored in the bonds of glucose and when glucose is broken down, much of that energy is released.
Cell utilize energy in the form of ATP.
The first step of respiration is called glycolysis. In a series of steps, glycolysis breaks glucose into two smaller molecules - a chemical called pyruvate. A small amount of ATP is formed during this process.
Most healthy cells continue the breakdown in a second process, called the Kreb's cycle. The Kreb's cycle allows cells to “burn” the pyruvates made in glycolysis to get more ATP.
The last step in the breakdown of glucose is called oxidative phosphorylation (Ox-Phos).
It takes place in specialized cell structures called mitochondria. This process produces a large amount of ATP. Importantly, cells need oxygen to complete oxidative phosphorylation.
If a cell completes only glycolysis, only 2 molecules of ATP are made per glucose. However, if the cell completes the entire respiration process (glycolysis - Kreb's - oxidative phosphorylation), about 36 molecules of ATP are created, giving it much more energy to use.
IN CANCER CELL:
Unlike healthy cells that "burn" the entire molecule of sugar to capture a large amount of energy as ATP, cancer cells are wasteful.
Cancer cells only partially break down sugar molecules. They overuse the first step of respiration, glycolysis. They frequently do not complete the second step, oxidative phosphorylation.
This results in only 2 molecules of ATP per each glucose molecule instead of the 36 or so ATPs healthy cells gain. As a result, cancer cells need to use a lot more sugar molecules to get enough energy to survive.
Unlike healthy cells that "burn" the entire molecule of sugar to capture a large amount of energy as ATP, cancer cells are wasteful.
Cancer cells only partially break down sugar molecules. They overuse the first step of respiration, glycolysis. They frequently do not complete the second step, oxidative phosphorylation.
This results in only 2 molecules of ATP per each glucose molecule instead of the 36 or so ATPs healthy cells gain. As a result, cancer cells need to use a lot more sugar molecules to get enough energy to survive.
introduction to WARBERG PHENOMENA:
WARBURG EFFECT Usually, cancer cells are highly glycolytic (glucose addiction) and take up more glucose than do normal cells from outside.
Otto Heinrich Warburg (; 8 October 1883 – 1 August 1970) In 1931 was awarded the Nobel Prize in Physiology for his "discovery of the nature and mode of action of the respiratory enzyme.
WARNBURG EFFECT : cancer cells under aerobic (well-oxygenated) conditions to metabolize glucose to lactate (aerobic glycolysis) is known as the Warburg effect. Warburg made the observation that tumor slices consume glucose and secrete lactate at a higher rate than normal tissues.
Earliest Galaxies in the JADES Origins Field: Luminosity Function and Cosmic ...Sérgio Sacani
We characterize the earliest galaxy population in the JADES Origins Field (JOF), the deepest
imaging field observed with JWST. We make use of the ancillary Hubble optical images (5 filters
spanning 0.4−0.9µm) and novel JWST images with 14 filters spanning 0.8−5µm, including 7 mediumband filters, and reaching total exposure times of up to 46 hours per filter. We combine all our data
at > 2.3µm to construct an ultradeep image, reaching as deep as ≈ 31.4 AB mag in the stack and
30.3-31.0 AB mag (5σ, r = 0.1” circular aperture) in individual filters. We measure photometric
redshifts and use robust selection criteria to identify a sample of eight galaxy candidates at redshifts
z = 11.5 − 15. These objects show compact half-light radii of R1/2 ∼ 50 − 200pc, stellar masses of
M⋆ ∼ 107−108M⊙, and star-formation rates of SFR ∼ 0.1−1 M⊙ yr−1
. Our search finds no candidates
at 15 < z < 20, placing upper limits at these redshifts. We develop a forward modeling approach to
infer the properties of the evolving luminosity function without binning in redshift or luminosity that
marginalizes over the photometric redshift uncertainty of our candidate galaxies and incorporates the
impact of non-detections. We find a z = 12 luminosity function in good agreement with prior results,
and that the luminosity function normalization and UV luminosity density decline by a factor of ∼ 2.5
from z = 12 to z = 14. We discuss the possible implications of our results in the context of theoretical
models for evolution of the dark matter halo mass function.
A brief information about the SCOP protein database used in bioinformatics.
The Structural Classification of Proteins (SCOP) database is a comprehensive and authoritative resource for the structural and evolutionary relationships of proteins. It provides a detailed and curated classification of protein structures, grouping them into families, superfamilies, and folds based on their structural and sequence similarities.
Observation of Io’s Resurfacing via Plume Deposition Using Ground-based Adapt...Sérgio Sacani
Since volcanic activity was first discovered on Io from Voyager images in 1979, changes
on Io’s surface have been monitored from both spacecraft and ground-based telescopes.
Here, we present the highest spatial resolution images of Io ever obtained from a groundbased telescope. These images, acquired by the SHARK-VIS instrument on the Large
Binocular Telescope, show evidence of a major resurfacing event on Io’s trailing hemisphere. When compared to the most recent spacecraft images, the SHARK-VIS images
show that a plume deposit from a powerful eruption at Pillan Patera has covered part
of the long-lived Pele plume deposit. Although this type of resurfacing event may be common on Io, few have been detected due to the rarity of spacecraft visits and the previously low spatial resolution available from Earth-based telescopes. The SHARK-VIS instrument ushers in a new era of high resolution imaging of Io’s surface using adaptive
optics at visible wavelengths.
Seminar of U.V. Spectroscopy by SAMIR PANDASAMIR PANDA
Spectroscopy is a branch of science dealing the study of interaction of electromagnetic radiation with matter.
Ultraviolet-visible spectroscopy refers to absorption spectroscopy or reflect spectroscopy in the UV-VIS spectral region.
Ultraviolet-visible spectroscopy is an analytical method that can measure the amount of light received by the analyte.
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.Sérgio Sacani
The return of a sample of near-surface atmosphere from Mars would facilitate answers to several first-order science questions surrounding the formation and evolution of the planet. One of the important aspects of terrestrial planet formation in general is the role that primary atmospheres played in influencing the chemistry and structure of the planets and their antecedents. Studies of the martian atmosphere can be used to investigate the role of a primary atmosphere in its history. Atmosphere samples would also inform our understanding of the near-surface chemistry of the planet, and ultimately the prospects for life. High-precision isotopic analyses of constituent gases are needed to address these questions, requiring that the analyses are made on returned samples rather than in situ.
This pdf is about the Schizophrenia.
For more details visit on YouTube; @SELF-EXPLANATORY;
https://www.youtube.com/channel/UCAiarMZDNhe1A3Rnpr_WkzA/videos
Thanks...!
The increased availability of biomedical data, particularly in the public domain, offers the opportunity to better understand human health and to develop effective therapeutics for a wide range of unmet medical needs. However, data scientists remain stymied by the fact that data remain hard to find and to productively reuse because data and their metadata i) are wholly inaccessible, ii) are in non-standard or incompatible representations, iii) do not conform to community standards, and iv) have unclear or highly restricted terms and conditions that preclude legitimate reuse. These limitations require a rethink on data can be made machine and AI-ready - the key motivation behind the FAIR Guiding Principles. Concurrently, while recent efforts have explored the use of deep learning to fuse disparate data into predictive models for a wide range of biomedical applications, these models often fail even when the correct answer is already known, and fail to explain individual predictions in terms that data scientists can appreciate. These limitations suggest that new methods to produce practical artificial intelligence are still needed.
In this talk, I will discuss our work in (1) building an integrative knowledge infrastructure to prepare FAIR and "AI-ready" data and services along with (2) neurosymbolic AI methods to improve the quality of predictions and to generate plausible explanations. Attention is given to standards, platforms, and methods to wrangle knowledge into simple, but effective semantic and latent representations, and to make these available into standards-compliant and discoverable interfaces that can be used in model building, validation, and explanation. Our work, and those of others in the field, creates a baseline for building trustworthy and easy to deploy AI models in biomedicine.
Bio
Dr. Michel Dumontier is the Distinguished Professor of Data Science at Maastricht University, founder and executive director of the Institute of Data Science, and co-founder of the FAIR (Findable, Accessible, Interoperable and Reusable) data principles. His research explores socio-technological approaches for responsible discovery science, which includes collaborative multi-modal knowledge graphs, privacy-preserving distributed data mining, and AI methods for drug discovery and personalized medicine. His work is supported through the Dutch National Research Agenda, the Netherlands Organisation for Scientific Research, Horizon Europe, the European Open Science Cloud, the US National Institutes of Health, and a Marie-Curie Innovative Training Network. He is the editor-in-chief for the journal Data Science and is internationally recognized for his contributions in bioinformatics, biomedical informatics, and semantic technologies including ontologies and linked data.
2. CHARACTERIZATION OF NANOPARTICLES
- Characterization refers to study of materials features
such as its composition, structure, and various
properties like physical, electrical, magnetic, etc.
Important characterization of nanoparticles
-Nanoparticle properties vary significantly with size
and shape.
- Accurate measurement of nanoparticles size and
shape is, therefore, critical to its applications.
6. Basic principle
When the beam of electrons
strikes the surface of the
specimen & interacts with
the atoms of the sample,
signals in the form of
secondary electrons, back
scattered electrons &
characteristic X-rays are
generated that contain
information about the
samples’ surface topography,
composition etc.
7. What can you see with an SEM?
-Topography
Texture/surface of a sample
-Morphology
Size, shape, order of particles
-Composition
Elemental composition of sample
-Crystalline Structure
Arrangement present within sample
8. Operation modes
There are 3 modes
-Primary: High resolution (1-5 nm); secondary electron
imaging
-Secondary: Characteristic X-rays; identification of elemental
composition of sample by EDX technique
-Tertiary: Back-scattered electronic images; clues to the
elemental composition of sample
9. Electronic devices are used
to detect & amplify the
signals & display them as
an image on a cathode ray
tube in which the raster
scanning is synchronized
with that of the
microscope.
10. In a typical SEM, the beam passes through pairs of
scanning coils or pairs of deflector plates in the electron
column to the final lens, which deflect the beam
horizontally & vertically.
The image displayed is therefore a distribution map of
the intensity of the signal being emitted from the
scanned area of the specimen.
14. Advantages:
1- Bulk-samples can be observed and larger sample area can be
viewed,
2- generates photo-like images,
3- very high-resolution images are possible
4- SEM can yield valuable information regarding the purity as well
as degree of aggregation
Disadvantages:
1- Samples must have surface electrical conductivity
2- non- conductive samples need to be coated with a conductive
Layer
3- Time consuming & expensive.
4- Sometimes it is not possible to clearly differentiate nanoparticle
from the substrate.
5- SEM can’t resolve the internal structure of these domains.
16. What can we see with a TEM?
-Morphology
• Shape, size, order of particles in sample
-Crystalline Structure
• Arrangement of atoms in the sample
• Defects in crystalline structure
-Composition
• Elemental composition of the sample
17. Basic principle
The crystalline sample interacts
with the electron beam mostly by
diffraction rather than by absorption.
The intensity of the diffraction
depends on the orientation of the
planes of atoms in a crystal relative to
the electron beam.
A high contrast image can be formed
by blocking deflected electrons which
produces a variation in the electron
intensity that reveals information on
the crystal structure.
This can generate both ‘bright or light
field’& ‘dark field’ images.
22. * Advantages:
1- Additional analysis techniques like X-ray spectrometry are
possible with the STEM.
2- high- resolution
3- ( 3-D) image construction possible but aberrant.
4- Changes in nanoparticle structure as a result of interactions with
gas, liquid & solid-phase substrates can also be monitored.
* Disadvantages :
1- Sample must be able to withstand the electron beam &
also the high vacuum chamber.
2- sample preparation necessary, mostly used for 2-D
images.
3- Time consuming.
24. Synthesis and optical characterization of copper oxide
nanoparticles:
SEM and TEM study
-Figure 2 shows the SEM image of as prepared CuO
nanoparticles. It shows that the CuO nanoparticles are in
rectangular shape.
-Figure 3 (a) shows the TEM image of as prepared nanoparticles.
The size of particle observed in TEM image is in the range of 5-6
nm which is in good agreement with calculated by Scherrer
formula using XRD. Figure 3 (b) shows the selected
area diffraction pattern (SAED) of as prepared CuO nanoparticles.
It shows that the particles are well crystallized.
-The diffraction rings on SAED image matches with the peaks in
XRD pattern which also proves the monoclinic structure of as
prepared CuO nanoparticles [18]
27. Basic principle
It is based on the concept of quantum tunneling. When a
conducting tip is brought very near to a metallic or semi-conducting
surface, a bias between the two can allow
electrons to tunnel through the vacuum between them.
Variations in tunneling current as the probe passes over
the surface are translated into an image.
They normally generate image by holding the current
between the tip of the electrode & the specimen at some
constant value by using a piezoelectric crystal to adjust the
distance between the tip & the specimen surface.
30. Advantages:
1- Very high image resolution (capable of „seeing‟ and manipulating
atoms).
2- STM can be used not only in ultra high vacuum but also in air &
various other liquid or gas, at ambient & wide range of
temperature.
Disadvantages :
1- Again
2- radius of curvature of tip
3- extremely sensitive to ambient vibrations
4- STM can be a challenging technique, as it requires extremely
clean surfaces & sharp tips.
32. Optical Spectroscopy
Optical spectroscopy uses the interaction of light
with matter as a function of wavelength or energy
in order to obtain information about the material.
Typical penetration depth is of the order of 50 nm.
Optical spectroscopy is attractive for materials
characterization because it is fast, nondestructive
and of high resolution.
33. X-ray Diffraction
XRD can be used to look at various characteristics of the single
crystal or polycrystalline materials using Bragg’s Law ,
nλ = 2d sinθ
34. X-ray Diffraction XRD
“Smaller crystals produce broader XRD
peaks”
Scherrer’s Formula
t- thickness of crystallite
K- shape constant
λ- wavelength
B- FWHM
ϴ- Bragg Angle
Characterizations
1.Lattice constant
2.d-spacing
3. crystal structure
4. Thickness (films)
5.sample orientation
6.Particle Size (grains)
XRD is time
consuming and
requires a large
volume of sample.
35. X-ray interactions
depends on the
number of atom on
a plane. This is why
only specific planes
would cause
diffraction.
38. Small Angle X-ray Scattering SAXS
“SAXS is the scattering due to the existence of inhomogeneity regions
of sizes of several nanometers to several tens nanometers.”
Characterization
1.Particle Size
2. Specific Surface Area
3.Morphology
4.Porosity
Fluctuations in electron density over lengths on the order of 10nm
or larger can be sufficient to produce an appreciable scattered X-ray
densities at angles 2ϴ < 50
It is capable of delivering structural information of molecules
between 5 and 25 nm. Of repeated distances in partially ordered
systems of up to 150 nm.
39.
40.
41. Example : SAXS data from a titania nanopowder, before and after
background correction, together with the background measurement.
In this comparison, the data are already corrected for absorption by
the sample .
51. UV-vis spectroscopy
This technique involves the absorption of near-UV or visible light.
One measures both intensity and wavelength. It is usually applied to
molecules and inorganic ions in solution.
Broad features makes it not ideal for sample identification.
However, one can determine the analyte concentration from
absorbance at one wavelength and using the Beer-Lambert law:
.where a = absorbance, b = path length, and c = concentration
53. Infrared Spectroscopy FT-IR
What is the principle behind IR spectroscopy?
Firstly, molecules and crystals can be thought of as systems of balls
(atoms or ions) connected by springs (chemical bonds).
These systems can be set into vibration, and vibrate with frequencies
determined by the mass of the balls (atomic weight) and by the
stiffness of the springs (bond strength).
With these oscillations of the system, a impinging beam of infrared
EMR could couple with it and be absorbed.
These absorption frequencies represent excitations of vibrations of
the chemical bonds and, thus, are specific to the type of bond and
the group of atoms involved in the vibration.
In an infrared experiment, the intensity of a beam of IR is measured
before and after it interacts with the sample as a function of light
frequency.
54. Infrared Spectroscopy FT-IR
Characterization:
1.Compositional
2.Concentration
3.Atomic Structure
4. surrounding environmentsor atomic arrangement
The mechanical molecular and crystal vibrations are at very high
frequencies ranging from 1012 to 1014 Hz (3-300μm wavelength), which
is in the infrared (IR) regions of the electromagnetic spectrum.
The oscillations induced by certain vibrational frequencies provide a
means for matter to couple with an impinging beam of infrared
electromagnetic radiation and to exchange energy with it when
frequencies are in resonance.