The document describes the LA-950 Laser Diffraction Analyzer. It can measure particle sizes from 0.01 to 3000 μm with low end sensitivity down to 30 nm. It has various accessories for liquid and dry powder samples, including options for small sample volumes. The software allows for automated refractive index computation and has tools for method development. Accuracy is verified using ISO and USP standards, with test results on polydisperse standards shown to be within specification tolerances. Precision is high, as demonstrated by repeat measurements on polystyrene latex standards.
Selecting the Best Particle Size Analyzer for your ApplicationHORIBA Particle
Mark Bumiller from HORIBA Particle discusses the benefits and limitations of modern particle size analyzers and ideas on how to approach the choice of a new measurement technique or instrument.
This presentation is archived with the original webinar video in the Download Center at www.horiba.com/us/particle.
Introducing the LA-960 Laser Particle Size AnalyzerHORIBA Particle
HORIBA Instruments proudly announces the latest evolution in the LA series of particle size analyzers, the LA-960. Building on the successful LA-950 platform, this advanced model measures the particle size of suspensions, emulsions, powders, pastes, creams, and gels between 10 nanometers and 5 millimeters. The complex science of particle size analysis is simplified with HORIBA’s refined software and powerful sample handling systems.
Join us on Wednesday, September 17th at 1:30 PM Eastern to learn more about the world’s newest and most advanced laser particle size analyzer. This webinar will be suitable for anyone wishing to learn about the state of the art in laser particle size analysis.
Understanding Your Particle Size Analyzer ResultsHORIBA Particle
Ian Treviranus, Product Line Manager for HORIBA Particle, covers everything from basic to advanced topics to understand your particle size distribution measurements.
GASSCHROMATOGRAPHY, ADVANCED STUDY OF THE FOLLOWING AND THEIR APPLICATIONS, I...Dr. Ravi Sankar
GASSCHROMATOGRAPHY, ADVANCED STUDY OF THE FOLLOWING AND THEIR APPLICATIONS, INTRODUCTION, THEORY, COLUMN OPERATION,INSTRUMENTATION AND DETECTION,APPLICATIONS AND ADVANTAGES OF GC,PRINCIPLE OF SEPARATION IN GC, HOW GC MECHINE WORKS? COLUMN, DETECTORS.
BY P.RAVISANKAR, VIGNAN PHARMACY COLLEGE, VADLAMUDI, GUNTUR, ANDHRA PRADESH, INDIA.
A method of obtaining an Infrared spectrum by measuring the interferogram of a sample using an interferometer, then performing a Fourier Transform upon the interferogram to obtain the spectrum.
Selecting the Best Particle Size Analyzer for your ApplicationHORIBA Particle
Mark Bumiller from HORIBA Particle discusses the benefits and limitations of modern particle size analyzers and ideas on how to approach the choice of a new measurement technique or instrument.
This presentation is archived with the original webinar video in the Download Center at www.horiba.com/us/particle.
Introducing the LA-960 Laser Particle Size AnalyzerHORIBA Particle
HORIBA Instruments proudly announces the latest evolution in the LA series of particle size analyzers, the LA-960. Building on the successful LA-950 platform, this advanced model measures the particle size of suspensions, emulsions, powders, pastes, creams, and gels between 10 nanometers and 5 millimeters. The complex science of particle size analysis is simplified with HORIBA’s refined software and powerful sample handling systems.
Join us on Wednesday, September 17th at 1:30 PM Eastern to learn more about the world’s newest and most advanced laser particle size analyzer. This webinar will be suitable for anyone wishing to learn about the state of the art in laser particle size analysis.
Understanding Your Particle Size Analyzer ResultsHORIBA Particle
Ian Treviranus, Product Line Manager for HORIBA Particle, covers everything from basic to advanced topics to understand your particle size distribution measurements.
GASSCHROMATOGRAPHY, ADVANCED STUDY OF THE FOLLOWING AND THEIR APPLICATIONS, I...Dr. Ravi Sankar
GASSCHROMATOGRAPHY, ADVANCED STUDY OF THE FOLLOWING AND THEIR APPLICATIONS, INTRODUCTION, THEORY, COLUMN OPERATION,INSTRUMENTATION AND DETECTION,APPLICATIONS AND ADVANTAGES OF GC,PRINCIPLE OF SEPARATION IN GC, HOW GC MECHINE WORKS? COLUMN, DETECTORS.
BY P.RAVISANKAR, VIGNAN PHARMACY COLLEGE, VADLAMUDI, GUNTUR, ANDHRA PRADESH, INDIA.
A method of obtaining an Infrared spectrum by measuring the interferogram of a sample using an interferometer, then performing a Fourier Transform upon the interferogram to obtain the spectrum.
HPLC PPT for you If You want to download it then download it .
Its is in original PPT form
Share it & improve your Knowledge.
INTRODUCTION
HPLC is a form of liquid chromatography used to separate compounds that are dissolved in solution.
HPLC instruments consist of a reservoir of mobile phase, a pump, an injector, a separation column, and a detector.
Compounds are separated by injecting a sample mixture into the column. The different component in the mixture pass through the column at differentiates due to differences in their partition behavior between the mobile phase and the stationary phase.
The mobile phase must be degassed to eliminate the formation of air bubbles.
Different Sources of radiation used in UV VISIBLE SPECTROSCOPY ANIT Thakur
1. Basics terminology, Transmittance and Absorbance
2. Lamberts Beers Law
3. Advantages and Disadvantages of Lambers Beers Law.
Different Sources of UV Visible spectroscopy,
Hydrogen lamp. deutorium lamp. tugsten filament lamp. mercury lamp. xenon lamp. LEDS. and their advantages and disadvantages.
These slides give an introduction to gas chromatography, It also guides analyst to a proper selection of liner, column, and some main operating conditions.
Interfaces in chromatography [LC-MS, GC-MS, HPTLC, LC, GC]Shikha Popali
THE INTERFACES OF CHROMATOGRAPHY INCLUDES THE CHROMATOGRAPHY CRITEREA WHERE THE DIFFERENT CHROMATOGRAPHY ARE EXPLAINED IN DETAIL WITH PRACTICAL EXAMPLES AND THEIR IMAGES.
Presented By :- Raghav Sharma
Class :- M. Pharm, 1st sem.
Department :- Pharmaceutics
Institute :- Parul Institute of Pharmacy
Content :-
Instrumentation and working of flame photometry
Flame atomizer
Nebulizer
Atomizer burner
Monochromator
Detector
Amplifier
Advantages
Disadvantages
Reference
Particle Classroom Series IV: System VerificationHORIBA Particle
Confirming the performance of a particle analyzer is a critical step in ensuring and proving data quality. Join Dr. Jeff Bodycomb as he discusses performance expectations, confirming system performance, and recommended practices. This is part four of a six-part classroom series.
View recorded webinars:
http://bit.ly/particlewebinars
HPLC PPT for you If You want to download it then download it .
Its is in original PPT form
Share it & improve your Knowledge.
INTRODUCTION
HPLC is a form of liquid chromatography used to separate compounds that are dissolved in solution.
HPLC instruments consist of a reservoir of mobile phase, a pump, an injector, a separation column, and a detector.
Compounds are separated by injecting a sample mixture into the column. The different component in the mixture pass through the column at differentiates due to differences in their partition behavior between the mobile phase and the stationary phase.
The mobile phase must be degassed to eliminate the formation of air bubbles.
Different Sources of radiation used in UV VISIBLE SPECTROSCOPY ANIT Thakur
1. Basics terminology, Transmittance and Absorbance
2. Lamberts Beers Law
3. Advantages and Disadvantages of Lambers Beers Law.
Different Sources of UV Visible spectroscopy,
Hydrogen lamp. deutorium lamp. tugsten filament lamp. mercury lamp. xenon lamp. LEDS. and their advantages and disadvantages.
These slides give an introduction to gas chromatography, It also guides analyst to a proper selection of liner, column, and some main operating conditions.
Interfaces in chromatography [LC-MS, GC-MS, HPTLC, LC, GC]Shikha Popali
THE INTERFACES OF CHROMATOGRAPHY INCLUDES THE CHROMATOGRAPHY CRITEREA WHERE THE DIFFERENT CHROMATOGRAPHY ARE EXPLAINED IN DETAIL WITH PRACTICAL EXAMPLES AND THEIR IMAGES.
Presented By :- Raghav Sharma
Class :- M. Pharm, 1st sem.
Department :- Pharmaceutics
Institute :- Parul Institute of Pharmacy
Content :-
Instrumentation and working of flame photometry
Flame atomizer
Nebulizer
Atomizer burner
Monochromator
Detector
Amplifier
Advantages
Disadvantages
Reference
Particle Classroom Series IV: System VerificationHORIBA Particle
Confirming the performance of a particle analyzer is a critical step in ensuring and proving data quality. Join Dr. Jeff Bodycomb as he discusses performance expectations, confirming system performance, and recommended practices. This is part four of a six-part classroom series.
View recorded webinars:
http://bit.ly/particlewebinars
We are committed to solve this challenge and increasing the availability, uptimes and affordability of medical supplies and other equipment. Our focus is to establish a close business relationship with our valuable clients and to meet their needs. We manufacture quality products in our company under the guidance of experts and engineers with innovative ideas and vast knowledge. We source our raw materials from reliable sources so that the final product is up to the expectations of our clients.
The NIR sensor of the TS x050 series is a 180° see-through sensor measuring absorption or turbidity in fluids in the near infrared range (880 nm wavelength). The sensor is installed in and/or on tanks or pipelines. The optical window of the sensor is submerged in the process medium in order to measure the physical properties by absorbing irradiated light.
Modern Particle Characterization Techniques Series: Laser DiffractionHORIBA Particle
This part two of the webinar series will introduce participants to basic experimental considerations when choosing laser diffraction for particle size analysis. The presentation will explain what makes laser diffraction a “modern technique.” Both wet and dry case studies will be shown along with brief demonstration videos.
In this webinar, you will learn:
- Method development
- Choosing an appropriate refractive index
- Understanding the analysis results
View recorded webinars:
http://bit.ly/particlewebinars
SBS strengthens its reputation as the world
leader in automatic dynamic balancing and
process monitoring for the grinding industry
with the SB-5500 controller. With its all-digital
electronic design, compact size and larger high
resolution detachable display, the SB-5500
provides unsurpassed accuracy (up to 0.02
micron), speed (300-30,000rpm) and flexibility
(four-channel capability).
This presentation aims to separate the fact from the fiction when it comes to nanobubbles. We will explain what exactly a nanobubble is and why there is interest in this area. We will look at some of the standards activity in the area and also some of the characterization challenges associated with measuring nanobubbles. Finally we will look at how three techniques have been applied to measuring nanobubbles based around real data sets and some of the benefits of the different techniques. A narrated recording of this webinar can be accessed at bit.ly/nanobubble
Portable turbidity meters are commonly used to assess the clarity and quality of water in natural bodies of water, drinking water sources, and water distribution systems. High turbidity levels can indicate the presence of contaminants, sediments, or pollutants in water. Researchers in various fields may use portable turbidity meters for experiments and studies related to colloidal chemistry, particle characterization, and the stability of suspensions.
Its a Powerful tool for measuring the absorbance of enzymatic reactions in a microplate. Our ELISA Reader is designed to provide efficient and accurate results, making it ideal for various applications in clinical diagnostics.
Dry powder or particles in aqueous /organic suspension
- Truly Automodal: You don\'t need to know anything related to size / modalities prior to running sample.
- Highest accuracy: Scattered data Cannot be manipulated to meet modality expectations. You get what what you have in there. 1 single answer every time.
Characterizing Nanoparticles used in Bio ApplicationsHORIBA Particle
Mark Bumiller from HORIBA Particle discusses current measurement technologies for investigating nanoparticles used in biologic and biotech applications.
This presentation is archived with the original webinar video in the Download Center at www.horiba.com/us/particle.
HUBUNGI KAMI:
PT. Minds Indo Survey, Komp. Ruko Mega Kalimalang Kav.8 Jln. KH. Noer Ali No.11 Pekayon Jaya Bekasi 17148 Fax: 02188860790, Mobile: 082119953499, Email: budi1080@gmail.com, Kunjungi dan KLIK Web Kami: mindsindosurvey.co.id
RUANG LINGKUP
PENJUALAN, SERVICE / PERBAIKAN DAN PENYEWAAN ALAT-ALAT UKUR DAN PENGUKURAN TOPOGRAFI.
PENJUALAN:
• Alat Ukur
- Total Station: (baru dan second hand)
- Theodolite: (baru dan second hand)
- Levels: (baru dan second hand)
- GPS
- USV
- Bathymetric
- Compass
- Clinometer
- Tandem/ Clino
- Binocular
- Digital Level
- Digital Planimeter
• Accessories:
- Tripod
- Prisma Polygon
- Prisma Detail
- Rambu Ukur
- Meteran
- Jalon
- Tripod Jepit
- Prisma mini
PENYEWAAN:
- Total Station
- Theodolite
- Automatic Level
- GPS Geodetik
-
MEREK ALAT UKUR YANG TERSEDIA:
- Minds
- Spectra
- Nikon
- Suunto
- Garmin
- Topcon
- Horizon
- Sokkia
- Leica
- CHC
- Trimble
- CHCNAV
SERVICE / PERBAIKAN DAN KALIBRASI MACAM-MACAM ALAT UKUR
HARGA MENARIK / COMPETITIVE
Catatan: Price List akan dikirim sesuai permintaan
Contact Person
Budianto
082119953499
Similar to Advantages of the LA-950 Laser Diffraction PSA (20)
Exosomes: Exploiting the Diagnostic and Therapeutic Potential of Nature’s Bio...HORIBA Particle
Research on exosomes and other forms of extracellular vesicles (EVs) have rapidly expanded over the last two decades. These lipid-enclosed, nanoscale messengers are released from cells packed with diverse cargo and can travel long distances to modify the function of target cells. Found in abundant quantities in biological fluids like blood, there is great clinical interest in using EVs as diagnostic markers or altering their properties for therapeutic delivery. Tune in to find out more about what exosomes are, how researchers study them, and what challenges remain. This talk will highlight multi-laser nanoparticle tracking analysis (NTA) with the ViewSizer 3000 and what it offers in exosome research.
View recorded webinars:
http://bit.ly/particlewebinars
Mastering the Processing Methods of Engineered ParticlesHORIBA Particle
This webinar will explain the development process for particles with specific attributes that can cause problems during production. Three case studies will be discussed: Engineered particles that protect omega-3 oil from oxidation using special microencapsulation methods; modified cellulose fibers with high water holding capacity; and engineered particles produced by melt atomization processes with unique attributes. The talk will focus on alternative processing methods, the importance of understanding the materials being used, and what can happen when you do not understand the functional properties that you are designing for.
View recorded webinars:
http://bit.ly/particlewebinars
Modern Particle Characterization Techniques Series I: IntroductionHORIBA Particle
Particle characterization is a rich field that touches industries from mining to pharmaceutical production. There are a number of characterization techniques available to the modern analyst. Understanding them is key to selecting the right technique as well as gaining deeper insight into the meaning of measurement results.
This webinar is the beginning of a new series reviewing a number of modern measurement techniques. Dr. Michael Pohl, Vice President of HORIBA Scientific, will describe some common ideas in particle characterization along with common questions to ask when selecting a technique. Mike will also give a very brief overview of some modern techniques before subsequent webinars go into detail.
View recorded webinars:
http://bit.ly/particlewebinars
Concentration and Size of Viruses and Virus-like ParticlesHORIBA Particle
Accurate concentration for virus and virus-like particles can be determined by multi-laser nanoparticle tracking analysis due to the fact they are nanoparticles. Other biologically relevant materials will have sizes that are close to those of viruses, whether they are protein aggregates that provoke an unwanted immune response or exosomes with a similar size and do not provoke an immune response.
In this webinar, Dr. Jeff Bodycomb will discuss the use of multi-laser nanoparticle tracking analysis (NTA) to determine the size distribution and concentration of these species, the latter which is correlated to viral infectivity.
Learn more about:
-How NTA determines concentration and size distribution
-Advantages and limits of the multi-laser technique
-Example measurement results
View recorded webinars:
http://bit.ly/particlewebinars
The Value of Real-time Imaging: Integrating Particle Size Analysis onto Fluid...HORIBA Particle
The capability to measure critical quality attributes (CQA) such as particle size in real time reveals their functional relationships with the critical process parameters (CPP). The Eyecon2™ offers a true non-product contact, a real-time imaging system that can be used with dry and wet bulk solid processing equipment affording a digital maturity competitive edge. We will dive into how the imaging technology works, the basic principles of analysis for particle size detection, the methods of integration onto process equipment such as Fluid Beds, Twin Screw Granulators and Roller Compactor and discuss key applications where the value of real-time in-line particle size results archetype how the Eyecon2 enables transparency, agility and productivity that aligns with the Factory of the Future and Pharma 4.0.
View recorded webinars:
http://bit.ly/particlewebinars
How and Why to Analyze Ceramic Powder ParticlesHORIBA Particle
Packing density, mechanical strength, and processing of ceramics are all affected by the size distribution of the powders. Therefore, particle size analysis is an important quality control step. Due to its wide size range and flexibility, laser diffraction is often the preferred method of analysis. Laser diffraction can be used for particles with sizes from 10’s of nanometers to millimeters. In this webinar, Dr. Jeff Bodycomb of HORIBA Scientific discusses particle analysis of ceramic particles, including electronic materials and common oxides. He will cover the basic principles of analysis, practical methods for obtaining good data, and example data.
View recorded webinars:
http://bit.ly/particlewebinars
Interpreting Laser Diffraction Results for Non-Spherical ParticlesHORIBA Particle
Particle shape can have a profound impact on particle size distribution (PSD) measurements. In the case of Laser Diffraction, the shape and aspect ratio of particles alter the diffraction pattern used to determine PSD, which is calculated on the basis of equivalent spherical diameter. For instance, it has been established that the reported size of an ellipsoid is always smaller than the physical major dimension of the particle. Furthermore, when non-spherical particles align within a flowing sample, laser diffraction instruments typically report a bi-modal size distribution even in the case of monodisperse samples.
Equipped with only qualitative knowledge of particle shape, the particle analyst can resolve this inherent ambiguity and use laser diffraction to obtain quantitative information (such as aspect ratio) about non-spherical particles. This webinar explains the origin of this effect, describes how to interpret PSD data in such cases, and demonstrates several practical applications for measurements of crystals, bacteria, and clays.
View recorded webinars:
http://bit.ly/particlewebinars
Particle Classroom Series VI: Method DevelopmentHORIBA Particle
Great results need a great method. In order to compare different lots of material or different manufacturing approaches, variation due to sample preparation should be minimized. Should the sample be run in suspension or as a dry powder? What salts or surfactants are needed for the suspension? How much energy should be applied and how? Systematically determining the answers to these questions is method development.
View recorded webinars:
http://bit.ly/particlewebinars
Particle Classroom Series V: Sampling and DispersionHORIBA Particle
The goal of a particle analysis is to understand the properties of a material, whether it is the size distribution of particles that are manufactured today or the size distribution of particles in the truck that just arrived. Naturally, a particle analyzer only encounters a tiny fraction of that material, the sample.
In addition, particles can be bound together to form agglomerates that do not represent the underlying materials. The instrument will then measure the agglomerates, not its constituent particles. In this webinar, Jeff discusses how to improve data quality by obtaining a representative sample and effectively disperse the sample to remove or prevent agglomerates.
View recorded webinars:
http://bit.ly/particlewebinars
Particle Size Analyses of Polydisperse Liposome Formulations with Multispectr...HORIBA Particle
During this webinar, Dr. Singh will discuss the significance of liposome size characteristics in medicine. He will discuss the challenges present in particle size measurement for heterogeneous size containing formulation (polydisperse). He will also discuss his recently published results on polydisperse bead and liposome formulations using DLS, conventional NTA, laser diffraction and a novel multispectral NTA measurement techniques.
View recorded webinars:
http://bit.ly/particlewebinars
Principio, Optimización y Aplicaciones del Análisis de seguimiento de Nanopar...HORIBA Particle
Este webinar tiene como objetivo presentar los fundamentos teóricos del análisis de seguimiento de nanopartículas (NTA), su optimización y su aplicaciones más recientes en la industria e investigación.
- Principio de la Técnica de Rastreo de Nanopartículas (NTA)
- Información que nos proporciona la técnica NTA
- Limitaciones de la técnica NTA
- Optimización
- Aplicaciones más recientes
View recorded webinars:
http://bit.ly/particlewebinars
Surface area is an important physical property that influences the reactivity, dissolution, catalysis, and separation of materials. The surface area often must be carefully engineered and measured to optimize specific functions. In this Webinar, our applications lab will explain with real-world examples:
- Physical adsorption technique - BET theory
- Sample preparation – the start of a good measurement
- Calculating specific surface area from gas adsorption on solid surfaces
- Troubleshooting – what happens when things go wrong?
View recorded webinars:
http://bit.ly/particlewebinars
Particle Classroom Series III: Refractive Index and Laser DiffractionHORIBA Particle
Modern laser diffraction particle analyzers use particle refractive index to accurately model the behavior of light inside of the particle. However, this presents the analyst with the challenge of choosing the correct value. In this Webinar, Dr. Jeff Bodycomb will discuss:
- Why do we need a refractive index value?
- What is refractive index?
- How do we choose refractive index values?
View recorded webinars:
http://bit.ly/particlewebinars
How to Present and Compare Data Obtained by Particle Tracking Analysis and Ot...HORIBA Particle
This Webinar is for anyone who wants to understand how experimental data should be presented and compared properly while using histograms.
Dr. Kuba Tatarkiewicz examines methods for particle size distributions as obtained by particle tracking analysis, fluorescence, as well as micro-sedimentation. Choices of binning schemes that users can design themselves will be discussed with examples of how various parameters (like mode and D50) change with different binnings, especially for highly polydisperse colloids. Methods for comparison of particle size distributions will be presented and explained in practical terms.
View recorded webinars:
http://bit.ly/particlewebinars
Why the University of Washington chose the HORIBA Laser Scattering Particle S...HORIBA Particle
Join our users at the University of Washington (UW) as they discuss how the HORIBA particle size analyzer is used in their undergraduate courses and how the instrument manages to support a wide range of applications. Some examples include polymers spheres, ceramic powders, soil, and rocks. In this Webinar, Materials Science and Engineering (MSE) graduate students, Michelle Katz and Tiffany Tang will demonstrate with case studies how other orthogonal methods such as optical microscopy, scanning electron microscopy and x-ray diffraction help them cross-validate their particle size and size distribution. You will also learn in their own words, why the UW MSE program chose the HORIBA particle size analyzer over other options for their undergraduate environment.
View recorded webinars:
http://bit.ly/particlewebinars
Particle Classroom Series II: The Basics of Laser DiffractionHORIBA Particle
Particle size analysis by laser diffraction offers many advantages. The technique is fast, reliable, and can be used for analyzing a wide range of particle sizes. In laser diffraction scattering as a function of angle is measured and the data used to determine the particle size distribution. The technique can be used over a very wide range of particle sizes -- 10's of nm to 100's of microns. In addition it is very fast and reliable. In this webinar, Dr. Jeff Bodycomb will discuss:
Exactly what happens when light strikes a particle
Light intensity and how it effects the measurement
Fraunhofer vs. Mie
Real and imaginary refractive index values
This is a great introduction to someone who wants to understand the science behind the measurement.
View recorded webinars:
http://bit.ly/particlewebinars
Particle Classroom Series I: Introduction to Particle AnalysisHORIBA Particle
If you're new to particle characterization, this is a webinar just for you! Dr. Jeff Bodycomb will discuss the basics of particles....why different size definitions will give you different results, various methods used to measure particles and why the method you use matters! This webinar will be the first in a series that will give you the knowledge you'll need to be the particle expert in your lab.
View recorded webinars:
http://bit.ly/particlewebinars
Improved Visualization, Counting and Sizing of Polydisperse Nanoparticle Coll...HORIBA Particle
The ViewSizer® 3000 offers the ability to visualize nanoparticle colloids without requiring calibration standards or knowledge of any particle material properties, such as refractive index. It was developed by MANTA – the Most Advanced Nanoparticle Tracking Analysis – and offers the user an unprecedented ability to count and size highly polydispersed samples, such as milk, sea water, or blood plasma.
View recorded webinars:
http://bit.ly/particlewebinars
Key Points to Achieving Successful Laser Diffraction Method DevelopmentHORIBA Particle
Unlock the secrets to the best measurement for particles. Topics covered include choosing appropriate accessories, selecting the best dispersing medium, assessing the effect of circulation pump speed, and evaluating the impact of using different imaginary refractive index values.
View recorded webinars:
http://bit.ly/particlewebinars
Particle Size Analysis for Homogenization Process Development HORIBA Particle
Emulsions and suspensions are commonly used in pharmaceutical, chemical and consumer products. The pharmaceutical industry, in particular, uses emulsions and suspensions to increase drug efficacy by controlling their particle size and size distribution. Among various available preparation methods, high-pressure homogenization is one of the widely employed processes in the field. This webinar discusses ways to develop a robust homogenization process for making pharmaceutical emulsions by evaluating droplet size distribution.
View recorded webinars:
http://bit.ly/particlewebinars
UiPath Test Automation using UiPath Test Suite series, part 6DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 6. In this session, we will cover Test Automation with generative AI and Open AI.
UiPath Test Automation with generative AI and Open AI webinar offers an in-depth exploration of leveraging cutting-edge technologies for test automation within the UiPath platform. Attendees will delve into the integration of generative AI, a test automation solution, with Open AI advanced natural language processing capabilities.
Throughout the session, participants will discover how this synergy empowers testers to automate repetitive tasks, enhance testing accuracy, and expedite the software testing life cycle. Topics covered include the seamless integration process, practical use cases, and the benefits of harnessing AI-driven automation for UiPath testing initiatives. By attending this webinar, testers, and automation professionals can gain valuable insights into harnessing the power of AI to optimize their test automation workflows within the UiPath ecosystem, ultimately driving efficiency and quality in software development processes.
What will you get from this session?
1. Insights into integrating generative AI.
2. Understanding how this integration enhances test automation within the UiPath platform
3. Practical demonstrations
4. Exploration of real-world use cases illustrating the benefits of AI-driven test automation for UiPath
Topics covered:
What is generative AI
Test Automation with generative AI and Open AI.
UiPath integration with generative AI
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!SOFTTECHHUB
As the digital landscape continually evolves, operating systems play a critical role in shaping user experiences and productivity. The launch of Nitrux Linux 3.5.0 marks a significant milestone, offering a robust alternative to traditional systems such as Windows 11. This article delves into the essence of Nitrux Linux 3.5.0, exploring its unique features, advantages, and how it stands as a compelling choice for both casual users and tech enthusiasts.
How to Get CNIC Information System with Paksim Ga.pptxdanishmna97
Pakdata Cf is a groundbreaking system designed to streamline and facilitate access to CNIC information. This innovative platform leverages advanced technology to provide users with efficient and secure access to their CNIC details.
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AIVladimir Iglovikov, Ph.D.
Presented by Vladimir Iglovikov:
- https://www.linkedin.com/in/iglovikov/
- https://x.com/viglovikov
- https://www.instagram.com/ternaus/
This presentation delves into the journey of Albumentations.ai, a highly successful open-source library for data augmentation.
Created out of a necessity for superior performance in Kaggle competitions, Albumentations has grown to become a widely used tool among data scientists and machine learning practitioners.
This case study covers various aspects, including:
People: The contributors and community that have supported Albumentations.
Metrics: The success indicators such as downloads, daily active users, GitHub stars, and financial contributions.
Challenges: The hurdles in monetizing open-source projects and measuring user engagement.
Development Practices: Best practices for creating, maintaining, and scaling open-source libraries, including code hygiene, CI/CD, and fast iteration.
Community Building: Strategies for making adoption easy, iterating quickly, and fostering a vibrant, engaged community.
Marketing: Both online and offline marketing tactics, focusing on real, impactful interactions and collaborations.
Mental Health: Maintaining balance and not feeling pressured by user demands.
Key insights include the importance of automation, making the adoption process seamless, and leveraging offline interactions for marketing. The presentation also emphasizes the need for continuous small improvements and building a friendly, inclusive community that contributes to the project's growth.
Vladimir Iglovikov brings his extensive experience as a Kaggle Grandmaster, ex-Staff ML Engineer at Lyft, sharing valuable lessons and practical advice for anyone looking to enhance the adoption of their open-source projects.
Explore more about Albumentations and join the community at:
GitHub: https://github.com/albumentations-team/albumentations
Website: https://albumentations.ai/
LinkedIn: https://www.linkedin.com/company/100504475
Twitter: https://x.com/albumentations
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
Maruthi Prithivirajan, Head of ASEAN & IN Solution Architecture, Neo4j
Get an inside look at the latest Neo4j innovations that enable relationship-driven intelligence at scale. Learn more about the newest cloud integrations and product enhancements that make Neo4j an essential choice for developers building apps with interconnected data and generative AI.
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
The graph in the upper left shows that using a light source with a 650 nm wavelength there is no difference in the scattering patterns between a 0.05 and 0.07 µm particle. The graph in the upper right shows how there is a difference in the scattering patterns when using a 405 nm wavelength light source. Since there is a difference in the scattering pattern, we can distinguish between to two particle sizes. The graph to the lower left shows results when 30, 40, 50, and 70 nm latex standards are measured (individually) on the LA-950.
The LA-950 can accurately detect particles as small as the 40 nm latex shown on the left, and the 30 nm Ludox silica on the right. The Ludox is the sample we use to verfiy performance of the DT-1200 system. It is well characterized and accepted as being 30 nm in size. The LA-950 has the best small particle detection performance of any laser diffraction particle size analyzer.
This is the training screen when installing the DT-1200, showing the expected 30 nm result on the right.
The LA-950 also has excellent performance measuring large particles – typically dry powders. The graph on the top shows the data from the coffee application note. Anything over 2000 µm would be missed by the MS2000. The data below from the brochure shows data for 2700 µm alumina balls. Powder samples with any fine particles (<20 µm) are analyzed on the MS2000 using “Fine Powder Mode”. This cuts off ALL data above 250 µm. Otherwise many samples include ghost peaks around 1000-2000 µm. If Fine powder mode is used, the customer will never see any particles >250 µm even if they are really there.
The PowderJet dry powder feeder is a superior sampler than the Sirocco in many ways. The sample flow path is straight down from the nozzle through the cell where the measurement is made. With the Sirocco the powder makes a 90 degree turn within the sampler, then transport via a tube to the cell (possible cross contamination). 2. The PowderJet maintains a constant mass flow rate by varying the feed rate to keep transmission constant. The Sirocco only controls the vibration feed rate. The sample concentration changes dramatically – decreasing the robustness of the measurement. See next slide.
If there are particles < 20 micron in the sample Malvern uses the fine powder powder. When the MS2000 uses Fine Powder mode, it cuts off all data above 250 micron.
This is real world data running a pharmaceutical excipient – magnesium stearate. Note the COV levels are extremely low, all fractions of a percent.
The LA-950 provides two error calculations Residual R and chi square. Both provide information of how the final calculation matches the raw data – a good indicator of how good the RI choice was. The chi square calculation also includes –
Results from analyzing the Whitehouse PS202 polydisperse standard. Great results.
Results from analyzing the Whitehouse PS225 polydisperse standard. Great results.
Results from analyzing the Whitehouse PS181 polydisperse standard. Great results.
These results come from mixing 2 standards 50/50 – good resolution and assignment of proportions.
More data showing the ability to resolve peaks with accurate proportions.
Resolution is the ability of the instrument to measure small differences in particle sizes. Resolution is difficult to specify, because it can have many possible meanings. In the case of the user, it would be most important to test a series of samples that track the variables of interest used to characterize the performance of the instrument for that specific use.
Perhaps the most important measure of resolution is how small a difference in the sample itself will generate a difference in the measurement. These changes may come from additional processing (milling or agglomeration) or to differentiate between different batches of material. Polystyrene Latex (PSL) NIST traceable standards are the easiest standards to use, but are significantly different from a vast majority of the materials manufactured and used in industry. This example does show the ability of the instrument to differentiate between a 553nm and a 600nm material.
This is an example of a mixture of three distinctly different size materials. Can the instrument identify that there are distinct distributions within the total sample? If so, are the relative amounts of the different components reported properly? Some companies have focused on tests to show closely spaced polystyrene latex as individual peaks in the overall distribution. As this example shows, it is possible, but the user must decide if this is a valid test of the ability of the instrument to respond to their samples.
This is another example of a mixture of three distinctly different size materials, but at a much larger size range. Instrument response can vary over the measurement range. Experiments should be designed to characterize performance in the range of the final application.
Resolution can also be the ability of the instrument to determine small amounts of a material outside of the main range of the sample, whether larger or smaller. A small amount of fines or coarse material can be a very important indicator of process problems. The ability to respond to these small amounts of material before they become a large contributor to the total particle distribution is a very important feature.
Precision is the ability of the instrument to get the same answer for the same material in repeat tests. The data shown was 24 samplings on one instrument. The data shows the precision of both sampling and instrument. Repeat measurements of the same sample in the instrument would characterize the precision of this one instrument.
USP <429> is a new pharmaceutical standard for using laser diffraction. This test has many similarities to ISO13320.
The USP <429> calculations are being written into the LA=-950 software. When released the user will have automatic pass/fail calculations and displays.
USP <429> also sets pass/fail criteria when analyzing a polydisperse standard to determine if the system is working properly. Measure the sample 3 times – to pass the accuracy the d50 must be within 3% of the certified value, and the d10&d90 must be within 5%. In addition, the test requires the user to calculate the COV for the 3 measurements and sets repeatability goals as shown in the slide.
This is just a screen shot from the development of the USP software. Customers will be able to choose from three options: Comply with USP<429> requirements Comply with ISO13320 requirements (more strict) Or create custom user defined requirements. This is why all customers will benefit from the software features.
Here is a screen shot form the software development showing an example where the accuracy was tested using a standard. The system passed at the d10&d90, but failed at the d50. This just an example, of course the units normally pass this test – see next slides.
Here are the accuracy & precision specifications for the LA-950, along with example data in the next slide. Does the competition talk about their specifications?
Here is data from 20 LA-950’s testing the instrument to instrument variation using polydisperse standards. Excellent results.
The last slides comes from customer data. They wanted to compare the reproducibility of the LA-910 to LA-950. This slide shows the protocol for testing the precision.
Here is the precision data for the LA-910. Two instruments were used in this study as shown in the protocol on the previous slide. Note: %RSD is the same as COV, just another name for the same calculation. These values are acceptable and would pass the USP<429> requirements.
Here is data from 6-8 different LA-910’s, testing instrument to instrument variability. This data is again acceptable.
Next, a similar study was performed using 2 LA-950’s. Notice the extremely low RSD values. EXCELLENT!
Finally, a study was performed on 4 LA-950’s. Notice the excellent RSD values. We expect this is the best in the business.