The document discusses using Dynochem software to determine suitable sampling endpoints for design of experiments (DOEs) investigating chemical reactions. It provides a case study of a Finkelstein alkylation reaction where an impurity forms. Dynochem is used to fit rate constants and activation energies to the reaction mechanism. This allows simulating different experimental conditions to identify suitable reaction times that control impurity formation before committing resources to a DOE. The kinetic model can then refine the factor ranges investigated in the DOE to efficiently establish critical process parameters.
Analysis of High-Order Residual-Based Dissipation for Unsteady Compressible F...kgrimich
A comprehensive study of the numerical properties of high-order residual-based dissipation terms for unsteady compressible flows leads to the design of well-behaved, low dissipative schemes of third-, fifth- and seventh-order accuracy. The dissipation and dispersion properties of the schemes are then evaluated theoreticaly, through Fourier space analysis, and numerically, through selected test cases including the inviscid Taylor-Green Vortex flow.
This document summarizes research on two proteins - Human Retinol Binding Protein-4 (RBP4) and Kvβ2, a subunit of the Kv1 potassium channel. For RBP4, the researcher optimized PCR conditions to amplify the gene and assessed RBP4's proposed role in insulin resistance and diabetes. For Kvβ2, the researcher overexpressed and purified the protein, then tested the inhibitory effects of rutin, quercetin, and resveratrol on Kvβ2 activity. The results provide new insights into physiological processes involving the Shaker potassium channel and identify resveratrol as a potential inhibitor of Kvβ2 activity. Overall, the research yielded beneficial PCR guidelines
1. The document provides information about the Post Graduate Common Entrance Test conducted on 08-08-2015 for courses offered by VTU, UVCE and UBDTCE.
2. It lists the instructions for candidates regarding filling the answer sheet correctly, signing the sheet, and preserving the candidate's copy for self-evaluation.
3. The test details include the subjects, duration, number and type of questions, and distribution of marks for different sections. Instructions for candidates during the exam are also provided.
Welcome to International Journal of Engineering Research and Development (IJERD)IJERD Editor
The document proposes a novel approach for designing online testable reversible circuits that can detect single bit errors. The approach works by converting each reversible gate in the circuit to a testable reversible gate with an additional output bit used for error detection. These testable gates are then cascaded to form testable blocks. An error checker circuit is constructed using modified Fredkin gates to examine the output bits of each testable block and produce a final error signal. Theoretical analysis proves the approach produces reversible and online testable circuits. Experimental results show the approach outperforms existing methods with fewer gates, garbage outputs, and quantum cost.
Effect of Transformer Oil and Petroleum Hydrocarbons on PCB Screening (DTR-14...Alvia Gaskill, Jr.
A study evaluated the effects of transformer oil, petroleum hydrocarbons, and inorganic chloride on the accuracy of field tests for determining PCBs in soil. Soil samples spiked with varying levels of Aroclor 1242 and contaminants were tested using EPA Method 4020 (immunoassay), the L2000 method, and gas chromatography. Method 4020 failed to correctly classify soils containing >2% transformer oil or 0.5% diesel fuel due to negative interference from hydrocarbons. The L2000 and GC methods correctly classified all samples, even in the presence of hydrocarbons. Both methods were also unaffected by inorganic chloride contamination.
Scale-up of Safety Data using Dynochem. Tom Vickery.Scale-up Systems
This document summarizes two case studies where Dynochem, a process modeling and simulation tool, was used to analyze experimental safety data and scale-up risks. In the first case, Dynochem was used to model the temperature-dependent decomposition of an unstable cryogenic reaction. The tool fitted the experimental heat flow data and allowed modeling of different reaction scenarios. In the second case, Dynochem fitted gas generation data from a reaction and allowed simulation of varying heat rates to understand their effect on runaway risks. The tool provided a consistent model for safety scale-up evaluations in both cases.
Crystallization process improvement driven by dynochem process modeling. Flav...Scale-up Systems
The original continuous crystallization process used anti-solvent crystallization with heptanes and IPAc to crystallize an API. This led to varying composition, supersaturation and volumes, producing small primary particles prone to agglomeration. The process had long cycle times, low throughput and yielded primarily agglomerated particles. Dynochem modelling was used to improve the process by controlling crystallization parameters.
Analysis of High-Order Residual-Based Dissipation for Unsteady Compressible F...kgrimich
A comprehensive study of the numerical properties of high-order residual-based dissipation terms for unsteady compressible flows leads to the design of well-behaved, low dissipative schemes of third-, fifth- and seventh-order accuracy. The dissipation and dispersion properties of the schemes are then evaluated theoreticaly, through Fourier space analysis, and numerically, through selected test cases including the inviscid Taylor-Green Vortex flow.
This document summarizes research on two proteins - Human Retinol Binding Protein-4 (RBP4) and Kvβ2, a subunit of the Kv1 potassium channel. For RBP4, the researcher optimized PCR conditions to amplify the gene and assessed RBP4's proposed role in insulin resistance and diabetes. For Kvβ2, the researcher overexpressed and purified the protein, then tested the inhibitory effects of rutin, quercetin, and resveratrol on Kvβ2 activity. The results provide new insights into physiological processes involving the Shaker potassium channel and identify resveratrol as a potential inhibitor of Kvβ2 activity. Overall, the research yielded beneficial PCR guidelines
1. The document provides information about the Post Graduate Common Entrance Test conducted on 08-08-2015 for courses offered by VTU, UVCE and UBDTCE.
2. It lists the instructions for candidates regarding filling the answer sheet correctly, signing the sheet, and preserving the candidate's copy for self-evaluation.
3. The test details include the subjects, duration, number and type of questions, and distribution of marks for different sections. Instructions for candidates during the exam are also provided.
Welcome to International Journal of Engineering Research and Development (IJERD)IJERD Editor
The document proposes a novel approach for designing online testable reversible circuits that can detect single bit errors. The approach works by converting each reversible gate in the circuit to a testable reversible gate with an additional output bit used for error detection. These testable gates are then cascaded to form testable blocks. An error checker circuit is constructed using modified Fredkin gates to examine the output bits of each testable block and produce a final error signal. Theoretical analysis proves the approach produces reversible and online testable circuits. Experimental results show the approach outperforms existing methods with fewer gates, garbage outputs, and quantum cost.
Effect of Transformer Oil and Petroleum Hydrocarbons on PCB Screening (DTR-14...Alvia Gaskill, Jr.
A study evaluated the effects of transformer oil, petroleum hydrocarbons, and inorganic chloride on the accuracy of field tests for determining PCBs in soil. Soil samples spiked with varying levels of Aroclor 1242 and contaminants were tested using EPA Method 4020 (immunoassay), the L2000 method, and gas chromatography. Method 4020 failed to correctly classify soils containing >2% transformer oil or 0.5% diesel fuel due to negative interference from hydrocarbons. The L2000 and GC methods correctly classified all samples, even in the presence of hydrocarbons. Both methods were also unaffected by inorganic chloride contamination.
Scale-up of Safety Data using Dynochem. Tom Vickery.Scale-up Systems
This document summarizes two case studies where Dynochem, a process modeling and simulation tool, was used to analyze experimental safety data and scale-up risks. In the first case, Dynochem was used to model the temperature-dependent decomposition of an unstable cryogenic reaction. The tool fitted the experimental heat flow data and allowed modeling of different reaction scenarios. In the second case, Dynochem fitted gas generation data from a reaction and allowed simulation of varying heat rates to understand their effect on runaway risks. The tool provided a consistent model for safety scale-up evaluations in both cases.
Crystallization process improvement driven by dynochem process modeling. Flav...Scale-up Systems
The original continuous crystallization process used anti-solvent crystallization with heptanes and IPAc to crystallize an API. This led to varying composition, supersaturation and volumes, producing small primary particles prone to agglomeration. The process had long cycle times, low throughput and yielded primarily agglomerated particles. Dynochem modelling was used to improve the process by controlling crystallization parameters.
Dr. Reddy's Development of Kinetic Model and Process Prediction. Keerthi Pemula.Scale-up Systems
This document discusses two case studies using kinetic modeling and DynoChem software to improve pharmaceutical synthesis processes. In the first case, three mechanisms were evaluated to predict an anti-bacterial reaction and reduce impurities. Mechanism 3 best fit the data and parameters from it improved yield. The second case developed a kinetic model for an API synthesis to minimize impurities and maximize yield through simulation and optimization. Process changes based on the mechanisms reduced reaction time and improved purity and yield. Overall, kinetic modeling with DynoChem helped analyze reaction mechanisms and improve two industrial synthesis processes.
The purpose of this webinar is to highlight GSK's approach to:
- create a simple, mechanistically descriptive model
- verify its utility with clarity of objectives, and
- communicate understanding via creative but aligned metrics
... for a challenging chemical reaction.
Scale-up Systems India Mettler RC1 Sanket SalgaonkarScale-up Systems
This document discusses the use of DynoChem software for modeling chemical processes. DynoChem can be used to develop process models based on experimental data from sources like reaction calorimeters. It allows modeling heat flows and predicting temperature profiles. This helps understand the process and enables process safety evaluations through what-if scenario analysis, like investigating the impact of loss of cooling capacity. DynoChem facilitates scale-up and process optimization by predicting large scale process performance based on the developed models.
Use of DynoChem in Process Development. Wilfried Hoffmann.Scale-up Systems
1) Process modeling allows prediction of interactions between chemical and physical rates as a function of scale and equipment to enable safe and cost-effective scale-up from lab to production.
2) A case study reaction is used to demonstrate how kinetic data, heat of reaction, and process safety data can be combined in a model to optimize temperature profiles and process conditions at different scales.
3) Process safety is considered by simulating what would happen in case of cooling failure, and including a thermal risk metric in the optimization to safely transfer the process across scales.
Dom Hebrault presented on using real time in situ FTIR analytics to enhance development and control of continuous processes. He discussed three case studies: [1] rapidly optimizing a Doebner modification reaction using inline FTIR to monitor concentrations in real time; [2] safely monitoring a hazardous indazole synthesis using hydrazine in flow; and [3] improving product quality of a Grignard reaction for drug synthesis from 40% to 1% impurity using inline FTIR process control. The case studies demonstrated how inline FTIR can provide major benefits for continuous flow reaction optimization, monitoring hazardous substances, and process quality control.
AiChE National Meeting 2012 Pittsburgh Presentation Flow Continuousdominev
1) In-situ FTIR spectroscopy using a ReactIR flow cell allows for real-time monitoring and analysis of continuous chemical reactions without interrupting flow.
2) Case studies demonstrated its use in optimizing a continuous ozonolysis reaction for safer API production, achieving a 2.7kg yield in 4 days.
3) Rapid screening and optimization of a Doebner modification reaction was also demonstrated, identifying optimal conditions within hours using on-the-fly variation of temperature and residence time analyzed via the in-situ FTIR.
Applications in Kilo Lab Flow Chemistry and Scale-up. Edel Hughes.Scale-up Systems
This presentation discusses Pfizer's Kilo Technology Lab (KTL) facility and its use of DynoChem software. The KTL uses modular equipment to develop and scale up processes from kg to commercial scale. The presentation characterizes the KTL's static mixer reactor using DynoChem and validates the model, shows how DynoChem was used to model centrifugation of a product, and outlines future plans to further characterize KTL equipment using DynoChem to enhance process understanding and better scale up processes.
Practical aspects of distillation modeling in DynoChem. Carolyn Cummings.Scale-up Systems
The document discusses using DynoChem software to model and optimize distillation processes. It presents two case studies:
1) Modeling an azeotropic distillation of MTBE and methanol to determine the endpoint. The model accurately predicted the distillation time.
2) Modeling a batch concentration distillation to assess premature crystallization. The model was used to reproduce a manufacturing process, determine the optimal operating pressure to avoid crystallization, and predict batch properties over time. The optimized process incorporated additional solvent charges to control temperature and maintain solubility.
Using DynoChem to Inform Experimental Design of Batch Crystallization. Rahn M...Scale-up Systems
The document discusses using modeling software to inform experimental design of batch crystallization processes. It presents two models - Model A acts as a "nucleation detector" to predict peak supersaturation without considering nucleation, while Model B solves the cooling or antisolvent addition curve. Case studies demonstrate how the models were used for optimization, robustness analysis, and scoping of new compounds. The models provide a simple tool to aid experimental design for scientists unfamiliar with crystallization kinetics.
The document summarizes Robert Woodward's presentation on the kinetics of an SnAr (nucleophilic aromatic substitution) reaction. The presentation covered the reaction chemistry, experiments conducted using two substrates at varying temperatures and conditions, kinetic data collected, and a kinetic scheme and assumptions for modeling the reaction using iterative simulation and optimization in Dynochem software. The goal was to better understand the reaction mechanism and develop predictive chemistry to improve conversion and reduce impurities within timelines for a diabetes medication synthesis.
This document discusses several studies utilizing continuous flow microreactors for organic synthesis. One study produced an unstable Vilsmeier-Haack formylation intermediate in a safe and controlled manner using inline infrared analysis to optimize reaction conditions. Another used inline infrared to study gas-liquid homogeneous catalysis kinetics at high pressures. A third demonstrated automated optimization of a Pall-Knorr reaction using online infrared data in a microfluidic system.
Scale Up Challenges in Chemical Engineering: The Role of Chemical Engineers i...jodoua
This document discusses challenges in scaling up chemical processes from the laboratory to the industrial scale. It emphasizes the importance of a systematic approach using models informed by experiments at different scales. Dimensionless groups allow insights from small-scale experiments and models to be applied at larger scales by capturing key relationships unaffected by scale. Process design challenges include variability in feedstocks, utilizing waste streams, energy costs, regulations, and rapid time to market.
Scale-up and scale-down of chemical processesSeppo Karrila
Explains the path from for example synthesizing a useful appearing material in the lab to actual production of the same. Also explains what pilot machines are, how they are used, and why sometimes down-scaling of a unit operation is done to experiment in bench-scale.
13th Brazilian Meeting on Organic Synthesisdominev
Combining real-time analytics and process control can enhance chemical development. FTIR was used as a PAT tool in two case studies:
1) Monitoring a deprotonation reaction in situ allowed precise endpoint determination, minimizing impurities. This improved process was successfully scaled up.
2) FTIR monitored three consecutive continuous reactions for a pharmaceutical intermediate. Real-time feedback controlled base feed rate and ensured proper stoichiometry, minimizing waste and impurities. This continuous process was also successfully scaled up.
Real-time flow analysis using FTIR allows more efficient process optimization, development, and scale-up through in-line monitoring and feedback control.
1) Recent advances in continuous flow chemistry allow for safer and more efficient reactions through the use of inline monitoring techniques like ATR-FTIR.
2) A Strecker reaction was optimized in a flow reactor using ATR-FTIR to monitor the reaction in situ which allowed for safer operation and higher yields through rapid stoichiometric optimization.
3) A chemoenzymatic sequence for the stereoselective synthesis of lactones was developed using a single-operation protocol combining continuous flow hydrogenation and biocatalyzed Baeyer-Villiger oxidation which provided a safer and simpler procedure.
Modeling of Granular Mixing using Markov Chains and the Discrete Element Methodjodoua
The document presents a method for modeling granular mixing using Markov chains and the discrete element method (DEM). It motivates the use of Markov chains to efficiently simulate granular mixing as an alternative to computationally expensive DEM simulations. The theory and definitions of Markov chains and operators are provided. The method is applied to simulate mixing in a cylindrical drum, and the effects of the number of states, time step, and learning time are investigated. Properties of the resulting operator like the invariant distribution and mixing rates are analyzed to characterize the mixing dynamics.
The document discusses three key strategies for future growth at PHARMACO:
1. Centralizing call handling through a call center to reduce pharmacist and technician phone times.
2. Migrating refill prescriptions from stores to a mail order program to focus on new prescriptions in stores.
3. Focusing on growing the mail order business by installing business processes to fulfill prescriptions through mail order using retail stores, aiming to increase mail order volume to over 30% of prescriptions.
1) In-situ FTIR spectroscopy using a ReactIR flow cell allows for real-time monitoring and analysis of continuous chemical reactions without the need for offline sampling.
2) A case study demonstrated the use of in-situ FTIR to develop a continuous process for the ozonolysis of styrene and an API intermediate, allowing characterization of reaction kinetics, intermediates, and optimization of flow rate and reactor size.
3) This led to the safe, efficient production of 2.7 kg of an API intermediate over 4 days with 99% conversion and 85% ozone efficiency. In-situ FTIR enabled continuous monitoring and ensured high product quality and yield.
21st International Conference Organic Process Research & Development 2010 San...dominev
This document discusses using real-time calorimetry to improve operational efficiency. It presents case studies where ReactIR, FBRM, PVM and RTCal were used:
1) ReactIR developed kinetic models to minimize byproducts in pharmaceutical reactions and improve crystallization processes.
2) FBRM and PVM helped optimize a crystallization to reduce impurities below 0.5%.
3) RTCal validated switching to a low copper acrylamide grade for polymerization, showing a shorter induction period but similar maximum heat output. Real-time calorimetry provided process safety evaluation.
Analysis insight about a Flyball dog competition team's performanceroli9797
Insight of my analysis about a Flyball dog competition team's last year performance. Find more: https://github.com/rolandnagy-ds/flyball_race_analysis/tree/main
Dr. Reddy's Development of Kinetic Model and Process Prediction. Keerthi Pemula.Scale-up Systems
This document discusses two case studies using kinetic modeling and DynoChem software to improve pharmaceutical synthesis processes. In the first case, three mechanisms were evaluated to predict an anti-bacterial reaction and reduce impurities. Mechanism 3 best fit the data and parameters from it improved yield. The second case developed a kinetic model for an API synthesis to minimize impurities and maximize yield through simulation and optimization. Process changes based on the mechanisms reduced reaction time and improved purity and yield. Overall, kinetic modeling with DynoChem helped analyze reaction mechanisms and improve two industrial synthesis processes.
The purpose of this webinar is to highlight GSK's approach to:
- create a simple, mechanistically descriptive model
- verify its utility with clarity of objectives, and
- communicate understanding via creative but aligned metrics
... for a challenging chemical reaction.
Scale-up Systems India Mettler RC1 Sanket SalgaonkarScale-up Systems
This document discusses the use of DynoChem software for modeling chemical processes. DynoChem can be used to develop process models based on experimental data from sources like reaction calorimeters. It allows modeling heat flows and predicting temperature profiles. This helps understand the process and enables process safety evaluations through what-if scenario analysis, like investigating the impact of loss of cooling capacity. DynoChem facilitates scale-up and process optimization by predicting large scale process performance based on the developed models.
Use of DynoChem in Process Development. Wilfried Hoffmann.Scale-up Systems
1) Process modeling allows prediction of interactions between chemical and physical rates as a function of scale and equipment to enable safe and cost-effective scale-up from lab to production.
2) A case study reaction is used to demonstrate how kinetic data, heat of reaction, and process safety data can be combined in a model to optimize temperature profiles and process conditions at different scales.
3) Process safety is considered by simulating what would happen in case of cooling failure, and including a thermal risk metric in the optimization to safely transfer the process across scales.
Dom Hebrault presented on using real time in situ FTIR analytics to enhance development and control of continuous processes. He discussed three case studies: [1] rapidly optimizing a Doebner modification reaction using inline FTIR to monitor concentrations in real time; [2] safely monitoring a hazardous indazole synthesis using hydrazine in flow; and [3] improving product quality of a Grignard reaction for drug synthesis from 40% to 1% impurity using inline FTIR process control. The case studies demonstrated how inline FTIR can provide major benefits for continuous flow reaction optimization, monitoring hazardous substances, and process quality control.
AiChE National Meeting 2012 Pittsburgh Presentation Flow Continuousdominev
1) In-situ FTIR spectroscopy using a ReactIR flow cell allows for real-time monitoring and analysis of continuous chemical reactions without interrupting flow.
2) Case studies demonstrated its use in optimizing a continuous ozonolysis reaction for safer API production, achieving a 2.7kg yield in 4 days.
3) Rapid screening and optimization of a Doebner modification reaction was also demonstrated, identifying optimal conditions within hours using on-the-fly variation of temperature and residence time analyzed via the in-situ FTIR.
Applications in Kilo Lab Flow Chemistry and Scale-up. Edel Hughes.Scale-up Systems
This presentation discusses Pfizer's Kilo Technology Lab (KTL) facility and its use of DynoChem software. The KTL uses modular equipment to develop and scale up processes from kg to commercial scale. The presentation characterizes the KTL's static mixer reactor using DynoChem and validates the model, shows how DynoChem was used to model centrifugation of a product, and outlines future plans to further characterize KTL equipment using DynoChem to enhance process understanding and better scale up processes.
Practical aspects of distillation modeling in DynoChem. Carolyn Cummings.Scale-up Systems
The document discusses using DynoChem software to model and optimize distillation processes. It presents two case studies:
1) Modeling an azeotropic distillation of MTBE and methanol to determine the endpoint. The model accurately predicted the distillation time.
2) Modeling a batch concentration distillation to assess premature crystallization. The model was used to reproduce a manufacturing process, determine the optimal operating pressure to avoid crystallization, and predict batch properties over time. The optimized process incorporated additional solvent charges to control temperature and maintain solubility.
Using DynoChem to Inform Experimental Design of Batch Crystallization. Rahn M...Scale-up Systems
The document discusses using modeling software to inform experimental design of batch crystallization processes. It presents two models - Model A acts as a "nucleation detector" to predict peak supersaturation without considering nucleation, while Model B solves the cooling or antisolvent addition curve. Case studies demonstrate how the models were used for optimization, robustness analysis, and scoping of new compounds. The models provide a simple tool to aid experimental design for scientists unfamiliar with crystallization kinetics.
The document summarizes Robert Woodward's presentation on the kinetics of an SnAr (nucleophilic aromatic substitution) reaction. The presentation covered the reaction chemistry, experiments conducted using two substrates at varying temperatures and conditions, kinetic data collected, and a kinetic scheme and assumptions for modeling the reaction using iterative simulation and optimization in Dynochem software. The goal was to better understand the reaction mechanism and develop predictive chemistry to improve conversion and reduce impurities within timelines for a diabetes medication synthesis.
This document discusses several studies utilizing continuous flow microreactors for organic synthesis. One study produced an unstable Vilsmeier-Haack formylation intermediate in a safe and controlled manner using inline infrared analysis to optimize reaction conditions. Another used inline infrared to study gas-liquid homogeneous catalysis kinetics at high pressures. A third demonstrated automated optimization of a Pall-Knorr reaction using online infrared data in a microfluidic system.
Scale Up Challenges in Chemical Engineering: The Role of Chemical Engineers i...jodoua
This document discusses challenges in scaling up chemical processes from the laboratory to the industrial scale. It emphasizes the importance of a systematic approach using models informed by experiments at different scales. Dimensionless groups allow insights from small-scale experiments and models to be applied at larger scales by capturing key relationships unaffected by scale. Process design challenges include variability in feedstocks, utilizing waste streams, energy costs, regulations, and rapid time to market.
Scale-up and scale-down of chemical processesSeppo Karrila
Explains the path from for example synthesizing a useful appearing material in the lab to actual production of the same. Also explains what pilot machines are, how they are used, and why sometimes down-scaling of a unit operation is done to experiment in bench-scale.
13th Brazilian Meeting on Organic Synthesisdominev
Combining real-time analytics and process control can enhance chemical development. FTIR was used as a PAT tool in two case studies:
1) Monitoring a deprotonation reaction in situ allowed precise endpoint determination, minimizing impurities. This improved process was successfully scaled up.
2) FTIR monitored three consecutive continuous reactions for a pharmaceutical intermediate. Real-time feedback controlled base feed rate and ensured proper stoichiometry, minimizing waste and impurities. This continuous process was also successfully scaled up.
Real-time flow analysis using FTIR allows more efficient process optimization, development, and scale-up through in-line monitoring and feedback control.
1) Recent advances in continuous flow chemistry allow for safer and more efficient reactions through the use of inline monitoring techniques like ATR-FTIR.
2) A Strecker reaction was optimized in a flow reactor using ATR-FTIR to monitor the reaction in situ which allowed for safer operation and higher yields through rapid stoichiometric optimization.
3) A chemoenzymatic sequence for the stereoselective synthesis of lactones was developed using a single-operation protocol combining continuous flow hydrogenation and biocatalyzed Baeyer-Villiger oxidation which provided a safer and simpler procedure.
Modeling of Granular Mixing using Markov Chains and the Discrete Element Methodjodoua
The document presents a method for modeling granular mixing using Markov chains and the discrete element method (DEM). It motivates the use of Markov chains to efficiently simulate granular mixing as an alternative to computationally expensive DEM simulations. The theory and definitions of Markov chains and operators are provided. The method is applied to simulate mixing in a cylindrical drum, and the effects of the number of states, time step, and learning time are investigated. Properties of the resulting operator like the invariant distribution and mixing rates are analyzed to characterize the mixing dynamics.
The document discusses three key strategies for future growth at PHARMACO:
1. Centralizing call handling through a call center to reduce pharmacist and technician phone times.
2. Migrating refill prescriptions from stores to a mail order program to focus on new prescriptions in stores.
3. Focusing on growing the mail order business by installing business processes to fulfill prescriptions through mail order using retail stores, aiming to increase mail order volume to over 30% of prescriptions.
1) In-situ FTIR spectroscopy using a ReactIR flow cell allows for real-time monitoring and analysis of continuous chemical reactions without the need for offline sampling.
2) A case study demonstrated the use of in-situ FTIR to develop a continuous process for the ozonolysis of styrene and an API intermediate, allowing characterization of reaction kinetics, intermediates, and optimization of flow rate and reactor size.
3) This led to the safe, efficient production of 2.7 kg of an API intermediate over 4 days with 99% conversion and 85% ozone efficiency. In-situ FTIR enabled continuous monitoring and ensured high product quality and yield.
21st International Conference Organic Process Research & Development 2010 San...dominev
This document discusses using real-time calorimetry to improve operational efficiency. It presents case studies where ReactIR, FBRM, PVM and RTCal were used:
1) ReactIR developed kinetic models to minimize byproducts in pharmaceutical reactions and improve crystallization processes.
2) FBRM and PVM helped optimize a crystallization to reduce impurities below 0.5%.
3) RTCal validated switching to a low copper acrylamide grade for polymerization, showing a shorter induction period but similar maximum heat output. Real-time calorimetry provided process safety evaluation.
Analysis insight about a Flyball dog competition team's performanceroli9797
Insight of my analysis about a Flyball dog competition team's last year performance. Find more: https://github.com/rolandnagy-ds/flyball_race_analysis/tree/main
STATATHON: Unleashing the Power of Statistics in a 48-Hour Knowledge Extravag...sameer shah
"Join us for STATATHON, a dynamic 2-day event dedicated to exploring statistical knowledge and its real-world applications. From theory to practice, participants engage in intensive learning sessions, workshops, and challenges, fostering a deeper understanding of statistical methodologies and their significance in various fields."
Open Source Contributions to Postgres: The Basics POSETTE 2024ElizabethGarrettChri
Postgres is the most advanced open-source database in the world and it's supported by a community, not a single company. So how does this work? How does code actually get into Postgres? I recently had a patch submitted and committed and I want to share what I learned in that process. I’ll give you an overview of Postgres versions and how the underlying project codebase functions. I’ll also show you the process for submitting a patch and getting that tested and committed.
End-to-end pipeline agility - Berlin Buzzwords 2024Lars Albertsson
We describe how we achieve high change agility in data engineering by eliminating the fear of breaking downstream data pipelines through end-to-end pipeline testing, and by using schema metaprogramming to safely eliminate boilerplate involved in changes that affect whole pipelines.
A quick poll on agility in changing pipelines from end to end indicated a huge span in capabilities. For the question "How long time does it take for all downstream pipelines to be adapted to an upstream change," the median response was 6 months, but some respondents could do it in less than a day. When quantitative data engineering differences between the best and worst are measured, the span is often 100x-1000x, sometimes even more.
A long time ago, we suffered at Spotify from fear of changing pipelines due to not knowing what the impact might be downstream. We made plans for a technical solution to test pipelines end-to-end to mitigate that fear, but the effort failed for cultural reasons. We eventually solved this challenge, but in a different context. In this presentation we will describe how we test full pipelines effectively by manipulating workflow orchestration, which enables us to make changes in pipelines without fear of breaking downstream.
Making schema changes that affect many jobs also involves a lot of toil and boilerplate. Using schema-on-read mitigates some of it, but has drawbacks since it makes it more difficult to detect errors early. We will describe how we have rejected this tradeoff by applying schema metaprogramming, eliminating boilerplate but keeping the protection of static typing, thereby further improving agility to quickly modify data pipelines without fear.
Global Situational Awareness of A.I. and where its headedvikram sood
You can see the future first in San Francisco.
Over the past year, the talk of the town has shifted from $10 billion compute clusters to $100 billion clusters to trillion-dollar clusters. Every six months another zero is added to the boardroom plans. Behind the scenes, there’s a fierce scramble to secure every power contract still available for the rest of the decade, every voltage transformer that can possibly be procured. American big business is gearing up to pour trillions of dollars into a long-unseen mobilization of American industrial might. By the end of the decade, American electricity production will have grown tens of percent; from the shale fields of Pennsylvania to the solar farms of Nevada, hundreds of millions of GPUs will hum.
The AGI race has begun. We are building machines that can think and reason. By 2025/26, these machines will outpace college graduates. By the end of the decade, they will be smarter than you or I; we will have superintelligence, in the true sense of the word. Along the way, national security forces not seen in half a century will be un-leashed, and before long, The Project will be on. If we’re lucky, we’ll be in an all-out race with the CCP; if we’re unlucky, an all-out war.
Everyone is now talking about AI, but few have the faintest glimmer of what is about to hit them. Nvidia analysts still think 2024 might be close to the peak. Mainstream pundits are stuck on the wilful blindness of “it’s just predicting the next word”. They see only hype and business-as-usual; at most they entertain another internet-scale technological change.
Before long, the world will wake up. But right now, there are perhaps a few hundred people, most of them in San Francisco and the AI labs, that have situational awareness. Through whatever peculiar forces of fate, I have found myself amongst them. A few years ago, these people were derided as crazy—but they trusted the trendlines, which allowed them to correctly predict the AI advances of the past few years. Whether these people are also right about the next few years remains to be seen. But these are very smart people—the smartest people I have ever met—and they are the ones building this technology. Perhaps they will be an odd footnote in history, or perhaps they will go down in history like Szilard and Oppenheimer and Teller. If they are seeing the future even close to correctly, we are in for a wild ride.
Let me tell you what we see.
ViewShift: Hassle-free Dynamic Policy Enforcement for Every Data LakeWalaa Eldin Moustafa
Dynamic policy enforcement is becoming an increasingly important topic in today’s world where data privacy and compliance is a top priority for companies, individuals, and regulators alike. In these slides, we discuss how LinkedIn implements a powerful dynamic policy enforcement engine, called ViewShift, and integrates it within its data lake. We show the query engine architecture and how catalog implementations can automatically route table resolutions to compliance-enforcing SQL views. Such views have a set of very interesting properties: (1) They are auto-generated from declarative data annotations. (2) They respect user-level consent and preferences (3) They are context-aware, encoding a different set of transformations for different use cases (4) They are portable; while the SQL logic is only implemented in one SQL dialect, it is accessible in all engines.
#SQL #Views #Privacy #Compliance #DataLake
4th Modern Marketing Reckoner by MMA Global India & Group M: 60+ experts on W...Social Samosa
The Modern Marketing Reckoner (MMR) is a comprehensive resource packed with POVs from 60+ industry leaders on how AI is transforming the 4 key pillars of marketing – product, place, price and promotions.
4th Modern Marketing Reckoner by MMA Global India & Group M: 60+ experts on W...
Using Dynochem to determine a suitable sampling endpoint in a DoE. David Place.
1. Using Dynochem to determine a
suitable sampling endpoint in a
DOE
David W. Place, Ph. D.
401 N Middletown Rd
B222/2149
Pearl River, NY 10965
May 13-14, 2009
2. Outline
I. Comments on DoE Assessment Process
II. Case Study: Finkelstein activated alkylation
Establish control over impurity formation that carries through to API
III. Importance of sampling endpoint
Understand kinetics in order to remove time as a factor from the DoE
IV. Data Fitting: Establishing k’s and Ea’s
V. Simulating Alternate Design Points
Refine Factor/CPP ranges based on Dynochem solved kinetic model
VI. Summary
2 D. Place
3. I. DOE Investigation
Assessment process
Increasing Process Predictability
Fractional Response
Reproducibility Kinetic
Factorial Surface Model
Assessment Assessment
DOE DOE
Validation that Understanding of Establish/identify Generate predictive
Parameters NOT Factor ranges to Most important CPPs Equation for CQA
investigated are establish suitable And their rank order/ based on important
being controlled process endpoints interactions CPPs
3 D. Place
4. II. Case Study: Finkelstein activated alkylation
N y NaI
R N
R Cl + x
50-82 C N
N
H z parts Solvent
“substrate” “amine” “product”
Parts
Experiment Amine NaI Temp
Solvent
mol equiv mol equiv degC mL/g
A (low) 2 0 50 4
B (centerpoint) 3.5 0.5 66 5.5
C (repeat) 3.5 0.5 66 5.5
D (High) 5 1 82 7
Issue: Reaction Conditions lead to formation of a quaternary salt (0.1- 2%)
impurity that carries through into the API and is difficult to remove.
Approach: Use a Fractional Factorial Res V design to determine critical
process parameters (CPPs) to control Quat Salt formation
Problem: Reaction mixture is heterogeneous requiring sacrificial quench
of entire reaction mixture to determine impurity profile
4 D. Place
5. Dynochem Models used
n Model: Dynochem’s Yield loss from side reactions (batch)
n Data: HPLC assay data for substrate and product converted to
mmol via calibration curves
n Assumption: Use simplest mechanism to describe conversion
N
R N
k1 R Cl + x
N
+ HCl
N 50-82 C
H z parts Solvent
“substrate” “amine” “product”
k2 R Cl + y NaI
50-82 C
R I + NaCl
z parts Solvent “intermediate”
N
R N
k3 R I + x + HI
50-82 C N
N
H z parts Solvent
R N R N R
+
k4 R Cl
N 50-82 C N
z parts Solvent Cl-
“impurity”
5 D. Place
6. III. Importance of Sampling Endpoint
Removing time as a factor from the DoE
If an adequate Kinetic model of the mechanism can be elucidated:
Dynochem Simulator can be used to scope out suitable endpoints
DoE factor (CPP) ranges can be investigated prior to committing
time/resources
3.5 mol%
<1 mol% from Previous batch experience
9h 15 h
* Simulated using Dynochem’s Yield loss from side reactions (batch) Model
6 D. Place
7. IV. Data Fitting: Procedure to fit rate
constants and Ea
k1 substrate + amine > product + HCl
k2 substrate + NaI > intermediate + NaCl
Process sheet k3 intermediate + amine > product + HI
k4 product + substrate > impurity
Parts
Experiment Amine NaI Temp
Solvent
mol equiv mol equiv degC mL/g
A (low) 2 0 50 4
Scenario Sheet B (centerpoint) 3.5 0.5 66 5.5
C (repeat) 3.5 0.5 66 5.5
D (High) 5 1 82 7
1. Translate mechanistic proposal into process sheet
2. Translate design factors into the scenarios sheet
3. HPLC Area count data converted to mmol for substrate
and product using reference standard calibration curves
4. Use Dynochem fitting engine to solve 4 k’s and 4 Ea’s
7 D. Place
8. Yield loss from side reactions (batch)
Modified to model suspected reaction mechanism
Data Sheet
Scenario Sheet
8 D. Place
9. Dynochem Fits of Experimental Data
Low factors, 50C No NaI 2 equiv amine 4 parts
7.5
Centerpoint values, 66C 0.5 equiv NaI 3.5 equiv amine
Solution.product (Exp) (mmol)
5.5 parts
Solution.substrate (Exp) (mmol)
6.0
A 5.0
Solution.impurity (mmol)
Solution.product (mmol)
High factors, 82C 1 equiv NaI 5 equiv amine 7 parts
Solution.product (Exp) (mmol)
7.5
Solution.substrate (Exp) (mmol)
Solution.substrate (mmol) B&C Solution.impurity (mmol)
Solution.product (Exp) (mmol)
Solution.substrate (Exp) (mmol)
4.0 Solution.product (mmol) Solution.impurity (mmol) D
Process profile (see legend)
Solution.substrate (mmol) Solution.product (mmol)
6.0
4.5
Process profile (see legend) Solution.substrate (mmol)
3.0
Process profile (see legend)
4.5
3.0
2.0
3.0
1.5
1.0
1.5
0.0
0.0 361.2 722.4 1.084E+3 1.445E+3 1.806E+3
Time (min) 0.0
0.0 336.2 672.4 1.009E+3 1.345E+3 1.681E+3
Time (min)
0.0
0.0 336.2 672.4 1.009E+3 1.345E+3 1.681E+3
S/P R2 = 0.97/0.98 S/P R2 = 0.99/0.98 Time (min)
S/P R2 = 0.99/0.97
n Model fits substrate loss fairly well over the set of data
n Model overestimates impurity content – model refinement necessary
9 D. Place
10. Fitting Summary
SCENARIO 4
k1 substrate + amine > product + HCl
k2 substrate + NaI > intermediate + NaCl
k3 intermediate + amine > product + HI
k4 product + substrate > impurity
k 10-5 L/mol s Ea kJ/mol kcal/mol
k1 1.1 +/-0.2 Ea1 40 +/- 9 9 +/- 2
k2 34 +/- 6 Ea2 - -
k3 3.7 +/- 0.7 Ea3 100 +/- 30 24 +/- 6
k4 5.0 +/- 0.8 Ea4 90 +/- 10 23 +/- 3
n k values reported at T(ref) = 66 C
10 D. Place
11. V. Simulating Alternate Design Points
n Criteria for the reaction:
Reaction completed to <1% substrate
Reactions time <30h.
n Question: Which ranges of CPPs will fit the criteria?
Process sheet
Variables Yield %
molpctSM %
Calculate Yield:= solution.product / solution.substrate.Y0
molpctSM:= solution.substrate / solution.substrate.Y0
End time:= if(molpctSM<0.01,time,14400)
11 D. Place
12. Searching for New CPP Ranges
Use the Dynochem Simulator
Endpoint at 1 mol% substrate
12 D. Place
13. Reaction Endpoint Predictions
n Initial Design points predict Reaction endpoints between 8h and
136h @ <1% substrate
Parts Reaction
Experiment Amine NaI Temp
Solvent Endpoint
mol equiv mol equiv degC mL/g h
A (low) 2 0 50 4 136
B&C (centerpoint) 3.5 0.5 66 5.5 30
D (High) 5 1 82 7 8
A' (corner point) 2 0 50 7 261
D' (corner point) 5 1 82 4 4
n Simulation of alternate design points actually suggests that the
reaction endpoint will vary between 4 and 261 h within the design
space – the CPP ranges need to be altered to meet the criteria
13 D. Place
14. Influence of CPP “decrease” on Reaction Endpoint
Parts Reaction
Experiment Amine NaI Temp
Solvent Endpoint
mol equiv mol equiv degC mL/g h
B&C (centerpoint) 3.5 0.5 66 5.5 29
Scenario 1 2 0.5 66 5.5 47
Scenario 2 3.5 0 66 5.5 43
Scenario 3 3.5 0.5 50 5.5 70
Scenario 4 3.5 0.5 66 7 38
n Reaction Temperature is the most influential parameter governing
reaction endpoint.
n Rank order: Rxn Temp > Amine > NaI mol > Parts Solvent
14 D. Place
15. Identifying a new "All-factors-low" design point
Temperature effects
Parts Reaction
Experiment Amine NaI Temp
Solvent Endpoint
mol equiv mol equiv degC mL/g h
A (low) 2 0 50 4 136
Scenario 1 2 0 66 4 52
Scenario 2 2 0 66 7 99
Scenario 2A 2 0 68 7 89
Scenario 2B 2 0 70 7 81
Scenario 2C 2 0 72 7 73
Scenario 2D 2 0 82 7 44
n In order to preserve Amine CPP range between 2-5 mol equiv and
NaI CPP range to 0-1:
Parts solvent CPP would need to be set to <4 parts to meet criteria
-OR-
Reaction Temperature CPP would need to have its lowest value set to > 82 degC to
meet criteria
15 D. Place
16. Identifying a new "All-factors-low" design point
NaI effects
Parts Reaction
Experiment Amine NaI Temp
Solvent Endpoint
mol equiv mol equiv degC mL/g h
A (low) 2 0 50 4 136
Scenario 1 2 0 66 4 52
Scenario 2 2 0 66 7 99
Scenario 2E 2 0.1 66 7 91
Scenario 2F 2 0.2 66 7 86
Scenario 2G 2 0.3 66 7 78
Scenario 2H 2 0.5 66 7 61
Scenario 2I 2 0.9 66 7 28
n In order to preserve Amine CPP range between 2-5 mol equiv and
Reaction Temperature CPP range to 66-82 degC
NaI CPP would need to have its lowest value set to 0.9 mol equiv to meet
criteria
16 D. Place
17. Conclusion: The Trade-Off
Parts Reaction
Experiment Amine NaI Temp
Solvent Endpoint
mol equiv mol equiv degC mL/g h
A' (low) - Old Design Point 2 0 50 7 261
New "low" Design Point 2 0.5 72 6 29
New and Recommended CPP ranges for DoE
based on kinetic assessment
CPP Unit Low Centerpoint High
Amine mol equiv 2 3.5 5
NaI mol equiv 0.5 1.5 2.5
Temp degC 72 77 82
Parts Solvent ml/g 5 5.5 6
17 D. Place
18. VI. Summary
n A Kinetic assessment of the reaction prior to running a
DoE is essential to ensure proper choice of design
factor ranges.
n If an adequate Kinetic model of the mechanism can be
elucidated DoE factor (CPP) ranges can be
investigated prior to committing time & resources
n Dynochem is a powerful tool that enables the process
chemist to leverage data collected from 4 “shake-
down” runs.
18 D. Place
19. Acknowledgements
Jianxin Ren
Michael O’Brien
Marty Guinn
Peter Clark
19 D. Place