The presentation aims to explain the meaning of ECONOMETRICS and why this subject is studied as a separate discipline.
The reference is based on the book "BASIC ECONOMETRICS" by Damodar N. Gujarati.
For further explanation, check out the youtube link:
https://youtu.be/S3SUDiVpUGU
The aim of this course is to equip the students with the necessary skills, including both the acquisition of habits of thought and knowledge of the techniques of modern econometrics.
The course is application oriented.
The course also aims to provide students with the ability to use appropriate software in an effective manner.
The presentation aims to explain the meaning of ECONOMETRICS and why this subject is studied as a separate discipline.
The reference is based on the book "BASIC ECONOMETRICS" by Damodar N. Gujarati.
For further explanation, check out the youtube link:
https://youtu.be/S3SUDiVpUGU
The aim of this course is to equip the students with the necessary skills, including both the acquisition of habits of thought and knowledge of the techniques of modern econometrics.
The course is application oriented.
The course also aims to provide students with the ability to use appropriate software in an effective manner.
Statistics as a subject (field of study):
Statistics is defined as the science of collecting, organizing, presenting, analyzing and interpreting numerical data to make decision on the bases of such analysis.(Singular sense)
Statistics as a numerical data:
Statistics is defined as aggregates of numerical expressed facts (figures) collected in a systematic manner for a predetermined purpose. (Plural sense) In this course, we shall be mainly concerned with statistics as a subject, that is, as a field of study
The dangers of macro-prudential policy experiments: initial beliefs under ada...GRAPE
Presentation from ASSA2021
The paper studies the implication of initial beliefs and associated confidence under adaptive learning. We first illustrate how prior beliefs determine learning dynamics and the evolution of endogenous variables in a small DSGE model with credit-constrained agents, in which rational expectations are replaced by constant-gain adaptive learning. We then examine how discretionary experimenting with new macroeconomic policies is affected by expectations that agents have in relation to these policies. More specifically, we show that a newly introduced macro-prudential policy that aims at making leverage counter-cyclical can lead to substantial increase in fluctuations under learning, when the economy is hit by financial shocks, if beliefs reflect imperfect information about the policy experiment.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Statistics as a subject (field of study):
Statistics is defined as the science of collecting, organizing, presenting, analyzing and interpreting numerical data to make decision on the bases of such analysis.(Singular sense)
Statistics as a numerical data:
Statistics is defined as aggregates of numerical expressed facts (figures) collected in a systematic manner for a predetermined purpose. (Plural sense) In this course, we shall be mainly concerned with statistics as a subject, that is, as a field of study
The dangers of macro-prudential policy experiments: initial beliefs under ada...GRAPE
Presentation from ASSA2021
The paper studies the implication of initial beliefs and associated confidence under adaptive learning. We first illustrate how prior beliefs determine learning dynamics and the evolution of endogenous variables in a small DSGE model with credit-constrained agents, in which rational expectations are replaced by constant-gain adaptive learning. We then examine how discretionary experimenting with new macroeconomic policies is affected by expectations that agents have in relation to these policies. More specifically, we show that a newly introduced macro-prudential policy that aims at making leverage counter-cyclical can lead to substantial increase in fluctuations under learning, when the economy is hit by financial shocks, if beliefs reflect imperfect information about the policy experiment.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
2. nternational Journal of Science
Engineering and Management
nternational Journal of Science
Engineering and Management
3.
4. Outcomes of lecture
• Develop basic understanding of econometrics
• Analyze the methodology used in economic model
5. METHODOLOGY OF ECONOMETRICS
Statement of Economic theory
Specification of the Mathematical model
Specification of the Econometric model
Obtaining Data
Estimation of econometric model
Hypothesis testing
Forecasting or prediction
Use of the model for policy purposes
6. METHODOLOGY OF ECONOMETRICS
• Broadly speaking, traditional econometric methodology
proceeds along the following lines:
1. Statement of Economic theory or hypothesis.
2. Specification of the mathematical model of the theory
3. Specification of the statistical, or econometric, model
4. Collecting the data
5. Estimation of the parameters of the econometric model
6. Hypothesis testing
7. Forecasting or prediction
8. Using the model for control or policy purposes.
• To illustrate the preceding steps, let us consider the well-
known Keynesian theory of consumption.
7. 1. Statement of Economic Theory or Hypothesis
• Keynes states that on average, consumers increase their
•
consumption i n c r e a s e as their income increases, but not as
much as the increase in their income. (MPC < 1).
MPC= Rate of change consumption by change in income.
2. Specification of the Mathematical Model of Consumption
(single-equation model)
Y = a + β1X 0 < β1 < 1
Y = consumption expenditure and (dependent variable)
X = income, (independent, or explanatory variable)
a = the intercept
β1 = the slope coefficient
• The slope coefficient β1 measures the MPC.
9. 3. Specification of the Econometric Model of Consumption
• The relationships between economic variables are generally
inexact. In addition to income, other variables affect consumption
expenditure. For example, size of family, ages of the members in the
family, family religion, etc., are likely to exert some influence on
consumption.
• To allow for the inexact relationships between economic variables,
(I.3.1) is modified as follows:
• Y = a1 + β1X + u
• where u, known as the disturbance, or error, term, is a random
(stochastic) variable. The disturbance term u may well represent all
those factors that affect consumption but are not taken into
account explicitly.
10. • it hypothesizes that Y is linearly related to X, but that the relationship
between the two is not exact; it is subject to individual variation. The
econometric model of can be depicted as shown in Figure I.2.
11. MCQ
• Consider the following simple regression model of
house prices: house_price = b0+ b1*land_size + u.
What is b1 ?
• (a) land_size.
• (b) the distance to the city.
• (c) slope parameter.
• (d) intercept parameter.
12. MCQ
• In the equation, y=β0+β1x1+β2x2+u
β0 is a(n) _____.
• (a) independent variable
• (b) dependent variable
• (c) slope parameter
• (d) intercept parameter
13. 4. Obtaining Data
• To obtain the numerical values of a and β1, we need data. Look at
Table I.1, which relate to the personal consumption expenditure
(PCE) and the gross domestic product (GDP). The data are in “real”
terms.
16. 5. Estimation of the Econometric Model
• Regression analysis is the main tool used to obtain the estimates.
Using this technique and the data given in Table I.1, we obtain the
following estimates of a and β1, namely, −184.08 and 0.7064. Thus,
the estimated consumption function is:
• Yˆ = −184.08 + 0.7064Xi
• The estimated regression line is shown in Figure I.3. The regression
line fits the data quite well. The slope coefficient (i.e., the MPC) was
about 0.70, an increase in real income of 1 dollar led, on average, to
an increase of about 70 cents in real consumption.
17. 6. Hypothesis Testing
• That is to find out whether the estimates obtained in, Eq. (I.3.3) are
in accord with the expectations of the theory that is being tested.
Keynes expected the MPC to be positive but less than 1.
• In our example we found the MPC to be about 0.70.
• But before we accept this finding as confirmation of Keynesian
consumption theory, we must enquire whether this estimate is
sufficiently below unity.
• In other words, is 0.70 statistically less than 1? If it is, it may
support Keynes’ theory.
18. 7. Forecasting or Prediction
• To illustrate, suppose we want to predict the mean consumption
expenditure for 1997. The GDP value for 1997 was 7269.8 billion
dollars consumption would be:
Yˆ1997 = −184.0779 + 0.7064 (7269.8) = 4951.3
• Now suppose the government decides to propose a reduction in the
income tax. What will be the effect of such a policy on income and
thereby on consumption expenditure and ultimately on
employment?
19. 8. Use of models for control or policy
purpose-
As the estimated model is used to make
policies by appropriate mix of fiscal and
monetary the government can manipulate the
control variable X to produce a desired level of
the target variable Y.
20. Economic Theory
Mathematic Model Econometric Model Data Collection
Estimation
Hypothesis Testing
Forecasting
Application
in control or
policy
studies
21. Scope of Econometrics
• Developing statistical methods for the estimation of
economic relationships,
• Testing economic theories and hypothesis,
• Evaluating and applying economic policies,
• Forecasting,
• Collecting and analyzing non-experimental or
observational data
22. Goals of Econometrics
• The three main aims econometrics are as follows:
1. Formulation and specification of econometric
models
2. Estimation and testing of models
3. Use of models
23. Division of Econometrics
• Two types of Econometrics:
1. Theoretical Econometrics- statistical methods
2. Applied Econometrics- empirical analysis
24. RELATIONSHIPS
Statistical versus deterministic- In statistical
relationships among variables we essentially deal
with random variables, in deterministic relationships
it also deals with variables
Regression versus causation- there is no
statistical reason to assume that rainfall doesn’t
depend on crop yield. commonsense says that
inverse relationship in itself cannot logically imply
causation.
25. Regression versus correlation- the strength of
association between two variables is correlation ,it
is measured by correlation coefficient.
To estimate or predict the average value of one
variable on the basis of the fixed variable.
In regression analysis there is an symmetry in the
way the dependent and explanatory variables are
treated. in correlation, variables are symmetrically.
27. TYPES OF DATA IN ECONOMETRICS
Time series data-
consists of observations on a variable or several
variables over time.
Chronological ordering
Frequency of time series data: hour, day, week, month,
year
Time length between observations is generally equal
Examples of time series data include stock prices,
money supply, consumer price index, gross domestic
product, annual homicide rates, and automobile sales
figures.
28.
29. Cross sectional Data
• Data collected at the same point of time
• consists of a sample of individuals, households, firms,
cities, states, countries, or a variety of other units,
taken at a given point in time
• Significant feature: random sampling from a target
population
• Generally obtained through official records of individual
units, surveys, questionnaires (data collection
instrument that contains a series of questions designed
for a specific purpose)
• For example, household income, consumption and
employment surveys conducted by the Turkish
Statistical Institute (TUIK/TURKSTAT)
30.
31. Pooled data
it is combined ,data are elements of both time series
and cross section data
consists of cross-sectional data sets that are observed in
different time periods and combined together
At each time period (e.g., year) a different random sample is
chosen from population
Individual units are not the same
For example if we choose a random sample 400 firms in
2002 and choose another sample in 2010 and combine
these cross-sectional data sets we obtain a pooled cross-
section data set.
Cross-sectional observations are pooled together over time.
32.
33. Panel Data (longitudinal data)
• Also known as Micropanel data.
• it is a special type of pooled data inwhich the
same cross sectional unit is surveyed over time
• consists of a time series for each cross-sectional
member in the data set.
• The same cross-sectional units (firms, households, etc.)
are followed over time.
• For example: wage, education, and employment history
for a set of individuals followed over a ten-year period.
• Another example: cross-country data set for a 20 year
period containing life expectancy, income inequality,
real GDP per capita and other country characteristics
36. Significance of error term
• Vagueness of theory
• Unavailability of data
• Randomness in human behavior
• Poor proxy variable
• Principle of parsimony (extreme unwillingness)
• Wrong functional form