II-SDV 2014 Patent Valuation: Building the tools to extract and unveil intelligence and value from patent data (Laurent Hill and Renaud Garat – Questel, France)
Similar to II-SDV 2014 Patent Valuation: Building the tools to extract and unveil intelligence and value from patent data (Laurent Hill and Renaud Garat – Questel, France)
Similar to II-SDV 2014 Patent Valuation: Building the tools to extract and unveil intelligence and value from patent data (Laurent Hill and Renaud Garat – Questel, France) (20)
II-SDV 2014 Patent Valuation: Building the tools to extract and unveil intelligence and value from patent data (Laurent Hill and Renaud Garat – Questel, France)
1. PATENT EVALUATION
Building the tools to extract and unveil
intelligence and value from patent data
April 2014 - Nice
Laurent HILL - Renaud GARAT
Copyright 2014
2. PATENT EVALUATION
Evaluation is a systematic determination of a subject's
merit, worth and significance, using criteria governed
by a set of standards (…) to help in decision-making;
(…) any particular definition of evaluation would have
be tailored to its context
Copyright 2014
3. Patent valuation vs. Patent evaluation
• Valuation: Assess economic worth
– How much ?
• Evaluation: Systematic assessment of worth to
assist in decision making in a given context
– Should I ?
• The first part of this presentation will try to contrast
methodology and intended use for both approaches
4. Contrast uses
• Global valuation model: How much ?
– Provide a single number per patent, representing the
"economic worth"
– May be aggregated for any large corpus of patents
Single value for a large portfolio
• Goal driven evaluation : should I ? (and why?)
– Answer a specific question
– Provide rational for the decision making process
Reach quickly high quality, documented, decisions on
specific IP issues
5. Examples: you choose the right approach
• How much has the value of US companies
portfolio grown in 2013?
• Are HP patents worth more than Apple's?
• Are my 10 nanotech patents worth licencing
out? to whom ?
• Which patent should I stop renewing this year?
6. Global valuation models: how does it work?
• Three steps
1. Estimate value V for historical patents
• Based on renewal behavior
2. Compute predictor metrics P for same corpus
• Citations, family size, claim length, …
3. Build model
• Use machine learning techniques to estimate V=f(p)
• Now V can be estimated for any patent for
which we can compute the metrics
7. Goal driven evaluation
• Our Goal: make the best decision on an IP issue
– Can we license out some of our patents? To
whom?
– Is this patent cluster offered to me for licencing a
good match for our company?
– For which patent should we stop paying annuities?
– …
8. A process based on expert best practice
• To answer these questions, licencing professionals use
specific metrics, to compare the patents at hand with
comparable patents
• There is more or less a consensus in the industry on what
the main ones are
• High level, business driven metrics: legal strength, originality,
predator presence, citation velocity…
• These metrics contribute to the decision process, are
understandable by decision stakeholders
• Metric list depend on question being answered
9. Nothing is absolute
• The core of the evaluation process is to compare
metrics value with the distribution of metric values
for a comparable set
• Values have no intrinsic meaning for business stake
holders
– Compare explaining: "our average velocity is 3.2"
versus "our patents get twice as much citation per
year as the rest of the industry".
• What is important is: how do the selected patents
compare to similar patents
10. What is a comparable set?
• Building a good comparable set is key to this approach
• May be manual, expert work:
– combination of IPC classes, keywords, competitors…
– serious patent information expertise + time needed
• Questel of course advocates the use of our similarity
algorithm, allowing instant comparable set building.
– Based on a proprietary semantic engine, plus all other available
information: classes, citations…
– Much faster: usually all is needed is to decide where to set the
cutoff point
11. Goal driven evaluation process
Select patents
Build comparable set
Compute metrics
Assess patents against
comparable set
Make a fast, documented, high quality
decision
27. • Patents are a key business evaluation material
• Evaluation is driven by business needs
• Business needs define metrics & methodology
• Evaluation is contextual
• Context is set by benchmarking
• Benchmarking relies on similarity algorithms
• Data quality & value-add are essential
• Metrics adjustment lead to sharper conclusions
Copyright 2014
Conclusion