Delphi Berkeley 2016

47,011 views

Published on

Delphi Berkeley 2016 final presentation

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
47,011
On SlideShare
0
From Embeds
0
Number of Embeds
45,581
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • On left hand side --
  • On left hand side --
  • Kanu said we can just say this… we can keep it for emphasis, but we should be cutting at this point
  • SNIPE: DEEP DIVE INTO THE PATENT PROCESS
  • The pain of patent discovery falls on the entire patent committee (CTO, GC, VP of IP, Paralegals, Patent agents, etc.), not just VP of IP.

    Friction between stage gates leads to problems at big companies to the point where many companies build internal infrastructure for filing patents.

    But value prop changes based on whether they file patents ad-hoc (friction reduction), or have a standardized process (safety net)

  • Two main ideas: value to engineer and value to IP committee member
    Value to engineer progressed from dynamic scraping (documentation issues), docketing system (would get the info, wouldn’t adopt), integrate into existing system
    Value to IP committee member progressed from simple patentability score (didn’t really take into account all aspects of patentability), all aspects of patentability (could not compute), to what we could compute
  • Two main ideas: value to engineer and value to IP committee member
    Value to engineer progressed from dynamic scraping (documentation issues), docketing system (would get the info, wouldn’t adopt), integrate into existing system
    Value to IP committee member progressed from simple patentability score (didn’t really take into account all aspects of patentability), all aspects of patentability (could not compute), to what we could compute
  • We refined our MVP and initially we got feedback like how do you come up with this score, how are you doing this?
  • So we started out by putting in only what we knew for sure we could do at the moment. What you see here is believable because we tell how we calculated the score and named it “Patent similarity score”
  • After this feedback, we refocused on what was VALUABLE to our customers instead of what was BELIEVABLE
  • Made a switch in story-telling from product back to company
    Need to figure out how to bring the audience to realize this big shift
  • The pain of patent discovery falls on the entire patent committee (CTO, GC, VP of IP, Paralegals, Patent agents, etc.), not just VP of IP.

    Small companies lack the bureaucracy needed for this to be seen as a problem (close integration of patent agents and engineers)

    Friction between stage gates leads to problems at big companies to the point where many companies build internal infrastructure for filing patents.

    But value prop changes based on whether they file patents ad-hoc (friction reduction), or have a standardized process (safety net)


  • The pain of patent discovery falls on the entire patent committee (CTO, GC, VP of IP, Paralegals, Patent agents, etc.), not just VP of IP.

    Friction between stage gates leads to problems at big companies to the point where many companies build internal infrastructure for filing patents.

    But value prop changes based on whether they file patents ad-hoc (friction reduction), or have a standardized process (safety net)

  • We had features like this which even highlighted exactly what our algorithm picked up to shed light on how we got to our results. After doing this our customers started losing excitement and we were told we’re thinking too much like engineers.
  • SNIPE: DEEP DIVE INTO THE PATENT PROCESS
  • Delphi Berkeley 2016

    1. Original Idea: Helping Patent Prosecutors write better patents by providing a window into the development cycle Current Idea: Helping in-house IP committees’ harvest and evaluate patentable opportunities within an engineer's documentation Harvesting Patentable Innovations 100 Interviews Later (and a few tears) Week 0 Week 10
    2. Harvesting Patentable Innovations 100 Interviews Later Week 0 Week 10 TAM / SAM: $15.48B Globally $5.95B Domestically $3.7B Globally $1.42B Domestically TAM / SAM: ~$11B Globally ~$1.32B Domestically $2.2B Globally ~$270M Domestically
    3. Matthew Ko Akilesh Bapu Prashanth Ganesh Giacomo Zacchia Business App Development Machine Learning Research Design Strategy/ Finance Front End Development Back End Development Strategy/ Project Management Harvesting Patentable Innovations Team
    4. Ewald Detjens Kanu Gulati AdvisorMentor Harvesting Patentable Innovations Team Exemplar Logic
    5. We Think, Therefore We’re Right Act 1:
    6. Harvesting Patentable Innovations Original Idea Taher Savliwala, Director of IP at Quixey: Delphi would serve to help integrate patent prosecutors into the development cycle by allowing them to view the different iterations of the technology and detailed analyses on these iterations. We took ONE piece of validation and took off running
    7. Original Idea
    8. Iteration Control Platform Unprecedented window into the Development Process Downstream Patent Prosecution & Innovation Identification Broader Scope of Innovation Stronger IP Position and ability to monetize intangible assets Harvesting Patentable Innovations Original Idea
    9. Harvesting Patentable Innovations FOCUS ON LEARNING, NOT BUILDING Key Learnings ASSUMPTIONS ≠ REALITY
    10. Harvesting Patentable Innovations Customer Discovery “We rely on our own- we have lots of experience”- Chinh H. Pham, Esq $ Learned the entire patent process Distilled Actionable Insights “Only certain kinds of software patents can go through after Alice”- Darryl Smith, VMware Customers need to need your product
    11. Harvesting Patentable Innovations Patent Process Inventor works on Invention Idea is realized to be novel in harvest session Inventor realizes novelty Discouraged Inventor fails to patent Engineer files patent disclosure form Patent Committee evaluates PC does not file unworthy patent PC files worthy patent PC fails to file worthy patent PC files unworthy patent No one realizes patent potential PC bring in outside counsel OC communicates with inventor OC writes overly broad patent OC writes good patent OC writes overly specific patent PC miss strategic patent oppurtunity Discouraged inventor now dislikes filing Discovery Filing
    12. Outhouse Counsel Hardware Engineers Patent Committee (VPs of IP, Patent Engineers, etc) Filing 100+ Product Managers Filing 100+ Inventors (Engineers, Researchers) Filing 100+ Startups/Independent Inventors Harvesting Patentable Innovations Target Customers IP CommitteesCompanies
    13. Harvesting Patentable Innovations Week 3 - Target Customers 87 “We get a lot of invention disclosures, and it takes time to go through each one and see if we want to pursue it, or justify why we are not” - Darryl Smith Primary Concerns Discovery Objective Criteria Tracking
    14. We Hypothesize, Therefore We Test Act 2 p1:
    15. MVP v1 Harvesting Patentable Innovations MVP v1
    16. Harvesting Patentable Innovations What We Learned - Integration Dynamically scrape documentation to extract opportunities Analyze submitted Invention Disclosures, automatically generate ID Build docketing system for invention disclosures We had initially prioritized seamlessness for the engineer at the expense of quality for the IP committee
    17. Novelty, Detectability, Non- Obviousness, Value to Business Harvesting Patentable Innovations What We Learned Novelty, Detectability, Non-obviousness Novelty, patent similarity Patentability We tried to test many different value propositions, instead of testing them one at a time
    18. Chasing Down The Signal Act 2 p2:
    19. Nailed Product Market Fit
    20. This Was Partial Victory, There was still a long way to go
    21. Harvesting Patentable Innovations Customer Relationships - GET Get to Website Sign-up to see Demo Trial Run Individual (Limited) Subscription Enterprise Subscription Direct $6 CPC 50% of IP committee members acquired through Direct Supplementary:WebPrimary:DirectSale Earned $0 CPC 50% of IP committee members acquired through Earned Salary: ($100k) (~$48/hr) Look at product features and case studies to gauge credibility P/F P/F: IP committee members view case studies as signs of credibility Ex 80% of IP committee members on website will click case studies before P/F:IP committee members view social proof as signs of credibility Ex:90% of IP committee members will click social proof Sign up for a demo video of our product as well as sample reports from case studies P/F: IP committee members are willing to exchange their email for demo video and a sample results page Ex. 60% of IP committee members who click social proof or case studies will give us their email for a demo video and sample report of our case study Agree to phone call and a personalized demo to test their own documents through Delphi P/F: Users who sign up for a demo will agree to a F2F EX. 80% of IP committee members agree to a phone call P/F: Users who requested demos will want to test their own documents through Delphi Ex:50% of IP committee members want to test their own documents through our system Pay for a limited monthly subscription to use Delphi We believe that 50% of IP committee members will pay $499 for a limited monthly subscription Pay for a fully customized Enterprise Subscription for entire IP committee 3% of IP committee members want a fully customized enterprise subscription for their entire team Set up Call
    22. Harvesting Patentable Innovations Customer Relationships - Keep & Grow 1. Interaction a. Customer Check-in Emails b. Customer Service Line c. Customer Feedback Channels 2. Product Improvements a. Software Updates (up to date patent corpus) b. Improve Algorithms c. Added Functionalities Keep Metrics: ● Changes in # of Searches ● Changes in # of Prior art Requests ● Open rates for check-in emails ● Service Interaction Count (before and after added functionalities) Cross-Sell Up-Sell Next-Sell Allow Users reaching subscription limits to purchase: Extractions Separately Searches Separately Prior Art Seperately Allow Users reaching subscription limits to purchase unlimited monthly subscription Allow Users to purchase year long or multi year long unlimited subscriptions at discounted rate Referral Rewards Program Users who referrals amount to a subscription will be awarded extraction/search/prio r art credit Grow Metrics: ● Track what type of award customer chooses (change extraction/search/prior art amounts to gauge utility to users) ● Track % of Subscriptions coming from referrals ● % of users reaching subscription limits but need not enough to purchase unlimited ● Type of Users needing beyond subscription limit ● % of users reaching subscription limits with need enough to purchase unlimited ● Type of Users ● % of users wanting extended contracts ● Type of Users wanted unlimited
    23. Harvesting Patentable Innovations Get to Website Trial Run Individual (Limited) Subscription Enterprise Subscription Direct $6 CPC 50% of customers acquired through Direct WebDirectSale(InterviewsasProxy) Earned $0 CPC 50% of customers acquired through Earned Salary: ($100k) (~$48/hr) $48/hour salary w/ 45 min (¾ hour) engagement = ($48*.75) =$36/engagement $36/engagement with 31% of engagements converted for trials ($36/.31)= $116.3/converted engagement to trial $116.3/converted engagement, with 50% willing to purchase limited subscription ($116.3/.5) = $232.6 per individual subscription sale $232.6/individual subscription, w/ 3% conversion to enterprise sale ($232.6/.03)= $7,753/enterprise sale Avg. CPC (Direct + Earned) = ($6*50%)+($0*50%) = $3 Loss Rate= 50% Demo Rate= 50% Cost Per Web Engagement= ($3/.5/.5) = $12 $12 premium for Web Engagement Lead + 45 min (¾ hour) engagement = $12 + ($48*.75) = $48/engagement $48/engagement with 31% of engagements converted. = ($48/31%) = $155/converted engagement $155/converted engagement, with 50% willing to purchase limited subscription = ($116.3/.5) = $310 / individual subscription sale $310/individual subscription, w/ 3% conversion to enterprise sale =($310/3%) = $10,335/enterprise sale Sign up to see Demo Video Show Demo Customer Relationships -GET Experiment Results: 31% of engagements wanted a personalized trial
    24. Harvesting Patentable Innovations Customer Relationships Can our LTV > CAC?
    25. Harvesting Patentable Innovations Revenue Streams - Pricing Model
    26. Harvesting Patentable Innovations Revenue Streams - Pricing Model Increase Revenue Decrease Cost Increase Compliance KPI: Value: Patentability Universe Patentability Allowance Ratio Infringement Litigation Rate Pricing Model is Driven by VALUE
    27. Gathering the Team, Making the Plan Act 3:
    28. Harvesting Patentable Innovations Critical Activities and Partners Delphi University Tech Transfer Offices First Adopter -> POC Academic Journal Databases USPTO Amazon AWS IP Blogs & Newsletters Non-CoinOperatedCoinOperated Delphi as a free service Validation in the form of POC (DATA / Better Models) Increase sales from funneling customers through prior art results Access to non-patent literature (DATA / Better Models) A better practice to implement (they already release best practices) Validation in the form of best practices (DATA / Better Models) Cash Servers Cash or Content Publicity/Validation/Social Proof
    29. Harvesting Patentable Innovations Operating and Funding Timeline 2017 2018 2019 2020 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4 Operating Milestones Industry Milestones $ 1m $ 1.5m $2m $3m $4m $5m List industry we are in: Electrical / Software Move into Consumer Product/Electronics Paid POC - Uni USPTO Pilot Ptnt Srch + Full research corpus Eval. Beta Launch Eval Launch Move into Mechanical & Information Services Fundraising Begin Mining for Integration Integration Beta Validation + Data Commercial KPI’s SELL SELL SELLPOC Integration Launch Begin Mining for Extraction Extraction Beta Extractio Launch
    30. Harvesting Patentable Innovations Investment Readiness Worst Case Best Case Model produces prior art at a lower level than a professional search firm. Predict patentability as determined by USPTO at least 90% of the time. Very cheap platform for start-up’s to generate prior art Model produces prior art at the same level, if not better that a manual search. Predict patentability as determined by the USPTO at least 90% of the time Standardized metric across the patent industry Investment Ready: One more step to derisk before we scale Investment Ready: Use funding to get NPL, scale to get 5 POC
    31. Harvesting Patentable Innovations Appendix Any Questions?
    32. Week 1 Went in thinking: our customers were outhouse counsel that wanted a better way to communicate with in house engineers for patent prosecution Outhouse engineers write lower quality patents because they do not have the same domain experience as inhouse counsel Lawyers would be open to talking to us We knew what Lawyers pain points were Found out: We found how that we actually don't know anything about the patent processes Misunderstood the dynamics of the patent lawyers workflow Misunderstood the relationship between in house vs outhouse counsel Misunderstood the pain points of the patent process (overgeneralized the pain point, communication si too broad) made the assumption that outsourced lawyers write lower quality patents because they do not have the domain expertise to de-engineer sophisticated inventions and identify innovation in that space. LAWYERS WERE HORRIBLE CUSTOMERS Moving forward: STOP BUILDING AND LEARN ABOUT OUR CUSTOMERS (or really figure out who our customers were) Figure out the intricacies of our value proposition
    33. Week 2Went in thinking: Our value proposition was to help engineers write better parents by increasing transparency and communication Customer Discovery: Use our interviews to understand our customers and their pain-points instead of assert assumptions about their work and sounding stupid Signals from last week: Explore in house counsel as our target customer Focus on Prior Art instead of communication Explore patent litigation as well as prosecution - firms are more desperate in litigation in terms of prior art Understand patent processes within different industries Found out: Dont sell to out house counsel Communication was not a huge issue - they only made a few interactions - they actually wanted this because there was a trade off between willful infringement and less interactions Friction through stage gates for in-house counsels, There is a lot of effort that goes into the initial invention disclosure form We learned a lot this week (patent prosecution cycle) with this higher understanding we were ready to make hypothesis Make value propositions that can be disproved Dont disprove hypothesis after one interview We were still focusing a bit too much on lawyers... we kept asking mainly technological question... started to get an inkling that this would be useful for in-house counsels (Steven Horowitz) -- we started nailing down the steps of the invention disclosure process -- We added additional customer segments -- We were still shooting in the dark as far as the specific person within the company -- We kept interviewing people involved in litigation -- Moving forward we started focusing on conceptualizing of the problems in terms of stagegates and figuring out where we were going to focus in -- we started being able to build better mental models -- Started to realize taht the disconect between patent prosecutors and engineers was rarely significant -- made value props that couldn't be disproved (Weinstein) -- started developing multiple strands of potential value props -- we still were not focusing in (We start focusing in when we make our first real demo) -- communication get's shit on (Chinh H. Pham) -- get an inkling that startups and individual inventors may be a customer -- we were making bold statements after one -- started realizing that we needed to understand the nuances of customers in terms of industry segments
    34. Week 3 Went in thinking: Made hypothesis about patent process - engineer recognition -> invention disclosure -> Patent committee evaluation -> outhouse counsel -> outhouse/engineer interactions -> patent filing (provisional/nonprovisional -> patent Made hypotheses about what are the current solutions patents are harvested (ID forms/ harvesting sessions) Made hypotheses about large companies having a CAPACITY issue Patent counsels miss opportunities Made hypothesis about incentive systems and how well they work Start ups may be a good customer Quality of documentation is good enough to act as inputs for our model Found out: Make value propositions that can be disproved. Hypothesis about patent process was confirmed. Current solutions were confirmed Large companies have a capacity issue (lack of depth), they have the wallet to not do as much due diligence in their patents Small companies do not really have a large communication issue between patent agents and engineers since the firm is smaller Small companies spend 30-45k per patent and they spend around 20 hours back and forth, A LOT MORE TIME AND ARE A LOT MORE TACTICAL WITH PATENTS Moving forward: Decided we wanted to pursue in house counsel at large corporations (fix capacity issue) Started to really see that startups were different than big companies -- there is a capacity issue (we saw capacity as not missing oppurtunities... we were still conceptualizing of the value prop in terms of value to the business not speicific people within the patent committee) versus a finding very important patent issue -- We start to conceptualize about the theories of searching further... confirm steven horowitz findings... the spectrum of never search to always search -- we start to really confirm communication is not that big of an issue -- we start to hear a bit about competitors -- LECORPIO -- Darren Cooke, we really started to realize the capacity distribution -- he really liked the idea, but then was confused when we assumed he handled more than 6 applications a month -- We started learning about the different engineer archetypes -- some people love to patent, some people hate to patent -- Further honed in on the changing incentive systems -- Started seeing other drivers of patenting -- name recognition, internal company politics and pride (the human aspect starts to come into play) -- Start to focus on the criteria for patenting as far as developing a score -- we hear of another potential competitor -- INNOGRAPHY -- We start to explore the different types of documentation we are going to look at and plug in
    35. Week 4 Went in thinking: Made a strict interview doc Made hypothesis that confined our customer segments to hardware because of patent frequency and documentation quality Started defining what qualities were important to seeing whether a patentable opporutnitiy is pursuable or not: Value to business, novelty score, detectability From last week after hearing that Lecorpio and Innography, we hypothesized that patent committees used some sort of enterprise software Made hypotheses around security (Cloud based vs behind firewall) Made hypothesis that 25% of the engineer’s incentive could go to Delphi for discovering patentable opportunities What we found: Smaller start-ups bootstrap as much of the patent legwork as they can because it’s a large investment Startups and early stage companies sometimes prefer trade secrets. Engineer documentation is not organized in a way that is conducive to extract patentability. Patent approval rate for engineers goes up over time as they get better at writing patents and evaluating what to patent We were wrong in assuming that startups make breakthroughs all the time and constantly need to be on the watch to patent something. They have one product usually and so they combine as much as possible into one patent to save costs. Filing provisional patents faster could be helpful. Large firms use patent troll insurance companies to fill the holes in their patent coverage, however, small companies may not have the luxury to buy into patent insurance portfolios. There isn't as much of a push for IP efficiency or any worry in general to have the best IP as we thought there was, even at the most active patent filers like IBM and Apple.
    36. Week 5 Went in thinking: Defined our Get, Keep and Grow funnels as well as our CAC to model our business plan Tested hypothesis that customers (both engineers and IP committee members) would be willing to send us their personal documentation to run through Delphi, to see what the result of the novelty, detectability, and business value scores are Tested hypothesis that customers are interested in using Delphi to process unstructured technical documentation and extract patentable opportunities Started to narrow down which qualities of the scores we hypothesized could be technologically feasible Began exploring how we could develop trust in our algorithm with the use of these scores What we found: Customer Relationships Customers find all of the different metrics we were envisioning to be valuable, but only some to be technologically feasible Comfort with the automatic extraction varies among our customer segments, with committee members being most accepting and engineers being least
    37. Week 6 Went in thinking: Industry segmentation- we thought that we would just go for microelectronics, computer architecture and semiconductors Showed the demonstration of our MVP to our interviewees Tested pricing for our services including invention disclosure generation, prior art search, and seat subscription What we found: Revenue Streams We found it was better to segment by industry… we focused on two groups to begin… and we're going to expand into it later --- different We took of Biotech, but then were told to keep Biotech on the map → they might pay premium since it is their lifeblood
    38. For Example: Customers should not have been fragmented by age Harvesting Patentable Innovations Original Idea
    39. Week 7 Went in thinking: Thinking pilot programs KPIs Backtesting our machine What we found: Key Activities are verbs Started finding out more about the politics of universities Started to realize that we should organize by project not individual person Convincing people to switch their operations and onboarding is a huge activity NEEDED TO REALIZE THAT activities around convincing people of brand were key… but we didn’t find this until after key resources Enterprise sales were key activity
    40. Week 8 Went in thinking: Universities will be primary customers What we found: Key resources are nouns Resources are partners Universities are not customers BRAND is a key resource Next Week: → Needed to look at other companies that have made a brand off a number FAIR ISAAC We need to sell the vision, not the most palatable version
    41. Week 9 Went in thinking: We thought any legitimate university would be a good partner We thought that costs was largely a numbers game of maximizing runway and staying afloat What we found: Costs - funding is to get to key milestones, not to sustain employees Universities we partner with most be similar to our actual customers Math vs Judgement → Not everything is entirely math e.g. housing… Be near customers We want to have POC funders help us build a product THEY would buy
    42. Customer Segments - Independent Researchers/Inventors Customer Segments Hardware Engineers/Inventors at BioTech,Micro Electronics, Digital Signal Processing and Computer Architecture companies (over 100 patent filings a year) Customer Segments: General Counsels and Vice Presidents of IP at BioTech, Micro Electronics, Digital Signal Processing, and Computer Architecture (over 100 patents per year) Principle Investigators & Business Directors at University Tech Transfer Offices (over 100 patent filings a year) - to be further segmented Academic and Industry Research Journals in patent heavy fields - to be further segmented Business Model Canvas: With Customer Segments Changed Everything...
    43. Harvesting Patentable Innovations Original Idea Topic Identification Understand Derived Innovations See Litigation History See Similar Patents News & Trends Gauge Drafted Patent Strength
    44. Pre-MVP Significant Parts of Business Model Canvas
    45. Harvesting Patentable Innovations Week 3 - Target Customers Understanding Pain Points of Harvesting Patents as opposed to simply Prosecuting Patents “The engineer can file an ID, sometimes the manager can push for one..”- Steven Horowitz “We use Anaqua- it works pretty well for us”- Kevin Brown “It is a small company...so they (engineers) come to us when they think they have a new idea”- Krishna Mehta Delete… Kanu didn’t understand why it was here, and neither did I to be honest
    46. Harvesting Patentable Innovations Week 3 - Target Customers 3 Concerns of Patent Committees The ability to discover patent opportunities without influence of engineer’s opinion or the time lag of harvest sessions The ability to sort patent disclosure forms by relevant criteria (novelty, value to business, detectability) The ability to have an objective criteria for evaluating invention disclosures “We get a lot of invention disclosures, and it takes time to go through each one and see if we want to pursue it, or justify why we are not” - Darryl Smith Too specific… how you said it… after customers and who they were… VERY HIGH LEVEL… there was three things to test Discover opportunities, Number, track currently
    47. Harvesting Patentable Innovations Lawyers are TERRIBLE customers “We rely on our own- we have lots of experience”- Chinh H. Pham, Esq Learned the entire patent process Key Players in the Patent Prosecution process and their main functions Patents behave differently in different industries “Only certain kinds of software patents can go through after Alice”- Darryl Smith, VMware Patent Processes can be different in different companies depending on size, strategy, etc. “We built an Invention Disclosure Filing system at Yahoo”- Steven Horowitz, Ovidian Customer Discovery
    48. You might just want to speak to it -- doesn’t add
    49. Post MVP v1: ● Allowed us to add startups to customer segments ● Allowed us to form hypotheses about customer relationships ● Allowed us to narrow down value propositions based on what stood out to customers in our MVP Don’t need it - just SAY IT -- delete
    50. Harvesting Patentable Innovations Discovery Filing Patent Process Focus on Discovery
    51. “Documentation...is distributed among many formats. It contains all kinds of extraneous information that don’t matter for patents”- Tanj Bennet “We have a great filing system we have been developing for years...we are not interested in switching- Luis Mejia “Lots of things determine patentability...novelty, detectability...value to business is important..”- Luv Kothari “If we don’t know our value to business, we shouldn’t be doing this”- Darryl Smith “Non-obviousness is very difficult to detect, I don’t think a computer could do it”- Dov Rosenfeld

    ×