Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

TAR: Beginning to End

33 views

Published on

Have you ever used TAR in your case? Are you afraid to use it without having previously walked through it in detail? Then this session is for you! We will walk through the TAR process from beginning to end so that you can create the project, run the reports and know how to interpret your results with confidence.

Published in: Technology
  • Be the first to comment

  • Be the first to like this

TAR: Beginning to End

  1. 1. Technology Assisted Review From beginning to end
  2. 2. In this interactive session, we will walk you through project creation, training, validation, and report analysis. Session Overview
  3. 3. Private and Confidential – Copyright 2020 • Analytic Index • Subject Matter Expert • Pre-coded Documents (optional) What do you need before you begin?
  4. 4. TAR Project Creation
  5. 5. Private and Confidential – Copyright 2020 • TAR Naming Convention • Auto-Creation of fields and Tags needed • Training Set Size – what is this? • Validation Round Standard • Certification Set Standard Step 1: Create a TAR project
  6. 6. Private and Confidential – Copyright 2020 • Do you have any Pre-Coded Documents? • Create your Training round batch • Review your Training documents • Run TAR Project Summary Report Step 2: Start Training
  7. 7. Private and Confidential – Copyright 2020 DATA DESCRIPTION Documents in Categorization Set Total number of documents in the Analytics Index selected for the project. Training Document Set Number of training documents, as determined by Eclipse analysis after the training review pass is created (part of the TAR project creation). Training Set - Responsive Number of training documents tagged as Responsive by a (human) reviewer. Training Set - Non-Responsive Number of training documents tagged as Non-Responsive by a reviewer. Training Set - Responsive Percentage Percentage determined by dividing the Training Set Responsive value by the Training Document Set value (occurs after the training review pass is completed). Projected - Responsive Estimated number of Responsive documents in the entire TAR project, based on Eclipse analysis of the training set results (occurs after the training review pass is completed). Projected - Non-Responsive Estimated number of Non-Responsive documents in the entire TAR project, based on Eclipse analysis of the training set results (occurs after the training review pass is completed). Current Validation Round - Precision Precision from the last completed validation round for the selected TAR project. Current Validation Round - Recall Recall from the last completed validation round for the selected TAR project. Current Validation Round - FMeasure F Measure from the last completed validation round* for the selected TAR project. Current Estimated Responsive Counts Estimated number of Responsive documents in the entire TAR project, based on latest categorization of the project. Current Estimated Non-Responsive Counts Estimated number of Non-Responsive documents in the entire TAR project, based on latest categorization of the project. TAR Project Summary Report
  8. 8. Validating your results
  9. 9. Private and Confidential – Copyright 2020 ❑ Create your validation round ❑ Run Validation report ❑ Analyze if additional review is needed ❑ REPEAT Step 3: Start Validating Did you know? You will execute 7-8 validation rounds during the average TAR project.
  10. 10. To Produce or Not to Produce?
  11. 11. Private and Confidential – Copyright 2020 ❑ Run Certification Round ❑ Create Certification Report ❑ Review “Prediction_undetermined” ❑ Gather “Predicted_Relevant” documents ❑ Produce Predicted_Relevant: Documents identified as relevant to the issues/ needs of the case Predicted_NON_RELEVANT: Documents identified as not relevant to the issues/needs of the case PREDICTION_UNDETERMINED: Documents that could not be identified as relevant or non- relevant because the document was mostly binary data or contained little or no words. Step 4: Option #1 - Produce
  12. 12. Private and Confidential – Copyright 2020 Step 4: Option #2 - Prioritized Review Not Relevant Relevant Unsure Database
  13. 13. Thank you! For additional information on Ipro for enterprise, go to iprotech.info/techshow20enterprise

×