IMPACT Interoperability and Evaluation Framework Clemens Neudecker, National Library of the Netherlands IMPACT Demo Day, B...
OCR: A multitude of challenges… <ul><li>I. OCR challenges (gothic fonts, bleed-through, warping, etc.) </li></ul>
OCR: A multitude of challenges… <ul><li>II. Language challenges (spelling variants, inflection, and many more!) </li></ul>...
And a multitude of solutions! <ul><li>22 different ‘tools’ from diverse developers: </li></ul><ul><li>OCR (C++, C#),  </li...
Main   requirements <ul><li>Behavioural: </li></ul><ul><li>Minimize integration effort </li></ul><ul><li>Minimize deployme...
Architecture <ul><li>IMPACT Interoperability Framework: Technologies </li></ul><ul><li>- Java 6 </li></ul><ul><li>- Generi...
Components I: IIF <ul><li>Enterprise Service Bus receives (SOAP) requests  from users and distributes  the load to the ava...
Framework integration <ul><li>Easy to use generic command line wrapper (open source) </li></ul>
Workflow development <ul><li>OCR workflow =  data pipeline </li></ul><ul><li>Building blocks =  </li></ul><ul><li>processi...
Workflow management <ul><li>Web 2.0 style registry: myExperiment </li></ul><ul><li>Local client: Taverna Workbench </li></...
Community <ul><li>Web2.0 style workflow registry </li></ul><ul><li>Community of experts </li></ul><ul><li>Sharing of resou...
Components II: Dataset <ul><li>Database and front end, hosted at the PRIMA  </li></ul><ul><li>research group at University...
Dataset <ul><li>Access to a representative and annotated dataset of significant size, with metadata, ground truth and sear...
Evaluation features <ul><li>Text based comparison of result with ground truth,  using Levenshtein distance method </li></u...
The PAGE Format Framework <ul><li>Two-level architecture: </li></ul><ul><ul><li>root structure </li></ul></ul><ul><ul><li>...
Ground-Truthing Tools <ul><li>Aletheia </li></ul><ul><li>FineReader </li></ul><ul><li>PAGE Exporter </li></ul><ul><li>GT V...
Profile ‘Full Text Recognition’ <ul><li>Evaluation for general text recognition </li></ul>Measure Weights Region Type Weig...
Partial   Miss Miss Merge Measures – Segmentation Errors Split Ground Truth Segmentation Result Mis-classi-fication Paragr...
OCR Accuracy
Thank you! Questions?
Upcoming SlideShare
Loading in …5
×

BL Demo Day - July2011 - (9) IMPACT Interoperability and Evaluation Framework

1,855 views

Published on

Slides from Clemens Neudecker's presentation on the IMPACT Interoperability and Evaluation Framework within the IMPACT project at the British Library Demo-day on the 12th July 2011.

Published in: Technology, Education
  • Be the first to comment

  • Be the first to like this

BL Demo Day - July2011 - (9) IMPACT Interoperability and Evaluation Framework

  1. 1. IMPACT Interoperability and Evaluation Framework Clemens Neudecker, National Library of the Netherlands IMPACT Demo Day, British Library 12/11/11
  2. 2. OCR: A multitude of challenges… <ul><li>I. OCR challenges (gothic fonts, bleed-through, warping, etc.) </li></ul>
  3. 3. OCR: A multitude of challenges… <ul><li>II. Language challenges (spelling variants, inflection, and many more!) </li></ul>Example: historical variants of the Dutch word ‘wereld’ (world): werelt weerelt wereld weerelds wereldt werelden weereld werrelts waerelds weerlyt wereldts vveerelts waereld weerelden waerelden weerlt werlt werelds sweerels zwerlys swarels swerelts werelts swerrels weirelts tsweerelds werret vverelt werlts werrelt worreld werlden wareld weirelt weireld waerelt werreld werld vvereld weerelts werlde tswerels werreldts weereldt wereldje waereldje weurlt wald we ë led
  4. 4. And a multitude of solutions! <ul><li>22 different ‘tools’ from diverse developers: </li></ul><ul><li>OCR (C++, C#), </li></ul><ul><li>Image Processing & Lexica (DLL), </li></ul><ul><li>Command Line Tools (Win/Linux), </li></ul><ul><li>Java, Ruby, PHP, Perl, etc. </li></ul><ul><li>+ 3 rd party software! </li></ul><ul><li>“ One ring to rule them all...” </li></ul><ul><li> IMPACT Interoperability Framework (IIF) </li></ul>
  5. 5. Main requirements <ul><li>Behavioural: </li></ul><ul><li>Minimize integration effort </li></ul><ul><li>Minimize deployment effort </li></ul><ul><li>Maximize usability </li></ul><ul><li>Maximize scalability </li></ul><ul><li>Functional: </li></ul><ul><li>Modular </li></ul><ul><li>Transparent </li></ul><ul><li>Expandable </li></ul><ul><li>Open source </li></ul><ul><li>Platform independent </li></ul>
  6. 6. Architecture <ul><li>IMPACT Interoperability Framework: Technologies </li></ul><ul><li>- Java 6 </li></ul><ul><li>- Generic Web Service Wrapper </li></ul><ul><li>- Apache Ant/Maven </li></ul><ul><li>- Apache Tomcat/httpd </li></ul><ul><li>- Apache Axis2 </li></ul><ul><li>- Apache Synapse </li></ul><ul><li>- Taverna Workflow Engine </li></ul><ul><li>IMPACT Evaluation Framework: Dataset </li></ul><ul><li>- approx. 5 TB raw data (images, text files, metadata) and growing </li></ul><ul><li>- Ground truth transcriptions </li></ul><ul><li>- Evaluation modules </li></ul>
  7. 7. Components I: IIF <ul><li>Enterprise Service Bus receives (SOAP) requests from users and distributes the load to the available worker nodes </li></ul><ul><li>Main effect: Process parallelization, </li></ul><ul><li>Load distribution, </li></ul><ul><li>Fail over </li></ul>
  8. 8. Framework integration <ul><li>Easy to use generic command line wrapper (open source) </li></ul>
  9. 9. Workflow development <ul><li>OCR workflow = data pipeline </li></ul><ul><li>Building blocks = </li></ul><ul><li>processing steps (nodes) </li></ul><ul><li>Integration = interaction between nodes </li></ul><ul><li>(mashup) </li></ul>
  10. 10. Workflow management <ul><li>Web 2.0 style registry: myExperiment </li></ul><ul><li>Local client: Taverna Workbench </li></ul><ul><li>Web client: project website </li></ul><ul><li>API: SOAP/REST </li></ul>
  11. 11. Community <ul><li>Web2.0 style workflow registry </li></ul><ul><li>Community of experts </li></ul><ul><li>Sharing of resources </li></ul><ul><li>Knowledge exchange </li></ul><ul><li>A central meeting point for users and researchers </li></ul>
  12. 12. Components II: Dataset <ul><li>Database and front end, hosted at the PRIMA </li></ul><ul><li>research group at University of Salford, </li></ul><ul><li>School of Computing, United Kingdom </li></ul><ul><li>- more than 500.000 images from Digital Libraries </li></ul><ul><li>- more than 50.000 ground truth representations </li></ul><ul><li>- up to 10.000 direct access calls per month </li></ul><ul><li>- 4 TB of space and growing </li></ul>
  13. 13. Dataset <ul><li>Access to a representative and annotated dataset of significant size, with metadata, ground truth and search facilities </li></ul>
  14. 14. Evaluation features <ul><li>Text based comparison of result with ground truth, using Levenshtein distance method </li></ul><ul><li>Layout based comparison of result with ground truth, </li></ul><ul><li>using the Page Analysis And Ground Truth Elements Framework </li></ul><ul><li>Example: </li></ul>
  15. 15. The PAGE Format Framework <ul><li>Two-level architecture: </li></ul><ul><ul><li>root structure </li></ul></ul><ul><ul><li>task specific sub-formats </li></ul></ul><ul><li>Separate XML Schema definitions </li></ul><ul><li>Format identification via Namespaces </li></ul><ul><li>Mapping of </li></ul><ul><ul><li>dependencies </li></ul></ul><ul><ul><li>process chains </li></ul></ul><ul><ul><li>alternative processing steps </li></ul></ul><ul><li>Linking via IDs </li></ul>Processing results or ground truth (e.g. binarisation, dewarping, page content)
  16. 16. Ground-Truthing Tools <ul><li>Aletheia </li></ul><ul><li>FineReader </li></ul><ul><li>PAGE Exporter </li></ul><ul><li>GT Validator </li></ul><ul><li>GT Normalizer </li></ul>
  17. 17. Profile ‘Full Text Recognition’ <ul><li>Evaluation for general text recognition </li></ul>Measure Weights Region Type Weights Merge Text Allowable Merge Image Split Graphic Allowable Split Chart Miss Table Partial Miss Separator Misclassification Maths False Detection Noise 1.5 1.0 2.0 2.0 1.0 1.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.5 0.5
  18. 18. Partial Miss Miss Merge Measures – Segmentation Errors Split Ground Truth Segmentation Result Mis-classi-fication Paragraph Caption
  19. 19. OCR Accuracy
  20. 20. Thank you! Questions?

×