EG Reports - Delicious Data

375 views

Published on

Delicious Data: Automated Techniques for Complex Reports: Get data into the hands of those that need it most by automating SQL reports, scheduling data extracts using the Evergreen reporter, and extending the reporter with new source definitions. Ben Shum from Bibliomation and Jeff Godin from the Traverse Area District Library will show you how you can meet your advanced or complex reporting needs, both with and without direct database access. Join us in our efforts to eliminate manual, time-consuming reporting workflows!
Ben Shum (Biblio), Jeff Godin (TADL)

Published in: Education, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
375
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
5
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • \n
  • EG Reports - Delicious Data

    1. 1. Delicious Data:Automated Techniques for Complex Reports Presented by Jeff Godin and Benjamin Shum
    2. 2. How NOT to run reports...
    3. 3. Planning the framework
    4. 4. Planning the framework SQL Query Report
    5. 5. Planning the framework SQL Query Export Results Report to CSV
    6. 6. Planning the framework SQL Query Export Results Report to CSV Convert CSV to XLS
    7. 7. Planning the framework SQL Query Export Results Report to CSV Convert Email XLS CSV to XLS to Library
    8. 8. Planning the framework SQL Query Export Results Report to CSV Schedule as Cron Job Convert Email XLS CSV to XLS to Library
    9. 9. Some StrategiesSchedule as Use our experience Cron Job setting up crontab
    10. 10. SQL Query Use our existing Report SQL reportsExport Results CSV is a standard to CSV format to work with
    11. 11. ConvertCSV to XLS Find a converterEmail XLS Find an email clientto Library
    12. 12. Implementation• Can find online at http://evergreen-ils.org/dokuwiki/doku.php? id=scratchpad:automated_sql_reports
    13. 13. Future of the work• Additional formatting• Dealing with reports with zero results• Adding new reports• Implementing more refinements
    14. 14. No database access?• Create scheduled reports• Completion notification via e-mail• Script pulls report data• Import and process
    15. 15. Don’t do this unless you have to
    16. 16. Reports as Exports• Create template that outputs raw data• Schedule report• Enable e-mail notification to special address
    17. 17. Act on the e-mail• fetchmail -> procmail -> perl• Report data is automatically downloaded• Within a minute or so
    18. 18. act-on-mail.pl#!/usr/bin/perl# called by procmail, and receives the mail message on stdinuse strict; use warnings;my $url;while (<>) { if ($_ =~ /https://reporter/) { $url = $_; last; }}system("/home/stats/bin/fetch-report.pl", "-u", $url);
    19. 19. fetch-report.pl logic• Fetch URL like: https://host/reporter/[...]/report-data.html• Automatically log in if needed (keep a cookie)• Download report-data.csv and save to output dir
    20. 20. imports.ini example 1[general]dir=/home/stats/reportsstate_dir=/home/stats/state[group holdratio]name=Hold Ratiofile1=holds_outstanding.csvfile2=holdable_copies.csvfile3=bib_info.csvcommand=psql -f /home/stats/sql/holds_init.sql#post-command=/home/stats/bin/mail-holdratio.pl
    21. 21. imports.ini example 2[general]dir=/home/stats/reportsstate_dir=/home/stats/state[group items_for_export]name=Items for Exportfile1=items_for_export.csvcommand=psql -f /home/stats/sql/items_for_export.sqlpost-command=/home/stats/bin/export_bibs.pl -f /home/stats/output/bre_ids.csv
    22. 22. Import script overview• Runs from cron• Have all input files been updated since the last time I ran this import?• Run import• Update timestamp• Run post-import command
    23. 23. The import process• drop table• create table• import data• process to output
    24. 24. The post-import• Send the report output somewhere• E-mail notify that “report’s ready at someurl”• Trigger something like a bib export of the records that were output
    25. 25. One more thing...• Don’t forget to delete your scheduled report output
    26. 26. Custom reportingviews exposed in the IDL
    27. 27. SQL views in the IDL• Exposed via reporting interface• Good for on-demand reporting• Don’t break the IDL • xmllint is your friend • pick a class id unlikely to collide
    28. 28. <class id="rlcd" controller="open-ils.cstore open-ils.pcrud open-ils.reporter-store" oils_obj:fieldmapper="reporter::last_copy_deleted" oils_persist:readonly="true" reporter:core="true" reporter:label="Last Copy Delete Time" >
    29. 29. <oils_persist:source_definition>SELECT b.id, MAX(dcp.edit_date) AS last_delete_dateFROM biblio.record_entry b JOIN asset.call_number cn ON (cn.record = b.id) JOIN asset.copy dcp ON (cn.id = dcp.call_number)WHERE NOT b.deletedGROUP BY b.idHAVING SUM( CASE WHEN NOT dcp.deleted THEN 1ELSE 0 END) = 0
    30. 30. <!-- continued --> <fields oils_persist:primary="id"oils_persist:sequence="biblio.record_entry"> <field reporter:label="Record ID" name="id"reporter:datatype="id"/> <field reporter:label="Delete Date/Time"name="last_delete_date" reporter:datatype="timestamp"/> </fields> <links> <link field="id" reltype="has_a" key="id" map=""class="bre"/> </links>
    31. 31. <!-- continued --> <permacrud xmlns="http://open-ils.org/spec/opensrf/IDL/permacrud/v1"> <actions> <retrieve/> </actions> </permacrud> </class>
    32. 32. The Future• General third party reporting frameworks?• Library Dashboards?• Your Ideas?
    33. 33. Photo CreditsSurrounded by papers by Flickr user suratlozowickhttp://www.flickr.com/photos/suratlozowick/4522926028/
    34. 34. Thanks!

    ×