SlideShare a Scribd company logo
Lauren: Hi, everyone. Thank you for joining us for today's Tech Forum session. I'm Lauren
Stewart, the operations director at BookNet. Welcome to #StandardsGoalsFor2023 Standards
and Certification roundup.
Before we get started, BookNet Canada acknowledges that its operations are remote and our
colleagues contribute their work from the traditional territories of the Mississaugas of the
Credit, Anishinaabe, Haudenosaunee, Wendat, and Mi’kmaq Peoples, the original nations of
the lands we now call Beeton, Brampton, Guelph, Halifax, Toronto, and Vaughan. We
encourage you to visit the native-land.ca website to learn more about the peoples whose land
you are joining from today. Moreover, BookNet endorses the calls to action from the Truth
and Reconciliation Commission of Canada and supports an ongoing shift from gatekeeping
to space-making in the book industry.
The book industry has long been an industry of gatekeeping. Anyone who works at any stage
of the book supply chain carries a responsibility to serve readers by publishing, promoting,
and supplying works that represent the wide extent of human experiences and identities in all
that complicated intersectionality. We at BookNet are committed to working with our
partners in the industry as we move towards a framework that supports space-making, which
ensures that marginalized creators and professionals all have the opportunity to contribute,
work, and lead. For our webinar today, if you're having difficulties with Zoom or have any
tech-related questions, please put your questions in the chat or you can email us at
techforum@booknetcanada.ca.
As you can tell, we're providing live ASL and closed captioning for this session. To see the
captions, please find the show subtitle button in the Zoom menu normally located at the
bottom of your screen. If during the presentation you have questions for us, please use the
Q&A panel found in the bottom menu. Lastly, we'd like to remind all attendees of our code
of conduct. Please do be kind, be inclusive, be respectful of others, including of their privacy.
Be aware of your words and your actions and please report any violations to us at
techforum@booknetcanada.ca. Lastly, we'd like to... Oops, sorry.
And then also do not harass speakers, hosts, or attendees or record these sessions. We have a
zero-tolerance policy. You can find our entire code of conduct at bnctechforum.ca/code-of-
conduct. Now let me introduce our speaker and my colleague, Tom Richardson. Tom is
BookNet's bibliographic manager and friend to all ONIX producers and users, building on an
illustrious career in Canadian publishing and metadata management. In 2016, the Book
Industry Study Group awarded him with their distinguished service award for his work on
standards and best practices. He's someone I learn from every day and I'm thrilled to
introduce him today. Tom, please take it away.
Tom: Well, I guess I should just get right in. Well, the first thing I'd like to do is answer
what's becoming quite a regular question. What's with BiblioShare and ONIX 3.0? And it's
simply to say, "Mea culpa, the fault is ours." But I'll explain, BiblioShare intentionally stores
ONIX as ONIX. The two versions are different and stored as distinct instances. Now, a retail
database, a typical data aggregator would map the ONIX data to their central dataset. So the
question of which version is a question of creating new programming to map the same data
points from ONIX 3.0, and then just having it delivered thereafter. I mean, don't
misunderstand me, it is not that simple, but it does mean most companies can just do a switch
and use one over the other, and the new mapping meets their needs and the publisher's needs
or it doesn't. With the slide, I've tried to illustrate what the retailer may not know is the full
extent of the data available in the feeds. Now, they know what they program for, but not
always what's being sent.
So BiblioShare data aggregation is actually both simpler and more complicated, and this
illustration is awfully idealized. But we do know what's in the ONIX file because we load all
of it. And as you can see here, that is actually simpler in its way. A retailer needs to create
actionable information from the metadata. BiblioShare preserves your data throughout
knowing everything sent is an asset in our work, and we help you fix and improve your data,
and in turn, that improves the quality of the flow through BiblioShare. Now that simplicity
helps support our APIs because fully standardized data is manipulated, and that makes it
expensive. BiblioShare can be exceptionally low cost. Now, retailers can also benefit from
this because we can help them understand the scope of the data available. We can do that
now for either ONIX 2.1 or the 3.0 data that we have. And we have begun ensuring that one
of the introductions of one of today's topics, ONIX 3.0, is as dull an event as we can possibly
make it.
Now, this is actually an intentionally misleading slide. The problem we are having is not
with handling ONIX 2.1 and 3.0 together, that actually works fine. We just keep them
separate. Rather it's that we still have five 2.1 ISBNs for every 3.0 one that we have. We
don't have enough 3.0 data to service our clients to the APIs, but trying to process the 3.0 in
volume has also run into problems. Our transition has turned unexpectedly awkward. Every
company has its own issues and systems and ours is being solved. And once it is, soon, the
next issue for us is that the 2.1 data that we serve is often merged from two sources, and we
may or may not have the same two sources available for 3.1. So will the new metadata we
serve be comparable? We have products depending on it. We have to move a little more
carefully account to account and that makes the transition a little more thoughtful and slow in
the changeover once it's begun.
We are sorry for the delay, and I can only ask you for your indulgence in continuing to
support the dual feeds into BiblioShare. Please get in touch if it's a problem and we'll be in
touch soon about the 3.0 data as we move forward, mea culpa. So BookNet's sister
organization in standards, BISG, the Book Industry Study Group, has produced an excellent
statement, decrying abuse of book titles in metadata. Now, standard aficionados may recall a
similar statement came out several years back from our UK-based sister, B-I-C, BIC. And I'll
be asking our own bibliographic committee to endorse this new one just the way we
endorsed the previous one. Please read this and distribute BISG'S statement in your
company.
It's available at the URL here, but if you just go to bisg.org and find the metadata committee,
it's there as a link. In the context of this webinar, it's a great introduction to where we are
now in standards. Now this is a biased paraphrase. BISG is both politer and more specific.
But my takeaway is this, not following the definitions provided in standard defeats the
purpose of having a standard. Sender and receiver need to be able to predict what the
metadata contains. Who knew? Senders need to create metadata before retail use can happen.
Retailers need to use the data they are sent following its named purpose, including, and this
is an underlying theme of this update, adopting new data in a timely fashion.
But senders need to appreciate that assuming we can get retailers obligingly following
reasonable advice, they too would need senders to follow the standard and maintain the data
properly. In this case title and not attempt to circumvent perceived problems by gamifying
the metadata to achieve some goal generally based on the current data habits of one or maybe
two players who may very well not follow the same rules for meditated ingestion in every
market they serve. It's not a game anyone can win. Now, interestingly, applying the standard
implies understanding the goals of the standard. And there is a big change that has happened.
Not that it amounts to much yet. No worries, nothing that much has changed that affects your
metadata, but combining the BISG statement and EDItEUR's announcement of ONIX 3.1
suggests we have to admit that standards will change and agree on how to communicate so
that the implementation between sender and retailer actually occurs in a timely manner.
No one should be proud of the transition from 2.1 to 3.0. Twelve years is way too long, and
obviously, I can't claim to have a solution that then when we're trying to stretch that out to 13
years. I can also ask both retailers and data senders to recognize it creates a cost in the same
way I could also admit that BookNet's data-sending clients are clear that we are becoming
one. Still, in my opinion, it is cheaper to expect and plan for change. So let's move on to
what's changed. Introducing ONIX 3.1 and why isn't it ONIX 3.0.9? So this is as direct a
quote from EDItEUR's latest specifications on their introduction to release 3.1 as I can make
it. This new release of ONIX for books is characterized by two key changes. First, the
removal of a handful of elements from release 3.0 that have been deprecated for up to a
decade, and second, the addition of a range of new data elements, which I'll cover in a
moment.
It's ONIX 3.1 because it's no longer fully backwards compatible with 3.0. A full list of all
deprecated elements can be found in an appendix in the specification, which is appendix
four, the list of elements removed from release 3.1. I'm not gonna try and cover all the details
of the deprecations here. It's not a problem because there's almost no use of what's been
removed. Mostly, just be sure you update your documentation, and this is what may be
considered to be new, stop or don't use anything that's labelled as deprecated in the new
documentation. That is how upcoming removals to ONIX are marked. All of which is to say
ONIX for books version 3 is considered stable enough that it has to support changes that
affect version compatibility. And that's good news because ONIX data senders and receivers
need a stable standard.
It's cheaper, but in addition to supporting new data with each version update, they should
consider the need to plan for change based on the standards requirements if they want to
remain within it. So to expand on that a little bit, maintaining valid as ONIX means you
actually have to plan for the deprecation of data. ONIX 2.1 preserved deprecated bits for
backward compatibility. ONIX 3.0 famously broke that, and up to now, ONIX 3.0
maintained full backward compatibility within its version. Now, the version three series isn't
going anywhere, not soon at any rate. There is no 4.0 on the works and anyone's horizons. So
after more than 12 years, it's stable enough that we actually need to get some cleanup in.
Now EDItEUR is seeking input about this, about how do they remove deprecations? If 12
years is clearly too long and if version updates that are around every second year seems too
short, how often and how should they manage this kind of change? Contact information will
be at the end, but get in touch with your thoughts on that and I'll come back to this a little bit
later. On deprecations, BiblioShare has your back. We haven't made the transition to ONIX
3.1 quite yet, but we will soon, but before we do, we're monitoring BiblioShare's citations for
issues. And I've started to notify clients if I find any. I am pleased to say and confirm it is not
a problem for most companies, but the next are the most likely issues. This one is associated
with core source feeds. Largely, they're coming in from eBOUND.
Now I've gotten in touch with eBOUND, and I hope it'll disappear without anyone else being
involved. So the most common use of newly removed data in BiblioShare's ONIX 3.0
archive is dual data, providing the deprecated and expected presentation. So this is an
audience code as an element supplied with an audience composite supplied with internal
coding for the same data bit. So the bit in red is the deprecated audience code. Now could I
just be clear? It may exist in our data because eBOUND supplies ONIX 2.1 feeds to
BiblioShare, and that is exactly the sort of problem that can occur if you try to support
conversion. By continuing to need ONIX 2.1, we may be the author of our receiving this
problem. The other common problem and it affects various state composites of which there
are many, which always have up to now included a deprecated option that will be now
invalid with 3.1.
So here's an example date composite. So this is a birth year coded to show the date format is
atypical. So in this case, coded for birth date, you're only giving a year format. So 1935, so
the date format for that is 05. So what's in red above is the deprecated version. You should
never use date format as an element. It should appear as an attribute within the date tag as in
blue below. The main reason this one is worth noting is because dates in ONIX are the
default eight-digit year, month, date, and default dates don't require the use of the date
format. It's important that you provide a date format code anytime you provide a date that
isn't the default value. And some dates require more detail and use minutes and seconds for
effective use. And depending on your software source, particularly if you're using in-house
created software, supporting attributes may or may not have been built in.
The idea here is data receivers should be able to know what they are processing based on the
attribute rather than loading the data, reading the data, deciding whether the data is
formatting, all of which is kind of wasteful processing time. You know, you warn them that
you're not giving them the usual date format. So they expect it to be in an attribute and
they're warned and they can point this efficiently. So if you find yourself adding an atypical
date, it might be worth checking. And I'm actually confident most of you, you'll find, you're
using the attribute now. Now, I have found one client using a value removed from the market
publishing detail composite, but I think it's related to conversion problems from 2.1, and
probably misunderstanding how to use the market composite in ONIX 3.1. Anyway,
promotion contact.
The first one on the list here is deprecated use product contact. And there are some others
that are detailed here and in the appendix, but the final removal worth noting is gender in the
contributor composite has been removed, and there is no equivalent to it in release 3.1. So to
explain, get in touch if that's a concern. And there has been almost no use of it. But the
gender composite was never designed to support use as an identity marker. And it was
included to support using ONIX for submitting to ISNI, the International Standard Name
Identifier, which first asked for a binary male, female, or other code to help identification of
content producers. And they found after a decade, that wasn't a help for their uses, so they
dropped it from their request. Therefore, ONIX has dropped it from their standard.
The short answer for identity concerns is that ONIX has never provided coding for that
purpose. So its removal isn't a problem. And I'll touch on this a bit at the end, but what's
missing from diversity support is an ongoing question. But the following is important
because there are newly deprecated data points in ONIX 3.1. So these are still part of 3.1, but
they are now labelled as deprecated. And that means that they will someday much sooner
than 12 years from now be removed from the standard in the same way the previous data
elements were just removed from 3.1. So newly deprecated, all default values in the header
composite: language of text, price type, and currency. Now I do see these in headers, but I
don't know the use case for them. I mean, for over 20 years, the request has been to put the
value in the record and everyone does that. I'm not aware of anyone expecting end users to
insert values based on defaults found in the header, or of any end user looking to fill in
missing data based on a supplied default. I can't actually see it working in most people's
feeds. I think its use is simply tradition.
IT departments include them because they have in the past and they can now, but they need
to stop doing it because these will be removed from version 3.1 at some future date. Now the
other deprecation to require some minor work is a little more serious. It represents forcing an
improvement made to the design of ONIX 3.0, and the ending of an ONIX 2.1 carryover.
Title text has been deprecated in the title element composite, which is used in both P.5
collection and P.6 product title. So instead of using title text, what you do is you use title
without prefix in all entries, period. It's either with no prefix and title without prefix, or it's
with a prefix and the title without prefix. In ONIX 2.1, title text had some sense in that some
but not all senders supplied all titles in title text and supplemented it with title with and
without as needed.
Anyway, all of this makes the question posed earlier and repeated here a very, very real one.
There is some use of default values in headers. There is a better but simpler way to handle
titles. There is also continuing use of that 2.1 carryover and the header problem. We moving
them is a task and doing it sometime in the next two to four years seems quite a reasonable
request. If they are to be moved by that time and they are then removed from the standard,
technically, the file including them is no longer valid as ONIX. That should be important
because supporting ONIX valid as XML documents measured against a published schema
should be a best practice goal for everyone. We have failed at this before. So how should this
sort of change be managed? Twelve years is too long, never doesn't make sense. Change in
metadata is a normal over long periods. How do we ensure both senders and receivers update
their system in a timely way?
Contact information is at the end. Let us know what you think. This is because it's a version
change, I have to just make a reminder to everyone about what's a version and things. So
hopefully everyone knows this. I just feel I need to say it. Version changes are when new
composites and elements are added to support new data and new functionality. Typically
happens every year and a half to two years. The current version is ONIX 3.1. Issue changes
are when new codes are added to existing code lists to support new data points. Typically
happens four times a year. The current issue is 61. There is what might be considered a mini
transition to ONIX 3.1 as cleaning out any deprecated elements may take a little of extra
time. So, I mean, it might be sensible to delay implementing 3.1 for just a little bit, but we
are talking like a couple of months. I mean, it shouldn't be any longer than that.
No, there are no problems. It's mostly good. So this is just a reminder, update your ONIX
specifications at least yearly. It has minor corrections and tweaks made regularly. Always
updated it after a version change. We just had one. Absolutely update it when there are newly
deprecated data points. Understand how your software updates and code list issue changes
happen. Let me rephrase that because I said it so badly. Understand how your software
updates and code list issue changes occur and are implemented in your systems. You need to
ensure that staff using the code lists have access to full versions that include explanatory
notes. And excuse me for just a sec while I clear my throat. Glad you didn't hear that. Okay,
sorry. All right, so continuing on. Use version updates to review ONIX for metadata that
might benefit your business.
So ONIX 3.0 had nine version updates since 2009, and they are summarized in the
specifications introduction. It's a very useful list. I've just reviewed it. It's full of lots of things
no one has implemented. Now, if you do that and you see things you're interested in, ask
BookNet for help. If you see part targets, potential targets for use, ask us if we see use of
them in BiblioShare. We can confirm if they exist, people are using them. If you're unsure
how to implement them, ask. Pleased to help with that sort of thing. If you're unsure of the
purpose of any metadata structure or code that you see or are using, ask. If your business has
something to say, a communication need and metadata might help, ask. But this is the
important message that I'd like to have.
Tell your trading partners and BookNet, if you add or need additional metadata support.
Don't rely them on finding your changes. Don't rely on them guessing what your needs are.
So let's move on to the next great topic. What's new in ONIX 3.1? So they have added
sequence number to allow explicit prioritization of block 2 collateral material. So that's
added in all four sections including prizes. And this answers the need of providing a priority
to say a sequence of reviews. So if a receiver can only fit three reviews from your list of 18,
they can fit the first 3 in a list, you can control. So this should be... People have been asking
this forever. So this is well worth implementing. We expect to see use. It's like this entire list
thing we expect to see the people want and to be using. However, having said that, they have
also added awarding body within prize. And Canadian publishers love their prize information
and they love doing it right.
So this is an XHTML-enabled field that gives a dedicated space for the sponsoring
organization should thrill everybody. This third one is the other particularly important one.
They have added a market reference to enable market-specific partial block updates. So this
is actually the other reason why it's version 3.1 as a change. This is a major change to ONIX.
If you want to practically support block updates, the most likely block you want to update is
block 6. And it is unique because it can repeat to represent different markets. Now, block 6 is
the heart of the ONIX standard. But when you do a block update, you probably aren't sending
block 6 to every data receiver. So how do you then update the blocks 6? It gets kind of
complicated. I mean, as it would work now is what you should try to do is you supply all the
block updates, and then you don't supply one of them and you make a change to that and then
they load it.
Anyway, it doesn't work very... It's not very sensible. However, when you add a market
reference statement so that you would then have a unique identifier so that your trading
partners can use it to identify the one that they need to update. So there's no appreciable use
of block updates in Canada yet. When we start, this will make a great help. There's two new
things in collections. They have added collection frequency to carry the frequency of
publications of products in a collection. And they've also added an ability to support a
collection element level to identifiers. And that allows you to craft reference collection title
elements to the identifiers. So the first one would be used for serialized products. I mean, if
you are, I don't know, listing comment books would be a simple example. I mean, you can
now say, "Give a frequency code into the system." That should be very useful and we'll
probably see use.
The identifier problem is one that really exists for... Well, you have to have a collection and a
sub-collection really to make it make any sense. So, I don't actually expect to see much use
of it, but, you know, it's a very useful thing because identifiers are useful. They have also
added an affiliation identifier composite within a professional affiliation. So that's a
contributor composite section thing, professional affiliation. Heavily used by university
presses. Again, university presses would mainly care about this next one, which is there's
something called open access, which allows people to provide products for free. And it's
usually done by university presses for academic-type printing/publishing. But they have
extended some elements to appear in content item block 3 content item that will allow the
support of hybrid open access, which allows you to have paid pricing and unpaid pricing at a
chapter level within one record.
Complete genius if you need it. Don't expect many do. The other changes are just some
sundry lists. I mean, every version includes some small tweaks to consolidate metadata
support for consistency and feature proofing. There was one other addition that I thought was
worth highlighting. So newly added in 2023 for issue 61 code list update to list 153,
everyone's favourite, text type, code 37, cover line. So the definition, as you would find in
the notes, U.S., which identifies the terminology being used here, reading line. Line of
usually explanatory copy on cover, somewhat like a subtitle, but not on the title page and
added by the publisher often on the cover.
So, cover line, meeting line, something like that. A sample would be with 125 illustrated
recipes. Now, this addition ties neatly back to the BISG statement on title. Now, that
statement advocates for leaving the book title metadata alone to be what it's supposed to be.
Something that matches the title of the book, the title page of the book specifically, and
advocates for retailers to support and publishers to use another list 153 code added in 2009
for the issue 10 code list update and supported by both ONIX 2.1 and 3.1, 3.0, everyone. List
153, text type, code 10, promotional headline. A promotional phrase, which is intended to
headline a description of the product.
Now, the BISG statements suggest that publishers feel that they need to use book title for
marketing messages because it's one of the few fields that they can reliably find displayed
and indexed. Outside of CataList as far as I can see, promotional headline, if it appears at all,
appears as another random piece of text buried on a page of, well, metadata blat. Hopefully,
we'll get some traction on the sort of support that BNC CataList supplies, they cunningly
match the definition supplied here. Now, cover line is a great addition. It creates a field that
adds to the book title, a place for that little bit of extra explanation not found in the title
proper. And it should be displayed for that purpose. Hence, that's why it is in the text type.
It's a display component, and it needs to be displayed for that purpose.
Note also, it's a U.S. edition. And I think that means it's likely gonna be used by Penguin
Random House, but I can't confirm that. Anyway, retailers should take note. So up to now, I
probably would've advocated for supplying that information in either, I don't know, product
form description or illustration description. But that's only because there wasn't a better
place. This is the better place. And it's better because it's there for promotional purposes. And
while the other two are displayed, typically only in the fine print along, you know, with that
little block of page number and that type of thing. So, promotional headlines should work
exactly how it sounds for display promptly. But what we need in book metadata is
transparency by all players.
It starts with publishers, in this case, supporting the standards offerings and being transparent
by telling retailers about the tools they supply to support their sales. Okay. You can't make
them use them, but if you're looking at metadata results you had with one very large retailer
and modifying your data based on what you see, you're gonna very quickly wind up doing
funny things with your book titles. Experience has shown this, and it leads to statements like
BISG has made and BIC before it, and me whining, "You're basing your feeds on
experiments with a black box that's designed to reward the retailer." No one can blame them
for that. And short term rewards are possible for the publishers doing it. Who knows?
Now, retailers need to look more at the data they get. Having their IT department tell them
what is in the data will include, for one thing, all the results of those publisher experiments.
Take them however you will. So everyone does have to rely on analysis, the data, to ask
them what to do. But surely, in 13 years of availability, how is it that somehow no North
American retailer has asked for or supported promotional header? That has to show you that
there is a problem. If you're going to design a page for display, at least make sure you
support the elements designed to be displayed on it. That's yesterday's news.
A new tool has been added to the standard. It's called cover line. Let's start supporting it in
less time than it's taken promotional header. And it starts with your being transparent and
communicating with your training partner. And BiblioShare can help in that, which is as I've
tried to explain earlier, and so can BookNet Canada. So I'm just about finished. We want you
to participate in standards, and here are the ones that we are most closely involved in. On the
left is some standard schemes we are typically associated with, and on the right, the
organizations. So EDI is Electronic Document Interchange. Anyway, it's still the heartbeat of
distributor to retailer information exchange. We do it.
Subject schemes, we are particularly involved with and have staff involved in BISAC subject
and Thema subjects. We closely watch identifiers, all of them, all the time. We monitor ISO
and W3C, GDSN and SAN stuff. We have staff members on BISG committees, metadata,
subjects, and supply chain in particular. Identifiers doesn't have a committee, and workflow
is more related to digital than we... Anyway, we monitor it closely, but we're not directly
involved at the moment. We can be. We have staff involved in the ONIX International
Steering Committee, the Thema International Steering Committee. We work closely with
BTLF, our Quebec partner.
Now, the BNC Bibliographic Committee is basically the group that we run and it is made up
of people from the industry being Canadian. It is the Canadian English Language Group,
National Group for ONIX and for Thema. It meets three or four times a year. Should be four
times a year. More if we could get it. But it is definitely a venue for participation. We
regularly discuss what is upcoming in ONIX. We tried one for Thema. If you have needs,
wants, desires, it is the group that would talk about them and agree to put them forward. We
want you to participate. We want you to be involved in these things. They are all changeable.
You need a code in ONIX, you can get a code in ONIX. It's not that difficult, but you do
have to participate and you do have to let people know, and other people have to agree with
you.
So that, I think, gets me down to my very last slide. And I will just simply observe that this is
identical to the slide I provided last year as part of my presentation at this time. So I am not
trying to say that the diversity, equity, inclusion, identity, ingenuity, and justice for book
subjects and author promotion has changed, or not been moving. It's moved quite a bit
actually since then. In particular, Lawrence Stewart has been doing a great deal of work that
is extremely valuable in BISAC, which we've gotten some new stuff in. We have some
ongoing discussions. We have a couple of working groups and things like this that we're
involved in. But it is a hard row to do it. Getting it right is as hard this year as it was last
year. The reference to OwnVoices, which was, you know, one of the problematic editions, I
mean, that became problematic as, you know, still remains what it is.
We want to have more of this stuff, but it is something that we are working on, and that we
need as much participation by members of the community as we can get. And that ends this
part of it completely on time when I expected it to be 40 minutes in.
Lauren: Thank you, Tom. Give you a chance to drink some water, and relax after that. So
helpful. You mentioned several times during your presentation that you would add contact
information, and that is now on the screen for anyone who needs it. And I trust that my
colleagues will likely put another contact link in the chat. So let's get into some Q&A. We've
got some great questions from the audience, and feel free to add more as we're chatting. But
just to get you started, Tom, so I've seen you now do this presentation for at least 12 years
that I've known you. And maybe it's because I've got all these TV show reunions on my mind
today, but I have to ask you a classic reality TV question, what is one rose and one thorn
from this past year in standards?
So for those of you who are not familiar with reality TV show reunions, on one particular
network, a rose is something that was successful like an accomplishment, a highlight,
something positive that came out of this past year. And a thorn is the opposite, something
that was challenging, a struggle, or something that perhaps needs another go. So Tom, what
are your rose and thorns for the past year in standards?
Tom: Well, standards and roses don't really go together. I mean, the slow change is actually
what marks standards. There's three things I would highlight as roses. Thema is just one of
them. It is slowly increasing in use. There's increased interest in the United States from it. It
is easily still the best thing that we have available for genuinely improving metadata that I
am aware of, and that people who are paying attention to it, I think, like it. We have a
presentation coming up on June 8th, which I just would like to mention because it's an
important one, I think, where Karen Smith, the organizer of the website for Blackwell's in the
UK, which is a Thema-based website. Just go over and look at Blackwell's.
Go on over and look at the way you could set up search and discovery for books using better
metadata. It is wonderful, and Karen will talk about it. And that's a real highlight to my mind.
The other one is ISNI, which is seeing actual use and increasing use. Major publishers are
planning to start introducing it if they haven't already. So U.S. ones, I mean, actual use, is
doing it. That means we have a second consistent, persistent identifier. And people who have
followed Tech Forum for years know how much we love to talk about persistent identifiers.
We haven't had a chance to recently keep trying to, but this is the first persistent identifier
that's really going to have a prospect of changing publishing since the ISBN. So that's
exciting.
So that's a real rose. The other thing is all the work you've done in diversity which is, like, a
complete wonder and hugely important. And that is the biggest frigging thorn in everybody's
side, which is just that getting the answers to the problem of identity and metadata is neither
simple, obvious, workable, or otherwise coming in everything. So we could talk for 20
minutes on the problems around that. So I will stop now.
Lauren: No, I think you've summed up kind of what my thoughts on the past year and
couple of years of work in many of these spaces have been. So yeah, very well stated. The
other thing that struck me just as I'm gonna pull in kind of a combination of a question from
the chat as well as from the Q&A, as well as one of my own, one of the things that you stated
a little bit early on was the kind of the chicken and egg scenario, that anyone who's working
with ONIX or any other standards and really looking towards implementation that we
struggle with, you were very clear when you said that data senders need to create metadata
before retail use can happen. And then you followed that up by saying retailers need to use
the data that they are sent following its named purpose. And so my follow-up question to that
was can you elaborate on how retailers can work with data senders to drive earlier adoption
of standards? And then I have a follow-up from the chat as well. It's not an easy question.
Tom: I tried to address that with the transparency slide and talk, so I don't wanna repeat that.
I mean, it's kind of a problem. I mean, retailers... I mean, anything retailers can do to stop the
scenario where they pull the data in, they talk to their IT department, they look at the data
they can, they do the best they can with that data, and then they don't spend much time
looking at the standard. And they rely on their IT department to tell them what's available in
the data as they get it, not what they could be getting in the data. So, I mean, it's just that. I
mean, I've quite pointedly said, "Look, no one implemented promotional header and it's
existed and being used in the data." I don't know, I don't think there's been a lot of publisher
use until the past, say, two to three years.
But I think there has been at least a couple of years of use of promotional header, but at least
some publishers. And the problem, of course, that they have when you rely on your IT
department is that the amount of use that you probably get of something like promotional
header in an aggregated feed of like millions of records is probably substantially less than
10%. So from the point of view of an IT department, it's like invisible data. You can't see it.
You won't know it until you go looking for it. So if you went looking for it and said, "If we
could just use promotional header, we could make a display better," they would've found
some use, so they went looking for it. They could have put it up and they could have found
that it did actually improve sales on like these test records and stuff like this.
They would've had a chance to do something with it. And publishers seen it being used
would flock to it because they would want to see it used. So we need to have the retailers
interacting more with the better parts of the data that aren't small amounts. So one of the
problems when you say diversity is you're talking things that affect less than 1% of the
records is where the real quality material and diversity will be found. You find it because you
go looking for it. You don't find it because it drops out of the data. The counterpoint to that is
the publisher tendency to want to make everything a default. So a lot of publishers will take a
data point that they feel that they should support because something important is, and if they
don't include it in every record, they feel it's invisible.
And they're probably right from what I just said, so I can understand it. But by putting
default data in, they basically put pointless pieces of data in. So you have companies
increasingly making companies load sales restriction information to have load the
information that I don't have a sales restriction. What's the point of that? I mean, sales
restriction exists to put up sales restrictions up. So, you don't add to the dialogue by saying,
"We have none." Unless it's new that you took it off or something like that, then that's
important to put in. But I don't know, people hide the data by doing that, but we do need the
retailers looking for the small data. Does that...?
Lauren: Well, I guess... Yeah, no, I think that gets to it and I think that then suggests kind of
a direction because when you said that... You kind of suggested that if publishers start
creating the data, then there will be use for it. But the question that we had from the chat was
how can publishers advocate for more thorough use of display fields on retailer sites? The
commenter mentions that they would love to see cover line use for an example, but I think it
could apply to any of the other examples that you mentioned particularly, you know, so-
called diversity content qualifiers, things like that. How could you answer that in terms of
advocacy?
Tom: I think if you have added something to your metadata because you think it's important,
and you're structuring your data, that you should send an example to the retailers. I mean,
very large U.S. publishers typically have a sheet. Now large U.S. publishers keep their
metadata really very, very basic. But when they add something, they add it to a list and they
distribute that around, and they make sure everybody in the supply chain freaking knows that
they support that piece of metadata. I mean, I think Canadian publishers have really good
metadata, but they don't necessarily tell anyone about it. They just kind of expect that people
know the standard and know to use it. So just by putting in the thing, we're supporting
promotional header with an expectation that it leads our descriptions.
Doesn't have to be more complicated than that. It's code this is being supplied. We supply a
short description for the 3.0 so that you can add it to...in this situation so that you could have
a main book page and a page with a short description on it. We make sure it's less than 350
words or characters. So that it can fit that type of thing so that people know that you have...
You're meeting standards, that you're providing the material, and then shipping it to them.
Because I suspect that retailers who had it pointed out to them that promotional header was
being used might have gone looking for it by now. And the fact that they don't seem to have
used it and we have to, like, shout about it before we can have any hope of it being used just
shows that no one has pointed it out. Now, people may say, "But it's your job, Tom, from
BiblioShare to point out that we started using it," and they're probably right. BiblioShare
could probably be doing a better job of trying to point out use and things like that. I mean,
we do.
Lauren: Yeah, I think there's something to that like celebrating early adopters, celebrating
adherence to best practices. There's something to that that I think seems very anti-Canadian
to overtly celebrate, but also very in line with what you've observed over many times over
the years, that so many of the small indie publishers are able to respond more nimbly to
changes in standards and in best practices in a way that the larger ships can't steer so quickly
around. And here's a big one to steer around, and I know that many people on this call have
kind of been through the trenches, and their first question is gonna be, do they have to move
over to 3.1 right away? And I don't want you to start thinking that you have to look into your
ball to see the future. But I think…
Tom: I want to just say it's not a problem. It's not a problem.
Lauren: Yes. Now, I guess what I'm getting at...
Tom: And I'm getting in touch with a few people who are having a small problem with it,
and it's a small problem, don't worry.
Lauren: Oh, and that's I think what we all got from that, is that it seems relatively easy to
manage. But I think what many of the people on this call and that even we at BookNet kind
of dealt with was what seemed like a relatively rapid industry shift from 2.1 to 3.0 because
there was a major industry push by a major industry player, and that seemed sudden. And do
you think that we're in a situation like that with 3.0 to 3.1 where it is going to be a sudden
push? Is there reason for panic? I'm hearing no. And I just wanna get some comparative
values maybe between that shift from 2.1 to 3.0 to 3.0 to 3.1.
Tom: I mean, Graham Bell from EDitEUR was quite clear that the answer to this is no
problems. But just to list some minor issues that you might have to deal with. There's a
tagline in your ONIX file that says, "Release 3.0, hello." It should move up to 3.1. Now that
also...I mean, something I didn't realize when I went to modify BiblioShare which again has
schemas and things like this, that underline is similarly all of a sudden, we need...and it's just
not a simple thing. I mean, I need to talk to our developers. I need to get that 3.0 be changed
to 3.1. There's some subtle little things that will cause some very minor technical glitches,
but nothing that is complicated.
Nothing that is difficult. And for most people, it literally is you take your current thing and
you just go through and you make the 3.0 into 3.1 and you're done. Now, everyone should be
downloading the information as it's updated. So if you go to EDitEUR's site right now, you
can download the correct schema for 3.1. You should do that, but you might not want to
implement it immediately until you've modified these numbers. Because all of a sudden, you
might be loading information where they don't match what the schema expects. It will expect
to see the same information it's looking for. And guess what? The schema for 3.1 looks for
3.1, not for 3.0. I mean, that's what I mean.
There's some subtle little problems you can get into, but if anybody comes up with that type
of thing, I can be a resource for that. I mean, they really are minor and they really shouldn't
be a problem for anyone. It's just there is... Anytime you touch the XML, this is an XML
level change. Anytime you touch the XML, it can get a little fuzzy. So that's...
Lauren: No, I think that's a fair answer. A very fair answer. We have another one from the
audience. And from this commenter's perspective, they write that deprecated elements need
to be handled at a database level since publishers don't always get to see how their data fields
map to ONIX. Which as the commenter says which means publishers don't know what
they're entering...don't know that what they're entering might end up sending in a way that
has been deprecated. And the question for you, Tom, in your wisdom is, are database
developers being asked the question about how to handle deprecation? And do you, Tom,
have a sense of what they are saying?
Tom: I have literally no idea if developers are being asked. I mean, no. I mean, one of the
reasons why... Okay, one of the reasons why this transition is not a problem is because
literally anybody who implemented 3.0 up to now would've been using a manual, should
have been using a manual, God help them, that was newer than version one. And version one
is when most of these deprecations were added. Some of them were even added to version
zero. I mean, they literally released version zero and deprecated some stuff almost
immediately. So, anybody who's using this stuff, it was already labelled as deprecated. Now,
when people implemented, were trying to handle 2.1 and 3.0 simultaneously for
convenience, some people added the deprecated component like the... In 2.1, they may have
used the code for audience code, which was allowed, and they may have implemented in 3.0
to make the transition easier for them.
So they may have created themselves a problem, but it was labelled as deprecated. They
should have known that, what they were doing at the time. So there's not really much of a
conversation you could have with the developer if they're not using the manual and reading
what's deprecated, well, someone should get them the manual and make them read it is the
only thing I can kind of say. But I do think in general, I mean, going back to what I was
saying about retailers, is that it's a real place if the organization can handle it to have a local
ONIX expert who's paying attention to the standard and the needs of the standard. I mean,
you can't go to the schema and understand what a block update is. A block update is as much
of a philosophical position and a set of processing rules as it is something that is in the
schema.
As far as if you just look at the schema is it just confuses the developers because it just is
represented by...there's a very large container that surrounds sections that has no apparent
purpose because the purpose is to support block updates, which happen as a processing thing
that has to be built into your program. So everything works without them until you need to
implement a block update where upon there's a set of rules that you have to follow to do it
properly so that people do it. It's no different than how you process an ONIX record now if
there's a set of rules that go with it, like, full replacement as a concept. I mean, there are
some rules that come with processing ONIX that just exist outside of the XML. Somebody
needs to be reading some of that stuff, the manual. Developers need support in doing that.
Somebody should be helping them understand that.
Again, BookNet is there as a resource for this. It's kind of our job to help with this, but it's
really hard to answer that question better than that.
Lauren: No, I think that what you've gotten there is that yes, ideally, that whoever's doing
your software development, if you've purchased a publishing management system that is
creating ONIX for you at your house, that you should have trust that there is someone at the
software developer that knows how that standard works, and is understanding and able to
respond through software to the standards needs and as it evolves. But then you also
mentioned having a local expert in the standard. And I think that what I took from that, and
what you've been saying throughout is that BookNet is there for that. If you are an emerging
press and you don't have that knowledge in-house, that, Tom, literally this is Tom's area of
expertise. He is your local expert for the Canadian market. So please do be in touch.
That actually looks like our time now. I'm gonna wrap up and just with the last kind of
snippet of wisdom that Tom left near the end of his slides was big problems are solved by
participation and sharing of information. And I think that's what we're asking of all of you
who have assembled here today. You have been left with a lot of information both from Tom
as well as in the chat in terms of how to engage with the standards, as well as how to engage
with us at BookNet. We are your standards organisation. Not only do we help this
marketplace implement international standards, we also represent the interests of this market
on an international stage.
So we wanna work with you, we wanna hear from you, and we are so appreciative of your
participation today, your attendance, as well as the great questions that you've provided. And
we look forward to working alongside you and working in solidarity with one another. So
Tom, thank you again for everything. Officially, I'd like to wrap up the session today. We
would love it if you could provide feedback on the session. We'll be sending you an email.
We do read every response, so please let us know how this went for you and provide us any
information that you can, and what you'd like to see from future sessions or what perhaps
was missed today. And you'll also get a recording of this session as soon as it's available.
And finally, of course, we would like to thank the Department of Canadian Heritage for their
support through the Canada Book Fund and all of you for attending. Thanks again, and again,
please remember that big problems are solved by participation and sharing of information.
That is your wisdom nugget from Tom Richardson today to send you off into the afternoon.
Thank you, everyone. Have a wonderful day.

More Related Content

Similar to Transcript: #StandardsGoals for 2023 Standards & certification roundup - Tech Forum 2023

Transcript: New from BookNet Canada for 2023: BNC SalesData and LibraryData -...
Transcript: New from BookNet Canada for 2023: BNC SalesData and LibraryData -...Transcript: New from BookNet Canada for 2023: BNC SalesData and LibraryData -...
Transcript: New from BookNet Canada for 2023: BNC SalesData and LibraryData -...
BookNet Canada
 
Teaching RDA After 3R (Transcript)
Teaching RDA After 3R (Transcript)Teaching RDA After 3R (Transcript)
Teaching RDA After 3R (Transcript)
ALAeLearningSolutions
 
BIG DATA, a new way to achieve success in Enterprise Architecture.
BIG DATA, a new way to achieve success in Enterprise Architecture.BIG DATA, a new way to achieve success in Enterprise Architecture.
BIG DATA, a new way to achieve success in Enterprise Architecture.
Georges Colin
 
BISG WEBCAST -- ONIX for Books V3.0 Implemention
BISG WEBCAST -- ONIX for Books V3.0 ImplementionBISG WEBCAST -- ONIX for Books V3.0 Implemention
BISG WEBCAST -- ONIX for Books V3.0 Implemention
bisg
 
Putting Buyers and Sellers in the Best Light, How Etsy Leverages Big Data for...
Putting Buyers and Sellers in the Best Light, How Etsy Leverages Big Data for...Putting Buyers and Sellers in the Best Light, How Etsy Leverages Big Data for...
Putting Buyers and Sellers in the Best Light, How Etsy Leverages Big Data for...
Dana Gardner
 
Covering Your Bases: 2014 Standards Implementation and Business Planning - Te...
Covering Your Bases: 2014 Standards Implementation and Business Planning - Te...Covering Your Bases: 2014 Standards Implementation and Business Planning - Te...
Covering Your Bases: 2014 Standards Implementation and Business Planning - Te...
BookNet Canada
 
The big shift 2011 07
The big shift 2011 07The big shift 2011 07
The big shift 2011 07
Frank Bennett
 
Information and Library Services Induction 2019 (with notes)
Information and Library Services Induction 2019 (with notes)Information and Library Services Induction 2019 (with notes)
Information and Library Services Induction 2019 (with notes)
Cambridge Judge Business School Information & Library Services
 
082508 Kzero Metanomics Transcript
082508 Kzero Metanomics Transcript082508 Kzero Metanomics Transcript
082508 Kzero Metanomics Transcript
Remedy Communications
 
Special Topics: Application Profiles (Transcript)
Special Topics: Application Profiles (Transcript)Special Topics: Application Profiles (Transcript)
Special Topics: Application Profiles (Transcript)
ALAeLearningSolutions
 
The Rise of Business Networks
The Rise of Business NetworksThe Rise of Business Networks
The Rise of Business Networks
Dana Gardner
 
Microsoft SharePoint at a Crossroad — Biggest Opportunities and Challenges fo...
Microsoft SharePoint at a Crossroad — Biggest Opportunities and Challenges fo...Microsoft SharePoint at a Crossroad — Biggest Opportunities and Challenges fo...
Microsoft SharePoint at a Crossroad — Biggest Opportunities and Challenges fo...
Dana Gardner
 
Persuasive Essay Topics Religion. Online assignment writing service.
Persuasive Essay Topics Religion. Online assignment writing service.Persuasive Essay Topics Religion. Online assignment writing service.
Persuasive Essay Topics Religion. Online assignment writing service.
Nicole Barnes
 
Simplifying B2B for Suppliers Enables Buyers
Simplifying B2B for Suppliers Enables BuyersSimplifying B2B for Suppliers Enables Buyers
Simplifying B2B for Suppliers Enables Buyers
sourcingdoctor
 
#standardsgoals for 2022: Standards & certification roundup - Tech Forum 2022
#standardsgoals for 2022: Standards & certification roundup - Tech Forum 2022#standardsgoals for 2022: Standards & certification roundup - Tech Forum 2022
#standardsgoals for 2022: Standards & certification roundup - Tech Forum 2022
BookNet Canada
 
HP Vertica General manager Sets Sights on Next Generation of Anywhere Analyti...
HP Vertica General manager Sets Sights on Next Generation of Anywhere Analyti...HP Vertica General manager Sets Sights on Next Generation of Anywhere Analyti...
HP Vertica General manager Sets Sights on Next Generation of Anywhere Analyti...
Dana Gardner
 
Digital Trends 2010
Digital Trends 2010Digital Trends 2010
Digital Trends 2010
Halogen AS
 
Digital trends 2010
Digital trends 2010Digital trends 2010
Digital trends 2010Tom de Haas
 
Digital Trends 2010
Digital Trends 2010Digital Trends 2010
Digital Trends 2010
Dan St. Peter
 
New Concepts: Relationship Elements Transcript (March 2020)
New Concepts: Relationship Elements Transcript (March 2020)New Concepts: Relationship Elements Transcript (March 2020)
New Concepts: Relationship Elements Transcript (March 2020)
ALAeLearningSolutions
 

Similar to Transcript: #StandardsGoals for 2023 Standards & certification roundup - Tech Forum 2023 (20)

Transcript: New from BookNet Canada for 2023: BNC SalesData and LibraryData -...
Transcript: New from BookNet Canada for 2023: BNC SalesData and LibraryData -...Transcript: New from BookNet Canada for 2023: BNC SalesData and LibraryData -...
Transcript: New from BookNet Canada for 2023: BNC SalesData and LibraryData -...
 
Teaching RDA After 3R (Transcript)
Teaching RDA After 3R (Transcript)Teaching RDA After 3R (Transcript)
Teaching RDA After 3R (Transcript)
 
BIG DATA, a new way to achieve success in Enterprise Architecture.
BIG DATA, a new way to achieve success in Enterprise Architecture.BIG DATA, a new way to achieve success in Enterprise Architecture.
BIG DATA, a new way to achieve success in Enterprise Architecture.
 
BISG WEBCAST -- ONIX for Books V3.0 Implemention
BISG WEBCAST -- ONIX for Books V3.0 ImplementionBISG WEBCAST -- ONIX for Books V3.0 Implemention
BISG WEBCAST -- ONIX for Books V3.0 Implemention
 
Putting Buyers and Sellers in the Best Light, How Etsy Leverages Big Data for...
Putting Buyers and Sellers in the Best Light, How Etsy Leverages Big Data for...Putting Buyers and Sellers in the Best Light, How Etsy Leverages Big Data for...
Putting Buyers and Sellers in the Best Light, How Etsy Leverages Big Data for...
 
Covering Your Bases: 2014 Standards Implementation and Business Planning - Te...
Covering Your Bases: 2014 Standards Implementation and Business Planning - Te...Covering Your Bases: 2014 Standards Implementation and Business Planning - Te...
Covering Your Bases: 2014 Standards Implementation and Business Planning - Te...
 
The big shift 2011 07
The big shift 2011 07The big shift 2011 07
The big shift 2011 07
 
Information and Library Services Induction 2019 (with notes)
Information and Library Services Induction 2019 (with notes)Information and Library Services Induction 2019 (with notes)
Information and Library Services Induction 2019 (with notes)
 
082508 Kzero Metanomics Transcript
082508 Kzero Metanomics Transcript082508 Kzero Metanomics Transcript
082508 Kzero Metanomics Transcript
 
Special Topics: Application Profiles (Transcript)
Special Topics: Application Profiles (Transcript)Special Topics: Application Profiles (Transcript)
Special Topics: Application Profiles (Transcript)
 
The Rise of Business Networks
The Rise of Business NetworksThe Rise of Business Networks
The Rise of Business Networks
 
Microsoft SharePoint at a Crossroad — Biggest Opportunities and Challenges fo...
Microsoft SharePoint at a Crossroad — Biggest Opportunities and Challenges fo...Microsoft SharePoint at a Crossroad — Biggest Opportunities and Challenges fo...
Microsoft SharePoint at a Crossroad — Biggest Opportunities and Challenges fo...
 
Persuasive Essay Topics Religion. Online assignment writing service.
Persuasive Essay Topics Religion. Online assignment writing service.Persuasive Essay Topics Religion. Online assignment writing service.
Persuasive Essay Topics Religion. Online assignment writing service.
 
Simplifying B2B for Suppliers Enables Buyers
Simplifying B2B for Suppliers Enables BuyersSimplifying B2B for Suppliers Enables Buyers
Simplifying B2B for Suppliers Enables Buyers
 
#standardsgoals for 2022: Standards & certification roundup - Tech Forum 2022
#standardsgoals for 2022: Standards & certification roundup - Tech Forum 2022#standardsgoals for 2022: Standards & certification roundup - Tech Forum 2022
#standardsgoals for 2022: Standards & certification roundup - Tech Forum 2022
 
HP Vertica General manager Sets Sights on Next Generation of Anywhere Analyti...
HP Vertica General manager Sets Sights on Next Generation of Anywhere Analyti...HP Vertica General manager Sets Sights on Next Generation of Anywhere Analyti...
HP Vertica General manager Sets Sights on Next Generation of Anywhere Analyti...
 
Digital Trends 2010
Digital Trends 2010Digital Trends 2010
Digital Trends 2010
 
Digital trends 2010
Digital trends 2010Digital trends 2010
Digital trends 2010
 
Digital Trends 2010
Digital Trends 2010Digital Trends 2010
Digital Trends 2010
 
New Concepts: Relationship Elements Transcript (March 2020)
New Concepts: Relationship Elements Transcript (March 2020)New Concepts: Relationship Elements Transcript (March 2020)
New Concepts: Relationship Elements Transcript (March 2020)
 

More from BookNet Canada

Transcript: Selling digital books in 2024: Insights from industry leaders - T...
Transcript: Selling digital books in 2024: Insights from industry leaders - T...Transcript: Selling digital books in 2024: Insights from industry leaders - T...
Transcript: Selling digital books in 2024: Insights from industry leaders - T...
BookNet Canada
 
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
BookNet Canada
 
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
BookNet Canada
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
BookNet Canada
 
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
BookNet Canada
 
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
BookNet Canada
 
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
BookNet Canada
 
Transcript: New from BookNet Canada for 2024: BNC SalesData and LibraryData -...
Transcript: New from BookNet Canada for 2024: BNC SalesData and LibraryData -...Transcript: New from BookNet Canada for 2024: BNC SalesData and LibraryData -...
Transcript: New from BookNet Canada for 2024: BNC SalesData and LibraryData -...
BookNet Canada
 
Transcript: Green paths: Learning from publishers’ sustainability journeys - ...
Transcript: Green paths: Learning from publishers’ sustainability journeys - ...Transcript: Green paths: Learning from publishers’ sustainability journeys - ...
Transcript: Green paths: Learning from publishers’ sustainability journeys - ...
BookNet Canada
 
Green paths: Learning from publishers’ sustainability journeys - Tech Forum 2024
Green paths: Learning from publishers’ sustainability journeys - Tech Forum 2024Green paths: Learning from publishers’ sustainability journeys - Tech Forum 2024
Green paths: Learning from publishers’ sustainability journeys - Tech Forum 2024
BookNet Canada
 
Transcript: Book industry state of the nation 2024 - Tech Forum 2024
Transcript: Book industry state of the nation 2024 - Tech Forum 2024Transcript: Book industry state of the nation 2024 - Tech Forum 2024
Transcript: Book industry state of the nation 2024 - Tech Forum 2024
BookNet Canada
 
Book industry state of the nation 2024 - Tech Forum 2024
Book industry state of the nation 2024 - Tech Forum 2024Book industry state of the nation 2024 - Tech Forum 2024
Book industry state of the nation 2024 - Tech Forum 2024
BookNet Canada
 
Trending now: Book subjects on the move in the Canadian market - Tech Forum 2024
Trending now: Book subjects on the move in the Canadian market - Tech Forum 2024Trending now: Book subjects on the move in the Canadian market - Tech Forum 2024
Trending now: Book subjects on the move in the Canadian market - Tech Forum 2024
BookNet Canada
 
Transcript: Trending now: Book subjects on the move in the Canadian market - ...
Transcript: Trending now: Book subjects on the move in the Canadian market - ...Transcript: Trending now: Book subjects on the move in the Canadian market - ...
Transcript: Trending now: Book subjects on the move in the Canadian market - ...
BookNet Canada
 
Transcript: New stores, new views: Booksellers adapting engaging and thriving...
Transcript: New stores, new views: Booksellers adapting engaging and thriving...Transcript: New stores, new views: Booksellers adapting engaging and thriving...
Transcript: New stores, new views: Booksellers adapting engaging and thriving...
BookNet Canada
 
Show and tell: What’s in your tech stack? - Tech Forum 2023
Show and tell: What’s in your tech stack? - Tech Forum 2023Show and tell: What’s in your tech stack? - Tech Forum 2023
Show and tell: What’s in your tech stack? - Tech Forum 2023
BookNet Canada
 
Transcript: Show and tell: What’s in your tech stack? - Tech Forum 2023
Transcript: Show and tell: What’s in your tech stack? - Tech Forum 2023Transcript: Show and tell: What’s in your tech stack? - Tech Forum 2023
Transcript: Show and tell: What’s in your tech stack? - Tech Forum 2023
BookNet Canada
 
Redefining the book supply chain: A glimpse into the future - Tech Forum 2023
Redefining the book supply chain: A glimpse into the future - Tech Forum 2023Redefining the book supply chain: A glimpse into the future - Tech Forum 2023
Redefining the book supply chain: A glimpse into the future - Tech Forum 2023
BookNet Canada
 
Transcript: The Details of Description Techniques tips and tangents on altern...
Transcript: The Details of Description Techniques tips and tangents on altern...Transcript: The Details of Description Techniques tips and tangents on altern...
Transcript: The Details of Description Techniques tips and tangents on altern...
BookNet Canada
 
The details of description: Techniques, tips, and tangents on alternative tex...
The details of description: Techniques, tips, and tangents on alternative tex...The details of description: Techniques, tips, and tangents on alternative tex...
The details of description: Techniques, tips, and tangents on alternative tex...
BookNet Canada
 

More from BookNet Canada (20)

Transcript: Selling digital books in 2024: Insights from industry leaders - T...
Transcript: Selling digital books in 2024: Insights from industry leaders - T...Transcript: Selling digital books in 2024: Insights from industry leaders - T...
Transcript: Selling digital books in 2024: Insights from industry leaders - T...
 
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
 
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
 
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
Transcript: New from BookNet Canada for 2024: BNC SalesData and LibraryData -...
Transcript: New from BookNet Canada for 2024: BNC SalesData and LibraryData -...Transcript: New from BookNet Canada for 2024: BNC SalesData and LibraryData -...
Transcript: New from BookNet Canada for 2024: BNC SalesData and LibraryData -...
 
Transcript: Green paths: Learning from publishers’ sustainability journeys - ...
Transcript: Green paths: Learning from publishers’ sustainability journeys - ...Transcript: Green paths: Learning from publishers’ sustainability journeys - ...
Transcript: Green paths: Learning from publishers’ sustainability journeys - ...
 
Green paths: Learning from publishers’ sustainability journeys - Tech Forum 2024
Green paths: Learning from publishers’ sustainability journeys - Tech Forum 2024Green paths: Learning from publishers’ sustainability journeys - Tech Forum 2024
Green paths: Learning from publishers’ sustainability journeys - Tech Forum 2024
 
Transcript: Book industry state of the nation 2024 - Tech Forum 2024
Transcript: Book industry state of the nation 2024 - Tech Forum 2024Transcript: Book industry state of the nation 2024 - Tech Forum 2024
Transcript: Book industry state of the nation 2024 - Tech Forum 2024
 
Book industry state of the nation 2024 - Tech Forum 2024
Book industry state of the nation 2024 - Tech Forum 2024Book industry state of the nation 2024 - Tech Forum 2024
Book industry state of the nation 2024 - Tech Forum 2024
 
Trending now: Book subjects on the move in the Canadian market - Tech Forum 2024
Trending now: Book subjects on the move in the Canadian market - Tech Forum 2024Trending now: Book subjects on the move in the Canadian market - Tech Forum 2024
Trending now: Book subjects on the move in the Canadian market - Tech Forum 2024
 
Transcript: Trending now: Book subjects on the move in the Canadian market - ...
Transcript: Trending now: Book subjects on the move in the Canadian market - ...Transcript: Trending now: Book subjects on the move in the Canadian market - ...
Transcript: Trending now: Book subjects on the move in the Canadian market - ...
 
Transcript: New stores, new views: Booksellers adapting engaging and thriving...
Transcript: New stores, new views: Booksellers adapting engaging and thriving...Transcript: New stores, new views: Booksellers adapting engaging and thriving...
Transcript: New stores, new views: Booksellers adapting engaging and thriving...
 
Show and tell: What’s in your tech stack? - Tech Forum 2023
Show and tell: What’s in your tech stack? - Tech Forum 2023Show and tell: What’s in your tech stack? - Tech Forum 2023
Show and tell: What’s in your tech stack? - Tech Forum 2023
 
Transcript: Show and tell: What’s in your tech stack? - Tech Forum 2023
Transcript: Show and tell: What’s in your tech stack? - Tech Forum 2023Transcript: Show and tell: What’s in your tech stack? - Tech Forum 2023
Transcript: Show and tell: What’s in your tech stack? - Tech Forum 2023
 
Redefining the book supply chain: A glimpse into the future - Tech Forum 2023
Redefining the book supply chain: A glimpse into the future - Tech Forum 2023Redefining the book supply chain: A glimpse into the future - Tech Forum 2023
Redefining the book supply chain: A glimpse into the future - Tech Forum 2023
 
Transcript: The Details of Description Techniques tips and tangents on altern...
Transcript: The Details of Description Techniques tips and tangents on altern...Transcript: The Details of Description Techniques tips and tangents on altern...
Transcript: The Details of Description Techniques tips and tangents on altern...
 
The details of description: Techniques, tips, and tangents on alternative tex...
The details of description: Techniques, tips, and tangents on alternative tex...The details of description: Techniques, tips, and tangents on alternative tex...
The details of description: Techniques, tips, and tangents on alternative tex...
 

Recently uploaded

State of ICS and IoT Cyber Threat Landscape Report 2024 preview
State of ICS and IoT Cyber Threat Landscape Report 2024 previewState of ICS and IoT Cyber Threat Landscape Report 2024 preview
State of ICS and IoT Cyber Threat Landscape Report 2024 preview
Prayukth K V
 
zkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex Proofs
zkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex ProofszkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex Proofs
zkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex Proofs
Alex Pruden
 
The Metaverse and AI: how can decision-makers harness the Metaverse for their...
The Metaverse and AI: how can decision-makers harness the Metaverse for their...The Metaverse and AI: how can decision-makers harness the Metaverse for their...
The Metaverse and AI: how can decision-makers harness the Metaverse for their...
Jen Stirrup
 
Le nuove frontiere dell'AI nell'RPA con UiPath Autopilot™
Le nuove frontiere dell'AI nell'RPA con UiPath Autopilot™Le nuove frontiere dell'AI nell'RPA con UiPath Autopilot™
Le nuove frontiere dell'AI nell'RPA con UiPath Autopilot™
UiPathCommunity
 
Enhancing Performance with Globus and the Science DMZ
Enhancing Performance with Globus and the Science DMZEnhancing Performance with Globus and the Science DMZ
Enhancing Performance with Globus and the Science DMZ
Globus
 
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...
James Anderson
 
DevOps and Testing slides at DASA Connect
DevOps and Testing slides at DASA ConnectDevOps and Testing slides at DASA Connect
DevOps and Testing slides at DASA Connect
Kari Kakkonen
 
Accelerate your Kubernetes clusters with Varnish Caching
Accelerate your Kubernetes clusters with Varnish CachingAccelerate your Kubernetes clusters with Varnish Caching
Accelerate your Kubernetes clusters with Varnish Caching
Thijs Feryn
 
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Albert Hoitingh
 
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
DanBrown980551
 
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
SOFTTECHHUB
 
PCI PIN Basics Webinar from the Controlcase Team
PCI PIN Basics Webinar from the Controlcase TeamPCI PIN Basics Webinar from the Controlcase Team
PCI PIN Basics Webinar from the Controlcase Team
ControlCase
 
UiPath Community Day Dubai: AI at Work..
UiPath Community Day Dubai: AI at Work..UiPath Community Day Dubai: AI at Work..
UiPath Community Day Dubai: AI at Work..
UiPathCommunity
 
Video Streaming: Then, Now, and in the Future
Video Streaming: Then, Now, and in the FutureVideo Streaming: Then, Now, and in the Future
Video Streaming: Then, Now, and in the Future
Alpen-Adria-Universität
 
Monitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR EventsMonitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR Events
Ana-Maria Mihalceanu
 
FIDO Alliance Osaka Seminar: Overview.pdf
FIDO Alliance Osaka Seminar: Overview.pdfFIDO Alliance Osaka Seminar: Overview.pdf
FIDO Alliance Osaka Seminar: Overview.pdf
FIDO Alliance
 
A tale of scale & speed: How the US Navy is enabling software delivery from l...
A tale of scale & speed: How the US Navy is enabling software delivery from l...A tale of scale & speed: How the US Navy is enabling software delivery from l...
A tale of scale & speed: How the US Navy is enabling software delivery from l...
sonjaschweigert1
 
By Design, not by Accident - Agile Venture Bolzano 2024
By Design, not by Accident - Agile Venture Bolzano 2024By Design, not by Accident - Agile Venture Bolzano 2024
By Design, not by Accident - Agile Venture Bolzano 2024
Pierluigi Pugliese
 
Epistemic Interaction - tuning interfaces to provide information for AI support
Epistemic Interaction - tuning interfaces to provide information for AI supportEpistemic Interaction - tuning interfaces to provide information for AI support
Epistemic Interaction - tuning interfaces to provide information for AI support
Alan Dix
 
Elizabeth Buie - Older adults: Are we really designing for our future selves?
Elizabeth Buie - Older adults: Are we really designing for our future selves?Elizabeth Buie - Older adults: Are we really designing for our future selves?
Elizabeth Buie - Older adults: Are we really designing for our future selves?
Nexer Digital
 

Recently uploaded (20)

State of ICS and IoT Cyber Threat Landscape Report 2024 preview
State of ICS and IoT Cyber Threat Landscape Report 2024 previewState of ICS and IoT Cyber Threat Landscape Report 2024 preview
State of ICS and IoT Cyber Threat Landscape Report 2024 preview
 
zkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex Proofs
zkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex ProofszkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex Proofs
zkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex Proofs
 
The Metaverse and AI: how can decision-makers harness the Metaverse for their...
The Metaverse and AI: how can decision-makers harness the Metaverse for their...The Metaverse and AI: how can decision-makers harness the Metaverse for their...
The Metaverse and AI: how can decision-makers harness the Metaverse for their...
 
Le nuove frontiere dell'AI nell'RPA con UiPath Autopilot™
Le nuove frontiere dell'AI nell'RPA con UiPath Autopilot™Le nuove frontiere dell'AI nell'RPA con UiPath Autopilot™
Le nuove frontiere dell'AI nell'RPA con UiPath Autopilot™
 
Enhancing Performance with Globus and the Science DMZ
Enhancing Performance with Globus and the Science DMZEnhancing Performance with Globus and the Science DMZ
Enhancing Performance with Globus and the Science DMZ
 
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...
 
DevOps and Testing slides at DASA Connect
DevOps and Testing slides at DASA ConnectDevOps and Testing slides at DASA Connect
DevOps and Testing slides at DASA Connect
 
Accelerate your Kubernetes clusters with Varnish Caching
Accelerate your Kubernetes clusters with Varnish CachingAccelerate your Kubernetes clusters with Varnish Caching
Accelerate your Kubernetes clusters with Varnish Caching
 
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
 
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...
 
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
 
PCI PIN Basics Webinar from the Controlcase Team
PCI PIN Basics Webinar from the Controlcase TeamPCI PIN Basics Webinar from the Controlcase Team
PCI PIN Basics Webinar from the Controlcase Team
 
UiPath Community Day Dubai: AI at Work..
UiPath Community Day Dubai: AI at Work..UiPath Community Day Dubai: AI at Work..
UiPath Community Day Dubai: AI at Work..
 
Video Streaming: Then, Now, and in the Future
Video Streaming: Then, Now, and in the FutureVideo Streaming: Then, Now, and in the Future
Video Streaming: Then, Now, and in the Future
 
Monitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR EventsMonitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR Events
 
FIDO Alliance Osaka Seminar: Overview.pdf
FIDO Alliance Osaka Seminar: Overview.pdfFIDO Alliance Osaka Seminar: Overview.pdf
FIDO Alliance Osaka Seminar: Overview.pdf
 
A tale of scale & speed: How the US Navy is enabling software delivery from l...
A tale of scale & speed: How the US Navy is enabling software delivery from l...A tale of scale & speed: How the US Navy is enabling software delivery from l...
A tale of scale & speed: How the US Navy is enabling software delivery from l...
 
By Design, not by Accident - Agile Venture Bolzano 2024
By Design, not by Accident - Agile Venture Bolzano 2024By Design, not by Accident - Agile Venture Bolzano 2024
By Design, not by Accident - Agile Venture Bolzano 2024
 
Epistemic Interaction - tuning interfaces to provide information for AI support
Epistemic Interaction - tuning interfaces to provide information for AI supportEpistemic Interaction - tuning interfaces to provide information for AI support
Epistemic Interaction - tuning interfaces to provide information for AI support
 
Elizabeth Buie - Older adults: Are we really designing for our future selves?
Elizabeth Buie - Older adults: Are we really designing for our future selves?Elizabeth Buie - Older adults: Are we really designing for our future selves?
Elizabeth Buie - Older adults: Are we really designing for our future selves?
 

Transcript: #StandardsGoals for 2023 Standards & certification roundup - Tech Forum 2023

  • 1. Lauren: Hi, everyone. Thank you for joining us for today's Tech Forum session. I'm Lauren Stewart, the operations director at BookNet. Welcome to #StandardsGoalsFor2023 Standards and Certification roundup. Before we get started, BookNet Canada acknowledges that its operations are remote and our colleagues contribute their work from the traditional territories of the Mississaugas of the Credit, Anishinaabe, Haudenosaunee, Wendat, and Mi’kmaq Peoples, the original nations of the lands we now call Beeton, Brampton, Guelph, Halifax, Toronto, and Vaughan. We encourage you to visit the native-land.ca website to learn more about the peoples whose land you are joining from today. Moreover, BookNet endorses the calls to action from the Truth and Reconciliation Commission of Canada and supports an ongoing shift from gatekeeping to space-making in the book industry. The book industry has long been an industry of gatekeeping. Anyone who works at any stage of the book supply chain carries a responsibility to serve readers by publishing, promoting, and supplying works that represent the wide extent of human experiences and identities in all that complicated intersectionality. We at BookNet are committed to working with our partners in the industry as we move towards a framework that supports space-making, which ensures that marginalized creators and professionals all have the opportunity to contribute, work, and lead. For our webinar today, if you're having difficulties with Zoom or have any tech-related questions, please put your questions in the chat or you can email us at techforum@booknetcanada.ca. As you can tell, we're providing live ASL and closed captioning for this session. To see the captions, please find the show subtitle button in the Zoom menu normally located at the bottom of your screen. If during the presentation you have questions for us, please use the Q&A panel found in the bottom menu. Lastly, we'd like to remind all attendees of our code of conduct. Please do be kind, be inclusive, be respectful of others, including of their privacy. Be aware of your words and your actions and please report any violations to us at techforum@booknetcanada.ca. Lastly, we'd like to... Oops, sorry. And then also do not harass speakers, hosts, or attendees or record these sessions. We have a zero-tolerance policy. You can find our entire code of conduct at bnctechforum.ca/code-of- conduct. Now let me introduce our speaker and my colleague, Tom Richardson. Tom is BookNet's bibliographic manager and friend to all ONIX producers and users, building on an illustrious career in Canadian publishing and metadata management. In 2016, the Book Industry Study Group awarded him with their distinguished service award for his work on standards and best practices. He's someone I learn from every day and I'm thrilled to introduce him today. Tom, please take it away. Tom: Well, I guess I should just get right in. Well, the first thing I'd like to do is answer what's becoming quite a regular question. What's with BiblioShare and ONIX 3.0? And it's simply to say, "Mea culpa, the fault is ours." But I'll explain, BiblioShare intentionally stores ONIX as ONIX. The two versions are different and stored as distinct instances. Now, a retail database, a typical data aggregator would map the ONIX data to their central dataset. So the question of which version is a question of creating new programming to map the same data
  • 2. points from ONIX 3.0, and then just having it delivered thereafter. I mean, don't misunderstand me, it is not that simple, but it does mean most companies can just do a switch and use one over the other, and the new mapping meets their needs and the publisher's needs or it doesn't. With the slide, I've tried to illustrate what the retailer may not know is the full extent of the data available in the feeds. Now, they know what they program for, but not always what's being sent. So BiblioShare data aggregation is actually both simpler and more complicated, and this illustration is awfully idealized. But we do know what's in the ONIX file because we load all of it. And as you can see here, that is actually simpler in its way. A retailer needs to create actionable information from the metadata. BiblioShare preserves your data throughout knowing everything sent is an asset in our work, and we help you fix and improve your data, and in turn, that improves the quality of the flow through BiblioShare. Now that simplicity helps support our APIs because fully standardized data is manipulated, and that makes it expensive. BiblioShare can be exceptionally low cost. Now, retailers can also benefit from this because we can help them understand the scope of the data available. We can do that now for either ONIX 2.1 or the 3.0 data that we have. And we have begun ensuring that one of the introductions of one of today's topics, ONIX 3.0, is as dull an event as we can possibly make it. Now, this is actually an intentionally misleading slide. The problem we are having is not with handling ONIX 2.1 and 3.0 together, that actually works fine. We just keep them separate. Rather it's that we still have five 2.1 ISBNs for every 3.0 one that we have. We don't have enough 3.0 data to service our clients to the APIs, but trying to process the 3.0 in volume has also run into problems. Our transition has turned unexpectedly awkward. Every company has its own issues and systems and ours is being solved. And once it is, soon, the next issue for us is that the 2.1 data that we serve is often merged from two sources, and we may or may not have the same two sources available for 3.1. So will the new metadata we serve be comparable? We have products depending on it. We have to move a little more carefully account to account and that makes the transition a little more thoughtful and slow in the changeover once it's begun. We are sorry for the delay, and I can only ask you for your indulgence in continuing to support the dual feeds into BiblioShare. Please get in touch if it's a problem and we'll be in touch soon about the 3.0 data as we move forward, mea culpa. So BookNet's sister organization in standards, BISG, the Book Industry Study Group, has produced an excellent statement, decrying abuse of book titles in metadata. Now, standard aficionados may recall a similar statement came out several years back from our UK-based sister, B-I-C, BIC. And I'll be asking our own bibliographic committee to endorse this new one just the way we endorsed the previous one. Please read this and distribute BISG'S statement in your company. It's available at the URL here, but if you just go to bisg.org and find the metadata committee, it's there as a link. In the context of this webinar, it's a great introduction to where we are now in standards. Now this is a biased paraphrase. BISG is both politer and more specific. But my takeaway is this, not following the definitions provided in standard defeats the purpose of having a standard. Sender and receiver need to be able to predict what the
  • 3. metadata contains. Who knew? Senders need to create metadata before retail use can happen. Retailers need to use the data they are sent following its named purpose, including, and this is an underlying theme of this update, adopting new data in a timely fashion. But senders need to appreciate that assuming we can get retailers obligingly following reasonable advice, they too would need senders to follow the standard and maintain the data properly. In this case title and not attempt to circumvent perceived problems by gamifying the metadata to achieve some goal generally based on the current data habits of one or maybe two players who may very well not follow the same rules for meditated ingestion in every market they serve. It's not a game anyone can win. Now, interestingly, applying the standard implies understanding the goals of the standard. And there is a big change that has happened. Not that it amounts to much yet. No worries, nothing that much has changed that affects your metadata, but combining the BISG statement and EDItEUR's announcement of ONIX 3.1 suggests we have to admit that standards will change and agree on how to communicate so that the implementation between sender and retailer actually occurs in a timely manner. No one should be proud of the transition from 2.1 to 3.0. Twelve years is way too long, and obviously, I can't claim to have a solution that then when we're trying to stretch that out to 13 years. I can also ask both retailers and data senders to recognize it creates a cost in the same way I could also admit that BookNet's data-sending clients are clear that we are becoming one. Still, in my opinion, it is cheaper to expect and plan for change. So let's move on to what's changed. Introducing ONIX 3.1 and why isn't it ONIX 3.0.9? So this is as direct a quote from EDItEUR's latest specifications on their introduction to release 3.1 as I can make it. This new release of ONIX for books is characterized by two key changes. First, the removal of a handful of elements from release 3.0 that have been deprecated for up to a decade, and second, the addition of a range of new data elements, which I'll cover in a moment. It's ONIX 3.1 because it's no longer fully backwards compatible with 3.0. A full list of all deprecated elements can be found in an appendix in the specification, which is appendix four, the list of elements removed from release 3.1. I'm not gonna try and cover all the details of the deprecations here. It's not a problem because there's almost no use of what's been removed. Mostly, just be sure you update your documentation, and this is what may be considered to be new, stop or don't use anything that's labelled as deprecated in the new documentation. That is how upcoming removals to ONIX are marked. All of which is to say ONIX for books version 3 is considered stable enough that it has to support changes that affect version compatibility. And that's good news because ONIX data senders and receivers need a stable standard. It's cheaper, but in addition to supporting new data with each version update, they should consider the need to plan for change based on the standards requirements if they want to remain within it. So to expand on that a little bit, maintaining valid as ONIX means you actually have to plan for the deprecation of data. ONIX 2.1 preserved deprecated bits for backward compatibility. ONIX 3.0 famously broke that, and up to now, ONIX 3.0 maintained full backward compatibility within its version. Now, the version three series isn't going anywhere, not soon at any rate. There is no 4.0 on the works and anyone's horizons. So after more than 12 years, it's stable enough that we actually need to get some cleanup in.
  • 4. Now EDItEUR is seeking input about this, about how do they remove deprecations? If 12 years is clearly too long and if version updates that are around every second year seems too short, how often and how should they manage this kind of change? Contact information will be at the end, but get in touch with your thoughts on that and I'll come back to this a little bit later. On deprecations, BiblioShare has your back. We haven't made the transition to ONIX 3.1 quite yet, but we will soon, but before we do, we're monitoring BiblioShare's citations for issues. And I've started to notify clients if I find any. I am pleased to say and confirm it is not a problem for most companies, but the next are the most likely issues. This one is associated with core source feeds. Largely, they're coming in from eBOUND. Now I've gotten in touch with eBOUND, and I hope it'll disappear without anyone else being involved. So the most common use of newly removed data in BiblioShare's ONIX 3.0 archive is dual data, providing the deprecated and expected presentation. So this is an audience code as an element supplied with an audience composite supplied with internal coding for the same data bit. So the bit in red is the deprecated audience code. Now could I just be clear? It may exist in our data because eBOUND supplies ONIX 2.1 feeds to BiblioShare, and that is exactly the sort of problem that can occur if you try to support conversion. By continuing to need ONIX 2.1, we may be the author of our receiving this problem. The other common problem and it affects various state composites of which there are many, which always have up to now included a deprecated option that will be now invalid with 3.1. So here's an example date composite. So this is a birth year coded to show the date format is atypical. So in this case, coded for birth date, you're only giving a year format. So 1935, so the date format for that is 05. So what's in red above is the deprecated version. You should never use date format as an element. It should appear as an attribute within the date tag as in blue below. The main reason this one is worth noting is because dates in ONIX are the default eight-digit year, month, date, and default dates don't require the use of the date format. It's important that you provide a date format code anytime you provide a date that isn't the default value. And some dates require more detail and use minutes and seconds for effective use. And depending on your software source, particularly if you're using in-house created software, supporting attributes may or may not have been built in. The idea here is data receivers should be able to know what they are processing based on the attribute rather than loading the data, reading the data, deciding whether the data is formatting, all of which is kind of wasteful processing time. You know, you warn them that you're not giving them the usual date format. So they expect it to be in an attribute and they're warned and they can point this efficiently. So if you find yourself adding an atypical date, it might be worth checking. And I'm actually confident most of you, you'll find, you're using the attribute now. Now, I have found one client using a value removed from the market publishing detail composite, but I think it's related to conversion problems from 2.1, and probably misunderstanding how to use the market composite in ONIX 3.1. Anyway, promotion contact. The first one on the list here is deprecated use product contact. And there are some others that are detailed here and in the appendix, but the final removal worth noting is gender in the contributor composite has been removed, and there is no equivalent to it in release 3.1. So to
  • 5. explain, get in touch if that's a concern. And there has been almost no use of it. But the gender composite was never designed to support use as an identity marker. And it was included to support using ONIX for submitting to ISNI, the International Standard Name Identifier, which first asked for a binary male, female, or other code to help identification of content producers. And they found after a decade, that wasn't a help for their uses, so they dropped it from their request. Therefore, ONIX has dropped it from their standard. The short answer for identity concerns is that ONIX has never provided coding for that purpose. So its removal isn't a problem. And I'll touch on this a bit at the end, but what's missing from diversity support is an ongoing question. But the following is important because there are newly deprecated data points in ONIX 3.1. So these are still part of 3.1, but they are now labelled as deprecated. And that means that they will someday much sooner than 12 years from now be removed from the standard in the same way the previous data elements were just removed from 3.1. So newly deprecated, all default values in the header composite: language of text, price type, and currency. Now I do see these in headers, but I don't know the use case for them. I mean, for over 20 years, the request has been to put the value in the record and everyone does that. I'm not aware of anyone expecting end users to insert values based on defaults found in the header, or of any end user looking to fill in missing data based on a supplied default. I can't actually see it working in most people's feeds. I think its use is simply tradition. IT departments include them because they have in the past and they can now, but they need to stop doing it because these will be removed from version 3.1 at some future date. Now the other deprecation to require some minor work is a little more serious. It represents forcing an improvement made to the design of ONIX 3.0, and the ending of an ONIX 2.1 carryover. Title text has been deprecated in the title element composite, which is used in both P.5 collection and P.6 product title. So instead of using title text, what you do is you use title without prefix in all entries, period. It's either with no prefix and title without prefix, or it's with a prefix and the title without prefix. In ONIX 2.1, title text had some sense in that some but not all senders supplied all titles in title text and supplemented it with title with and without as needed. Anyway, all of this makes the question posed earlier and repeated here a very, very real one. There is some use of default values in headers. There is a better but simpler way to handle titles. There is also continuing use of that 2.1 carryover and the header problem. We moving them is a task and doing it sometime in the next two to four years seems quite a reasonable request. If they are to be moved by that time and they are then removed from the standard, technically, the file including them is no longer valid as ONIX. That should be important because supporting ONIX valid as XML documents measured against a published schema should be a best practice goal for everyone. We have failed at this before. So how should this sort of change be managed? Twelve years is too long, never doesn't make sense. Change in metadata is a normal over long periods. How do we ensure both senders and receivers update their system in a timely way? Contact information is at the end. Let us know what you think. This is because it's a version change, I have to just make a reminder to everyone about what's a version and things. So hopefully everyone knows this. I just feel I need to say it. Version changes are when new
  • 6. composites and elements are added to support new data and new functionality. Typically happens every year and a half to two years. The current version is ONIX 3.1. Issue changes are when new codes are added to existing code lists to support new data points. Typically happens four times a year. The current issue is 61. There is what might be considered a mini transition to ONIX 3.1 as cleaning out any deprecated elements may take a little of extra time. So, I mean, it might be sensible to delay implementing 3.1 for just a little bit, but we are talking like a couple of months. I mean, it shouldn't be any longer than that. No, there are no problems. It's mostly good. So this is just a reminder, update your ONIX specifications at least yearly. It has minor corrections and tweaks made regularly. Always updated it after a version change. We just had one. Absolutely update it when there are newly deprecated data points. Understand how your software updates and code list issue changes happen. Let me rephrase that because I said it so badly. Understand how your software updates and code list issue changes occur and are implemented in your systems. You need to ensure that staff using the code lists have access to full versions that include explanatory notes. And excuse me for just a sec while I clear my throat. Glad you didn't hear that. Okay, sorry. All right, so continuing on. Use version updates to review ONIX for metadata that might benefit your business. So ONIX 3.0 had nine version updates since 2009, and they are summarized in the specifications introduction. It's a very useful list. I've just reviewed it. It's full of lots of things no one has implemented. Now, if you do that and you see things you're interested in, ask BookNet for help. If you see part targets, potential targets for use, ask us if we see use of them in BiblioShare. We can confirm if they exist, people are using them. If you're unsure how to implement them, ask. Pleased to help with that sort of thing. If you're unsure of the purpose of any metadata structure or code that you see or are using, ask. If your business has something to say, a communication need and metadata might help, ask. But this is the important message that I'd like to have. Tell your trading partners and BookNet, if you add or need additional metadata support. Don't rely them on finding your changes. Don't rely on them guessing what your needs are. So let's move on to the next great topic. What's new in ONIX 3.1? So they have added sequence number to allow explicit prioritization of block 2 collateral material. So that's added in all four sections including prizes. And this answers the need of providing a priority to say a sequence of reviews. So if a receiver can only fit three reviews from your list of 18, they can fit the first 3 in a list, you can control. So this should be... People have been asking this forever. So this is well worth implementing. We expect to see use. It's like this entire list thing we expect to see the people want and to be using. However, having said that, they have also added awarding body within prize. And Canadian publishers love their prize information and they love doing it right. So this is an XHTML-enabled field that gives a dedicated space for the sponsoring organization should thrill everybody. This third one is the other particularly important one. They have added a market reference to enable market-specific partial block updates. So this is actually the other reason why it's version 3.1 as a change. This is a major change to ONIX. If you want to practically support block updates, the most likely block you want to update is block 6. And it is unique because it can repeat to represent different markets. Now, block 6 is
  • 7. the heart of the ONIX standard. But when you do a block update, you probably aren't sending block 6 to every data receiver. So how do you then update the blocks 6? It gets kind of complicated. I mean, as it would work now is what you should try to do is you supply all the block updates, and then you don't supply one of them and you make a change to that and then they load it. Anyway, it doesn't work very... It's not very sensible. However, when you add a market reference statement so that you would then have a unique identifier so that your trading partners can use it to identify the one that they need to update. So there's no appreciable use of block updates in Canada yet. When we start, this will make a great help. There's two new things in collections. They have added collection frequency to carry the frequency of publications of products in a collection. And they've also added an ability to support a collection element level to identifiers. And that allows you to craft reference collection title elements to the identifiers. So the first one would be used for serialized products. I mean, if you are, I don't know, listing comment books would be a simple example. I mean, you can now say, "Give a frequency code into the system." That should be very useful and we'll probably see use. The identifier problem is one that really exists for... Well, you have to have a collection and a sub-collection really to make it make any sense. So, I don't actually expect to see much use of it, but, you know, it's a very useful thing because identifiers are useful. They have also added an affiliation identifier composite within a professional affiliation. So that's a contributor composite section thing, professional affiliation. Heavily used by university presses. Again, university presses would mainly care about this next one, which is there's something called open access, which allows people to provide products for free. And it's usually done by university presses for academic-type printing/publishing. But they have extended some elements to appear in content item block 3 content item that will allow the support of hybrid open access, which allows you to have paid pricing and unpaid pricing at a chapter level within one record. Complete genius if you need it. Don't expect many do. The other changes are just some sundry lists. I mean, every version includes some small tweaks to consolidate metadata support for consistency and feature proofing. There was one other addition that I thought was worth highlighting. So newly added in 2023 for issue 61 code list update to list 153, everyone's favourite, text type, code 37, cover line. So the definition, as you would find in the notes, U.S., which identifies the terminology being used here, reading line. Line of usually explanatory copy on cover, somewhat like a subtitle, but not on the title page and added by the publisher often on the cover. So, cover line, meeting line, something like that. A sample would be with 125 illustrated recipes. Now, this addition ties neatly back to the BISG statement on title. Now, that statement advocates for leaving the book title metadata alone to be what it's supposed to be. Something that matches the title of the book, the title page of the book specifically, and advocates for retailers to support and publishers to use another list 153 code added in 2009 for the issue 10 code list update and supported by both ONIX 2.1 and 3.1, 3.0, everyone. List 153, text type, code 10, promotional headline. A promotional phrase, which is intended to headline a description of the product.
  • 8. Now, the BISG statements suggest that publishers feel that they need to use book title for marketing messages because it's one of the few fields that they can reliably find displayed and indexed. Outside of CataList as far as I can see, promotional headline, if it appears at all, appears as another random piece of text buried on a page of, well, metadata blat. Hopefully, we'll get some traction on the sort of support that BNC CataList supplies, they cunningly match the definition supplied here. Now, cover line is a great addition. It creates a field that adds to the book title, a place for that little bit of extra explanation not found in the title proper. And it should be displayed for that purpose. Hence, that's why it is in the text type. It's a display component, and it needs to be displayed for that purpose. Note also, it's a U.S. edition. And I think that means it's likely gonna be used by Penguin Random House, but I can't confirm that. Anyway, retailers should take note. So up to now, I probably would've advocated for supplying that information in either, I don't know, product form description or illustration description. But that's only because there wasn't a better place. This is the better place. And it's better because it's there for promotional purposes. And while the other two are displayed, typically only in the fine print along, you know, with that little block of page number and that type of thing. So, promotional headlines should work exactly how it sounds for display promptly. But what we need in book metadata is transparency by all players. It starts with publishers, in this case, supporting the standards offerings and being transparent by telling retailers about the tools they supply to support their sales. Okay. You can't make them use them, but if you're looking at metadata results you had with one very large retailer and modifying your data based on what you see, you're gonna very quickly wind up doing funny things with your book titles. Experience has shown this, and it leads to statements like BISG has made and BIC before it, and me whining, "You're basing your feeds on experiments with a black box that's designed to reward the retailer." No one can blame them for that. And short term rewards are possible for the publishers doing it. Who knows? Now, retailers need to look more at the data they get. Having their IT department tell them what is in the data will include, for one thing, all the results of those publisher experiments. Take them however you will. So everyone does have to rely on analysis, the data, to ask them what to do. But surely, in 13 years of availability, how is it that somehow no North American retailer has asked for or supported promotional header? That has to show you that there is a problem. If you're going to design a page for display, at least make sure you support the elements designed to be displayed on it. That's yesterday's news. A new tool has been added to the standard. It's called cover line. Let's start supporting it in less time than it's taken promotional header. And it starts with your being transparent and communicating with your training partner. And BiblioShare can help in that, which is as I've tried to explain earlier, and so can BookNet Canada. So I'm just about finished. We want you to participate in standards, and here are the ones that we are most closely involved in. On the left is some standard schemes we are typically associated with, and on the right, the organizations. So EDI is Electronic Document Interchange. Anyway, it's still the heartbeat of distributor to retailer information exchange. We do it.
  • 9. Subject schemes, we are particularly involved with and have staff involved in BISAC subject and Thema subjects. We closely watch identifiers, all of them, all the time. We monitor ISO and W3C, GDSN and SAN stuff. We have staff members on BISG committees, metadata, subjects, and supply chain in particular. Identifiers doesn't have a committee, and workflow is more related to digital than we... Anyway, we monitor it closely, but we're not directly involved at the moment. We can be. We have staff involved in the ONIX International Steering Committee, the Thema International Steering Committee. We work closely with BTLF, our Quebec partner. Now, the BNC Bibliographic Committee is basically the group that we run and it is made up of people from the industry being Canadian. It is the Canadian English Language Group, National Group for ONIX and for Thema. It meets three or four times a year. Should be four times a year. More if we could get it. But it is definitely a venue for participation. We regularly discuss what is upcoming in ONIX. We tried one for Thema. If you have needs, wants, desires, it is the group that would talk about them and agree to put them forward. We want you to participate. We want you to be involved in these things. They are all changeable. You need a code in ONIX, you can get a code in ONIX. It's not that difficult, but you do have to participate and you do have to let people know, and other people have to agree with you. So that, I think, gets me down to my very last slide. And I will just simply observe that this is identical to the slide I provided last year as part of my presentation at this time. So I am not trying to say that the diversity, equity, inclusion, identity, ingenuity, and justice for book subjects and author promotion has changed, or not been moving. It's moved quite a bit actually since then. In particular, Lawrence Stewart has been doing a great deal of work that is extremely valuable in BISAC, which we've gotten some new stuff in. We have some ongoing discussions. We have a couple of working groups and things like this that we're involved in. But it is a hard row to do it. Getting it right is as hard this year as it was last year. The reference to OwnVoices, which was, you know, one of the problematic editions, I mean, that became problematic as, you know, still remains what it is. We want to have more of this stuff, but it is something that we are working on, and that we need as much participation by members of the community as we can get. And that ends this part of it completely on time when I expected it to be 40 minutes in. Lauren: Thank you, Tom. Give you a chance to drink some water, and relax after that. So helpful. You mentioned several times during your presentation that you would add contact information, and that is now on the screen for anyone who needs it. And I trust that my colleagues will likely put another contact link in the chat. So let's get into some Q&A. We've got some great questions from the audience, and feel free to add more as we're chatting. But just to get you started, Tom, so I've seen you now do this presentation for at least 12 years that I've known you. And maybe it's because I've got all these TV show reunions on my mind today, but I have to ask you a classic reality TV question, what is one rose and one thorn from this past year in standards? So for those of you who are not familiar with reality TV show reunions, on one particular network, a rose is something that was successful like an accomplishment, a highlight,
  • 10. something positive that came out of this past year. And a thorn is the opposite, something that was challenging, a struggle, or something that perhaps needs another go. So Tom, what are your rose and thorns for the past year in standards? Tom: Well, standards and roses don't really go together. I mean, the slow change is actually what marks standards. There's three things I would highlight as roses. Thema is just one of them. It is slowly increasing in use. There's increased interest in the United States from it. It is easily still the best thing that we have available for genuinely improving metadata that I am aware of, and that people who are paying attention to it, I think, like it. We have a presentation coming up on June 8th, which I just would like to mention because it's an important one, I think, where Karen Smith, the organizer of the website for Blackwell's in the UK, which is a Thema-based website. Just go over and look at Blackwell's. Go on over and look at the way you could set up search and discovery for books using better metadata. It is wonderful, and Karen will talk about it. And that's a real highlight to my mind. The other one is ISNI, which is seeing actual use and increasing use. Major publishers are planning to start introducing it if they haven't already. So U.S. ones, I mean, actual use, is doing it. That means we have a second consistent, persistent identifier. And people who have followed Tech Forum for years know how much we love to talk about persistent identifiers. We haven't had a chance to recently keep trying to, but this is the first persistent identifier that's really going to have a prospect of changing publishing since the ISBN. So that's exciting. So that's a real rose. The other thing is all the work you've done in diversity which is, like, a complete wonder and hugely important. And that is the biggest frigging thorn in everybody's side, which is just that getting the answers to the problem of identity and metadata is neither simple, obvious, workable, or otherwise coming in everything. So we could talk for 20 minutes on the problems around that. So I will stop now. Lauren: No, I think you've summed up kind of what my thoughts on the past year and couple of years of work in many of these spaces have been. So yeah, very well stated. The other thing that struck me just as I'm gonna pull in kind of a combination of a question from the chat as well as from the Q&A, as well as one of my own, one of the things that you stated a little bit early on was the kind of the chicken and egg scenario, that anyone who's working with ONIX or any other standards and really looking towards implementation that we struggle with, you were very clear when you said that data senders need to create metadata before retail use can happen. And then you followed that up by saying retailers need to use the data that they are sent following its named purpose. And so my follow-up question to that was can you elaborate on how retailers can work with data senders to drive earlier adoption of standards? And then I have a follow-up from the chat as well. It's not an easy question. Tom: I tried to address that with the transparency slide and talk, so I don't wanna repeat that. I mean, it's kind of a problem. I mean, retailers... I mean, anything retailers can do to stop the scenario where they pull the data in, they talk to their IT department, they look at the data they can, they do the best they can with that data, and then they don't spend much time looking at the standard. And they rely on their IT department to tell them what's available in the data as they get it, not what they could be getting in the data. So, I mean, it's just that. I
  • 11. mean, I've quite pointedly said, "Look, no one implemented promotional header and it's existed and being used in the data." I don't know, I don't think there's been a lot of publisher use until the past, say, two to three years. But I think there has been at least a couple of years of use of promotional header, but at least some publishers. And the problem, of course, that they have when you rely on your IT department is that the amount of use that you probably get of something like promotional header in an aggregated feed of like millions of records is probably substantially less than 10%. So from the point of view of an IT department, it's like invisible data. You can't see it. You won't know it until you go looking for it. So if you went looking for it and said, "If we could just use promotional header, we could make a display better," they would've found some use, so they went looking for it. They could have put it up and they could have found that it did actually improve sales on like these test records and stuff like this. They would've had a chance to do something with it. And publishers seen it being used would flock to it because they would want to see it used. So we need to have the retailers interacting more with the better parts of the data that aren't small amounts. So one of the problems when you say diversity is you're talking things that affect less than 1% of the records is where the real quality material and diversity will be found. You find it because you go looking for it. You don't find it because it drops out of the data. The counterpoint to that is the publisher tendency to want to make everything a default. So a lot of publishers will take a data point that they feel that they should support because something important is, and if they don't include it in every record, they feel it's invisible. And they're probably right from what I just said, so I can understand it. But by putting default data in, they basically put pointless pieces of data in. So you have companies increasingly making companies load sales restriction information to have load the information that I don't have a sales restriction. What's the point of that? I mean, sales restriction exists to put up sales restrictions up. So, you don't add to the dialogue by saying, "We have none." Unless it's new that you took it off or something like that, then that's important to put in. But I don't know, people hide the data by doing that, but we do need the retailers looking for the small data. Does that...? Lauren: Well, I guess... Yeah, no, I think that gets to it and I think that then suggests kind of a direction because when you said that... You kind of suggested that if publishers start creating the data, then there will be use for it. But the question that we had from the chat was how can publishers advocate for more thorough use of display fields on retailer sites? The commenter mentions that they would love to see cover line use for an example, but I think it could apply to any of the other examples that you mentioned particularly, you know, so- called diversity content qualifiers, things like that. How could you answer that in terms of advocacy? Tom: I think if you have added something to your metadata because you think it's important, and you're structuring your data, that you should send an example to the retailers. I mean, very large U.S. publishers typically have a sheet. Now large U.S. publishers keep their metadata really very, very basic. But when they add something, they add it to a list and they distribute that around, and they make sure everybody in the supply chain freaking knows that
  • 12. they support that piece of metadata. I mean, I think Canadian publishers have really good metadata, but they don't necessarily tell anyone about it. They just kind of expect that people know the standard and know to use it. So just by putting in the thing, we're supporting promotional header with an expectation that it leads our descriptions. Doesn't have to be more complicated than that. It's code this is being supplied. We supply a short description for the 3.0 so that you can add it to...in this situation so that you could have a main book page and a page with a short description on it. We make sure it's less than 350 words or characters. So that it can fit that type of thing so that people know that you have... You're meeting standards, that you're providing the material, and then shipping it to them. Because I suspect that retailers who had it pointed out to them that promotional header was being used might have gone looking for it by now. And the fact that they don't seem to have used it and we have to, like, shout about it before we can have any hope of it being used just shows that no one has pointed it out. Now, people may say, "But it's your job, Tom, from BiblioShare to point out that we started using it," and they're probably right. BiblioShare could probably be doing a better job of trying to point out use and things like that. I mean, we do. Lauren: Yeah, I think there's something to that like celebrating early adopters, celebrating adherence to best practices. There's something to that that I think seems very anti-Canadian to overtly celebrate, but also very in line with what you've observed over many times over the years, that so many of the small indie publishers are able to respond more nimbly to changes in standards and in best practices in a way that the larger ships can't steer so quickly around. And here's a big one to steer around, and I know that many people on this call have kind of been through the trenches, and their first question is gonna be, do they have to move over to 3.1 right away? And I don't want you to start thinking that you have to look into your ball to see the future. But I think… Tom: I want to just say it's not a problem. It's not a problem. Lauren: Yes. Now, I guess what I'm getting at... Tom: And I'm getting in touch with a few people who are having a small problem with it, and it's a small problem, don't worry. Lauren: Oh, and that's I think what we all got from that, is that it seems relatively easy to manage. But I think what many of the people on this call and that even we at BookNet kind of dealt with was what seemed like a relatively rapid industry shift from 2.1 to 3.0 because there was a major industry push by a major industry player, and that seemed sudden. And do you think that we're in a situation like that with 3.0 to 3.1 where it is going to be a sudden push? Is there reason for panic? I'm hearing no. And I just wanna get some comparative values maybe between that shift from 2.1 to 3.0 to 3.0 to 3.1. Tom: I mean, Graham Bell from EDitEUR was quite clear that the answer to this is no problems. But just to list some minor issues that you might have to deal with. There's a tagline in your ONIX file that says, "Release 3.0, hello." It should move up to 3.1. Now that also...I mean, something I didn't realize when I went to modify BiblioShare which again has schemas and things like this, that underline is similarly all of a sudden, we need...and it's just
  • 13. not a simple thing. I mean, I need to talk to our developers. I need to get that 3.0 be changed to 3.1. There's some subtle little things that will cause some very minor technical glitches, but nothing that is complicated. Nothing that is difficult. And for most people, it literally is you take your current thing and you just go through and you make the 3.0 into 3.1 and you're done. Now, everyone should be downloading the information as it's updated. So if you go to EDitEUR's site right now, you can download the correct schema for 3.1. You should do that, but you might not want to implement it immediately until you've modified these numbers. Because all of a sudden, you might be loading information where they don't match what the schema expects. It will expect to see the same information it's looking for. And guess what? The schema for 3.1 looks for 3.1, not for 3.0. I mean, that's what I mean. There's some subtle little problems you can get into, but if anybody comes up with that type of thing, I can be a resource for that. I mean, they really are minor and they really shouldn't be a problem for anyone. It's just there is... Anytime you touch the XML, this is an XML level change. Anytime you touch the XML, it can get a little fuzzy. So that's... Lauren: No, I think that's a fair answer. A very fair answer. We have another one from the audience. And from this commenter's perspective, they write that deprecated elements need to be handled at a database level since publishers don't always get to see how their data fields map to ONIX. Which as the commenter says which means publishers don't know what they're entering...don't know that what they're entering might end up sending in a way that has been deprecated. And the question for you, Tom, in your wisdom is, are database developers being asked the question about how to handle deprecation? And do you, Tom, have a sense of what they are saying? Tom: I have literally no idea if developers are being asked. I mean, no. I mean, one of the reasons why... Okay, one of the reasons why this transition is not a problem is because literally anybody who implemented 3.0 up to now would've been using a manual, should have been using a manual, God help them, that was newer than version one. And version one is when most of these deprecations were added. Some of them were even added to version zero. I mean, they literally released version zero and deprecated some stuff almost immediately. So, anybody who's using this stuff, it was already labelled as deprecated. Now, when people implemented, were trying to handle 2.1 and 3.0 simultaneously for convenience, some people added the deprecated component like the... In 2.1, they may have used the code for audience code, which was allowed, and they may have implemented in 3.0 to make the transition easier for them. So they may have created themselves a problem, but it was labelled as deprecated. They should have known that, what they were doing at the time. So there's not really much of a conversation you could have with the developer if they're not using the manual and reading what's deprecated, well, someone should get them the manual and make them read it is the only thing I can kind of say. But I do think in general, I mean, going back to what I was saying about retailers, is that it's a real place if the organization can handle it to have a local ONIX expert who's paying attention to the standard and the needs of the standard. I mean, you can't go to the schema and understand what a block update is. A block update is as much
  • 14. of a philosophical position and a set of processing rules as it is something that is in the schema. As far as if you just look at the schema is it just confuses the developers because it just is represented by...there's a very large container that surrounds sections that has no apparent purpose because the purpose is to support block updates, which happen as a processing thing that has to be built into your program. So everything works without them until you need to implement a block update where upon there's a set of rules that you have to follow to do it properly so that people do it. It's no different than how you process an ONIX record now if there's a set of rules that go with it, like, full replacement as a concept. I mean, there are some rules that come with processing ONIX that just exist outside of the XML. Somebody needs to be reading some of that stuff, the manual. Developers need support in doing that. Somebody should be helping them understand that. Again, BookNet is there as a resource for this. It's kind of our job to help with this, but it's really hard to answer that question better than that. Lauren: No, I think that what you've gotten there is that yes, ideally, that whoever's doing your software development, if you've purchased a publishing management system that is creating ONIX for you at your house, that you should have trust that there is someone at the software developer that knows how that standard works, and is understanding and able to respond through software to the standards needs and as it evolves. But then you also mentioned having a local expert in the standard. And I think that what I took from that, and what you've been saying throughout is that BookNet is there for that. If you are an emerging press and you don't have that knowledge in-house, that, Tom, literally this is Tom's area of expertise. He is your local expert for the Canadian market. So please do be in touch. That actually looks like our time now. I'm gonna wrap up and just with the last kind of snippet of wisdom that Tom left near the end of his slides was big problems are solved by participation and sharing of information. And I think that's what we're asking of all of you who have assembled here today. You have been left with a lot of information both from Tom as well as in the chat in terms of how to engage with the standards, as well as how to engage with us at BookNet. We are your standards organisation. Not only do we help this marketplace implement international standards, we also represent the interests of this market on an international stage. So we wanna work with you, we wanna hear from you, and we are so appreciative of your participation today, your attendance, as well as the great questions that you've provided. And we look forward to working alongside you and working in solidarity with one another. So Tom, thank you again for everything. Officially, I'd like to wrap up the session today. We would love it if you could provide feedback on the session. We'll be sending you an email. We do read every response, so please let us know how this went for you and provide us any information that you can, and what you'd like to see from future sessions or what perhaps was missed today. And you'll also get a recording of this session as soon as it's available. And finally, of course, we would like to thank the Department of Canadian Heritage for their support through the Canada Book Fund and all of you for attending. Thanks again, and again,
  • 15. please remember that big problems are solved by participation and sharing of information. That is your wisdom nugget from Tom Richardson today to send you off into the afternoon. Thank you, everyone. Have a wonderful day.