• Save
Moving Beyond The Desk
Upcoming SlideShare
Loading in...5
×

Like this? Share it with your network

Share

Moving Beyond The Desk

  • 4,597 views
Uploaded on

Government data has typically been shared via applications or downloadable datasets, basically duplicating the functionality of the front desk or filing cabinet on the internet. We need to move ...

Government data has typically been shared via applications or downloadable datasets, basically duplicating the functionality of the front desk or filing cabinet on the internet. We need to move beyond that towards publishing individual records to increase transparency and user experience.

More in: Technology , Business
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
  • this is good stuff, could you please allow to download this presentation for offline usage and reference ?

    thanks
    Are you sure you want to
    Your message goes here
No Downloads

Views

Total Views
4,597
On Slideshare
4,114
From Embeds
483
Number of Embeds
8

Actions

Shares
Downloads
0
Comments
1
Likes
8

Embeds 483

http://www.spatiallyadjusted.com 453
http://geothink.tistory.com 17
http://www.slideshare.net 6
http://geothink.textcube.com 2
http://blogs.weogeo.com 2
http://xss.yandex.net 1
http://74.125.93.132 1
http://www.linkedin.com 1

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide
  • I have removed many of the technical details of implementations and other points from this version of the presentation, and re-focused on the value of turning your database inside out on the web. Please contact me if you are interested.
  • This is beautiful downtown Nanaimo, on Vancouver Island, Canada. Remember the part about the 85000 residents. It’s important.
  • I work in the City’s IT department, one of several people in charge of ensuring that staff and residents have access to the information they need, in the most convenient and efficient way possible. We identified a problem in the way that information was being presented to the public, and a way to move beyond this problem
  • So, this presentation talks about why we started changing how we publish data—especially geographic data—to the web, how this was accomplished, and what kind of results we’re seeing.
  • If you haven’t seen Paul Ramsey’s presentation on government web presences, it is definitely worth your time to read. It does a great job of summarizing Nanaimo’s rationale for changing the way we deliver information to the public, and I even borrowed a couple of the analogies from his presentation.
  • Let’s go back to the pre-internet times. If a citizen had a question they would either come to city hall or get on the telephone and ask someone sitting behind a desk the question. The person behind the desk had knowledge of and access to…
  • Lots of data. This system was relatively efficient. Most people were able to get the information they wanted in a relatively short amount of time. The problem with this is that it meant they had to either go to the trouble of getting to city hall, or at the very least get on the phone and talk to someone. The more questions people had, the more staff was required to answer them. So, when the internet started gaining popularity, it only made sense to “get on the web” to make it easier for people to answer their own questions. Only, this didn’t happen.
  • If you visited this site, where would you go to find out about the zoning on your property? Nanaimo’s website is designed around current best practices for navigability and multiple paths to information so it should be easy, right?
  • But if the information you want is stored in a database it’s quite possible that after three or twelve (or …) clicks you end up at something like this. An application or portal that you have to hope will give you the answer you want. The applications on Nanaimo’s site may look and function similarly, but I can guarantee that they are not the same as those on other government web sites. On each new site, a citizen has to re-learn the paths to the information they want (and the number of clicks required goes up exponentially with each level of government) and figure out a new user interface when they get there. You see this play out at each level…
  • Imagine if you want to find out about water leases on the creek that runs through your property.
  • You might end up here, but would you know what to do?
  • Or if you wanted to know about the federal park that you are planning to visit.
  • You might eventually end up in the same situation. Information technology professionals may know what to do in these cases, but most of our citizens at this point end up feeling like…
  • This. Feel free to insert a popular three-letter acronym there… I’m not trying to pick on Nanaimo, or the BC Provincial Government, or the Government of Canada here. It’s the same all over. The problem is that we have been focused on…
  • building better desks. Now these may be the coolest desks in the world. Like the one up there. Which I want. But they are missing something. We are trying to give our citizens somewhere to come to find out stuff on their own, without any help, instead of allowing them to find the information they want in a way that fits with the web. We have fallen into the trap of being ON THE WEB instead of being OF THE WEB. So…
  • How do we fix this?
  • This kind of interface is often proposed to allow global access to information (especially geospatial information) but the problem is that it requires either special knowledge or
  • A specialized application. This is fine (actually great, since WMS is an open standard) for professional users, but not for most of my 85,000 residents.
  • This kind of interface can also be incredibly useful, but in this case instead of building desks you’re packaging up your filing cabinets and shipping them out.
  • Data aggregation and publishing is definitely worth doing (like Nanaimo does at http://data.nanaimo.ca/ ) in the interest of economic development and open government. But unless someone else builds an application which presents the data, it is only immediately useful to application developers and data analysts. I’m guessing that we’ve got maybe 500 of those in Nanaimo. If we tried to propose this solution or the open geo standards to the other 84,500 residents, I’d imagine that they’d start feeling like…
  • This. And this is what we don’t want. We want our users to be able to access the information that they need…
  • At their leisure. So if the previous solutions won’t work, what will?
  • The answer may surprise you in its simplicity. Use the web. However, you can’t just be ON the web, you have to be OF it. Imagine that each of the drops of dew on this web is a single piece of information, a single page from one of the folders in your filing cabinet or a single record from your database. In itself, this doesn’t help our citizens.
  • But remember that every good web comes with a spider. And the most important one comes with many spiders. These spiders will crawl over your data, ingest it, remember how it is interlinked (and maybe one day, if they’re really smart, how they are spatially related) and use all of that metadata to…
  • Help people who are looking for information. For instance, pretend that someone is out for a walk along the Nanaimo waterfront and notices a nice looking statue of our first mayor, Mark Bate. There isn’t a lot of information on the statue, so when they get home they do the first thing that many people have learned to do: ask one of the spiders (yes, I know that’s not actually the spider, but stop being such a geek…)
  • Imagine their surprise when the first item in the results is a link to…
  • a record in the City of Nanaimo’s public art inventory database! That’s right, instead of sitting inside an application, this database is sitting directly on the web. Of course, the application is still there for users that want to perform complex searches or take advantage of a friendly art browsing interface, and because finding the information is sometimes only 80% of the problem. But the real value here is that the resident was able to get the information that they wanted, right when they wanted it, without even knowing that the application exists. So how did this happen?
  • It’s really quite simple, and it’s as old as the web. All of the features in Nanaimo’s public art database have a home on the web. A location that can be linked to, easily reached by spiders, and shared between users. Most online companies have figured out that it’s important to open up their databases this way to make their products and services easier for consumers to purchase or find information about through the search engines, but government generally hasn’t made this leap yet. We have to, for the good of our citizens. Just as importantly, we have to move beyond this, and publish location information just as openly. There is an incredible amount of information where a large part of its value comes from knowing where it is, and this information needs to be made available to the spiders as well if they are going to help our residents find and understand that information.
  • Here’s an example from further down the page on that art database record. You can see exactly where that statue is and, due to some embedded information and a link to a KML representation of that record, so can the spiders.
  • And if the citizen or spider manages to run into the KML representation first, they can easily get back to the HTML page as well by following the “read more” link.
  • The previous example was from a standard corporate application where the spatial reference is secondary to the rest of the information. What about a case where the spatial information is of primary importance… a parcel of land? Let’s take the previous example of determining the zoning of a lot, try typing in a random address, and see what happens.
  • If you look beyond the attempt to get you to launch Google Maps, you can see what looks suspiciously like government data on that address.
  • Yep, that’s what it is, and hey, there’s the answer to my zoning question: RS-1. And the beauty of this is that we didn’t have to write any code to make this happen. We used a geospatial application known as MapGuide Open Source, along with the GeoREST extension developed by HarisKurtagic of SL-King, and were able to turn our geospatial data into dynamically served web and Google Earth pages with a bit of configuration.
  • This does look a lot like the public art application I showed earlier, so what is different about this? Well, apart from the cool RESTful framework that is being used to publish it, there are a few key parts.
  • The first is the use of SiteMaps, both standard sitemaps and GeoSitemaps. For instance…
  • The standard sitemap contains records for each parcel web page in Nanaimo
  • And the Geo SiteMap includes all of the Google Earth (KML) parcel pages. In combination, these allow spiders to ensure they visit all of the dynamic pages on a site. It is best to make sure that the can all be reached by links too; just think of this as insurance.
  • Another interesting feature can be seen on this page. Can you tell what it is?
  • Still no?
  • That little glowing orange down-arrow means that our property search is available from your web browser through the magic of something called OpenSearch. And it’s not only available when you’re on our search page, you can take it with you if you want and never have to come back to our search page again.
  • Oh, and did I mention that it’s really easy? You just put a bit of code in the web page…
  • Write a little bit of XML
  • And tell the web browser where to put your search words. These additional items are not a lot of extra work, but they have the potential to add a lot of value.
  • After doing this, how do we know we’ve taken the right path?
  • Well, through measurement of course. You can see that almost 90% of all users accessing our mapping data website come directly from search queries...
  • And that at least one of those spiders has indexed the web pages all of the parcels in our database, almost 28,000.
  • About half of the visitors are coming from Nanaimo (and I suspect misclassification of some of the Victoria visitors), which means that this information is being useful to our residents over 1500 times per month.
  • And it is clear that this is not just a few users who are accessing the system. Inferring from the number of visits, in the last month at least 1000 residents performed these searches.
  • One of the most interesting things about this is that there is very broad range of keywords being used to access this data. To take a part of the keyword report at random…
  • You can see it is people typing in addresses. Very specific addresses. They’re typing in the thing they want information about, and getting the answers they want
  • Have a close look at the bounce rate above. This means that about two-thirds of users find the information that they want on the first page that they visit, and then don’t go elsewhere on the site. The beautiful thing is that instead of having to click through link-after-link, and figure out different interfaces and, in general, to become frustrated with their experience on the City’s website (our main site has a low bounce rate; under 40%), they are able to find what they want…
  • Be happy
  • And get on to doing more important things
  • I just wanted to leave you with this. As proof that Nanaimo is really a part of the web, have a close look at Mark Bate’s face. Not only is he OF THE WEB, but the spider is still there!

Transcript