Your SlideShare is downloading. ×
0
Architectural, Spatial, andNavigational Metaphors as Design     Points for Collaboration             John “Boz” Handy-Bosm...
Credit:Ogivly
Credit: Fellowship of the Rich, Flickr
5    MOSFET Architecture, scaling recipes,            and Moore’s Law                     If scaled by   Constant         ...
Recipes in Dennards Scaling TheoryFactors maintained in constant ratio   Predictable outcomes on figures of merit  Consist...
Where to look for scaling principles?(Answer: where things are clogged or crowded)                                        ...
An approach Identify practical recipes for improving Collaboration and Search. Use these as input to                      ...
Wayfinding and Isovist: How is search relevance                          measured?Key Term       Definition               ...
11                         Example: What aspects of metadata                              facilitate collaboration?     Co...
Example: when is metadata search        helpful to collaboration?When metadata search?                              When n...
Measuring effective precision of               metadata searchSequence                                              •    L...
1. Configure                                                           2. Observe                  Optimization Cycle     ...
Metaphors as design points for collaboration 2012
Metaphors as design points for collaboration 2012
Upcoming SlideShare
Loading in...5
×

Metaphors as design points for collaboration 2012

428

Published on

Published in: Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
428
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
2
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide
  • Credit @adrants, John Sword, Andrson, Flickr.
  • Credit: kimballphoto, Flickr
  • Transcript of "Metaphors as design points for collaboration 2012"

    1. 1. Architectural, Spatial, andNavigational Metaphors as Design Points for Collaboration John “Boz” Handy-Bosma, Ph.D. Chief Architect for Collaboration, IBM Office of the CIO For KM Chicago, May 8, 2012 • May 8, 2012
    2. 2. Credit:Ogivly
    3. 3. Credit: Fellowship of the Rich, Flickr
    4. 4. 5 MOSFET Architecture, scaling recipes, and Moore’s Law If scaled by Constant Decreased by: K  Dimensions  Circuit delay:  Voltages K  Power/circuit: K K  Doping levels  Power delay product:
    5. 5. Recipes in Dennards Scaling TheoryFactors maintained in constant ratio Predictable outcomes on figures of merit Consistent inputs in proportion Not classical power laws  Figure of merit: a quantity characterizing performance Multiple factors  Used for benchmarking and Relationships among comparisons factors  e.g.; clock speed in CPU, wicking factor in fabrics No claims about development in relation to  Consistent measurement time  Related figures held to constant performance, not How to achieve ratios is degraded not addressed 6
    6. 6. Where to look for scaling principles?(Answer: where things are clogged or crowded) 7
    7. 7. An approach Identify practical recipes for improving Collaboration and Search. Use these as input to decisions on architecture and design. Consistent improvement in specific factors Factors maintained in constant Predictable outcomes in figures of merit ratio(roughly) - in proportion -  Recipes enable balanced improvement in search, collaboration, Multiple factors: and metrics •Precision and recall • Wayfinding (e.g.; navigating, searching, sorting, filtering) •Content and metadata • Information production (e.g.; quality and quantity •Adoption and Use of authoring, tagging, publishing)  Experimentation allows for measurement and improvement on key measures – but it is important to identify potential trade-offs in • Bidirectional (e.g.; reciprocal networking among figures of merit resulting from technical and social factors: participants) Serial navigation (similar to Fitts Law) Relationships: mutually reinforcing, mutually impinging, exponential Impact of follower models on signal-to-noise ratio of communication Factors are expressed via specific solutions as used in the field 8
    8. 8. Wayfinding and Isovist: How is search relevance measured?Key Term Definition  Test for performance using known corpora and results (e.g.; Trec)Relevance A subjective measure of whether a document in a search result answers  Typically uses a single query and a query response, rather than a series of interactions between users and searchPrecision A measure of the percentage of documents in a result list that answer engine a query  Geared toward top of results listRecall A measure of the percentage of documents in a result list relevative  But traditional approaches are not to all documents in a collection sufficient to measure relevance of results,Pertinence A subjective measure of whether a where relevance is determined by social document in a search result answers a query (in light of previous interaction and collaboration outcomes! knowledge or experience)Aboutness The subjects and topics conveyed by a document or queryIsovist Pertinent items visible | not visible at any given point in a navigational sequence 9
    9. 9. 11 Example: What aspects of metadata facilitate collaboration? Collaboration capability Metadata features Integrating disparate bodies of content from - Incorporate global and local extensions to multiple sources / communities vocabulary -- Query modification to allow lateral navigation -- Matching on shared interests Team Coordination - Content previews, review and approval, collaborative workflow - Tagging at group level - Metadata suggestions Positive network effects from sharing in social - Social Tagging and Bookmarking channels - Rankings and ratings - Clickstream analysis for ranking Knowledge Elicitation - Query expansion a) Conditional metadata, b) Did you mean? - Tag notifications Facilitate collaboration among disparate language - Unique and mapped display values; e.g.; Social comunities Authority
    10. 10. Example: when is metadata search helpful to collaboration?When metadata search? When not metadata search?✔ Multiple set membership for searchables ✗ Precise results can be obtained without✔ Sufficient completeness and quality of metadata classification scheme ✗ When metadata leads to undesirable✔ Adequate accuracy of categorization phenomena such as conjunction search,✔ Leads to improved effective precision and time to find serial navigation, or error propagation Often assumed, but questionable: ? That a single large corpus is to be searched ? That metadata require hierarchical taxonomy with many classifiers ? That agreement on taxonomy is needed ? That searches are for documents (as opposed to collections of documents, parts of documents, people, facts, etc.) ? That metadata operations only involve “anding” on attributes to find instances 12
    11. 11. Measuring effective precision of metadata searchSequence • Log sequence of user actions in a Privacy- Clickstream search session (queries, metadata preserving repository selections, links) cookies • Work backward from a known result (document click, download, print, tag, Search bookmark, notify, rate, exit) queries • Establish influence of each step in Segmentation sequence on ranking of document(s) Analysis that elicited that result (via rankings database Clickstream in results list) data • Query by segments of interest using aggregated data Survey and Survey and ratings info ratings repositories Example: Is stemming improving the search results? Method: A-B tests using stemming, sample measures of search precision 13
    12. 12. 1. Configure 2. Observe Optimization Cycle Practices practice and Tools7. Transition Variables to 3. Evaluate Constants Bottlenecks 6. Measure 4. Propose New Outcomes Variables 5. Build new
    1. A particular slide catching your eye?

      Clipping is a handy way to collect important slides you want to go back to later.

    ×