Chi2011 Case Study: Interactive, Dynamic Sparklines
Interactive Sparklines: A Dynamic Display of Quantitative InformationLeo Frishberg AbstractTektronix, Inc. Initially proposed by Edward Tufte, “sparklines” present13975 SW Karl Braun Ave. hundreds of data points in the space of a word or two.Beaverton, OR 97077 USA Tufte originally designed sparklines to be embedded email@example.com a sentence. Today they have moved off the printed page into websites, online applications, smart phone screens and interactive documents. Sparklines display hundreds of data points over time: stock prices or sports statistics for the prior year, for example. But how well do they perform with millions of data points acquired in microseconds? What if users capture these data every couple of minutes? How well do sparklines, primarily designed for static display of historical data, fare in the context of an interactive application? In this case study, the author describes interactive sparklines his team designed and developed to assist electronic engineers debugging their electronic circuits. The case study presents an iterative user-centered design process from the initial proposal of sparklines through to their refinement after several releases. TheCopyright is held by the author/owner(s). study concludes with reflections about futureCHI 2011, May 7–12, 2011, Vancouver, BC, Canada. improvements to interactive sparklines.ACM 978-1-4503-0268-5/11/05.
Keywords abstractions of “serial” data flowing in the circuit, suchSparklines, Interaction Design, Participatory Design, as transactions between two components).Progressive Display, Information Visualization Data transmitted on serial buses conform toACM Classification Keywords standardized specifications and protocols. Common,H.5.2 [INFORMATION INTERFACES AND everyday serial standards found around the housePRESENTATION]: User Interfaces---Graphical User include RS-232, USB and Ethernet (TCP/IP).Interfaces (GUI), Screen design, User-centered design;I.3.6 [COMPUTER GRAPHICS]:Methodology and In general, serial protocols are “layered:” each layer isTechniques---Interaction techniques, sparklines. responsible for a type of behavior or domain of operation; lower layers provide support for higherGeneral Terms layers.Design, Human Factors For example, PCI-Express (a high-speed serial dataIntroduction standard for inter-chip communication commonly foundIn the world of debugging digital electronic circuits, in PCs and other computers) has three layers. Thespeed is everything: the speed with which engineers lowest layer (the PHY layer) defines how the electricalresolve a bug and the speed of the measurement signals should operate – their frequency, voltage andinstruments they use to find it. Traditionally, the Test the definition of a single bit of data. Among its otherand Measurement (T&M) market has focused on responsibilities, the PHY layer specifies the rules for“speeds and feeds” – how fast an instrument can combining a set of bits into a “symbol” and the validacquire data and the number of features packed into sequences of symbols transmitted between twothe product. endpoints.The focus on features and functions as a competitive The middle layer (the data-link layer) is responsible fordifferentiator has shifted over the past several years managing the link between the two endpoints: howtoward improving the debugging engineer’s experience. much data receivers and transmitters can send,In a typical debugging scenario, an engineer must whether receivers or transmitters are ready for thedetermine whether a digital design (a circuit board for next transmission and so forth.example) is behaving as expected. The engineer mayuse an oscilloscope (to measure the analog In the topmost layer (the transaction layer)characteristics, such as voltage or amplitude, of a few components send data back and forth across the linksignals at a time); a logic analyzer (to capture digital permitting devices to interact and work together.attributes, such as 0s and 1s, of hundreds of signals) orperhaps a protocol analyzer (to view higher order At each layer of a protocol, whatever can go wrong will go wrong: at the PHY layer, bits will flip; at the data
link layer, links will stop transmitting due to overflowing challenge. Creating a competitive business case withoutbuffers; at the transaction layer, a request for data introducing too much novelty was equally daunting. Wefrom a device will “time out”, invalidating the request. were the newcomer to the market, and we faced a strongly entrenched incumbent – the industry standardIn early 2009, our company was pursuing the design of protocol analyzer. “Familiar” data visualizations in thisa hybrid type of instrument: a “logic protocol analyzer” context meant “like the protocol analyzer” from thethat could display digital characteristics at the PHY incumbent.layer as well as the packet information flowing in theupper layers. The launch of the instrument was to The competitor’s solution was so entrenched thatcoincide with the emergence of the next generation of customer input (usually acquired through the salesthe PCI-Express standard: PCIE Gen 3. channel) referred to “the way the other guys do it.” Our product team didn’t believe a “me-too” strategy wouldWe knew the basic use case scenario: an engineer unseat the long-established incumbent. We hoped toacquires billions of data points from tens of channels in find a distinctive solution compelling enough toa fraction of a second and uses a computer to post- overcome our prospects’ objections to learning a newprocess the data into a variety of information views. interface.The process repeats until the engineer identifies andresolves the problem. The key questions facing our team were:Debugging engineers have several expectations of their What do debugging engineers really need to debugmeasurement instruments: their serial circuit designs? Which aspects of the competitive solution were The instrument must accurately acquire the data truly satisfying underlying needs?flowing on users’ circuits without affecting the circuit. What underlying needs were not being satisfied The instrument must present the acquired data that would provide us a competitive advantage?quickly. The data visualizations must allow the engineers to We pursued a four-prong approach to answer thesequickly detect (and hopefully isolate) any one of questions:hundreds of possible problems. The data visualizations should look familiar. 1. We performed a light-weight contextual inquiryEngineers cannot afford to spend time learning new with a “lighthouse” customer (building on pastinterfaces or new ways to view their results. contextual inquiries of this same customer).Creating an instrument that acquires billions of datapoints in milliseconds is an engineering and technical
2. Users demonstrated and critiqued the incumbent’s Designers often use sparklines in combination with product to help illustrate what was essential to Tufte’s notion of “small multiples:” repetitive graphs of them. data placed next to one another . The scale of the 3. We facilitated several two-hour participatory design graphs, their display characteristics and their layout sessions to identify underlying challenges help reveal interesting data patterns by leveraging irrespective of the incumbent’s approach. principles of human perception. 4. We released prototypes monthly for 15 months, In almost all cases, graphics use static sparklines: they discussing with users what worked and refining or summarize trend data to be interpreted passively. eliminating that which didn’t work. Online displays may update sparklines when the underlying data set is changed, but the updates occurGoogle Finance showing sparklines The resulting design solution helps debugging immediately, with little delay. In rare instances, onlinein the context of a chart of stocks engineers assess the health of their circuit and quickly applications design interactive sparklines. In a typical identify problems, primarily through an innovative use application such as Google Finance , a list of stocks of sparklines. The case study details the design of the is displayed in a table. One column includes sparklines sparkline, from the original inspiration as a static reflecting the stocks’ daily, monthly or yearly trends. element to the final expression as an animated interactive control. The study illustrates how our team’s A review of Tufte’s sparkline blog , a query on the user-centered design process was central to the ACM Digital Library  and a query on the IEEE library evolution of the sparkline design. The case study  return citations describing sparklines as a means of concludes with reflections on user feedback for displaying data sets in small multiples. With one improving the sparklines. exception, none of the references mentions sparklines as interactive or navigational elements. The literature The Background of Sparklines doesn’t describe how the construction or animated When Edward Tufte introduced sparklines as “small, display of sparklines might influence the observer’s intense, simple datawords”  he literally meant them interpretation of their data. to be graphical words embedded in sentences ( ). He intended The one exception is a recent iPhone application for sparklines to maximize information density when comparing stock market funds and indexes: TraxStocks communicating quantitative (statistical) information. . The publisher moves beyond passive display of Since their introduction, sparklines have found their data with one enhancement: users touch the sparklineYahoo! Finance using a sparkline way into a wide variety of media: print, web and to display the value associated with that point on the(at bottom) as an interactivecontrol to adjust the upper graphs’ interactive applications for the desktop and mobile line.span of time phone [1,3,11].
Participatory Design Sessions Of equal interest is the lack of discussion regarding We followed a fairly typical contextual inquiry process: dynamically updating sparklines: the online examples letting the users do their work, focusing on the specificPreparation: We introduced refresh sparklines instantaneously. issues they had, asking questions only to clarify whatthe sessions several weeks in they were saying and letting them lead us where theyadvance in teleconferencing Although no literature search uncovered interactive felt it was important.calls and emails, highlighting sparklines, interactivity is beginning to work its waythe process, agenda and the into sparkline designs. Both Yahoo! Finance  and We spent several hours working as a group later in thelike. In spite of this effort, Google Finance  provide excellent examples of using evening generating a coherent set of observations frommost participants were a sparkline as the backdrop for focusing the analyst’s our dozens of photographs, hours of recordings andsurprised about our agenda. attention on a specific span of time. DataPlace  used several pages of notes. The first day’s sessions sparklines to support interactive semantic zoom for generated over 170 observations that became theMaterials: We use low- visualizations of demographic data. initial foundation for our design investigations.fidelity materials, forexample, flip chart paper and User Requirements Gathering through As an example, during a 15 minute observation of threecommon office products. Participatory Design debugging engineers, our team recorded dozens ofBecause we were traveling, Our design team, led by the author as Principal instances when the instrument obstructed the users’we were able to purchase the Architect, User Experience, included a Software desired outcomes. Users were either unaware of thesematerials near our Architect, a Hardware Architect and a Marketing roadblocks, apologized for their own lack ofdestination. Participants were Product Planner. We spent several days with our understanding, unconsciously created work-arounds ordisoriented by the low-fidelity customer in their facilities at the outset of our program. explicitly brought their concerns to our attention.approach. Having a multi-disciplinary team on site, working closely with the users, was crucial to the success of the On the second day, we facilitated a series of two-hourProcess: We begin with process – the biases each team member brought to the participatory design sessions focusing on the needs ofwarm-up exercises to get the sessions broadened the team’s perception and the individual teams observed the day before. Thegroup into a creative mood. understanding of what we observed. sessions did not go as smoothly as we would haveUsually our participants enjoy hoped (see discussion in side bar to the left).these exercises. In this case, We spent the first day observing six user teams, eachwe participants were focused on different aspects of debugging. Some teams We had several objectives for these sessions:uncomfortable. worked with logic analyzers in their workspaces on a specific protocol debugging problem. Others Prioritize information in terms of its relevance toOutcomes: In spite of our demonstrated their use of the incumbent product. We tasks.rocky start, participants observed what worked well for each team, how theyagreed (in a post-process Identify how the engineers sequenced information pursued debugging tasks and where potential workreview) the sessions helped they used. breakdowns might occur.identify important needs. Delineate the context surrounding the information.
Clarify which interactions with information would out onto a conference table. Audio recordings of thefacilitate debugging tasks. sessions played in the background while the author looked at the elements each participant had placed orIn our approach to participatory design, we provide annotated. The intention was to dive below the surfaceusers with oversized sheets of paper, markers, yarn, of the design to better understand the deeperpipe cleaners, glue guns, adhesive dots, sticky-notes structures the participants were trying to communicate.and other materials to sketch desirable designs forscreens. Working in groups of seven or fewer, For example, several users insisted specific elementsparticipants are provided a blank sheet of paper; their had to be adjacent to one another. In looking deepertask starts by placing the most important piece of into this apparent requirement, the author determinedinformation on the sheet. the requirement was not about specific elements but about the relationship between specific types ofGetting the first piece of information down is often the information and the debug process itself.most challenging, but once that barrier is brokenindividuals participate by writing, editorializing and Outcomesdrawing. The team identified several common themes from the participatory sessions on day two that differed from theThe process requires strong facilitation on the part of first day’s observations:the design team: much of the time the author acted ascoach and “cheerleader,” encouraging individuals to put Anomalies, errors and “stuff that didn’t smell right”items onto the sheet, helping them reorganize the were rated as more important than patterns of normalelements and teasing out conflicts and clarifications. data flow. Summary or statistical results were far moreThe resulting sketches do not directly form the basis for important than detailed information especially whenour design; instead, through making marks on paper initially assessing the acquired data.and making decisions about what elements should go Users wanted to stay in an information view as longwhere, the participants provide important insights into as possible – preferably adding more information intothe relevance and priority of their information needs. the view instead of creating and navigating to a different view.The participants created ten screen prototypes duringthe design sessions. When they had to navigate away, users wanted to minimize the impact of switching from the summaryParticipatory Design Analysis information to detailed views.The author analyzed the design sessions after returningto the office. Each sheet was hung on a wall or spread
User Requirements Analysis The Design of Interactive, Dynamic In their discussion of summary information, users listed Sparklines the need to: From our preliminary design research, we prototyped a single screen: the Summary Profile Window (SPW). Our Know a data element was present or absent; iterative design process incorporated several streams ofThe Team’s Design Principles information: Know how many of one data type existed in1. Maintain context: Use relation to another; Overview+Detail or User input; Focus+Context patterns to Know when data elements were present, relative to maintain orientation as users one another, in time sequence; Fundamental design principles (see sidebar at left;) move through data sets . Principles of Information Visualization and Quickly view details of a specific data element in2. Leverage users’ the time context of the other elements; Engineering constraints. perceptual systems: Use Know a data element’s relative position within the graphical displays and clean protocol hierarchy. The effort resulted in a novel form of sparklines that visual design [3,6]. diverges from Tufte’s original concept.3. Employ semantic as well During the design sessions, users sketched tables of as visual zoom: Enhance numbers to describe statistical data. For PCIE Sparklines became integral elements of the SPW (see users’ navigation through the data sets . debugging, these numbers included the total number of Figure 1 below and accompanying video material) packet types at each level of the protocol, the total because of their good fit to users’ needs for information4. Pixels are precious: Strive number of a variety of error types and the total number density. In spite of our confidence in these data to make every pixel behave of custom-defined packets. elements as well-established visualizations, nothing like as both a readout and an opportunity for input. them existed in our users’ contexts. We were Three factors that emerged from the analysis inspired concerned about their novelty; user skepticism gave us5. Minimize time to first us: the number of summarized data elements, the reason to worry. information: No matter how importance of the summary information and the long the final bit takes to variability of the data (the data sets changed from one The SPW is primarily a table of summary data. The arrive, make sure the first bit acquisition to the next). These attributes suggested rows of the table correspond to the information arrives as soon as possible. Information Visualization  and Tufte’s work as hierarchy of PCIE. Users open the rows to drill down starting points for our design. Specifically, sparklines into greater detail or close them to view summary seemed a natural fit to increase the information density information. of the screens and to help users quickly interpret their numerical data. The columns display the total for each element in the acquisition as well as the total in “the Viewfinder” a user-specified region of time within the acquisition.
Within each row, a sparkline displays the aggregate across the entire acquisition. values for that element, ordered in time sequence PCIE data flow in twoThe Summary Profile Window directions – an importantemploys several types of zoom. attribute users need to see.Semantic zoom is applied in the In the SPW, columnshierarchy of the table: users drill distinguish between thedown into more detailed upstream (Up) andinformation by opening up the downstream (Dn) traffic.outline.The Viewfinder column uses asecond type of semantic zoom.Here, users see quantities of eachprotocol element within a specifiedregion (the Viewfinder) of thesparkline.Dragging or sizing the Viewfinderupdates the Viewfinder column; of User leverage the power ofgreater utility, the interaction small multiples and sparklinesupdates adjacent data views (not by visually comparing theshown) by scrolling their data into sparklines. The patternsview. among the different data flows are self-evident.Non-zero values in the table arehyperlinked to other, more detailed,data views. When users click on ahyperlink, the other view scrolls tothe first instance of the hyperlinkedelement. Figure 1 An early version of the Summary Profile Window showing the relationship of sparklines to tabular information. See accompanying video of the SPW updating in real time. Performance and Engineering Constraints circuit’s data stream, it must display the desired A debugging engineer’s biggest concern is performance. information within seconds. Not only must our instrument keep pace with the
In our analysis of sparklines as a powerful visualization design problem. Moving and processing all of the dataof summary information, we had to consider the (which the summary statistics and sparklinesperformance constraints imposed on the solution by our necessitated) is made more challenging by the user-hardware and software. selectable size of the acquired data set: users may choose an acquisition size varying between roughly 1KBSTATIC VERSUS DYNAMIC DATA to more than 16GBytes.Sparklines on the printed page don’t change – theyhave no performance constraint other than their printed USER EXPECTATIONSresolution. Sparklines in interactive applications must Debugging engineers understand the burden large databe updated within a reasonable period of time – in sets impose on our instrument, but their expectationsGoogle Finance, for example, changing the data set for performance do not diminish as data sets grow:updates the sparklines within one screen refresh –apparently instantaneously. In these and other Acquire as much data as requested in a fewinteractive applications, the data set is pre-computed – seconds or less;it doesn’t vary from moment to moment. This is not the Display the entire acquisition on screen within acase in the real-time capture environment of T&M: few seconds;except under artificial circumstances, the flow of data ina circuit is in constant flux. An instrument must Drill down into the data set to explore anomalies orcompute sparkline information in a period of seconds. patterns of interest instantaneously.LAWS OF PHYSICS A quick calculation reveals it is impossible to move andThe logic protocol analyzer is a PC-based standalone process 16GB of data to the screen in these timeframesmachine – it doesn’t have access to the cloud, to server using desktop-scale computers. Our solution had tofarms sitting in far off places, or even to local high- overcome the constraints of the users’ computingspeed servers over fast wide networks. For many systems while still meeting their expectations for rapidcustomers, the system is completely standalone; for visualization of the data.others, it is connected to a host PC directly or througha network. The host may include top-of-the-line multi- Sparkline Design Detailscore processing power but our application is specified to We had to modify the canonical sparkline in severalrun on stock PCs. ways to integrate them into an interactive visualization. Some of these changes diverged from Tufte’s originalOur company takes great pride in its custom-designed design intentions.hardware that acquires the data—it is a differentiatingtechnology. But moving the data through the host NAVIGATION MARKSsystem after it is acquired, and displaying it as Most T&M instruments share a common vocabularymeaningful information, has been a challenging system regarding data annotation and navigation:
Cursors: one or more annotations the engineer the Viewfinder boundaries around a pattern of interest, uses to identify specific events of interest. users simultaneously cause detailed information views Trigger: the event that caused the instrument to (in adjacent windows) to scroll to that region in the take an acquisition – a mark of special interest as the acquisition. user is likely interested in this event over all others. Based on the design research, we believed the addition of way-finding (through the use of marks) and According to Tufte, highlighting specific data points in a interactivity (through the use of an interactive sparkline enhances their utility: a maximum value, a Viewfinder) would enhance the user experience. These minimum value, the end value and/or values that were the first two ways in which we diverged from deviate from a known good value. According to our Tufte’s definition of sparklines. During our evaluation user feedback, these points are of less value than the sessions, we listened carefully for how these additional annotations on which users rely to navigate through the elements might negatively impact sparklines as dense data. As a result, we enhanced our sparklines with data words. navigational markers (two cursors and the trigger point) rather than the canonical values promoted by An equally important issue was the speed with which Tufte.In Tufte’s design of sparklines, we could render the sparklines. In Tufte’s discussions,specific values are called out — sparklines are printed; in the online examples, they areMin, Max, First or Last for example. rendered virtually immediately. Given the performanceIn T&M contexts, other marks are constraints of our equipment, we were concerned aboutmore important: cursors and the the implications for (and risk to) the user experience iftrigger point. we failed to render the sparklines quickly enough.Users can quickly compare the Figure 2 Detail of sparklines. This vignette out of the SPWreference point of a cursor in one shows the upstream data traffic (on the left) versus PROGRESSIVE DISPLAY OF DATAsparkline with the same point in a downstream traffic for two types of transactions: Memory We solved our performance constraints bydifferent sparkline. Reads (on top) and Memory Writes below. progressively displaying the data: updating sparklines (and the table data) as the system processed the data.The debugging engineer uses this SEMANTIC ZOOMobjective grid of reference points to Each value in the table not only provides the totalinvestigate the relationship of one The team considered several different approaches to number of elements in the acquisition (or Viewfinder),set of data to another. updating: it also satisfies a user expectation to drill into the details of the data set. Each non-zero value is a hyperlink to the first instance of the element in an Checkerboarding (Depth before breadth): associated detailed view of the data. update a chunk of sparklines, from the top down, completing the first chunk in its entirety before Each sparkline takes advantage of a related form of proceeding to the next. zoom through the use of the Viewfinder: by adjusting
Striping (Breadth before depth): update an narrow—within one or two orders of magnitude. Whenentire region across all sparklines, from top to bottom, the range is several orders of magnitude, however, theadvancing all elements in each cycle. choice of scale is crucial to making sparklines readable. Interrupt-driven: allow the user to stop the Similarly, to allow viewers to compare data quickly,update, take a new acquisition, put the window aside small multiples should all have the same scale.and so forth. A PCIE stream contains several sub-streams, each associated with a layer of the protocol. An acquisitionThe sparklines became animated data visualizations will likely contain symbols associated with the physicalpotentially enhancing (or degrading) engineers’ layer, data-link packets associated with the data-linkunderstanding of the data. Throughout our iterative layer and transaction packets associated with theprocess, engineers offered their priorities for updates: transaction layer. On the physical layer there may be tens of thousands of symbols, while on the transaction Update around the trigger first. layer there may be only a few packets. Prioritize root-level elements of the protocolhierarchy before the details. What scale should the sparklines have in the y-axis to Prioritize updates inside the Viewfinder before maintain the advantage of small multiples but stillupdating outside. reveal meaningful information? Our initial prototypes inadvertently scaled each sparkline independently, a Update the display when the engineer moves the violation of the theory of small multiples. SubsequentViewfinder rather than continuing to process data in the releases, in which we applied a universal scale wasViewfinder’s prior location. problematic for different reasons. The answer, as the final section of the study describes, was a fourthAnimating sparklines was the third way in which we deviation from Tufte’s design.deviated from Tufte’s original concept. x-axis scalingSPARKLINE SCALING In Tufte’s introduction to sparklines , each point onTufte describes in depth the relationship between y- the x-axis represented a single event (a baseball game.and x-axis scaling to paint sparklines in the best a sample of glucose) or a discrete amount of time – apossible light – to leverage the viewer’s perceptual day, a week, a year.capabilities to their maximum advantage. In brief,slopes of lines should stay close to 45°. We believed debugging engineers would understand the x-axis represented time because of their familiarity withy-axis scaling time-oriented data. We didn’t know what unit of timeFinding a proper scale to best display a sparkline is would be most important to them.easy when the range of values in the data is relatively
In addition, our sparklines didn’t map one display point complained they couldn’t determine whether theto one data point as Tufte’s did. In our case, each point system was hung or still processing.on the x-axis represented a slice of the acquisition – 2. Users didn’t know when the sparklines wereinitially a centile – one hundredth of the total time. finished. They suggested a progress bar to help know when the processing was complete.Would our users figure out the sparkline x-axis 3. Applying a universal scale across all of therepresented an arbitrary slice of time? Users quickly sparklines proved problematic: elements with veryaccepted this approach and then immediately expected few instances, such as errors, might be far moremore. Their interest in an arbitrary slice of time as a important than elements with thousands ofmeans of exploring the data has stretched the limits of instances, but the small numbers were attenuated.a sparkline. The discussion in the detailed evaluationsection below highlights some of these unanticipated 4. Some users wanted to treat each pixel on the(yet desirable) interactions. sparkline as a single data point, expecting to be able to click on a point and navigate directly to aEvaluation spot in the acquisition. That the points representedPrototype Phase an aggregate of a centile of elements was not self-In our monthly releases of the window to a key user evident from the displays.group, we provided working copies of early prototypes 5. Other users were frustrated by their inability toand received opportunistic feedback via email and zoom into the sparklines and explore the aggregateduring weekly conference calls. We also met on-site, data in depth, in situ. These users obviouslyletting users walk us through the interface while we understood the arbitrary slice concept.solicited detailed feedback. Overall, users spent several 6. Users accurately reported the meaning of and needdozens of hours working with the prototypes. for marks (cursors and trigger point). 7. Users validated the advantage of small multiples byEarly Working Phase pointing to artifacts between two or moreSubsequent to the initial prototype iterations, our sparklines when discussing the data.development teams integrated the window with theacquisition hardware. With the integration completed, 8. Most importantly, in spite of sparklines’ novelty,we could better evaluate and optimize system users understood their value to validate theirperformance using a combination of hardware and understanding of the data flows or to highlightsoftware to improve the update rates. potential areas of concern.During this phase, users offered significant feedback: Usability Study We ran a small usability test of the worst case1. The update rate of the early iterations was so slow (predicted) update rate using the checkerboarding as to make the window practically useless – users (depth-first) banding scheme to determine if this
approach to progressive display would be acceptable. interactive high resolution prototypes, early working During the test, we discussed this approach and code and a focused usability test. We have converged alternative approaches to determine what concerns, if on a direction to address the several concerns raised by any, users had about each. users over this period. We performed the test with four users, individually. y-axis scalingFigure 3. Log displays (above) Three participated remotely (using WebEx™ and After analyzing the range of values in PCIE streams,and Linear displays (below) of teleconferencing) and the fourth was in person. we’ve concluded a log scale for the y-axis is the bestthe same data. approach. We are encouraged by results from pre- The test (a brief Adobe Flash™ prototype, see testing with our internal debugging engineers. accompanying material) focused on four objectives: In addition, rather than applying a universal scale 1. Definition of done. Could we assist users in across all sparklines, each root element (the element at knowing when the sparklines finished rendering the very top of the tree hierarchy) is scaled separately; without using a separate progress bar? each child is scaled according to its root. We struggledEven without knowing the actual y- with this approach given it would violate the compellingscale of these figures, the effect of 2. Impact of Animation. Did the manner andlog scaling versus linear scaling is rationale of small multiples, but the decision is justified direction of updating the sparklines matter? Thiseasily seen. by the nature of the data. included the duration of update, whether theThe log scale in the upper pair of update started with the trigger, the Viewfinder or The root elements differ significantly from each other:figures reveals variability more from one edge. although they are all part of the same data stream,appropriately: lesser quantities are 3. Scaling using Max Value. Did adding a maxaccentuated and minor variability of they are fundamentally different sets of numbers. value at the end of the sparkline assist in There is no expectation to compare the sub-elementshigher quantities is dampened. understanding the y-axis scale? Did it help or from the Transaction Layer with the sub-elements fromThe linear scale in the lower pair of hinder the comparison between data elements? the Physical Layer for example.figures does exactly the opposite: Would users prefer to adjust the y-scale maximum?the variability of smaller quantitiesis attenuated and larger quantity 4. Independent versus Uniform Scale. Would The max value is placed into its own column ratherswings are enhanced. users prefer to apply a uniform scale across all than attached to the sparkline itself. This removed elements or let each element scale independently? confusion about its meaning and permits us to expandLarge swings in large quantities is Would they prefer a logarithmic versus a linear on the possibilities for the column. Even with itnot nearly as important to users as scale? separated from the sparkline, it provides a helpfullarge swings in small quantities. dimension to a sparkline’s scale. Design Changes As of this writing (September 2010), the SPW has been We have also concluded the maximum y-scaling should in front of users for almost 15 months in the form of be determined by the system and not be user- paper prototypes, interactive low resolution prototypes, adjustable. The estimated impact of the additional user
interaction, along with preliminary tests of the apparent reduction in legibility or usefulness of the logarithmic scaling, convinced us adjustable scaling had sparklines. marginal benefit. Definition of Done x-axis scaling All users understood the way we had rendered the Using a centile (1/100th) slice of the acquisition resulted meaning of done. As a result, we did not have to add in unacceptable performance. Slices were reduced to a progress bars or other interfaces – each sparklineA selection of frames from the Flashusability prototype shows the quadragintile (1/40th) of the acquisition with no served as its own progress bar as Figure 4 belowprogressive updating of the illustrates.sparklines.One question resolved by the testwas whether the sparklines couldact as their own “progress” bars.In the test, two indications wereprovided: a change in tone fromgrey to black and the addition ofterminating blocks.In the final design, the terminatingblocks were removed to reducevisual noise. Figure 4 Usability Prototype for Definition of Done. These selected frames from the Flash prototype show the sparklines updating from right to left using a checkerboarding scheme. They indicate they are complete by changing their tone from grey to black and terminate with 2x2 pixel blocks. Updating accustomed to acquiring larger data sets, they will Dynamically updating the sparklines has the greatest likely tire of waiting and desire better performance. potential to negatively impact the user experience. For some types of acquisitions, our updating algorithms Final Thoughts and Next Steps take advantage of our proprietary acquisition hardware. The sparklines turned out be one of the most In other cases, we use software algorithms to process controversial elements in our design proposals. Users the data. This means users experience different update were unfamiliar with them, they didn’t exist in the behaviors based on the acquisition and the types of “other guys’” interface and, being novel, there wasn’t algorithms we employ. sufficient precedent for users to understand their benefit. As clever as we’ve been so far, however, we expect we’ll need to do further work to reduce the impact of Our iterative design process was crucial to getting the updates on the user experience. As users grow design of sparklines right and helping overcome user
resistance to our early rough prototypes. Throughout qualities of the sparkline by rendering it differently,our discussions, users expressed a tension between perhaps as a miniature histogram, for example.sparklines as a summary element versus the need toaccess more detail. The sparklines’ novelty (in the Using the Viewfinder to Zoom into the data set.users’ context) inspired them to raise creative Users want to zoom into the data set in the context ofsuggestions for enhancing sparklines further. the sparkline itself as they are accustomed to doing in the waveform view. In trying to open up the slices toClicking on a data point in the sparkline to navigate to a see more information, users clearly understand thespecific detailed element. aggregate nature of the x-scaling.This idea, raised by several users, is problematic. Thedata points in the sparkline aren’t individual elements, If we pursue a zoom function, we will need to carefullybut rather an aggregate of the slice of elements. On the consider the Overview+Detail and Focus+Contextother hand, we are permitting a similar interaction in patterns to reduce user disorientation as they movethe hyperlinked elements in the adjacent table, so why into the data.not here as well? Progressive Display and Interleaving.We are considering the possibility of hyperlinking to the We are still concerned about the sparklines’ rate offirst instance of the element in the slice when the user update, especially with very large acquisitions. Ifclicks on a point in the sparkline. update times exceed the maximum times users will tolerate, we will likely explore alternate methods ofMoving or placing marks in the sparklines. updating sparklinesUsers wanted to place marks on the sparklines inaddition to using them for orientation. This reveals an Are we really talking about sparklines?important underlying issue: many users interpreted In retrospect, we may have stretched the definition ofsparklines as miniature forms of the most standard sparklines beyond Tufte’s original intention. He hadview of data in the T&M domain: a waveform view. designed them to be small, intense, simple datawords occupying space in a block of text or a cell of a table.The waveform view is the most robust data view in our We have enlarged these beyond the absolute minimum,product. As a result, engineers often express their reducing their information density as a result.tasks in terms of working in the waveform view. Asthey explored the sparkline the engineers offered Users have come to embrace the sparklines, findingsuggestions about how to make it more waveform-like. them an invaluable aid in their debugging efforts.If we don’t enhance the sparkline with waveform-like Equally importantly, our sales teams have successfullyinteractivity, users may find the experience frustrating. moved prospects from “the other guys” to our solution,Alternatively we could reduce the waveform-like in part because of these engaging visualizations.
This case study has highlighted the use of Edward CitationsTufte’s sparklines as a novel form of visualization in the (All Web citations were accessed on 23 Sep. 2010.)context of a real-time test and measurement  All of Zero. TraxStocks.instrument. To improve their utility and usability, we http://www.allofzero.com/traxstocks/diverged from Tufte’s original concept in several ways:  Ask E.T.: Sparklines: theory and practice. http://www.edwardtufte.com/bboard/q-and-a-fetch- msg?msg_id=0001OR&topic_id=1&topic= We used sparklines to summarize large data sets in real time.  Card, S.K, Mackinlay, J., Shneiderman, B. Readings in Information Visualization: Using Vision to Think. We used sparklines as the foundation for navigation Morgan Kaufmann Publishers, San Francisco, CA, USA, into the data sets. (1999), 285-310. We used sparklines as interactive elements within  Google Finance. http://www.google.com/finance an information visualization space.  IEEE.org. We used sparklines as built-in progress indicators http://ieee.org/searchresults/index.html?cx=00653974 to progressively display data as the instrument 0418318249752:f2h38l7gvis&cof=FORID:11&qp=&ie= UTF-8&oe=UTF- processes it, continuously and smoothly animating 8&q=sparkline&siteurl=ieee.org/index.html the sparkline and transitioning it from grey to black  The ACM Digital Library. Search query: “sparkline” to reinforce our users’ understanding of “done.” http://portal.acm.org/portal.cfm  Theisen, Karen, Frishberg, Nancy. DataPlace:We could have copied “familiar” offerings from a Exploring Statistics about Cities,competitor and missed an opportunity to improve the http://www.andrew.cmu.edu/user/cdisalvo/chi2007wordebugging engineers’ experience. Instead, we achieved kshop/papers/DataPlace-workshopCHi2007-3.pdftwo strategic goals through our design process: CHI2007. Pg 2.accelerate bug resolution for our users and offer a  Tufte, E.R. Beautiful Evidence. Graphics Press LLC,distinctive, competitive solution. Cheshire, CT, USA (2006), 46-63.  Tufte, E.R. Envisioning Information. Graphics PressAcknowledgements LLC, Cheshire, CT, USA (1990), 67.Huge thanks to the product line’s team of hardware and  Ware, C. Information Visualization: Perception forsoftware engineers who helped develop sparklines. Design. Morgan Kaufmann Publishers, San Francisco,Additional thanks goes to the numerous debugging CA, USA, (2004).engineers who put up with extraordinarily bad early  Yahoo! Finance. http://finance.yahoo.com/versions of our software. Finally, appreciation goes toDavid Stubbs for his detailed editorial suggestions. Anyerrors are solely the responsibility of the author.