Relevancy and Context: The Key to Great Mobile Experiences

  • 1,432 views
Uploaded on

While we've been using the term "personal computing" for well over 40 years, the device in your pocket is really the first computer-worthy of the moniker. Consider that the typical smartphone …

While we've been using the term "personal computing" for well over 40 years, the device in your pocket is really the first computer-worthy of the moniker. Consider that the typical smartphone possesses the senses of vision, audition, mechanoreception, equilibrioception, and thermoception; and they’re all backed by information regarding the user's location, friends, schedule, correspondence, etc. Now, top it off with a full-time connection to the Internet. If we define context as the sum total of everything that the user is experiencing at the moment of engagement, then the mobile device has the unique ability to gather contextual information and provide relevant content in ways never before imagined.

(From David's Digital Atlanta 2012 presentation: http://digitalatlanta2012.sched.org/speaker/davidreeves1#.UJrw7GCD2oI)

More in: Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
1,432
On Slideshare
0
From Embeds
0
Number of Embeds
1

Actions

Shares
Downloads
15
Comments
0
Likes
3

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Relevancy and Context: @davidreevesATL The Key to @22squared Great Mobile #DigATL Experiences #mobilecontext
  • 2. squared ©2012 2In 1986, I wanted one of these. Badly. I mean really badly. I grew up about 90 miles from Atlanta, in a very small town in ruralAlabama.The local Radio Shack in Piedmont, Alabama, had one of these - the Omnibot 2000 - on display. The problem was that the storemanagers never put batteries in the floor model. So that left the capabilities of the Ombibot 2000 to my imagination. And I hada really active imagination.At night, I’d sketch out all the things that I’d train Roy to do for me. (That’s what I called him. I have no idea where the namecame from.) I was going to teach him to wake me up in the morning with a gentle nudge and a Coke. As I dressed, he would scanthe house for my books and arrange them neatly by the front door. Never again would I show up to school without a pencil ornotebook. I had no freakin’ idea that only one of his arms worked and that he actually couldn’t sense much of anything. He had acouple of ultrasonic sensors that allowed him to detect (crudely) when he was about to run into a wall, but hat was it. Thoseeyes? They didn’t see anything. That left arm? Just for looks. And that glass of bourbon he’s holding in his right arm? Yeah. Well,good luck with that.I never got an Omnibot 2000. And it’s probably a good thing, because I would have been one disappointed kid.But lately, I’ve been thinking about robots again, though not in the same way that I did in 1986. Let’s start here:
  • 3. ro-bot noun /ˈrōˌbät/  /ˈrōbət/ robots, plural A machine capable of carrying out a complex series of actions automatically, especially one programmable by a computer. squared ©2012 3Here’s one definition of robot that comes up with a quick Google search. There’s a lot of debate within robotics circles aboutwhat is and isn’t a robot, but this definition is a good start.For many of us in the room, our idea of what is and isn’t a robot is rooted in science fiction: those humanoid machines withsuper-advanced artificial intelligence. The Rosie the Robots and the Lost in Space robots of our youth shaped our ideas aboutwhat it means to be defined as a robot.But this definition is a little more broad and, honestly, more accurate. A robot doesn’t need arms and legs to carry out a complexseries of actions. What they do need, though, is information about their surroundings and the ability to respond in a meaningfulway.
  • 4. ro-bot noun /ˈrōˌbät/  /ˈrōbət/ robots, plural A machine capable of carrying out a complex series of actions automatically, especially one programmable by a computer. squared ©2012 4So, let’s cut Roy some slack. It’s not that he wasn’t up for the job. He just wasn’t equipped for it. He wouldn’t have ever been ableto roam the house finding books because he didn’t have any way of sensing much of anything about his environment. Those eyesare just lightbulbs, the ears are just pieces of molded plastic.
  • 5. squared ©2012 5Now consider this guy.Let’s play a little game. How many sensors do you think are packed into the iPhone 5?Capacitive touchscreenTwo camerasBluetoothWiFiCellular radiosAccelerometerDigital compassMicrophonesProximity sensorFour buttonsOne switchAmbient light sensorInternal temperature sensorGPS radioThis is certainly a device capable of carrying out a complex series of tasks automatically. In part, because it knows a tremendous amountabout its environment. Another way to put it: this device knows about its context.
  • 6. ro-bot noun /ˈrōˌbät/  /ˈrōbət/ robots, plural A machine capable of carrying out a complex series of actions automatically, especially one programmable by a computer. squared ©2012 6This device comes a lot closer to fulfilling the ideas I had about Roy’s capabilities than anything I could fathom at the time. Andit’s possible, in part, because of its ability to sense what’s going on around it. In other words, it’s able to sense a lot about itscontext.
  • 7. con-text noun /ˈkäntekst/ The sum total of everything the user experiences at the moment of engagement. squared ©2012 7You’ll hear context defined in a variety of ways.For the purposes of this discussion (since we’re here to talk about building great mobile experiences), we’ll define context as thesum total of EVERYTHING that the user experiences at the moment of engagement.That could be time of day, location, mood, their social network or even combinations of these factors. (For example, theproximity of their friends at any given moment.)For the next hour or so, we’ll walk through just what this device knows about it’s environment, as how to use it to create reallygreat user experiences. We’ll take a look at some great examples (and maybe point out a couple of almost-there, but-not-quite-yet examples). Then, we’ll talk about what the future holds.
  • 8. touch noun /təCH/ The faculty by which external objects or forces are perceived through contact with the body (especially the hands) squared ©2012 8Let’s do a little exercise. I’d like everyone in the room who possesses any type of mobile phone to raise their hands and keep them raised.Now, if the primary method of input is by interacting with a touchscreen, put your hand down.OK, how many of you have a stylus?And finally, how many people interact through a physical keyboard or keypad?It’s kinda funny, right? If I’d done this five years ago, this exercise would have ended much differently.Why is it that devices that utilize touchscreens have, for the most part, completely replaced those devices with other input mechanisms?When it comes to typing, I don’t know many people who would say that touchscreens allow them to type faster. In fact, that’s the primary argument against touchscreens for most people. I was one of them. I could type pretty darn fast on my Blackberry, and I could do it without looking at the device.That was awesome in rush hour traffic. I’d drive with one knee on the steering wheel and type like a fiend with the other two hands - well, actually thumbs - all while juggling a Diet Coke.I have a theory about this. It all starts with the personal nature of mobile phones. Never before has a device ever truly been worthy of being called a “personal computer.” Mobile phones connect you to your work, your friends, your spouses and partners, your parents and your children. Your vacationphotos are there, and the photos you’d rather not share with the general public.According to research from the Pew Research Center, most of us sleep with our phones less than 10 feet from us.Does anyone here sleep with their laptop within arms’ reach? If there’s anyone in the room who disagrees with the “personal” thing, then please walk up here, unlock your phone and let me read whatever I want to the audience.... for the remainder of this presentation.So, it’s a personal device. Got it. But that’s only one part. The other part is, back to my point, the touch part. Smartphones are, by and large, devices used for content consumption. Content that is personal in nature. And interacting with that content (on a mobile device, anyway) via touch is new. You useyour fingers to pinch to zoom in on that photo of your new nephew. When an annoying email hits your inbox, you roll your eyes and, as if by reflex, your right thumb swipes from left to right and with one pleasure-filled tap of delete, all memory of the thing that annoyed you disappears.Touching content brings us closer to it. We’re touching pixels; but at a deeper level, we’re touching the thing behind the content.It feels natural because that’s the way we’re programmed to interact with the world. Babies touch things. They reach for and grab objects. No one taught them to do that. Touch makes us human, and it makes objects - and content - come alive. (By the way, if you were a dog, you’d put it in your mouth.But that’s a different presentation.)Touch is emotional. “It’s touching. That’s touched my heart. Reach out to someone in need.” These metaphors ring true across all cultures.Ok. You get it.So, let’s take a look at some apps that do a really good job with touch.
  • 9. squared ©2012 9ClearWho would have thought that something as boring and mundane as a to-do list could be thrilling to use? Maybe thrilling is a bit of a stretch, but the to-do app, Clear, pushes theboundaries (in a good way, for the most part) of how gestures can make dull-as-paste activities like task management a little more palatable. Swiping, pinching, tapping and holding,panning gestures...you name it, this app takes advantage of touch unlike any other. It’s not entirely intuitive in places, but the app does a good job of coaching you in the basics at start-upand users can quickly navigate once they learn how the app is organized. The takeaway here is that gestures, once learned, can altogether eliminate the need for menus. Life doesn’t havea menu system, and with a great mobile app that implements gestures in smart, intuitive ways, neither does your app. The content becomes the control, and even the spaces between thecontext.Letter SchoolIf iPhones were around when I was in elementary school, and I had Letter School, maybe my handwriting would be a little more legible than it is today. This is one of my favorite touchexamples for two reasons: one, its just fun. I mean seriously. This game was written for kids, but it’s beautifully art directed. The animations, the sounds, the interaction design - it’s just ahome run.The app works like this: letters of the alphabet are presented first as an example like “R is for Robot.” Then, the letter is shown with animated cues to tap (in order) which demonstratehow you’d write the letter. Next, you trace the letter with visual coaching. Finally, you’re on your on to draw the letter solo. Make a mistake? You’re gently prodded and shown the correctway to write the letter.But, back to the second thing about this app that I love: it actually invites touch. From launch, and the animations... If I were forced to find a fault with this app - and trust me, it’s hard - Ican only find one thing, and it’s so small that it’s really ridiculous to even mention. Every letter has a pretty good starting point. “A is for Ant,” “Z is for Zebra,” “K is for Kite,” and then youget to Q. “Q is for Quenton,” a person’s name. What if you’re a kid and you don’t know anyone named Quenton? What if “D was for David?”Maybe quack, or quilt, or queen. I’m not suggesting big words like “quarantine” or “quantum,”but something other than a common name. That’s a stretch, I know. I mean seriously, if that’smy only gripe about this app, you know I’m reaching for something. The designers of this app seriously hit the nail on the head here.
  • 10. squared ©2012 10LEGO Harry PotterHow many people here read the Harry Potter series? One things that author J. K. Rowling did very well was explain a myriad ofspells to the reader. As the characters learned spells, she painted a picture of the wand movements and the precise incantations.LEGO Harry Potter recreates this by training game players in spell-casting through the use of gestures. In this screen shot, theplayer (as Harry Potter) is casting the Wingardium Leviosa (levitation charm) spell. There isn’t a menu, but the spell is cast on theitem by making a gesture that corresponds to the spell. (On a side note, this gesture mimics the same wand gesture thatHermione used in the first Harry Potter movie. I don’t think that’s by accident.)The Incendio spell, which sets things on fire, is cast by drawing a symbol that resembles a flame. Genius.
  • 11. touch •Use content as controls •Invite touch •Touch is more than taps squared ©2012 11The takeaways:- Use content as controls- Invite touch- Touch is more than taps: gestures can create connection and provide immersion
  • 12. sight noun /sīt/ The faculty or power of seeing. squared ©2012 12Next up is sight.When we think about sight, we think about the recognition of objects. We look at a face and we see a friend. We look at a dozencars and we’re able to determine which one is ours.But let’s dive a little deeper. What we’re really talking about is the ability to sense a very narrow band of the electromagneticspectrum It’s just between infrared and ultraviolet, what we refer to as the visible light spectrum. Normally, when we use the term“light,” we’re referring to the range of these waves that stimulate the retina in our eyes. Each individual wavelength isrepresentative of a different color. (Physics again. Ugh.)The camera app in that smartphone of yours is, at its core, just sensing and recording different wavelengths of light. Your brain isresponsible for the heavy lifting and making sense of it. The act of seeing is so natural that it’s difficult to appreciate the vastlysophisticated machinery underlying the process. It may come as a surprise, but about one third of the human brain is devoted tovision. The brain has to perform an enormous amount of work to unambiguously interpret the billions of photos streaming intothe eyes.Let’s talk about how we can start using this sensed light in some interesting ways.
  • 13. squared ©2012 13Google Search (for the iPhone)When I have the opportunity to work on mobile experiences, whether it’s a native application or mobile web, one the the first things I focus on is the way that the app solicits input fromthe user. In most cases, I find that apps introduce barriers to usability. Why type a search phrase when your phone can see what’s in front of you? Google saw - no pun intended - anopportunity here, and they provided a method for search via image recognition. It isn’t perfect, of course, but it’s pretty good with text, book covers, and objects with distinct branding.Instant Heart RateI love this app because it uses the camera sensor for something that isn’t immediately apparent. Because tiny capillaries in the tips of your finger fill and empty with blood on eachheartbeat, the color on the surface of your skin changes ever so slightly. You may not have ever noticed it, but the developers of Instant Heart Rate did, and they leveraged the camera onthe iPhone to detect these subtle changes over time to calculate your heart rate. At first, I was a skeptic. But after repeated use, I’m sold, as are thousands of other users and healthmagazines, personal trainers, and physicians who have validated the apps’ accuracy time and time again.Resistor Photo IDThis is my absolute favorite of late. About three months ago, I was bitten by the Arduino bug. No, it’s not some exotic insect. It’s an open source electronic rapid prototyping platform. In anutshell, it lets you build stuff. It’s popular in DIY and modification circles. Basically, you can build new gadgets. But when I started tinkering with the platform, I ran into a big problem:resistors.Without getting too geeky, a resistor is a part in an electric circuit that resists the flow of electricity. This is an important component when designing new gadgets because differentcomponents may be particularly sensitive to even small changes in voltage. For example, to control the flow of electricity through LED lights to a safe value, you put a resistor in thecircuit. But here’s the deal with resistors: they come in different sizes. You can tell by looking at a resistor what its value is because they’re all coded with these nifty little color bands. Thenumber of bands and their colors in various positions are used to calculate values. Here’s the important part for me: I’m colorblind. Specifically, I’m red/green colorblind. So, picking theright resistor out of a pile of 10 different values is tough for me. Actually, it’s impossible. But with this app, I can snap a photo of a resistor and the app does the work of detecting thecolor values on the individual bands and telling me their value. As long as I know that I need a resistor of X value (for example, 20 Ohm), then I can use this app to help me find the rightpart in the toolbox.
  • 14. sight •Cameras aren’t just for photos •Back to the basics: light •What can the camera see that you can’t? squared ©2012 14The takeaways:- Cameras aren’t just for photos- Back to the basics: light- What can the camera see that you can’t?
  • 15. hear-ing noun /ˈhi(ə)riNG/ The faculty of perceiving sound. squared ©2012 15I promise, I’m really trying to stay away from physics, but I have to go back to it for one more minute.We just talked about sight. Sight is tied to your retina (and cameras) sensing electromagnetic waves.Hearing is about the ability to sense sound. Sound is a mechanical wave that is an oscillation of pressure transmitted through asolid, liquid, or gas, composed of frequencies within the range of hearing.There’s that word again, range. It’s important because the range that the human ear is capable of is somewhere between 20 Hzand 20,000 Hz (20 kHz), although these limits are not definite. They change with age: the more Metallica you listened to as ayoung adult, the more likely the upper range has decreased. They also vary by species; other species have a different range ofhearing. For example, dogs can perceive vibrations higher than 20 kHz, but they’re deaf to anything below 40 Hz.It shouldn’t be surprising that the microphones in your smartphone are designed specifically to detect sound in the same rangeof human hearing.
  • 16. squared ©2012 16Into_NowAnyone familiar with Yahoo’s INTO_NOW? It’s pretty good. I like it because it bridges the gap between two channels: televisionand mobile (or tablet). Launch INTO_NOW during your favorite episode of Law & Order and in a few seconds, you have all ofdetails on special guest stars, plot and, in some cases, a total and complete spoiler which makes you want to hurl your iPadthrough the nearest window.ShazaamWe’re probably all familiar with this one. You’re in a car (or a bar) and a song comes on that you like but don’t know. Shazaamsolves the probkem by listening for a few seconds and... wham. There’s your song and artist. It’s great for settling pop culturedisputes , too. It’s not so great at listening to your friend attempt to sing a song and determine what she’s trying to perform.It works by referencing a vast catalog of audio fingerprints pre-recorded and stored. When a similar audio fingerprint isencountered later, a match is made and the data along with the song is sent over.Ocarina (2)Shazaam is about recognizing music. Ocarina is about making music. Here, the microphone is used to detect air as it flows overthe microphone. As the microphone detects vibrations made by blowing across the bottom of the phone, the phone produces asound at a particular pitch based on finger combinations. (Think: trumpet.)
  • 17. hearing •Back to the basics: microphones detect sound waves •Microphones can be used for more than recording video and phone calls squared ©2012 17The takeaways:- Back to the basics: Microphones detect sound waves; be innovative in how you leverage them.- Microphones can be used for more than recording video and phone calls
  • 18. taste noun /tāst/ The sensation of flavor perceived in the mouth and throat on contact with a substance. squared ©2012 18Next up is taste. But, we can’t talk about taste without talking about smell, because the two are so closely linked.
  • 19. smell noun /smel/ The faculty or power of perceiving odors or scents by means of the organs in the nose. squared ©2012 19Both taste and smell are tied to chemoreception. Let’s go into biology for some background on taste and smell.To produce a behavioral response in an organism, a chemical must produce a signal in the organisms’s nervous system. Thisentails processes that are initiated at the taste or smell receptor cells. First, the molecule must be captured and traverse a layerof mucus, in which the endings of the receptor cell are bathed; these are known as perireceptor events. Second, the moleculemust interact with the surface of the receptor cell in a specific way to produce reactions within the cell. These reactions lead to achange in cellular electrical charge, which generates a nerve impulse. Transformation of an external stimulus into a cellularresponse is known as signal transduction.More simply put, a chemical interacts with a receptor in a way that produces some sort of reaction in a cell. And those reactionsresults in a signal being generated and passed along nerves to be processed in the brain.
  • 20. Creepy...? squared ©2012 20Ok, this is getting a little creepy. A phone that can smell and taste? Why would I ever want that?Earlier, we talked about the limitations of the retina and the human ear, and their abilities to pick up signals that fall within a verylimited range. Our noses have the same limitations. If we consider dogs, their sense of smell is thousands of times more sensitivethan ours. There’s evidence to suggest that they may even have the ability to detect a vastly broader range of chemicalcompounds than we can.A new study adds to the body of research suggesting that man’s best friend may actually be able to smell cancer. Cancerdetection? We don’t know how they do it, but they’re doing it. Researchers in Germany found that dogs were able to pick up onthe scent of organic compounds linked to the presence of lung cancer in the human body, and that their keen sense of smell maybe useful for the early detection of the disease. Four family dogs – two German shepherds, one Australian shepherd and oneLabrador retriever – smelled test tubes containing breath samples of 220 patients, both those with lung cancer and thosewithout. The dogs were trained to lie down in front of the test tubes where they smelled lung cancer and touch the vial with theirnoses. According to the study, the dogs successfully identified lung cancer in 71 out of 100 patients with the disease.NASA’s low-cost, high-speed device could be a huge boon for law enforcement officials who need to quickly assess a chemicalspill or possible chemical attack. Knowing just what the chemical is would be a great start, but NASA’s press release doesn’t sayif the device is able to measure concentration with any degree of precision. The device is, at this point, just a proof of concept,but it’s an amazing use of technology from NASA’s Cell-All program, and we can’t wait to see what other crazy iPhone appsthey’ll come up with next.
  • 21. squared ©2012 21But, what about something more useful for us? Imagine a phone that can pick up the scent of your favorite foods and alert youbefore you’re able to smell them yourself. You’re visiting a city, it’s around lunchtime, you’ve ‘Liked’ pepperoni pizza on yourFacebook page and before you know it, your phone is letting you know that there’s a great pizza place right around the cornerthat has fresh wood-fired pizza. And better yet, here’s $5 off! Not so creepy anymore.
  • 22. ra-di-o-ac-tiv-i-ty noun /rādē-ō-āk-tĭvĭ-tē/ The radiation, including alpha particles, nucleons, electrons, and gamma rays, emitted by a radioactive substance. squared ©2012 22http://news.cnet.com/8301-1035_3-57444283-94/softbanks-geiger-counter-smartphone-start-of-a-global-trend/A few weeks ago, Softbank introduced its summer smartphone lineup for Japan and included a device -- Sharps Pantone 5 --that includes a Geiger counter to track radiation. The backdrop for Softbanks Pantone 5 is obvious. In March 2011, Japan was hit with a tsunami and earthquakes that led to anuclear meltdown. Since the disaster, there has been widespread distrust about nuclear power, the Japanese governmentsreaction to the event and disclosure to citizens. Simply put, worries about radiation in Japan will persist for years.Mobile devices capable of sensing dangerous levels of radiation could go a long way to ease fears and could, quite literally, saveyour life.
  • 23. ra-di-o noun /rādē-ō/ Electromagnetic waves of a frequency between about 104 and 1011 or 1012 Hz. squared ©2012 23Crap. Physics again. Is he ever going to stop talking about electromagnetic waves? No.Your phone’s ability to send and receive radio waves is what makes it a mobile phone. But, radio waves enable a lot more thanjust voice communication. There are also radio waves being used for WiFi communication, Bluetooth technology, location-basedservices through Global Positioning System (GPS) satellites orbiting the earth, and for some phone manufacturers, near fieldcommunication (NFC), which enables short-distance communication from one device to another.
  • 24. ...and more proximity ambient light GPS accelerometer digital compass thermometer squared ©2012 24That’s just the beginning.If you tear apart just about any modern smartphone, you’ll find at least a half dozen more sensors hiding in there, silentlydetecting things about its environment. There are proximity sensors that work through passive infrared light, there are ambientlight sensors detecting the brightness and allowing your phone to automatically adjust the brightness of the screen to match itssurroundings, there’s an accelerometer providing portrait and landscape support, a gyroscope that allows for advancedinteraction around multiple axes, a digital compass that allows for navigation and heading information, and even internalthermometers to monitor safe operating temperatures. It’s amazing how much this device knows, and it’s amazing how muchwe’re underutilizing it.Some of that isn’t our fault, though. Apple’s software development kit (SDK) doesn’t give us a way to use the ambient lightsensors through our own code. We don’t have access to the internal temperature sensor on the phone, and it might not provideus with the data we really need, which is the external temperature.But don’t let that deter you. Apple’s SDK does provide us with the ability to harness data collected from external sensors, and ithas a mature program for building certified iPhone applications. If you can dream it, you can build it.Finally, I’ll leave you with a few examples of apps that utilize some of these sensors in interesting ways.
  • 25. squared ©2012 25Sleep CycleSleep Cycle uses the accelerometer to detect movements while you sleep. Less movement equals deeper sleep. Based on asimilar concept, there’s even an app that measures your performance in other bedroom-related activities.PassionPassion uses the accelerometer (and the clock) to measure your sexual performance. That’s all I’ll say about this app. You can useyour imagination. Disclaimer: This is not a screenshot from my phone, thank you very much.CrunchFuCrunchFu turns a task I personally hate (sit-ups) into a game. Unfortunately, it also uses the accelerometer to measure thenumber of sit-ups, which means I can’t cheat. You start the app, hold it against your chest, and the app counts the reps. You canuse the app in training mode, or you can challenge your friends to sit-up battles. Pretty nifty.
  • 26. squared ©2012 26Star WalkThis is another one that I wish I’d had growing up. I mentioned before that I grew up in a small town in rural Alabama. It’s not agreat place on the excitement meter, but it is a great place for stargazing because there’s almost no light pollution.I can’t remember the first time I was introduced to constellations, but I do remember learning a few of them and then rushingoutdoors at night to try to find them. I wasn’t very successful, so I ended up naming my own constellations. I had the cow, thesmiley face, the monkey... I was an only child, cut me a little slack.The Star Walk app makes use of so many sensors that I lose track. Camera, GPS, accelerometer, gyroscope, various radios andtouchscreen (with really good gesture implementations). My favorite way to use this app is to use a feature where it overlays it’sdatabase on stars on top of the camera feed, allowing you to easily find planets, stars, and even man-made satellites. You caneven look up things like the International Space Station and fast forward through time to find out when it may be visible in yourlocation. Total geek-out.
  • 27. WHEW! squared ©2012 27You’re thinking two things right now:1. Wow, that’s a lot of sensors.2. What in the hell does that have to do with context and relevancy in mobile experiences?Well, we’re almost there. Stick with me for five more minutes.
  • 28. con-text re-le-vant noun /ˈkäntekst/ adjective /ˈreləvənt/ The sum total of everything Closely connected or the user experiences at the appropriate to the matter moment of engagement. at hand. squared ©2012 28The concept of relevance is studied in many different fields, including cognitive sciences, logic, and library and informationscience.Something (A) is relevant to a task (T) if it increases the likelihood of accomplishing the goal (G), which is implied by T.
  • 29. Something is relevant to a task ONLY if it increases the likelihood of accomplishing the goal. squared ©2012 29Just because you have access to all of these sensors (and data), doesn’t mean that you should present it all to the user. You haveto start with an understanding of what the users’ goals are when they’re mobile.Your app was clearly created to help the user do something. Even if it’s a game, there is a goal; but the goal doesn’t have to beutility. The goal may be to entertain or distract...to provide some escape from a dull moment.Understanding what the user’s goals are is the first step. Next, identify what the tasks are that the user must accomplish to reachthat goals, and then determine what pieces of context will assist the user in accomplishing that task, moving them closer to theirgoal. Those pieces of context are the ones that are relevant. Everything else is a distraction.
  • 30. Road to Relevancy in Mobile Experiences: Understanding the Mobile Mindsetsquared ©2012 30
  • 31. Mindset: Micro-tasking squared ©2012 31The iPhone is a device of convenience and context, built for short dashes of activity (micro-tasks wedged in between otheractivities).For example, in conversation, we use it to settle pop culture disagreements by going to IMDB, or to add our lunch date to ourcalendar.Users want to get in and get out.This device allows us to capture lost time in grocery store lines and on subway commutes.The most popular iPhone games are casual games designed to last just a few minutes.In all of its contexts, the device’s quick-draw convenience lets users make the most of downtime, whether for work, play, orcreative contemplation.For makers of great apps, that means you have to identify the recurring tasks that your users will perform with your app, andthen polish, polish, polish.Optimize to make it fast to get in, get it done, and get out.
  • 32. Mindset: Local squared ©2012 32“I’m local, tell me what’s happening where I am right now!”We’ve celebrated the personal computer since the days of disco, but it took three decades to get the first truly personalcomputer in the iPhone.It knows so much about you and your surroundings. Camera, microphone, GPS, motion detector, and compass, all backed withInternet know-how.But local isn’t just about latitude and longitude, and it isn’t just about a push pin on a map. It’s also about what is right in front ofyou. When we think of location, we tend to think about GPS. But start thinking about the camera. What can your phone senseabout what’s near it through other sensors? Camera, radio, even microphones.
  • 33. Mindset: Bored squared ©2012 33What is more valuable than helping you survive a dull-as-paste moment?For all this talk of micro-tasking and efficient analysis of our surroundings, you’d think we were all paragons of productivity.However, games own the biggest slice of downloads, and the reason is that we’re all looking for something to do with lost time.Our smartphones are great for staving off boredom for the same reason they’re great for micro-tasking: they’re always with you,at the ready with a video game, low humor or high literature.
  • 34. Examine context (available through sensors), determine what is relevant, and provide information that assists the user in meeting their (mobile) goals. Eliminate everything else.squared ©2012 34
  • 35. Barriers Sight isn’t seeing. Hearing isn’t listening. Listening isn’t understanding. squared ©2012 35All the sensors in the world are useless without a way to interpret and understand the data coming into them.If you want to read some fascinating stuff on this topic, pick up the book Incognito by David Eagleman. Specifically, read chapter2, “The Testimony of the Senses: What is Experience Really Like?” It will challenge the way you think about the gap between yoursensory system and your experience of reality.The point is this: to create new mobile apps that continue to inspire, we need technologists, for sure. But we also need biologists,those who specialize in understanding how our bodies sense, and then can replicate and evolve that in the digital realm. We needpsychologists and linguists and computer geeks coming together around fields like computational linguistics. We needanthropologists who can provide insight into how differences in cultures influence the the implementation of technologysolutions.I don’t want anyone to leave this room and think that in order to build a great mobile app that you have to know how to code, orthat you have to have a degree in physics. Bring what you have to the table because what you have is human experience, andthat experience is invaluable. There’s a way to plug it in, I promise. It might not be obvious at first, but you have something thatwill spark an idea.
  • 36. THANKS! Relevancy and Context: @davidreevesATL The Key to @22squared Great Mobile #DigATL Experiences #mobilecontext