Extended Reality(XR) Development
Augmented Typography:
Legibility and readability
Creating visual contrast
Take control
Design with purpose
. Color for XR:
Color appearance models
Light interactions
Dynamic adaptation
Reflection
Sound Design:
Hearing what you see
Spatial sound
Augmented audio
Voice experiences
Power of sound.
Visual Style and Aesthetics: Basics of Visual Design
Visual Design for Enterprise Applications
Range of Visual Styles.
Mobile Interfaces:
Challenges and Opportunities of Mobile Design
Approach to Mobile Design
Patterns
Day 3 Compose Camp Material Design.pptxShayantaniKar
The document discusses the principles of Material Design for developing user interfaces. It covers topics like the Material Design system guidelines and components that support best practices, using a responsive layout grid with columns and gutters for adaptive designs, applying color themes with primary and secondary colors, and selecting typography for clear content presentation. The document provides information to help strategize how to build apps using Material Design foundations.
Mobile ui trends present future – meaningful mobile typographyHalil Eren Çelik
This document discusses mobile UI typography trends both present and future. It begins by noting that typography on small screens does not mean small type, but rather type designed with specific intent and meaning. It then provides guidelines and best practices for mobile typography, including using sans serif body text, giving words some space, paying attention to alignment, creating subtle contrast, simplifying hierarchy, and not forcing desktop conventions onto mobile. It concludes by discussing trends like moving away from thin fonts to more readable medium weights and using card-based interfaces to improve link management.
The document provides guidelines for designing a mobile app, including:
- Layout content so it is viewable without scrolling horizontally on an iOS device.
- Use visual weight and alignment to emphasize important elements and show hierarchical relationships.
- Ensure text is at least 11 points and legible at default size without zooming. Consider changes to text size in settings.
- Only use high resolution images to avoid blurriness and maintain professional appearance.
- For newsroom apps, focus on delivering easy to access content without draining resources in order to track user behavior and monetize through various advertisement formats. Keep the app simple and relevant to bring users into other platforms.
- The document provides tips and rules for designing interfaces for iOS, focusing on simplifying designs, maximizing content space, using appropriate colors and typography, negative space, intuitive icons, and transitional animations between screens.
- Key aspects of iOS design covered include using prominent tap targets, descriptive screen titles, meaningful use of colors, and adaptive layouts for different device sizes.
- The document recommends designing for touch by using appropriately sized buttons and text, as well as techniques like blurred backgrounds, to improve the user experience on iOS.
This document discusses responsive and adaptive design approaches. It defines breakpoints as specific pixel widths that trigger layout changes. Common breakpoints are listed. Responsive design uses fluid, percentage-based layouts while adaptive design uses fixed-width layouts. The document provides tips for high-quality responsive and adaptive designs such as considering content and usability across devices.
2013 was an exciting year for UX/UI design with one of the key changes being a marked move from skeumorphic to flat design. However, beware of thinking this approach is merely a move towards a minimalist style. The real point behind this change is that it is a shift of focus onto content and functionality, while doing away with superfluous elements. It is important to note that it is not merely about removing as many elements as possible in an attempt to reduce clutter. It is about deciding what to remove in order to make the product easier to understand while still maintaining a clear hierarchy of elements.
Visual Style and Aesthetics: Basics of Visual Design
Visual Design for Enterprise Applications
Range of Visual Styles.
Mobile Interfaces:
Challenges and Opportunities of Mobile Design
Approach to Mobile Design
Patterns
Day 3 Compose Camp Material Design.pptxShayantaniKar
The document discusses the principles of Material Design for developing user interfaces. It covers topics like the Material Design system guidelines and components that support best practices, using a responsive layout grid with columns and gutters for adaptive designs, applying color themes with primary and secondary colors, and selecting typography for clear content presentation. The document provides information to help strategize how to build apps using Material Design foundations.
Mobile ui trends present future – meaningful mobile typographyHalil Eren Çelik
This document discusses mobile UI typography trends both present and future. It begins by noting that typography on small screens does not mean small type, but rather type designed with specific intent and meaning. It then provides guidelines and best practices for mobile typography, including using sans serif body text, giving words some space, paying attention to alignment, creating subtle contrast, simplifying hierarchy, and not forcing desktop conventions onto mobile. It concludes by discussing trends like moving away from thin fonts to more readable medium weights and using card-based interfaces to improve link management.
The document provides guidelines for designing a mobile app, including:
- Layout content so it is viewable without scrolling horizontally on an iOS device.
- Use visual weight and alignment to emphasize important elements and show hierarchical relationships.
- Ensure text is at least 11 points and legible at default size without zooming. Consider changes to text size in settings.
- Only use high resolution images to avoid blurriness and maintain professional appearance.
- For newsroom apps, focus on delivering easy to access content without draining resources in order to track user behavior and monetize through various advertisement formats. Keep the app simple and relevant to bring users into other platforms.
- The document provides tips and rules for designing interfaces for iOS, focusing on simplifying designs, maximizing content space, using appropriate colors and typography, negative space, intuitive icons, and transitional animations between screens.
- Key aspects of iOS design covered include using prominent tap targets, descriptive screen titles, meaningful use of colors, and adaptive layouts for different device sizes.
- The document recommends designing for touch by using appropriately sized buttons and text, as well as techniques like blurred backgrounds, to improve the user experience on iOS.
This document discusses responsive and adaptive design approaches. It defines breakpoints as specific pixel widths that trigger layout changes. Common breakpoints are listed. Responsive design uses fluid, percentage-based layouts while adaptive design uses fixed-width layouts. The document provides tips for high-quality responsive and adaptive designs such as considering content and usability across devices.
2013 was an exciting year for UX/UI design with one of the key changes being a marked move from skeumorphic to flat design. However, beware of thinking this approach is merely a move towards a minimalist style. The real point behind this change is that it is a shift of focus onto content and functionality, while doing away with superfluous elements. It is important to note that it is not merely about removing as many elements as possible in an attempt to reduce clutter. It is about deciding what to remove in order to make the product easier to understand while still maintaining a clear hierarchy of elements.
This document discusses text, fonts, and hypermedia. It begins by outlining the objectives of recognizing the importance of word choice, understanding the difference between typeface and font, and learning about font sources and hypermedia concepts. It then covers the history of text, defines typeface and font terminology, and discusses font features, styles, and sizes. It also explains hypertext, links, and anchors. The document provides tips for effective text usage and recommends testing word choices. It defines multimedia concepts and reviews technologies like HTML, CSS, and eBooks.
This document discusses text, fonts, and hypermedia. It begins by outlining objectives related to word choice, typefaces versus fonts, font sources, and hypermedia concepts. It then provides information on the history of text, fonts and type terminology. It discusses using text in multimedia, including considerations for screen reading versus print. It also covers hypermedia, hypertext, and web technologies. Overall, the document provides an overview of textual concepts and their application in digital media.
AR / UX: Building Augmented Reality ExperiencesJoey deVilla
These are the slides from the presentation given at the joint Tampa Bay User Experience / Front End Design Group meetup by Anitra Pavka and Joey deVilla on the evening of June 14, 2018.
This document discusses designing mobile applications to be accessible and inclusive for all users. It covers common misconceptions about accessibility and outlines that over 1 in 5 people have an official disability. The document then discusses the WCAG 2.0 guidelines of perceivable, operable, understandable and robust. It provides tips for making content perceivable through proper use of color, fonts, labeling and forgiveness in design. Tips are also provided for making interfaces operable through touch targets, placement, keyboards and hierarchy. The document stresses the importance of understandable information and layout. It concludes with making content robust and accessible to assistive technologies like screen readers.
Overview of ios Accessibility, a look at what is on offer for a11y support in apps and also how the a11y api architecture works in ios.
Talk given in August 2016 at Dev World Melbourne Australia's national OSX conference.
Choosing which device to invest in for the classroom is a big decision and can take a considerable amount of time. There are many great tablets on the market making it difficult to narrow down the options. You want something that is durable, functional for the classroom and secure, with a good quality screen display and is enjoyable for students and staff to use. Two popular tablets on the market right now that fit the bill are the iPad Pro and the Surface Pro 4. In this Infographic we compare the two tablets side by side.
World IA Day 2019 - Hong Kong. Design for Difference.
Accessibility by definition is about obtaining information as easily as possible, making content accessible for everyone. Designing for accessibility is far from just colour contrasts and font sizes, it should affect every part of your design process. From UX copy to how a screen reader will navigate through your design, and much more. This session will give you an understanding of the new WCAG 2.1 Accessibility Guideline and what it means to designers, developers, and the end-user. Resulting in an improved experience for users.
https://www.worldiaday.org/events/hong-kong/2019
Still trying to get your head around responsive design? This presentation of basic terms, concepts, and examples can help. Useful for introducing responsive design thinking to UX professionals and departments.
PROPS: to Ethan Marcotte for his book, "Responsive Web Design" (available for sale on Amazon) from which this presentation drew heavily.
Excellence in the Android User Experiencemobilegui
The document discusses ways to provide an excellent Android user experience. It covers making a great first impression with the app icon, title and description. It also discusses designing for ease of use through clarity in information hierarchy and navigation patterns. The document provides tips on UI design and development, and introduces new prototyping and asset creation tools to help improve the app quality and continue impressing users.
The document discusses key factors in creating an effective sign, including visibility, readability, noticeability, and legibility. It notes that a sign should be sized appropriately for viewing distance, with 1 cm of letter height for every 1.2 meters of distance. Readability is increased by logically grouping ideas and using formatting like larger text. Noticeability can be improved by using color contrast, unique design elements, or motion. Legibility depends on clear typestyle selection without sacrificing readability. The document also provides guidance on character height, width-to-height ratio, spacing, and using high color contrast materials to optimize a sign.
The document provides an overview of the Adobe Aero training session, including pre-training, during training, and post-training steps. It then details the two hours of training, which include an introduction to augmented reality and the Adobe Aero app. Key concepts around AR like file types, scale, field of view, interaction design, and uses for teaching and learning are explained. The document outlines a simplified workflow for designing mobile AR experiences for education.
The document discusses several principles for designing effective websites, including designing for the medium by making the site portable across different browsers and devices, presenting information clearly with good structure and navigation, designing for the user by considering their needs and how they will interact with the site, and designing for the screen by considering its differences from paper.
The document provides guidelines for making websites usable and accessible for all users. It recommends designing for common screen resolutions like 1024x768 pixels, enabling access to the homepage from any page, and limiting prose text on the homepage. It also suggests avoiding cluttered displays, ensuring visual consistency, and not using color alone to convey information since some users have trouble distinguishing colors. Accessibility is important to ensure everyone can use websites.
Fonts play a crucial role in both User Interface (UI) and User Experience (UX) design. They affect readability, accessibility, aesthetics, and overall user perception.
Android UX-UI Design for Fun and Profitpenanochizzo
Even though we are developers dealing with source code, it is good to know how to deal with UI/UX when building our user interfaces by applying tips and best practices.
So, in this session, we are gonna talk about android usability patterns, based on real cases and experiences with mobile development.
Android UX-UI Design for fun and profit | Fernando Cejas | Tuenti Smash Tech
Fernando Cejas gave a talk on user interface, user experience, and usability design for Android applications. He discussed key concepts like the difference between UI, UX, and usability. He also provided an overview of common Android design patterns for navigation, actions, and visual structure. Cejas emphasized testing designs with real users and following platform conventions to provide intuitive experiences.
Even though we are developers dealing with source code, it is good to know how to deal with UI/UX when building our user interfaces by applying tips and best practices.
So, in this session, we are gonna talk about android usability patterns, based on real cases and experiences with mobile development.
The document discusses several aspects of designing user interfaces, including error messages, non-anthropomorphic design, display design, window design, and color use. Key points covered include making error messages user-friendly, avoiding anthropomorphic computer representations, organizing information on displays to reduce clutter and search time, managing multiple windows to limit distraction, and using color conservatively and strategically to support tasks. Guidelines are provided for improving designs in each of these areas.
UNIT V EVOLVE
Evolve: Methods & Tools,
Concept Synthesis
Strategic Requirements
Evolved Activity Systems
Activity System Integration
Viability Analysis
Innovation Tools using User Needs, CAP, 4S
Change Management
Quick Wins.
More Related Content
Similar to Extended Reality(XR) Development in immersive design
This document discusses text, fonts, and hypermedia. It begins by outlining the objectives of recognizing the importance of word choice, understanding the difference between typeface and font, and learning about font sources and hypermedia concepts. It then covers the history of text, defines typeface and font terminology, and discusses font features, styles, and sizes. It also explains hypertext, links, and anchors. The document provides tips for effective text usage and recommends testing word choices. It defines multimedia concepts and reviews technologies like HTML, CSS, and eBooks.
This document discusses text, fonts, and hypermedia. It begins by outlining objectives related to word choice, typefaces versus fonts, font sources, and hypermedia concepts. It then provides information on the history of text, fonts and type terminology. It discusses using text in multimedia, including considerations for screen reading versus print. It also covers hypermedia, hypertext, and web technologies. Overall, the document provides an overview of textual concepts and their application in digital media.
AR / UX: Building Augmented Reality ExperiencesJoey deVilla
These are the slides from the presentation given at the joint Tampa Bay User Experience / Front End Design Group meetup by Anitra Pavka and Joey deVilla on the evening of June 14, 2018.
This document discusses designing mobile applications to be accessible and inclusive for all users. It covers common misconceptions about accessibility and outlines that over 1 in 5 people have an official disability. The document then discusses the WCAG 2.0 guidelines of perceivable, operable, understandable and robust. It provides tips for making content perceivable through proper use of color, fonts, labeling and forgiveness in design. Tips are also provided for making interfaces operable through touch targets, placement, keyboards and hierarchy. The document stresses the importance of understandable information and layout. It concludes with making content robust and accessible to assistive technologies like screen readers.
Overview of ios Accessibility, a look at what is on offer for a11y support in apps and also how the a11y api architecture works in ios.
Talk given in August 2016 at Dev World Melbourne Australia's national OSX conference.
Choosing which device to invest in for the classroom is a big decision and can take a considerable amount of time. There are many great tablets on the market making it difficult to narrow down the options. You want something that is durable, functional for the classroom and secure, with a good quality screen display and is enjoyable for students and staff to use. Two popular tablets on the market right now that fit the bill are the iPad Pro and the Surface Pro 4. In this Infographic we compare the two tablets side by side.
World IA Day 2019 - Hong Kong. Design for Difference.
Accessibility by definition is about obtaining information as easily as possible, making content accessible for everyone. Designing for accessibility is far from just colour contrasts and font sizes, it should affect every part of your design process. From UX copy to how a screen reader will navigate through your design, and much more. This session will give you an understanding of the new WCAG 2.1 Accessibility Guideline and what it means to designers, developers, and the end-user. Resulting in an improved experience for users.
https://www.worldiaday.org/events/hong-kong/2019
Still trying to get your head around responsive design? This presentation of basic terms, concepts, and examples can help. Useful for introducing responsive design thinking to UX professionals and departments.
PROPS: to Ethan Marcotte for his book, "Responsive Web Design" (available for sale on Amazon) from which this presentation drew heavily.
Excellence in the Android User Experiencemobilegui
The document discusses ways to provide an excellent Android user experience. It covers making a great first impression with the app icon, title and description. It also discusses designing for ease of use through clarity in information hierarchy and navigation patterns. The document provides tips on UI design and development, and introduces new prototyping and asset creation tools to help improve the app quality and continue impressing users.
The document discusses key factors in creating an effective sign, including visibility, readability, noticeability, and legibility. It notes that a sign should be sized appropriately for viewing distance, with 1 cm of letter height for every 1.2 meters of distance. Readability is increased by logically grouping ideas and using formatting like larger text. Noticeability can be improved by using color contrast, unique design elements, or motion. Legibility depends on clear typestyle selection without sacrificing readability. The document also provides guidance on character height, width-to-height ratio, spacing, and using high color contrast materials to optimize a sign.
The document provides an overview of the Adobe Aero training session, including pre-training, during training, and post-training steps. It then details the two hours of training, which include an introduction to augmented reality and the Adobe Aero app. Key concepts around AR like file types, scale, field of view, interaction design, and uses for teaching and learning are explained. The document outlines a simplified workflow for designing mobile AR experiences for education.
The document discusses several principles for designing effective websites, including designing for the medium by making the site portable across different browsers and devices, presenting information clearly with good structure and navigation, designing for the user by considering their needs and how they will interact with the site, and designing for the screen by considering its differences from paper.
The document provides guidelines for making websites usable and accessible for all users. It recommends designing for common screen resolutions like 1024x768 pixels, enabling access to the homepage from any page, and limiting prose text on the homepage. It also suggests avoiding cluttered displays, ensuring visual consistency, and not using color alone to convey information since some users have trouble distinguishing colors. Accessibility is important to ensure everyone can use websites.
Fonts play a crucial role in both User Interface (UI) and User Experience (UX) design. They affect readability, accessibility, aesthetics, and overall user perception.
Android UX-UI Design for Fun and Profitpenanochizzo
Even though we are developers dealing with source code, it is good to know how to deal with UI/UX when building our user interfaces by applying tips and best practices.
So, in this session, we are gonna talk about android usability patterns, based on real cases and experiences with mobile development.
Android UX-UI Design for fun and profit | Fernando Cejas | Tuenti Smash Tech
Fernando Cejas gave a talk on user interface, user experience, and usability design for Android applications. He discussed key concepts like the difference between UI, UX, and usability. He also provided an overview of common Android design patterns for navigation, actions, and visual structure. Cejas emphasized testing designs with real users and following platform conventions to provide intuitive experiences.
Even though we are developers dealing with source code, it is good to know how to deal with UI/UX when building our user interfaces by applying tips and best practices.
So, in this session, we are gonna talk about android usability patterns, based on real cases and experiences with mobile development.
The document discusses several aspects of designing user interfaces, including error messages, non-anthropomorphic design, display design, window design, and color use. Key points covered include making error messages user-friendly, avoiding anthropomorphic computer representations, organizing information on displays to reduce clutter and search time, managing multiple windows to limit distraction, and using color conservatively and strategically to support tasks. Guidelines are provided for improving designs in each of these areas.
Similar to Extended Reality(XR) Development in immersive design (20)
UNIT V EVOLVE
Evolve: Methods & Tools,
Concept Synthesis
Strategic Requirements
Evolved Activity Systems
Activity System Integration
Viability Analysis
Innovation Tools using User Needs, CAP, 4S
Change Management
Quick Wins.
Human Factors and Background of Immersive Design
Designing the whole experience
Theories of perception
Creating hierarchy in 3D
Human centered
Expecting the unexpected
Figure-ground
Location, location, location
Getting emotional
Control is overrated
UNIT III Navigation and Layout
Getting Around: Navigation, Signposts, and Wayfinding:
Signposts
Wayfinding
Navigation Types
Design Considerations
Navigational Models
Patterns.
Layout of Screen Elements:
Basics of Layout
Patterns
This document provides an overview of design thinking and empathy methods. It discusses field observation, deep user interviews, empathy maps, journey maps, and persona development as tools to gain empathy and understand user needs. Field observation involves observing users in their natural environment to understand behaviors. Deep interviews use open-ended questions to elicit stories and uncover user insights. Empathy maps and journey maps can help generate interview questions. Insights from interviews and observations are analyzed to identify user needs and inform the creation of user personas. The goal is to truly understand users and address their needs through the design process.
Advanced design components discussed in the document include backdrops, checkboxes, chips, date pickers, dialogs, dividers, image lists, lists, menus, progress indicators, radio buttons, sheets, sliders, snackbars, switches, tabs, tool tips, and time pickers. The document provides specifications, guidelines and examples for each component.
The document discusses the basic design components of Material Design. It introduces Material Design as a design system created by Google to help teams build digital experiences. It covers the principles of Material Design which are inspired by the physical world. It also discusses the Material environment including surfaces, elevation, light and shadows. It explains the properties of surfaces such as dimensions, shadows, resolution, content and physical properties.
This document discusses user experience (UX) and user interface (UI) considerations for immersive design using extended reality (XR) technologies. It covers topics like approachable design, seamless user flow, understanding the audience, accessibility, and safety. Specific UX elements discussed include personal space, agency, social signifiers, feedback, affordances, and interactions. UI elements like the z-axis, 3D interface metaphors, microinteractions, and time/space are also covered. The goal is to create intuitive, comfortable and safe immersive experiences through thoughtful UX/UI design.
The document discusses immersive design for 3D experiences. It covers topics like the immersive experience in 3D, affordances, multimodal experiences, ideation, innovation, human factors, creating prototypes, using materials, textures, lighting, cameras, scenes, rendering, and 3D file formats. The key aspects of immersive 3D design discussed are using 3D primitives to create more complex shapes, applying materials and textures, setting up proper lighting and cameras, and optimizing file size for performance. Rendering converts the 3D scene into a final output, while different 3D file formats have varying levels of included properties and complexity.
This document discusses head-mounted displays (HMDs) for extended reality applications. It describes how HMDs can be used for virtual reality, augmented reality, and mixed reality experiences. Key points include:
- HMDs can track movement using inside-out tracking (sensors on the device) or outside-in tracking (external sensors). Inside-out is more portable but outside-in allows more precise tracking.
- For virtual reality, a headset fully immerses the user in a virtual environment. For augmented and mixed reality, headsets overlay digital images on the real world and have fields of view ranging from small to large.
- Popular HMD companies include Oculus, HTC V
International Upcycling Research Network advisory board meeting 4Kyungeun Sung
Slides used for the International Upcycling Research Network advisory board 4 (last one). The project is based at De Montfort University in Leicester, UK, and funded by the Arts and Humanities Research Council.
Architectural and constructions management experience since 2003 including 18 years located in UAE.
Coordinate and oversee all technical activities relating to architectural and construction projects,
including directing the design team, reviewing drafts and computer models, and approving design
changes.
Organize and typically develop, and review building plans, ensuring that a project meets all safety and
environmental standards.
Prepare feasibility studies, construction contracts, and tender documents with specifications and
tender analyses.
Consulting with clients, work on formulating equipment and labor cost estimates, ensuring a project
meets environmental, safety, structural, zoning, and aesthetic standards.
Monitoring the progress of a project to assess whether or not it is in compliance with building plans
and project deadlines.
Attention to detail, exceptional time management, and strong problem-solving and communication
skills are required for this role.
Extended Reality(XR) Development in immersive design
1. P.GOWSIKRAJA M.E., (Ph.D.,)
Assistant Professor
Department of Computer Science and Design
UNIT V- Extended Reality(XR) Development
KONGU ENGINEERING COLLEGE (AUTONOMOUS)
DEPARTMENT OF COMPUTER SCIENCE AND DESIGN
20CDH01-HONOR DEGREE-IMMERSIVE DESIGN THEORY
2. UNIT V- Extended Reality(XR) Development
1. Augmented Typography:
Legibility and readability
Creating visual contrast
Take control
Design with purpose
2. Color for XR:
Color appearance models
Light interactions
Dynamic adaptation
Reflection
3. Sound Design:
Hearing what you see
Spatial sound
Augmented audio
Voice experiences
Power of sound.
3. Augmented Typography:-
To exploring how to optimize your type in augmented experiences.
Though many of the topics discussed also apply to virtual reality, the
emphasis best practices for AR, because that provides less control of the
environment.
UNDERSTAND LEGIBILITY AND READABILITY
CREATE VISUAL CONTRAST
TAKE CONTROL
4.
5. Legibility and readability
Evolution
Back to the basics
Legibility
KEEP IT SIMPLE
THINK BIG
CONSIDER X-HEIGHT
STAY SANS DETAIL
Type made for XR
ARone
Type is meant to be read
Readability
GIVE IT SPACE
ARone Halo
SAY MORE WITH LESS
MAKE A CASE
LIMIT LINE LENGTH
WEIGH IN
KEEP IT FLAT
6. Displays covered in type are around us everywhere—from gas station signage to
airport information boards to mobile applications in the palm of our hands.
Creation of each new screen, each new context, there will be some adjustment that
designers need to notice and adjust.
Type on displays. Presenting text that will appear on a variety of screens creates
design challenges.
Evolution: Throughout the evolution of screens, typefaces have been designed to
improve the user experience.(CRT)
Screens have grown bigger, brighter & more light weight.
7.
8.
9.
10. Back to the basics:-
Designing typography for ease of reading within XR involves similar
considerations as designing for a screen.
typographical marks include:
● Letters
● Numbers
● Punctuation
● Dingbats/symbols
11.
12. Legibility is based on learned from designing for screens, simple is better.
Typefaces that are made from simple shapes translate better into lower
resolution displays, such as screens.
Legibility How easily distinguishable one letter is from another within a
typeface.
KEEP IT SIMPLE. Typefaces created from simple shapes work better than
overly styled type.
Geometric Type. Look for letterforms
that are created with basic geometric
shapes, right angles, and horizontal
finishing strokes.
GEMOMETRY
13. THINK BIG. In print you can have body
copy ranging from 8 to 12 points (in
print design we measure the size or
height of type using points). That is too
small for pixel-based type. 14 to 16
pixels or larger is optimal size.
CONSIDER X-HEIGHT. The height of
the lowercase letters is called the x-
height. Not all typefaces have the same
x-height.
14. STAY SANS DETAIL:- create typefaces specifically for
screen type, so these are good places to start.
Helvetica, Verdana, and Georgia are some classics,
but this list continues to grow thanks to the availability
of the Web Open Font Format and fonts being designed
for both print and web formats
ARone Halo
SERIF
17. Type is meant to be read:
Selecting a typeface that is legible to users means that they can easily
distinguish the characters from one another.
Readability The spacing and arrangement of characters and words in order
to make the content flow together to aid reading it.
Many of these remain connected to the foundations of typography, but just
need some optimization for XR. Keep these guidelines in mind:
GIVE IT SPACE:-
Increasing your overall tracking, the space between two or more characters,
will help with readability.
With many of the displays you often see a bit of a halo effect around the text,
so by tracking out your type you can avoid the overlay of the halos and the
letters themselves.
18. An ARone typeface that demonstrates how unusual shapes in letter forms
produce better results in the rendering from AR headsets.
SAY MORE WITH LESS. Reducing the amount of copy, especially
paragraphs of type, is a better practice.
You can complement this with tool tips, explainer type, closed
captioning, and an audio track.
MAKE A CASE. Select a case that works for the content.
Uppercase is hard to read in large amounts, but can add hierarchy for
headers or shorter phrases you want to stand out.
19. LIMIT LINE LENGTH To reduce eye strain, keep the length of your lines of
type to 50 to 60 characters per line. We lose our place when our eyes
have to jump back to the beginning of the next line if it’s too long.
WEIGH IN. Varying your weights of type is a great way to add hierarchy to
your designs and can help guide the user’s eye though the page.
Just watch for the extreme weights.
Light and Extra Bold weights are much less legible than regular, medium,
or bold weights.
20. KEEP IT FLAT. 2D type is easier to read than 3D type. Type that is
extruded and volumetric becomes much harder to read.
It makes sense if you consider we aren’t as used to reading type in 3D; most
of our reading is two dimensional.
Logotypes are an exception as they can work as a 3D element in an
experience.
21.
22. Legibility and readability
Evolution
Back to the basics
Legibility
KEEP IT SIMPLE
THINK BIG
CONSIDER X-HEIGHT
STAY SANS DETAIL
Type made for XR
ARone
Type is meant to be read
Readability
GIVE IT SPACE
ARone Halo
SAY MORE WITH LESS
MAKE A CASE
LIMIT LINE LENGTH
WEIGH IN
KEEP IT FLAT
23. UNIT V- Extended Reality(XR) Development
1. Augmented Typography:
Legibility and readability
Creating visual contrast
Take control
Design with purpose
2. Color for XR:
Color appearance models
Light interactions
Dynamic adaptation
Reflection
3. Sound Design:
Hearing what you see
Spatial sound
Augmented audio
Voice experiences
Power of sound.
24. Creating visual contrast
Viewing Distance- display and text
type
Spatial Zones
UI zone
Focal zone
Environmental zone
Different spatial zones
IMMERSIVE TYPE
UI TYPE
ANCHORED TYPE
RESPONSIVE TYPE
25. Creating visual contrast:-
Designers are used to considering
reading distance when designing.
A poster or billboard is expected to
be viewed from a further distance
than a brochure or postcard.
There are different design
considerations as a result of the
distance between the user and the
design element.
26. Viewing Distance. The viewing distance from text to our eyes changes
based on the medium.
Within an XR experience there may be type that is:
● Placed within the 3D space
● Static (such as any type that is part of the UI)
● Anchored within the environment
● Responsive
In print media, when choosing from the wide range of type options, you can
start by selecting from two main text types: display and text type.
(There may be other places where type is used, such as for a URL or caption
information, which will also be relatively small.)
27. Display type Type found in large
headings and titles; typically 16+ points.
Text type Type found in paragraphs and
meant for longer reading; typically 8 to 12
points and sometimes called body text.
Spatial Zones. Showing the three main
spatial zones in relationship to the user’s
display.
● UI zone
● Focal zone
● Environmental zone
28. UI ZONE The closest text to the user is within this space.
This type is anchored to the camera position on a mobile device or HMD making
this information constant in placement and view.
FOCAL ZONE The next zone moving farther away from the user is the focal zone.
This is an optimal placement for some of the main part of the experience,
including any essential type.
This is the ideal reading distance for essential type within the experience. This
space is within 3 to 16 feet from the user.
ENVIRONMENTAL ZONE The space that reaches farther beyond this scope is
the environmental zone. It can be used for positioning, landmarks, and to add
any additional environmental context within the experience.
Because this is farther away from the user, it is intended to provide directional
cues for the user, showing them places that they can explore within the
experience, or to provide helpful context to what they are experiencing up close.
29.
30. Keep your type in the center zone of what you are designing to avoid the pixels
from blurring on the edge of your peripheral sight. Here are the optimal degrees
to remember:
Field of view: 94°
Head turn limit: 154°
Maximum viewing at one time: 204
With a 3D experience, there are important type design considerations for each
kind of type relative to the different spatial zones.
Immersive type
UI type
Anchored type
Responsive type
31. IMMERSIVE TYPE This type needs to act like a 3D object, but will most
likely be a flat 2D element (for readability).
● This type is integrated into the 3D environment. As such, it should
match the perspective of the planes where it is placed.
● If you want the type to feel integrated into a space, then it needs to look
believable by following the same perspective.
● This dynamic type will rely on spatial computing to map out the space
in advance of the experience or having the user select a vertical or
horizontal plane where the type will be placed.
32. UI TYPE This type remains static in the experience.
This should be 2D and remain in one place on the screen, such as the
navigation bar or on the top and bottom of the screen.
This text is critical for the user experience and often provides identifying
information, such as the name of the app or experience.
The type can serve as a menu allowing the user to see what other options
are available at any given point.
UI type must be easy to find, easy to see, and easy to use, because it plays
an essential role in the approachability of the experience for a user.
33. ANCHORED TYPE This type is connected to a
specific plane or object within the environment.
As the user moves around the environment, the type
will remain in the same spot as the object to which it
is anchored.
Anchored type stays pinned to one specific location
or object to identify it, like the business labels in
this AR navigation app prototype.
Example, in a navigation experience, the tags
pinned to the surrounding businesses and
landmarks around help the user identify them.
These visual tags are anchored to the physical
location.
So, as the user explores, they will always see the
correct name to each building
34. RESPONSIVE TYPE Just as websites have to create responsive layouts and
size ratios for the desktop displays, tablets, and mobile devices, that
same concept applies in XR environments.
Currently, type in HMDs uses pixel or bitmap type, instead of vector or
outline type which would allow it to be scalable.
With the dynamic needs of the content and type used in an augmented
environment, the design can be seen from far away and also super close,
even inside it and all around it.
This means that the type needs to be crisp and clear in both near and far
viewing distances.
35. Just as in CSS we use the em unit of measurement to scale the type in
relation to the width of the screen, there is a benefit for a similar system
within AR.
Based on user movement and the viewing angle, this approach allows type to
automatically adjust for optimal readability.
Black-on-white text is not as effective across all devices because you cannot
reproduce pure black in a transparent or see-through display, which is
used for many AR/MR experiences.
Without the pure black there may not be enough contrast between the type
and the background for readability.
36. Creating visual contrast
Viewing Distance- display and text
type
Spatial Zones
UI zone
Focal zone
Environmental zone
Different spatial zones
IMMERSIVE TYPE
UI TYPE
ANCHORED TYPE
RESPONSIVE TYPE
37. UNIT V- Extended Reality(XR) Development
1. Augmented Typography:
Legibility and readability
Creating visual contrast
Take control
Design with purpose
2. Color for XR:
Color appearance models
Light interactions
Dynamic adaptation
Reflection
3. Sound Design:
Hearing what you see
Spatial sound
Augmented audio
Voice experiences
Power of sound.
38. Take control
Where type is used? Tags
How type is used
How to view type? perspective distortion.
Customization
Minimize
Design with purpose
DESIGN CHALLENGE
MAKE AN AUGMENTED EYE CHART
39. Take control
You are probably aware by this point that there are a lot of uncontrollable
components to working within augmented and mixed realities.
One of the most exciting aspects about these technologies is that you can
use them in varying environments and scenarios.
Where type is used
To achieve more control is to consistently place your type in the same
location within the experience, from the entry point until the end.
After the user sees type repeatedly show up in the same place multiple
times, they will start to look to that place for the information when they
need it.
40. This can apply to the UI type that helps users
figure out how to navigate through an
experience, but it can also relate to the
immersive type that is part of the 3D space.
For example, in tagAR the name tags always
appear above people’s heads.
After you see this happen two or three times,
you understand that is where the digital
augmented object appears, and then you will
look for it in that same location each time after
that.
41. Tags. An augmented name tag from the mobile application tagAR.
These tags always appear directly above each person’s head, making it easier
to see their name and make eye contact at the same time.
In a different example, cars with projected GPS directions appearing in the
road in front of them use this approach to take advantage of constants
within the driving experience.
How type is used
To allow the users to start associating a specific style with a specific
function, give a role to each of the type styles within the experience.
You can use headers to identify important information, for instance, and
body copy to provide tool tips or provide instructions within the experience.
42. It does take time to initially set up the styling for each of the needed styles,
such as:
Main header (h1)
Secondary header (h2)
Additional headers (max 6)
Body type (p)
Adding these categories of type to your experience will make it easier to
navigate and find content.
43. How to view type
Because people can move through and around an AR experience, a world of
possibilities opens for how they can view any given element, including type.
Unlike a 3D object, however, type needs to be viewed from the correct angle
and perspective for it to be readable.
The way to control this viewing angle of text in 3D space is to have it always
face the user.
The positioning and orientation are relative to the user and their gaze. This
added control ensures that people will view the type without any
perspective distortion.
When users view text in 3D space from extreme angles, the type can get bent
and misshapen
44. Perspective Distortion. As type gets
warped to fit into a 3D scene, the use of
extreme perspectives makes the type
more distorted, reducing the readability
of the message.
Perspective distortion A warping of the
appearance of an object or image often
caused by viewing it from an extreme
angle or how it is placed into a 3D scene.
45. Customization
Knowing that people will each have a different experience based on their
physical location and environment, you can design for this.
Using your user research to identify the most common places people interact
with the experience, you can create different experiences for each.
When a user first launches the experience, they would have to provide
information about their physical environment.
Reduce the effort for users (and yourself) by providing a list they can choose
from; not only does this make providing the information easier for them, it
also is easier to design for.
46. Their answers, which could be as simple as selecting indoors or outdoors,
would activate different features or designs based on their choice.
Lighting is typically brighter outside than inside, for example, so you could
alter the design of your type, and other elements, according to their
selection.
47. Minimize
What information is essential to be included in the copy?
It is important to look through all the wording you are including in an
experience and be as efficient as possible.
reading large amounts of type in XR is not optimal. So, you want to narrow
in on just what is needed and avoid anything that is not needed for the
experience itself.
You can also explore if there are any other ways to express the information
instead of in type form.
it may not always be the best solution to communicate an idea or action
quickly.
48. Using simple icons, arrows, illustrations, photographs, videos, or even a
combination of these like a data visualization or an infographic could help
eliminate the amount of type needed by communicating the same
information in a visual way.
For this to work well, you have to put in some work to narrow down what
type is needed and how best to use each word effectively.
When working with mobile AR especially, screen space is premium real
estate.
You want to reserve as much space as you can for people to see and interact
with the AR experience. Make use of user interactions or UI elements to
reveal more information.
49. Design with purpose
The key takeaway to understand leaving this chapter is efficiency.
There are many challenges in displaying type in AR—everything from
working with lower resolution screens to choosing the best typeface to be
viewed up close and far away and everywhere in between.
These challenges reveal the need for efficiency of all the type you include
within an experience.
As you go through your full user journey, check to make sure the type holds
purpose everywhere you add it.
50. DESIGN CHALLENGE
MAKE AN AUGMENTED EYE CHART
The goal of this challenge is to help test your typographic design choices at
varying reading distances in augmented reality.
1. Based on some of the suggestions in this chapter select three
typefaces and font weights that you think will be legible in AR.
2. Using Adobe Illustrator or Photoshop, design an eye chart with
different letters in each row. Use these letters in order and add line
breaks as shown in the figure.
EFPTOZLPEDPECFDEDFCZP
3. As you go down each line, reduce the point size as shown.
4. Save this file as a JPG.
5. Launch Adobe Dimension. From the basic shape library select a plane.
51. 6. Using the widget tool on the plane, rotate the
plane on the z-axis (blue) to lift the plane up
vertically, as you would expect to see a traditional
eye chart. Then position the plane on the x-axis
(magenta) to lift it up off the ground.
7. Now you need to add your eye chart to the plane.
To do this, move to the right side of the screen and
select your Plane layer in the Scene panel. Click the
arrow on this layer to view your customization
properties. Find the Properties panel, and double-
click the base color. Toggle from selecting a color to
selecting an image. Here you can upload the JPG
you saved earlier.
52. 8. Adjust the positioning of your plane as needed to make sure you can view
the letters correctly.
9. Now, the fun really begins. You are going to share this to Adobe Aero.
While still in Adobe Dimension, choose File > Export > Selected for Aero.
Choose Export from the pop-up window, and then save the file in your
Creative Cloud Files folder. This should be the default folder that comes up,
but if you don’t see it, you can find it in your user files.
10. Using a mobile device or iPad, launch the Adobe Aero application. When
it prompts you to choose an image, choose your eye chart from your Creative
Cloud files. Place this on a plane so you can start testing.
53. 11. Make sure you are clear to
move around the image. View the
type up close, and step back away
from it. How is the readability
affected? Take notes.
12. Based on your findings, choose
a different typeface and repeat the
process to help identify which
typefaces to try out in your next
AR project.
54. Take control
Where type is used? Tags
How type is used
How to view type? perspective distortion.
Customization
Minimize
Design with purpose
DESIGN CHALLENGE
MAKE AN AUGMENTED EYE CHART
55. UNIT V- Extended Reality(XR) Development
1. Augmented Typography:
Legibility and readability
Creating visual contrast
Take control
Design with purpose
2. Color for XR:
Color appearance models
Light interactions
Dynamic adaptation
Reflection
3. Sound Design:
Hearing what you see
Spatial sound
Augmented audio
Voice experiences
Power of sound.
56. 2. Color for XR: Color appearance models
Color space
Additive
Subtractive
Linear versus gamma color space
Usability
●Legibility and readability
●Contrast
●Vibrancy
●Comfort
●Transparency
57. 2. Color for XR: Color appearance models
● The color is a personal and dynamic relationship.
● Color creates an emotional impact, as we carry cultural meanings to
the hues surrounding our society.
● “Not all reds are the same. Some are more intense,
some more passionate, some more full of life, and
some more cautionary”.
● The term color space is used to describe the capabilities of a display or
printer to reproduce color information.
58. Example, you will want to make sure that you match
the color space used with the medium (print or
digital).
In a closely related concept, software often allows you
to set the color mode.
Color space A specific organization of colors that
determines the color profile that is used to support
the reproduction of accurate color information on a
device. RGB and CMYK are two common examples.
In traditional print design, ensuring the accurate
creation of color is so important to brand identities
and marketing that the Pantone Matching System was
60. Additive color/RGB Subtractive color/CMYK
Red, Green, and Blue each a color value of
between 0 and 255. To create over 16 million
color combinations.
The common color profile for this is called
CMYK (cyan, magenta, yellow and key black).
(mixing base)
8-bit sRGB color format preferred input for
images on many XR devices.
the term key is a direct reference to the key
plate used in the printing process. four-color
printing process
Each color has a specific value in the HSB or
HSL format. provides a numeric value to the
hue, saturation, and brightness (or
lightness) of the color.
These four colors can produce over 16,000
different color combinations.
create elements such as image targets, a
camera scanning a printed image and then
applying augmented content to it.
61. Linear and Gamma
color space:
The difference between
increasing the shading
incrementally in the
linear color space
versus using the
gamma correction,
which is nonlinear.
62. Linear Color Space Gamma Color Space
when creating digital images, there is
a need of more accuracy and variety
in the dark tones.
Once an image or graphic has been
gamma corrected, it should, in theory,
be displayed “correctly” for the human
eye.
To accommodate for this sensitivity of
the way the brain perceives shades,
gamma correction, also referred to as
tone mapping, was created.
To replicate that in a way that is
mathematical correct, the linear color
space was created to match our
physical space.
63. Linear Color Space Gamma Color Space
Linear color space Numeric color
intensity values that are
mathematically proportionate.
Gamma correction A process that
increases the contrast of an image in
a nonlinear way to adjust for the
human eye’s perception and the way
displays function.
many XR and game designers prefer to use the linear color space to give
their work that realistic feel. This has also become a standard within
software focused on immersive experiences such as Unity and Unreal
Engine.
64. Tint The increased lightness of color by the addition of white.
Shade The increased darkness of a color by the addition of black.
HMDs will support linear only, while others support gamma only.
Some will allow a combination: linear color with some gamma
corrections.
Usability:-
Selecting colors that will make the experience usable.
● Legibility and readability
● Contrast
● Vibrancy
● Comfort
● Transparency
65. Legibility and readability
● Legibility and readability refer not only to the color of type (text), but
also to color of the elements surrounding the text.
● To ensure that type is easily read, To use a shape as a color
background that helps separate the letters from the environmental
background.
● White is the most common color for text and icons in XR.
● Red text on a black background is hard to read because they are both
dark.
● Select colors that have varying shades, so you don’t have a dark color on
a dark color; instead, you want light on dark or dark on light.
66.
67. Contrast
● When you have two colors that are close in shade or even saturation,
they will start to vibrate off one another.
● To avoid this effect, select colors that have visual contrast.
● It means opposite qualities such as light& dark or saturated &
desaturated.
Color Vibration. Colors that are close in tonal range start to vibrate when
placed in close proximity.
● Contrast is essential for keeping your experience accessible.
● Making sure your color choices have solid contrast will make the
experience usable for a greater number of users.
● This approach is more likely to suit a user’s unique needs, even if those
needs change based on their environment.
68.
69. Vibrancy
● A color at its purest form is called chroma. when the color is fully saturated,
without the addition of gray. These pure colors are bright and vibrant.
● Vibrancy increases the brightness of the desaturated tones.
● The energy of a color caused by increasing or decreasing the saturation of the
least saturated tones.
● Vibrancy can also change the energy of the color and, as a result, the overall
experience.
● Bright oranges and reds will grab your attention over desaturated greens or grays.
70. Comfort
To create a positive user experience, you want the user to be comfortable.
If the colors you select are too intense or create too much strain, then this
will cause discomfort.
If a user is met with too much discomfort, they will likely leave the experience
to find a different one that is more comfortable.
Larger areas of color in XR, especially vibrant and fully saturated colors, will
be hard on the eyes. So, use these brighter colors sparingly to attract
attention, but don’t use them in large quantities.
71. Comfort
To create a positive user experience, you want the user to be comfortable.
If the colors you select are too intense or create too much strain, then this
will cause discomfort.
If a user is met with too much discomfort, they will likely leave the experience
to find a different one that is more comfortable.
Larger areas of color in XR, especially vibrant and fully saturated colors, will
be hard on the eyes. So, use these brighter colors sparingly to attract
attention, but don’t use them in large quantities.
72. Comfort:-
The users test the experience, and even test with different color combinations
to see what works best for most people.
The colors will change in appearance between the computer you create them
on and the actual device that plays the XR experience, it is important to test
your designs.
View the colors in context, and then make adjustments to improve the ease
of use.
73. Transparency:- Color will be displayed differently based on the kind of
display you use.
An optical see-through (OST) display, such as the Microsoft HoloLens 2,
AR glasses, or smart glasses will show all elements as more transparent,
due to the nature of the technology.
Video see-through (VST) displays, such as mobile AR experiences that use
the camera to view the physical world, have different considerations.
Because any graphics or objects will be applied directly on top of the
camera view in a VST-based experience, they can be displayed fully opaque.
74. If the amount of transparency in a 3D model or
object,then you can reserve opaque colors for UI
elements so that they stand out on the display.
Making the UI easy to see and interact with is a
high priority.
The perception of color is directly connected to
the light in the scene.
To ensure that users see the colors that you
select for the design, you need to design the
lighting as well.
75. 2. Color for XR: Color appearance models
Color space
Additive
Subtractive
Linear versus gamma color space
Usability
●Legibility and readability
●Contrast
●Vibrancy
●Comfort
●Transparency
76. UNIT V- Extended Reality(XR) Development
1. Augmented Typography:
Legibility and readability
Creating visual contrast
Take control
Design with purpose
2. Color for XR:
Color appearance models
Light interactions
Dynamic adaptation
Reflection
3. Sound Design:
Hearing what you see
Spatial sound
Augmented audio
Voice experiences
Power of sound.
77. Light interactions:
●Type of light
● POINT LIGHT
● SPOT LIGHT
● AREA LIGHT
● DIRECTIONAL OR PARALLEL LIGHT
● AMBIENT LIGHT
●Color of light- Light Temperatures
●Lighting setup
● Soft lighting
● One-point lighting
● Three-point lighting
● Sunlight
● Backlight
● Environmental
●Direction and distance of light:- Falloff, Feathering
●Intensity of light:- 100% (the highest brightness)
●Shadows
78. Adjusting light in a scene or onto an object does not just mean that you
are simply brightening or darkening;
Believable immersion relies in the use of light and its accompanying
shadow.
With the exception of some stylistic deviations, you will want your
lighting to mimic the real world.
It makes sense then to be inspired by light from your physical space.
Type of light: Think about lighting design as you would think about
determining the colors of a composition: Identify the key areas that you
would like to have the most attention.
The brightest and most vibrant colors will attract attention first.
79. POINT LIGHT A point light will emit light in
all directions from a single point.
This light has a specific location and
shines light equally in all directions,
regardless of orientation or rotation.
Examples are lightbulbs and candles.
SPOT LIGHT A spot light works just like a
spotlight used in stage design.
It emits light in a single direction, and you
can move the direction of the light as needed.
Example is a stage spot light for a soloist.
80. AREA LIGHT This light source is confined within
a single object, often in a geometric shape such
as a rectangle or sphere shape. Examples are a
rectangular florescent light and a softbox
light.
DIRECTIONAL OR PARALLEL LIGHT Parallel
rays that mimic the sun; these lights are
infinite, just like sun infinitely lights. This means
that the position of these lights doesn’t matter,
only their direction and brightness. An obvious
example is sunlight.
81. AMBIENT LIGHT Ambient light applies to the full scene. You cannot
choose a specific location for this light, and it will change the overall
brightness of the scene. Example: natural, indirect light from a window.
Color of light
If you have ever gone lightbulb shopping or bought Christmas lights, then
you’ve seen how many different colors of light there are.
Even if you want just “plain white” light, you are greeted with a magnitude
of options. The reason is that no light is pure white.
Light is made up of three colors: red, green, and blue. Mixing these colors
in different proportions alters the color of the light we see, thanks to the
additive property we discussed earlier.
Light has a color temperature;
82. it can be warm or cool depending on the proportional mix of colors.
2700K is a warmer, yellower white;
7000K is a cooler, bluer white; daylight is 6400K.
Light Temperatures. The temperatures of various kinds of lights using the
Kelvin scale for measurement.
83. Lighting setup: It is quite to use more than one light in your scene, just as
you would in the real world. You can have window light and a table lamp in
the same space.
Example: add additional lights to the scene, you need to control the
relationship of the lights.
Soft lighting:Soft lighting is the best choice if you need to add evenly
distributed lighting to your scene.
The name actually refers to the soft quality of shadows in the scene, making
the overall contrast feel balanced and calm.
This kind of lighting is frequently used for
portrait photography.
84. Soft Light. One soft light provides
equal light across the 3D sphere.
One-point lighting: The one-point
lighting technique uses a single light
and, as a result, will create a dynamic
mood.
It also creates harsher shadows where
the light is not illuminating the
object.
One-point light hits the 3D sphere
making the light and shadows more
dramatic.
85. Three-point lighting
The three-point lighting technique
uses three lights—key, rim, and
fill—each of which has a specific role
in the overall lighting setup.
Three lights are set up around the 3D
sphere to demonstrate the positions
of the rim light (backlight), key
light, and fill light.
86. Three-Point Light:
Three lights are set up around the 3D sphere to demonstrate the positions of
the rim light (backlight), key light, and fill light.
● Key light illuminates the focal point of the scene or object and is the
primary light in the scene.
● Rim light illuminates the back your subject, separating it from the
background and adding depth.
● Fill light fills in more light in the scene to reduce or eliminate harsh
shadows and even out the overall lighting.
87. Sunlight
In the sunlight approach there is a single
light source: the sun.
If you are looking to replicate an outdoor
scene, then you should use direct sunlight as
your lighting.
Unlike in the real world, however, you easily
can move the direction of the sun in a 3D
scene to mimic the type of sunlight you
prefer: sunrise, high noon, sunset, or
something in between.
88. Backlight
A primary light source behind your
object is a backlight.
This technique is not as commonly
used, but it can create some
mystery and drama to the scene as
needed.
This lighting also can cause harsh
shadows and a lot of contrast
between the light and the object,
often creating a silhouette and
reducing the number of details seen.
89. Environmental
The environmental lighting approach pulls lighting from an image that is
imported into the program.
This works best when using high-dynamic-range imagery (HDRI) for which
the luminosity data of the image, specifically the darkest and lightest
tones, are captured at a larger range.
This basically means that more lighting data is stored within the image file (it
is a 32-bit image, versus the standard 8-bit).
These images can be used to replicate the lighting in the image in the 3D
scene.
Using environmental lighting is a fast way to generate a custom and
believable lighting setup.
90. Environmental. The light was created to mimic
the lighting from the background image and
replicated on the 3D sphere.
Direction and distance of light
The relationship between the light and the shadow provides a lot of
information, to control the look and feel of that transition.
the light weakens so too will the shadow. This weakening of a light along
its outer edge is called falloff.
The falloff has a radius and a distance, and you can control it.
Lights with a smooth falloff have a high radius and a large distance that
will show a gradient blur that slowly goes from light to dark.
91. Falloff :
● The visual relationship of shadow and light as illumination decreases
while becoming more distant from the light source.
● The edge of the light can be controlled through edge or cone feathering
to soften the line between the light and the shadow.
● This is how you can edit and control the edge itself.
● This option is often available for any lighting that is a cone shape, such
as a spot light.
Feathering :-
The smoothing, softening, or blurring of an edge in computer
graphics.
92. Intensity of light
Once you have the kinds of lights, their position, their roles in the
scene, and their color properties identified, the next step is to
determine how bright the light should be. This is the intensity.
The default is 100% (the highest brightness), but this amount can be
edited to make the light dimmer. The strength of the light can also be
called energy.
Shadows
Wherever there is a light, there must be an accompanying shadow, where
there is light falloff or light is blocked by another object.
93. Without a shadow, the light will not be perceived as real
and won’t be believable.
Shadows also play a big part in our ability to perceive
where an object is in space. Seeing a shadow far away
from an object tells us that the object is suspended in the
air or not near the plane.
A shadow that connects to the bottom of the object
tells us that the object is sitting directly on the plane.
Example, natural sunlight casts stronger shadows than
artificial light.
94. The terms soft light and hard light actually
reference the characteristic of the shadows the
types of light create.
Soft lighting provides a more even light across
all of the subject and, in turn, creates soft
shadows with a fuzzy edge.
Hard lighting provides more dramatic lighting on
an object, creating sharp edges on shadows.
Shadows. 3D rendering highlighting where the main light source is (the
setting sun) and how the light falls off into increasing shadow inside the
cave. The farther away from the sunlight, the darker the shadows become.
95. Light interactions:
●Type of light
● POINT LIGHT
● SPOT LIGHT
● AREA LIGHT
● DIRECTIONAL OR PARALLEL LIGHT
● AMBIENT LIGHT
●Color of light- Light Temperatures
●Lighting setup
● Soft lighting
● One-point lighting
● Three-point lighting
● Sunlight
● Backlight
● Environmental
●Direction and distance of light:- Falloff, Feathering
●Intensity of light:- 100% (the highest brightness)
●Shadows
96. UNIT V- Extended Reality(XR) Development
1. Augmented Typography:
Legibility and readability
Creating visual contrast
Take control
Design with purpose
2. Color for XR:
Color appearance models
Light interactions
Dynamic adaptation
Reflection
3. Sound Design:
Hearing what you see
Spatial sound
Augmented audio
Voice experiences
Power of sound.
97. Dynamic adaptation
Lighting estimation
● Brightness
● Light color
● Color correction values
● Main light direction
● Ambient intensity
● Ambient occlusion
Environmental reflections
●Diffusion
●Roughness
●Metalness
98. Dynamic adaptation:
The idea of the copycat. It allows you to learn and adapt to new
interactions by imitating what someone else is doing—learning as you go
along. This simple concept can be applied to a larger scale, as we look at
imitation in AR.
With dynamic backgrounds and environments, the light and the
properties of the light will constantly change. Just as a child sees a hand
movement and repeats the action on their own, so too can software.
such as Google’s ARCore and Apple’s ARKit framework, evaluate
environmental light and repeat it as digital light. The basic method used is
called lighting estimation.
99. Using sensors, cameras, and algorithms, the computer creates a
picture of the lighting found within a user’s physical space and then
generates similar lighting and shadows for digital objects added to the
space.
To be effective and realistic, this analysis should be continual
throughout the experience so it can adapt to changes in the lighting and
within the environment.
This is a key attribute in the ARCore and ARKit frameworks.
Lighting estimation A process that uses sensors, cameras, machine
learning, and mathematics to provide data dynamically on lighting
properties within a scene.
100. Lighting estimation
When using this lighting estimation method, the computer and AR
development framework work together to analyze the:
● Brightness
● Light color
● Color correction values
● Main light direction
● Ambient intensity
● Ambient occlusion
101. Brightness:
For each pixel on the display, the average lighting intensity can be
calculated and then applied to all digital objects is called pixel intensity, and it
adjusts the overall brightness based on calculating the average overall available light in
the environment.
Light color and color correction
The white balance can be detected and checked dynamically to allow for color
correction of any digital objects within the scene to react to the color of the
light.
It will enhance the color balance allows changes to occur smoothly and more
naturally instead of abrupt adjustments(the illusion or realism).
To apply luminance properties applied to your 3D model, it will still maintain
those color properties, but it will also receive the color correction from the light
estimation scan.
102. Main light direction:By identifying the
main directional light, the software ensures
that digital objects added to the scene will
have shadows cast in the same direction
as other objects around them.
It also enables specular highlights and
reflections to be correctly positioned on the
object to match the environment.
If you want to make sure that all the
shadows and highlights are following
consistently from the singular directional
light.
103. Having this consistent direction of light may seem minor, but it is
something that the brain sees and perceives without us even realizing.
The intensity of the light and also the falloff of those shadows. You
don’t want the intensity of the light to feel too bright to match the scene,
or the reverse of that where the light feels too dark to match the scene.
Light Direction:
3D rendering showing a prominent main light source that can be seen
as it enters through the window opening.
The position of this light source leaves the interior of the scene in
shadow.
104. Ambient intensity: how multiple lights
can work together to create a full lighting
setup, As an important part of the light
estimation scan, ARCore can re-create
what Google calls “ambient probes” that
add an ambient light to the full scene
coming from a broad direction to create a
softer overall tone.
It works with the directional light to help
the digital objects blend more seamlessly
into the scene.
Again, it is about replicating or
imitating the real-world scene.
105. Ambient occlusion
Every time you add a computer-generated light,
it will produce a generated shadow. Those
shadows need to fall into the physical space
to make to make them believable. To do so,
two things need to happen.
● When you add an ambient light, it should
both cast a shadow on the object and
have the shadows occlude all around
it.
● When the light hits the object itself, such
as on a piece of fabric, each wrinkle
should show a shadow.
106. Something like a brick wall should have shadows created inside of every
groove. Ambient light will hit multiple surfaces, and each one will create
their own shadow. This shadow casting is called ambient occlusion.
Ambient occlusion Simulation of shadows both on an object itself and
also on the other objects around it created by the addition of an ambient
light source.
107. Environmental reflections
Take a look at the reflections to see environmental reflections, or places
where pieces of the space are reflected.
Depending on the material of the objects, the relative reflectiveness
will change.
When you add a digital object to a scene, especially an object that has a
metallic or glass surface, it should respond to the light around it in the
form of a reflection.
For these virtual objects, the reflections have to happen in real time and
adjust according to the space to lend realism and believability to the
objects.
108. Reflection. A metallic sphere reflects images from the environment
surrounding it.When creating your 3D objects, you can adjust several
properties to affect how reflective an object is.
● Diffusion
● Roughness
● Metalness
Diffusion Even distribution of light across an object’s surface.
Each material you apply to your 3D object has a base color or texture.
Adjusting an object’s diffusion property affects the amount and color of
light that is reflected at each point of an object.
109. Diffusion: The diffusion stays consistent as you look around the object.
It is a property that is applied equally along the material’s surface.
Because this is an even distribution of light, it will result in a
nonreflective surface.
In 3D software, the default diffusion color is white, unless you change it
otherwise.
110. Roughness:
● If the surface is smooth and shiny like a car’s chrome bumper, it will be highly
reflective. But if the surface has tiny bumps and cracks along the surface like the
surface of a rock or brick, then it will be less reflective.
● This roughness property can change how matte or shiny an object can become.
Increasing the roughness and using brighter colors will diffuse the light across the
surface more, making it appear matte or rough.
● Reducing the amount of roughness, in addition to using darker colors, will cause the
material to appear smooth and shiny.
● Materials that are shiny will also create specular highlights.
● These are the small shiny areas on the edges of an object’s surface that reflect a light.
● These specular highlights should change relative to the position of a viewer in a scene,
because they are created by the position of the light.
111. Metalness:- For the physical surface of an object, you can set multiple
properties to determine how metallic or nonmetallic it is.
● The refraction index controls the ability for light to travel through the
material. Light that cannot travel through an object will reflect back,
and more metallic surfaces will produce sharper reflections.
● The grazing angle makes the surface appear more or less mirror-like.
● If the surface reflects the light sharply and has a mirror-like quality, it
will appear more metallic.
● These properties can be adjusted to lower or increase the metalness to
change the appearance of an object’s surface.
112. ● If the surface is made more metallic and mirror-like, this will increase the
need for environmental reflections on the object’s surface.
● Reflective surfaces also pick up colors and reflect images. So, a metallic
object placed in a green room will also have a green tone.
Reflection: Light and color work together to create a sense of depth and
realism.
● To create and design digital objects, they should be reflective of the
environment around them.
● This process starts with selecting the appropriate color appearance mode
for your experience, works through adding and adjusting any custom
lighting options, and should come to life by adapting to the physical
spaces that the object augments.
113. LIGHTING DESIGN
To creating some lighting setups.
To get started, you need to add a sphere to your scene. Do not apply any
materials to the sphere so you can see the way the lights change the
surface. Using the lighting setups create the following: One-point light
(soft and hard), Three-point light (add a key, fill, and rim light)
● Sunlight, Backlight and Your own custom setup
For each lighting setup you create, go to your Render options, and save a
PNG file for each. You can name each lighting setup accordingly. Save
these images in a folder, and use them for reference as you work on more
complex 3D models. This will create a lighting reference library for you.
114. Dynamic adaptation
Lighting estimation
● Brightness
● Light color
● Color correction values
● Main light direction
● Ambient intensity
● Ambient occlusion
Environmental reflections
●Diffusion
●Roughness
●Metalness
115. UNIT V- Extended Reality(XR) Development
1. Augmented Typography:
Legibility and readability
Creating visual contrast
Take control
Design with purpose
2. Color for XR:
Color appearance models
Light interactions
Dynamic adaptation
Reflection
3. Sound Design:
Hearing what you see
Spatial sound
Augmented audio
Voice experiences
Power of sound.
116. Sound Design: Hearing what you see
Exploring how sound plays an essential role in creating an immersive experience.
From how sound is created to how we can re-create it in a digital space, there
are a lot of exciting things happening within 3D sound experiences.
HEARING WHAT YOU SEE It is important first to understand how we hear so
that we can then look at the best ways to re-create that sound to create realism in
a soundscape.
SPATIAL SOUND Just as in physical spaces, sound has direction and distance.
There is different technology that will help create a sense of 3D sound.
AUGMENTED AUDIO Just as we can add a layer of visuals into a user’s view,
we can also add a layer of ambient audio to what they hear.
VOICE EXPERIENCES With XR devices becoming more hands free, voice is
becoming an intriguing way to interact with a computer.
117.
118. HEARING WHAT YOU SEE
Listening
Sound localization
How do we hear sound?
Loudness, pitch
Raw audio to be captured and edited
Music and voice audio
Transferable and sharable audio
How is sound useful?
How do we use sound in XR?
● Ambient sound
● Feedback sound
● Spatial sound
119. HEARING WHAT YOU SEE
Find a place that you can sit comfortably for about five
minutes, and bring a notebook and something to write
with.
It can be inside or outside—it really can be anywhere.
1. Close your eyes, and be still. Bring your awareness to listening. Try to
avoid moving your head as you do this. Don’t turn your neck toward a
sound; try to keep your neck at a similar orientation.
2. Listen for what you hear. See if you can identify what sounds you are
hearing.
120. 3. Then go one step further and try to identify where those sounds are
coming from.
Are they close? Far?
Which direction are they coming from?
Keeping yourself as the central axis, do you hear them in front of you?
Behind you?
To the left or right of you?
Up high or down low?
4. When five minutes are up, draw out what you heard by placing a circle in
the middle of the page to represent you, and then map out all the sounds
that you heard around you in the locations you heard them from. If they felt
close, write them closer to you, and in the same way, if they felt far away,
then write them farther from you.
121. Sound localization: start to pay attention to where you place the sound in context to
yourself. Also consider how you determine the source of the sound. This is called
sound localization. The ability of a listener to identify the origin of a sound based on
distance and direction.
It is impressive how well we can understand spatial and distance relationships just
from sound.
How do we hear sound?
Sound is created through the vibration of an object. This causes particles to
constantly bump into one another, sending vibrations as sound waves to our ears and,
more specifically, to our eardrums.
When a sound wave reaches the eardrum, it too will vibrate at the same rate. Then the
cochlea, inside the ear, processes the sound into a format that can be read by the
brain.
122. To do this, the sound has to travel from the ear to the brain along the
auditory nerve.
Sound requires an element or medium to travel through, such as air, water,
or even metal.
You may already understand this process, but as we look to design for
sound, there are some key properties that are essential to understand,
including loudness and pitch.
Loudness The intensity of a sound, measured in relation to the space that
the sound travels.
we can detect a wide range of sound, we need a way to measure the intensity
of the sound. This is called loudness, which uses the unit of decibels (dB) to
measure how loud or soft a sound is.
123. To help you add some perspective to dB measurements:
● A whisper is between 20 and 30 dB.
● Normal speech is around 50 dB.
● A vacuum cleaner is about 70 dB.
● A lawn mower is about 90 dB.
● A car horn is about 110 dB.
Pitch:-
Sound changes depending on how fast the object is vibrating. The faster the
vibration, the higher the sound. This pitch is measured using frequency, or
how many times the object vibrates per second.
124. Pitch The perceived highness or lowness of a sound based on the frequency of vibration.
Frequency is measured in hertz (Hz).
The human hearing ranges from 20 to 20,000 Hz. However, our hearing is most sensitive to
sounds ranging in frequency between 2000 and 5000 Hz. Those who experience hearing
loss will often start to lose or have the upper pitches affected first.
How do you choose which format to use? The answer depends on what you’re working with.
● Raw audio to be captured and edited: Uncompressed formats allow you to work
with the highest quality file, and then you can compress the files to be smaller
afterward.
● Music and voice audio: Lossless audio compression files maintain the audio quality
but also the larger file sizes.
● Transferable and sharable audio: Lossy audio compression formats produce smaller
files sizes, which facilitates sharing.
125. How is sound useful?
● Sound is much like a ripple in water; it starts in a central spot, and
then it slowly extends out gradually getting smaller and smaller (or
quieter and quieter) as it moves away from the center.
● Even if you hear a sound from far away, you can still detect where the
sound is coming from or at least an approximate direction.
● You can tell the difference between footsteps walking behind you or
down another hallway.
● You can tell the difference between a crowded restaurant and an empty
one from the lobby, all because of the sound cues of chatter.
● The more chatter you hear, the more people must be inside.
● Sound adds an additional layer of information that will help the user
further grow their understanding of what is going on around them.
126. ● how light can be used to understand space and depth, and sound can
also be used to calculate distance and depth.
● Through the use of SONAR (sound navigation and ranging), you can
measure the time it takes for a sound to reflect back its echo.
● This idea is used by boats and submarines to navigate at sea and to
learn about the depth of the ocean as well.
How do we use sound in XR?
There are many ways that sounds play a role in our understanding of space.
Within XR there are three main ways sound is used.
● Ambient sound
● Feedback sound
● Spatial sound
127. Ambience for reality
In order to really create a sense of “being there,” sound adds another layer
of realness.
When you see a train approaching, that comes with the expectation of
hearing the wheels on the tracks, the chugging sound of the engine, steam
blowing, and the whistle, horn, or bell.
These sounds add to your perception of the train approaching.
Notice the ambient sounds that allow the user to feel truly immersed.
128. Listen for sounds that you can mimic to re-create the scene. Sounds that are
noise intensive and have consistent looping, such as fans, wind, or waves,
do not work as well in this medium, however, so you want to avoid them.
Sounds that have a start and stop to them will be more effective and less
intrusive.
When designing for AR and MR, you can rely more on the natural ambient
noise that will be in the user’s physical space.
129. Providing feedback
● For the user experience, sound can be a great way to provide feedback
about how the user is interacting within space.
● Hearing a sound when you select an interactive element will reinforce
that you have successfully activated it.
● These sounds can be very quiet, such as a click, or louder, such as a
chime.
● Just be sure to use these sounds in a consistent way, so that the user
will start to associate the sounds with their actions.
● Sound cues can guide interactions.
130. ● You can also use sound to direct the user to look or move to another
location, to make sure they see an object that may not be in their
gaze.
● It can also be used in VR to alert the user when they are close to the
edge of their space boundaries.
Creating depth
Because our understanding of sound is 3D, it makes sense that you would
also re-create sound to reflect depth.
It also provides information to the user, such as how close or far away an
object is.
This topic is such an essential part of XR sound design, we are going to
dive into how make your sound have depth next.
131. HEARING WHAT YOU SEE
Listening
Sound localization
How do we hear sound?
Loudness, pitch
Raw audio to be captured and edited
Music and voice audio
Transferable and sharable audio
How is sound useful?
How do we use sound in XR?
● Ambient sound
● Feedback sound
● Spatial sound
132. UNIT V- Extended Reality(XR) Development
1. Augmented Typography:
Legibility and readability
Creating visual contrast
Take control
Design with purpose
2. Color for XR:
Color appearance models
Light interactions
Dynamic adaptation
Reflection
3. Sound Design:
Hearing what you see
Spatial sound
Augmented audio
Voice experiences
Power of sound.
134. To re-create sound in a spatial environment, look at two components.
● How the sound is recorded
● How the sound is played back through speakers or headphones
The traditional types of audio recordings are mono and stereo.
Mono sound is recorded from a single microphone.
stereo is recorded with two microphones spaced apart.
Stereo is an attempt to create a sense of depth by having different sounds
heard on the right and left sides of a recording.
It is intended to create a sense of 3D audio.
135. The concept of 360-degree sound has been experimented with for years,
looking at how surround sound can allow sound to come from different
speakers all around the room creating a full 3D audio experience.
This is used most commonly for the cinema and must be designed around
people sitting in one fixed location
Single-point audio capture:
stereo recordings sound even more natural, one option is a binaural audio
recording format.
To record binaurally, you record from two opposite sides and place each
microphone inside a cavity to replicate the position and chamber of an ear.
This concept is used to re-create sound as closely as possible to the way we
hear it ourselves.
Headphones are needed to accurately listen to binaural sound.
136. Binaural A method of recording two-channel sound that
mimics the human ears by placing two microphones within
a replicated ear chamber positioned in opposite locations to
create a 3D sound.
Ambisonic audio uses four channels (W, X, Y, and Z) of sound
versus the standard two channels.
An ambisonic microphone is almost like four microphones in
one. You can think of this as 2D (stereo) versus 4D (ambisonic)
sound.
Ambisonic microphones have four pickups, each pointed and
oriented in a different direction making a tetrahedral
arrangement. Sound from each direction is recorded to its own
channel to create a sphere of sound.
137. Ambisonic Microphone. Able to
capture audio from four directions at
once, this Sennheiser ambisonic
microphone is creating a spatial audio
recording from nature.
Ambisonic A method of recording
four-channel sound that captures a
sphere of sound from a single point to
reproduce 360° sound.
It was developed by the British National Research Development
Council in the 1970s, more specifically, by engineer Michael
Gerzon.
138. Paradise case study
Many XR experiences rely on an individual user experience, where each
person will have their own set of headphones on or be inside their own
virtual space.
This is a feature for which there is more and more of a demand.
In a social situation, the sound may not track with the user.
However, it could be designed to be static within a space, allowing the sound
to change as the user moves through space (as in the real life).
Paradise is an interactive sound installation and gestural instrument for
16 to more than 24 loudspeakers.
For this collaborative project, Douglas Quin and Lorne Covington joined
their backgrounds in interaction design and sound design to create a fully
immersive sound experience that they optimized for four to eight people.
139. Paradise case study
The installation allows users to “compose a collage of virtual acoustic
spaces drawn from the ‘natural’ world.” As users move through the space
and change their arm positioning,
sensors activate different soundscapes from wilderness and nature to create
a musical improvisation.
This composition is unique each time as it relies on how each user moves
and interacts within the space.
Motions can change the density of sounds, the volume of them, the motion or
placement of the sound in the space, and the overall mix of the sounds
together.
140. Paradise Experience:- Visitors
react to the interactive
soundscape environment of
Paradise. Venice International
Performance Art Week, 2016.
Photograph used by permission
of Douglas Quin
141. This experience was reimagined for
both interior and exterior spaces.
Changing the location “creates a
different spatial image,” Quin
explained when I spoke with him
and Covington about the challenges
of the project.
As they re-created the experience,
they had to adjust for the location.
The exterior exhibit required fewer
ambient sounds, as they were
provided naturally.
142. The interior exhibit
required more
planning based on
how sound would
be reflected and
reverbed by the
architecture of the
space.
143. Behind the Scenes. This behind-the-scenes screen capture shows the
installation environment for Paradise.
Numbers indicate loudspeakers. The green rectangular blocks are visitors.
The large red circles are unseen zones of sounds that slowly rotate.
Sounds are activated when a visitor breaks the edge of a circle.
The triangles with colored balls are sound sources for any given sound (with
volume indicated by the size of each ball).
145. UNIT V- Extended Reality(XR) Development
1. Augmented Typography:
Legibility and readability
Creating visual contrast
Take control
Design with purpose
2. Color for XR:
Color appearance models
Light interactions
Dynamic adaptation
Reflection
3. Sound Design:
Hearing what you see
Spatial sound
Augmented audio
Voice experiences
Power of sound.
146. Augmented audio
AR Sound
How does it work?
Speaker Closeup
More than a speaker
How can you get your own?
Imagine going for a bike ride while listening to your favorite playlist,
receiving voice-driven instructions, and still being able to hear car engines,
sirens, and horns honking all around you.
This is the power of augmented audio.
147. Augmented audio The layering of digital sound on top of, but not blocking
out, the ambient sounds of an environment.
Augmented audio, also referred to as open-ear audio, allows you to hear all
the ambient sounds around you while adding another layer of audio on top of
it.
This allows you to receive navigational directions, participate on a phone call,
listen to an audiobook, or listen to your favorite music—all while still being
connected to the world around you.
Smartglasses, also known as audio glasses, come in different shapes from a
number of different manufacturers.
Many of these come in a sunglasses option, as they are most likely to be
used outside. However, many come with customizable lens options to
personalize your experience.
148. Bose was the first to the market and started off demonstrating the technology
using 3D printed prototypes at South by Southwest (SXSW) Conference and
Festival in 2018.
I remember walking by the Bose house in Austin, Texas, where they had taken
over part of a local restaurant to showcase their AR glasses.
I was intrigued.
I wanted to know how Bose, known for their high-quality speakers and
headphones, was entering the world of AR.
Well, I quickly found out how important audio is to an immersive experience while
wearing their AR glasses on a walking tour of Austin.
The experience started by connecting the sunglasses to my phone through
Bluetooth.
Advantage of the processing power of a smartphone, Bose could keep them
lightweight and cool.
149. One person in the group spotted a famous
actor stepping out of their vehicle for a film
premiere at the festival and was able to tell
everyone else as we continued listening to our
guided tour.
AR Sound. 3D printed prototypes of the
original Bose AR glasses at SXSW 2018.
To be clear, these glasses and similar pairs
from other developers don’t show the user any
visuals. They are just to provide audio.
150. It allow for voice interactions without needing to take out a phone.
They allow the user to interact hands-free and ears-free.
They are essentially replacements for headphones or earbuds that allow the
user to still hear everything around them at the same time.
How does it work?
Using what is called open-ear technology, a speaker is built into each arm
of the audio glasses.
What helps make them augmented, while also staying private, is the position
and direction of the speakers.
One speaker is placed on each arm of the glasses near the temple so that the
sound is close, but still allows other sounds to enter the ear cavity.
151. The speakers point backward from
the face, so they are angled right
toward the ears.
This angle reduces how much of
the sound can be heard by others
around the wearer.
Even in a 3D printed prototype
there was not much sound
escaping from the glasses, and very
little could be heard even by those
standing on either side.
152. Speaker Closeup. The speakers on the Bose
AR glass prototypes are near the ear.
In addition to the speakers themselves, there is
also a head-motion sensor built in that can
send information from the multi-axis points to
your smartphone.
This allows the app to know both the wearer’s
location as well as what direction they are
looking.
This information can help customize
directions—knowing the wearer’s right from left
for example—as well as making sure they see
key parts of an experience along the way.
153. More than a speaker
Listening is only half of the conversation. To allow for user feedback,
these glasses also include a microphone.
This allows the glasses to connect with the user’s voice assistant (more on
this in the next section).
Once again, this function helps maintain the hands-free functionality.
It also allows the glasses to be used for phone calls and voice memos for
those who want to communicate on the go.
Many models have the option to turn the microphone feature on and off for
privacy when needed. This is an important consideration.
If you do purchase a pair, make sure that you can control when the device is
listening and when it is not.
154. To further customize the experience, one arm of the glasses is equipped with
a multi-function button that you can tap, touch, or swipe.
Other than microphone control, this is the only other button you will find on
the glasses.
This allows you change your volume, change tracks, make a selection,
and navigate within an experience—without having to access your
phone directly.
155. How can you get your own?
Although Bose has recently announced they would stop manufacturing their
audio sunglasses line, they are still currently available for purchase as of this
writing.
They were the first to the market, but decided to not continue manufacturing
the glasses as they didn’t make as much profit as the company had hoped.
When interviewed about, this a Bose spokesperson said, Bose AR didn’t
become what we envisioned. It’s not the first time our technology couldn’t be
commercialized the way we planned, but components of it will be used to
help Bose owners in a different way.
We’re good with that. Because our research is for them, not us.
Roettgers, J. (2020, June 16). Another company is giving up on AR. This time,
it’s Bose. Protocol. www.protocol.com/bose-gives-up-on-augmented-reality.
156. Since Bose’s first launch, others have stepped up production of their own
version of Bluetooth shades.
Leading the way is Amazon with their Echo Frames, which bring their well-
known Alexa assistant into a pair of sunglasses.
Everything many have learned to love about having a voice-powered home
assistant is now available on the go.
Other options to check out include audio glasses from GELETE, Lucyd,
Scishion, AOHOGOD, Inventiv, and OhO.
157. If you are looking to use your glasses for more than just audio
communication, some of shades on the market also include cameras allowing
for some action-packed capture. Leading the way in this market is Snapchat
with their Spectacles Bluetooth Video Sunglasses.
Audio glasses might remain as stand-alone audio devices. Augmented audio
may also be incorporated into full visual and auditory glasses. But in either
case, the focus on exceptional sound quality will pave the way.
159. UNIT V- Extended Reality(XR) Development
1. Augmented Typography:
Legibility and readability
Creating visual contrast
Take control
Design with purpose
2. Color for XR:
Color appearance models
Light interactions
Dynamic adaptation
Reflection
3. Sound Design:
Hearing what you see
Spatial sound
Augmented audio
Voice experiences
Power of sound.
160. Voice experiences & Power of sound
Voice experiences
● VUI for voice user interface
● NLP for natural language processing
●Not a replacement- Virtual Keyboard
●Context
●Scripts
Power of sound
161. Voice experiences
Voice is now an interface. Voice interfaces are found in cars, mobile devices,
smartwatches, and speakers.
They have become popular because of how they can be customized to the
user’s environment, the time of day, and the uniqueness of each situation.
Alexa, Siri, and Cortana have become household names, thanks to their
help as virtual assistants.
We are accustomed to using our voice to communicate with other people—
not computers.
it makes sense that companies like Amazon, Apple, and Microsoft try to
humanize their voice devices by giving them names.
162. It is important to make these interfaces feel conversational to match the
expectations that humans have for any kind of voice interaction.
As stated in Amazon’s developer resources for Alexa, “Talk with them,
not at them.”
This concept has also been supported by Stanford researchers Clifford
Nass and Scott Brave, authors of the book Wired for Speech.
Their work affirms how users relate to voice interfaces in the same way that
they relate to other people.
This makes sense, because that is the most prominent way we engage in
conversation, up until this point.
163. Voice user interface The use of human speech recognition in order to
communicate with a computer interface.
Alexa is one example of a voice interface that allows a user to interact
conversationally.
The challenge of this, of course, is that when we speak to a person, we rely
on context to help them make sense of what we are saying.
Natural language processing, or understanding the context of speech, is
the task that an NLP software engine performs for virtual-assistant devices.
The process starts with a script provided by a VUI designer.
Just as you’d begin learning a foreign language by understanding important
key words, a device like Alexa must do something similar.
This script allows the user to train an assistant to an experience, or skill, as
it is called in VUI design.
164. Natural language processing
The use of artificial intelligence to translate human language to be
understood by a computer.
With many XR experiences linking to a smartphone or even promoting
hands-free as an added benefit, that opens up the potential for other ways for
users to interact within an experience.
If you are relying on tapping into smartphone technology, then you first
need to understand how to design for it.
VUIs are not reliant on visuals, unlike graphic user interfaces (GUIs).
The first thing to understand is that voice interactions should not be
viewed as a replacement for a visual interface.
165. Not a replacement
It is important not to get into a mindset that a voice interaction can serve
as a replacement for a visual interaction.
Example, adding a voice component is not an exact replacement for
providing a keyboard.
You also need to be aware that the design approach must be different.
If you show a keyboard to a user, they will likely understand what action
to complete thanks to their past experiences with keyboards.
A keyboard, in and of itself, will communicate to the user that they need
to enter each letter, number, or symbol.
If they are able to do this with both hands, like on a computer, it may be
an easy enough task.
166. But if they have to enter a long search term or password using an
interface where they have to move a cursor to each letter individually, this
task may be greeted with intense resentment.
One way to overcome this daunting task is to provide a voice input option
instead.
It is often much easier to say a word than type it all out.
However, the process of inputting data this way is much different than
with a traditional QWERTY keyboard or even an alphabet button
selection, and it not as familiar.
167. When a user sees the letters
of the alphabet or a standard
QWERTY keyboard, they
connect their past experiences
with it, so they can easily start
to navigate to the letter they
want to choose.
But when you are relying on
voice, the user will connect to
their past communication with
other people as how to interact.
168. Virtual Keyboard. Concept communicating
with friend via screen hologram with full
QWERTY keyboard.
There needs to be something in the UI that
communicates to the user that they can
use their voice to interact.
This is often shown through the use of a
microphone icon.
However, it can also come in the form of
speech.
One way to let someone know that they can
speak is by starting the conversation with a
question such as “How can I help you?”
169. Depending on the device, and where it will be used, design this experience to
match what will work best to communicate to the user that they can use their
voice and to start the conversation.
What do you do first in a conversation?
Before you speak, you might make sure the person you are speaking to is
listening. But if you are speaking to a computer, you don’t have body cues or eye
contact to rely on. So, this active listening state needs to be designed.
Though most of the conversation experience will not use visuals, this is one area
where a visual provides great benefit.
Using a visual to let the user know that the device is listening can help substitute
for the eye contact they are used to when talking with another person.
This can be a visual change, such as a light turning on or a colorful animation, so
they know that what they are saying is being heard.
170. Tip
Provide visual cues to provide feedback to the user. A great place for this is to
communicate that a device is ready and listening.
Companies like Apple have created their own custom circular animations
that they use across all their devices;
when a user sees Apple’s colorful circle of purples, blues, and white, they
connect it with a voice interaction.
Seeing this animation communicates that the device is ready and listening
for a voice command.
All of this happens instead of a keyboard appearing. So, it isn’t a
replacement, but rather a totally different way of communicating, and
therefore in need of a totally different interface and design.
171. Context
When people communicate, we use our knowledge of the context to create a
shared understanding. Once the user knows the device is listening, they may
know to start talking, but how does the user know what to say or even what
is okay to say without any prompts?
With a voice interface there is no visual to show what the options are.
It is best practice to have some options voiced, such as “you can ask...”
followed by a few options.
You may be familiar with the options provided by a teleprompt: “Press 1 to
talk to HR, press 2 to talk to the front desk.”
Those are often very frustrating, because they are very one sided and not
conversational. The goal here is to start by understanding the user’s goal.
172. This is where everything we have been talking about is coming together.
Once you have identified the why in your project, planned out roughly how it
might look, done user research, and created a user flow, you can start to
predict some options that a user may be looking for.
You can start off by letting the user know what their options are based on
this research.
This can be laid out in a set of questions or by asking an open-ended
question.
When you record an interview with someone, it is the best practice to ask
open-ended questions or compound questions.
The reason is that you want the person to answer with context.
173. If you ask two questions in one, a compound question, it is a natural
tendency for them to clarify the answer as they respond.
Perhaps you ask “What is your favorite way to brew coffee, and what do you
put in it?” Instead of answering “French press and with cream,” it is likely
that they will specify which of the questions they are answering within the
answer itself.
174. We’re discussing question and answer methods here because such
exchanges point out an important way that humans communicate.
We like to make sure that our answers have the correct context to them. This
is especially true when there is more than one question asked.
So, a likely response would be, “My favorite way to brew coffee is using a
French press, and I like just a little cream in it.”
Traditional media interviews don’t include the questions—so it’s important to
get context in the answer.
175. The need for context is important to understand as it relates to voice
interfaces:
Humans may not provide needed context in their voice commands.
Having the computer ask questions that are open-ended or have multiple
options will trigger the user to provide more context in their answer, which
will help the device more successfully understand what the user is asking.
Using the power of machine learning and natural language processing,
the device creates a system that recognizes specific voice commands.
These commands must be written out as scripts.
176. Scripts
Think about all the different ways within just the English language someone
can say no:
nope, nah, no way, not now, no thanks, not this time...
And this is just to name a few. You also need to consider what questions the
user may ask and the answers they may give.
This requires anticipating what the user will say and then linking that
response to activate the next step.
With a traditional computer screen, the user has a limited number of options
based on the buttons and links you provide.
With the input of a click or a tap, the computer knows to load the connected
screen based on that action. With voice, the interaction is reliant only on
spoken language.
177. As part of the voice interface design, an
important step of the process is to create a
script.
This script should embrace the dynamic
qualities of conversation.
A successful script should go through
multiple levels of user testing to identify
questions that users answer—and also all
the different ways they answer.
When the user isn’t given a set number of
options to choose from, the script helps
translate the human response into
something actionable by the computer.
178. Script Sample. Sample answers collected to show possible answers to the
question “How have you been?”
All likely answers need to be collected to help a voice assistant understand
how to respond based on each possible answer.
While it is easy to tell a computer what to do if someone says “yes” or “no,” it
is less likely that the user will stick to these words.
Computers may easily be able to understand yes and no commands, but
what happens when a user, who is speaking as they always do to other
people, says “I’m going to call it a day.” Or “I’m going to hit the sack.”
These idioms are not going to be understood by a computer, unless they are
taught them. Without understanding the cultural context, it could be
understood that you are going to physically hit a bag, instead of go to sleep.
179. How do you anticipate responses of users to build into a script? You ask
them, and you listen to them.
How do you anticipate responses of users to build into a script? You ask
them, and you listen to them.
Use a script and thorough user testing to collect anticipated responses to
help keep the experience complete.
Multiple rounds of quality assurance testing must be completed throughout
the whole process.
Think of how color, image choice, and copy set the tone for a design in print
or on the web. In the same way, sound quality, mood, and content of the
responses of the voice will set the tone for the voice experience.
As you can imagine, this is no small task.
180. To do this well requires a team of people dedicated to designing these VUI
skills. However, scripts that are created have components that can be used
across the different skills.
The conversational experiences we have will start to build a relationship with
the device and will also help establish a level of trust.
These experiences have a way of making an experience personal. Think about
how nice it is when someone knows and uses your name.
What if they could also learn and know your favorite settings and words?
What if they could mimic these preferences as they guide you, provide clear
instructions, and help reduce your anxiety by saying the right words?
This is what voice experiences can do. They are rooted in our experiences of
conversations with friends and colleagues, so it is no surprise that we start to
trust them like one too
181. Power of sound
Sound design should not be an afterthought; it makes or breaks the
experience.
Once you start to notice how sound plays a role in your physical world, you
can start to design ways for sound to create more immersion in your XR
experiences.
Using audio can help an experience feel more real, enhance your physical
space, or even help you interact with a computer interface, hands-free.
182. SOUND LOCALIZATION DESIGN
In the beginning of this chapter, you played the role of the listener. You
directed your awareness to the sounds that happened around you.
This time, you can take what you learned from that experience, and what you
have learned in this chapter, to design your own soundscape.
To do this, draw a chart similar to the sound localization diagram you
created from your listening experience.
However, this time you are going to design what sounds will be happening,
and where.
183. ● Think about where the experience will be happening, and if it is for VR
or AR, as that will determine how much ambient sound you will need
to plan for.
● Think about the distance and intensity of the sound from the user’s
perspective.
If you want the extra challenge, you can then record the sounds and bring
them into a sound editor of your choice, such as Apple Logic Pro or Adobe
Audition, to start editing them.
184. To create a full immersive experience, you
will need to bring the edited sounds into a
program, such as Unity Pro or Unreal
Engine, that will allow you to spatial map
out the location of the sounds.