What Are The Drone Anti-jamming Systems Technology?
Mpeg v-awareness event
1. MPEG-V: a standard for multi-
sensorial and immersive
experiences
Marius PREDA*
Institut TELECOM / TELECOM SudParis
*and several tens of people that shared their images on Internet
2. Immersion, presence, stimuli,
perception …
From centuries, we are building two worlds
films
stories
Physical
Informa0onal
novels
knowledge
music
Immersion – a straightforward definition
making abstraction of the Physical world (remove all possible
connection to it) and have stimuli only from the Informational World
3. In which world our users are?
The stimuli from the physical
world are familiar or at very
low intensities
9. What is the secret of a good
immersion in the Informational world?
Is the quality of the stimuli? Not only.
A guess:
The “scenarisation” of the experience,
or, in MPEG terms “authoring the content”
We should have methods and tools to express this new
type of complex content
10. Multi-sensorial content
A natural extension of the more traditional audio video content
This is not new, but it is now the right time to bring it in home
environments, therefore interoperability is an issue
MPEG-V is the solution by offering a rich set of tools for
representing multi-sensorial content
Covered mainly by Part 2, 3 & 5
12. Towards 3D content
3D Virtual Worlds, a promising trend in the 3DGFX space
- New technologies for creating, representing and visualizing 3D content are now available
- Fast development of high performing 3D graphics cards
- Connecting real and virtual economy
- Increased users demands for rich communication channels
Research
by
kzero.org
MPEG was already prepared to this take-off by defining means for GFX representation
13. Virtual World assets
Generic Virtual Objects Avatars
Container for
personal data,
personality, skills, …
Communication
support
between users
Interaction support
between the user
and the virtual
environment
14. Closed VW vs Open VW
MPEG-V
Sharing content between Virtual Worlds becomes
possible with MPEG-V
Covered mainly by Part 4
15. MPEG-V
A standardisation effort initiated in 2007 and driven by
two complementary forces – multi-sensorial experience
and virtual worlds – with the goal of offering a solid
technical ground for immersive, multi-dimensional,
multimedia applications and services
Promoted as ISO/IEC 23005 in January 2011
16. Why, How and other more or less
philosophical questions
Why the appropriate place for MPEG-V is in MPEG?
Answer by Dr. Leonardo Chiariglione
How MPEG is dealing with the interoperability of
immersive experience?
Answer by Eng. Jean Gelissen (Part 1), Pr. Kyoungro Yoon (Part 2&5),
Dr. Christian Timmerer (Part 3) and Dr. Jae Joon Han (Part 4)
How MPEG-V connects to MPEG-4 3D Graphics?
Answer by Dr. Minsu Ahn (invited speaker)
What the future reserves us related to Virtual Worlds?
Answer by Dr. Yesha Sivan (invited speaker)
18. MPEG-V Architecture & Use
Case
Jean H.A. Gelissen
Philips Research
Eindhoven, The Netherlands
19. MPEG-V Architecture and Use
Cases
MPEG-‐V
defines
an
architecture
that
provides
interoperability
for
informa<on
exchange
with
virtual
worlds.
This
allows
for
the
simultaneous
reac<ons
in
both
worlds
to
changes
in
the
environment
and
human
behavior.
Key
words
are
efficient,
effec<ve,
intui<ve
and
entertaining
interfaces
between
both
worlds
taking
the
economics,
rules
and
regula<ons
into
account.
20. Use Case driven Architecture
• Use cases have been crucial in the development of MPEG-V:
– For the requirements gathering,
– During design, development / implementation,
– Basis for validation (supported by reference software and conformance)
• Representation of Sensory Effects (RoSE)
• Full motion control and navigation of avatar/object with multi-input sources
• Virtual Travel
• Serious gaming for Ambient Assisted Living
• Virtual Traces of Real Places
• Avatar Appearance
• Social presence
• Group Decision-making in the context of Spatial Planning
• Consumer Collaboration in Product Design Processes along the Supply Chain
• Virtual Objects
• Internet Haptic Service - YouTube, Online Chatting
• Next Generation Classroom – Sensation Book
• Immersive Broadcasting – Home Shopping, Fishing Channels
• Entertainment – Game (Second Life, Star Craft), Movie Theater
• Virtual Simulation for Training – Military Task, Medical training
• Motion Effects
35. Capturing and controlling
the real world
with MPEG-V
Kyoungro Yoon (yoonk@konkuk.ac.kr)
School of Computer Science and
Engineering
Konkuk University
36. Contents
• MPEG-V Architecture
• Scope of Part 2 and Part 5
• Example Scenarios
• Part 2 Control Information
• Part 5 Data Formats for Interaction Devices
• Conclusion & Discussions
37. MPEG-V Architecture
Enriched content
Digital Content
Adaptation VV
Adaptation VV
Provider
(Virtual World,
(serious) game,
Virtual World N
Virtual World Data
Representation R
simulator, DVD, …)
User Interaction
Adaptation RV/VR
Virtual World Data Representation V
Adaptation VR/RV
Standardization Area B:
Sensory Information
Adaptation RV
Adaptation VR
Adaptation RV
Adaptation VR
(Part 3, 4, …)
Real World Data
Representation
Standardization Area A:
S
S
A
A
Control Information Sensor
Actuator
Device Commands
(Part 2 & 5)
Real World
Real
Real
Real
Areas A & B are
Device N
World
World
World
targets of MPEG-V
Dev1
Dev2
Dev3
standardization
38. Two Parts to capture and control
the real world
• ISO/IEC 23005 Part 2 Control Information
– The capability descriptions of actuators (sensory
devices) and sensors in the real world
– The user’s sensory preference information (USEP),
which characterizes devices and users, so that
appropriate information to control individual devices
(actuators and sensors) for individual users can be
generated
39. Two Parts to capture and control
the real world
• ISO/IEC 23005 Part 5 Data Formats for Interaction
Devices
– Data formats for industry-ready interaction devices:
sensors and actuators
41. Intention
• Part 5 alone can provide functionality of capturing
and controlling the real-world.
– Device Commands: Intensity provided in percentage with
respect to the maximum intensity that the specific device
can generate
– Sensed Information: Sensed value provided in specific unit
of the each individual sensor
• Part 2 helps to adapt the control to each individual
user’s case.
– Device capability description with min/max intensity
– Sensor capability description with min/max value and
accuracy, etc.
– User’s Preferences on Sensory Effects
42. Example Scenarios
• RoSE
– Sensory Effect Metadata, Sensory Device Capabilities, Sensory
Device Commands, and User Sensory Preferences are within
the scope of standardization and, thus shall be normatively
specified. On the other side, the RoSE Engine as well as
Provider entities and Consumer Devices are informative and are
left open for industry competition.
44. Bringing sensibility in a virtual world
" Environment
Sensors
-‐
The
real-‐0me
environmental
data
of
the
real
world
is
applied
to
reflect
on
the
virtual
enviro
nment.
-‐
Light,
ambient
noise,
temperature,
humidity,
distance,
atmospheric
pressure,
force,
torque,
pressure
sensors.
Sensing
in
Real
World
Reflec<ng
Environmental
context
to
Virtual
World
-‐
Sensed
Informa<on
(TimeStamp,
id,
sensorIdRef,
linkedlist,
groupID,
ac0vate,
priority)
-‐
Sensor
Capability
(Accuracy,
unit,
maxValue,
minValue,
offset,
numOfLevels,
sensi0vity,
SNR)
-‐
Brightness
changes
-‐
Turn
on
the
light
-‐
Change
day
(night)
to
night
(day)
-‐
Adapt
the
0me
series
sensor
data
seman0cally
to
the
virtual
world
based
on
the
-‐
Measure
Temperature,
Humidity,
Light,
sensor
capabili0es
and
adapta0on
Atmospheric
pressure,
and
etc.,
in
the
real
preferences.
world.
Courtesy
of
Samsung
45. Do what I do
" Mo<on
Sensor
-‐
detects/tracks
Real
0me
3D
mo0on
informa0on
of
a
control
device
-‐
Posi0on,
Velocity,
Accelera0on,
Orienta0on,
Angular
velocity,
Angular
accelera0on
sensors
" Can
be
applicable
to
mo<on
sensor
based
games.
Virtual
Naviga0on Virtual
Sports
Virtual
Concert Virtual
Games
Courtesy
of
Samsung
46. Part 5: Data Formats for Interaction
Devices
• Provide simple interface to the physical world (real
world)
Sensory Virtual World
Within Scope of Part 5
Information Object Char.
(Part 3) (Part 4)
Sensory Adaptation RV/VR
Device Commands
Device Capability Engine
(non-normative)
User’s Sensory Sensed
Sensor Capability
Preference Information
47. Part 5: Data Formats for Interaction
Devices
• Interaction Information Description Language (IIDL)
• Device Command Vocabulary
• Sensed Information Vocabulary
48. Part 5: Data Formats for Interaction
Devices
Device Command Vocabulary
Light Sprayer
Flash ColorCorrection
Heating Tactile
Cooling Kinesthetic
Wind RigidBodyMotion
Vibration
Scent
Fog
50. Part 5: Data Formats for Interaction
Devices
• Interaction Information Description Language (IIDL)
• Three Root Elements
– InteractionInfo
– DeviceCommand
– SensedInfo
51. Part 5: Data Formats for Interaction
Devices
• Interaction Information Description Language (IIDL)
– Provides base types for data formats for interaction
devices
• DeviceCommandBaseType
• SensedInfoBaseType
52. Part 5: Data Formats for Interaction
Devices
• Interaction Information Description Language (IIDL)
53. Part 2: Control Information
• Provide Capability and Preference Descriptions for
Fine-Tuned Control of Devices
Sensory Virtual World
Within Scope of Part 2
Information Object Char.
(Part 3) (Part 4)
Sensory Adaptation RV/VR
Device Commands
Device Capability Engine
(non-normative)
User’s Sensory Sensed
Sensor Capability
Preference Information
54. Part 2: Control Information
• Control Information Description Language (CIDL)
• Device Capability Description Vocabulary (DCDV)
• Sensor Capability Description Vocabulary (SCDV)
• User’s Sensory Effect Preference Vocabulary (SEPV)
58. Part 2: Control Information
Control Information Description Language (CIDL)
Provides basic structure of tools defined in part 2
59. Part 2: Control Information
Control Information Description Language (CIDL)
Also provides base types for each type of description
Sensory Device Capability Base Type
60. Part 2: Control Information
Control Information
Description Language (CIDL)
Also provides base types for
each type of description
Sensor Capability Base
Type
61. Part 2: Control Information
Control Information Description Language (CIDL)
Also provides base types for each type of description
User Sensory Preference Base Type
62. Conclusion and Discussions
• Capturing and Controlling Real-World can be
supported by MPEG-V Part 5
• For the fine-tuned control/capture of real-
world, MPEG-V Part 2 can help.
• For the personalized effects, MPEG-V Part 2
is required.
• Questions?
• Thank You.
63. Immersive Future Media
Technologies Sensory Experience
Christian Timmerer
Klagenfurt University (UNI-KLU) Faculty of Technical Sciences (TEWI)
Department of Information Technology (ITEC) Multimedia Communication (MMC)
http://research.timmerer.com http://blog.timmerer.com
mailto:christian.timmerer@itec.uni-klu.ac.at
Acknowledgments.
This
work
was
supported
in
part
by
the
European
Commission
in
the
context
of
the
NoE
INTERMEDIA
(NoE
038419),
the
P2P-‐Next
project
(FP7-‐ICT-‐216217),
and
the
ALICANTE
project
(FP7-‐ICT-‐248652).
64. Motivation
• Consumption of multimedia content may stimulate also other
senses
– Vision or audition
– Olfaction, mechanoreception, equilibrioception, thermoception, …
• Annotation with metadata providing so-called sensory effects
that steer appropriate devices capable of rendering these effects
…
giving
her/him
the
sensa<on
of
being
part
of
the
par<cular
media
➪
worthwhile,
informa<ve
user
experience
65. Outline
• Background / Introduction
– MPEG-V Media Context and Control
– Sensory Effect Description Language (SEDL) and Sensory
Effect Vocabulary (SEV)
– Software/Hardware components: SEVino, SESim, SEMP,
and amBX+SDK
• Improving the QoE through Sensory Effects ➪ Sensory
Experience
– A Brief Introduction to UME/QoE (UMA/QoS)
– Results from Subjective Tests
• Conclusions and Future Work
66. MPEG-V: Media Context and Control
System
Architecture
Pt.
1:
Architecture
Pt.
2:
Control
Informa0on
Pt.
3:
Sensory
Informa0on
Pt.
4:
Virtual
World
Object
Characteris0cs
Pt.
5:
Data
Formats
for
Interac0on
Devices
Pt.
6:
Common
Types
and
Tools
Pt.
7:
Conformance
and
hep://www.chiariglione.org/mpeg/working_documents.htm#MPEG-‐V
Reference
Sodware
67. Sensory Effect Description Language
(SEDL)
• XML Schema-based language for describing sensory effects
– Basic building blocks to describe, e.g., light, wind, fog, vibration, scent
– MPEG-V Part 3, Sensory Information
– Adopted MPEG-21 DIA tools for adding time information (synchronization)
• Actual effects are not part of SEDL but defined within the Sensory Effect
Vocabulary (SEV)
– Extensibility: additional effects can be added easily w/o affecting SEDL
– Flexibility: each application domain may define its own sensory effects
• Description conforming to SEDL :== Sensory Effect Metadata (SEM)
– May be associated to any kind of multimedia content (e.g., movies, music,
Web sites, games)
– Steer sensory devices like fans, vibration chairs, lamps, etc. via an
appropriate mediation device
➪ Increase the experience of the user
➪ Worthwhile, informative user experience
71. SEVino, SESim, SEMP, and amBX
Annota0on
Tool:
SEVino
Player:
SEMP
Simulator:
SESim
amBX (Ambient Experience) system + SDK
• Two fan devices, a wrist rumbler, two sound
speakers, a subwoofer, two lights, and a wall
washer
• Everything controlled by SEM descriptions
except light effect
72. Quality of Experience
Factors impacting Quality of Experience
Network
Content
Device
Format
Technical
Factors
Task
Quality
of
Applica<on
Experience
Context
(QoE)
User
Social
and
Psychological
Factors
User
Environment
Expecta<on
Content
T.
Ebrahimi,
“Quality
of
Mul0media
Experience:
Past,
Present
and
Future”,
Keynote
at
ACM
Mul0media
2009,
Beijing,
China,
Oct
22,
2009.
hep://www.slideshare.net/touradj_ebrahimi/qoe
73. Quality of Experience
• Universal Multimedia Access (UMA)
– Anywhere, anytime, any device + technically feasible
– Main focus on devices and network connectivity issues ➪ QoS
• Universal Multimedia Experience (UME)
– Take the user into account ➪ QoE
• Multimedia Adaptation and Quality Models/Metrics
– Single modality (i.e., audio, image, or video only) or a simple combination of
two modalities (i.e., audio and video)
• Triple user characterization model
– Sensorial, e.g., sharpness, brightness
– Perceptual, e.g., what/where is the content
– Emotional, e.g., feeling, sensation
• Ambient Intelligence
– Add’l light effects are highly appreciated for both audio and visual content
– Calls for a scientific framework to capture, measure, quantify, judge, and
explain the user experience
B.
de
Ruyter,
E.
Aarts.
“Ambient
intelligence:
visualizing
the
future”,
Proceedings
of
the
Working
Conference
on
Advanced
Visual
Interfaces,
New
York,
NY,
USA,
2004,
pp.
203–208.
E.
Aarts,
B.
de
Ruyter,
“New
research
perspec0ves
on
Ambient
Intelligence”,
Journal
of
Ambient
Intelligence
and
Smart
Environments,
IOS
Press,
vol.
1,
no.
1,
2009,
pp.
5–14.
F.
Pereira,
“A
triple
user
characteriza0on
model
for
video
adapta0on
and
quality
of
experience
evalua0on,”
Proc.
of
the
7th
Workshop
on
Mul<media
Signal
Processing,
Shanghai,
China,
October
2005,
pp.
1–4.
74. Experiment: Goal & Setup
• Aim: demonstrate that sensory effects is a vital tool for enhancing the
user experience depending on the actual genre
• Tools
– Sensory Effect Media Player (SEMP)
– Test sequences annotated with sensory effects: action (Rambo 4,
Babylon A.D.), news (ZIB Flash), documentary (Earth), commercials
(Wo ist Klaus), and sports (Formula 1)
– Double Stimulus Impairment Scale (DSIS) also known as Degradation
Category Rating (DCR)
• Five-level impairment scale ➪ new five-level enhancement scale
• Procedure
– First, show reference sequence w/o sensory effects
– Second, the same sequence enriched with sensory effects with a two
second break in between
– Finally, subjects to rate the overall opinion of the audio/video resource
and sensory effect quality
76. Conclusions
• Sensory effects is a vital tool for enhancing the user
experience leading to a unique, worthwhile Sensory
Experience
– Action, sports, and documentary genres benefit more
from these additional effects
– Rambo 4 and Babylon A.D. are from the same genre,
the results differ slightly
– Commercial genre can also profit from the additional
effects but not at the same level as documentary
– News genre will not profit from these effects
• Interoperability through MPEG-V (ISO/IEC 23005)
77. Acknowledgments
• EC projects for funding this activity
– NoE INTERMEDIA (NoE 038419)
• http://intermedia.miralab.ch/
– P2P-Next project (FP7-ICT-216217)
• http://www.p2p-next.eu
– ALICANTE project (FP7-ICT-248652)
• http://www.ict-alicante.eu
– COST ICT Action IC1003
• QUALINET – European Network on Quality
of Experience in Multimedia Systems and Services
• Markus Waltl for implementing, preparing, conducting, evaluating
almost all the experiments
• Benjamin Rainer for implementing the Firefox plug-in + WWW tests
• Hermann Hellwagner for his advice and feedback
• ISO/IEC MPEG and its participating members for their constructive
feedback during the standardization process
78. References
• Markus Waltl, Christian Timmerer, Hermann Hellwagner, “A Test-Bed for Quality of
Multimedia Experience Evaluation of Sensory Effects”, Proceedings of the First
International Workshop on Quality of Multimedia Experience (QoMEX 2009), San
Diego, USA, July 29-31, 2009.
• C. Timmerer, J. Gelissen, M. Waltl, and H. Hellwagner, “Interfacing with Virtual
Worlds”, Proceedings of the NEM Summit 2009, Saint-Malo, France, September
28-30, 2009.
• M. Waltl, Enriching Multimedia with Sensory Effects, VDM Verlag Dr. Müller, February,
2010.
• M. Waltl, C. Timmerer and H. Hellwagner, “Increasing the User Experience of
Multimedia Presentations with Sensory Effects”, Proceedings of the 11th
International Workshop on Image Analysis for Multimedia Interactive Services
(WIAMIS’10), Desenzano del Garda, Italy, April 12-14, 2010.
• C. Timmerer, M. Waltl, and H. Hellwagner, “Are Sensory Effects Ready for the World
Wide Web?”, Proceedings of the Workshop on Interoperable Social Multimedia
Applications (WISMA 2010), Barcelona, Spain, May 19-20, 2010.
• M. Waltl, C. Timmerer, and H. Hellwagner, “Improving the Quality of Multimedia
Experience through Sensory Effects”, Proceedings of the 2nd International
Workshop on Quality of Multimedia Experience (QoMEX2010), Trondheim, Norway,
June 21-23, 2010.
• M. Waltl, C. Raffelsberger, C. Timmerer, and H. Hellwagner, “Metadata-based Content
Management and Sharing System for Improved User Experience”, Proc. of the 4th
InterMedia Open Forum (IMOF 2010), Palma de Mallorca, Spain, September 1, 2010.
82. Introduction (1/2)
Architecture and specifications of associated information
representations of virtual worlds
Enable the interoperability
between virtual worlds
(Adaptation VV)
Provides
controllability of
virtual worlds with
the real world
devices
83. Introduction (2/2)
" Characterize virtual world objects by the two elements.
Avatar: a representation of the user inside the virtual environments.
Virtual object: any object except for avatars in the virtual environments.
Virtual
Object
Avatars
84. Why need the common specifications?
• Need to import characters from one virtual world to
another virtual world.
Import parameters of
… the created avatar
VW 1 VW 2 … VW N
• Provide common formats for interfacing between
virtual world and the real world.
Scaling & Rotation of an object
85. Virtual world object characteristics (VWOC)
• Characterize various kinds of objects within the VW.
• referencing the resources
• Provide an interaction with the VW.
Interaction Resources
Devices
sound
Interaction references
as an event as a resource scen
t
animation
86. A base type of virtual world object
characteristics
• Common characteristics for both avatars and virtual
objects
• Identification
Characterizes virtual
• A list of sound resources world objects
• A list of scent resources
• A list of control parameters Provides Interaction
• A list of events with virtual world
• A list of behavioral models objects
• Inherit the base type to extend the specific aspects
of each.
87. Identification
• Describes the ownership, credits, associated user
ID.
• Supports digital rights according to
ISO/IEC 21000-5:2004 (MPEG21)
Name
Cello
family
Musical Instr
ument
UserID
JohnDoe
Ownership
V. School of
Art
Rights
No duplicati
on
Virtual Credits
Ms. Jane So
und
Instruments Added a cell
shop o sound.
88. Sound and scent type
• Contains the URLs of the resources.
• Contains the descriptions of the resources
– ID, intensity, duration, loop, and name.
V- concert Performance
Name
CelloSound
Resource
http://...
Intensity
50%
duration
2 seconds
loop
unlimited
V-‐Cello
89. Control type
• Contains the control parameters for position,
orientation, and scale factor.
Scaling an object Rotating an object
90. Event type and behavioral model
• Event type supports legacy input devices and user
defined Input devices.
Mouse Keyboard User defined input devices
• Behavioral model provides the mapping between the
in/out event by IDs
“Wo
w~”
Mouse left click walk animation wow sound
(ID: EventID1) (ID: AniID4) (ID: SoID5)
91. Additional common types
• Haptic properties
S0ffness
…
Damping
Material properties Tactile properties Dynamic force effects
• Animation description type
Animation Name: RunAndJump,
Duration:10 seconds,
Loop: 5 repetitions,
URL: http://www.animation.com
/jumping.ani
92. Avatar metadata
• Based on VWOBaseType (ID, Behavioral model,
Control, Event, Resources (Sound and Scent))
• Other resources defined for Avatar
– Appearance: body parts, accessories, and physical
conditions, links to the resources.
• ex.) NoseType = {size, width, upper bridge, lower bridge,
tip angle, hapticIDRef and etc}
– Animation
– Communication skills
• Supports input/ouput channels. Each channel support verbal and non-
verbal communication. (ex. text, voice, sign language, gesture and so
on.)
– Personality
– Control features
– Haptic properties
– Gender
93. Avatar metadata (animation)
• Composed of various types of animation sequences
– Ex. idle, greeting, dancing, and fighting animation
types
• Each type has its own animation set defined in its
classification scheme.
Examples of
Idle type
Default idle Body noise Resting pose
94. Avatar metadata (personality)
• Based on the OCEAN model which is a set of personality
characteristics.
– Openness, agreeableness, neuroticism, extraversion, and
conscientiousness (ranged between -1 and 1)
• The characteristics can be used for
– Designing the characteristics of the avatar.
– VW can interpret its inhabitant wishes.
– Adapt the communication to the personality.
Help
No me As alwa
ys
Can you
help me?
Agreeableness = -0.9 Agreeableness = 0.9
Communication Reaction creation based on personality
95. Avatar metadata (Control feature)
Supports the feature points of the avatar to control
both face and body.
– Placeholder for sensor (sensed information)
– Facial control features are defined by the outlines of each facial
part.
– Body features are defined by the bones of the skeleton.
Head
Upper
Middle
Down
96. Virtual Object
• Defines the representation of virtual objects inside the
environment
– to characterize various kinds of objects.
– to provide an interaction with the real world devices.
• Support the following types of data, in addition to the common
characteristics.
– Appearance, animation, haptic properties with the base type
– Components: allows to build a virtual object with the combination
of virtual objects.
Carrot
-sound : cutting sound, frying sound
-smell : carrot juice, carrot soup etc
-stiffness : 4th level
-components:
green part and main part
-behavioral model:
input: mouse left click
output: cutting sound + slicing
Virtual cooking class animation
97. Covered and not yet covered
Covered Not covered yet
• Avatar characteristics
• Communication protocols
- size, clothing, accessories,
- Ensure security, trust, privacy
• Personal attributes
• Virtual currency exchange
Interfaces b - Shape, animation, control
• Visual context on location and ori
etween virt • VO characteristics
ual worlds entation information from virtual w
- cars, house, furniture, …
orld
• User ID, user profile, ownership,
• Personal attributes
rights and obligation of VO
-Movement, behaviors
• Haptic properties
98. Covered and not yet covered
Covered Not covered yet
• Many sensors • Not all the sensors,
• Mental state • User conditions (preference, atm
- Emotional, physical condition osphere, context, and so on),
Interfaces b
etween virt • Control feature points for avatar • Easy to use privacy /openness c
ual worlds • Input events ontrol,
and the phy
sical world • Behavioral model • Contextual information (visual im
pression) from real world,
• Timing constraints for sensors an
d actuators
99. Conclusion
• MPEG-V part 4: Virtual world object characteristics deals the
high level description of the two elements in the virtual world.
• The specification describes
- identity,
- resources description (sound, scent, haptics, animation, and
appearance),
- real-time direct control (scaling, position, rotation, body skeleton,
facial feature points),
- behavioral mapping for interaction (input devices and output
events).
• The specification can support
– Easy import of characters from a virtual world to another virtual
world.
– Common formats for interfacing between virtual world and the
real world devices.