Mobile and portable devices require the definition of new user interfaces (UI) capable of reducing the level of attention required by users to operate the applications they run to improve the calmness of them. To carry out this task, the next generation of UIs should be able to capture information from the context and act accordingly. This work defines an extension to the UsiXML methodology that specifies how the information on the user is modeled and used to customize the UI. The extension is defined vertically through the methodology, affecting all layers of the methodology. In the Tasks & Concepts layer, we define the user environment of the application, where roles and individuals are characterized to represent different user situations. In the Abstract UI layer, we relate groups of these individuals to abstract interaction objects. Thus, user situations are linked to the abstract model of the UI. In the Concrete UI layer, we specify how the information on the user is acquired and how it is related to the concrete components of the UI. This work also presents how to apply the proposed extensions to a case of study. Finally, it discusses the advantages of using this approach to model user-aware applications.
The Codex of Business Writing Software for Real-World Solutions 2.pptx
Extending UsiXML to support User-aware Interfaces
1. Extending UsiXML to support User-aware Interfaces Ricardo Tesoriero12 ricardo.tesoriero@uclm.es Jean Vanderdonckt1 jean.vanderdonckt@uclouvain.be 1 Universitécatholique de Louvain 2 University of Castilla-La Mancha
2. Agenda Introduction Scope The UsiXML framework The UsiXML extensions The Study Case: Healthy Menu Conclusions & future work
3. Introduction Ubiquitous computing is everywhere… Many computers are shared by each of us Information overload Calm Technology [1] Context-aware UIs Multi-modal UIs At home At work Public spaces Personal devices Ticket machines Digital blackboards Navigators [1] M. Weiser and J. S. Brown. The coming age of calm technology. The next fifty years of computing, pp. 75-85. Copernicus. 1997.
4. Scope: User-aware UIs The context is any information that can be used to characterize the situation of an entity (person, place or object) that is considered relevant to the interaction between a user and an application [2]. Feature space for context [3] Focus on Partially boarded From user perspective [2] A. Dey. Understanding and using context. Personal and Ubiquitous computing, 5, pp. 4-7. 2001 [3] A. Schmidt, M. Beigl, H. W. Gellersen. There is more to context than location. Computer & Graphics, 23(6), pp. 893-901. 1999
5. The UsiXML Framework Defines a development process based on the Cameleon Reference Framework [4] to build multi-device interactive applications. MDA toaddresstheproblem fromdifferentperspectives. Context of Use Tasks & Concepts CIM (TaskModel and DomainModel) Abstract UI TransformationModel PIM (AbstractUserInterfaceModel) Mapping Model Concrete UI PSM (ConcreteUserInterfaceModel) Concretization ISM (SourceCode) Final UI Abstraction [4] G. Calvary, J. Coutaz, D. Thevenin, Q. Limbourg, L. Bouillon, J. Vanderdonckt. A unifying reference framework for multi-target user interfaces. Interacting with Computers, 15(3), pp. 289-308. 2003
6. The ContextModel (context of use) Fixeddescription of theUserCharacteristics / Capabilities i.e. systemExperience deviceExperience browserCapablities hardwarePlatform softwarePlatform etc.
7. The UsiXML extension Characterizeusercapabilitiesaccordingtotheapplicationdomainorcapabilities Express differentuser «situations» in terms of thesecharacteristics Extensions Modifications
8. The Task & Concepts LayerUserModel & TaskModel extensions Task-Individual Relationship User Feature level: defines the features of the user that affect the UI in terms of Roles and Features (i.e. Patient[R].temperature[F]) RoleSpecialization (common characteristics) User Profile level: characterizes the features according to runtime situations in terms of Individuals and Feature Constraints (i.e. PatientWithFever[I].temperature[F] > 38.5[FCo]) RoleCharacterization (i.e. Patient[R]<->PatientWithFever) Note: R=Role, I=Individual, F=Feature and Fco=Feature Constraint TaskModel extension Allowed Not allowed Feature level Profile level
9. The AUI extension ContainerObserves “enables / disables, show/hides” AbstarctContainer FacetObserves “enables / disables” a Facet of an AbstarctInteractionComponent modifying its behavior. MappingModel facet Extension to the MappingModel UserModel
10. The CUI extension CUI extension Sensors & Interpreters (Polling and Event-driven) Mapping extension Interpreter Updates (Interpreter -> Feature) MappingModel UserModel CUI Extension Extension to MappingModel
11. The transformation process and the FUI AUI CUI ObserversIndivoduals from Mapping Observed by Interpreters from CUI Updates UpdatesIndividual from Mapping Individuals from T&C Sensors from CUI
12. The study case The Healthy Menu The goal of the application is the presentation of dishes according to the role and the biophysical state of the user. The User Model
14. The Healthy MenuMapping model AUI mappings Containers Oc(aPatient, PatientMenu) Oc(aNurse, NurseMenu) Oc(aDoctor, DoctorMenu) Oc(aVisitor, VisitorMenu) Facets Of (anUser, nAccept) Of (anUser, oIdNumber) Of (anUser, cAccept) Of (anUser, oDesease) Of (anUser, nReject) Of (anUser, cReject) Of (aPatientWithHipoGlycemia, oDrink) Of (aPatientWithFever, iDrink) Of (aPatientWithHipoGlycemia, oMeal) Of (aPatientWithFever, oMeal) Of (aPatientWithHipoGlycemia, oDessert) Of (aPatientWithFever, iDessert) Of (aPatientWithHyperGlycemia, oDrink) Of (aNormalPatient, iDrink) Of (aPatientWithHyperGlycemia, oMeal) Of (aNormalPatient, iMeal) Of (aPatientWithHyperGlycemia, oDessert) Of (aNormalPatient, iDessert)
16. CUI – FUI (Patient With Fever) Temp. Sensor interpreter isRefinedBy/isAbstarctedInto isRefinedBy/isAbstarctedInto Wineisout! isRefinedBy/isAbstarctedInto isRefinedBy/isAbstarctedInto Green Salad Only! isRefinedBy/isAbstarctedInto isRefinedBy/isAbstarctedInto isRefinedBy/isAbstarctedInto
17. CUI – FUI (Patient With Hipo/Hyper glucemy) Gluc. Sensor interpreter isRefinedBy/isAbstarctedInto isRefinedBy/isAbstarctedInto isRefinedBy/isAbstarctedInto isRefinedBy/isAbstarctedInto isRefinedBy/isAbstarctedInto StrictDiet! isRefinedBy/isAbstarctedInto isRefinedBy/isAbstarctedInto
18. Conclusions This work exposes a model-based approach to develop user-aware multi-platform and multi-modal UIs following the UsiXML framework This approach encourages the separation of the user modeling from the application domain to improve the model reuse It covers from the conceptual modeling of the user environment to the specification of the sensing infrastructure The user modeling is divided in two levels of abstraction The specification of User Features (Roles) The quantification of User Features (Individuals) Providing designers the ability to define custom features of user profiles / Roles
19. Future Work The definition of an extension of the user awareness to model the social awareness of UIs The inclusion of the location awareness as part of the UI specification The definition of a common feature-based framework allowing designers to express characteristics that are related to the combination of the social and location features of context-aware UIs, such as the co-location of users
20. Thankyouverymuchforyourattenttion! Questions, Suggestions, Critics and Comments are always welcomed! Ricardo Tesoriero12 ricardo.tesoriero@uclm.es Jean Vanderdonckt1 jean.vanderdonckt@uclouvain.be 1 Universitécatholique de Louvain 2 University of Castilla-La Mancha This work was founded by: