Skip to main content
Top

2007 | Book

Task Models and Diagrams for Users Interface Design

5th International Workshop, TAMODIA 2006, Hasselt, Belgium, October 23-24, 2006. Revised Papers

Editors: Karin Coninx, Kris Luyten, Kevin A. Schneider

Publisher: Springer Berlin Heidelberg

Book Series : Lecture Notes in Computer Science

insite
SEARCH

About this book

We are proud to present the TAMODIA 2006 proceedings. In 2006, the TA- MODIA workshop celebrated its ?fth anniversary. TAMODIA is an obscure acronym that stands for TAsk MOdels and DIAgrams for user interface - sign. The ?rst edition of TAMODIA was organized in Bucharest (Romania) by Costin Pribeanu and Jean Vanderdonckt. The fact that ?ve years later the TAMODIAseriesofworkshopsstillcontinuessuccessfullyprovestheimportance of this research area for the human–computer interaction community! The ?rst workshopaimed at examining how multiple forms of task expressionscan sign- icantly increase or decrease the quality of user interface design. This is still the scope of the current edition; we tried to assemble papers that discuss how the complexity of HCI design and development can be managed with tasks, models and diagrams. Much like the previous editions, the selection of papers from the 2006 edition re?ects the broad scope of this ?eld, which cannot be labeled with a single title or term. The invited paper is by Jo¨ elle Coutaz and discusses meta-user interfaces for ambient spaces. Finding appropriate ways to design and develop user interfaces for interactive spaces is becoming an important challenge for the creation of future usable applications. This exciting work gives a good feel of the new type of user interfaces and the required new approaches we are evolving toward when we want to realize the vision of ambient intelligent environments and create systems that can be used and controlled by the end-users.

Table of Contents

Frontmatter

Invited Paper

Meta-User Interfaces for Ambient Spaces
Abstract
In this article, we propose the concept of meta-User Interface (meta-UI) as the set of functions (along with their user interfaces) that are necessary and sufficient to control and evaluate the state of interactive ambient spaces. This set is meta-, since it serves as an umbrella beyond the domain-dependent services that support human activities in an ambient interactive space. They are User Interface-oriented since their role is to help users to control and evaluate the state of this space. We present a dimension space to classify, compare, and contrast disparate research efforts in the area of meta-UI’s. We then exploit the generative power of our design space to suggest directions for future research.
Joëlle Coutaz

Tool Support

Tool Support for Handling Mapping Rules from Domain to Task Models
Abstract
The success of model-based approaches to user interface design depends on the ability to solve the mapping problem as well as on the availability of tools able to reduce the effort of establishing and maintaining of links between models throughout the development life cycle. In this paper a tool supporting a small set of mapping rules is presented. The tool enables the designer to produce task model fragments at operational level based on the patterns of mapping between task and domain models. The task model fragments are generated in XML format and can be further loaded in task modeling tools like CTTE or Teresa.
Costin Pribeanu
Towards Visual Analysis of Usability Test Logs Using Task Models
Abstract
In this paper we discuss techniques on how task models can enhance visualization of the usability test log. Evaluation of the usability tests is a traditional method in user centered design. It is a part of the methodology for design of usable products. We developed a tool for visualization of usability logs that uses the hierarchical structure of the task models to group and visualize observers’ annotations. This way of visualization is very natural and allows investigation of the test dynamics and comparison of participant’s behavior in various parts of the test. We also describe methods of visualization of multiple logs that allow for comparison of the test results between participants. For that purpose we present a new visualization method based on alignment of the visual representations of tasks, where binding between the task model and the log visualization is used. Finally, we present an evaluation of our tool on two usability tests, which were conducted in our laboratory and we discuss observed findings.
Ivo Malý, Pavel Slavík

Model-Based Interface Development

Dialog Modeling for Multiple Devices and Multiple Interaction Modalities
Abstract
Today a large variety of mobile interaction devices such as PDAs and mobile phones enforce the development of a wide range of user interfaces for each platform. The complexity even grows, when multiple interaction devices are used to perform the same task and when different modalities have to be supported. We introduce a new dialog model for the abstraction of concrete user interfaces with a separate advanced control layer for the integration of different modalities. In this context, we present the Dialog and Interface Specification Language (DISL), which comes with a proof-of-concept implementation.
Robbie Schaefer, Steffen Bleul, Wolfgang Mueller
Model-Based Support for Specifying eService eGovernment Applications
Abstract
Model-based approaches are a suitable alternative to cope with the increasing complexity of eServices made available in the last years by eGovernment applications. However, up to now, only a few studies have investigated which are the requirements for notations and tools devoted to support eService modeling. The main goal of this paper is to make a state of knowledge on the specification of user activity and processes in eGovernment eServices. Our results advocated for a hybrid approach for modeling combining task models and process models.
Florence Pontico, Christelle Farenc, Marco Winckler
A Model-Based Approach to Develop Interactive System Using IMML
Abstract
Software engineering and human-computer interaction communities use methods, techniques and tools that are not easily integrated. Our work argues that the development process could be improved by providing the designer with models, languages and tools that provides a seamless integration of software engineering and human-computer interaction approaches. In order to achieve this goal we have developed a language the Interactive Message Modeling Language (IMML) to support the development of interactive systems. This paper presents and discusses the concepts and models that are the foundation of IMML. We also compare our process with traditional task-based perspectives.
Jair C. Leite

User Interface Patterns

PIM Tool: Support for Pattern-Driven and Model-Based UI Development
Abstract
Model-based approaches describe the process of creating UI models and transforming them to build a concrete UI. Developers specify interactive systems on a more abstract and conceptual level instead of dealing with low level implementation. However, specifying the various models is a complex and time consuming task. Pattern-based approaches encapsulate frequently used solutions in form of building blocks that developers may combine to create a user interface model. Thus they enforce reuse and readability and reduce complexity. In this paper we present a comprehensive framework that unites model-based and pattern-driven approaches. We introduce the “Patterns In Modelling” (PIM) tool, that implements this framework. We will demonstrate the functioning of the tool by using an illustrative example. We primarily focus on the creation of the task model and give a brief outlook how patterns will be applied to the other levels within the framework.
Frank Radeke, Peter Forbrig, Ahmed Seffah, Daniel Sinnig
Pattern-Based UI Design: Adding Rigor with User and Context Variables
Abstract
In current practice, user interface development is often based on a vague and undocumented design process, relying solely on the designer’s experience. This paper defines a pattern-based design process, which adds rigor to user interface design. The process is based on the notion of user variables to capture user requirements in a formal manner – based on discrete values that are amenable for tool support and automated analysis. Other context of use information is captured as context variables. Using these values as input, design patterns are selected to leverage best design practices directly into user interface development. Pattern-Oriented Design is then employed to derive a conceptual design, or early prototype, of the user interface. A case study with a Bioinformatics information site exemplifies the feasibility and applicability of this process.
Homa Javahery, Daniel Sinnig, Ahmed Seffah, Peter Forbrig, T. Radhakrishnan
Error Patterns: Systematic Investigation of Deviations in Task Models
Abstract
We propose a model-based approach to integrate human error analysis with task modelling, introducing the concept of Error Pattern. Error Patterns are prototypical deviations from abstract task models, expressed in a formal way by a model transformation. A collection of typical errors taken from the literature on human errors is described within our framework. The intent is that the human factors specialist will produce the task models taking an error-free perspective, producing small and useful task models. The specialist will then choose from the collection of error patterns, and selectively apply these patterns to parts of the original task model, thus producing a transformed model exhibiting erroneous user behaviour. This transformed task model can be used at various stages of the design process, to investigate the system’s reaction to erroneous behaviour or to generate test sequences.
Rémi Bastide, Sandra Basnyat
Using an Interaction-as-Conversation Diagram as a Glue Language for HCI Design Patterns on the Web
Abstract
The benefits of using software design patterns have been widely reported. However, in order for user interface design patterns to achieve the same degree of success as software design patterns, it would be useful to document design pattern solutions using a representation language that can be readily transported into the definition or specification of the interactive solution. Moreover, patterns are fragmented, which may hinder the designers’ global comprehension about their design decisions. In this paper, we present a small study that illustrates the use of an interaction modeling language called MoLIC as a glue language, which binds together the design pattern solutions and novel design constructs (for which there are no patterns defined) into forming a whole interactive solution.
Ariane Moraes Bueno, Simone Diniz Junqueira Barbosa

Bridging the Gap: Driven by Models

An MDA Approach for Generating Web Interfaces with UML ConcurTaskTrees and Canonical Abstract Prototypes
Abstract
UML has become the standard language for modelling in different areas and domains, but it is widely recognized that it lacks support for User Interface Design (UID). On the other hand, ConcurTaskTree (CTT) is one of the most widely used notations for task and dialogue modelling. An important achievement is the proposed notation and semantics for CTT by extending the UML metamodel, proving that task modelling in user interface design can be accomplished by a UML compliant notation. For the interface structure design was proposed that UML’s CTT could be complemented with Canonical Abstract Prototypes (CAP) leading to a model-based user interface design method co-specified by the presentation (CAP) and behaviour (UML’s CTT) perspectives. In this paper we propose another step in this UID method by defining a specific model compliant with the OMG recommended Model Driven Architecture (MDA), which will be the intermediary between the design model and an implementation of the user interface. This proposal will align the UID method with the MDA recommendation making it possible to automatically generate interface prototypes from conceptual models.
Duarte Costa, Leonel Nóbrega, Nuno Jardim Nunes
High-Level Modeling of Multi-user Interactive Applications
Abstract
With the shift of networked software applications away from the desktop computers and into home appliances comes the challenge of finding new ways to model this type of software. The most important home appliance that has new found computing capabilities is the television set. Through television people can now be participants instead of viewers, and use interactive software to enter the participative stage. Staged participatory multimedia events are a subset of this kind of interactive software that have a predefined temporal structure that lets television viewers and mobile users become participants in a distributed multimedia event. In this paper, we introduce an interaction model that is part of modelling language for staged participatory multimedia events. We show how the language can be mapped on UML constructs and shortly discuss related work using a common example.
Jan Van den Bergh, Kris Luyten, Karin Coninx
Goals: Interactive Multimedia Documents Modeling
Abstract
Multimedia has been largely applied to develop attractive and functional applications that allow achieving useful user-tasks. These sophisticated applications are usually developed using bottom-up approaches regardless of the complexity of the implementation. Furthermore, the design of complex Interactive Multimedia Documents (IMDs) introduces an additional complexity; the authoring of these applications can be an error-prone task considering the increasing number of media objects participating in these documents and the synchronization among them. For this reason, the authoring of IMDs should be supported by a structured methodology based on an intuitive abstraction that facilitates the design of complex IMDs. This paper presents Goals a top-down use-case driven, architectural-centric, UML based methodology that allows for the intuitive authoring of complex IMDs through the structured modeling of the presentations aspects, content and the complete behavior of these documents.
Pedro Valente, Paulo N. M. Sampaio

Task-Centered Design

Using Task Models for Cascading Selective Undo
Abstract
Many studies have shown that selective undo, a variant of the widely-implemented linear undo, has many advantages over the prevailing model. In this paper, we define a task model for implementing selective undo in the face of dependencies that may exist between the undone action and other subsequent user actions. Our model accounts for these dependencies by identifying other actions besides the undone one that should also be undone to keep the application in a stable state. Our approach, which we call cascading selective undo, is built upon a process-programming language originally designed in the software engineering community. The result is a formal analytical framework by which the semantics of selective undo can be represented separately from the application itself. We present our task model, the selective undo algorithm, and discuss extensions that account for differing kinds of inter-action dependencies.
Aaron G. Cass, Chris S. T. Fernandes
Exploring Interaction Space as Abstraction Mechanism for Task-Based User Interface Design
Abstract
Designing a user interface is often a complex undertaking. Model-based user interface design is an approach where models and mappings between them form the basis for creating and specifying the design of a user interface. Such models usually include descriptions of the tasks of the prospective user, but there is considerable variation in the other models that are employed. This paper explores the extent to which the notion of interaction space is useful as an abstraction mechanism to reduce the complexity of creating and specifying a user interface design. We present how we designed a specific user interface through use of design techniques and models that employ the notion of interaction space. This design effort departed from the task models in an object-oriented model of the users’ problem and application domains. The lessons learned emphasize that the notion of interactions spaces is a useful abstraction mechanism that can help user interface designers exploit object-oriented analysis results and reduce the complexity of designing a user interface.
Christian M. Nielsen, Michael Overgaard, Michael B. Pedersen, Jan Stage, Sigge Stenild

Multi-modal User Interfaces

Comparing NiMMiT and Data-Driven Notations for Describing Multimodal Interaction
Abstract
In the past few years, multimodal interaction is gaining importance in virtual environments. Although multimodality makes interaction with the environment more intuitive and natural for the user, the development cycle of such an environment is often a long and expensive process. In our overall field of research, we investigate how model-based design can help shorten this process by designing the application with the use of high-level diagrams. In this scope, we developed ‘NiMMiT’, a graphical notation especially suitable for expressing multimodal user interaction. We have already experienced the benefits of NiMMiT in several in-house applications, and are currently assessing the value of NiMMiT with respect to existing notations. In this paper we report on our comparison of NiMMiT against some well known data-driven modeling notations.
Joan De Boeck, Chris Raymaekers, Karin Coninx
Incorporating Tilt-Based Interaction in Multimodal User Interfaces for Mobile Devices
Abstract
Emerging ubiquitous environments raise the need to support multiple interaction modalities in diverse types of devices. Designing multimodal interfaces for ubiquitous environments using development tools creates challenges since target platforms support different resources and interfaces. Model-based approaches have been recognized as useful for managing the increasing complexity consequent to the many available interaction platforms. However, they have usually focused on graphical and/or vocal modalities. This paper presents a solution for enabling the development of tilt-based hand gesture and graphical modalities for mobile devices in a multimodal user interface development tool. The challenges related to developing gesture-based applications for various types of devices involving mobile devices are discussed in detail. The possible solution presented is based on a logical description language for hand-gesture user interfaces. Such language allows us to obtain a user interface implementation on the target mobile platform. The solution is illustrated with an example application that can be accessed from both the desktop and mobile device supporting tilt-based gesture interaction.
Jani Mäntyjärvi, Fabio Paternò, Carmen Santoro
An HCI Model for Usability of Sonification Applications
Abstract
Sonification is a representation of data using sounds with the intention of communication and interpretation. The process and technique of converting the data into sound is called the sonification technique. One or more techniques might be required by a sonification application. However, sonification techniques are not always suitable for all kinds of data, and often custom techniques are used - where the design is tailored to the domain and nature of the data as well as the users’ required tasks within the application. Therefore, it is important to assure the usability of the technique for the specific domain application being developed. This paper describes a new HCI Model for usability of sonification applications. It consists of two other models, namely the Sonification Application (SA) Model and User Interpretation Construction (UIC) Model. The SA model will be used to explain the application from the designer’s point of view. The UIC Model will be used to explain what the user might perceive and understand.
Ag. Asri Ag. Ibrahim, Andy Hunt

Reflections on Tasks and Activities in Modeling

Non-functional User Interface Requirements Notation (NfRn) for Modeling the Global Execution Context of Tasks
Abstract
This paper describes the rationale behind a user interface requirements management notation and a supporting tool suite. The notation is being developed to facilitate the design of interactions based on an account of non-functional requirements (NFRs), thus the acronym NfRn for the technique. NfRn is a graphical notation which is used to specify an interactive system’s global execution context (GEC). The resulting depiction is referred to as the Global Execution Context graph (GECg). The GECg is a visual construction, which consists of nodes, representing interaction scenarios, and directed links representing scenario relationships designating alternate execution, concurrency, ordering, and set-oriented relationships between two scenario nodes. The technique is particularly useful for specifying certain NFRs - such as adaptability, adaptivity, scalability and portability - which are especially relevant for anytime, anywhere access. In the paper, we demonstrate the application of the technique in the context of an on-going research project aiming to build an ‘electronic village’ of local interest in the region of Crete.
Demosthenes Akoumianakis, Athanasios Katsis, Nikolas Vidakis
Requirements Elicitation and Elaboration in Task-Based Design Needs More Than Task Modelling: A Case Study
Abstract
In this paper, a small case study is presented to illustrate our conceptual understanding of a task-based requirements process. We argue that sub-models as known in model-based design (e.g. task models, dialog models) support the reflection about an existing work situation at a conceptual level and allow a formal specification of requirements. However, it is also shown that the integration of complementary analysis approaches facilitates a richer consideration of social as well as technical aspects. An intertwined creation of models differing in their focus and in the degree of abstraction and formality supports a more effective requirements elicitation and elaboration.
In addition, the paper discusses some crucial issues in task- and model-based design such as the ‘myth’ of generalised task models, the different roles of task and dialog models, or the influence of intentions on models of current situations. We hope to contribute to a further clarification of the problem space.
Anke Dittmar, Andreas Gellendin, Peter Forbrig
Discovering Multitasking Behavior at Work: A Context-Based Ontology
Abstract
Despite the availability of several task and personal information management tools, an appropriate support to human multitasking at work is still lacking. Supporting multitasking behavior entails capturing and modeling this behavior. In this paper, we refine an approach to model multitasking behavior in organizations, through an ontology based on two interrelated primitives; action and interaction contexts. The main contributions of the proposed ontology are: (1) enable the discovery of scheduling heuristics combining personal and inter-personal elements, (2) enable bottom-up discovery of tasks and (3) suggest a flexible system architecture for multitasking support. The first two contributions are illustrated through a case study.
Marielba Zacarias, H. Sofia Pinto, José Tribolet
The Tacit Dimension of User Tasks: Elicitation and Contextual Representation
Abstract
Traditional task-elicitation techniques provide prepared structures for acquiring and representing knowledge about user tasks. As different users might perceive work tasks quite differently, normative elicitation and representation schemes do not necessarily lead to accurate support of individual users. If the individual perception of tasks should guide the development of user interfaces personal constructs have to be taken into account. They can be elicited through repertory grids: Personal work content and task-relevant information emerge in the course of structured interviews and can be transformed to conventional representation schemes, even for execution and prototyping. In this paper we introduce an elicitation procedure based on repertory grids and its embodiment in a working user-centered and task-based design approach.
Jeannette Hemmecke, Chris Stary

Context and Plasticity

The Comets Inspector: Towards Run Time Plasticity Control Based on a Semantic Network
Abstract
In this paper, we describe the Comets Inspector, a software tool providing user interface designers and developers with a semantic network in order to control the plasticity of their User Interfaces (UI) at run-time. Thanks to a set of predefined relationships, the semantic network links together various concepts ranging from the final UI (i.e., the UI described in terms of techno lo gical spaces) to the concrete and abstract UIs (i.e., the UI respectively described in terms of concrete interaction objects independently of any technological space, and abstract individual components and containers independently of any interaction modality) up to the tasks and concepts of the interactive system. In this way, plasticity can be addressed at four levels of abstraction (task and concepts, abstract, concrete, and final user interface) for forward, reverse, and lateral engineering. The end user exploits the semantic network at run time to adapt his/her UI to another context of use by identifying, selecting, and applying plasticity suitable operations.
Alexandre Demeure, Gaëlle Calvary, Joëlle Coutaz, Jean Vanderdonckt
A Prototype-Driven Development Process for Context-Aware User Interfaces
Abstract
This paper describes a model-based development process for context-aware user interfaces. The development process consists of the specification and updates of several models followed by the generation and evaluation of a prototype. A generic runtime architecture will be presented supporting distinct prototyping renderers at different abstraction levels in order to support prototyped development during the whole design cycle of the context-aware user interface. To clarify the functioning of the architecture, a case study will be presented, demonstrating the possibilities of this prototype-driven development approach.
Tim Clerckx, Chris Vandervelpen, Kris Luyten, Karin Coninx
Backmatter
Metadata
Title
Task Models and Diagrams for Users Interface Design
Editors
Karin Coninx
Kris Luyten
Kevin A. Schneider
Copyright Year
2007
Publisher
Springer Berlin Heidelberg
Electronic ISBN
978-3-540-70816-2
Print ISBN
978-3-540-70815-5
DOI
https://doi.org/10.1007/978-3-540-70816-2