Skip to main content

Über dieses Buch

This, the 30th issue of Transactions on Large-Scale Data- and Knowledge-Centered Systems, contains six in-depth papers focusing on the subject of cloud computing. Topics covered within this context include cloud storage, model-driven development, informative modeling, and security-critical systems.



Cloud Computing: Read Before Use

Cloud computing is evolving as a new paradigm in service computing in order to reduce initial infrastructure investment and maintenance cost. Virtualization technology is used to create virtual infrastructure by sharing the physical resources through virtual machine. By using these virtual machines, cloud computing technology enables the effective usage of resources with economical profit for customers. Because of these advantages, scientific community is also thinking to shift from grid and cluster computing to cloud computing. However, this virtualization technology comes with significant performance penalties. Moreover, scientific jobs are different from commercial workload. In order to understand the reliability and feasibility of cloud computing for scientific workload, we have to understand the technology and its performance. In this work, we have evaluated the scientific jobs as well as standard benchmarks on private and public cloud to understand exact performance penalties involved in adoption of cloud computing. These jobs are categorized into CPU, memory, N/W and I/O intensive. We also analyzed the results and compared the private and public cloud virtual machine’s performance by considering execution time as well as price. Results show that the cloud computing technology faces considerable performance overhead because of virtualization technology. Therefore, cloud computing technology needs improvement to execute scientific workload.
Amol Jaikar, Seo-Young Noh

Differential Erasure Codes for Efficient Archival of Versioned Data in Cloud Storage Systems

In this paper, we study the problem of storing an archive of versioned data in a reliable and efficient manner. The proposed technique is relevant in cloud settings, where, because of the huge volume of data to be stored, distributed (scale-out) storage systems deploying erasure codes for fault tolerance is typical. However existing erasure coding techniques do not leverage redundancy of information across multiple versions of a file. We propose a new technique called differential erasure coding (DEC) where the differences (deltas) between subsequent versions are stored rather than the whole objects, akin to a typical delta encoding technique. However, unlike delta encoding techniques, DEC opportunistically exploits the sparsity (i.e., when the differences between two successive versions have few non-zero entries) in the updates to store the deltas using sparse sampling techniques applied with erasure coding. We first show that DEC provides significant savings in the storage size for versioned data whenever the update patterns are characterized by in-place alterations. Subsequently, we propose a practical DEC framework so as to reap storage size benefits against not just in-place alterations but also real-world update patterns such as insertions and deletions that alter the overall data sizes. We conduct experiments with several synthetic and practical workloads to demonstrate that the practical variant of DEC provides significant reductions in storage-overhead.
J. Harshan, Anwitaman Datta, Frédérique Oggier

Secure Integration of Third Party Components in a Model-Driven Approach

Model-driven approaches facilitate the development of applications by introducing domain-specific abstractions. Our model-driven approach called SecureMDD supports the domain of security-critical applications that use web services. Because many applications use external web services (i.e. services developed and provided by someone else), the integration of such web services is an important task of a model-driven approach. In this paper we present an approach to integrate and exchange external developed web services that use standard or non-standard cryptographic protocols, in security-critical applications. All necessary information is defined in an abstract way in the application model, which means that no manual changes of the generated code are necessary. We also show how security properties for the whole system including external web services can be defined and proved. For demonstration we use an electronic ticketing case study that integrates an external payment service.
Marian Borek, Kurt Stenzel, Kuzman Katkalov, Wolfgang Reif

Comprehending a Service by Informative Models

Services are one of the main supporting facilities of modern societies. They support users in their everyday life. They provide additional features to their users. They must be useful, usable in the user environment and must correspond to the utilisation pattern of potential users. In modern applications, the user must be able to understand the service on the fly and to appreciate the utility a service provides. The user thus needs a comprehensive service.
Models are a mainstay of every scientific and engineering discipline. Models are typically more accessible to study than the systems. Models are instruments that are effectively functioning within a scenario. The effectiveness is based on an associated set of methods and satisfies requirements of utilisation of the model. A typical utilisation of a model is explanation, informed selection, and appropriation of an opportunity. The mental model that a user might have on a service can be supported by a specific general model of the service: the informative model. It generalises such conceptions as the instruction leaflet, the package or product insert, the information and direction for use, and the enclosed label.
Bernhard Thalheim, Ajantha Dahanayake

Providing Ontology-Based Privacy-Aware Data Access Through Web Services and Service Composition

Web services have emerged as an open standard-based means for publishing and sharing data through the Internet. Whenever web services disclose sensitive data to service consumers, data privacy becomes a fundamental concern for service providers. In many applications, sensitive data may only be disclosed to particular users for specific purposes. That is, access to sensitive data is often restricted, and web services must be aware of these restrictions such that the required privacy of sensitive data can be guaranteed. Privacy preservation is a major challenge that has attracted much attention by researchers and practitioners. Hippocratic databases have recently emerged to protect privacy in relational database systems where the access decisions, allowed or denied, are based on privacy policies and authorization tables. In particular, the specific purpose of a data access has been considered. Ontologies has been used to represent classification hierarchies, which can be efficiently accessed via ontology query languages. In this paper, we propose an ontology-based data access model so that different levels of data access can be provided to web service users with different roles for different purposes. For this, we utilize ontologies to represent purpose hierarchies and data generalization hierarchies. For more complex service requests that require composite web services we discuss the privacy-aware composition of web services. To demonstrate the usefulness of our access control model we have implemented prototypes of financial web services, and used them to evaluate the performance of the proposed approach.
Sven Hartmann, Hui Ma, Panrawee Vechsamutvaree


Weitere Informationen

Premium Partner