Skip to main content

2005 | Buch

Computational Science and Its Applications – ICCSA 2005

International Conference, Singapore, May 9-12, 2005, Proceedings, Part III

herausgegeben von: Osvaldo Gervasi, Marina L. Gavrilova, Vipin Kumar, Antonio Laganà, Heow Pueh Lee, Youngsong Mun, David Taniar, Chih Jeng Kenneth Tan

Verlag: Springer Berlin Heidelberg

Buchreihe : Lecture Notes in Computer Science

insite
SUCHEN

Über dieses Buch

The four volume set assembled following The 2005 International Conference on Computational Science and its Applications, ICCSA 2005, held in Suntec International Convention and Exhibition Centre, Singapore, from 9 May 2005 till 12 May 2005, represents the ?ne collection of 540 refereed papers selected from nearly 2,700 submissions. Computational Science has ?rmly established itself as a vital part of many scienti?c investigations, a?ecting researchers and practitioners in areas ranging from applications such as aerospace and automotive, to emerging technologies such as bioinformatics and nanotechnologies, to core disciplines such as ma- ematics, physics, and chemistry. Due to the shear size of many challenges in computational science, the use of supercomputing, parallel processing, and - phisticated algorithms is inevitable and becomes a part of fundamental t- oretical research as well as endeavors in emerging ?elds. Together, these far reaching scienti?c areas contribute to shape this Conference in the realms of state-of-the-art computational science research and applications, encompassing the facilitating theoretical foundations and the innovative applications of such results in other areas.

Inhaltsverzeichnis

Frontmatter

Grid Computing and Peer-to-Peer (P2P) Systems Workshop

Resource and Service Discovery in the iGrid Information Service

In this paper we describe resource and service discovery mechanisms available in iGrid, a novel Grid Information Service based on the relational model. iGrid is developed within the GridLab project by the ISUFI Center for Advanced Computational Technologies (CACT) at the University of Lecce, Italy and it is deployed on the European GridLab testbed. The GridLab Information Service provides fast and secure access to both static and dynamic information through a GSI enabled web service. Besides publishing system information, iGrid also allow publication of user’s or service supplied information. The adoption of the relational model provides a flexible model for data, and the hierarchical distributed architecture provides scalability and fault tolerance.

Giovanni Aloisio, Massimo Cafaro, Italo Epicoco, Sandro Fiore, Daniele Lezzi, Maria Mirto, Silvia Mocavero
A Comparison of Spread Methods in Unstructured P2P Networks

In recent years, unstructured Peer-to-Peer (P2P) applications are very popular on the Internet. Unstructured P2P topology has power-law characteristic in the link distribution, containing a few nodes that have a very high degree and many with low degree. This reflects the presence of central nodes which interact with many others and play a key role in relaying information. The system performance can be improved by replicating file location information to the high degree nodes in unstructured P2P. In this paper, we present an overview of several spread mechanisms for unstructured P2P and analyze the performance of them relative to the number of information replica and the bandwidth consumption. The optimal spread mechanism is high degree spread method. Simulation results empirically evaluate their behavior in direct comparison and verify our analysis.

Zhaoqing Jia, Bingzhen Pei, Minglu Li, Jinyuan You
A New Service Discovery Scheme Adapting to User Behavior for Ubiquitous Computing

Discovering primary services requested by users is a very difficult but crucial task in networked system, especially in ubiquitous computing system. The service discovery also has to be an accurate, fast, and stable operation. The typical lookup services such as SLP, Jini, and UPnP focus on either wired networks or particular networks. In this paper, thus, we propose a new service discovery algorithm using service agents adapted to user behavior. We compare the proposed algorithm with the DEAPspace algorithm using real experiment. It reveals that the proposed scheme offers services to the users seamlessly and accurately as much as DEAPspace, while the number of packets used is about half of it.

Yeo Bong Yoon, Hee Yong Youn
The Design and Prototype of RUDA, a Distributed Grid Accounting System

The Grid environment contains a large and growing number of widely distributed sites with heterogeneous resources. It is a great challenge to dynamically manage and account for usage data of Grid resources, such as computational, network, and storage resources. A distributed Resource Usage Data management and Accounting system (RUDA) is designed to perform accounting in the Grid environment. RUDA utilizes fully decentralized design to enhance scalability and supports heterogeneous resources with no significant impact on local systems. It can easily be integrated into Grid infrastructures and maintains the integrity of the Grid security features.

M. L. Chen, A. Geist, D. E. Bernholdt, K. Chanchio, D. L. Million
An Adaptive Routing Mechanism for Efficient Resource Discovery in Unstructured P2P Networks

The widespread adoption of large-scale decentralized peer-to-peer (P2P) systems imposes huge challenges on distributed search and routing. Decentralized and unstructured P2P networks are very attractive because they require neither centralized directories, nor precise control over network topology or data placement. However their search mechanisms are extremely unscalable, generating large loads on the network participants. In this paper, to address this major limitation, we propose and evaluate the adoption of an innovative algorithm for routing user queries. The proposed approach aims at dynamically adapting the network topology to peer interests, on the basis of query interactions among users. Preliminaries evaluations show that the approach is able to dynamically group peer nodes in clusters containing peers with shared interests and organized into a

small world

topology.

Luca Gatani, Giuseppe Lo Re, Salvatore Gaglio
Enhancing UDDI for Grid Service Discovery by Using Dynamic Parameters

A major problem for a grid user is the discovery of currently available services. With large number of services, it is beneficial for a user to be able to discover the services that most closely match their requirements. This report shows how to extend some concepts of UDDI such that they are suitable for dynamic parameter based discovery of grid services.

Brett Sinclair, Andrzej Goscinski, Robert Dew
A New Approach For Efficiently Achieving High Availability in Mobile Computing

Recent advances in hardware technologies such as portable computers and wireless communication networks have led to the emergence of mobile computing systems. Thus, availability and accessibility of the data and services become important issues of mobile computing systems. In this paper, we present a data replication and management scheme tailored for such environments. In the proposed scheme data is replicated synchronously over stationary sites while for the mobile network, data is replicated asynchronously based on commonly visited sites for each user. The proposed scheme is compared with other techniques and is shown to require less communication cost for an operation as well as provide higher degree of data availability.

M. Mat Deris, J. H. Abawajy, M. Omar
A Flexible Communication Scheme to Support Grid Service Emergence

The next generation grid systems exhibit a strong sense of automation. To address such a challenge, Our previous work has viewed a grid as a number of interacting agents and applied some key mechanisms of natural ecosystems to build a novel grid middleware system, where a collection of distributed agents are searched, assembled, organized and coordinated to emerge desirable grid services. All these actions of agents depend on an effective communication scheme.

In this paper, we design a flexible communication scheme to implement the complicated coordination strategies among agents, including a RMI-IIOP-based transport mechanism, an ecological network communication language, and an ecological network interaction protocol from low to high implementation strategy. To test our hypothesis that grid services with desired properties can emerge from individual agents via our communication scheme, simulations of resource discovery service are carried out. The results prove that the scheme can well support this kind of bottom-up approach to build desirable services in grid environments.

Lei Gao, Yongsheng Ding
A Kernel-Level RTP for Efficient Support of Multimedia Service on Embedded Systems

Since the RTP is suitable for real-time data transmission in multimedia services like VoD, AoD, and VoIP, it has been adopted as a real-time transport protocol by RTSP, H.323, and SIP. Even though the RTP protocol stack for embedded systems has been in great need for efficient support of multimedia services, such a stack has not been developed yet. In this paper, we explain

embeddedRTP

which supports the RTP protocol stack at the kernel level so that it is suitable for embedded systems. Since

embeddedRTP

is designed to reside in the UDP module, existing applications which rely on TCP/IP services can be processed the same as before, while applications which rely on the RTP protocol stack can request RTP services through

embeddedRTP

’s API. Our performance test shows that packet-processing speed of

embeddedRTP

is about 7.8 times faster than that of UCL RTP for multimedia streaming services on PDA in spite that its object code size is reduced about by 58% with respect to UCL RTP’s.

Dong Guk Sun, Sung Jo Kim
Group-Based Scheduling Scheme for Result Checking in Global Computing Systems

This paper considers the problem of correctness to fault-tolerance in global computing systems. Global computing system has been shown to be exposed to intentional attacks, where authentication is not relevant, network security techniques are insufficient. To guarantee correctness for computation, fault-tolerance schemes have been used to majority voting and spot-checking but these schemes intend to high computation delay because are not applied scheduling scheme for result checking. In this paper, we propose a new technique called GBSS(Group-Based Scheduling Scheme) which guarantee correctness and reduce computation delay by using fault-tolerant scheduling scheme to the result checking. Additionally, simulation results show increased usable rate of the CPU and increased performance of the system.

HongSoo Kim, SungJin Choi, MaengSoon Baik, KwonWoo Yang, HeonChang Yu, Chong-Sun Hwang
Service Discovery Supporting Open Scalability Using FIPA-Compliant Agent Platfrom for Ubiquitous Networks

Service discovery protocol is the main element that determines the efficiency in middleware platform of ubiquitous networks. Recently, a large number of middlewares supporting scalability which focus on the distributed computing and data sharing among tremendous peers has been proposed. However, due to the distributed nature of ad-hoc networks, peers may not be able to find other peers and corresponding resources persistently. Furthermore, current service discovery engines do not provide

open scalability

that can make them to interoperate with each other. In this paper, we propose a simple mechanism, which provides cross-platform interoperability through directory federation combined with DHT mechanism to support

open scalability

and lightweight service discovery in ad-hoc network based on FIPA compatibility agent framework.

Kee-Hyun Choi, Ho-Jin Shin, Dong-Ryeol Shin
A Mathematical Predictive Model for an Autonomic System to Grid Environments

One of the most important aims of the Grid technology is using geographically distributed resources. Nevertheless, Grid environments have a great problem: the system management is very complex because of the large number of resources. Thus, improving the system performance is a hard task and it would be advisable to build an autonomic system in charge of the system management. The autonomic system must take decisions based on the analysis of the monitored data of the whole Grid trying to improve the system performance. These decisions should not only take into account the current conditions of the Grid but the predictions of the further future behaviour of the system too. In this sense, we propose a mathematical model to decide the optimal policy based on predictions made thanks to the known past behaviour of the system. This paper shows our model on the basis of the decision theory.

Alberto Sánchez, María S. Pérez

Spatial Analysis and GIS: Local or Global? Workshop

Spatial Analysis: Science or Art?

Spatial Analysis is a relatively young discipline, descending from a modelling tradition where the analyst possesses all the knowledge and qualities that lead him/her to the definition of the optimal model. Numerous spatial analytical techniques are available these days in general-purpose GIS software, but their user interfaces are dry and do not offer structured choices in pull-down menus, as they do for more conventional operations in GIS. The average GIS user is often unprepared to identify the right solutions without guidance. Defining optimizing criteria and introducing them in structured software interfaces appears to be, at present, the best means to promote a widespread and appropriate use of spatial analysis. Defining such criteria constitutes at the same time an important line of research, potentially capable of furthering the theoretical underpinnings of the discipline, aiding its transition from infancy to maturity.

Stefania Bertazzon
Network Density Estimation: Analysis of Point Patterns over a Network

This research focuses on examining point pattern distributions over a network, therefore abandoning the usual hypotheses of homogeneity and isotropy of space and considering network spaces as frameworks for the distribution of point patterns. Many human related point phenomena are distributed over a space that is usually not homogenous and that depend on a network-led configuration. Kernel Density Estimation (KDE) and K-functions are commonly used and allow analysis of first and second order properties of point phenomena. Here an extension of KDE, called Network Density Estimation (NDE) is proposed. The idea is to consider the kernel as a density function based on network distances rather than Euclidean ones. That should allow identification of ‘linear’ clusters along networks and the identification of a more precise surface pattern of network related phenomena.

Giuseppe Borruso
Linking Global Climate Grid Surfaces with Local Long-Term Migration Monitoring Data: Spatial Computations for the Pied Flycatcher to Assess Climate-Related Population Dynamics on a Continental Scale

Bird populations are known to be affected by climate and habitat change. Here we assess on a continental scale the relationship of a bird population index for the Pied Flycatcher (

Ficedula hypoleuca

) with spatially explicit long-term climate data. For 1971-2001 and using multiple linear regression and AIC selection methods for candidate models we found that log-transformed 30 year long-term fall bird monitoring data from Rybachy Station (Russia), Baltic Sea, can be explained by 40% with monthly mean temperatures in the West African wintering grounds; the positive relationship suggest that increasing bird numbers are explained by increasing mean November temperatures. Precipitation, European fall, spring and breeding range temperatures did not show a strong relationship, nor with bird monitoring data from two other international stations (Pape and Kabli). Our findings help to improve hypotheses to be tested in the poorly known wintering grounds. However, due to various biases care has to be taken when interpreting international long-term bird monitoring data.

Nikita Chernetsov, Falk Huettmann
Classifying Internet Traffic Using Linear Regression

A globally weighted regression technique is used to classify 32 monitoring sites pinging data packets to 513 unique remote hosts. A statistic is developed relative to the line of best fit for a 360° manifold, measuring either global or local phase correlation for any given monitoring site in this network. The global slope of the regression line for the variables, phase and longitude, is standardised to unity to account for the Earth’s rotation. Monitoring sites with a high global phase correlation are well connected, with the observed congestion occurring at the remote host. Conversely, sites with a high local phase correlation are poorly connected and are dominated by local congestion. These 32 monitoring sites can be classified either globally or regionally by a phase statistic ranging from zero to unity. This can provide a proxy for measuring the monitoring site’s network capacity in dealing with periods of peak demand. The research suggests that the scale of spatial interaction is one factor to consider in determining whether to use globally or locally weighted regression, since beyond one thousand kilometres, random noise makes locally weighted regression problematic.

T. D. Mackay, R. G. V. Baker
Modeling Sage Grouse: Progressive Computational Methods for Linking a Complex Set of Local, Digital Biodiversity and Habitat Data Towards Global Conservation Statements and Decision-Making Systems

Modern conservation management needs to link biological questions with computational approaches. As a global template, here we present such an approach from a local study on sage grouse breeding habitat, leks, in North Natrona County, Wyoming, using remote sensing imagery, digital datasets, spatial statistics, predictive modelling and a Geographic Information System (GIS). Four quantitative models that describe sage grouse breeding habitat selection were developed for multiple scales using logistic regression and multivariate adaptive regression splines (MARS-Salford Systems). Based on candidate models and AIC, important habitat predictor variables were elevation, distance to human development, slope, distance to roads, NDVI and distance to water, but not Sagebrush. Some predictors changed when using different scales and MARS. For the year 2011, a cumulative prediction index approach is presented on how the population viability of sage grouse can be assessed over time and space using Markov chain models for deriving future landscape scenarios and MARS for species predictions.

Anthonia Onyeahialam, Falk Huettmann, Stefania Bertazzon
Local Analysis of Spatial Relationships: A Comparison of GWR and the Expansion Method

Considerable attention has been paid in recent years to the use and development of local forms of spatial analysis, including the method known as geographically weighted regression (GWR). GWR is a simple, yet conceptually appealing approach for exploring spatial non-stationarity that has been described as a natural evolution of the expansion method. The objective of the present paper is to compare, by means of a simulation exercise, these two local forms of spatial analysis. Motivation for the exercise derives from two basic research questions: Is spatial non-stationarity in GWR an artifact of the way the model is calibrated? And, how well does GWR capture spatial variability? The results suggest that, on average, spatial variability in GWR is not a consequence of the calibration procedure, and that GWR is sufficiently flexible to reproduce the type of map patterns used in the simulation experiment.

Antonio Páez
Middleware Development for Remote Sensing Data Sharing and Image Processing on HIT-SIP System

Sharing spatial data derived from remote sensing is a very significant thing. Grid computing and Web Service technology provides fundamental support for it. In this paper we mainly discuss architecture and middleware of sharing spatial data derived from remote sensing and processing. Because middleware of automatically transferring and task execution on grid is the key of the architecture, we study the middleware. It can effectively protect the owner of data and middleware’s property through giving users their required result not just simply copying data and codes resource to them. Based on this sharing architecture and middleware technology, a data and middleware transferring example is showed.

Jianqin Wang, Yong Xue, Chaolin Wu, Yanguang Wang, Yincui Hu, Ying Luo, Yanning Guan, Shaobo Zhong, Jiakui Tang, Guoyin Cai
A New and Efficient K-Medoid Algorithm for Spatial Clustering

A new

k

-medoids algorithm is presented for spatial clustering in large applications. The new algorithm utilizes the TIN of medoids to facilitate local computation when searching for the optimal medoids. It is more efficient than most existing

k

-medoids methods while retaining the exact the same clustering quality of the basic

k

-medoids algorithm. The application of the new algorithm to road network extraction from classified imagery is also discussed and the preliminary results are encouraging.

Qiaoping Zhang, Isabelle Couloigner

Computer Graphics and Rendering Workshop

Security Management for Internet-Based Virtual Presentation of Home Textile Product

Internet and E-commerce technology are two fast developing fields. Companies who adopt these new techniques have gained good economic profit. In this paper, we will focus on the application of these techniques in the virtual presentation and sale of textile product. The paper will address the following technical problems in detail: Internet-based garment CAD/electronic garment fitting, virtual presentation and design of home textile product, security problems in electronic commerce, and copyright protection of textile patterns on Internet.

Lie Shi, Mingmin Zhang, Li Li, Lu Ye, Zhigeng Pan
An Efficient Approach for Surface Creation

In this paper, we present an efficient method for generation of free from surfaces using the solution to a fourth order partial differential equation. In the interest of computational efficiency, the surface function is taken to be the combination of the boundary functions modulated by some unknown functions. Making use of the properties of boundary functions, the fourth order partial differential equation is transformed into a fourth order ordinary differential equation. To solve this equation, we further convert it to a set of one-dimensional finite difference equations where the number of unknowns is reduced significantly allowing fast surface generation.

L. H. You, Jian J. Zhang
Interactive Visualization for OLAP

Business data collection is growing exponentially in recent years. A variety of industries and businesses have adopted new technologies of data storages such as data warehouses. On Line Analytical Processing (OLAP) has become an important tool for executives, managers, and analysts to explore, analyze, and extract interesting patterns from enormous amount of data stored in data warehouses and multidimensional databases. However, it is difficult for human analysts to interpret and extract meaningful information from large amount of data if the data is presented in textual form as relational tables. Visualization and interactive tools employ graphical display formats that help analysts to understand and extract useful information fast from huge data sets. This paper presents a new visual interactive exploration technique for an analysis of multidimensional databases. Users can gain both overviews and refine views on any particular region of interest of data cubes through the combination of interactive tools and navigational functions such as

drilling down

,

rolling up

, and

slicing

. Our technique allows users who are not experts in OLAP technology to explore and analyze OLAP data cubes and data warehouses without generating sophisticated queries. Furthermore, the visualization in our technique displays the exploration path enhancing the user’s understanding of the exploration.

Kesaraporn Techapichetvanich, Amitava Datta
Interactive 3D Editing on Tiled Display Wall

Recent interest in large displays has led to the rapid development of cluster-based tiled display wall systems, which are comprised of many individual projectors arranged in an array, and driven by a cluster of PCs. In this paper we describe an interaction protocol and the essential synchronization mechanisms to support interactive 3D editing on cluster-based tiled display wall. In order to examine the feasibility of our approach, we have developed a parallel interaction library, named

PIEL

(Parallel Interactive Editing Library). By using the PIEL library, we can conveniently port rendering applications running on stand-along PC to cluster-based tiled display wall systems.

Xiuhui Wang, Wei Hua, Hujun Bao
A Toolkit for Automatically Modeling and Simulating 3D Multi-articulation Entity in Distributed Virtual Environment

This paper describes the Entity Modeling and Simulating Platform (EMS), a toolkit which can automatically model and simulate multi-articulation entities (actor or avatar) in a distributed virtual environment. EMS focuses particularly on modeling the physical and behavioral attributes of an entity and defining the relationship among different attributes. Using EMS, a user can interactively define the attributes of an entity, run and control the entity in a distributed virtual environment without programming. In this paper, the main modules and key technologies of EMS are introduced.

Liang Xiaohui, Wang Chuanpeng, Che Yinghui, Yu Jiangying, Qu Na
Footprint Analysis and Motion Synthesis

In this paper, we describe a novel method to analyze captured human motion data. As described later, we analyze the sequences of footprint of the moving figure to extract different motion stages, as well as the switch frame between two adjacent stages. The displacement between left and right footprints is a time varying function. The change of motion (in direction, posture, speed etc.) can be reflected by the displacement between left and right footprints. Each motion stage is a sequence of motion data. Two adjacent motion stages can be different just in movement direction despite they are in the same movement style. In the second half of this paper, we apply footprint analysis on a short human motion clip of walking forward, and synthesis new motions of walking (running) along a path, of which the direction can be changed interactively in running time.

Qinping Zhao, Xiaoyan Hu
An Adaptive and Efficient Algorithm for Polygonization of Implicit Surfaces

This paper describes an adaptive and efficient algorithm for polygonization of implicit surfaces, which consists of two steps: initial polygonization and adaptive refinement. The algorithm first generates an initial coarse triangular mesh from implicit surface using a variation of the traditional Marching Cubes (MC) Algorithm. And then the triangles in the coarse mesh are iteratively subdivided by employing a sampling rate that varies spatially according to local complexity of the surface. The new created vertices in refined mesh are projected onto the implicit surface by gradient descent method. Consequently, the algorithm produces the minimum number of polygons required to approximate the surface with a desired precision and the final mesh is simplicial complex. Our algorithm can be used in the real-time environment of visualization of implicit surfaces.

Mingyong Pang, Zhigeng Pan, Mingmin Zhang, Fuyan Zhang
A Framework of Web GIS Based Unified Public Health Information Visualization Platform

The GIS plays a vital role in public health information visualization for public health information management, broadcasting, data management, statistical analysis, and decision supporting. This paper described the elementary requirement and the essential technology for public health information visualization and proposed a framework of the unified public health information visualization platform based on the Web GIS and visualization technology. The system framework adopted multi-tier system infrastructure that consist the sever tier and the front tier. In the server tier, the J2EE based architecture was adopted to construct a distrusted system infrastructure. In the front tier, the GIS map java applet is used to show the public health information with spatial graphical map, and the web based graphics figures such as curves, bars, maps and multi-dimensional visualization technology are used to visualize the public health information. The public health information contained the geo-referenced data, such as specific location, area code, latitude and longitude, street address, and geopolitical boundaries can be visualized with GIS distribution maps. The system infrastructure, functions, system integration, and some key technology were discussed in this paper. It would have the important practical value for constructing the visible public health information system.

Xiaolin Lu
An Improved Colored-Marker Based Registration Method for AR Applications

Registration is crucial in an Augmented Reality (AR) system for it determines the performance of alignment between virtual objects and real scene. Colored-makers with known world coordinates are usually put in the target scene beforehand to help get a real-time, precise registration because they can provide explicit 3D/2D correspondences and four such correspondences can produce adequate and accurate equations of the pose matrix if the camera’s intrinsic matrix has already been calibrated, and then registration can be achieved by solving these equations. However, usually only limited number of (e.g. two or three) markers out of four can be captured and this will make the colored-marker based method fail. In order to overcome such shortcomings an improved colored-marker based registration method is proposed in this paper which works when the target scene is a plane. The proposed method integrates both 3D/2D and 2D/2D information by updating the cost function used in the optimization step of RANSAC, and thus combines the virtues of homography based method. Experimental result shows that the proposed method can provide acceptable pose estimation and its potential to be applied in actual AR systems.

Xiaowei Li, Yue Liu, Yongtian Wang, Dayuan Yan, Dongdong Weng, Tao Yang
Non-photorealistic Tour into Panorama

In this paper, we describe a system NP-TIP ( Non-Photorealistic Tour Into Panorama ). It provides a simple non-photorealistic scene model in which users can freely walk through and obtain enjoyable and real-time artistic experience. In NP-TIP, firstly we design a new algorithm for fast converting a photo or image to a synthesized painting following the painting style of an example image. By treating painting styles as sample textures, we reduce the problem of learning an example painting to that of texture synthesis, which improves the speed of non-photorealistic rendering. Secondly, we propose a new modeling scheme for TIP based on the cubic panorama, which not only overcomes the disadvantage of fixed viewpoint in browsing panorama, but also models easily and computes simply. According to users’ selection from example paintings of different artistic style, NP-TIP can provide stylized interactive and real-time panoramic walkthrough of scenes.

Yang Zhao, Ya-Ping Zhang, Dan Xu
Image Space Silhouette Extraction Using Graphics Hardware

In computer graphics, silhouette extracting and rendering has an important role in a number of application. Special features such as silhouette and crease of a polygonal scene are usually displayed by identifying for the corresponding geometry especially in non-photorealistic rendering (NPR). We present an algorithm for extracting and rendering silhouette outlines and crease edges of 3D polygonal meshes in image space. This algorithm is simple and suitable for modern programmable graphics hardware implementation. It can generate NPR image synthesis in real-time. Our experimental results show it satisfies real-time interactive needs.

Jiening Wang, Jizhou Sun, Ming Che, Qi Zhai, Weifang Nie
Adaptive Fuzzy Weighted Average Filter for Synthesized Image

Monte Carlo is a powerful tool for the computation of global illumination. However noises, which are resulted from the low convergence of Monte Carlo, are noticeable for the synthesized image with global illumination effects and can be regarded as the combination of Gaussian noise and impulse noise. Filter is a cheap way to eliminate noises. In this paper, we investigate nonlinear filtering techniques to reduce the mixturing noise. Based on fuzzy theory, we present an adaptive weighted average filter to optimize the weighs of the filters. Analysis and the computational results, which have been obtained from experiments for noise attenuation and edge preservation, indicate that the new algorithm is promising.

Qing Xu, Liang Ma, Weifang Nie, Peng Li, Jiawan Zhang, Jizhou Sun

Data Mining and Bioinformatics Workshop

The Binary Multi-SVM Voting System for Protein Subcellular Localization Prediction

Support Vector Machine (SVM) as a learning system has been widely employed for pattern recognition and data classification tasks such as biological data classification. Choosing appropriate parameters are essential for SVM to achieve a high global performance. In this paper, we propose a new binary multi-SVM voting system without difficult parameter selection for protein subcellular localization prediction. The sufficient experimental results demonstrate that the multi-SVM voting system can achieve higher average prediction accuracies for the protein subcellular localization prediction than the traditional single-SVM system.

Bo Jin, Yuchun Tang, Yan-Qing Zhang, Chung-Dar Lu, Irene Weber
Gene Network Prediction from Microarray Data by Association Rule and Dynamic Bayesian Network

Using microarray technology to predict gene function has become important in research. However, microarray data are complicated and require a powerful systematic method to handle these data. Many scholars use clustering algorithms to analyze microarray data, but these algorithms can find only the same expression mode, not the transcriptional relation between genes. Moreover, most traditional approaches involve all-against-all comparisons that are time consuming. To reduce the comparison time and find more relations, a proposed method is to use an a priori algorithm to filter possible related genes first, which can reduce number of candidate genes, and then apply a dynamic Bayesian network to find the gene’s interaction. Unlike the previous techniques, this method not only reduces the comparison complexity but also reveals more mutual interaction among genes.

Hei-Chia Wang, Yi-Shiun Lee
Protein Interaction Prediction Using Inferred Domain Interactions and Biologically-Significant Negative Dataset

Protein domains are evolutionarily-conserved structural or functional subunits in proteins that are suggestive of the proteins’ propensity to interact or form a stable complex. In this paper, we propose a novel domain-based probabilistic classification method to predict protein-protein interactions. Our method learns the interacting probabilities of domain pairs based on domain pairing information derived from both experimentally-determined interacting protein pairs and carefully-chosen non-interacting protein pairs. Unlike conventional approaches that use random pairing to generate artificial non-interacting protein pairs as negative training data, we generate biologically meaningful non-interacting protein pairs based on the proteins’ biological information. Such careful generation of negative training data set is shown to result in a more accurate classifier. Our classifier predicts potential interaction between any pair of proteins based on the probabilistically inferred domain interactions. Comparative results showed that our probabilistic approach is effective and outperforms other domain-based techniques for protein interaction prediction.

Xiao-Li Li, Soon-Heng Tan, See-Kiong Ng
Semantic Annotation of Biomedical Literature Using Google

With the increasing amount of biomedical literature, there is a need for automatic extraction of information to support biomedical researchers. Due to incomplete biomedical information databases, the extraction is not straightforward using dictionaries, and several approaches using contextual rules and machine learning have previously been proposed. Our work is inspired by the previous approaches, but is novel in the sense that it is using Google for semantic annotation of the biomedical words. The semantic annotation accuracy obtained – 52% on words not found in the Brown Corpus, Swiss-Prot or LocusLink (accessed using Gsearch.org) – is justifying further work in this direction.

Rune Sætre, Amund Tveit, Tonje S. Steigedal, Astrid Lægreid
Fast Parallel Algorithms for the Longest Common Subsequence Problem Using an Optical Bus

A parallel algorithm for the longest common subsequence problem on LARPBS is presented. For two sequences of lengths

m

and

n

, the algorithm uses

p

processors and costs

O

(

mn

/

p

) computation time where 1 ≤

p

≤ max{

m

,

n

}. Time-area cost of the algorithm is

O

(

mn

/

p

) and memory space required is

O

((

m+n

)/

p

) which all reach optimal. We also show this algorithm is scalable when the number of processors

p

satisfies 1 ≤

p

≤ max{

m

,

n

}. To the best of our knowledge this is the fastest and cost-optimal parallel algorithm for LCS problem on array architectures.

Xiaohua Xu, Ling Chen, Yi Pan, Ping He
Estimating Gene Networks from Expression Data and Binding Location Data via Boolean Networks

In this paper, we propose a computational method for estimating gene networks by the Boolean network model. The Boolean networks have some practical problems in analyzing DNA microarray gene expression data: One is the choice of threshold value for discretization of gene expression data, since expression data take continuous variables. The other problem is that it is often the case that the optimal gene network is not determined uniquely and it is difficult to choose the optimal one from the candidates by using expression data only. To solve these problems, we use the binding location data produced by Lee

et al

.[8] together with expression data and illustrate a strategy to decide the optimal threshold and gene network. To show the effectiveness of the proposed method, we analyze

Saccharomyces cerevisiae

cell cycle gene expression data as a real application.

Osamu Hirose, Naoki Nariai, Yoshinori Tamada, Hideo Bannai, Seiya Imoto, Satoru Miyano
Efficient Matching and Retrieval of Gene Expression Time Series Data Based on Spectral Information

In this paper, we propose an efficient method based on spectral analysis for matching and retrieval of gene expression time series data. In this technique, we decompose a gene expression time series into a set of spectral components. The spectral parameters can then be used to compute the correlation between the expression data for a pair of genes using a closed-form mathematical equation. This method provides a reliable similarity metric for the comparison of gene expression data and can be used for efficient data retrieval.

Hong Yan
SVM Classification to Predict Two Stranded Anti-parallel Coiled Coils based on Protein Sequence Data

Coiled coils is an important 3-D protein structure with two or more stranded alpha-helical motif wounded around to form a “knobs-into-holes” structure. In this paper we propose an SVM classification approach to predict the two stranded anti-parallel coiled coils structure based on the primary amino acid sequence. The training dataset for the machine learning are collected from SOCKET database which is a SOCKET algorithm predicted coiled coils database. Total 41 sequences of at least two heptad repeats of the two stranded anti-parallel coiled coils motif are extracted from 12 proteins as the positive datasets. Total 37 of non coiled coils sequences and two stranded parallel coiled coils motif are extracted from 5 proteins as negative datasets. The normalized positional weight matrix on each heptad register a, b, c, d, e, f and g is from SOCKET database and is used to generate the positional weight on each entry. We performed SVM classification using the cross-validated datasets as training and testing groups. Our result shows 73% accuracy on the prediction of two stranded anti-parallel coiled coils based on the cross-validated data. The result suggests a useful approach of using SVM to classify the two stranded anti-parallel coiled coils based on the primary amino acid sequence.

Zhong Huang, Yun Li, Xiahohua Hu
Estimating Gene Networks with cDNA Microarray Data Using State-Space Models

State-space model is used in this paper to analyze dynamics of gene expression profile data. State-space models describe dynamics that the observed measurements depend on some hidden state variables. Hidden state variables can capture effects that cannot be measured in gene expression profiling experiment, for example, genes that have not been included in the observation variables, levels of regulatory proteins or the effect of mRNA. System identification is achieved by EM algorithm that is based on the maximum likelihood method. We apply this method to a published yeast cell-cycle gene expression time-series data under the assumption that state variables and observation variables are generated by Gaussian white noise process, that produces the simplest and most reasonable model to explain of behaviors of the gene expression profiles.

Rui Yamaguchi, Satoru Yamashita, Tomoyuki Higuchi
A Penalized Likelihood Estimation on Transcriptional Module-Based Clustering

In this paper, we propose a new clustering procedure for high dimensional microarray data. Major difficulty in cluster analysis of microarray data is that the number of samples to be clustered is much smaller than the dimension of data which is equal to the number of genes used in an analysis. In such a case, the applicability of conventional model-based clustering is limited by the occurence of overlearning. A key idea of the proposed method is to seek a linear mapping of data onto the low-dimensional subspace before proceeding to cluster analysis. The linear mapping is constructed such that the transformed data successfully reveal clusters existed in the original data space. A clustering rule is applied to the transformed data rather than the original data. We also establish a link between this method and a probabilistic framework, that is, a penalized likelihood estimation of the mixed factors model. The effectiveness of the proposed method is demonstrated through the real application.

Ryo Yoshida, Seiya Imoto, Tomoyuki Higuchi
Conceptual Modeling of Genetic Studies and Pharmacogenetics

Genetic Studies examine relationships between genetic variation and disease development. Pharmacogenetics studies the responses to drugs against genetic variation. These two lines of research evaluate relationships among genotype, phenotype, and environment regarding subjects. These studies demand a variety of other information; such as clinical observations, disease development history, demographics, life style, and living environment. Correct and informative modeling of these data is critical for bioinformaticians; the model affects the capacity of data manipulation and the types of queries they can ask as well as performance of the implemented system. In this paper, we present a conceptual model on genetic studies and Pharmacogenetics using Unified Modeling Language (UML). Our model provides a comprehensive view of integrated data for genetic studies and Pharmacogenetics by incorporating genomics, experimental data, domain knowledge, research approaches, and interface data for other publicly available resources into one cohesive model. Our model can support diverse biomedical research activities that use both clinical and biomedical data to improve patient care through incorporation of the roles of environment, life style and genetics. Our model consists of a set of class diagrams organized into a hierarchy of packages diagrams to clearly and intuitively show inter-object relationships at different levels of complexity.

Xiaohua Zhou, Il-Yeol Song

Parallel and Distribuited Computing Workshop

A Dynamic Parallel Volume Rendering Computation Mode Based on Cluster

Based on cluster, a dynamic parallel mode–the MSGD (Master-Slave-Gleaner-Dynamic) mode is presented in this paper. Combining the load balancing with the MSG (Master-Slave-Gleane) mode, this mode brings in a load balancing strategy–task pool, which can coordinate the workload of each node effectively. A volume rendering algorithm called Splatting is used to test the performance of this mode on IBM Cluster1350. The results prove that this mode can effectively increase the total calculation speed and improve the global load balance. In addition this mode has good scalability and is suitable for various parallel volume rendering algorithms with large-scale data volume.

Weifang Nie, Jizhou Sun, Jing Jin, Xiaotu Li, Jie Yang, Jiawan Zhang
Dynamic Replication of Web Servers Using Rent-a-Servers

Some popular web sites are attracting large number of users and becoming hot spots in the Internet. Not only their servers are overloaded but also they create excessive traffic on the Internet. Many of these hot spots are appearing and disappearing rapidly. Traditional approaches such as web caching and static replication of web servers are not proper solutions to the hot spot problem. We present a dynamic replication method of web servers using a concept called rent-a-servers. The proposed method not only distributes web requests among replicated web servers evenly but also significantly reduces traffic on the Internet. We compare the proposed method with other approaches and show the effectiveness of the proposed method through simulation.

Young-Chul Shim, Jun-Won Lee, Hyun-Ah Kim
Survey of Parallel and Distributed Volume Rendering: Revisited

This paper covers a number of considerations about the subject of parallel volume rendering according to the rendering pipeline, including the choice of parallel architectures, the parallel volume rendering algorithms, the strategies for data distribution, and sorting and composition methods to achieve load balancing. Through the survey of recent parallel implementations, the general concerns and current research trend on the design of a parallel volume rendering system are discussed.

Jiawan Zhang, Jizhou Sun, Zhou Jin, Yi Zhang, Qi Zhai
Scheduling Pipelined Multiprocessor Tasks: An Experimental Study with Vision Architecture

This paper presents the application scheduling algorithms on a class of multiprocessor architectures that exploit temporal and spatial parallelism simultaneously. The hardware platform is a multi-level or partitionable architecture. Spatial parallelism is exploited with MIMD type processor clusters (or layers) and temporal parallelism is exploited by pipelining operations on those independent clusters. In order to fully exploit system’s capacity, multi processor tasks (MPTs) that are executed on such system should be scheduled appropriately. In our earlier study, we have proposed scheduling algorithms based on well known local search heuristic algorithms such as simulated annealing, tabu search and genetic algorithm and their performances were tested computationally by using a set of randomly generated test data. In this paper, we present application of these scheduling algorithms on a multilayer architecture which is designed as a visual perception unit of an autonomous robot and evaluate performance improvement achieved.

M. Fikret Ercan
Universal Properties Verification of Parameterized Parallel Systems

This paper presents a method for verifying universal properties of parameterized parallel systems using Parameterized Predicate Diagrams [10]. Parameterized Predicate Diagrams are diagrams which are used to represent the abstractions of such systems described by specifications written in temporal logic. This method presented here integrates deductive verification and algorithmic techniques. Non-temporal proof obligations establish the correspondence between the original specification and the diagram, whereas model checking can be used to verify properties over finite-state abstractions.

Cecilia E. Nugraheni

Symbolic Computation, SC 2005 Workshop

2d Polynomial Interpolation: A Symbolic Approach with Mathematica

This paper extends a previous work done by the same authors on teaching 1d polynomial interpolation using Mathematica [1] to higher dimensions. In this work, it is intended to simplify the the theoretical discussions in presenting multidimensional interpolation in a classroom environment by employing Mathematica’s symbolic properties. In addition to symbolic derivations, some numerical tests are provided to show the interesting properties of the higher dimensional interpolation problem. Runge’s phenomenon was displayed for 2d polynomial interpolation.

Ali Yazici, Irfan Altas, Tanil Ergenc
Analyzing the Synchronization of Chaotic Dynamical Systems with Mathematica: Part I

One of the most interesting and striking issues in dynamical systems is the possibility to synchronize the behavior of several (either identical or different) chaotic systems. This is the first of a series of two papers (both included in this volume) describing a new Mathematica package developed by the authors,

ChaosSynchronization

, for the analysis of chaotic synchronization. In particular, this first paper is devoted to the analysis of the Pecora-Carroll scheme (the first chaotic synchronization scheme reported in the literature) as well as a recent modification based on partial connections. The performance of the package is discussed by means of several illustrative and interesting examples. In our opinion, this package provides the users with an excellent, user-friendly computer tool for learning and analyzing this exciting topic within a unified (symbolic, numerical and graphical) framework.

A. Iglesias, A. Gálvez
Analyzing the Synchronization of Chaotic Dynamical Systems with Mathematica: Part II

This work concerns the synchronization of chaotic dynamical systems by using the program Mathematica and follows up a previous one (also included in this volume) devoted to the same topic [2]. In particular, this second paper classifies and illustrates the wide range of chaotic synchronization phenomena. Some examples of each kind are briefly analyzed in this paper with the help of a Mathematica package already introduced in the previous paper.

A. Iglesias, A. Gálvez
A Mathematica Package for Computing and Visualizing the Gauss Map of Surfaces

One of the most interesting and striking concepts in Differential Geometry is that of the

Gauss map

. In the case of surfaces, this map projects surface normals to a unit sphere. This strategy is especially useful when analyzing the shape structure of a smooth surface. This paper describes a new Mathematica package,

GaussMap

, for computing and displaying the tangent and normal vector fields and the Gauss map of surfaces described symbolically in either implicit or parametric form. The performance of the package is discussed by means of several illustrative and interesting examples. The package presented here can be applied for visualizing and studying the geometry of a surface under analysis, thus providing the users with an excellent computer tool for teaching and visualization purposes.

R. Ipanaqué, A. Iglesias
Numerical-Symbolic Matlab Toolbox for Computer Graphics and Differential Geometry

In the last few years, computer algebra systems (CAS) have become standard and very powerful tools for scientific computing. One of their most remarkable features is their ability to integrate numerical, symbolic and graphical capabilities within a uniform framework. In addition, in most cases, these systems also incorporate a nice user interface making them specially valuable for educational purposes. In this work we introduce a user-friendly

Matlab

toolbox for dealing with many of the most important topics in Computer Graphics and Differential Geometry. The paper describes the main features of this program (such as the toolbox architecture, its simulation flow, some implementation issues and the possibility to generate standalone applications) and how the symbolic, numerical and graphical

Matlab

capabilities have been effectively used in this process.

Akemi Gálvez, Andrés Iglesias
A LiE Subroutine for Computing Prehomogeneous Spaces Associated with Real Nilpotent Orbits

We describe an algorithm for decomposing certain modules attached to real nilpotent orbits into their irreducible components. These modules are prehomogeneous spaces in the sense of Sato and Kimura and arise in the study of nilpotent orbits and the representation theory of Lie groups. The output is a set of LATEX statements that can be compiled in a LATEX environment in order to produce tables. Although the algorithm is used to solve the problem in the case of exceptional real reductive Lie groups of inner type it does describe these spaces for the classical cases of inner type also. Complete tables for the exceptional groups can be found at http://www.math.umb.edu/~anoel/publications/tables/.

Steven Glenn Jackson, Alfred G. Noël
Applications of Graph Coloring

A graph

G

is a mathematical structure consisting of two sets

V

(

G

) (vertices of

G

) and

E

(

G

) (edges of

G

). Proper coloring of a graph is an assignment of colors either to the vertices of the graphs, or to the edges, in such a way that adjacent vertices / edges are colored differently. This paper discusses coloring and operations on graphs with

Mathematica

and web

Mathematica

. We consider many classes of graphs to color with applications. We draw any graph and also try to show whether it has an Eulerian and Hamiltonian cycles by using our package ColorG.

Ünal Ufuktepe, Goksen Bacak
Mathematica Applications on Time Scales

Stefan Hilger introduced the calculus on time scales in order to unify continuous and discrete analysis in 1988. The study of dynamic equations is an active area of research since time scales unifies both discrete and continuous processes, besides many others. In this paper we give many examples on derivative and integration on time scales calculus with

Mathematica

. We conclude with solving the first order linear dynamic equation

N

Δ

(

t

) =

N

(

t

), and show that the solution is a generalized exponential function with

Mathematica

.

Ahmet Yantır, Ünal Ufuktepe
A Discrete Mathematics Package for Computer Science and Engineering Students

Discrete mathematics is one of the very basic mathematics courses in computer engineering (CE) and/or computer science (CS) departments. This course covers almost all of the basic concepts for many other courses in the curriculum and requires active learning of students. For this purpose, especially “propositions” concept, which cannot be understood well in prior years, should be covered with every detail. Previous studies show that learning by entertaining activities and competition has positive effect on student motivation. In this study, an Internet-based Discrete Mathematics Package (DMP) for “propositions” that can work in mobile devices and encourages competitive learning between students has been developed within related literature.

Mustafa Murat Inceoglu
Circle Inversion of Two-Dimensional Objects with Mathematica

One of the most interesting and less known two-dimensional transformations is the so-called

circle inversion

. This paper presents a new Mathematica package,

CircleInversion

, for computing and displaying the images of two-dimensional objects (such as curves, polygons, etc.) by the circle inversion transformation. Those objects can be described symbolically in either parametric or implicit form. The output obtained is consistent with Mathematica’s notation and results. The performance of the package is discussed by means of several illustrative and interesting examples.

R. T. Urbina, A. Iglesias

Specific Aspects of Computational Physics for Modeling Suddenly-Emerging Phenomena Workshop

Specific Aspects of Training IT Students for Modeling Pulses in Physics

This study presents a method for teaching fundamental aspects in mathematics and physics for students in informatics and information technology, so as these concepts to be available for modeling suddenly emerging phenomena at any moment of time during the last years of study. By teaching basic aspects using the language of their specialty and by addressing questions connected to the mathematical or physical model adequate for their projects at unexpected moment of time (using a computer connected at a local network or wireless e-mail connection of mobile phones) the students are able to improve their abilities of qualitative analysis of phenomena. Moreover, their technical judgment patterns can be established in an objective manner. Finally students in Informatics and Information Technology are able not only to understand basic phenomena in physics and electronics, but also to simulate complex phenomena as suddenly emerging pulses in a proper manner, the importance of quick and proper answers being underlined in their mind.

Adrian Podoleanu, Cristian Toma, Cristian Morarescu, Alexandru Toma, Theodora Toma
Filtering Aspects of Practical Test-Functions and the Ergodic Hypothesis

This papers presents properties of dynamical systems able to generate practical test-functions (defined as functions which differ to zero on a certain interval and possess only a finite number of continuous derivatives on the whole real axis) when the free-term of the differential equation (corresponding to the received input signal) is represented by alternating functions. The shape of the output signal (obtained by numerical simulations in Matlab based on Runge-Kutta functions) is analyzed, being shown that for high-frequency inputs an external observer could notice (in certain condition) the generation of two different pulses corresponding to two distinct envelopes. Such as aspect differs to the oscillations of unstable type second order systems studied using difference equations.

Flavia Doboga, Ghiocel Toma, Stefan Pusca, Mihaela Ghelmez, Cristian Morarescu
Definition of Wave-Corpuscle Interaction Suitable for Simulating Sequences of Physical Pulses

This study presents a logic definition for the interaction between waves and corpuscles suitable for simulating the action of a sequence of electromagnetic waves upon corpuscles. First are defined the classes of measuring methods based on the wave aspect of matter and on the corpuscular aspect of matter, using considerations about a possible memory of previous measurements (operators). A suitable algorithm associated to this formalization is applied on adjoining space-time intervals, so as the space-time validity of certain assertions to be proved. The results are applied for defining the wave-corpuscle interaction in a logic manner.

Minas Simeonidis, Stefan Pusca, Ghiocel Toma, Alexandru Toma, Theodora Toma
Practical Test-Functions Generated by Computer Algorithms

As it is known, Runge-Kutta methods are widely used for numerical simulations [1]. This paper presents an application of such methods (performed using MATLAB procedures) for generating practical test functions. First it is shown that differential equations can generate only functions similar to test functions (defined as practical test functions); then invariance properties of these practical test functions are used for obtaining a standard form for a differential equation able to generate such a function. Further this standard form is used for computer aided generation of practical test-functions; a heuristic algorithm (based on MATLAB simulations) is used so as to establish the most simple and robust expression for the differential equation. Finally it is shown that we obtain an oscillating system (a system working at the limit of stability, from initial null conditions, on limited time intervals) which can be built as an analog circuit using standard electrical components and amplifiers, in an easy manner.

Ghiocel Toma
Possibilities for Obtaining the Derivative of a Received Signal Using Computer-Driven Second Order Oscillators

As it is known, a first step in modeling dynamic phenomena consists in measuring with higher accuracy some physical quantities corresponding to the dynamic system. However, for suddenly-emerging phenomena ,the data acquisition can’t be restricted at sampling procedures for a received signal (corresponding to a certain physical quantity). A significant quantity is represented by the derivative (the slope) of the received signals, because all dynamical models must take it into consideration. Usually the derivative of a received signal is obtained by filtering the received signal and by dividing the difference between the filtered values of the signal at two different moments of time at the time difference between these time moments. Many times these filtering and sampling devices consists of low-pass filters represented by asymptotically stable systems, sometimes an integration of the filter output over a certain time interval being added. However, such a structure is very sensitive at random variations of the integration period, and so it is recommended the signal which is integrated to be approximately equal to zero at the end of the integration period. It will be shown that the simplest structure with such properties is represented by an oscillating second order computer-driven system working on a time period.

Andreea Sterian, Ghiocel Toma
Simulating Laser Pulses by Practical Test Functions and Progressive Waves

This study presents simulations performed in Matlab for the change of dielectric properties of mixtures of fatty acids under the influence of external optical or electrical pulses. These simulations are based on the use of practical test-functions. It is shown that, for an external observer, the behavior of the material medium for an alternating input of high-frequency (corresponding to an optical pulse) presents a single oscillation on the whole working interval, similar to the dynamics of

ε

noticed during the experiments. For the case of suddenly-applied electric fields, it is shown that the use of a similar dynamical model, in conjunction with the hypothesis of progressive waves generated inside mixtures of fatty acids by suddenly-applied electric fields, can simulate the dynamics of electrical properties of such mixtures in a correct manner.

Rodica Sterian, Cristian Toma
Statistical Aspects of Acausal Pulses in Physics and Wavelets Applications

In the mathematical theory of distributions are widely used test-functions (which differ to zero only on a limited interval and have continuous derivatives of any order on the whole real axis). The use of such functions is also recommended in Fourier analysis of wavelets. However, less attention was given to connections between test-functions and equations used in mathematical physics (as wave equation). This paper shows that test-functions, considered at the macroscopic scale (that means not as

δ

-functions) can represent solutions for the wave-equation, under the form of acausal pulses (which appear under initial null conditions and without any source-term to exist). This implies the necessity for some supplementary requirements to be added to the wave-equation, so as the possibility of appearing such pulses to be rejected. It will be shown that such a possibility represents in fact a kind of bifurcation point, and a statistic interpretation (based on probability for state-variables to make certain jumps) is presented for justifying the fact that such pulses are not observed. Finally the advantage of using practical test function for wavelets processing is presented.

Cristian Toma, Rodica Sterian
Wavelet Analysis of Solitary Wave Evolution

The problem of solitary wave propagation in dispersive media is considered. The nonlinear equation (hyperbolic modification of Burger’s one) in presence of a dispersive propagation law, give rise to nonlinear effects as the breaking down of the wave into localized chaotic oscillations. Finally, it is shown how to handle the representation of solitary profiles by using elastic wavelets.

Carlo Cattani
Numerical Analysis of Some Typical Finite Differences Simulations of the Waves Propagation Through Different Media

The Numerical Physics field has its specific phenomena , which intervene always but are confined usually to very restricted limits. However, the description possibilities (accuracy, computing time, etc) of computers are limited and sometimes – in strong connection with some specific features of the used algorithms and the computer errors – the numerical phenomena reach important amplitudes, the accomplished numerical simulations presenting significant distortions relative to the simulated (true) physical evolutions. That is why the goal of this work is to study the main features of some classical and newly found out numerical phenomena associated to the Finite Differences (FD) simulations of the wave propagation through media with sharp interfaces and attenuative character (considered as suddenly-emerging phenomena), and of other physical processes. The mechanisms of these numerical phenomena were studied in detail, the obtained findings allowing us to predict the distortions of the simulated physical processes. Of course, the good knowledge of the main features and of the mechanisms of the most important numerical phenomena allows us also to avoid the appearance of drastic distortions of the simulated evolutions, as well as the optimization of some numerical simulations.

Dan Iordache, Stefan Pusca, Ghiocel Toma
B–Splines and Nonorthogonal Wavelets

The necessary and sufficient conditions for the (non orthogonal) wavelet multiresolution analysis with arbitrary (for example

B

-spline) scaling function are established.

The following results are obtained:

1

the general theorem which declares necessary and sufficient conditions for the possibility of multiresolution analysis in the case of arbitrary scaling function;

2

the reformulation of this theorem for the case of

B

-spline scaling function from

W

$_{\rm 2}^{m}$

;

3

the complete description of the family of wavelet bases generated by

B

-spline scaling function;

4

the concrete construction of the unconditional wavelet bases (with minimal supports of wavelets) generated by

B

-spline scaling functions which belongs to

W

$_{\rm 2}^{m}$

.

These wavelet basesare simple and convenient for applications. In spite of their nonorthogonality, these bases possess the following advantages: 1) compactness of set

$\mbox{supp\,}\psi$

and minimality of its measure; 2) simple explicit formulas for the change of level. These advantages compensate the nonorthogonality of described bases.

Nikolay Strelkov
Optimal Wavelets

The complete description of wavelet bases is given such that each of them is generated by the fixed function whose Fourier image is the characteristic function of some set. In particular, for the case of Sobolev spaces wavelet bases with the following property of universal optimality are constructed: subspaces generated by these functions are extremal for the projection-net widths (if

n

= 1, then also for Kolmogorov widths) of the unit ball in

$W^m_2({\mathbb{R}}^n)$

with

$W^s_2({\mathbb{R}}^n)$

-metric for the whole scale of Sobolev classes simultaneously (i.e., for all

s

,

m

 ∈ ℝ such that

s

<

m

). Some results concerning completeness and basis property of exponential systems are established in passing.

Nikolay Strelkov, Vladimir Dol’nikov
Dynamics of a Two-Level Medium Under the Action of Short Optical Pulses

Optical Bloch equations are used to describe the dynamics of a two-level atomic medium subjected to ultrafast optical pulses. Starting with an uninverted system, the final atomic state is numerically studied. The generalization of the Bloch equations for a dense medium such that induced near dipole-dipole interactions are significant can result in a switching behavior of the atomic population inversion. Based on numerical investigations for typical pulse shapes, the mechanism of switching is explained in terms of dynamical systems theory.

Valerică Ninulescu, Andreea-Rodica Sterian
Nonlinear Phenomena in Erbium-Doped Lasers

The nonlinear dynamics of an erbium-doped fiber laser is explained based on a simple model of the ion pairs present in heavily doped fibers. The single-mode laser dynamics is reductible to four coupled nonlinear differential equations. Depending on the ion-pair concentration, the pumping level and the photon lifetime in the laser cavity, numerical calculations predict cw, self-pulsing and sinusoidal dynamics. The regions of these dynamics in the space of the laser parameters are determined.

Andreea Sterian, Valerică Ninulescu

Internet Comunications Security (WICS) Workshop

An e-Lottery Scheme Using Verifiable Random Function

A number of e-lottery schemes have been proposed; however, none of them can satisfy all the identified requirements. In particular, some of them require a certain subset of players to remain online or the existence of a trusted third party (TTP) in order to generate the winning number(s) and some suffer from the forgery ticket attack. In this paper, we propose a new e-lottery scheme based on

Verifiable Random Function

that can satisfy all the identified requirements without the presence of TTP, yet the result of this generation is publicly verifiable.

Sherman S. M. Chow, Lucas C. K. Hui, S. M. Yiu, K. P. Chow
Related-Mode Attacks on Block Cipher Modes of Operation

In this paper, we present a generalization of the notion of the recently proposed related-cipher attacks. In particular, we show that when the cryptanalyst has access to an oracle under one mode, then almost all other related-cipher modes can be attacked with ease. Typically only one chosen plaintext/ciphertext query is required, while computational complexity is negligible.

Raphael C. –W. Phan, Mohammad Umar Siddiqi
A Digital Cash Protocol Based on Additive Zero Knowledge

In this paper, we introduce the concept of

Additive

Non-Interactive Zero Knowledge (NIZK). We extend the notion of NIZK proofs to include the prover’s identity as part of the theorem being proved. An additive proof allows a verifier to construct a new proof of knowledge using the information from an old proof. Intuitively, an additive proof is a proof of

knowledge of knowledge

. As an application of this concept, we propose a digital cash scheme with transferable coins.

Amitabh Saxena, Ben Soh, Dimitri Zantidis
On the Security of Wireless Sensor Networks

Wireless Sensor Networks are extremely vulnerable against any kind of internal or external attacks, due to several factors such as resource-constrained nodes and lack of tamper-resistant packages. As a result, security must be an important factor to have in mind when designing the infrastructure and protocols of sensor networks. In this paper we survey the “state-of-the-art” security issues in sensor networks and highlight the open areas of research.

Rodrigo Roman, Jianying Zhou, Javier Lopez
Dependable Transaction for Electronic Commerce

Electronic transaction becomes common practice in real world business. This paper focuses on the issue of dependability in critical transactions like electronic payment, electronic contract signing. Recent fair protocols can recover transactions from network crashes, but cannot survive local system crashes. A two-party dependable transaction protocol is proposed. During the protocol, both parties can recover the transaction from network and local system failures in a transparent way, which means that after the recovery, outcome messages would be just the same as those from a successful run of the transaction.

Hao Wang, Heqing Guo, Manshan Lin, Jianfei Yin, Qi He, Jun Zhang
On the Security of a Certified E-Mail Scheme with Temporal Authentication

Certified e-mail is a value-added service for standard e-mail systems, in which the intended recipient gets the mail content if and only if the mail originator receives a non-repudiation evidence that the message has been received by the recipient. As far as security is concerned, fairness is one of the most important requirements. Recently, Galdi and Giordano (2004) presented an optimistic protocol for certified e-mail with temporal authentication. In this paper, we analyze their protocol and demonstrate that it cannot achieve true fairness and has some other weaknesses. We further propose the improvements to avoid those security problems.

Min-Hua Shao, Jianying Zhou, Guilin Wang
Security Flaws in Several Group Signatures Proposed by Popescu

In resent years, Popescu et al. proposed several group signature schemes in [8, 9, 10, 11] that based on the Okamoto-Shiraishi assumption. Their schemes are claimed to be secure. However, we identify several security flaws in their schemes and then show that these schemes are all

insecure

. By exploiting those flaws, anybody (not necessarily a group member) can forge valid group signatures on arbitrary messages of his/her choice. In other words, these schemes are universally forgeable.

Guilin Wang, Sihan Qing
A Simple Acceptance/Rejection Criterium for Sequence Generators in Symmetric Cryptography

A simple method of checking the degree of balancedness in key-stream generators of cryptographic application has been developed. The procedure is based exclusively on the handling of bit-strings by means of logic operations and can be applied to standard generators proposed and published in the open literature (combinational generators, multiple clocking generators, irregularly clocked generators). The requirements of time and memory complexity are negligible. The method here developed is believed to be a first selective criterium for acceptance/rejection of this type of generators with application in symmetric cryptography.

Amparo Fúster-Sabater, Pino Caballero-Gil
Secure Electronic Payments in Heterogeneous Networking: New Authentication Protocols Approach

Recent research efforts have been addressed towards maintain the heterogeneous networking transparent under the powerful all-IP concept. As example, current global standarization initiative specifies the 3G cellular system to wireless LAN inter-working. On the other hand, smart cards are presented as enough powerful devices capable to perform a strong authentication at lower layers of the protocol stack. Our work proposes a novel model reference and a scenario of applicability for secure electronic payment in this environment. Impact on the trust relations are assesed and a set of authentication requirements are provided. Finally, a new approach based on end-to-end layer 2 authentication protocols is adjusted to this proposal, considering the most interesting improvements in the authentication mechanisms applicable to this context.

J. Torres, A. Izquierdo, A. Ribagorda, A. Alcaide

Component Based Software Engineering and Software Process Model Workshop

Software Reliability Measurement Use Software Reliability Growth Model in Testing

In this paper, we study the software reliability measurement method of reliability testing metrics. Software reliability is very important. But, it is very difficult to test for software reliability measurement. So we describes the software reliability metrics for ISO/IEC 9126, and we introduce Gamma-Lomax software reliability model for multiple error debugging. And we calculate the software reliability measure(reliability metrics, parameter estimation etc). We introduce the measurement method of software reliability quality of software product.

Hye-Jung Jung, Hae-Sool Yang
Thesaurus Contruction Using Class Inheritance

A lot of methodologies have been proposed for the compo- nent retrieval. Among them, thesaurus concept has been introduced for the similar component retrieval. In this paper, for the efficient retrieval of component, we classified classes with the concept of the inheritance relation and applied fuzzy logic to thesaurus method, and constructed an object-oriented thesaurus. The proposed method could express the category between concepts automatically, and calculate the fuzzy degree between classes by comparing matching weight with mismatching weight of each class and each category, and construct the thesaurus finally. By using classes of a component in the component retrieval, candidate com- ponents could be retrieved according to the priority order by the fuzzy similarity. Also, the retrieval performance was improved greatly by the thesaurus and the decision of the most suitable threshold value through simulation.

Gui-Jung Kim, Jung-Soo Han
An Object Structure Extraction Technique for Object Reusability Improvement Based on Legacy System Interface

This paper suggests a technique, OSET(An Object Structure Extraction Technique for Object Reusability Improvement Based on Legacy System Interface) for reuse and reengineering by analyzing the Legacy System interface to distill the meaningful information from them and disassemble them into object units which are to be integrated into the next generation of systems. The OSET method consists of a procedure with 4 steps: 1) the interface use case analysis step, 2) the interface object dividing step, 3) the object structure modeling step, and 4) the object model integration step. In step 1, the interface structure and information about the interaction between the user and the Legacy System are obtained. In step 2, the interface information is divided into semantic fields. In step 3, the structural and collaborative relationship is studied and modeled among interface objects. Finally, in step 4, the object model integration step, integrates the models and improves the integrated model at a higher level.

Chang-Mog Lee, Cheol-Jung Yoo, Ok-Bae Chang
Automatic Translation Form Requirements Model into Use Cases Modeling on UML

Given the investment organizationsare making in use cases and the increasing use of computers to control safety-critical applications, the research on integrating use cases and safety consideration is highly demanded. However, currently the approaches for incorporating safety analysis into use case modeling are scarce. In this paper, we present an approach to integrating safety requirements into use case modeling. We demonstrate how deductive and inductive safety techniques can be used hand-in-hand with use cases. An application of the proposed approach facilitates early hazard identification and assessment, as well as elicitation and tracing of safety requirement though the entire development process. The proposed approach is illustrated by a realistic case study – a liquid handling workstation.

Haeng-Kon Kim, Youn-Ky Chung
A Component Identification Technique from Object-Oriented Model

Today’s software system environment requires rapid development and high productivity. In order to satisfy these requirements, researches have been working on the development of software reuse technology. Therefore, it is general to develop component-based software due to the advance of reuse technology nowadays. One of main issues raised in component-based development (CBD) is how to identify reusable and independent components. Existing methodologies have dealt with the problem based on only developer’s heuristics, so that it is difficult to identify the components by common developers. Therefore, in this paper, we propose a new technique to identify the business components based on system component. The proposed technique applies the characteristics and degree of dependency between classes in object-oriented model. We also present a case study and experimental results to prove the practical use of our technique. We examined various examples to get the objective foundations.

Mi-Sook Choi, Eun-Sook Cho
Retrieving and Exploring Ontology-Based Human Motion Sequences

A framework for semantic annotation of human motion sequences is proposed in this paper. Motion capture technology is widely used for manufacturing animation but it has a significant weakness due to the lack of an industry wide standard for archiving and retrieving motion capture data. It is difficult for animators to retrieve the desired motion sequences from motion capture files as there is no semantic annotation on already captured motion data. Our goal is to improve the reusability of motion capture data. To archive our goal first, we propose a standard format for integrating different motion capture file formats. Second, we define motion ontologies that are used to annotate and semantically organize human motion sequences. This ontology-based approach provides the means for discovering and exploiting the information and knowledge surrounding motion capture data.

Hyun-Sook Chung, Jung-Min Kim, Yung-Cheol Byun, Sang-Yong Byun
An Integrated Data Mining Model for Customer Credit Evaluation

Based on the customer information relating to details of financing and payment histories from a financial institution, this study derived single data mining models using MLP, MDA, and DTM. The results obtained from these single models were subsequently compared with the results from an integrated model developed using GA. This study not only verifies existing single models and but also attempts to overcome the limitations of these approaches. While our comparative analysis of single models for the purpose of identifying the best-fit model relies upon existing techniques, this study presents a new methodology to build an integrated data mining model using GA.

Kap Sik Kim, Ha Jin Hwang
A Study on the Component Based Architecture for Workflow Rule Engine and Tool

The component based development and architecture technology have the potential to be more powerful than traditional. In this paper, we propose 4 views for architecture development, which are use case view, logical view, component view and implementation view. We present component based architecture through ABCD pattern in a component viewpoint. In addition, we apply 4 viewpoints for workflow rule and tool development and propose each design. We present the user interface of actuality developed workflow engine and tool in implementation view. It actually allowed the various stakeholders to find what they want to know about the software architecture. This architecture viewpoints separation accurately will accurately reflects the practice in development of real software systems. Also, we expect to decrease the complexity of implementation, improve reuse and reconfiguration and productivity in workflow system and similar domain.

Ho-Jun Shin, Kwang-Ki Kim, Bo-Yeon Shim
A Fragment-Driven Process Modeling Methodology

In this paper, we propose an advanced modeling approach, which is called a fragment-driven process modeling methodology that enables several real actors/workers to cooperatively define a process model. We also suggest a feasible design that can cooperatively realize the methodology with maximizing efficiency and involvement of real workers in the process modeling work. In the traditional approaches, the modeling work is done by a single designer who has to know all about the detailed and complex information needed for a process model, such as relevant data, organizational data, roles, activities, related software and programs, scripts, etc. However, according to that processes have become more complicated and large-scaled in recent, it is hard to say that the approach is reasonable. Therefore, we propose a more realistic approach that enables several real actors/workers to cooperatively define a process model with disclosing just required information through completely distributed environment. In this approach, the actors need to define only their own activities, not the whole one, then the system gathers these partial sub-models (which is called process fragments), and finally compose the complete process model. We strongly believe that the methodology should be very applicable and valuable for cooperatively modeling not only intra-organizational processes but also cross-organizational e-business processes, such as SCM, e-Commerce, e-Logistics, and so on.

Kwang-Hoon Kim, Jae-Kang Won, Chang-Min Kim
A FCA-Based Ontology Construction for the Design of Class Hierarchy

One of the most tasks of object-oriented software designers is the design of the class hierarchy and the relationships among classes. Since there are many conceptual similarities with the design of an ontology, and an ontology is semantically richer than a UML class model, it makes sense to put the emphasis on ontology design. That is, an object-oriented software designer can design an ontology by organizing object classes in a class hierarchy and creating relationships among classes. UML models can then be generated from the ontology.

In this paper, we introduce the Formal Concept Analysis(FCA) as the basis for a practical and well founded methodological approach to the construction of ontology. We show a semi-automatic, graphic and interactive tool to support this approach. The purpose of this work is to provide a semi-automatic methods for the ontology developers. We describe here the basic ideas of the work and its current state.

Suk-Hyung Hwang, Hong-Gee Kim, Hae-Sool Yang
Component Contract-Based Formal Specification Technique

When we analyze the business domain, we have to decide what business concepts are to be encapsulated into a component and find what business concepts are to be built by using a reuse component. Also, as a component is reused in the form of a black-box, the reuser must have detailed information about the component, such as the functional and non-functional performance which is necessary to reuse or integrate. So, we will propose a formal approach to design a robust component. First, we analyze a business domain by using Z and category theory. Second, we extract the components and the interfaces from previous analysis results. Lastly, we add component contracts(functional and non-functional performances) to the result. We specify business concept based on DbC which is used broadly to specify the behavior of an interface in an object-oriented area. Also, we will define rules for extraction of components and component contracts from specification. Specially, we will use category theory to analyze the relations between components.

Ji-Hyun Lee, Hye-Min Noh, Cheol-Jung Yoo, Ok-Bae Chang
A Business Component Approach for Supporting the Variability of the Business Strategies and Rules

In this paper we present how to use rule for manipulating the variability of a business. We can classify the business component as business domain components and business process components. The most important thing to design the component, designer should identify the variability among the same business components. If we put the variability into the components, we should modify those components whenever we reuse that. It is not easy to modify the component since the user of that component might not be same with the writer. We extend component property to business rules. We define the component property to customize the component execution environment. Like this, we can define the difference business strategy in rules. With our rule engine, the component can use the rule during the component execution time so that the user of component can redefine or rewrite the rules according to their business context. Based on this approach, the reusability of component is improved. In this paper we describe new component architecture including rule component, our rule engine and case study.

Jeong Ah Kim, YoungTaek Jin, SunMyung Hwang
A CBD Application Integration Framework for High Productivity and Maintainability

Under a rapidly evolving e-business environment with frequent employee turnover and limited resources, a large volume of web-based software development projects require the CBD method for highly reusability and flexibility. This paper presents the architecture of our CBD application integration framework that has been used for developing large-scale e-business applications to achieve high development productivity and maintainability. This framework is flexible and extensible dynamically so as to be customized to various applications. It provides a development and maintenance toolkit as well as common business logics that can be reused with a few adaptation steps. This framework copes with a number of software architectural qualities in various aspects.

Yonghwan Lee, Eunmi Choi, Dugki Min
Integrated Meta-model Approach for Reengineering from Legacy into CBD

There is an increasing interest in migration legacy systems to new hardware platforms and to new software development paradigms. The reason is that high maintenance costs and lack of documentation. In order to migrate or transform legacy system, various approaches such as screen scrapping, wrapping, semi-development, and re-development, tools, and methodologies are introduced until now. However, architecture or requirements level’s transformation is not suggested because most of those approaches focus on code-level transformation or a few model-level transform. In this paper, we suggest a meta-model driven approach applying 3D space concept, which can be applied into architecture and requirement phase. Proposed integrated model drives seamless migration or co-evolution from code to architecture of reverse engineering and from architecture to code of forward engineering.

Eun Sook Cho
Behavior Modeling Technique Based on EFSM for Interoperability Testing

With the rapid growth of network technology, two or more products from different vendors are integrated and interact with each other to perform a certain function in the latest systems. However, there are cases where products from different vendors or even from the same vendor often do not interoperate properly. Thus, interoperability testing is considered as an essential aspect for correctness of integrated systems. Interoperability testing tests the ability of software and hardware on different machines from different vendors to share data. Existing researches about interoperability testing are usually focused on an optimal test scenario generation applying graph and automata theory. Most of these researches model communication system behavior using EFSM(Extended Finite State Machines) and use EFSM as an input of test scenario generation algorithm. There are many studies on systematic and optimal test cast generation algorithms using EFSM, but in these researches, the study for generating EFSM model, which is a foundation of test scenario generation, is not sufficient. This paper proposes an EFSM generation technique, which is a foundation of test scenario generation for more complete interoperability testing based on use case specification. The generated EFSM through the proposed technique in this paper can be used as an input of EFSM-based test scenario generation algorithm proposed in other studies.

Hye-Min Noh, Ji-Hyun Lee, Cheol-Jung Yoo, Ok-Bae Chang
Automatic Connector Creation for Component Assembly

In this paper, we present the automatic connector creation technique which connects and assembles components without method call or component modification. The connector is automatically created by the definition of component port, specification and architecture. Through the created connector, components are assembled.

Jung-Soo Han, Gui-Jung Kim, Young-Jae Song
MaRMI-RE : Systematic Componentization Process for Reengineering Legacy System

Most legacy systems have many problems to accommodate new technologies, or to be expanded or changed in accordance with complicated business requirements, since they are lack of standardization, openness, distributed architecture, and et al. Therefore, it is necessary to reengineer the legacy systems to maximize the utility thereof as an important asset of an organization. In this paper, we provide the a componentization process, that is the MaRMI-RE(Magic and Robust Methodology Integrated-ReEngineering), for reengineering legacy systems into component systems so that legacy systems can continue to be developed to comply with varying business and technical environments. We define and specify concrete procedures, work-products guidelines and considerations for transforming into component-based system with a well-defined architecture and more reusable assets.

Jung-Eun Cha, Chul-Hong Kim
A Study on the Mechanism for Mobile Embedded Agent Development Based on Product Line

In most mobile embedded agent systems (MEAS), agents are required to achieve their own goals. An agent’s goals, however, can conflict with others either when agents compete with each other to achieve a common goal or when they have to use a set of limited resources to accomplish agents’ divergent goals. In either case, agents need to be designed to reach a mutual acceptable state where they can avoid any goal conflicts through negotiation with others to achieve their goals. In this paper, we consider that a ABCD architecture, as a shorthand for Architecture platform, Business, Common, and Domain is a core component of agents’ mental attitudes and represent resource-bounded ABCD agents in logic programming framework. We propose an algorithm in which ABCD agents with different goals solve their problems through negotiation resolving goal conflicts. Finally, we develop a negotiation meta-language to show the effectiveness of the negotiation system proposed in this paper.

Haeng-Kon Kim
Frameworks for Model-Driven Software Architecture

On every new development new issues are taken into account to cope with either environmental or the stakeholders needs that are evolving over time. Several approaches have faced this problem. Some of them exhibit this evolution ability by using static design/compile-time techniques whereas others introduce this ability in the system at run-time. Nevertheless, in both cases evolving requirements give rise to the need of adaptability which is inherent to every software development. This paper sketches our work in this field in which we are concerned about the MDSAD (Model Driven Software Architecture Development) methodology to guide the reflexive development of architectures from the software requirements. In particular, we are detailing the first step of this methodology, i.e., the definition of the goals model whose constituents are the fundamental basis for the overall process defined in MDSAD proving its suitability for obtaining traceable architectural models. It provides our work either to its ability to specify and manage positive and negative interactions among goals or to its capability to trace low-level details back to high-level concerns.

Soung Won Kim, Myoung Soo Kim, Haeng Kon Kim
Parallel and Distributed Components with Java

This paper presents an environment for supporting distributed application using shared component in Java within a heterogeneous environment. In particular, It offers a set of components such as Lists, Queues, Stacks that can be shared across a network of heterogeneous machine in the same way as DSM systems. Shared is achieved without recourse to Java RMI or component proxies as in other component systems. An implementation of the environment is provided together with performance timings.

Chang-Moon Hyun
CEB: Class Quality Evaluator for BlueJ

Today most programming courses for beginners use an object-oriented language. This has led to many new learning tools, theories and methods for teaching object orientation. One such new tool is BlueJ which is developed to teach Java and object orientation for beginners. BlueJ does not have any support for quality evaluation for program and this paper describes the development of “CEB”, a BlueJ extension for class quality evaluation based on the CK metrics. The CEB is developed as a plug-in for BlueJ and designed to be used at hand by Java programming beginners. This paper presents the BlueJ and its extensions, and describes the design and implementation of the CEB.

Yu-Kyung Kang, Suk-Hyung Hwang, Hae-Sool Yang, Jung-Bae Lee, Hee-Chul Choi, Hyun-Wook Wee, Dong-Soon Kim
Workflow Modeling Based on Extended Activity Diagram Using ASM Semantics

The Unified Modeling Language(UML) provides rich notations for representing and analyzing architecture and behaviors of systems. Among these notations, UML activity diagram is well-known for describing systems’ dynamic behaviors, and it is useful to model business process and workflow. Currently, the UML semantics is informally defined in plain text and it is often unclear, ambiguous or it contains contradictory assertions. It is difficult to present the precise semantics which are taken as important in workflow system with the guide provided by OMG to the UML activity diagram. In this thesis, the alternative approach of using Abstract State Machines to formalize UML activity diagrams is presented. We propose the workflow modeling methodology by applying ASM semantics to the activity diagram. Through the exact definition to formal semantics based on ASM, it is possible to effectively model the workflow.

Eun-Jung Ko, Sang-Young Lee, Hye-Min Noh, Cheol-Jung Yoo, Ok-Bae Chang
Unification of XML DTD for XML Documents with Similar Structure

There are many cases that XML documents have different DTDs in spite of having a similar structure and being logically the same kind of document. For this reason, a problem may occur in which these XML documents will have different database schema and are stored in different databases, and we have to access all database that is concerned to process the queries of users. Consequently, it decreases seriously the efficiency of retrieval. To solve this problem, we propose an algorithm that unifies DTDs of these XML documents using the finite automata and the tree structure. The finite automata are suitable for representing repetition operators and connectors of DTD, and are simple representation method for DTD. By using the finite automata, we are able to reduce the complexity of algorithm. And we apply a proposed algorithm to unify DTDs of science journals.

Chun-Sik Yoo, Seon-Mi Woo, Yong-Sung Kim
Secure Payment Protocol for Healthcare Using USIM in Ubiquitous

The ubiquitous environment aims to provide user-friendly service, which is the potential area that creates new value by grafting in diverse Internet business models. Healthcare service, which deals with health and the quality of life, is one area that actively features the ubiquitous business models. This paper introduces remote-controlled hospital treatment services and payment protocols for ubiquitous healthcare services. They make use of mobile terminals and USIM under different ubiquitous network environments. Through these new services, a more efficient treatment system will be studied and developed by connecting mobile terminals, USIM, and networks.

Jang-Mi Baek, In-Sik Hong
Verification of UML-Based Security Policy Model

Since the security policy model plays an important role in any secure information system, its specification has been studied extensively. In particular, UML-based specification has widely used because of its visual characteristics. Although visual specifications are good to write, they are difficult to verify whether some desired properties are hold in a given specification. This paper shows our techniques to verify UML-based specification with a running example.

Sachoun Park, Gihwon Kwon

Computer Graphics and Geometric Modeling (TSCG 2005) Workshop

From a Small Formula to Cyberworlds

Cyberworlds created on the web allow for providing personal mentoring of the students with different cultural and educational backgrounds. Virtual Campus of Nanyang Technological University is designed to be such a cyberworld. This is a place for research and education, fun and immersion in campus life. Besides standard VRML, its hybrid function-based extension is used in the design of Virtual Campus. In place of thousands of polygons, small formulas are used for constituting complex geometric shapes and appearances. Collaborative Shape Modeling Laboratory, which is a part of the Virtual Campus, is based on this extension. It is developed to help students with their computer graphics assignments.

Alexei Sourin
Visualization and Analysis of Protein Structures Using Euclidean Voronoi Diagram of Atoms

Protein consists of amino acids, and an amino acid consists of atoms. Given a protein, understanding its functions is critical for various reasons for designing new drugs, treating diseases, and so on. Due to recent researches, it is now known that the structure of protein directly influences its functions. Hence, there have been strong research trends towards understanding the geometric structure of proteins. In this paper, we present a Euclidean Voronoi diagram of atoms constituting a protein and show how this computational tool can effectively and efficiently contribute to various important problems in biology. Some examples, among others, are the computations for molecular surface, solvent accessible surface, extraction of pockets, interaction interface, convex hull, etc.

Deok-Soo Kim, Donguk Kim, Youngsong Cho, Joonghyun Ryu, Cheol-Hyung Cho, Joon Young Park, Hyun Chan Lee
C 2 Continuous Spline Surfaces over Catmull-Clark Meshes

An efficient method for generating a

C

2

continuous spline surface over a Catmull-Clark mesh is presented in this paper. The spline surface is the same as the Catmull-Clark limit surface except in the immediate neighborhood of the irregular mesh points. The construction process presented in this paper consists of three steps: subdividing the initial mesh at most twice using the Catmull-Clark subdivision rules; generating a bi-cubic Bézier patch for each regular face of the resultant mesh; generating a

C

2

Gregory patch around each irregular vertex of the mesh. The union of all patches forms a

C

2

spline surface. Differing from the previous methods proposed by Loop, DeRose and Peters, this method achieves an overall

C

2

smoothness rather than only a

C

1

continuity.

Jin Jin Zheng, Jian J. Zhang, Hong Jun Zhou, L. G. Shen
Constructing Detailed Solid and Smooth Surfaces from Voxel Data for Neurosurgical Simulation

This paper deals with a neurosurgical simulation system with precise volume rendering and smooth tactile sensation. In the system, the Octree based hierarchical representation of volume data with continuous tri-cubic parametric functions, called volumetric implicit functions, and smooth boundary surfaces are introduced to provide detailed solid and smooth tactile sensation in an interactive environment. The volume data represented as voxel data, which are created from CT or MRI images, are divided into sub-volume until volumetric implicit functions can approximate voxel values accurately. An Octree manages the divided volume and parameters of the implicit functions in a hierarchical manner. Furthermore, smooth boundary surfaces are constructed by fitting points on a level surface of the implicit functions. In order to render more detailed solid than voxel precision when objects are zoomed up, sub-sampled voxels are generated by using the implicit functions. As for the tactile sensation, haptic device, PHANToM, is used to actualize a smooth reaction force which is calculated by the surface normal and the distance from a position of an instrument to the nearest surface. Incision with tactile sensation can be executed by making voxels underlying the instrument transparent, when a reaction force is greater than a limit. Several experiments reveal the effectiveness of the proposed methods.

Mayumi Shimizu, Yasuaki Nakamura
Curvature Estimation of Point-Sampled Surfaces and Its Applications

In this paper, we propose a new approach to estimate curvature information of point-sampled surfaces. We estimate curvatures in terms of the extremal points of a one-dimensional energy function for discrete surfels (points equipped with normals) and a multi-dimensional energy function for discrete unstructured point clouds. Experimental results indicate that our approaches can estimate curvatures faithfully, and reflect the subtle curvature variations. Some applications for curvature information, such as surface simplification and feature extraction for point-sampled surfaces, are given.

Yongwei Miao, Jieqing Feng, Qunsheng Peng
The Delaunay Triangulation by Grid Subdivision

This study presents an efficient algorithm of Delaunay triangulation by grid subdivision. The proposed algorithm show a superior performance in terms of execution time to the incremental algorithm and uniform grid method mainly due to the efficient way of searching a mate. In the proposed algorithm, uniform grids are divided into sub-grids depending on the density of points and areas with high chance of finding a mate is explored first. Most of the previous researches have focused on theoretical aspects of the triangulation, but this study presents empirical results of computer implementation in 2-dimension and 3-dimension, respectively.

Si Hyung Park, Seoung Soo Lee, Jong Hwa Kim
Feature-Based Texture Synthesis

We introduce a new method for texture synthesis on regular and irregular example textures. In this paper, an enhanced patch-based algorithm is proposed to select patches with the best structural similarity and to avoid discontinuity at the boundary of adjacent patches. This new method utilizes a feature-weighted function to measure the structural similarity. A re-synthesis technique is presented to reduce the structural discontinuity. We conduct a comparison study to verify the proposed method. Preliminary experiments show that the proposed method can yield better results than other well-known methods for examples used in this study.

Tong-Yee Lee, Chung-Ren Yan
A Fast 2D Shape Interpolation Technique

This paper proposes a computationally inexpensive 2D shape interpolation technique for two compatible triangulations. Each triangle in a triangulation is represented using a stick structure. The intermediate shape of each triangle is interpolated using these sticks. All of these intermediate triangles are then assembled together to obtain the intermediate shape of the triangulation according to a predetermined order. Our approach is inspired by Alexa et al’s work [1], but is simpler and more efficient. Even though we ignore the local error, our approach can generate the satisfactory (as-rigid-as-possible) morph sequence like Alexa et al’s.

Ping-Hsien Lin, Tong-Yee Lee
Triangular Prism Generation Algorithm for Polyhedron Decomposition

RP(Rapid Prototyping) is often called as Layered Manufacturing because of layer by layer building strategy. Layer building strategy is classified into two methodologies. One is based on the 2D layer and the other is based on the 3D layer. 2D layer is simply created by the intersection between the polyhedron and a slicing plane whereas 3D layer is created with some constraints such as cuttability and manufacturability. Currently, 3D Layer is generated by using the boundary surface information in the native solid modeling format. However, most input data in Rapid Prototyping is the polyhedral surface data. We propose a geometric algorithm that uses the triangular prism to create 3D layers. Examples are shown to show the validity.

Jaeho Lee, JoonYoung Park, Deok-Soo Kim, HyunChan Lee
Tweek: A Framework for Cross-Display Graphical User Interfaces

Developers of virtual environments (VEs) face an often-difficult problem: users must have some way to interact with the virtual world. The VE application designers must determine how to map available inputs to actions within the virtual world. However, manipulating large amounts of data, entering alphanumeric information, or performing abstract operations may not map well to current VE interaction methods, which are primarily spatial. Furthermore, many VE applications are derived from mature desktop applications that typically have a very rich user interface (UI). This paper presents Tweek, a reusable, extensible framework for UI construction that allows use of the same UI on a desktop system, on a hand-held computer, or in an immersive 3D space. Designers can maintain interaction consistency across conventional visualization settings such as desktop systems and multi-screen immersive systems. This paper covers in detail the design of Tweek and its use as an input device for virtual environments.

Patrick Hartling, Carolina Cruz-Neira
Surface Simplification with Semantic Features Using Texture and Curvature Maps

We propose a polygonal surface simplification algorithm that can preserve semantic features without user control. The semantic features of a model are important for human perception, which are insensitive to small geometric errors. Using an edge detects: Its three kinds of maps are employed to extract these features. First, an image map is generated boundary lines represent changes of chroma in the texture image by using edge detector. Second, the discrete curvatures at 3D vertices are mapped to the curvature map, and their data is also analyzed by an edge detector. Finally, a feature map is generated by combining the image and curvature maps. By finding areas of the 2D map that correspond to areas of the 3D model, semantic features can be preserved after simplification. We demonstrate this experimentally.

Soo-Kyun Kim, Jung Lee, Cheol-Su Lim, Chang-Hun Kim
Development of a Machining Simulation System Using the Octree Algorithm

The overall goal of this thesis is to develop a new algorithm based on the octree model for geometric and mechanistic milling operation at the same time. To achieve a high level of accuracy, fast computation time and less memory consumption, the advanced octree model is suggested. By adopting the supersampling technique of computer graphics, the accuracy can be significantly improved at approximately equal computation time. The proposed algorithm can verify the NC machining process and estimate the material removal volume at the same time.

Y. H. Kim, S. L. Ko
A Spherical Point Location Algorithm Based on Barycentric Coordinates

An algorithm based on barycentric coordinates is presented to solve the point location problem in spherical triangulation meshes. During the preprocessing stage, a subdivision connectivity mesh is constructed to partition the spherical domain into some subdivision regions. Then we find a representation triangle from the triangle set of original spherical mesh for each subdivision region. During the locating stage, we firstly find the subdivision region containing the query point

p

and select the corresponding representation triangle as the start one to walk. Then the barycentric coordinates are used to extract local heuristic information about the location of

p

, so as to find the shortest path from the start triangle to the target one. In comparison with traditional algorithms, our approach has better time-space performance.

Yong Wu, Yuanjun He, Haishan Tian
Realistic Skeleton Driven Skin Deformation

Skeleton driven animation is a popular method for the animation of deformable human and creature characters. The main advantage is its computational performance. However it suffers from a number of problems, such as collapsing elbow and candy wrapper joint. In this paper, we present a new method which is able to solve these defects; reduce the animator’s manual work still allowing his/her full control over the process; and realistically simulate the fat bulge effect around a joint.

X. S. Yang, Jian J. Zhang
Implementing Immersive Clustering with VR Juggler

Continuous, rapid improvements in commodity hardware have allowed users of immersive visualization to employ high-quality graphics hardware, high-speed processors, and significant amounts of memory for much lower costs than would be possible with high-end, shared memory computers traditionally used for such purposes. Mimicking the features of a single shared memory computer requires that the commodity computers act in concert—namely, as a tightly synchronized cluster. In this paper, we describe the clustering infrastructure of VR Juggler that enables the use of distributed and clustered computers for the display of immersive virtual environments. We discuss each of the potential ways to synchronize a cluster for immersive visualization in use today. Then, we describe the VR Juggler cluster infrastructure in detail, and we show how it allows virtual reality application developers to combine various existing clustering techniques to meet the needs of their specific applications.

Aron Bierbaum, Patrick Hartling, Pedro Morillo, Carolina Cruz-Neira
Adaptive Space Carving with Texture Mapping

Space carving reconstructs a 3D object from multiple images, but existing algorithms rely on a regular grid which makes poor use of memory. By using the image information, adaptive space carving uses a recursively generated structure which reduces memory requirements and thus allows a finer grid. After reconstruction, models are triangulated to facilitate texture mapping. Experimental results show the enhanced appearance of models reconstructed in this way.

Yoo-Kil Yang, Jung Lee, Soo-Kyun Kim, Chang-Hun Kim
User-Guided 3D Su-Muk Painting

We present a technique for rendering animated 3D models in a Su-Muk painting style with user’s guide. First, a user can sketch directly over 3D models by varying ink and water values. And we simulate the behavior of ink-water directly on 3D model for consistent shade information over 3D models. After simulation, ink and water are spread over a 3D surface appropriately. Second, to achieve a real hand-crafted look of Su-Muk painting, ink-water behavior is simulated again on a 2D screen by using overall ink-water shade information from previous output of 3D ink-water simulation. We demonstrate some images for Su-Muk painting with our user-guided drawing system.

Jung Lee, Joon-Yong Ji, Soo-Kyun Kim, Chang-Hun Kim
Sports Equipment Based Motion Deformation

We propose a novel scheme for generating the locomotion of sports equipment along with motion deformation. To accomplish this, we search the event frames in order to find the points where the character is in contact with the sports equipment (contact points) or not (split points). To find these points, it is necessary to locate the corresponding event points in the signal using Discrete Fourier Transformation. This technique generates a left and right hand-cross analysis and sports equipment locomotion adapted to the character motion. Also, the locomotion of the sports equipment is changed by adjusting its properties, and this may require the posture of the character to be modified. In conclusion, we are able to deform the motion of both the sports equipment and the character simply by adjusting the properties of the sports equipment. Our scheme can be used for the motion deformation of any character with an associated sports equipment.

Jong-In Choi, Chang-Hun Kim, Cheol-Su Lim
Designing an Action Selection Engine for Behavioral Animation of Intelligent Virtual Agents

This paper presents a new action selection scheme for behavioral animation in computer graphics. This scheme provides a powerful mechanism for the determination of the sequence of actions to be performed by the virtual agents emulating real world’s life. In particular, the present contribution focuses on the description of the system architecture and some implementation issues. Then, the performance of our approach is analyzed by means of a simple yet illustrative example. Finally, some advantages of our scheme and comparison wih previous approaches are also briefly discussed.

F. Luengo, A. Iglesias
Interactive Transmission of Highly Detailed Surfaces

It is a challenging work for transmitting the highly detailed surfaces interactively to meet real time requirement of the large-scale model visualization from clients. In this paper, we propose a novel approach to interactive transmission of highly detailed surfaces according to the viewpoint. We firstly map the 3D surfaces onto the parameter space through surface parametrization, and the

geometry images

(GIM) and normal map atlas are obtained by regular re-sampling. Then the quadtree-based hierarchical representation can be constructed based on GIM. Since the hierarchical structure is regular, an efficient compression scheme can be applied to encode the structure and vertices of its nodes. The encoded nodes can be transmitted in arbitrary order, so the extreme flexibility in transmission could be achieved. By taking advantage of normal texture atlas, the rendering result of the partial transmitted model is improved greatly, and then only the geometry on the silhouette to the current viewpoint need be refined and transmitted, and so the amount of data needed to transfer each frame is greatly reduced.

Junfeng Ji, Sheng Li, Enhua Wu, Xuehui Liu
Contour-Based Terrain Model Reconstruction Using Distance Information

In order to create three-dimensional terrain models, we reconstruct geometric models from contour lines on two-dimensional map. Previous methods divide a set of contour lines into simple matching regions and clefts. Since long processing time is taken for reconstructing clefts, performance might be degraded while manipulating complicated models. We propose a fast reconstruction method, which generates triangle strips by computing distance of corresponding vertex pairs in adjacent slices for simple matching region. If there are some branches or dissimilarities, it computes midpoints of corresponding vertices and reconstructs geometry of those areas by tiling the midpoints and remaining vertices. Experimental results show that our method reconstructs geometric models fairly well and it is faster than the previous method.

Byeong-Seok Shin, Hoe Sang Jung
An Efficient Point Rendering Using Octree and Texture Lookup

As modern 3D scanning devices handle an enormous amount of point data, generation of triangle mesh becomes time-consuming job. Furthermore, projected triangles are smaller than pixel size, thereby increasing overhead for rasterization. In recent years point-based rendering has become an efficient method for the rendering of complex models. We propose an acceleration method for rendering of point-based geometry. It solves the visibility of point samples by taking advantages of octree structure constructed directly from a point cloud. This enables graphics hardware to render a scene in a single pass thus avoiding additional pass for visibility computation. In addition, we also present an efficient splatting technique to use a lookup table of alpha textures, resulting in alleviation of the load of pixel processing. It achieves better performance over the other methods.

Yun-Mo Koo, Byeong-Seok Shin
Faces Alive: Reconstruction of Animated 3D Human Faces

This paper presents a new method for reconstructing anatomy-based, animatable facial models with minimal manual intervention. The technique is based on deforming a multi-layered prototype model to the acquired surface data in an “outside-in” manner: deformation applied to the skin layer is propagated, with the final effect of deforming the underlying muscles. In the skin layer deformation, the generic skin mesh is represented as a dynamic deformable model which is subjected to internal force stemming from the elastic properties of the surface and external forces generated by input data points and features. A fully automated approach has been developed for deforming the muscle layer that includes three types of muscle models. Our method generates animatable models from incomplete input data and reconstructed facial models can be animated directly to synthesize various expressions.

Yu Zhang, Terence Sim, Chew Lim Tan
Quasi-interpolants Based Multilevel B-Spline Surface Reconstruction from Scattered Data

This paper presents a new fast and local method of 3D surface reconstruction for scattered data. The algorithm makes use of quasi-interpolants to compute the control points from a coarse to fine hierarchy to generate a sequence of bicubic B-spline functions whose sum approaches to the desired interpolation function. Quasi-interpolants gives a procedure for deriving local spline approximation methods where a B-spline coefficient only depends on data points taken from the neighborhood of the support corresponding B-spline. Experimental results demonstrate that high-fidelity reconstruction is possible from a selected set of irregular samples.

Byung-Gook Lee, Joon-Jae Lee, Ki-Ryoung Kwon

Methodology of Information Engineering Workshop

Efficient Mapping Rule of IDEF for UMM Application

Various methodologies for business process analysis and design have been developed by many organizations. However, these methodologies are all uniquely developed to fit each organizations in their own ways and thuds are not compatible with each other. This poor compatibility between these methodologies incurs unnecessary analysis cost. In order to reduce this analysis cost, methods to map between the different methodologies need to be developed. UMM (UN/CEFACT Modeling Methodology) that has an object-oriented point of view can resolve the limits of the existing bottom-up approaches and make it more reasonable. It also simplifies the business and administrative procedures. IDEF (Integrated Definition Language) that has a structural point of view and is widely used as a system analysis and design method, needs to be mapped to UMM so that the existing IDEF models can be re-used. In this study, we present a guideline for procedures utilizing IDEF models from which the UMM models can be derived to develop an electronic commerce system that includes electronic documents exchange. By comparing IDEF and UMM, we analyze the differences between those two methodologies. Based on these differences, we propose basic strategies for mapping from IDEF to UMM. We also propose a mapping guideline which can be used to make suppositions of UMM results based on the modeling results of IDEF. The existing IDEF analysis/design results can be used to adopt UMM methodology for electronic business system. Therefore, with this, analysts who are familiar with the IDEF methodology can develop UMM work-flow by utilizing their existing results and skills.

Kitae Shin, Chankwon Park, Hyoung-Gon Lee, Jinwoo Park
A Case Study on the Development of Employee Internet Management System

As enterprise computing environments become more Internet-oriented, the importance of Internet traffic monitoring and filtering increases. The Internet monitoring and filtering system was introduced as a simple utility for analysis and blocking access to objectionable Web sites. And, it has evolved into a major corporate application. This paper provides our experience to design and develop Web traffic monitoring and filtering system called WebKeeper. In this paper, we present requirements, design, implementation and operational statistics of WebKeeper. With the use of this system including advanced databases, it now will provide enhanced security, productivity and increased bandwidth in the form of the Internet access management.

Sangkyun Kim, Ilhoon Choi
Cost-Benefit Analysis of Security Investments: Methodology and Case Study

We live in an unsafe world in which we encounter threats against our safety and security every day. This is especially true in the information processing environment. Managements are engaging and facing difficult problems to manage information security issues. One of the most brain-teasing management issues is “How they could make a decision on security-related investment to maximize the economic balance?” To solve this problem the ROI of security investments must be measured and managed. This paper provides the integrated methodology which consists of a process model and analysis criteria of cost factors and benefit factors to support an economic justification of security investments. Also, a case study is provided to show practicality of this methodology.

Sangkyun Kim, Hong Joo Lee
A Modeling Framework of Business Transactions for Enterprise Integration

A modeling framework and an integration platform for unifying B2Bi (Business to Business integration) and EAI (Enterprise Application Integration) systems are required to effectively manage collaborative B2B business processes. In this research, we suggest a modeling framework for enterprise integration. The suggested framework includes various activity controllers derived from several business process modeling standards. An extension mechanism is included in the suggested framework for incorporating future changes in those standards or system’s operational requirements. The modeling framework has been implemented as a real business integration platform and runs successfully in production systems to execute enterprise B2B collaborations.

Minsoo Kim, Dongsoo Kim, Yong Gu Ji, Hoontae Kim
Process-Oriented Development of Job Manual System

Today’s business environment involves increasingly complex and uncertain processes. To maintain or increase their competitiveness, businesses look for new methodologies to permit them to manage their processes more effectively. One of these methodologies, Business Process Management (BPM), which includes the workflow concept, has gained some attention. The processcentric integration of BPM has demonstrated the capability to more effectively manage business processes; however it is difficult to integrate systematically using existing IT development paradigms. Therefore, a new IT development paradigm is required, where business processes are managed independently and automated with calling relevant business applications. This new IT paradigm is called the

‘Process Orientation’

(PO) concept, which focuses on independence of the business processes and the BPM system. In this paper, we explain a new kind of job manual system as a business application under the PO concept. In the new job manual system, the BPM system becomes a development platform to implement various applications. Contrary to existing job manual systems, it is possible for this new system to directly transfer a manual prepared in it to a BPM process definition. The new system also enables more efficient management of business processes and helps users of both systems perform relevant tasks more easily.

Seung-Hyun Rhee, Hoseong Song, Hyung Jun Won, Jaeyoung Ju, Minsoo Kim, Hyerim Bae
An Information System Approach and Methodology for Enterprise Credit Rating

This research is going to construct an enterprise credit evaluation model which consists of non-financial elements of IT index considered influencing to the financial output of the business entities. To examine the influence closely, this paper analyzes the relationship between the IT investment of enterprises and the elements of IT level and their productivity level or financial output first, and then is going to propose the new evaluation model of which IT elements as a non-financial one activates in a method of evaluating credit of each enterprises by discovering the linkage between the IT level and their financial output finally.

Hakjoo Lee, Choonseong Leem, Kyungyup Cha
Privacy Engineering in ubiComp

In the ubiquitous age, privacy will be the matter of trade-offs about pros and cons of revealing personal information for personalized services. Ubiquitous computing demands a fundamental shift in the control of personal information and requires disclosure of personal information. As we enjoy comfortable life, the invasion of personal information can be occurred at the same time. The privacy requires the effective security, but the effective security does not guarantee the effective privacy. We present the privacy engineering in order to prevent the privacy invasion and measure the economic value of privacy. We hope this privacy engineering in ubiComp will be used as one of the tool for protecting the users in the ubiquitous age. The approach includes the followings; the architecture of the privacy engineering, the database modeling, the privacy impact assessment, the economic value assessment of the privacy.

Tae Joong Kim, Sang Won Lee, Eung Young Lee
Development of a BSC-Based Evaluation Framework for e-Manufacturing Project

Recently, in order to swiftly satisfy the various product demands by customers reinforced collaboration among industries or business types as well as increased outsourcing have emerged as indispensable constitutive elements of modern manufacturing. An e-Manufacturing pilot project for supporting design collaboration of a domestic mould company is now underway. To drive efficiently the project, different evaluation frameworks per stakeholder in the e-Manufacturing project are required. In this study, we proposed a unified evaluation template which is composed of two viewpoints (user and operator) and three steps (pre-evaluation, operation evaluation and post evaluation). Particularly, we proposed the evaluation framework for contractor mould companies from the user’s viewpoint using BSC concept.

Yongju Cho, Wooju Kim, Choon Seong Leem, Honzong Choi
Design of a BPR-Based Information Strategy Planning (ISP) Framework

In this paper we propose a BPR-based Information Strategy Planning (ISP) Framework, which aims to fully incorporate the BPR concept into ISP. Since business processes and information systems (IS) are tightly linked together, business processes need to be reengineered during information systems planning. Five sub-processes are defined in the framework – Business Strategy Analysis, Process Analysis and Redesign, IS Analysis and Modeling, Organization Analysis, and ROI (Return On Investment) Analysis and Integrated Execution Planning of IS. With this framework, it is expected that enterprises will be able to perform more effective, efficient, and strategic IS planning.

Chiwoon Cho, Nam Wook Cho
An Integrated Evaluation System for Personal Informatization Levels and Their Maturity Measuement: Korean Motors Company Case

It is an incontestable fact that an enterprise’s competitiveness these days is reflected by whether an individual has the application capability and knowledge related to information technology. This study improves upon the previous research of evaluation systems for personal informatization level and its maturity measurement. To measure the personal informatization level, three types of questions are developed: Answer-Driven Question(ADQ), Personality -Fit Question(PFQ) and Understanding-Oriented Question(UOQ). These three types are fitted to the characteristic of evaluation indices. Evaluation indices consist of three evaluation domains: Mind of IT, Knowledge of IT and Application of IT. Each domain is composed of three evaluation factors and their specific items. They have cause-and-effect relation among them. We applied this evaluation system to individuals working in an informatization environment at a representative Korean motor company, and verified the application and practicality of evaluation systems through presenting the evaluation results of 18,898 individuals.

Eun Jung Yu, Choon Seong Leem, Seung Kyu Park, Byung Wan Kim
Critical Attributes of Organizational Culture Promoting Successful KM Implementation

Many organizations are implementing Knowledge Management (KM) technologies to promote knowledge sharing. An extensive review of recent articles and journals about such implementations reveals that one of the main barriers to implementation of KM technology is the absence of an organizational culture that supports knowledge sharing. The purpose of this research is to explore the possible relationship between the successful implementation of knowledge management technology and specific organizational culture attributes. The OCP and the KMTP instruments were used to identify and rank the most critical organizational culture attributes of promoting knowledge sharing and KM technology implementation successes. Data were collected from twenty six US organizations involved in a KM effort.

Heejun Park
Backmatter
Metadaten
Titel
Computational Science and Its Applications – ICCSA 2005
herausgegeben von
Osvaldo Gervasi
Marina L. Gavrilova
Vipin Kumar
Antonio Laganà
Heow Pueh Lee
Youngsong Mun
David Taniar
Chih Jeng Kenneth Tan
Copyright-Jahr
2005
Verlag
Springer Berlin Heidelberg
Electronic ISBN
978-3-540-32045-6
Print ISBN
978-3-540-25862-9
DOI
https://doi.org/10.1007/b136271

Premium Partner