Skip to main content

Über dieses Buch

This book presents the edited proceedings of the 16th IEEE/ACIS International Conference on Computer and Information Science (ICIS 2017), which was held on May 24–26, 2017 in Wuhan, China. The aim of this conference was to bring together researchers and scientists, businessmen and entrepreneurs, teachers, engineers, computer users, and students to discuss the various fields of computer science, share their experiences and exchange new ideas and information. The research results included relate to all aspects (theory, applications and tools) of computer and information science, and discuss the practical challenges encountered and the solutions adopted to solve them. The work selected represents 17 of the most promising papers from the conference, written by authors who are certain to make further significant contributions to the field of computer and information science.



Big Data and IoT for U-healthcare Security

Big Data is a latest topic of interest by many researchers because of its big potential applied in many areas of science and technology. Big Data is by far captivating strong roots in the healthcare ecosystem, but this healthcare data are becoming more complex, which are challenging to solve using common database management tools or simply the traditional data processing application along with the security systems. On the other hand, IoT remainds you to track your health like fitness devices, calorie meters, heart rate monitors, to name a few up to your fridge reminding you that it is basically running out of water.” Big Data and IoT are built on networks and cloud computing of gathering data using sensors but challenges using both especially the security for health care is very vital. IoT and Big Data have the potential to transform the way healthcare providers use sophisticated technologies from their clinical and other data repositories and make informed decisions, but without the right security and encryption solution, Big Data and IoT can mean big problems, especially on healthcare security systems. In this study, we discuss the use and application of IoT and Big Data for u-health care. We have presented this architecture to address the mentioned challenges in this study. Data privacy of patient and user data is a critical requirement. The architecture will access controls to medical device data. The patient should be in control of what is being viewed by whom and will allow him/her to view and set the access control policies, maintaining anonymity and masking of data wherever possible.
Mechelle Grace Zaragoza, Haeng-Kon Kim, Roger Y. Lee

Retrospection and Perspectives on Pragmatic Software Architecture Design: An Industrial Report

It is commonly recognized that the research on software architecture has enjoyed a golden age of innovation and concept formulation, and began to enter the mature stage of utilization. It is a natural expectation at present that the relevant concepts and methods have been populated and the improvements have been achieved in practice. In this paper, we give a retrospective report on our extensive architecture design of a large scale mission-critical system conducted in our company. The project has been trying to incorporate newly proposed concepts and methods from academic realm, and keeping introspective during the whole process. At the end of the project, we considered it as a successful architecting conduct, and believed that much experience can be gained from it. We describe the process, approaches, and results of this industrial practice. Experience and perspectives from the practitioners’ points of view are analyzed and summarized. The contribution of this paper is twofold. Firstly, it presents a real-world software architecture design phenomenon, shares first-hand experience and perspectives on the pragmatic architecture design practices, therefore provides some insights into the state of the practice. Secondly, it may inspire further research on more effective and efficient methods and tools for better practices.
Xiaofeng Cui

Distributed Coding and Transmission Scheme for Wireless Communication of Satellite Images

In this chapter, we propose a novel coding and transmission scheme for satellite images broadcasting. First, we use a 2D-DWT to divide full-size satellite image band into four sub-bands; three of them are the details and a small version of image band in LL sub-band. Second, our scheme utilizes coset coding based on distributed source coding (DSC) for the LL sub-band to achieve high compression efficiency and a low encoding complexity. After that, without syndrome coding, the transmission power is directly allocated to band details and coset values according to their distributions and magnitudes without forward error correction (FEC). Finally, these data are transformed by Hadamard matrix and transmitted over a dense constellation. Experiments on satellite images demonstrate that the proposed scheme improve the average image quality by 2.28 dB, 2.83 dB and 3.66 dB over LineCast, SoftCast-3D and SoftCast-2D, respectively, and it achieves up to 6.26 dB gain over JPEG2000 with FEC.
Ahmed Hagag, Ibrahim Omara, Souleyman Chaib, Xiaopeng Fan, Fathi E. Abd El-Samie

Experimental Evaluation of HoRIM to Improve Business Strategy Models

Aligning organizational goals and strategies is important in Business Process Management (BPM). The Horizontal Relation Identification Method (HoRIM), which is our extension of the GQM+Strategies framework, improves the strategic alignment between organizations. GQM+Strategies aligns the strategies across organizational units at different levels by a strategy model, which is a tree structure of strategies called a GQM+Strategies grid. HoRIM identifies and handles horizontal relations (e.g., conflicting and similar strategies) between strategies in different branches, but we have yet to adequately inspect the impact of HoRIM on identifying correct horizontal relations and improving grids. This lack of clarity hampers the application of HoRIM to industrial business strategy models. Herein, we evaluate the impact of HoRIM on the review process and the improvement process of GQM+Strategies grids using two experiments. The review experiment confirms that HoRIM identifies about 1.5 more horizontal relations than an ad hoc review. The modification experiment where four researchers evaluated the validity of improved grids by the ranking method suggests that HoRIM effectively modifies GQM+Strategies grids.
Yohei Aoki, Hironori Washizaki, Chimaki Shimura, Yuichiro Senzaki, Yoshiaoki Fukazawa

Combining Lexicon-Based and Learning-Based Methods for Sentiment Analysis for Product Reviews in Vietnamese Language

Social media websites are a major hub for users to express their opinions online. Businesses spend an enormous amount of time and money to understand their customer opinions about their products and services. Sentiment analysis which is also called opinion mining, involves in building a system to collect and examine opinions about the product made in blog posts, comments, or reviews. In this paper, we propose a framework for sentiment analysis based on combining lexicon-based and learning-based methods for product review sentiment analysis in Vietnamese language. Text analytics, Linguistic analysis and Vietnamese emotional dictionary were built, proposing features which adapted with the language was proposed. The experimental show that our system has very well performance when combine advantage of lexicon-based and learning based and can be applied in online systems for sentiment analysis product reviews.
Son Trinh, Luu Nguyen, Minh Vo

Reducing Misclassification of True Defects in Defect Classification of Electronic Board

This paper proposes a method to discriminate the defect on the electronic board and the foreign matter attached to the circuit. The purpose of this paper is to reduce the misclassification of the true defect to the pseudo defect in the automatic classification approach. Proposed method improves the classification accuracy under the multiple illuminating conditions. Each result of multiple classifiers is used for the voting process and incorrect classification is reduced by introducing the classification into three classes of true defect, pseudo detect, and difficult defect. The approach evaluates the correct ratio for the circuit board image based on actual defects are included and the effectiveness of the proposed approach is confirmed via the experiment.
Tokiko Shiina, Yuji Iwahori, Yohei Takada, Boonserm Kijsirikul, M. K. Bhuyan

Virtual Prototyping Platform for Multiprocessor System-on-Chip Hardware/Software Co-design and Co-verification

This paper describes the implementation of a virtual prototyping platform to address the ever-challenging multiprocessor system-on-chip (MPSoC) hardware/software co-design and co-verification requirements. The increasingly popular deployment of MPSoC brings complexity to system modeling, design, and verification. Fiercely competitive business environment makes it absolutely critical to rein in time-to-market and chip fabrication costs. The holy grail is to be able to verify the hardware design and synthesize to the gate level for physical layout, at the same time carry out software development for the hardware design using the same system models and verification platforms. One approach is to raise the abstraction level of system design and verification to ESL. In this paper, a virtual prototyping platform is built using SystemC with transaction-level modeling (TLM) and the open virtual platforms (OVP) processor model with instruction set simulator (ISS). As a demonstration of concept and feasibility, the virtual platform prototypes a 128-bit advanced encryption standard (AES) Cryptosystem MPSoC. The supporting subsystems and environment are also modeled, for example the system peripherals, the network-based interconnect scheme or Network-on-Chip (NoC), system firmware, the interrupt service handling, and driver. The virtual platform is scalable up to but not limited to twelve processing elements and configurable to the extent of the OVPs generic memory models (RAM and ROM) addresses and sizes, simulation parameters and debugging and tracing options.
Arya Wicaksana, Chong Ming Tang

A Data-Mining Model for Predicting Low Birth Weight with a High AUC

Birth weight is a significant determinant of a newborn’s probability of survival. Data-mining models are receiving considerable attention for identifying low birth weight risk factors. However, prediction of actual birth weight values based on the identified risk factors, which can play a significant role in the identification of mothers at the risk of delivering low birth weight infants, remains unsolved. This paper presents a study of data-mining models that predict the actual birth weight, with particular emphasis on achieving a higher area under the receiver operating characteristic (AUC). The prediction is based on birth data from the North Carolina State Center for Health Statistics of 2006. The steps followed to extract meaningful patterns from the data were data selection, handling missing values, handling imbalanced data, model building, feature selection, and model evaluation. Decision trees were used for classifying birth weight and tested on the actual imbalanced dataset and the balanced dataset using synthetic minority oversampling technique (SMOTE). The results highlighted that models built with balanced datasets using the SMOTE algorithm produce a relatively higher AUC compared to models built with imbalanced datasets. The J48 model built with balanced data outperformed REPTree and Random tree with an AUC of 90.3%, and thus it was selected as the best model. In conclusion, the feasibility of using J48 in birth weight prediction would offer the possibility to reduce obstetric-related complications and thus improving the overall obstetric health care.
Uzapi Hange, Rajalakshmi Selvaraj, Malatsi Galani, Keletso Letsholo

A Formal Approach for Maintaining Forest Topologies in Dynamic Networks

In this paper, we focus on maintaining a forest of spanning trees in dynamic networks. In fact, we propose an approach based on two levels for specifying and proving distributed algorithms in a forest. The first level allows us to control the dynamic structure of the network by triggering a maintenance operation when the forest is altered. To do so, we develop a formal pattern using the Event-B method, based on the refinement technique. The proposed pattern relies on the dynamicity aware-graph relabeling systems (DA-GRS) which is an existing model for building and maintaining a spanning forest in dynamic networks. It is based on evolving graphs as a powerful model to record the evolution of a network topology. The second level of our approach deals with distributed algorithms which can be applied to spanning trees of the forest. Through an example of a leader election algorithm, we illustrate our pattern. The proof statistics show that our solution can save efforts on specifying as well as proving the correctness of distributed algorithms in a forest topology.
Faten Fakhfakh, Mohamed Tounsi, Mohamed Mosbah, Dominique Méry, Ahmed Hadj Kacem

A Multicriteria Approach for Selecting the Optimal Location of Waste Electrical and Electronic Treatment Plants

This paper presents multicriteria decision-making approach for selecting the optimal location of waste electrical and electronic equipment (WEEE) treatment plants. Hesitant fuzzy set is used to deal with the situations in which the decision maker hesitates among several values to assess the alternatives. A hesitant fuzzy Hamacher geometric operator is proposed for producing a preference value for every waste electrical and electronic equipment treatment plant location alternative across all selection criteria. An example is given for demonstrating the applicability of the proposed multicriteria decision-making approach for selecting the optimal location of WEEE treatment plants.
Santoso Wibowo, Srimannarayana Grandhi

Localization Strategy for Island Model Genetic Algorithm to Preserve Population Diversity

Years after being firstly introduced by Fraser and remodeled for modern application by Bremermann, genetic algorithm (GA) has a significant progression to solve many kinds of optimization problems. GA also thrives into many variations of models and approaches. Multi-population or island model GA (IMGA) is one of the commonly used GA models. IMGA is a multi-population GA model objected to getting a better result (aimed to get global optimum) by intrinsically preserve its diversity. Localization strategy of IMGA is a new approach which sees an island as a single living environment for its individuals. An island’s characteristic must be different compared to other islands. Operator parameter configuration or even its core engine (algorithm) represents the nature of an island. These differences will incline into different evolution tracks which can be its speed or pattern. Localization strategy for IMGA uses three kinds of single GA core: standard GA, pseudo GA, and informed GA. Localization strategy implements migration protocol and the bias value to control the movement. The experiment results showed that localization strategy for IMGA succeeds to solve 3-SAT with an excellent performance. This brand new approach is also proven to have a high consistency and durability.
Alfian Akbar Gozali, Shigeru Fujimura

HM-AprioriAll Algorithm Improvement Based on Hadoop Environment

In order to improve the efficiency of the mining frequent item-sets of AprioriAll algorithm, the Hadoop environment and MapReduce model are introduced to improve AprioriAll algorithm, a new algorithm of mining frequent item-sets under the environment of big data HM-AprioriAll algorithm is designed. Compared with the original algorithm, the new algorithm introduces user attributes and pruning technology, which reduces the number of the elements in the candidate sets and reduces the number of the scanning times on the data sets, greatly reduces the time complexity and space complexity of computing, gives rules model in large scale. After testing HM-AprioriAll algorithm on Hadoop platform, the results prove that the insertion of this technology makes HM-AprioriAll algorithm have higher efficiency of expanding.
Wentian Ji, Qingju Guo, Yanrui Lei

Architecture of a Real-time Weather Monitoring System in a Space-time Environment Using Wireless Sensor Networks

This paper presents a Web-mapping framework for collecting, storing, and analyzing meteorological data recorded by wireless sensor network. The main objective is to design and implement climate monitoring system based on WSN able to intercepting and filtering meteorological records and generate alert system in real time in case of emergency. The application consists in transferring the data recorded by the sensor network via a gateway to an application server by using a CloudMQTT transfer protocol. The server, which allows intercepting sensor data in real time via Web Socket to a Web application (RIA), is designed to indexing and recording data in memory database. These data are stored in a spatial-temporal database and can be visualized via Web services API (REST, GWS).
Walid Fantazi, Tahar Ezzedine

Mobile Application Development on Domain Analysis and Reuse-Oriented Software (ROS)

The term reuse is suggested to be a key to improving of any software development and productivity, particularly where one can identify a family of systems. As to mobile application development, one should consider not only component development, but also a classification for reusable domain pertaining to software. We classify the domain component considering functional and nonfunctional factor identified through domain analysis. This paper briefly describes the study of mobile application development with the use of domain analysis. How to reuse software will cause certain advantages and issues upon reviewing this study. Also, we proposed a flowchart based on the reuse method of large-scale embedded software based on inter-module relations on process flow of the proposed reuse method.
Mechelle Grace Zaragoza, Haeng-Kon Kim

A Transducing System Between Hichart and XC on a Visual Software Development Environment

In recent years, embedded systems have been widely used in various fields. However, the development burden of embedded systems tends to be high because they are complex. A visual development environment is one way to reduce this burden. We have already proposed a visual programming development environment for program diagrams called Hichart. In this paper, we describe a visual development environment for the XC language, which is a programming language for XMOS evaluation boards.
Takaaki Goto, Ryo Nakahata, Tadaaki Kirishima, Takeo Yaku, Kensei Tsuchida

Development of an Interface for Volumetric Measurement on a Ground-Glass Opacity Nodule

Although radiologists easily recognize lung nodules in CT volume data, and then judge their benign or malignant based on the type of lung nodules, some lung nodules also are difficult to be detected because of their size or shape and so on such as ground-glass opacity nodules (GGO). Some features of GGO nodules are necessary because they can help radiologists to recognize benign or malignant of GGO nodules such as to find the boundaries in order to obtain the volume of GGO nodules. However, different radiologists can give different boundaries of GGO nodules depended on radiologists’ personal habits. It was difficult to obtain the boundaries of GGO nodules which were satisfied with all radiologists. This study is to develop an interface to obtain the boundaries of GGO nodules by using expectation–maximization (EM) algorithm (US Cancer Statistics Working Group. United States cancer statistics: 19992012. Incidence and mortality Web-based report. Atlanta, GA: US Department of Health and Human Services, CDC, National Cancer Institute, 2015, [1]) and the histogram method as radiologists’ personal habits because the parameters of the EM algorithm and the threshold values of the histogram method can be adjusted. Experimental results showed the proposed interface can obtain the boundaries of GGO nodules as radiologists’ personal habits. This study can reduce the burden of radiologists effectively.
Weiwei Du, Dandan Yuan, Xiaojie Duan, Jianming Wang, Yanhe Ma, Hong Zhang

Efficient Similarity Measurement by the Combination of Distance Algorithms to Identify the Duplication Relativity

This paper studied the efficient similarity measurement in order to improve the duplication detection techniques in a programming class. This work used the combination of the three proficient algorithms that include Smith-Waterman, longest common subsequence, and Damalau-Levenshtein distance to measure the distance between each pair of the code files. In order to identify the proximity of the person who duplicates each other, this work applied the frequency of occurrence technique to score the accumulated similarity of duplication, the number of times or the regularity of the duplication happens, and the relationship between incidence and time period. Moreover, this work proposed the technique to identify a group of people positioned closely together. The result shows that the proposed of the combination of algorithms could measure the similarity of files efficiently. This work could identify the relativity of a person who frequently duplicates together, and the people positioned closely together. Finally, this work suggested the number of time that student duplicated the code files.
Manop Phankokkruad

Correction to: Virtual Prototyping Platform for Multiprocessor System-on-Chip Hardware/Software Co-design and Co-verification

Without Abstract
Arya Wicaksana, Chong Ming Tang


Weitere Informationen

Premium Partner