Skip to main content

2011 | Buch

Software Engineering and Computer Systems

Second International Conference, ICSECS 2011, Kuantan, Pahang, Malaysia, June 27-29, 2011, Proceedings, Part II

herausgegeben von: Jasni Mohamad Zain, Wan Maseri bt Wan Mohd, Eyas El-Qawasmeh

Verlag: Springer Berlin Heidelberg

Buchreihe : Communications in Computer and Information Science

insite
SUCHEN

Über dieses Buch

This Three-Volume-Set constitutes the refereed proceedings of the Second International Conference on Software Engineering and Computer Systems, ICSECS 2011, held in Kuantan, Malaysia, in June 2011. The 190 revised full papers presented together with invited papers in the three volumes were carefully reviewed and selected from numerous submissions. The papers are organized in topical sections on software engineering; network; bioinformatics and e-health; biometrics technologies; Web engineering; neural network; parallel and distributed e-learning; ontology; image processing; information and data management; engineering; software security; graphics and multimedia; databases; algorithms; signal processing; software design/testing; e- technology; ad hoc networks; social networks; software process modeling; miscellaneous topics in software engineering and computer systems.

Inhaltsverzeichnis

Frontmatter

Information and Data Management

A Mean Mutual Information Based Approach for Selecting Clustering Attribute

Rough set theory based attribute selection clustering approaches for categorical data have attracted much attention in recent years. However, they have some limitations in the process of selecting clustering attribute. In this paper, we analyze the limitations of three rough set based approaches: total roughness (TR), min-min roughness (MMR) and maximum dependency attribute (MDA), and propose a mean mutual information (MMI) based approach for selecting clustering attribute. It is proved that the proposed approach is able to overcome the limitations of rough set based approaches. In addition, we define the concept of mean inter-class similarity to measure the accuracy of selecting clustering attribute. The experiment results show that the accuracy of selecting clustering attribute using our method is higher than that using TR, MMR and MDA methods.

Hongwu Qin, Xiuqin Ma, Jasni Mohamad Zain, Norrozila Sulaiman, Tutut Herawan
A Soft Set Model on Information System and Its Application in Clustering Attribute Selection

In this paper, we define a soft set model on the set of equivalence classes in an information system, which can be easily applied to obtaining approximation sets of rough set. Furthermore, we use it to select clustering attribute for categorical data clustering and a heuristic algorithm is presented. Experiment results on UCI benchmark data sets show that the proposed approach provides faster decision for selecting a clustering attribute as compared with maximum dependency attributes (MDA) approach.

Hongwu Qin, Xiuqin Ma, Jasni Mohamad Zain, Norrozila Sulaiman, Tutut Herawan
Alternative Model for Extracting Multidimensional Data Based-On Comparative Dimension Reduction

In line with the technological developments, the current data tends to be multidimensional and high dimensional, which is more complex than conventional data and need dimension reduction. Dimension reduction is important in cluster analysis and creates a new representation for the data that is smaller in volume and has the same analytical results as the original representation. To obtain an efficient processing time while clustering and mitigate curse of dimensionality, a clustering process needs data reduction. This paper proposes an alternative model for extracting multidimensional data clustering based on comparative dimension reduction. We implemented five dimension reduction techniques such as ISOMAP (Isometric Feature Mapping), KernelPCA, LLE (Local Linear Embedded), Maximum Variance Unfolded (MVU), and Principal Component Analysis (PCA). The results show that dimension reductions significantly shorten processing time and increased performance of cluster. DBSCAN within Kernel PCA and Super Vector within Kernel PCA have highest cluster performance compared with cluster without dimension reduction.

Rahmat Widia Sembiring, Jasni Mohamad Zain, Abdullah Embong
Knowledge Taxonomy for Developing Organizational Memory System (OMS) for Public Institutions of Higher Learning (IHL) in Malaysia

Knowledge in organization, supported by appropriate technology will produce organizational memories that provide necessary support for organization to learn and grow. As the scope of organizational memory in one organization is too broad, it is necessary to classify them into acceptable narrower view. Using organization memory computer base literature related, this study found some basic understanding of organization memory system, and produce knowledge taxonomy and metadata as a base to develop organizational memory system in IHL; Malaysia scenario. The taxonomy and metadata is developed base on pertinent literature of organizational memory system model, organizational memory sources types and knowledge management technology. Further study can be conducted to validate and verify the developed taxonomy.

Suzana Basaruddin, Haryani Haron
A Framework for Optimizing Malware Classification by Using Genetic Algorithm

Malware classification is a vital in combating the malware. Malware classification system is important and work together with malware identification to prepare the right and effective antidote for malware. Current techniques in malware classification do not give a good classification result when it deals with the new and unique types of malware. For this reason, we proposed the usage of Genetic Algorithm to optimize the malware classification system as well as help in malware prediction. The new malware classification system is based on malware target and its operation behavior. The result from this study will create a new framework that designed to optimize the classification of malware. This new malware classification system also has an ability to train and learn by itself, so that it can predict the current and upcoming trend of malware attack.

Mohd Najwadi Yusoff, Aman Jantan
Building a Corpus-Derived Gazetteer for Named Entity Recognition

Gazetteers, or entity dictionaries, are an important element for Named Entity Recognition. Named Entity Recognition is an essential component of Information Extraction. Gazetteers work as specialized dictionaries to support initial tagging. They provide quick entity identification thus creating richer document representation. However, the compilation of such gazetteers is sometimes mentioned as a stumbling block in Named Entity Recognition. Machine learning, both rule-based and look-up based approaches, are often used to perform this process. In this paper, a gazetteer developed from MUC-3 annotated data for the ‘person named’ entity type is presented. The process used has a small computational cost. We combine rule-based grammars and a simple filtering technique for automatically inducing the gazetteer. We conclude with experiments to compare the content of the gazetteer with the manually crafted one.

Norshuhani Zamin, Alan Oxley
Fuzzy Goal Programming for Multi-level Multi-objective Problem: An Additive Model

The coordination of decision authority is noteworthy especially in a complex multi-level structured organization, which faces multi-objective problems to achieve overall organization targets. However, the standard formulation of mathematical programming problems assumes that a single decision maker made the decisions. Nevertheless it should be appropriate to establish the formulations of mathematical models based on multi-level programming method embracing multiple objectives. Yet, it is realized that sometimes estimating the coefficients of objective functions in the multi-objective model are difficult when the statistical data contain random and fuzzy information. Hence, this paper proposes a fuzzy goal programming additive model, to solve a multi-level multi-objective problem in a fuzzy environment, which can attain a satisfaction solution. A numerical example of production planning problem illustrates the proposed solution approach and highlights its advantages that consider the inherent uncertainties in developing the multi-level multi-objective model.

Nureize Arbaiy, Junzo Watada
Software Agents for E-Commerce Data Workflow Management

Software agent technology started to have a key role in e-commerce domain. Agents are now used to support a pervasive technology for different partners in e-business environment in which various virtual business processes will be incorporated and facilitated. Agent technology is used to automate these processes, as well as to enhance e-market places where sellers, vendors, and retailers provide a virtual shop for consumers and buyers to purchase items online through delegating requirements to software agents. This paper proposes a new framework for the use of software agent technology. The paper presents underlying framework, which is implemented by using agent coordination and collaboration in a distributed computing environment. The use of an agent controller pattern will provide robustness and scalability to e-market place. The framework allows multiple sellers to be registered, whereas buyers satisfy their requirements by using a mobile purchasing agent, which translates their requirements to the e-market place. In addition, the framework customized to satisfy e-business transactions for buyers and sellers.

Faiz Al-Shrouf, Aiman Turani, Khalil Al-Shqeerat
Mining Opinions in User-Generated Contents to Improve Course Evaluation

The purpose of this paper is to show how opinion mining may offer an alternative way to improve course evaluation using students’ attitudes posted on Internet forums, discussion groups and/or blogs, which are collectively called

user-generated content.

We propose a model to mine knowledge from students’ opinions to improve teaching effectiveness in academic institutes. Opinion mining is used to evaluate course quality in two steps: opinion classification and opinion extraction. In opinion classification, machine learning methods have been applied to classify an opinion as positive or negative for each student’s posts. Then, we used opinion extraction to extract features, such as teacher, exams and resources, from the user-generated content for a specific course. Then we grouped and assigned orientations for each feature.

Alaa El-Halees
Rough Set Theory Approach for Classifying Multimedia Data

The huge size of multimedia data requires for efficient data classification and organization in providing effective multimedia data manipulation. Those valuable data must be captured and stored for potential purposes. One of the main problems in Multimedia Information System (MIS) is the management of multimedia data. As a consequence, multimedia data management has emerged as an important research area for querying, retrieving, inserting and updating of these vast multimedia data. This research considers the rough set theory technique to organize and categorize the multimedia data. Rough set theory method is useful for exploring multimedia data and simplicity to construct multimedia data classification. Classification will help to improve the performance of multimedia data retrieving and organizing process.

M. Nordin A. Rahman, Yuzarimi M. Lazim, Farham Mohamed, Suhailan Safei, Sufian Mat Deris, M. Kamir Yusof
Application of Rasch Model in Validating the Content of Measurement Instrument for Blog Quality

Research in blog quality is very crucial nowadays in order to have a good quality blog in the blogosphere. The blog quality criteria have been derived from a rigorous metadata analysis. Yet, these criteria have not been reviewed and their significance has not been proven systematically. In this paper, Rasch Model is applied to produce an empirical evidence of content validity of the blog quality criteria. This study confirms that the definitions of 11 families and the 49 criteria assigned have content validity by mean of online survey. These criteria will then be used as a basis of constructing the instrument to measure the acceptability of the criteria for blog quality.

Zuhaira Muhammad Zain, Abdul Azim Abd Ghani, Rusli Abdullah, Rodziah Atan, Razali Yaakob
Super Attribute Representative for Decision Attribute Selection

Soft set theory proposed by Molodstov is a general mathematic tool for dealing with uncertainties. Recently, several algorithms had been proposed for decision making using soft set theory. However, these algorithms still concern on a Boolean-valued information system. In this paper, Support Attribute Representative (SAR), a soft set based technique for decision making in categorical-valued information system is proposed. The proposed technique has been tested on two datasets. The results of this research will provide useful information for decision makers to handle categorical datasets.

Rabiei Mamat, Tutut Herawan, Mustafa Mat Deris
The Normal Parameter Value Reduction of Soft Sets and Its Algorithm

Some work has been done to such issues concerning parameter reduction of soft sets. However, up to the present, few documents have focused on parameter value reduction of soft sets. In this paper, we introduce the definition of normal parameter value reduction (NPVR) of soft sets which can overcome the problem of suboptimal choice and added parameter values. More specifically, minimal normal parameter value reduction (MNPVR) is defined as a special case of NPVR and a heuristic algorithm is presented. Finally, an illustrative example is employed to show our contribution.

Xiuqin Ma, Norrozila Sulaiman
Improving Language Identification of Web Page Using Optimum Profile

Language is an indispensable tool for human communication, and presently, the language that dominates the Internet is English. Language identification is the process of determining a predetermined language automatically from a given content (e.g., English, Malay, Danish, Estonian, Czech, Slovak, etc.). The ability to identify other languages in relation to English is highly desirable. It is the goal of this research to improve the method used to achieve this end. Three methods have been studied in this research are distance measurement, Boolean method, and the proposed method, namely, optimum profile. From the initial experiments, we have found that, distance measurement and Boolean method is not reliable in the European web page identification. Therefore, we propose optimum profile which is using

N

-grams frequency and

N

-grams position to do web page language identification. The result show that the proposed method gives the highest performance with accuracy 91.52%.

Choon-Ching Ng, Ali Selamat
The Grounded Process of KMS Adoption: The Case Study of Knowledge Sharing and CoPs in Oil and Gas Industry in Malaysia

This paper presents the adoption of a knowledge management system (KMS). It is an innovation in the field of IT and its adoption rests within the literature of IT adoption theories. This research documents via Grounded Theory (GT), the adoption of KMS at an Oil and Gas Industry in Malaysia. The model generated offer insights into the technology process adoption scenario, and documents 12 factors arising which can prove useful in stimulating employees to go online and share.

Sureena Matayong, Ahmad Kamil Bin Mahmood
File Integrity Monitor Scheduling Based on File Security Level Classification

Integrity of operating system components must be carefully handled in order to optimize the system security. Attackers always attempt to alter or modify these related components to achieve their goals. System files are common targets by the attackers. File integrity monitoring tools are widely used to detect any malicious modification to these critical files. Two methods, off-line and on-line file integrity monitoring have their own disadvantages. This paper proposes an enhancement to the scheduling algorithm of the current file integrity monitoring approach by combining the off-line and on-line monitoring approach with dynamic inspection scheduling by performing file classification technique. Files are divided based on their security level group and integrity monitoring schedule is defined based on related groups. The initial testing result shows that our system is effective in on-line detection of file modification.

Zul Hilmi Abdullah, Nur Izura Udzir, Ramlan Mahmod, Khairulmizam Samsudin
A Convex Hull-Based Fuzzy Regression to Information Granules Problem – An Efficient Solution to Real-Time Data Analysis

Regression models are well known and widely used as one of the important categories of models in system modeling. In this paper, we extend the concept of fuzzy regression in order to handle real-time implementation of data analysis of information granules. An ultimate objective of this study is to develop a hybrid of a genetically-guided clustering algorithm called genetic algorithm-based Fuzzy C-Means (GA-FCM) and a convex hull-based regression approach being regarded as a potential solution to the formation of information granules. It is shown that a setting of Granular Computing helps us reduce the computing time, especially in case of real-time data analysis, as well as an overall computational complexity. We propose an efficient real-time information granules regression analysis based on the convex hull approach in which a Beneath-Beyond algorithm is employed to design sub convex hulls as well as a main convex hull structure. In the proposed design setting, we emphasize a pivotal role of the convex hull approach or more specifically the Beneath-Beyond algorithm, which becomes crucial in alleviating limitations of linear programming manifesting in system modeling.

Azizul Azhar Ramli, Junzo Watada, Witold Pedrycz
Genetic Algorithm Approach to Path Planning for Intelligent Camera Control for Scientific Visualization

In this paper, we propose to develop an intelligent camera control algorithm for scientific visualization. Intelligent camera control refers to a path planning algorithm that allows a virtual camera to navigate a scene autonomously. Intelligent camera overcomes some shortcomings of traditional manual navigation such as the risk of getting lost in the scene, or the user’s distraction from the main goal of the study. In the past years, several path planning approaches have been proposed. While those approaches focus on determining the shortest path between two points, they cannot adapt to multiple constraints that a virtual camera is subjected to, in scientific visualization. Inspired by Unmanned Aerial Vehicle path planning, our algorithm uses genetic algorithm as an optimization tool. Finally, the paper presents the experimental results of our algorithm including an empirical study to determine the optimal values for the genetic parameters.

Djamalladine Mahamat Pierre, Nordin Zakaria
Software Risk Assessment: A Review on Small and Medium Software Projects

Software risk assessment is a process of identifying, analyzing, and prioritizing risks. In general, there are large, medium, and small software projects that each of them can be influenced by a risk. Therefore, it needs a unique assessment process of the possible risks that may cause failure or loss of the project if they occur. In the literature, there are wide range of risk assessment researches conducted toward software projects. But there is at least view researches focusing on risk assessment of small and medium software projects. This creates a gap for the risk assessment research field which can cause most of small and medium project without having risk assessment. Therefore, the main focus of the paper is to give researchers an insight of the current level of risk assessment for small and medium software development projects. Finally, some future directions will be discussed hoping to insight the gap of the risk assessment field for small and medium software development projects.

Abdullahi Mohamud Sharif, Shuib Basri
Building Knowledge Representation for Multiple Documents Using Semantic Skolem Indexing

The rapid growth of digital data and users’ information needs have made the demands for automatic indexing to become more important than before. Indexing based on keyword has proven to be unsuccessful to cater for the current needs. Thus, this paper presents a new approach in creating semantic skolem indexing for multiple documents that automatically index all the documents into single knowledge representation. The skolem indexing matrix will then be incorporated in question answering system to retrieve the answer for users query.

Kasturi Dewi Varathan, Tengku Mohd. Tengku Sembok, Rabiah Abdul Kadir, Nazlia Omar
Synchronous Replication: Novel Strategy of Software Persistence Layer Supporting Heterogeneous System

Synchronous replication is the ideal solution for organizations: search for the fastest possible data recovery, minimal data loss and protection against the problems of database integrity. This ensures that the remote copy of data that is identical to the primary copy is created at the same time the primary copy is updated. However, most of the synchronous replication does not consider the heterogeneous system. In this paper, a software persistence layer for heterogeneous synchronous replication has been designed and developed based on multi-threading known as PLSR. The main objective of this strategy is to make the persistence layer adaptive and make the synchronous replication process reliable and faster than other existing replication processes concerning cost minimization. In the proposed PLSR replication technique, the replication servers are OS independent and the entire replication process is not inter dependent nevertheless on the main server. Adding a new replication server is easier than other processes. The technique also introduces the modification of replication servers without making impairment to the entire process. The comparative Results with SQL server data replication show the PLSR is more acceptable in terms of transactional insertions and sync time. The result shows that PLSR outstanding performs 88.5 % faster than SQL server for transactional insert.

A. H. Beg, Noraziah Ahmad, Ahmed N Abd Alla, E. I. Sultan, Noriyani Mohd Zin
Preserving Data Replication Consistency through ROWA-MSTS

In modern distributed systems, replication receives particular awareness to provide high data availability, reliability and enhance the performance of the system. Replication becomes as significant mechanism since it enables organizations to provide users with admission to current data where and when they need it. Integrated VSFTPD with Read-One-Write-All Monitoring Synchronization Transaction System (ROWA-MSTS) has been developed to monitor data replication and transaction performs in distributed system environment. This paper presents the ROWA-MSTS framework and process flow in order to preserve the data replication consistency. The implementation shows that ROWA-MSTS able to monitor the replicated data distribution while maintaining the data consistency over multiple sites.

Noraziah Ahmad, Roslina Mohd. Sidek, Mohammad F. J. Klaib, Ainul Azila Che Fauzi, Mohd Helmy Abd Wahab, Wan Maseri Wan Mohd
Halal Product Price Indicator Portal: Promoting Malaysia’s Local Halal SMEs

Local Halal small and medium enterprises (SMEs) play an important role in Malaysia’s future economy. Currently the popularity of their products is still low compare to those of the large and multinational companies. A Halal Product Price Indicator Portal is proposed to help promote and improve the condition of the SMEs. The portal involves Malaysia Department of Islamic Development (JAKIM) and local Halal SMEs in Malaysia. The main feature of the portal is Halal products price information and comparison functionality. It is believed that the establishment of the portal will encourage people to view it, and in time, will help promote local Halal SMEs and made their products more accessible to customers and eventually contribute to national development.

Alfi Khairiansyah Machfud, Jawdat Khatib, Ahmed Abdulaziz Haji-Ahmed, Abdul Rahman Ahmad Dahlan
A Pattern for Structuring the Information System Architecture -Introducing an EA Framework for Organizing Tasks

Enterprise Architecture (EA) is an approach which aims to align IT and business strategy to help organizations invest wisely in IT and make the most of their current IT facilities. In fact, EA isa conceptual framework which defines the structure of an enterprise, its components and their relationships. It is considered to be a massive and also complicated job which if proper methods and approaches are not used, a huge amount of chaos and overheads will be caused. This paper aims to introduce a structure for organizing EA tasks. This structure provides guidelines on how to organize and document business services. Therefore, a more organized EA process will be carried out and both time and effort are saved. A case study is used to elaborate and explain the structure accompanied by the evaluation and discussion.

Shokoofeh Ketabchi, Navid Karimi Sani, Kecheng Liu
The Specifications of the Weakly Hard Real-Time Systems: A Review

A real-time system is one in which the temporal aspects of its behaviour are part of their specification. The problem with traditional real-time specification is no guarantees on when and how many deadlines may be missed can be given to the tasks because their specification is focused on met all the deadlines and cannot missed it, otherwise the tasks is totally failed. Thus, the weakly hard specification solve this problem with define a gradation on how the deadlines can be missed while still guaranteeing the tasks meets their deadlines. In this paper, a review has been made on the three specifications of real-time systems which is losses or missed of the deadlines can be permitted occasionally. Three criteria used in the evaluation are the process model, temporal specifications and predictability. These three criteria were chosen because the tasks in real-time systems are usually periodic in nature, have timing constraints like deadlines and the behaviour of the systems must be predictable. The three specifications we reviewed in this paper are the skip constraints known as skip factor

s

, (

m

,

k

)-firm deadlines and the weakly hard constraints. The objective of review is to find which specification is better in order to predict the behaviour of a task based on those three criteria. Based on our review, it is concluded that the weakly hard constraints outperforms the two conventional specifications of weakly hard real-time systems using that three criteria based on our evaluation by using a mobile robot case study due to its capability to specify in a clear of the distribution of deadlines met and missed.

Habibah Ismail, Dayang N. A. Jawawi
Extracting Named Entities from Prophetic Narration Texts (Hadith)

In this paper, we report our work on a Finite State Transducer-based entity extractor, which applies named-entity extraction techniques to identify useful entities from prophetic narrations texts. A Finite State Transducer has been implemented in order to capture different types of named entities. For development and testing purposes, we collected a set of prophetic narrations texts from “

Sahîh Al-Bukhari

” corpus. Preliminary evaluation results demonstrated that our approach is feasible. Our system achieved encouraging precision and recall rates, the overall precision and recall are 71% and 39% respectively. Our future work includes conducting larger-scale evaluation studies and enhancing the system to capture named entities from chains of transmitters (

Salasil Al-Assanid

) and biographical texts of narrators (

Tarajims

).

Fouzi Harrag, Eyas El-Qawasmeh, Abdul Malik Salman Al-Salman

Engineering

Security as a Service for User Customized Data Protection

Some of Internet services require users to provide their sensitive information such as credit card number, and an ID-password pair. In these services, the manner in which the provided information is used is solely determined by the service providers. As a result, even when the manner in which information is used by a service provider appears vulnerable, users have no choice but to allow such usage. In this paper, we propose a framework that enables users to select the manner in which their sensitive information is protected. In our framework, a policy, which defines the type of information protection, is offered as a

Security as a Service

. According to the policy, users can incorporate the type of information protection into a program. By allowing a service provider to use their sensitive information through this program, users can protect their sensitive information according to the manner chosen by them.

Kenichi Takahashi, Takanori Matsuzaki, Tsunenori Mine, Takao Kawamura, Kazunori Sugahara

Software Security

Centralizing Network Digital Evidences

The forensic community has long acknowledged only investigating operating system (computer) for the sake of discovering digital crimes secrets. However, these techniques are not reliable anymore in case when to be used to achieve investigation aims since the data of the operating system can be tampered with by an attacker himself. Hence, focusing on alternative fields; that is network forensic comes into picture. In this paper, a methodology to collect and centralize network digital evidences in order to come up with the reliable investigation is introduced. In a case study, the laboratory is designed and set up to examine the proposed solution toward network digital evidences and centralize them as well. Finally, the operating system forensic weaknesses are obviously proven, and then a successful solution to these shortcomings through collecting and centralizing network digital evidences to be used for the investigation is presented.

Mohammed Abbas, Elfadil Sabeil, Azizah Abdul Manaf
Challenges and Opportunities in the Information Systems Security Evaluation and Position of ISO / IEC 15408

Organizations would encounter with challenges which leaving them would be impossible without any systematic and engineering approach and without any preparation of Secure Information System. The most important and greatest challenge is related to security of area that provides Information Systems. The main contribution of this work is providing a security standard-based process for software product line development. It is based on categories vulnerabilities and some concept of software engineering and use of the redefinition of information system life cycle, which integrated by Common Criteria (ISO/IEC 15408) controls into the product line lifecycle. Present approach reduces the complexity and ambiguity inherent in the information systems security in the engineering, well-defined, repeatability process.

Thus, the security organizations which implement secure products ensure the security level their product and use time-cost effective and engineering process to improve their future product.

Nasser Vali, Nasser Modiri
Computer Security Threats Towards the E-Learning System Assets

E-learning system is a web-based system which is exposed to computer threats. Services or asset of the e-learning system must be protected from any computer threats to ensure the users have peace of mind when using it. It is important to identify and understand the threats to the system in order develop a secure system. The main objectives of this paper are to discuss the computer security threats towards the e-learning system assets and to study the six categories of computer security threats to the e-learning assets. The activities which involve the e-learning assets will be analyzed and evaluated using the STRIDE model. The results show that the e-learning system assets are exposed to threats on availability, integrity and confidentiality .Our findings also show that the high risk assets are assessment and students’ assessment marks.

Zainal Fikri Zamzuri, Mazani Manaf, Adnan Ahmad, Yuzaimi Yunus
User Acceptance for Extended Function Point Analysis in Software Security Costing

This paper explains the user acceptance used in evaluating the Extended Function Point Analysis (Extended FPA) in software security costing. The construct of Software Security Characteristics Model (SSCM), the validation of SSCM, prototype as well as adaptation of the user acceptance models, are discussed in this paper. This paper also emphasize on the user acceptance test for the prototype. The experiment construct includes the variables selection, subject selection, hypotheses formulation, and treatment. User acceptance questionnaire is setup followed by the experiment. Results show that Extended FPA is perceived ease to use, more useful as well as more likely to use, rather than IFPUG FPA in calculating software security cost.

Nur Atiqah Sia Abdullah, Rusli Abdullah, Mohd Hasan Selamat, Azmi Jaafar
Analyzing Framework for Alternate Execution of Workflows for Different Types of Intrusion Threats

The main objective of the research is to conduct an analysis of the framework for alternate execution of workflows under intrusion threat with respect to different types of threats. Framework for alternate execution of workflows under threat makes the system available to the end user no matter if it is under attack by some intrusion threat. The assessment is required to be made for the framework in consideration in terms of what types of threats and how many types of threats for which it may work. For this purpose, 34 different types of threats as described by SOPHOS have been considered. Firstly the types of threats have been categorized based on their goals. Then for each category, framework in consideration is assessed. On the basis of that assessment it is analyzed for what types of threats, the framework can be enabled completely and partially. The number of threats for which the framework is enabled completely is also found. Based on the analysis, the recommendations have been made for possible extensions in the framework where it is enabled partially.

Sohail Safdar, Mohd Fadzil Hassan
Taxonomy of C Overflow Vulnerabilities Attack

Various software vulnerabilities classifications have been constructed since the early 70s for correct understanding of vulnerabilities, and thus acts as a strong foundation to protect and prevent software from exploitation. However, despite all research efforts, exploitable vulnerabilities still exist in most major software, the most common still being C overflows vulnerabilities. C overflow vulnerabilities are the most frequent vulnerabilities to appear in various advisories with high impact or critical severity. Partially but significantly, this is due to the absence of a source code perspective taxonomy to address all types of C overflow vulnerabilities. Therefore, we propose this taxonomy, which also classifies the latest C overflow vulnerabilities into four new categories. We also describe ways to detect and overcome these vulnerabilities, and hence, acts as a valuable reference for developers and security analysts to identify potential security C loopholes so as to reduce or prevent exploitations altogether.

Nurul Haszeli Ahmad, Syed Ahmad Aljunid, Jamalul-lail Ab Manan
A Secure Storage Model to Preserve Evidence in Network Forensics

The main challenge in Network Forensics, especially during the Trial session, is to protect the evidences and preserve the contents from malicious attempts to modify and tamper it. Any potential evidences that are not accurate, complete, reliable and verifiable will certainly affect the decision among the jury and judges. In this paper, we classify the potential evidences that will be stored in the network storage based on their contents, characteristics and functions. We also propose a Secure Storage Model, which implements components that preserve evidences using Cryptographic Hashing and Logging Report. As a result, we present the flow of our storage mechanisms and show the importance of hashing for forensics work to secure collected network evidences.

Mohd Izham Ibrahim, Aman Jantan
Attack Intention Analysis Model for Network Forensics

In network forensics, attack intentions analyses play a major role to help and accelerate decision-making for apprehending the real perpetrator. In fact, attack intention analysis is a prediction factor to help investigators to conclude a case with high accuracy. However, current techniques in attack intention analysis only focus on recognizing an alert correlation for certain evidence and predicting future attacks. In reality, more prediction factors should be used by the investigators to come to a more concise decision such as attack intention, incident path ..., etc. This paper will propose an attack intention analysis model, which focus on reasoning of attacks under uncertainty intention. A new model will be introduced using a combination of a mathematical Dempster- Shafer (D-S) evidence theory with a probabilistic technique through a causal network to predict an attack intention. We found that by analyzing the attacker’s intention, forensic investigation agents will be able to audit and perform evidence in an efficient way. Experiments were performed on samples of probability of attack intentions to evaluate the proposed model. Arguably, attack intention analysis model may produce a clear and impact factor for investigator decision-making.

M. Rasmi, Aman Jantan
Formal Verification for Interaction Protocol in Agent-Based E-Learning System Using Model Checking Toolkit - MCMAS

This paper presents a formal verification done for interaction protocol in agent-based e-learning system using a model checking toolkit – MCMAS (Model Checking Multi-Agent System). The goal of this interaction protocol is to automate the document downloading and notification in e-learning on behalf of the students. The specification of the interaction protocol for each agents are translated into Interpreted Systems Programming Language (ISPL) file as the programming language for MCMAS and a combination of temporal logic operators like Computation Tree Logic (CTL) and Linear Temporal Logic (LTL) are used to verify the formula for each tested reachable property. The purpose of executing this formal verification is to convince that this interaction protocol is sound and reachable for each state. Overall, this paper describes the idea of agent-based e-learning system, MCMAS toolkit - used as a model checker, CTL and LTL – logics in verification used, ISPL – programming language used under MCMAS platform and the results derived from the running verification.

Norizal Abd Latif, Mohd Fadzil Hassan, Mohd Hilmi Hasan
A New Stream Cipher Using Natural Numbers Based on Conventional Encryption Techniques: MINNSC

Stream cipher is as an important part of symmetric crypto system. One-time-pad cipher is the basic idea for stream ciphers, which uses XOR operation on the plain text and the key to generate the cipher. This research proposes a new stream cipher called MINNSC, based on the encryption decryption process using natural numbers. For a chosen natural, number blocks of non-zero divisors are generated and used as the encryption keys. These non-zero divisors form a group, their distribution is quite a random in nature, and this randomness is the desired property of stream cipher.

Rajendra Hegadi
Remote User Authentication Scheme with Hardware-Based Attestation

Many previous works on remote user authentication schemes are related to remote services environment such as online banking and electronic commerce. However, these schemes are dependent solely on one parameter, namely, user legitimacy in order to fulfill the authentication process. Furthermore, most of the schemes rely on prearranged shared secret key or server secret key to generate session key in order to secure its communication. Consequently, these schemes are vulnerable to malicious software attacks that could compromise the integrity of the platform used for the communication. As a result, user identity or shared secret key potentially can be exposed due to limitation of the scheme in providing trust or evidence of claimed platform identity. In this paper, we propose a remote authentication with hardware based attestation and secure key exchange protocol to resist malicious software attack. In addition, we also propose pseudonym identity enhancement in order to improve user identity privacy.

Fazli Bin Mat Nor, Kamarularifin Abd Jalil, Jamalul-lail Ab Manan

Graphics and Multimedia

Optimal Camera Placement for 3D Environment

Efficient camera placement is important in order to make sure the cost of a monitoring system is not higher than what it should be. This is also to ensure the maintenance of that system will not be complex and take longer time. Based on these issues, it has become an important requirement to optimize the number of the camera in camera placement system inside particular environment. This problem is based on the well-known Art Gallery Problem but most of previous works only proposed solution to this problem in 2D. We propose a method for finding the minimum number of cameras that can observe maximum space of 3D environment. In this method we assume that each of the cameras has limited field of view of 90o and only to be placed on the wall of the environment. Placement in 3D environment uses volume approach that takes frustum’s volume and space’s volume to calculate minimum number of camera.

Siti Kamaliah Mohd Yusoff, Abas Md Said, Idris Ismail

Databases

XRecursive: AStorage Method for XML Document Based on Relational Database

Storing XML documents in a relational database is a promising solution because relational databases are mature and scale very well and they have the advantages that in a relational database XML data and structured data can coexist making it possible to build application that involve both kinds of data with little extra effort. In this paper, we propose an algorithm schema named XRecursive that translates XML documents to relational database according to the proposed storing structure. The steps and algorithm are given in details to describe how to use the storing structure to storage and query XML documents in relational database. Then we report our experimental results on a real database to show the performance of our method in some features.

M. A. Ibrahim Fakharaldien, Jasni Mohamed Zain, Norrozila Sulaiman
Enhancement Algorithms for SQL-Based Chatbot

Artificial intelligence chatbot is a technology that makes interaction between men and machines using natural language possible. From literature of chatbot’s keywords/pattern matching techniques, some potential issues for enhancement had been discovered. The discovered issues are in the context of relation between previous and next responses/outputs and also keywords arrangement for matching precedence together with keywords variety for matching flexibility. To encounter these issues, two respective algorithms had been proposed. Those algorithms are

Extension and Prerequisite

and

OMAMC (One-match and All-match Categories)

. Implemented in SQL-Based chatbot, both algorithms are shown to be enhancing the capability of chatbot’s keywords/pattern matching process by providing an augment ways in storing the data and performing the process. This paper will present the significance of results from implementing both proposed algorithms into SQL-Based chatbot that will result on some enhancements in certain area of chatbot’s processes.

Abbas Saliimi Lokman, Jasni Mohamad Zain
An Alternative Measure for Mining Weighted Least Association Rule and Its Framework

Mining weighted based association rules has received a great attention and consider as one of the important area in data mining. Most of the items in transactional databases are not always carried with the same binary value. Some of them might associate with different level of important such as the profit margins, weights, etc. However, the study in this area is quite complex and thus required an appropriate scheme for rules detection. Therefore, this paper proposes a new measure called Weighted Support Association Rules (WSAR*) measure to discover the significant association rules and Weighted Least Association Rules (WELAR) framework. Experiment results shows that the significant association rules are successfully mined and the unimportant rules are easily differentiated. Our algorithm in WELAR framework also outperforms the benchmarked FP-Growth algorithm.

Zailani Abdullah, Tutut Herawan, Mustafa Mat Deris
Mining Interesting Association Rules of Student Suffering Mathematics Anxiety

Up to this moment, association rules mining are one of the most important issues in data mining application. One of the commonly and popular techniques used in data mining application is association rules mining. The purpose of this study is to apply an enhanced association rules mining method, so called SLP-Growth (Significant Least Pattern Growth) proposed by [9] for capturing interesting rules in student suffering mathematics anxiety dataset. The dataset was taken from a survey on exploring mathematics anxiety among engineering students in Universiti Malaysia Pahang (UMP). The results of this research will provide useful information for educators to make a decision on their students more accurately, and to adapt their teaching strategies accordingly. It also can be helpful to assist students in handling their fear of mathematics and useful in increasing the quality of learning.

Tutut Herawan, Prima Vitasari, Zailani Abdullah
Methodology for Measuring the Quality of Education Using Fuzzy Logic

Fuzzy logic has been considered as a strategy to define the values of the complex realities that define every aspect of education and to demonstrate formalized educational issues. This is because the quality of education has awakened the interest of investigators worldwide because they can be the answer of Education problems, and because some countries spend more resources in funding education compared to others which leads to higher levels of growth. This article proposes a new methodology using fuzzy logic to measure the quality of education by using quantitative and qualitative values with the hopes to develop criteria for the quality of education in a way closer to the realities of Latin American countries.

Sergio Valdés-Pasarón, Bogart Yail Márquez, Juan Manuel Ocegueda-Hernández
Rough Set Theory for Feature Ranking of Traditional Malay Musical Instruments Sounds Dataset

This paper presents an alternative feature ranking technique for Traditional Malay musical instruments sounds dataset using rough-set theory based on the maximum degree of dependency of attributes. The modeling process comprises seven phases: data acquisition, sound editing, data representation, feature extraction, data discretization, data cleansing, and finally feature ranking using the proposed technique. The results show that the selected features generated from the proposed technique able to reduce the complexity process.

Norhalina Senan, Rosziati Ibrahim, Nazri Mohd Nawi, Iwan Tri Riyadi Yanto, Tutut Herawan
Pi-Sigma Neural Network for Temperature Forecasting in Batu Pahat

In this study, two artificial neural network (ANN) models, a Pi-Sigma Neural Network (PSNN) and a three-layer multilayer perceptron (MLP), are applied for temperature forecasting. PSNN is use to overcome the limitation of widely used MLP, which can easily get stuck into local minima and prone to overfitting. Therefore, good generalisation may not be obtained. The models were trained with backpropagation algorithm on historical temperature data of Batu Pahat region. Through 810 experiments, we found that PSNN performs considerably better results compared to MLP for daily temperature forecasting and can be suitably adapted to forecasts a particular region using the historical data over larger geographical areas.

Noor Aida Husaini, Rozaida Ghazali, Nazri Mohd Nawi, Lokman Hakim Ismail
Revisiting Protocol for Privacy Preserving Sharing Distributed Data: A Review with Recent Results

In this paper, we give a review of some recent results concerning the study of an efficient protocol that allows parties to share data in a private way with no restrictions and without loss of accuracy. Privacy policies might discourage people who would otherwise participate in a joint contribution to a data mining task. Previously, we proposed a method that has the immediate application on horizontally partitioned databases which can be brought together and made public without disclosing the source/owner of each record. We also showed an additional benefit that we can apply our protocol to privately share patterns in the form of association rules. We performed more experiments to show that our protocol is more efficient than previous protocols. Aside from the above, we propose a new categorization for the privacy preserving data mining field.

Ahmed HajYasien

Algorithms

The Performance of Compound Enhancement Algorithm on Abnormality Detection Analysis of Intra-oral Dental Radiograph Images

Dentists look for abnormality in radiograph for determining any diseases that may appear at the apices of the teeth. However poor quality of the radiograph produces weak visual signal that may produce misleading interpretations. Hence the quality of radiograph influence dentists’ decision that reflects the success or failure of any suggested treatments. Thus this work aim to analyze the abnormality found in intra-oral dental radiographs by comparing the original images with images that had been enhanced using compound enhancement algorithms (CEA) namely Sharp Adaptive Histogram Equalization (SAHE) and Sharp Contrast adaptive histogram equalization (SCLAHE). Results show that SCLAHE enhanced images provide slight improvement, compared to the original images, in detecting widen periodontal ligament space abnormality

Siti Arpah Ahmad, Mohd Nasir Taib, NoorElaiza Abd Khalid, Rohana Ahmad, Haslina Taib
Performance Study of Two-Dimensional Orthogonal Systolic Array

The systolic array implementation of artificial neural networks is one of the ideal solutions to communication problems generated by highly interconnected neurons. A systolic array is an arrangement of processors in an array where data flows synchronously across the array between neighbours, usually with different data flowing in different directions. The simulation of systolic array for matrix multiplication is the practical application in order to evaluate the performance of systolic array. In this paper, a two-dimensional orthogonal systolic array for matrix multiplication is presented. Perl scripting language is used to simulate a two-dimensional orthogonal systolic array compared to conventional matrix multiplication in terms of average execution time. The comparison is made using matrices of size 5xM versus Mx5 which M ranges from 1 to 10, 10 to 100 and 100 to 1000. The orthogonal systolic array results show better average execution time when M is more than 30 compared to conventional matrix multiplication when the size of the matrix multiplication is increased.

Ahmad Husni Mohd Shapri, Norazeani Abdul Rahman, Mohamad Halim Abd. Wahid
Similarity Approach on Fuzzy Soft Set Based Numerical Data Classification

Application of soft sets theory for classification of natural textures has been successfully carried out by Mushrif et. al.. However the approach can not be applied in a particular classification problem, such as problem of text classification. In this paper, we propose the new numerical data classification based on similarity fuzzy soft sets. In addition can be applied to text classification, this new fuzzy soft sets classifier (FSSC) can also be used in general numerical data classification. As compare to previous soft sets classifier on seven real data sets experiments, the new proposed approach give high degree of accuracy with low computational complexity.

Bana Handaga, Mustafa Mat Deris
An Empirical Study of Density and Distribution Functions for Ant Swarm Optimized Rough Reducts

Ant Swarm Optimization refers to the hybridization of Particle Swarm Optimization (PSO) and Ant Colony Optimization (ACO) algorithms to enhance optimization performance. It is used in rough reducts calculation for identifying optimally significant attributes set. Coexistence, cooperation, and individual contribution to food searching by a particle (ant) as a swarm (colony) survival behavior, depict the common characteristics of both PSO and ACO algorithms. Ant colony approach in Ant Swarm algorithm generates local solutions which satisfy the Gaussian distribution for global optimization using PSO algorithm. The density and distribution functions are two common types of Gaussian distribution representation. However, the description and comparison of both functions are very limited. Hence, this paper compares the solution vector of ACO is represented by both density and distribution function to search for a better solution and to specify a probability functions for every particle (ant), and generate components of solution vector, which satisfy Gaussian distributions. To describe relative probability of different random variables, Probability Density Function (PDF) and the Cumulative Density Function (CDF) are capable to specify its own characterization of Gaussian distributions. The comparison is based on the experimental result to increase higher fitness value and gain better reducts.

Lustiana Pratiwi, Yun-Huoy Choo, Azah Kamilah Muda
The UTLEA: Uniformization of Non-uniform Iteration Spaces in Three-Level Perfect Nested Loops Using an Evolutionary Algorithm

The goal of the uniformization based on the concept of vector decomposition, to find the basic dependence vector set in a way that any vector in iteration space could present non-negative integer combination of these vectors. To get an optimal solution, we can use an approximate algorithm. In this paper, the uniformization for three-level perfect nested loops has been presented using an evolutionary method that is called the UTLEA, the method to minimize both the number of vectors and dependence cone size. The most available approaches have not been used; moreover, there are problems in approaches that could generalize them in three levels. In the proposed approach, we have been tried to solve these problems and according to executed tests, the achieved results are close to optimal result.

Shabnam Mahjoub, Shahriar Lotfi
A Discrete Event Simulation for Utility Accrual Scheduling in Uniprocessor Environment

This research has focused on the proposed and the development of an event based discrete event simulator for the existing General Utility Scheduling (GUS) to facilitate the reuse of the algorithm under a common simulation environment. GUS is one of the existing TUF/UA scheduling algorithms that consider the Time/Utility Function (TUF) of the executed tasks in its scheduling decision in a uniprocessor environment. The scheduling optimality criteria are based on maximizing accrued utility accumulated from execution of all tasks in the system. These criteria are named as Utility Accrual (UA). The TUF/ UA scheduling algorithms are design for adaptive real time system environment. The developed GUS simulator has derived the set of parameter, events, performance metrics and other unique TUF/UA scheduling element according to a detailed analysis of the base model.

Idawaty Ahmad, Shamala Subramaniam, Mohamad Othman, Zuriati Zulkarnain
Engineering Design Analysis Using Evolutionary Grammars with Kano’s Model to Refine Product Design Strategies

The ability for a product developer to successfully launch useful products to a market is tied to the company’s product development strategies, thus making profitability. This paper investigates the shape formulation process for product design strategies. The research focuses on nonlinear product design analysis with Kano’s model to refine product design strategies using interactive evolutionary grammars based design framework.

In analyzing the generated designs, Kano’s attribute curves (Basic, Performance and Exciting) are mapped to the analysis results to form reference models. By comparison of the user preference curves with the Kano’s reference models, patterns emerged to delineate different approaches to the product design strategies like fostering innovation and creativity in product design, controlling production expense, searching the targeted users, enhancing or lowering functionalities, and increasing or decreasing resources, features and services. Upon determining the appropriate Kano’s reference models, product developers could refine product design strategies to suit the targeted market.

Ho Cheong Lee
Applications of Interval-Valued Intuitionistic Fuzzy Soft Sets in a Decision Making Problem

Soft set theory in combination with the interval-valued intuitionistic fuzzy set has been proposed as the concept of the interval-valued intuitionistic fuzzy soft set. However, up to the present, few documents have focused on practical applications of the interval-valued fuzzy intuitionistic soft sets. In this paper, Firstly, we present the algorithm to solve decision making problems based on interval-valued intuitionistic fuzzy soft sets, which can help decision maker obtain the optical choice. And then we propose a definition of normal parameter reduction of interval-valued intuitionistic fuzzy soft sets, which considers the problems of suboptimal choice and added parameter set, and give a heuristic algorithm to achieve the normal parameter reduction of interval-valued intuitionistic fuzzy soft sets. Finally, an illustrative example is employed to show our contribution.

Xiuqin Ma, Norrozila Sulaiman, Mamta Rani
Elasticity Study: DBrain Project Grid-Cloud Integration

DBrain is a collaborative project involving multi-discipline fields aimed primarily to provide information and assistance to patient-physician interaction on Dementia disease. This paper explores the possibility and feasibility of the Grid implementation over the Cloud infrastructure to extend and enhance the current resources capacity for catering future expansion. The paper presented a study on the Cloud’s elasticity to achieve effective operational cost within the context of DBrain project. Finer details of the current metric is explored and extended for the elasticity indicator.

Dani Adhipta, Mohd Fadzil Hassan, Ahmad Kamil Mahmood
Embedded Backups for Link Failure Recovery in Shortest Path Trees

Shortest Path Tree Problem has always been a popular problem but with the devise of Dijkstra’s algorithm, SPT and many related problems received immense intention of researchers. Shortest Path Tree plays an important role in many applications like robot navigation, games, transportation and communication routing, etc. In many applications like network and vehicle routing, fast and reliable recovery from the failure is desired. Recovery from these failed links need means and/or plan, with least additional healing cost for the prolongation of the process with no or minimum delay. This paper presents an approach to recover from undesirable state of link failure(s) back to the normal state of working. A shortest path tree is being extracted from the graph with alternate path at each point (junction) keeping the cost as low as possible.

Muhammad Aasim Qureshi, Mohd Fadzil Hassan
Managing Differentiated Services in Upstream EPON

In order to enhance the bandwidth utilization while maintaining the required quality of service (QoS), an efficient dynamic bandwidth allocation (DBA) algorithm that supports global priority has been proposed in upstream Ethernet passive optical network (EPON). The algorithm aims to allocate bandwidth to different traffic classes according to the priority as a whole. The simulation is done using MATLAB by comparing the proposed algorithm with a DBA algorithm found in the literature. The proposed algorithm shows an improvement as high as 11.78% in terms of bandwidth utilization and reduces expedited forwarding (EF), assured forwarding (AF) and best effort (BE) delay as high as 25.22%, 18.96% and 21.34% respectively compared to Min’s DBA.

Nurul Asyikin Mohamed Radzi, Norashidah Md. Din, Mohammed Hayder Al-Mansoori, Mohd. Shahmi Abdul Majid
Real-Time Volume Shadow Using Stencil Buffer

Two of the best methods to recognize silhouette to create real-time volume shadow in virtual environment are described in this paper. Volume shadow algorithm is implemented for virtual environment with moveable illuminated light source. Triangular method and the Visible-non visible method are introduced. The recent traditional silhouette detection and implementation techniques used in volume shadow algorithm are improved. With introduce flowchart of both algorithms, the last volume shadow algorithm using stencil buffer is rewritten. A very simple pseudo code to create volume shadow is proposed. These techniques are poised to bring realism into commercial games. It may be use in virtual reality applications.

Hoshang Kolivand, Mohd Shahrizal Sunar, Yousef Farhang, Haniyeh Fattahi
Highest Response Ratio Next (HRRN) vs First Come First Served (FCFS) Scheduling Algorithm in Grid Environment

Research on Grid scheduling nowadays, focuses in solving three problems such as finding a good algorithm, automating the process, and building a flexible, scalable, and efficient scheduling mechanism. The complexity of scheduling predicament increases linearly with the size of the Grid. Submitted jobs in Grid environment are put in queue due to the large number of jobs submission. An adequate Grid scheduling technique used to schedule these jobs and sending them to their assigned resources. This paper simulates in C programming First Come First Served (FCFS) and Highest Response Ratio Next (HRRN) Grid scheduling algorithms. A good scheduling algorithm normally shows lower value of total waiting and schedule time. Hence, HRRN was selected because of the algorithm outperform the existing gLite Grid middleware scheduling. From the simulation result proof HRRN has better performance of total waiting time due to the priority scheme policy implementation in scheduler.

Rohaya Latip, Zulkhairi Idris
Reduced Support Vector Machine Based on k-Mode Clustering for Classification Large Categorical Dataset

The smooth support vector machine (SSVM) is one of the promising algorithms for classification problems. However, it is restricted to work well on a small to moderate dataset. There exist computational difficulties when we use SSVM with non linear kernel to deal with large dataset. Based on SSVM, the reduced support vector machine (RSVM) was proposed to solve these difficulties using a randomly selected subset of data to obtain a nonlinear separating surface. In this paper, we propose an alternative algorithm,

k

-mode RSVM (KMO-RSVM) that combines RSVM with

k

-mode clustering technique to handle classification problems on categorical large dataset. In our experiments, we tested the effectiveness of KMO-RSVM on four public available dataset. It turns out that KMO-RSVM can improve speed of running time significantly than SSVM and still obtained a high accuracy. Comparison with RSVM indicates that KMO-RSVM is faster, gets smaller reduced set and comparable testing accuracy than RSVM.

Santi Wulan Purnami, Jasni Mohamad Zain, Abdullah Embong

Signal Processings

Fractal Analysis of Surface Electromyography (EMG) Signal for Identify Hand Movements Using Critical Exponent Analysis

Recent advances in non-linear analysis have led to understand the complexity and self-similarity of surface electromyography (sEMG) signal. This research paper examines usage of critical exponent analysis method (CEM), a fractal dimension (FD) estimator, to study properties of the sEMG signal and to use these properties to identify various kinds of hand movements for prosthesis control and human-machine interface. The sEMG signals were recorded from ten healthy subjects with seven hand movements and eight muscle positions. Mean values and coefficient of variations of the FDs for all the experiments show that there are larger variations between hand movement types but there is small variation within hand movement. It also shows that the FD related to the self-affine property for the sEMG signal extracted from different hand activities 1.944~2.667. These results have also been evaluated and displayed as a box plot and analysis-of-variance (

p

value). It demonstrates that the FD value is suitable for using as an EMG feature extraction to characterize the sEMG signals compared to the commonly and popular sEMG feature, i.e., root mean square (RMS). The results also indicate that the

p

values of the FDs for six muscle positions was less than 0.0001 while that of the RMS, a candidate feature, ranged between 0.0003-0.1195. The FD that is computed by the CEM can be applied to be used as a feature for different kinds of sEMG application.

Angkoon Phinyomark, Montri Phothisonothai, Pornpana Suklaead, Pornchai Phukpattaranont, Chusak Limsakul
Robust Eye Movement Recognition Using EOG Signal for Human-Computer Interface

Electrooculography (EOG) signal is one of the useful biomedical signals. Development of EOG signal as a control signal has been paid more increasing interest in the last decade. In this study, we are proposing a robust classification algorithm of eight useful directional movements that it can avoid effect of noises, particularly eye-blink artifact. Threshold analysis is used to detect onset of the eye movements. Afterward, four beneficial time features are proposed that are peak and valley amplitude positions, and upper and lower lengths of two EOG channels. Suitable threshold conditions were defined and evaluated. From experimental results, optimal threshold values were selected for each parameters and classification accuracies approach to 100% for three subjects testing. To avoid the eye-blink artifact, the first derivative was additionally implemented.

Siriwadee Aungsakun, Angkoon Phinyomark, Pornchai Phukpattaranont, Chusak Limsakul
Recognizing Patterns of Music Signals to Songs Classification Using Modified AIS-Based Classifier

Human capabilities of recognizing different type of music and grouping them into categories of genre are so remarkable that experts in music can perform such classification using their hearing senses and logical judgment. For decades now, the scientific community were involved in research to automate the human process of recognizing genre of songs. These efforts would normally imitate the human method of recognizing the music by considering every essential component of the songs from artist voice, melody of the music through to the type of instruments used. As a result, various approaches or mechanisms are introduced and developed to automate the classification process. The results of these studies so far have been remarkable yet can still be improved. The aim of this research is to investigate Artificial Immune System (AIS) domain by focusing on the modified AIS-based classifier to solve this problem where the focuses are the censoring and monitoring modules. In this highlight, stages of music recognition are emphasized where feature extraction, feature selection, and feature classification processes are explained. Comparison of performances between proposed classifier and WEKA application is discussed.

Noor Azilah Draman, Sabrina Ahmad, Azah Kamilah Muda
Backmatter
Metadaten
Titel
Software Engineering and Computer Systems
herausgegeben von
Jasni Mohamad Zain
Wan Maseri bt Wan Mohd
Eyas El-Qawasmeh
Copyright-Jahr
2011
Verlag
Springer Berlin Heidelberg
Electronic ISBN
978-3-642-22191-0
Print ISBN
978-3-642-22190-3
DOI
https://doi.org/10.1007/978-3-642-22191-0

Premium Partner