Skip to main content
Top

2010 | Book

Advances in Intelligent Decision Technologies

Proceedings of the Second KES International Symposium IDT 2010

Editors: Gloria Phillips-Wren, Lakhmi C. Jain, Kazumi Nakamatsu, Robert J. Howlett

Publisher: Springer Berlin Heidelberg

Book Series : Smart Innovation, Systems and Technologies

insite
SEARCH

About this book

Intelligent Decision Technologies (IDT) seeks an interchange of research on intelligent systems and intelligent technologies which enhance or improve decision making in industry, government and academia. The focus is interdisciplinary in nature, and includes research on all aspects of intelligent decision technologies, from fundamental development to the applied system. This volume represents leading research from the Second KES International Symposium on Intelligent Decision Technologies (KES IDT’10), hosted and organized by the Sellinger School of Business and Management, Loyola University Maryland, USA, in conjunction with KES International. The symposium was concerned with theory, design development, implementation, testing and evaluation of intelligent decision systems. Topics include decision making theory, intelligent agents, fuzzy logic, multi-agent systems, Bayesian networks, optimization, artificial neural networks, genetic algorithms, expert systems, decision support systems, geographic information systems, case-based reasoning, time series, knowledge management systems, Kansei communication, rough sets, spatial decision analysis, and multi-criteria decision analysis. These technologies have the potential to revolutionize decision making in many areas of management, healthcare, international business, finance, accounting, marketing, military applications, ecommerce, network management, crisis response, building design, information retrieval, and disaster recovery.

Table of Contents

Frontmatter

Keynote Papers

Intelligence Analysis as Agent-Assisted Discovery of Evidence, Hypotheses and Arguments

This paper presents a computational approach to intelligence analysis which is viewed as mixed-initiative discovery of evidence, hypotheses and arguments by an intelligence analyst and a cognitive assistant. The approach is illustrated with the analysis of wide area motion imagery of fixed geographic locations where the goal is to discover threat events such as an ambush or a rocket launch. This example is used to show how the Disciple cognitive assistants developed in the Learning Agents Center can help the analysts in coping with the astonishing complexity of intelligence analysis.

Gheorghe Tecuci, David Schum, Mihai Boicu, Dorin Marcu, Benjamin Hamilton
Intelligent Software for Ecological Building Design

Building design is a complex process because of the number of elements and issues involved and the number of relationships that exist among them. Adding sustainability issues to the list increases the complexity of design by an order of magnitude. There is a need for computer assistance to manage the increased complexity of design and to provide intelligent collaboration in formulating acceptable design solutions. Software development technology today offers opportunities to design and build an intelligent software system environment that can serve as a reliable intelligent partner to the human designer. In this paper the authors discuss the requirements for an intelligent software design environment, explain the major challenges in designing this environment, propose an architecture for an intelligent design support system for sustainable design and present the existing technologies that can be used to implement that architecture.

Jens Pohl, Hisham Assal, Kym Jason Pohl

Decision Making Theory

Issues in Aggregating AHP/ANP Scales

Additive synthesis of ratio scales requires the scales to be in a common unit of measure. Unlike regular ratio scales, the unit of measure for relative ratio scales is not readily identifiable. That obscurity complicates the problem of achieving commensurability before multiple scales are synthesized. Examples are given of how conventional AHP may fail to aggregate commensurable values. Several techniques are presented that address the issue of commensurability. The analysis is then extended to more complex forms of aggregation such as benefit/cost analysis and the ANP.

William C. Wedley
An Application of Dominant Method: Empirical Approach to Public Sector Reform

Under tight financial conditions, local governments in Japan are required to conduct performance-based administrative management and subsequent public sector reform. Concerning public sector reform, a joint research group proposed a framework of how to reform public sector with theoretical approach. The research concluded that present public sector should be decomposed into two parts: one is the organization specialized in decision-making and ruling with authority and legal power; the other is collective entities which coproductively cover social issues. On the other hand, by employing the dominant method of the Analytic Hierarchy Process, the author of this paper developed a rational approach to the delegation of power from the public sector to other potential sectors. Based on the results of the study, this paper considers the scheme of public sector reform. Principal component analysis concerning properties featuring projects of a local government is carried out, and new concepts representing the aspect of the projects are extracted. The results of this paper illustrate the structure of the publicness and show the scheme of public sector reform.

Yuji Sato
General Application of a Decision Support Framework for Software Testing Using Artificial Intelligence Techniques

The use of artificial intelligent (AI) techniques for testing software applications has been investigated for over a decade. This paper proposes a framework to assist test managers to evaluate the use of AI techniques as a potential tool to test software. The framework is designed to facilitate decision making and provoke the decision maker into a better understanding of the use of AI techniques as a testing tool. We provide an overview of the framework and its components. Fuzzy Cognitive Maps (FCMs) are employed to evaluate the framework and make decision analysis easier, and therefore help the decision making process about the use of AI techniques to test software. What-if analysis is used to explore and illustrate the general application of the framework.

Deane Larkman, Masoud Mohammadian, Bala Balachandran, Ric Jentzsch
A Double-Shell Design Approach for Multiobjective Optimal Design of Microgrids

This work develops a new double shell approach to optimal design for multi-objective optimally managed systems. The cost of each design solution can be defined by the evaluation of operational issues and capital costs. In most systems, the correct definition of operational issues can be deduced by means of the solution of a multi-objective optimization problem. The evaluation of each design solution must thus be deduced using the outcome of a multi-objective optimization run, namely a Pareto hyper-surface in the n-dimensional space of operational objectives. In the literature, the design problem is usually solved by considering a single objective formulation of the operational issue. In this paper, the proposed double shell approach is implemented using evolutionary computation and it is explained considering the problem of optimal microgrids design. For this problem the multiple operational impacts identification corresponds to the solution of the optimal unit commitment of generators. After an introductory part, the particular problem formulation is presented and an interesting application of the considered approach to a medium size micro-grid is shown.

Maria Luisa Di Silvestre, Giuseppe Fileccia Scimemi, Mariano Giuseppe Ippolito, Eleonora Riva Sanseverino, Gaetano Zizzo
A Comparison of Dominant AHP/CCM and AHP/ANP

The Theory of Games is a conflict descriptive type model designed to minimize one’s loss. AHP, on the other hand, is a conflict solving type model and offers a method to describe which element in the conflict is more critical. This paper analyze the comparison of Dominant AHP/CCM (Concurrent Convergence Method), proposed by Kinoshita and Nakanishi, with AHP/ANP, proposed by Saaty and present the calculation methods and the mathematical structure of the former in the process.

Eizo Kinoshita
The Weighted Least Square Method Applied to the Binary and Ternary AHP

The weighted least square method is applied to the binary and ternary AHP (Analytic Hierarchy Process) based on the Bradley-Terry model. We transform this problem to have constant variance of errors (this is the basic property for the least square method to give the best estimators). The transformed problem has the form just to be solved by the geometric programming. Moreover the multi-stage Bradley-Terry model is proposed to meet the case where paired comparisons have various weights, which can also be solved by the geometric programming. Various examples solved by other methods are solved by our proposed method. The improved results are recognized.

Kazutomo Nishizawa, Iwaro Takahashi
Decision-Making by “Minor ANP” and Classification of the Types

The evaluation with only the alternatives consist of the missing values or the non-square matrix is not easy to make. The ANP that contains these irregular alternatives is defined as "Minor ANP". Then, this study describes the methods of making the priority of only the alternative’s matrix values by using the ANP. Our new method is proposed, and is compared with the Harker method and the Nishizawa method in presumption of missing values. And, another new method is proposed, and is compared with the Kinoshita method in the non-square matrix, too. Moreover, the paper presents the types of the decision-making, the meaning of the eigenvalue, and the negative elements of the criteria matrix to the ANP.

Toshimasa Ozaki, Mei-Chen Lo, Eizo Kinoshita, Gwo-Hshiung Tzeng
Improving the E-Store Business Model for Satisfying Customers’ Needs Using a Hybrid MCDM Combined DANP with Grey Relational Model

The current paper aims to qualitatively and quantitatively measure and evaluate e-store index criteria used to achieve the aspired levels for satisfying customers’ needs. Such research should help e-store managers understand customers’ feelings and requirements for improving the e-store business. Therefore, in the current approach, we use a hybrid MCDM model to address the dependent relational problems among the criteria. Specifically, we combine DANP (DEMATEL-based ANP) and grey relation analysis (GRA) methods to calculate the relative importance weights and relation of the criteria between interdependence and feedback. We also propose a strategy to improve the criteria gap for closing to the aspired levels of human life and convenient service. As such, this research can provide e-store managers with an understanding of how to improve business models in order to achieve consumers’ needs and promote more repurchases and enable them to devise the best marketing strategies to provide the most effective and efficient service for their customers.

Wan-Yu Chiu, Gwo-Hshiung Tzeng, Han-Lin Li

Advances in Intelligent Decision Systems

Multi-Agent System Protecting from Attacking with Elliptic Curve Cryptography

Today’s software applications are mainly characterized by their component-based structures which are usually heterogeneous and distributed. Agent technology provides a method for handling increasing software complexity and supporting rapid and accurate decision making. This paper first investigated multiagent applying for key exchange. Security systems have been drawing great attentions as cryptographic algorithms have gained popularity due to the natures that make them suitable for use in constrained environment such as mobile sensor information applications, where computing resources and power availability are limited. Elliptic curve cryptography (ECC) is one of them, which requires less computational power, communication bandwidth, and memory in comparison with other cryptosystem. In particularly, for saving pre-computing storages recently there is a trend for the sensor networks that the sensor group leaders rather than sensors communicate to the end database, which highlighted the needs to prevent from the man-in-the middle attack. Due to the multiagent system (MAS) used the whole system becomes easy to be used. In particularly we designed a hidden generator point that offer a good protection from the man-in-the middle (MinM) attack which becomes one of major worries for the sensor’s networks with MAS.

Xu Huang, Pritam Gajkumar Shah, Dharmendra Sharma
An Implementation of a Multi-attribute Negotiation Protocol for E-Commerce

Agent-mediated eCommerce (AMEC) is rapidly emerging as a new paradigm to develop distributed and intelligent eCommerce systems. Such systems are built upon the foundations of agent technology with a strong emphasis on the automated negotiation. In this paper, we address negotiation problems where agreements must resolve several different issues. We propose a one-to-many multi-attribute negotiation model based on the decision making theory. The proposed model is capable of processing agents’ preferences and arriving to an optimal solution from a set of alternatives by ranking them according to the score that they achieved. We present our experimental system architecture, together with a discussion of the underlying negotiation framework. We then report on our prototype implementation using the JADE and Eclipse platform and illustrate it with an example in eCommerce. Our concluding remarks and future research are presented.

B. M. Balachandran, Tauhid Tayeb, Dharmendra Sharma, Masoud Mohammadian
A Decision Support System for Ore Blending Cost Optimization Problem of Blast Furnaces

In iron and steel enterprises, it is difficult to obtain the lowest-cost optimal solution to an ore blending problem for blast furnaces by using the traditional trial-fault-trial (TFT) method because of the complexity of materials and burden of workflow. Here, we develop a set of decision support systems (DSS) software to solve the problem. Using the basics of analyzing business flow and the working process of ore blending, we pre-process the data for materials and elements, abstract a non-linear model of ore blending for a blast furnace, design the architecture for ore blending cost optimization DSS which integrates a database, a model base and a knowledge base, and solve the problem. The system has made economic gains since it was implemented in Xiangtan Iron & Steel Group Co. Ltd., China, in September 2008.

Ruijun Zhang, Jizhong Wei, Jie Lu, Guangquan Zhang

Intelligent Decision Technologies in Accounting and Finance

A Study on the Relationship between Corporate Governance and Pricing for Initial Public Offerings: The Application of Artificial Neural Networks

The purpose of this study is to investigate the relationship between corporate governance and pricing for initial public offerings (IPOs). Empirical result finds that the prediction of pricing of IPOs with corporate governance added can have a rather higher degree of predicting accuracy than that of non governance added during the training and testing samples. Therefore, it can be observed that corporate governance mechanism can affect the pricing of IPOs.

Chei-Chang Chiou, Wang Sen-Wei
Combining ICA with Kernel Based Regressions for Trading Support Systems on Financial Options

Options are highly non-linear and complicated products in financial markets. Owing to the high risk associated with option trading, investment on options is a knowledge-intensive industry. This study develops a novel decision support system for option trading. In the first stage, independent component analysis (ICA) is employed to uncover the independent hidden forces of the stock market that drive the price movement of an option. In the second stage, a dynamic kernel predictors are constructed for trading decisions. Comparing with convectional feature extractions and pure regression models, the performance improvement of the new method is significant and robust. The cumulated trading profits are substantialy increased. The resultant intelligent investment decision support system can help investors, fund managers and investment decision-makers make profitable decisions.

Shian-Chang Huang, Chuan-Chyuan Li, Chih-Wei Lee, M. Jen Chang
Integration of Financial and Non-financial Information for Decision-Making by Using Goal Programming and Fuzzy Analytic Hierarchy Process on a Capital Budgeting Investment Case Study

Our objective in this paper is to develop a decision model to assist decision-makers and researchers in understanding the multiple criteria decision making on a capital budgeting investment. This decision model helps decision makers on reducing decision-making time and choosing a suitable decision alternative for a capital budgeting investment under the companies’ goals, constraints and strategies. The methods utilized in this paper are the integration of goal programming (GP) and fuzzy analytic hierarchy process (FAHP). We demonstrate a case study of the capital budgeting investment by using these two methods on a small car hire company.

Yu-Cheng Tang, Ching-Ter Chang

Optimization-Based Intelligent Techniques in Image Processing

A Statistical Tailored Image Reconstruction from Projections Method

The presented paper is concerned with the image reconstruction from projections problem - the key problem in area of computer tomography. The presented paper describes a reconstruction method based on recurrent neural network structure. This structure is designed considering a probabilistic profile of distortion obesrved in X-ray computed tomography. The reconstruction process is performed using neural network solving the optimization problem. Experimental results show that the appropriately designed neural network is able to reconstruct an image with better quality than obtained from conventional algorithms.

Robert Cierniak
Realistic 3D-Modeling of Forest Growth with Natural Effect

At present laser scanning integrated with traditional air photography is a priority tendency for forest assessment and monitoring. This direction of research is based on modern technique of digital photogrammetry and geographic information systems (GIS), also on digital multidimensional signals processing. Terrain 3

D

-modelng with mapped growth is one of the main tasks during initial stage of virtual forest assessment. Proposed method permits to use LIDAR and air photography data for modeling terrain and rendering fractal texture of growth. Also our approach includes imitation of natural effects such as fog, rain, snow blanket and etc.

M. N. Favorskaya, A. G. Zotin, I. M. Danilin, S. S. Smolentcheva

E-commerce and Logistics Management

A Parallel Simulated Annealing Solution for VRPTW Based on GPU Acceleration

In order to improve the performance of simulated annealing (SA) algorithm while solving the large scale vehicle routing problem with time window(VRPTW), we propose a parallel SA(PSA) algorithm based on GPU-acceleration, which maps parallel SA algorithm to thread block executing on consumer-level graphics cards. The analytical results demonstrate that the method we proposed increases the population size, speeds up its execution and provides ordinary users with a feasible PSA solution.

Jian-Ming Li, Hong-Song Tan, Xu Li, Lin-Lin Liu
Evidential Reasoning Approach for MADA under Group and Fuzzy Decision Environment

Multiple attribute decision analysis (MADA) is common in everyday life of the real word. In a MADA problem, both qualitative attributes with uncertain assessment values and quantitative attributes with accurate assessments are often included in. Evidential reasoning (ER) is one of the MADA approaches that it can tackle issues under ignorance, incompleteness and uncertainty. In this paper, group of experts are involved, and the weights of attributes originated by experts are considered to be triangular fuzzy numbers. Weighted arithmetic mean method is used to combine the triangular fuzzy weights of attributes from each expert.

α

-cut is then used to transform the combined fuzzy weights to interval weights, and several pairs of evidential reasoning nonlinear programming models are conduct to calculate the total assessments.

Xin-Bao Liu, Mi Zhou, Jian-Bo Yang
Study on the Inventory Forecasting in Supply Chains Based on Rough Set Theory and Improved BP Neural Network

It has never stopped to study inventory management problems, and a variety of inventory control models have been proposed, but the existing models have their shortcomings and aren’t suitable to the inventory forecasting in supply chains. According to those shortcomings and the actual situation in supply chains, the paper combined rough set theory and BP neural network to analyze the inventory forecasting in supply chains. The introduction of rough sets cut down the input dimensions of BP neural network, and the neural network algorithm was improved by adding the momentum factor and applying adaptive learning rate. And, according to the inventory data of a manufacturing enterprise in Handan city, the paper proved the validity of the proposed model.

Xuping Wang, Yan Shi, Junhu Ruan, Hongyan Shang
A Model of Disruption Management for Solving Delivery Delay

When the delivery vehicle encounters disruptions in a distribution network, it is usually difficult to generate new plans dynamically to minimize the negative impact. A method measuring the system deviation caused by the disruptions is presented in this paper. First of all, the criterion to identify whether a deviation occurs is clarified. Secondly, based on the experience and knowledge of the decision-maker, the revising plans to cope with disruptions are summarized. Furthermore, by taking human behavior into consideration and adopting hierarchical cluster analysis to segment customers, the delivery delay is divided into multiple stages. Then a model of disruption management characteristic of multi-stage, multi-objective, and combining both qualitative and quantitative analysis is formed by constructing the submodel at each stage. Finally, the effectiveness of this method is validated by providing a real-world case study.

Qiulei Ding, Xiangpei Hu, Yunzeng Wang
A Real-Time Scheduling Method for a Variable-Route Bus in a Community

Real-time vehicle scheduling that can arrange routes variably for the bus within a community in response to new customers’ requests is beneficial to flexible routing for its efficiency in quick response and saving cost especially when the density of customer requests is low. In this paper, a real-time scheduling method for assigning immediate request, which allows variable route, is proposed. A multi-objective model is built for the real-time scheduling problem considering the cost, and passengers on bus and waiting at stops. A two-phase quick response approach and a local optimization method is proposed, which makes trade off between the computation time and the solution quality. A numerical experiment based on the real-world case has been designed to test the effectiveness of the proposed method. Comparison on the variable-route method and the fixed-route approach demonstrates the potential savings and diminishing of waiting time can be obtained through the proposed approach. The method proposed also has potential application for handling real-time vehicle routing in schools, tourist places, and manufactories logistics.

Yan Fang, Xiangpei Hu, Lirong Wu, Yidi Miao
A Fair Transaction Protocol with an Offline Semi-Trusted Third Party

Trusted Third Party (TTP) must be completely trustworthy when it is used in a transaction protocol to achieve fairness. But in previous works, most of the proposed protocols depend on certain strong assumptions about the third party trust, not considering possible misbehaviors by the third party and conspiracy within the main parties to some extent. In this paper, a fair transaction protocol using an offline STTP (semi-trusted third party) is proposed, which is based on the idea of RBAC(Role-Based Access Control). In the novel protocol, STTP also avoids misbehaviors on its own or conspiracy with either of the main participants by adopting interactive validation signature protocol. Meanwhile, the proposed protocol is analyzed for its security and efficiency. The results of the analysis show that it not only provides improved security, but also features high efficiency and practicability.

Wang Qian, Su Qi
Impacts of Supply Chain Globalization on Quality Management and Firm Performance: Some Evidences in Shanghai, China

In this paper, a research model and some hypotheses with regard to the effects of supply chain globalization on quality management and firm performance are proposed based on a comprehensive literature review. Structural equation modeling (SEM) techniques are employed to test the conceptual model by using empirical data that were collected from practicing managers of firms operating in Shanghai, China. The findings show that global supply chain management is significantly correlated with the quality management and, directly and indirectly influences positively firm performance. The implications of the findings for researchers and practitioners are further discussed.

Jiancheng Guan, Lei Fan

Intelligent Spatial Decision Analysis

Analysis of Fuzzyness in Spatial Variation of Real Estate Market: Some Italian Case Studies

The paper shows a method aiming at giving a measure of fuzzyness referring to the change of real estate value from an area to another one belonging to the same urban context. This measure is based on Munda’s “Semantic distance” (1997). Such measure is considered helpful to validate the traditional subdivision of the city by the Italian Cadastral System in the so-called "cadastral census section". The paper starts explaining the cadastral approach that guides the partition of an urban area, according to the hypothesis of homogeneity of the real estate values and of the physical context. After the explanation of the partition of Italian Cadastre, the concept of semantic distance is introduced, as measure of the difference among estate values referring to the cadastral sections, that in this case are considered as well fuzzy variables. The semantic distance is compared with the expected real estate value; starting from such comparison it is possible to estimate a degree of uncertainty in the variation of values area by area of the urban context.

The case studies refer to the biggest Southern Italian metropolitan areas, Naples, Bari and Palermo. The work is due to a joint effort. In detail, C. Mariano wrote the first paragraph and C.M. Torre the second the third and the fourth paragraphs.

Carmelo M. Torre, Claudia Mariano
Assessing Macroseismic Data Reliability through Rough Set Theory: Application on Vulture Area (Basilicata, Southern Italy)

This paper deals with the analysis of the reliability of information concerning damages caused to buildings by earthquakes. This research was started after analyzing a huge amount of written sources drawn up after 1930 Irpinia (southern Italy) earthquake. The analysis led to delineate damage ‘scenarios’, useful in trying to mitigate seismic risk for most affected towns. Once analyzed the effects induced by the quake, it was suitable to assess the reliability of the retrieved information. A data-set has been built concerning administrative-technical aspects of 1930 earthquake and referring to the most important towns of the area. Data have been analyzed through Rough Set Approach, a non-parametric statistic methodology.

Fabrizio Gizzi, Nicola Masini, Maria Rosaria Potenza, Cinzia Zotta, Lucia Tilio, Maria Danese, Beniamino Murgante
Fire Data Analysis and Feature Reduction Using Computational Intelligence Methods

Fire is basically the fast oxidation of a substance that produces gases and chemical productions. These chemical productions can be read by sensors to yield an insight about type and place of the fire. However, as fires may occur in indoor or outdoor areas, the type of gases and therefore sensor readings become different. Recently, wireless sensor networks (WSNs) have been used for environmental monitoring and real-time event detection because of their low implementation costs and their capability of distributed sensing and processing. In this paper, the authors investigate spatial analysis of data for indoor and outdoor fires using data-mining approaches for WSN-based fire detection purposes. This paper also delves into correlated data features in fire data sets and investigates the most contributing features for fire detection applications.

Majid Bahrepour, Berend Jan van der Zwaag, Nirvana Meratnia, Paul Havinga
The Effect of Standardization in Multicriteria Decision Analysis on Health Policy Outcomes

Health planners and epidemiologists have begun to use spatial analysis and Geographic Information Systems (GIS) to explore socioeconomic inequalities that can affect population health. In particular, the use of area-based composite indices, also known as deprivation indices, has been effective at incorporating multiple indicators into an analysis. We used GIS-based Multicriteria Decision Analysis (MCDA) to create a weighted index of health service need, and explored the standardization step in MCDA within a geovisualization environment. In a neighbourhood prioritization scenario for the City of Toronto, we implemented an MCDA using two common standardization techniques and three methods for standardizing cost criteria. We compared the resulting scores and rankings of neighbourhoods, and show that standardization is an important consideration in the data analysis process. We conclude with an assessment of the appropriateness of using one technique over the other as well as the potential effect on decision-making related to health policy.

Jacqueline Young, Claus Rinner, Dianne Patychuk
A Fuzzy Approach to the Small Area Estimation of Poverty in Italy

Urban poverty, especially in metropolitan areas, represents one of the most significant problems to both developed and developing countries. The aim of the present work is to identify territorial zones characterized by the presence of such a phenomenon. In particular, data gathered from the

EU-SILC

study for 2006 has been examined and elaborated in order to obtain estimates of poverty at a provincial level through the use of statistical methods such as Small Area Estimation and Total Fuzzy and Relative. The results obtained from this approach have been improved using SaTScan methodology for the graphical identification of homogeneous areas of poverty.

Silvestro Montrone, Francesco Campobasso, Paola Perchinunno, Annarita Fanizzi
Geographical Information Systems and Ontologies: Two Instruments for Building Spatial Analysis Systems

Starting form precedent experiences and from examples available in literature, this paper describes the usefulness of building a framework based on ontologies for the integration of geographic information in a Spatial Analysis System. The framework have to allow integration of information at different levels of detail. Since there is not a unifying concept of space it is necessary to be able to deal with multiple views of the geographic world. Therefore, it is necessary for GIS developers to be able to integrate different ontologies. After discussing the relevance of using ontologies on GIS to build a Spatial Analysis System, the paper highlight future perspective for research in this important field for geosciences.

Francesco Rotondo
Real Estate Decision Making Processes and Web-Based Applications: An Integrated Approach

Decision-making is a critical part of a typical real estate property valuation aimed at quantifying the market value of a property according to its qualitative characteristics. Being visualization a prominent character of this kind of problems, GIS are commonly used as a support for spatial decision analysis. This paper presents a new approach that regards decision making as a data-driven process whose tasks can be decomposed and made executable as granular applications individually. Being applications implemented on networked resources (GIS, applications, databases, and so on), this paper aims at defining a computational environment, enabled by web technologies and supporting the analyst in planning, designing and coordinating different real estate scenarios. The paper also describes an application of the computational environment which deals with the use of relevant geospatial service, user knowledge and statistical information.

Michele Argiolas, Nicoletta Dessì, Giampaolo Marchi, Barbara Pes
Geographical and Multi-criteria Approach for Hydro-geological Risk Evaluation in Decision Making Processes

The catastrophes occurring in recent years and a changed awareness not only among “experts” but in the whole society of the reasons for hydro-geological calamities are in fact pointing out the urgency of a correct policy for the prevention of hydro-geological risk and the development of more adequate systems for forecasting disasters. In the light of a problem of this entity, it is necessary to act on two fronts: careful planning and programming of the use of territorial resources, as well as setting up decision making support systems that can improve the efficiency of the actions taken. In this paper the topic of planning in areas with hydro - geological risk is deal with, through the implementation of two different methodologies: the former is a probabilistic - quantitative one for the definition of the hydraulic dangerousness; the latter is a qualitative one (able to consider also social, economic, cultural and political aspects) for the definition of vulnerability considered as the ability of a territorial system to answer to the calamitous events.

Francesco Selicato, Grazia Maggio
Analysis of Vulnerability of Road Networks on the Basis of Graph Topology and Related Attribute Information

The safety of people and the security of the vital functions of society are among the core tasks of governments. Various networks, especially transportation networks, are important for human life. Therefore, the government should place greater emphasis on preparedness planning and mitigation actions. Much research has been done to analyse the vulnerability of road networks and most of the methods were based on analysing the topological structure of the network using only topological attributes. This paper introduces a multi-attribute value theory which can combine all the attributes’ values, not only topological but also non-topological, and weight them according to a decision maker’s preference in order to produce an overall value. This overall value is used to compute the vulnerability of a road network. A road has a higher overall value is considered more vulnerable and more effort needs to be put into preparedness in crisis management. In this paper, a decision making software package Web-HIPRE is also introduced and illustrated.

Zhe Zhang, Kirsi Virrantaus

Using Intelligent Systems for Decision Support in Health Systems

Adoption of Open Source Software in Healthcare

The Open Source (OS) platform is a new paradigm for software development in which several parties serve as volunteers in the design, coding, testing, debugging, distribution, and documentation of OS software projects. Open source software (OSS) is experiencing an exponential growth in several industries such as finance, sales and marketing, pharmaceuticals, and manufacturing. However, the adoption of OSS in healthcare has been slow despite the availability of several quality applications for the healthcare industry. In this paper, we outline the salient characteristics of OSS, including the development process for OSS, license types, and revenue models, and then discuss major factors that affect the uptake of OSS by organizations. With a detailed discussion of BI (Business Intelligence) and other open source applications available in healthcare, we conclude that the adoption of OSS in healthcare requires a comprehensive understanding of not only the needs of the healthcare sector, the types and complexities of OS applications, and how they interact with various organizational factors.

Gokul Bhandari, Anne Snowdon
Symbiotic Simulation Decision Support System for Injury Prevention

Symbiotic simulation decision support systems refer to a class of decision support systems in which there is a presence of beneficial feedback between a physical system and a simulation system. In this paper, we report the design and development of such a system in the area of injury prevention. Specifically, we used our decision support system to lower the occurrences of patient falls in hospitals and to minimize injury and death due to the improper use of child safety seats in vehicles. Empirical results from our study show a great potential of our DSS for assisting decision makers and stakeholders in the healthcare sector.

Gokul Bhandari, Anne Snowdon
Application of Subjective Logic to Health Research Surveys

The application of semi-automated decision support systems in health care faces challenging tasks, mainly in generating evidence based recommendations in a short critical time window. Traditional data collection and survey methodology to generate evidence for the decision support systems also suffers from a slow turn-around time. In addition to probabilistic Bayesian analysis it is required to support reasoning with uncertainty in the context of the total survey error paradigm. Following Jøsang, subjective logic provides a suitable framework for connecting survey data collection directly to a model of evidence based opinions with uncertainty that also support subjective reasoning. We report on the current design and implementation aspects of a system for application of subjective logic to health research surveys.

Robert D. Kent, Jason McCarrell, Gilles Paquette, Bryan St. Amour, Ziad Kobti, Anne W. Snowdon
A Survey of Text Extraction Tools for Intelligent Healthcare Decision Support Systems

Due to abundance of textual corpora present in the medical field, it is difficult for humans to process text and infer knowledge efficiently. Traditionally, gathered information would be analyzed by domain experts who would infer knowledge and provide decisions or solutions based on a problem. The medical field would greatly benefit from an automated decision support system to make intelligent and justified decisions about a certain topic. The approach taken in this study is to survey existing text extraction tools to analyze human-generated comments from newspaper articles relating to H1N1 hype. The goal is to evaluate existing tools to later implement into an automated healthcare decision support system. From our results, Alchemy seems to be the most promising term extraction tool for use within a larger decision support framework.

Ryan Ramirez, Jordan Iversen, John Ouimet, Ziad Kobti

Ontology-Based KMS and DMSS for Service Systems

Towards Semantic-Aware and Ontology-Based e-Government Service Integration – An Applicative Case Study of Saudi Arabia’s King Abdullah Scholarship Program

By improving the quality of e-government services by enabling access to services across different government agencies through one portal, services integration plays a key role in e-government development. This paper proposes a conceptual framework of ontology based e-government service integration, using Saudi Arabia’s King Abdullah Scholarship Program (SAKASP) as a case study. SAKASP is a multi-domain program in which students must collect information from various Ministries to complete applications and the administering authority must verify the information supplied by the Ministries. The current implementation of SAKASP is clumsy because it is a mixture of online submission and manual collection and verification of information; its time-consuming and tedious procedures are inconvenient for the applicants and inefficient for the administrators. The proposed framework provides an integrated service by employing semantic web service (SWS) and ontology, improving the current implementation of SAKASP by automatically collecting and processing the related information for a given application. The article includes a typical scenario that demonstrates the workflow of the framework. This framework is applicable to other multi-domain e-government services.

Abdullah Alqahtani, Haiyan Lu, Jie Lu
Using Feature Selection with Bagging and Rule Extraction in Drug Discovery

This paper investigates different ways of combining feature selection with bagging and rule extraction in predictive modeling. Experiments on a large number of data sets from the medicinal chemistry domain, using standard algorithms implemented in the Weka data mining workbench, show that feature selection can lead to significantly improved predictive performance.When combining feature selection with bagging, employing the feature selection on each bootstrap obtains the best result.When using decision trees for rule extraction, the effect of feature selection can actually be detrimental, unless the transductive approach oracle coaching is also used. However, employing oracle coaching will lead to significantly improved performance, and the best results are obtained when performing feature selection before training the opaque model. The overall conclusion is that it can make a substantial difference for the predictive performance exactly how feature selection is used in conjunction with other techniques.

Ulf Johansson, Cecilia Sönströd, Ulf Norinder, Henrik Boström, Tuve Löfström
Validating and Designing a Service Centric View for C2TP: Cloud Computing Tipping Point Model

This paper attempts to extend current research efforts in cloud computing by creating a value model for ICT organisations to objectively determine to invest in either cloud computing or continue investing with on premise hosting options. The model will attempt to factor ICT organisations’ business attributes, financial attributes and technical attributes to provide a balanced view. Previously the authors of this paper have proposed a Cloud Computing Tipping Point model to address these gaps in research. The author also attempts to enable the model with package of services to exchange information with 3

rd

parties and agents that consumes the model. This paper attempts to validate the Cloud Computing Tipping Point model by leveraging surveys. The paper also attempts to validate the model by creating an artifact to demonstrate the model’s capability. It continues to discusses the authors attempt to enable service centric capability by leveraging service oriented architecture (SOA) concepts. This paper concludes by introducing a standard based XML taxonomy to exchange information between the model and its clients or agents.

C. Peiris, D. Sharma, B. Balachandran
Utilization of Agents for Key Distribution in IEEE 802.11

Applications of the IEEE 802.11 wireless networks are sharply increasingly in today’s modern communications, in particular follow the so-called “last mile communications” across the world. Since the communication medium being wireless, they are vulnerable to security attacks. In our previous research work, we have discussed the use of Quantum Key Distribution (QKD) to distribute the key that used to encrypt and decrypt data in Wi-Fi networks. We also discussed a novel method of using Multi Agent System (MAS) to implement QKD in Wi-Fi networks. In this paper we extend our previous work to investigate the implementation of QKD protocol via modified MAS solution, which will improve the performances for the whole implementing system and become more efficient and effective.

Shirantha Wijesekera, Xu Huang, Dharmendra Sharma

Service-Oriented Innovation for Designing Intelligent Environment

Approximately Solving Aggregate k-Nearest Neighbor Queries over Web Services

In this paper, we propose a procedure for solving

aggregate k-nearest neighbor

(

k

-ANN) queries, which is effective in building location-based services (LBSs) by mashing up several Web services. In order to get the exact results of a

k

-ANN query, the

Cartesian

product has to be computed toward data of information sources, each of which is located at a different Web site. However, this requires a large amount of communication, because the entire data of at least one information source are sent to the other site. To reduce the cost,

representative query point

(RQP) is introduced to represent a set of query points and a

k-nearest neighbor

(

k

-NN) query with RQP as a key is requested to disseminate information of data objects. Although it computes an approximate result, the experimental results with synthetic data on

precision

and

recall length ratio

show that a

k

-NN query result with the minimal point of

sum

distance as RQP is allowable for a

sum k

-NN query result and a

k

-NN query result with the minimal point of

max

distance as RQP is allowable for a

max k

-NN query result regarding some combinations of

k

and the number of query points.

Hideki Sato
Remotely Accessible Exercise Environment for Intrusion Detection/Defense Exercises Based on Virtual Machine Networks

Network security exercises carried out in laboratories have limitations of time and place. Students cannot exercise during their free time and outside the campus. Furthermore, teaching attack skills to students raises the issue of maintaining ethical standards. We have developed LiNeS (Linux Network Simulator) that generates networks for network administration exercises by using virtual machines; such a network is called a virtual machine network. In this study, we realize a virtual machine network in which attacks are generated automatically in a remotely accessible exercise environment by using LiNeS. Our system can resolve the abovementioned problems because students can completely focus on intrusion detection/defense via PCs connected to the Internet.

Yuichiro Tateiwa, Shoko Tatematsu, Tomohiro Iwasaki, Takami Yasuda
Supporting Design and Composition of Presentation Document Based on Presentation Scenario

In this paper, we propose a framework for supporting presentations. Our framework has two features. One is to provide presenters with functions of organizing their ideas in a top-down style. The other is to visualize a presentation document so that it will be easy for presenters and audience to grasp the context. We realize these features by using a presentation scenario. A scenario is composed by associating topics and keywords with each other using hierarchical/sequential relations. Then, a presentation document is generated by arranging topics and components such as texts, images, and navigation cues to other topics according to the scenario. In our framework, a presentation document is generated as spatially arranged subdocuments in a three-dimensional space. Namely, a subdocument of a topic is located on the background of the subdocuments of more detailed topics. This arrangement makes it easy for both presenters and audience to grasp the presentation context. Based on these approaches, we have implemented a prototyped tool for composing scenarios and documents.

Koichi Hanaue, Toyohide Watanabe
Translation Unit for Simultaneous Japanese-English Spoken Dialogue Translation

It is strongly required to develop a simultaneous translation system that supports the smooth cross-lingual communications. We are putting forward a framework for simultaneous translation of incrementally executing detection, translation and generation processing for a translation unit which is smaller than a sentence. In this paper, we propose a translation unit which can be translated independently and immediately. Moreover, we define how to segment a source sentence into the translation units by using a parallel corpus. To confirm that such the translation unit is effective for simultaneous translation, we evaluated the translation unit from the viewpoints of the length and the automatic detectability.

Koichiro Ryu, Shigeki Matsubara, Yasuyoshi Inagaki
Automatic Extraction of Phrasal Expressions for Supporting English Academic Writing

English academic writing is not easy for non-native researchers. They often refer to lexica of phrases on English research papers to know useful expressions in academic writing. However, lexica on sales do not have enough amount of expressions. Therefore, we propose a method for automatically extracting useful expressions from English research papers. We found four characteristics of the expressions by analyzing the existing lexicon of phrases on English research papers. The expressions are extracted from research papers based on statistical and syntactic information. In our experiment using 1,232 research papers, our proposed method achieved 57.5% in precision and 51.9% in recall. The f-measure was higher than the baselines, and therefore, we confirmed the feasibility of our method.

Shunsuke Kozawa, Yuta Sakai, Kenji Sugiki, Shigeki Matsubara
A Simulation System of Disaster Areas for Evaluating Communication Systems

In this paper, we propose a simulation system for evaluating communication systems in disaster situations. In this simulation system, virtual disaster areas are constructed based on hazard maps which are provided for predicting damage of disaster. Virtual disaster areas enables us to conduct experiments for communication systems effectively since we cannot conduct experiments in real disaster situations. In our system, movement patterns of refugees are also provided. Experimental results show that we can clarify effectiveness of proposed communication protocol by our system.

Koichi Asakura, Toyohide Watanabe
Re-ranking of Retrieved Web Pages, Based on User Preference

Generally, it is difficult to retrieve only Web-pages just appropriate to our retrieval purpose, even if we were very smart experts who can compose well-specified queries. In order to resolve this problem intelligently, many researchers have investigated the advanced retrieval methods which satisfy user’s requests with feedback control mechanism from retrieval results. However, since in the traditional researches the feedback control mechanism in which users do not directly operate or judge is used, un-necessary operational loads are imposed on users. In this paper, we propose a new re-ranking method based on user actions, which indicate whether the retrieved pages are relevant to their preferences or not: this method does not impose un-necessary operational loads on users, but provides easily operations which can indicate his/her preference appropriately. In addition, a means for modifying the existing queries by intended words is addressed and our method makes it possible to infer the words which are included in target Web-pages.

Toyohide Watanabe, Kenji Matsuoka

Applying Intelligent Decision Technology

Automated N-Step Univariate Time Series Forecasts with Bayesian Networks

Learning structure for univariate time series with a Bayesian network and few assumptions about variable structure other than horizon provides some interesting insights into time series analysis. The paper shows an application of Bayesian networks to univariate time series forecast and compares their performances with those of neural networks and exponential smoothing algorithms. Due to the shortness of the time series under consideration the models’ performance was evaluated only on the basis of their

in-sample

forecast accuracy.

Gordon Rios, Antonino Marvuglia, Richard Wallace
Application of EVALPSN to Network Routing

Our goal of this research is to construct an ad-hoc network routing protocol system based on EVALPSN in which various kinds of protocols can be dealt with uniformly. In this paper, we introduce the formalization of both single-path and multipath ad-hoc network routing in EVALPSN, which is the translation of DSR and MDSR protocols.

Kazumi Nakamatsu, Toshiaki Imai, Jair Minoro Abe, Takashi Watanabe
A Combination of Case-Based Reasoning and Analytic Hierarchy Process to Support Innovation in Industry

This paper proposes an approach based on a combination of case-based reasoning and the analytic hierarchy process to support innovation in industry. The combination of these two methods has the objective of overcoming some of their individual weaknesses. Case-based reasoning is a method used to compute similarity between a new situation and previous stored cases, which results in a list of similar situations. The Analytic Hierarchy Process is a method used to order a set of alternatives considered as options in a decision-making process. With the proposed combination, the case-based reasoning identifies the set of alternatives to be used and ordered by the analytic hierarchy process. The approach considers group decision-making, involving actors with complementary expertise engaged in innovation processes in companies, such as product design.

Ana Campos, Rui Neves-Silva
Urban Spatiotemporal Data Modeling: Application to the Study of Pedestrian Walkways

In this paper we propose a methodology describing a global process used to model urban spatiotemporal data. During the design phase, we use a tool based on the Entity-Relation formalism that is dedicated to the spatiotemporal data design (MADS). The resulting conceptual schemas are adapted to the particularities of the urban data by distinguishing between physical concepts (continuants) and events (occurents). This distinction enables a better readability of conceptual schemas and a more powerful interoperability. The resulting schemas are then implemented in an Object Oriented environment. In the end, we use a GIS to query and graphically restitute the urban data. In order to test the capacity of our proposed modeling methodology in a real case, we apply it on a case study whose objective is to analyze pedestrian walkways in urban areas.

Chamseddine Zaki, Elyes Zekri, Myriam Servières, Guillaume Moreau, Gérard Hégron
An Efficient Pruning Approach for Class Association Rule Mining

We propose an efficient pruning approach to build a faster classifier based on CARs (Class Association Rule

s

). First, we develop a structure called LECR (Lattice of Equivalence Class Rules) and propose an algorithm for fast mining CARs. Second, we propose an algorithm to prune rules that are redundant in LECR. Experimental results show that our approach is more efficient than the one based on the ECR-tree (Equivalence Class Rules-tree) by Vo and Le in 2009. The rule sets generated by two approaches, ECR-tree and LECR, are the same, therefore the accuracy does not change.

Loan T. T. Nguyen, Thang N. Nguyen

Soft Data Analysis Based Fuzzy Systems, Control and Decision Making

Binary Tree Classifier Based on Kolmogorov-Smirnov Test

The classification of large dimensional data sets arising from the merging of remote sensing data with more traditional forms of ancillary data causes a significant computational problem. Decision, tree classification is a popular approach to the problem and an efficient form for representing a decision process in hierarchical pattern recognition systems. They are characterized by the property that samples are subjected to a sequence of decision rules before they assigned to a unique class.

The scope of development presented here is limited to a certain class of decision trees: binary decision trees in which each decision involves the comparison of a single feature to a threshold. Since any combination of features can be expressed as a newly defined single feature, the restriction to binary trees can be made without loss of generality.

The algorithm for partitioning of a feature space developed in this paper is based on the Kolmogorov-Smirnov test (K-S test), which requires the calculation of K-S distance and the threshold coefficients of the tree nodes.

George Georgiev, Iren Valova, Natacha Gueorguieva
A Stackelberg Location Problem on a Tree Network with Fuzzy Random Demands

This paper focuses on a new Stackelberg location problem on a tree network with demands whose sites are given uncertainly and vaguely. By representing their sites as fuzzy random variables on the tree network, the distance between a facility and a customer can be defined as a fuzzy random number. For solving the Stackelberg location problem with the fuzzy random distances, first we introduce the

α

-level set for fuzzy random numbers and transfer it to the Stackelberg location problem with random demands. Next, the randomness of demands is represented as scenarios for each demand. Then, by using their expectations and variances, it can be reformulated as a version of conventional Stackelberg location problem on a tree network. Its complexity and solution method are shown based upon the characteristics of the facility location.

Takeshi Uno, Hideki Katagiri, Kosuke Kato
Learning Based Self-organized Additive Fuzzy Clustering Method

In this paper, we propose a self-organized additive fuzzy clustering method considering the learning process of self-organized similarity. The proposed method combines the learning process to the conventional self-organized additive fuzzy clustering method by using inner product between a pair of degree of belongingness of objects. By learning the status of the noise in each process of iteration of the algorithm, the proposed method can obtain a more adaptable result. Several numerical examples show the better performance.

Tomoyuki Kuwata, Mika Sato-Ilic

Kansei Communication and Value Creation in Human Mind

A Modeling and Systems Thinking Approach to Activity Rousing Consumer’s Buying Motivation Focusing on “Kansei Information” in POP ADS at the Store

This paper discusses causalities between

kansei

information and consumer buying behavior and actions a firm can take knowing that

kansei

information influences such behavior. Our focus is on

kansei

information, which is the result of cerebral processing of information obtained through the five senses, leading to a positive affect. Since 2000, we have conducted research into the effects of

kansei

information on consumer buying behavior in various industries in Japan. Based on observations of such effects as well as firms’ marketing and sales activities, we have modeled the process in which purchases are generated by the presentation of

kansei

information. And we have explored a way to build a systems dynamics model centering on the effects of

kansei

information. In this paper, we consider messages contained in

kansei

information through POP ADS, explain causal relationships using a systems thinking approach, and construct causal loop diagrams. Making comparisons with the causal loops, we also examine an experiment from an actual retail store which takes advantage of

kansei

information in an attempt to increase customer buying motivation.

Yuji Kosaka, Hisao Shiizuka
Ageing Society and Kansei Communication

This paper emphasizes the necessity for Kansei communication in the ageing society. It is important for the Kansei communication to give birth to the empathy. Important elements that give birth to the Kansei communication are the non-verbal communication and the interpersonal distance with good feelings. There is a possibility of promoting the use of new communication media such as the cell phone and the e-mail by motivating the Kansei communication.

Ayako Hashizume, Hisao Shiizuka
A Study on Group Decision Making with Observation on the Process of Consensus Building

It can be said that consensus building is group decision making. In group decision making support method, it has been focused on whether we can make a rational decision by using mathematical method. However, in daily life, people perform harmonious consensus naturally and make a decision that all the members satisfy. Therefore, the authors observed the process on consensus building with selection problem and compared a decision by mathematical method with it by discussion. Then, we proposed method that input a personal evaluation reason to choices in online and discuss GDSS (Group Decision Support System).

Yuri Hamada, Hiroko Shoji
Application of Modeling and Recommendation of Sensitivity to Get Tired

Most previous studies on the recommendation of the rut and sensitivity to aspects will feel tired and if the user continued use of the same systems and services has not taken into account. Continue to get service users continue to use, it is necessary to recommend considering information on their characteristics. This study is intended for planning meals, we propose a model sensitivity people get tired, we have applied to the verification and recommendation. An experiment to compare the results of subjective evaluation of real data and simulation results showed that a mathematical model representing the human feeling of mannerism proposed model. We also implemented the system, we evaluate the actual users, and other relevant planning meals to assist in creating, was found to be very effective considering the recommendation sensibility get tired.

Hiroo Inamura, Yuko Noma, Akihiro Ogino, Hiroko Shoji
Evaluation of Feelings Received from the Rhythms of Percussive Timbre and Relationships between Affective Values

Studies of the relationships between music and human feeling have been conducted for a long time. According to Gabrielsson (1982), experiences with music rhythms involve several aspects, including the “construction of rhythms,” such as beat and complexity, the “movement of rhythms,” such as tempo and the feeling of procession, and the “affective aspect of rhythms,” which involves feelings such as vivacity or quietness. There have been few studies of the affective aspects of musical rhythms. Although music therapy has attracted attention in recent years, only a few studies are available on the “characteristics of sounds and music in music therapy. This paper discusses our listening evaluations of the affective values of different percussive rhythms produced by two kinds of MIDI timbre. The results confirmed that affective values changed with changes in rhythms. The applicability of this study to music therapy is also examined.

Yuta Kurotaki, Hisao Shiizuka
A Rough-Set-Based Two-Class Classifier for Large Imbalanced Dataset

The objective of this paper is to provide a rouch-set-based two-class classifier approach to classifying samples in large and imbalanced dataset. A database has plenty of hidden knowledge, which can be used in decision making to support commerce, research and other activities. Prediction is another form of expanding data analysis. It enables us to establish a data model using existing data and to predict the trend of data in future. In this paper, a method consists of data scaling, rough sets analysis and support vector machine with radial basis function (SVM-RBF), which is used to classify a large and imbalanced data set obtained in semiconductor industry.

Junzo Watada, Lee-Chuan Lin, Lei Ding, Mohd. Ibrahim Shapiai, Lim Chun Chew, Zuwairie Ibrahim, Lee Wen Jau, Marzuki Khalid

Future Direction of Innovative Decision Technologies

A Hybrid MADM Based Competence Set Expansion for Marketing Imagination Capabilities

Creativity and marketing imagination capabilities are the most important factors for achieving marketing success and thus, firms’ profitability. However, a lot of marketing programs of new products are in lack of creativity. Further, albeit scholars found decision support systems to be useful for enhancing creativity, few past researches tried to uncover this issue in detail. To resolve the above mentioned problems and propose a decision making framework to derive the factors which affect the creativity of marketing programs, relationships between the factors and finally, define appropriate strategies to maximize the performance of marketing imagination, a hybrid multiple attribute decision making (MADM) method based decision support system by introducing the (creativity) competence set expansion concept will be proposed. A case study for expanding the creativity and thus, marketing imagination capabilities of a system vendor of telematic products was provided to demonstrate the feasibility of this MADM framework.

Chi-Yo Huang, Gwo-Hshiung Tzeng, Shu Hor
Semiconductor Foundry Technology Life Cycle Strategy Portfolio Definitions of Fabless IC Design Firms by Using the ISM and Fuzzy Integral Method

Fabless integrated circuit (IC) design firms are the semiconductor companies without wafer fabrication facilities. Thus, appropriate semiconductor foundry strategies are critical for the fables IC design firms’ profitability. Further, due to the various characteristics of the semiconductor wafer foundry industry versus each stage of a technology life cycle (TLC), the foundry strategies of the fabless IC design firms should further be defined based on the characteristics of each stage of a TLC. Albeit important, very few past researches tried to illustrate the foundry strategies of fabless IC design firms, not to mention the considerations about foundry strategy portfolios definition versus the concept of TLC. Thus, this research aims to develop a fuzzy integral based fuzzy multiple criteria decision making (FMCDM) framework for defining foundry strategies versus each stage of a TLC by using the interpretive structure modeling (ISM), fuzzy analytic hierarchical process (FAHP) and the non-additive fuzzy integral method. By inviting twenty one industry experts from leading semiconductor foundries, fabless IC design firms as well as IC design service companies, strategy portfolios versus each stage of a TLC were developed.

Chi-Yo Huang, Chao-Yu Lai, Gwo-Hshiung Tzeng
An Emotional Designed Based Hybrid MCDM Framework for the Next Generation Embedded System Configurations

Emotions play a critical role in human’s ability to understanding and learning about the world and new things. Furthermore, the concept of user-centered system design (UCSD), which emphasizes the importance of having a good understanding of the users, is emerging as a core of modern design methods. Traditionally, new product development team defined high technology products in general and embedded systems in special, based on their intuition. However, this kind of design methodology can be subjective and misleading. Users’ emotions and requirements were not really considered. Thus, this paper aims to resolve this issue by introducing a hybrid fuzzy multiple criteria decision making (FMCDM) based design methodology consisting of Interpretive Structure Modeling (ISM), Fuzzy Analytic Hierarchical Process (FAHP) and Fuzzy Integral by introducing the concept of emotional design. Thirteen senior managers from world’s leading IT system companies as well as IC design houses were invited for configuring a FMCDM framework for the next generation handset designs and verified the feasibility. In the future, the proposed framework can be used to configure any embedded system to be embedded in consumer electronics products.

Chi-Yo Huang, Hsiang-Chun Lin, Gwo-Hshiung Tzeng
Derivations of Factors Influencing Segmental Consumer Behaviors Using the RST Combined with Flow Graph and FCA

Consumer behavior analysis and prediction are both important for marketers in general and high technology marketers in special. At the moment of fast evolutions of high technology products, precise predictions of consumer behaviors can serve as the foundation of product/specification definitions. Traditionally, qualitative approaches (e.g. brain storming) or multivariate statistical (e.g. principal component analysis, factor analysis, etc.) were applied widely on consumer behavior analysis. However, the qualitative methods can be objective while the statistical approaches could be hard to be manipulated. Thus, a rule-based prediction method can be very helpful for analyzing and predicting consumer behavior. Moreover, precise prediction rules for consumer behavior being derived by the forecast mechanism can be very useful for marketers and designers to define the features of the products. Therefore, this research intends to define a Cluster Analysis (CA), Rough Set Theory (RST), flow graph (FG) and formal concept analysis (FCA) based forecast mechanism for predicting segmental consumer behavior. An empirical study on 124 Taiwanese 4G handset users was leveraged for verifying the feasibility of the proposed forecast mechanism. The empirical study results demonstrate the feasibility of this proposed framework. Meanwhile, the proposed consumer behavior forecast mechanism can be leveraged on defining features of other high technology products/services.

Chi-Yo Huang, Ya-Lan Yang, Gwo-Hshiung Tzeng, Hsiao-Cheng Yu, Hong-Yuh Lee, Shih-Tsunsg Cheng, Sang-Yeng Lo
Power System Equipments Investment Decision-Making under Uncertainty: A Real Options Approach

Power supply failures have caused major social losses in the information society in the present age. Such a loss is estimated up to approximately two trillion yen when a large power failure happens in a big city such as Metropolitan Tokyo. Therefore, it is necessary to provide some remedies such as a diagnosis of the power system equipments not only for preventing the system accident of the equipment beforehand from its failures but also for guarding the social cost from increasing. The objective of the paper is to provide the preliminary research on life cycle management. In this paper, net present value (NPV) analysis and real options approach (ROA) are employed in life cycle management in investment and maintenance of power supply systems in order to keep the continuous normal operation under uncertainty.

Shamshul Bahar Yaakob, Junzo Watada
Combining DEMATEL and ANP with the Grey Relational Assessment Model for Improving the Planning in Regional Shopping Centers

A novel MCDM (Multiple Criteria Decision-Making) for evaluating, comparing and improving the effectiveness of network-competence indicators in various needs that is choice for environment plan at the primary regional shopping center level. The ANP (Analytic Network Process) weights are based on a DEMATEL (Decision-making trial and evaluation laboratory) technique along with a grey relational assessment model to solve the problems of interdependence and feedback to evaluate and improve the gaps in each criterion for reducing the gaps to achieve the aspired/desired level. An empirical case is used to demonstrate that the hybrid MCDM model can be used to measure and evaluate regional shopping center problems in real cases. Results can be used to provide the decision-maker to choice the regional shopping center location. The proposed model can help practitioners improve their decision process, especially when criteria are numerous and inter-related. The data from a Taiwanese regional shopping centers is used to demonstrate this model.

Vivien Y. C. Chen, Chui-Hua Liu, Gwo-Hshiung Tzeng, Ming-Huei Lee, Lung-Shih Yang
Key Success Factors of Brand Marketing for Creating the Brand Value Based on a MCDM Model Combining DEMATEL with ANP Methods

When consumers purchase products, they will consider the brand first, because it indirectly leads consumers to associate with the quality, functions, and design of products. According to the smiling curve, the production of products is interacted with marketing and R&D. Therefore, the enterprises enhancing the marketing and R&D for production can create brand value. This study focuses on the marketing, and is based on the traditional marketing strategies to develop the brand marketing mix. However, there are many criteria among the strategies, and the criteria are interrelated. Thus, we want to probe the key success factors of the brand marketing to satisfy customer’s needs. A MCDM model combining DEMATEL with ANP methods is used to find the importance and influence among the dimensions and criteria, which can evaluate marketing strategy. The study results will provide the enterprises with a reference for planning brand marketing.

Yung-Lan Wang, Gwo-Hshiung Tzeng, Wen-Shiung Lee
Backmatter
Metadata
Title
Advances in Intelligent Decision Technologies
Editors
Gloria Phillips-Wren
Lakhmi C. Jain
Kazumi Nakamatsu
Robert J. Howlett
Copyright Year
2010
Publisher
Springer Berlin Heidelberg
Electronic ISBN
978-3-642-14616-9
Print ISBN
978-3-642-14615-2
DOI
https://doi.org/10.1007/978-3-642-14616-9

Premium Partners