Skip to main content

2009 | Buch

Cutting-Edge Research Topics on Multiple Criteria Decision Making

20th International Conference, MCDM 2009, Chengdu/Jiuzhaigou, China, June 21-26, 2009. Proceedings

herausgegeben von: Yong Shi, Shouyang Wang, Yi Peng, Jianping Li, Yong Zeng

Verlag: Springer Berlin Heidelberg

Buchreihe : Communications in Computer and Information Science

insite
SUCHEN

Über dieses Buch

MCDM 2009, the 20th International Conference on Multiple-Criteria Decision M- ing, emerged as a global forum dedicated to the sharing of original research results and practical development experiences among researchers and application developers from different multiple-criteria decision making-related areas such as multiple-criteria decision aiding, multiple criteria classification, ranking, and sorting, multiple obj- tive continuous and combinatorial optimization, multiple objective metaheuristics, multiple-criteria decision making and preference modeling, and fuzzy multiple-criteria decision making. The theme for MCDM 2009 was “New State of MCDM in the 21st Century.” The conference seeks solutions to challenging problems facing the development of multiple-criteria decision making, and shapes future directions of research by prom- ing high-quality, novel and daring research findings. With the MCDM conference, these new challenges and tools can easily be shared with the multiple-criteria decision making community. The workshop program included nine workshops which focused on different topics in new research challenges and initiatives of MCDM. We received more than 350 submissions for all the workshops, out of which 121 were accepted. This includes 72 regular papers and 49 short papers. We would like to thank all workshop organizers and the Program Committee for the excellent work in maintaining the conference’s standing for high-quality papers.

Inhaltsverzeichnis

Frontmatter

Workshop on Evolutionary Methods for Multi-Objective Optimization and Decision Making

An Evolutionary Algorithm for the Multi-objective Multiple Knapsack Problem

In this study, we consider the multi-objective multiple knapsack problem (MMKP) and we adapt our favorable weight based evolutionary algorithm (FWEA) to approximate the efficient frontier of MMKP. The algorithm assigns fitness to solutions based on their relative strengths as well as their non-dominated frontiers. The relative strength is measured based on a weighted Tchebycheff distance from the ideal point where each solution chooses its own weights that minimize its distance from the ideal point. We carry out experiments on test data for MMKP given in the literature and compare the performance of the algorithm with several leading algorithms.

Banu Soylu, Murat Köksalan
Adaptive Differential Evolution for Multi-objective Optimization

No existing multi-objective evolutionary algorithms (MO-EAs) have ever been applied to problems with more than 1000 real-valued decision variables. Yet the real world is full of large and complex multi-objective problems. Motivated by the recent success of SaNSDE [1], an adaptive differential evolution algorithm that is capable of dealing with more than 1000 real-valued decision variables effectively and efficiently, this paper extends the ideas behind SaNSDE to develop a novel MOEA named MOSaNSDE. Our preliminary experimental studies have shown that MOSaNSDE outperforms state-of-the-art MOEAs significantly on most problems we have tested, in terms of both convergence and diversity metrics. Such encouraging results call for a more in-depth study of MOSaNSDE in the future, especially about its scalability.

Zai Wang, Zhenyu Yang, Ke Tang, Xin Yao
An Evolutionary Approach for Bilevel Multi-objective Problems

Evolutionary multi-objective optimization (EMO) algorithms have been extensively applied to find multiple near Pareto-optimal solutions over the past 15 years or so. However, EMO algorithms for solving bilevel multi-objective optimization problems have not received adequate attention yet. These problems appear in many applications in practice and involve two levels, each comprising of multiple conflicting objectives. These problems require every feasible upper-level solution to satisfy optimality of a lower-level optimization problem, thereby making them difficult to solve. In this paper, we discuss a recently proposed bilevel EMO procedure and show its working principle on a couple of test problems and on a business decision-making problem. This paper should motivate other EMO researchers to engage more into this important optimization task of practical importance.

Kalyanmoy Deb, Ankur Sinha
Multiple Criteria Decision Making: Efficient Outcome Assessments with Evolutionary Optimization

We propose to derive

assessments

of

outcomes

to MCDM problems instead of just outcomes and carry decision making processes with the former. In contrast to earlier works in that direction, which to calculate assessments make use of subsets of the

efficient set

(

shells

), here we provide formulas for calculation of assessments based on the use of

upper and lower approximations

(

upper and lower shells

) of the efficient set, derived by

evolutionary optimization

. Hence, by replacing shells, which are to be in general derived by optimization, by pairs of upper and lower shells, exact optimization methods can be eliminated from MCDM.

Ignacy Kaliszewski, Janusz Miroforidis

Workshop on Mining Text, Semi-structured, Web, or Multimedia Data

Automatic Detection of Subjective Sentences Based on Chinese Subjective Patterns

Subjectivity analysis requires lists of subjective terms and corpus resources. Little work to date has attempted to automatically recognize subjective sentences and create corpus resources for Chinese subjectivity analysis. In this paper, we present a bootstrapping process that can use subjective phrases to automatically create training set from unannotated data, which is then fed to a subjective phrases extraction algorithm. The learned phrases are then used to identify more subjective sentences. The bootstrapping process can learn many subjective sentences and phrases. We show that the recall for subjective sentences is increased with slightly drop in reliability.

Ziqiong Zhang, Qiang Ye, Rob Law, Yijun Li
Case Study on Project Risk Management Planning Based on Soft System Methodology

This paper analyzed the soft system characters of construction projects and the applicability on using Soft System Methodology (SSM) for risk analysis after a brief review of SSM. Taking a hydropower project as an example, it constructed the general frame of project risk management planning (PRMP) and established the Risk Management Planning (RMP) system from the perspective of the interests of co-ordination. This paper provided the ideas and methods for construction RMP under the win-win situation through the practice of SSM.

Xie Lifang, Li Jun
Experiments with Bicriteria Sequence Alignment

In this article we investigate the performance of a multicriteria dynamic programming algorithm for pairwise global sequence alignment that maximizes the number of matches and minimizes the number of indels or gaps. We provide estimates on the number of optimal alignments for pairs of random sequences, as well as computational results in a benchmark dataset. Our empirical analysis indicates that this approach is feasible from practical point of view.

Luís Paquete, João P. O. Almeida
Integrating Decision Tree and Hidden Markov Model (HMM) for Subtype Prediction of Human Influenza A Virus

Multiple criteria decision making (MCDM) has significant impact in bioinformatics. In the research reported here, we explore the integration of decision tree (DT) and Hidden Markov Model (HMM) for subtype prediction of human influenza A virus. Infection with influenza viruses continues to be an important public health problem. Viral strains of subtype H3N2 and H1N1 circulates in humans at least twice annually. The subtype detection depends mainly on the antigenic assay, which is time-consuming and not fully accurate. We have developed a Web system for accurate subtype detection of human influenza virus sequences. The preliminary experiment showed that this system is easy-to-use and powerful in identifying human influenza subtypes. Our next step is to examine the informative positions at the protein level and extend its current functionality to detect more subtypes. The web functions can be accessed at http://glee.ist.unomaha.edu/.

Pavan K. Attaluri, Zhengxin Chen, Aruna M. Weerakoon, Guoqing Lu
Fuzzy Double Linear Regression of the Financial Assets Yield

We constructed a fuzzy bilinear regression FDLR (

p, q

) model to deal with the interval financial data, and then deduce the fuzzy least square method to estimate the unknown parameter in the model. Mean square error (MSE) and Mean-absolute error (MAE) are employed to evaluate and compare the fitting results of the two models, FDLR (

p, q

) and FAR (

p

), and also the forecasting of two models. Empirical analysis showed the first one is more effective.

Taiji Wang, Weiyi Liu, Zhuyu Li
Detection of Outliers from the Lognormal Distribution in Financial Economics

The lognormal distribution is widely applied in many fields such as economics, finance, management. A method for detection of the outliers from the lognormal sample is discussed. The new test statistics based on the superior estimators for the population parameters are given. The test statistics can resist contamination from the outliers, the approximate distribution of the test statistics is proposed in this article.

Yunfei Li, Zongfang Zhou, Hong Chen
A Bibliography Analysis of Multi-Criteria Decision Making in Computer Science (1989-2009)

In the past two decades, there have been some research area shifts in the field of multi-criteria decision making (MCDM). MCDM in computer science (CS) is one of the rising subtopical areas. This paper conducted a bibliographic study of MCDM literature, including 2171 research papers from 1989 to 2009. It summarized basic statistics about the longitudinal changes of MCDM in CS research activities. It also evaluated the distribution of subject areas within the MCDM in CS.

Gang Kou, Yi Peng

Workshop on Knowledge Management and Business Intelligence

A Six Sigma Methodology Using Data Mining: A Case Study on Six Sigma Project for Heat Efficiency Improvement of a Hot Stove System in a Korean Steel Manufacturing Company

Recently, Six Sigma has been widely adopted in a variety of industries as a disciplined, data-driven problem solving approach or methodology supported by a handful of powerful statistical tools in order to reduce variation through continuous process improvement. Also, data mining has been widely used to discover unknown knowledge from a large volume of data using various modeling techniques such as neural network, decision tree, regression analysis, etc. This paper proposes a Six Sigma methodology based on data mining for effectively and efficiently processing massive data in driving Six Sigma projects. The proposed methodology is applied in the hot stove system which is a major energy-consuming process in a Korean steel company for improvement of heat efficiency and reduction of energy consumption. The results show optimal operation conditions and reduction of the hot stove energy cost by 15%.

Gil-Sang Jang, Jong-Hag Jeon
Internal and External Beliefs as the Determinants of Use Continuance for an Internet Search Engine

The antecedents of IS use continuance is gaining research focus recently. The traditional beliefs based on TRA have been adopted as the main drivers of IS use continuance. They are based on the assumptions that continuance is a simple repetition of initial adoption because those belief constructs are the same with those for TAM. However, continued usage is a matter of continuous decision making in the competitive alternative systems. Furthermore, control factors such as habit, self efficacy, and self identity have been suggested to extend TRA. This study introduced self-identity, self-efficacy and perceived switching cost as a new driver for IS use continuance. 395 samples were collected from the users of a Korean internet search engine portal site. Data analysis using Structural Equation Modeling (SEM) shows that while the effect of habit overwhelms the effect of intention on continuance, it is also mediated by self identity and self efficacy. These can be considered as the internal beliefs formed by the internalization of habit. On the contrary, external beliefs such as switching cost and perceived usefulness showed no significant effect on continuance.

Soongeun Hong, Young Sik Kang, Heeseok Lee, Jongwon Lee
Development of Knowledge Intensive Applications for Hospital

Most studies of medical intelligence system have focused on the development of backend data repositories rather than frontend user applications. Also, they tend to lack systematic development models which demonstrate how user requirements inform system design and implementation. For these reasons, it is highly desirable to show the process of eliciting knowledge requirements in addition to the development process for knowledge-intensive applications. This research covers the implementation of a medical intelligence system based on analysis of knowledge requirements such as OLAP (On-line Analytical Processing) fundamental functionalities and knowledge types. The proposed medical intelligence system provides health care providers and their supporters with the ability to draw meaningful insights from very large, complex data sets. A real life case is presented to illustrate the system’s practical usage. Six application packages are defined, namely: explorer, analyzer, reporter, statistician, visualizer, and meta administrator. Finally, the study concludes with an evaluation of the developed system and future research directions.

Jongho Kim, Han-kuk Hong, Gil-sang Jang, Joung Yeon Kim, Taehun Kim
Organizational Politics, Social Network, and Knowledge Management

This research identifies the social relationship and structure among members as well as organization’s political inclination, through which, it also identifies the current status of knowledge management. The result shows that the socio-technological factors (individual, knowledge and IT factors) affect knowledge transfer and the knowledge transfer influences performance and that the members’ relationship based on the political inclination of the organization has a major moderating effect on the above two relation.

Hyun Jung Lee, Sora Kang, Jongwon Lee
Implementation of Case-Based Reasoning System for Knowledge Management of Power Plant Construction Projects in a Korean Company

Recently, plant construction industries are enjoying a favorable business climate centering around developing countries and oil producing countries rich in oil money. This paper proposes a methodology of implementing corporation-wide case-based reasoning (CBR) system for effectively managing lessons learned knowledge like experiences and know-how obtained in performing power plant construction projects. Our methodology is consisted of 10 steps: user requirement analysis, information modeling, case modeling, case base design, similarity function design, user interface design, case base building, CBR module development, user interface development, integration test. Also, to illustrate the usability of proposed methodology, the practical CBR system is implemented for the plant construction business division of ’H’ company which has international competitiveness in the field of plant construction industry. At present, our CBR system is successfully utilizing as storing, sharing, and reusing the knowledge which is accumulated in performing power plant construction projects in the target enterprise.

Gil-Sang Jang

Workshop on Data Mining Based Extension Theory

Mining Conductive Knowledge Based on Transformation of Same Characteristic Information Element in Practical Applications

The transformation of one characteristic or some/certain characteristics of one or a kind of objects would lead to other transformation of the same characteristics of other objects. From the engineering point of view, this paper define a basic concept of conductive objects and conductive class objects based on transformation of same characteristics information element, conductive degree and it’s interval, the scope of the characteristics value; and the paper also give a theory and steps of mining conductive knowledge based on transformations of same characteristics information element in practical applications.

Li Xiao-Mei
Research on Customer Value Based on Extension Data Mining

Extenics is a new discipline for dealing with contradiction problems with formulize model. Extension data mining (EDM) is a product combining Extenics with data mining. It explores to acquire the knowledge based on extension transformations, which is called extension knowledge (EK), taking advantage of extension methods and data mining technology. EK includes extensible classification knowledge, conductive knowledge and so on. Extension data mining technology (EDMT) is a new data mining technology that mining EK in databases or data warehouse. Customer value (CV) can weigh the essentiality of customer relationship for an enterprise according to an enterprise as a subject of tasting value and customers as objects of tasting value at the same time. CV varies continually. Mining the changing knowledge of CV in databases using EDMT, including quantitative change knowledge and qualitative change knowledge, can provide a foundation for that an enterprise decides the strategy of customer relationship management (CRM). It can also provide a new idea for studying CV.

Yang Chun-yan, Li Wei-hua
The Intelligent System of Cardiovascular Disease Diagnosis Based on Extension Data Mining

This thesis gives the general definition of the concepts of extension knowledge, extension data mining and extension data mining theorem in high dimension space, and also builds the IDSS integrated system by the rough set, expert system and neural network, develops the relevant computer software. From the diagnosis tests, according to the common diseases of myocardial infarctions, angina pectoris and hypertension, and made the test result with physicians, the results shows that the sensitivity, specific and accuracy diagnosis by the IDSS are all higher than the physicians. It can improve the rate of the accuracy diagnosis of physician with the auxiliary help of this system, which have the obvious meaning in low the mortality, disability rate and high the survival rate, and has strong practical values and further social benefits.

Baiqing Sun, Yange Li, Lin Zhang
From Satisfaction to Win-Win: A Novel Direction for MCDM Based on Extenics

Multi-criteria decision making (MCDM) problems are often imprecise and changeable. Therefore, an important step in many applications of MCDM is to perform analysis on the criteria by input data and finally get compromised or satisfied solutions. Trying to seek solutions to challenging problems facing the development of MCDM, we present a methodology called Extension Multi-criteria decision making (eMCDM) to get win-win solutions based on Extenics and Data Mining. While Data Mining technology helps to find reasons why those criteria and goals were been selected, Extenics helps to find win-win solutions by implication analysis and transform the criteria or goals. A new direction for refining MCDM is firstly presented. The application shows its value and Viable Vision.

Xingsen Li, Liuying Zhang, Aihua Li
The Methods to Construct Multivariate and Multidimensional Basic-Element and the Corresponding Extension Set

Extenics aims to research the extensible probability and innovative rules by using the formalized model and obtain the approaches to resolve the incompatible problems. The basic-element theory and the extension set theory are the mainstays of extenics. On the basis of the other results, we proposed the definitions of multivariate and multidimensional basic-element and the corresponding extension set. These results could be enriched the contents of the basic-element theory and the extension set theory, and enlarged the application fields of extenics.

Li Qiao-Xing, Yang Jian-Mei
A New Process Modeling Method Based on Extension Theory and Its Application in Purified Terephthalic Acid Solvent System

Owing to the complexity and nonlinearity of chemical process, the intelligent process modeling has become a hot issue. For the purified terephthalic acid (PTA) solvent system, the acetic acid consumption is a crucial production index that has received much more attention in monitoring and guiding the production conditions. In this work, the focus is on the understanding and modeling the influence of operating conditions on acetic acid consumption, which is composed of matter-element model, extension transformation and general regression neural network (GRNN).Thus it comes to an extension engineering method for process modeling of chemical industry. Through the actual application in PTA solvent system of a chemical plant, cases studies show that the proposed method enables us to understand matters in an overall way, to describe thinking process of human beings in a formalized way, to model the chemical process precisely, which exploits a new way to simulate and guide the production process in chemical industry.

Xu Yuan, Zhu Qunxiong
Research on Liquidity Risk Evaluation of Chinese A-Shares Market Based on Extension Theory

This research defines the liquidity risk of stock market in matter-element theory and affair-element theory, establishes the indicator system of the forewarning for liquidity risks,designs the model and the process of early warning using the extension set method, extension dependent function and the comprehensive evaluation model. And the paper studies empirically A-shares market through the data of 1A0001, which prove that the model can better describe liquidity risk of China’s A-share market. At last, it gives the corresponding policy recommendations.

Sun Bai-qing, Liu Peng-xiang, Zhang Lin, Li Yan-ge
Contradictory Problems and Space-Element Model

Large numbers of contradictory problems generally consist in the design of the architectural interior space. It is very necessary to study the measure to solve the contradictory problems consist in spatial form from the contradictory problems. The basic theory of Extenics provides the thesis extension leading idea of transformation and one of the tools of matter-element theory. On the base of these theories, the thesis tries to set up space-element model to describe the contradictory problems in the architectural inner space, and even guide the theory and practice for the future, tries to provide the methods to solve the contradictory problems intellectualized.

Wang Tao, Zou Guang-tian

Workshop on Intelligent Knowledge Management

Knowledge Intelligence: A New Field in Business Intelligence

This paper discussed the development of business intelligence considering the development of data mining. Business intelligence plays an important role in producing up-to-data information for operative and strategic decision-making. We proposed a new kind of knowledge named intelligent knowledge gotten from data. We illustrated a way to combine the business intelligence and intelligent knowledge and proposed a way of the management of intelligent knowledge which is more structural than the traditional knowledge.

Guangli Nie, Xiuting Li, Lingling Zhang, Yuejin Zhang, Yong Shi
Mining Knowledge from Multiple Criteria Linear Programming Models

As a promising data mining tool, Multiple Criteria Linear Programming (MCLP) has been widely used in business intelligence. However, a possible limitation of MCLP is that it generates unexplainable black-box models which can only tell us results without reasons. To overcome this shortage, in this paper, we propose a Knowledge Mining strategy which mines from black-box MCLP models to get explainable and understandable knowledge. Different from the traditional Data Mining strategy which focuses on mining knowledge from data, this Knowledge Mining strategy provides a new vision of mining knowledge from black-box models, which can be taken as a special topic of “Intelligent Knowledge Management”.

Peng Zhang, Xingquan Zhu, Aihua Li, Lingling Zhang, Yong Shi
Research on Domain-Driven Actionable Knowledge Discovery

Traditional data mining is a data-driven trial-and-error process, stop on general pattern discovered. However, in many cases the mined knowledge by this process could not meet the real-world business needs. Actually, in real-world business, knowledge must be actionable, that is to say, one can do something on it to profit. Actionable knowledge discovery is a complex task, due to it is strongly depend on domain knowledge, such as background knowledge expert experience, user interesting, environment context, business logic, even including law, regulation, habit, culture etc. The main challenge is moving data-driven into domain-driven data mining (DDDM), its goal is to discover actionable knowledge rather than general pattern. As a new generation data mining approach, main ideas of the DDDM are introduced. Two types of process models show the difference between loosely coupled and tightly coupled. Also the main characteristics, such as constraint-base, human-machine cooperated, loop-closed iterative refinement and meta-synthesis-base process management are proposed. System architecture will be introduced, as well as a paradigm will be introduced.

Zhengxiang Zhu, Jifa Gu, Lingling Zhang, Wuqi Song, Rui Gao
Data Mining Integrated with Domain Knowledge

Traditional data mining is a data-driven trial-and-error process[1], which aims at discovered pattern/rule. People either view data mining as an autonomous process, or only analyze the issues in an isolated and case-by-case manner. Because it overlooks some valuable information, such as existing knowledge, expert experience, context and real constraints, the results coming out can’t be directly applied to support decisions in business. This paper proposes a new methodology called Data Mining Integrated With Domain Knowledge, aiming to discovery more interesting, more actionable knowledge.

Anqiang Huang, Lingling Zhang, Zhengxiang Zhu, Yong Shi
A Simulation Model of Technological Adoption with an Intelligent Agent

Operational optimization models of technology adoptions commonly assume the existence of a social planner who knows a long-term future. Such kind of planner does not exist in reality. This paper introduces a simulation model in which an intelligent agent forms its expectation on future by continuous learning from its previous experience, and adjusts its decision on technology development continuously. Simulations with the model show that with the intelligent agent, an advanced but currently expensive technology will be adopted, but with a much slower pace than in optimization models.

Tieju Ma, Chunjie Chi, Jun Chen, Yong Shi
Research on Ratchet Effects in Enterprises’ Knowledge Sharing Based on Game Models

Knowledge sharing in enterprises is propitious to the employees’ knowledge richness and growth. The purpose of this paper is to apply game models to analyze knowledge sharing. First, we conclude that knowledge sharing is often trapped in “prisoner’s dilemma”. Then we find that “Ratchet Effects” exists and weaken the validity of incentive mechanisms. At last, we conclude that a more objective evaluation standard and long-term incentive contract would help to eliminate “Ratchet Effects”.

Ying Wang, Lingling Zhang, Xiuyu Zheng, Yong Shi
Application of Information Visualization Technologies in Masters’ Experience Mining

Experiences which belong to a kind of tacit knowledge were gradually summarized by the experts during their long working procedures. To analyze and inherit those experiences are worthwhile to the social construction and improvement. We build a platform composed of some visualization methods and analysis methods to present and analyze the data (from database, paper, web and etc.). So that students can intuitively understand the academic thinking of masters better than before. The platform has been applied in investigating the masters’ experiences of Traditional Chinese Medicine (TCM) and the positive results were also introduced.

Song Wuqi, Gu Jifa
Study on an Intelligent Knowledge Push Method for Knowledge Management System

In this paper, we design a mechanism which can measure the affinity between knowledge and user, affinity among users to achieve the intelligent management of knowledge. Based on the affinity, we can implement knowledge push to provide the right knowledge to the right person automatically. Several matrixes are designed to calculate the affinity.

Lingling Zhang, Qingxi Wang, Guangli Nie
Extension of the Framework of Knowledge Process Analysis: A Case Study of Design Research Process

This study undergoes the approach of Knowledge process analysis in an academic research project. It investigates the knowledge creation primitives of KPA used in previous studies and tests other possible primitives from the domain of design studies. This is a step improving KPA with design research experience.

Georgi V. Georgiev, Kozo Sugiyama, Yukari Nagai

The Ninth International Workshop on Meta-Synthesis and Complex Systems

On Heterogeneity of Complex Networks in the Real World

Although recent studies have made great progress in the research on heterogeneity of complex networks with power-law degree distribution in the real world, they seem to ignore that there may be different types of heterogeneities. Therefore, this paper, from the perspective of power-law degree distribution, suggests a comprehensive analysis taking several coefficients into account to reveal heterogeneity of complex networks in the real world more accurately. We show that there are at least two types of heterogeneities.

Ruiqiu Ou, Jianmei Yang, Jing Chang, Weicong Xie
Some Common Properties of Affiliation Bipartite Cooperation-Competition Networks

This article presents a brief review about some common properties of cooperation-competition networks described by affiliation bipartite graphs. Firstly, the distributions of three statistical quantities, the two bipartite graph degrees and a projected unipartite graph degree, which describes the network cooperation-competition configuration, are introduced. The common function forms of the distributions are deduced by the analytic and numerical analyses of a network evolution model, and then verified by the empirical investigations on 23 real world cooperation-competition systems. Secondly, for a description on the competition results, a node weight is proposed which represents a kind of its competition gain. A common node weight distribution function is empirically observed in the 23 real world systems. Thirdly, the relationships between the properties describing the cooperation-competition configuration and the competition properties are discussed. The only example reported in this article is the correlation between the node weight and a bipartite graph degree. These studies may be helpful for the development of complex system theory and the understanding of some important real world systems.

Da-Ren He
Cases of HWMSE

In 1992, a concept, Hall for Workshop on Meta-synthetic Engineering (HWMSE) was proposed by the Chinese system scientist Qian Xuesen as a platform for practicing the meta-synthesis system approach (MSA). Along the continuous study in MSA & HWMSE, doubts and critical opinions always exist. In this paper, two cases of HWMSE are addressed to expand the understanding of the concept of HWMSE and practice of meta-synthesis system approach.

Xijin Tang
Group Argumentation Info-visualization Model in the Hall for Workshop of Meta-synthetic Engineering

In order to improve effectiveness and efficiency of the compute-mediated group argumentation, which is one of the main working in HWME, it’s necessary to explore some new technologies of info-visualization. The paper analyzes the attributes of the argument information, defines their structures and relationships, and organizes the argument information with the “directed graph” at a certain time point. The info-visualization model provides experts with easy-to-understand way to discern the state of argumentation. Finally, the paper studies the consensus estimation based on the model.

Wang Ming-li, Dai Chao-fan
Study on Improving the Fitness Value of Multi-objective Evolutionary Algorithms

Pareto sort classification method is often used to compute the fitness value of evolutionary groups in multi-objective evolutionary algorithms. However this kind of computation may produce great selection pressure and result in premature convergence. To address this problem, an improved method to compute the fitness value of multi-objective evolutionary algorithms based on the relative relationship between objective function values is proposed in this paper, which improves the convergence and distribution of multi-objective evolutionary algorithms. Testing results of test functions show that the improved computation method has a higher ability of convergence and distribution than the evolutionary algorithm based on Pareto sort classification method.

Yong Gang Wu, Wei Gu
Simulation for Collaborative Competition Based on Multi-Agent

The system of collaborative competition shows swarm intelligence features through individual co-interactive. Based on the predominance of Multi-Agents Simulation in describing and modeling a complex system, a model of the enterprise collaborative competition was built. The model defined every enterprise as an agent and defined agent actions combining Holland’s ECHO model. In the model, the mechanism of behaviors and simulation parameters were designed. The dynamic competitive behavior of the enterprises was simulated through java language. Finally the results of the simulation were analyzed.

Zhiyuan Ge, Jiamei Liu
Fuzzy Optimal Decision for Network Bandwidth Allocation with Demand Uncertainty

In this paper, a fuzzy methodology is proposed to optimize network bandwidth allocation with demand uncertainty in communication networks (CNs). In this proposed methodology, uncertain traffic demands are first handled by a fuzzification way. Then a fuzzy optimization methodology is presented for network bandwidth allocation problem with the consideration of the trade-off between resource utilization and service performance in CNs. Accordingly, the optimal network bandwidth is allocated to obtain maximum network revenue in CNs. Finally, a numerical example is presented for purpose of illustration.

Lean Yu, Wuyi Yue, Shouyang Wang
A Comparison of SVD, SVR, ADE and IRR for Latent Semantic Indexing

Recently, singular value decomposition (SVD) and its variants, which are singular value rescaling (SVR), approximation dimension equalization (ADE) and iterative residual rescaling (IRR), were proposed to conduct the job of latent semantic indexing (LSI). Although they are all based on linear algebraic method for tem-document matrix computation, which is SVD, the basic motivations behind them concerning LSI are different from each other. In this paper, a series of experiments are conducted to examine their effectiveness of LSI for the practical application of text mining, including information retrieval, text categorization and similarity measure. The experimental results demonstrate that SVD and SVR have better performances than other proposed LSI methods in the above mentioned applications. Meanwhile, ADE and IRR, because of the too much difference between their approximation matrix and original term-document matrix in Frobenius norm, can not derive good performances for text mining applications using LSI.

Wen Zhang, Xijin Tang, Taketoshi Yoshida
The Bilevel Programming Model of Earthwork Allocation System

The earthwork allocation which is common in construction projects and directly affects the quality, costs and scheduling of projects is a transportation problem with hierarchical structure. Linear programming (LP) model cannot clearly reflect the characteristics of the system. Considering Bilevel Programming (BLP) is the one of useful tools for solving the problem with this structure, in this paper, the BLP model of earthwork allocation is established. The objective of upper level is that of minimizing the transportation cost. And the objective of lower level is to balance the supply and demand of earthwork in the excavation and embankment activities. In addition, a hybrid particle swarm optimization algorithm is proposed to solve the model by combining the method of particle swarm optimization (PSO) with simplex algorithm.

Wang Xianjia, Huang Yuan, Zhang Wuyue
Knowledge Diffusion on Networks through the Game Strategy

In this paper, we develop a knowledge diffusion model in which agents determine to give their knowledge to others according to some exchange strategies. The typical network namely small-world network is used for modeling, on which agents with knowledge are viewed as the nodes of the network and the edges are viewed as the social relationships for knowledge transmission. Such agents are permitted to interact with their neighbors repeatedly who have direct connections with them and accordingly change their strategies by choosing the most beneficial neighbors to diffuse knowledge. Two kinds of knowledge transmission strategies are proposed for the theoretical model based on the game theory and thereafter used in different simulations to examine the effect of the network structure on the knowledge diffusion effect. By analyses, two main observations can be found: One is that the simulation results are contrary to our intuition which agents would like to only accept but not share, thus they will maximize their benefit; another one is that the number of the agents acquired knowledge and the corresponding knowledge stock turn out to be independent of the percentage of those agents who choose to contribute their knowledge.

Shu Sun, Jiangning Wu, Zhaoguo Xuan
The Analysis of Complex Structure for China Education Network

We collected the data of the documents and their links of China Education and Research Network’s which construct the complex directed network China Education Network (CEN) with large amount of documents with their edges (URLs). This paper analyzes some statistical properties, including degree distributions, average path length, clustering coefficient, and the community structure of China Education Network basing on the practical data. By analyzing the practical data, we found that the in-degree and out-degree distribution of the CEN has power-law tail and the network displays both properties of small world and scale free. The CEN has a considerably small average path length and its clustering coefficient is in the mediate. As a large scale complex network, China Education Network clearly present its community structure in which the colleges in a school constitute communities generally with a large modularity.

Zhu-jun Deng, Ning Zhang
Priority-Pointing Procedure and Its Application to an Intercultural Trust Project

In the Western cultural background, the Priority-Pointing Procedure (PPP), which is a qualitative research-based diagnostic procedure, has been proven to be able to point to a priority for action by measuring imbalances in the context of Nomology. As the starting point to prove its feasibility in the environment of the Eastern cultural background, we applied PPP to the research process of an Intercultural Trust Project, which was performed in Dublin, Ireland. In this paper we present the application of PPP in the environment of a mixed cultural background. We find that PPP is useful for defining variables and identifying imbalances.

Rong Du, Shizhong Ai, Cathal M. Brugha
Exploring Refinability of Multi-Criteria Decisions

This paper used the Structured Multi-Criteria Methodology and the Direct-Interactive Structured-Criteria (DISC) Multi-Criteria Decision-Making (MCDM) system to explore the refinability of decisions in dialogue between Decision Advisors (DAs) and Decision-Makers (DMs). The study showed the importance of a sensitive DA/DM interaction, of using iterative cycles to complete stages, and that the DAs should have confidence in the full eight stage MCDM structured process when helping DMs to reach a decision.

Cathal M. Brugha
Methodology for Knowledge Synthesis

This paper considers the problem of knowledge synthesis and proposes a theory of knowledge construction, which consists of three fundamental parts: a knowledge integration model, the structure-agency-action paradigm, and the evolutionally constructive objectivism. The first is a model of gathering and integrating knowledge, the second relates to necessary abilities when gathering knowledge in individual domains, and the third comprises a set of principles to evaluate gathered and integrated knowledge.

Yoshiteru Nakamori
Study on Public Opinion Based on Social Physics

Social combustion theory, social shock wave theory and social behavior entropy theory are the three basic theories of social physics. This paper studies on public opinion formation based on social combustion theory, and explores public opinion evolution process based on social shock wave theory, and grasps the individual’s whose specifically refers the public opinion leader’s behavior based on social behavior entropy theory.

Yijun Liu, Wenyuan Niu, Jifa Gu
Context-Based Decision Making Method for Physiological Signal Analysis in a Pervasive Sensing Environment

With the advent of light-weight, high-performance sensing and processing technology, a pervasive physiological sensing device has been actively studied. However, a pervasive sensing device is easily affected by the external factors and environmental changes such as noise, temperature or weather. In addition, it is hard to deal with the internal factors of a user and personal differences based on physiological characteristics while measuring physiological signal with a pervasive sensing device. To address these issues, we propose a context-based decision making method considering pervasive sensing environments in which it concerns users’ age, gender and sensing environments for detecting normal physiological condition of a user. From the research conducted, we found that the context-based physiological signal analysis for multiple users’ regular data showed reliable results and reduced errors.

Ahyoung Choi, Woontack Woo
A Framework of Task-Oriented Decision Support System in Disaster Emergency Response

Based on the analysis of organizing of rescuing process of Wenchuan Earthquake in China, the paper developed a task-oriented management model to deal with the disaster emergency response. The management mechanism of task generating in emergency response has been established. Four kinds of task generation mechanism have been studied and three decision-making patterns have been suggested. The routings to produce task system were discussed, which could dispose the essential task into sub-task and form the task system through the processes of Work Breakdown Structure (WBS). A framework of decision support system in emergency response has been proposed, which based on the Hall for Workshop of Mate-synthetic Engineering. It could help the operation team to transfer the predetermined plan to execution plan in emergency response and to assign and dynamic supervise the task system.

Jun Tian, Qin Zou, Shaochuan Cheng, Kanliang Wang
Study on the Developing Mechanism of Financial Network

Financial network is a capital flow network made up of a great number of account nodes. Based on the theories of Economic Physics , Behavior Economics and Complex Networks, the developing model of financial network has been constructed from the point of the weight properties of vertices and edges in financial network. According to the parsing of the model, it presents that the financial network shows a power-law degree distribution (

$p_{k}\sim k^{-2}$

)in the condition that the time tends to infinity. Finally, the degree distribution of financial network is simulated with the experimental data on this thesis.

Xiaohui Wang, Yaowen Xue, Pengzhu Zhang, Siguo Wang
Solving Sudoku with Constraint Programming

Constraint Programming (CP) is a powerful paradigm for modeling and solving Complex Combinatorial Problems (generally issued from Decision Making). In this work, we model the known Sudoku puzzle as a Constraint Satisfaction Problems and solve it with CP comparing the performance of different Variable and Value Selection Heuristics in its Enumeration phase. We encourage this kind of benchmark problem because it may suggest new techniques in constraint modeling and solving of complex systems, or aid the understanding of its main advantages and limits.

Broderick Crawford, Carlos Castro, Eric Monfroy
A Study of Crude Oil Price Behavior Based on Fictitious Economy Theory

The over fluctuating of international crude oil price has aroused wide concern in the society and the academics. Based on the theory of fictitious economy, this paper has studied and explained the crude oil price behavior from Jan 1946 to Dec 2008. It concludes that the long term prices of crude oil are subject to mean reversion in accordance with the decisive law of value, which is fluctuating around the long term marginal opportunity cost. However, at the same time the prices also appeared to deviate far from long term marginal opportunity cost for several relatively long periods. This paper highlights four aspects of this issue: the diversification of international crude oil market participants, the structural changes of the participants, the evolution of pricing mechanism, and the periodic change of world economy.

Xiaoming He, Siwei Cheng, Shouyang Wang
Study on the Method of Determining Objective Weight of Decision-Maker (OWDM) in Multiple Attribute Group Decision-Making

In multi-attribute group decision-making, the aggregating result is much depended upon objective weight of decision makers. For getting a more accurate aggregating result quickly, a method of determining OWDM to attributes in interactive decision-making is presented in this paper, which is based on thinning the objective weight of decision makers down the objective weight of decision makers to attributes. Then a definition of consensus degree and the flow of interactive decision-making based on the objective weight of decision makers to attributes are proposed.

Donghua Pan, Yong Zhang
Machining Parameter Optimal Selection for Blades of Aviation Engine Based on CBR and Database

Blades of aviation engine are usually composed of complex three dimensional twisted surfaces that request high geometrical precision. Their machining is very difficult. Hence, how to reuse successful machining technics becomes an important and effective measure to improve blade machining quality. Machining parameter optimization for blades of aviation engine based on CBR and database is discussed in the paper. The system architecture and workflow are presented. Machining parameter database based on CBR consists of a case library and a machining database. Both of them can not only run independently, but also be integrated through application interface. Case representation includes two aspects, namely problem and objective description and solution scheme. Similarity ratio calculation is divided into local similarity ratio and integral similarity ratio. Through system development, it is proven to be feasible that machining parameter optimal selection is realized based on CBR and database.

Yan Cao, Yu Bai, Hua Chen, Lina Yang
A Multi-regional CGE Model for China

With the development of China’s economy, the regional diversity and interregional economic linkage have become more and more remarkable and been two important factors to study China’s national and regional economy. Based on the multi-regional input-output table for China, this paper develops a multi-regional CGE (MRCGE) model for China that is expected to provide a useful tool for analysis of regional economy and regional policies. This model depicts regional diversities on scale and structure and interregional economic linkages, i.e. commodity flow, labor flow and capital flow. As an application of this model, this paper designs to increase the investment for Northwestern region to reveal the important effect that comes from the regional differences and linkages.

Na Li, Minjun Shi, Fei Wang

Workshop on Risk Correlation Analysis and Risk Measurement

The Method Research of Membership Degree Transformation in Multi-indexes Fuzzy Decision-Making

The conversion of membership degree is the key computation of fuzzy evaluation for multi-indexes fuzzy decision-making. But the method should be discussed, because redundant data in index membership degree is also used to compute object membership degree, which is not useful for object classification. The new method is: based on data mining of entropy, mining knowledge information about object classification hidden in every index, affirming the relation of object classification and index membership, eliminating the redundant data in index membership for object classification by defining distinguishable weight, extracting valid values to compute object membership. Thus constructing a new membership degree conversion method that can not be effected by redundant data and it is used for fuzzy decision for multi-indexes.

Kaidi Liu, Jin Wang, Yanjun Pang, Jimei Hao
Study on Information Fusion Based Check Recognition System

Automatic check recognition techniques play an important role in financial systems, especially in risk management. This paper presents a novel check recognition system based on multi-cue information fusion theory. For Chinese bank check, the amount can be independently determined by legal amount, courtesy amount, or E13B code. The check recognition algorithm consists of four steps: preprocessing, check layout analysis, segmentation and recognition, and information fusion. For layout analysis, an adaptive template matching algorithm is presented to locate the target recognition regions on the check. The hidden markov model is used to segment and recognize legal amount. Courtesy and E13B code are recognized by artificial neural network method, respectively. Finally, D-S evidence theory is then introduced to fuse above three recognition results for better recognition performance. Experimental results demonstrate that the system can robustly recognize checks and the information fusion based algorithm improves the recognition rate by 5~10 percent.

Dong Wang
Crisis Early-Warning Model Based on Exponential Smoothing Forecasting and Pattern Recognition and Its Application to Beijing 2008 Olympic Games

A large number of methods like discriminant analysis, logic analysis, recursive partitioning algorithm have been used in the past for the business failure prediction. Although some of these methods lead to models with a satisfactory ability to discriminate between healthy and bankrupt firms, they suffer from some limitations, often due to only give an alarm, but cannot forecast. This is why we have undertaken a research aiming at weakening these limitations. In this paper, we propose an Exponential Smoothing Forecasting and Pattern Recognition (ESFPR) approach in this study and illustrate how Exponential Smoothing Forecasting and Pattern Recognition can be applied to business failure prediction modeling. The results are very encouraging, and prove the usefulness of the proposed method for bankruptcy prediction. The Exponential Smoothing Forecasting and Pattern Recognition approach discovers relevant subsets of financial characteristics and represents in these terms all important relationships between the image of a firm and its risk of failure.

Baojun Tang, Wanhua Qiu
Measuring Interdependency among Industrial Chains with Financial Data

Industrial chains exhibit strong interdependency within a large-scale resource-based enterprise group. When analyzing the independency effect in industrial chains, the interdependency of financial index is often ignored. In this paper, we will mainly focus on measuring the long-term interdependency effect by historical simulation and cointegration tests with financial data. A large-scale coal-mining group is studied empirically as a case to explain the framework of independency analysis among the industrial chains. The results show that high degree of independency appears in production costs and marketing costs, and low degree appears in revenues and profits.

Jingchun Sun, Ye Fang, Jing Luo
Multi-objective Economic Early Warning and Economic Risk Measure

The leading indicators approach is the most prominent and widely used method for economic early warning. However, only the single target’ analysis is focused. In fact, there is more to any economy than a single overarching business target. In this paper, the multi-dimension business climate index approach is introduced to carry out multi-objective economic early warning and measure economic risk. First, the multi-dimension coincident index is constructed directly in the unified framework based on FHLR method. Second, vector space approach and probability analysis of multi-dimension impact point are introduced to provide early warning analysis of multi-objective and measure economic risk. Then, it is applied to research Chinese two-object economic system. The empirical results show that multi-dimension climate index approach may provide a coherent and evolving outlook for multi-objective early warning and build a consistent track record of predictions.

Guihuan Zheng, Jue Wang
An Analysis on Financial Crisis Prediction of Listed Companies in China’s Manufacturing Industries Based on Logistic Regression and Bayes

In this paper, some listed companies in China’s manufacturing industries are taken as the research objects, and the ST companies’ financial indicators in 1~5 years before the occurrence of their financial crisis are collected. On the other hand, non-ST companies are selected as samples and then empirically analyzed by Logistic regression; the Logistic regression model is established to forecast financial crisis, and the prediction capacity of forecasting financial crisis in 1

$\thicksim$

5 years ahead of their occurrence are summed up. According to the established model, by using Bayes’ Theorem, the financial crisis probabilities of listed companies in China’s manufacturing industries in the next years are amended.

Wenhua Yu, Hao Gong, Yuanfu Li, Yan Yue
Comparative Analysis of VaR Estimation of Double Long-Memory GARCH Models: Empirical Analysis of China’s Stock Market

GARCH models are widely used to model the volatility of financial assets and measure VaR. Based on the characteristics of long-memory and lepkurtosis and fat tail of stock market return series, we compared the ability of double long-memory GARCH models with skewed student-t-distribution to compute VaR, through the empirical analysis of Shanghai Composite Index (SHCI) and Shenzhen Component Index (SZCI). The results show that the ARFIMA-HYGARCH model performance better than others, and at less than or equal to 2.5 percent of the level of VaR, double long-memory GARCH models have stronger ability to evaluate in-sample VaRs in long position than in short position while there is a diametrically opposite conclusion for ability of out-of-sample VaR forecast.

Guangxi Cao, Jianping Guo, Lin Xu
Estimation of Value-at-Risk for Energy Commodities via CAViaR Model

This paper uses the Conditional Autoregressive Value at Risk model (CAViaR) proposed by Engle and Manganelli (2004) to evaluate the value-at-risk for daily spot prices of Brent crude oil and West Texas Intermediate crude oil covering the period May 21

th

, 1987 to Novermber 18

th

, 2008. Then the accuracy of the estimates of CAViaR model, Normal-GARCH, and GED-GARCH was compared. The results show that all the methods do good job for the low confidence level (95%), and GED-GARCH is the best for spot WTI price, Normal-GARCH and Adaptive-CAViaR are the best for spot Brent price. However, for the high confidence level (99%), Normal-GARCH do a good job for spot WTI, GED-GARCH and four kind of CAViaR specifications do well for spot Brent price. Normal-GARCH does badly for spot Brent price. The result seems suggest that CAViaR do well as well as GED-GARCH since CAViaR directly model the quantile autoregression, but it does not outperform GED-GARCH although it does outperform Normal-GARCH.

Zhao Xiliang, Zhu Xi
An Empirical Analysis of the Default Rate of Informal Lending—Evidence from Yiwu, China

This study empirically analyzes the underlying factors contributing to the default rate of informal lending. This paper adopts snowball sampling interview to collect data and uses the logistic regression model to explore the specific factors. The results of these analyses validate the explanation of how the informal lending differs from the commercial loan. Factors that contribute to the default rate have particular attributes, while sharing some similarities with commercial bank or FICO credit scoring Index. Finally, our concluding remarks draw some inferences from empirical analysis and speculate as to what this may imply for the role of formal and informal financial sectors.

Wei Lu, Xiaobo Yu, Juan Du, Feng Ji
Empirical Study of Relations between Stock Returns and Exchange Rate Fluctuations in China

The existing theories tell us that there exist interaction relations between stock prices and exchange rates. However, empirical research results don’t always support these theories. This paper uses quantile regression techniques to check whether the above theories hold in Chinese markets. We first eliminate the impact of the calendar effect and the time trends on stock market returns and exchange rate fluctuations using Gallant, Rossi, and Tachen’s method (1992) by combination with stepwise regression method, and then do quantile regression analysis according to the adjusted data. The empirical results are summarized as follows: the influences of exchange rate fluctuations to stock returns are negative at most quantiles of stock returns; the opposite influences are not significant at most quantiles of exchange rate fluctuations. Some explanations according to these results are given.

Jian-bao Chen, Deng-ling Wang, Ting-ting Cheng
Cost Risk Tolerance Area of Material Supply in Biomass Power Generation

This research aims to shed light on the interaction mechanism of cost risks for biomass material supply in power generation. Firstly, game model is established to analyze interactions among factors including unit procurement cost, unit transportation cost, basic price and coal price for cop residue collection. Secondly, Monte Carlo simulation is implemented based on two profit ratio to compare the profit increase under material competition with that under price alliance. Thirdly, risk tolerance area approach is applied in mapping of cost risk. Shi Liquan Biomass Power Plant in Shandong Province of China is analyzed as a case to show that price alliance will benefit both the allied enterprises and the local collection area by the proposed methodology.

Sun Jingchun, Chen Jianhua, Fang Ye, Hou Junhu
The Effect of Subjective Risk Attitudes and Overconfidence on Risk Taking Behaviors: A Experimental Study Based on Traders of the Chinese Stock Market

Our research analyzes the effect of the traders’ subjective risk attitude, optimism and overconfidence on their risk taking behaviors on the Chinese Stock Market by experimental study method. We find that investors’ risk taking behavior is significantly affected by their subjective risk attitude, optimism and overconfidence. Our results also argue that the objective return and volatility of stock are not as good predictors of risk taking behavior as subjective risk and return measures. Moreover, we illustrate that overconfidence and optimism have an significant impact on risk taking behavior In line with theoretical models.

Qi-an Chen, Yinghong Xiao, Hui Chen, Liang Chen
Application of the Maximum Entropy Method to Risk Analysis of Mergers and Acquisitions

The maximum entropy (ME) method can be used to analyze the risk of mergers and acquisitions when only pre-acquisition information is available. A practical example of the risk analysis of China listed firms’ mergers and acquisitions is provided to testify the feasibility and practicality of the method.

Jigang Xie, Wenyun Song
Internal Control, CPA Recognition and Performance Consequence: Evidence from Chinese Real Estate Enterprises

In recent years, internal control has caught more and more attention over the whole globe. However, whether internal control could improve business efficiency also lacks the empirical supports. Based on a sample size of 146 Chinese real estate enterprises, this study analyses the CPA’s recognition degree on firm’s implementing internal control, and its performance consequence. The evidence suggests that CPAs are able to give exact evaluation on firm’s internal control implement, and the higher the internal control implemented, the better performance the enterprise will have.

Chuan Zhang, Lili Zhang, Yi Geng
The Influence of IPO to the Operational Risk of Chinese Commercial Banks

This paper analyzed the initial public offering (IPO) effect to operational risk. We classified the banks with listed banks and non-listed banks, simulated the annual operational loss with bootstrap method and compared the annual unexpected loss distributions with VaRs, expected shortfall and the annual operational economic capital, the result showed that the annual operational capital of list banks is rather lower than that of non-listed banks. It is showed that IPO may spur banks improving operational risk management.

Lijun Gao, Jianping Li
The Measurement and Analysis Risk Factors Dependence Correlation in Software Project

The complexity of software process leads to that there are all kinds of fuzzy correlations among different process management risk factors, such as dependence correlation among software risk factors. It’s difficult to analyze risk data directly by mathematic tools because that risk data is uncertain and rough. Based on the rough set theory and the data in risk management library, the risk factors dependence correlation analysis system(RFDCAS) is established, and the dependence coefficient and its calculate formula on the base of equivalence class is suggested. The RFDCAS unveils the dependence correlation among risk factors contribute to risk management, and can help discover the problems in the software process improvement management.

Ding JianJie, Hou Hong, Hao KeGang, Guo XiaoQun
Assessment of Disaster Emergency Management Ability Based on the Interval-Valued Fuzzy TOPSIS Method

The paper presents a risk prediction model for software based on service oriented architecture and discusses risk measurement based on the presented model. The risk prediction model, ECORPM, is derived from EXPERT COCOMO, providing a risk prediction baseline for software using service oriented architecture.

Jing Kun-peng, Song Zhi-jie
Dynamic Project Risk Analysis and Management Based on Influence Diagrams

This paper presents a real-time process-oriented project risk analysis and management model which can be combined with the project general management, based on the hierarchical risk influence diagram which is constructed on the basis of network planning. Through network planning, it can solve the problems of the dynamic and overall identification of risk elements, and the showing and analysis of risk transfer along with the working procedure, and the timely and dynamic risk prevention and control as well. The influence diagram can effectively represent the risk combination and transfer in time and logic order. And it is good at the analysis of the sensitivity and control value of risk elements, as well as being convenient for communicating between experts, managers and owner. So the hierarchical risk influence diagram can make the decision of risk management timelier and more accurate. The problems of risk description, diagrams construction and risk evaluation are solved very well in applying the general influence diagram to dynamic project risk analysis. In the end, good result is attained in the example.

Xiaohua Liu, Chaoyuan Yue
Risk Prediction and Measurement for Software Based on Service Oriented Architecture

The paper presents a risk prediction model for software based on service oriented architecture and discusses risk measurement based on the presented model. The risk prediction model, ECORPM, is derived from EXPERT COCOMO, providing a risk prediction baseline for software using service oriented architecture.

Jun Liu, Jianzhong Qiao, Shukuan Lin
Risk Measurement and Control of Water Inrush into Qiyue Mountain Tunnel

Qiyue mountain tunnel lies in Hurongxi highway in Hubei Province, soluble rocks are karstified intensively. To predict and prevent water inrush into tunnel when it is constructed, risk measurement is necessary to guide comprehensive advance geology prediction and prevention measures. Fuzzy mathematics model is put forward to distinguish potentially dangerous parts along tunnel. 7 main controls of rock karstification are selected as risk rank indicators. Use the fuzzy mathematics model differentiated the tunnel into 4 ranks of dangerous parts according to risk index. It is applied in Qiyue mountain tunnel and practice approves that the measurement and measures are useful.

Ge Yan-hui, Ye Zhi-hua, Li Shu-cai, Lu Wei, Zhang Qing-song
Operational Risk Measurement of Chinese Commercial Banks Based on Extreme Value Theory

The financial institutions and supervision institutions have all agreed on strengthening the measurement and management of operational risks. This paper attempts to build a model on the loss of operational risks basing on Peak Over Threshold model, emphasizing on weighted least square, which improved Hill’s estimation method, while discussing the situation of small sample, and fix the sample threshold more objectively basing on the media-published data of primary banks loss on operational risk from 1994 to 2007.

Jiashan Song, Yong Li, Feng Ji, Cheng Peng
A Multi-criteria Risk Optimization Model for Trustworthy Software Process Management

The trustworthiness measurement model is defined on the basis of the process assumption about risk transferring in this paper. Based on the risk management effectiveness calculation methods, an integrated trustworthy software process risk optimization model is proposed to give an optimized risk management scheme towards trustworthy software with the constraints of process cost and schedule. Simulation cases are then analyzed by this model framework, and experimental results are discussed.

Jianping Li, Minglu Li, Hao Song, Dengsheng Wu
Country Risk Volatility Spillovers of Emerging Oil Economies: An Application to Russia and Kazakhstan

The emerging oil economies (EOEs) of geographical proximity, are usually impacted by some common risk factors which may make the interaction of their country risk closely related. This paper focuses on the interaction of country risk between EOEs by investigating the volatility spillovers of country risk. Taking Russia and Kazakhstan for example, a multivariate conditional volatility model is used to capture the dynamic spillovers of country risk. Empirical results show that there are significant bidirectional spillover effects with the asymmetrical volatility between Russia and Kazakhstan.

Xiaolei Sun, Wan He, Jianping Li
Modeling the Key Risk Factors to Project Success: A SEM Correlation Analysis

Researchers have put forward a number of project risk factors in different levels or dimensions. However, they mostly focus on the assessment of a single factor’s effect on project performance, neglecting of the relationship between factors and risk factors’ impact on project result due to the relationships. Taking domestic construction projects as an example and using structural equation model, this study analyses the correlation between risk factors and risk factors’ effect on project success. This study intends to help project managers better understanding of domestic projects risk characteristics, and provide positive recommendations for project management effectively.

Juan Song, Jianping Li, Dengsheng Wu
Research on R&D Project Risk Management Model

R&D project is an exploratory high-risk investment activity and has potential management flexibility. In R&D project risk management process, it is hard to quantify risk with very little past information available. This paper introduces quality function deployment and real option in traditional project risk management process. Through waterfall decomposition mode, R&D project risk management process is constructed step by step; through real option, the managerial flexibility inherent in R&D project can be modeled. In the paper, first of all, according to the relation matrix between R&D project success factors and risk indexes, risk priority list can be obtained. Then, risk features of various stages are analyzed. Finally, real options are embedded into various stages of R&D project by the risk features. In order to effectively manage R&D risk in a dynamic cycle, the steps above should be carried out repeatedly.

Xiaoyan Gu, Chen Cai, Hao Song, Juan Song
Software Risks Correlation Analysis Using Meta-analysis

Software risk identification is the first and critical activity adopted in software risk management process. Checklist, as a low cost and easy-to-operate method, is hard to get a comprehensive view of the whole kinds of risks due to culture diversity and location restriction. After getting a synthetic risk list by examining a series of checklists, we formed a legible and integral impression of the frequency and importance of the whole risks with the help of the meta-analysis framework. The top ten risks were listed compared with Boehm’s and SPSS was used to analyze the correlations between different categories and major risks.

Hao Song, Chen Cai, Minglu Li, Dengsheng Wu
A Two-Layer Least Squares Support Vector Machine Approach to Credit Risk Assessment

Least squares support vector machine (LS-SVM) is a revised version of support vector machine (SVM) and has been proved to be a useful tool for pattern recognition. LS-SVM had excellent generalization performance and low computational cost. In this paper, we propose a new method called two-layer least squares support vector machine which combines kernel principle component analysis (KPCA) and linear programming form of least square support vector machine. With this method sparseness and robustness is obtained while solving large dimensional and large scale database. A U.S. commercial credit card database is used to test the efficiency of our method and the result proved to be a satisfactory one.

Jingli Liu, Jianping Li, Weixuan Xu, Yong Shi
Credit Risk Evaluation Using a C-Variable Least Squares Support Vector Classification Model

Credit risk evaluation is one of the most important issues in financial risk management. In this paper, a

C

-variable least squares support vector classification (

C

-VLSSVC) model is proposed for credit risk analysis. The main idea of this model is based on the prior knowledge that different classes may have different importance for modeling and more weights should be given to those classes with more importance. The

C

-VLSSVC model can be constructed by a simple modification of the regularization parameter in LSSVC, whereby more weights are given to the lease squares classification errors with important classes than the lease squares classification errors with unimportant classes while keeping the regularized terms in its original form. For illustration purpose, a real-world credit dataset is used to test the effectiveness of the

C

-VLSSVC model.

Lean Yu, Shouyang Wang, K. K. Lai
Ecological Risk Assessment with MCDM of Some Invasive Alien Plants in China

Alien plant invasion is an urgent global issue that threatens the sustainable development of the ecosystem health. The study of its ecological risk assessment (ERA) could help us to prevent and reduce the invasion risk more effectively. Based on the theory of ERA and methods of the analytic hierarchy process (AHP) of multi-criteria decision-making (MCDM), and through the analyses of the characteristics and processes of alien plant invasion, this paper discusses the methodologies of ERA of alien plant invasion. The assessment procedure consisted of risk source analysis, receptor analysis, exposure and hazard assessment, integral assessment, and countermeasure of risk management. The indicator system of risk source assessment as well as the indices and formulas applied to measure the ecological loss and risk were established, and the method for comprehensively assessing the ecological risk of alien plant invasion was worked out. The result of ecological risk analysis to 9 representative invasive alien plants in China shows that the ecological risk of

Erigeron annuus, Ageratum conyzoides, Alternanthera philoxeroides and Mikania midrantha

is high (grade1-2), that of

Oxalis corymbosa and Wedelia chinensis

comes next (grade3), while

Mirabilis jalapa, Pilea microphylla and Calendula officinalis

of the last (grade 4). Risk strategies are put forward on this basis.

Guowen Xie, Weiguang Chen, Meizhen Lin, Yanling Zheng, Peiguo Guo, Yisheng Zheng
Empirically-Based Crop Insurance for China: A Pilot Study in the Down-middle Yangtze River Area of China

Factors that caused slow growth in crop insurance participation and its ultimate failure in China were multi-faceted including high agricultural production risk, low participation rate, inadequate public awareness, high loss ratio, insufficient and interrupted government financial support. Thus, a clear and present need for data driven analyses and empirically-based risk management exists in China. In the present investigation, agricultural production data for two crops (corn, rice) in five counties in Jiangxi Province and Hunan province for design of a pilot crop insurance program in China. A crop insurance program was designed which (1) provides 75% coverage, (2) a 55% premium rate reduction for the farmer compared to catastrophic coverage most recently offered, and uses the currently approved governmental premium subsidy level. Thus a safety net for Chinese farmers that help maintain agricultural production at a level of self-sufficiency that costs less than half the current plans requires one change to the program: ≥80% of producers must participate in an area.

Erda Wang, Yang Yu, Bertis B. Little, Zhongxin Chen, Jianqiang Ren
A Response Analysis of Economic Growth to Environmental Risk: A Case Study of Qingdao

The economic growth is one kind of risk sources for environmental pollution. Taking Qingdao as sample, using principal component analysis, Granger causality test and impulse response function, this paper aims to find the relationship between economic growth and environmental pollution. The result shows that the economic growth is Granger cause of environmental risk which are influenced by time lags. The influence is progressive, gradual, and long-standing.

Yunpeng Qin, Jianyue Ji, Xiaoli Yu

Workshop on Optimization-Based Data Mining Method and Applications

A Multiple Criteria and Multiple Constraints Mathematical Programming Model for Classification

Mathematical programming has been widely used in data classification. A general strategy to build classifiers is to optimize a global objective function such as the square-loss function. However, in many real life situations, optimizing only one single objective function can hardly achieve a satisfactory classifier. Thus a series of models based on multiple criteria mathematical programming (MCMP) have been proposed recently, such as the multiple criteria linear programming (MCLP) model and the linear discriminant Analysis (LDA) model. In this paper, we argue that due to the inherent complexity of the real world data, multiple criteria mathematical programming may be also inadequate to identify a genuine classification boundary. Under this observation, we present a multiple criteria multiple constraints mathematical programming (MC

2

MP) model for classification. More specifically, we extend a most recent multiple criteria programming model, the MEMBV model, into a multiple constraints MEMBV model.

Peng Zhang, Yingjie Tian, Dongling Zhang, Xingquan Zhu, Yong Shi
New Unsupervised Support Vector Machines

Support Vector Machines (SVMs) have been dominant learning techniques for more than ten years, and mostly applied to supervised learning problems. Recently nice results are obtained by two-class unsupervised classification algorithms where the optimization problems based on Bounded

C

-SVMs, Bounded

ν

-SVMs and Lagrangian SVMs respectively are relaxed to Semi-definite Programming. In this paper we propose another approach to solve unsupervised classification problem, which directly relaxes a modified version of primal problem of SVMs with label variables to a semi-definite programming. The preliminary numerical results show that our new algorithm often obtains more accurate results than other unsupervised classification methods, although the relaxation has no tight bound, as shown by an example where its approximate ratio of optimal values can be arbitrarily large.

Kun Zhao, Ying-jie Tian, Nai-yang Deng
Data Mining for Customer Segmentation in Personal Financial Market

The personal financial market segmentation plays an important role in retail banking. It is widely admitted that there are a lot of limitations of conventional ways in customer segmentation, which are knowledge based and often get bias results. In contrast, data mining can deal with mass of data and never miss any useful knowledge. Due to the mass storage volume of unlabeled transaction data, in this paper, we propose a clustering ensemble method based on majority voting mechanism and two alternative manners to further enhance the performance of customer segmentation in real banking business. Through the experiments and examinations in real business environment, we can come to a conclusion that our model reflect the true characteristics of various types of customers and can be used to find the investment preferences of customers.

Guoxun Wang, Fang Li, Peng Zhang, Yingjie Tian, Yong Shi
Nonlinear Knowledge in Kernel-Based Multiple Criteria Programming Classifier

Kernel-based Multiple Criteria Linear Programming (KMCLP) model is used as classification methods, which can learn from training examples. Whereas, in traditional machine learning area, data sets are classified only by prior knowledge. Some works combine the above two classification principle to overcome the defaults of each approach. In this paper, we propose a model to incorporate the nonlinear knowledge into KMCLP in order to solve the problem when input consists of not only training example, but also nonlinear prior knowledge. In dealing with real world case breast cancer diagnosis, the model shows its better performance than the model solely based on training data.

Dongling Zhang, Yingjie Tian, Yong Shi
A Note on the 1-9 Scale and Index Scale In AHP

This paper demonstrates that the 1-9 scale and the index scale used in the Analytical Hierarchy Process (AHP) can both be derived from the same scale-selection criteria. They are both accordant to Saaty’s basic thoughts about the scale-selection in the AHP. These two different kinds of ratio scales are the results of the two different ways of applications of the Weber-Fechner psychophysical law. We believe that the index scale is preferable to the 1-9 scale in the theory and hope that the more use of the index scale could be encouraged in the future. But for the rich experiences of application, people can keep on using the existing 1-9 scale in the practice.

Zhiyong Zhang, Xinbao Liu, Shanlin Yang
Linear Multi-class Classification Support Vector Machine

Support Vector Machines (SVMs) for classification have been shown to be promising classification tools in many real-world problems. How to effectively extend binary SVC to multi-class classification is still an on-going research issue. In this article, instead of solving quadratic programming (QP) in algorithm in [1], utilizing a linear function in the objective function a linear programming (LP) problem is introduced in our algorithm,thus leading to a new algorithm for multi-class problem named linear multi-class classification support vector machine. Numerical experiments on artificial data sets and benchmark data sets show that the proposed method is comparable to algorithm [1] in errors, while considerably ten times faster and the same robustness.

Yan Xu, Yuanhai Shao, Yingjie Tian, Naiyang Deng
A Novel MCQP Approach for Predicting the Distance Range between Interface Residues in Antibody-Antigen Complex

Antibody-antigen association plays a key role in immune system.The distance range between interface residues in protein complex is one of interface features.Three machine learning approaches, known as Multiple Criteria Quadratic Programming (MCQP), Linear Discriminant Analysis (LDA) (SPSS 2004) and Decision Tree based See5 (Quinlan 2003), are used to predict the distance range between interface residues in antigen-antibody complex. It is explored that how different surface patch size affects the accuracy of different distance range. The results of three approaches are compared with each other.

Yong Shi, Ruoying Chen, Jia Wan, Xinyang Zhang
Robust Unsupervised Lagrangian Support Vector Machines for Supply Chain Management

Support Vector Machines (SVMs) have been dominant learning techniques for more than ten years, and mostly applied to supervised learning problems. These years two-class unsupervised and semi-supervised classification algorithms based on Bounded

C

-SVMs, Bounded

ν

-SVMs, Lagrangian SVMs (LSVMs) and robust version to Bounded

C

 − SVMs respectively, and which are relaxed to Semi-definite Programming (SDP), get good classification results. The time consumed of method based on robust version to B

C

-SVMs is too long. So it seems necessary to find a faster method, which has almost accurate results as above at least. Therefore we proposed robust version to unsupervised and semi-supervised classification algorithms based on Lagrangian Support Vector Machines and its application on evaluation of supply chain management performance. Numerical results confirm the robustness of the proposed method and show that our new unsupervised and semi-supervised classification algorithms based on LSVMs often obtain almost the same accurate results as other algorithms,while considerably faster than them.

Kun Zhao, Yong-sheng Liu, Nai-yang Deng
A Dynamic Constraint Programming Approach

Constraint Programming (CP) is a powerful paradigm for solving Combinatorial Problems (generally issued from Decision Making). In CP, Enumeration Strategies are crucial for resolution performances. In a previous work, we proposed to dynamically change strategies showing bad performances, and to use metabacktrack to restore better states when bad decisions were made. In this work, we design and evaluate strategies to improve resolution performances of a set of problems.

Eric Monfroy, Carlos Castro, Broderick Crawford
The Evaluation of the Universities’ Science and Technology Comprehensive Strength Based on Management Efficiency

This paper proposes the evaluation methods of Universities’ science and technology, which can eliminate the advantage and disadvantage effects between the objective being evaluated, truly reflect the improvement of science and technology comprehensive strength, due to the subjective efforts, and very fair and objective. Management efficiency methods make universities be in a difficult condition, which is beneficial to university manager to find differences and improve management condition duo to the dynamic changes of the benchmark indexes and present indexes of the colleges and universities’ science and technology comprehensive strength. It is showed that management efficiency methods have impact of incentive on the all universities, and have a broad application prospect in the area of performance evaluation.

Baiqing Sun, Yange Li, Lin Zhang

Topics in Risk Analysis with Multiple Criteria Decision Making

MCDM and SSM in the Public Crisis Management: From the Systemic Point of View

From the perspective of system science, this paper analyzes the system mechanism of public crisis formation and argues that the essence of public crisis is a variety of bifurcations of the social system. Accordingly, the main characteristic of public crisis is shown as an ill- structured problem situation. In the public crisis management, MCDM, as a hard system approach, is strong in coping with the managerial problem which has clear objectives and well structure. When encountered ill- structured problem situations, SSM can be a supplementary methodology to MCDM.

Yinyin Kuang, Dongping Fan
The Diagnosis of Blocking Risks in Emergency Network

For the maximum flow f* in the network N, if an arc set B has characteristics: the increase of the capacity of every arc will increase the maximum value v* of the network and meanwhile decrease the total expense Z; then, we call “B” as the “Blocking Set” and the network with blocking set as “ill-conditioned network”. In this paper we first characterize the blocking set and ill-conditioned network and second, present the optimal investment decision for increasing the capacity and removing ill-conditioned network.

Xianglu Li, Wei Sun, Haibo Wang
How Retailer Power Influence Its Opportunism Governance Mechanisms in Marketing Channel?– An Empirical Investigation in China

This study aims to investigate the relationship among power, norms, contract and opportunism, analysis the effect of retailer power on the choice of norms and contracts, examine the influence of retailer power on opportunism, and discuss the moderating effect of communication between retailer’s power and governance mechanism, and also the moderating effect of monitoring between governance mechanism and opportunism. We got the following conclusions: (1) The retailer power affects supplier’s choice of governance mechanisms. (2) The use of different types of governance mechanisms from supplier has a significant negative impact on retailer’s opportunism behavior. (3) Communication significantly decreases the frequency of supplier’s use of contract, but does not show a significant effect on decreasing the frequency of the use of norms; monitoring does not show a significant moderating effect between governance mechanism and opportunism behavior.

Yu Tian, Xuefang Liao
Applications in Oil-Spill Risk in Harbors and Coastal Areas Using Fuzzy Integrated Evaluation Model

Based on statistical analysis for reasons and impacts of oil-spill accident in ports and offshore areas in China, the possibility index system of oil-spill risk and impact assessment index system were set up, respectively. Take Tianjin Port for example, weight of each evaluation factor was determined by analytic hierarchy process (AHP), and the membership functions of evaluation factors were established by down semi-trapezoidal distribution. Then the possibility and impacts of oil spills were evaluated by multi-level fuzzy integrated evaluation model. And the overall effects of oil-spill risk in Tianjin Port were determined by two-dimensional risk matrix.

Chaofeng Shao, Yufen Zhang, Meiting Ju, Shengguang Zhang
Coexistence Possibility of Biomass Industries

This research aims to shed light on the mechanism of agricultural biomass material competition between the power generation and straw pulp industries and the impact on their coexistence. A two-stage game model is established to analyze including factors such as unit transportation cost, and profit spaces for the firms. The participants in the competition are a biomass supplier, a power plant and a straw pulp plant. From the industrial economics perspective, our analysis shows that raw material competition will bring about low coexistence possibility of the two industries based on agricultural residues in a circular collection area.

Sun Jingchun, Hou Junhu
How Power Mechanism Influence Channel Bilateral Opportunism

In the background of marketing channel power asymmetry structure, this article discuss the relation between power dominant member’s use of power mechanism and the opportunism behavior of both Power disadvantage member and the power dominant member itself, and test whether distributive fairness perception and procedural fairness perception have moderate effects on this relation. The result shows that, the power dominant member’s use of coercive power will increase the opportunistic tendency of both sides; in contrast, the power dominant member’s use of noncorecive power will inhibit such tendency. Distributive fairness perception and procedural fairness perception negatively moderate the relation between power dominant member’s use of noncorecive power and power disadvantage member’s opportunism. Procedural fairness perception also negatively moderates the relation between power dominant member’s use of coercive power and the other side’s opportunism.

Yu Tian, Shaodan Chen

Workshop on Applications of Decision Theory and Method to Financial Decision Making

Compromise Approach-Based Genetic Algorithm for Constrained Multiobjective Portfolio Selection Model

In this paper, fuzzy set theory is incorporated into a multiobjective portfolio selection model for investors’ taking into three criteria: return, risk and liquidity. The cardinality constraint, the buy-in threshold constraint and the round-lots constraints are considered in the proposed model. To overcome the difficulty of evaluation a large set of efficient solutions and selection of the best one on non-dominated surface, a compromise approach-based genetic algorithm is presented to obtain a compromised solution for the proposed constrained multiobjective portfolio selection model.

Jun Li
Financial Time Series Analysis in a Fuzzy View

In financial markets, fuzzy data are always observed. These observations are recorded in the form of intervals. In this paper, we constructed a fuzzy financial time series to describe the dynamic characters of such uncertain observations, tested the stationary of it and then proposed a fuzzy auto-regression model (FAR). We estimated the unknown parameters with Fuzzy Linear Program (FLP) method and gave a principle of evaluating the fitness of the model by using the sample average of the closeness (SAC). An empirical analysis would go to Shanghai Composite Index (SCI), in which we illustrated the modeling of fuzzy financial time series, evaluation of the fitness of the model and the modeling forecast effect.

Zhuyu Li, Taiji Wang, Cheng Zhang
Asset Allocation and Optimal Contract for Delegated Portfolio Management

This article studies the portfolio selection and the contracting problems between an individual investor and a professional portfolio manager in a discrete-time principal-agent framework. Portfolio selection and optimal contracts are obtained in closed form. The optimal contract was composed with the fixed fee, the cost, and the fraction of excess expected return. The optimal portfolio is similar to the classical two-fund separation theorem.

Jingjun Liu, Jianfeng Liang
The Heterogeneous Investment Horizon and Dynamic Strategies for Asset Allocation

This paper discusses the influence of the portfolio rebalancing strategy on the efficiency of long-term investment portfolios under the assumption of independent stationary distribution of returns. By comparing the efficient sets of the stochastic rebalancing strategy, the simple rebalancing strategy and the buy-and-hold strategy with specific data examples, we find that the stochastic rebalancing strategy is optimal, while the simple rebalancing strategy is of the lowest efficiency. In addition, the simple rebalancing strategy lowers the efficiency of the portfolio instead of improving it.

Heping Xiong, Yiheng Xu, Yi Xiao
Tracking Models for Optioned Portfolio Selection

In this paper we study a target tracking problem for the portfolio selection involving options. In particular, the portfolio in question contains a stock index and some European style options on the index. A refined tracking-error-variance methodology is adopted to formulate this problem as a multi-stage optimization model. We derive the optimal solutions based on stochastic programming and optimality conditions. Attention is paid to the structure of the optimal payoff function, which is shown to possess rich properties.

Jianfeng Liang
Size, Book-to-Market Ratio and Relativity of Accounting Information Value: Empirical Research on the Chinese Listed Company

Recently there are many literatures studying the effect of factors such as size or book-market ratio on fluctuation of accounting earnings, stock price or earnings respectively, but so far their affection on accounting information value relativity has been scarcely addressed. This paper presents the detail analyses of their effect of the two factors to the relativity of accounting information value respectively by taking Shanghai and Shenzhen stock markets as sample. And the analyses supports the following two hypotheses, (1) The relativity of accounting information value of big size corporation is more than that of small size corporation. (2) The relativity of accounting information value of low B/M ratio corporation is more than that of low B/M ratio corporation.

Jing Yu, Siwei Cheng, Bin Xu

New Frontiers of Hybrid MCDM Techniques for Problems-Solving

Fuzzy MCDM Technique for Planning the Environment Watershed

In the real word, the decision making problems are very vague and uncertain in a number of ways. The most criteria have interdependent and interactive features so they cannot be evaluated by conventional measures method. Such as the feasibility, thus, to approximate the human subjective evaluation process, it would be more suitable to apply a fuzzy method in environment-watershed plan topic. This paper describes the design of a fuzzy decision support system in multi-criteria analysis approach for selecting the best plan alternatives or strategies in environmentwatershed. The Fuzzy Analytic Hierarchy Process (FAHP) method is used to determine the preference weightings of criteria for decision makers by subjective perception. A questionnaire was used to find out from three related groups comprising fifteen experts. Subjectivity and vagueness analysis is dealt with the criteria and alternatives for selection process and simulation results by using fuzzy numbers with linguistic terms. Incorporated the decision makers’ attitude towards preference, overall performance value of each alternative can be obtained based on the concept of Fuzzy Multiple Criteria Decision Making (FMCDM). This research also gives an example of evaluating consisting of five alternatives, solicited from a environmentwatershed plan works in Taiwan, is illustrated to demonstrate the effectiveness and usefulness of the proposed approach.

Yi-Chun Chen, Hui-Pang Lien, Gwo-Hshiung Tzeng, Lung-Shih Yang, Leon Yen
Nonlinear Deterministic Frontier Model Using Genetic Programming

In economics, several parametric regression-based models have been proposed to measure the technical efficiency of decision making units (DMUs). However, the problem of misspecification restricts the use of these methods. In this paper, symbolic regression is employed to obtain the approximate optimal production function automatically using genetic programming (GP). Monte Carlo simulation is used to compare the performance of data envelopment analysis (DEA), deterministic frontier analysis (DFA) and GP-based DFA with respect to three different production functions and sample sizes. The simulated results indicated that the proposed method has better performance than that of others with respect to nonlinear production functions.

Chin-Yi Chen, Jih-Jeng Huang, Gwo-Hshiung Tzeng
A Revised VIKOR Model for Multiple Criteria Decision Making - The Perspective of Regret Theory

VIKOR is one of the multiple criteria decision making (MCDM) models to determine the preference ranking from a set of alternatives in the presence of conflicting criteria. The justification of VIKOR is to use the concept of the compromise programming to determine the preference ranking by the results of the individual and group regrets. However, VIKOR has a critical problem in deriving the preference ranking of alternative and this issue will be discussed later. In this paper, the perspective of regret theory is applied to explain the content of VIKOR and one revised model of VIKOR is proposed base on the concept of regret theory. Then, two examples are given to justify the proposed model and compare it with VIKOR and the regret model.

Jih-Jeng Huang, Gwo-Hshiung Tzeng, Hsiang-Hsi Liu
A Novel Evaluation Model for the Vehicle Navigation Device Market Using Hybrid MCDM Techniques

The developing strategy of ND is also presented to initiate the product roadmap. Criteria for evaluation are constructed via reviewing papers, interviewing experts and brain-storming. The ISM (interpretive structural modeling) method was used to construct the relationship between each criterion. The existing NDs were sampled to benchmark the gap between the consumer’s aspired/desired utilities with respect to the utilities of existing/developing NDs. The VIKOR method was applied to rank the sampled NDs. This paper will propose the key driving criteria of purchasing new ND and compare the consumer behavior of various characters. Those conclusions can be served as a reference for ND producers for improving existing functions or planning further utilities in the next e-era ND generation.

Chia-Li Lin, Meng-Shu Hsieh, Gwo-Hshiung Tzeng
A VIKOR Technique with Applications Based on DEMATEL and ANP

In multiple criteria decision making (MCDM) methods, the compromise ranking method (named VIKOR) was introduced as one applicable technique to implement within MCDM. It was developed for multicriteria optimization of complex systems. However, few papers discuss conflicting (competing) criteria with dependence and feedback in the compromise solution method. Therefore, this study proposes and provides applications for a novel model using the VIKOR technique based on DEMATEL and the ANP to solve the problem of conflicting criteria with dependence and feedback. In addition, this research also uses DEMATEL to normalize the unweighted supermatrix of the ANP to suit the real world. An example is also presented to illustrate the proposed method with applications thereof. The results show the proposed method is suitable and effective in real-world applications.

Yu-Ping Ou Yang, How-Ming Shieh, Gwo-Hshiung Tzeng
Identification of a Threshold Value for the DEMATEL Method: Using the Maximum Mean De-Entropy Algorithm

To deal with complex problems, structuring them through graphical representations and analyzing causal influences can aid in illuminating complex issues, systems, or concepts. The DEMATEL method is a methodology which can be used for researching and solving complicated and intertwined problem groups. The end product of the DEMATEL process is a visual representation—the impact-relations map—by which respondents organize their own actions in the world. The applicability of the DEMATEL method is widespread, ranging from analyzing world problematique decision making to industrial planning. The most important property of the DEMATEL method used in the multi-criteria decision making (MCDM) field is to construct interrelations between criteria. In order to obtain a suitable impact-relations map, an appropriate threshold value is needed to obtain adequate information for further analysis and decision-making. In this paper, we propose a method based on the entropy approach, the maximum mean de-entropy algorithm, to achieve this purpose. Using real cases to find the interrelationships between the criteria for evaluating effects in E-learning programs as an examples, we will compare the results obtained from the respondents and from our method, and discuss that the different impact-relations maps from these two methods.

Li Chung-Wei, Tzeng Gwo-Hshiung
High Technology Service Value Maximization through an MCDM-Based Innovative e-Business Model

The emergence of the Internet has changed the high technology marketing channels thoroughly in the past decade while E-commerce has already become one of the most efficient channels which high technology firms may skip the intermediaries and reach end customers directly. However, defining appropriate e-business models for commercializing new high technology products or services through Internet are not that easy. To overcome the above mentioned problems, a novel analytic framework based on the concept of high technology customers’ competence set expansion by leveraging high technology service firms’ capabilities and resources as well as novel multiple criteria decision making (MCDM) techniques, will be proposed in order to define an appropriate e-business model. An empirical example study of a silicon intellectual property (SIP) commercialization e-business model based on MCDM techniques will be provided for verifying the effectiveness of this novel analytic framework. The analysis successful assisted a Taiwanese IC design service firm to define an e-business model for maximizing its customer’s SIP transactions. In the future, the novel MCDM framework can be applied successful to novel business model definitions in the high technology industry.

Chi-Yo Huang, Gwo-Hshiung Tzeng, Wen-Rong Ho, Hsiu-Tyan Chuang, Yeou-Feng Lue
Airline Maintenance Manpower Optimization from the De Novo Perspective

Human resource management (HRM) is an important issue for today’s competitive airline marketing. In this paper, we discuss a multi-objective model designed from the De Novo perspective to help airlines optimize their maintenance manpower portfolio. The effectiveness of the model and solution algorithm is demonstrated in an empirical study of the optimization of the human resources needed for airline line maintenance. Both De Novo and traditional multiple objective programming (MOP) methods are analyzed. A comparison of the results with those of traditional MOP indicates that the proposed model and solution algorithm does provide better performance and an improved human resource portfolio.

James J. H. Liou, Gwo-Hshiung Tzeng
A Novel Hybrid MADM Based Competence Set Expansions of a SOC Design Service Firm

As the IC (integrated circuit) industry migrates to the System-on-Chip (SOC) era, a novel business model, the SOC design service (DS), is emerging. However, how to expand a firm’s innovation competences while satisfying multiple objectives including highest quality, lowest cost, and fastest time to market as well as most revenues for economics of scale are always problems for a design service firm. Therefore, attempts to expand the innovation competences, and thus the competitiveness, of latecomers in the SOC DS industry have already become the most critical issue facing the top managers of SOC design service firms. In this paper, a novel multiple attribute decision making (MADM) analytic framework based on the concept of competence set expansion, as well as MADM methods consisting with DEMATEL, ANP and multiple objective decision making (MODM) will be proposed in order to define a path for expanding a late-coming SOC DS firm’s innovation capabilities. An empirical study on expanding innovation competence sets, of a late-coming Taiwanese DS firm then will be presented.

Chi-Yo Huang, Gwo-Hshiung Tzeng, Yeou-Feng Lue, Hsiu-Tyan Chuang
A Genetic Local Search Algorithm for the Multiple Optimisation of the Balanced Academic Curriculum Problem

We deal with the Balanced Academic Curriculum Problem, a real world problem that is currently part of CSPLIB. We introduce a Genetic Local Search algorithm to solve this problem using two objectives which is a more realistic model than the one we used in our previous research. The tests carried out show that our algorithm obtains better solutions than systematic search techniques in the same amount of time.

Carlos Castro, Broderick Crawford, Eric Monfroy
Using Consistent Fuzzy Preference Relations to Risk Factors Priority of Metropolitan Underground Project

To execute a large and complex underground project in metropolis will involve more risky factors. It is the successful implementation of such project depends on effective management of the key risk factors. This study cites the key risk factors of underground rail-way project identified by Ghosh and Jintanapakanont (2004) and uses the consistent fuzzy preference relations (CFPR) to deal with the degree of impact of these risk factors. It reveals that the CFPR is an easy and practical way to provide rankings of more risk factors in making decision and yields consistent requirement from only

n

- 1 pairwise comparisons.

Shih-Tong Lu, Cheng-Wei Lin, Gwo-Hshiung Tzeng
Using MCDM Methods to Adopt and Assess Knowledge Management

This paper proposes a Fuzzy Group Decision Approach for making strategic decisions about knowledge management adoption. Implementing KM is becoming more complicated. Practitioners must assess complex and confusing situations, initiate KM, identify the causal relationships between problems, make appropriate decisions, and guarantee that the recommended action plan will be effective. Therefore, effective group decision-making is essential. A 27-item list that constituted a complete domain for OKMR (Organizational Knowledge Management Readiness) measurement was obtained, which are used to build the 8 constructs/Criteria of knowledge management adoption. The DEMATEL (Decision MAking Trial and Evaluation Laboratory) method gathers collective knowledge to capture the causal relationships between strategic criteria. This paper applies the DEMATEL method in the strategic planning of knowledge management to help managers address the above situations and related questions. The ANP (Analytic network process) handles dependence within a cluster and among different clusters with the goal at the knowledge management and the alternatives in the lower levels based on dynamic concept of Markov chain.

Ying-Hsun Hung, Seng-Cho T. Chou, Gwo-Hshiung Tzeng
Backmatter
Metadaten
Titel
Cutting-Edge Research Topics on Multiple Criteria Decision Making
herausgegeben von
Yong Shi
Shouyang Wang
Yi Peng
Jianping Li
Yong Zeng
Copyright-Jahr
2009
Verlag
Springer Berlin Heidelberg
Electronic ISBN
978-3-642-02298-2
Print ISBN
978-3-642-02297-5
DOI
https://doi.org/10.1007/978-3-642-02298-2

Premium Partner