Skip to main content
Erschienen in: Research in Engineering Design 4/2007

01.03.2007 | Original Paper

Feature-based classifiers for design optimization

verfasst von: Haoyang Liu, T. Igusa

Erschienen in: Research in Engineering Design | Ausgabe 4/2007

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

We present a design optimization method for systems with high-dimensional parameter spaces using inductive decision trees. The essential idea is to map designs into a relatively low-dimensional feature space, and to derive a classifier to search for high-performing design alternatives within this space. Unlike learning classifier systems that were pioneered by Holland and Goldberg, classifiers defined by inductive decision trees were not originally developed for design optimization. In this paper, we explore modifications to such classifiers to make them more effective in the optimization problem. We expand the notions of feature space, generalize the tree construction heuristic beyond the original information-theoretic definitions, increase the reliance on domain expertise, and facilitate the transfer of design knowledge between related systems. There is a relatively small but rapidly growing body of work in the use of inductive trees for engineering design; the method presented herein is complementary to this research effort.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Anhänge
Nur mit Berechtigung zugänglich
Fußnoten
1
All columns have the same length and cross-section properties; the beams are 50% longer than the columns with a second moment of cross-section area that is twice that of the columns; the same elastic material is used for all beams and columns.
 
2
C n j is the binomial coefficient of n and j and the reduction by the factor of 2 is due to symmetry.
 
3
Standard finite-element analysis is used to compute g.
 
4
We can set γ0 =  ∞ and γ M =  − ∞ so that classes C 1 and C M would correspond to the highest and lowest performing designs.
 
5
Here \({\overline g }\) and s g are the sample average and standard deviation of the performance values g(x j ) of the training data set, and the multiplier of 0.8 was chosen so that the three classes had approximately the same number of supervised data.
 
6
This is also known as the naive Bayes classifier and is based on the maximum likelihood principle.
 
7
For relatively small m, the knowledge abstraction level and the associated evaluation effort of the feature functions are usually high; the corresponding features are termed high-level features (as opposed to low-level features). It is preferable, however, to separate any quantities that require substantial evaluation effort from the feature coordinates and to treat them separately in the knowledge modeling process. For instance, low-fidelity approximate models g approx(x) which are simpler than the original performance function g(x) may be too complex to satisfy the simplicity attribute for the feature vector. It is shown in Sect. 4.2 how g approx(x) can be used in the design problem.
 
8
This is simply the ratio of the column height over width. Other measures of slenderness will depend on the type of column being examined. For thin-walled steel members, such measures would be in terms of the first and second moment of cross-section area and other aggregated dimensional quantities (Liu et al. 2004).
 
9
The domain expert that we use is a novice structural engineer with knowledge of frame behavior gained primarily from a graduate-level course in structural mechanics. This level of expertise is sufficient for meaningful interaction with the design process as illustrated in Fig. 1.
 
10
For instance, reduced-error pruning repeatedly analyzes the classifier resolution, quantified by an average information entropy, at each non-leaf node. If the resolution of the subtree with the non-leaf node as the root is not much higher than that of the trivial subtree where the same node is replaced by a leaf, then the tree is pruned at this node.
 
11
It may be necessary to adjust the performance thresholds used in (2) so that the expanded training data would be more evenly distributed among the classes C j .
 
12
Although the average information entropy in (10) could be viewed as a special case of the expected utility in (12) by setting u ij =  −log2 P ij , the utilities are not usually defined in terms of probabilities.
 
13
Statistically derived models (Rudnyi 1996; Buzas 1997; Haq and Kibria 1997; De La Cruz-Mesia and Marshall 2003) are not of interest herein because they are typically in terms of basis function expansions with non-informative coefficients.
 
14
In this method, it is assumed that the inflection points are at the midpoint of every beam and column, except for the lower columns.
 
15
The thresholds that define the classes C j used in the second system would have to be adjusted so that class 1 would still represent high-performing designs.
 
16
Our classifier approach has also been successfully applied to a much less intuitive design problem involving thin-walled steel columns (Liu et al. 2004). It is noted that the emphasis of that paper was on an exploration of a new nonlinear model for cold-formed steel columns; only a brief outline of a simpler form of the classifier approach was given.
 
Literatur
Zurück zum Zitat Bailey R, Bras B, Allen JK (1999) Using robust concept exploration and system dynamics models in the design of complex industrial ecosystems. Eng Optim 32(1):33-58 Bailey R, Bras B, Allen JK (1999) Using robust concept exploration and system dynamics models in the design of complex industrial ecosystems. Eng Optim 32(1):33-58
Zurück zum Zitat Breiman L, Friedman JH, Olshen RA, Stone CJ (1984) Classification and regression trees. Wadsworth International, BelmontMATH Breiman L, Friedman JH, Olshen RA, Stone CJ (1984) Classification and regression trees. Wadsworth International, BelmontMATH
Zurück zum Zitat Buntine W (1990) A theory of learning classification rules, PhD thesis, University of Technology, Sydney Buntine W (1990) A theory of learning classification rules, PhD thesis, University of Technology, Sydney
Zurück zum Zitat Buntine W (1992) Learning classification tree. Statistics and computing 2:63–73, DOI:10.1007/BF01889584 Buntine W (1992) Learning classification tree. Statistics and computing 2:63–73, DOI:10.1007/BF01889584
Zurück zum Zitat Buzas JS (1997) Instrumental variable estimation in nonlinear measurement error models. Commun Stat Theory Methods 26(12):2861–2877MATHMathSciNet Buzas JS (1997) Instrumental variable estimation in nonlinear measurement error models. Commun Stat Theory Methods 26(12):2861–2877MATHMathSciNet
Zurück zum Zitat Chen W, Allen JK, Mavris D, Mistree F (1996) A concept exploration method for determining robust top-level specifications. Eng Optim 26:137–158 Chen W, Allen JK, Mavris D, Mistree F (1996) A concept exploration method for determining robust top-level specifications. Eng Optim 26:137–158
Zurück zum Zitat DeGroot MH (1970) Optimal statistical decisions. McGraw-Hill, New YorkMATH DeGroot MH (1970) Optimal statistical decisions. McGraw-Hill, New YorkMATH
Zurück zum Zitat De La Cruz-Mesia R, Marshall G (2003) A Bayesian approach for nonlinear regression models with continuous errors. Commun Stat Theory Methods 32:1631–1646, DOI:10.1081/STA-120022248 De La Cruz-Mesia R, Marshall G (2003) A Bayesian approach for nonlinear regression models with continuous errors. Commun Stat Theory Methods 32:1631–1646, DOI:10.1081/STA-120022248
Zurück zum Zitat Devroye L, Gyorfi L, Lugosi G (1996) A probabilistic theory of pattern recognition. Springer, Berlin Heidelberg New YorkMATH Devroye L, Gyorfi L, Lugosi G (1996) A probabilistic theory of pattern recognition. Springer, Berlin Heidelberg New YorkMATH
Zurück zum Zitat Duda RO, Hart PE, Sytork DG (2001) Pattern classification, 2nd edn. Wiley, New YorkMATH Duda RO, Hart PE, Sytork DG (2001) Pattern classification, 2nd edn. Wiley, New YorkMATH
Zurück zum Zitat Forouraghi B (1999) On utility of inductive learning in multi-objective robust design. Artif Intell Eng Des Anal Manuf 13:27–36, DOI:10.1017/S0890060499131032 Forouraghi B (1999) On utility of inductive learning in multi-objective robust design. Artif Intell Eng Des Anal Manuf 13:27–36, DOI:10.1017/S0890060499131032
Zurück zum Zitat Gero JS, Kazakov VA (1995) Evolving building blocks for design using genetic engineering. In: IEEE international conference on evolutionary computing, pp 340–345 Gero JS, Kazakov VA (1995) Evolving building blocks for design using genetic engineering. In: IEEE international conference on evolutionary computing, pp 340–345
Zurück zum Zitat Goldberg DE (1987) Simple genetic algorithm and the minimal deceptive problem. In: Research notes in artificial intelligence, chapter 6. Morgan Kauffmann Publishers, Inc, San Francisco, pp 74–88 Goldberg DE (1987) Simple genetic algorithm and the minimal deceptive problem. In: Research notes in artificial intelligence, chapter 6. Morgan Kauffmann Publishers, Inc, San Francisco, pp 74–88
Zurück zum Zitat Goldberg DE (1989) Genetic algorithms in search, optimization and machine learning, Addison-Wesley, Reading Goldberg DE (1989) Genetic algorithms in search, optimization and machine learning, Addison-Wesley, Reading
Zurück zum Zitat Haq M, Kibria B (1997) Predictive inference for linear and multivariate linear models with MA (1) error processes. Commun Stat Theory Methods 26(2):331–353MATHMathSciNet Haq M, Kibria B (1997) Predictive inference for linear and multivariate linear models with MA (1) error processes. Commun Stat Theory Methods 26(2):331–353MATHMathSciNet
Zurück zum Zitat Hastie T, Tibshirani R, Friedman J (2001) The elements of statistical learning. Springer, Berlin Heidelberg New YorkMATH Hastie T, Tibshirani R, Friedman J (2001) The elements of statistical learning. Springer, Berlin Heidelberg New YorkMATH
Zurück zum Zitat Holland JH (1975) Adaptation in natural and artificial systems. University of Michigan Press, Ann Arbor Holland JH (1975) Adaptation in natural and artificial systems. University of Michigan Press, Ann Arbor
Zurück zum Zitat Igusa T, Liu H, Schafer BW, Naiman DQ (2003) Bayesian classification trees and clustering for rapid generation and selection of design alternatives. In: Reddy RG (ed) NSF design, manufacturing, and industrial innovation research conference, January 6–9, Birmingham Igusa T, Liu H, Schafer BW, Naiman DQ (2003) Bayesian classification trees and clustering for rapid generation and selection of design alternatives. In: Reddy RG (ed) NSF design, manufacturing, and industrial innovation research conference, January 6–9, Birmingham
Zurück zum Zitat Kovacs T (2004) Bibliography of real-world classifier systems applications. In: Bull L (ed) Applications of learning classifier systems. Springer, Berlin Heidelberg New York, pp 300–305 Kovacs T (2004) Bibliography of real-world classifier systems applications. In: Bull L (ed) Applications of learning classifier systems. Springer, Berlin Heidelberg New York, pp 300–305
Zurück zum Zitat Lee J, Hajela P (2001) Application of classifier systems in improving response surface based approximations for design optimization. Comput Struct 79:333–344, DOI:10.1016/S0045-7949(00)00132-2 Lee J, Hajela P (2001) Application of classifier systems in improving response surface based approximations for design optimization. Comput Struct 79:333–344, DOI:10.1016/S0045-7949(00)00132-2
Zurück zum Zitat Liu H (2003) Bayesian classifiers for uncertainty modeling with applications to global optimization and solid mechanics problems, PhD thesis, Department of Civil Engineering, Johns Hopkins University, Baltimore Liu H (2003) Bayesian classifiers for uncertainty modeling with applications to global optimization and solid mechanics problems, PhD thesis, Department of Civil Engineering, Johns Hopkins University, Baltimore
Zurück zum Zitat Liu H, Motoda H (1998) Feature selection for knowledge discovery and data mining. Kluwer Academic Publishers, NorwellMATH Liu H, Motoda H (1998) Feature selection for knowledge discovery and data mining. Kluwer Academic Publishers, NorwellMATH
Zurück zum Zitat Liu H, Igusa T, Schafer BW (2004) Knowledge-based global optimization of cold-formed steel columns. Thin-Walled Struct 42:785–801, DOI:10.1016/j.tws.2004.01.001 Liu H, Igusa T, Schafer BW (2004) Knowledge-based global optimization of cold-formed steel columns. Thin-Walled Struct 42:785–801, DOI:10.1016/j.tws.2004.01.001
Zurück zum Zitat Matheus C (1991a) The need for constructive induction. In: Machine learning: proceeding of the eighth international workshop, pp 173–177 Matheus C (1991a) The need for constructive induction. In: Machine learning: proceeding of the eighth international workshop, pp 173–177
Zurück zum Zitat Matheus C (1991b) The need for constructive induction. In: Machine learning: proceedings of the eighth international workshop, pp 173–177 Matheus C (1991b) The need for constructive induction. In: Machine learning: proceedings of the eighth international workshop, pp 173–177
Zurück zum Zitat Mili F, Shen W, Martinez I, et al. (2001) Knowledge modeling for design decisions. Artif Intell Eng 15:153–164, DOI:10.1016/S0954-1810(01)00013-9 Mili F, Shen W, Martinez I, et al. (2001) Knowledge modeling for design decisions. Artif Intell Eng 15:153–164, DOI:10.1016/S0954-1810(01)00013-9
Zurück zum Zitat Myers RH, Khuri AI, Carte, WH (1989) Response surface methodology: 1966–1988. Technometrics 31:137–157, DOI:10.2307/1268813 Myers RH, Khuri AI, Carte, WH (1989) Response surface methodology: 1966–1988. Technometrics 31:137–157, DOI:10.2307/1268813
Zurück zum Zitat Perremans P (1996) Feature-based description of modular fixturing elements: The key to an expert system for the automatic design of the physical fixture. Adv Eng Software 25:19–27, DOI:10.1016/0965-9978(95)00082-8 Perremans P (1996) Feature-based description of modular fixturing elements: The key to an expert system for the automatic design of the physical fixture. Adv Eng Software 25:19–27, DOI:10.1016/0965-9978(95)00082-8
Zurück zum Zitat Quinlan JR (1993) C45: Programs for machine learning. Morgan Kaufmann, San Mateo Quinlan JR (1993) C45: Programs for machine learning. Morgan Kaufmann, San Mateo
Zurück zum Zitat Reckhow KH (1999) Water quality prediction, mechanism, and probability network models. Canad J Fish Aquat Sci 56:1150–1158, DOI:10.1139/cjfas-56-7-1150 Reckhow KH (1999) Water quality prediction, mechanism, and probability network models. Canad J Fish Aquat Sci 56:1150–1158, DOI:10.1139/cjfas-56-7-1150
Zurück zum Zitat Rosenman MA (1997) The generation of form using an evolutionary approach. In: Dasgupta D, Michalewicz Z (eds) Evolutionary algorithms in engineering applications. Springer, Berlin Heidelberg New York, pp 69–86 Rosenman MA (1997) The generation of form using an evolutionary approach. In: Dasgupta D, Michalewicz Z (eds) Evolutionary algorithms in engineering applications. Springer, Berlin Heidelberg New York, pp 69–86
Zurück zum Zitat Rudnyi EB (1996) Statistical model of systematic errors: linear error model. Chemometrics and intelligent laboratory systems 34:41–54, DOI:10.1016/0169-7439(96)00004-4 Rudnyi EB (1996) Statistical model of systematic errors: linear error model. Chemometrics and intelligent laboratory systems 34:41–54, DOI:10.1016/0169-7439(96)00004-4
Zurück zum Zitat Salustri FA, Venter RD (1992) An axiomatic theory of engineering design information. Eng Comput 8:197–211, DOI:10.1007/BF01194322 Salustri FA, Venter RD (1992) An axiomatic theory of engineering design information. Eng Comput 8:197–211, DOI:10.1007/BF01194322
Zurück zum Zitat Schwabacher M, Ellman T, Hirsh H (1998) Learning to set up numerical optimizations of engineering designs. Artif Intell Eng Des Anal Manuf 12:173-192, DOI:10.1017/S0890060498122084 Schwabacher M, Ellman T, Hirsh H (1998) Learning to set up numerical optimizations of engineering designs. Artif Intell Eng Des Anal Manuf 12:173-192, DOI:10.1017/S0890060498122084
Zurück zum Zitat Stahovich TF, Bal H (2002) An inductive approach to learning and reusing design strategies. Res Eng Des 13:109–121, DOI:10.1007/s00163-001-0010-9 Stahovich TF, Bal H (2002) An inductive approach to learning and reusing design strategies. Res Eng Des 13:109–121, DOI:10.1007/s00163-001-0010-9
Zurück zum Zitat Varadarajan S, Chen W, Pelka CJ (2000) Robust concept exploration of propulsion systems with enhanced model approximation capabilities. Eng Optim 32:309–334 Varadarajan S, Chen W, Pelka CJ (2000) Robust concept exploration of propulsion systems with enhanced model approximation capabilities. Eng Optim 32:309–334
Zurück zum Zitat Witten IH, Frank E (2000) Data mining: practical machine learning tools and techniques with java implementations. Morgan Kaufmann Publishers, San Francisco Witten IH, Frank E (2000) Data mining: practical machine learning tools and techniques with java implementations. Morgan Kaufmann Publishers, San Francisco
Zurück zum Zitat Wolpert DH, Macready WG (1997) No free lunch theorems for optimization. IEEE Trans Evol Comput 4:67–82, DOI:10.1109/4235.585893 Wolpert DH, Macready WG (1997) No free lunch theorems for optimization. IEEE Trans Evol Comput 4:67–82, DOI:10.1109/4235.585893
Zurück zum Zitat Wyse N, Dubes R, Jain A (1980) A critical evaluation of intrinsic dimensionality algorithms. In: Pattern recognition in practice. Morgan Kaufmann Publishers, San Francisco, pp 415–425 Wyse N, Dubes R, Jain A (1980) A critical evaluation of intrinsic dimensionality algorithms. In: Pattern recognition in practice. Morgan Kaufmann Publishers, San Francisco, pp 415–425
Metadaten
Titel
Feature-based classifiers for design optimization
verfasst von
Haoyang Liu
T. Igusa
Publikationsdatum
01.03.2007
Verlag
Springer-Verlag
Erschienen in
Research in Engineering Design / Ausgabe 4/2007
Print ISSN: 0934-9839
Elektronische ISSN: 1435-6066
DOI
https://doi.org/10.1007/s00163-006-0024-4

    Marktübersichten

    Die im Laufe eines Jahres in der „adhäsion“ veröffentlichten Marktübersichten helfen Anwendern verschiedenster Branchen, sich einen gezielten Überblick über Lieferantenangebote zu verschaffen.