GA-optimized Support Vector Regression for an Improved Emotional State Estimation Model
GA-optimized Support Vector Regression for an Improved Emotional State Estimation Model
KSII Transactions on Internet and Information Systems (TIIS). 2014. Jun, 8(6): 2056-2069
Copyright © 2014, Korean Society For Internet Information
  • Received : March 14, 2014
  • Accepted : May 30, 2014
  • Published : June 28, 2014
Export by style
Cited by
About the Authors
Hyunchul Ahn
Graduate School of Business IT, Kookmin University Seoul, 136-702, Republic of Korea
Seongjin Kim
Graduate School of Business IT, Kookmin University Seoul, 136-702, Republic of Korea
Jae Kyeong Kim
School of Management, Kyunghee University Seoul, 130-701, Republic of Korea

In order to implement interactive and personalized Web services properly, it is necessary to understand the tangible and intangible responses of the users and to recognize their emotional states. Recently, some studies have attempted to build emotional state estimation models based on facial expressions. Most of these studies have applied multiple regression analysis (MRA), artificial neural network (ANN), and support vector regression (SVR) as the prediction algorithm, but the prediction accuracies have been relatively low. In order to improve the prediction performance of the emotion prediction model, we propose a novel SVR model that is optimized using a genetic algorithm (GA). Our proposed algorithm—GASVR—is designed to optimize the kernel parameters and the feature subsets of SVRs in order to predict the levels of two aspects—valence and arousal—of the emotions of the users. In order to validate the usefulness of GASVR, we collected a real-world data set of facial responses and emotional states via a survey. We applied GASVR and other algorithms including MRA, ANN, and conventional SVR to the data set. Finally, we found that GASVR outperformed all of the comparative algorithms in the prediction of the valence and arousal levels.
1. Introduction
A ffective computing has recently been gaining attention from researchers who are studying interactive and personalized Web services. Affective computing technologies deal with the systems, devices, and computers that can recognize, interpret, and express affective and emotional states [1] . These devices can improve human-computer interaction by enabling the communication of the users’ emotional states [2] .
In order to implement affective computing, it is necessary to understand the tangible and intangible responses of the users such as their speech, gestures, and facial expressions. From this wide array of responses, facial expressions have often been selected as the major source of information for estimating the users’ emotional states [3] [4] [5] .
Facial affect detection using emotional state estimation models may enhance user experiences by providing intelligent interaction with the users. For example, Thompson and McGill [2] proposed affective tutoring systems that enable e-learning applications to have the ability to detect and respond to the emotions that are experienced by the learner. Lin et al. [6] also proposed a learning emotional recognition model that enhances the students’ understanding during distance learning courses. Jung and Kim [5] and Kim et al. [7] presented emotional state estimation models for implementing interactive exhibitions.
The process of facial affect detection using an emotional state estimation model requires the selection of an appropriate prediction algorithm. Various algorithms have been used for estimating emotional states, including multiple regression analysis (MRA), artificial neural network (ANN), and support vector regression (SVR). The SVR method has been used in many recent studies because of its high level of prediction accuracy [7] [8] [9] .
SVR—a quantitative support vector machine (SVM) model—attempts to minimize the generalized error bound so as to achieve generalized performances [10] . Compared to ANN, it is known that SVR offers good prediction performances, even with a limited number of learning patterns [7] . In spite of these advantages, SVR and SVM are often criticized because their architectures are often determined by heuristic factors, such as the parameters of the kernel function and appropriate feature subsets [11] . However, optimization of these factors may lead to better results for emotional state estimation.
Given this background, our study proposes a novel algorithm, called Genetic Algorithm-based Support Vector Regression (GASVR), as the prediction algorithm for emotional state estimation. Our proposed algorithm uses a genetic algorithm (GA) to optimize the parameters and the feature subset selection for SVR. We use facial responses and the GASVR method to predict the two indicators of emotional state—the levels of valence and arousal.
The rest of this paper is organized as follows. In section 2, we present theoretical background information about emotional state estimation, SVR, and GA. In section 3, we describe the research model that is proposed in this paper and the procedures that are associated with it. Section 4 describes the empirical analysis that we used to validate the effectiveness of the proposed algorithm with a real-world data set. The conclusions and limitations of the study are discussed in the final section.
2. Theoretical Background
- 2.1 Emotional state estimation
Emotional state estimation plays an important role in affective computing. When implementing affective computing in practical situations, the computer systems should estimate the emotional states of humans in order to output an appropriate response for those emotions. As a result, researchers in the field of affective computing have studied methods for constructing effective emotional state estimation models.
Before building an emotional state estimation model, we first have to adopt a theoretical model that quantifies the levels of emotional states. Various emotional state models have been proposed in the field of psychology. We have adopted the V-A (valence-arousal) model, which is the most popular model for emotional state estimation.
The V-A model was proposed to measure the emotions felt by humans. It uses a two-dimensional approach [12] [13] . The dimensions of the V-A model are valence and arousal. The valence dimension (V) represents how positive or negative the emotion is. It ranges from unpleasant to pleasant. The arousal dimension (A) refers to the level of excitement or apathy of the emotion and it ranges from sleepiness to frantic excitement [5] [8] [12] [13] . Fig. 1 shows the two dimensions of the V-A model and the positions of blended emotions.
PPT Slide
Lager Image
V-A model.
- 2.2 Support Vector Regression
SVM is a statistical learning technique that was introduced by Vapnik [14] . SVM is a data mining algorithm that can be applied to classification (referred to as support vector classification—SVC) and prediction (referred to as support vector regression—SVR) [10] . SVM, which includes SVC and SVR, can lead to superior performances in practical applications because of its structural risk minimization principle, which is more general and superior to the empirical risk minimization principle that is adopted by conventional neural networks [15] .
SVR, the regression model of SVM, is able to solve nonlinear estimation problems effectively. In order to extend SVM from classification to regression, Vapnik et al. [16] adopted an ε -insensitivity loss function. In order to illustrate the concept of SVR, a typical regression problem will be formulated [15] [17] . Let us assume that there is a set of data
PPT Slide
Lager Image
where x i is a vector of the model’s input features, q i is the value of the output variable, and n is the number of data patterns. Then, the goal of the general regression analysis is to find a function f (x) that is able to accurately predict the desired output values ( q ). A typical linear regression function can be derived as q = f (x)+δ, where δ is the random error with distribution of N (0,σ 2 ).
In SVR, in order to solve nonlinear regression problems, the input features (x i ) are first mapped into a high-dimensional feature space (F) where they are correlated linearly with the outputs. Thus, SVR derives the following linear estimation function [15] [18] .
PPT Slide
Lager Image
where v is the weight vector, b is the constant, and Φ(x) represents a mapping function in the feature space.
In SVR, the problem of nonlinear regression in the input space (x) is transformed into the problem of linear regression in a higher-dimensional feature space (F). Here, the robust ε -insensitive loss function ( L ε ) that is presented in (2) is the most commonly used cost function [15] [18] .
PPT Slide
Lager Image
where ε denotes a precision parameter that represents the radius of the tube around the regression function f (x) as shown in Fig. 2 [15] .
PPT Slide
Lager Image
SVR using ε-insensitive loss function (Adopted from [15]).
The weight vector (v) and the constant ( b ) in (1) can be estimated by minimizing the following regularized risk function [15] [18] .
PPT Slide
Lager Image
where L ε is the ε -insensitive loss function,
PPT Slide
Lager Image
is the regularization term that controls the trade-off between the complexity and the approximation accuracy of the regression model, and C is the regularization constant that is used to specify the trade-off between the empirical risk and the regularization term.
By adopting slack variables
PPT Slide
Lager Image
(3) can be transformed into the following constrained form:
PPT Slide
Lager Image
The application of Lagrangian multipliers and Karush-Kuhn-Tucker (KKT) conditions to (4) finally leads to the following general form of the SVR-based regression function [18] :
PPT Slide
Lager Image
where K ( x , x i ) denotes the kernel function.
Although there are several possible choices for the kernel functions, the Gaussian radial basis function (RBF) that is defined as
PPT Slide
Lager Image
is the most popular [15] [17] . For further information on the theoretical background of SVR, it is recommended to refer to Lu et al. [15] and Vapnik [18] .
- 2.3 GA and its application with SVR
GA is a simple, but effective optimization method that attempts to simulate biological evolution [19] [20] . With the application of genetic operations such as selection, crossover, and mutation, it is designed to gradually improve the search results. In particular, there is a mechanism that is similar to mutation that prevents GA from falling into the local optima, and mechanisms like selection and crossover that enable efficient searches. Goldberg [21] provides more information about the evolution process and the genetic operators of GA.
So far, many researchers have applied GA for searching optimal architectural parameters of machine learning algorithms, including case-based reasoning (CBR) [22] [23] , artificial neural network (ANN) [24] [25] , and SVM (SVC) [26] [27] . Additionally, some recent studies have attempted to use GA to optimize the architectural factors of SVR [28] [29] [30] [31] [32] [33] [34] [35] [36] .
Chen [28] , Chen and Wang [29] , Ji et al. [30] , and Liu et al. [31] used GA as the tool for optimizing the kernel parameters (C, σ, and ε) of SVR in order to forecast the target variable more effectively. Lahiri and Ghanta [32] and Wu et al. [33] tried to use GA to optimize both the kernel functions of SVR and their parameters simultaneously. All of these studies reported that SVR methods with kernel optimizations based on GA led to improved prediction accuracies as compared to conventional SVR methods.
Since Huang and Wang [26] proposed GA for feature selection and kernel parameter optimization for SVC, several recent studies, like He et al. [34] , Oliveira et al. [35] , and Huang [36] , have tried to apply GA for feature selection and kernel parameter optimization for SVR. Huang [36] reported that optimal feature selection and kernel parameter selection both contributed significantly to the prediction accuracy of the algorithm. However, there have been few previous studies that have applied the optimization of the feature subsets and the kernel parameters of SVR to the estimation of emotional states. Thus, this study applies GASVR, which optimizes both feature subsets and kernel parameters, to emotional state estimation.
3. Research Model
In this study, we propose a novel SVR algorithm that uses GA to optimize the feature selection and the kernel parameter settings simultaneously. Hereafter, we will call our algorithm GASVR (GA-optimized SVR). Fig. 3 shows how the GASVR process works. The detailed explanation for each step of GASVR is as follows.
PPT Slide
Lager Image
Process of GASVR
- 3.1 Phase I: Initiation
In the first phase, the initial population of the GA search is initiated based on the structure of a chromosome. In order to apply the genetic operators of GA, the values that are encoded in a chromosome must be transformed into binary forms. In the case of GASVR, each chromosome should contain information about feature selection and kernel parameter settings. Feature selections can easily be encoded as binary strings since the values of the codes for selection are set to ‘0’ or ‘1’ where ‘1’ signifies that the corresponding feature is selected and ‘0’ signifies that it is not selected. For kernel parameters, the values should be converted to binary numbers. In this study, we assign 14 bits per parameter. Our study uses Gaussian RBF as the kernel function for SVR. Regarding the Gaussian RBF kernel function, Tay and Cao [37] reported that the upper bound C and the kernel parameter σ 2 are critical to the performance of SVM when using Gaussian RBF. Thus, the chromosome of GASVR is designed to optimize the two kernel parameters (C,σ 2 ) for Gaussian RBF and the precision parameter ε of the ε-insensitive loss function. Finally, the length of each chromosome becomes ( m + 52) bits, where m is the number of features.
GASVR generates the initial population based on the chromosome structure that is described above. At this time, the values of the chromosomes in the population are initiated to random values.
- 3.2 Phase II: SVR training and evaluation
In the second phase, GASVR repeats a typical ε -SVR training process based on the assigned value in the chromosomes. Then it evaluates the fitness value of each chromosome. The main objective of the GA search in GASVR is to find the optimal or near optimal feature subsets and kernel parameters that lead to the most accurate predictions. From this perspective, we use the mean absolute error (MAE) of the training data set as the fitness function for GASVR.
- 3.3 Phase III: Evolution by genetic operators
In this phase, GASVR applies genetic operators, such as selection, crossover, and mutation, to the current population based on the fitness values that were generated in Phase II. As a result, a new generation of the population is created in this phase.
From this point, Phases II and III are iterated until the stopping conditions are satisfied. When the stopping conditions are satisfied, the chromosome that shows the best fitness value in the last population is selected, and the values of the optimal chromosome (i.e., optimal feature subset and kernel parameters) are finally determined.
- 3.4 Phase IV: Checking for Generalizability
The optimized feature subset and kernel parameters that are derived from Phase III generally fit well with the training data set because the GA search in the GASVR method is guided by the prediction accuracy of the training data set. However, GA searches may show disappointing performances when they are applied to unknown data sets because of overfitting. In order to avoid this type of danger, we apply the finally selected features and kernel parameters to the hold-out (unknown) data set in the last phase. In this way, we can check the general applicability of GASVR.
4. Empirical Validation
- 4.1 Data collection and preprocessing
We applied the proposed algorithm to a real-world data set in order to establish its validity. The data set used in this study was collected from the participants at the 2011-2012 Digital Media City (DMC) Culture Open Festival that was held in Seoul, Republic of Korea. At the festival, we ran a booth that displayed video stimuli to the participants via a large digital information display (DID), and captured their facial responses using a high-resolution camera. Funny, sad, and disgusting user created content (UCC) were used as the video stimuli. At the conclusion of the booth experience, we surveyed the participants about the emotional states that they experienced as they were exposed to the video stimuli. Their emotional state was measured in two dimensions—valence and arousal—using the 7-point Likert scale.
We collected 381 responses from a total of 244 participants. This study used 297 valid responses for the empirical validation. In order to quantify the facial responses of the participants, we extracted the coordinates of 64 facial points from their facial response images (refer to Fig. 4 ), and derived 35 facial feature values that were proposed by Pantic and Rothkrantz [38] . These facial feature values were used as the candidate input variables for the prediction of the participants’ emotional states. Table 1 presents the candidate input variables for the emotional state estimation model in this study.
PPT Slide
Lager Image
The positions of sixty-four facial points used in this study
Candidate input variables
PPT Slide
Lager Image
* A: the middle point of 11 and 15 ** B: the middle point of 19 and 23
In order to build a model for predicting the level of valence and arousal from these candidate input variables, we filtered the variables that did not have statistically significant correlations with the dependent variables (valence or arousal). In order to accomplish this, we used correlation analysis for the ratio input variables (f), and an independent samples t-test for the nominal input variables (AU). As a result, two ratio variables (f1 and f18) and six nominal variables (AU5, AU7, AU10, AU18, AU20, and AU41) were selected as the independent variables for the prediction of valence. Three ratio variable (f4, f11, f14) and seven nominal variables (AU1, AU2, AU13, AU15, AU18, AU20, and AU24) were selected as the independent variables for the prediction of arousal.
238 samples out of 297 samples (80%) were used as the training data set and 59 samples (20%) were used as the hold-out data set.
- 4.2 Experimental design
The experimental system for GASVR was developed using Microsoft EXCEL VBA (Visual Basic for Applications) and LIBSVM [39] . Palisade Software’s Evolver 5.5, a commercial software package for implementing GA, was also used in our experimental system.
As mentioned in 3.1, Gaussian RBF was used as the kernel function for GASVR. The ranges of parameters C,σ 2 , ε for GA search were set to [10 100], [1 100], and [0.05 0.5], respectively [7] [15] [37] . The population size was set to 100 organisms, and the crossover and mutation rates were set to 50% and 10%. As a stopping condition, 50 generations were permitted.
In order to test the effectiveness of GASVR, we also adopted three algorithms for comparison purposes – MRA, ANN, and conventional SVR with a grid search. The detailed descriptions of the settings for these algorithms that were used for comparison purposes were reported in our previous study [7] .
- 4.3 Experimental results
As the criterion for evaluating the prediction performance, we adopted the mean absolute error (MAE) method that is shown in the following equation:
PPT Slide
Lager Image
where n is the number of samples, f i is the predicted output, and y i is the actual output.
Table 2 shows the features and the optimal kernel parameters that were finally selected and Table 3 describes the prediction performances of GASVR and the other algorithms that were used for comparison purposes. As shown in Table 2 , GASVR used only six variables for the prediction of valence and nine variables for the prediction of arousal. Although GASVR used fewer input variables than the other algorithms, it achieved the highest level of prediction accuracy for the hold-out data set in the prediction of both valence and arousal, as shown in Table 3 .
Result of GASVR
PPT Slide
Lager Image
Result of GASVR
MAEs of the algorithms
PPT Slide
Lager Image
MAEs of the algorithms
A related-samples t-test was used to examine whether the prediction error of GASVR was significantly lower than that of the other algorithms. Table 4 presents the results of the t-test. As shown in the table, GASVR was better than MRA and SVR at the 1% statistical significance level, and better than ANN at the 5% significance level in the prediction of valence. However, it did not outperform any of the other algorithms with statistical significance for the prediction of arousal. This may be interpreted that the idea of our model fits better to the estimation of valence rather than arousal. However, the experimental results would be understood in a limited context, as the insufficient number of hold-out samples (just 59 samples) might have been the cause of these disappointing results.
Results of related-samples t-test
PPT Slide
Lager Image
* significant at the 5% level ** significant at the 1% level
5. Conclusion
In this paper, we suggested a new kind of hybrid SVR and GA model, named GASVR, in order to improve the prediction performance of the typical SVR algorithm for emotional state estimation. From the experimental results, we found that GASVR led to better estimation results when predicting valence levels or arousal levels from facial features.
From a practical perspective, the algorithm that was proposed in this study could be applied in various personalized Web services, including distance learning, content recommendations, and personalized advertisements on the Web. In particular, estimations of Web users’ arousal levels will enable the system to respond properly when the users feel bored or uninterested.
The future research directions of this study are as follows. First, the empirical validation should be refined. In this study, we validated the proposed algorithm with a limited number of samples. As a result, it was not possible to observe statistically significant differences between GASVR and other algorithms during the predictions of arousal levels. Thus, in the future, we need to validate GASVR using larger samples.
Second, the optimization of SVR by GA can be extended to ‘instance selection’. Some prior studies have indicated that proper selection of training instances might play a critical role in improving the prediction performances of support vector-based learning algorithms [11] [27] . Thus, we expect that there will be studies about the application of proper instance selection to SVR in the near future.
Lastly, the proposed algorithm—GASVR—can be applied to many types of prediction / estimation problems, though we used it for emotional state estimation. Thus, it is recommended to endeavor to apply GASVR to other prediction domains for resolving business problems in the future.
Hyunchul Ahn is an associate professor in the School of Management Information Systems at Kookmin University, Seoul, Korea. He has a master’s degree and a Ph.D. in management engineering from the Korea Advanced Institute of Science and Technology (KAIST) Graduate School of Management. His research interests include technical issues on intelligent information systems for marketing and finance, and behavioral issues on the adoption of information systems. His works has been published in Annals of Operations Research, Applied Soft Computing, Computers & Operations Research, Computers in Human Behavior, Expert Systems with Applications, International Journal of Electronic Commerce, Information & Management, and Technological Forecasting and Social Change.
Seongjin Kim is currently enrolled in a master’s program at Graduate School of Business IT, Kookmin University, where he also earned his B.A. degree in Management Information Systems. His primary research interests include data mining for business, Information Technology adoption and use, and Human-Computer Interaction.
Jae Kyeong Kim is a professor at School of Management, Kyunghee University. He obtained his M.S. and Ph.D. in Management Information Systems (MIS) from KAIST (Korea Advanced Institute of Science and Technology), and his B.S. in Industrial Engineering from Seoul National University. His current research interests focus on business intelligence, network management, and green business/IT. He has published numerous papers which have appeared in Artificial Intelligence Review, Electronic Commerce Research and Applications, European Journal of Operational Research, Expert Systems with Applications, Group Decision and Negotiations, IEEE Transactions on Services Computing, International Journal of Human-Computer Studies, International Journal of Information Management, and Technological Forecasting and Social Change. He is also a chief editor of JIIS (Journal of Intelligence and Information Systems), and an AE (associate editor) of Information Technology and Management (SSCI).
Tao J. , Tan T. 2005 “Affective Computing: A Review” Lecture Notes in Computer Science Article (CrossRef Link) 3784 981 - 995
Thompson N. , McGill T. J. 2012 “Affective Tutoring Systems: Enhancing e-Learning with the Emotional Awareness of a Human Tutor” International Journal of Information and Communication Technology Education Article (CrossRef Link) 8 (4) 75 - 89    DOI : 10.4018/jicte.2012100107
Ekman P. 1971 “Universals and Cultural Differences in Facial Expression of Emotion” Nebraska Symposium on Motivation 19 207 - 283
Ekman P. , Friesen W. 1978 Facial Action Coding System: A Technique for the Measurement of Facial Movement Consulting Psychologists Press Palo Alto
Jung M.-K. , Kim J.-K. 2012 “The Intelligent Determination Model of Audience Emotion for Implementing Personalized Exhibition” Journal of Intelligence and Information Systems 18 (1) 39 - 57
Lin K. C. , Huang T.-C. , Hung J. C.-S. , Yen N. Y. , Chen S.-J. 2013 “Facial Emotion Recognition towards Affective Computing-based Learning” Library Hi Tech Article (CrossRef Link) 31 (2) 294 - 307    DOI : 10.1108/07378831311329068
Kim S. , Ryoo E. , Jung M. K. , Kim J. K. , Ahn H. 2012 “Application of Support Vector Regression for Improving the Performance of the Emotion Prediction Model” Journal of Intelligence and Information Systems 18 (3) 185 - 202
Nicolaou M. , Gunes H. , Pantic M. 2011 “Continuous Prediction of Spontaneous Affect from Multiple Cues and Modalities in Valence-Arousal Space” IEEE Transactions on Affective Computing Article (CrossRef Link) 2 (2) 92 - 105    DOI : 10.1109/T-AFFC.2011.9
Nicolaou M. , Gunes H. , Pantic M. 2012 “Output-associative RVM regression for dimensional and continuous emotion prediction” Image and Vision Computing Article (CrossRef Link) 30 (3) 186 - 196    DOI : 10.1016/j.imavis.2011.12.005
Basak D. , Pal S. , Patranabis D. C. 2007 “Support Vector Regression” Neural Information Processing – Letters and Reviews 11 (10) 203 - 224
Ahn H. , Lee K. , Kim K.-j. 2006 “Global Optimization of Support Vector Machines Using Genetic Algorithms for Bankruptcy Prediction” Lecture Notes in Computer Science Article (CrossRef Link) 4234 420 - 429
Mehrabian A. , Russell J. A. 1974 An Approach to Environmental Psychology MIT Press
Russell J. A. 1980 “A Circumplex Model of Affect” Journal of Personality and Social Psychology Article (CrossRef Link) 39 (6) 1161 - 1178    DOI : 10.1037/h0077714
Vapnik V. 1998 Statistical Learning Theory Wiley New York
Lu C. J. , Lee T. S. , Chiu C. C. 2009 “Financial time series forecasting using independent component analysis and support vector regression” Decision Support Systems Article (CrossRef Link) 47 (2) 115 - 125    DOI : 10.1016/j.dss.2009.02.001
Vapnik V. , Golowich S. , Smola A. 1997 “Support vector method for function approximation regression estimation, and signal processing” In M. Mozer, M. Jordan, T. Petsche, editors Advances in Neural Information Processing Systems MIT Press 9
Hong T. H. , Kim E. M. 2010 “The Prediction of Purchase Amount of Customers Using Support Vector Regression with Separated Learning Method” Journal of Intelligence and Information Systems 16 (4) 213 - 225
Vapnik V. N. 2000 The Nature of Statistical Learning Theory Springer New York
Kim Y.-H. , Yoon Y. 2008 “Effect of Changing the Basis in Genetic Algorithms Using Binary Encoding” KSII Transactions on Internet and Information Systems Article (CrossRef Link) 2 (4) 184 - 193    DOI : 10.3837/tiis.2008.04.002
Yang Y. , Zheng W. , Huang S. 2013 “A Novel Automatic Block-based Multi-focus Image Fusion via Genetic Algorithm” KSII Transactions on Internet and Information Systems Article (CrossRef Link) 7 (7) 1671 - 1689    DOI : 10.3837/tiis.2013.07.009
Goldberg D. E. 1989 Genetic Algorithms in Search, Optimization, and Machine Learning Addison-Wesley New York
Ahn H. , Kim K.-j. 2009 “Global Optimization of Case-based Reasoning for Breast Cytology Diagnosis” Expert Systems with Applications Article (CrossRef Link) 36 (1) 724 - 734    DOI : 10.1016/j.eswa.2007.10.023
Ahn H. , Kim K.-j. 2009 “Bankruptcy Prediction Modeling with Hybrid Case-Based Reasoning and Genetic Algorithms Approach” Applied Soft Computing Article (CrossRef Link) 9 (2) 599 - 607    DOI : 10.1016/j.asoc.2008.08.002
Kim K.-j. , Ahn H. 2012 “Simultaneous optimization of artificial neural networks for financial forecasting” Applied Intelligence Article (CrossRef Link) 36 (4) 887 - 898    DOI : 10.1007/s10489-011-0303-2
Oh S.-K. , Park H.-S. , Jeong C.-W. , Joo S.-C. 2009 “GA-based Feed-forward Self-organizing Neural Network Architecture and Its Applications for Multi-variable Nonlinear Process Systems” KSII Transactions on Internet and Information Systems Article (CrossRef Link) 3 (3) 309 - 330    DOI : 10.3837/tiis.2009.03.006
Huang C. L. , Wang C. J. 2006 “A GA-based feature selection and parameters optimization for support vector machines” Expert Systems with Applications Article (CrossRef Link) 31 (2) 231 - 240    DOI : 10.1016/j.eswa.2005.09.024
Kim K.-j. , Ahn H. 2011 “Optimization of Support Vector Machines for Financial Forecasting” Journal of Intelligence and Information Systems 17 (4) 241 - 254
Chen K.-Y. 2007 “Forecasting systems reliability based on support vector regression with genetic algorithms” Reliability Engineering & System Safety Article (CrossRef Link) 92 (4) 423 - 432    DOI : 10.1016/j.ress.2005.12.014
Chen K.-Y. , Wang C.-H. 2007 “Support vector regression with genetic algorithms in forecasting tourism demand” Tourism Management Article (CrossRef Link) 28 (1) 215 - 226    DOI : 10.1016/j.tourman.2005.12.018
Huang J. , Bo Y. , Wang H. 2011 “Electromechanical equipment state forecasting based on genetic algorithm – support vector regression” Expert Systems with Applications Article (CrossRef Link) 38 (7) 8399 - 8402    DOI : 10.1016/j.eswa.2011.01.033
Liu S. , Tai H. , Ding Q. , Li D. , Xu L. , Wei Y. 2013 “A hybrid approach of support vector regression with genetic algorithm optimization for aquaculture water quality prediction” Mathematical and Computer Modelling Article (CrossRef Link) 58 (3-4) 458 - 465    DOI : 10.1016/j.mcm.2011.11.021
Lahiri S. K. , Ghanta K. C. 2008 “Prediction of pressure drop of slurry flow in pipeline by hybrid support vector regression and genetic algorithm model” Chinese Journal of Chemical Engineering Article (CrossRef Link) 16 (6) 841 - 848    DOI : 10.1016/S1004-9541(09)60003-3
Wu C.-H. , Tzeng G.-H. , Lin R.-H. 2009 “A novel hybrid genetic algorithm for kernel function and parameter optimization in support vector regression” Expert Systems with Applications Article (CrossRef Link) 36 (3) 4725 - 4735    DOI : 10.1016/j.eswa.2008.06.046
He W. , Wang Z. , Jiang H. 2008 “Model optimizing and feature selecting for support vector regression in time series forecasting” Neurocomputing Article (CrossRef Link) 72 (1-3) 600 - 611    DOI : 10.1016/j.neucom.2007.11.010
Oliveira A. L. I. , Braga P. L. , Lima R. M. F. , Cornélio M. L. 2010 “GA-based method for feature selection and parameters optimization for machine learning regression applied to software effort estimation” Information and Software Technology Article (CrossRef Link) 52 (11) 1155 - 1166    DOI : 10.1016/j.infsof.2010.05.009
Huang C.-F. 2012 “A hybrid stock selection model using genetic algorithms and support vector regression” Applied Soft Computing Article (CrossRef Link) 12 (2) 807 - 818    DOI : 10.1016/j.asoc.2011.10.009
Tay F. E. H. , Cao L. 2001 “Application of support vector machines in financial time series forecasting” Omega Article (CrossRef Link) 29 (4) 309 - 317    DOI : 10.1016/S0305-0483(01)00026-3
Pantic M. , Rothkrantz L. J. M. 2000 “Expert System for Automatic Analysis of Facial Expressions” Image and Vision Computing Article (CrossRef Link) 18 (11) 881 - 905    DOI : 10.1016/S0262-8856(00)00034-2
Chang C.-C. , Lin C.-J. 2011 “LIBSVM : a library for support vector machines” ACM Transactions on Intelligent Systems and Technology Software available at . Article (CrossRef Link) 2 (3) 27:1 - 27:27