ScholarMate
客服热线:400-1616-289

Novel Multitask Conditional Neural-Network Surrogate Models for Expensive Optimization

Luo, Jianping*; Chen, Liang; Li, Xia; Zhang, Qingfu
Science Citation Index Expanded
-

摘要

Multiple-related tasks can be learned simultaneously by sharing information among tasks to avoid tabula rasa learning and to improve performance in the no transfer case (i.e., when each task learns in isolation). This study investigates multitask learning with conditional neural process (CNP) networks and proposes two multitask learning network models on the basis of CNPs, namely, the one-to-many multitask CNP (OMc-MTCNP) and the many-to-many MTCNP (MMc-MTCNP). Compared with existing multitask models, the proposed models add an extensible correlation learning layer to learn the correlation among tasks. Moreover, the proposed multitask CNP (MTCNP) networks are regarded as surrogate models and applied to a Bayesian optimization framework to replace the Gaussian process (GP) to avoid the complex covariance calculation. The proposed Bayesian optimization framework simultaneously infers multiple tasks by utilizing the possible dependencies among them to share knowledge across tasks. The proposed surrogate models augment the observed dataset with a number of related tasks to estimate model parameters confidently. The experimental studies under several scenarios indicate that the proposed algorithms are competitive in performance compared with GP-, single-task-, and other multitask model-based Bayesian optimization methods.

关键词

Task analysis Correlation Optimization Neural networks Bayes methods Linear programming Computational modeling Evolutionary optimization Gaussian process (GP) multitask neural network surrogate model