ScholarMate
客服热线:400-1616-289

Correlation Expert Tuning System for Performance Acceleration

Chai, Yanfeng; Ge, Jiake; Zhang, Qiang; Chai, Yunpeng; Wang, Xin*; Zhang, Qingpeng
Science Citation Index Expanded
北京交通大学; 天津大学

摘要

One configuration can not fit all workloads and diverse resources limitations in modern databases. Auto-tuning methods based on reinforcement learning (RL) normally depend on the exhaustive offline training process with a huge amount of performance measurements, which includes large inefficient knobs combinations under a trial-and-error method. The most time-consuming part of the process is not the RL network training but the performance measurements for acquiring the reward values of target goals like higher throughput or lower latency. In other words, the whole process nearly could be considered as a zero-knowledge method without any experience or rules to constrain it. So we propose a correlation expert tuning system (CXTuning) for acceleration, which contains a correlation knowledge model to remove unnecessary training costs and a multi-instance mechanism (MIM) to support finegrained tuning for diverse workloads. The models define the importance and correlations among these configuration knobs for the user's specified target. But knobs-based optimization should not be the final destination for auto-tuning. Furthermore, we import an abstracted architectural optimization method into CXTuning as a part of the progressive expert knowledge tuning (PEKT) algorithm. Experiments show that CXTuning can effectively reduce the training time and achieve extra performance promotion compared with the state-of-the-art auto-tuning method.

关键词

Auto-tuning Database optimization Correlation expert rules Reinforcement learning Training time reduction