Several recent successes in deep learning(DL),such as state-of-the-art performance on several image classification benchmarks,have been achieved through the improved configuration.Hyperparameters(HPs)tuning is a key f...Several recent successes in deep learning(DL),such as state-of-the-art performance on several image classification benchmarks,have been achieved through the improved configuration.Hyperparameters(HPs)tuning is a key factor affecting the performance of machine learning(ML)algorithms.Various state-of-the-art DL models use different HPs in different ways for classification tasks on different datasets.This manuscript provides a brief overview of learning parameters and configuration techniques to show the benefits of using a large-scale handdrawn sketch dataset for classification problems.We analyzed the impact of different learning parameters and toplayer configurations with batch normalization(BN)and dropouts on the performance of the pre-trained visual geometry group 19(VGG-19).The analyzed learning parameters include different learning rates and momentum values of two different optimizers,such as stochastic gradient descent(SGD)and Adam.Our analysis demonstrates that using the SGD optimizer and learning parameters,such as small learning rates with high values of momentum,along with both BN and dropouts in top layers,has a good impact on the sketch image classification accuracy.展开更多
文摘Several recent successes in deep learning(DL),such as state-of-the-art performance on several image classification benchmarks,have been achieved through the improved configuration.Hyperparameters(HPs)tuning is a key factor affecting the performance of machine learning(ML)algorithms.Various state-of-the-art DL models use different HPs in different ways for classification tasks on different datasets.This manuscript provides a brief overview of learning parameters and configuration techniques to show the benefits of using a large-scale handdrawn sketch dataset for classification problems.We analyzed the impact of different learning parameters and toplayer configurations with batch normalization(BN)and dropouts on the performance of the pre-trained visual geometry group 19(VGG-19).The analyzed learning parameters include different learning rates and momentum values of two different optimizers,such as stochastic gradient descent(SGD)and Adam.Our analysis demonstrates that using the SGD optimizer and learning parameters,such as small learning rates with high values of momentum,along with both BN and dropouts in top layers,has a good impact on the sketch image classification accuracy.