文章引用说明 更多>> (返回到该文章)

Greff, S.K. and Schmidhuber, J. (2015) Highway Networks. Advances in Neural Information Processing (NIPS), Montréal, 7-12 December 2015.

被以下文章引用:

  • 标题: 深度学习模型各层参数数目对于性能的影响The Influence of the Amount of Parameters in Different Layers on the Performance of Deep Learning Models

    作者: 岳喜斌, 胡晓林, 唐亮

    关键字: 卷积神经网络, 递归卷积神经网络, 深度学习Convolutional Neural Network, Recurrent Convolutional Neural Network, Deep Learning

    期刊名称: 《Computer Science and Application》, Vol.5 No.12, 2015-12-30

    摘要: 近年来深度学习在图像识别、语音识别等领域得到了广泛的应用,取得了优异的效果,但深度学习网络的结构设计没有一般规律可循。本文基于卷积神经网络和递归卷积神经网络模型探究了深度学习网络不同层级间参数分布对网络性能的影响,在CIFAR-10、CIFAR-100和SVHN数据集上进行了大量的实验。结果表明:在保证网络总参数大致相等并稳定在饱和的临界值附近的条件下,增加高层参数数量的能够提升网络性能,而增加低层参数数量的会降低网络性能。通过这一简单的规则,我们设计的递归卷积神经网络模型结构在CIFAR-100和SVHN两个数据集上达到了目前单模型最低的识别错误率。 In recent years, deep learning has been widely used in many pattern recognition tasks including image classification and speech recognition due to its excellent performance. But a general rule for the structure design is lacked. We explored the influence of the amount of parameters in different layers of two deep learning models, convolutional neural network (CNN) and recurrent convolutional neural network (RCNN). Experiments on three benchmark datasets, CIFAR-10, CIFAR-100 and SVHN showed that when the total number of parameters was fixed, increasing the number of parameters in higher layers could boost the performance of the models while increasing the number of parameters in lower layers could be harmful to the performance of the models. Based on this simple rule, we obtained the state-of-the-art classification accuracy on CIFAR-100 and SVHN with single models.

在线客服:
对外合作:
联系方式:400-6379-560
投诉建议:feedback@hanspub.org
客服号

人工客服,优惠资讯,稿件咨询
公众号

科技前沿与学术知识分享