基于Kohonen神经网络的组合式流量预测模型
Combined Prediction Model of Network Traffic Based on Kohonen Neural Network
DOI: 10.12677/HJWC.2014.46018, PDF, HTML, 下载: 2,665  浏览: 6,347 
作者: 唐幸乐, 孙文胜, 姚劲松:杭州电子科技大学通信工程学院,杭州
关键词: 小波变换神经网络流量预测自组织映射Wavelet Transform Neural Network Traffic Prediction Self-Organizing Mapping
摘要: 该文针对传统预测模型预测精度低、对训练数据依赖程度高以及不能很好的刻画网络流量特征等不足,提出了一个混合的流量预测模型。该模型根据Kohonen神经网络的学习速率快、分类精度高、抗噪声能力强等特性,通过小波变换将网络流量分解为高频部分和低频部分,高频部分采用Kohonen神经网络进行预测,低频部分采用自回归AR模型进行预测,并采用Matlab进行仿真,通过实验得,这种组合预测模型可以提高对非线性、多时间尺度变化的网络流量的预测精度。
Abstract: Considering that the original prediction model whose accuracy is low, and which highly depends on the training data and can’t well described the characteristics of network traffic, we proposed a mixed traffic prediction model. The model is based on the Kohonen neural network feartures, that is, quickly learning rate, highly classification accuracy and strongly anti-noise. By wavelet trans-forming, we decompose the network traffic into high frequency part and the low frequency part, and the high frequency part is dealt by using Kohonen neural network prediction model, the low frequency part by using autoregressive AR model to predict by using Matlab to simulat. Through the experiment we conclude this combination prediction model can improve the prediction accu-racy on multiple time scales and the nonlinear changing network traffic.
文章引用:唐幸乐, 孙文胜, 姚劲松. 基于Kohonen神经网络的组合式流量预测模型[J]. 无线通信, 2014, 4(6): 113-119. http://dx.doi.org/10.12677/HJWC.2014.46018

参考文献

[1] 邢婷 (2010) 认知网络中一种基于神经网络预测模型的负载均衡算法. 北京交通大学, 北京, 1-10.
[2] 谭骏, 陈兴蜀, 杜敏, 等 (2012) 基于自适应BP神经网络的网络流量识别算法. 电子科技大学学报, 41, 580-584.
[3] Wang, P. and Liu, Y. (2008) Network traffic prediction based on improved BP wavelet neural network in wireless communications. Networking and Mobile Computing, Beijing, 1-5.
[4] 王祥 (2012) 小波分析与神经网络的网络流量预测模型. 无线电工程, 6, 8-11.
[5] 麻书钦 (2013) 基于Kohonen神经网络算法的网络入侵聚类算法的测试研究. 中国测试, 4, 113-116.
[6] 马力, 张高明, 苟娟迎 (2012) 一种基于小波变换的校园网流量预测方法研究. 第一届中国互联网学术会议论文集, 北京, 1-10.
[7] 郑文翰, 邹金慧 (2012) 小波分解与AR结合预测太阳黑子数的研究. 信息技术, 11, 98-102.