局部极小值

  • 网络local minimum;local minima;local-maxima
局部极小值局部极小值
  1. 本文针对粒子群算法易出现早熟陷入局部极小值问题对标准粒子群算法进行改进并将改进的粒子群算法应用于BP神经网络中。

    With regards to the problem that particle swarm algorithm is apt to premature and plunge into local minima , the thesis makes improvements about the standard particle swarm algorithm , and then applies the improved particle swarm algorithm to BP neural network .

  2. 特别地,在最优化、神经控制、信号处理和模式区别的应用中,须要求网络避免伪响应或发生局部极小值,因此,ANN网络的全局渐近稳定性是研究的核心问题。

    Especially , in the applications of optimization , neural control , signal processing and pattern recognition , the global asymptotic stability has become the core problem in the research in order to avoid the spurious response or local minima .

  3. 当共混比为50/50时,B产生局部极小值现象。

    A local minimum phenomenon for B is formed when the blending ratio is 50:50 .

  4. BP网络有易陷入局部极小值的缺点。

    The drawback of BP network lies in its local minimum .

  5. 避免路径规划NA算法局部极小值的仿真试验

    The Experiment of the Local Minimum Value of the Algorithm for Path Planning

  6. BP网络能较好地解决贮灰坝监测系统的预报问题,但容易陷入局部极小值;

    BP network can solve the prediction problem of monitoring system , but is apt to plunge into local minimum .

  7. 将混沌优化算法与BP算法相结合,使模糊神经网络克服了容易陷入局部极小值的缺点。

    The combination of chaos and BP learning method make fuzzy neural network escape from getting in local minimization value .

  8. 因此,克服了传统的BP学习算法过程中难以跳出局部极小值与收敛速度慢的缺点。

    So it overcomes properties of traditional BP algorithm , such as falling easily into local optimal solution and slower convergence speed .

  9. 采用GA作为训练算法,解决ANN局部极小值问题;

    In the thesis GA is adopted as the training algorithm to solve the ANN local minimum problem .

  10. 对于传统BP算法存在的收敛速度慢和易陷入局部极小值问题,人们提出了径向基函数网络。

    People put forward radial basis function networks considering the conventional BP algorithm problems of slow convergence speed and easily getting into local dinky value .

  11. 由于标准的BP算法在训练系统网络时,不仅收敛速度慢,而且其最优解很容易陷入局部极小值,这就在一定程度上限制了鉴定系统的稳定性和准确率。

    Because the standard BP algorithm not only has a low convergence rate , it is more easily trapped in local minimum during its training .

  12. 先对传统的BP人工神经网络进行了分析,针对其收敛速度慢,存在局部极小值的缺点提出了一种改进后的BP人工神将网络。

    An improved BP neural network is proposed for the purpose of overcoming the slow convergence and existence of local minimum in conventional BP neural network .

  13. 为克服BP算法收敛速度慢,易陷入局部极小值等的缺点,提高BP预测精度等性能,提出了变共轭梯度法(VCG)。

    To improve the prediction accuracy of BP network , the variable conjugate gradient ( VCG ) algorithm was proposed .

  14. 这种方法解决了BP神经网络中存在的收敛速度慢、训练容易陷入瘫痪和容易陷入局部极小值等问题。

    The problems existed in training BP neural network , such as the converging slowly , falling into paralysis and local minimum easily are solved by this approach .

  15. 但是在训练过程中BP神经网络的寻优不具有唯一性,容易陷入局部极小值,从而可能造成误判。

    But the optimization of BP neural network is not unique ; it is susceptible to fall into the local minimum , which maybe results in miscarriage of justice .

  16. 但是传统的BP神经网络由于本身存在收敛速度慢,人工神经网络易陷入局部极小值的原因,使得其在应用方面具有一定的局限性。

    Given the traditional BP neural net has problems that the convergence speed is slowly and the net maybe trapped into local minimum , it is limited in applications .

  17. 在Needle方法陷入局部极小值点时,可以运用增加总厚度的方法来使优化过程不断地进行下去。一般采用增加薄膜中某一层的厚度。

    When the process of Needle optimization is trapped in a local minimum , it is pulled out by increasing the thickness of the current structure .

  18. 然而,k-means算法具有对初始值敏感、易陷入局部极小值等缺点。

    However k-means has two severe defects & sensitive to initial data and easy to get into a local optimum .

  19. 该算法可以有效抵消非线性噪声,但由于BP神经网络存在容易陷入局部极小值、收敛速度慢等不足,所以该算法还需要进一步优化。

    This algorithm can easily deal with the nonlinear noise . Because of BP neural network existing the shortages of slow convergence speed and easily falling into local minimal value .

  20. 实验结果证明,该算法有很好的全局收敛性,不仅有效地克服了传统的k均值算法易陷入局部极小值和对初始值敏感的问题,而且具有较快的收敛速度。

    The experimental results show that the algorithm not only avoids the local optima and is robust to initialization , but also increases the convergence speed and has global searching capability .

  21. 通过混沌遍历,可使Hopfield网络在整个相空间进行搜索,从而避免网络在运行过程中陷入局部极小值。

    By chaotic travelling , Hopfield network can do search in all phase space and so avoid trapping in local minimum .

  22. 针对传统BP算法即梯度下降法,收敛速度慢、容易陷入局部极小值等缺点,提出了基于附加动量法和自适应学习速率的改进方法。

    Aiming at the defect of gradient-descending arithmetic , such as low-speed of constringency , easily falling into local minimum , an improved approach based on affixation momentum and self-adaptive learning rate is proposed .

  23. 具有高度非线性和很强的自学习能力的BP神经网络使预知维修成为可能,但由于BP算法存在收敛速度慢、易出现局部极小值等缺陷。

    With highly nonlinear and a self-learning ability , BP neural network makes it possible , and however , the BP algorithm has a very slow convergence speed and gets into a local minimum easily .

  24. 该方法用粒子群优化算法来优化神经网络的初始权值,克服了BP神经网络易陷入局部极小值、收敛速度慢和泛化性差等缺点。

    The method uses PSO algorithm to optimize connected right value of ANN . It can overcome some disadvantages in BP ANN such as running into local minimum value easily , slow convergence rate and so on .

  25. 为克服这些困难,本文采用新的快速训练算法,同时引入全局优化算法&遗传算法(GA),克服局部极小值。

    To get over the demerit of slow training , this thesis presents some new faster training algorithms and introduces global optimization algorithm & genetic algorithm ( GA ) against local minimum problem .

  26. 经典LBG算法的局部极小值问题是制约其性能的重要因素。

    The local minimum problem is a much important factor which restricts the performance of the classic LBG algorithm .

  27. 在提取人头部磁共振图像T2加权像的横断面的脑轮廓时,因为在图像内部存在许多的局部极小值,严重的影响了算法向全局极小收敛。

    When deformable models are used to extract brain contour from head MRI T2 images , local minimums largely affect the algorithm 's converging to the global minimum .

  28. 结果显示,采用本模型可缩短训练时间,避免训练陷入局部极小值,并可对近期4d的车流量差值进行可靠预测。

    The results indicate that the model can reduce the train period , avoid minimum training , and predict the differences of recent 4D traffic flow .

  29. 拉格朗日对象松弛技术能够通过调节拉格朗日因子帮助HNN脱离目前的局部极小值。

    The Lagrange object relaxation technique can help the HNN escape from the local minimum by correcting Lagrange multipliers .

  30. BP算法因收敛速度慢、易于陷入局部极小值等缺点,使得对于较大的搜索空间、多峰值和不可微函数常常不能搜索到全局极小点,这些制约了BP网络在各个领域中的应用。

    BP algorithm has weaknesses such as slow convergent speed and easy getting into local minimum , insurable to find global extreme value point for multi-modal and non-differential function in larger searching zone , which restrict neural network 's application in every field .