[go: up one dir, main page]

CN110837886A - A Soft-Sensing Method for Effluent NH4-N Based on ELM-SL0 Neural Network - Google Patents

A Soft-Sensing Method for Effluent NH4-N Based on ELM-SL0 Neural Network Download PDF

Info

Publication number
CN110837886A
CN110837886A CN201911030774.8A CN201911030774A CN110837886A CN 110837886 A CN110837886 A CN 110837886A CN 201911030774 A CN201911030774 A CN 201911030774A CN 110837886 A CN110837886 A CN 110837886A
Authority
CN
China
Prior art keywords
network
output
input
weight
function
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911030774.8A
Other languages
Chinese (zh)
Inventor
杨翠丽
聂凯哲
乔俊飞
武战红
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing University of Technology
Original Assignee
Beijing University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing University of Technology filed Critical Beijing University of Technology
Priority to CN201911030774.8A priority Critical patent/CN110837886A/en
Publication of CN110837886A publication Critical patent/CN110837886A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Molecular Biology (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Evolutionary Computation (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Tourism & Hospitality (AREA)
  • Educational Administration (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Strategic Management (AREA)
  • Primary Health Care (AREA)
  • Measuring Or Testing Involving Enzymes Or Micro-Organisms (AREA)

Abstract

The invention discloses an effluent NH4-N soft measurement method based on an ELM-SL0 neural network, belonging to the field of water treatment and intelligent information control. The method mainly comprises the following operation processes: the L0 regularization penalty term is first added to the conventional error function to approximate the insignificant weight to 0, and then the improved error function is updated using a batch gradient descent algorithm to achieve training and pruning of the network. The soft measurement method of the effluent NH4-N based on the neural network formed by the steps belongs to the protection scope of the invention. The invention combines the regularization technology and the batch gradient algorithm to optimize the ELM network structure, thereby reducing the network computation complexity, improving the prediction accuracy and increasing the generalization performance.

Description

一种基于ELM-SL0神经网络的出水NH4-N软测量方法A Soft-Sensing Method for Effluent NH4-N Based on ELM-SL0 Neural Network

技术领域technical field

本发明针对污水处理过程中氨氮浓度难以测量的问题,将批量梯度下降算法与L0正则化结合应用于神经网络,对污水处理过程中氨氮浓度进行预测。神经网络是智能信息处理技术的主要分支之一,基于神经网络的污水氨氮浓度预测技术不但属于水处理领域,还属于智能信息领域。Aiming at the problem that the ammonia nitrogen concentration is difficult to measure in the sewage treatment process, the invention combines the batch gradient descent algorithm with the L0 regularization and applies it to the neural network to predict the ammonia nitrogen concentration in the sewage treatment process. Neural network is one of the main branches of intelligent information processing technology. The prediction technology of ammonia nitrogen concentration in sewage based on neural network belongs not only to the field of water treatment, but also to the field of intelligent information.

背景技术Background technique

随着当今社会城市化和工业化的快速发展,我国水环境已受到严重破坏。污水排放不仅严重影响居民的日常生活,而且破坏了大自然的生态平衡。为了降低污水的排放量,实现水的循环利用,全国各地纷纷建立了污水处理厂。在污水处理过程中,NH4-N浓度是衡量污水处理工艺(WWTP)性能的一个重要参数,然而由于污水处理过程是一个具有高度非线性、大滞后、大时变、多变量耦合等特点的复杂系统,而且维护成本较高,因此对其进行预测仍然是一个悬而未决的问题。因此,如何低成本、高效率地对出水NH4-N的浓度进行预测对于出水水质的达标考核以及污水处理厂的稳定运行是非常有必要的。With the rapid development of urbanization and industrialization in today's society, my country's water environment has been seriously damaged. Sewage discharge not only seriously affects the daily life of residents, but also destroys the ecological balance of nature. In order to reduce the discharge of sewage and realize the recycling of water, sewage treatment plants have been established all over the country. In the sewage treatment process, the NH4-N concentration is an important parameter to measure the performance of the sewage treatment process (WWTP). systems, and they are expensive to maintain, so forecasting them remains an open question. Therefore, how to predict the concentration of NH4-N in the effluent with low cost and high efficiency is very necessary for the assessment of the effluent quality and the stable operation of the sewage treatment plant.

软测量方法利用易测变量,通过构建模型对难测变量进行实时预测,为污水处理过程中关键水质参数的测量提供了一种高效快速的解决方案。神经网络因其良好的学习能力、信息处理能力和自适应特性,能对非线性系统进行高精度逼近。本发明设计了一种基于ELM-SL0神经网络的出水NH4-N软测量方法,实现出水NH4-N浓度的在线预测。The soft sensing method utilizes easy-to-measure variables and builds models to predict difficult-to-measure variables in real time, providing an efficient and fast solution for the measurement of key water quality parameters in the sewage treatment process. Because of its good learning ability, information processing ability and self-adaptive characteristics, neural network can approximate nonlinear systems with high precision. The invention designs a soft measurement method of effluent NH4-N based on ELM-SL0 neural network, so as to realize the online prediction of effluent NH4-N concentration.

发明内容SUMMARY OF THE INVENTION

一种基于ELM-SL0神经网络的出水NH4-N软测量方法,主要操作流程如下:首先将L0正则化惩罚项添加到传统误差函数使不重要权值逼近于0,然后利用批量梯度下降算法对改进的误差函数进行更新以实现网络的训练和修剪。本方法利用神经网络的学习能力,根据训练误差对输出权值进行优化,消除不重要输出权值,然后对污水处理过程中的氨氮浓度进行预测,使其误差最小化,提高了网络结构稀疏性。其特征在于,包括以下步骤:A soft sensing method for effluent NH4-N based on ELM-SL0 neural network. The main operation process is as follows: First, add the L0 regularization penalty term to the traditional error function to make the unimportant weights approach 0, and then use the batch gradient descent algorithm to The improved error function is updated to enable training and pruning of the network. This method uses the learning ability of the neural network to optimize the output weights according to the training error, eliminate the unimportant output weights, and then predict the ammonia nitrogen concentration in the sewage treatment process to minimize the error and improve the sparsity of the network structure. . It is characterized in that, comprises the following steps:

步骤1:初始化网络结构及参数Step 1: Initialize the network structure and parameters

步骤1.1:初始化网络结构Step 1.1: Initialize the network structure

将温度、溶氧量、总悬浮物含量、pH值以及出水氧化还原电位作为输入变量,氨氮浓度作为输出变量,确定回声状态网络结构为5-N-1,其中N表示储备池节点个数。典型回声状态网络的储备池节点个数N取值为50≤N≤1000,为了更好观察所提算法的修剪效果,N的取值不宜太小。该网络中N取500,即该网络含有5个输入节点,500个储备池节点,1个输出节点。Taking temperature, dissolved oxygen, total suspended solids content, pH value, and effluent redox potential as input variables, and ammonia nitrogen concentration as output variables, the network structure of the echo state is determined as 5-N-1, where N represents the number of nodes in the reservoir. The number N of the reserve pool nodes of a typical echo state network is 50≤N≤1000. In order to better observe the pruning effect of the proposed algorithm, the value of N should not be too small. In this network, N is 500, that is, the network contains 5 input nodes, 500 reserve pool nodes, and 1 output node.

步骤1.2:初始化网络参数Step 1.2: Initialize network parameters

将sigmoid函数作为网络激活函数G(·),确定初始迭代次数i=0,最大迭代次数imax≥5000,训练样本

Figure BDA0002250087240000021
uk表示第k组输入样本,tk表示第k组实际输出值,表示输入样本维度为n,L为样本总数;随机初始化网络输入权值W和阈值向量b在(0,1)之间,设置初始输出权值W=0。Take the sigmoid function as the network activation function G( ), determine the initial number of iterations i=0, the maximum number of iterations i max ≥ 5000, and the training samples
Figure BDA0002250087240000021
u k represents the kth group of input samples, tk represents the kth group of actual output values, Indicates that the input sample dimension is n, and L is the total number of samples; the random initialization network input weight W and the threshold vector b are between (0, 1), and the initial output weight W=0.

步骤2:采用网格搜索法确定学习率η及正则化参数λStep 2: Use the grid search method to determine the learning rate η and the regularization parameter λ

(1)首先,将正则化参数置0,即λ=0,然后以0.0005步长将学习率的搜索范围设定为[0.0005,0.01],运行程序,选取训练误差最小的最优学习率η。(1) First, set the regularization parameter to 0, that is, λ=0, then set the search range of the learning rate to [0.0005, 0.01] with a step size of 0.0005, run the program, and select the optimal learning rate η with the smallest training error .

(2)在最优学习率η情况下,以0.0025步长将正则化参数的搜索范围设定为[0.0025,0.05],保证不影响训练误差情况下选取稀疏效果最佳的最优正则化参数λ。(2) In the case of the optimal learning rate η, the search range of the regularization parameter is set to [0.0025, 0.05] with a step size of 0.0025 to ensure that the optimal regularization parameter with the best sparsity effect is selected without affecting the training error. λ.

步骤3:计算输入第k组样本的网络输出yk及预测误差dk Step 3: Calculate the network output y k and the prediction error d k of the input kth group of samples

对于给定的激活函数G(·)、输入样本uk、输入权值W以及阈值向量b,得到隐含层输出为:For a given activation function G(·), input sample uk , input weight W and threshold vector b, the output of the hidden layer is obtained as:

Figure BDA0002250087240000023
Figure BDA0002250087240000023

其中,gj,1<j<N表示储备池第j个神经元的激活函数,Wj·uk,1<j<N表示储备池第j个神经元与输入层之间的输入权值向量Wj与输入向量uk的内积,bj,1<j<N表示储备池中第j个神经元的阈值。Among them, g j ,1<j<N represents the activation function of the jth neuron in the reserve pool, and W j u k ,1<j<N represents the input weight between the jth neuron in the reserve pool and the input layer The inner product of the vector W j and the input vector uk , b j , 1<j<N represents the threshold of the jth neuron in the reserve pool.

输入第k组样本,网络输出yk由如下公式得到:Input the kth group of samples, the network output yk is obtained by the following formula:

yk=W·G(Wuk+b) (2)y k =W·G(Wuk + b) (2)

网络期望输出tk与实际输出yk之间的训练误差dk定义为:The training error d k between the expected output t k of the network and the actual output y k is defined as:

dk=tk-yk (3)d k =t k -y k (3)

步骤4:计算输出权值梯度,更新输出权值Step 4: Calculate the gradient of the output weights and update the output weights

定义标准均方误差函数为:The standard mean squared error function is defined as:

Figure BDA0002250087240000031
Figure BDA0002250087240000031

其中,

Figure BDA0002250087240000032
in,
Figure BDA0002250087240000032

在误差函数上添加L0正则化项,改进的误差函数为:Adding the L0 regularization term to the error function, the improved error function is:

Figure BDA0002250087240000033
Figure BDA0002250087240000033

其中,

Figure BDA0002250087240000034
为W的L0范数,被定义如下:in,
Figure BDA0002250087240000034
is the L0 norm of W, defined as follows:

Figure BDA0002250087240000035
Figure BDA0002250087240000035

其中,Wj(1<j<N)是第j个输出权值。Among them, W j (1<j<N) is the jth output weight.

然而,L0范数是非凸函数,因此公式(5)是一个NP-hard极小化组合问题。为了解决这个问题,我们采用一个连续可微的函数f(·)对L0范数进行逼近,关于W的函数f(γ,Wj)被定义如下:However, the L0 norm is a non-convex function, so formula (5) is an NP-hard minimization combinatorial problem. To solve this problem, we use a continuously differentiable function f(·) to approximate the L0 norm, and the function f(γ, W j ) about W is defined as follows:

Figure BDA0002250087240000036
Figure BDA0002250087240000036

Figure BDA0002250087240000037
Figure BDA0002250087240000037

其中,γ为正数,其控制f(γ,Wj)逼近

Figure BDA0002250087240000038
的程度,γ较大时,函数f(γ,Wj)对权值向量的修剪程度较低,γ接近0时,函数f(γ,Wj)能够更好修剪权值向量W的非零元素,本专利中γ取0.05。由此可得,f(γ,Wj)的一阶导数为:Among them, γ is a positive number, which controls the approximation of f(γ, W j )
Figure BDA0002250087240000038
When γ is large, the function f(γ, W j ) trims the weight vector to a lower degree. When γ is close to 0, the function f(γ, W j ) can better trim the non-zero value of the weight vector W element, γ is taken as 0.05 in this patent. From this, the first derivative of f(γ, W j ) is:

Figure BDA0002250087240000041
Figure BDA0002250087240000041

因此公式(5)更新为:So formula (5) is updated to:

Figure BDA0002250087240000042
Figure BDA0002250087240000042

引入批量梯度下降算法,在初始权值W=W0的情况下,E(W)的梯度公式为:The batch gradient descent algorithm is introduced. In the case of the initial weight W=W 0 , the gradient formula of E(W) is:

Figure BDA0002250087240000043
Figure BDA0002250087240000043

其中,为第i次E(W)的梯度,

Figure BDA0002250087240000045
为第i次
Figure BDA0002250087240000046
的梯度。in, is the gradient of the i-th E(W),
Figure BDA0002250087240000045
for the ith time
Figure BDA0002250087240000046
gradient.

由此得到,输出权值的更新公式为:From this, the update formula of the output weight is:

Figure BDA0002250087240000047
Figure BDA0002250087240000047

其中,Wi+1为第i+1次迭代的输出权值,Wi为第i次迭代的输出权值。输出权值每更新一次,i累加1,即i=i+1。Among them, Wi +1 is the output weight of the i+1th iteration, and Wi is the output weight of the i -th iteration. Each time the output weight is updated, i accumulates 1, that is, i=i+1.

步骤5:判断训练是否结束Step 5: Determine whether the training is over

若满足i≥imax,则执行步骤6,否则返回步骤3。If i≥i max is satisfied, go to step 6, otherwise go back to step 3.

步骤6:测试网络Step 6: Test the Network

利用以上步骤得到的输出权值W,输入测试样本,对网络进行测试。Use the output weight W obtained in the above steps to input test samples to test the network.

本发明的创造性主要体现在:The inventive step of the present invention is mainly reflected in:

(1)针对污水处理过程中氨氮浓度难以测量的问题,本发明根据极限学习机非线性映射能力强的特点,设计了一种基于ELM-SL0神经网络的出水NH4-N软测量方法,该方法具有预测精度高、稳定性强、维护成本低等优点。(1) Aiming at the problem that the ammonia nitrogen concentration is difficult to measure in the sewage treatment process, the present invention designs a effluent NH4-N soft measurement method based on the ELM-SL0 neural network according to the feature of the extreme learning machine's strong nonlinear mapping capability. It has the advantages of high prediction accuracy, strong stability and low maintenance cost.

(2)本发明结合了L0正则化方法和批量梯度下降法,对神经网络进行训练,有效的修剪了网络中贡献度较低的神经元,减少了网络的计算时间,提高了网络结构的稀疏性。(2) The present invention combines the L0 regularization method and the batch gradient descent method to train the neural network, effectively pruning the neurons with low contribution in the network, reducing the computing time of the network, and improving the sparseness of the network structure. sex.

附图说明Description of drawings

图1.本发明的神经网络拓扑结构图;Fig. 1. neural network topology structure diagram of the present invention;

图2.本发明的出水NH4-N浓度预测方法训练均方根误差(RMSE)变化图;Fig. 2. effluent NH4-N concentration prediction method training root mean square error (RMSE) variation diagram of the present invention;

图3.训练过程中绝对值小于0.005的输出权重个数m变化图;Figure 3. Change graph of the number of output weights m whose absolute value is less than 0.005 during the training process;

图4.本发明的出水NH4-N浓度预测结果图;Fig. 4. effluent NH4-N concentration prediction result diagram of the present invention;

图5.本发明的出水NH4-N浓度预测误差图。Figure 5. Prediction error diagram of effluent NH4-N concentration of the present invention.

具体实施方式Detailed ways

一种基于ELM-SL0神经网络的出水NH4-N软测量方法,主要操作流程如下:首先将L0正则化惩罚项添加到传统误差函数使不重要权值逼近于0,然后利用批量梯度下降算法对改进的误差函数进行更新以实现网络的训练和修剪。本方法利用神经网络的学习能力,根据训练误差对输出权值进行优化,消除不重要输出权值,然后对污水处理过程中的氨氮浓度进行预测,使其误差最小化,提高了网络结构稀疏性。其特征在于,包括以下步骤:A soft-sensor method for effluent NH4-N based on ELM-SL0 neural network. The main operation process is as follows: First, add the L0 regularization penalty term to the traditional error function to make the unimportant weights approach 0, and then use the batch gradient descent algorithm to The improved error function is updated to enable training and pruning of the network. This method uses the learning ability of the neural network to optimize the output weights according to the training error, eliminate the unimportant output weights, and then predict the ammonia nitrogen concentration in the sewage treatment process to minimize the error and improve the sparsity of the network structure. . It is characterized in that, comprises the following steps:

步骤1:初始化网络结构及参数Step 1: Initialize the network structure and parameters

步骤1.1:初始化网络结构Step 1.1: Initialize the network structure

将温度、溶氧量、总悬浮物含量、pH值以及出水氧化还原电位作为输入变量,氨氮浓度作为输出变量,确定回声状态网络结构为5-N-1,其中N表示储备池神经元个数。该网络中N取500,即该网络含有5个输入节点,500个储备池节点,1个输出节点。Taking temperature, dissolved oxygen, total suspended solids content, pH value and effluent redox potential as input variables, and ammonia nitrogen concentration as output variables, the network structure of the echo state is determined as 5-N-1, where N represents the number of neurons in the reserve pool . In this network, N is 500, that is, the network contains 5 input nodes, 500 reserve pool nodes, and 1 output node.

步骤1.2:初始化网络参数Step 1.2: Initialize network parameters

将sigmoid函数作为网络激活函数G(·),初始迭代次数i=0,最大迭代次数imax≥5000,训练样本

Figure BDA0002250087240000051
uk表示第k组输入样本,tk表示第k组实际输出值,
Figure BDA0002250087240000052
表示输入样本维度为n,L为样本总数;随机初始化网络输入权值W和阈值向量b在(0,1)之间,设置初始输出权值W=0。Take the sigmoid function as the network activation function G( ), the initial number of iterations i = 0, the maximum number of iterations i max ≥ 5000, the training sample
Figure BDA0002250087240000051
u k represents the kth group of input samples, tk represents the kth group of actual output values,
Figure BDA0002250087240000052
Indicates that the input sample dimension is n, and L is the total number of samples; the random initialization network input weight W and the threshold vector b are between (0, 1), and the initial output weight W=0.

步骤2:采用网格搜索法确定学习率η及正则化参数λStep 2: Use the grid search method to determine the learning rate η and the regularization parameter λ

(1)首先,将正则化参数置0,即λ=0,然后以0.0005的步长将学习率的搜索范围设定为[0.0005,0.01],运行程序,选取训练误差最小的最优学习率η=0.01。(1) First, set the regularization parameter to 0, that is, λ=0, then set the search range of the learning rate to [0.0005, 0.01] with a step size of 0.0005, run the program, and select the optimal learning rate with the smallest training error n=0.01.

(2)在最优学习率η=0.01的情况下,以0.0025的步长将正则化参数的搜索范围设定为[0.0025,0.05],保证不影响训练误差情况下选取稀疏效果最佳的最优正则化参数λ=0.05。(2) In the case of the optimal learning rate η=0.01, the search range of the regularization parameter is set to [0.0025, 0.05] with a step size of 0.0025 to ensure that the best sparsity effect is selected without affecting the training error. The optimal regularization parameter λ=0.05.

步骤3:计算输入第k组样本的网络输出yk及预测误差dk Step 3: Calculate the network output y k and the prediction error d k of the input kth group of samples

对于给定的激活函数G(·)、输入样本uk、输入权值W以及阈值向量b,得到隐含层输出为:For a given activation function G(·), input sample uk , input weight W and threshold vector b, the output of the hidden layer is obtained as:

Figure BDA0002250087240000061
Figure BDA0002250087240000061

其中,gj,1<j<N表示储备池第j个神经元的激活函数,Wj·uk,1<j<N表示储备池第j个神经元与输入层之间的输入权值向量Wj与输入向量uk的内积,bj,1<j<N表示储备池中第j个神经元的阈值。Among them, g j ,1<j<N represents the activation function of the jth neuron in the reserve pool, and W j u k ,1<j<N represents the input weight between the jth neuron in the reserve pool and the input layer The inner product of the vector W j and the input vector uk , b j , 1<j<N represents the threshold of the jth neuron in the reserve pool.

输入第k组样本,网络输出yk由如下公式得到:Input the kth group of samples, the network output yk is obtained by the following formula:

yk=W·G(Wuk+b) (2)y k =W·G(Wuk + b) (2)

网络期望输出tk与实际输出yk之间的训练误差dk定义为:The training error d k between the expected output t k of the network and the actual output y k is defined as:

dk=tk-yk (3)d k =t k -y k (3)

步骤4:计算输出权值梯度,更新输出权值Step 4: Calculate the gradient of the output weights and update the output weights

定义标准均方误差函数为:The standard mean squared error function is defined as:

Figure BDA0002250087240000062
Figure BDA0002250087240000062

其中,

Figure BDA0002250087240000063
in,
Figure BDA0002250087240000063

在误差函数上添加L0正则化项,改进的误差函数为:Adding the L0 regularization term to the error function, the improved error function is:

Figure BDA0002250087240000064
Figure BDA0002250087240000064

其中,

Figure BDA0002250087240000065
为W的L0范数,被定义如下:in,
Figure BDA0002250087240000065
is the L0 norm of W, defined as follows:

Figure BDA0002250087240000066
Figure BDA0002250087240000066

其中,Wj(1<j<N)是第j个输出权值。Among them, W j (1<j<N) is the jth output weight.

然而,L0范数是非凸函数,因此公式(5)是一个NP-hard极小化组合问题。为了解决这个问题,我们采用一个连续可微的函数f(·)对L0范数进行逼近,关于W的函数f(γ,Wj)被定义如下:However, the L0 norm is a non-convex function, so formula (5) is an NP-hard minimization combinatorial problem. To solve this problem, we use a continuously differentiable function f(·) to approximate the L0 norm, and the function f(γ, W j ) about W is defined as follows:

Figure BDA0002250087240000071
Figure BDA0002250087240000071

Figure BDA0002250087240000072
Figure BDA0002250087240000072

其中,γ为正数,其控制f(γ,Wj)逼近

Figure BDA0002250087240000073
的程度,γ较大时,函数f(γ,Wj)对权值向量的修剪程度较低,γ接近0时,函数f(γ,Wj)能够更好修剪权值向量W的非零元素,本专利中γ取0.05。Among them, γ is a positive number, which controls the approximation of f(γ, W j )
Figure BDA0002250087240000073
When γ is large, the function f(γ, W j ) trims the weight vector to a lower degree. When γ is close to 0, the function f(γ, W j ) can better trim the non-zero value of the weight vector W element, γ is taken as 0.05 in this patent.

由此得到,f(γ,Wj)的一阶导数为:From this, the first derivative of f(γ,W j ) is:

Figure BDA0002250087240000074
Figure BDA0002250087240000074

因此公式(5)更新为:So formula (5) is updated to:

Figure BDA0002250087240000075
Figure BDA0002250087240000075

引入批量梯度下降算法,在初始权值W=W0的情况下,E(W)的梯度公式为:The batch gradient descent algorithm is introduced. In the case of the initial weight W=W 0 , the gradient formula of E(W) is:

Figure BDA0002250087240000076
Figure BDA0002250087240000076

其中,

Figure BDA0002250087240000077
为第i次E(W)的梯度,
Figure BDA0002250087240000078
为第i次
Figure BDA0002250087240000079
的梯度。in,
Figure BDA0002250087240000077
is the gradient of the i-th E(W),
Figure BDA0002250087240000078
for the ith time
Figure BDA0002250087240000079
gradient.

由此得到,输出权值的更新公式为:From this, the update formula of the output weight is:

其中,Wi+1为第i+1次迭代的输出权值,Wi为第i次迭代的输出权值。输出权值每更新一次,i累加1,即i=i+1。Among them, Wi +1 is the output weight of the i+1th iteration, and Wi is the output weight of the i -th iteration. Each time the output weight is updated, i accumulates 1, that is, i=i+1.

步骤5:判断训练是否结束Step 5: Determine whether the training is over

若满足i≥imax,则执行步骤6,否则返回步骤3。If i≥i max is satisfied, go to step 6, otherwise go back to step 3.

步骤6:测试网络Step 6: Test the Network

利用以上步骤得到的输出权值W,输入测试样本,对网络进行测试。Use the output weight W obtained in the above steps to input test samples to test the network.

数据样本data sample

表1-12是本发明实验数据。表1-5为训练输入样本:进水温度、好氧末段溶解氧、好氧末端总固体悬浮物、出水酸碱度pH、出水氧化还原电位,表6为训练样本出水氨氮的浓度,表7-11为测试输入样本:进水温度、好氧末段溶解氧、好氧末端总固体悬浮物、出水酸碱度pH、出水氧化还原电位,表12为测试样本出水氨氮的浓度。Tables 1-12 are the experimental data of the present invention. Table 1-5 is the training input sample: inlet water temperature, dissolved oxygen at the end of aerobic, total suspended solids at the end of aerobic, pH of effluent, redox potential of effluent, Table 6 is the concentration of ammonia nitrogen in the effluent of the training sample, Table 7- 11 is the test input sample: inlet water temperature, dissolved oxygen at the aerobic end, total suspended solids at the aerobic end, effluent pH, effluent redox potential, and Table 12 is the concentration of ammonia nitrogen in the effluent of the test sample.

训练样本:Training samples:

表1.辅助变量进水温度(℃)Table 1. Auxiliary Variable Inlet Water Temperature (°C)

Figure BDA0002250087240000081
Figure BDA0002250087240000081

Figure BDA0002250087240000091
Figure BDA0002250087240000091

表2.辅助变量溶解氧(mg/L)Table 2. Auxiliary Variable Dissolved Oxygen (mg/L)

0.08510.0851 0.26670.2667 0.04280.0428 0.03360.0336 0.03130.0313 0.31650.3165 0.04410.0441 5.52285.5228 0.26540.2654 0.04510.0451 0.03280.0328 0.03990.0399 0.03550.0355 0.03410.0341 0.06550.0655 0.03140.0314 5.79405.7940 0.03170.0317 5.71435.7143 0.36240.3624 0.04740.0474 0.04410.0441 1.22131.2213 0.07430.0743 0.05450.0545 0.42070.4207 5.18835.1883 0.46940.4694 0.04530.0453 0.16240.1624 0.06120.0612 0.03450.0345 6.12716.1271 0.09650.0965 0.03630.0363 0.03120.0312 0.05180.0518 0.03190.0319 0.06640.0664 0.03090.0309 0.54000.5400 0.27010.2701 1.16101.1610 0.68570.6857 0.07680.0768 0.03290.0329 0.03130.0313 0.04670.0467 0.39870.3987 0.03390.0339 0.07150.0715 0.03380.0338 0.96700.9670 3.66273.6627 0.03110.0311 0.45640.4564 0.39420.3942 0.46840.4684 0.54870.5487 0.20660.2066 0.04100.0410 2.50882.5088 0.25660.2566 0.04640.0464 6.18336.1833 0.28900.2890 0.54260.5426 0.37820.3782 0.03020.0302 0.03090.0309 0.05550.0555 0.03730.0373 0.25570.2557 0.47110.4711 0.06150.0615 0.03120.0312 0.03900.0390 0.04160.0416 0.05910.0591 0.04510.0451 0.03450.0345 0.05400.0540 0.44780.4478 0.06370.0637 6.16546.1654 0.03080.0308 0.45080.4508 0.51920.5192 0.14810.1481 0.03960.0396 0.03180.0318 0.04890.0489 2.96312.9631 0.03570.0357 0.05300.0530 0.22820.2282 0.55390.5539 0.03840.0384 0.22320.2232 0.44480.4448 0.06910.0691 0.11720.1172 0.06830.0683 3.01783.0178 0.52870.5287 0.25580.2558 0.05610.0561 0.03090.0309 0.09360.0936 0.03110.0311 0.03560.0356 0.04120.0412 0.05100.0510 0.04480.0448 0.03180.0318 0.03870.0387 5.56285.5628 0.03500.0350 0.09070.0907 0.03630.0363 5.37875.3787 0.04720.0472 0.03640.0364 0.13960.1396 0.80630.8063 0.06860.0686 0.03400.0340 0.48330.4833 0.26870.2687 0.27400.2740 0.25460.2546 0.43290.4329 0.03000.0300 0.03120.0312 0.04110.0411 0.42910.4291 0.03820.0382 0.53510.5351 0.05320.0532 0.03020.0302 0.33010.3301 0.09090.0909 0.02970.0297 0.03460.0346 0.05920.0592 0.04610.0461 0.04920.0492 0.20790.2079 0.07060.0706 0.03340.0334 0.03750.0375 1.63911.6391 0.06830.0683 0.04060.0406 0.03980.0398 0.05620.0562 0.43400.4340 0.02910.0291 0.03370.0337 0.46210.4621 0.24890.2489 0.37030.3703 0.30960.3096 0.26460.2646 0.07060.0706 6.09936.0993 0.46490.4649 0.26590.2659 0.03270.0327 0.12470.1247 1.26621.2662 0.03080.0308 2.12162.1216 0.53780.5378 5.37805.3780 0.03380.0338 0.03970.0397 0.04110.0411 0.03360.0336 0.08700.0870 0.04270.0427 0.09560.0956 0.05050.0505 0.40260.4026 0.03500.0350 0.02860.0286 0.04880.0488 0.05590.0559 0.03180.0318 0.36400.3640 0.03520.0352 0.04550.0455 0.04120.0412 0.42730.4273 0.06400.0640 0.07920.0792 0.03080.0308 1.04971.0497 0.04830.0483 0.03090.0309 0.05820.0582 0.09710.0971 0.05710.0571 0.04780.0478 0.05820.0582 0.04940.0494 0.03170.0317 0.39300.3930 0.03780.0378 0.04100.0410 0.03610.0361 0.05290.0529 0.05650.0565 0.04470.0447 0.76170.7617 0.09630.0963 0.03530.0353 0.38120.3812 0.13430.1343 0.05350.0535 0.04410.0441 0.06920.0692 0.06680.0668 5.75205.7520 0.04030.0403 0.04420.0442 0.04080.0408 0.07990.0799 0.32720.3272 0.03070.0307 0.23650.2365 0.04640.0464 5.48115.4811 0.07690.0769 0.45120.4512 0.53090.5309 0.06570.0657 2.77942.7794 0.07840.0784 0.06170.0617 0.35540.3554 0.04220.0422 0.05820.0582 0.24700.2470 0.40730.4073 5.95485.9548 0.03790.0379 0.07960.0796 0.29970.2997 0.58580.5858 0.03160.0316 2.68522.6852 0.43160.4316 0.44550.4455 0.04210.0421 0.05480.0548 0.03560.0356 5.85315.8531 2.06042.0604 0.10090.1009 0.03100.0310 0.43790.4379 0.03700.0370 0.04320.0432 0.58150.5815 0.04800.0480 0.07870.0787 0.05670.0567 0.23800.2380 0.04860.0486 0.03390.0339 0.04150.0415 0.48890.4889 2.50402.5040 0.06730.0673 0.32740.3274 0.50430.5043 0.19950.1995 0.03650.0365 0.02970.0297 0.07110.0711 0.24040.2404 0.09460.0946 1.50571.5057 0.54980.5498 0.06960.0696 0.05220.0522 0.29740.2974 0.03610.0361 0.18650.1865 0.03090.0309 0.08310.0831 0.03460.0346 0.06830.0683 5.97115.9711 3.41093.4109 0.08230.0823 0.05610.0561 0.19780.1978 1.69311.6931

表3.辅助变量总固体悬浮物(mg/L)Table 3. Auxiliary Variable Total Suspended Solids (mg/L)

Figure BDA0002250087240000092
Figure BDA0002250087240000092

表4.辅助变量pH值Table 4. Auxiliary Variable pH

表5.辅助变量氧化还原电位Table 5. Auxiliary Variable Redox Potential

Figure BDA0002250087240000112
Figure BDA0002250087240000112

Figure BDA0002250087240000121
Figure BDA0002250087240000121

表6.实测出水NH4-N浓度(mg/L)Table 6. Measured NH4-N concentration in effluent (mg/L)

Figure BDA0002250087240000131
Figure BDA0002250087240000131

测试样本:Test sample:

表7.辅助变量进水温度(℃)Table 7. Auxiliary Variable Inlet Water Temperature (°C)

26.666426.6664 25.592525.5925 26.075126.0751 26.865526.8655 24.930724.9307 24.943624.9436 25.251625.2516 25.825525.8255 24.917724.9177 25.469125.4691 25.646325.6463 23.623923.6239 26.796126.7961 23.383523.3835 25.566425.5664 25.623125.6231 23.680623.6806 24.183324.1833 25.538825.5388 25.741025.7410 25.999125.9991 25.557625.5576 24.946524.9465 24.972524.9725 24.741824.7418 27.208727.2087 25.866325.8663 26.713626.7136 24.906124.9061 25.669625.6696 24.681324.6813 23.277023.2770 23.863123.8631 24.966724.9667 26.806526.8065 24.480124.4801 24.887424.8874 25.485025.4850 22.962522.9625 25.247225.2472 25.996225.9962 27.109427.1094 25.628925.6289 25.408125.4081 24.229124.2291 25.472025.4720 27.202827.2028 25.399425.3994 25.564925.5649 24.669824.6698 24.901824.9018 24.547624.5476 25.361725.3617 23.737823.7378 24.302224.3022 24.984024.9840 22.809822.8098 25.010025.0100 25.297925.2979 25.030325.0303 27.078427.0784 24.272124.2721 24.419824.4198 24.982624.9826 25.666725.6667 23.055923.0559 23.730723.7307 25.477825.4778 25.389325.3893 25.512625.5126 25.672525.6725 25.406725.4067 25.053425.0534 23.156523.1565 25.088125.0881 24.911924.9119 24.966724.9667 24.948024.9480 24.868624.8686 26.909826.9098 25.930525.9305 23.284123.2841 25.348625.3486 25.299325.2993 24.518824.5188 25.437125.4371 24.948024.9480 27.293327.2933 25.984525.9845 25.461825.4618 25.221225.2212 27.056227.0562 23.102723.1027 24.861424.8614 25.085225.0852 24.559124.5591 25.415325.4153 25.626025.6260 26.934926.9349 25.350125.3501 25.348625.3486 24.379624.3796 25.293625.2936 23.625323.6253 24.540424.5404 24.204724.2047 26.791726.7917 24.605124.6051 25.615825.6158 24.636824.6368 22.911522.9115 24.904724.9047 25.255925.2559 26.504626.5046 27.133127.1331 25.964125.9641 24.999924.9999 26.042926.0429 23.629523.6295 24.669824.6698 24.790824.7908 24.749024.7490 26.028326.0283 23.063023.0630 25.495225.4952 25.158925.1589 23.203223.2032 23.559823.5598 25.652225.6522 23.331023.3310 25.540225.5402 23.155123.1551 23.874523.8745 24.615224.6152 26.785826.7858 25.163325.1633 25.943625.9436 23.629523.6295 25.153225.1532 25.825525.8255 24.805224.8052 25.295025.2950 25.177825.1778 23.990223.9902 27.333427.3334 27.188027.1880 23.474523.4745 26.955626.9556 25.339925.3399 23.404823.4048 25.953925.9539 26.815326.8153 25.674025.6740 25.445825.4458 26.040026.0400 25.131525.1315 24.822524.8225 24.949424.9494 23.431823.4318 25.505325.5053 26.672326.6723 26.821226.8212 23.095623.0956 25.498125.4981 25.229925.2299 23.576923.5769 23.609623.6096 23.138123.1381 23.700623.7006 25.506825.5068 23.511423.5114 25.640525.6405 25.148825.1488 23.871723.8717 26.976326.9763 27.214727.2147 26.952626.9526 25.104025.1040 23.642223.6422 25.128525.1285 25.130025.1300 23.847723.8477 23.419023.4190 23.019123.0191 24.959524.9595 24.121824.1218 23.633823.6338 25.284925.2849 23.629523.6295 26.765226.7652

表8.辅助变量溶解氧(mg/L)Table 8. Auxiliary Variable Dissolved Oxygen (mg/L)

Figure BDA0002250087240000141
Figure BDA0002250087240000141

表9.辅助变量总固体悬浮物(mg/L)Table 9. Auxiliary Variable Total Suspended Solids (mg/L)

2.82032.8203 2.94602.9460 2.86782.8678 2.82022.8202 2.56112.5611 2.58292.5829 2.84322.8432 2.88922.8892 2.53142.5314 3.03583.0358 2.94242.9424 2.54502.5450 2.75392.7539 2.30892.3089 2.95852.9585 2.96512.9651 2.65722.6572 2.49492.4949 2.94972.9497 2.80612.8061 2.80562.8056 2.94052.9405 3.12663.1266 2.57652.5765 3.01283.0128 2.82512.8251 2.79742.7974 2.78272.7827 3.02333.0233 2.87532.8753 2.83772.8377 2.27402.2740 2.46932.4693 2.89422.8942 2.81512.8151 2.49822.4982 3.22383.2238 3.02893.0289 2.26922.2692 2.71312.7131 2.76842.7684 3.17273.1727 2.94202.9420 3.01383.0138 2.47002.4700 2.93792.9379 2.81822.8182 2.96992.9699 2.96992.9699 2.96962.9696 2.53632.5363 2.45732.4573 2.90052.9005 2.44282.4428 2.41212.4121 2.45052.4505 2.31002.3100 2.81732.8173 2.88682.8868 3.09123.0912 2.80532.8053 2.50252.5025 3.15273.1527 2.93242.9324 2.94162.9416 2.31572.3157 2.38292.3829 2.89732.8973 3.07283.0728 3.14563.1456 2.86172.8617 3.08573.0857 3.03293.0329 2.21052.2105 2.80242.8024 2.43762.4376 2.60052.6005 2.92752.9275 2.47092.4709 2.79972.7997 2.82382.8238 2.47892.4789 2.94232.9423 2.94352.9435 3.16183.1618 2.99972.9997 2.82172.8217 2.71762.7176 2.78002.7800 3.02503.0250 2.74102.7410 2.80292.8029 2.29352.2935 2.39332.3933 2.44432.4443 3.03693.0369 3.03493.0349 2.92852.9285 2.78582.7858 2.93292.9329 3.01513.0151 2.38392.3839 2.72192.7219 2.51132.5113 3.05353.0535 2.42452.4245 2.79992.7999 2.99792.9979 2.92012.9201 2.49162.4916 2.31192.3119 2.56642.5664 2.74912.7491 2.85092.8509 2.80602.8060 2.79732.7973 2.90192.9019 2.81192.8119 2.57542.5754 3.16213.1621 2.41922.4192 2.79532.7953 2.82132.8213 2.34392.3439 2.92652.9265 2.40682.4068 2.22002.2200 2.35142.3514 2.87382.8738 2.28052.2805 2.88952.8895 2.41962.4196 2.50452.5045 2.43452.4345 2.79792.7979 2.89792.8979 2.85722.8572 2.42552.4255 2.69412.6941 2.83062.8306 2.80522.8052 2.77442.7744 2.73062.7306 2.48202.4820 2.83432.8343 2.85232.8523 2.38832.3883 2.85362.8536 2.97092.9709 2.53212.5321 2.82602.8260 2.75562.7556 2.86322.8632 3.10043.1004 2.83372.8337 3.00593.0059 2.49712.4971 2.78322.7832 2.31552.3155 3.06403.0640 2.82952.8295 2.81652.8165 2.41552.4155 3.04943.0494 2.90232.9023 2.36552.3655 2.47842.4784 2.41612.4161 2.43312.4331 3.07263.0726 2.43472.4347 2.94802.9480 2.77902.7790 2.52862.5286 2.77252.7725 2.89852.8985 2.79982.7998 2.95572.9557 2.55192.5519 2.80872.8087 2.40822.4082 2.28352.2835 2.44402.4440 2.26682.2668 2.55902.5590 2.63052.6305 2.39382.3938 2.70672.7067 2.48662.4866 2.90672.9067

表10.辅助变量pH值Table 10. Auxiliary Variable pH

Figure BDA0002250087240000142
Figure BDA0002250087240000142

Figure BDA0002250087240000151
Figure BDA0002250087240000151

表11.辅助变量氧化还原电位Table 11. Auxiliary Variable Redox Potential

-5.3838-5.3838 38.327238.3272 -45.9542-45.9542 -122.6730-122.6730 -196.9560-196.9560 -196.1870-196.1870 -126.8390-126.8390 -40.0577-40.0577 -194.4560-194.4560 16.343516.3435 35.379035.3790 -170.4860-170.4860 -87.8065-87.8065 -163.3710-163.3710 33.135733.1357 32.174432.1744 -168.6270-168.6270 -190.8670-190.8670 0.06410.0641 -66.2715-66.2715 -21.5991-21.5991 29.995229.9952 19.740419.7404 -194.4560-194.4560 48.005248.0052 -17.4331-17.4331 -41.2755-41.2755 -97.8049-97.8049 46.851546.8515 36.019936.0199 -88.8961-88.8961 -165.2940-165.2940 -163.0510-163.0510 27.623827.6238 -5.7042-5.7042 -200.9940-200.9940 18.266318.2663 19.355919.3559 -199.9040-199.9040 -170.1650-170.1650 -46.2106-46.2106 -115.4940-115.4940 33.840833.8408 7.94757.9475 -161.0000-161.0000 34.930334.9303 -67.2329-67.2329 45.954245.9542 -3.0764-3.0764 41.211441.2114 -196.8920-196.8920 -205.3520-205.3520 36.660836.6608 -154.1420-154.1420 -158.5640-158.5640 -202.5960-202.5960 -170.1650-170.1650 -146.5150-146.5150 30.443930.4439 21.342721.3427 -16.7281-16.7281 -161.0640-161.0640 18.650918.6509 30.764330.7643 35.314935.3149 -194.8410-194.8410 -155.1680-155.1680 29.995229.9952 8.13978.1397 17.817717.8177 -15.5103-15.5103 19.420019.4200 45.313345.3133 -169.7810-169.7810 -142.6700-142.6700 -205.0960-205.0960 -187.0860-187.0860 26.470126.4701 -196.5710-196.5710 -5.7683-5.7683 -45.6337-45.6337 -172.3440-172.3440 15.894915.8949 30.443930.4439 19.548219.5482 6.85796.8579 -27.8161-27.8161 -16.5358-16.5358 -10.2548-10.2548 4.80694.8069 -172.7930-172.7930 -117.9940-117.9940 -161.3850-161.3850 -205.9290-205.9290 -202.7240-202.7240 30.379830.3798 28.777528.7775 34.097134.0971 -13.5876-13.5876 21.022321.0223 9.03709.0370 -190.8030-190.8030 -163.2430-163.2430 -170.6140-170.6140 20.701820.7018 -161.3200-161.3200 -15.5103-15.5103 36.019936.0199 34.545834.5458 -201.8910-201.8910 -165.9990-165.9990 -197.0200-197.0200 -155.4880-155.4880 -8.7807-8.7807 -112.1620-112.1620 -13.3312-13.3312 33.328033.3280 -12.4980-12.4980 -171.9600-171.9600 18.971318.9713 -196.6990-196.6990 -63.9001-63.9001 -12.8185-12.8185 -158.5000-158.5000 31.341231.3412 -202.3400-202.3400 -174.0750-174.0750 -157.7950-157.7950 -61.3364-61.3364 -164.5250-164.5250 37.686337.6863 -164.0120-164.0120 -163.4350-163.4350 -206.2490-206.2490 -76.3340-76.3340 -138.9520-138.9520 -18.4586-18.4586 -152.8600-152.8600 -178.4330-178.4330 -38.0068-38.0068 -55.9526-55.9526 -160.8720-160.8720 -176.7030-176.7030 -162.3460-162.3460 -16.7922-16.7922 -73.8344-73.8344 -162.6020-162.6020 -121.6470-121.6470 46.018346.0183 -157.5390-157.5390 -39.7373-39.7373 -89.2806-89.2806 39.545039.5450 19.420019.4200 -6.0888-6.0888 44.928744.9287 -196.6990-196.6990 -102.4200-102.4200 -163.0510-163.0510 18.074018.0740 -20.3814-20.3814 -99.7277-99.7277 -161.7690-161.7690 17.561317.5613 -135.4270-135.4270 -159.9750-159.9750 -151.5140-151.5140 -173.4980-173.4980 -177.4080-177.4080 18.650918.6509 -161.9610-161.9610 35.058535.0585 -108.6370-108.6370 -186.5730-186.5730 -5.2556-5.2556 -71.1425-71.1425 -76.8467-76.8467 37.494037.4940 -172.8570-172.8570 -150.6170-150.6170 -202.7880-202.7880 -145.4900-145.4900 -157.8590-157.8590 -201.2500-201.2500 -194.5840-194.5840 -161.6410-161.6410 -160.5510-160.5510 -157.2830-157.2830 -174.7800-174.7800 -120.7500-120.7500

表12.实测出水NH4-N浓度(mg/L)Table 12. Measured NH4-N concentration in effluent (mg/L)

Figure BDA0002250087240000152
Figure BDA0002250087240000152

Figure BDA0002250087240000161
Figure BDA0002250087240000161

Claims (1)

1.一种基于ELM-SL0神经网络的出水NH4-N软测量方法,其特征在于,包括以下步骤:1. a kind of effluent NH based on ELM-SL0 neural network -N soft measurement method, is characterized in that, comprises the following steps: 步骤1:初始化网络结构及参数Step 1: Initialize the network structure and parameters 步骤1.1:初始化网络结构Step 1.1: Initialize the network structure 将温度、溶氧量、总悬浮物含量、pH值以及出水氧化还原电位作为输入变量,氨氮浓度作为输出变量,确定回声状态网络结构为5-N-1,其中N表示储备池节点个数;典型回声状态网络的储备池节点个数N取值为50≤N≤1000;Taking temperature, dissolved oxygen, total suspended solids content, pH value and effluent redox potential as input variables, and ammonia nitrogen concentration as output variables, the network structure of the echo state is determined as 5-N-1, where N represents the number of nodes in the reserve pool; The number N of the reserve pool nodes of a typical echo state network is 50≤N≤1000; 步骤1.2:初始化网络参数Step 1.2: Initialize network parameters 将sigmoid函数作为网络激活函数G(·),确定初始迭代次数i=0,最大迭代次数imax≥5000,训练样本
Figure FDA0002250087230000011
uk表示第k组输入样本,tk表示第k组实际输出值,
Figure FDA0002250087230000012
表示输入样本维度为n,L为样本总数;随机初始化网络输入权值W和阈值向量b在(0,1)之间,设置初始输出权值W=0;
Take the sigmoid function as the network activation function G( ), determine the initial number of iterations i=0, the maximum number of iterations i max ≥ 5000, and the training samples
Figure FDA0002250087230000011
u k represents the kth group of input samples, tk represents the kth group of actual output values,
Figure FDA0002250087230000012
Indicates that the input sample dimension is n, and L is the total number of samples; randomly initialize the network input weight W and the threshold vector b between (0, 1), and set the initial output weight W=0;
步骤2:采用网格搜索法确定学习率η及正则化参数λStep 2: Use the grid search method to determine the learning rate η and the regularization parameter λ (1)首先,将正则化参数置0,即λ=0,然后以0.0005步长将学习率的搜索范围设定为[0.0005,0.01],运行程序,选取训练误差最小的最优学习率η;(1) First, set the regularization parameter to 0, that is, λ=0, then set the search range of the learning rate to [0.0005, 0.01] with a step size of 0.0005, run the program, and select the optimal learning rate η with the smallest training error ; (2)在最优学习率η情况下,以0.0025步长将正则化参数的搜索范围设定为[0.0025,0.05],保证不影响训练误差情况下选取稀疏效果最佳的最优正则化参数λ;(2) In the case of the optimal learning rate η, the search range of the regularization parameter is set to [0.0025, 0.05] with a step size of 0.0025 to ensure that the optimal regularization parameter with the best sparsity effect is selected without affecting the training error. λ; 步骤3:计算输入第k组样本的网络输出yk及预测误差dk Step 3: Calculate the network output y k and the prediction error d k of the input kth group of samples 对于给定的激活函数G(·)、输入样本uk、输入权值W以及阈值向量b,得到隐含层输出为:For a given activation function G(·), input sample uk , input weight W and threshold vector b, the output of the hidden layer is obtained as:
Figure FDA0002250087230000013
Figure FDA0002250087230000013
其中,gj,1<j<N表示储备池第j个神经元的激活函数,Wj·uk,1<j<N表示储备池第j个神经元与输入层之间的输入权值向量Wj与输入向量uk的内积,bj,1<j<N表示储备池中第j个神经元的阈值;Among them, g j ,1<j<N represents the activation function of the jth neuron in the reserve pool, and W j u k ,1<j<N represents the input weight between the jth neuron in the reserve pool and the input layer The inner product of the vector W j and the input vector uk , b j , 1<j<N represents the threshold of the jth neuron in the reserve pool; 输入第k组样本,网络输出yk由如下公式得到:Input the kth group of samples, the network output yk is obtained by the following formula: yk=W·G(Wuk+b) (2)y k =W·G(Wuk + b) (2) 网络期望输出tk与实际输出yk之间的训练误差dk定义为:The training error d k between the expected output t k of the network and the actual output y k is defined as: dk=tk-yk (3)d k =t k -y k (3) 步骤4:计算输出权值梯度,更新输出权值Step 4: Calculate the gradient of the output weights and update the output weights 定义标准均方误差函数为:The standard mean squared error function is defined as:
Figure FDA0002250087230000021
Figure FDA0002250087230000021
其中,
Figure FDA0002250087230000022
in,
Figure FDA0002250087230000022
在误差函数上添加L0正则化项,改进的误差函数为:Adding the L0 regularization term to the error function, the improved error function is:
Figure FDA0002250087230000023
Figure FDA0002250087230000023
其中,
Figure FDA0002250087230000024
为W的L0范数,被定义如下:
in,
Figure FDA0002250087230000024
is the L0 norm of W, defined as follows:
Figure FDA0002250087230000025
Figure FDA0002250087230000025
其中,Wj,1<j<N是第j个输出权值;Among them, W j , 1<j<N is the jth output weight; 然而,L0范数是非凸函数,因此公式(5)是一个NP-hard极小化组合问题;采用一个连续可微的函数f(·)对L0范数进行逼近,关于W的函数f(γ,Wj)被定义如下:However, the L0 norm is a non-convex function, so formula (5) is an NP-hard minimization combinatorial problem; using a continuously differentiable function f( ) to approximate the L0 norm, the function f(γ with respect to W , W j ) is defined as follows:
Figure FDA0002250087230000026
Figure FDA0002250087230000026
Figure FDA0002250087230000027
Figure FDA0002250087230000027
其中,γ为正数,γ取0.05;由此得到,f(γ,Wj)的一阶导数为:Among them, γ is a positive number, and γ is taken as 0.05; thus, the first derivative of f(γ, W j ) is:
Figure FDA0002250087230000031
Figure FDA0002250087230000031
因此公式(5)更新为:So formula (5) is updated to:
Figure FDA0002250087230000032
Figure FDA0002250087230000032
引入批量梯度下降算法,在初始权值W=W0的情况下,E(W)的梯度公式为:The batch gradient descent algorithm is introduced. In the case of the initial weight W=W 0 , the gradient formula of E(W) is:
Figure FDA0002250087230000033
Figure FDA0002250087230000033
其中,
Figure FDA0002250087230000034
为第i次E(W)的梯度,
Figure FDA0002250087230000035
为第i次
Figure FDA0002250087230000036
的梯度;
in,
Figure FDA0002250087230000034
is the gradient of the i-th E(W),
Figure FDA0002250087230000035
for the ith time
Figure FDA0002250087230000036
the gradient of ;
由此得到,输出权值的更新公式为:From this, the update formula of the output weight is:
Figure FDA0002250087230000037
Figure FDA0002250087230000037
其中,Wi+1为第i+1次迭代的输出权值,Wi为第i次迭代的输出权值;输出权值每更新一次,i累加1,即i=i+1;Among them, Wi +1 is the output weight of the i+1th iteration, and Wi is the output weight of the i -th iteration; every time the output weight is updated, i accumulates 1, that is, i=i+1; 步骤5:判断训练是否结束Step 5: Determine whether the training is over 若满足i≥imax,则执行步骤6,否则返回步骤3;If i ≥ i max , go to step 6, otherwise go back to step 3; 步骤6:测试网络Step 6: Test the Network 利用以上步骤得到的输出权值W,输入测试样本,对网络进行测试。Using the output weights W obtained in the above steps, input test samples to test the network.
CN201911030774.8A 2019-10-28 2019-10-28 A Soft-Sensing Method for Effluent NH4-N Based on ELM-SL0 Neural Network Pending CN110837886A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911030774.8A CN110837886A (en) 2019-10-28 2019-10-28 A Soft-Sensing Method for Effluent NH4-N Based on ELM-SL0 Neural Network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911030774.8A CN110837886A (en) 2019-10-28 2019-10-28 A Soft-Sensing Method for Effluent NH4-N Based on ELM-SL0 Neural Network

Publications (1)

Publication Number Publication Date
CN110837886A true CN110837886A (en) 2020-02-25

Family

ID=69575622

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911030774.8A Pending CN110837886A (en) 2019-10-28 2019-10-28 A Soft-Sensing Method for Effluent NH4-N Based on ELM-SL0 Neural Network

Country Status (1)

Country Link
CN (1) CN110837886A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116151121A (en) * 2023-02-21 2023-05-23 北京工业大学 A Neural Network-Based NH4-N Soft-Sensing Method for Effluent Water
CN116451763A (en) * 2023-03-17 2023-07-18 北京工业大学 Effluent NH based on EDDESN 4 -N prediction method

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
BR8200467U (en) * 2002-03-18 2003-12-09 Volnei Jaco Knorst Synthetic marble sink with tub in differentiated material
CN104616030A (en) * 2015-01-21 2015-05-13 北京工业大学 Extreme learning machine algorithm-based recognition method
CN104965971A (en) * 2015-05-24 2015-10-07 北京工业大学 Ammonia nitrogen concentration soft-measuring method based on fuzzy neural network
CN106503730A (en) * 2016-09-30 2017-03-15 暨南大学 A kind of bridge moving load identification method based on concatenate dictionaries and sparse regularization
CN106803237A (en) * 2016-12-14 2017-06-06 银江股份有限公司 A kind of improvement self-adaptive weighted average image de-noising method based on extreme learning machine
US20180093092A1 (en) * 2016-04-22 2018-04-05 Newton Howard Biological co-processor (bcp)
CN108469507A (en) * 2018-03-13 2018-08-31 北京工业大学 A kind of water outlet BOD flexible measurement methods based on Self organizing RBF Neural Network
CN109242194A (en) * 2018-09-25 2019-01-18 东北大学 A kind of thickener underflow concentration prediction method based on mixed model
JP2019040414A (en) * 2017-08-25 2019-03-14 日本電信電話株式会社 Learning device and learning method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
BR8200467U (en) * 2002-03-18 2003-12-09 Volnei Jaco Knorst Synthetic marble sink with tub in differentiated material
CN104616030A (en) * 2015-01-21 2015-05-13 北京工业大学 Extreme learning machine algorithm-based recognition method
CN104965971A (en) * 2015-05-24 2015-10-07 北京工业大学 Ammonia nitrogen concentration soft-measuring method based on fuzzy neural network
US20180093092A1 (en) * 2016-04-22 2018-04-05 Newton Howard Biological co-processor (bcp)
CN106503730A (en) * 2016-09-30 2017-03-15 暨南大学 A kind of bridge moving load identification method based on concatenate dictionaries and sparse regularization
CN106803237A (en) * 2016-12-14 2017-06-06 银江股份有限公司 A kind of improvement self-adaptive weighted average image de-noising method based on extreme learning machine
JP2019040414A (en) * 2017-08-25 2019-03-14 日本電信電話株式会社 Learning device and learning method
CN108469507A (en) * 2018-03-13 2018-08-31 北京工业大学 A kind of water outlet BOD flexible measurement methods based on Self organizing RBF Neural Network
CN109242194A (en) * 2018-09-25 2019-01-18 东北大学 A kind of thickener underflow concentration prediction method based on mixed model

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
YU YL ET AL: "A Homotopy Iterative Hard Thresholding Algorithm With Extreme Learning Machine for scene Recognition" *
慈能达: "车载毫米波雷达通信一体化系统中的压缩感知DOA估计" *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116151121A (en) * 2023-02-21 2023-05-23 北京工业大学 A Neural Network-Based NH4-N Soft-Sensing Method for Effluent Water
CN116151121B (en) * 2023-02-21 2024-03-26 北京工业大学 A soft measurement method for effluent NH4-N based on neural network
CN116451763A (en) * 2023-03-17 2023-07-18 北京工业大学 Effluent NH based on EDDESN 4 -N prediction method
CN116451763B (en) * 2023-03-17 2024-04-12 北京工业大学 Effluent NH based on EDDESN 4 -N prediction method

Similar Documents

Publication Publication Date Title
US10570024B2 (en) Method for effluent total nitrogen-based on a recurrent self-organizing RBF neural network
Wang et al. Pan evaporation modeling using four different heuristic approaches
CN105740619B (en) Weighting extreme learning machine sewage disposal on-line fault diagnosis method based on kernel function
CN111291937A (en) Method for predicting quality of treated sewage based on combination of support vector classification and GRU neural network
CN106022954B (en) Multiple BP neural network load prediction method based on grey correlation degree
CN109828089B (en) DBN-BP-based water quality parameter nitrous acid nitrogen online prediction method
CN109657790B (en) PSO-based recursive RBF neural network effluent BOD prediction method
CN109344971B (en) Effluent ammonia nitrogen concentration prediction method based on adaptive recursive fuzzy neural network
CN110542748B (en) A knowledge-based robust effluent ammonia nitrogen soft-sensing method
CN112182709B (en) Method for rapidly predicting water drainage temperature of large reservoir stoplog gate layered water taking facility
CN105354620A (en) Method for predicting fan generation power
CN114037163A (en) An early warning method of wastewater treatment effluent quality based on dynamic weight PSO optimization BP neural network
CN108416460A (en) Cyanobacterial bloom prediction technique based on the random depth confidence network model of multifactor sequential-
CN108595892A (en) Soft-measuring modeling method based on time difference model
CN110837886A (en) A Soft-Sensing Method for Effluent NH4-N Based on ELM-SL0 Neural Network
CN110991616B (en) Method for predicting BOD of effluent based on pruning feedforward small-world neural network
CN112924646B (en) A BOD Soft Sensing Method Based on Adaptive Pruning Feedforward Small World Neural Network
CN110929809B (en) A Soft Sensing Method for Key Water Quality Indicators of Sewage with Feature Self-enhancing Recurrent Neural Network
Shang et al. Research on intelligent pest prediction of based on improved artificial neural network
CN116306803A (en) Method for predicting BOD concentration of outlet water of ILSTM (biological information collection flow) neural network based on WSFA-AFE
CN112765902A (en) RBF neural network soft measurement modeling method based on TentFWA-GD and application thereof
CN115905821A (en) State monitoring method of urban sewage treatment process based on multi-stage dynamic fuzzy width learning
Vaněk et al. On-line estimation of biomass concentration using a neural network and information about metabolic state
CN118625667A (en) Soft measurement and multi-objective optimization control method, device and medium for sewage treatment process
CN110276478B (en) Short-term wind power prediction method based on segmented ant colony algorithm optimization SVM

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200225