[go: up one dir, main page]

CN102267095A - Method for monitoring and dressing grinding wheel on line - Google Patents

Method for monitoring and dressing grinding wheel on line Download PDF

Info

Publication number
CN102267095A
CN102267095A CN 201110247142 CN201110247142A CN102267095A CN 102267095 A CN102267095 A CN 102267095A CN 201110247142 CN201110247142 CN 201110247142 CN 201110247142 A CN201110247142 A CN 201110247142A CN 102267095 A CN102267095 A CN 102267095A
Authority
CN
China
Prior art keywords
mrow
grinding wheel
msup
neural network
frequency
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 201110247142
Other languages
Chinese (zh)
Other versions
CN102267095B (en
Inventor
王洪
许世雄
许君
王东昱
戴瑜兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
HUNAN YUHUAN TONGXIN CNC MACHINE TOOL CO Ltd
Original Assignee
HUNAN YUHUAN TONGXIN CNC MACHINE TOOL CO Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by HUNAN YUHUAN TONGXIN CNC MACHINE TOOL CO Ltd filed Critical HUNAN YUHUAN TONGXIN CNC MACHINE TOOL CO Ltd
Priority to CN 201110247142 priority Critical patent/CN102267095B/en
Publication of CN102267095A publication Critical patent/CN102267095A/en
Application granted granted Critical
Publication of CN102267095B publication Critical patent/CN102267095B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Constituent Portions Of Griding Lathes, Driving, Sensing And Control (AREA)

Abstract

The invention discloses a method for monitoring and dressing a grinding wheel on line. The method comprises the following steps of: detecting acoustic emission (AE) digital signals which are sent by the grinding wheel in a grinding process and processed by using the conventional grinding wheel dressing system; performing fast Fourier transformation on N AE digital signals a(n) in the previous step to obtain different spectral components of N/2 real parts AR(k) and imaginary parts AI(k); and calculating by formulae (1, 2 and 3) to obtain a root-mean-square value (Y), total energy (E), and a center frequency (fc) of a frequency domain, which serve as input samples of a back propagation (BP) neural network, of the AE digital signals. In the method, the root-mean-square value (Y), the total energy (E) and the center frequency (fc) of the frequency domain of the AE digital signals serve as feature extraction values for the first time and then serve as input values of the BP neural network; and the starting time and ending time for dressing of the grinding wheel are accurately predicted by training and test of the BP neural network, so an automatic dressing function of the grinding wheel is realized. The method is high in prediction accuracy.

Description

Grinding wheel online monitoring and dressing method
Technical Field
The invention belongs to a grinding machine processing control method, and particularly relates to an online monitoring and trimming method for a grinding wheel.
Background
The dressing of the grinding wheel has great influence on the grinding performance and the grinding effect of the grinding wheel. With the development of intelligent and adaptive control of grinding processing, the accurate time for finishing the passivation of the grinding wheel and the finishing of the grinding wheel can be automatically detected, and the finishing of the grinding wheel can be started and finished in time, so that the grinding quality and the production efficiency are improved. In the past, timing dressing is mostly adopted to avoid workpiece grinding burn, and when the grinding wheel does not reach the working life limit, the grinding wheel is dressed in advance, so that the method is blindly carried out ("grinding quality on-line monitoring method research", Liu Guijie et al, diamond and abrasive grinding tool engineering, 2004.10). Frequent dressing of the grinding wheel not only affects machining efficiency, but also accelerates wear of the grinding wheel, and particularly, the use of Cubic Boron Nitride (CBN) grinding wheels is expensive, increasing production costs. On the contrary, if the finishing period is delayed, the precision and the surface quality of the workpiece are influenced, and waste products are caused. Therefore, the grinding wheel can be accurately and automatically dressed in time, and the method is an important way for improving the grinding processing productivity and ensuring the grinding processing quality.
In recent development and research, Acoustic Emission (AE) signals have been widely used as an information source for grinding control. Many studies have shown that: by monitoring the amplitude change of the AE signals, the sharpness of the grinding wheel can be evaluated and the service life of the grinding wheel can be determined (the neural network-based online monitoring of the grinding wheel state, Liu Gui Jie, etc., the university report of northeast China, 2002.10; the application of STFT in the AE signal feature extraction, Liao Jun, etc., the instrument and meter report, 2008.9; the AE monitoring model of the grinding process, Wu national month, grinding machine and grinding, 1999.1). However, when the material, the processing condition and the processing parameters of the processed workpiece change frequently, the amplitude of the acoustic emission signal also changes violently, and the degree of the grinding wheel passivation cannot be judged only by detecting the amplitude of the acoustic emission signal. Therefore, the FFT spectral analysis method is provided, namely, the amplitude of the high-frequency component generated by machining when the grinding wheel is sharp is large and the spectral component is more, and the amplitude of the high-frequency component is small and the spectral component is less when the grinding wheel is passivated. A band limit is set, when the high-frequency component is less than a certain value, the grinding wheel needs to be dressed, and when the high-frequency component reaches a certain range, the dressing of the grinding wheel is finished, so that the automatic dressing of the grinding wheel is realized.
Disclosure of Invention
The invention aims to provide an online monitoring and trimming method for a grinding wheel.
The invention provides an online monitoring and trimming method of a grinding wheel, which is realized by the following steps:
step 1, detecting an Acoustic Emission (AE) digital signal which is emitted and processed in a grinding process of a grinding wheel by using an existing grinding wheel dressing system;
step 2, obtaining N Acoustic Emission (AE) digital signals a (N) of the step 1 through Fast Fourier Transform (FFT)
Figure BDA0000086182560000021
Different spectral components of real part ar (k) and imaginary part ai (k);
and 3, calculating by formulas (1), (2) and (3) to obtain a root mean square value Y, a total energy E and a frequency domain center frequency fc of the Acoustic Emission (AE) digital signal as an input sample of the BP neural network:
<math> <mrow> <mi>Y</mi> <mo>=</mo> <msqrt> <mfrac> <mrow> <munderover> <mi>&Sigma;</mi> <mrow> <mi>n</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>N</mi> </munderover> <msup> <mi>a</mi> <mn>2</mn> </msup> <mrow> <mo>(</mo> <mi>n</mi> <mo>)</mo> </mrow> </mrow> <mi>N</mi> </mfrac> </msqrt> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow> </math>
<math> <mrow> <mi>E</mi> <mo>=</mo> <munderover> <mi>&Sigma;</mi> <mrow> <mi>k</mi> <mo>=</mo> <mn>0</mn> </mrow> <mfrac> <mi>N</mi> <mn>2</mn> </mfrac> </munderover> <mrow> <mo>(</mo> <mi>AR</mi> <msup> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <mi>AI</mi> <msup> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </mrow> </math>
<math> <mrow> <msub> <mi>f</mi> <mi>c</mi> </msub> <mo>=</mo> <mfrac> <mrow> <munderover> <mi>&Sigma;</mi> <mrow> <mi>k</mi> <mo>=</mo> <mn>0</mn> </mrow> <mfrac> <mi>N</mi> <mn>2</mn> </mfrac> </munderover> <mi>k</mi> <mo>&times;</mo> <mi>f</mi> <mo>&times;</mo> <mrow> <mo>(</mo> <mi>AR</mi> <msup> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <mi>AI</mi> <msup> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>)</mo> </mrow> </mrow> <mrow> <mn>2</mn> <mi>&pi;E</mi> </mrow> </mfrac> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> </mrow> </math>
in the formula: y is the root mean square value of the digital signal of the Acoustic Emission (AE), a (N) is the digital signal of the Acoustic Emission (AE), N is the sampling number of the digital signal of the input Acoustic Emission (AE), k is 0, 1, N, AR (k) is the real part frequency spectrum after FFT calculation, AI (k) is the imaginary part frequency spectrum after FFT calculation, E is the total energy of the digital signal of the Acoustic Emission (AE), fcThe frequency-domain center frequency of the Acoustic Emission (AE) digital signal, f being the resolution frequency of the EFT spectral analysis, e.g. the Acoustic Emission (AE) signalMaximum frequency of number fmaxSampling frequency of fsIf N is selected as the sampling point, the resolution frequency is f ═ fs/N。
Step 4, the BP neural network takes a part of the samples in the step 3 for learning and training, so that the error value generated by the sample training is within a set range, and the weight V of the BP neural network is obtainedij、WJkAnd a control value Rj、QkInputting the rest samples into the BP neural network matrix to be tested to obtain numerical values in two different ranges in the passivation state and the sharpness state of the grinding wheel, wherein the numerical values are used for predicting the starting time and the ending time of grinding wheel dressing respectively;
step 5, programming the processes in the steps 2, 3 and 4 by using VB and Mitrix-VB software tools, and calculating a root mean square value Y, signal energy E and central frequency f of a signalcWeight V of sum BP neural networkij、WjkControl value Rj、QkAnd the numerical values of the grinding wheel in two different ranges in the passivation state and the sharp state are used for verifying whether the result in the step 4 achieves the expected aim of automatic grinding wheel dressing, and the program is transplanted into a numerical control system of a camshaft grinding machine to realize the automatic grinding wheel dressing function.
The method firstly uses the root mean square value (Y), the total energy (E) and the frequency domain center frequency (f) of an Acoustic Emission (AE) digital signalc) The method has the advantages that the prediction accuracy is high, the machining precision and the machining efficiency are improved, the service life of the grinding wheel is prolonged, and the production cost of an enterprise is reduced.
The invention is further described with reference to the following figures and detailed description.
Drawings
FIG. 1 is a block diagram of an automatic grinding wheel dressing system for a camshaft grinding machine.
FIG. 2 is a diagram of a BP neural network model.
Fig. 3 is a flowchart of the BP neural network training procedure.
Fig. 4 is a flowchart of a BP neural network test procedure.
FIG. 5 is a parameter input diagram of a BP neural network.
FIG. 6 is a graph of the result of the BP neural network training operation.
FIG. 7 is a graph of the result of the BP neural network training operation.
Detailed Description
Fig. 1 shows a block diagram of a grinding wheel dressing system of a conventional YTMCNC8326 numerically controlled camshaft grinder, which comprises a grinding wheel, an AE sensor, a diamond roller dresser, a workpiece clamping device, a marsops P7WB dynamic balancer, a siemens 840D numerically controlled system, a siemens 611D driver and the like. The AE sensor is arranged at the axle center of the dynamic and static piezoelectric main shaft, the MARPOSS dynamic balancing instrument detects an AE signal, digitalizes and filters the AE signal, and then sends the AE signal to the NCU through the Profibus bus, the MMC user interface (OEM) through the NCU or the MMC user interface (OEM) through the serial port COM. And performing spectrum analysis in an OEM (original equipment manufacturer), and controlling the moment of starting or finishing the grinding wheel dressing according to different spectrums at different stages to realize automatic dressing of the grinding wheel.
The steps and the principle of the method are as follows:
step 1, detecting an Acoustic Emission (AE) digital signal which is emitted and processed in a grinding process of a grinding wheel by using an existing grinding wheel dressing system;
step 2, obtaining N Acoustic Emission (AE) digital signals a (N) of the step 1 through Fast Fourier Transform (FFT)
Figure BDA0000086182560000041
Different spectral components of real part ar (k) and imaginary part ai (k);
principle of FFT algorithm
In digital signal processing, Discrete Fourier Transform (DFT) provides a mathematical method for Fourier Transform operations using a digital computer. The DFT calculation is:
<math> <mrow> <mi>X</mi> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mo>=</mo> <mi>DFT</mi> <mo>[</mo> <mi>x</mi> <mrow> <mo>(</mo> <mi>n</mi> <mo>)</mo> </mrow> <mo>]</mo> <mo>=</mo> <munderover> <mi>&Sigma;</mi> <mrow> <mi>n</mi> <mo>=</mo> <mn>0</mn> </mrow> <mrow> <mi>N</mi> <mo>-</mo> <mn>1</mn> </mrow> </munderover> <mi>x</mi> <mrow> <mo>(</mo> <mi>n</mi> <mo>)</mo> </mrow> <msubsup> <mi>W</mi> <mi>N</mi> <mi>nk</mi> </msubsup> <mo>,</mo> <mn>0</mn> <mo>&le;</mo> <mi>k</mi> <mo>&le;</mo> <mi>N</mi> <mo>-</mo> <mn>1</mn> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>4</mn> <mo>)</mo> </mrow> </mrow> </math>
wherein,
Figure BDA0000086182560000043
referred to as fourier factors.
However, the calculation workload of DFT is very large, and Fast Fourier Transform (FFT) algorithm is an algorithm for reducing the calculation time of DFT developed on the basis of DFT, which greatly improves the calculation efficiency. The present invention is directed to a typical form of FFT (i.e., the kurily-graph-radix algorithm), which is a radix-2 FFT algorithm that takes N to the power M of 2: n is 2MM is a positive integer, and the algorithm starts by decomposing the N-point DFT operation into two groups
Figure BDA0000086182560000044
And (3) point DFT operation, wherein the calculation formula is as follows:
<math> <mrow> <mi>X</mi> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mo>=</mo> <munderover> <mi>&Sigma;</mi> <mrow> <mi>r</mi> <mo>=</mo> <mn>0</mn> </mrow> <mrow> <mfrac> <mi>N</mi> <mn>2</mn> </mfrac> <mo>-</mo> <mn>1</mn> </mrow> </munderover> <mi>x</mi> <mrow> <mo>(</mo> <mn>2</mn> <mi>r</mi> <mo>)</mo> </mrow> <msubsup> <mi>W</mi> <mi>N</mi> <mrow> <mn>2</mn> <mi>rk</mi> </mrow> </msubsup> <mo>+</mo> <munderover> <mi>&Sigma;</mi> <mrow> <mi>r</mi> <mo>=</mo> <mn>0</mn> </mrow> <mrow> <mfrac> <mi>N</mi> <mn>2</mn> </mfrac> <mo>-</mo> <mn>1</mn> </mrow> </munderover> <mi>x</mi> <mrow> <mo>(</mo> <mn>2</mn> <mi>r</mi> <mo>+</mo> <mn>1</mn> <mo>)</mo> </mrow> <msubsup> <mi>W</mi> <mi>N</mi> <mrow> <mrow> <mo>(</mo> <mn>2</mn> <mi>r</mi> <mo>+</mo> <mn>1</mn> <mo>)</mo> </mrow> <mi>k</mi> </mrow> </msubsup> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>5</mn> <mo>)</mo> </mrow> </mrow> </math>
wherein W is as defined above; r is 0, 1, …,
Figure BDA0000086182560000046
2r denotes an even number, and 2r +1 denotes an odd number.
By the method, the operation can be divided into four groups of even numbers and odd numbers in the even numbers and the odd numbers, and the like.
The spectral distributions of the various signals can be obtained by the algorithm described above.
Step 3, using a well-known feature space expression method as a proper 'feature', wherein the root mean square value Y and the frequency domain center frequency f of the Acoustic Emission (AE) digital signalcAnd total energy E is selected, and the root mean square value Y, the total energy E and the frequency domain center frequency f of the Acoustic Emission (AE) digital signal are obtained through calculation of the formulas (1), (2) and (3)cAnd as input samples for the BP neural network:
<math> <mrow> <mi>Y</mi> <mo>=</mo> <msqrt> <mfrac> <mrow> <munderover> <mi>&Sigma;</mi> <mrow> <mi>n</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>N</mi> </munderover> <msup> <mi>a</mi> <mn>2</mn> </msup> <mrow> <mo>(</mo> <mi>n</mi> <mo>)</mo> </mrow> </mrow> <mi>N</mi> </mfrac> </msqrt> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow> </math>
<math> <mrow> <mi>E</mi> <mo>=</mo> <munderover> <mi>&Sigma;</mi> <mrow> <mi>k</mi> <mo>=</mo> <mn>0</mn> </mrow> <mfrac> <mi>N</mi> <mn>2</mn> </mfrac> </munderover> <mrow> <mo>(</mo> <mi>AR</mi> <msup> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <mi>AI</mi> <msup> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </mrow> </math>
<math> <mrow> <msub> <mi>f</mi> <mi>c</mi> </msub> <mo>=</mo> <mfrac> <mrow> <munderover> <mi>&Sigma;</mi> <mrow> <mi>k</mi> <mo>=</mo> <mn>0</mn> </mrow> <mfrac> <mi>N</mi> <mn>2</mn> </mfrac> </munderover> <mi>k</mi> <mo>&times;</mo> <mi>f</mi> <mo>&times;</mo> <mrow> <mo>(</mo> <mi>AR</mi> <msup> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <mi>AI</mi> <msup> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>)</mo> </mrow> </mrow> <mrow> <mn>2</mn> <mi>&pi;E</mi> </mrow> </mfrac> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> </mrow> </math>
in the formula: y is the root mean square value of the digital signal of the Acoustic Emission (AE), a (N) is the digital signal of the Acoustic Emission (AE), N is the sampling number of the digital signal of the input Acoustic Emission (AE), k is 0, 1, N, AR (k) is the real part frequency spectrum after EET calculation, AI (k) is the imaginary part frequency spectrum after FFT calculationSpectrum, E is the total energy of the Acoustic Emission (AE) digital signal, fcThe frequency domain center frequency of the Acoustic Emission (AE) digital signal, f is the resolution frequency of FFT spectral analysis, e.g. the maximum frequency of the Acoustic Emission (AE) signal is fmaxSampling frequency of fsIf N is selected as the sampling point, the resolution frequency is f ═ fs/N。
Step 4, the BP neural network takes a part of the samples in the step 3 for learning and training, so that the error value generated by the sample training is within a set range, and the weight V of the BP neural network is obtainedij、WjkAnd a control value Rj、QkInputting the rest samples into the BP neural network matrix to be tested to obtain numerical values in two different ranges in the passivation state and the sharpness state of the grinding wheel, wherein the numerical values are used for predicting the starting time and the ending time of grinding wheel dressing respectively;
establishing a BP neural network model:
(1) topology of neural network
The neural network can establish a relation model for a multi-input and multi-output complex nonlinear system, and is very suitable for replacing human decision behaviors based on experience.
In the application, a three-layer BP neural network is adopted as a grinding wheel state identification model, and as shown in FIG. 2, a general three-layer BP neural network model diagram is provided. In fact, the selection of the number of hidden layer nodes is not regular and the solution is to gradually increase the number of neurons from small to large until higher precision and faster convergence rate are obtained. The self-constructed neural network is selected, a hidden layer node is given to the network firstly, and the error of the hidden layer node is calculated; then adding a hidden layer node; and if so, exiting and determining the number of hidden layer nodes of the network. The result of training according to the method shows that the activation function is a Sigmoid function, the topological structure is a 3-15-1 neural network, and the recognition accuracy is about 98%, so that the topological structure is selected.
(2) Neural network learning algorithm
Assuming a given pattern xiFor network input, OkFor network output, dkFor the target value, j is 0, 1, …, m for each output layer; k is 1, 2, … l; the hidden layers are all i-0, 1, …, n; j is 1, 2, …, m.
<math> <mrow> <msub> <mi>E</mi> <mi>p</mi> </msub> <mo>=</mo> <mfrac> <mn>1</mn> <mn>2</mn> </mfrac> <mi>&Sigma;</mi> <msup> <mrow> <mo>(</mo> <msub> <mi>d</mi> <mi>k</mi> </msub> <mo>-</mo> <msub> <mi>O</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>6</mn> <mo>)</mo> </mrow> </mrow> </math>
dkTo a target value, OkTo output a value, EpIs the sum of the squares of the error values
For the output layer or layers, the number of layers,
Figure BDA0000086182560000062
can be unfolded as follows:
<math> <mrow> <msubsup> <mi>&delta;</mi> <mi>k</mi> <mi>v</mi> </msubsup> <mo>=</mo> <mrow> <mo>(</mo> <msub> <mi>d</mi> <mi>k</mi> </msub> <mo>-</mo> <msub> <mi>O</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <msup> <mi>f</mi> <mo>&prime;</mo> </msup> <mrow> <mo>(</mo> <msub> <mi>net</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>7</mn> <mo>)</mo> </mrow> </mrow> </math>
wherein is f ( x ) = 1 1 + e - x ,
Figure BDA0000086182560000065
An error signal is output.
In the case of a hidden layer or layers,
Figure BDA0000086182560000066
can be unfolded as follows:
<math> <mrow> <msubsup> <mi>&delta;</mi> <mi>j</mi> <mi>y</mi> </msubsup> <mo>=</mo> <msup> <mi>f</mi> <mo>&prime;</mo> </msup> <mrow> <mo>(</mo> <msub> <mi>net</mi> <mi>j</mi> </msub> <mo>)</mo> </mrow> <munderover> <mi>&Sigma;</mi> <mrow> <mi>k</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>l</mi> </munderover> <msubsup> <mi>&delta;</mi> <mi>k</mi> <mi>v</mi> </msubsup> <msub> <mi>w</mi> <mi>jk</mi> </msub> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>8</mn> <mo>)</mo> </mrow> </mrow> </math>
wherein
Figure BDA0000086182560000068
Latent layer error signal, WjkOutput layer weight
<math> <mrow> <msup> <mi>f</mi> <mo>&prime;</mo> </msup> <mrow> <mo>(</mo> <msub> <mi>net</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <msub> <mrow> <mo>&PartialD;</mo> <mi>O</mi> </mrow> <mi>k</mi> </msub> <mrow> <mo>&PartialD;</mo> <msub> <mi>net</mi> <mi>k</mi> </msub> </mrow> </mfrac> <mo>=</mo> <msub> <mi>O</mi> <mi>k</mi> </msub> <mrow> <mo>(</mo> <msub> <mrow> <mn>1</mn> <mo>-</mo> <mi>O</mi> </mrow> <mi>k</mi> </msub> <mo>)</mo> </mrow> </mrow> </math>
<math> <mrow> <msup> <mi>f</mi> <mo>&prime;</mo> </msup> <mrow> <mo>(</mo> <msub> <mi>net</mi> <mi>j</mi> </msub> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <msub> <mrow> <mo>&PartialD;</mo> <mi>y</mi> </mrow> <mi>j</mi> </msub> <mrow> <mo>&PartialD;</mo> <msub> <mi>net</mi> <mi>j</mi> </msub> </mrow> </mfrac> <mo>=</mo> <msub> <mi>y</mi> <mi>j</mi> </msub> <mrow> <mo>(</mo> <msub> <mrow> <mn>1</mn> <mo>-</mo> <mi>y</mi> </mrow> <mi>j</mi> </msub> <mo>)</mo> </mrow> </mrow> </math>
Thus, there are:
<math> <mrow> <msubsup> <mi>&delta;</mi> <mi>k</mi> <mi>v</mi> </msubsup> <mo>=</mo> <mrow> <mo>(</mo> <msub> <mi>d</mi> <mi>k</mi> </msub> <mo>-</mo> <msub> <mi>O</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <msub> <mi>O</mi> <mi>k</mi> </msub> <mrow> <mo>(</mo> <msub> <mrow> <mn>1</mn> <mo>-</mo> <mi>O</mi> </mrow> <mi>k</mi> </msub> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>9</mn> <mo>)</mo> </mrow> </mrow> </math>
<math> <mrow> <msubsup> <mi>&delta;</mi> <mi>j</mi> <mi>y</mi> </msubsup> <mo>=</mo> <msub> <mi>y</mi> <mi>j</mi> </msub> <mrow> <mo>(</mo> <mn>1</mn> <mo>-</mo> <msub> <mi>y</mi> <mi>j</mi> </msub> <mo>)</mo> </mrow> <munderover> <mi>&Sigma;</mi> <mrow> <mi>k</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>l</mi> </munderover> <msubsup> <mi>&delta;</mi> <mi>k</mi> <mi>v</mi> </msubsup> <msub> <mi>w</mi> <mi>jk</mi> </msub> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>110</mn> <mo>)</mo> </mrow> </mrow> </math>
wherein: yj is the hidden layer output value.
The weight correction of the output layer is
Awjk=η(dk-Ok)Ok(1-Ok)yj (11)
The hidden layer weight correction amount is
<math> <mrow> <mi>&Delta;</mi> <msub> <mi>v</mi> <mi>ij</mi> </msub> <mo>=</mo> <mi>&eta;</mi> <mrow> <mo>(</mo> <munderover> <mi>&Sigma;</mi> <mrow> <mi>k</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>l</mi> </munderover> <msubsup> <mi>&delta;</mi> <mi>k</mi> <mi>v</mi> </msubsup> <msub> <mi>w</mi> <mi>jk</mi> </msub> <mo>)</mo> </mrow> <msub> <mi>y</mi> <mi>j</mi> </msub> <mrow> <mo>(</mo> <msub> <mrow> <mn>1</mn> <mo>-</mo> <mi>y</mi> </mrow> <mi>j</mi> </msub> <mo>)</mo> </mrow> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>12</mn> <mo>)</mo> </mrow> </mrow> </math>
In order to improve the network convergence characteristics, the correction expression Δ W (n) is selected
ΔW(n)=ηδX+αW(n-1)
Wherein, W is a certain layer weight matrix, X is a certain layer input vector, and eta is a learning factor; α is a proportionality constant for adjusting the variation.
(3) VB6.0 programming for realizing FFT spectral analysis and training and testing of BP neural network
The grinding wheel dressing control interface is mainly used for collecting AE signals, performing spectral analysis on the AE signals, drawing an AE signal spectrogram, extracting a characteristic root mean square value Y of the AE signals, the central frequency fc of frequency domain amplitude and the total energy E, and judging the starting time and the ending time of grinding wheel dressing through BP neural network prediction.
The slave balancer receives digital AE signals through a serial port or a Profibus bus, and the digital AE signals are divided into 4 groups: the 1 st group is the digital AE signal that the emery wheel was gathered when sharp grinding work piece, the 2 nd group is the digital AE signal that the emery wheel was gathered when passive grinding work piece, the 3 rd group is the digital AE signal that the passivation emery wheel was gathered when the diamond-impregnated roller was maintained, the 4 th group is the digital AE signal that the diamond-impregnated roller was gathered when the sharp emery wheel was maintained. Each group collects 200 segments, and each segment collects 256 sample points. Respectively carrying out spectrum analysis on the 4 groups of AE signals, calculating the root mean square value Y of the AE signals, the central frequency fc of the frequency domain amplitude and the characteristic value of the total energy E, then training the neural network by taking the characteristic values of the 1 st group and the 2 nd group as the input value of the BP neural network until the error value reaches the allowable range, and identifying the grinding workpiece of the grinding wheel by using the network to obtain the initial time of the grinding wheel needing to be dressed. The characteristic values of the 3 rd and 4 th groups are trained by the same method, and the finishing time of the grinding wheel can be obtained.
Fig. 3 is a flowchart of a training procedure of the BP neural network, and fig. 4 is a flowchart of a testing procedure of the BP neural network.
Step 5, programming the processes in the steps 2, 3 and 4 by using VB and Mitrix-VB software tools, and calculating a root mean square value Y, signal energy E and central frequency f of a signalcWeight V of sum BP neural networkij、WjkControl value Rj、QkAnd the numerical values of the grinding wheel in two different ranges in the passivation state and the sharp state are used for verifying whether the result in the step 4 achieves the expected aim of automatic grinding wheel dressing, and the program is transplanted into a numerical control system of a camshaft grinding machine to realize the automatic grinding wheel dressing function.
Examples of the applications
(1) The AE digital signals are read from the marsposs dynamic balancer through the serial port, and the digital AE signals are divided into 4 groups: the 1 st group is the digital AE signal that the emery wheel was gathered when sharp grinding work piece, the 2 nd group is the digital AE signal that the emery wheel was gathered when passive grinding work piece, the 3 rd group is the digital AE signal that the passivation emery wheel was gathered when the diamond-impregnated roller was maintained, the 4 th group is the digital AE signal that the diamond-impregnated roller was gathered when the sharp emery wheel was maintained. Each group collects 200 segments, and each segment collects 256 sample points.
(2) And (3) carrying out Fast Fourier Transform (FFT) on 4 groups, 200 sections in each group and 256 sampling points in each section through the formula (5) in the step (2), programming the formula (5) in the step (5), and calculating different spectral components of the real parts AR (0) -AR (127) and the imaginary parts AI (0) -AI (127) at each end of the 4 groups, 200 sections in each group.
(3) 4 groups, 200 segments in each group, 256 sampling points in each segment, 4 groups, 200 segments in each group, and different frequency spectrum components of real parts AR (0) -AR (127) and imaginary parts AI (0) -AI (127) at each end respectively. The root mean square value Y, the frequency domain center frequency fc and the total energy E characteristic value of 4 groups of 200 Acoustic Emission (AE) digital signals can be calculated by respectively calculating the formulas (1), (2) and (3) in the step 3 and programming the formulas (1), (2) and (3) in the step 5, wherein the table 1 shows partial calculated values of one group. f. ofs250kHz, N1024, then: f ═ fs/N=250000/1024=244Hz。
TABLE 1 extracted eigenvalues
Figure BDA0000086182560000091
(4) The 4 groups of characteristic values are divided into two groups respectively to carry out BP neural network training, one group comprises the root mean square value Y, the center frequency fc and the total energy E characteristic value of 200 Acoustic Emission (AE) digital signals when the grinding wheel of the 1 st group grinds the workpiece in a sharp manner, and the root mean square value Y, the frequency domain center frequency fc and the total energy E characteristic value of 200 Acoustic Emission (AE) digital signals when the grinding wheel of the 2 nd group grinds the workpiece in a passive manner. And (4) sending the calculated formula into a formula calculation in the step (4), programming the formula in the step (4) through a step 5, sending the characteristic values of the root mean square value Y, the frequency domain center frequency fc and the total energy E characteristic value of the 160 AE signals when the grinding wheel of the 1 st group in one group grinds the workpiece in a sharp manner and the characteristic values of the root mean square value Y, the frequency domain center frequency fc and the total energy E of the 160 AE signals when the grinding wheel of the 2 nd group grinds the workpiece in a passive manner to a BP neural network, training, setting the ideal output when the grinding wheel is sharp to be 1, setting the ideal output when the grinding wheel is passive to be 0, and then training to enable an error curve to be within an allowable range. Training is carried out by using a neural network interface programmed by VB6.0, a BP neural network parameter input graph is shown in fig. 5, and a BP neural network operation result graph is shown in fig. 6. Note: the AE signal has a root mean square value Y, a frequency domain center frequency value fc × 0.001 and an energy value E × 0.1.
(5) The weight and threshold values trained by the neural network are stored (as shown in FIG. 6, V in a graph showing the operation result of BP neural network trainingji、Wkj、Rj、Qk) And then taking out the remaining 40 groups of the groups to test the neural network, judging the correctness of the grinding wheel at the time of sharpness or passivation when the grinding wheel processes the workpiece, wherein the operation result is shown as a BP neural network test result chart in fig. 7. Similarly, another group of characteristic values of the root mean square value Y, the frequency domain center frequency fc and the total energy E of 160 AE signals when the passivated grinding wheel is dressed by the diamond roller of the group 3 and a group 4 of characteristic values of the root mean square value Y, the frequency domain center frequency fc and the total energy E of 200 Acoustic Emission (AE) digital signals when the sharp grinding wheel is dressed by the diamond roller of the group 4 are used for training and testing another neural network, and the correctness of whether the grinding wheel is passivated or sharp at the time of dressing is judged. So as to realize the automatic dressing function of the grinding wheel.
The method of the invention independently adopts the acoustic emission AE digital signal output by the dynamic balancing instrument, carries out FFT spectrum analysis on the signal, calculates the root mean square value Y, the frequency domain center frequency fc and the total energy E characteristic value of the Acoustic Emission (AE) digital signal, then sends the signal to the BP neural network for training and testing, obtains the accurate grinding wheel passivation and grinding wheel sharp finishing time, and realizes the grinding wheel on-line finishing function. Compared with the original method, the method has the advantages of intellectualization, high grinding wheel prediction accuracy, improvement of machining precision and machining efficiency, prolongation of the service life of the grinding wheel and reduction of the production cost of enterprises.

Claims (1)

1. An online monitoring and finishing method for a grinding wheel is characterized by comprising the following steps:
step 1, detecting an Acoustic Emission (AE) digital signal which is emitted and processed in a grinding process of a grinding wheel by using an existing grinding wheel dressing system;
step 2, obtaining N Acoustic Emission (AE) digital signals a (N) of the step 1 through Fast Fourier Transform (FFT)A sum of real part AR (k) and imaginary part AI (k)Co-frequency spectral components;
step 3, calculating by formulas (1), (2) and (3) to obtain the root mean square value Y, the total energy E and the frequency domain center frequency f of the Acoustic Emission (AE) digital signalcAnd as input samples for the BP neural network:
<math> <mrow> <mi>Y</mi> <mo>=</mo> <msqrt> <mfrac> <mrow> <munderover> <mi>&Sigma;</mi> <mrow> <mi>n</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>N</mi> </munderover> <msup> <mi>a</mi> <mn>2</mn> </msup> <mrow> <mo>(</mo> <mi>n</mi> <mo>)</mo> </mrow> </mrow> <mi>N</mi> </mfrac> </msqrt> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow> </math>
<math> <mrow> <mi>E</mi> <mo>=</mo> <munderover> <mi>&Sigma;</mi> <mrow> <mi>k</mi> <mo>=</mo> <mn>0</mn> </mrow> <mfrac> <mi>N</mi> <mn>2</mn> </mfrac> </munderover> <mrow> <mo>(</mo> <mi>AR</mi> <msup> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <mi>AI</mi> <msup> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </mrow> </math>
<math> <mrow> <msub> <mi>f</mi> <mi>c</mi> </msub> <mo>=</mo> <mfrac> <mrow> <munderover> <mi>&Sigma;</mi> <mrow> <mi>k</mi> <mo>=</mo> <mn>0</mn> </mrow> <mfrac> <mi>N</mi> <mn>2</mn> </mfrac> </munderover> <mi>k</mi> <mo>&times;</mo> <mi>f</mi> <mo>&times;</mo> <mrow> <mo>(</mo> <mi>AR</mi> <msup> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <mi>AI</mi> <msup> <mrow> <mo>(</mo> <mi>k</mi> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>)</mo> </mrow> </mrow> <mrow> <mn>2</mn> <mi>&pi;E</mi> </mrow> </mfrac> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> </mrow> </math>
in the formula: y is the root mean square value of the digital signal of the Acoustic Emission (AE), a (N) is the digital signal of the Acoustic Emission (AE), N is the sampling number of the digital signal of the input Acoustic Emission (AE), k is 0, 1, N, AR (k) is the real part frequency spectrum after FFT calculation, AI (k) is the imaginary part frequency spectrum after FFT calculation, E is the total energy of the digital signal of the Acoustic Emission (AE), fcThe frequency domain center frequency of the Acoustic Emission (AE) digital signal, f is the resolution frequency of FFT spectral analysis, e.g. the maximum frequency of the Acoustic Emission (AE) signal is fmaxSampling frequency of fsIf N is selected as the sampling point, the resolution frequency is f ═ fs/N。
Step 4, the BP neural network takes a part of the samples in the step 3 for learning and training, so that the error value generated by the sample training is within a set range, and the weight V of the BP neural network is obtainedij、WjkAnd a control value Rj、QkInputting the rest samples into the BP neural network matrix to be tested to obtain numerical values in two different ranges in the passivation state and the sharpness state of the grinding wheel, wherein the numerical values are used for predicting the starting time and the ending time of grinding wheel dressing respectively;
step 5, programming the processes in the steps 2, 3 and 4 by using VB and Mitrix-VB software tools, and calculating a root mean square value Y, signal energy E and central frequency f of a signalcWeight V of sum BP neural networkij、WjkControl value Rj、QkAnd the numerical values of the grinding wheel in two different ranges in the passivation state and the sharp state are used for verifying whether the result in the step 4 achieves the expected aim of automatic grinding wheel dressing, and the program is transplanted into a numerical control system of a camshaft grinding machine to realize the automatic grinding wheel dressing function.
CN 201110247142 2011-08-26 2011-08-26 Method for monitoring and dressing grinding wheel on line Active CN102267095B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201110247142 CN102267095B (en) 2011-08-26 2011-08-26 Method for monitoring and dressing grinding wheel on line

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201110247142 CN102267095B (en) 2011-08-26 2011-08-26 Method for monitoring and dressing grinding wheel on line

Publications (2)

Publication Number Publication Date
CN102267095A true CN102267095A (en) 2011-12-07
CN102267095B CN102267095B (en) 2013-04-03

Family

ID=45049656

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201110247142 Active CN102267095B (en) 2011-08-26 2011-08-26 Method for monitoring and dressing grinding wheel on line

Country Status (1)

Country Link
CN (1) CN102267095B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102621226A (en) * 2012-04-16 2012-08-01 上海理工大学 Device for measuring grinding vibration performance of CBN (cubic boron nitride) grinding wheel matrix
CN105021706A (en) * 2015-07-16 2015-11-04 郑州磨料磨具磨削研究所有限公司 Grinding wheel broken state early warning recognition device and method
CN104625966B (en) * 2013-11-13 2017-01-25 中国科学院沈阳计算技术研究所有限公司 Slow-advancing grinding online dressing and machining method based on 840D
CN106649930A (en) * 2016-09-24 2017-05-10 上海大学 Online measurement method for grinding wheel arc finishing outline
CN107457703A (en) * 2017-09-07 2017-12-12 哈尔滨工业大学 A kind of end surface full jumping is better than 2 μm of bronze boart boart wheel disc precise dressing method
CN107577736A (en) * 2017-08-25 2018-01-12 上海斐讯数据通信技术有限公司 A kind of file recommendation method and system based on BP neural network
CN112372379A (en) * 2020-11-12 2021-02-19 中国航发南方工业有限公司 Grinding method for complex curved surface type blade tip for aero-engine
CN113798929A (en) * 2021-08-03 2021-12-17 郑州大学 Diamond tool finishing state identification method based on acoustic emission
CN114714200A (en) * 2022-05-07 2022-07-08 哈尔滨工业大学 A process optimization method for hemispheric harmonic oscillator grinding based on acoustic analysis
CN118905946A (en) * 2024-09-04 2024-11-08 东莞市冠锋金刚石制品有限公司 Dressing control method and system for diamond grinding wheel

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007043263A1 (en) * 2005-10-14 2007-04-19 Asahi Glass Company, Limited Truing member for abrasive pad and truing method of abrasive pad
US7229336B2 (en) * 1999-08-31 2007-06-12 Micron Technology, Inc. Apparatus and method for conditioning and monitoring media used for chemical-mechanical planarization
CN200951522Y (en) * 2006-10-09 2007-09-26 北京第二机床厂有限公司 Micro precision repairing and maintaining tool setting device of abrasion wheel
CN101642895B (en) * 2009-09-11 2011-06-29 湖南大学 Laser Dressing Method of Superabrasive Grinding Wheel

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7229336B2 (en) * 1999-08-31 2007-06-12 Micron Technology, Inc. Apparatus and method for conditioning and monitoring media used for chemical-mechanical planarization
WO2007043263A1 (en) * 2005-10-14 2007-04-19 Asahi Glass Company, Limited Truing member for abrasive pad and truing method of abrasive pad
CN200951522Y (en) * 2006-10-09 2007-09-26 北京第二机床厂有限公司 Micro precision repairing and maintaining tool setting device of abrasion wheel
CN101642895B (en) * 2009-09-11 2011-06-29 湖南大学 Laser Dressing Method of Superabrasive Grinding Wheel

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102621226A (en) * 2012-04-16 2012-08-01 上海理工大学 Device for measuring grinding vibration performance of CBN (cubic boron nitride) grinding wheel matrix
CN104625966B (en) * 2013-11-13 2017-01-25 中国科学院沈阳计算技术研究所有限公司 Slow-advancing grinding online dressing and machining method based on 840D
CN105021706A (en) * 2015-07-16 2015-11-04 郑州磨料磨具磨削研究所有限公司 Grinding wheel broken state early warning recognition device and method
CN106649930A (en) * 2016-09-24 2017-05-10 上海大学 Online measurement method for grinding wheel arc finishing outline
CN107577736A (en) * 2017-08-25 2018-01-12 上海斐讯数据通信技术有限公司 A kind of file recommendation method and system based on BP neural network
CN107457703B (en) * 2017-09-07 2019-03-19 哈尔滨工业大学 A kind of bronze boart boart wheel disc precise dressing method of the end surface full jumping better than 2 μm
CN107457703A (en) * 2017-09-07 2017-12-12 哈尔滨工业大学 A kind of end surface full jumping is better than 2 μm of bronze boart boart wheel disc precise dressing method
CN112372379A (en) * 2020-11-12 2021-02-19 中国航发南方工业有限公司 Grinding method for complex curved surface type blade tip for aero-engine
CN112372379B (en) * 2020-11-12 2022-04-01 中国航发南方工业有限公司 Grinding method for complex curved surface type blade tip for aero-engine
CN113798929A (en) * 2021-08-03 2021-12-17 郑州大学 Diamond tool finishing state identification method based on acoustic emission
CN113798929B (en) * 2021-08-03 2022-06-24 郑州大学 Diamond tool finishing state identification method based on acoustic emission
CN114714200A (en) * 2022-05-07 2022-07-08 哈尔滨工业大学 A process optimization method for hemispheric harmonic oscillator grinding based on acoustic analysis
CN114714200B (en) * 2022-05-07 2023-05-30 哈尔滨工业大学 A Hemispherical Harmonic Oscillator Grinding Process Optimization Method Based on Acoustic Wave Analysis
CN118905946A (en) * 2024-09-04 2024-11-08 东莞市冠锋金刚石制品有限公司 Dressing control method and system for diamond grinding wheel

Also Published As

Publication number Publication date
CN102267095B (en) 2013-04-03

Similar Documents

Publication Publication Date Title
CN102267095A (en) Method for monitoring and dressing grinding wheel on line
Kwak et al. Neural network approach for diagnosis of grinding operation by acoustic emission and power signals
CN114619292B (en) A Milling Tool Wear Monitoring Method Based on Wavelet Noise Reduction and Attention Mechanism Fusion GRU Network
CN103786069B (en) Flutter online monitoring method for machining equipment
CN104723171B (en) Cutter wear monitoring method based on current and acoustic emission compound signals
CN101817163B (en) Neural network-based grinding machining working condition detection method
CN108747590A (en) A kind of tool wear measurement method based on rumble spectrum and neural network
Kwak et al. Trouble diagnosis of the grinding process by using acoustic emission signals
CN108942409A (en) The modeling and monitoring method of tool abrasion based on residual error convolutional neural networks
CN102335872B (en) Artificial neural network-based method and device for automatically trimming grinding wheel of grinding machine
CN113741377A (en) Machining process intelligent monitoring system and method based on cutting characteristic selection
CN112372371B (en) Method for evaluating abrasion state of numerical control machine tool cutter
Miranda et al. Monitoring single-point dressers using fuzzy models
EP3736648A1 (en) Method for autonomous optimization of a grinding process
CN109605127A (en) A kind of cutting-tool wear state recognition methods and system
CN108037034A (en) The multisensor on-line checking and data handling system of wheel grinding performance
CN109885900A (en) A Turning Parameter Optimization Method Based on Grayscale Analysis
Deng et al. A hybrid model using genetic algorithm and neural network for process parameters optimization in NC camshaft grinding
CN114253219A (en) A grinding force adaptive control method and system based on face grinding
CN101819119B (en) Wavelet analysis-based grinding machining working condition detection system and method thereof
Biera et al. Time-domain dynamic modelling of the external plunge grinding process
CN109117597B (en) Processing key factor grading method based on corresponding analysis method
Gupta et al. Investigation of tool chatter features at higher metal removal rate using sound signals
Mahata et al. In-process characterization of surface finish in cylindrical grinding process using vibration and power signals
CN114888635A (en) Cutter state monitoring method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C53 Correction of patent for invention or patent application
CB02 Change of applicant information

Address after: 410323 Liuyang, Hunan, manufacturing base of the weft Road,,

Applicant after: Yuhuan CNC Machine Tool Co., Ltd.

Address before: 410323 Liuyang, Hunan, manufacturing base of the weft Road,,

Applicant before: Hunan Yuhuan Tongxin CNC Machine Tool Co., Ltd.

C14 Grant of patent or utility model
GR01 Patent grant