[go: up one dir, main page]

CN201543633U - Cleaning robot and dirt identifying device thereof - Google Patents

Cleaning robot and dirt identifying device thereof Download PDF

Info

Publication number
CN201543633U
CN201543633U CN2009201778895U CN200920177889U CN201543633U CN 201543633 U CN201543633 U CN 201543633U CN 2009201778895 U CN2009201778895 U CN 2009201778895U CN 200920177889 U CN200920177889 U CN 200920177889U CN 201543633 U CN201543633 U CN 201543633U
Authority
CN
China
Prior art keywords
belief
foul
degree
image
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
CN2009201778895U
Other languages
Chinese (zh)
Inventor
汤进举
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ecovacs Robotics Suzhou Co ltd
Ecovacs Robotics Suzhou Co Ltd
Original Assignee
Ecovacs Robotics Suzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ecovacs Robotics Suzhou Co Ltd filed Critical Ecovacs Robotics Suzhou Co Ltd
Priority to CN2009201778895U priority Critical patent/CN201543633U/en
Application granted granted Critical
Publication of CN201543633U publication Critical patent/CN201543633U/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Landscapes

  • Manipulator (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The utility model discloses a cleaning robot and a dirt identifying device thereof. The identifying device includes an image collecting module (10) and an image processing module (20), wherein the image collecting module (10) is used for collecting the image information on the surface to be processed by the cleaning robot (1) and sends the image information to the image processing module (20); and the image processing module (20) divides the collected image on the surface to be processed into N pieces, extracts the image information for each piece of image, and processes the image information to determine which surface to be processed corresponding to one piece of image in N pieces is most dirty. By the scheme of the utility model, the cleaning robot can actively identify the dirt such as dusts so as to accurately and rapidly enter into the work area to work, enhance the work efficiency and save the work time.

Description

Clean robot and foul recognition device thereof
Technical field
The utility model relates to a kind of smart machine, especially a kind of clean robot and foul recognition device thereof.
Background technology
Along with the progress in epoch and the development that leaps of science and technology, smart machine has become the known noun of people as robot.Moreover, intelligent scavenging machine, the intelligence similarly home services robot such as machine that mops floor with its easy to clean, time saving and energy saving characteristics, makes people break away from loaded down with trivial details housework and has stepped into common people's family life.
In existing home services Robotics and product, robot can finish the function of the most basic dust out by people's requirement.
In the prior art, the clean robot intelligent scavenging machine that roughly is divided into intellective dust collector, intelligent sweeper and integrates suction, sweep.The cleaning principle of intellective dust collector is to rely on the Motor Drive fan blade to rotate at a high speed, fan blade strong pumping air, cause the very big pressure differential of the inside and outside generation of dust catcher body, produce airflow at the suction inlet place, thereby the dust and the foul on surface to be cleaned sucked with air-flow in the collecting unit of dust of dust catcher, through the filtration of filter, dust and foul are left in the collecting unit of dust, clean air then sees through filter, through parts such as fan blades, discharges into the atmosphere.The cleaning principle of intelligence sweeper is: be provided with rotating round brush in the bottom of sweeper, the rotation by round brush is taken the fouls such as particle on surface to be cleaned in the collecting unit of dust of sweeper to.The cleaning principle of intelligence scavenging machine is: scavenging machine is provided with motor, fan blade and the rotating round brush that can produce vacuum effectiveness, rotation by pull of vacuum and round brush, make the dust and the particle foul on surface to be cleaned be inhaled into and be brought in the interior collecting unit of dust of scavenging machine, cleaning effect is than good with intellective dust collector under the power or intelligent sweeper.
Above-mentioned said clean robot generally can move when carrying out work voluntarily.Work as robot, such as intelligent scavenging machine, when it runs into foul such as dust in the path of walking, survey the quantity of fouls such as dust by the dust sensor that is installed in the intake channel both sides, fouls such as dust enter into dust-collecting box by intake channel under the effect of round brush drive and suction, whether this moment, dust sensor compared by quantity and the predefined standard value that will detect fouls such as dust between intake channel, thereby need to select this cleaning point inhaled in the fixed-point set with carrying out the zonule sweep.
This shows that existing clean robot is a passive process to the detection of fouls such as dust, when having only fouls such as running into dust when robot, just can play the function of a detection, not identification selection function initiatively.Therefore, when doing cleaning,, usually be to stroll about on surface to be cleaned because this clean robot can not initiatively be discerned dust, need take a long time to surface to be cleaned is cleaned up not only inefficiency, and waste electric energy.In order to overcome these defectives, need be by people to this clean robot channeling conduct, the place that makes clean robot can arrive fouls such as dust is cleaned, and has increased artificial intervention virtually again, can not really play the final purpose that people are freed from work.
Whether in addition, a kind of clean robot is arranged also, it disposes camera head, gathers the image on ground, the place ahead by camera head, and this image and a standard picture are compared, need it is cleaned thereby determine.Carry out active probe though this method can be treated the cleaning place, determination methods is too simple, can not determine exactly whether institute's pickup area really needs cleaning, False Rate height.
Summary of the invention
Technical problem to be solved in the utility model is, at the prior art deficiency, a kind of foul recognition device and clean robot are provided, can carry out active identification to fouls such as dusts, and can judge whether need cleaning exactly, thereby improve cleaning efficiency, the saving cleaning time of clean robot, people are really freed from work.
For ease of solving above-mentioned technical problem, the utility model provides a kind of foul recognition device of clean robot, comprises image capture module, image processing module;
Described image capture module is used to gather the image information that clean robot is treated working surface, and described image information is sent to described image processing module;
The working surface image for the treatment of that described image processing module will collect is divided into N (N>1) piece, extract the image information of each piece image, described image information is handled, treat that so that an image in definite described N piece is pairing working surface is the dirtiest, thereby make described device carry out active identification foul.
Second kind of scheme that the utility model provides is that in the foul recognition device of the clean robot that first kind of scheme provides, described image processing module comprises image cutting unit, information extraction unit and computing unit;
Described image cutting unit is used for the clean robot that collects is treated that the image of working surface is divided into N (N>1) piece;
Described information extraction unit extracts the image information of each piece image, and sends to described computing unit;
Described computing unit is according to the image information of each piece image, judges that a image in the described N piece is pairing treats that working surface is the dirtiest.
The third scheme that the utility model provides is, in the foul recognition device of the clean robot that second kind of scheme provides, described information extraction unit is the gray value extraction unit, and described computing unit is a comparing unit; Wherein, described gray value extraction unit extracts corresponding gray from each piece image; Described comparing unit is the gray value of each piece image relatively, thereby the image of gray value maximum is defined as the dirtiest working surface for the treatment of.
The 4th kind of scheme that the utility model provides be, in the foul recognition device of the clean robot that second kind of scheme provides, described information extraction unit is the gray value extraction unit, and described computing unit comprises characteristic value extraction unit and comparing unit; Wherein, described gray value extraction unit extracts corresponding gray from each piece image; Described characteristic value extraction unit is converted into the gray value of each image block and its characteristic of correspondence value, thereby extracts the foul feature from each piece image; Described comparing unit is the foul characteristic value of each image relatively, thereby the image of foul characteristic value maximum is defined as the dirtiest working surface for the treatment of.
The 5th kind of scheme that the utility model provides is, in the foul recognition device of the clean robot that second kind of scheme provides, described information extraction unit is the gray value extraction unit, and described computing unit comprises characteristic value extraction unit, theoretical degree of belief computing unit and comparing unit; Wherein, described gray value extraction unit extracts corresponding gray from each image block; Described characteristic value extraction unit is converted into the gray value of each image block and its characteristic of correspondence value, thereby extracts the foul feature from each piece image; Described theoretical degree of belief computing unit is a parameter with described characteristic value, the instant degree of belief corresponding to described characteristic value that obtains from a database, obtains theoretical degree of belief corresponding to each image block according to the functional relation of described characteristic value, instant degree of belief and theoretical degree of belief; Described comparing unit is the theoretical degree of belief of each image block relatively, thereby the image of theoretical degree of belief maximum is defined as the dirtiest working surface for the treatment of.
The 6th kind of scheme that the utility model provides is in the foul recognition device of the 5th kind of clean robot that scheme provides, also to comprise dust sensor and theoretical degree of belief amending unit; Wherein, described dust sensor is used for the foul amount that sensing is treated working surface, and the information that will sense actual foul amount sends to described theoretical degree of belief amending unit; Described theoretical degree of belief amending unit calculates the difference of actual foul amount and standard foul amount, and according to the functional relation of this difference with instant degree of belief deviate, calculates instant degree of belief deviate; Described theoretical degree of belief computing unit calculates and revises the theoretical degree of belief of back corresponding to each image block according to the functional relation of described characteristic value, instant degree of belief and instant degree of belief deviate.
The 7th kind of scheme that the utility model provides is, in the foul recognition device of the 6th kind of clean robot that scheme provides, described comparing unit is the revised theoretical degree of belief of each image block relatively, and the image of theoretical degree of belief maximum is defined as the dirtiest working surface for the treatment of.
The 8th kind of scheme that the utility model provides is that in the foul recognition device of the 5th kind of clean robot that scheme provides, the functional relation of described characteristic value, instant degree of belief and theoretical degree of belief is: P A=AF A
Wherein, A is a characteristic value; F AFor characteristic value is the instant degree of belief of A; P AFor characteristic value is the theoretical degree of belief of A.
The 9th kind of scheme that the utility model provides is that in the foul recognition device of the 6th kind of clean robot that scheme provides, the difference of actual foul amount and standard foul amount is with the functional relation of instant degree of belief deviate:
ΔF A(n)=f(x)
X is the difference of actual foul amount and standard foul amount; Δ F A (n)Be the instant degree of belief deviate that the n time revised characteristic value is A, f is a functional relation;
Described theoretical degree of belief computing unit (232E) calculates by following formula and revises the theoretical degree of belief of back corresponding to each image block:
P A(n)’=A(F A+ΔF A(n)+ΔF A(n-1)+……ΔF A(1))
Wherein, A is a characteristic value; F AFor characteristic value is the instant degree of belief of A; Δ F A (n)The instant degree of belief deviate of the actual foul amount when revising and the difference of standard foul amount corresponding to the n time; Δ F A (n-1)The instant degree of belief deviate of the actual foul amount when revising and the difference of standard foul amount corresponding to the n-1 time; Δ F A (1)The instant degree of belief deviate of the actual foul amount when revising and the difference of standard foul amount corresponding to the 1st time; P A(n) ' for process is revised the back n time, characteristic value is the theoretical degree of belief of A.
The tenth kind of scheme that the utility model provides be, in the foul recognition device of the 9th kind of clean robot that scheme provides, and Δ F A (n)The functional relation of f is described in=the f (x):
ΔF A(n)=a nX n+a n-1X n-1+a n-2X n-2+......+a 1X 1+a 0
Wherein, X is the difference of actual foul amount and standard foul amount; Δ F A (n)It is the instant degree of belief deviate that the n time revised characteristic value is A; a n, a N-1..., a 1, a 0Be multinomial coefficient.
The 11 kind of scheme that the utility model provides is, in the foul recognition device of the 5th kind of clean robot that scheme provides, also comprise described database, described database stores mutual characteristic of correspondence value, instant degree of belief, initial trust degree and theoretical degree of belief.
The 12 kind of scheme that the utility model provides is, also comprise described database in the foul recognition device of the 6th kind of clean robot that scheme provides, described database stores the threshold value of mutual characteristic of correspondence value, instant degree of belief, initial trust degree, instant degree of belief deviate, theoretical degree of belief and standard foul amount.
The 13 kind of scheme that the utility model provides be, the 11, in the foul recognition device of the clean robot that provides of 12 kind of scheme, the value of the instant degree of belief of described characteristic value correspondence equates with the value of initial trust degree when initial.
The 14 kind of scheme that the utility model provides be, in the foul recognition device of the 6th kind of clean robot that scheme provides, also comprises a setup unit, is used to set the threshold value of initial trust degree and/or standard foul amount.
The 15 kind of scheme that the utility model provides be, the setup unit described in the foul recognition device of the clean robot that provides the 14 kind of scheme is button, knob, touch or distance type device.
The utility model also provides a kind of clean robot, comprise robot body, control module, driver element, walking unit and cleaning unit, the work of described control module control cleaning unit, and control driver element, by drive unit drives walking unit walking, also comprise aforementioned schemes one to 15 arbitrary described foul recognition device, the dirtiest working surface for the treatment of that described control module is determined according to described foul recognition device, determine that the walking unit is the track route of destination with the dirtiest working surface for the treatment of, the dirtiest working surface for the treatment of is cleaned.
In described cleaning machine people, the image processing module in aforementioned schemes one to 15 arbitrary described foul recognition device is a part of described control module.
The utility model carries out piecemeal by image processing module with the image that image capture module collected to be handled, and by every behind piecemeal image information is compared, thereby the working surface of judging which block image correspondence in all polylith images is the dirtiest.By scheme of the present utility model, clean robot can carry out active identification to fouls such as dusts, thereby accurately and apace enters working region work, compared with prior art, has improved accuracy rate and the operating efficiency judged, has saved the working time; And selection and cleaning owing to the working region do not need human intervention, thereby people are really freed from work.In addition, in the utility model, by image information is changed into gray value, therefore need not to store pictorial information and do not need a large amount of memory spaces, accelerated the speed of service of control module thus, and, also can reduce the purchase cost of components and parts thus owing to the memory space of memory cell is not had extra demand.
Below in conjunction with the drawings and specific embodiments the technical solution of the utility model is described in detail.
Description of drawings
Fig. 1 is the control block diagram with recognition device of initiatively discerning foul described in the utility model;
Fig. 2 is the control block diagram with embodiment one of the recognition device of initiatively discerning foul described in the utility model;
Fig. 3 is for having the workflow diagram of the recognition device of initiatively discerning foul described in the utility model Fig. 2;
Fig. 4 is the control block diagram with embodiment two of the recognition device of initiatively discerning foul described in the utility model;
Fig. 5 is for having the workflow diagram of the recognition device of initiatively discerning foul described in the utility model Fig. 4;
Fig. 6 is the control block diagram with embodiment three of the recognition device of initiatively discerning foul described in the utility model;
Fig. 7 is the control block diagram with embodiment four of the recognition device of initiatively discerning foul described in the utility model;
Fig. 8 is for having the workflow diagram of the recognition device of initiatively discerning foul described in the utility model Fig. 7;
Fig. 9 is the control block diagram with embodiment five of the recognition device of initiatively discerning foul described in the utility model;
Figure 10 is for having the workflow diagram of the recognition device of initiatively discerning foul described in the utility model Fig. 9;
Figure 11 is the overall schematic from mobile clean robot described in the utility model;
Figure 12 is the bottom schematic view from mobile clean robot described in the utility model;
Figure 13 is the control block diagram from mobile clean robot one specific embodiment described in the utility model;
Figure 14 is the workflow diagram from mobile clean robot described in the utility model;
Figure 15 A-15C is from mobile clean robot operation schematic diagram.
The specific embodiment
The utility model provides a kind of recognition device of realizing that foul is initiatively discerned, comprises image capture module 10 and image processing module 20; Wherein, described image capture module 10 is used to gather the image information that clean robot 1 is treated working surface, and described image information is sent to described image processing module 20; The working surface image for the treatment of that described image processing module 20 will collect is divided into the N piece, extract the image information of each piece image, wherein, N>1, described image information is handled, judged finally that a image in the described N piece is pairing treats that working surface is the dirtiest; The image information that perhaps will extract each piece image sends to described clean robot 1, is judged by clean robot that a image in the described N piece is pairing treats that working surface is the dirtiest.Wherein, judge that a image in the described N piece is pairing to be treated that working surface is the dirtiest and promptly can be finished by recognition device, also can finish by clean robot.Below launch explanation by specific embodiment.
Realize the embodiment one of the recognition device that foul is initiatively discerned
As shown in Figure 2, a kind of recognition device of initiatively discerning foul that provides for present embodiment one, this device comprises image capture module 10A and image processing module 20A, and wherein, described image processing module 20A comprises image cutting unit 210A, information extraction unit 220A and computing unit 230A.Fig. 3 is the workflow diagram of this recognition device.
In conjunction with Fig. 2 and Fig. 3, after image capture module 10A is used to gather the image information (step S10A) for the treatment of working surface, the image information for the treatment of working surface that image cutting unit 210A among the image processing module 20A will collect is divided into N piece (N>1) (step S20A) according to the real work needs, extract the image information (step S30A) of each piece image by information extraction unit 220A, and send to described computing unit 230A, described computing unit 230A judges the pairing working surface the dirtiest (step S40A) for the treatment of of an image in the described N piece according to the image information of each piece image.
Wherein, information extraction unit 220A can extract a part of information from each piece image, as gray value, characteristic value etc., thereby the information of all images all need not be stored, thereby greatly saved memory space, and, because described computing unit 230A judges according to these information, so, improved the speed of service.
Realize the embodiment two of the recognition device that foul is initiatively discerned
As shown in Figure 4, a kind of recognition device of initiatively discerning foul that provides for present embodiment two, Fig. 5 is the workflow diagram of recognition device shown in Figure 4, in conjunction with Fig. 4 and Fig. 5, this recognition device comprises image capture module 10B and image processing module 20B, wherein, image processing module 20B comprises image cutting unit 210B, gray value extraction unit 220B and comparing unit 230B.
Image capture module 10B gathers the image information (step S10B) that clean robot is treated working surface, and this image information is sent to described image cutting unit 210B; The image information for the treatment of working surface that described image cutting unit 210B will collect is divided into N piece (N>1) (step S20B) according to the real work needs; Described gray value extraction unit 220B extracts corresponding gray (step S30B) from each piece image, then, described comparing unit 230B is the gray value of each piece image relatively, thereby the image of gray value maximum is defined as the dirtiest working surface (step S40B) for the treatment of.
Wherein, the method for extracting gray value from image is known usual knowledge, and usually, gray value is between 0~255.
Present embodiment with the size of gray value be basis for estimation be because, for character identical treat working surface, the gray value on the surface that the gray value on the surface that foul amounts such as dust are big is littler than foul amounts such as dusts usually is big, therefore, just can judge the dirtiest working surface by the contrast gray value.
Realize the embodiment three of the recognition device that foul is initiatively discerned
As shown in Figure 6, a kind of recognition device principle schematic of initiatively discerning foul that provides for present embodiment three, described recognition device comprises image capture module 10C and image processing module 20C, wherein, image processing module 20C comprises image cutting unit 210C, gray value extraction unit 220C, characteristic value extraction unit 231C and comparing unit 230C, wherein, characteristic value extraction unit 231C and comparing unit 230C constitute computing unit.
The difference of present embodiment and embodiment one is: at characteristic value extraction unit 231C, gray value is changed into characteristic value, the foundation when judging the dirtiest working surface is the size of characteristic value, is defined as the surface of the image block correspondence of characteristic value maximum the dirtiest.
Wherein, its method for transformation adopts the statistical method in the pattern-recognition, such as clustering procedure or the like.Specifically, each piece image information is carried out preliminary treatment after digital link, be used to remove the interfere information of sneaking into and reduce some distortion and distortion.Carry out feature extraction subsequently, promptly from digitlization or pretreated input pattern, extract a stack features.Said feature is meant selected a kind of tolerance, and it remains unchanged or almost remain unchanged for general distortion and distortion, and only contains the least possible redundancy.Gray value is as known usual knowledge, and its numerical value is between 0~255.In the present embodiment, characteristic value makes its span between 0~1 by an algorithm for pattern recognition.
Recognition device in the present embodiment by the gray value that every image information is transformed, again by algorithm for pattern recognition, makes the characteristic value that different gray values is corresponding different.Thereby judge treating in the working surface of institute's piecemeal by the comparison between the characteristic value, which piece is the dirtiest.
As everyone knows, digital picture is that the form with bitmap exists, and bitmap is a rectangular lattice, and the bright-dark degree that pixel had of each point is identified by gray value.Therefore, the gray value of each piece image information is compared, its data occupancy memory space is very big, and this has increased many loads for memory cell.And at present embodiment, by grasping the some of each piece image information or several gray values, change into characteristic value from the gray value that is grasped by algorithm, and removed from the entire image information of each piece is carried out gray value relatively, its data occupancy amount is less, and the data confidence level is stronger.
The embodiment four of the recognition device of foul is initiatively discerned in realization
As shown in Figure 7, a kind of recognition device principle schematic of initiatively discerning foul that provides for present embodiment four, compare with embodiment three, increased by a theoretical degree of belief computing unit 232D, described theoretical degree of belief computing unit 232D is according to described characteristic value, the instant degree of belief that from a database, obtains corresponding to described characteristic value, wherein, this database can be stored in the memory cell in the recognition device, also can be stored in the memory cell of recognition device outside.
In this database, store a lot of stack features values and the instant corresponding data of degree of belief.According to a characteristic value, just can obtain the instant degree of belief corresponding by inquiry with it.
Fig. 8 is the workflow diagram of device shown in Figure 7.The recognition methods of recognition device is:
Step S10D, image capture module 10D gathers the image information that clean robot is treated working surface, and this image information is sent to described image cutting unit 210D;
Step S20D, the image information for the treatment of working surface that described image cutting unit 210D will collect is divided into N piece (N>1) according to the real work needs;
Step S30D, described gray value extraction unit 220D extracts corresponding gray from each piece image;
Step S40D at characteristic value extraction unit 231D, changes into characteristic value by algorithm for pattern recognition with gray value;
Step S50D, described theoretical degree of belief computing unit 232D is according to described characteristic value, from a database, search corresponding instant degree of belief, when initial, instant degree of belief in the database is a preset value, and preset value is all identical, and the instant degree of belief when this is initial is defined as the initial trust degree;
Step S60D, described theoretical degree of belief computing unit 232D obtains theoretical degree of belief corresponding to each image block according to the functional relation of described characteristic value, instant degree of belief and theoretical degree of belief, wherein, the functional relation of described characteristic value, instant degree of belief and theoretical degree of belief is a kind of conversion method, such as multiplication relation, i.e. P A=AF A, wherein, A is a characteristic value; F AFor characteristic value is the instant degree of belief of A; P AFor characteristic value is the theoretical degree of belief of A, so, all obtain a theoretical degree of belief at each image block;
Step S70D, described comparing unit 230D is the theoretical degree of belief of each image block relatively, thereby the image of theoretical degree of belief maximum is defined as the dirtiest working surface for the treatment of.
The embodiment five of the recognition device of foul is initiatively discerned in realization
As shown in Figure 9, be the control block diagram with embodiment five of the recognition device of initiatively discerning foul described in the utility model; Figure 10 is for having the workflow diagram of the recognition device of initiatively discerning foul described in the utility model Fig. 9.
In the present embodiment, compare with embodiment four, increased dust sensor 235E and theoretical degree of belief amending unit 236E, sensor can be infrared emission, infrared receiving sensor; It also can be sonar sensor, being used for sensing treats the foul amount of working surface, and the information that will sense actual foul amount sends to described theoretical degree of belief amending unit 236E, calculate the difference of actual foul amount and standard foul amount by described theoretical degree of belief amending unit 236E, and according to the functional relation of this difference with instant degree of belief deviate, calculate instant degree of belief deviate, theoretical this difference of degree of belief computing unit 232E is revised the theoretical degree of belief that calculates at last.
Step S10E, image capture module 10E gathers the image information that clean robot is treated working surface, and this image information is sent to described image cutting unit 210E;
Step S20E, the image information for the treatment of working surface that described image cutting unit 210E will collect is divided into N piece (N>1) according to the real work needs;
Step S30E, described gray value extraction unit 220E extracts corresponding gray from each piece image;
Step S40E at characteristic value extraction unit 231E, adopts by algorithm for pattern recognition gray value is changed into characteristic value;
Step S50E, described theoretical degree of belief computing unit 232E is according to described characteristic value, from a database, search corresponding instant degree of belief, when initial, instant degree of belief in the database is a preset value, and preset value is all identical, and the instant degree of belief when this is initial is defined as the initial trust degree;
Step S60E, described theoretical degree of belief computing unit 232E obtains theoretical degree of belief corresponding to each image block according to the functional relation of described characteristic value, instant degree of belief and theoretical degree of belief;
Step S70E, described comparing unit 230E is the theoretical degree of belief of each image block relatively, thereby the image of theoretical degree of belief maximum is defined as the dirtiest working surface for the treatment of.
When clean robot during in the work of this working surface, at step S80E, dust sensor 235E is to the detection of fouls such as dust, particle, and with actual detection to amount of dust L send to described theoretical degree of belief amending unit 236E;
At step S90E, theoretical degree of belief amending unit 236E receives the actual foul amount L that the described dust sensor 235E of meter sends, and should actual foul amount L and standard foul amount λ compare, and according to functional relation Δ F=f (the x)=f (L-λ) of this difference with instant degree of belief deviate, calculate instant degree of belief deviate Δ F, wherein, X is the difference of actual foul amount and standard foul amount, i.e. (L-λ); Δ F is instant theoretical degree of belief deviate, and f is a functional relation, and this functional relation is represented a kind of feedback compensation algorithm; After calculating difference DELTA F, difference DELTA F is returned to theoretical degree of belief computing unit 232E, recomputated theoretical degree of belief by theoretical degree of belief computing unit 232E, thereby adjust the theoretical degree of belief of this piece image.
The feedback compensation algorithm of the deviate Δ F of instant degree of belief mentioned above is that the manner of comparison of instant degree of belief of basis and amount of dust deviate obtains, and the functional relation between the two is: Δ F A (n)The functional relation of f described in the=f (x):
ΔF A(n)=a nX n+a n-1X n-1+a n-2X n-2+......+a 1X 1+a 0
Wherein, X is the difference of actual foul amount and standard foul amount;
Δ F A (n)It is the instant degree of belief deviate that the n time revised characteristic value is A;
a n, a N-1..., a 1, a 0Be multinomial coefficient.
Can obtain multiple functional relation according to above-mentioned functional relation,, depend on how final plan goes to handle the effect that the amount of dust deviate is fed back as for adopting which kind of functional relation.Such as: as a fixing feedback effects, functional relation corresponds to: Y=a 1X 1+ a 0As a revocable feedback effects, the relation in the functional relation between Y and the X is curved, and curve then is the function of many variables of an X, as quadratic function ... N function or the like.The utility model provides the enlightenment of feedback compensation algorithm of the deviate Δ F of instant degree of belief at this, and those skilled in the art can select best functional relation flexibly with the environment of robot real work.
In addition, enter the Infinite Cyclic state, in computing module, can increase the restriction of upper and lower limit the span of instant degree of belief for avoiding IMAQ, processing and analysis.Illustrate and pairingly treats working surface enough totally that image capture module does not need to gather image again in limited time down when what the numerical value of all instant degree of beliefs was lower than span; In a certain setting-up time section, when the numerical value of all instant degree of beliefs is higher than going up in limited time of span, illustrate that pairing to treat that working surface has been proofreaied and correct the number of times of adjustment abundant, should carry out next IMAQ at this moment.
When carrying out the collection of next image, the last adjusted instant degree of belief of previous image is the current instant degree of belief under this characteristic value.
Present embodiment is compared with embodiment four, when clean robot moves to definite working surface when carrying out cleaning, is detected by the amount of dust of dust sensor to the work at present surface, and by this recognition device current instant degree of belief is adjusted.The deviation of the instant degree of belief that obtains after wherein, current instant degree of belief equals last degree of belief and adjusts each time and.
As shown in figure 11, be the overall schematic from mobile clean robot described in the utility model; Figure 12 is the control block diagram from mobile clean robot described in the utility model, with reference to Figure 11 and Figure 12, described clean robot has the function that realizes that foul is initiatively discerned, it comprises robot body 1, control module 2, driver element 3, walking unit 4 and cleaning unit 5,5 work of described control module 2 control cleaning units, and control driver element 3, drive 4 walkings of walking unit by driver element 3, comprise also that in this robot foul recognition device noted earlier is used for determining the dirtiest working surface for the treatment of, control module 2 determines that according to this dirtiest working surface for the treatment of walking unit 4 is the track route of destination with the dirtiest working surface for the treatment of, the dirtiest working surface for the treatment of is cleaned.
Wherein, by the described driver element 3 of control module 2 controls, make driver element 3 drive walking unit 4 (driving wheel is as Figure 13) drive robot body 1 and move that for realizing the function of cleaning automatically, this is provided with cleaning unit 5 from the inside of mobile clean robot.This cleaning unit 5 comprises associated components such as vacuum draw unit, scrubbing brush, is used for cleaning and treats working surface.Bottom mobile certainly cleaning machine human body is provided with intake channel, fouls such as dust enter into robot body 1 by this intake channel under the effect of the drive of round brush and pull of vacuum, dust sensor 235E is installed in the both sides of robot intake channel, is distinguished the size of the foul amount between intake channel by dust sensor.
In addition, aforementioned foul recognition device can be a device of machine-independent people's control module 2, also can be and control module 2 cross one another device on forming, and promptly the image processing module in the foul recognition device is a part of control module 2.Because this recognition device explains in front, therefore here no longer be described in detail.
Below describe clean robot in detail and when work, how to realize the active identification and the workflow thereof of foul with foul active recognition function.More clear for expressing, now totally set forth in conjunction with concrete data and respective figure at present embodiment, specifically be exemplified below:
With reference to Figure 11, Figure 12, Figure 13, Figure 14 and Figure 15 A-15C, the workflow of mobile clean robot is as follows certainly:
Step S101, the image capture module 10E (as camera) that is arranged at the cleaning machine head part carries out IMAQ to the surface to be cleaned of clean robot front;
Two image blocks about step S102, image cutting unit 210E are divided into described image.Shown in Figure 15 A, about two image blocks correspond respectively to area B, C, and the image information of each image block is changed into gray value, for example, left side gray value is 125, the right gray value is 180;
Step S103, described characteristic value extraction unit 231E adopt algorithm for pattern recognition that gray value is changed into characteristic value, thereby extract the foul feature from each piece image block, as corresponding characteristic value are: 0.15 and 0.56;
Step S104, described theoretical degree of belief computing unit 232E is a parameter with described characteristic value, the instant degree of belief corresponding to described characteristic value that obtains from a database, obtains theoretical degree of belief corresponding to each image block according to the functional relation of described characteristic value, instant degree of belief and theoretical degree of belief.Wherein, the data in the database are as shown in table 1, and described theoretical degree of belief computing unit is found characteristic value from this table be 0.15 and 0.56 instant degree of belief F 0.15And F 0.56, the instant degree of belief in the initial data base is the initial trust degree, and each initial trust number of degrees value is all identical.In the present embodiment, this initial trust degree is predefined, and degree of belief is 1, i.e. F 0.15(0)=F 0.56(0)=1.Then under the prerequisite of each known features value and instant degree of belief,, draw theoretical degree of belief by the multiplication relation.Concrete conversion relation formula is as follows: the theoretical degree of belief=instant degree of belief of characteristic value *.So, the image information of each image block all can corresponding theoretical degree of belief, and theoretical degree of belief is respectively: P 0.15(0)=0.15* (1+0)=0.15; P 0.56(0)=0.56* (1+0)=0.56.
Step S105 compares two theoretical degree of beliefs, theoretical degree of belief numerical value maximum be 0.56, thereby the image block on the right side is defined as treating working surface;
Step S106, control module 2 sends control signal to driver element 3, and driver element 3 drives walking unit 4 and moves to zone C, shown in Figure 15 B, move to this working surface, i.e. zone C, shown in Figure 15 C, 5 work of control module 2 control cleaning units are cleaned this surface.Driver element 3 drives walking unit 4 and moves in the zone, can come accurately to be located by the equipment of encoder or odometer or similar principles.
Step S107, when in the work of this working surface, dust sensor 235E carries out real-time detection to fouls such as dust, particles, the amount of dust L that actual detection is arrived 0.56=700 send to the theoretical degree of belief amending unit 236E in the recognition unit, and this theory degree of belief amending unit 236E should actual foul amount L 0.56=700 compare with standard foul amount λ=500, and wherein, the threshold value of this standard foul amount λ can be provided with by a setup unit flexibly by the user.Be 0.56 instant degree of belief deviate Δ F by feedback compensation algorithm computation of characteristic values relatively 0.56(1)=f (L 0.15-λ)=0.2;
Step S108, at this moment, theoretical degree of belief computing unit 232E recomputates the theoretical degree of belief of characteristic value 0.56, and adjusted instant degree of belief is adjusted into current instant degree of belief under this characteristic value.
At first calculate instant degree of belief, instant degree of belief equals last instant degree of belief and adds adjusted each time instant degree of belief deviate Δ F (n).In the present embodiment, owing to only adjusted once, therefore, F 0.56(1)=F 0.56(0)+Δ F 0.56(1)=1+0.2=1.2.This moment, the theoretical degree of belief of characteristic value 0.56 correspondence was updated to P 0.56(1)=0.56*F 0.56(1)=0.56*1.2=0.672.
In these concrete data are given an example, gradation of image value characteristic of correspondence value, any number of its span between 0~1; The span of corresponding instant degree of belief is in 0~10 any number; Instant trust deviate is any number between 0~1.
Table 1, the corresponding relation between each characteristic value and each numerical value
Characteristic value 0 ... 0.15 ... 0.56 0.6 ... 1
Instant degree of belief F F 0(n) F 0.15(n) F 0.56(n) F 0.6(n) F 1(n)
The initial trust degree 1 1 1 1 1 1
Instant degree of belief deviate Δ F ΔF 0 ΔF 0.15 ΔF 0.56 ΔF 0.6 ΔF 1
Theoretical degree of belief P 0 0.15*F 0.15(n) 0.56*F 0.56(n) 0.6*F 0.6(n) F 1(n)
In the foregoing description of present embodiment, the initial trust degree is predefined.Except that this kind mode, the initial trust degree also can pass through a setup unit 7, is chosen in several options by the user, or is arbitrarily chosen by user (as: greater than 0 and smaller or equal to 1) in certain span.By artificial setting, make each initial trust degree, instant degree of belief and each theoretical degree of belief all can correspondingly adjust and change.This is under the constant situation of the trust criterion of theoretical degree of belief, if selected initial trust number of degrees value is less, the instant degree of belief and the theoretical degree of belief of its corresponding each characteristic value correspondence all can diminish, utilize image to carry out initiatively discerning in the judgement of foul, can think that then this regional foul is less, can adopt working method correspondingly to carry out work; If selected initial trust number of degrees value is bigger, the instant degree of belief and the theoretical degree of belief of each characteristic value correspondence all can become big relatively for they, utilize image to carry out initiatively discerning in the judgement of foul, can think that then this regional foul is more, can adopt working method correspondingly to carry out work equally.
Except the initial trust degree of foregoing description can be adjusted, for dust sensor 235E, equally also can pass through a setup unit 7, the numerical value or the corresponding clean-up performance of its standard dust amount are chosen.Selected mode can be chosen in several options by the user, or is arbitrarily chosen in certain span by the user.If selected standard dust numerical quantity is less, then mean dust induced signal sensitivity, the easier fouls such as dust that detect of dust sensor; If selected standard dust numerical quantity is bigger, mean that then the dust induced signal is more blunt, dust sensor is not easy to detect fouls such as dust.
When robot cleaned on the work at present surface, real-time judge: whether (1) its battery electric quantity was lower than preset value (step S109); Whether (2) all pairing current instant degree of beliefs of characteristic value all converge to 0, the lower limit (step S111) of instant degree of belief; When (3) the actual foul amount that senses of dust sensor all is lower than setting value in the default time period or in another Preset Time section always greater than setting value (step S112); If battery electric quantity is lower than preset value (step S109), then keep the instant degree of belief of all characteristic value correspondences, and withdraw from the work at present state, stop cleaning (step S110); When if the pairing current instant degree of belief of all characteristic values all converges to 0 (lower limit of instant degree of belief just), the instant degree of belief of each characteristic value correspondence of storing in the storehouse of then clearing data, and the initial trust degree of used characteristic value is the instant degree of belief of current this characteristic value in the default database, and withdraw from the work at present state, representing all working surfaces all to clean thus and finishing.If the amount of dust that dust sensor detected all is lower than pre-set threshold, at this moment, this surface clean enough is described, will return step S101 and choose next image; If the amount of dust that dust sensor detected some setting-up time sections (such as: 10 minutes) in, be in greater than pre-set threshold always, explanation can't be cleaned totally this surface, then abandons the cleaning to this surface, returns step S101 and chooses next image.
In the above-described embodiments, the foundation of clean robot when judgement is mobile to where is theoretical degree of belief, can certainly be basis for estimation with gray value or characteristic value, and determination methods realizes the embodiment two and three of the recognition device that foul is initiatively discerned as described above.
Because only come initiatively to discern foul by image information, recognition device can be subjected to the interference of external environment unavoidably, more or less, be that the INFORMATION OF INCOMPLETE discerned of foundation is reliable and cause with the image information.Thereby, in the present embodiment, recognition device will be treated after the image information collection of working surface and be divided into N piece (N>1), each piece image information is converted into gray value, and then carries out a series of calculating, draw the theoretical degree of belief of each segment, numerical value to each theoretical degree of belief compares again, take out maximum, and on this basis, choose this theoretical maximum degree of belief correspondence this piece image treat working surface.Then, dust sensor treats that to the dirtiest a certain piecemeal of being judged by image information working surface carries out actual detection, the amount of dust and the predefined dust nominal value of actual detection are compared, constantly correct instant degree of belief with the feedback compensation algorithm, and with this renewal theory degree of belief.
Adopt above method in the present embodiment, realization is treated working surface and is carried out IMAQ, information conversion, extracts numerical value, numeric ratio, differentiate maximum, dust sensor auxiliary judgment, feedback compensation algorithm, data are corrected and Data Update, constantly judge and error correction thus at the working surface image information for the treatment of of the dirtiest a certain piecemeal of current judgement, thereby effectively raise the efficiency, strengthen Reliability of Information.
In the present embodiment, the identification processing unit that is used for realizing the cutting apart of image, information extraction, judgement and calculating is the part of control module 2, in the specific implementation, can adopt hardware and internal control program in the control module to finish, also can adopt one to overlap that independently hardware and control program are finished, this moment should with other control section cooperatings in the control module 2 in the robot, finish the cleaning of robot with this.
Wherein, setup unit can be positioned on the human-computer interaction interface of robot, as the setting panel 7 among Figure 11, wherein has the setting button of threshold value of the standard dust amount of the setting button of initial trust degree and dust sensor.
The user on purpose sets or chooses the initial trust degree by setup unit, and the mode of choosing can be to choose in several options, also can be that (as: greater than 0 and smaller or equal to 1) is arbitrarily chosen in certain span.By artificial setting, make each initial instant degree of belief and each initial theory degree of belief all can correspondingly adjust thereupon and change.This is under the constant situation of the trust criterion of theoretical degree of belief, if selected initial trust number of degrees value is less, the instant degree of belief and the theoretical degree of belief of its corresponding each characteristic value correspondence all can diminish, utilize image to carry out initiatively discerning in the judgement of foul, can think that then this regional foul is less, can adopt working method correspondingly to carry out work, corresponding working method can be to carry out fan-shaped cleaning from mobile clean robot in small area, also can be to pass through control module, dust suction power be turned down carried out work from mobile clean robot; If selected initial trust number of degrees value is bigger, the instant degree of belief and the theoretical degree of belief of each characteristic value correspondence all can become big relatively for they, utilize image to carry out initiatively discerning in the judgement of foul, can think that then this regional foul is more, can adopt working method correspondingly to carry out work equally, corresponding working method can be to carry out similar spirality cleaning from mobile clean robot in small area, also can be to pass through control module, dust suction power be transferred carried out work greatly from mobile clean robot.
By the user threshold value of the numerical value of its standard dust amount or corresponding clean-up performance (threshold value that this clean-up performance is corresponding certain) are chosen artificially and set.Selected mode can be by several options are chosen, and also can be arbitrarily to be chosen in certain span by the user.If selected standard dust numerical quantity is less, then mean dust induced signal sensitivity, the easier fouls such as dust that detect of dust sensor; If selected standard dust numerical quantity is bigger, mean that then the dust induced signal is more blunt, dust sensor is not easy to detect fouls such as dust.
In the present embodiment, be to set by button, can certainly adopt knob, touch-screen or wired or wireless remote control to set.
Above-mentioned have recognition device and the clean method of initiatively discerning foul and can be applied on the various self-movement robots, according to the difference of build-in function unit, self-movement robot unit, can be used by dust removal machine device people or intelligent scavenging machine or the like.
It should be noted that at last: above embodiment is only unrestricted in order to explanation the utility model, although the utility model is had been described in detail with reference to preferred embodiment, those of ordinary skill in the art is to be understood that, the modification of under the prerequisite that does not break away from spirit and scope of the present utility model the utility model being carried out or be equal to replacement all should be encompassed in the middle of the claim scope of the present utility model.

Claims (17)

1. the foul recognition device of a clean robot comprises image capture module (10), it is characterized in that, also comprises image processing module (20);
Described image capture module (10) is used to gather the image information that clean robot is treated working surface, and described image information is sent to described image processing module (20);
The working surface image for the treatment of that described image processing module (20) will collect is divided into the N piece, wherein, N>1, extract the image information of each piece image, described image information is handled, treat that so that an image in definite described N piece is pairing working surface is the dirtiest, thereby make described device carry out active identification foul.
2. the foul recognition device of clean robot according to claim 1 is characterized in that, described image processing module (20A) comprises image cutting unit (210A), information extraction unit (220A) and computing unit (230A);
Described image cutting unit (210A) is used for the clean robot that collects is treated that the image of working surface is divided into the N piece, wherein N>1;
Described information extraction unit (220A) extracts the image information of each piece image, and sends to described computing unit (230A);
Described computing unit (230A) is according to the image information of each piece image, judges that a image in the described N piece is pairing treats that working surface is the dirtiest.
3. the foul recognition device of clean robot according to claim 2 is characterized in that, described information extraction unit (220A) is gray value extraction unit (220B), and described computing unit (230A) is comparing unit (230B);
Described gray value extraction unit (220B) extracts corresponding gray from each piece image;
Described comparing unit (230B) is the gray value of each piece image relatively, thereby the image of gray value maximum is defined as the dirtiest working surface for the treatment of.
4. the foul recognition device of clean robot according to claim 2, it is characterized in that, described information extraction unit (220A) is gray value extraction unit (220C), and described computing unit comprises characteristic value extraction unit (231C) and comparing unit (230C);
Described gray value extraction unit (220C) extracts corresponding gray from each piece image;
Described characteristic value extraction unit (231C) is converted into the gray value of each image block and its characteristic of correspondence value, thereby extracts the foul feature from each piece image;
Described comparing unit (230C) is the foul characteristic value of each image relatively, thereby the image of foul characteristic value maximum is defined as the dirtiest working surface for the treatment of.
5. the foul recognition device of clean robot according to claim 2, it is characterized in that, described information extraction unit (220A) is gray value extraction unit (220D), and described computing unit (230A) comprises characteristic value extraction unit (231D), theoretical degree of belief computing unit (232D) and comparing unit (230D);
Described gray value extraction unit (220D) extracts corresponding gray from each image block;
Described characteristic value extraction unit (231D) is converted into the gray value of each image block and its characteristic of correspondence value, thereby extracts the foul feature from each piece image;
Described theoretical degree of belief computing unit (232D) is a parameter with described characteristic value, the instant degree of belief corresponding to described characteristic value that obtains from a database, obtains theoretical degree of belief corresponding to each image block according to the functional relation of described characteristic value, instant degree of belief and theoretical degree of belief;
Described comparing unit (230D) is the theoretical degree of belief of each image block relatively, thereby the image of theoretical degree of belief maximum is defined as the dirtiest working surface for the treatment of.
6. the foul recognition device of clean robot according to claim 5 is characterized in that, also comprises dust sensor (235E) and theoretical degree of belief amending unit (236E);
Described dust sensor (235E) is used for the foul amount that sensing is treated working surface, and the information that will sense actual foul amount sends to described theoretical degree of belief amending unit (236E);
Described theoretical degree of belief amending unit (236E) calculates the difference of actual foul amount and standard foul amount, and according to the functional relation of this difference with instant degree of belief deviate, calculates instant degree of belief deviate;
Described theoretical degree of belief computing unit (232E) calculates and revises the theoretical degree of belief of back corresponding to each image block according to the functional relation of described characteristic value, instant degree of belief and instant degree of belief deviate.
7. the foul recognition device of clean robot according to claim 6 is characterized in that, described comparing unit (230E) is the revised theoretical degree of belief of each image block relatively, and the image of theoretical degree of belief maximum is defined as the dirtiest working surface for the treatment of.
8. the foul recognition device of clean robot according to claim 5 is characterized in that, the functional relation of described characteristic value, instant degree of belief and theoretical degree of belief is:
P A=AF A
Wherein, A is a characteristic value; F AFor characteristic value is the instant degree of belief of A; P AFor characteristic value is the theoretical degree of belief of A.
9. the foul recognition device of clean robot according to claim 6 is characterized in that, the difference of actual foul amount and standard foul amount with the functional relation of instant degree of belief deviate is:
ΔF A(n)=f(x)
X is the difference of actual foul amount and standard foul amount; Δ F A (n)Be the instant degree of belief deviate that the n time revised characteristic value is A, f is a functional relation;
Described theoretical degree of belief computing unit (232E) calculates by following formula and revises the theoretical degree of belief of back corresponding to each image block:
P A(n)’=A(F A+ΔF A(n)+ΔF A(n-1)+……ΔF A(1))
Wherein, A is a characteristic value; F AFor characteristic value is the instant degree of belief of A; Δ F A (n)The instant degree of belief deviate of the actual foul amount when revising and the difference of standard foul amount corresponding to the n time; Δ F A (n-1)The instant degree of belief deviate of the actual foul amount when revising and the difference of standard foul amount corresponding to the n-1 time; Δ F A (1)The instant degree of belief deviate of the actual foul amount when revising and the difference of standard foul amount corresponding to the 1st time; P A(n) ' for process is revised the back n time, characteristic value is the theoretical degree of belief of A.
10. the foul recognition device of clean robot according to claim 9 is characterized in that, Δ F A (n)The functional relation of f is described in=the f (x):
ΔF A(n)=a nX n+a n-1X n-1+a n-2X n-2+……+a 1X 1+a 0
Wherein, X is the difference of actual foul amount and standard foul amount; Δ F A (n)It is the instant degree of belief deviate that the n time revised characteristic value is A; a n, a N-1..., a 1, a 0Be multinomial coefficient.
11. the foul recognition device of clean robot according to claim 5 is characterized in that, also comprises described database, described database stores mutual characteristic of correspondence value, instant degree of belief, initial trust degree and theoretical degree of belief.
12. the foul recognition device of clean robot according to claim 6, it is characterized in that, also comprise described database, described database stores the threshold value of mutual characteristic of correspondence value, instant degree of belief, initial trust degree, instant degree of belief deviate, theoretical degree of belief and standard foul amount.
13. the foul recognition device according to claim 11 or 12 described clean robots is characterized in that, the value of the instant degree of belief of described characteristic value correspondence equates with the value of initial trust degree when initial.
14. the foul recognition device of clean robot according to claim 6 is characterized in that, also comprises a setup unit (7), is used to set the threshold value of initial trust degree and/or standard foul amount.
15. the foul recognition device of clean robot according to claim 14 is characterized in that, described setup unit (7) is button, knob, touch or distance type device.
16. clean robot, comprise robot body (1), control module (2), driver element (3), walking unit (4) and cleaning unit (5), described control module (2) control cleaning unit (5) work, and control driver element (3), drive walking unit (4) walking by driver element (3), it is characterized in that, also comprise the arbitrary described foul recognition device of claim 1-15, the dirtiest working surface for the treatment of that described control module (2) is determined according to described foul recognition device, determine that walking unit (4) is the track route of destination with the dirtiest working surface for the treatment of, the dirtiest working surface for the treatment of is cleaned.
17. clean robot according to claim 16 is characterized in that, the graphics processing unit (20) in the arbitrary described foul recognition device of claim 1-15 is a part of described control module (2).
CN2009201778895U 2009-09-22 2009-09-22 Cleaning robot and dirt identifying device thereof Expired - Lifetime CN201543633U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2009201778895U CN201543633U (en) 2009-09-22 2009-09-22 Cleaning robot and dirt identifying device thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2009201778895U CN201543633U (en) 2009-09-22 2009-09-22 Cleaning robot and dirt identifying device thereof

Publications (1)

Publication Number Publication Date
CN201543633U true CN201543633U (en) 2010-08-11

Family

ID=42598792

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2009201778895U Expired - Lifetime CN201543633U (en) 2009-09-22 2009-09-22 Cleaning robot and dirt identifying device thereof

Country Status (1)

Country Link
CN (1) CN201543633U (en)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102599855A (en) * 2012-03-30 2012-07-25 东莞市万锦电子科技有限公司 Automatic dust collection power converting device of intelligent cleaning equipment
CN102601797A (en) * 2012-04-07 2012-07-25 大连镔海自控股份有限公司 A high-speed parallel robot with three-dimensional translation and one-dimensional rotation
CN102613944A (en) * 2012-03-27 2012-08-01 复旦大学 Dirt recognizing system of cleaning robot and cleaning method
CN103637749A (en) * 2013-11-25 2014-03-19 银川博聚工业产品设计有限公司 Washing integrated machine capable of conducting washing according to cleanliness degree
US8924019B2 (en) 2009-07-03 2014-12-30 Ecovacs Robotics Suzhou Co., Ltd. Cleaning robot, dirt recognition device thereof and cleaning method of robot
CN104317199A (en) * 2014-09-16 2015-01-28 江苏大学 Mobile smart housekeeper
CN105170528A (en) * 2015-09-29 2015-12-23 北京洁禹通环保科技有限公司 Automatic cleaning method and device for movable escalator steps
CN106821151A (en) * 2017-01-03 2017-06-13 合肥旋极智能科技有限公司 A kind of Intelligent cleaning robot
CN107144576A (en) * 2017-04-12 2017-09-08 丁永胜 A kind of colony house environment automatic cleaning method and system based on IMAQ
CN107194436A (en) * 2017-06-19 2017-09-22 安徽味唯网络科技有限公司 A kind of method of sweeping robot Intelligent Recognition spot
CN107876530A (en) * 2017-10-27 2018-04-06 安徽育安实验室装备有限公司 One kind experiment house infrastructure intelligence cleaning method
CN107900056A (en) * 2017-10-27 2018-04-13 安徽育安实验室装备有限公司 One kind experiment house infrastructure system for washing intelligently
CN104737085B (en) * 2012-09-24 2018-08-14 罗巴特公司 Robot for automatically detecting or handling ground and method
CN109576946A (en) * 2018-12-04 2019-04-05 余姚市朗硕电器科技有限公司 The mini washing tub of automatic aspirating
CN111084589A (en) * 2019-12-17 2020-05-01 万翼科技有限公司 Cleaning method and related product
USD907868S1 (en) 2019-01-24 2021-01-12 Karcher North America, Inc. Floor cleaner
CN112401737A (en) * 2017-06-05 2021-02-26 碧洁家庭护理有限公司 Autonomous floor cleaning system
CN116571489A (en) * 2023-05-25 2023-08-11 北京金轮坤天特种机械有限公司 High-pressure water jet cleaning method for environmental sediment on surface of thermal barrier coating
US12070181B2 (en) 2017-05-04 2024-08-27 Alfred Kärcher SE & Co. KG Floor cleaning appliance and method for cleaning a floor surface

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8924019B2 (en) 2009-07-03 2014-12-30 Ecovacs Robotics Suzhou Co., Ltd. Cleaning robot, dirt recognition device thereof and cleaning method of robot
CN102613944A (en) * 2012-03-27 2012-08-01 复旦大学 Dirt recognizing system of cleaning robot and cleaning method
CN102599855B (en) * 2012-03-30 2014-08-27 东莞市万锦电子科技有限公司 Automatic dust collection power converting device of intelligent cleaning equipment
CN102599855A (en) * 2012-03-30 2012-07-25 东莞市万锦电子科技有限公司 Automatic dust collection power converting device of intelligent cleaning equipment
CN102601797A (en) * 2012-04-07 2012-07-25 大连镔海自控股份有限公司 A high-speed parallel robot with three-dimensional translation and one-dimensional rotation
CN102601797B (en) * 2012-04-07 2014-08-06 大连创奇科技有限公司 A high-speed parallel robot with three-dimensional translation and one-dimensional rotation
CN104737085B (en) * 2012-09-24 2018-08-14 罗巴特公司 Robot for automatically detecting or handling ground and method
CN109613913A (en) * 2012-09-24 2019-04-12 罗巴特公司 Working method of autonomous mobile robot, autonomous mobile robot and system
CN103637749A (en) * 2013-11-25 2014-03-19 银川博聚工业产品设计有限公司 Washing integrated machine capable of conducting washing according to cleanliness degree
CN104317199A (en) * 2014-09-16 2015-01-28 江苏大学 Mobile smart housekeeper
CN105170528A (en) * 2015-09-29 2015-12-23 北京洁禹通环保科技有限公司 Automatic cleaning method and device for movable escalator steps
CN106821151A (en) * 2017-01-03 2017-06-13 合肥旋极智能科技有限公司 A kind of Intelligent cleaning robot
CN107144576A (en) * 2017-04-12 2017-09-08 丁永胜 A kind of colony house environment automatic cleaning method and system based on IMAQ
US12070181B2 (en) 2017-05-04 2024-08-27 Alfred Kärcher SE & Co. KG Floor cleaning appliance and method for cleaning a floor surface
CN112401737A (en) * 2017-06-05 2021-02-26 碧洁家庭护理有限公司 Autonomous floor cleaning system
CN107194436A (en) * 2017-06-19 2017-09-22 安徽味唯网络科技有限公司 A kind of method of sweeping robot Intelligent Recognition spot
CN107900056A (en) * 2017-10-27 2018-04-13 安徽育安实验室装备有限公司 One kind experiment house infrastructure system for washing intelligently
CN107876530A (en) * 2017-10-27 2018-04-06 安徽育安实验室装备有限公司 One kind experiment house infrastructure intelligence cleaning method
CN109576946A (en) * 2018-12-04 2019-04-05 余姚市朗硕电器科技有限公司 The mini washing tub of automatic aspirating
USD907868S1 (en) 2019-01-24 2021-01-12 Karcher North America, Inc. Floor cleaner
CN111084589A (en) * 2019-12-17 2020-05-01 万翼科技有限公司 Cleaning method and related product
CN116571489A (en) * 2023-05-25 2023-08-11 北京金轮坤天特种机械有限公司 High-pressure water jet cleaning method for environmental sediment on surface of thermal barrier coating

Similar Documents

Publication Publication Date Title
CN201543633U (en) Cleaning robot and dirt identifying device thereof
CN101941012B (en) Cleaning robot, dirt recognition device thereof and cleaning method of cleaning robot
US8871030B2 (en) Cleaning path guidance method combined with dirt detection mechanism
CN109124499B (en) Cleaning control method and chip based on cleaning robot
CN109124473B (en) Cleaning control method and chip of cleaning robot based on spiral walking
US11458628B1 (en) Method for efficient operation of mobile robotic devices
CN1745693A (en) The clean method of cleaner and this cleaner of use
CN107943044A (en) A kind of sweeping robot
CN110693396B (en) Obstacle avoidance processing mode of sweeper based on free move technology
CN107014633B (en) A kind of sweeping robot dust filter device intelligent detecting method
CN101992190A (en) Ground processing system and dirt cleaning and emptying method thereof
CN107669215A (en) Chip clean method, system and the sweeping robot being applicable
CN112294191A (en) Dirt blockage detection method for dust box filter screen of sweeper and sweeper
CN106725123A (en) A kind of Multifunctional floor-sweeping machine people
CN112515536B (en) Control method and device of dust collection robot and dust collection robot
CN108319266A (en) Sweeping method and control system of cleaning machine based on shape recognition
CN107664748B (en) Method and chip for detecting carpet by robot
JP2014079513A (en) Self-propelled vacuum cleaner
JP3926209B2 (en) Self-propelled vacuum cleaner
CN206565898U (en) A kind of full self-braking Intelligent robot for sweeping floor of ash
CN118078154A (en) Cleaning robot and cleaning control method thereof
CN114177731A (en) Building construction dust collector
CN111225592B (en) Self-propelled vacuum cleaner and extended area identification method
CN115644739B (en) Commercial cleaning robot control method and system based on Internet of things
EP3437535B1 (en) Vacuum cleaner

Legal Events

Date Code Title Description
C14 Grant of patent or utility model
GR01 Patent grant
C56 Change in the name or address of the patentee
CP01 Change in the name or title of a patent holder

Address after: 215168 Wuzhong District, Jiangsu, Stone Lake Road West, No. 108

Patentee after: ECOVACS ROBOTICS Co.,Ltd.

Address before: 215168 Wuzhong District, Jiangsu, Stone Lake Road West, No. 108

Patentee before: ECOVACS ROBOTICS Co.,Ltd.

Address after: 215168 Wuzhong District, Jiangsu, Stone Lake Road West, No. 108

Patentee after: ECOVACS ROBOTICS Co.,Ltd.

Address before: 215168 Wuzhong District, Jiangsu, Stone Lake Road West, No. 108

Patentee before: ECOVACS ROBOTICS (SUZHOU) Co.,Ltd.

Address after: 215168 Wuzhong District, Jiangsu, Stone Lake Road West, No. 108

Patentee after: ECOVACS ROBOTICS (SUZHOU) Co.,Ltd.

Address before: 215168 Wuzhong District, Jiangsu, Stone Lake Road West, No. 108

Patentee before: TEK ELECTRICAL (SUZHOU) Co.,Ltd.

CX01 Expiry of patent term
CX01 Expiry of patent term

Granted publication date: 20100811