SE1550114A1 - Milking arrangement - Google Patents
Milking arrangement Download PDFInfo
- Publication number
- SE1550114A1 SE1550114A1 SE1550114A SE1550114A SE1550114A1 SE 1550114 A1 SE1550114 A1 SE 1550114A1 SE 1550114 A SE1550114 A SE 1550114A SE 1550114 A SE1550114 A SE 1550114A SE 1550114 A1 SE1550114 A1 SE 1550114A1
- Authority
- SE
- Sweden
- Prior art keywords
- image
- camera
- milking
- teats
- control unit
- Prior art date
Links
- 210000002445 nipple Anatomy 0.000 claims abstract description 59
- 241001465754 Metazoa Species 0.000 claims abstract description 16
- 230000001427 coherent effect Effects 0.000 claims abstract description 6
- 230000005855 radiation Effects 0.000 claims abstract description 6
- 230000003287 optical effect Effects 0.000 claims abstract description 4
- 230000000007 visual effect Effects 0.000 claims description 9
- 238000003384 imaging method Methods 0.000 abstract description 15
- 238000001514 detection method Methods 0.000 description 12
- 210000000481 breast Anatomy 0.000 description 6
- 238000000034 method Methods 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 241000283690 Bos taurus Species 0.000 description 4
- 235000013365 dairy product Nutrition 0.000 description 4
- 238000006073 displacement reaction Methods 0.000 description 3
- 238000005286 illumination Methods 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 235000013336 milk Nutrition 0.000 description 3
- 239000008267 milk Substances 0.000 description 3
- 210000004080 milk Anatomy 0.000 description 3
- 230000003252 repetitive effect Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 210000001519 tissue Anatomy 0.000 description 2
- 238000002604 ultrasonography Methods 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 241000252203 Clupea harengus Species 0.000 description 1
- 206010061218 Inflammation Diseases 0.000 description 1
- 240000008042 Zea mays Species 0.000 description 1
- 235000005824 Zea mays ssp. parviglumis Nutrition 0.000 description 1
- 235000002017 Zea mays subsp mays Nutrition 0.000 description 1
- 210000001015 abdomen Anatomy 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 235000005822 corn Nutrition 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000005337 ground glass Substances 0.000 description 1
- 235000019514 herring Nutrition 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 230000004054 inflammatory process Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 210000002414 leg Anatomy 0.000 description 1
- 230000007257 malfunction Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 238000003908 quality control method Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01J—MANUFACTURE OF DAIRY PRODUCTS
- A01J5/00—Milking machines or devices
- A01J5/017—Automatic attaching or detaching of clusters
- A01J5/0175—Attaching of clusters
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B42/00—Obtaining records using waves other than optical waves; Visualisation of such records by using optical means
- G03B42/02—Obtaining records using waves other than optical waves; Visualisation of such records by using optical means using X-rays
- G03B42/026—Obtaining records using waves other than optical waves; Visualisation of such records by using optical means using X-rays for obtaining three-dimensional pictures
Landscapes
- Life Sciences & Earth Sciences (AREA)
- Animal Husbandry (AREA)
- Environmental Sciences (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Milking arrangement (1) comprising: • - a milking parlour (2) with teat cups (8), ♦ - a robot arm (7) for connecting the. teat: cups to the teats (5) of a milking animal (3) • - a control means (6, 9) tor controlling the robot ami, ana comprising: ■ - a coher - ent light source (12)'emit a coherent beam (13) of optical radiation, • - a speckle pattern generator (14) for forming a speckled beam (16) • - a camera (9)for imaging a reflected part of said beam, and ■ - a control unit (6) arranged to form a three-dimensional image from said image by comparing said image with a reference image of said beam, and statistically cross- correlating said speckle pat -. tern with the speckle pattern in said reference image, to identify said teats and said teat cups arid to detect the location thereof with respect to the camera, on the basis of said ttaee-dimensional image.
Description
Milking arrangement The present invention relates to a milking arrangement. More in particular, the invention relates to a milking arrangement comprising a milking parlour with teat 5 cups, a robot arm for connecting the teat cups to the teats of a milking animal, and a control means for controlling the robot arm.
Such milking arrangements are known as milking robots, available on the market today. Milking robots are able to milk cows in a completely automated way. Thereto, to a dairy animal such as a cow enters the milking parlour. The control means is arranged to help guide the robot arm to the teats for connecting the teat cups to the teats of the milking animal. Various types of control means are known in the prior art, such as laser detection systems, stereoscopic cameras and 3D time-of-flight cameras.
It turns out that in practice, the known control means do not always function sufficiently reliably and fast, sometimes to a point that it is unable to connect all teat cups to the teats. This is of course undesirable, as it reduces not only the capacity of the milking arrangement, but also might reduce milk production for the dairy animal as one or more quarters will not be milked out. Even if the dairy farmer would be warned and would milk the cow by connecting the teat cups manually, this would reduce the overall capacity of the milking arrangement, and would furthermore lead to more work for the farmer and to more stress for the dairy animal.
It is therefore an object of the present invention to provide a milking arrangement of the kind mentioned above, that is more reliable and/or faster, or at least provides the public with a reasonable alternative.
The invention achieves this object by means of a milking arrangement according to claim 1, and in particular comprising a milking parlor with teat cups, a robot arm for connecting the teat cups to the teats of a milking animal, a control means for controlling the robot arm, wherein said control means comprise a coherent light source arranged to emit a coherent beam of optical radiation, a speckle pattern generator arranged to impart a speckle pattern to said beam, thereby forming a WO 204/014341PCT/N1,2013/050481 2 speckled beam, a camera for repeatedly obtaining an image of a reflected part of said beam, and a control unit arranged to form a three-dimensional image from said image by comparing said image with at least one reference reflection image of said beam and taken with said camera, and statistically cross-correlating said speckle pattern in said image with said speckle pattern in said at least one reference reflection image, wherein the control unit is further arranged to identify said teats and to detect the location thereof with respect to the camera, on the basis of said three-dimensional image. In such a milking arrangement, and such a control means, it turns out to be possible to detect teats, and thus connect the teat cups, reliably and infast In particular a fast detection is advantageous as milking animals are live creatures, that can MOW) in an unpredictable way, hindering easy detection and connection. Therefore, if that detection is performed swiftly, connection becomes more reliable.
Special embodiments of the invention are described in the dependent claims, as well as in what follows.
It is noted that the technology of creating a speckle pattern such as caused by interference among different components of a diffused beam, and to use the speckle pattern to establish position, orientation and/or movement is itself known from so. the gamebox controller Kinect and their software developers PrimeSense, see e.g. http://www.joystio.corn/201 0106/19tkinect-how4-works-from-the-company-behind- the-tech!. Therefore, for specific technical details reference is made to corresponding publications such as W020071043036. In the present application, the presence of a memory for one or more reference images of the speckles at known distances, and other parts necessary to perform the method from the cited WO'036 document are deemed implicitly included in the milking arrangement, in particular in the control means. Herein, various steps regarding the cross-correlation of an image to a reference image are given in an exemplary embodiment on pages 15 and further, hereby included by reference.
It is furthermore noted that detection of structures and objects in a three-dimensional image is also known in the art. Depending on the objects to be detected, a number of criteria may be applied to the image. For example, if a teat has to be detected, one WO 20141014341PCT/N11013/050481 can look for a more or less cylindrical object with a diameter of about 2-3 cm and a length roughly between 2 and 8 cm, with a rounded tip to the lower side and connected at the upper side to a much bigger spherical structure, and moreover being provided in fourfold in a trapezoidal symmetry. Furthermore, as finger detection has already been contemplated for Kinect and such like systems with only computing power determining the required resolution, and as teats and fingers are geometrically like objects, the presently contemplated system is well suited for teat detection. Of course, if other objects need to be detected, suitable criteria can be provided, based on knowledge of the geometry of those objects.
In advantageous embodiments, the control unit is further arranged to detect at least one of said teat cups and the position thereof with respect to the camera, on the basis of said three-dimensional image. By not only identifying a teat and detecting its location, but also detecting the position of a teat cup in the image, the control unit can Is determine the mutual distance between the teat cup and the teat to which it is to be connected. By controlling the guiding of the teat cup by minimising that distance, the efficiency will be improved. It also ensures that any mispositioning of the teat cup on the robot arm can be corrected. Failing to do so and using only a basic position might cause unsuccessful attempts to connect, which decreases system efficiency.
A particular advantage of the system with respect to for example triangulation systems, using two or more cameras, is that a single camera suffices to do measurements. An actual image by the camera is compared and cross-correlated with a stored reference image taken before by the same camera at a known distance.
This makes the system less vulnerable to malfunctions of a camera (as it has only one), but also faster, as only one image at a time has to be acquired. This increased speed helps in obtaining the increased detection speed, in particular for unpredictably moving animals. Furthermore, it also helps that, contrary to time-offlight measurements, no sampling of a modulated signal is necessary for each pixel separately, nor integration of said signal for averaging, to increase bad signal-to- noise ratios. Inevitably, this takes time, while the arrangement according to the invention only requires the grabbing of a single image to detect momentary position, since the processing can be done externally, with a high-power computer, that does thus not need to be present very near the camera or even on the robot arm.
WO 2014/014341PCT/N1,2013/050481 4 Therefore, in a particular embodiment, the control means has a single camera that is arranged to repeatedly obtain an image of a reflected part of said beam from which image the control unit forms a three-dimensional image. It is stressed here that this single camera relates to the grabbing of the image for making the three-dimensional 6 image, Other cameras may be present for other purposes. The forming of the three-dimensional image still takes place by cross-correlating the speckle pattern in it with the speckle pattern of one or more reference images, as stored in the control means.
Advantageously, the speckle pattern generator is arranged to generate a constant and random speckled pattern. This allow easier cross-correlation of the image with the reference image, as each part of the pattern is in principle unique, and can be traced in the distorted received image more easily. However, it is also possible to use a speckle pattern generator that is arranged to generate a speckle pattern having some degree of regularity, up to a completely regular pattern, In embodiments, the milking arrangement according to the invention further comprises an additional sensor different from said camera, said sensor being arranged to obtain at least one additional image of a reflected part of said beam in a way that differs from the way of said camera, and wherein the control unit is arranged to use said additional image in identifying at least said teats, With such embodiments, an additional image is available to improve the detection capabilities of the control means. Because in use, the detection of teats will often depend on e.g. edge detection and the like. But when a surface is inclined sharply, the reflected speckle pattern will be locally weak and/or much distorted. It is then relatively difficult to determine whether the signal is just weak but real, or whether there is some noie or other signal disturbance. In other words, many likelihood criteria used to detect edges and the like will have difficulties determining such edges with a high accuracy. Now, by using an additional image, this accuracy and likelihood may be improved, because it becomes better possible to determine whether a structure in the acquired image is a real edge or similar structure by comparing it with the corresponding part of the additional image. If the latter for example shows a similar discontinuity, the likelihood of there being a real edge is higher, while a smooth surface in the additional image helps to conclude that there is not, WO 20141014341PCTINLM 3/050481 In embodiments, the additional sensor comprises a thermal image camera or an ultrasound sensor. Generally, such sensors are less susceptible to dirt, as a layer of dirt will often still reflect ultrasound in much the same way as clean tissue does, and dirt will assume a temperature that is often much as the same as the underlying tissue. Therefore, using such additional images improves the reliability of the original, three-dimensional image. E.g., such a thermal image camera produces a thermal image of the region of interest. Since an udder and the teats, but also most other body parts, are at more than 30 °C (such as at about 33-35 °C), which is almost always much more than ambient temperature, these structures are easily visible.
Also, since such a sensor measures a completely different, but relevant parameter, this additional image is also available for other purposes, such as health monitoring and general animal management. For example, if a teat or udder quarter has an inflammation, this may show up in the thermographic image as a rise in temperature.
Additionally or aternatively, the additional sensor comprises a visual camera.
Similarly, any perceived matching (dis)continuities in the visual iamge as compared with the original image may support the finding of an edge or other structure, while non-matching images support the absence of such structures. Furthermore, there are very cheap, compact, versatile and reliable visual cameras available.
In particular embodiments, the control unit is arranged to determine a movement for at least a part of said at least one additional image. Preferably the control unit is furthermore to use said movement in detecting the location of at least the teats with respect to the control means. Specifically, the movement relates to speed, direction of movement or preferably both. Furthermore, the movement is, whenever possible, determined for a structure already recognised in either the additional image or the (original) image.
As mentioned above, animals are live creatures, that can move unexpectedly. More in particular, any movement of the milking animal as a whole will have a relatively large effect on the position of the teats. Therefore, fast imaging is important in order to obtain an arrangement that can reliably connect teat cups to teats. But being able to monitor and track such movements in processing the image for detecting the teats also helps in improving the reliability of the milking arrangement. By using movement WO 2014/014341PCT/NL2013/050481 6 recognition for the additional sensor, use can be made of any available greater ease in doing so than for the image detection based on speckle patterns.
In particular, the additional sensor comprises a visual camera arranged to repeatedly 5 obtain an additional image, wherein the control unit comprises visual image recognition software and is arranged to determine a rate of movement for one or more structures as detected in the additional image by the image recognition software. Such motion detection is much more easily done with visual techniques, and its results can be used in correcting any measurements and determinations for 10 the original image and its subsequent processing into a three-dimensional image. For example, when the direction and rate of movement of a structure recognised as a teat tip is determined by means of the additional visual images, it becomes easier to predict where to find matching parts of the speckle pattern in a subsequent image in the cross-correlating step. After all, if a movement is swift, the relative displacement of e.g, a teat tip in the image can be such that the speckle pattern will be distorted with rasped to the previous image to a relatively high degree. With knowledge of the movement, that can be determined in parallel with the acquiring of the original image, and of course with knowledge of the previously obtained three-dimensional image, the new three-dimensional image and thus the speckle pattern to be expected can be predicted to a higher degree, so that the actual determination of the new three-dimensional image can be performed quicker.
The invention will now be described in more detail by means of some non-limiting embodiments and drawings, in which: - Figure 1 very diagrammatically shows a milking arrangement 1 according to the invention, in a perspective side elevational view; - Figure 2 very diagrammatically shows a part of the milking arrangement 1 in more detail; and - Figures 3A and 3B diagrammatically show a part of a reference image, and of an soactual image, respectively.
Figure 1 very diagrammatically shows a milking arrangement 1 according to the invention, in a perspective side elevational view.
WO 2014/014331PCT/N1,2013/030481 The milking arrangement 1 comprises a milking parlour 2, that is presently occupied by a milking animal 3 having an udder 4 with teats 5. The arrangement further comprises a control unit 6 with a robot arm 7, here carrying a teat cup 8, and with a camera 9 having a field of view 10. Herein, the camera 9 is comprised in the control means for controlling movement of the robot arm 7.
The milking parlour 2 may be a spatially fixed milking parlour, or may be a parlour on a rotary platform. The parlour may comprise a single milking box, or may be a part of a multi-parlour arrangement. The robot arm 7 may be a dedicated robot arm for just a single milking parlour 2, or may be a displaceable robot arm to operate in a plurality of milking parlours 2, in particular in a herring bone set-up or for a rotary platform.
The control unit 6 is arranged to control a.o. the milking process, with settings for milking, quality control and so on, but is in particular also arranged to control operation of the robot arm 7 with the help of information from the camera 9. Camera 9 has a field of view 10 that is arranged to be suitable to acquire a view of a relevant part of the milking animal 3 and/or the milking parlour 2. In particular, the field of view 10 is arranged to be able to comprise, when in use, a view of at least a part of the udder 4, teats 5 and at least one teat cup 8. The camera 9 may be positioned on the robot arm 7, on the control unit 6, directly connected to the milking parlour 2 or any other suitable position, as long as in use a suitable field of view can be arranged. In each case, the three-dimensional image then provides coordinates in a respective coordinate frame, such as with respect to the robot arm, the control unit, the milking parlour, respectively. In particular, positioning the camera on the robot arm is very suitable for stationary milking parlours For milking arrangements with a rotary platform on which a plurality of milking parlours has been provided, it may be advantageous to provide the camera in a fixed position with respect to the ground.
Figure 2 very diagrammatically shows a part of the milking arrangement 1 in more detail, in particular the camera 9 and the control unit 6, The camera 9 comprises an illumination unit 11 and an imaging unit 17. The illumination unit 11 comprises a laser 12, emitting a laser beam 13. A diffuser 14 and a diffractive element 15 turn the beam 13 into an emitted speckled beam 16.
WO 2014/014341PCT/N1,2013/050481 The imaging unit 17, having the field of view 10 of the camera 9, comprises imaging optics 18, that create an image of the field of view onto the sensor 19, that provides images to the control unit 6, that in turn comprises an image processor 20 and a memory 21. An additional rgb camera has been indicated by numeral 22, and has a field-of-view 23. It is explicitly noted here that the rgb camera 22 is not a part of the camera 9 that is arranged to acquire images for generating the three-dimensional image. toIn use, the laser 12 emits a laser beam 13 of a useful wavelength, having a wavelength between 600 and 1200 nm, preferably between 700 and 1000 nm, NIR has an advantage that ambient levels are relatively low and the sensitivity of most eyes is also low. Therefore, inconvenience for e.g. cows is relatively low. The laser beam then is sent through a diffuser 14, such as a piece of ground glass, that generates a speckle pattern in the beam, or a speckled beam, by interference effects within the propagated beam The diffractive element 15 helps to control the brightness level in a direction transverse to the beam propagation direction, It is noted that the position of the diffractive element 15 may also be between laser 12 and diffuser 14. The diffuser 14 may also be a different kind of speckle pattern generator, such as a holographic plate, or a transparency with an imprinted pattern. For more information regarding the diffuser 14 and the diffractive element 15, and regarding the technical background of this technique, reference is made to W02007/043036, incorporated by reference, and in particular to page 5-8.
The speckled beam is emitted and hits an object, in this case an udder 4 with teats 5. A part of the radiation is reflected towards the imaging unit 17, in which the imaging optics 18 form an image of the reflected radiation onto the sensor 19. The sensor 19 could be a ccd sensor, a crnos sensor or the like. Advantageously, the imaging unit comprises a filter, transmitting substantially only radiation with the wavelength of the laser source 12, in order to filter out as much ambient light as possible. The image formed in sensor 19 is then sent to the image processor 20, in which it is processed and a o. compared to one or more reference images stored in memory 21. The memory 21 could also serve to store temporarily or permanently the image from sensor 19, as well as any subsequent image from said sensor. It is noted that Figure WO 2014/014341PCT/NL2013/050481 2 is not to scale, and that the illumination unit 11 and the imaging unit 17 are preferably positioned very close to each other. Furthermore, it is not relevant where, that is in which part of the arrangement, the processing takes place. Alternatively, the processing could take place within the imaging unit 17 or in a device physically separate from the control unit 6. Of course: the control unit 6 should be connected to the image processor 20 in such a way that the results and information of the latter can be used by the former in the performing of its controlling tasks.
With reference numeral 22, an additional sensor in the form of a rgb camera has been indicated. Its field-of-view 23 should be overlapping with the field-of-view 10 of the camera/imaging unit 17 as much as possible. The rgb camera 22 serves to provide a visual image for supporting the formation of the three-dimensional image and the image and object recognition in said three-dimensional image. Thereto, the rgb camera is operatively connected to the image processor 20. The latter can compare an image from the rgb camera 22 to the actual image from the imaging unit 17 and or to its subsequent three-dimensional image. If a structure or, edge or object is apparently detected in the (actual or) three-dimensional image but something is determined to be wrong, such as a wrong number of detected teats, then the rgb image may be taken into account. For example, assume five teats have been determined in the three-dimensional image, but the rgb image shows a continuous colour and/or intensity on the position (i.e. spatial angle) of one of the teats and its immediate surroundings, then it is safe to conclude that that particular position does not contain a teat.
In a similar fashion, it is possible to use the rgb camera 22 as a means to determine movement of objects in the image. It is relatively easy to determine movement by means of an optical (rgb) camera and image. Then, if a structure in the three-dimensional image has been coupled (cross-correlated) to the rgb image, in other words its position and distance are known, a displacement in the rgb image can be coupled to a displacement in the three-dimensional image. Then, the new and distorted speckle pattern can be predicted to some degree. This makes the cross-correlating of the new speckle pattern in the subsequent image from imaging unit 17 easier.
WO 2014/014341PCT/1N L2013/050481 Figure 3A diagrammatically shows a part of a reference image, and Figure 36 diagrammatically shows a part Of an actual image taken by the imaging unit 17.
The reference image 3A is an image of the speckled beam, taken at a known distance. The image shows the pattern of the speckles as present in space at said distance. Just for convenience, the pattern is shown as completely regular. This greatly simplifies the following discussion. However, it is to be noticed that a random, non-repetitive pattern is much more convenient in practice, as this allows to identify a part of the actual image much easier and with much more certainty. Furthermore, although dots have been indicated, this does not mean that there are only bright spots while all the rest is dark. Rather, the dots indicate brighter parts in the image, while the parts around and inbetween the dots is darker, but not necessarily completely dark, even without a view to ambient light. is The actual image 36, highly idealised in this case, shows how the emitted speckled beam 16 would be imaged when illuminating a part of a milking animal 3. One can see a pattern of the speckles. Some parts are lacking dots, obviously the parts where no reflective object is present. Furthermore, some parts do show a pattern, that, however, has been deformed with respect to the original. The deformation of the pattern, and in particular the distance between neighbouring speckles, and also the (average) size of the speckles is an indication of the orientation and the distance with respect to the camera (or imaging unit) of the surface reflecting the speckle pattern, but can also also be compared with the distance at which the reference image 3A was taken. For example, a part pattern slightly above the centre of Figure 36 shows speckles at about the same distance as in Figure 3A, and also in about a square pattern. This indicates that the reflecting surface is oriented substantially transversely with respect to the camera and at about the same distance as for image 3A. To the left and right thereof, the speckles are more and more closer together, and run off to the top of the page This indicates that the surface bends further away, i.e. bends to sothe back, and furthermore is slightly inclined such as to face the ground. In all, the central part of the image seems to resemble roughly a semi-circle, better: a half sphere. Looking more closely, four structures can be found having a more or less cylindrical shape with a rounded tip. These are obviously the teats. To the extreme left and right edges of the Figure 3B, similar cylindrical structures can be seen, which WO 2014/014341PCT/N1,2013/050481 11 can be recognised as the legs, while the large structure at the top of the Figure will be the belly. Note that in this case the image analysis is a kind of two-step analysis. First, a three-dimensional image is created by determining, for as many points or speckles as possible, the spatial coordinates thereof. Then, the three-dimensional image is further analysed in order to extract surfaces and shapes therefrom, by means of image and shape recognition techniques. These are parse deemed known to the skilled person.
It will be clear that in the above Figure 3B, the picture is much clearer than will be the case in practice. For example, there is no dirt, noise or background signal present, the structures (legs, udder, teats) are all separated and easily recognisable. On the other hand, it is more difficult to actually position the structures in space (i.e. determine the right orientation and distance) with a completely regular pattern, as it is not possible to distinguish parts in a repetitive pattern. Thereto, an irregular pattern is used, for example random or regular though non-repetitive patterns. With such patterns, it is easier to cross-correlate parts of the collected image and similar parts of the reference image, as no so-called wrapping problem occurs, Reference is again made to W02007/043036, The above embodiments and drawings are not intended to limit the invention, the scope of which is determined by the appended claims.
Claims (8)
1. * * *CUM:FM:MT, 000 0 000 000 0 0 0 0000 000 000 0000 0 000 0 0 000 0 000 0000 0 0 0 000 000 0 0 0 0 0 00 0000 0 0 000 0 0 000000 000 0000 000 0 0 0 0 00 0000 0 0 000 0 000 0000 00 0000 0 000 0 0 00 00 0 000 00 000000000000000000000000000000000 000000000000000000000000000000000 000000000000000000000000000000000 000000000000000000000000000000000 000000000000000000000000000000000 000000000000000000000000000000000 000000000000000000000000000000000 000000000000000000000000000000000 000000000000000000000000000000000 000000000000000000000000000000000 000000000000000000000000000000000 000000000000000000000000000000000 000000000000000000000000000000000 000000000000000000000000000000000 000000000000000000000000000000000 000000000000000000000000000000000 000000000000000000000000000000000 000000000000000000000000000000000 000000000000000000000000000000000 000000000000000000000000000000000 000000000000000000000000000000000 000000000000000000000000000000000 000000000000000000000000000000000 000000000000000000000000000000000 000000000000000000000000000000000 000000000000000000000000000000000 000000000000000000000000000000000 000000000000000000000000000000000
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP12177270 | 2012-07-20 | ||
PCT/NL2013/050481 WO2014014341A1 (en) | 2012-07-20 | 2013-07-02 | Milking arrangement |
Publications (1)
Publication Number | Publication Date |
---|---|
SE1550114A1 true SE1550114A1 (en) | 2015-02-04 |
Family
ID=48794166
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
SE1550114A SE1550114A1 (en) | 2012-07-20 | 2013-07-02 | Milking arrangement |
Country Status (3)
Country | Link |
---|---|
DE (1) | DE112013003612T5 (en) |
SE (1) | SE1550114A1 (en) |
WO (1) | WO2014014341A1 (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10527711B2 (en) * | 2017-07-10 | 2020-01-07 | Aurora Flight Sciences Corporation | Laser speckle system and method for an aircraft |
US10964019B2 (en) | 2018-08-22 | 2021-03-30 | EIO Diagnostics, Inc. | System for high performance, AI-based dairy herd management and disease detection |
MX2021013961A (en) | 2019-05-14 | 2022-04-01 | Delaval Holding Ab | System and method for providing a decision basis for controlling a robotic arm, computer program and non-volatile data carrier. |
CN116267625B (en) * | 2023-04-19 | 2024-09-24 | 内蒙古欧牧机械设备有限公司 | Intelligent machine vision system and method for milking robot |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
SE514924C2 (en) * | 1998-08-31 | 2001-05-21 | Delaval Holding Ab | Apparatus and method for monitoring the movement of an animal |
WO2005094565A1 (en) * | 2004-03-30 | 2005-10-13 | Delaval Holding Ab | Arrangement and method for determining positions of the teats of a milking animal |
US8400494B2 (en) | 2005-10-11 | 2013-03-19 | Primesense Ltd. | Method and system for object reconstruction |
CA2711388C (en) * | 2008-01-22 | 2016-08-30 | Delaval Holding Ab | Arrangement and method for determining the position of an animal |
-
2013
- 2013-07-02 WO PCT/NL2013/050481 patent/WO2014014341A1/en active Application Filing
- 2013-07-02 DE DE112013003612.6T patent/DE112013003612T5/en not_active Withdrawn
- 2013-07-02 SE SE1550114A patent/SE1550114A1/en not_active Application Discontinuation
Also Published As
Publication number | Publication date |
---|---|
DE112013003612T5 (en) | 2015-06-03 |
WO2014014341A1 (en) | 2014-01-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP1656014B1 (en) | Improvements in or relating to milking machines | |
US8794181B2 (en) | System and method for three dimensional teat modeling for use with a milking system | |
EP3520605B1 (en) | Arrangement and method for determining a weight of an animal | |
US9576368B2 (en) | Method and device for optically determining a position and/or orientation of an object in space using a two dimensional image to generate three dimensional information | |
EP3076892B1 (en) | A medical optical tracking system | |
US20070215052A1 (en) | Time of flight teat location system | |
EP2822448A1 (en) | Apparatus for optical coherence tomography of an eye and method for optical coherence tomography of an eye | |
CN102670170A (en) | Optical tomographic image photographing apparatus and control method therefor | |
NL2010213C2 (en) | Camera system, animal related system therewith, and method to create 3d camera images. | |
Azouz et al. | Development of a teat sensing system for robotic milking by combining thermal imaging and stereovision technique | |
US20180128736A1 (en) | Image forming device | |
SE1550114A1 (en) | Milking arrangement | |
Pezzuolo et al. | A comparison of low-cost techniques for three-dimensional animal body measurement in livestock buildings | |
CN113393436B (en) | Skin detection system based on multi-angle image acquisition | |
CA2799358C (en) | Sensor array for locating an object in space | |
US12039792B2 (en) | Position-determining device | |
EP3281517A2 (en) | Dairy animal treatment system | |
CN204944449U (en) | Depth data measuring system | |
NO20092446L (en) | Device and method for fish counting or biomass determination | |
NL2009923C2 (en) | System for detecting and determining positions of animal parts. | |
RU185290U1 (en) | Device for positioning the manipulator of robotic plants for pre-milking the udder and milking | |
Ben Azouz | Development of teat sensing system for automated milking |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
NAV | Patent application has lapsed | ||
NAV | Patent application has lapsed |