[go: up one dir, main page]

SE2350163A1 - Plant handling arrangement and a method - Google Patents

Plant handling arrangement and a method

Info

Publication number
SE2350163A1
SE2350163A1 SE2350163A SE2350163A SE2350163A1 SE 2350163 A1 SE2350163 A1 SE 2350163A1 SE 2350163 A SE2350163 A SE 2350163A SE 2350163 A SE2350163 A SE 2350163A SE 2350163 A1 SE2350163 A1 SE 2350163A1
Authority
SE
Sweden
Prior art keywords
plant
plants
image data
predetermined
candidate
Prior art date
Application number
SE2350163A
Inventor
Filip Runesson
Joakim Svensson
Nils-Anders Brunnsgård
Rickard Bergh
Original Assignee
Soedra Skogsaegarna Ekonomisk Foerening
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Soedra Skogsaegarna Ekonomisk Foerening filed Critical Soedra Skogsaegarna Ekonomisk Foerening
Priority to SE2350163A priority Critical patent/SE2350163A1/en
Priority to PCT/EP2024/053202 priority patent/WO2024170405A1/en
Publication of SE2350163A1 publication Critical patent/SE2350163A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01CPLANTING; SOWING; FERTILISING
    • A01C11/00Transplanting machines
    • A01C11/02Transplanting machines for seedlings
    • A01C11/025Transplanting machines using seedling trays; Devices for removing the seedlings from the trays
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B79/00Methods for working soil
    • A01B79/02Methods for working soil combined with other agricultural processing, e.g. fertilising, planting
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01CPLANTING; SOWING; FERTILISING
    • A01C1/00Apparatus, or methods of use thereof, for testing or treating seed, roots, or the like, prior to sowing or planting
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01CPLANTING; SOWING; FERTILISING
    • A01C11/00Transplanting machines
    • A01C11/02Transplanting machines for seedlings
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01GHORTICULTURE; CULTIVATION OF VEGETABLES, FLOWERS, RICE, FRUIT, VINES, HOPS OR SEAWEED; FORESTRY; WATERING
    • A01G23/00Forestry
    • A01G23/02Transplanting, uprooting, felling or delimbing trees
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01GHORTICULTURE; CULTIVATION OF VEGETABLES, FLOWERS, RICE, FRUIT, VINES, HOPS OR SEAWEED; FORESTRY; WATERING
    • A01G9/00Cultivation in receptacles, forcing-frames or greenhouses; Edging for beds, lawn or the like
    • A01G9/02Receptacles, e.g. flower-pots or boxes; Glasses for cultivating flowers
    • A01G9/029Receptacles for seedlings
    • A01G9/0295Units comprising two or more connected receptacles
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01GHORTICULTURE; CULTIVATION OF VEGETABLES, FLOWERS, RICE, FRUIT, VINES, HOPS OR SEAWEED; FORESTRY; WATERING
    • A01G9/00Cultivation in receptacles, forcing-frames or greenhouses; Edging for beds, lawn or the like
    • A01G9/08Devices for filling-up flower-pots or pots for seedlings; Devices for setting plants or seeds in pots
    • A01G9/083Devices for setting plants in pots
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40053Pick 3-D object from pile of objects
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/45Nc applications
    • G05B2219/45017Agriculture machine, tractor
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/45Nc applications
    • G05B2219/45063Pick and place manipulator

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Environmental Sciences (AREA)
  • Mechanical Engineering (AREA)
  • Soil Sciences (AREA)
  • Robotics (AREA)
  • Human Computer Interaction (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Ecology (AREA)
  • Forests & Forestry (AREA)
  • Image Analysis (AREA)

Abstract

ABSTRACT A method performed by a plant handling arrangement for handling one or more plants of a set of plants is provided. The method comprises obtaining (201) image data indicative of the set of plants. The method further comprises, based on the obtained 5 image data, identifying (203) a candidate plant out of the set of plants. The method further comprises, based on the obtained image data, determining (204) a position of the identified candidate plant. The method further comprises handling (205) the identified candidate plant based on the determined position.

Description

PLANT HANDLING ARRANGEMENT AND A METHOD TECHNICAL FIELD Embodiments herein relate to a plant handling arrangement and a method therein. Furthermore, a vehicle comprising the plant handling arrangement, a computer program and a carrier are also provided herein. ln some aspects, embodiments herein relate to handling one or more plants of a set of plants.
BACKGROUND Handling plants in an automated manner is a complex and difficult task. This is since each plant has its own unique shape, is sensitive and may be prone to damage unless handled in a delicate manner. Therefore, to handle plants, for example, when planting them using a planting device, the plants are typically packed in an ordered manner to be handled by a device one by one. As an example, in a planting device, plants are typically packed in individual containers, e.g., pots, capsules and/or in cassettes, such that the planting device can, in an ordered manner, output one plant at a time through its respective individual container. While such a manner of handling plants can be quick and effective compared to manually handling the plants, such systems are complex, expensive, take a lot of space, and are limited in productivity in the speed of handling each individual container.
Hence, there is a strive to improve efficiency of handling plants.
SUMMARY As a part of developing embodiments further problems were identified by the inventors and will first be discussed. Handling plants, such as planting the plants, by the use of individual containers for each plant, e.g., pots, capsules or cassettes, is time- consuming, requires excessive space, and may relate to very monotonous work tasks. This is both since the packing procedure may be slow as it may need manual labor, but may also need more space due to the size of the pots, capsules and/or cassettes. Furthermore, when a plant has been planted, an empty individual container will remain, thus adding to cargo weight of a plant handling machine, and further takes up a lot of valuable space, limiting the number of plants the plant handling machine can carry.
The above plant handling problems may pertain to any kind of plant handling, but may be more apparent when handling tree plants, i.e. tree saplings, where a lot of trees may need to be handled within the same time period.
An object of embodiments herein is to improve efficiency of handling plants.
According to an aspect of embodiments herein, the object is achieved by providing a method performed by a plant handling arrangement for handling one or more plants of a set of plants.
The method comprises obtaining image data indicative of the set of plants. The method further comprises, based on the obtained image data, identifying a candidate plant out of the set of plants. The method further comprises, based on the obtained image data, determining a position of the identified candidate plant. The method further comprises handling the identified candidate plant based on the determined position.
According to another aspect of embodiments herein, the object is achieved by providing a plant handling arrangement configured to handle a set of plants. The plant handling arrangement is configured to: obtain image data indicative of the set of plants, based on the obtained image data, identify a candidate plant out of the set of plants, and based on the obtained image data, determine a position of the identified candidate plant, and handle the identified candidate plant based on the determined position. lt is furthermore provided herein a computer program comprising instructions, which, when executed on at least one processor, cause the at least one processor to carry out the methods above, as performed by plant handling arrangement. lt is additionally provided herein a carrier, having stored thereon a computer program product comprising instructions which, when executed on at least one processor, cause the at least one processor to carry out the methods above, as performed by the plant handling arrangement. lt is furthermore provided, a vehicle comprising the plant handling arrangement.
As the candidate plant is identified out of the set of plants based on the image data, and since its corresponding position is determined based on the image data, the identified plant can be handled by the plant handling arrangement in a more efficient manner. This is since the plant handling arrangement is enabled to handle the set of plants arranged in any suitable manner, and is not restricted to having the plants ordered in individual containers. ln other words, the set of plants may comprise multiple plants mixed together and may be comprised in one or more containers, arranged on one or more trays, conveyor belts, and/or tables/workbenches, etc. This means that the act of preparing the set of plants, e.g., packing or arranging the set of plants, is significantly reduced, as they do not need to be put into separate containers, which thereby significantly reduces preparation time, total handling time, and consequently improves efficiency for handling plants.
BRIEF DESCRIPTION OF THE DRAWINGS Examples of embodiments herein are described in more detail with reference to attached drawings in which: Fig. 1 is a schematic block diagram illustrating embodiments of a plant handling arrangement. Fig. 2 is a flowchart depicting an embodiment of a method of embodiments herein.
Fig. 3 is a diagram illustrating exemplary embodiments herein.
Figs. 4a-b are diagrams illustrating exemplary embodiments herein.
Fig. 5 is a diagram illustrating exemplary embodiments herein.
Fig. 6 is a diagram illustrating exemplary embodiments herein.
Fig. 7 is a diagram illustrating exemplary embodiments herein.
Fig. 8 is a diagram illustrating exemplary embodiments herein.
Fig. 9 is a diagram illustrating exemplary embodiments herein.
Fig. 10 is a diagram illustrating exemplary embodiments herein.
Fig. 11 is a schematic block diagram illustrating embodiments of a control unit.
DETAILED DESCRIPTION Embodiments provide aspects of handling plants by use of obtaining image data of a set of plants. The image data can be used to identify a candidate plant out of the set of plants, and further, determine a position of the candidate plant. A plant handling arrangement can then be configured to handle the candidate plant based on the position.
For example, the plant handling arrangement may comprise a robot arm Which can grab, i.e. pick up, the candidate plant at, or based on, the determined position. An example advantage of embodiments herein is that the set of plants reduces any preparation needed for the set of plants before being handled by the plant handling arrangement, i.e. as theyjust need to be arranged within a set distance from the plant handling arrangement, e.g., in a shared area such as an inside an open container or tray. As the set of plants can be arranged in a shared areas such as packed in a box and delivered to the plant handling arrangement, the set of plants may also be handled in a manner which required a lot less space, can be more tightly packed when transporting, and the packing may be automated. Furthermore, the set of plants can in this way be mixed in any suitable manner, e.g., multiple types of plants in the same box, arranged with their roots in different directions, etc.
Fig. 1 is a schematic overview depicting a plant handling arrangement 1.
The plant handling arrangement 1 may comprise any suitable components for performing the embodiments herein. ln or within a set distance, e.g., an interval of 0.1 - 2meters, of the plant handling arrangement, a set of plants 20 may be arranged. The set of plants 20 may comprise any number of plants of any suitable type and/or size. Type as used herein may mean a variant of a species or a species of plant. ln some embodiments, the set of plants 20 comprises tree plants, i.e. saplings. The tree plants may be, any one or more out of: pine, spruce, Picea abies, Pinus sylvestris, Betula pendula, Larix eurolepis, Pseudotsuga menziesii, Populus, just to name a few examples. The set of plants 20 may additionally or alternatively comprise any one or more out of: tomato plants, perennials, ornamental shrubs , lettuce, cabbage, onion and other plants but not limited to planted in plug trays.
The set of plants 20 may typically comprise only one or two types of plants, but may comprise any suitable combination of plants.
The set of plants 20 may comprise more than one plant, and the set of plants 20 may be at least partially unorganized and/or at least partially unordered. This means that the set of plats 20 may be freely arranged, e.g., mixed, in at least one shared area 30. The at least one shared area 30 may comprise any one or more out of: a box, an open- ended container, a tray, a table, a workbench, etc. ln other words, the set of plants 20 may be mixed, e.g., in one or more piles, in one or more areas of the at least one shared area 30 when being handled by the plant handling arrangement 1.
The plant handling arrangement 1 may comprise means for obtaining image data, e.g., any suitable unit for obtaining image data such as at least one camera 2. The image data may be one or more images obtained from, e.g., captured by, each respective camera in the at least one camera 2. The image data comprises one or more images of the set of plants 20. The at least one camera 2 may comprise one or more cameras arranged above the set of plants 20. The at least one camera 2 may additionally or alternatively comprise one or more cameras on a side of the set of plants 20. The at least one camera 2 may comprise one or more cameras arranged on a side of the set of plants . The at least one camera 2 may comprise one or more cameras arranged at a respective angle to the set of plants 20. The at least one camera 2 may be arranged at a predetermined position, e.g., absolute position or relative position which is relative to any entity of the plant handling arrangement 10, e.g., relative to the set of plants 20 and/or relative to the shared area 30. The at least one camera 2 may comprise at least one two- dimensional (2D) camera. ln some embodiments the at least one camera 2 is only one, two, three, or more 2D cameras. When the at least one camera 2 comprises one 2D camera, depth information, e.g., respective height of the set of plants 20, of the set of plants 20 may be estimated e.g., using color, contrast, trained machine learning models etc. When the at least one camera 2 comprises more than one camera, e.g., at least one 2D camera, depth information may be extrapolated from different images of different cameras in image data obtained from the at least one camera 2. The at least one camera 2 may comprise at least one three-dimensional (3D) camera. The at least one 3D camera may be a camera wherein some depth information of the set of plants 20 may be obtained in image data from the at least one 3D camera.
Using the at least one camera 2, image data is obtained, and based on said image data, a candidate plant 10 is identified out of the set of plants 20. Furthermore based on said image data, a position of the candidate plant 10 is determined. The candidate plant 10 is then handled based on the determined position. For example, at the determined position, or at a predetermined position relative to the determined position, the candidate plant 10 may be handled by being picked up at a preferable part of the candidate plant 10, for example, at a stem position between a root of the candidate plant 10 and branches of the candidate plant 10.
The plant handling arrangement 1 may, for example, comprise a robot arm 40. The robot arm 40 may be arranged to handle the set of plants 20 and/or the candidate plant 10, e.g., by grabbing the candidate plant 10 out of the set of plants 20, at, or based on the determined position of the candidate plant 10. The robot arm 40 may be configured to use a predetermined or dynamically determined amount of pressure when grabbing the candidate plant 10 to reduce damages to the candidate plant 10.
The plant handling arrangement 1 may be comprised in a vehicle 50, and/or comprised in a trailer of the vehicle 50. The vehicle 50 may be a tractor, harvester, forwarder, custom built vehicle, just to name a few suitable examples. For example, the vehicle 50 may be arranged to drive in an area where the set of plants are to be planted. The robot arm 40 picks up identified candidate plants out of the set of plants 20 and initiates a planting of the picked up plant, for example, by releasing the candidate plant 10 causing a planting of the candidate plant 10. As an example, the robot arm 40 may release the candidate plant 10 into a planting device or a funnel leading the candidate plant 10 to the planting device. A planting device may be a machine configured to receive the candidate plant 10, and to insert the candidate plant 10 into a suitable place in the ground. ln some embodiments, the plant handling arrangement 1 is not arranged in the vehicle 50 and may instead be in a stationary position, e.g., at a plant nursery. ln these embodiments, the candidate plant 10 may be, e.g., by the robot arm 40, moved from the set of plants 10 to a target position, e.g., container, plug tray for replanting. the robot arm 40 may, for example, move the candidate plant 10 between Workstations, Wherein each Workstation may be arranged for different steps in a production process, e.g., different steps of breeding/nurturing plants. the Workstations may comprise one or more out of: a Workstation for sorting plants, a Workstation for protecting against pest, a Workstation for preparing plants for delivery, etc.
Embodiments herein may be controlled by a control unit 60. The control unit 60 may be a control unit comprised in the plant handling arrangement 1 or remote to the plant handling arrangement 1, e.g., in a remote cloud environment. The control unit 60 may be communicatively coupled, e.g., Wired or Wirelessly, With the at least one camera 2 and/or the robot arm 40. The control unit 60 may obtain image data from the at least one camera 2 and/or may trigger movement of the robot arm 40, such as trigger the robot arm 40 to pick up the candidate plant 10, at a given position.
A number of embodiments Will noW be described, some of Which may be seen as alternatives, While some may be used in combination.
Fig. 2 shoWs embodiments of the method performed by the plant handling arrangement 1 for handling one or more plants of a set of plants 20. The set of plants 20 may be arranged at least partly in the at least one shared area 30. ln other Words, the set of plants 20 does not comprise an individual container per plant. lnstead, the set of plants 20 may be mixed together in one or more containers, boxes, trays, on tables, on Workbenches, etc. The set of plants 20 may be unordered or partially unordered. When stored in a container or box, the set of plants 20 may have been shuffled therein during transport so even if they Were initially packed in a certain manner, the set of plants 20 may be at least partly unorganized When handled. The set of plants 20 may for example be packed, e.g., by a machine or by hand, in one or more open ended boxes to be picked up, i.e. grabbed, by the robot arm 40. ln some embodiments, the plant handling arrangement 1 comprises at least one camera 2 arranged at a predetermined position.
The control unit 60 may perform the method as described below. The method comprises the following actions, which actions may be taken in any suitable order. Optional actions are referred to as dashed boxes in Fig. 2.
Action 201 The method comprises obtaining image data indicative of the set of plants 20. ln some embodiments, the image data comprises height data of the set of plants 20.
The height data may indicate a depth in the image data. The height data may indicate which plant is most suitable to handle, for example, Wherein the most suitable plant may be the one with a greatest position in height. ln some embodiments, obtaining the image data comprises obtaining one or more images captured by the at least one camera 2. ln other words, the image data may be images from different cameras and/or different angles of the set of plants 20. ln some embodiments obtaining the image data indicative of the set of plants 20 comprises obtaining 3D image data of the set of plants 20. The 3D image data is indicative of a point cloud of the set of plants 20. The 3D image data may be indicative of height data of the set of plants 20, e.g., by relative measures of points in the point cloud. ln some embodiments obtaining the image data indicative of the set of plants 20 comprises obtaining 2D image data of the set of plants 20. The 2D image data may be indicative of height data of the set of plants 20, e.g., by extrapolating depth of the images by using multiple images from different angles, or by estimating the height data by use of a predictive model, e.g., by estimating the height data based on any one or more out of: contrast, shape, and color, in the 2D image data.
Additionally or alternatively, the image data of the set of plants 20 may be obtained in any suitable manner. The image data may be indicative of the set of plants 20 in any suitable manner, such that it is possible to identify a candidate plant out based on the image data, and to determine a position of the candidate plant based on the image data.
Action 202 ln some embodiments the method may comprises selecting a subset based on an area of the image data. The subset may be a subset of the image date, i.e. the subset is an image data subset. The subset may fulfil one or more criteria, e.g., a size or distance of the subset and/or that the subset is likely to comprise a plant which can be identified.
The subset may be used to narrow a search space for where to identify a candidate plant 10 out of the set of plants 20. For example, when there are a lot of areas to look for a candidate plant in the image data, it may be faster, and thereby improve productivity, to narrow down how much of the image data to use as a basis for identifying the candidate plant.
The subset may be selected based on predetermined knowledge of how the set of plants 20 are packed or likely to be packed in the at least one shared area 30, e.g., the subset may be selected as one or more predetermined areas of where roots of the set of plants are packed/arranged in the at least one shared area, e.g., an area with a predetermined distance of a predetermined side of the at least one shared area 30.
The subset may be selected based on using a 2D image of the 2D image data to predict a suitable area to use as a subset, e.g., where one or more roots of the set of plants 20 are detected in the image data.
Additionally or alternatively, the subset may be selected based on an analysis of the image data of where most of the set of plants 20 are located, and/or where most of their respective roots are located.
Action 203 The method comprises identifying the candidate plant 10 out of the set of plants 20 based on the obtained image data.
The candidate plant 10 may be a plant to be grabbed, e.g., picked up by the robot arm 40. ln some embodiments, identifying the candidate plant 10 out of the set of plants 20 comprises identifying a predetermined plant shape from the image data. The predetermined plant shape may be a predetermined shape of a root of a plant. ln other words, identifying the candidate plant 10 out of the set of plants 20 may comprise searching for image details in the image data which looks like a root of a plant, for example, a particular root type. ln some embodiments, identifying the predetermined plant shape from the image data comprises matching one or more features of the predetermined plant shape with the image data.
For example, the one or more features may be one or more features of a root to look for in the image data. The one or more features of the predetermined plant shape may for example comprise any one or more out of: - a size parameter, e.g., length, circumference, and/or a 2D projection size, - a 3D shape of at least a part of the predetermined plant shape, e.g., a cylinder shape, or other shape, representing a root of a plant, and - a 2D shape of a cross-section or a side of at least a part of the predetermined plant shape, e.g., for comparison/matching with a cross-section or height data of the image data, - a 2D representation of the predetermined plant shape, e.g., wherein matching the one or more features comprises comparing a contrast and/or color difference between the predetermined plant shape and the image data. ln other words, identifying the candidate plant 10 out of the set of plants 20 may comprise searching the image data for a match with the one or more features, e.g., with respect to some error margin. ln other words, matching the one or more features of the predetermined plant shape with the image data may further comprise estimating an error margin of the one or more features, and wherein identifying the candidate plant 10 is performed based on the estimated error margin. For example, the candidate plant 10 may be identified if the error margin fulfils a success condition, such as, the error margin is below a threshold. ln other words, identifying the candidate plant 10 out of the set of plants 20 may be based on a correlation threshold and/or a score between a trained pattern, e.g., the predetermined plant shape, and the image, such as the image data, which correlation threshold and/or score is reached or exceeded to be considered a found pattern, i.e., the candidate plant 10 is identified.
When the image data comprises height data of the set of plants 20, matching the one or more features of the predetermined plant shape with the image data may comprise matching the one or more features of the predetermined plant shape with the height data. ln other words, the one or more features of the predetermined plant shape, e.g., a 2D or 3D shape of a plant may be matched with the height data of the image data. This means that the height data may be a profile of a top side the set of plants 20, and the predetermined plant shape may be matched with said profile. The profile may be projected in 2D and matched with a predetermined 2D shape, e.g., a cylinder shape of a root matched with height data to see if the cylinder shape fits in the top-part of the height data. ln some embodiments, matching the height data with the one or more features of the predetermined plant shape comprises, based on the height data and the one or more features of the predetermined plant shape , identifying a first plant associated with a first height and a second plant associated with a second height. ln some of these embodiments, identifying the candidate plant 10 out of the set of plants 20 comprises identifying the candidate plant 10 out of the first plant and the second plant based at least partly on a comparison between the first height and the second height. ln other words, two plants may be identified, and the plant which is highest up in the height data may be identified as the candidate plant. As the candidate plant is to be handled, this may pose lowest risk to damage other plants, e.g., when grabbing the identified plant 10. ln some embodiments, identifying the candidate plant 10 out of the set of plants 20 comprises identifying the candidate plant 10 based on a subset of the image data. The subset may be a predetermined selected subset of the image data, or the subset is the selected subset, e.g., as in action 202. ldentifying the candidate plant 10 based on the subset of the image data means that only the subset of the image data may be used for searching for plants to identify as the candidate plant 10. Since the candidate plant 10 is identified based on the subset of the image data, improved productivity, higher margin of error, and accuracy is achieved. This is since only part of the image data needs to be analyzed to identify a candidate plant. ln some embodiments herein, matching the one or more features with the predetermined plant shape and/or identifying the candidate plant 10 may be performed using any suitable image analysis method. ldentifying the candidate plant 10 out of the set of plants 20 based on the obtained image data may be performed based on using a first machine learning model. The first machine learning model may comprise a neural network. The first machine learning model may have been trained based on training image data of plants, e.g., same type and/or species as the plants in the set of plants. The training image data of plants may comprise single images of plants and/or images of multiple plants such as the set of plants 20. The training image data of plants may comprise images of plants in the at least one shared area 30. The training of the first machine learning model may be supervised, e.g., by using truth labels for which plant is the candidate plant to be identified out of a set of plants. The training of the first machine learning model may comprise deep learning. Embodiments herein may further comprise training the first machine learning model.
Action 204 The method comprises, determining the position of the identified candidate plant 10 based on the obtained image data.
The position may be a position for use when handling the candidate plant 10. For example, the determined position may be a position wherein the robot arm 40 is arranged to grab, i.e., pick up the candidate plant 10. 11 The position may be a predetermined part of a plant, e.g., a center of a root of the candidate plant 10, a position between the root and a first branch of the candidate plant 10, a center of the candidate plant 10, an end-point of the candidate plant 10, etc.
When the obtained image data comprises one or more images captured by the at least one camera 2, and determining the position of the identified candidate plant 10 may, at least partially, be based on the predetermined position of the camera. Alternatively, any other suitable reference point in terms of positioning may be used. ln some embodiments, determining the position of the identified candidate plant 10 comprises determining one or more coordinates for handling the identified candidate plant 10. The one or more coordinates may be with respect to a coordinate system for the at least one shared area 30, for the robot arm 40, or for the candidate plant 10. ln some embodiments herein, determining the position of the identified candidate plant 10 may be performed using any suitable image analysis method. For example, determining the position of the identified candidate plant 10 may comprise checking that there is space around a position, e.g., to check that there is space for gripper fingers, e.g., of the robot arm 40. Determining the position of the identified candidate plant may further comprise assessing whether the determined position is a suitable position, e.g. whether or not the robot arm 40 can reach the position. ln some embodiments herein, determining the position of the identified candidate plant 10 comprises, for example, consists of, determining an X and y coordinate of the identified candidate plant 10. The x and y coordinate may correspond to a width and length of the at least one shared area 30. ln these embodiments, the robot arm 40 may be instructed to grab the identified candidate plant 10 in the X and y coordinate. As a z- coordinate, i.e., a height coordinate, is also needed to actually grab the identified candidate plant 10, the robot arm 40 will be configured to sense the identified candidate plant 10 in the z-coordinate, e.g., by moving and sensing the identified candidate plant 10 in the z-domain.
Determining the position of the identified candidate plant 10 may be performed based on using a second machine learning model, e.g., the first machine learning model of action 203 or a separate machine learning model. The second machine learning model may comprise a neural network. The second machine learning model may have been trained based on training image data of plants, e.g., same type and/or species as the plants in the set of plants. The training image data of plants may comprise single images of plants and/or images of multiple plants such as the set of plants 20. The training image data of plants may comprise images of plants in the at least one shared area 30. The 12 training of the second machine learning model may be supervised, e.g., by using truth labels for positions of the identified candidate plant. The training of the second machine learning model may comprise deep learning. Embodiments herein may further comprise training the second machine learning model.
Action 205 The method comprises handling the identified candidate plant 10 based on the determined position.
As the position is determined, the candidate plant 10 may be handled in any suitable manner based on the position. For example, in some embodiments handling the identified candidate plant 10 based on the determined position comprises triggering the robot arm 40 of the plant handling arrangement 1 to grab the identified candidate plant 10 at the determined coordinate. Another example of handling the identified candidate plant 10 based on the determined position may involve analysis of the candidate plant 10 at the determined position, e.g., by means of further obtained image data of the candidate plant 10, e.g., to determine health of the candidate plant 10 and/or suitability to move the candidate plant 10 and/or to determined species/type of the candidate plant 10. ln some embodiments, triggering the robot arm 40 to grab the identified candidate plant 10 comprises triggering the robot arm 40 to grab the identified candidate plant 10 using a set configuration, e.g., wherein the set configuration defines any one or more out of: a pressure to apply when grabbing the identified candidate plant 10, where to move the identified candidate plant 10, how fast to move the identified candidate plant 10, and/or a combination thereof. ln some embodiments triggering the robot arm 40 to grab the identified candidate plan 10 comprises detecting that one or more plant out of the set of plants 20 has been grabbed. ln some embodiments, the robot arm 40 may further be triggered to detect, i.e., sense, the height of the candidate plant 10 if no height data is provided in the determined position. ln other words, the robot arm 40 may be triggered to grab the candidate plant 10 without knowing the height position, and may instead sense the position by lowering the robot arm 40 until reaching the candidate plant 10. This may or may not be coordinated with image data obtained from the at least one camera 2. ln some embodiments, handling the identified candidate plant 10 based on the determined position may comprise handling the identified candidate plant 10 at the 13 determined position, e.g., the robot arm 40 grabbing the identified candidate plant 10 at the determined position, or handling the identified candidate plant 10 based on the determined position may comprise handling the identified candidate plant 10 at a position relative to the determined position. The determined position or the position relative to the determined position may be a preferred position to grab the identified candidate plant 10, e.g., at a position arranged between a root of the candidate plant 10 and a branch of the candidate plant 10.
The determined position or the position relative to the determined position may be the root or stem of the candidate plant 10.
The above embodiments will now be further explained and exemplified below. The embodiments below may be combined with any suitable embodiment above.
Fig. 3 illustrates different plants which may be part of the set of plants 20. The illustration of Fig. 3. shows that the set of plants 20 may comprise any suitable plant, and that different types of plants may have different characteristic features, e.g., different root shapes, which can be used for identifying the candidate plant 10, e.g., as part of action 203.
Fig. 3 illustrates a first plant 301. The first plant 301 is a generic plant which is illustrated with a generic predetermined plant shape, which features may be used when identifying the candidate plant 10 out of the set of plants 20. For example, the first plant 301 has an illustrated first root 301r, which shape may be used for matching with the image data obtained in action 201 to identify the candidate plant 10, e.g., as part of action 203.
The set of plants 20 may additionally or alternatively comprise a second plant 302. The shape of the second plant 302 may be a predetermined shape used for matching with the image data obtained in action 201 to identify the candidate plant 10, e.g., as part of action 203. ln particular, a second root 302r of the second plant 302 may have a particular shape, e.g., characteristic for that type and/or species, which may be used for matching with the image data obtained in action 201 to identify the candidate plant 10, e.g., as part of action 203.
The set of plants 20 may additionally or alternatively comprise a third plant 303. The shape of the third plant 303 may be a predetermined shape, e.g., characteristic for that type and/or species, used for matching with the image data obtained in action 201 to identify the candidate plant 10, e.g., as part of action 203. ln particular, a third root 303r 14 of the third plant 303 may have a particular shape which may be used for matching with the image data obtained in action 201 to identify the candidate plant 10, e.g., as part of action 203.
The set of plants 20 may additionally or alternatively comprise a fourth plant 304. A shape of the fourth plant 304 may be a predetermined shape, e.g., characteristic for that type and/or species, used for matching with the image data obtained in action 201 to identify the candidate plant 10, e.g., as part of action 203. ln particular, a fourth root 304r of the fourth plant 304 may have a particular shape which may be used for matching with the image data obtained in action 201 to identify the candidate plant 10, e.g., as part of action 203.
The set of plants 20 may additionally or alternatively comprise a fifth plant 305. The shape of the fifth plant 305 may be a predetermined shape, e.g., characteristic for that type and/or species, used for matching with the image data obtained in action 201 to identify the candidate plant 10, e.g., as part of action 203. ln particular, a fifth root 305r of the fifth plant 305 may have a particular shape which may be used for matching with the image data obtained in action 201 to identify the candidate plant 10, e.g., as part of action 203.
Fig. 4a illustrates the set of plants 20 arranged in different areas and different arrangements, e.g., as part of the at least one area 30.
The set of plants 20 may be arranged in a box 401, e.g., as part of the at least one shared area 30. The box 401 may be open-ended, e.g., any suitable box for transporting plants. The set of plants 20 may be packed in the box 401, e.g., manually or by a machine.
The set of plants 20 may be arranged in a second shared area 402, e.g., as part of the at least one shared area 30. The second shared area 402 may be any suitable area, e.g., a box, tray, table, workbench. The set of plants 20 may be arranged in the second shared area 402 wherein the roots of the set of plants 20 are arranged within a same second subarea 402s of the second shared area 402. The second subarea 402s may be the selected subset as part of action 202.
The set of plants 20 may be arranged in a third shared area 403, e.g., as part of the at least one shared area 30. The third shared area 403 may be any suitable area, e.g., a box, tray, table, workbench. The set of plants 20 may be arranged in the third shared area 403 wherein the roots of the set of plants 20 are arranged with roots in different directions, such that the roots occupy two third subareas 403s of the third shared area 403. The two third subareas 403s may be the selected subset as part of action 202.
The set of plants 20 may be arranged in a fourth shared area 404, e.g., as part of the at least one shared area 30. The fourth shared area 404 may be any suitable area, e.g., a box, tray, table, Workbench. The set of plants 20 may be arranged in the fourth shared area 404 wherein the roots of the set of plants 20 are arranged with roots in different directions. The roots may have been shuffled around due to transportation. The two third subareas 403s or the second subarea 402s could still be used as subsets to find most of the plants in the set of plants 20. When no roots remain the subareas, the entire fourth shared area 404 may need to be searched for identifying the candidate plant 10. ln Fig. 4b, the set of plants 20 are illustrated in a fifth shared area 405, e.g., as part of the at least one shared area 30. The fifth shared area 405 is illustrated to be a box. The set of plants 20 is in Fig. 4b arranged in the fifth shared area 405 such that the roots of the set of plants 20 are arranged within a same fifth subarea 405s of the fifth shared area 405. The fifth subarea 405s may be the selected subset as part of action 202.
Fig. 5 illustrates first exemplary image data 500 of the set of plants 20 in the at least one shared area 30. The first exemplary image data 500 may be part of the image data as obtained in action 201 .The first exemplary image data 500 of Fig. 5 illustrates an example scenario wherein the at least one shared area 30 comprises two containers 502 in which the set of plants 20 are comprised. The first example image data 500 may be a point cloud of the set of plants 20 in the at least one shared area 30, or may be a 2D image using contrast of the image to differentiate plants in the set of plants 20. The white area of the first exemplary image data 500 of Fig.5 illustrates where a part of a plant in the set of plants 20 is located in the at least shared area 30. For some other scenarios not visible in the illustration of Fig.5, the white areas of the first example image data 500 may be differently colored or using a grayscale for differentiating different parts of the set of plants 20. ln the example of Fig.5, a first area 504 and a second area 505. The first area 504 and the second area 505 are areas where a lot of roots of the set of plants are found. The first area 504 and/or the second area 505 may be predetermined or may be determined due to automatic image analysis of the first exemplary image data 500. The first area 504 and/or the second area 505 may be a subset, e.g., as used in any one or both of actions 203-204. The first area 504 and/or the second area 505 may then be used to identify the candidate plant 10, e.g., as in action 203. ln other words, the first area 504 and/or the second area 505 may be used to search for the candidate plant, e.g., based on a predetermined shape or features of a root.
Fig. 6 illustrates exemplary height data 600, e.g., as part of the image data as obtained in action 201. The exemplary height data 600 may be part of the first exemplary 16 image data 500 illustrated in Fig. 5. The exemplary height data 600 may be a 2D projection of a 3D image of the image data obtained in action 201.
The exemplary height data 600 may be extracted from a point cloud image as provided by a 3D camera in the image data.
The exemplary height data 600 is in this example plotted on two axes, a height axis 601, and a position axis 602. The position axis 602 may correspond to a side or cross section of the first or second area 504, 505 illustrated in Fig. 5. ln this example, the height data 600 is matched with features of a predetermined plant shape, e.g., a cross section of a root. ldentifying the candidate plant 10 may therefore comprise matching the height data 600 with a predetermined shape of a root, wherein a match with a predetermined shape of a root may also fulfil a height condition, e.g., the match with the predetermined shape of a root may be considered if it is located at a height of above a threshold, e.g., the match with a greatest height in the height data 600. ln the example of Fig. 6, a first area of interest 603 is detected where a first example candidate plant 604 is found. The first example candidate plant 604 may be the candidate plant 10, and may be identified at least partly using the height data 600. For example, a center point may be determined for the first example candidate plant 604 which may further be used for height data for the first example candidate plant 604. Additionally or alternatively, a lowest point and/or a highest point may be determined for the first example candidate plant 604 based on the height data 600. ln some embodiments, the first area of interest 603 is searched to identify the first example candidate plant 604. Searching the first area of interest 603 may in some embodiments be needed to determine a position of the first example candidate plant 604, e.g., based on the height data 600. Embodiments herein may further comprise performing a 3D comparison with a predetermined 3D model of a plant in a search area established around the first example candidate plant 604, e.g., based on height data of the first example candidate plant 604 such as any one or more of the center point, the lowest point, and/or the highest point of the first example candidate plant 604. The search area may be used to identify the first example candidate plant 604, its position, and optionally to determine an angle of the first example candidate plant 604. ln other Words, identifying the candidate plant 10 and/or determining the position of the candidate plant 10, e.g., as in actions 203 and/or 204, may be performed based on the height data 600 and/or based on the first area of interest 603.
Fig. 7 illustrates second exemplary image data 700 of the set of plants 20 in the at least one shared area 30, e.g., as obtained in action 201. The second exemplary image data 700 may be the same or based on the first exemplary image data 500. The second 17 exemplary image data 700 comprises a second area of interest 701. The second exemplary image data 700 may comprise the height data 600 of Fig. 6 The second area of interest 701 may be a 3D area of the second exemplary image data 700. The second area of interest 701 may be obtained based on the first example candidate plant 604 and its height data, e.g., a center point as a z-coordinate, e.g., as obtained in combination with the first area 504 of Fig. 5. ldentifying the candidate plant 10 and/or determining the position of the candidate plant 10, e.g., as in actions 203 and/or 204, may be performed based on the second area of interest 701, e.g., in combination with the first area of interest 603 and/or based on a predetermined size.
Fig. 8 illustrates third exemplary image data 800 of the set of plants 20 in the at least one shared area 30, e.g., as obtained in action 201. The third exemplary image data 800 may be the same or based on the first exemplary image data 500 and/or the second exemplary image data 700. The third exemplary image data 800 illustrates a third example candidate plant 801. The third example candidate plant 801 may be the candidate plant 10. The third example candidate plant 801 may be identified, e.g., as in action 303, based on the second area of interest 701. ln the scenario of Fig. 8, an example position 802 of the third example candidate plant 801 may be determined, e.g., as in action 204. The example position 802 is in this case in the end-point of a root of the third example candidate plant 801, but may also be in other suitable part of the third example candidate plant 801. Furthermore, e.g., when determining the position of the third example candidate plant 801, e.g., as in action 204, an angle 804 of the third example candidate plant 801 may further be determined. The angle 804 may e.g., be determined with respect to an axis 803 of the third exemplary image data 800.
Fig. 9 illustrates an example plant handling arrangement 1. The set of plants 20 are arranged in the at least one shared area 3, which in this example, comprises two boxes. The at least one camera comprises one camera, e.g., a 2D or 3D camera, arranged at a predetermined location above the at least one shared area 30. The robot arm 40 is arranged at a predetermined position with respect to the at least one shared area 30 such that the robot arm 40 can grab the identified candidate plant 10 at the determined position of action 204.
Fig. 10 illustrates another example of the plant handling arrangement 1 wherein the plant handling arrangement 1 is arranged to be comprised in the vehicle 50, e.g., on the vehicle 50 or a trailer of the vehicle 50. The set of plants 20 are arranged in the at least one shared area 3. The at least one camera comprises one camera, e.g., a 2D or 3D camera, arranged at a predetermined location above the at least one shared area 30. The 18 robot arm 40 is configured to handle the set of plants 20 by grabbing the identified candidate plant 10 based on the determined position of the candidate plant 10, and by releasing the candidate plant 10 into a funnel 1001 leading the candidate plant 10 to a planting device 1002 which will then be arranged to plant the candidate plant 10 in a ground surface travelled by the vehicle 50. The funnel 1001 and/or the planting device 1002 may be exchanged with any suitable device for planting an individual plant.
To perform the method actions above, the plant handling arrangement 1 is configured to handle one or more plants in the set of plants 20. The plant handling arrangement 1 may comprise a control unit, e.g., the control unit 60, configured to perform the above-mentioned actions, e.g., as part of the plant handling arrangement 1. The control unit 60 is illustrated in Fig. 11. The components of the control units 60 may also be directly integrated into the plant handling arrangement 1 in any suitable manner. ln other words, while Fig.11 illustrated the control unit 60, the illustration may also be for the plant handling arrangement 1, e.g., as a single entity.
The plant handling arrangement 1 e.g., the control unit 60, may comprise an input and output interface 1100 configured to communicate with and/or to control entities of the plant handling arrangement 1, e.g., the at least one camera 2 and/or the robot arm 40. The input and output interface 1100 may e.g., comprise a wired or wireless receiver (not shown) and a Wireless or wired transmitter (not shown).
The plant handling arrangement 1 e.g., the control unit 60, may further be configured to: - obtain image data indicative of the set of plants 20, - based on the obtained image data, identify a candidate plant 10 out of the set of plants 20, and - based on the obtained image data, determine a position of the identified candidate plant 10, and - handle the identified candidate plant based on the determined position.
The plant handling arrangement 1 e.g., the control unit 60, may further be configured to perform any of the actions 201-205 as described with respect to Fig. 2, in any suitable order.
The plant handling arrangement 1 e.g., the control unit 60, may further be configured to: identify the candidate plant 10 out of the set of plants 20 by identifying a predetermined plant shape from the image data. 19 The plant handling arrangement 1 e.g., the control unit 60, may further be configured to identify the predetermined plant shape from the image data by matching one or more features of the predetermined plant shape with the image data. ln some embodiments, the image data comprises height data of the set of plants 20.
The plant handling arrangement 1 e.g., the control unit 60, may further be configured to match the one or more features of the predetermined plant shape with the image data by matching the one or more features of the predetermined plant shape with the height data.
The plant handling arrangement 1 e.g., the control unit 60, may further be configured to match the height data with the one or more features of the predetermined plant shape based on the height data and the one or more features of the predetermined plant shape, by identifying a first plant associated with a first height and a second plant associated with a second height, and to identify the candidate plant 10 out of the set of plants 20 by identifying the candidate plant 10 out of the first plant and the second plant based at least partly on a comparison between the first height and the second height. ln some embodiments, the one or more features of the predetermined plant shape comprises any one or more out of: a size parameter, a three-dimensional shape of at least a part of the predetermined plant shape, and a two-dimensional shape of a cross-section or a side of at least a part of the predetermined plant shape, a two-dimensional representation of the predetermined plant shape, wherein matching the one or more features comprises comparing a contrast and/or color difference between the predetermined plant shape and the image data.
The plant handling arrangement 1 e.g., the control unit 60, may further be configured to identify the candidate plant 10 out of the set of plants 20 by identifying the candidate plant 10 based on a subset of the image data. ln some embodiments, the subset is a predetermined selected subset of the image data.
The plant handling arrangement 1 e.g., the control unit 60, may further be configured to select the subset based on an area of the image data fulfilling one or more criteria. ln some embodiments, the plant handling arrangement 1 comprises at least one camera 2 arranged at a predetermined position. The plant handling arrangement 1 e.g., the control unit 60, may further be configured to obtain the image data by obtaining one or more images captured by the at least one camera 2, and to determine the position of the identified candidate plant 10 is, at least partially, based on the predetermined position of the camera.
The plant handling arrangement 1 e.g., the control unit 60, may further be configured to determine the position of the identified candidate plant 10 by determining one or more coordinates for handling the identified candidate plant 10.
The plant handling arrangement 1 e.g., the control unit 60, may further be configured to handle the identified candidate plant based on the determined position by triggering the robot arm 40 of the plant handling arrangement 1 to grab the identified candidate plant 10 at the determined coordinate.
The plant handling arrangement 1 e.g., the control unit 60, may further be configured to the obtain the image data indicative of the set of plants 20 by obtaining three-dimensional 3D image data of the set of plants 20. The 3D image data may be indicative of a point cloud of the set of plants 20.
The plant handling arrangement 1 e.g., the control unit 60, may further be configured to obtain the image data indicative of the set of plants 20 by obtaining two- dimensional 2D image data of the set of plants 20. ln some embodiments, the set of plants 20 being arranged at least partly in the at least one shared area 30.
The embodiments herein may be implemented through a respective processor or one or more processors, such as the processor 1160 of a processing circuitry in the plant handling arrangement 1, e.g., the control unit 60, together with respective computer program code for performing the functions and actions of the embodiments herein. The program code mentioned above may also be provided as a computer program e.g., a computer program product, for instance in the form of a data carrier carrying computer program code for performing the embodiments herein when being loaded into the plant handling arrangement 1, e.g., the control unit 60,. One such carrier may be in the form of a CD ROM disc. lt is however feasible with other data carriers such as a memory stick. The computer program code may furthermore be provided as pure program code on a server and downloaded to the plant handling arrangement 1, e.g., the control unit 60, The plant handling arrangement 1, e.g., the control unit 60, may further comprise a memory 1170 comprising one or more memory units. The memory 1170 comprises instructions executable by the processor in the plant handling arrangement 1, e.g., the control unit 60. The memory 1170 is arranged to be used to store e.g. information, 21 indications, data, configurations, and applications to perform the methods herein when being executed in the plant handling arrangement 1, e.g., the control unit 60. ln some embodiments, a computer program 1180 comprises instructions, which when executed by the respective at least one processor 1160, cause the at least one processor of the plant handling arrangement 1, e.g., the control unit 60, to perform the actions above. ln some embodiments, a respective carrier 1190 comprises the respective computer program 1180, wherein the carrier 1190 is one of an electronic signal, an optical signal, an electromagnetic signal, a magnetic signal, an electric signal, a radio signal, a microwave signal, or a computer-readable storage medium.
Those skilled in the art will appreciate that the units in the plant handling arrangement 1, e.g., the control unit 60, described above may refer to a combination of analog and digital circuits, and/or one or more processors configured with software and/or firmware, e.g. stored in the plant handling arrangement 1, e.g., the control unit 60, that when executed by the respective one or more processors such as the processors described above. One or more of these processors, as well as the other digital hardware, may be included in a single Application-Specific Integrated Circuitry (ASIC), or several processors and various digital hardware may be distributed among several separate components, whether individually packaged or assembled into a system-on-a-chip (SoC).
When using the word "comprise" or "comprising" it shall be interpreted as non- limiting, i.e. meaning "consist at least of".
The embodiments herein are not limited to the above described preferred embodiments. Various alternatives, modifications and equivalents may be used.
Any embodiments and/or examples mentioned above may be used in any suitable combination with any other one or more embodiments and/or examples above.

Claims (1)

1.Claims A method performed by a plant handling arrangement (1) for handling one or more plants of a set of plants (20), the method comprising: - obtaining (201) image data indicative of the set of plants (20), - based on the obtained image data, identifying (203) a candidate plant (10) out of the set of plants (20), and - based on the obtained image data, determining (204) a position of the identified candidate plant (10), - handling (205) the identified candidate plant (10) based on the determined position. The method according to claim 1, wherein identifying (203) the candidate plant (10) out of the set of plants (20) comprises identifying a predetermined plant shape from the image data. The method according to claim 2, wherein identifying the predetermined plant shape from the image data comprises matching one or more features of the predetermined plant shape with the image data. The method according to any one of claims 2-3, wherein the image data comprises height data of the set of plants (20), and wherein matching the one or more features of the predetermined plant shape with the image data comprises matching the one or more features of the predetermined plant shape with the height data. The method according to claim 4, wherein matching the height data with the one or more features of the predetermined plant shape comprises, based on the height data and the one or more features of the predetermined plant shape, identifying a first plant associated with a first height and a second plant associated with a second height, and wherein identifying (203) the candidate plant (10) out of the set of plants (20) comprises identifying the candidate plant (10) out of the first plant and the second plant based at least partly on a comparison between the first height and the second height. The method according to any one of claims 2-5, wherein the one or more features of the predetermined plant shape comprises any one or more out of:- a size parameter, - a three-dimensional shape of at least a part of the predetermined plant shape, and - a two-dimensional shape of a cross-section or a side of at least a part of the predetermined plant shape - a two-dimensional representation of the predetermined plant shape, wherein matching the one or more features comprises comparing a contrast and/or color difference between the predetermined plant shape and the image data. A method according to any one of claims 1-6, wherein identifying (203) the candidate plant (10) out of the set of plants (20) comprises identifying the candidate plant (10) based on a subset of the image data. The method according to claim 7, wherein the subset is a predetermined selected subset of the image data. The method according to claim 8, further comprising selecting (202) the subset based on an area of the image data fulfilling one or more criteria. The method according to any one of claims 1-9, wherein the plant handling arrangement (1) comprises at least one camera (2) arranged at a predetermined position, and wherein obtaining (201) the image data comprises obtaining one or more images captured by the at least one camera (2), and wherein determining (204) the position of the identified candidate plant (10) is, at least partially, based on the predetermined position of the camera. The method according to any one of claims 1-10, wherein determining (204) the position of the identified candidate plant (10) comprises determining one or more coordinates for handling the identified candidate plant (10). The method according to claim 11, wherein handling (205) the identified candidate plant based on the determined position comprises triggering a robot arm (40) of the plant handling arrangement (1) to grab the identified candidate plant (10) at the determined coordinate. The method according to any one of claims 1-12, wherein the obtaining (201) the image data indicative of the set of plants (20) comprises obtaining three-dimensional(3D) image data of the set of plants (20), wherein the 3D image data is indicative of a point cloud of the set of plants (20). The method according to any one of claims 1-13, wherein the obtaining (201) the image data indicative of the set of plants (20) comprises obtaining two-dimensional (2D) image data of the set of plants (20). The method according to any one of claims 1-14, wherein the set of plants (20) being arranged at least partly in at least one shared area (30). A plant handling arrangement (1) configured to handle a set of plants (20), wherein the plant handling arrangement (1) is configured to: - obtain image data indicative of the set of plants (20), - based on the obtained image data, identify a candidate plant (10) out of the set of plants (20), and - based on the obtained image data, determine a position of the identified candidate plant (10), and - handle the identified candidate plant based on the determined position. The plant handling arrangement (1) configured to perform the method according to any one of claims 2- A vehicle (50) comprising the plant handling arrangement (1) according to claim A computer program (1180) comprising instructions, which when executed by a processor (1160), causes the processor to perform actions according to any of the claims 1- A carrier (1190) comprising the computer program (1180) of claim 19, wherein the carrier is one of an electronic signal, an optical signal, an electromagnetic signal, a magnetic signal, an electric signal, a radio signal, a microwave signal, or a computer- readable storage medium.
SE2350163A 2023-02-16 2023-02-16 Plant handling arrangement and a method SE2350163A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
SE2350163A SE2350163A1 (en) 2023-02-16 2023-02-16 Plant handling arrangement and a method
PCT/EP2024/053202 WO2024170405A1 (en) 2023-02-16 2024-02-08 Plant handling arrangement and a method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
SE2350163A SE2350163A1 (en) 2023-02-16 2023-02-16 Plant handling arrangement and a method

Publications (1)

Publication Number Publication Date
SE2350163A1 true SE2350163A1 (en) 2024-08-17

Family

ID=89901266

Family Applications (1)

Application Number Title Priority Date Filing Date
SE2350163A SE2350163A1 (en) 2023-02-16 2023-02-16 Plant handling arrangement and a method

Country Status (2)

Country Link
SE (1) SE2350163A1 (en)
WO (1) WO2024170405A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2719273A1 (en) * 2012-10-12 2014-04-16 Nederlandse Organisatie voor toegepast -natuurwetenschappelijk onderzoek TNO Method and apparatus for cutting plants
CN107529718A (en) * 2015-04-20 2018-01-02 三原诚 Transplanted seedling tree device and transplanted seedling tree vehicle
EP3552479A1 (en) * 2018-04-10 2019-10-16 IG Specials B.V. Apparatus and method for placing plant bulbs
US10602664B1 (en) * 2015-04-27 2020-03-31 X Development Llc Tagging of fruit-producing flowers for robotic selective harvesting
US20210150208A1 (en) * 2018-07-05 2021-05-20 Iron Ox, Inc. Method for selectively deploying sensors within an agricultural facility
US20210368686A1 (en) * 2020-05-28 2021-12-02 Automated Harvesting Solutions, LLC End-effector with rotary actuator for harvesting crops
CN114128461A (en) * 2021-10-27 2022-03-04 江汉大学 The control method of the plug-in seedling raising and transplanting robot and the plug-in seedling raising and transplanting robot
US20230028506A1 (en) * 2019-11-25 2023-01-26 Robert Bosch Gmbh Method for Processing Plants in a Field

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FI20106090A0 (en) * 2010-10-21 2010-10-21 Zenrobotics Oy Procedure for filtering target image images in a robotic system
US12256682B2 (en) * 2016-12-22 2025-03-25 Inevitable Technology Inc. System and method for automating transfer of plants within an agricultural facility
CN115657531B (en) * 2022-10-18 2024-11-26 兰州大学 A system and method for determining bonsai grasping posture and parking robot

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2719273A1 (en) * 2012-10-12 2014-04-16 Nederlandse Organisatie voor toegepast -natuurwetenschappelijk onderzoek TNO Method and apparatus for cutting plants
CN107529718A (en) * 2015-04-20 2018-01-02 三原诚 Transplanted seedling tree device and transplanted seedling tree vehicle
US10602664B1 (en) * 2015-04-27 2020-03-31 X Development Llc Tagging of fruit-producing flowers for robotic selective harvesting
EP3552479A1 (en) * 2018-04-10 2019-10-16 IG Specials B.V. Apparatus and method for placing plant bulbs
US20210150208A1 (en) * 2018-07-05 2021-05-20 Iron Ox, Inc. Method for selectively deploying sensors within an agricultural facility
US20230028506A1 (en) * 2019-11-25 2023-01-26 Robert Bosch Gmbh Method for Processing Plants in a Field
US20210368686A1 (en) * 2020-05-28 2021-12-02 Automated Harvesting Solutions, LLC End-effector with rotary actuator for harvesting crops
CN114128461A (en) * 2021-10-27 2022-03-04 江汉大学 The control method of the plug-in seedling raising and transplanting robot and the plug-in seedling raising and transplanting robot

Also Published As

Publication number Publication date
WO2024170405A1 (en) 2024-08-22

Similar Documents

Publication Publication Date Title
US10779472B2 (en) Robotic fruit picking system
Defterli Review of robotic technology for strawberry production
US10524425B2 (en) Method for automating transfer of plants within an agricultural facility
JP6661211B1 (en) Control device and control method for robot system
US8306663B2 (en) Robot with 3D grasping capability
US9913429B1 (en) Tagging of fruit-producing flowers for robotic selective harvesting
Raja et al. Crop signalling: A novel crop recognition technique for robotic weed control
Hua et al. Recent advances in intelligent automated fruit harvesting robots
BR112018001902B1 (en) METHOD FOR PERFORMING TASKS, AND ROBOT CONFIGURED
Burks et al. Engineering and horticultural aspects of robotic fruit harvesting: Opportunities and constraints
ES2910073T3 (en) Multi-arm robot for complex order picking tasks
Hayashi et al. Automation technologies for strawberry harvesting and packing operations in Japan
US20240373787A1 (en) Robot fruit picking system
Ren et al. Mobile robotics platform for strawberry sensing and harvesting within precision indoor farming systems
Kounalakis et al. Development of a tomato harvesting robot: Peduncle recognition and approaching
Tejada et al. Proof-of-concept robot platform for exploring automated harvesting of sugar snap peas
Khort et al. Robotized platform for picking of strawberry berries
Yoshida et al. A tomato recognition method for harvesting with robots using point clouds
US11413762B2 (en) Method and device for identifying objects
Bhattarai et al. Design, integration, and field evaluation of a robotic blossom thinning system for tree fruit crops
SE2350163A1 (en) Plant handling arrangement and a method
EP4404012A2 (en) Method for packaging pieces by means of a manipulator robot and head for fastening and transporting said pieces by means of the manipulator robot
Yung et al. Partially structured robotic picking for automation of tomato transplantation
Hemming Current developments in greenhouse robotics and challenges for the future
Scarfe Development of an autonomous kiwifruit harvester: a thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy in Industrial Automation at Massey University, Manawatu, New Zealand.