CN109436820A - A kind of the de-stacking method and de-stacking system of stacks of goods - Google Patents
A kind of the de-stacking method and de-stacking system of stacks of goods Download PDFInfo
- Publication number
- CN109436820A CN109436820A CN201811082406.3A CN201811082406A CN109436820A CN 109436820 A CN109436820 A CN 109436820A CN 201811082406 A CN201811082406 A CN 201811082406A CN 109436820 A CN109436820 A CN 109436820A
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- goods
- cargo
- stack
- camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B65—CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
- B65G—TRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
- B65G61/00—Use of pick-up or transfer devices or of manipulators for stacking or de-stacking articles not otherwise provided for
Landscapes
- Manipulator (AREA)
Abstract
The present invention relates to a kind of de-stacking method of stacks of goods and de-stacking systems, which comprises acquisition 2D camera shoots the 2D image that stacks of goods obtains above stacks of goods and 3D camera shoots the 3D rendering that stacks of goods obtains above stacks of goods;Each posture information of the top layer cargo based on robot basis coordinates system in stacks of goods top layer is determined according to 2D image and 3D rendering;The motion profile moved to stacks of goods according to all top layer cargos based on the posture information planning robot of robot basis coordinates system and de-stacking sequence;It controls robot and takes out all top layer cargos from stacks of goods top layer according to motion profile and de-stacking sequence;Judging stacks of goods top layer not is the last layer of stacks of goods, continues to take out all top layer cargos from stacks of goods top layer, realizes the de-stacking of stacks of goods.The present invention is based on the motion profile and de-stacking sequence of 2D image and the automatic planning robot of 3D rendering, control robot is layered de-stacking to stacks of goods according to motion profile and de-stacking sequence, and the degree of automation and de-stacking efficiency are higher.
Description
Technical Field
The invention relates to the technical field of goods storage and transportation, in particular to a goods stack unstacking method and a goods stack unstacking system.
Background
At present, the automatic stacking system is widely adopted in the goods storage and transportation process at home and abroad, goods can be directly stacked on trays, then the trays are conveyed to a warehouse by a forklift or an AGV trolley, the goods are usually still manually taken out of the trays, and then the goods are delivered out of the warehouse, so that the manual unstacking difficulty is high, and the efficiency is low.
Disclosure of Invention
The invention provides a goods stack unstacking method and a goods stack unstacking system, aiming at the technical problems of high manual unstacking difficulty and low efficiency in the conventional unstacking method.
In one aspect, the invention provides a method for unstacking a stack of goods, which comprises the following specific steps:
step 1, acquiring a 2D image obtained by shooting a cargo stack above the cargo stack by a 2D camera and a 3D image obtained by shooting the cargo stack above the cargo stack by a 3D camera;
step 2, determining pose information of each top-layer cargo on the top layer of the cargo stack based on a robot base coordinate system according to the 2D image and the 3D image;
step 3, planning a motion track and a destacking sequence of the robot to the goods stack according to pose information of all top-layer goods based on a robot base coordinate system;
step 4, controlling the robot to take out all top-layer cargos from the top layer of the cargo stack according to the motion track and the unstacking sequence;
and 5, judging whether the top layer of the goods stack is the last layer of the goods stack, if not, skipping to execute the step 1, and if so, stopping unstacking.
The goods stack unstacking method provided by the invention has the beneficial effects that: the 2D camera and the 3D camera can shoot the goods stack above the goods stack simultaneously to obtain a 2D image and a 3D image, and the 2D image and the 3D image are used for identifying the pose information of each top-layer goods on the top layer of the goods stack in the robot base coordinate system, so that the pose information can be calibrated quickly and accurately; movement track and unstacking order that move to the goods buttress based on aforementioned position appearance information planning robot, realize the robot and unstack the layering of goods buttress, can deal with different goods buttress planning optimal motion route, according to the order of unstacking, the robot moves to goods buttress top layer along movement track, take out every top layer goods according to the preface from goods buttress top layer, until all top layer goods are whole to be taken out, and then when the goods buttress top layer is not the last one deck of goods buttress, continuously take out all top layer goods from goods buttress top layer, realize the robot and unstack the layering of goods buttress top layer, have degree of automation and the efficient characteristics of unstacking.
In another aspect, the present invention provides a system for unstacking a stack of goods, the system comprising: a conveyor belt, a 2D camera, a 3D camera, a processor, and a robot;
the conveying belt is used for conveying the goods stack to a working area of the robot;
the 2D camera is used for shooting a 2D image of the goods stack above the goods stack;
the 3D camera is used for shooting a 3D image of the goods stack above the goods stack;
the processor is used for acquiring the 2D image and the 3D image, determining the position and attitude information of each top-layer cargo on the top layer of the cargo stack based on a robot base coordinate system according to the 2D image and the 3D image, and planning the motion track and the unstacking sequence of the robot to the cargo stack according to the position and attitude information of all the top-layer cargos based on the robot base coordinate system;
the robot is used for taking all the top-layer cargos out of the top layer of the cargo stack according to the motion track and the unstacking sequence;
the processor is further used for judging whether the top layer of the goods stack is the last layer of the goods stack, if not, the stack is continuously disassembled, and if so, the stack disassembly is stopped.
The goods stack unstacking system provided by the invention has the beneficial effects that: the top layer of the goods stack is a layer of goods which is closest to the 2D camera and the 3D camera, the heights from the 2D camera and the 3D camera to the top layer of the goods stack are equal, the 2D camera and the 3D camera can shoot the goods stack above the goods stack simultaneously to obtain a 2D image and a 3D image, the method is suitable for the 2D camera and the 3D camera to have the same or different heights above the goods stack, the 2D image and the 3D image are used for identifying the pose information of each top layer goods on the top layer of the goods stack in the robot base coordinate system, and the pose information can be calibrated quickly and accurately; the treater plans the movement track and the order of breaking a jam of robot to goods buttress motion based on aforementioned position appearance information, realize that the robot breaks a jam to goods buttress layering, can deal with different goods buttress planning optimal motion route, according to the order of breaking a jam, the robot moves to goods buttress top layer along the movement track, take out every top layer goods according to the preface from goods buttress top layer, until all top layer goods are whole to take out, and then when the goods buttress top layer is not the last layer of goods buttress, continuously take out all top layer goods from goods buttress top layer, realize that the robot breaks a jam to goods buttress top layer layering, the automation degree has, efficiency of breaking a jam and the high characteristics of engineering actual value.
Drawings
Fig. 1 is a schematic flow chart of a method for unstacking a stack of goods according to an embodiment of the present invention;
FIG. 2 is a schematic structural diagram of a system for unstacking stacks of items according to an embodiment of the present invention;
fig. 3 is a schematic diagram of the electrical connections in a system for unstacking packs of goods according to fig. 2.
In the drawings, the components represented by the respective reference numerals are listed below:
11-base, 12-robot, 13-conveyor belt, 14-elevator, 15-processor, 121-robot arm, 122-unstacking fixture, 141-2D camera, 142-3D camera.
Detailed Description
The principles and features of this invention are described below in conjunction with examples, which are set forth to illustrate, but are not to be construed to limit the scope of the invention.
Example one
As shown in fig. 1, an embodiment of the present invention provides a method for unstacking a stack of goods, including the following steps:
step 1, acquiring a 2D image obtained by shooting a goods stack above the goods stack by a 2D camera and a 3D image obtained by shooting the goods stack above the goods stack by a 3D camera;
step 2, determining the position and posture information of each top-layer cargo on the top layer of the cargo stack based on the robot base coordinate system according to the 2D image and the 3D image;
step 3, planning the motion trail and unstacking sequence of the robot to the goods stack according to the pose information of all top-layer goods based on the robot base coordinate system;
step 4, controlling the robot to take out all top-layer goods from the top layer of the goods stack according to the motion track and the unstacking sequence;
and 5, judging whether the top layer of the goods stack is the last layer of the goods stack, if not, skipping to execute the step 1, and if so, stopping unstacking.
In the embodiment, the top layer of the goods stack is a layer of goods which is closest to the 2D camera and the 3D camera, the heights from the 2D camera and the 3D camera to the top layer of the goods stack are equal, the 2D camera and the 3D camera can shoot the goods stack above the goods stack simultaneously to obtain a 2D image and a 3D image, the method is suitable for enabling the 2D camera and the 3D camera to have the same or different heights above the goods stack, and the 2D image and the 3D image are used for identifying the pose information of each top layer goods on the top layer of the goods stack in the robot base coordinate system, so that the pose information can be calibrated quickly and accurately; movement track and unstacking order that move to the goods buttress based on aforementioned position appearance information planning robot, realize the robot and unstack the layering of goods buttress, can deal with different goods buttress planning optimal motion path, according to the order of unstacking, the robot moves to goods buttress top layer along movement track, take out every top layer goods according to the preface from goods buttress top layer, until all top layer goods are whole to be taken out, and then when the goods buttress top layer is not the last one deck of goods buttress, continuously take out all top layer goods from goods buttress top layer, realize the robot and unstack the layering of goods buttress top layer, and the automation degree has, the efficiency of unstacking and the high characteristics of engineering actual value.
Preferably, step 2 specifically comprises:
step 2.1, let the first camera coordinate system of the 2D camera be { o }1-x1y1z1And a second camera coordinate system of the 3D camera is { o }2-x2y2z2And the robot base coordinate system is { o }b-xbybzbAnd acquiring a first homogeneous transformation matrix of the first camera coordinate system in a second camera coordinate system and a second homogeneous transformation matrix of the second camera coordinate system in the robot base coordinate system.
And 2.2, detecting the 2D image to obtain two-dimensional coordinates of each top cargo based on a first camera coordinate system, and detecting the 3D image to obtain depth coordinates and posture information of each cargo in the 3D image based on a second camera coordinate system.
And 2.3, transforming the two-dimensional coordinates of each top-layer cargo based on the first camera coordinate system by applying the first simultaneous transformation matrix to obtain the two-dimensional coordinates of each top-layer cargo based on the second camera coordinate system.
And 2.4, sequencing the depth coordinates of each cargo in the 3D image based on the second camera coordinate system to obtain a depth coordinate sequence, determining the depth coordinates of each top cargo based on the second camera coordinate system according to the distance between adjacent depth coordinates in the depth coordinate sequence, and correspondingly forming the three-dimensional coordinates of each top cargo based on the second camera coordinate system by using the two-dimensional coordinates of each top cargo based on the second camera coordinate system and the depth coordinates of each top cargo based on the second camera coordinate system.
And 2.5, determining the position information of each top-level cargo in the second camera coordinate system according to each three-dimensional coordinate, and transforming the position information and the posture information of each top-level cargo in the second camera coordinate system by applying a second homogeneous transformation matrix to obtain the posture information of each top-level cargo based on the robot base coordinate system.
After the 2D image and the 3D image are subjected to filtering, correction, enhancement, depth detection and other processing, the number of top-layer cargos on the top layer of the cargo stack and the pixel coordinates of each top-layer cargo in the 2D image are identified based on the 2D image, and the two-dimensional coordinates of each top-layer cargo are calculated according to the pixel coordinates, wherein the two-dimensional coordinates represent the plane position of the top-layer cargo on the top layer of the cargo stack; in addition, the depth coordinate and the posture information of the goods based on the second camera coordinate system can be rapidly identified based on the 3D image, the depth coordinate represents the height of the layer of the goods stack to which the goods belong in the 3D image from the 3D camera, the posture information represents the information such as the orientation and the angle of the goods in the second camera coordinate system, and the identification efficiency and the accuracy of the two-dimensional coordinate, the depth coordinate and the posture information are improved.
Sorting the goods in the 3D image based on the depth coordinates of the second camera coordinate system according to the sequence from small to large to obtain a depth coordinate sequence, and calculating the difference value of two adjacent depth coordinates in the depth coordinate sequence to obtain the difference value of the two adjacent depth coordinates; comparing the difference value with a preset cargo height, if the difference value is smaller than the preset cargo height, the cargo corresponding to the two adjacent depth coordinates is on the top layer of the cargo stack, and if the difference value is larger than or equal to the preset cargo height, the cargo corresponding to the two adjacent depth coordinates is not on the same layer; determining the goods on the same layer as the top layer goods, determining the depth coordinates of the goods on the same layer in the depth coordinate sequence as the depth coordinates of the top layer goods based on the second camera coordinate system, excluding the goods which are not on the top layer of the goods in the 3D image, and ensuring the accuracy of the top layer goods and the depth coordinates thereof.
For example: n top cargo (e.g., boxes) box _1, box _2, …, box _ n, with three-dimensional coordinates of (x)1,y1,z1),(x2,y2,z2),…,(xn,yn,zn),z1、z2To znIn order of (a), wherein | | zi-zjAnd | | l < D, i from 1 to n-1, j from 2 to n, and D is the preset cargo height.
And the three-dimensional coordinates represent position information in a one-dimensional matrix form, the pose information is represented in a 3 multiplied by 3 matrix form, and the position information and the pose information are respectively transformed to a base coordinate system based on the robot through a second homogeneous transformation matrix to obtain the pose information, so that the pose information is positioned based on the robot hand-eye positioning technology.
Preferably, step 3 specifically comprises:
step 3.1, for the position and posture information of each top-layer cargo based on the robot base coordinate system, the position information is PbThe attitude information is RbAnd then, establishing a motion equation,
the equation of motion is expressed as
Wherein,bHfis a homogeneous transformation matrix of a flange coordinate system of the robot in a robot base coordinate system,fHeea homogeneous transformation matrix of a unstacking tool coordinate system of the robot in a flange coordinate system;
step 3.2, planning the motion trail of the robot arm corresponding to the flange coordinate system to each top-layer cargo by using a motion equation;
and 3.3, sequencing position information in the pose information of all top-layer cargos based on the robot base coordinate system to obtain a position sequence, and planning the position sequence into an unstacking sequence from small to large.
And the homogeneous transformation matrix of the flange coordinate system in the robot base coordinate system is obtained by reading a robot joint angle function through a demonstrator, and parameters of the unstacking tool coordinate system in the homogeneous transformation matrix of the flange coordinate system are fixed values.
Constructing a motion equation by utilizing the position information and the posture information of each top-layer cargo based on the robot base coordinate system, and planning the optimal motion track from the robot arm to each top-layer cargo according to the motion equation; and sequencing the three-dimensional coordinates of each position information based on the robot base coordinate system according to a sequence from small to large, wherein the smaller the three-dimensional coordinates is, the closer the top goods are to the robot, the larger the three-dimensional coordinates is, the farther the top goods are from the robot, and the unstacking sequence from top to bottom and from front to back is determined.
Preferably, step 4 specifically includes: and according to the unstacking sequence, sequentially driving the robot arm to move to each top-layer cargo along each motion track, enabling the unstacking tool in the unstacking tool coordinate system to move to each top-layer cargo along the robot arm, and driving the unstacking tool to take out each top-layer cargo from the top layer of the cargo stack until all the top-layer cargos are taken out completely.
The robot arm starts from the initial position, moves to a top layer goods along a motion track, and after the unstacking tool takes out the top layer goods from the top layer of the goods stack, the robot arm returns to the initial position, and the robot arm and the unstacking tool are executed in a circulating mode according to the unstacking sequence until all the top layer goods on the top layer of the goods stack are taken out, so that the taking-out precision of the top layer goods is improved.
Preferably, step 5 specifically includes: when the top layer of the goods stack is not the last layer of the goods stack, synchronously reducing the 2D camera and the 3D camera to the same height as the top layer of the goods stack, and skipping to execute the step 1; the de-stacking is stopped when the top layer of the stack is the last layer of the stack.
If the top layer of the goods stack is not the last layer of the goods stack, the fact that at least one layer of goods is arranged below the top layer of the goods stack which is unstacked is shown, the layer of goods becomes the top layer of the goods stack to be unstacked, in order to ensure that the top layer of the goods stack to be unstacked is within the shooting range of the 2D camera and the 3D camera, the heights of the 2D camera and the 3D camera are synchronously reduced, the heights are the same as the heights of the top layer of the goods in the top layer of the goods stack which is unstacked, and the height difference between the 2D camera and the top layer of the goods stack is kept unchanged.
Alternatively, if the top level of the stack is the last level of the stack, indicating that the stack has been de-stacked, the 2D camera and the 3D camera are raised simultaneously to the initial height to wait for the next stack to be de-stacked.
Example two
As shown in fig. 2 and 3, an embodiment of the present invention provides a system for unstacking a stack of goods, including: the automatic unstacking device comprises a base 11, a robot 12, a conveyor belt 13, a lifter 14 and a processor 15, wherein the robot 12 comprises a robot arm 121 and a unstacking clamp 122 which is arranged at the tail end of the robot arm 121 and can move along with the robot arm 121, and the lifter 14 is provided with a 2D camera 141 and a 3D camera 142.
The robot 12 is fixed on the base of the base 11, one side surface of the base 11 is opposite to the conveyor belt 13, the conveyor belt 13 is on the same side with one side edge of the base 11, and the 2D camera 141, the 3D camera 142 and the robot 12 are respectively and electrically connected with the processor 15.
The conveyor belt 13 is used to convey the stack of goods to the working area of the robot arm 121, the 2D camera 141 is used to take 2D images of the stack of goods above the stack of goods, and the 3D camera 142 is used to take 3D images of the stack of goods above the stack of goods.
The processor 15 is configured to acquire a 2D image and a 3D image, determine pose information of each top-level cargo on the top level of the cargo stack based on the robot base coordinate system according to the 2D image and the 3D image, and plan a motion trajectory and a unstacking sequence of the robot moving to the cargo stack according to the pose information of all the top-level cargos based on the robot base coordinate system; and the method is also used for judging whether the top layer of the goods stack is the last layer of the goods stack, if not, the stack disassembly is continued, and if so, the stack disassembly is stopped.
The robot 12 is used to take all top-level goods from the top level of the stack according to the motion trajectory and unstacking sequence, and the elevator 14 is used to synchronously lower the 2D camera and the 3D camera to the same height as the top level of the stack when the top level of the stack is not the last level of the stack.
In the embodiment, the top layer of the goods stack is a layer of goods which is closest to the 2D camera and the 3D camera, the height from the 2D camera and the height from the 3D camera to the top layer of the goods stack are equal, the 2D camera and the 3D camera can shoot the goods stack above the goods stack simultaneously to obtain a 2D image and a 3D image, the method is suitable for enabling the 2D camera and the 3D camera to have the same or different heights above the goods stack, and the 2D image and the 3D image are used for identifying the pose information of each top layer goods in the top layer of the goods stack in the robot base coordinate system, so that the pose information can be calibrated quickly and accurately; the treater plans the movement track and the order of breaking a jam of robot to goods buttress motion based on aforementioned position appearance information, realize the robot and break a jam to goods buttress layering, can deal with different goods buttress planning optimal motion route, according to the order of breaking a jam, the robot moves to goods buttress top layer along the movement track, take out every top layer goods according to the preface from goods buttress top layer, until all top layer goods are whole to be taken out, and then when the goods buttress top layer is not the last layer of goods buttress, continue to take out all top layer goods from goods buttress top layer, realize the robot and break a jam to goods buttress top layer layering, the automation degree has, efficiency of breaking a jam and the high characteristics of engineering actual value.
Preferably, the processor 15 is specifically configured to:
let the first camera coordinate system of the 2D camera be { o }1-x1y1z1And a second camera coordinate system of the 3D camera is { o }2-x2y2z2And the robot base coordinate system is { o }b-xbybzbAnd acquiring a first homogeneous transformation matrix of the first camera coordinate system in a second camera coordinate system and a second homogeneous transformation matrix of the second camera coordinate system in the robot base coordinate system.
And detecting the 2D image to obtain a two-dimensional coordinate of each top cargo based on a first camera coordinate system, and detecting the 3D image to obtain a depth coordinate and attitude information of each top cargo in the 3D image based on a second camera coordinate system.
And transforming the two-dimensional coordinates of each top cargo based on the first camera coordinate system by applying the first simultaneous transformation matrix to obtain the two-dimensional coordinates of each top cargo based on the second camera coordinate system.
And sequencing each cargo in the 3D image based on the depth coordinate of the second camera coordinate system to obtain a depth coordinate sequence, determining the depth coordinate of each top cargo based on the second camera coordinate system according to the distance between adjacent depth coordinates in the depth coordinate sequence, and correspondingly forming the three-dimensional coordinate of each top cargo based on the second camera coordinate system by the two-dimensional coordinate of each top cargo based on the second camera coordinate system and the depth coordinate of each top cargo based on the second camera coordinate system.
And determining the position information of each top cargo in the second camera coordinate system according to each three-dimensional coordinate, and transforming the position information and the posture information of each top cargo in the second camera coordinate system by applying a second homogeneous transformation matrix to obtain the position and posture information of each top cargo based on the robot base coordinate system.
Preferably, the processor 15 is further configured to:
for the position and attitude information of each top-layer cargo based on the robot base coordinate system, the position information is PbThe attitude information is RbAnd then, establishing a motion equation,
the equation of motion is expressed as
Wherein,bHfis a homogeneous transformation matrix of a flange coordinate system of the robot in a robot base coordinate system,fHeeis a homogeneous transformation matrix of the coordinate system of the unstacking tool of the robot in the coordinate system of the flange plate.
And planning the motion track of the robot arm corresponding to the flange coordinate system to move towards each top-layer cargo by using a motion equation, sequencing position information of all top-layer cargos in the pose information based on the robot base coordinate system to obtain a position sequence, and planning the position sequence as an unstacking sequence from small to large.
Preferably, the processor 15 is specifically configured to: and according to the unstacking sequence, sequentially driving the robot arm to move to each top-layer cargo along each motion track, enabling the unstacking tool arranged at the tail end of the robot arm in the unstacking tool coordinate system to move to each top-layer cargo along the robot arm, and driving the unstacking tool to take out each top-layer cargo from the top layer of the cargo stack until all the top-layer cargos are taken out completely.
Preferably, the elevator 14 is also used to simultaneously raise the 2D camera 141 and the 3D camera 142 to an initial height when the top level of the stack of goods is the last level of the stack of goods.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.
Claims (10)
1. A method of unstacking a stack of goods, comprising the steps of:
step 1, acquiring a 2D image obtained by shooting a cargo stack above the cargo stack by a 2D camera and a 3D image obtained by shooting the cargo stack above the cargo stack by a 3D camera;
step 2, determining pose information of each top-layer cargo on the top layer of the cargo stack based on a robot base coordinate system according to the 2D image and the 3D image;
step 3, planning a motion track and a destacking sequence of the robot to the goods stack according to pose information of all top-layer goods based on a robot base coordinate system;
step 4, controlling the robot to take out all top-layer cargos from the top layer of the cargo stack according to the motion track and the unstacking sequence;
and 5, judging whether the top layer of the goods stack is the last layer of the goods stack, if not, skipping to execute the step 1, and if so, stopping unstacking.
2. A method of unstacking a stack of goods according to claim 1 wherein step 2 comprises in particular:
step 2.1, let the first camera coordinate system of the 2D camera be { o }1-x1y1z1A second camera coordinate system of the 3D camera is { o }2-x2y2z2And the robot base coordinate system is { o }b-xbybzbAcquiring a first homogeneous transformation matrix of the first camera coordinate system in the second camera coordinate system and a second homogeneous transformation matrix of the second camera coordinate system in the robot base coordinate system;
step 2.2, detecting the 2D image to obtain two-dimensional coordinates of each top cargo based on the first camera coordinate system, and detecting the 3D image to obtain depth coordinates and posture information of each cargo in the 3D image based on the second camera coordinate system;
2.3, transforming the two-dimensional coordinates of each top-level cargo based on the first camera coordinate system by applying the first simultaneous transformation matrix to obtain the two-dimensional coordinates of each top-level cargo based on the second camera coordinate system;
step 2.4, sequencing the depth coordinates of each cargo in the 3D image based on the second camera coordinate system to obtain a depth coordinate sequence, determining the depth coordinates of each top cargo based on the second camera coordinate system according to the distance between adjacent depth coordinates in the depth coordinate sequence, and correspondingly forming the three-dimensional coordinates of each top cargo based on the second camera coordinate system by the two-dimensional coordinates of each top cargo based on the second camera coordinate system and the depth coordinates of each top cargo based on the second camera coordinate system;
and 2.5, determining the position information of each top-level cargo in the second camera coordinate system according to each three-dimensional coordinate, and transforming the position information and the posture information of each top-level cargo in the second camera coordinate system by applying the second homogeneous transformation matrix to obtain the position and posture information of each top-level cargo based on the robot base coordinate system.
3. A method of unstacking a stack of goods according to claim 2, characterized in that step 3 comprises in particular:
step 3.1, enabling position information in the pose information of each top-layer cargo based on the robot base coordinate system to be PbAnd enabling the attitude information in the pose information of each top-layer cargo based on the robot base coordinate system to be RbAnd then, establishing a motion equation,
the equation of motion is expressed as
Wherein,bHfis a homogeneous transformation matrix of the flange coordinate system of the robot in the robot base coordinate system,fHeethe homogeneous transformation matrix of the unstacking tool coordinate system of the robot in the flange plate coordinate system is obtained;
step 3.2, planning the motion trail of the robot arm corresponding to the flange coordinate system to each top-layer cargo by applying the motion equation;
and 3.3, sequencing position information in the pose information of all the top-layer cargos based on the robot base coordinate system to obtain a position sequence, and planning the position sequence as the unstacking sequence from small to large.
4. A method of unstacking a stack of goods according to claim 3 wherein step 4 comprises in particular:
and sequentially driving the robot arm to move to each top-layer cargo along each motion track according to the unstacking sequence, so that the unstacking tool in the unstacking tool coordinate system moves to each top-layer cargo along with the robot arm, and driving the unstacking tool to take out each top-layer cargo from the top layer of the cargo stack until all the top-layer cargos are taken out completely.
5. A method of unstacking a stack of goods according to any one of claims 1 to 4 wherein step 5 comprises in particular:
when the top layer of the goods stack is not the last layer of the goods stack, synchronously lowering the 2D camera and the 3D camera to the same height as the top layer of the goods stack, and skipping to execute the step 1;
stopping unstacking when the top layer of the stack of goods is the last layer of the stack of goods.
6. A system for unstacking a stack of goods, the system comprising: a conveyor belt, a 2D camera, a 3D camera, a processor, and a robot;
the conveying belt is used for conveying the goods stack to a working area of the robot;
the 2D camera is used for shooting a 2D image of the goods stack above the goods stack;
the 3D camera is used for shooting a 3D image of the goods stack above the goods stack;
the processor is used for acquiring the 2D image and the 3D image, determining the position and attitude information of each top-layer cargo on the top layer of the cargo stack based on a robot base coordinate system according to the 2D image and the 3D image, and planning the motion track and the unstacking sequence of the robot to the cargo stack according to the position and attitude information of all the top-layer cargos based on the robot base coordinate system;
the robot is used for taking all the top-layer cargos out of the top layer of the cargo stack according to the motion track and the unstacking sequence;
the processor is further used for judging whether the top layer of the goods stack is the last layer of the goods stack or not, and if not, the unstacking is continued, and if so, the unstacking is stopped.
7. The system of unstacking a stack of goods of claim 6 wherein the processor is specifically configured to:
let the first camera coordinate system of the 2D camera be { o1-x1y1z1A second camera coordinate system of the 3D camera is { o }2-x2y2z2And the robot base coordinate system is { o }b-xbybzbAcquiring a first homogeneous transformation matrix of the first camera coordinate system in the second camera coordinate system and a second homogeneous transformation matrix of the second camera coordinate system in the robot base coordinate system;
detecting the 2D image to obtain two-dimensional coordinates of each top cargo based on the first camera coordinate system, and detecting the 3D image to obtain depth coordinates and posture information of each cargo in the 3D image based on the second camera coordinate system;
transforming the two-dimensional coordinates of each top-level cargo based on the first camera coordinate system by applying the first simultaneous transformation matrix to obtain the two-dimensional coordinates of each top-level cargo based on the second camera coordinate system;
sequencing each cargo in the 3D image based on the depth coordinate of the second camera coordinate system to obtain a depth coordinate sequence, determining the depth coordinate of each top-level cargo based on the second camera coordinate system according to the distance between adjacent depth coordinates in the depth coordinate sequence, and correspondingly forming the three-dimensional coordinate of each top-level cargo based on the second camera coordinate system by the two-dimensional coordinate of each top-level cargo based on the second camera coordinate system and the depth coordinate of each top-level cargo based on the second camera coordinate system;
and determining the position information of each top cargo in the second camera coordinate system according to each three-dimensional coordinate, and transforming the position information and the posture information of each top cargo in the second camera coordinate system by applying the second homogeneous transformation matrix to obtain the pose information of each top cargo based on the robot base coordinate system.
8. The system of unstacking a stack of goods according to claim 7 wherein the processor is further configured to:
for each top-layer cargo, enabling position information in the top-layer cargo to be P based on pose information of a robot base coordinate systembThe attitude information is RbAnd then, establishing a motion equation,
the equation of motion is expressed as
Wherein,bHfis a homogeneous transformation matrix of the flange coordinate system of the robot in the robot base coordinate system,fHeethe homogeneous transformation matrix of the unstacking tool coordinate system of the robot in the flange plate coordinate system is obtained;
planning a motion track of the robot arm corresponding to the flange coordinate system to move to each top-layer cargo by applying the motion equation;
and sequencing position information in the pose information of all the top-layer cargos based on the robot base coordinate system to obtain a position sequence, and planning the position sequence as the unstacking sequence from small to large.
9. The system of unstacking a stack of goods of claim 8 wherein the processor is specifically configured to:
and sequentially driving the robot arm to move to each top-layer cargo along each motion track according to the unstacking sequence, so that the unstacking tool in the unstacking tool coordinate system moves to each top-layer cargo along with the robot arm, and driving the unstacking tool to take out each top-layer cargo from the top layer of the cargo stack until all the top-layer cargos are taken out completely.
10. A system for unstacking a stack of goods according to any one of claims 6 to 9 further comprising a lift on which the 2D camera and the 3D camera are mounted, the lift being adapted to synchronously lower the 2D camera and the 3D camera to the same height as the top level of the stack of goods when the top level of the stack of goods is not the last level of the stack of goods.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811082406.3A CN109436820B (en) | 2018-09-17 | 2018-09-17 | Destacking method and destacking system for goods stack |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811082406.3A CN109436820B (en) | 2018-09-17 | 2018-09-17 | Destacking method and destacking system for goods stack |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109436820A true CN109436820A (en) | 2019-03-08 |
CN109436820B CN109436820B (en) | 2024-04-16 |
Family
ID=65530527
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811082406.3A Active CN109436820B (en) | 2018-09-17 | 2018-09-17 | Destacking method and destacking system for goods stack |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109436820B (en) |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110422521A (en) * | 2019-07-17 | 2019-11-08 | 上海新时达机器人有限公司 | The robot side de-stacking method and device thereof of irregular random material |
CN110642025A (en) * | 2019-09-26 | 2020-01-03 | 华中科技大学 | Stacking and unstacking device for automatic transfer of box body structure |
CN111754515A (en) * | 2019-12-17 | 2020-10-09 | 北京京东尚科信息技术有限公司 | Method and device for sequential gripping of stacked articles |
CN112077843A (en) * | 2020-08-24 | 2020-12-15 | 北京配天技术有限公司 | Robot graphical stacking method, computer storage medium and robot |
CN112509024A (en) * | 2021-02-08 | 2021-03-16 | 杭州灵西机器人智能科技有限公司 | Lifting device based mixed unstacking control method, device, equipment and medium |
CN112520431A (en) * | 2020-11-23 | 2021-03-19 | 配天机器人技术有限公司 | Stacking calibration method and related device for stacking robot |
CN112978392A (en) * | 2019-12-13 | 2021-06-18 | 上海佳万智能科技有限公司 | Method for disassembling paperboard stack |
CN112975943A (en) * | 2019-12-13 | 2021-06-18 | 广东弓叶科技有限公司 | Processing method and system for judging optimal grabbing height of robot clamping jaw |
TWI746333B (en) * | 2020-12-30 | 2021-11-11 | 所羅門股份有限公司 | Destacking method and destacking system |
CN113688704A (en) * | 2021-08-13 | 2021-11-23 | 北京京东乾石科技有限公司 | Item sorting method, item sorting device, electronic device, and computer-readable medium |
CN114012720A (en) * | 2021-10-27 | 2022-02-08 | 因格(苏州)智能技术有限公司 | Robot |
CN114029250A (en) * | 2021-10-27 | 2022-02-11 | 因格(苏州)智能技术有限公司 | Article sorting method and system |
CN114030843A (en) * | 2021-10-27 | 2022-02-11 | 因格(苏州)智能技术有限公司 | Article circulation method and system |
CN114682529A (en) * | 2020-12-11 | 2022-07-01 | 因特利格雷特总部有限责任公司 | Method, device and system for automatically performing sorting operations |
CN115159402A (en) * | 2022-06-17 | 2022-10-11 | 杭州海康机器人技术有限公司 | Goods putting and taking method and device, electronic equipment and machine readable storage medium |
CN117485929A (en) * | 2023-12-29 | 2024-02-02 | 中国电力工程顾问集团西南电力设计院有限公司 | Unmanned material stacking and taking control system and method based on intelligent control |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11333770A (en) * | 1998-03-20 | 1999-12-07 | Kobe Steel Ltd | Loading position and attitude recognizing device |
CN1293752A (en) * | 1999-03-19 | 2001-05-02 | 松下电工株式会社 | Three-D object recognition method and pin picking system using the method |
CN104331894A (en) * | 2014-11-19 | 2015-02-04 | 山东省科学院自动化研究所 | Robot unstacking method based on binocular stereoscopic vision |
US9102055B1 (en) * | 2013-03-15 | 2015-08-11 | Industrial Perception, Inc. | Detection and reconstruction of an environment to facilitate robotic interaction with the environment |
CN105217324A (en) * | 2015-10-20 | 2016-01-06 | 上海影火智能科技有限公司 | A kind of novel de-stacking method and system |
CN106276325A (en) * | 2016-08-31 | 2017-01-04 | 长沙长泰机器人有限公司 | Van automatic loading system |
CN108313748A (en) * | 2018-04-18 | 2018-07-24 | 上海发那科机器人有限公司 | A kind of 3D visions carton de-stacking system |
-
2018
- 2018-09-17 CN CN201811082406.3A patent/CN109436820B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11333770A (en) * | 1998-03-20 | 1999-12-07 | Kobe Steel Ltd | Loading position and attitude recognizing device |
CN1293752A (en) * | 1999-03-19 | 2001-05-02 | 松下电工株式会社 | Three-D object recognition method and pin picking system using the method |
US9102055B1 (en) * | 2013-03-15 | 2015-08-11 | Industrial Perception, Inc. | Detection and reconstruction of an environment to facilitate robotic interaction with the environment |
CN104331894A (en) * | 2014-11-19 | 2015-02-04 | 山东省科学院自动化研究所 | Robot unstacking method based on binocular stereoscopic vision |
CN105217324A (en) * | 2015-10-20 | 2016-01-06 | 上海影火智能科技有限公司 | A kind of novel de-stacking method and system |
CN106276325A (en) * | 2016-08-31 | 2017-01-04 | 长沙长泰机器人有限公司 | Van automatic loading system |
CN108313748A (en) * | 2018-04-18 | 2018-07-24 | 上海发那科机器人有限公司 | A kind of 3D visions carton de-stacking system |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110422521B (en) * | 2019-07-17 | 2021-06-01 | 上海新时达机器人有限公司 | Robot side unstacking method and device for irregular random materials |
CN110422521A (en) * | 2019-07-17 | 2019-11-08 | 上海新时达机器人有限公司 | The robot side de-stacking method and device thereof of irregular random material |
CN110642025A (en) * | 2019-09-26 | 2020-01-03 | 华中科技大学 | Stacking and unstacking device for automatic transfer of box body structure |
CN112975943A (en) * | 2019-12-13 | 2021-06-18 | 广东弓叶科技有限公司 | Processing method and system for judging optimal grabbing height of robot clamping jaw |
CN112978392A (en) * | 2019-12-13 | 2021-06-18 | 上海佳万智能科技有限公司 | Method for disassembling paperboard stack |
CN112978392B (en) * | 2019-12-13 | 2023-04-21 | 上海佳万智能科技有限公司 | Paperboard stack disassembling method |
CN111754515A (en) * | 2019-12-17 | 2020-10-09 | 北京京东尚科信息技术有限公司 | Method and device for sequential gripping of stacked articles |
CN111754515B (en) * | 2019-12-17 | 2024-03-01 | 北京京东乾石科技有限公司 | Sequential gripping method and device for stacked articles |
CN112077843A (en) * | 2020-08-24 | 2020-12-15 | 北京配天技术有限公司 | Robot graphical stacking method, computer storage medium and robot |
CN112520431B (en) * | 2020-11-23 | 2024-12-20 | 配天机器人技术有限公司 | A palletizing verification method and related device for a palletizing robot |
CN112520431A (en) * | 2020-11-23 | 2021-03-19 | 配天机器人技术有限公司 | Stacking calibration method and related device for stacking robot |
US12246354B2 (en) | 2020-12-11 | 2025-03-11 | Intelligrated Headquarters, Llc | Methods, apparatuses, and systems for automatically performing sorting operations |
CN114682529A (en) * | 2020-12-11 | 2022-07-01 | 因特利格雷特总部有限责任公司 | Method, device and system for automatically performing sorting operations |
US11911801B2 (en) | 2020-12-11 | 2024-02-27 | Intelligrated Headquarters, Llc | Methods, apparatuses, and systems for automatically performing sorting operations |
TWI746333B (en) * | 2020-12-30 | 2021-11-11 | 所羅門股份有限公司 | Destacking method and destacking system |
CN112509024A (en) * | 2021-02-08 | 2021-03-16 | 杭州灵西机器人智能科技有限公司 | Lifting device based mixed unstacking control method, device, equipment and medium |
CN113688704A (en) * | 2021-08-13 | 2021-11-23 | 北京京东乾石科技有限公司 | Item sorting method, item sorting device, electronic device, and computer-readable medium |
CN114029250A (en) * | 2021-10-27 | 2022-02-11 | 因格(苏州)智能技术有限公司 | Article sorting method and system |
CN114029250B (en) * | 2021-10-27 | 2022-11-18 | 因格(苏州)智能技术有限公司 | Article sorting method and system |
CN114030843A (en) * | 2021-10-27 | 2022-02-11 | 因格(苏州)智能技术有限公司 | Article circulation method and system |
CN114012720A (en) * | 2021-10-27 | 2022-02-08 | 因格(苏州)智能技术有限公司 | Robot |
CN115159402A (en) * | 2022-06-17 | 2022-10-11 | 杭州海康机器人技术有限公司 | Goods putting and taking method and device, electronic equipment and machine readable storage medium |
CN117485929A (en) * | 2023-12-29 | 2024-02-02 | 中国电力工程顾问集团西南电力设计院有限公司 | Unmanned material stacking and taking control system and method based on intelligent control |
CN117485929B (en) * | 2023-12-29 | 2024-03-19 | 中国电力工程顾问集团西南电力设计院有限公司 | Unmanned material stacking and taking control system and method based on intelligent control |
Also Published As
Publication number | Publication date |
---|---|
CN109436820B (en) | 2024-04-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109436820A (en) | A kind of the de-stacking method and de-stacking system of stacks of goods | |
US12103792B2 (en) | Apparatus and method for building a pallet load | |
US9707682B1 (en) | Methods and systems for recognizing machine-readable information on three-dimensional objects | |
JP6305213B2 (en) | Extraction device and method | |
EP3169489B1 (en) | Real-time determination of object metrics for trajectory planning | |
US20240408746A1 (en) | Manipulating boxes using a zoned gripper | |
JP6461712B2 (en) | Cargo handling device and operation method thereof | |
CA3155138A1 (en) | Vision-assisted robotized depalletizer | |
CN109081026B (en) | Robot unstacking system and method based on laser ranging radar positioning guidance | |
JP2019509559A (en) | Box location, separation, and picking using a sensor-guided robot | |
Nakamoto et al. | High-speed and compact depalletizing robot capable of handling packages stacked complicatedly | |
CN115582827A (en) | A Grasping Method of Unloading Robot Based on 2D and 3D Vision Positioning | |
CN112173518A (en) | Control method and automatic guided vehicle | |
JP2023115274A (en) | Extracting device | |
CN112173519A (en) | Control method and automatic guided vehicle | |
CN113307042B (en) | Object unstacking method and device based on conveyor belt, computing equipment and storage medium | |
US12269164B2 (en) | Method and computing system for performing robot motion planning and repository detection | |
CN113800270A (en) | Robot control method and system for logistics unstacking | |
JP6600026B2 (en) | Extraction device and method | |
US12202145B2 (en) | Robotic system with object update mechanism and methods for operating the same | |
US20230027984A1 (en) | Robotic system with depth-based processing mechanism and methods for operating the same | |
CN119731098A (en) | Apparatus and method for automatic pallet builder calibration | |
US20240391101A1 (en) | Workspace adaptive robotic system to load or unload pallets or other receptacles | |
CN219708017U (en) | Cargo handling system | |
CN204487582U (en) | Based on the automatic de-stacking system of industrial robot that 3D machine vision guides |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |