[go: up one dir, main page]

CN115317136A - Control method and control device for surgical robot and robot - Google Patents

Control method and control device for surgical robot and robot Download PDF

Info

Publication number
CN115317136A
CN115317136A CN202211014750.5A CN202211014750A CN115317136A CN 115317136 A CN115317136 A CN 115317136A CN 202211014750 A CN202211014750 A CN 202211014750A CN 115317136 A CN115317136 A CN 115317136A
Authority
CN
China
Prior art keywords
control
controlled instrument
controlled
current
input device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211014750.5A
Other languages
Chinese (zh)
Inventor
杨辉
贺绍台
刘娟娟
王�锋
桂凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tuodao Medical Technology Co Ltd
Original Assignee
Tuodao Medical Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tuodao Medical Technology Co Ltd filed Critical Tuodao Medical Technology Co Ltd
Priority to CN202211014750.5A priority Critical patent/CN115317136A/en
Publication of CN115317136A publication Critical patent/CN115317136A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/35Surgical robots for telesurgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • A61B34/74Manipulators with manual electric input means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • A61B2034/254User interfaces for surgical systems being adapted depending on the stage of the surgical procedure

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Robotics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Human Computer Interaction (AREA)
  • Manipulator (AREA)
  • Endoscopes (AREA)

Abstract

The embodiment of the application provides a control method, a control device and a surgical robot of a surgical robot, which judge whether an endoscope turns over relative to an initial moment by monitoring pose states of a first controlled instrument and a second controlled instrument which are arranged on the surgical robot in real time, so that after the endoscope turns over and a picture displayed on a display screen correspondingly turns over, a control relation between a control device and the corresponding controlled instrument is switched in time, namely, a master-slave mapping relation between the control device and the controlled instrument can be exchanged along with the exchange of the relative positions of the controlled instrument under the visual field of the endoscope, and hand-eye coordination consistency and operation intuition in the operation process of a doctor can be ensured.

Description

Control method and control device of surgical robot and robot
Technical Field
The present application relates to the technical field of medical devices, and in particular, to a control method and a control device for a surgical robot, and a surgical robot.
Background
A surgical robot is an instrument capable of assisting a doctor in performing a surgical operation, and the surgical robot is generally in a master-slave teleoperation type structure, and the doctor controls the movement of controlled instruments such as surgical instruments (including harmonic scissors, a traction hook, a medicine knife, a flusher, a probe, surgical scissors and the like) and an endoscope by operating control devices (such as a master input device and an auxiliary input device) to perform a desired surgical action.
In a traditional surgical robot, a control device and a controlled instrument meeting intuitive operation conditions are generally in a control relationship, and during surgery, a doctor observes a picture displayed on a display screen of the surgical robot and operates the controlled instrument through the control device to realize surgery. For example, the doctor may control the left controlled instrument to work through the left control device (the position of the left controlled instrument in the screen satisfies the left-hand left-eye coordination and consistency operation when the doctor operates the left control device), and the doctor may control the right controlled instrument to work through the right control device (the position of the right controlled instrument in the screen satisfies the right-hand right-eye coordination and consistency operation when the doctor operates the right control device).
However, in the surgical procedure, after the endoscope is turned over, the image displayed on the display screen is also changed, and then the left controlled instrument is displayed on the right side of the image, and the right controlled instrument is displayed on the left side of the image, so that the control device no longer satisfies the intuitive operation condition for controlling the controlled instrument, thereby affecting the safety and smoothness of the surgical operation.
Disclosure of Invention
The embodiment of the application provides a control method and device of a surgical robot and the robot, which can ensure that the control equipment still meets the intuitive operation condition for controlling the controlled instrument after the endoscope is turned over, namely, the picture displayed on the display screen is changed, thereby ensuring the safety and smoothness of the surgical operation.
A first aspect of the embodiments of the present application provides a control method for a surgical robot, where the surgical robot includes a first control device, a first controlled instrument, a second control device, a second controlled instrument, and an endoscope; the method comprises the following steps:
acquiring initial poses of the first controlled instrument and the second controlled instrument at an initial moment;
acquiring current poses of the first controlled instrument and the second controlled instrument at the current moment;
determining whether the endoscope is flipped at the current time relative to the initial time based on the initial pose and the current pose;
if the operation robot is overturned, acquiring the current control relationship of the operation robot;
if the current control relationship is a first control relationship or a second control relationship, switching the current control relationship to the second control relationship or the first control relationship; the first control relation is that the first control equipment controls a first controlled instrument to work, and the second control equipment controls a second controlled instrument to work; the second control relation is that the first control equipment controls the second controlled equipment to work, and the second control equipment controls the first controlled equipment to work.
A second aspect of the embodiments of the present application provides a control apparatus for a surgical robot, the control apparatus being applied to the robot, the surgical robot including a first control device, a first controlled instrument, a second control device, a second controlled instrument, and an endoscope, the control apparatus including:
the first acquisition module is used for acquiring the initial poses of the first controlled instrument and the second controlled instrument at the initial moment;
the second acquisition module is used for acquiring the current poses of the first controlled instrument and the second controlled instrument at the current moment;
a determination module configured to determine whether the endoscope is flipped at the current time relative to the initial time based on the initial pose and the current pose;
the third acquisition module is used for acquiring the current control relation of the surgical robot after the operation robot is overturned;
the switching module is used for switching the current control relation to a first control relation or a second control relation when the current control relation is the first control relation or the second control relation; the first control relation is that the first control equipment controls a first controlled instrument to work, and the second control equipment controls a second controlled instrument to work; the second control relation is that the first control equipment controls the second controlled equipment to work, and the second control equipment controls the first controlled equipment to work.
A third aspect of embodiments of the present application provides a surgical robot, including: a processor and a memory for storing processor-executable instructions;
the processor is configured to read the executable instructions from the memory and execute the instructions to implement the control method according to the first aspect of the present application.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium having a computer program stored thereon, including: the computer program, when executed by a processor, implements a control method as described in the first aspect.
A fifth aspect of embodiments of the present application provides a computer program product, which includes a computer program, and is characterized in that the computer program, when executed by a processor, implements the control method according to the first aspect.
The technical scheme provided by the embodiment of the application can at least achieve the following beneficial effects:
according to the control method, the control device and the surgical robot of the surgical robot provided by the embodiment of the application, whether the endoscope is turned over relative to an initial time is judged by monitoring pose states of a first controlled instrument and a second controlled instrument arranged on the surgical robot, so that after the endoscope is turned over and a corresponding picture displayed on a display screen is turned over, a control relation between the control device and the corresponding controlled instrument is timely switched.
Drawings
FIG. 1 is an environmental diagram illustrating a method for controlling a surgical robot according to an exemplary embodiment of the present application;
FIG. 2 is a flow chart illustrating a method of controlling a surgical robot according to an exemplary embodiment of the present application;
FIG. 3a is an image displayed on a display screen at an initial time according to an exemplary embodiment of the present application;
FIG. 3b is an image displayed on a display screen at a current time shown in an exemplary embodiment of the present application;
FIG. 3c is an image of the corresponding image of FIG. 3b after reconstruction of the lens coordinate system;
FIG. 4 is a flow chart illustrating yet another method of controlling a surgical robot in accordance with an exemplary embodiment of the present application;
FIG. 5 is a flow chart illustrating a further method of controlling a surgical robot in accordance with an exemplary embodiment of the present application;
FIG. 6 is a flow chart illustrating yet another method of controlling a surgical robot in accordance with an exemplary embodiment of the present application;
FIG. 7 is a flow chart illustrating a further method of controlling a surgical robot according to an exemplary embodiment of the present application;
FIG. 8 is a schematic illustration of a control device for a surgical robot shown in an exemplary embodiment of the present application;
fig. 9 is a schematic view of an internal structure of a surgical robot according to an exemplary embodiment of the present application.
Reference numerals:
10. a first surgical cart; 20. a master control device; 30. a second surgical cart;
11. a first control device; 12. a second control device; 13. a controlled instrument; 14. a display screen; 15. a processor;
111. a first primary input device; 112. a first auxiliary input device; 121. a second primary input device; 122. a second auxiliary input device; 131. a first controlled instrument; 132. a second controlled instrument; 133. an endoscope;
1312. a first towing hook; 1322. a second towing hook.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. The following description refers to the accompanying drawings in which the same numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present application. The word "if," as used herein, may be interpreted as "at \8230; \8230when" or "when 8230; \823030when" or "in response to a determination," depending on the context.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. Unless otherwise defined, technical or scientific terms used herein shall have the ordinary meaning as understood by one of ordinary skill in the art to which this application belongs. The use of "first," "second," and similar terms in the description and in the claims does not indicate any order, quantity, or importance, but rather is used to distinguish one element from another. Also, the use of the terms "a" or "an" and the like do not denote a limitation of quantity, but rather denote the presence of at least one. "plurality" or "a number" means two or more. Unless otherwise indicated, "front", "rear", "lower" and/or "upper" and the like are for convenience of description and are not limited to one position or one spatial orientation. The word "comprising" or "comprises", and the like, means that the element or item listed after "comprises" or "comprising" is inclusive of the element or item listed after "comprising" or "comprises", and the equivalent thereof, and does not exclude additional elements or items. The terms "connected" or "coupled" and the like are not restricted to physical or mechanical connections, but may include electrical connections, whether direct or indirect.
Embodiments of the disclosure may be implemented in electronic devices such as terminal devices, computer systems, servers, etc., which are operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well known terminal devices, computing systems, environments, and/or configurations that may be suitable for use with electronic devices, such as terminal devices, computer systems, servers, and the like, include, but are not limited to: personal computer systems, server computer systems, thin clients, thick clients, hand-held or laptop devices, microprocessor-based systems, set-top boxes, programmable consumer electronics, network pcs, minicomputer systems, mainframe computer systems, distributed cloud computing environments that include any of the above, and the like.
Electronic devices such as terminal devices, computer systems, servers, etc. may be described in the general context of computer system-executable instructions, such as program modules, being executed by a computer system. Generally, program modules may include routines, programs, objects, components, logic, data structures, etc. that perform particular tasks or implement particular abstract data types. The computer system/server may be practiced in distributed cloud computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed cloud computing environment, program modules may be located in both local and remote computer system storage media including memory storage devices.
A surgical robot is an instrument capable of assisting a surgeon in performing a surgical operation, and is generally a master-slave teleoperation type structure, wherein the surgeon controls the movement of a controlled instrument 13, such as a surgical instrument (which may include a harmonic scissors, a traction hook, a drug knife, a flusher, a probe, a surgical scissors, etc.), and an endoscope 133, by operating a master input device to complete a desired surgical action. Further, the surgeon controls the surgical instrument to perform additional auxiliary functions, such as single/bipolar energy supply, suction, cutting or stapling, etc., by simultaneously operating the primary input device and the corresponding auxiliary input device (e.g., foot pedal, finger clutch, photosensor, etc.).
In general, in order to ensure that the doctor can operate the controlled instrument 13 in a manner of coordinating hands and eyes during the operation to improve the safety and smoothness of the operation, the main input device and the auxiliary input device are bound to be used as control devices and establish a control relationship with the controlled instrument 13 meeting intuitive operation conditions.
In practice, the controlled apparatus 13 may be a plurality of apparatuses, for example, the controlled apparatus 13 includes a left controlled apparatus 13 and a right controlled apparatus 13, and the control device includes a left control device and a right control device. During surgery, a doctor can perform the surgery by observing the picture displayed on the display screen 14 of the surgical robot and operating the controlled instrument 13 through the control device, for example, the doctor can control the left controlled instrument 13 to move through the left control device (the left controlled instrument 13 is located on the left side of the picture displayed on the display screen 14), so that the position of the left controlled instrument 13 in the picture meets the left-hand and left-eye coordinated and consistent operation when the doctor operates the left control device, and the doctor can control the right controlled instrument 13 to work through the right control device (the right controlled instrument 13 is located on the left side of the picture displayed on the display screen 14).
The screen displayed on the display screen 14 is an image captured by the endoscope 133 on the surgical robot. With the movement of the endoscope 133 during the operation, the image displayed on the display screen 14 changes correspondingly, and after the endoscope 133 is turned over, the displayed image changes correspondingly, and the change may be that the positions of the left controlled apparatus 13 and the right controlled apparatus 13 are exchanged, for example, the image of the left controlled apparatus 13 is displayed on the right side, and the image of the right controlled apparatus 13 is displayed on the left side, so that after the endoscope 133 is turned over, the position of the right controlled apparatus 13 does not satisfy the left-hand and left-eye coordinated operation when the doctor operates the left control device, and the position of the left controlled apparatus 13 does not satisfy the right-hand and right-eye coordinated operation when the doctor operates the right control device, which further results in that the control device does not satisfy the intuitive operation condition for controlling the controlled apparatus 13, thereby affecting the safety and smoothness of the operation.
In view of this, the embodiment of the present application provides a method for controlling a surgical robot, which determines whether an endoscope 133 is turned over relative to an initial time by monitoring pose states of a first controlled instrument 13 and a second controlled instrument 13 that are disposed on the surgical robot in real time, so as to adjust a control relationship between a control device and the controlled instrument 13 in time after the endoscope 133 is turned over, that is, a master-slave mapping relationship between the control device and the controlled instrument 13 can be exchanged along with exchange of relative positions of the controlled instrument 13 in a visual field of the endoscope 133, thereby ensuring consistency of hand-eye coordination and intuition of operation in a doctor operation process.
Referring to fig. 1, an application scenario of the embodiment of the present application is described as follows:
the present embodiment is applied to a surgical robot including a first surgical cart 10, a master control device 20, and a second surgical cart 30. The first operation cart 10 is provided with a first control device, a second control device and a display screen 14; the general control device 20 is provided with a plurality of processors 15, and the number of the processors 15 can be multiple; the second surgical cart 30 is provided with a plurality of controlled instruments 13, wherein the plurality of controlled instruments 13 may include a first controlled instrument 131, a second controlled instrument 132 and an endoscope 133. The first surgical cart 10 may be communicatively connected to the second surgical cart 30 via the overall control device 20, and the first surgical cart 10 and the second surgical cart 30 may each be wirelessly connected or wired to the overall control device 20 to physically enable control relationships for controlling the controlled instruments 13 on the second surgical cart 30 via the control device on the first surgical cart 10. The first surgical cart 10, the general control device 20, and the second surgical cart 30 may further include a processor 15, a memory, an input device, an output device, a display, a sensor, and other devices, which are not limited herein.
Wherein the first control device 11 may be a device comprising a first main input device 111 and a first auxiliary input device 112, and the second control device 12 may be a device comprising a second main input device 121 and a second auxiliary input device 122. The primary input devices including the first primary input device 111 and the second primary input device 121 may be, for example, control knobs provided on the first surgical cart 10, and the auxiliary input devices including the first auxiliary input device 112 and the second auxiliary input device 122 may be, for example, foot pedals, finger clutches, photosensors, etc., provided on the first surgical cart 10. When the user operates the first and second master input devices 111 and 121 by both hands, the first master input device 111 may control the first controlled instrument 131 or the second controlled instrument 132 to move to a designated position, and accordingly, the second master input device 121 may control the second controlled instrument 132 or the first controlled instrument 131 to move to a designated position. That is, either the first controlled instrument 131 is controlled to move by the first master input device 111 and the second controlled instrument 132 is controlled to move by the second master input device 121; either the second controlled instrument 132 is controlled to move by the first master input device 111 and the first controlled instrument 131 is controlled to move by the second master input device 121. Likewise, when the user operates the first auxiliary input device 112 and the second auxiliary input device 122 by foot or hand, the auxiliary function of the first controlled instrument 131 or the second controlled instrument 132 may be activated to cause the first controlled instrument 131 or the second controlled instrument 132 to perform the corresponding auxiliary operation. That is, either the first controlled instrument 131 is controlled by the first auxiliary input device 112 to perform the corresponding auxiliary operation, and the second controlled instrument 132 is controlled by the second auxiliary input device 122 to perform the corresponding auxiliary operation; or the second controlled instrument 132 is controlled to perform a corresponding auxiliary operation by the first auxiliary input device 112 and the first controlled instrument 131 is controlled to perform a corresponding auxiliary operation by the second auxiliary input device 122.
Specifically, based on the control relationship established in the above physical layer, signals may be sent from the first control device 11 and the second control device 12 on the first surgical cart 10 to the processor 15 of the general control device 20, and then the signals are analyzed by the processor 15 and corresponding commands are generated, so as to send the commands to the interface component connected to the corresponding controlled instrument 13 in the second surgical cart 30 to operate the corresponding controlled instrument 13.
The first controlled device 131 and the second controlled device 132 are, for example, harmonic scissors, a traction hook, a medical knife, a flusher, a probe and surgical scissors, an endoscope 133, and the like. First and second controlled instruments 131, 132 may be used for energy supply, clamping, stapling, cutting, suction, irrigation, laser energy, and endoscope 133 may be used to provide real-time visualization of a remote surgical site, among other operations.
At some time, the first main input device 111 of the first control device 11 may be used to control the plurality of first controlled apparatuses 131 (the positional relationship between the first main input device 111 and the plurality of first controlled apparatuses 131 on the display screen 14 is satisfied when the doctor performs the hand-eye coordination operation during the operation), and then the first main input device 111 may perform the switching of the plurality of first controlled apparatuses 1311 by operating the first auxiliary input device 112 having the binding relationship with the first main input device 111 (the first auxiliary input device 112 and the first main input device 111 are located on the same side of the actual position).
Similarly, the second main input device 121 of the second control device 12 may be used to control a plurality of second controlled instruments 132 (the positions of the second main input device 121 and the plurality of second controlled instruments 132 on the display screen 14 are satisfied with the consistent hand-eye coordination operation performed by the doctor during the operation), and then the second main input device 121 may be used to control the switching of the plurality of second controlled instruments 132 (the second auxiliary input device 122 and the second main input device 121 are located on the same side of the actual position) by operating the second auxiliary input device 122 having a binding relationship with the second main input device 121.
Of course, after the frame displayed on the display screen 14 changes, the first main input device 111 and the first auxiliary input device 112 may also establish a control relationship with the second controlled apparatus 132 to cooperatively control the second controlled apparatus 132 to perform the surgical operation through the first main input device 111 and the first auxiliary input device 112, and the second main input device 121 and the second auxiliary input device 122 may also establish a control relationship with the first controlled apparatus 131 to cooperatively control the first controlled apparatus 131 to perform the surgical operation through the second main input device 121 and the second auxiliary input device 122. The control relationship between the control device and the controlled instrument 13 can be switched according to the requirements in the operation process.
Illustratively, the first and second main input devices 111 and 121 are, for example, first and second control handles provided on the first surgical cart 10 (the first control handle is provided on the left hand side of the surgeon, and the second control handle is provided on the right hand side of the surgeon), and the first auxiliary input device 112 may be, for example, first and second pedals provided on the first surgical cart 10 (the first pedal is provided on the left side of the set of auxiliary input devices, the second pedal is provided on the right side of the set of auxiliary input devices, and the surgeon pedals the first and second pedals, respectively, with the right foot).
In addition, the second surgical cart 30 is also provided with an endoscope 133. The screen displayed on the display 14 is a screen in the field of view of the endoscope 133, and is also the field of view of the doctor. Therefore, during the operation of the doctor, the positions of the first controlled instrument 131 and the second controlled instrument 132 are determined by viewing the images captured by the endoscope 133 displayed on the display screen 14, the corresponding first control device 11 or second control device 12 is selectively operated to control the first controlled instrument 131 and the second controlled instrument 132 to move to the corresponding positions, and then the first auxiliary input device 112 and the second auxiliary input device 122 are selectively operated to control the first controlled instrument 131 and the second controlled instrument 132 to perform the corresponding auxiliary function operations. Therefore, the coordination of hands and eyes can be satisfied during the operation of a doctor, and the smoothness of the operation is improved.
When a surgeon performs a surgical operation by using the surgical robot of the embodiment of the present application, the surgeon can operate the first control handle on the left-hand side of the surgeon by the left hand while observing the display screen 14 (for example, by grasping the first control handle and moving the first control handle in various directions), so as to output a control signal to the processor 15 of the overall control device 20 through the first control handle, interpret the signal through the processor 15 and generate a corresponding instruction, so as to send the instruction to the control interface on the second surgical cart 30 for controlling the first controlled instrument 131, thereby controlling the first controlled instrument 131.
Similarly, the physician can operate the second control handle located on the right hand side of the physician by the right hand while viewing the display screen 14 (for example, by grasping the second control handle and moving the second control handle in various directions), so as to input the control signal to the processor 15 of the overall control device 20 through the second control handle, and the signal is interpreted by the processor 15 and generates a corresponding instruction, so as to send the instruction to the control interface of the second surgical cart 30 for controlling the second controlled instrument 132, thereby controlling the second controlled instrument 132.
In some examples, after the doctor moves the endoscope 133 to the target position by operating the first control handle and the second control handle, the first control handle and the second control handle can be simultaneously switched to control the movement of one of the first controlled instrument 131 and the second controlled instrument 132 by left pedaling (the first pedal or the second pedal on the left side is continued as shown in fig. 1), and the controlled instrument 13 is moved to the target position.
The technical solutions of the embodiments of the present application are exemplarily described below with reference to the drawings.
Fig. 2 is a flowchart illustrating a control method of a surgical robot according to an exemplary embodiment of the present application. Referring to fig. 2, the method comprises the following steps:
step S100, acquiring an initial pose of a first controlled instrument 131 or a second controlled instrument 132 at an initial moment;
wherein the initial pose of the first controlled instrument 131 represents the position and posture of the first controlled instrument 131 at the initial time, and the initial pose of the second controlled instrument 132 represents the position and posture of the second controlled instrument 132 at the initial time. The initial posture of the first controlled instrument 131 or the second controlled instrument 132 may be acquired by at least one or a plurality of posture detection devices provided on the second surgical cart 30. The pose detection device is, for example, an inertial measurement unit, a wheel type odometer, or the like.
After the initial pose of the first controlled instrument 131 or the second controlled instrument 132 is acquired by the pose detection device, the initial pose may be transmitted to the memory of the second surgical cart 30 and stored, may be transmitted to the general control device 20 through the output device of the second surgical cart 30, and may be further transmitted to the input device of the first surgical cart 10 through the general control device 20.
It should be noted here that the endoscope 133, which is initially set on the second surgical cart 30, may be moved to the target position, or may be in a stationary state without being moved. This is not limited in this application. If the endoscope 133 has moved to the target position at the initial time, the endoscope 133 may be moved to the target position by operating the first control apparatus 11 and the second control apparatus 12; the endoscope 133 may be moved to the target position by operating a button provided on a scope holding arm connected to the endoscope 133, which is not limited in the present application.
Then, the present embodiment may determine the time when first controlled instrument 131 and second controlled instrument 132 have not started to move as the initial time; the time when the first controlled instrument 131 and the second controlled instrument 132 move to the target position and then stand still to wait for the start of the operation may be determined as the initial time; the initial time may also be determined according to the endoscope 133 or other controlled device 13, which is not limited in this application. The initial poses of first controlled instrument 131 and second controlled instrument 132 may be poses of first controlled instrument 131 and second controlled instrument 132 acquired by the pose detection apparatus at the initial time.
Step S200, acquiring the current poses of the first controlled instrument 131 and the second controlled instrument 132 at the current moment;
wherein, the current poses of the first controlled instrument 131 and the second controlled instrument 132 are a relative concept, and during the actual operation, the poses of the first controlled instrument 131 and the second controlled instrument 132 at the current time may not be really changed relative to the initial time, and may be the same pose; however, if the endoscope 133 is turned over at the current time relative to the initial time, the display frames of the first controlled instrument 131 and the second controlled instrument 132 (i.e. the frames in the field of view of the doctor) on the display screen 14 at the current time will change from the initial time, so that it can be determined whether the endoscope 133 has changed according to the current pose and the initial pose in the display frames.
Fig. 3a is a schematic view of an image captured by the endoscope 133 at the initial time (i.e., a screen displayed on the display screen 14 at the initial time) according to an exemplary embodiment of the present application. As can be seen from the screen, since the first surgical instrument included in the first controlled instrument 131, for example, the first hook 1312, and the second surgical instrument included in the second controlled instrument 132, for example, the second hook 1322 are disposed such that the first hook 1312 is located on the left side of the screen (i.e., the doctor's field of view) and the second hook 1322 is located on the right side of the screen (i.e., the doctor's field of view), the first control device 11 controls the first hook 1312 to operate and the second control device 12 controls the second hook 1322 to operate, so that the hand-eye coordination operation of the doctor can be satisfied.
When the endoscope 133 is turned over, the images displayed on the display screen 14 may change, for example, the positions of the first controlled instrument 131 and the second controlled instrument 132 in the display image of the display screen 14 (i.e., under the field of view of the doctor) may change.
Fig. 3b is a schematic diagram of an image acquired by the endoscope 133 at the current time (i.e., a frame displayed on the display screen 14 at the current time), and fig. 3c is a schematic diagram of an image corresponding to fig. 3b after the lens coordinate system is reconstructed, and as can be seen from the frame, the first towing hook 1312 is located at the right side of the frame (i.e., the view of the doctor), and the second towing hook 1322 is located at the left side of the frame (i.e., the view of the doctor), so that the operation of the first towing hook 1312 is continuously controlled by the first control device 11 at this time, and the operation of the second towing hook 1322 is controlled by the second control device 12, which will not satisfy the operation of the coordination between the hands and the eyes of the doctor.
Therefore, whether the endoscope 133 is turned over at the current time relative to the initial time can be determined through the initial time and the pose of the first controlled instrument 131 and the second controlled instrument 132 at the current time, so as to provide an accurate basis for whether to switch the control relationship between the control device and the controlled instrument 13.
Specifically, the current poses of first controlled instrument 131 and second controlled instrument 132 can be obtained by:
in one embodiment, as shown in fig. 4, fig. 4 is an alternative embodiment of a method for determining the current poses of a first controlled instrument 131 and a second controlled instrument 132 according to an exemplary embodiment of the present application, and includes the following specific steps:
step S201, acquiring a first pose of the endoscope 133 at an initial time and a second pose of the endoscope 133 at a current time;
the first pose of the endoscope 133 at the initial time and the second pose of the endoscope 133 at the current time can be acquired by the same pose detection device, so that the first pose and the second pose of the endoscope 133 can be determined based on the same coordinate system, and the first pose and the second pose of the endoscope 133 can be rapidly compared after the first pose and the second pose of the endoscope 133 are obtained, so as to obtain a comparison result, thereby achieving the purpose of improving the efficiency of determining pose change relationship, and further improving the efficiency of performing surgical operation by a doctor through a surgical robot.
Step S202, determining a pose change relation of the endoscope 133 at the current moment based on the initial moment according to the first pose and the second pose;
in one embodiment, the pose change relationship may be calculated by: first, the endoscope 133 is connected to the scope holding arm, which has n joints, and the coordinate transformation of adjacent joints satisfies the common homogeneous transformation matrix a.
The joint position of each joint before the movement of the endoscope 133 is θ b1 ,θ b2 ,…θ b(n-1) ,θ bn The first position of the endoscope 133 can be described as:
Figure BDA0003812071550000111
the joint position of each joint after the movement of the endoscope 133 is θ a1 ,θ a2 ,…θ a(n-1) ,θ an The second position of the endoscope 133 may be described as:
Figure BDA0003812071550000112
the following relationship holds:
T b R endo =T a
Figure BDA0003812071550000113
wherein R is endo A pose change relationship is represented in which the base coordinate system does not change with translation or rotation of the endoscope 133.
Step S203 determines the current poses of the first controlled instrument 131 and the second controlled instrument 132 based on the reference coordinate system constructed based on the first pose of the endoscope 133 and the pose change relationship.
The reference coordinate system may be constructed according to the initial pose of the endoscope 133, that is, the position and pose of the endoscope 133 at the initial time. For example, the reference coordinate system may be obtained by setting the plane where the mirror surface of the endoscope 133 in the initial posture at the initial time is located as the horizontal axis and the vertical axis, and further constructing the vertical axis based on the direction perpendicular to the plane where the horizontal axis and the vertical axis are located. The reference coordinate system may also be constructed from a world coordinate system, from a robot-customized coordinate system, from a hanging scaffold connected to a scope holding arm of the endoscope 133, etc., and is not limited herein. The reference coordinate system does not change along with the translation or rotation of the endoscope 133, and a reliable judgment basis can be provided for subsequently judging whether the endoscope 133 is overturned. With reference to FIGS. 3a to 3c, a coordinate system (o) 0 x 0 y 0 ) Or coordinate system (oxy) as a reference coordinate system.
The current poses of the first controlled instrument 131 and the second controlled instrument 132 can be obtained by converting the pose change relationship with reference to the coordinate system; the pose change relation can also be obtained by calculating the pose change relation with a unit vector or a vector sum of a reference coordinate system. And are not limited herein.
In some embodiments, as shown in fig. 5, fig. 5 is an alternative method embodiment for determining the current poses of first controlled instrument 131 and second controlled instrument 132 based on the reference coordinate system and the pose change relationship, provided by an exemplary embodiment of the present application, and includes the following steps:
step S601, determining a vector sum of the first controlled instrument 131 and the second controlled instrument 132 in the reference coordinate system at the initial time;
in some embodiments, as shown in fig. 6, fig. 6 is an alternative embodiment of a method for determining a vector sum according to an exemplary embodiment of the present application, and includes the following steps:
step S701, determining a first vector projected to a target plane by the first controlled device 131 and a second vector projected to the target plane by the second controlled device 132 at the initial moment, wherein the target plane is a plane where the mirror surface of the endoscope 133 is located;
the target plane is a plane where the mirror surface of the endoscope 133 is located, and the first controlled instrument 131, the second controlled instrument 132 and the endoscope 133 are all located in the same space in the process of performing the operation.
The initial positions of the first controlled instrument 131 and the second controlled instrument 132 based on the above description include position information and posture information of the first controlled instrument 131 and position information and posture information of the second controlled instrument 132; the positions of first controlled instrument 131 and second controlled instrument 132 projected onto the target plane can be determined according to the initial poses of first controlled instrument 131 and second controlled instrument 132, so that the position equations (i.e., straight line equations) of first controlled instrument 131 and second controlled instrument 132 can be obtained, and then the first vector can be determined based on the position equation of first controlled instrument 131, and likewise, the second vector can be determined based on the position equation of second controlled instrument 132. Namely, the three-dimensional coordinates are converted into two-dimensional coordinates, so that subsequent calculation is facilitated.
Step S702, determining a vector sum of first controlled instrument 131 and second controlled instrument 132 in the reference coordinate system based on the first vector and the second vector.
Wherein, the vector sum of the first controlled instrument 131 and the second controlled instrument 132 in the reference coordinate system can be obtained by performing vector addition calculation on the first vector and the second vector
Figure BDA0003812071550000121
It is also possible to calculate the units of the first vector and the second vector firstThen the unit vector of the first vector and the unit vector of the second vector are subjected to vector addition calculation to obtain the vector sum of the first controlled instrument 131 and the second controlled instrument 132 in the reference coordinate system
Figure BDA0003812071550000122
Figure BDA0003812071550000123
Wherein, referring to FIGS. 3a to 3c,
Figure BDA0003812071550000124
is a vector sum or a unit vector sum,
Figure BDA0003812071550000125
in order to be the first vector, the vector is,
Figure BDA0003812071550000126
is the second vector. Whether the endoscope 133 is overturned or not can be determined through the initial poses and the current poses of the first instrument and the second instrument, and whether the endoscope 133 is overturned or not can be determined through the unit vectors of the first controlled instrument 131 and the second controlled instrument 132, and the vector size is irrelevant.
Step S602, determining the current poses of first controlled instrument 131 and second controlled instrument 132 based on the vector sum and pose change relationship.
For example, the pose change relationship may be multiplied by the vector sum to obtain the current poses of the first controlled instrument 131 and the second controlled instrument 132; the pose change relationship may also be multiplied by the unit vector to obtain the current poses of the first controlled instrument 131 and the second controlled instrument 132, which is not limited in the present application.
Step S300, based on the initial pose and the current pose, determines whether the endoscope 133 has been flipped at the current time relative to the initial time.
The present embodiment may determine whether endoscope 133 is flipped at the current time relative to the initial time based on the difference, change, and the like of the initial pose and the current pose of first controlled instrument 131 and second controlled instrument 132. Specifically, it may be determined that the endoscope 133 is turned when the pose (i.e., angle) of the initial pose and the pose of the current pose differ by a preset value, but the present application does not limit whether the position of the initial pose and the position of the pose of the current pose change in the actual process.
Based on the above process, the initial pose and the current pose of the first controlled instrument 131 and the second controlled instrument 132 are obtained, which may be comparing the initial pose and the current pose, comparing the comparison result with a preset threshold, and determining whether the endoscope 133 is turned over at the current time relative to the initial time according to the comparison result; the initial pose and the current pose may also be input into the analysis and comparison model, and whether the endoscope 133 is turned over at the current time relative to the initial time is determined based on the output result of the model, which is not limited herein.
In one embodiment, as shown in fig. 7, fig. 7 is an alternative method embodiment for determining that the endoscope 133 is flipped at the current time relative to the initial time provided by the embodiment of the present application, and the method embodiment includes the following steps:
step S801, acquiring a reference component of the attitude data in the initial pose in a reference coordinate system and a target component of the attitude data in the current pose in the reference coordinate system;
the target component and the reference component are components of the same coordinate axis, and are both components on the horizontal axis, for example. Based on the initial and current poses of first controlled instrument 131 and second controlled instrument 132 obtained as described above, the specific data of each component can be easily obtained, so that the efficiency of determining whether endoscope 133 is flipped can be improved.
In step S802, if the target component is opposite to the reference component, it is determined that the endoscope 133 is turned over at the current time with respect to the initial time.
After the target component and the reference component are obtained from the initial pose and the current pose of the first controlled instrument 131 and the second controlled instrument 132, it may be determined that the endoscope 133 is turned over at the current time relative to the initial time by comparing the directions of the target component and the reference component. Illustratively, the target component and the reference component are, for example, components of a vertical axis, and if the target component is 1 and the reference component is-1, and the direction of the target component relative to the reference component is changed through comparison, it is determined that the endoscope 133 is flipped at the current time relative to the initial time.
Step S400, if the operation robot is turned over, the current control relation of the operation robot is obtained;
after the endoscope 133 is determined to be turned over based on the above process, it is also necessary to determine the current control relationship of the surgical robot to determine whether the control relationship needs to be switched, so that the doctor can control the controlled apparatus 13 to move or execute the corresponding auxiliary function operation by the control device while observing the display frame on the display screen 14, thereby enabling the coordination between the hands and the eyes of the doctor to be consistent.
The current control relationship of the surgical robot may then be, for example, the first control device 11 having a control relationship with the first controlled instrument 131 (i.e., the first controlled instrument 131 moves or performs a corresponding auxiliary function operation when the surgeon manipulates the first control device 11), and the second control device 12 having a control relationship with the second controlled instrument 132 (i.e., the second controlled instrument 132 moves or performs a corresponding auxiliary function operation when the surgeon manipulates the second control device 12). Of course, the current control relationship of the surgical robot may also be that the first control device 11 has a control relationship with the second controlled instrument 132, and the second control device 12 has a control relationship with the first controlled instrument 131, which may be specifically determined according to the current detection result, which is not limited herein. The embodiment of the present application is merely an example showing the current control relationship of the surgical robot, and is for convenience of the following description and is not particularly limited.
For example, at the previous moment of the current moment, the doctor may operate the first control apparatus 11 located on the left-hand side of the doctor by the left hand while observing the display screen 14 to control the first controlled instrument 1311, for example, the first tow hook 1312 located on the left side in the screen displayed on the display screen 14, to instruct the first tow hook 1312 to move to the target position or to perform the corresponding auxiliary function operation. Likewise, the doctor can operate the second control device 12 positioned on the right hand side of the doctor by the right hand while observing the display screen 14 to control the second controlled instrument 132, such as the second towing hook 1322, positioned on the right side in the picture displayed on the display screen 14, to instruct the second towing hook 1322 to move to the target position or to perform the corresponding auxiliary function operation. The position of the first controlled instrument 131 on the left side of the screen satisfies the left-hand and left-eye coordinated consistency operation when the doctor operates the first control device 11, and the position of the second controlled instrument 132 on the right side of the screen satisfies the right-hand and right-eye coordinated consistency operation when the doctor operates the second control device 12. The control relation of the surgical robot at the last moment can meet the requirement that a doctor carries out hand-eye coordination and consistency operation. However, if the endoscope 133 is turned over at the current time and the surgical robot maintains such a control relationship at the current time, the hand-eye coordination and consistency operation by the doctor will not be satisfied.
Step S500, if the current control relationship is the first control relationship or the second control relationship, switching the current control relationship to the second control relationship or the first control relationship; the first control relation is that the first control device 11 controls the first controlled appliance 131 to work, and the second control device 12 controls the second controlled appliance 132 to work; the second control relationship is that the first control device 11 controls the second controlled device 132 to work, and the second control device 12 controls the first controlled device 131 to work.
Based on the obtained current control relationship of the surgical robot, whether the relationship meets the hand-eye coordination consistency of the doctor when the surgical operation is continued can be continuously judged, and if the relationship does not meet the hand-eye coordination consistency, the control relationship needs to be switched; if so, the current control relationship continues to be maintained.
The first control relationship is, for example, that the first control device 11 controls the first controlled instrument 131, the second control device 12 controls the second controlled instrument 132; of course, the first control relationship may also be that the first control device 11 controls the second controlled instrument 132, and the second control device 12 controls the first controlled instrument 131, which is not limited herein; accordingly, the second control relationship is, for example, that the first control device 11 controls the second controlled instrument 132 and the second control device 12 controls the first controlled instrument 131, or that the first control device 11 controls the first controlled instrument 131 and the second control device 12 controls the second controlled instrument 132.
For example, if at the previous time of the current time, during the procedure, the doctor can observe through the display 14 that the first controlled device 131 is located at the right side of the screen and the second controlled device 132 is located at the left side of the screen; at this time, the control relationship of the surgical robot is that the first controlled device 131 is controlled by the first control device 11 to work, and the second controlled device 132 is controlled by the second control device 12 to work; in this way, the doctor operates the first control device 11 on the left hand side of the doctor by the left hand while observing the display screen 14 to control the operation of the first controlled instrument 131 through the first control device 11, and operates the second control device 12 on the right hand side of the doctor by the right hand while observing the display screen 14 to control the operation of the second controlled instrument 132 through the second control device 12, so that the doctor can perform the hand-eye coordination operation.
However, the endoscope 133 is turned over at the current time, which may cause the position of the first controlled instrument 131 to be switched to the left side of the screen and the position of the second controlled instrument 132 to be switched to the right side of the screen when the doctor performs the operation while the doctor is viewing the first controlled instrument 131 through the display 14; if the doctor continues to perform the operation according to the control relationship, the condition that the doctor performs the hand-eye coordination and consistency operation cannot be met, and the smoothness of the operation is further hindered, so that the efficiency of the operation is reduced.
Therefore, it is necessary to switch the control relationship between the control device and the corresponding controlled instrument 13 to ensure that the surgeon can achieve consistent hand-eye coordination when performing the corresponding surgical operation.
Then, after the endoscope 133 is turned over, the position of the second controlled instrument 132 switched to the left side of the screen satisfies the left-hand and left-eye coordinated and consistent operation when the doctor operates the first control device 11, and the position of the first controlled instrument 131 switched to the right side of the screen satisfies the right-hand and right-eye coordinated and consistent operation when the doctor operates the second control device 12. That is, the master-slave mapping relationship between the control device and the controlled instrument 13 can be exchanged with the exchange of the relative position of the controlled instrument 13 in the view of the endoscope 133, so that the hand-eye coordination consistency and the operation intuition during the operation of the doctor can be ensured. Therefore, if the current control relationship of the surgical robot is the first control relationship, if the control relationship needs to be switched, the first control relationship may be switched to the second control relationship; if the current control relationship of the surgical robot is the second control relationship, the second control relationship may be switched to the first control relationship if the control relationship needs to be switched.
It can be understood that the above switching processes are all automatically completed in the surgical process, and there is no need for the doctor to frequently and manually switch the operations of the first control device 11 and the second control device 12, or frequently step on the foot to switch the control right, so that the safety and smoothness of the surgical operation are increased, and the surgical efficiency is also improved.
In one embodiment, the first control device 11 includes a first main input device 111 and a first auxiliary input device 112, the second control device 12 includes a second main input device 121 and a second auxiliary input device 122, the first main input device 111 may control the movement of the first controlled instrument 131 or the second controlled instrument 132, the second main input device 121 may control the movement of the second controlled instrument 132 or the first controlled instrument 131, the first auxiliary input device 112 may activate an auxiliary function of the first controlled instrument 131 or the second controlled instrument 132, and the second auxiliary input device 122 may activate an auxiliary function of the second controlled instrument 132 or the first controlled instrument 131, the method includes:
if a rollover occurs and it is detected that the first main input device 111 or the second main input device 121 controls the first controlled apparatus 131, the second main input device 121 or the first main input device 111 controls the second controlled apparatus 132, the control of the auxiliary function of the first controlled apparatus 131 is selectively assigned to either the first auxiliary input device 112 or the second auxiliary input device 122 and the control of the auxiliary function of the second controlled apparatus 132 is assigned to either the second auxiliary input device 122 or the first auxiliary input device 112.
Wherein, it has been described above that the first control device 11 includes the first main input device 111 and the first auxiliary input device 112, and the second control device 12 includes the second main input device 121 and the second auxiliary input device 122, wherein the first main input device 111 can control the first controlled apparatus 131 to move to a corresponding position, or control the second controlled apparatus 132 to move to a corresponding position; accordingly, the second master input device 121 may control the second controlled instrument 132 to move to a corresponding position, or control the first controlled instrument 131 to move to a corresponding position. The first main input device 111 can be bound with the first auxiliary input device 112 to cooperatively control the first controlled instrument 131 or the second controlled instrument 132; the second primary input device 121 may be bound to the second auxiliary input device 122 to cooperatively control the second controlled instrument 132 or the first controlled instrument 131. Therefore, if the first main input device 111 controls the first controlled apparatus 131 and is switched to the first main input device 111 controls the second controlled apparatus 132, accordingly, the second auxiliary input device 122 for activating the auxiliary function of the second controlled apparatus 132 needs to be switched to the first auxiliary input device 112, so as to activate the auxiliary function of the second controlled apparatus 132 through the first auxiliary input device 112, and realize that the second controlled apparatus 132 is controlled to perform the corresponding auxiliary function operation through the first auxiliary input device 112. This is done because: the auxiliary input device may be disposed at the bottom of the first surgical cart 10, which is convenient for the doctor to operate through feet or other lower limb portions, so switching the control relationship between the auxiliary input device and the controlled apparatus 13 can also satisfy the eye and foot operation consistency of the doctor, further improving the efficiency of the doctor performing the operation, and ensuring the fluency of the operation process.
It should be understood that, although the steps in the flowcharts related to the embodiments as described above are sequentially displayed as indicated by arrows, the steps are not necessarily performed sequentially as indicated by the arrows. The steps are not limited to being performed in the exact order illustrated and, unless explicitly stated herein, may be performed in other orders. Moreover, at least a part of the steps in the flowcharts related to the embodiments described above may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the execution order of the steps or stages is not necessarily sequential, but may be performed alternately or alternately with other steps or at least a part of the steps or stages in other steps.
Based on the same inventive concept, the embodiment of the present application further provides a control device of a surgical robot for implementing the control method of a surgical robot. The implementation scheme for solving the problem provided by the device is similar to the implementation scheme recorded in the method, so the specific limitations in one or more embodiments of the data backup device provided below can be referred to the limitations in the control method of the surgical robot, and are not described herein again.
In one embodiment, as shown in fig. 8, there is provided a control apparatus 900 of a surgical robot, the control apparatus being applied to the surgical robot, the control apparatus including: a first obtaining module 901, a second obtaining module 902, a determining module 903, a third obtaining module 904 and a switching module 905;
a first obtaining module 901, configured to obtain initial poses of the first controlled instrument 131 and the second controlled instrument 132 at an initial time;
a second obtaining module 902, configured to obtain current poses of the first controlled instrument 131 and the second controlled instrument 132 at the current time;
a determining module 903, configured to determine whether the endoscope 133 is turned over at the current time relative to the initial time based on the initial pose and the current pose;
a third obtaining module 904, configured to obtain a current control relationship of the surgical robot after the surgical robot is turned over;
a switching module 905, configured to switch the current control relationship to a second control relationship or a first control relationship when the current control relationship is the first control relationship or the second control relationship; the first control relationship is that the first control device controls the first controlled device 131 to work, and the second control device 12 controls the second controlled device 132 to work; the second control relationship is that the first control device 11 controls the second controlled apparatus 132 to operate, and the second control device 12 controls the first controlled apparatus 131 to operate.
In an embodiment, the determining module 903 is specifically configured to compare the initial pose with the current pose, and determine a pose change relationship based on the comparison result; it is determined whether the endoscope 133 is flipped at the current time with respect to the initial time based on the pose change relationship.
In an embodiment, the second obtaining module 902 is further configured to obtain a first pose of the endoscope 133 at the initial time and a second pose of the endoscope 133 at the current time; determining a pose change relation of the endoscope 133 at the current time based on the initial time according to the first pose and the second pose; determining the current poses of the first controlled instrument 131 and the second controlled instrument 132 based on a reference coordinate system and the pose change relationship, the reference coordinate system being constructed based on the first pose of the endoscope 133.
In one embodiment, the second obtaining module 902 further comprises a determining unit (not shown in the figure),
a determining unit, configured to determine a vector sum of the first controlled instrument 131 and the second controlled instrument 132 in the reference coordinate system at the initial time; based on the vector sum and the pose change relationship, the current poses of the first controlled instrument 131 and the second controlled instrument 132 are determined.
In one embodiment, the determining unit is further configured to determine a first vector projected by the first controlled instrument 131 to a target plane at the initial time and a second vector projected by the second controlled instrument 132 to the target plane, where the target plane is a plane on which a mirror of the endoscope 133 is located; determining a vector sum of the first controlled instrument 131 and the second controlled instrument 132 in the reference coordinate system based on the first vector and the second vector.
In an embodiment, the determining module 903 is specifically configured to acquire a reference component of the pose data in the initial pose in the reference coordinate system and a target component of the pose data in the current pose in the reference coordinate system; if the target component and the reference component are in opposite directions, it is determined that the endoscope 133 is flipped over at the current time relative to the initial time.
In one embodiment, the control device further comprises a distribution module (not shown),
and an assigning module, configured to, if the rollover occurs and it is detected that the first main input device 111 or the second main input device 121 controls the first controlled apparatus 131, the second main input device 121 or the first main input device 111 controls the second controlled apparatus 132, selectively assign control of the auxiliary function of the first controlled apparatus 131 to either the first auxiliary input device 112 or the second auxiliary input device 122 and assign control of the auxiliary function of the second controlled apparatus 132 to either the second auxiliary input device 122 or the first auxiliary input device 112.
The respective modules in the control device of the surgical robot described above may be entirely or partially implemented by software, hardware, and a combination thereof. The modules can be embedded in a hardware form or independent of a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a surgical robot is provided, the internal structure of which may be as shown in fig. 9. The surgical robot includes a processor, a memory, and a network interface connected by a system bus. Wherein the processor of the surgical robot is configured to provide computing and control capabilities. The memory of the surgical robot includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operating system and the computer program to run on the non-volatile storage medium. The database of the surgical robot is used for storing the pose of the controlled instrument. The network interface of the surgical robot is used for communicating with an external terminal through network connection. The computer program is executed by a processor to implement a method of controlling a surgical robot.
Those skilled in the art will appreciate that the configuration shown in fig. 9 is a block diagram of only a portion of the configuration associated with the present application and does not constitute a limitation on the surgical robot to which the present application may be applied, and that a particular surgical robot may include more or less components than those shown, or combine certain components, or have a different arrangement of components.
In one embodiment, there is provided a surgical robot comprising a memory and a processor, the memory having stored therein a computer program, the processor implementing any of the steps of the control method of the surgical robot when executing the computer program.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored, which computer program, when being executed by a processor, realizes any of the steps of the above-mentioned control method of a surgical robot.
In one embodiment, a computer program product is provided, comprising a computer program which, when executed by a processor, carries out any of the steps of the above-described method of controlling a surgical robot.
It is understood that a person skilled in the art can combine, split, recombine and the like the embodiments of the present application to obtain other embodiments on the basis of several embodiments provided by the present application, and the embodiments do not depart from the scope of the present application.
The above embodiments, objects, technical solutions and advantages of the embodiments of the present application are described in further detail, and it should be understood that the above embodiments are only specific embodiments of the present application and are not intended to limit the scope of the embodiments of the present application, and any modifications, equivalent substitutions, improvements and the like made on the basis of the technical solutions of the embodiments of the present application should be included in the scope of the embodiments of the present application.

Claims (10)

1. A control method of a surgical robot is characterized in that the surgical robot comprises a first control device, a first controlled instrument, a second control device, a second controlled instrument and an endoscope; the method comprises the following steps:
acquiring initial poses of the first controlled instrument and the second controlled instrument at an initial moment;
acquiring current poses of the first controlled instrument and the second controlled instrument at the current moment;
determining whether the endoscope is flipped at the current time relative to the initial time based on the initial pose and the current pose;
if the operation robot is turned over, acquiring the current control relation of the operation robot;
if the current control relationship is a first control relationship or a second control relationship, switching the current control relationship to the second control relationship or the first control relationship; the first control relation is that the first control equipment controls a first controlled appliance to work, and the second control equipment controls a second controlled appliance to work; the second control relation is that the first control equipment controls the second controlled equipment to work, and the second control equipment controls the first controlled equipment to work.
2. The control method according to claim 1, wherein the acquiring the current poses of the first controlled instrument and the second controlled instrument at the current time comprises:
acquiring a first pose of the endoscope at the initial moment and a second pose of the endoscope at the current moment;
determining a pose change relation of the endoscope at the current moment based on the initial moment according to the first pose and the second pose;
determining current poses of the first and second controlled instruments based on a reference coordinate system and the pose change relationship, the reference coordinate system being constructed based on the first pose of the endoscope.
3. The control method according to claim 2, wherein the determining the current poses of the first controlled instrument and the second controlled instrument based on a reference coordinate system and the pose change relationship comprises:
determining the vector sum of the first controlled instrument and the second controlled instrument in the reference coordinate system at the initial moment;
and determining the current poses of the first controlled instrument and the second controlled instrument based on the vector sum and the pose change relation.
4. The control method according to claim 3, wherein the determining the vector sum of the first controlled instrument and the second controlled instrument in the reference coordinate system at the initial moment comprises:
determining a first vector projected to a target plane by the first controlled instrument and a second vector projected to the target plane by the second controlled instrument at the initial moment, wherein the target plane is a plane in which a mirror surface of the endoscope is located;
determining a vector sum of the first controlled instrument and the second controlled instrument in the reference coordinate system based on the first vector and the second vector.
5. The control method according to claim 4, wherein the determining whether the endoscope has flipped at the current time relative to the initial time based on the initial pose and the current pose comprises:
acquiring a reference component of the attitude data in the initial pose in the reference coordinate system and a target component of the attitude data in the current pose in the reference coordinate system;
and if the target component and the reference component are opposite in direction, determining that the endoscope is turned over at the current moment relative to the initial moment.
6. The method according to any one of claims 1-5, wherein the first control device includes a first primary input device and a first secondary input device, the second control device includes a second primary input device and a second secondary input device, the first primary input device can control movement of the first controlled instrument or the second controlled instrument, the second primary input device can control movement of the second controlled instrument or the first controlled instrument, the first secondary input device can activate a secondary function of the first controlled instrument or the second controlled instrument, the second secondary input device can activate a secondary function of the second controlled instrument or the first controlled instrument, the method comprising:
if the first main input device or the second main input device is detected to control the first controlled instrument, and the second main input device or the first main input device is detected to control the second controlled instrument, selectively distributing the control of the auxiliary function of the first controlled instrument to any one of the first auxiliary input device or the second auxiliary input device and distributing the control of the auxiliary function of the second controlled instrument to any one of the second auxiliary input device or the first auxiliary input device.
7. A control apparatus of a surgical robot, the control apparatus being applied to the surgical robot including a first control device, a first controlled instrument, a second control device, a second controlled instrument, and an endoscope, the control apparatus comprising:
the first acquisition module is used for acquiring initial poses of the first controlled instrument and the second controlled instrument at an initial moment;
the second acquisition module is used for acquiring the current poses of the first controlled instrument and the second controlled instrument at the current moment;
a determination module configured to determine whether the endoscope is flipped at the current time relative to the initial time based on the initial pose and the current pose;
the third acquisition module is used for acquiring the current control relation of the surgical robot after the operation robot is turned over;
a switching module, configured to switch the current control relationship to a first control relationship or a second control relationship when the current control relationship is the first control relationship or the second control relationship; the first control relation is that the first control equipment controls a first controlled instrument to work, and the second control equipment controls a second controlled instrument to work; the second control relation is that the first control equipment controls the second controlled equipment to work, and the second control equipment controls the first controlled equipment to work.
8. A surgical robot, characterized in that the robot comprises: a processor and a memory for storing processor-executable instructions;
the processor is used for reading the executable instructions from the memory and executing the instructions to realize the control method according to any one of claims 1 to 6.
9. A computer-readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the control method according to any one of claims 1 to 6.
10. A computer program product comprising a computer program, characterized in that the computer program, when being executed by a processor, carries out the control method of any one of claims 1 to 6.
CN202211014750.5A 2022-08-23 2022-08-23 Control method and control device for surgical robot and robot Pending CN115317136A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211014750.5A CN115317136A (en) 2022-08-23 2022-08-23 Control method and control device for surgical robot and robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211014750.5A CN115317136A (en) 2022-08-23 2022-08-23 Control method and control device for surgical robot and robot

Publications (1)

Publication Number Publication Date
CN115317136A true CN115317136A (en) 2022-11-11

Family

ID=83926834

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211014750.5A Pending CN115317136A (en) 2022-08-23 2022-08-23 Control method and control device for surgical robot and robot

Country Status (1)

Country Link
CN (1) CN115317136A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116999178A (en) * 2023-10-07 2023-11-07 北京科鹏医疗器械有限公司 Dual-frequency filtering visual master-slave mapping method operated by natural channel endoscope

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116999178A (en) * 2023-10-07 2023-11-07 北京科鹏医疗器械有限公司 Dual-frequency filtering visual master-slave mapping method operated by natural channel endoscope
CN116999178B (en) * 2023-10-07 2024-01-12 北京科鹏医疗器械有限公司 Dual-frequency filtering visual master-slave mapping method operated by natural channel endoscope

Similar Documents

Publication Publication Date Title
US20220175475A1 (en) Estimation of a position and orientation of a frame used in controlling movement of a tool
CA2945189C (en) Robotic systems and methods of operating robotic systems
US20190125462A1 (en) Multi-input robotic surgical system control scheme
US8594841B2 (en) Visual force feedback in a minimally invasive surgical procedure
Richter et al. Augmented reality predictive displays to help mitigate the effects of delayed telesurgery
US8706301B2 (en) Obtaining force information in a minimally invasive surgical procedure
Krupa et al. Robotized tele-echography: an assisting visibility tool to support expert diagnostic
US20240221239A1 (en) Systems and methods for clinical workspace simulation
KR20190048589A (en) Apparatus and method for dual-arm robot teaching based on virtual reality
JP3559335B2 (en) 3D image processing device
Arent et al. Selected topics in design and application of a robot for remote medical examination with the use of ultrasonography and ascultation from the perspective of the remedi project
CN115317136A (en) Control method and control device for surgical robot and robot
WO2023237105A1 (en) Method for displaying virtual surgical instrument on surgeon console, and surgeon console
Rastogi et al. Telerobotic control with stereoscopic augmented reality
JP2009087161A (en) Image processor and image processing method
CN115607294A (en) Surgical robot system and data processing method
CN106903665A (en) A kind of master-slave mode telesurgery robot control system based on stereoscopic vision
WO2020174586A1 (en) Information processing device, information processing method, and program
CN114668507B (en) Remote surgery visual feedback system and method
US20240029368A1 (en) System and method for transparent overlay in surgical robotic system
WO2022221621A1 (en) Robust surgical scene depth estimation using endoscopy
WO2018150489A1 (en) Method for operating surgical instrument, robotic surgery system, program for estimating relationship between camera coordinates and coordinates related to robot
US20230172674A1 (en) System and method for integrated control of 3d visualization through a surgical robotic system
US12277267B2 (en) Two-way communication between head-mounted display and electroanatomic system
Xu et al. Augmented Reality Enabled Telepresence Framework for Intuitive Robotic Teleoperation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination