WO2026025418A1 - Methods and systems for assisting doctors in work - Google Patents
Methods and systems for assisting doctors in workInfo
- Publication number
- WO2026025418A1 WO2026025418A1 PCT/CN2024/109063 CN2024109063W WO2026025418A1 WO 2026025418 A1 WO2026025418 A1 WO 2026025418A1 CN 2024109063 W CN2024109063 W CN 2024109063W WO 2026025418 A1 WO2026025418 A1 WO 2026025418A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- doctor
- patient
- surgery
- virtual
- terminal
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Landscapes
- Medical Treatment And Welfare Office Work (AREA)
Abstract
Provided is a method for assisting doctors in work, implemented on a processing device communicatively connected to a doctor terminal of a doctor, comprising: obtaining, from the doctor terminal, an access request to a doctor space application; in response to the access request, determining, based on a receipt time of the access request and schedule information of the doctor, one or more pending tasks to be completed by the doctor; and causing the doctor terminal to present, via the doctor space application, an interactive interface that includes first interface elements for accessing assistance services relating to at least one of the one or more pending tasks.
Description
The present disclosure relates to the field of medical service technology, and in particular to a method and a system for assisting doctors in work.
In their daily work routines, which can include ward rounds, consultations, surgeries, meetings, doctors need timely access to various types of information, such as work schedules, inpatient data, outpatient details, and surgery plans. Whether the doctors can comprehensively and conveniently view the information affects their work efficiency.
To this end, it is desirable to provide a method and a system for assisting doctors in work, so as to assist the doctors in managing their daily tasks efficiently.
One or more embodiments of the present disclosure provide a method for assisting doctors in work, implemented on a processing device communicatively connected to a doctor terminal of a doctor. The method may comprise: obtaining, from the doctor terminal, an access request to a doctor space application; in response to the access request, determining, based on a receipt time of the access request and schedule information of the doctor, one or more pending tasks to be completed by the doctor; and causing the doctor terminal to present, via the doctor space application, an interactive interface that includes first interface elements for accessing assistance services relating to at least one of the one or more pending tasks.
In some embodiments, the method may further comprise determining, based on the receipt time and the schedule information, one or more completed tasks of the doctor, wherein the interactive interface further includes one or more collapsible elements relating to the one or more completed tasks.
In some embodiments, the interactive interface may further include a second interface element for accessing a real-time 3D map relating to a target location corresponding to the at least one pending task.
In some embodiments, the one or more pending tasks may include conducting ward rounds in a hospital ward. The first interface elements may include a first interface element for applying to participate in the ward rounds remotely.
In some embodiments, the method may further comprise obtaining, from the doctor terminal, a request to participate in the ward rounds remotely input by the doctor; in response to detecting that the ward rounds are conducted in the hospital ward, obtaining sensed information collected by one or more sensing devices in the hospital ward during the ward rounds; generating, based on the sensed information and patient data, a virtual ward space; and causing the doctor terminal to present the virtual ward space.
In some embodiments, the one or more pending tasks may include providing consultation services in a consultation room. The first interface elements may include a first interface element for accessing patient data of patients who have booked the consultation services.
In some embodiments, the method may further comprise obtaining, from the doctor terminal, a
request for accessing the patient data of a target patient among the patients; generating, based on the patient data of the target patient, a virtual character representing the target patient; and causing the doctor terminal to present the virtual character to explain the patient data of the target patient to the doctor.
In some embodiments, the one or more pending tasks may include providing remote consultation services. The first interface elements may include a first interface element for entering a virtual consultation room.
In some embodiments, the method may further comprise obtaining, from the doctor terminal, a request to enter the virtual consultation room to provide the remote consultation services to a target patient; causing the doctor terminal to present a 3D patient model of the target patient; obtaining, from the doctor terminal, an examination instruction input by the doctor via interacting with the 3D patient model; and causing, based on the examination instruction, a wearable device worn by the target patient to collect measurement data of the target patient.
In some embodiments, the one or more pending tasks may include performing surgery on a target patient. The first interface elements may include a first interface element for accessing patient data relating to the target patient.
In some embodiments, the interactive interface may further comprise a third interface element for conducting preoperative education.
In some embodiments, the doctor terminal may include a second XR device worn by the doctor. The method may further comprise: obtaining a request for conducting preoperative education for a target patient, the request being obtained from the second XR device and input by the doctor via interacting with the third interface element; in response to the request, generating explanatory materials for explaining a candidate surgery plan of the target patient; causing a first XR device worn by the patient and the second XR device to simultaneously present the explanatory materials to the target patient and the doctor.
In some embodiments, the interactive interface may further comprise a fourth interface element for conducting surgery simulation.
In some embodiments, the doctor terminal may include a second XR device worn by the doctor. The method may further comprise: obtaining a request for simulating a target surgery, the request being obtained from the second XR device and input by the doctor via interacting with the fourth interface element; in response to the request, generating a virtual surgery scene corresponding to the target surgery, the virtual surgery scene including a virtual surgery site and a virtual surgery equipment; and causing the second XR device to present the virtual surgery scene to the doctor; obtaining an interaction instruction with respect to the virtual surgery equipment input by the doctor via the second XR device or an interactive device corresponding to the virtual surgery equipment; and updating the virtual surgery site and the virtual surgery equipment in the virtual surgery scene based on the interaction instruction.
In some embodiments, the interactive interface may further comprise a fifth interface element for performing surgery planning.
In some embodiments, the method may further comprise: obtaining a request for performing surgery planning for a target patient, the request being obtained from the doctor terminal and input by the doctor via interacting with the fifth interface element; determining an operation difficulty factor based on patient data of the target patient; determining whether an expert meeting is needed based on the operation difficulty factor;
in response to determining that the expert meeting is needed, causing the doctor terminal to present a sixth interface element for initiating the expert meeting.
In some embodiments, the interactive interface may further comprise a seventh interface element for patient management.
In some embodiments, the method may further comprise: obtaining a request for accessing a preliminary admission record of a target patient, the request being obtained from the doctor terminal and input by the doctor via interacting with the seventh interface element; in response to the request, causing the doctor terminal to present the preliminary admission record; and updating the preliminary admission record based on feedback information regarding the preliminary admission record input by the doctor via the doctor terminal.
In some embodiments, the updating the preliminary admission record may comprise: determining, based on the feedback information, inquiry content of a supplementary inquiry; causing, based on the inquiry content, a third terminal device in a hospital ward where the target patient is located to conduct the supplementary inquiry; obtaining sensed information collected by one or more sensing devices in the hospital ward during the supplementary inquiry; and updating the preliminary admission record based on the sensed information.
In some embodiments, in response to the access request, the determining one or more pending tasks to be completed by the doctor may comprise: in response to the access request, causing the doctor terminal to present, via the doctor space application, a preliminary interactive interface that includes an eighth interface element for reminding the doctor to check a work schedule; and in response to a request for accessing the work schedule input by the doctor via the doctor terminal, determining the one or more pending tasks.
In some embodiments, the preliminary interactive interface may further present a virtual character configured to communicate with the doctor.
One or more embodiments of the present disclosure provide a method for assisting doctors in work, implemented on a doctor terminal of a doctor communicatively connected to a processing device, the doctor terminal being installed with a doctor space application. The method may comprise: receiving an access request to the doctor space application input by the doctor; transmitting the access request to the processing device; receiving, from the processing device, an instruction to present an interactive interface via the doctor space application, wherein the interactive interface may include first interface elements for accessing assistance services relating to at least one pending task of one or more pending tasks to be completed by the doctor, and the one or more pending tasks may be determined by the processing device based on a receipt time of the access request at the processing device and schedule information of the doctor.
The present disclosure will be further illustrated by way of exemplary embodiments, which will be described in detail by means of the accompanying drawings. These embodiments are not limiting, and in these embodiments, the same numbering indicates the same structure, wherein:
FIG. 1 is a block diagram illustrating an exemplary medical service system 100 according to some embodiments of the present disclosure;
FIG. 2 is a schematic diagram illustrating an exemplary medical service system 200 according to some embodiments of the present disclosure;
FIG. 3 is a schematic diagram illustrating an exemplary hospital support platform 300 according to some embodiments of the present disclosure;
FIG. 4 is a schematic flowchart illustrating an exemplary method for assisting doctors in work according to some embodiments of the present disclosure;
FIG. 5 is a schematic diagram illustrating an exemplary initial interactive interface according to some embodiments of the present disclosure;
FIG. 6 is a schematic diagram illustrating an exemplary interface for presenting one or more pending tasks according to some embodiments of the present disclosure;
FIG. 7 is a schematic diagram illustrating an exemplary interactive interface according to some embodiments of the present disclosure;
FIG. 8 is a schematic diagram illustrating an exemplary interface for presenting in-hospital tour services according to some embodiments of the present disclosure;
FIG. 9 is a schematic diagram illustrating an exemplary ward round service interface according to some embodiments of the present disclosure;
FIG. 10 is a flowchart illustrating an exemplary process of participating in ward rounds remotely according to some embodiments of the present disclosure;
FIG. 11A is a schematic diagram illustrating an exemplary consultation service interface according to some embodiments of the present disclosure;
FIG. 11B is a schematic diagram illustrating an exemplary interface for presenting patient data of consultation patients according to some embodiments of the present disclosure;
FIG. 11C is a schematic diagram illustrating an exemplary interface for presenting past multimodal data of consultation patients according to some embodiments of the present disclosure;
FIG. 12 is a flowchart illustrating an exemplary process of viewing patient data according to some embodiments of the present disclosure;
FIG. 13 is a flowchart illustrating an exemplary process of remote consultation according to some embodiments of the present disclosure;
FIG. 14 is a schematic diagram illustrating an exemplary interface of performing a surgery service according to some embodiments of the present disclosure;
FIG. 15 is a flowchart illustrating an exemplary process of preoperative education according to some embodiments of the present disclosure;
FIG. 16 is a flowchart illustrating an exemplary process of surgery simulation according to some embodiments of the present disclosure;
FIG. 17 is a flowchart illustrating an exemplary process of performing surgery planning according to other embodiments of the present disclosure;
FIG. 18 is a flowchart illustrating an exemplary process of updating a preliminary admission record according to some embodiments of the present disclosure; and
FIG. 19 is a flowchart illustrating an exemplary method for assisting doctors in work according to some embodiments of the present disclosure.
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the accompanying drawings required to be used in the description of the embodiments are briefly described below. Obviously, the accompanying drawings in the following description are only some examples or embodiments of the present disclosure, and it is possible for a person of ordinary skill in the art to apply the present disclosure to other similar scenarios in accordance with these drawings without creative labor. Unless obviously obtained from the context or the context illustrates otherwise, the same numeral in the drawings refers to the same structure or operation.
It should be understood that the terms "system" , "device" , "unit" and/or "module" used herein are a way to distinguish between different components, elements, parts, sections, or assemblies at different levels. However, the terms may be replaced by other expressions if other words accomplish the same purpose.
Usually, the terms "module" , "unit" , or "block" are used here to refer to the logic or collection of software instructions embodied in hardware or firmware, or a collection of software instructions. Modules, units, or blocks described herein may be implemented as software and/or hardware, and may be stored in any type of non-transitory computer-readable medium or another storage device. In some embodiments, the software module/unit/block may be compiled and linked into an executable program. It should be appreciated that the software modules may be called from other modules/units/blocks or from themselves, and/or may be called in response to a detected event or interrupt. Software modules/units/blocks configured to be executed on a computing device may be provided on a computer-readable medium (e.g., a CD-ROM, a digital video diskette, a flash drive, a diskette, or any other tangible medium) , or as a digital download (which may initially be stored in a compressed or installable format that needs to be installed, decompressed, or decrypted prior to execution) . Software code herein may be stored in part or in whole in a storage device of a computing device that performs an operation and is applied in the operation of the computing device. The software instructions may be embedded in firmware, such as an EPROM. It should also be understood that hardware modules/units/blocks may be included in connected logic assemblies, such as gates and flip-flops, and/or may include programmable units, such as programmable gate arrays or processors. The modules/units/blocks or computing device functions described herein may be implemented as software modules/units/blocks, but may be represented in hardware or firmware. Usually, the modules/units/blocks described herein refer to logical modules/units/blocks, which may be combined with other modules/units/blocks or divided into sub-modules/sub-units/sub-blocks, even though they are physical organizations or memory pieces. This description may apply to a system, an engine, or a portion thereof.
It is to be understood that, unless the context explicitly states otherwise, when a unit, engine, module, or block is said to be "on" another unit, engine, module, or block, "connected" or "coupled to" another unit, engine, module, or block, it may be directly on, connected or coupled to, or in communication with, other units, engines, modules, or blocks, or there may be intermediate units, engines, modules, or blocks. In the present disclosure, the term "and/or" may include any one or more of the relevant listed entries or combinations thereof. In the present disclosure, the term "image" may refer to a 2D image, a 3D image, or a 4D image.
These and other features, characteristics, and methods of functioning and operation of the relevant structural elements, as well as combinations of components and economies of manufacture of the present disclosure may be rendered more apparent in the light of the following description in conjunction with the
accompanying drawings, which form a part. It should be understood, however, that the accompanying drawings are for illustrative and descriptive purposes only and are not intended to limit the scope of the present disclosure. It should be understood that the accompanying drawings are not to scale.
As shown in the present disclosure and in the claims, unless the context clearly suggests an exception, the words "one" , "a" , "an" , "one kind" , and/or "the" do not refer specifically to the singular, but may also include the plural. Generally, the terms "including" and "comprising" suggest only the inclusion of clearly identified steps and elements, however, the steps and elements that do not constitute an exclusive list, and the method or apparatus may also include other steps or elements.
Flowcharts are used in the present disclosure to illustrate the operations performed by a system according to embodiments of the present disclosure, and the related descriptions are provided to aid in a better understanding of the magnetic resonance imaging method and/or system. It should be appreciated that the preceding or following operations are not necessarily performed in an exact sequence. Instead, steps can be processed in reverse order or simultaneously. Also, it is possible to add other operations to these processes or to remove a step or steps from these processes.
In the daily work scenarios of doctors, doctors may need to handle multiple tasks, such as ward rounds, consultation services, surgeries, attending meetings, etc. In order to complete the above tasks, doctors not only need to know the schedule of each task, but also need to know the information (e.g., inpatient data, outpatient information, surgery plans, etc. ) related to each task in advance, and/or create information relating to each task (e.g., create a surgery plan, etc. ) . Doctors view the above information mostly via text or pictures, which may not be able to fully and intuitively display the pathological features (e.g., the complex geometric features of a lesion, etc. ) of patients, the real-time conditions (e.g., the recovery of inpatients) of the patients, and details (e.g., an implantation path of an implant surgery, etc. ) of the surgery plan. As a result, the doctors may not be able to conveniently and intuitively understand the detailed information required for each task, affecting the work efficiency of the doctors. In addition, the ward rounds for inpatients traditionally require the doctors to perform on-site in the inpatient area, and the outpatient work for outpatients requires the doctors to perform on-site in the consultation room. However, in these areas, the doctors may also be unable to view the patient data comprehensively and intuitively, which also affects the work efficiency of the doctors and even causes inexperienced doctors to make misjudgments, resulting in adverse consequences.
Some embodiments of the present disclosure provide a method and a system for assisting doctors in work. The method and the system for assisting the doctors in work may combine a physical hospital and a digital twin hospital, and realize the interaction between the doctors and the system via virtual and real linkage, which can not only eliminate the incompleteness of 2D image information and reduce the difficulty of task processing, but also enable the doctors to handle certain tasks remotely, thereby improving the work efficiency of the doctors. More descriptions regarding the method may be found in FIGs. 4-18 and related descriptions thereof.
FIG. 1 is a block diagram illustrating an exemplary medical service system 100 according to some embodiments of the present disclosure.
The medical service system 100 can also be referred to as a meta hospital system, and is built based on various innovative technologies including metaverse technology, XR technology (e.g., augmented reality
(AR) technology, virtual reality (VR) technology, mixed reality (MR) technology, etc. ) , AI technology, digital twin technology, IOT technology, data circulation technology (e.g., blockchain technology, data privacy computing technology) , spatial computing technology, image rendering technology, etc.
As illustrated in FIG. 1, the medical service system 100 may include a physical hospital 110, a virtual hospital 130, at least one user space application 120, and a hospital support platform 140. In some embodiments, the hospital support platform 140 may map data relating to the physical hospital 110 into the virtual hospital 130 corresponding to the physical hospital 110, and provide user services to relevant users of the physical hospital 110 via the at least one user space application 120.
The physical hospital 110 refers to a hospital that exists in the physical world and has tangible properties. As used herein, healthcare institutions that offer medical, surgical, and psychiatric care and treatment for people are collectively referred to as hospitals.
As shown in FIG. 1, the physical hospital 110 may include a plurality of physical entities. For example, the plurality of physical entities may include departments, users, hardware devices, user services, public areas, medical service procedures, or the like, or any combination thereof.
A department refers to a specialized unit or division dedicated to providing specific types of medical care, treatments, and services. Each of the departments may focus on a particular area of medicine and may be staffed by healthcare professionals with expertise in that area. For example, the departments may include a consultation department, a hospitalization department, a surgery department, a support department (e.g., a registration department, a pharmacy department) , an internal medicine department, a surgical department, a specialized medical department, a children’s health department, or the like, or any combination thereof.
The users may include any users associated with the physical hospital 110 (or referred to as relevant users of the physical hospital 110) . For example, the users may include patients (or a portion of the patients (e.g., organs) ) , companions of the patients, visitors of the patients, hospital staff of the physical hospital 110, suppliers of the physical hospital 110, application developers of the physical hospital 110, or the like, or any combination thereof. The hospital staff of the physical hospital 110 may include medical service providers (e.g., doctors, nurses, technicians, etc. ) , hospital managers, support staff, or the like, or any combination thereof. Exemplary hospital managers may include a departmental nursing manager, a clinical leader, a departmental dean, a hospital dean, a hospital executive, a functional manager, or the like, or any combination thereof.
The hardware devices may include hardware devices located in the physical hospital 110 and/or hardware devices in a communication with the hardware devices in the physical hospital 110. Exemplary hardware devices may include terminal devices, medical service devices, sensing devices, basic devices, or the like, or any combination thereof.
The terminal devices may include terminal devices that interact with the users relating to the medical service system 100. For example, the terminal devices may include a terminal device that interacts with a patient (also referred to as a patient terminal device) , a terminal device that interacts with a doctor of the patient (also referred to as a doctor terminal device) , a terminal device that interacts with a nurse (also referred to as a nurse terminal device) , a terminal device that interacts with a remote visitor (also referred to as a remote terminal device) , a public terminal of the hospital (e.g., a consultation room terminal, a bedside terminal device, a terminal device in a waiting region, an intelligent surgery terminal) , or the like, or any combination thereof. In the present disclosure, unless obviously obtained from the context or the context illustrates otherwise, a terminal
device that is owned by a user and a terminal device that is provided to the user by the physical hospital 110 are collectively referred to as a terminal device of the user or a terminal device that interacts with the user.
The terminal devices may include a mobile terminal, an XR device, a smart wearable device, etc. The mobile terminal may include a smart phone, a personal digital assistant (PDA) , a display, a gaming device, a navigation device, a handheld terminal (POS) , a tablet computer, or the like, or any combination thereof.
The XR device may include a device that allows a user to be engaged in an extended reality experience. For example, the XR device may include a VR assembly, an AR assembly, an MR assembly, or the like, or any combination thereof. In some embodiments, the XR device may include an XR helmet, XR glasses, an XR patch, a stereoscopic headset, or the like, or any combination thereof. For example, the XR device may include a Google GlassTM, an Oculus RiftTM, a Gear VRTM, an Apple Vision proTM, etc. Specifically, the XR device may include a display component on which virtual content may be rendered and/or displayed. In some embodiments, the XR device may further include an input component. The input component may enable user interactions between a user and the virtual content (e.g., the virtual surgery environment) displayed by the display component. For example, the input component may include a touch sensor, a microphone, an image sensor, etc., configured to receive user input, which may be provided to the XR device and used to control the virtual world by varying the visual content rendered on the display component. The input component may include a handle, a glove, a stylus, a console, etc.
The smart wearable device may include smart bracelets, smart shoes and socks, smart glasses, smart helmets, smart watches, smart clothes, smart backpacks, smart accessories, or the like, or any combination thereof. In some embodiments, the smart wearable device may obtain physiological data (e.g., heart rate, blood pressure, body temperature, etc. ) of the user.
The medical service devices may be configured to provide medical services to the patients. For example, the medical service devices may include examination devices, nursing care devices, therapeutic devices, or the like, or any combination thereof.
The examination devices may be configured to provide examination services to the patients, such as collecting examination data of the patients. Exemplary examination data may include a heart rate, a respiratory rate, a body temperature, blood pressure, medical imaging data, a body fluid test report (e.g., a blood test report) , or the like, or any combination thereof. Correspondingly, the examination devices may include a vital sign monitor (e.g., a blood pressure monitor, a glucometer, a cardiotachometer, a thermometer, a digital stethoscope, etc. ) , a medical imaging device (e.g., a computed tomography (CT) device, a digital subtraction angiography (DSA) device, a magnetic resonance (MR) device, etc. ) , a laboratory device (e.g., a blood routine examination device, etc. ) , or the like, or any combination thereof.
The nursing care devices may be configured to provide nursing care services to the patients and/or assist the medical service providers to provide the nursing care services. Exemplary nursing care devices may include a hospital bed, a patient-care robot, an intelligent nursing trolley, an intelligent medicine box, an intelligent wheelchair, etc.
The therapeutic devices may be configured to provide therapeutic services to the patients and/or assist the medical service providers to provide the therapeutic services. Exemplary therapeutic devices may include surgical devices, radiotherapeutic devices, physical therapy devices, or the like, or any combination thereof.
The sensing devices may be configured to collect sensed information relating to the environment
where it is located. For example, the sensing devices may include an image sensor, an acoustic sensor, etc. The image sensor may be configured to collect image data in the physical hospital 110, and the acoustic sensor may be configured to collect acoustic data in the physical hospital 110. In some embodiments, a sensing device may be an independent device or be integrated into another device. For example, the acoustic sensor may be part of a medical service device or a terminal device.
The basic devices may be configured to support data transmission, storage, and processing. For example, the basic devices may be networks, machine room facilities, computing devices, computing chips, storage devices, etc.
In some embodiments, at least part of the hardware devices of the physical hospital 110 are IoT devices. The IoT devices refer to devices with sensors, processing ability, software, and other technologies that connect and exchange data with other devices and systems over the Internet or other communications networks. For example, one or more medical service devices and/or sensing devices of the physical hospital 110 are IoT devices and configured to transmit the collected data to the hospital support platform 140 for storage and/or processing.
The user services may include any services provided by the hospital support platform 140 to the users. For example, the user services include medical services provided to the patients and/or the companions of the patients, support services provided to the staff of the physical hospital 110 and/or suppliers of the physical hospital 110, etc. In some embodiments, the user services may be provided to patients, doctors, and hospital managers via the user space application (s) 120, which will be described in detail in the following descriptions.
The public areas refer to shared spaces accessible to the users (or a portion of the users) in the physical hospital 110. For example, the public areas may include a reception area (e.g., a front desk) , waiting areas, corridors and hallways, or the like, or any combination thereof.
A medical service procedure refers to a procedure that provides a corresponding medical service to the patients. The medical service procedure normally includes serval stages and/or steps that a user needs to go through for receiving the corresponding medical service. Exemplary medical service procedures may include a consultation procedure, a hospitalization procedure, a surgery procedure, or the like, or any combination thereof. In some embodiments, the medical service procedure may include medical service procedures corresponding to different departments, different diseases, etc. In some embodiments, a preset data acquisition protocol may be set and specify standard stages involved in the medical service procedure and how to collect relating to the medical service procedure.
The at least one user space application 120 provides the users with access to the user services provided by the hospital support platform 140. A user space application 120 may be an application program, a plug-in, a website, an applet, or in any other suitable form. For example, the user space application 120 is an application program installed on a user’s terminal device, and the application program includes user interfaces for the user to initiate requests and receive corresponding services.
In some embodiments, the at least one user space application 120 may include different applications corresponding to different types of users. For example, the at least one user space application 120 includes a patient space application corresponding to patients, a doctor space application corresponding to doctors, a manager space application corresponding to managers, or the like, or any combination thereof. User services provided via the patient space application, the doctor space application, and the manager space application are also referred to as patient space services, doctor space services, and manager space services, respectively.
Exemplary patient space services include registration services, navigation services, pre-consultation services, remote consultation services, hospitalization admission services, hospitalization discharge services, etc. Exemplary doctor space services include scheduling services, surgical planning services, surgical simulation services, patient management services, remote ward round services, remote consultation services, etc. Example manager space services include monitoring services, medical service evaluation services, equipment parameter setting services, service parameter setting services, resource scheduling services, etc.
In some embodiments, the patient space application, the doctor space application, and the manager space application may be integrated into one user space application 120, and the user space application 120 may be configured to provide access for each type of the users (e.g., the patients, the medical service providers, the managers, etc. ) . Merely by way of example, a specific user may have a corresponding identity that can be used to log into the user space application, view corresponding diagnosis and treatment data, and obtain corresponding user services.
According to some embodiments of the present disclosure, by providing the user space applications for different types of users, each type of users can easily obtain various user services that he/she may need on his/her corresponding user space application. In addition, at present, the users are usually required to install various applications to obtain different user services, which results in poor user experience and high development costs. Therefore, the user space applications in the present disclosure can improve the user experience, improve the service quality and efficiency, enhance the service safety, and reduce the development or operational costs.
In some embodiments, the at least one user space application 120 may be configured to provide access for the relevant users of the physical hospital 110 to interact with the virtual hospital 130. For example, via a user space application 120, a user may input an instruction for retrieving digital content of the virtual hospital 130 (e.g., a digital twin model of a hardware device, a patient organ, a public area) , view the digital content, and interact with the digital content. As another example, via a user space application 120, a user may communicate with a virtual character representing an intelligent agent. In some embodiments, a public terminal of the hospital may be installed with a manager space application, and a manager account of the department corresponding to the public terminal may be logged in the manager space application. Users may receive user services via the manager space application installed in the public terminal.
The virtual hospital 130 is a digital twin (i.e., a virtual representation or virtual copy) of the physical hospital 110 that is used to simulate, analyze, predict, and optimize the operation status of the physical hospital 110. For example, the virtual hospital 130 may be a digital copy of the physical hospital 110 in real time.
In some embodiments, the virtual hospital 130 may be presented to the users using digital technologies. For example, at least a portion of the virtual hospital 130 may be presented to the relevant users using the XR technology when the relevant users interact with the virtual hospital 130. Merely by way of example, the at least a portion of the virtual hospital 130 may be superimposed on a real-world view of the relevant users using the MR technology.
In some embodiments, the virtual hospital 130 may include digital twins of the physical entities relating to the physical hospital 110. A digital twin refers to a virtual representation (e.g., a virtual copy, a mapping body, a digital simulator) of a physical entity. The digital twins may reflect and predict status, behaviors, and performances of the physical entities in real time. For example, the virtual hospital 130 may include digital twins of at least a portion of the medical services, the departments, the users, the hardware devices,
the user services, the public areas, the medical service procedures, etc., of the physical hospital 110. The digital twin of a physical entity may be in various forms including a model, an image, a graph, text, numerical values, etc. For example, the digital twins may be a virtual hospital corresponding to the physical hospital, virtual personnel (e.g., virtual doctors, virtual nurses, and virtual patients) corresponding to personnel entities (e.g., the doctors, the nurses, and the patients) , virtual devices (e.g., the virtual imaging device and a virtual scalpel) corresponding to medical service devices (e.g., an imaging device and a scalpel) , etc.
In some embodiments, the digital twins may include one or more first digital twins and/or one or more second digital twins. The status of each first digital twin may be updated based on an update of the status of the corresponding physical entity. For example, the one or more first digital twins may be updated during a process of mapping the data relating to the physical hospital 110 into the virtual hospital 130. The one or more second digital twins may be updatable via at least one of the at least one user space application 120, and the update of each second digital twin may result in a status update of the corresponding physical entity. In other words, when the corresponding physical entity changes its status, a first digital twin may be updated accordingly; when a second digital twin is updated, the status of the corresponding physical entity changes accordingly. For example, the one or more first digital twins may include the digital twins of the public areas, the medical services, the users, the hardware devices, etc., and the one or more second digital twins may include the digital twins of the hardware devices, the user services, the medical service procedures, etc. It should be understood that a digital twin can be both a first digital twin and a second digital twin.
According to some embodiments of the present disclosure, by generating the virtual hospital 130 including the digital twins of the physical entities relating to the physical hospital 110, the physical hospital 110 (encompassing the hardware device, the users, the user services, the medical service procedures, etc. ) can be simulated and tested in a safe and controllable environment. Through a virtual-real linkage (e.g., real-time interactions between the physical hospital 110 and the virtual hospital 130) , various medical scenarios can be predicted and responded to more accurately, thereby improving the quality and efficiency of the medical services. Additionally, the use of the XR technology and virtual-real integration technology enables more natural and intuitive interactions for the relevant users, providing a more comfortable and efficient medical environment, thereby enhancing the user experience.
In some embodiments, the virtual hospital 130 may further include intelligent agents that achieve self-evolution based on the data relating to the physical hospital 110 and AI technology.
An intelligent agent refers to an agent acting in an intelligent manner. For example, the intelligent agent may include a computing/software entity that can learn and evolve autonomously, and perceive and analyze data to perform specific tasks and/or achieve specific goals (e.g., the medical service procedures) . Through AI technology (e.g., reinforcement learning, deep learning, etc. ) , the intelligent agent may continuously learn and self-optimize in the interaction with the environment. In addition, the intelligent agent may collect and analyze massive amounts of data (e.g., the data relating to the physical hospital 110) through big data technology, and mine patterns and learn rules from the data to optimize a decision-making process, so as to identify environmental changes, respond quickly, and make reasonable judgments in uncertain or dynamic environments. For example, the intelligent agents may autonomously learn and evolve based on the AI technology to adapt to changes in the physical hospital 110. Merely by way of example, the intelligent agents may be built based on an NLP technology (e.g., a large language model, etc. ) , and may automatically learn and
autonomously update via a large amount of language texts (e.g., hospital business data and patient feedback information) to improve the quality of the user services provided by the physical hospital 110.
In some embodiments, the intelligent agents may include different types corresponding to different medical service procedures, different user services, different departments, different diseases, different hospital positions (e.g., the nurses, doctors, technicians, etc. ) , different stages in a medical service procedure, etc. An intelligent agent of a specific type is used to handle tasks corresponding to the specific type. In some embodiments, one intelligent agent may correspond to different medical service procedures (or different medical services, or different departments, or different diseases, or different hospital positions) . In some embodiments, the intelligent agent may operate with reference to essential data (e.g., dictionaries, knowledge graphs, templates, etc. ) of the department and/or disease corresponding to the intelligent agent. In some embodiments, a plurality of intelligent bodies may collaborate with each other and share information via network communication to accomplish complex tasks together.
In some embodiments, configurations of an intelligent agent may be set. For example, essential data used by the intelligent agent in operation may be set. The essential data may include a dictionary, a knowledge database, a template, etc. As another example, usage permissions of the intelligent agent may be set for different users. In some embodiments, a manager of the physical hospital 110 may set configurations of the intelligent agent via a manage space application.
In some embodiments, an intelligent agent may be integrated into or deployed on a hardware device. For example, an intelligent agent corresponding to the hospitalization services may be integrated into the hospital bed or a presentation device of the hospital bed. In some embodiments, an intelligent agent may be integrated into or deployed on an embodied intelligence robot. The embodied intelligence robot refers to a robotic system that integrates physical presence (embodiment) with intelligent behavior (cognition) . The embodied intelligence robot may be configured to interact with the real world in a manner that mimics or complements human capabilities, utilizing physical form and cognitive functions to perform tasks, make decisions, and adapt to the environment. By leveraging AI and sensor technologies, the embodied intelligence robot may operate autonomously, interact with the environment, and continuously improve the performance. For example, the embodied intelligence robot may be configured with the intelligent agent corresponding to the surgery services and assist the doctors to perform surgeries.
In some embodiments, at least a portion of the user services may be provided based on the intelligent agents. For example, the at least a portion of the user services may be provided to the relevant users based on a processing result, wherein the processing result is generated by at least one of the intelligent agents based on the data relating to the physical hospital 110. Merely by way of example, the data relating to the physical hospital 110 may include data relating to a medical service procedure of the physical hospital 110, the intelligent agents may include an intelligent agent corresponding to the medical service procedure, and the user services may be provided to relevant users of the medical service procedure by processing the data using the intelligent agent corresponding to the medical service procedure.
The hospital support platform 140 may be configured to provide technical support for the medical service system 100. For example, the hospital support platform 140 may include computational hardware and software to support the innovative technologies including XR technology, the AI technology, digital twin technology, data circulation technology, etc. In some embodiments, the hospital support platform 140 may at
least include a storage device for data storage and a processing device for data computation.
In some embodiments, the hospital support platform 140 may support the interaction between the physical hospital 110 and the virtual hospital 130. For example, the processing device of the hospital support platform 140 may obtain data relating to the physical hospital 110 from the hardware devices and map the data relating to the physical hospital 110 into the virtual hospital 130. For instance, the processing device of the hospital support platform 140 may update a portion of the digital twins in the virtual hospital 130 (e.g., the one or more first digital twins) based on the obtained data, so that each of the portion of the digital twins in the virtual hospital 130 may reflect an updated status of the corresponding physical entity in the physical hospital 110. Based on such digital twins that are constantly updated with the corresponding physical entities, the users can understand the real-time statuses of the physical entities relating to the physical hospital 110, thereby realizing monitoring and evaluation of the physical entities. As another example, intelligent agent (s) corresponding to the data relating to the physical hospital 110 may be self-evolving and self-learning by training and/or updating based on the data relating to the physical hospital 110.
In some embodiments, the hospital support platform 140 may support and/or provide the user services to the relevant users of the physical hospital 110. For example, in response to receiving a user service request from a user, the processing device of the hospital support platform 140 may provide the user service corresponding to the service request. As another example, in response to detecting that a user service needs to be provided to a user, the processing device of the hospital support platform 140 may control a physical entity or a virtual entity corresponding to the user service to provide the user service. For instance, in response to detecting that the patient is admitted to a hospital ward, the processing device of the hospital support platform 140 may control the intelligent nursing trolley to guide a nurse to the hospital ward to perform an initial examination on the patient.
In some embodiments, at least a portion of the user services may be provided to the relevant users based on the interactions between the relevant users and the virtual hospital 130. An interaction refers to a reciprocal action or influence (e.g., conversation, behavior, etc. ) between the relevant users and the virtual hospital 130. For example, the interactions between the relevant users and the virtual hospital 130 may include interactions between the relevant users and the digital twins in the virtual hospital 130, interactions between the relevant users and the intelligent agents, interactions between the relevant users and the virtual characters, or the like, or any combination thereof.
In some embodiments, at least a portion of the user services may be provided to the relevant users based on the interactions between the relevant users and at least one of the digital twins. For example, an updating instruction of a second digital twin inputted by a relevant user may be received via the at least one user space application 120, and the corresponding physical entity of the second digital twin may be updated based on the updating instruction. As another example, a user may view a first digital twin of a physical entity (e.g., a 3D digital twin model of a patient’s organ or a hardware device) via the user space application 120 to understand the status of the physical entity. Optionally, the user may change the display angle, the display size, etc., the digital twin.
In some embodiments, the processing device of the hospital support platform 140 may present a virtual character corresponding to an intelligent agent via the at least one user space application to interact with the relevant users, and provide at least a portion of the user services to the relevant users based on the interactions
between the relevant users and the virtual character.
In some embodiments, the hospital support platform 140 may have a five-layer structure, including a hardware device layer, an interface layer, a data processing layer, an application development layer, and a service layer, which will be described in FIG. 3. In some embodiments, the hardware devices of the physical hospital 110 may be part of the hospital support platform 140.
According to some embodiments of the present disclosure, by comprehensively integrating various internal and external resources (e.g., the medical service devices, hospital staff, drugs and consumables, etc. ) of the physical hospital, the virtual hospital corresponding to the physical hospital can be established. This virtual hospital can reflect the real-time statuses (e.g., changes, updates, etc. ) of the physical entities relating to the physical hospital, thereby enabling monitoring and evaluation of the physical entities. This integration can provide accurate data support for the operation of the medical services and intelligent decision-making. Furthermore, through the virtual hospital, the relevant users relating to the medical services can collaboratively establish an open and shared ecosystem, thereby fostering innovation and enhancement of medical services.
In addition, full-life cycle patient medical and health services with in-hospital and out-of-hospital linkage may be provided. The perspective of the medical services is expanded from simple disease treatment to encompass the entire life cycle of the patients, including prevention, diagnosis, treatment, rehabilitation, health management, etc. By establishing the in-hospital and out-of-hospital linkage, the physical hospital can better integrate online and offline resources to provide the patients comprehensive and continuous medical and health services. For example, through remote monitoring and online consultation, the patients’ health status can be tracked in real time, which can adjust treatment plans promptly, and improve treatment outcomes.
FIG. 2 is a schematic diagram illustrating an exemplary medical service system 200 according to some embodiments of the present disclosure.
As illustrated in FIG. 2, the medical service system 200 may include a processing device 210, a network 220, a storage device 230, one or more medical service devices 240, one or more sensing devices 250, one or more patient terminal devices 260 of a patient 261, and one or more doctor terminal devices 270 of a doctor 271 associated with the patient 261. In some embodiments, components in the medical service system 200 may be connected to and/or communicate with each other via a wireless connection, a wired connection, or a combination thereof. The connection between the components of the medical service system 200 may be variable.
The processing device 210 may process data and/or information obtained from the storage device 230, the medical service device (s) 240, the sensing device (s) 250, the patient terminal device (s) 260, and/or the doctor terminal device (s) 270. For example, the processing device 210 may provide user services to the patient 261 and the doctor 271 via the patient terminal device (s) 260 and/or the doctor terminal device (s) 270, respectively.
In some embodiments, the processing device 210 may be a single server or a server group. The server group may be centralized or distributed. In some embodiments, the processing device 210 may be local to or remote from the medical service system 200. In some embodiments, the processing device 210 may be implemented on a cloud platform. For example, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or a combination thereof.
In some embodiments, the processing device 210 may include one or more processors (e.g., single-
core processor (s) or multi-core processor (s) ) . Merely for illustration, only one processing device 210 is described in the medical service system 200. However, it should be noted that the medical service system 200 in the present disclosure may also include multiple processing devices. Thus, operations and/or method steps that are performed by one processing device 210 as described in the present disclosure may also be jointly or separately performed by the multiple processing devices.
The network 220 may include any suitable network that can facilitate the exchange of information and/or data for the medical service system 200. The network 220 may be or include a wired network, a wireless network (e.g., an 802.11 network, a Wi-Fi network) , a BluetoothTM network, a near field communication (NFC) network, or the like, or any combination thereof.
The storage device 230 may store data, instructions, and/or any other information. In some embodiments, the storage device 230 may store data obtained from other components of the medical service system 200. In some embodiments, the storage device 230 may store data and/or instructions that the processing device 210 may execute or use to perform exemplary methods described in the present disclosure.
In some embodiments, the data stored in the storage device 230 may include multimodal data. The multimodal data may include data in multiple forms (e.g., images, graphics, video, text, etc. ) , data of various types, data obtained from different sources, data relating to different medical businesses (e.g., diagnosis, surgery, rehabilitation, etc. ) , data relating to different users (e.g., the patients, the medical staff, managers, etc. ) . For example, the data stored in the storage device 230 may include medical data of the patient 261 reflecting a health condition of the patient 261. For instance, the medical data may include an electronic health record of the patient 261. The electronic health record refers to an electronic file that records various types of patient data (e.g., basic information, examination data, imaging data) . For example, the electronic health record may include three-dimensional models of a plurality of organs and/or tissues of the patient 261.
In some embodiments, the storage device 230 may include a mass storage device, a removable storage device, a volatile read-and-write memory, a read-only memory (ROM) , or the like, or any combination thereof. In some embodiments, the storage device 230 may include a data lake and a data warehouse, which will be described in detail in connection with FIG. 3.
The medical service device (s) 240 may be used to provide or assist medical services. As shown in FIG. 2, the medical service device (s) 240 include a consultation room terminal 240-1, a hospital bed 240-2, an intelligent surgery terminal 240-3, an intelligent nursing trolley 240-4, an intelligent wheelchair 240-5, or the like, or any combination thereof.
The consultation room terminal 240-1 refers to a terminal device configured in a consultation room for use by doctors and patients in a medical consultation process. For example, the consultation room terminal 240-1 may include one or more of a screen, a sound output component, an image sensor, or an acoustic sensor. The screen of the consultation room terminal 240-1 may present a consultation interface, and data may be presented on the consultation interface for facilitating the communication between patients and doctors. Exemplary data may include an electronic health record (or a portion thereof) , a pre-consultation record, a medical image, a 3D organ model, an examination result, a decision recommendation, etc.
The hospital bed 240-2 refers to a bed in a hospital ward that can support a patient admitted to the hospital ward and provide user services to the patient. The hospital bed 240-2 may include a bed, a bedside terminal device, a bedside examination device, sensors, or the like, or any combination thereof. The bedside
terminal device may include an XR device, a display device, a mobile device, or the like, or any combination thereof. In some embodiments, the hospital bed 240-2 may be controlled by an intelligent agent corresponding to hospitalization services, wherein such a hospital bed may be also referred to as an intelligent hospital bed or a meta-hospital bed.
The intelligent surgery terminal 240-3 refers to a device configured for assisting surgeries and controlled by an intelligent agent corresponding to the surgery service. The intelligent surgery terminal 240-3 may perceive interactions (e.g., conversation, behavior, etc. ) between the medical service providers, the patients, and the intelligent agent, and obtain data captured by the sensing device (s) 250, so as to provide surgery assistance. In some embodiments, the intelligent surgery terminal 240-3 may be configured to perform risk warnings of surgical operations, generate surgical records for the surgery procedure, etc., based on the intelligent agent configured therein.
The intelligent nursing trolley 240-4 refers to a nursing trolley that has an automatic driving function and can assist in patient treatment and care. For example, the intelligent nursing trolley 240-4 may be configured to guide a nurse to the hospital ward to perform an initial examination on the patient. In some embodiments, the intelligent nursing trolley can be controlled by an intelligent agent (e.g., an intelligent agent corresponding to the hospitalization services, a nursing intelligent agent) . In some embodiments, the intelligent nursing trolley 240-4 may include a trolley, a presentation device, one or more examination devices and/or nursing tools, sensors (e.g., an image sensor, a GPS sensor, an acoustic sensor, etc. ) , etc. In some embodiments, the intelligent nursing trolley 240-4 may be configured to obtain relevant treatment and care information of the patient and generate measurement data, nursing data, etc. The measurement data may include vital signs data of the patient. The nursing data may include a detailed record of a nursing operation, such as a nursing time, a nursing operator, nursing measures, patient responses, etc.
The intelligent wheelchair 240-5 refers to a transport device for intelligently picking up and dropping off the patients. In some embodiments, the intelligent wheelchair 240-5 may be configured to perform autonomous navigation through integrated sensors and maps, locate a location of a patient using a radio frequency identification device (RFID) , Bluetooth, or Wi-Fi signals, identify the patient through biometric technology. In some embodiments, the intelligent wheelchair 240-5 may be controlled by an intelligent agent (e.g., an intelligent agent corresponding to the hospitalization services, an intelligent agent corresponding to the surgery services) . In some embodiments, the intelligent wheelchair 240-5 may be configured to generate data (e.g., records of interaction content between the intelligent agent and the patients) by sensing interaction data through built-in cameras/sensors.
The sensing device (s) 250 may be configured to collect sensed information relating to the environment where it is located. In some embodiments, the sensing device (s) 250 may include sensing device (s) in the physical hospital 110. For example, the sensing device (s) 250 may include an image sensor 250-1, an acoustic sensor, 250-2, a temperature sensor, a humidity sensor, etc.
The patient terminal device (s) 260 may be a terminal device that interacts with the patient 261. In some embodiments, the patient terminal device (s) 260 may include a mobile terminal 260-1, an XR device 260-2, a smart wearable device 260-3, etc. The doctor terminal device (s) 270 may be a terminal device that interacts with the doctor 271. In some embodiments, the doctor terminal device (s) 270 may include a mobile terminal 270-1, an XR device 270-2, etc. In some embodiments, the patient 261 may access the user space application
(e.g., the patient space application) through a patient terminal device 260, and the doctor 271 may access the user space application (e.g., the doctor space application) through a doctor terminal device 270. In some embodiments, the patient 261 and the doctor 271 may communicate with each other remotely via a patient terminal device 260 and a doctor terminal device 270, so as to provide remote medical services, such as remote consultation service, remote ward round service, remote follow-up service, etc.
The sensing device (s) 250, the patient terminal device (s) 260, and the doctor terminal device (s) 270 may be configured as data sources to provide information for the medical service system 200. For example, these devices may transmit collected data to the processing device 210, and the processing device 210 may provide user services based on the received data.
It should be noted that the above description of the medical service systems 100 and 200 is intended to be illustrative, and not to limit the scope of the present disclosure. Many alternatives, modifications, and variations will be apparent to those skilled in the art. The features, structures, methods, and other characteristics of the exemplary embodiments described herein may be combined in various ways to obtain additional and/or alternative exemplary embodiments. For example, the medical service system 200 may include one or more additional components, such as terminal devices of other users, public terminal devices of the hospital, etc. As another example, two or more components of the medical service system 200 may be integrated into a single component.
FIG. 3 is a schematic diagram illustrating an exemplary hospital support platform 300 according to some embodiments of the present disclosure.
As shown in FIG. 3, the hospital support platform 300 may include a hardware layer 310 (also referred to as a hardware module) , an interface layer 320 (also referred to as an interface module) , a data processing layer 330 (also referred to as a data processing module) , an application development layer 340 (also referred to as an application development module) , and a service layer 350 (also referred to as a service module) . It should be understood that the “layer” and “module” in the present disclosure are only used to logically divide the components of the hospital support platform, and are not intended to be limiting.
The hardware layer 310 may be configured to provide a hardware foundation for the interaction between a real world and a digital world, and may include one or more hardware devices related to hospital operation. An exemplary hardware device may include a medical service device, a sensing device, a terminal device, and a basic device.
The interface layer 320 may be connected with the hardware layer 310 and the data processing layer 330. The interface layer 320 may be configured to obtain the data collected by the hardware devices of the hardware layer 310 and send the data to the data processing layer 330 for storage and/or processing. The interface layer 320 may also be configured to control at least a portion of the hardware devices of the hardware layer 310. In some embodiments, the interface layer 320 may include hardware interfaces and software interfaces (e.g., data interface, control interface) .
The data processing layer 330 may be configured to store and/or process data. The data processing layer 330 may include a processing device, and multiple data processing units may be configured on the processing device. The data processing layer 330 may be configured to obtain data from the interface layer 320 and process the data via at least one of the data processing units to implement user services related to the hospital business.
The data processing units may include various preset algorithms for implementing data processing. In some embodiments, the data processing layer 330 may include a processing device (e.g., the processing device 220 in FIG. 2) . The data processing units may be configured on the processing device. In some embodiments, the data processing units may include XR units configured to process data using XR technologies to achieve XR services, AI units (e.g., intelligent agent units) configured to process the data using AI technologies to achieve AI services, digital twin units configured to process the data using digital twin technologies to achieve digital twin service, data circulation units configured to process the data using data circulation technologies (e.g., blockchain technologies, data privacy computing technologies) to achieve data circulation services, etc.
In some embodiments, the data processing layer 330 may also include a data center configured to store data. In some embodiments, the data center may adopt a lake-warehouse integrated architecture, which may include a data lake and a data warehouse. The data lake may be used to persistently store massive data in a tamper-proof manner. The data warehouse may be used to store index data corresponding to the data in the data lake. The data stored in the data lake may include native (or raw) data collected by the hardware devices, derived data generated based on the native data, etc. In some embodiments, the data in the data lakehouse may be processed by a processing device (e.g., the processing device 210) .
The application development layer 340 may be configured to support application development, publishing, subscription, etc. The application development layer 340 is also referred to as an ecological suite layer. In some embodiments, the application development layer 340 may be configured to provide open interfaces for application developers to access or invoke at least a portion of the data processing units and utilize the at least a portion of the data processing units to develop applications. In some embodiments, as shown in FIG. 3, the application development layer 340 may provide a development toolkit, an application marketplace, a multi-tenant operation platform, a cloud official website, a workspace, and other support kits to assist the developers in their work.
The service layer 350 may be configured for relevant users of the hospital business to access the user services relating to the hospital business via user space applications.
The present disclosure provides a hospital support platform designed for the comprehensive management of various resources within a hospital, including hardware resources, software resources, and data resources. In certain embodiments, the platform further incorporates data processing units capable of supporting advanced technologies, such as AI, XR, digital twin, and blockchain. These advanced technologies are harnessed to enhance service efficiency and quality within the healthcare industry. For instance, AI technologies enable autonomous evolution and continuous optimization of hospital operations, while XR and digital twin technologies facilitate the creation and maintenance of a virtual hospital. This virtual hospital can engage with users, offering an immersive and novel service experience. Additionally, the platform includes an application development layer that grants access to these advanced technologies to third-party developers within the healthcare industry. This access fosters an open ecosystem that promotes application development and innovation, thereby driving advancements in healthcare services.
FIG. 4 is a schematic flowchart illustrating an exemplary process for assisting doctors in work according to some embodiments of the present disclosure. As shown in FIG. 4, in some embodiments, a process 400 may include the following operations. In some embodiments, the process 400 may be implemented by a processing device 210.
In 410, an access request to a doctor space application may be obtained from a doctor terminal.
The doctor terminal refers to a device used by a doctor. For example, as shown in FIG. 4, a doctor terminal 270 may include a mobile terminal device 270-1 (e.g., a smart phone or a tablet computer) , a desktop terminal device 270-3 (e.g., a laptop computer and a display screen) , or the like, used by a doctor 271.
In some embodiments, the doctor terminal 270 may include an extended reality (XR) device. For example, as shown in FIG. 4, the doctor terminal 270 may include a second XR device 270-2 worn by the doctor.
The doctor space application refers to an assistant application designed to support a doctor’s daily clinical tasks. The doctor space application may allow doctors to complete at least a portion of daily work tasks. For example, the doctor may conduct ward rounds and view patient data remotely via the doctor space application. The doctor space application may be installed and operated on the doctor terminal 270. For example, the doctor 271 may activate the doctor space application via the second XR device 270-2 that the doctor wears.
The access request refers to a doctor's request to initiate the doctor space application. The doctor may generate the access request in various ways. For example, the doctor space application may be displayed on a display screen of the mobile terminal device 270-1, and the doctor 271 may generate the access request by clicking an icon of the doctor space application on the display screen or by issuing a voice command, such as "start the doctor space application" . As another example, the second XR device 270-2 may generate a virtual reality version of the doctor space application, and the doctor 271 may generate the access request by interacting with the virtual reality doctor space application or by using a voice command.
In 420, in response to the access request, one or more pending tasks by the doctor may be determined based on a receipt time of the access request and schedule information of the doctor.
The receipt time of the access request refers to a time point when the processing device 210 receives the access request sent from the doctor terminal 270.
The schedule information of the doctor refers to work arrangement details of the doctor for the day. The schedule information of the doctor may include various work tasks that the doctor needs to complete on the day and a time corresponding to each of the work tasks (e.g., a planned start time and a planned end time of each of the work tasks) . Merely by way of example, the schedule information of the doctor may be "ward round preparation (7: 45-8: 00) , ward rounds (8: 00-9: 00) , pre-consultation preview (9: 00-9: 10) , consultation (9:10-11: 30) , preoperative preparation (13: 30-14: 00) , surgery (14: 00-17: 00) " . Ward round preparation refers to one or more tasks that the doctor needs to complete before ward rounds, such as reviewing a surgery record and a medical history of a patient. Pre-consultation preview refers to one or more tasks that the doctor needs to complete before carrying out consultation work, such as checking the patient data of the patient who has booked the consultation. Preoperative preparation refers to one or more tasks that the doctor needs to complete before conducting surgery, such as preoperative cleaning, preoperative disinfection, etc.
A pending task refers to a task that the doctor has not completed on the day, which may include a task that the doctor is currently working on a task that have not yet started, or a combination thereof. For example, taking the schedule information of the doctor in the above example as an example, if the doctor is currently conducting the consultation, the one or more pending tasks may include the consultation, the preoperative preparations, and the surgery.
In some embodiments, the one or more pending tasks may include conducting ward rounds in a hospital ward.
In some embodiments, a doctor (e.g., the doctor 271) may conduct the ward rounds by participating in the ward rounds remotely. More descriptions regarding conducting the ward rounds remotely may be found in FIG. 7, FIG. 9, or FIG. 10 and related descriptions.
In some embodiments, the one or more pending tasks may include providing consultation services in a consultation room. More descriptions regarding the embodiment may be found in FIG. 11A and FIG. 12 and related descriptions thereof.
In some embodiments, the one or more pending tasks may include providing a remote consultation service.
The remote consultation service refer to the provision of medical diagnosis and consultation by a doctor through an online platform (e.g., the doctor space application) . More descriptions regarding the embodiment may be found in FIG. 13 and related descriptions thereof.
In some embodiments, the one or more pending tasks may include performing a surgery on a target patient. More descriptions regarding the embodiment may be found in FIG. 14 and related descriptions thereof.
In some embodiments, the one or more pending tasks may include writing a work record.
The work record may document the details of the doctor’s daily activities, such the content of the work done that day, working hours, emergencies during the task, a task summary, etc. For example, the work record may include a ward round record of the day (including an actual start/end time of the ward rounds, emergencies encountered during the ward rounds, corresponding solutions, etc. ) , a consultation record (including an actual start/end time of the consultation, a count of diagnosed patients, special patients encountered, problems of patients, etc. ) , a surgery record (including an actual start/end time of the surgery, whether the surgery is successful, whether abnormal conditions occurred during the surgery, etc. ) , a work error (e.g., an incorrect operation during the surgery) , etc.
In some embodiments, the work record may include a daily work record, a work record for each task, or a work record for a preset time period (e.g., within 3 h, 5 h, 24 h, etc. ) . In some embodiments, the one or more pending tasks of writing the work record may be displayed all the time. For example, if the doctor does not complete or completes the task of writing the work record, the pending task may be displayed on the display interface.
In some embodiments, the one or more pending tasks may include reviewing records of one or more tasks that have been completed. For example, the doctor 271 may access records related to the ward rounds, the consultation, the surgery, etc., via the doctor terminal 270 and review the records. When the records of one or more tasks that have been completed are reviewed, the doctor may add, modify and/or delete content in the records. For example, the doctor 271 may input (e.g., text input or voice input, etc. ) the end time of the surgery via the doctor terminal 270 to modify the end time of the surgery already in the record.
The ward round record refers to a record of relevant data during ward rounds. The ward round record may include the time of the ward rounds (e.g., the start time of the ward rounds) , the participants of the ward rounds (e.g., the names of doctors and nurses participating in the ward rounds) , the patient data, communication between the patient and at least one doctor, the doctor order during the ward round, or the like,
or a combination thereof. In some embodiments, the processor 210 may generate a preliminary ward round record based on the sensing data collected during the ward rounds by the sensing devices in the ward (e.g., voice information of the doctors and the patients) , and generate the ward round report based on the doctor feedback (e.g., the modification information or the confirmation information) on the preliminary ward round record. More descriptions regarding the preliminary ward round record may be found in FIG. 9 and related descriptions thereof.
The diagnosis record may include a patient's condition, cause, diagnosis, outpatient doctor order, etc., during a consultation process. In some embodiments, the processor 210 may generate a preliminary diagnosis record based on sensing data (e.g., voice information of the doctor and the patient) collected by one or more sensing devices during the consultation process, and generate the diagnosis record based on feedback information (e.g., modification information) of the doctor on the preliminary diagnosis record. More descriptions regarding the preliminary diagnosis record may be found in FIG. 11A and related descriptions thereof.
The surgery record may include a record of events (e.g., doctor's operations) during a surgery process. For example, the surgery record may include details (e.g., a start and end time of the surgery, whether the surgery is successful, etc. ) related to the surgery, details (e.g., vital signs of the patient, etc. ) related to the patient, details (e.g., an operation record, voice information record, or the like, of the doctor and nurses) related to participants, etc.
In some embodiments, the processing device 210 may generate a preliminary surgery record based on data (e.g., sensing information collected by the one or more sensing devices in an operation room during the surgery) collected during the surgery. In some embodiments, the processing device 210 may generate the surgery record based on the preliminary surgery record and feedback information (e.g., the modification information) on the preliminary surgery record input by the doctor. More descriptions regarding the preliminary surgery record may be found in FIG. 14 and related descriptions thereof.
In some embodiments, the processing device 210 may determine one or more tasks of which planned end time is later than the receipt time of the access request as the pending tasks. For example, taking the receipt time of the access request is 11: 13 and the schedule information of the doctor is "ward round preparation (7: 45-8: 00) , ward rounds (8: 00-9: 00) , pre-consultation preview (9: 00-9: 10) , consultation (9: 10-11:30) , preoperative preparation (13: 30-14: 00) , surgery (14: 00-17: 00) " , the tasks of which the planned end time is later than the receipt time of the access request of 11: 13 may include the consultation, the preoperative preparation, and the surgery, and the processing device 210 may determine the consultation, the preoperative preparation, and the surgery as the pending tasks.
In some embodiments, since some tasks may be completed ahead of schedule, the doctor (e.g., a task completion instruction (e.g., the doctor 271 may input a voce command "the consultation task completed" ) via the doctor terminal 270 based on the actual completion status of the task. The doctor terminal 270 may send the task completion instruction to the processing device 210, and the processing device 210 may determine the corresponding task as a completed task (i.e., not a pending task) based on the task completion instruction. For example, continuing with the previous example, if the receipt time of the access request is 11: 13, but the doctor has already indicated that the consultation task is complete, the determined one or more pending tasks may not include the consultation task. Thus the one or more pending tasks may only include the preoperative
preparation and the surgery.
In some embodiments, the processing device 210 may determine one or more completed tasks of the doctor based on the receipt time and the schedule information of the doctor.
In some embodiments, the processing device 210 may determine a task of which planned end time is earlier than the receipt time of the access request as a completed task. For example, if the receipt time of the access request is 11: 13 and the schedule information of the doctor is "ward round preparation (7: 45-8: 00) , ward rounds (8: 00-9: 00) , pre-consultation preview (9: 00-9: 10) , consultation (9: 10-11: 30) , preoperative preparation (13: 30-14: 00) , surgery (14: 00-17: 00) " , the tasks of which the planned end time is earlier than the receipt time of the access request of 11: 13 may include ward rounds preparation, ward rounds, and pre-consultation preview, and the processing device 210 may determine the ward rounds preparation, the ward rounds, and the pre-consultation preview as the completed tasks.
In some embodiments, since some tasks may be completed ahead of schedule, the doctor (e.g., the doctor 271) may input the task completion instruction (e.g., a voice command "the consultation task completed" ) via the doctor terminal 270 based on the actual completion status of the task. The doctor terminal 270 may send the task completion instruction to the processing device 210, and the processing device 210 may determine the corresponding task as the completed task based on the task completion instruction. For example, continuing with the previous example, if the receipt time of the access request is 11: 13, but the doctor has already indicated that the consultation task is completed, the determined one or more pending tasks may include the consultation task.
In some embodiments, in response to the access request, the processing device 210 may cause the doctor terminal 270 to present a preliminary interactive interface that includes an eighth interface element for reminding the doctor to check a work schedule (i.e., the schedule information of the doctor) via the doctor space application. In some embodiments, in response to a request for accessing the work schedule input by the doctor via the doctor terminal, the processing device 210 may determine the one or more pending tasks. More descriptions regarding the above embodiments may be found in FIG. 5 and related descriptions thereof.
In 430, the doctor terminal may be caused to present an interactive interface via the doctor space application. For example, as shown in FIG. 4, the processing device 210 may cause the doctor terminal 270 to present an interactive interface 431. The interactive interface 431 may be presented in a way that: if the doctor terminal 270 is a terminal device with a display screen such as the mobile terminal device 270-1 or the desktop terminal device 270-3, the interactive interface 431 may be directly presented on the display screen; if the doctor terminal 270 is the second XR device 270-2, the second XR device 270-2 may present the interactive interface 431 in a virtual reality space generated by the second XR device 270-2. More descriptions regarding the interactive interface 431 may be found in FIG. 7 and related descriptions thereof.
The interactive interface (e.g., the interactive interface 431) may include at least one interface element. The doctor (e.g., the doctor 271) may access one or more assistance services corresponding to the at least one interface element by accessing the at least one interface element. The doctor may access the at least one interface element by clicking, long pressing, voice selection (inputting an interface element to be accessed in voice) , etc.
The assistance services refer to functions provided by the doctor space application to assist the doctor in completing work tasks. For example, the assistance services may include displaying patient data,
displaying 3D maps of target locations within the hospital, and providing services for the doctor to participate remotely in consultation or ward rounds. More descriptions regarding the above services may be found in the following description.
In some embodiments, the interactive interface (e.g., the interactive interface 431) may include first interface elements for accessing the assistance services relating to at least one of the one or more pending tasks. The doctor (the doctor 271) may access the assistance services corresponding to the first interface elements by clicking or selecting the first interface elements in voice. More descriptions regarding the interactive interface may be found in FIG. 7 and related descriptions thereof.
In some embodiments, in response to an access request, the processing device 210 may cause the doctor terminal 270 to present a preliminary interactive interface via the doctor space application. The interface may include an eighth interface element for reminding a doctor to check a work schedule. For example, FIG. 5 is a schematic diagram illustrating an exemplary preliminary interactive interface according to some embodiments of the present disclosure. In response to the access request, the processing device 210 may cause the doctor terminal 270 to present a preliminary interactive interface 500 as shown in FIG. 5 via the doctor space application. The preliminary interactive interface 500 may include an eighth interface element 510.
As shown in FIG. 5, the eighth interface element 510 may include an input box element (e.g., a position of "Check today schedule" in FIG. 5) and a functional element (e.g., a "Back to Home" element and an "Open" element in FIG. 5) . The input box element may display a reminder message (e.g., a text "Check today schedule" ) for checking the work schedule in gray font or other forms to remind the doctor to check the work schedule. For example, as shown in FIG. 5, the doctor 271 may initiate a request to access the work schedule by entering the text "Check today schedule" in the input box and clicking the "Open" element.
It is understood that FIG. 5 and the description thereof are only examples, and in some embodiments, the eighth interface element 510 may be presented in other ways, such as a "Check Work Schedule" button, a "Check Work Schedule" voice reminder, etc., which is not limited in the present disclosure.
In some embodiments, the processing device 210 may cause the doctor terminal (e.g., the doctor terminal 270) to present a corresponding interface based on an interaction (e.g., clicking or selecting in voice, etc. ) between the doctor (e.g., doctor 271) and the eighth interface element 510. For example, in response to the doctor 271 clicking the "Open" element, the processing device 210 may cause the doctor terminal 270 to present schedule information of the doctor. As another example, in response to the doctor 271 entering a name of an assistance service (e.g., hospital tour) in the element of the input box and clicking the "open" element, the processing device 210 may cause the doctor terminal 270 to present a doctor's interface of the assistance service. As another example, the doctor 271 may click the "Back to Home" element to cause the doctor terminal 270 to present a main interface (e.g., the interactive interface 431 shown in FIG. 7) . As used herein, "hospital tour" refers to accessing a real-time 3D map relating to at least one target location within the hospital. More descriptions regarding the target location and the real-time 3D map may be found in FIG. 7 and related descriptions thereof.
In some embodiments, when the doctor clicks the "Open" element, the processing device 210 may cause the doctor terminal (e.g., the doctor terminal 270) to present a work schedule interface.
FIG. 6 is a schematic diagram illustrating an exemplary interface for presenting one or more pending
tasks according to some embodiments of the present disclosure.
In some embodiments, as shown in FIG. 6, a work schedule interface 600 may include various tasks that a doctor (e.g., the doctor 271) needs to handle on the day. For example, as shown in FIG. 6, tasks presented on the work schedule interface 600 may include a ward round task 610, a consultation task 620, and a surgery task 630. In some embodiments, further, the work schedule interface 600 may include specific work content and corresponding time information (e.g., planned start time) included in each task in the work schedule. For example, as shown in FIG. 6, the ward round task 610 may include ward rounds, and a planned start time of the ward rounds may be 8: 00; the surgery task 630 may include preoperative preparation and intraoperative execution, and planned start times of the preoperative preparation and the intraoperative execution may be 13: 30 and 14: 00, respectively.
In some embodiments, as shown in FIG. 6, the work schedule interface 600 may further include a tour element 640. The processing device 210 may cause the doctor terminal (e.g., the doctor terminal 270) to present an interface (e.g., an interface shown in FIG. 8) corresponding to a hospital tour service based on an interaction (e.g., clicking or selecting in voice, etc. ) between the doctor (e.g., the doctor 271) and the tour element 640. More descriptions regarding the hospital tour services may be found in FIG. 8 and related descriptions thereof.
In some embodiments, the preliminary interactive interface may further present a virtual character configured to communicate with the doctor. For example, as shown in FIG. 5, the preliminary interactive interface 500 may include a virtual character 520.
In some embodiments, the doctor (e.g., the doctor 271) may access a desired assistance service via communicating with the virtual character 520 in voice. For example, the doctor 271 may say "back to home" in voice, the doctor terminal 270 may receive a voice message and send the voice message to the processing device 210, and the processing device 210 may perform voice recognition on the voice message and cause the doctor terminal 270 to jump to the main interface (e.g., the interactive interface 431 shown in FIG. 7) according to a recognition result.
In some embodiments, the access request to the work schedule may be input by the doctor via communicating with the virtual character. For example, the doctor 271 may say "Check today schedule" in voice, the doctor terminal 270 may receive a voice message and send the voice message to the processing device 210, and the processing device 210 may perform voice recognition on the voice message and cause the doctor terminal 270 to jump to the work schedule interface (e.g., the work schedule interface 600 shown in FIG. 6) according to a recognition result.
In some embodiments, in response to a request for accessing the work schedule input by the doctor via the doctor terminal, the processing device 210 may determine the one or more pending tasks. That is, after receiving the request for accessing the work schedule input by the doctor via the doctor terminal, the processing device 210 may determine the one or more pending tasks to be completed by the doctor.
FIG. 7 is a schematic diagram illustrating an exemplary interactive interface according to some embodiments of the present disclosure.
In some embodiments, as shown in FIG. 7, the interactive interface 431 may include first interface elements 710 (hereinafter referred to as a plurality of first interface elements 710) for accessing assistance services relating to at least one of one or more pending tasks.
In some embodiments, when the one or more pending tasks include conducting ward rounds in a hospital ward, the first interface elements may include a first interface element (hereinafter referred to as ward round interface element) for applying to participate in the ward rounds remotely. For example, as shown in FIG. 7, the plurality of first interface elements 710 may include a ward round interface element 711.
In some embodiments, in response to an interaction (e.g., the doctor 271 clicks on the ward round interface element 711) between the doctor (e.g., the doctor 271) and the ward round interface element 711, the processing device 210 may cause the doctor terminal 270 to present a ward round service interface 7110 as shown in FIG. 9.
In some embodiments, the doctor (e.g., the doctor 271) may access an assistance service (e.g., sending a request to participate in the ward rounds remotely, and checking patient data of patients under ward rounds, etc. ) relating to participating in the ward rounds remotely by interacting (e.g., clicking on interface elements in the ward round service interface 7110, voice interaction, etc. ) with the ward round service interface 7110 as shown in FIG. 9.
Participating in the ward rounds remotely refers to that the doctor (e.g., the doctor 271) conducts the ward rounds via a virtual ward space presented by the doctor terminal 270 (e.g., the second XR device 270-2) .
More descriptions regarding the embodiment and the virtual ward space may be found in FIG. 9 and/or FIG. 10 and related descriptions thereof.
In some embodiments, when the one or more pending tasks include providing a consultation service in a consultation room, the first interface elements may include a first interface element (hereinafter referred to as a consultation interface element) for accessing patient data of patients who have booked the consultation services. For example, as shown in FIG. 7, the plurality of first interface elements 710 may include a consultation interface element 712.
In some embodiments, in response to an interaction (e.g., the doctor 271 clicks on the consultation interface element 712) between the doctor (e.g., the doctor 271) and the consultation interface element 712, the processing device 210 may cause the doctor terminal 270 to present a consultation service interface 7120 as shown in FIG. 11A.
In some embodiments, the doctor (e.g., the doctor 271) may access an assistance service (e.g., viewing the patient data of the patients who have booked the consultation, etc. ) relating to the consultation via interacting (e.g., clicking on interface elements on the ward round service interface 7120, interacting in voice, etc. ) with the ward round service interface 7120 as shown in FIG. 11A.
More descriptions regarding the embodiment may be found in FIG. 11A and related descriptions thereof.
In some embodiments, when the one or more pending tasks include providing a remote consultation service, the first interface elements may include a first interface element (hereinafter referred to as a consultation room interface element) for entering a virtual consultation room. For example, as shown in FIG. 7, the plurality of first interface elements 710 may include a consultation room interface element 713.
In some embodiments, in response to an interaction (e.g., the doctor 271 clicks on the consultation room interface element 713) between the doctor (e.g., the doctor 271) and the consultation room interface element 713, the processing device 210 may cause the doctor terminal 270 (e.g., the second XR device 270-2) to present the virtual consultation room.
More descriptions regarding the embodiment and the virtual consultation room may be found in FIG. 13 and related descriptions thereof.
In some embodiments, when the one or more pending tasks include performing a surgery on a target patient, the first interface elements may include a first interface element (hereinafter referred to as a surgery interface element) for accessing patient data relating to the target patient. For example, as shown in FIG. 7, the plurality of first interface elements 710 may include a surgery interface element 714.
In some embodiments, in response to an interaction (e.g., the doctor 271 clicks on the surgery interface element 714) between the doctor (e.g., the doctor 271) and the surgery interface element 714, the processing device 210 may cause the doctor terminal 270 to present a surgery service interface 7140 as shown in FIG. 14.
In some embodiments, the doctor (e.g., the doctor 271) may access an assistance service (e.g., viewing the patient data of the target patient on whom the surgery is to be performed, etc. ) relating the surgery via interacting (e.g., clicking on an interface element on the surgery service interface 7140, and interacting in voice, etc. ) with the surgery service interface 7140 as shown in FIG. 14.
More descriptions regarding the embodiment may be found in FIG. 14 and related descriptions thereof.
In some embodiments, as shown in FIG. 7, the interactive interface 431 may further include a second interface element 720. The second interface element 720 may be configured to access a real-time 3D map (i.e., to tour the target location corresponding to the at least one pending task) relating to a target location corresponding to at least one pending task.
The target location corresponding to the at least one pending task refers to a spatial location where the doctor performs the at least one pending task. For example, if the at least one pending task includes ward rounds, the target location may be a ward. As another example, if the at least one pending task includes surgery, the target location may be an operation room.
In some embodiments, the real-time 3D map of the target location may be generated based on a preliminary 3D map of a hospital and real-time information of the target location.
The preliminary 3D map of the hospital may be a static map of the hospital, which can reflect the physical structure and/or layout of the hospital. In some embodiments, the processing device 210 may generate the 3D map of the hospital based on a 3D model of the hospital.
The real-time information is real-time dynamic information about users entering the hospital and hospital devices. The real-time information may include current location information (e.g., the current location of the patient, the current location of the doctor, the current location of the nurse) of the user, the current medical process information (e.g., the patient is receiving the consultation service) of the user, the medical service information (e.g., the pre-consultation record, the diagnosis record, the examination result, etc. ) of the user, and the operation information (e.g., operation information of elevators, smart beds, smart nursing carts, smart chairs, etc. ) of the hospital devices.
In some embodiments, the real-time information may be obtained based on one or more sensing devices (e.g., in-hospital camera equipment, sound sensing equipment, etc. ) of the hospital and/or the user terminals (e.g., the doctor terminal 270) . For example, the doctor terminal 270 may obtain the current location information of the doctor, the task that the doctor is handling, etc. For example, the sensing device of the
hospital may pick up consultation conversations between a doctor and a patient.
The real-time 3D map of a target location refers to a 3D map that reflects the layout of the hospital at the target location and a real-time condition (e.g., the real-time location of the doctor or the patient) . Specifically, the processing device 210 may dynamically update the preliminary 3D map of the target location according to the real-time information of the target location and obtain the real-time 3D map of the target location. For example, taking the target location of a consultation room (hereinafter referred to as consultation room A) as an example, if the real-time information of a doctor shows that the location information of the doctor is in the consultation room A, the processing device 210 may place a 3D model representing the doctor in the real-time 3D map of the consultation room A.
FIG. 8 is a schematic diagram illustrating an exemplary interface for presenting in-hospital tour services according to some embodiments of the present disclosure.
In some embodiments, in response to an interaction (e.g., clicking or selecting in voice, etc. ) between a doctor (e.g., the doctor 271) and the second interface element 720, the processing device 210 may cause the doctor terminal 270 (e.g., the second XR device 270-2) to present an in-hospital tour service interface 7200.
For example, taking one or more pending tasks of the doctor including ward rounds, consultation, and a surgery as an example, as shown in FIG. 8, the in-hospital tour service interface 7200 may include a ward tour element 7201, a consultation room tour element 7202, and an operation room tour element 7203. In response to an interaction (e.g., clicking or selecting in voice, etc. ) between the doctor (e.g., the doctor 271) and one of the ward tour element 7201, the consultation room tour element 7202, and the operation room tour element 7203, the processing device 210 may cause the doctor terminal 270 (e.g., the second XR device 270-2) to present one of a real-time 3D map of a ward, a consultation room, or an operation room, correspondingly.
In some embodiments, as shown in FIG. 7, the interactive interface 431 may further include a third interface element 730 for conducting preoperative education.
In some embodiments, in response to an interaction (e.g., clicking the third interface element 730 or selecting the third interface element 730 in voice) between the doctor (e.g., the doctor 271) and the third interface element 730, the doctor terminal 270 may generate a request for conducting the preoperative education for a target patient and send the request to the processing device 210.
The preoperative education may include explaining a patient condition, a surgery plan, and postoperative recovery to a patient and/or a patient family. For example, the processing device 210 may present a lesion and surrounding tissues or organs thereof in a visualized manner via a terminal device (XR glasses or an intelligent display terminal) worn by participants (e.g., the patient, the doctors, the patient family, etc. ) based on a 3D anatomical model of the patient. Then, the processing device 210 may obtain explanatory information of the doctor to the patient and/or the patient family via a sensor disposed the terminal device worn by the doctor, and send the explanatory information to the terminal device of the patient or the patient family such that the patient or the patient family may receive the explanatory information. The explanatory information may include explanations of various surgery plans and simulated surgery procedures, potential risks, the progress of postoperative recovery, discharge criteria, etc., or a combination thereof. The explanatory information may be in the form of voice information, pattern information, text information, etc.
In some embodiments, the processing device 210 may obtain a request for conducting the preoperative education for the target patient. In response to the request for conducting the preoperative
education for the target patient, the processing device 210 may generate an explanatory material for explaining a surgery plan for the target patient. The processing device 210 may cause a first XR device worn by the patient and a second XR device to simultaneously preset the explanatory material to the target patient and the doctor. More descriptions regarding the preoperative education and the embodiment may be found in FIG. 15 and related descriptions thereof.
In some embodiments, as shown in FIG. 7, the interactive interface 431 may further include a fourth interface element 740 for conducting surgical simulation.
In some embodiments, in response to an interaction (e.g., clicking on the fourth interface element 740) between the doctor (e.g., the doctor 271) and the fourth interface element 740, the doctor terminal 270 may generate a request for simulating a target surgery and send the request for simulating a target surgery to the processing device 210. The target surgery refers to a surgery corresponding to a target surgery plan.
Surgery simulation refers to a process in which the doctor (e.g., the doctor 271) practices surgery in a safe and controllable environment, improves the surgery plan, and/or improves surgery skills. For example, the doctor may conduct surgery simulation on a virtual patient in a virtual surgery scene (e.g., an XR surgery scene) based on a formulated surgery plan using an XR device (the second XR device 270-2) to identify a risk point that may occur during the surgery, thereby formulating a corresponding risk prevention measure. The risk point refers to a risky surgery operation, or an abnormal status of an organ, or other emergency situations during the surgery. For example, the risk point may be that the patient has low blood pressure, bradypnea, liver failure, etc. As another example, for a plurality of surgery plans, the doctor may simulate each of the plurality of surgery plans in the virtual surgery scene using the XR device (e.g., the second XR device 270-2) to compare the pros and cons of different surgery plans and determine an optimal surgery plan. As another example, the doctor may repeatedly practice the same surgery procedure in the virtual surgery scene using the XR device (e.g., the second XR device 270-2) to enhance the understanding and memory of the surgery operation and improve the accuracy and proficiency of the surgery operation.
In some embodiments, the processing device 210 may obtain a request for simulating a target surgery. In response to the request for simulating a target surgery, the processing device 210 may generate the virtual surgery scene corresponding to the target surgery. The virtual surgery scene may include a virtual surgery site and virtual surgery equipment. The processing device 210 may cause the second XR device to present the virtual surgery scene to the doctor. The processing device 210 may obtain an interaction instruction with respect to the virtual surgery equipment input by the doctor via the second XR device or an interactive device corresponding to the virtual surgery equipment. The processing device 210 may update the virtual surgery site and the virtual surgery equipment in the virtual surgery scene based on the interaction instruction. More descriptions regarding the preoperative education and the embodiment may be found in FIG. 16 and related descriptions thereof.
In some embodiments, as shown in FIG. 7, the interactive interface 431 may include a fifth interface element 750 for performing surgery planning.
In some embodiments, in response to an interaction (e.g., clicking or selecting the fifth interface element 750 in voice) between the doctor (e.g., the doctor 271) and the fifth interface element 750, the doctor terminal 270 may generate a request for performing surgery planning for the target patient and send the request to the processing device 210.
The surgery plan refers to a surgery scheme for performing surgery on a patient undergoing the surgery. In some embodiments, the surgery plan may include at least one of the surgery site (e.g., abdomen, chest, etc. ) , surgery time, operating procedures, an estimated surgery duration, an anesthesia dosage, a surgeon, a surgery type (e.g., minimally invasive surgery, laparoscopic surgery, or invasive surgery, etc. ) , a surgery incision location, an incision depth, an incision path, an implant (e.g., a heart stent) , an implant path, types and quantities of surgery tools, risk prevention measures, etc., or a combination thereof.
Performing the surgery plan refers to a process of performing the surgery on the patient based on the surgery plan.
In some embodiments, the processing device 210 may determine the surgery plan using a plan generation model. For example, the surgery plan may be determined by processing and analyzing a condition assessment result using the plan generation model. The plan generation model may be a machine learning model. In some embodiments, an input of the plan generation model may include the condition assessment result, and an output of the plan generation model may include the surgery plan. In some embodiments, the plan generation model may be integrated in the processing device 210. After the processing device 210 determines the surgery plan using the plan generation model, the surgery plan may be output and presented to the doctor through an interactive device (e.g., the doctor terminal 270 or a smart display screen, etc. ) of a doctor workstation (e.g., a doctor's office) . In some embodiments, the processing device 210 may confirm or update the surgery plan based on feedback information (e.g., confirmation information or modification information of the surgery plan, etc. ) of the doctor obtained from the interactive device.
The condition assessment result may reflect a health status and/or condition of the patient. The condition assessment result may include a disease stage, disease progression, a deterioration rate of the disease, patient's tolerance for drugs or surgery, etc. For example, taking cancer as an example, the condition assessment result may be that "the disease stage is advanced, the cancer has spread, the cancer develops and spreads rapidly, the patient has no history of drug allergies, and the patient has a strong tolerance for the surgery" .
An assessment manner of the condition assessment result may include manual assessment and/or intelligent assessment.
The manual assessment refers to that the condition assessment result is determined by a condition assessor (e.g., the doctor) based on the patient data. In some embodiments, after the patient data is retrieved and presented on a display interface of the interactive device (e.g., the doctor terminal 270, and the intelligent display terminal in a doctor's office) , the doctor may assess the health status and condition of the patient based on the patient data, and input the condition assessment result via the interactive device (e.g., input in voice, via the interface, etc. ) . For example, the doctor may access the patient data in the UHR via the intelligent display terminal of the doctor workstation, locate the lesion via tools provided by the UHR, and view a location and size of the lesion, and the relationship with the surrounding tissues of the lesion, so as to intuitively understand the condition of the patient and conduct preoperative assessment. A sensing device (e.g., a microphone, a gesture sensor, etc. on a second terminal device worn by the doctor) may capture input information (e.g., voice information and gesture information) , etc., of the doctor to generate the condition assessment result.
The intelligent assessment refers to that the condition assessment result is determined by processing the patient data using a condition assessment model. The condition assessment model may be a machine
learning model. An input of the condition assessment model may include the patient data, and an output of the condition assessment model may include the condition assessment result. In some embodiments, the condition assessment model may be integrated into the processing device 210. After the processing device 210 determines the condition assessment result of the patient using the condition assessment model, the condition assessment result may be output and presented to the doctor via the interactive terminal (e.g., doctor terminal 270, the smart display screen, etc. ) of the doctor workstation. In some embodiments, the processing device 210 may update the condition assessment result based on the feedback information (e.g., the confirmation information or the modification information on the condition assessment result, etc. ) of the doctor on the presented condition assessment result.
The patient data refers to physiological or pathological information related to the patient. The patient data may include personal data of the patient, historical diagnosis and treatment data, medical examination data, a digital twin of the patient (e.g., a 3D anatomical model of the patient) , etc.
The patient personal data refers to basic information of the patient. The patient personal data may include gender, age, height, weight, etc., of the patient. For example, the patient personal data may be "male, age 39, height 174cm, and weight 73kg" .
The historical diagnosis and treatment data refers to data related to treatment and care before a current surgery of the patient or a current moment. The historical diagnosis and treatment data may include a medical record, a historical treatment record (e.g., historical surgery records, historical chemotherapy records, etc. ) , a hospitalization care record (e.g., medical order records, nursing records, medication records, etc. during hospitalization) , a historical medication record (e.g., medication records before the current hospitalization, etc. ) of the patient, etc.
The medical examination data refers to data reflecting results of medical examinations performed on the patient. The medical examinations may include blood detection, biochemical detection, urine detection, immunological detection, microbiological detection, allergen detection, imageological detection (e.g., a CT scan, a MR scan, a PET scan, an ultrasound scan, etc. ) , etc. The medical examination data may include a medical examination report, such as a blood detection record, a urine detection record, an immunological detection record, a CT record, an MR record, a PET record, etc.
The patient data may be obtained from a unified health record (UHR) or supplemented by the doctor. The UHR is an archive for storing and/or visualizing the patient data in various ways. For example, the patient personal data (e.g., after the first log-in) , physical examination records, the medical examination records, etc., may be stored in a folder (e.g., a folder named by a patient number, name, etc. ) corresponding to the patient in the UHR. For example, the doctor may initiate a patient data acquisition instruction by operating the interactive terminal (e.g., the doctor terminal 270, and the intelligent display terminal) of the doctor workstation, and the processing device 210 may retrieve the patient data of the current patient from the UHR based on the acquisition instruction and present the patient data of the current patient via the interactive terminal.
In some embodiments, the doctor may determine the surgery plan based on the condition assessment result. For example, the doctor may determine the surgery plan by determining the necessity of a surgery for the current patient, the most suitable surgery manner and technique, the surgery time and the estimated duration of the surgery, etc., based on the condition assessment result.
In some embodiments, when the surgery plan is formulated, the processing device 210 may generate the surgery plan by initiating an expert meeting.
In some embodiments, the processing device 210 may obtain the request for performing surgery planning for the target patient. The processing device 210 may determine an operation difficulty factor based on the patient data of the target patient. The processing device 210 may determine whether an expert meeting is required based on the operation difficulty factor. In response to determining that the expert meeting is needed, the processing device 210 may cause the doctor terminal to present a sixth interface element (e.g., a sixth interface element 760) for initiating the expert meeting. More descriptions regarding the embodiment may be found in FIG. 17 and related descriptions thereof.
In some embodiments, as shown in FIG. 7, the interactive interface 431 may further includes a seventh interface element 770 for patient management.
In some embodiments, in response to an interactive behavior (e.g., the doctor 271 clicks on the seventh interface element 770) between the doctor (e.g., the doctor 271) and the seventh interface element 770, the doctor terminal 270 may generate a request for accessing a preliminary admission record of the target patient and send the request to the processing device 210.
Patient management refers to management of data (e.g., the patient data, the surgery record, the admission record, the nursing record, the postoperative recovery record, etc. ) related to the patient.
In some embodiments, the processing device 210 may obtain a request for accessing the preliminary admission record of the target patient. In response to the request, the processing device 210 may cause the doctor terminal to present the preliminary admission record. The processing device 210 may update the preliminary admission record based on feedback information regarding the preliminary admission record input by the doctor via the doctor terminal. More descriptions regarding the embodiment may be found in FIG. 18 and related descriptions thereof.
In some embodiments, the interactive interface 431 may further include time schedule information relating to at least one pending task. For example, as shown in FIG. 7, the interactive interface 431 may include time schedule information of the ward rounds (planned start time may be 8: 00) , the pre-consultation /preview (planned start time may be 9: 00) , the consultation (planned start time may be 9: 10) , etc.
In some embodiments, the interactive interface 431 may further include one or more collapsible elements relating to one or more completed tasks. The one or more collapsible elements in the interactive interface may be reduced (e.g., reduced to 1/4 of an original size) or enlarged via human-computer interaction (e.g., clicking or selecting in voice, etc. ) . Accordingly, a collapsible element may include a folded status (astatus when being reduced) and an unfolded status (astatus when being enlarged) . In the unfolded status, the collapsible element may present a completed task and time information (e.g., start time of the one or more completed tasks) of the completed task. For example, as shown in FIG. 7, the interactive interface 431 may include a collapsible element 790 (in the unfolded status) , the collapsible element 790 may present a ward round preparation task, and start time of the ward round preparation task may be 7: 45. The doctor (e.g., the doctor 271) may click on the collapsible element 790 on the interactive interface 431, and the processing device 210 may cause the collapsible elements790 to enter the folded status (an area of the collapsible elements790 may be reduced and the completed task may not be presented any more) in response to the click operation.
In some embodiments, the configuration of the interactive interface 431 may be determined based on preference information of the doctor.
The preference information reflects preference of the doctor (e.g., the doctor 271) for the content, specifications (including shape, color, font, etc. ) and/or position of the elements (e.g., the first interface element, the second interface element, etc. ) on the interface. For example, if the preference information is that "the interface elements are circular, the font is regular script, and the spacing is 5 mm" , then the processing device 210 may set the shape of each of the interface elements on the interactive interface 431 to be circular, the font contained in each of the interface elements to be regular script, and the spacing between the interface elements to be 5 mm.
In some embodiments, the preference information may also reflect the preference of the doctor (e.g., the doctor 271) for an interface style. The interface style may include a simple style and a detailed style.
The simple style may include fewer interface elements than the detailed style. For example, the interactive interface 431 in the detailed style may include all the interface elements as shown in FIG. 7, and the interactive interface 431 in the simple style may only include the first interface element 711, the second interface element 720, the third interface element 730, the fourth interface element 740, the fifth interface element 750, and the seventh interface element 770 as shown in FIG. 7.
In some embodiments, the preference information may include whether to display the virtual character 520 as shown in FIG. 5. Further, the preference information may include an image preference for the virtual character 520. For example, a character image of the virtual character 520 may be a cartoon character, a cartoon animal character, etc. The preference information may include doctor's (e.g., doctor 271) selection of a specific character image.
FIG. 9 is a schematic diagram illustrating an exemplary ward round service interface according to some embodiments of the present disclosure.
In some embodiments, as shown in FIG. 9, the ward round service interface 7110 may include a first interface element 7111 for applying to participate in ward rounds remotely. In response to an interaction (e.g., clicking or selecting the first interface element 7111 in voice) between a doctor (e.g., the doctor 271) and the first interface element 7111, the doctor terminal (e.g., the doctor terminal 270) may generate a request for applying to participate in the ward rounds remotely and send the request to the processing device 210.
In some embodiments, as shown in FIG. 9, the ward round service interface 7110 may further include a first interface element 7112 for accessing patient data of patients to be visited in the ward rounds. In response to an interaction (e.g., clicking or selecting the first interface element 7112 in voice) between the doctor (e.g., the doctor 271) and the first interface element 7112, the doctor terminal (e.g., the doctor terminal 270) may generate a request for accessing the patient data of the patient to be visited in the ward rounds and send the request to the processing device 210. In response to the request, the processing device 210 may retrieve (e.g., retrieve from the storage device 230 or the UHR) the patient data and send the patient data to the doctor terminal (e.g., the doctor terminal 270) , and then cause the doctor terminal (e.g., the doctor terminal 270) to present the patient data to the doctor.
More descriptions regarding the patient data may be found in FIG. 7 and related descriptions thereof.
In some embodiments, as shown in FIG. 9, the ward round service interface 7110 may further include a first interface element 7113 for accessing a preliminary ward round record relating to the ward
rounds. In response to an interaction (e.g., clicking or selecting the first interface element 7113 in voice) between the doctor (e.g., the doctor 271) and the first interface element 7113, the doctor terminal (e.g., the doctor terminal 270) may generate a request for accessing the preliminary ward round record and send the request to the processing device 210. In response to the request, the processing device 210 may retrieve (e.g., retrieve from the storage device 230) the preliminary ward round record and send the preliminary ward round record to the doctor terminal (e.g., the doctor terminal 270) , and then cause the doctor terminal (e.g., the doctor terminal 270) to present the preliminary ward round record to the doctor.
In some embodiments, the preliminary ward round record may be generated based on sensed information (hereinafter referred to as first sensed information) collected by one or more sensing devices in the hospital ward during the ward rounds.
The one or more sensing devices in the hospital ward may include an image sensing device, a sound sensing device, a light intensity sensing device, etc., in the hospital ward.
The first sensed information refers to information collected by the one or more sensing devices during the ward rounds. For example, the first sensed information collected by the image sensing device may be image information inside the ward, and the first sensed information collected by the sound sensing device may be sound information (e.g., voice of the patient, voice of a ward round participant (e.g., doctors and/or nurses in the ward) , interactive voice between the patient and the ward round participant, etc. ) inside the ward.
In some embodiments, the processing device 210 may obtain a ward round record template. For example, the ward round record template may be preset and stored in the storage device 230, and the processing device 210 may retrieve the ward round record template from the storage device 230. In some embodiments, the processing device 210 may generate a preliminary ward round record by filling in the ward round record template based on the first sensed information. For example, the processing device 210 may perform face recognition on the ward round participants based on image information of personnel in the ward in the first sensed information to determine identity information (e.g., the name) of the ward round participants and fill the identity information of the ward round participants in the ward round record template. As another example, the processing device 210 may perform voice recognition based on the voice information of the doctor in the ward in the first sensed information to determine the content of the doctor order, and fill the doctor order in the ward round record template.
In some embodiments, after viewing the preliminary ward round record through the doctor terminal (e.g., the doctor terminal 270) , the doctor may further input (e.g., voice input) feedback information (e.g., the confirmation information or the modification information) on the preliminary ward round record via the doctor terminal (e.g., the doctor terminal 270) to confirm or modify the preliminary ward round record, thereby obtaining the ward round record.
FIG. 10 is a flowchart illustrating an exemplary process of participating in ward rounds remotely according to some embodiments of the present disclosure. As shown in FIG. 10, in some embodiments, a process 1000 may include the following operations. In some embodiments, the process 1000 may be performed by the processing device 210.
In 1010, a request to participate in ward rounds remotely input by a doctor may be obtained from a doctor terminal. More descriptions regarding the request to participate in the ward rounds remotely may be found in FIG. 9 and related descriptions thereof.
In 1020, in response to detecting that the ward rounds are conducted in a hospital ward, sensed information (e.g., the first sensed information) collected by one or more sensing devices in the hospital ward during the ward rounds may be obtained. More descriptions regarding the first sensed information may be found in FIG. 9 and related descriptions thereof.
In 1030, a virtual ward space may be generated based on the first sensed information and patient data.
The virtual ward space refers to a VR ward space. The virtual ward space may be a virtualized result of a real ward and/or people or objects in the ward. The virtual ward space may include a virtual bed, a virtual patient, a virtual device in the ward (e.g., a virtual electrocardiograph, a virtual blood pressure monitor, etc. ) , etc.
In some embodiments, when a doctor (e.g., the doctor 271) participates in the ward round remotely, on-site ward round personnel (e.g., doctors and nurses participating in on-site ward rounds) may exist in the real ward corresponding to the virtual ward space. Accordingly, the first sensed information may include image information and voice information inside the physical ward, such that the virtual ward space may also include virtual on-site ward round personnel. The doctor (e.g., the doctor 271) participating in the ward round remotely may obtain the virtual personnel (e.g., the virtual on-site ward round personnel, and virtual patients) and corresponding voice information (e.g., voice information of the patients, voice information of the on-site ward round personnel, etc. ) via the doctor terminal 270 (e.g., the second XR device 270-2) .
In some embodiments, the processing device 210 may construct a 3D model (i.e., the virtual ward space) corresponding to a real ward space at a preset ratio (e.g., a volume ratio of an object in the virtual ward space to an object in the real ward may be 1: 100) based on the first sensed information (e.g., an internal image of the ward) using a technique such as 3D modeling. For example, the processing device 210 may construct the 3D model of the real ward round personnel at a preset ratio (e.g., a volume ratio of the virtual ward round personnel to the real ward round personnel may be 1: 100) using the technique such as 3D modeling. The processing device 210 may construct the 3D model of the device in the ward at a preset ratio (e.g., a volume ratio of virtual ward devices to the real ward devices may be 1: 100) using the technique such as 3D modeling.
In some embodiments, the processing device 210 may generate the virtual ward space based on the patient data and the first sensed information. For example, the processing device 210 may construct virtual bodies (e.g., the virtual beds, the virtual patients, the virtual ward devices, the virtual on-site ward round personnel, etc. ) in the virtual ward space based on the above method. Accordingly, the processing device 210 may construct a virtual display screen for presenting the patient data in the virtual ward space based on the patient data (e.g., pathological data, medical examination data, etc. of the patients) . The virtual display screen may present the patient data for the doctor participating in the ward rounds remotely to view.
In 1040, the doctor terminal may be caused to present the virtual ward space. For example, the processing device 210 may send the generated virtual ward space to the second XR device 270-2 and control the second XR device 270-2 to present the virtual ward space.
FIG. 11A is a schematic diagram illustrating an exemplary consultation service interface according to some embodiments of the present disclosure. FIG. 11B is a schematic diagram illustrating an exemplary interface for presenting patient data of consultation patients according to some embodiments of the present disclosure. FIG. 11C is a schematic diagram illustrating an exemplary interface for presenting past
multimodal data of consultation patients according to some embodiments of the present disclosure.
In some embodiments, as shown in FIG. 11A, the consultation service interface 7120 may include a first interface element 7121 for accessing patient data of one or more patients who have booked a consultation service. In response to an interaction (e.g., clicking or selecting the first interface element 7121 in voice) between a doctor (e.g., the doctor 271) and the first interface element 7121, a doctor terminal (e.g., the doctor terminal 270) may generate a request for accessing the patient data of the patients who have booked the consultation service and send the request to the processing device 210. In response to the request, the processing device 210 may retrieve (e.g., retrieve from the storage device 230) the patient data and send the patient data to the doctor terminal (e.g., the doctor terminal 270) , and then cause the doctor terminal (e.g., the doctor terminal 270) to present the patient data to the doctor.
In some embodiments, as shown in FIG. 11B, the patient data of the patients who have booked the consultation service presented by the doctor terminal 270 may include consultation dates (e.g., December 29, 2022 as shown in FIG. 11B) , numbers, names, attending physicians, medical insurance time, contact numbers, last consultation time, etc., of the patients who have booked the consultation services.
In some embodiments, the processing device 210 may obtain a request for accessing patient data of a target patient of the patients from the doctor terminal. The processing device 210 may generate a virtual character representing the target patient based on the patient data of the target patient (e.g., a patient of the patient who have booked the consultation services) . The processing device 210 may cause the doctor terminal to present the virtual character to explain the patient data of the target patient to the doctor. More descriptions regarding the embodiment may be found in FIG. 12 and related descriptions thereof.
In some embodiments, the consultation service interface 7120 may further include a first interface element 7122 for accessing a preliminary diagnosis record relating to the consultation service. In response to an interaction (e.g., clicking or selecting the first interface element 7122 in voice) between the doctor (e.g., the doctor 271) and the first interface element 7122, the doctor terminal (e.g., the doctor terminal 270) may generate a request for accessing the preliminary diagnosis record and send the request to the processing device 210. In response to the request, the processing device 210 may retrieve (e.g., retrieve from the storage device 230) the preliminary diagnosis record and send the preliminary diagnosis record to the doctor terminal (e.g., the doctor terminal 270) , and then cause the doctor terminal (e.g., the doctor terminal 270) to present the preliminary diagnosis record to the doctor.
In some embodiments, the preliminary diagnosis record may be generated based on sensed information (hereinafter referred to as second sensed information) collected by one or more sensing devices in a consultation room during the consultation service.
The second sensed information refers to information collected by the one or more sensing devices in the consultation room. For example, taking the sensing device as a sound sensing device as an example, the second sensed information may be voice information of the doctors and the patient in the consultation room during the consultation services.
In some embodiments, the preliminary diagnosis record may be automatically generated. In some embodiments, the preliminary diagnosis record may include a preliminary patient medical history, a preliminary diagnosis opinion, a preliminary diagnosis prescription (e.g., a preliminary treatment prescription and a preliminary examination prescription) , a preliminary doctor order, or the like, or a combination thereof.
In some embodiments, the preliminary diagnosis record may be generated by an intelligent agent corresponding to the consultation service. The intelligent agent may learn a generation mechanism of the diagnosis record from various data such as a diagnosis record template, a knowledge dictionary, a knowledge database, etc., and process the second sensed information and the patient data based on the generation mechanism to generate the diagnosis record.
The intelligent agent refers to a program (e.g., a machine learning model with a self-evolving capability) that can make decisions or provide services based on environmental information (e.g., the second sensed information) , user input (e.g., voice information input by the doctor) , and/or empirical data. The program may be configured to autonomously collect information and make decisions on a regular basis, at a scheduled time, or in real time when prompted by a user.
In some embodiments, the processing device 210 may extract a key content from sensed data based on the diagnosis record template.
The diagnosis record template is used to define a format and/or content of the diagnosis record. For example, the diagnosis record template may contain a plurality of template fields arranged in a specific format, and these template fields may represent the content to be included in the diagnosis record. In some embodiments, the diagnosis record template may include the template fields such as the patient medical history (including basic information of the patient, description of patient symptoms, physical examination data, etc. ) , the diagnosis opinion, the diagnosis prescription (e.g., the treatment prescription and the examination prescription) , the doctor order, or the like.
The key content refers to a content relating toa template field in the diagnosis record template. In some embodiments, the second sensed information may include a voice signal. Since the voice signal records a conversation between the patient and the doctor, the key content extracted from the voice signal may include content in the natural language form. Specifically, the processing device 210 may transcribe the voice signal into a text, and extract the key content from the transcribed text based on the plurality of template fields in the diagnosis record template. For example, the key content "take a CT scan of the leg" may be extracted from the transcribed text based on the template field "check prescription" .
In some embodiments, the processing device 210 may extract the key content from the physical examination data collected by a physical examination device during a consultation process. The physical examination device refers to a device used in the consultation room to perform physical examination on the patient, such as a blood pressure monitor, etc. For example, when the diagnosis record template contains a template field "blood pressure" , a blood pressure value of the patient may be extracted from the data collected by the blood pressure monitor as the key content. The physical examination data refers to data of the patient collected by the physical examination device. For example, taking the physical examination device as the blood pressure monitor for an example, the physical examination data of the patient may be the blood pressure data of the patient.
In some embodiments, the processing device 210 may convert the key content into professional content based on the knowledge dictionary.
The knowledge dictionary refers to a reference dictionary of natural language description and professional language description. Specifically, the processing device 210 may use the key content as an index to retrieve the corresponding professional language from the knowledge dictionary as the professional
content. For example, the key content "take a CT scan of the leg" may be converted into the professional content "CT scan of the leg" based on the knowledge dictionary.
In some embodiments, the processing device 210 may convert the key content into the professional content based on a term conversion model. The term conversion model may be a machine learning model. The processing device 210 may obtain the professional content by processing the key content using the term conversion model. An input of the term conversion model may include the key content, and an output of the term conversion model may include the professional content.
In some embodiments, different departments may correspond to different knowledge dictionaries and/or term conversion models, and the professional content may be generated using a knowledge dictionary and/or a term conversion model corresponding to a registration department of the patient.
In some embodiments, the processing device 210 may generate the preliminary diagnosis record by updating the diagnosis record template based on the professional content and the knowledge database.
The knowledge database refers to a knowledge database of the registration department, which includes a consultation specification (e.g., a symptom description specification, a diagnosis specification, a prescription specification, a doctor order specification, etc. ) of the department. In some embodiments, the processing device 210 may assess and/or adjust the professional content according to the knowledge database to make the professional content conform to the consultation specification of the department. Further, the processing device 210 may fill in the corresponding template fields in the diagnosis record template with assessed or adjusted professional content, thereby generating the preliminary diagnosis record. In some embodiments, the processing device 210 may further generate the preliminary diagnosis record based on an electronic health record of the patient. For example, the processing device 210 may search for content corresponding to the template fields in the electronic health record, assess and/or adjust the content according to the knowledge database, and fill in the corresponding template fields in the diagnosis record template with the assessed or adjusted content.
In some embodiments, as shown in FIG. 11C, the preliminary diagnosis record presented by the doctor terminal 270 may include a number, a name, age, a preliminary diagnosis result, an examination image (e.g., a liver CT image in FIG. 11C) , ECG data, EEG data, examination results of multiple indicators (e.g., Glucose, BUN, etc. in FIG. 11C) , and a normal range corresponding to each examination item (e.g., a normal range of Glucose and BUN in FIG. 11C ) of the patient, etc.
FIG. 12 is a flowchart illustrating an exemplary process of viewing patient data according to some embodiments of the present disclosure. As shown in FIG. 12, in some embodiments, a process 1200 may include the following operations. In some embodiments, the process 1200 may be performed by the processing device 210.
In 1210, a request for accessing patient data of a target patient among patients may be obtained from a doctor terminal. The target patient may be a patient to be visited during ward round, or a patient who has booked a consultation service. More descriptions regarding the request for accessing the patient data may be found in FIG. 9 or FIG. 11A and related descriptions thereof.
In 1220, a virtual character (hereinafter referred to as a patient virtual character) representing the target patient may be generated based on the patient data of the target patient.
The patient virtual character refers to a VR image of the target patient. The patient virtual character
may be used to explain physical features (e.g., height, weight, a body outline, etc. ) and a lesion condition (e.g., a location, a size, and a shape of a lesion, etc. ) of the patient.
In some embodiments, the processing device 210 may construct a 3D model (i.e., the patient virtual character) corresponding to the target patient at a preset ratio (e.g., a ratio of a volume of the patient virtual character to an actual volume of the target patient may be 1: 10) based on the patient data of the target patient using a technique such as 3D modeling.
In 1230, the doctor terminal may be caused to present the virtual character to explain the patient data of the target patient to the doctor.
The processing device 210 may send the patient virtual character to the second XR device 270-2, and cause the second XR device 270-2 to present the patient virtual character. Meanwhile, the processing device 210 may cause the patient virtual character presented on the second XR device 270-2 to explain the patient data of the target patient in a preset form (e.g., a voice form or a video form, etc. ) . For example, the patient virtual character may describe basic information, historical diagnosis and treatment data, and/or current symptoms of the corresponding patient in the voice form.
In some embodiments, the processing device 210 may cause the patient virtual character to interact with the doctor in voice to answer doctor's questions. For example, the doctor may ask questions after the explanatory, and the second XR device 270-2 may collect question information of the doctor and send the question information to the processing device 210. The processing device 210 may determine the answer information by performing voice recognition on the question information, and send the answer information to the second XR device 270-2, and cause the patient virtual character to express the answer information to the doctor. For example, taking the doctor's question "How many surgeries have you performed" as an example, the processing device 210 may recognize the voice, and then search for a surgery record in the patient data and determine a count of surgeries on the patient. Then, the processing device 210 may send the count of surgeries on the patient to the second XR device 270-2, and cause the patient virtual character to inform the doctor of the count of surgeries on the patient in the voice form.
In some embodiments, the above explanatory content may be obtained by the processing device 210 by processing the patient data of the target patient based on a language model. For example, the processing device 210 may obtain one or more template fields (e.g., basic information, symptoms, operation records, medical examination reports of the patient, etc. ) in an explanatory content template, retrieve content corresponding to each of the template fields from the patient data of the target patient, and generate corresponding voice information or video information based on the above retrieval content, thereby obtaining the explanatory content.
In some embodiments, the processing device 210 may update the explanatory content of the patient virtual character based on real-time data (e.g., real-time actions, voice, diagnosis information of the patient, etc. ) of the patient. For example, the doctor 271 may ask questions to the patient virtual character via the second XR device 270-2, and the processing device 210 may send the question content to a first XR device worn by the target patient corresponding to the patient virtual character and obtain real-time feedback information of the target patient, so as to cause the patient virtual character to answer the questions based on the real-time feedback information.
FIG. 13 is a flowchart illustrating an exemplary process of remote consultation according to some
embodiments of the present disclosure. As shown in FIG. 13, in some embodiments, a process 1300 may include the following operations. In some embodiments, the process 1300 may be performed by the processing device 210.
In 1310, a request for entering a virtual consultation room to provide a remote consultation service to a target patient may be obtained from a doctor terminal.
The virtual consultation room refers to a VR consultation room. The virtual consultation room may be a virtualized result of a real consultation room and objects in the consultation room. The virtual consultation room may include a virtual consultation bed, a virtual consultation chair, a virtual consultation device, etc., or a combination thereof.
In some embodiments, the processing device 210 may construct a 3D model (i.e., the virtual consultation room) of the real consultation room according to a preset ratio (e.g., a volume ratio of the object in the virtual consultation room to the object in the real consultation room may be 1: 100) using a technique such as 3D modeling.
In 1320, the doctor terminal may be caused to present a 3D patient model of the target patient.
The 3D patient model refers to a virtual model corresponding to an entire body of the patient or a portion (e.g., upper body) of the body of the patient. Merely by way of example, the processing device 210 may obtain a preliminary 3D patient model from an electronic health record of the patient, and update the preliminary 3D patient model based on real-time dynamic data and physiological data of the patient to obtain the 3D patient model. Further, the processing device 210 may cause the doctor terminal to present the 3D patient model to the doctor. For example, the processing device 210 may cause the XR device 270-2 to present the 3D patient model in a doctor's field of view. The doctor's field of view may be a real field of view within a sight range of the doctor, or may be a virtual background (e.g., the virtual consultation room) .
In 1330, an examination instruction input by a doctor via interacting with the 3D patient model may be obtained from the doctor terminal.
The examination instruction of the doctor refers to an instruction to check vital signs of the patient. The examination instruction may include an examination site, an examination device, and/or an examination operation. The doctor may input the examination instruction in various ways such as a voice, a gesture, and an operation input device (e.g., smart gloves, a smart handle, etc. ) . For example, the XR device 270-2 may present virtual examination devices corresponding to various examination devices. The doctor may select the virtual examination device via the input device and perform a virtual examination operation on the 3D patient model using the virtual examination device. The processing device 210 may determine the examination site, the examination device, the examination operation, etc., based on the virtual examination operation performed by the doctor, thereby generating the examination instruction.
In addition to the examination instruction, the doctor may also input other instructions, such as an operation instruction to instruct the XR device 270-2 to rotate, enlarge, reduce, etc., the 3D patient model.
In 1340, a wearable device worn by the target patient may be caused to collect measurement data of the target patient based on the examination instruction.
The wearable device may be a wearable examination device. For example, the wearable device may be a blood pressure monitor, an electrocardiogram belt, etc. Accordingly, the measurement data of the target patient collected by the wearable device may be blood pressure data, electrocardiogram data, etc. of the
patient. More descriptions regarding the measurement data may be found in FIG. 11A and related descriptions thereof.
FIG. 14 is a schematic diagram illustrating an exemplary interface of performing a surgery service according to some embodiments of the present disclosure.
In some embodiments, as shown in FIG. 14, the surgery service interface 7140 may include a first interface element 7141 for accessing patient data relating to a target patient (i.e., a patient for whom the surgery is performed) for surgery. In response to an interaction (e.g., clicking or selecting the first interface element 7141 in voice) between a doctor (e.g., the doctor 271) and the first interface element 7141, a doctor terminal (e.g., the doctor terminal 270) may generate a request for accessing the patient data relating to the target patient for a surgery and send the request to the processing device 210. In response to the request, the processing device 210 may retrieve (e.g., retrieve from the storage device 230) the patient data and send the patient data to the doctor terminal (e.g., the doctor terminal 270) , and then cause the doctor terminal (e.g., the doctor terminal 270) to present the patient data to the doctor.
In some embodiments, the patient data of the patient may include data relating to a target surgery plan corresponding to the target patient. For example, the patient data of the patient may include content of the target surgery plan. The target surgery plan refers to a final surgery plan (i.e., the surgery plan adopted when the surgery is performed on the patient) . For example, the target surgery plan may be an optimal (e.g., a surgery plan that causes the least harm to the patient) surgery plan selected from a plurality of surgery plans. More descriptions regarding the patient data and the surgery plan may be found in FIG. 7 and related descriptions thereof.
In some embodiments, as shown in FIG. 14, the surgery service interface 7140 may further include a first interface element 7142 for updating a doctor order for the target patient.
For example, in response to an interaction (e.g., clicking or selecting the first interface element 7142 in voice) between the doctor (e.g., the doctor 271) and the first interface element 7142, the doctor terminal (e.g., the doctor terminal 270) may generate the doctor order for the target patient and send the doctor order to the processing device 210. In response to the above request, the processing device 210 may retrieve (e.g., retrieve from the storage device 230) the doctor order for the target patient and send the doctor’s order to the doctor terminal (e.g., the doctor terminal 270) , and then cause the doctor terminal (e.g., the doctor terminal 270) to present the doctor’s order for the target patient to the doctor. Then, the doctor terminal (e.g., the doctor terminal 270) may receive modification information regarding the doctor’s order input (e.g., a text input, a voice input, etc. ) by the doctor and send the modification information to the processing device 210, and the processing device 210 may update the doctor’s order for the target patient according to the modification information. For example, if the modification information input by the doctor is "modify the frequency of medication to three times a day" , the processing device 210 may update the frequency of medication in the doctor's order to three times a day according to the information.
In some embodiments, as shown in FIG. 14, the surgery service interface 7140 may further include a first interface element 7143 for accessing a preliminary surgery record of the surgery. In response to an interaction (e.g., clicking or voice selecting the first interface element 7143) between the doctor (e.g., the doctor 271) and the first interface element 7143, the doctor terminal (e.g., the doctor terminal 270) may generate a request for accessing the preliminary surgery record of the surgery and send the request to the
processing device 210. In response to the request, the processing device 210 may retrieve (e.g., retrieve from the storage device 230) the preliminary surgery record and send the preliminary surgery record to the doctor terminal (e.g., the doctor terminal 270) , and then cause the doctor terminal (e.g., the doctor terminal 270) to present the preliminary surgery record to the doctor.
The preliminary surgery record refers to original record data of information relating to the surgery and events occurred during the surgery. For example, the preliminary surgery record may include the information relating to the surgery, a record relating to the patient, a record relating to participants, etc.
The information relating to the surgery may include a start/end time of the surgery, a duration of the surgery, a risk that occurred, the types and quantity of surgery tools before the surgery, the types and quantity of the surgery tools after the surgery (to prevent the situation that the surgery tools are left in the patient's body) , the types and quantity of surgery consumables before the surgery, the types and quantity of the surgery consumables after the surgery, etc. For example, the information relating to the surgery may be that "the surgery starts at 14: 00, ends at 17: 12, and lasts for 3 hours and 12 minutes; the patient has severe bleeding during the surgery; the surgery tools before the surgery include 2 scalpels and 2 hemostatic forceps, and after the surgery, the surgery tools are counted and include 2 scalpels and 2 hemostatic forceps; the surgery consumables before the surgery include 2 blood transfusion packs (each 400 ml) , 1 dose of cardiac stimulant, and 5 packs of hemostatic gauze, and after the surgery, the surgery consumables are counted and include 0 blood transfusion pack remaining (indicating that the blood transfusion pack is used up) , 1 dose of cardiac stimulant remaining (indicating that the cardiac stimulant is not used) , and 1 pack of hemostatic gauze remaining" .
The record relating to the patient may include a duration of anesthesia (aduration from the start of anesthesia to awakening) , vital sign data (e.g., the heart rate, the blood pressure, the respiratory rate, etc. ) , the blood loss, etc., of the patient during the surgery. For example, the record relating to the patient may be in the form of a record sheet relating to the patient, which records the duration of anesthesia (e.g., 2 hours) , the blood loss (e.g., 350 ml) , and the heart rate, the blood pressure, the respiratory rate, etc., of the patient at multiple time points (e.g., recorded every 10 seconds during the surgery) during the surgery.
The record relating to the participants may include an action record, a force record, and a standing position record of a surgeon, and a standing position record, an action record, etc., of nursing staff's (nurses) , etc. The action record may include an action type of each action (e.g., cutting, pressing, lifting, etc. ) of the surgery participants and a time corresponding to each action. The force record may include a force magnitude of the surgery participants at multiple moments (e.g., record every 1 second) during the surgery.
The standing position record may include standing positions of the surgery participants at multiple moments during the surgery. The action record may be obtained via an action recognition device. The force record may be obtained via a force sensing device.
In some embodiments, the preliminary surgery record may be generated based on sensed information (hereinafter referred to as third sensed information) collected by one or more sensing devices in an operation room during the surgery.
The third sensed information may include all types of data during the surgery, such as gesture data of the participants (including doctors and nurses participating in the surgery) , image data and voice data in the operation room, etc.
In some embodiments, the processing device 210 may generate the preliminary surgery record based on the third sensed information using a record generation model (e.g., a preset record template) .
FIG. 15 is a flowchart illustrating an exemplary process of preoperative education according to some embodiments of the present disclosure. As shown in FIG. 15, in some embodiments, a process 1400 may include the following operations. In some embodiments, a process 1500 may be performed by the processing device 210.
In 1510, a request for conducting preoperative education for a target patient may be obtained.
The preoperative education refers to explaining matters relating to a surgery to a patient before the surgery, for example, explaining a content of a surgery plan to the patient, an implementation process of the surgery plan at a surgery site of the patient, and a postoperative recovery process of the patient after the surgery plan is used.
In some embodiments, the request may be obtained from a second XR device and input by a doctor via interacting with the third interface element 730.
For example, in response to an interaction (e.g., clicking or selecting the third interface element 730 in voice) between the doctor (e.g., the doctor 271) and the third interface element 730, the second XR device may generate the request for conducting the preoperative education for the target patient and send the request to the processing device 210.
In 1520, in response to the request, explanatory materials for explaining a surgery plan of the target patient may be generated.
The explanatory materials may be used to explain information relating to the surgery plan, such as an explanatory description of the surgery plan, an implementation process of the surgery plan at a surgery site of the patient, a postoperative recovery process of the patient after the surgery plan is used, etc. In some embodiments, the explanatory materials may include text, picture, audio, or video materials.
In some embodiments, the explanatory materials may be generated based on a digital twin model (e.g., a 3D anatomical model) of the surgery site of the patient. For example, the processing device 210 may simulate a process and a result of the surgery (e.g., a size of a postoperative incision, etc. ) based on the surgery plan on the 3D anatomical model of the surgery site of the patient.
In some embodiments, the explanatory materials may include a surgery video. The surgery video may present a process of performing the surgery on the surgery site of the patient based on the surgery plan. For example, taking the surgery type in the surgery plan as an invasive surgery as an example, the surgery video may present an appearance of the surgery site of the patient before the surgery, a process of the surgery site being cut open with a scalpel, a process of a lesion being removed, a process of suturing the incision, an appearance of the surgery site of the patient after suturing, etc.
In some embodiments, the explanatory materials may reflect the postoperative recovery process of the patient. In some embodiments, the processing device 210 may generate explanatory materials in the form of texts, pictures, audios, or videos to present the postoperative recovery process of the patient.
In some embodiments, the surgery video may further present the postoperative recovery process of the patient.
In some embodiments, the processing device 210 or the preoperative preparation module 430 may predict the postoperative recovery process of the patient based on the patient data of the patient.
The recovery process may reflect the vital signs of the patient and/or a wound recovery progress after the surgery. In some embodiments, the recovery process may include a wound healing speed, whether the vital signs of the patient are normal, whether there are complications, and/or an expected recovery time. For example, the recovery process may be that "the wound healing speed of the patient is 1 mm/day, the postoperative vital signs are normal without complications, and the expected recovery time is 1 month. "
In some embodiments, the processing device 210 or the preoperative preparation module 430 may determine the recovery process by processing the patient data and the surgery plan using a recovery prediction model. The recovery prediction model may be a machine learning model. An input of the recovery prediction model may include the patient data and the surgery plan, and an output of the recovery prediction model may include the recovery process.
In 1530, a first XR device worn by the patient and a second XR device may be caused to simultaneously present the explanatory materials to the target patient and the doctor.
In some embodiments, the processing device 210 may send the explanatory materials to the first XR device worn by the patient and the second XR device worn by the doctor, respectively, and cause the first XR device and the second XR device to present the explanatory materials.
In some embodiments, the processing device 210 may send the explanatory materials to a third XR device worn by a family of the patient, and cause the third XR device to present the explanatory materials.
In some embodiments, after the surgery plan is determined, the processing device 210 may further obtain a first confirmation instruction and/or a second confirmation instruction. The first confirmation instruction refers to an instruction regarding the surgery plan input by the patient via the first XR device. The second confirmation instruction refers an instruction regarding the surgery plan input by the family of the patient via the third XR device. In response to receiving the first confirmation instruction and the second confirmation instruction, the processing device 210 may cause the first XR device, the second XR device 270-2, and the third XR device to present a surgery consent form, respectively. The processing device 210 may obtain signature information of the surgery consent form from the first XR device, the second XR device 270-2, and the third XR device, respectively.
The first confirmation instruction and the second confirmation instruction may be confirmation information for the patient and the family of the patient to confirm that a currently determined surgery plan is a final surgery plan (i.e., the target surgery plan) . An input manner of the confirmation instruction may include a key input, a gesture input, a voice input, etc. For example, the patient may say "Confirm the surgery plan" to the first XR device, and the family of the patient may say "Confirm the surgery plan" to the third XR device. The first XR device and the third XR device may obtain a voice signal via a microphone disposed on the first XR device and the third XR device, respectively, and send the voice signals to the processing device 210.
The surgery consent form refers to an electronic informed consent form regarding the surgery plan that the patient and the family of the patient need to sign simultaneously after the patient and the family of the patient communicate with the doctor to jointly determine the optimal surgery plan and the doctor has informed the patient and the family of the patient of precautions during the surgery.
In some embodiments, the patient, the doctor, and the family of the patient may view the surgery consent form and sign the surgery consent form via the first XR device, the second XR device 270-2, and the third XR device, respectively. For example, the processing device 210 may generate signature information
(e.g., a signed text or pattern) on the surgery consent form based on the signature information of the surgery consent form obtained from the first XR device, the second XR device 270-2, and the third XR device. In some embodiments, the patient may input the signature information based on a fingerprint input, and the processing device 210 may verify the identity of the patient based on the fingerprint input by the patient, and generate the signature information of the patient in response to determining that identity verification is correct. The patient may input a fingerprint via a fingerprint sensor of the first XR device or other terminal devices (e.g., a touch screen device used by the patient) .
FIG. 16 is a flowchart illustrating an exemplary process of surgical simulation according to some embodiments of the present disclosure. As shown in FIG. 16, in some embodiments, a process 1600 may include the following operations. In some embodiments, the process 1600 may be performed by the processing device 210.
In 1610, a request for simulating a target surgery may be obtained.
In some embodiments, the request may be obtained from the second XR device 270-2 and input by a doctor via interacting with the fourth interface element 740.
For example, in response to an interaction (e.g., clicking or selecting the fourth interface element 740 in voice) between a doctor (e.g., the doctor 271) and the fourth interface element 740, the second XR device may generate the request for simulating a target surgery and send the request to the processing device 210.
Step 1620, in response to the request, a virtual surgery scene corresponding to the target surgery may be generated.
The virtual surgery scene may be generated based on a digital twin technique and presented to the doctor 271. The virtual surgery scene may be a virtual form of a real operation room.
The virtual surgery scene may include a virtual surgery site and/or virtual surgical equipment.
The virtual surgical equipment refers to a virtual form of real surgical equipment required for the surgery. For example, the virtual surgical equipment may include a virtual scalpel, a virtual hemostat, a virtual implant, a virtual blood transfusion pack, a virtual hemostatic gauze, etc.
The virtual surgery site refers to a virtual form of a part of the patient where surgery is required.
For example, the virtual surgery site may be a 3D anatomical model of a chest cavity of a patient and organs and in the chest cavity.
In some embodiments, the processing device 210 may generate the virtual surgery scene for surgical simulation based on a surgery plan of the target surgery. For example, the processing device 210 may determine the surgery site based on the surgery plan, and construct the 3D anatomical model corresponding to the surgery site at a preset ratio (e.g., a volume ratio of the real surgery site to the 3D anatomical model may be 1:1) using a technique such as 3D modeling. The processing device 210 may determine a model and a specification of a target implant based on the surgery plan, and determine a target implant model by selecting a target implant model from a database, or determining a personalized target implant model (i.e., a virtual target implant) . The processing device 210 may determine target surgery equipment to be used based on the surgery plan, and construct a 3D model of the target surgery equipment at a preset ratio (e.g., a volume ratio of the surgical equipment to the 3D model may be 1: 1) .
In 1630, the second XR device may be caused to present the virtual surgery scene to the doctor.
In some embodiments, the processing device 210 may send a generated virtual surgery scene to the
second XR device and cause the second XR device to present the virtual surgery scene.
In 1640, an interaction instruction with respect to the virtual surgery equipment input by the doctor via the second XR device or an interactive device corresponding to the virtual surgery equipment may be obtained.
The interactive device corresponding to the virtual surgical device refers to a device for sensing a behavior of a wearer (e.g., a doctor) . For example, the interactive device may be a sensing wearable device. The sensing wearable device may sense an action and/or a gesture of the wearer (e.g., the doctor) . For example, the sensing wearable device may include sensing gloves, a sensing bracelet, sensing clothing, etc.
The interaction instruction may reflect operation data of the doctor on the virtual surgery equipment. For example, the operation data on the virtual surgery equipment may include a type of the virtual surgery equipment used by the doctor 271, a fixed position, a movement direction, a movement range (e.g., a movement distance and angle) , and other data of the virtual surgery equipment.
In some embodiments, the interaction instruction may be a voice instruction, or an operation instruction (e.g., a key input, a gesture input, etc. ) input by the doctor via the second XR device or the interactive device (e.g., the sensing wearable device) . For example, the interaction instruction may be the operation instruction such as picking up, putting down, and moving the virtual surgery equipment by the doctor 271 via a sensing wearable device 271-3.
In some embodiments, the doctor 271 (e.g., a surgeon and/or a medical student) may enter a virtual surgery environment by wearing the second XR device 270-2, and perform operations on the 3D anatomical model of the patient in the virtual surgery scene via the second XR device 270-2 or the interactive device (e.g., the sensing wearable device) corresponding to the virtual surgery environment, thereby performing a simulated surgery in the virtual surgery environment.
In some embodiments, the doctor 271 (e.g., the surgeon and/or the medical student) may perform the simulated surgery by operating on the 3D anatomical model of the patient in the virtual surgery scene using real simulated equipment (e.g., a scalpel, an operating table, a forces, etc. ) in a physical space.
In 1650, the virtual surgery site and the virtual surgery equipment in the virtual surgery scene may be updated based on the interaction instruction.
Updating the virtual surgery site and virtual surgery equipment in the virtual surgery scene means updating a form and/or a position of the virtual surgery site and virtual surgery equipment based on the content of the interaction instruction after the interaction instruction is obtained. For example, taking the interaction instruction of "lowering the scalpel tip until it contacts the skin, then moving downward to cut into the skin 1 cm, and then cutting 3 cm to the left" as an example, the processing device 210 may present a virtual incision 1 cm deep and 3 cm long in a left-right direction on a skin surface of the virtual surgery site in the virtual operation room; assuming that the height of the virtual scalpel from the skin of the virtual surgery site before the update is 5cm, after the update, the virtual scalpel may move 6 cm downward and 3 cm to the left compared to that before the update.
FIG. 17 is flowchart illustrating an exemplary process of performing surgery planning according to other embodiments of the present disclosure. As shown in FIG. 17, in some embodiments, a process 1700 may include the following operations. In some embodiments, the process 1700 may be performed by the processing device 210.
In 1710, in response to determining that an expert meeting is needed, the doctor terminal may be caused to present a sixth interface element 760 for initiating the expert meeting.
In some embodiments, the request may be obtained from a doctor terminal (e.g., the doctor terminal 270) and input by a doctor via interacting with the fifth interface element 750.
For example, in response to an interaction (e.g., clicking or selecting the fifth interface element 750 in voice) between the doctor (e.g., doctor 271) and the fifth interface element 750, the doctor terminal 270 may generate the request for performing surgery planning for the target patient and send the request to the processing device 210.
In some embodiments, when the processing device 210 obtains the request for performing surgery planning for the target patient, the processing device 210 may further obtain patient data (e.g., retrieve from the storage device 230 or the UHR) of the target patient.
In 1720, an operation difficulty factor may be determined based on patient data of the target patient.
The operation difficulty factor reflects a difficulty degree of the surgery. The operation difficulty factor may be characterized based on an operation difficulty factor value. For example, the operation difficulty factor value may be an integer within a range of [1, 10] . The larger the value, the higher the operation difficulty factor (i.e., the more difficult the surgery) .
A determination manner of the operation difficulty factor may include manual determination and/or intelligent determination.
The manual determination refers to that the operation difficulty factor is determined by difficulty factor determination personnel (e.g., a doctor expert) based on the patient data. For example, the doctor may determine the operation difficulty factor based on surgery simulation and/or historical surgery experience via a second terminal device worn by the doctor or an intelligent display terminal of a doctor workstation.
The intelligent determination refers to determining the operation difficulty factor by processing the patient data using a factor determination model. The factor determination model may be a machine learning model. An input of the factor determination model may include the patient data, and an output of the factor determination model may include the operation difficulty factor value. In some embodiments, the factor determination model may be integrated in the processing device 210. After the processing device 210 determines the operation difficulty factor using the factor determination model, the operation difficulty factor output may be presented to the doctor via an interactive device (e.g., a second terminal device, an intelligent display terminal, etc. ) of a doctor workstation. In some embodiments, the doctor may input modification information of the operation difficulty factor via an interactive device (e.g., a microphone, a gesture sensor, a touch screen, etc., on the second terminal device worn by the doctor) to modify the operation difficulty factor determined by the factor determination model (e.g., increase the operation difficulty factor value determined by the factor determination model by 2 as a final operation difficulty factor) . In some embodiments, in order to ensure the accuracy of the operation difficulty factor, the processing device 210 may combine the intelligent determination with the manual determination. For example, the processing device 210 may first perform the intelligent determination and send a result of the intelligent determination to the factor determination person, and the factor determination personnel may confirm or modify the result of the intelligent determination to determine the final operation difficulty factor.
In 1730, whether an expert meeting is needed may be determined based on the operation difficulty
factor.
In some embodiments, the processing device 210 may determine whether the expert meeting is needed by determining whether the operation difficulty factor is greater than a difficulty factor threshold. In response to determining that the operation difficulty factor is greater than the difficulty factor threshold, operation 1740 may be performed; in response to determining that the operation difficulty factor is not greater than the difficulty factor threshold, the processing device 210 may directly generate a surgery plan (e.g., the processing device 210 may generate a surgery plan using the process described in FIG. 7) . The difficulty factor threshold may be manually preset or automatically determined by the system.
In 1740, in response to determining that an expert meeting is required, the doctor terminal may be controlled to present the sixth interface element 760 for initiating an expert meeting.
In some embodiments, in response to an interaction (e.g., clicking or selecting the sixth interface element 760 in voice) between a doctor (e.g., the doctor 271) and the sixth interface element 760, the doctor terminal (e.g., the doctor terminal 270) may generate a request for initiating the expert meeting and send the request to the processing device 210. In response to the request, the processing device 210 may send a meeting invitation to terminal devices (e.g., the XR devices) used by other participants (e.g., remote experts, and doctors from other departments) . In response to receiving attendance confirmation information sent back by the terminal devices of the participants, the processing device 210 may generate a virtual meeting space and cause the terminal devices used by the participants (including the doctor interacting with the sixth interface element 760, the remote experts, or the doctors from other departments, etc. ) to present the virtual meeting space.
The virtual meeting space refers to a virtual meeting scene presented by an XR device (e.g., the second XR device 270-2) .
The participants of the expert meeting may enter the virtual meeting space via the interactive devices (e.g., the XR devices) worn by the participants, and then view the patient data (e.g., the 3D anatomical model of the patient) presented in the virtual meeting space, and share views (e.g., conduct voice discussions in the virtual meeting space based on the interactive devices worn by the participants) of the participants in real time, and determine the final surgery plan.
In some embodiments, the doctor (e.g., the doctor 271) may determine whether the expert meeting is needed based on the patient data. If the doctor determines that the expert meeting is needed, the doctor may initiate the request for initiating the expert meeting via the doctor terminal 270 (e.g., the second XR device 270-2) and send the request to the processing device 210. In response to the request, the processing device 210 may cause the doctor terminal 270 to present the sixth interface element 760 for initiating the expert meeting. The doctor may initiate the request for initiate the expert meeting by interacting with the doctor terminal 270, such as by interacting with the virtual character 520 shown in FIG. 5 in voice.
FIG. 18 is a flowchart illustrating an exemplary process of updating a preliminary admission record according to some embodiments of the present disclosure. As shown in FIG. 18, in some embodiments, a process 1800 may include the following operations. In some embodiments, the process 1800 may be performed by the processing device 210.
In 1810, a request for accessing a preliminary admission record of a target patient may be obtained.
In some embodiments, the request may be obtained from a doctor terminal (e.g., the doctor terminal
270) and input by the doctor via interacting with the seventh interface element 770.
For example, in response to an interaction (e.g., clicking or selecting the seventh interface element 770 in voice) between the doctor (e.g., doctor 271) and the seventh interface element 770, the doctor terminal 270 may generate the request for accessing the preliminary admission record of the target patient and send the request to the processing device 210.
The preliminary admission record refers to record data of relevant information when the patient begins hospitalization. The preliminary admission record may include basic information (including gender, age, height, weight, etc. of the patient) , a consultation record (including doctor's inquiries and patient's responses during the consultation) , physical examination data, medical examination data (including blood detection, biochemical detection, urine detection, immunological detection, microbiological detection, allergen detection, imaging tests (e.g., a CT scan, an MR scan, a PET scan, an ultrasound scan, etc. ) , etc. ) , a preliminary diagnosis result, etc., of the patient.
The basic information, the physical examination data, and/or the medical examination data of the patient may be obtained from the UHR and/or supplemented by the doctor. For example, personal information (e.g., after first log-in) , the physical examination records, the medical examination records, etc., of the patient be stored in a folder (e.g., a folder named by a number, name, etc. of the patient) corresponding to the patient in the UHR.
The consultation record may be recorded and acquired by a sound sensor (e.g., a microphone) configured at the doctor terminal (e.g., the doctor terminal 270) during the consultation.
The preliminary diagnosis result may be determined by the doctor based on the consultation record, the physical examination data, and the medical examination data.
In 1820, in response to the request for accessing a preliminary admission record, the doctor terminal may be caused to present the preliminary admission record.
For example, in response to the request for accessing a preliminary admission record, the processing device 210 may retrieve (e.g., retrieve from the storage device 230) the preliminary admission record and send the preliminary admission record to the doctor terminal (e.g., the doctor terminal 270) , and then cause the doctor terminal (e.g., the doctor terminal 270) to present the preliminary admission record to the doctor. For example, the preliminary admission record may be presented to the doctor via the second XR device 270-2 or a large display screen of a doctor room.
In 1830, the preliminary admission record may be updated based on feedback information regarding the preliminary admission record input by the doctor via the doctor terminal.
The feedback information refers to feedback on missing information and/or change information in the preliminary admission record. The missing information refers to information that the doctor wants to know from the preliminary admission record but is not recorded in the preliminary admission record. The change information refers to information that is incorrectly recorded in the preliminary admission record and needs to be adjusted.
For example, the doctor terminal 270 may receive the feedback information on the preliminary admission record input (e.g., a text input, a voice input, etc. ) by the doctor and send the feedback information to the processing device 210, and the processing device 210 may update the preliminary admission record of the target patient according to the feedback information. For example, if the feedback information input by
the doctor is "the preliminary admission record lacks a patient's abdominal CT image" , the processing device 210 may add the patient's abdominal CT image to the preliminary admission record according to the information to update the preliminary admission record.
In some embodiments, the processing device 210 may update the preliminary admission record via the operations 1831-1834.
In 1831, an inquiry content of a supplementary inquiry may be determined based on the feedback information.
The supplementary inquiry refers to a process of communicating with the patient to obtain information that needs to be supplemented and/or amended in the preliminary admission record. The inquiry content of the supplementary inquiry may include questions that the patient needs to answer during the inquiry.
In some embodiments, the processing device 210 may determine the corresponding inquiry content based on the missing information and/or change information in the feedback information. For example, if the feedback information is "the family genetic history of the patient is missing in the preliminary admission record" , the inquiry content of the corresponding supplementary inquiry may be "Does your family have a family genetic history? " If so, what genetic disease? "
In 1832, a third terminal device in a hospital ward where a target patient is located may be caused to conduct the supplementary inquiry based on the inquiry content.
The third terminal device refers to a terminal device provided by a hospital for use in an inpatient ward during a patient's stay. The third terminal device may include an XR device, a mobile device, a presentation device, or a similar device, or any combination thereof. In some embodiments, the third terminal device may be a portion of a hospital bed.
In some embodiments, the processing device 210 may perform the supplementary inquiry by presenting the inquiry content of the supplementary inquiry via the third terminal device. For example, the processing device 210 may present the above inquiry content to the target patient in the form of text via a screen of the third terminal device. As another example, the processing device 210 may broadcast the above inquiry content to the target patient via a loudspeaker disposed on the third terminal device.
In some embodiments, if the third terminal device is an XR device worn by the target patient, the processing device 210 may cause the third terminal device to present a virtual inquiry character that performs the supplementary inquiry. The virtual inquiry character refers to a computer-generated virtual person or virtual object set to interact with a patient in an XR environment. The virtual inquiry character may be configured to conduct the supplementary inquiry via communicating with the target patient. For example, the virtual inquiry character may be a digital role with certain appearance features, acoustic features, etc. Merely by way of example, the processing device 210 may cause the XR device worn by the target patient to present the virtual inquiry character and cause the virtual inquiry character to present the inquiry content of the supplementary inquiry to the target patient. Meanwhile, the virtual inquiry character may simulate the expressions and/or actions of people during speaking, providing the target patient with almost real communication experience.
In 1833, sensed information (hereinafter referred to as fourth sensed information) collected by one or more sensing devices in the hospital ward may be obtained during the supplementary inquiry.
The fourth sensed information refers to information obtained by one or more sensing devices (e.g.,
microphones, etc. ) in the ward during the supplementary inquiry. For example, the fourth sensed information may be voice information expressed by the target patient, which is picked up by a microphone disposed on the third terminal device during the supplementary inquiry.
In 1834, the preliminary admission record may be updated based on the sensed information.
In some embodiments, the processing device 210 may determine response content of the patient to the inquiry based on the fourth sensed information. The processing device 210 may update the preliminary admission record based on the response content.
For example, if the inquiry content is still "Does your family have a family genetic history? " If so, what genetic disease? " , the processing device 210 may determine a statement relating to the genetic disease in the voice information by performing voice recognition on the voice information of the patient in the fourth sensed information, and determine the response content based on the statement. For example, if the content of the voice information relating to the genetic disease is "there is no genetic disease in my family, " the processing device 210 may determine that the response content is "no family genetic history. " Accordingly, the processing device210 may supplement a record of no family genetic disease of the target patient in the preliminary admission record based on the response content, or change an original record relating to the genetic disease in the preliminary admission record to no genetic disease.
FIG. 19 is a flowchart illustrating an exemplary method for assisting a doctor in work according to some embodiments of the present disclosure. As shown in FIG. 19, in some embodiments, a process 1900 may include the following operations. In some embodiments, the process 1900 may be performed by a doctor terminal (e.g., the doctor terminal 270) .
In1910, an access request to a doctor space application input by a doctor may be received.
The access request refers to an initiation request of the doctor for the doctor space application. The doctor may input the access request to the doctor space application in various ways. For example, the doctor space application may be presented on a display screen of the mobile terminal device 270-1, and the doctor 271 may generate the access request (e.g., the doctor 271 says "start the doctor space application" ) to the doctor space application by clicking on the doctor space application or inputting the access request to the doctor space application in voice. As another example, the second XR device 270-2 may generate a VR doctor space application, and the doctor 271 may generate the access request to the doctor space application by clicking on the VR doctor space application or inputting the access request to the doctor space application in voice.
In 1920, the access request may be transmitted to a processing device (e.g., the processing device 210) .
In 1930, an instruction to present an interactive interface via the doctor space application may be received from the processing device (e.g., processing device 210) .
In response to receiving the instruction, the doctor terminal 270 may present the interactive interface via the doctor space application.
In some embodiments, the interactive interface may include one or more first interface elements for accessing one or more assistance services relating to at least one pending task of one or more pending tasks to be completed by the doctor. For example, the interactive interface may include a first interface element for conducting ward rounds remotely, a first interface element for accessing patient data of patients who have
booked consultation services, a first interface element for entering a virtual consultation room, a first interface element for accessing patient data relating to a target patient, etc. More descriptions regarding the first interface elements may be found in FIG. 7 and related descriptions thereof.
In some embodiments, the processing device (e.g., processing device 210) may determine the one or more pending tasks based on a receipt time of the access request at the processing device and schedule information of the doctor. More descriptions regarding the embodiment may be found in FIG. 4 and related descriptions thereof.
Some embodiments of the present disclosure further provide a system, comprising at least one storage medium, including a set of instructions; and one or more processors configured to communicate with the at least one storage medium. When the set of instructions are executed, the one or more processors are configured to implement the method described above (e.g., the processes 400-1900) .
Some embodiments of the present disclosure further provide a non-transitory computer-readable storage medium comprising computer instructions that, when read by a computer, may direct the computer to implement the method described above (e.g., the processes 400-1900) .
The basic concepts have been described above, and it is apparent to those skilled in the art that the foregoing detailed disclosure is intended as an example only and does not constitute a limitation of the present disclosure. While not expressly stated herein, various modifications, improvements, and amendments may be made to the present disclosure by those skilled in the art. Those types of modifications, improvements, and amendments are suggested in the present disclosure, so those types of modifications, improvements, and amendments remain within the spirit and scope of the exemplary embodiments of the present disclosure.
Also, the present disclosure uses specific words to describe embodiments of the present disclosure such as "an embodiment" , "one embodiment" , and/or "some embodiment" , which means a feature, structure, or characteristic associated with at least one embodiment of the present disclosure. Accordingly, it should be emphasized and noted that "one embodiment" , "an embodiment" or "an alternative embodiment" referred to two or more times in different locations in the present disclosure means a feature, structure, or characteristic related to at least one embodiment of the present disclosure. Therefore, it should be stressed and notes that "one embodiment" or "an alternative embodiment" in different places in the present disclosure do not necessarily refer to the same embodiment. In addition, certain features, structures, or characteristics in one or more embodiments of the present disclosure may be suitably combined.
In addition, unless expressly stated in the claims, the order of processing elements and sequences, the use of numerical letters, or the use of other names as described herein are not intended to qualify the order of the processes and methods of the present disclosure. While some embodiments of the invention that are currently considered useful are discussed in the foregoing disclosure by way of various examples, it should be appreciated that such details serve only illustrative purposes, and that additional claims are not limited to the disclosed embodiments, rather, the claims are intended to cover all amendments and equivalent combinations that are consistent with the substance and scope of the embodiments of the present disclosure. For example, although the implementation of various components described above may be embodied in a hardware device, it may also be implemented as a software only solution, e.g., an installation on an existing server or mobile device.
Similarly, it should be noted that in order to simplify the presentation of the disclosure of the present
disclosure, and thereby aid in the understanding of one or more embodiments of the invention, the foregoing descriptions of embodiments of the present disclosure sometimes group multiple features together in a single embodiment, accompanying drawings, or a description thereof. However, this method of disclosure does not imply that the objects of the present disclosure require more features than those mentioned in the claims. Rather, claimed subject matter may lie in less than all features of a single foregoing disclosed embodiment.
Numbers describing the number of components, attributes, and attributes are used in some embodiments, and it should be understood that such numbers used in the description of embodiments are modified in some examples by the modifiers "about" , "approximately" , or "generally" . Unless otherwise noted, the terms "about" , "approximately" , or "generally" indicates that a ±20%variation in the stated number is allowed. Correspondingly, in some embodiments, the numerical parameters used in the present disclosure and claims are approximations, which can change depending on the desired characteristics of individual embodiments. In some embodiments, the numerical parameters should take into account the specified number of valid digits and employ general place-keeping. While the numerical domains and parameters used to confirm the breadth of their ranges in some embodiments of the present disclosure are approximations, in specific embodiments, such values are set to be as precise as possible within a feasible range.
For each patent, patent application, patent application disclosure, and other material cited in the present disclosure, such as articles, books, specification sheets, publications, documents, etc., the entire contents of which are hereby incorporated herein by reference. Historical application history documents that are inconsistent with or create a conflict with the contents of the present disclosure are excluded, as well as documents that limit the broadest scope of the claims of the present disclosure (currently or hereafter appended to the present disclosure) . It should be noted that in the event of any inconsistency or conflict between the descriptions, definitions, and/or use of terminology in the materials appended to the present disclosure and those set forth in the present disclosure, the descriptions, definitions and/or use of terminology in the present disclosure prevail.
Finally, it should be understood that the embodiments described in the present disclosure are used only to illustrate the principles of the embodiments of the present disclosure. Other deformations may also fall within the scope of the present disclosure. As such, alternative configurations of embodiments of the present disclosure may be viewed as consistent with the teachings of the present disclosure as an example, not as a limitation. Correspondingly, the embodiments of the present disclosure are not limited to the embodiments expressly presented and described here.
Claims (31)
- A method for assisting doctors in work, implemented on a processing device communicatively connected to a doctor terminal of a doctor, comprising:obtaining, from the doctor terminal, an access request to a doctor space application;in response to the access request, determining, based on a receipt time of the access request and schedule information of the doctor, one or more pending tasks to be completed by the doctor; andcausing the doctor terminal to present, via the doctor space application, an interactive interface that includes first interface elements for accessing assistance services relating to at least one of the one or more pending tasks.
- The method of claim 1, wherein the method further comprises:determining, based on the receipt time and the schedule information, one or more completed tasks of the doctor, wherein the interactive interface further includes one or more collapsible elements relating to the one or more completed tasks.
- The method of claim 1, wherein the interactive interface further includes a second interface element for accessing a real-time 3D map relating to a target location corresponding to the at least one pending task.
- The method of claim 1, wherein the one or more pending tasks include conducting ward rounds in a hospital ward, and the first interface elements include a first interface element for applying to participate in the ward rounds remotely.
- The method of claim 4, wherein the method further comprises:obtaining, from the doctor terminal, a request to participate in the ward rounds remotely input by the doctor;in response to detecting that the ward rounds are conducted in the hospital ward, obtaining sensed information collected by one or more sensing devices in the hospital ward during the ward rounds;generating, based on the sensed information and patient data, a virtual ward space; andcausing the doctor terminal to present the virtual ward space.
- The method of claim 1, wherein the one or more pending tasks include providing consultation services in a consultation room, and the first interface elements include a first interface element for accessing patient data of patients who have booked the consultation services.
- The method of claim 6, wherein the method further comprises:obtaining, from the doctor terminal, a request for accessing the patient data of a target patient among the patients;generating, based on the patient data of the target patient, a virtual character representing the target patient; andcausing the doctor terminal to present the virtual character to explain the patient data of the target patient to the doctor.
- The method of claim 1, wherein the one or more pending tasks include providing remote consultation services, and the first interface elements include a first interface element for entering a virtual consultation room.
- The method of claim 8, wherein the method further comprises:obtaining, from the doctor terminal, a request to enter the virtual consultation room to provide the remote consultation services to a target patient;causing the doctor terminal to present a 3D patient model of the target patient;obtaining, from the doctor terminal, an examination instruction input by the doctor via interacting with the 3D patient model; andcausing, based on the examination instruction, a wearable device worn by the target patient to collect measurement data of the target patient.
- The method of claim 1, wherein the one or more pending tasks include performing surgery on a target patient, and the first interface elements include a first interface element for accessing patient data relating to the target patient.
- The method of claim 1, wherein the interactive interface further comprises a third interface element for conducting preoperative education.
- The method of claim 11, wherein the doctor terminal includes a second XR device worn by the doctor, and the method further comprises:obtaining a request for conducting preoperative education for a target patient, the request being obtained from the second XR device and input by the doctor via interacting with the third interface element;in response to the request, generating explanatory materials for explaining a candidate surgery plan of the target patient;causing a first XR device worn by the patient and the second XR device to simultaneously present the explanatory materials to the target patient and the doctor.
- The method of claim 1, wherein the interactive interface further comprises a fourth interface element for conducting surgery simulation.
- The method of claim 13, wherein the doctor terminal includes a second XR device worn by the doctor, the method further comprises:obtaining a request for simulating a target surgery, the request being obtained from the second XR device and input by the doctor via interacting with the fourth interface element;in response to the request, generating a virtual surgery scene corresponding to the target surgery, the virtual surgery scene including a virtual surgery site and a virtual surgery equipment; andcausing the second XR device to present the virtual surgery scene to the doctor;obtaining an interaction instruction with respect to the virtual surgery equipment input by the doctor via the second XR device or an interactive device corresponding to the virtual surgery equipment; andupdating the virtual surgery site and the virtual surgery equipment in the virtual surgery scene based on the interaction instruction.
- The method of claim 1, wherein the interactive interface further comprises a fifth interface element for performing surgery planning.
- The method of claim 15, wherein the method further comprises:obtaining a request for performing surgery planning for a target patient, the request being obtained from the doctor terminal and input by the doctor via interacting with the fifth interface element;determining an operation difficulty factor based on patient data of the target patient;determining whether an expert meeting is needed based on the operation difficulty factor;in response to determining that the expert meeting is needed, causing the doctor terminal to present a sixth interface element for initiating the expert meeting.
- The method of claim 1, wherein the interactive interface further comprises a seventh interface element for patient management.
- The method of claim 17, wherein the method further comprises:obtaining a request for accessing a preliminary admission record of a target patient, the request being obtained from the doctor terminal and input by the doctor via interacting with the seventh interface element;in response to the request, causing the doctor terminal to present the preliminary admission record; andupdating the preliminary admission record based on feedback information regarding the preliminary admission record input by the doctor via the doctor terminal.
- The method of claim 18, wherein the updating the preliminary admission record comprises:determining, based on the feedback information, inquiry content of a supplementary inquiry;causing, based on the inquiry content, a third terminal device in a hospital ward where the target patient is located to conduct the supplementary inquiry;obtaining sensed information collected by one or more sensing devices in the hospital ward during the supplementary inquiry; andupdating the preliminary admission record based on the sensed information.
- The method of claim 1, wherein in response to the access request, the determining one or more pending tasks to be completed by the doctor comprises:in response to the access request, causing the doctor terminal to present, via the doctor space application, a preliminary interactive interface that includes an eighth interface element for reminding the doctor to check a work schedule; andin response to a request for accessing the work schedule input by the doctor via the doctor terminal, determining the one or more pending tasks.
- The method of claim 20, wherein the preliminary interactive interface further presents a virtual character configured to communicate with the doctor.
- A method for assisting doctors in work, implemented on a doctor terminal of a doctor communicatively connected to a processing device, the doctor terminal being installed with a doctor space application, comprising:receiving an access request to the doctor space application input by the doctor;transmitting the access request to the processing device;receiving, from the processing device, an instruction to present an interactive interface via the doctor space application, whereinthe interactive interface includes first interface elements for accessing assistance services relating to at least one pending task of one or more pending tasks to be completed by the doctor, andthe one or more pending tasks are determined by the processing device based on a receipt time of the access request at the processing device and schedule information of the doctor.
- The method of claim 22, wherein the interactive interface further includes a second interface element for accessing a real-time 3D map relating to a target location corresponding to the at least one pending task.
- The method of claim 22, wherein the one or more pending tasks include conducting ward rounds in a hospital ward, and the first interface elements include a first interface element for applying to participate in the ward rounds remotely.
- The method of claim 22, wherein the one or more pending tasks include providing consultation services in a consultation room, and the first interface elements include a first interface element for accessing patient data of patients who have booked the consultation services.
- The method of claim 22, wherein the one or more pending tasks include providing remote consultation services, and the first interface elements include a first interface element for entering a virtual consultation room.
- The method of claim 22, wherein the one or more pending tasks include performing surgery on a target patient, and the first interface elements include a first interface element for accessing patient data relating to the target patient.
- The method of claim 22, wherein the interactive interface further comprises a third interface element for conducting preoperative education.
- The method of claim 22, wherein the interactive interface further comprises a fourth interface element for conducting surgery simulation.
- The method of claim 22, wherein the interactive interface further comprises a fifth interface element for performing surgery planning.
- The method of claim 22, wherein the interactive interface further comprises a seventh interface element for patient management.
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2024/109063 WO2026025418A1 (en) | 2024-07-31 | 2024-07-31 | Methods and systems for assisting doctors in work |
| CN202411385860.1A CN121460091A (en) | 2024-07-31 | 2024-09-30 | Method and system for assisting doctor in working |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2024/109063 WO2026025418A1 (en) | 2024-07-31 | 2024-07-31 | Methods and systems for assisting doctors in work |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2026025418A1 true WO2026025418A1 (en) | 2026-02-05 |
Family
ID=98579655
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2024/109063 Pending WO2026025418A1 (en) | 2024-07-31 | 2024-07-31 | Methods and systems for assisting doctors in work |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN121460091A (en) |
| WO (1) | WO2026025418A1 (en) |
-
2024
- 2024-07-31 WO PCT/CN2024/109063 patent/WO2026025418A1/en active Pending
- 2024-09-30 CN CN202411385860.1A patent/CN121460091A/en active Pending
Also Published As
| Publication number | Publication date |
|---|---|
| CN121460091A (en) | 2026-02-03 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11545271B2 (en) | Systems and methods for public and private communication threads | |
| KR20210113299A (en) | Systems and methods for interactive and flexible data presentation | |
| US20200365258A1 (en) | Apparatus for generating and transmitting annotated video sequences in response to manual and image input devices | |
| US11600397B2 (en) | Systems and methods for conversational flexible data presentation | |
| WO2026025413A1 (en) | Methods, systems, and storage mediums for providing medical consultation services | |
| WO2026025415A1 (en) | Methods and systems for surgery planning and executing | |
| WO2026025414A1 (en) | Systems and methods for providing hospitalization services | |
| WO2026025416A1 (en) | Systems for hospital management | |
| WO2026025419A1 (en) | Hospital support platforms | |
| WO2026025418A1 (en) | Methods and systems for assisting doctors in work | |
| WO2026025420A1 (en) | Medical service systems, devices, and methods | |
| Mirbabaie et al. | Digital Assistants for Diabetes Treatment: Designing a User Interface to Support Chronic Disease Self-Management | |
| WO2026025411A1 (en) | Medical service systems and methods | |
| WO2026025412A1 (en) | Medical service systems and methods | |
| WO2026025417A1 (en) | Methods, systems, and storage media for providing medical services | |
| CN119964751A (en) | A medical service system, device, equipment and method | |
| CN119811610A (en) | A system and method for providing inpatient nursing services | |
| Huang et al. | World Of 5g, The-Volume 5: Intelligent Medicine | |
| Nadeem | Integration of hybrid knowledge graph models for real-time decision support in emergency medical care | |
| JP2024104274A (en) | Information processing system, control method for information processing system, control program, and recording medium |