[go: up one dir, main page]

CN119541906A - A method, system and storage medium for providing medical consultation services - Google Patents

A method, system and storage medium for providing medical consultation services Download PDF

Info

Publication number
CN119541906A
CN119541906A CN202411742797.2A CN202411742797A CN119541906A CN 119541906 A CN119541906 A CN 119541906A CN 202411742797 A CN202411742797 A CN 202411742797A CN 119541906 A CN119541906 A CN 119541906A
Authority
CN
China
Prior art keywords
patient
terminal
doctor
inquiry
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202411742797.2A
Other languages
Chinese (zh)
Inventor
张丽
张瑞祺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Lianying Zhiyuan Medical Technology Co ltd
Original Assignee
Shanghai Lianying Zhiyuan Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Lianying Zhiyuan Medical Technology Co ltd filed Critical Shanghai Lianying Zhiyuan Medical Technology Co ltd
Priority to CN202510043844.2A priority Critical patent/CN119541908A/en
Priority to CN202510043800.XA priority patent/CN119581067A/en
Priority to CN202510052426.XA priority patent/CN119601261A/en
Priority to CN202510046056.9A priority patent/CN119560184A/en
Priority to CN202510052613.8A priority patent/CN119541909A/en
Priority to CN202510043977.XA priority patent/CN119724628A/en
Publication of CN119541906A publication Critical patent/CN119541906A/en
Pending legal-status Critical Current

Links

Landscapes

  • Medical Treatment And Welfare Office Work (AREA)

Abstract

本说明书实施例提供一种提供医疗就诊服务的方法、系统和存储介质。所述方法由至少一个处理器执行,其特征在于,包括:通过患者的患者终端对患者进行挂号询问,以确定患者挂号的医生;基于医生的科室,通过患者终端对患者进行预问诊询问,以生成患者的预问诊记录;响应于检测到患者开始问诊,通过诊室中的诊室终端显示与预问诊记录有关的界面元素,以向医生和患者展示预问诊记录;基于诊室中的感知设备在患者的就诊过程中采集的感知信息,生成初始诊断记录,其中:通过患者的患者终端对患者进行挂号询问包括:通过患者终端显示第一虚拟人物来进行挂号询问,通过患者终端对患者进行预问诊询问包括:通过患者终端显示第二虚拟人物来进行预问诊询问。

The embodiments of this specification provide a method, system and storage medium for providing medical consultation services. The method is executed by at least one processor, and is characterized in that it includes: registering and inquiring the patient through the patient terminal of the patient to determine the doctor with whom the patient is registered; based on the doctor's department, conducting a pre-consultation inquiry on the patient through the patient terminal to generate a pre-consultation record for the patient; in response to detecting that the patient has started a consultation, displaying interface elements related to the pre-consultation record through the clinic terminal in the clinic to show the pre-consultation record to the doctor and the patient; generating an initial diagnosis record based on the perception information collected by the perception device in the clinic during the patient's consultation, wherein: registering and inquiring the patient through the patient terminal of the patient includes: performing a registration inquiry by displaying a first virtual character through the patient terminal, and performing a pre-consultation inquiry on the patient through the patient terminal includes: performing a pre-consultation inquiry by displaying a second virtual character through the patient terminal.

Description

Method, system and storage medium for providing medical treatment service
Cross reference
The present application claims priority from international application number PCT/CN2024/109057 filed on 7.31 of 2024, the entire contents of which are incorporated herein by reference.
Technical Field
The present disclosure relates to the field of medical services, and more particularly, to a method, system, and storage medium for providing medical services.
Background
Outpatient services play an important role in hospital systems, which has important implications in facilitating early diagnosis, improving disease management, alleviating hospitalization needs, providing health education and counseling, facilitating patient access, and relieving medical system stress, among others. The outpatient service involves multiple links such as registration, waiting, visit, taking medicine, etc. However, current outpatient services suffer from a number of problems including patient difficulty in hanging to the proper number, excessive waiting time for patient visits, inefficient service, and fragmented, lack of continuity in the visit service flow.
Accordingly, there is a need for a system, method, and storage medium for providing medical treatment services that improves treatment efficiency and quality of medical treatment services.
Disclosure of Invention
One aspect of embodiments of the present description provides a method of providing medical services. The method is performed by at least one processor and is characterized by comprising the steps of presenting a first interface element to a doctor and a patient through a first terminal of a consulting room, wherein the first interface element at least comprises an electronic health record of the patient, acquiring control instructions sent by the target patient and/or the doctor based on perception information acquired by one or more perception devices in the consulting room in a consultation process, wherein the control instructions are used for retrieving at least one part of the electronic health record, and responding to the control instructions to retrieve at least one part of the electronic health record and present at least one part of the electronic health record through the first terminal.
Another aspect of embodiments of the present description provides a method of providing medical services. The method is performed by at least one processor and is characterized by comprising the steps of conducting a first query to a patient through a patient terminal to determine a doctor registering the patient, controlling the patient terminal to conduct a second query to the patient based on a department of the doctor to generate a pre-diagnosis record of the patient, controlling the department terminal to display the pre-diagnosis record to the doctor and the patient on an interface element in response to detecting that the patient starts to be diagnosed, and generating an initial diagnosis record based on perception information acquired by a perception device in the department during the diagnosis of the patient.
Another aspect of embodiments of the present description provides a method of providing medical services. The method is performed by at least one processor and is characterized by comprising the steps of conducting a first inquiry to a patient through a patient terminal to determine a doctor registering the patient, conducting a second inquiry to the patient through the patient terminal based on a department of the doctor to generate a pre-consultation record of the patient, controlling the patient terminal and the doctor terminal to display the pre-consultation record on interface elements to the patient and the doctor respectively in response to detecting that the patient starts to be diagnosed, and generating an initial diagnosis record based on perception information acquired by the patient terminal and the doctor terminal during the treatment of the patient.
Another aspect of embodiments of the present description provides a method of providing medical services. The method is performed by at least one processor and is characterized by comprising presenting, by at least one terminal, a first interface element to a target user, the first interface element being related to an electronic health record of a patient served by a medical visit, the target user including at least the patient and a doctor, obtaining control instructions initiated by at least one of the target users based on sensory information acquired during the visit based on one or more sensory devices, the control instructions for presenting at least a portion of medical data, and retrieving, in response to the control instructions, at least a portion of the medical data and presenting, by the at least one terminal, at least a portion of the medical data.
Another aspect of embodiments of the present description provides a method of providing medical services. The method is performed by at least one processor and is characterized by comprising registering a patient with a patient terminal of the patient to determine a doctor registering the patient, pre-querying the patient with the patient terminal based on a department of the doctor to generate a pre-query record for the patient, displaying interface elements related to the pre-query record with a consulting room terminal in a consulting room to present the pre-query record to the doctor and the patient in response to detecting the patient, generating an initial diagnostic record based on perception information acquired by a perception device in the consulting room during the visit of the patient, wherein registering the patient with the patient terminal of the patient comprises displaying a first virtual character with the patient terminal to perform the registering the query, and pre-querying the patient with the patient terminal comprises displaying a second virtual character with the patient terminal to perform the pre-query.
Another aspect of embodiments of the present description provides a method of providing medical services. The method is performed by at least one processor and is characterized by comprising registering a patient with a patient terminal of the patient to determine a doctor registering the patient, pre-registering the patient with the patient terminal based on a department of the doctor to generate a pre-diagnosis record of the patient, displaying interface elements related to the pre-diagnosis record with the patient terminal and a doctor terminal of the doctor to present the pre-diagnosis record to the patient and the doctor, respectively, in response to detecting the patient, and generating an initial diagnosis record based on perception information acquired by the patient terminal and the doctor terminal during a visit of the patient, wherein registering the patient with the patient terminal of the patient comprises displaying a first virtual character with the patient terminal to perform the registering the inquiry, and pre-registering the patient with the patient terminal comprises displaying a second virtual character with the patient terminal to perform the pre-diagnosis.
Another aspect of embodiments of the present description provides a method of providing medical services. The method is performed by at least one processor and is characterized by comprising presenting, by at least one terminal, a first interface element to a target user, the first interface element relating to an electronic health profile of a patient served by a medical visit, the target user including at least the patient and a doctor, obtaining control instructions initiated by at least one of the target users based on perception information collected by a perception device during the visit, the control instructions for retrieving at least a portion of the electronic health profile, and in response to the control instructions retrieving at least a portion of the electronic health profile and presenting, by the at least one terminal, to the target user, wherein the at least one terminal presents at least a portion of the electronic health profile to the target user using an augmented reality technique.
Another aspect of embodiments of the present description provides a method of providing a registration service. The method is performed by at least one processor and is characterized by comprising the steps of acquiring patient complaints of a patient through the patient terminal of the patient or a registration terminal of a hospital, determining at least one candidate department based on the patient complaints, displaying virtual characters through the patient terminal or the registration terminal to carry out registration inquiry based on the patient complaints and the at least one candidate department, and determining the doctor based on data acquired by the patient terminal or the registration terminal in the registration inquiry.
Another aspect of embodiments of the present specification provides a method of providing a pre-consultation service. The method is performed by at least one processor and is characterized by comprising the steps of displaying a virtual character on a patient terminal of a patient based on a department where a doctor registered by the patient is located, performing a pre-consultation inquiry on the patient through the patient terminal of the patient, generating the pre-consultation record based on data acquired by the patient terminal in the pre-consultation inquiry, and responding to the detection of the starting of the inquiry of the patient, presenting interface elements related to the pre-consultation record of the patient through a diagnosis terminal in a diagnosis room, wherein the processor is configured with a pre-consultation intelligent body corresponding to the pre-consultation service of the department, the pre-consultation inquiry is performed by the pre-consultation intelligent body, and the pre-consultation intelligent body stores knowledge data corresponding to the department and can realize self evolution by using an artificial intelligent technology.
Another aspect of embodiments of the present description provides a method of providing medical services. The method is executed by at least one processor and is characterized by comprising the steps of acquiring perception information in real time during a patient's visit, wherein the perception information is acquired by a perception device in a consulting room during the visit, processing the perception information by a visit agent during the visit to generate a visit suggestion, wherein the visit suggestion comprises at least one of a supplementary query suggestion, a physical examination suggestion, a prescription suggestion and a treatment proposal, and processing the perception information by the visit agent after the visit process is finished to generate an initial diagnosis record, and the visit agent realizes self evolution by using an artificial intelligence technology.
Another aspect of embodiments of the present description provides a method of providing medical services. The method is performed by at least one processor and is characterized by comprising the steps of receiving a remote accompanying request initiated by a patient, wherein the remote accompanying request relates to a remote accompanying person, responding to the remote accompanying request and detecting that the patient starts to visit, and displaying real-time pictures of the remote accompanying person to the patient and a doctor by using an augmented reality technology through at least one terminal device.
Another aspect of embodiments of the present description provides a method of providing medical services. The method is performed by at least one processor and is characterized by comprising generating a health care plan based on a target diagnosis record of a patient, the health care plan being related to at least one of medication care, health habit care and physiological data care, generating health care instructions based on the health care plan, the health care instructions comprising at least one of medication instructions, health habit instructions, physiological data care instructions, sending the health care instructions to a monitoring device to obtain health care information of the patient from the monitoring device, and updating the health care plan based on the health care information, wherein the processor is configured with a post-diagnosis agent, the method being performed by the post-diagnosis agent, the post-diagnosis agent enabling self-evolution based on artificial intelligence techniques.
Another aspect of embodiments of the present description provides a system for providing medical services. The system comprises at least one processor and at least one memory, wherein the at least one memory is used for storing computer instructions, and the at least one processor is used for executing at least part of the computer instructions to realize a method for providing medical treatment service.
Another aspect of the embodiments of the present specification provides a computer-readable storage medium storing computer instructions that, when read by a computer in the storage medium, perform a method of providing medical services.
Drawings
The present specification will be further elucidated by way of example embodiments, which will be described in detail by means of the accompanying drawings. The embodiments are not limiting, in which like numerals represent like structures, wherein:
FIG. 1 is a block diagram of an exemplary healthcare system shown according to some embodiments of the present application;
FIG. 2 is a schematic diagram of an exemplary healthcare system shown according to some embodiments of the present application;
FIG. 3 is a schematic diagram of an exemplary hospital support platform shown according to some embodiments of the present application;
FIG. 4 is an exemplary block diagram of a medical treatment system according to some embodiments of the present disclosure;
FIG. 5 is a schematic diagram of an exemplary medical treatment procedure shown in accordance with some embodiments of the present description;
FIG. 6 is a schematic diagram of an exemplary flow of a physician determining that a patient is to be registered, shown in accordance with some embodiments of the present description;
FIG. 7 is a schematic diagram of a path planning and path guidance service shown in accordance with some embodiments of the present description;
FIG. 8 is a schematic diagram of an exemplary process for providing a pre-consultation service according to some embodiments of the present description;
FIG. 9 is a schematic diagram of an exemplary flow of initiating a second query shown in accordance with some embodiments of the present description;
FIG. 10 is a schematic diagram of a process for presenting a avatar by an XR device, shown in accordance with some embodiments of the present description;
FIG. 11 is a schematic diagram of an exemplary medical treatment system shown in accordance with some embodiments of the present description;
FIG. 12 is a flow chart illustrating an exemplary presentation of medical data to a target user during a visit according to some embodiments of the present disclosure;
FIG. 13 is a flow chart of an exemplary process for providing medical services based on perceived information according to some embodiments of the present description;
FIG. 14 is a schematic diagram of an exemplary clinic interface shown in accordance with some embodiments of the present description;
FIG. 15 is a flowchart of an exemplary process of generating an initial diagnostic record, according to some embodiments of the present description;
fig. 16 is a schematic diagram of an exemplary process for providing remote companion service shown in accordance with some embodiments of the present description;
Fig. 17 is a schematic diagram of an exemplary process for providing healthcare services according to some embodiments of the present description;
FIG. 18 is a schematic diagram of an exemplary medical treatment procedure shown in accordance with some embodiments of the present disclosure;
Fig. 19 is a schematic diagram of an exemplary medical treatment procedure shown in accordance with some embodiments of the present description.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present specification, the drawings that are required to be used in the description of the embodiments will be briefly described below. It is apparent that the drawings in the following description are only some examples or embodiments of the present specification, and it is possible for those of ordinary skill in the art to apply the present specification to other similar situations according to the drawings without inventive effort. Unless otherwise apparent from the context of the language or otherwise specified, like reference numerals in the figures refer to like structures or operations.
It will be appreciated that "system," "apparatus," "unit" and/or "module" as used herein is one method for distinguishing between different components, elements, parts, portions or assemblies at different levels. However, if other words can achieve the same purpose, the words can be replaced by other expressions.
As used in this specification and the claims, the terms "a," "an," "the," and/or "the" are not specific to a singular, but may include a plurality, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that the steps and elements are explicitly identified, and they do not constitute an exclusive list, as other steps or elements may be included in a method or apparatus.
A flowchart is used in this specification to describe the operations performed by the system according to embodiments of the present specification. It should be appreciated that the preceding or following operations are not necessarily performed in order precisely. Rather, the steps may be processed in reverse order or simultaneously. Also, other operations may be added to or removed from these processes.
Fig. 1 is a block diagram of an exemplary healthcare system 100 shown according to some embodiments of the present application.
The healthcare system 100, which may also be referred to as a metahospital system, is built based on a variety of innovative technologies including metauniverse technology, XR technology (e.g., augmented Reality (AR) technology, virtual Reality (VR) technology, mixed Reality (MR) technology, etc.), AI technology, digital twin technology, IOT technology, data flow technology (e.g., blockchain technology, data privacy computing technology), spatial computing technology, image rendering technology, etc.
As shown in fig. 1, the healthcare system 100 may include a physical hospital 110, a virtual hospital 130, a user space application 120, and a hospital support platform 140. In some embodiments, the hospital support platform 140 may map data related to the physical hospital 110 into a virtual hospital 130 corresponding to the physical hospital 110 and provide user services to related users of the physical hospital 110 through the user space application 120.
The physical hospital 110 refers to a hospital existing in the physical world and having a tangible attribute. Health care institutions that provide medical, surgical and psychiatric care and treatment for humans are collectively referred to herein as hospitals.
As shown in fig. 1, the physical hospital 110 may include a plurality of physical entities. For example, the plurality of physical entities may include departments, users, hardware devices, user services, public areas, medical service procedures, and the like, or any combination thereof.
A department refers to a specialized unit or department that is specialized in providing a particular type of medical care, treatment, and service. Each department may be focused on a particular medical field and may be equipped with a healthcare professional having expertise in that field. For example, the departments may include an outpatient department, an inpatient department, a surgical department, a support department (e.g., a registration department, a pharmacy department), a medical department, a surgical department, a specialty medical department, a child care department, etc., or any combination thereof.
The user may include any user associated with the physical hospital 110 (or related user referred to as the physical hospital 110). For example, the user may include a patient (or a portion of a patient (e.g., an organ)), a physician, a visit to a patient, a hospital staff of the physical hospital 110, a provider of the physical hospital 110, an application developer of the physical hospital 110, or the like, or any combination thereof. Hospital staff of the physical hospital 110 may include healthcare providers (e.g., doctors, nurses, technicians, etc.), hospital administrators, support staff, or the like, or any combination thereof. Exemplary hospital administrators may include department care administrators, clinical administrators, department courtyards, hospital administrative staff, job management staff, or the like, or any combination thereof.
The hardware devices may include hardware devices located in the physical hospital 110 and/or hardware devices in communication with hardware devices in the physical hospital 110. Exemplary hardware devices may include terminal devices, healthcare devices, sensing devices, base devices, etc., or any combination thereof.
The terminal device may comprise a terminal device that interacts with a user of the medical services system 100. For example, the terminal devices may include terminal devices that interact with the patient (also referred to as patient terminals), terminal devices that interact with the patient's doctor (also referred to as doctor terminals), terminal equipment that interact with the nurse (also referred to as nurse terminals), terminal devices that interact with the remote seeker (also referred to as remote terminal devices), or public terminals of the hospital (e.g., office terminals, bedside terminal devices, terminal devices in waiting areas, intelligent surgical terminals), etc., or any combination thereof. In the present application, unless explicitly obtained from the context or otherwise stated in the context, the terminal devices owned by the user and the terminal devices provided to the user by the physical hospital 110 are collectively referred to as the user's terminal devices or the terminal devices interacting with the user.
The terminal device may include a mobile terminal, an XR device, an intelligent wearable device, etc. The mobile terminal may include a smart phone, a Personal Digital Assistant (PDA), a display, a gaming device, a navigation device, a hand-held terminal (POS), a tablet computer, etc., or any combination thereof.
The XR device may comprise a device that allows a user to participate in an augmented reality experience. For example, the XR device may include VR components, AR components, MR components, and the like, or any combination thereof. In some embodiments, the XR device may include an XR helmet, XR glasses, an XR patch, a stereo headset, or the like, or any combination thereof. For example, the XR device may include GoogleGlassTM, oculusRiftTM, gearVRTM, appleVisionproTM, etc. In particular, the XR device may include a display component on which virtual content may be presented and/or displayed. In some embodiments, the XR device may further comprise an input component. The input component can enable user interaction between a user and virtual content (e.g., virtual surgical environment) displayed by the display component. For example, the input component may include a touch sensor, microphone, image sensor, etc. configured to receive user input that may be provided to the XR device and used to control the virtual world by changing visual content presented on the display component. The input components may include handles, gloves, styluses, consoles, and the like.
The intelligent wearable device may include an intelligent wristband, intelligent footwear, intelligent glasses, intelligent helmet, intelligent watch, intelligent garment, intelligent backpack, intelligent accessory, etc., or any combination thereof. In some embodiments, the smart wearable device may acquire physiological data of the user (e.g., heart rate, blood pressure, body temperature, etc.).
The healthcare device may be configured to provide healthcare to the patient. For example, the medical services device may include an examination device, a care device, a treatment device, etc., or any combination thereof.
The examination apparatus may be configured to provide examination services to a patient, e.g. to collect examination data of the patient. Exemplary examination data may include heart rate, respiratory rate, body temperature, blood pressure, medical imaging data, body fluid test reports (e.g., blood test reports), and the like, or any combination thereof. Accordingly, the examination device may include a vital sign monitor (e.g., blood pressure monitor, blood glucose meter, heart rate meter, thermometer, digital stethoscope, etc.), a medical imaging device (e.g., computed Tomography (CT) device, digital Subtraction Angiography (DSA) device, magnetic Resonance (MR) device, etc.), a laboratory device (e.g., blood routine examination device, etc.), or any combination thereof.
The care device may be configured to provide care services to the patient and/or assist the healthcare provider in providing care services. Exemplary care devices may include hospital beds, patient care robots, smart care carts, smart kits, smart wheelchairs, and the like.
The treatment device may be configured to provide treatment services to the patient and/or assist the medical service provider in providing treatment services. Exemplary treatment devices may include surgical devices, radiation treatment devices, physical treatment devices, and the like, or any combination thereof.
The sensing device may be configured to gather sensing information related to the environment in which it is located. For example, the sensing device may include an image sensor, a sound sensor, or the like. The image sensor may be configured to collect image data in the physical hospital 110 and the sound sensor may be configured to collect voice signals in the physical hospital 110. In some embodiments, the sensing device may be a stand-alone device or may be integrated into another device. For example, the sound sensor may be part of a medical service device or a terminal device.
The base device may be configured to support data transmission, storage, and processing. For example, the infrastructure devices may include networks, machine room facilities, computing devices, computing chips, storage devices, and the like.
In some embodiments, at least a portion of the hardware devices of the physical hospital 110 are IoT devices. An internet of things device refers to a device with sensors, processing power, software and other technologies that connect and exchange data with other devices and systems through the internet or other communication networks. For example, one or more healthcare devices and/or sensing devices of the physical hospital 110 are internet of things devices and are configured to transmit collected data to the hospital support platform 140 for storage and/or processing.
The user services may include any service provided by the hospital support platform 140 to the user. For example, user services include medical services provided to patients and/or accompanying persons, support services provided to staff members of physical hospital 110 and/or suppliers of physical hospital 110, and the like. In some embodiments, user services may be provided to patients, doctors, and hospital administrators through the user space application 120, which will be described in detail in the following description.
The public area refers to a shared space accessible to users (or portions of users) in the physical hospital 110. For example, the common area may include a reception area (e.g., a foreground), a waiting area, a hallway, etc., or any combination thereof.
A healthcare procedure is a procedure that provides a corresponding healthcare to a patient. Medical service procedures typically include several links and/or steps through which a user may need to obtain a corresponding medical service. Exemplary healthcare procedures may include outpatient procedures, hospitalization procedures, surgical procedures, or the like, or any combination thereof. In some embodiments, the healthcare procedures may include corresponding healthcare procedures for different departments, different diseases, and the like. In some embodiments, a preset data acquisition protocol may be set and specify the standard links involved in the healthcare procedure and how to acquire data related to the healthcare procedure.
The user space application 120 provides the user with access to user services provided by the hospital support platform 140. The user space application 120 may be an application, plug-in, website, applet, or any other suitable form. For example, the user space application 120 is an application installed on a user terminal device that includes a user interface for a user to initiate requests and receive corresponding services.
In some embodiments, user space application 120 may include different applications corresponding to different types of users. For example, the user space application 120 includes a patient space application corresponding to a patient, a medical space application corresponding to a doctor, a tube space application corresponding to an administrator, and the like, or any combination thereof. User services provided through the patient space application, the medical space application, and the management space application are also referred to as a patient space service, a medical space service, and a management space service, respectively. Exemplary patient space services include registration services, route guidance services, pre-consultation services, remote consultation services, hospitalization services, discharge services, and the like. Exemplary medical space services include scheduling services, surgical planning services, surgical simulation services, patient management services, remote ward services, remote outpatient services, and the like. Exemplary manager space services include monitoring services, medical services assessment services, device parameter setting services, service parameter setting services, resource scheduling services, and the like.
In some embodiments, the patient space application, the medical space application, and the management space application may be integrated into one user space application 120, and the user space application 120 may be configured to provide access portals for each type of user (e.g., patient, healthcare provider, manager, etc.). By way of example only, a particular user may have a corresponding account number that may be used to log into a user space application, view corresponding diagnostic data, and obtain corresponding user services.
According to some embodiments of the present application, by providing user space applications for different types of users, each type of user can easily obtain various user services that he/she may need on its corresponding user space application. In addition, currently users often need to install various applications to obtain different user services, which results in poor user experience and high development costs. Therefore, the user space application of the application can improve the user experience, improve the service quality and efficiency, enhance the service safety and reduce the development or operation cost.
In some embodiments, the user space application 120 may be configured to provide access portals for relevant users of the physical hospital 110 to interact with the virtual hospital 130. For example, through the user space application 120, a user may enter instructions for retrieving digital content of the virtual hospital 130 (e.g., hardware devices, patient organs, digital twin models of public areas), view the digital content, and interact with the digital content. As another example, through the user space application 120, a user may communicate with a avatar representing an agent. In some embodiments, a public terminal of a hospital may install a administrative space application, and an administrator account of a department to which the public terminal corresponds may be logged into the administrative space application. The user may accept user services through a pipe space application installed in the public terminal.
The virtual hospital 130 is a digital twin (i.e., virtual representation or virtual copy) of the physical hospital 110 for simulating, analyzing, predicting, and optimizing the operating state of the physical hospital 110. For example, the virtual hospital 130 may be a real-time digital copy of the physical hospital 110.
In some embodiments, the virtual hospital 130 may be presented to the user using digital technology. For example, when the relevant user interacts with the virtual hospital 130, at least a portion of the virtual hospital 130 may be presented to the relevant user using XR technology. For example only, MR technology may be used to superimpose at least a portion of the virtual hospital 130 on the real-world view of the relevant user.
In some embodiments, the virtual hospital 130 may include a digital twin of a physical entity associated with the physical hospital 110. Digital twins refer to virtual representations (e.g., virtual copies, mappers, digital simulators) of physical entities. The digital twin can reflect and predict the state, behavior and performance of the physical entity in real time. For example, the virtual hospital 130 may include digital twins of at least a portion of medical services, departments, users, hardware devices, user services, public areas, medical services procedures, and the like of the physical hospital 110. The digital twins of a physical entity can take a variety of forms including models, images, graphics, text, numerical values, and the like. For example, the digital twin body may be a virtual hospital corresponding to a physical hospital, virtual personnel (e.g., virtual doctors, virtual nurses, and virtual patients) corresponding to personnel entities (e.g., doctors, nurses, and patients), virtual devices (e.g., virtual imaging devices and virtual scalpels) corresponding to medical service devices (e.g., imaging devices and scalpels), and the like.
In some embodiments, the digital twins may include one or more first digital twins and/or one or more second digital twins. The state of each first digital twin may be updated based on an update of the state of the corresponding physical entity. For example, one or more first digital twins may be updated during the mapping of data associated with the physical hospital 110 to the virtual hospital 130. One or more second digital twins can be updated by at least one of the user space applications 120, and the update of each second digital twins can result in a status update of the corresponding physical entity. In other words, the first digital twin may be updated accordingly when the corresponding physical entity changes its state, and the state of the corresponding physical entity changes accordingly when the second digital twin is updated. For example, the one or more first digital twins may include digital twins of a public area, a medical service, a user, a hardware device, etc., and the one or more second digital twins may include digital twins of a hardware device, a user service, a medical service procedure, etc. It should be appreciated that the digital twins may be either the first digital twins or the second digital twins.
According to some embodiments of the present application, physical hospitals 110 (including hardware devices, users, user services, healthcare procedures, etc.) may be simulated and tested in a secure and controllable environment by generating a virtual hospital 130 that includes digital twins of physical entities associated with the physical hospitals 110. By virtual reality linkage (e.g., real-time interaction between physical hospital 110 and virtual hospital 130), various medical scenarios can be more accurately predicted and responded to, thereby improving the quality and efficiency of medical services. In addition, the application of the XR technology and the virtual reality integration technology enables the interaction of related users to be more natural and visual, and provides a more comfortable and efficient medical environment, so that the user experience is improved.
In some embodiments, the virtual hospital 130 may further include agents that implement self-evolution based on data related to the physical hospital 110 and AI technology.
An agent refers to an agent that acts in an intelligent manner. For example, an agent may include a computing/software entity that can autonomously learn and evolve, and sense and analyze data to perform specific tasks and/or achieve specific goals (e.g., healthcare procedures). Through AI techniques (e.g., reinforcement learning, deep learning, etc.), an agent can constantly learn and self-optimize in interactions with the environment. In addition, the agent can collect and analyze mass data (e.g., related data of the physical hospital 110) through a big data technology, mine patterns and learning rules from the data, optimize decision flow, thereby identifying environmental changes in uncertain or dynamic environments, responding quickly, and making reasonable judgment. For example, agents may learn and evolve autonomously based on AI technology to accommodate changes in physical hospitals 110. By way of example only, agents may be built based on NLP technology (e.g., large language models, etc.) and may automatically learn and autonomously update through large amounts of language text (e.g., hospital business data and patient feedback information) to improve the quality of user service provided by physical hospitals 110.
In some embodiments, the agents may include different types of agents corresponding to different healthcare procedures, different user services, different departments, different diseases, different hospital positions (e.g., nurses, doctors, technicians, etc.), different links of healthcare procedures, and the like. A particular type of agent is used to process tasks corresponding to the particular type. In some embodiments, one agent may correspond to a different healthcare procedure (or a different healthcare, or a different department, or a different disease, or a different hospital location). In some embodiments, an agent may operate with reference to basic configuration data (e.g., dictionary, knowledge graph, template, etc.) of a department and/or disease corresponding to the agent. In some embodiments, multiple agents may cooperate and share information through network communications to collectively accomplish complex tasks.
In some embodiments, a configuration of the agent may be provided. For example, basic configuration data for use by the agent in operation may be set. The basic configuration data may include dictionaries, knowledge databases, templates, etc. As another example, usage rights of the agent may be set for different users. In some embodiments, an administrator of the physical hospital 110 may set the configuration of the agent through a managed space application.
In some embodiments, the agent may be integrated into or deployed on a hardware device. For example, agents corresponding to hospitalization services may be integrated into a hospital bed or presentation device of a hospital bed. In some embodiments, the agent may be integrated into or deployed on the intelligent robot. A self-contained intelligent robot refers to a robotic system that combines physical presence (manifestation) with intelligent behavior (cognition). The self-contained intelligent robot may be configured to interact with the real world in a manner that mimics or complements human capabilities, utilizing physical morphology and cognitive functions to perform tasks, make decisions, and adapt to the environment. By utilizing artificial intelligence and sensor technology, the self-contained intelligent robot can operate autonomously, interact with the environment, and continuously improve performance. For example, the self-contained intelligent robot may configure an agent corresponding to a surgical service and assist a doctor in performing a surgery.
In some embodiments, at least a portion of the user services may be provided based on the agent. For example, at least a portion of the user services may be provided to the relevant users based on the processing results, wherein the processing results are generated by at least one of the agents based on data related to the physical hospital 110. For example only, the data related to the physical hospital 110 may include data related to a healthcare procedure of the physical hospital 110, the agent may include an agent corresponding to the healthcare procedure, and the user service may be provided to an associated user of the healthcare procedure by using the agent processing data corresponding to the healthcare procedure.
The hospital support platform 140 may be configured to provide technical support to the healthcare system 100. For example, the hospital support platform 140 may include computing hardware and software to support innovative technologies including XR technology, AI technology, digital twinning technology, data flow technology, and the like. In some embodiments, the hospital support platform 140 may include at least a storage device for data storage and a processing device for data computation.
In some embodiments, the hospital support platform 140 may support interactions between the physical hospitals 110 and the virtual hospitals 130. For example, the processing device of the hospital support platform 140 may obtain data related to the physical hospital 110 from the hardware device and map the data related to the physical hospital 110 into the virtual hospital 130. For example, the processing device of the hospital support platform 140 may update a portion of the digital twins (e.g., one or more first digital twins) in the virtual hospital 130 based on the obtained data such that each portion of the digital twins in the virtual hospital 130 may reflect the updated status of the corresponding physical entity in the physical hospital 110. Based on the digital twin body which is continuously updated with the corresponding physical entity, the user can know the state of the physical entity related to the physical hospital 110 in real time, so that the monitoring and evaluation of the physical entity are realized. As another example, agents corresponding to data related to the physical hospital 110 may train and/or update based on the data related to the physical hospital 110 to self-evolve and self-learn.
In some embodiments, the hospital support platform 140 may support and/or provide user services to the relevant users of the physical hospital 110. For example, in response to receiving a user service request from a user, the processing device of the hospital support platform 140 may provide a user service corresponding to the service request. As another example, in response to detecting a need to provide a user service to a user, the processing device of the hospital support platform 140 may control a physical entity or virtual entity corresponding to the user service to provide the user service. For example, in response to detecting that a patient is being sent to a hospital ward, the processing device of the hospital support platform 140 may control the intelligent care cart to direct a nurse to the hospital ward for a hospital admission check of the patient.
In some embodiments, at least a portion of the user services may be provided to the relevant users based on interactions between the relevant users and the virtual hospital 130. Interaction refers to interactions or effects (e.g., conversations, behaviors, etc.) between the relevant user and the virtual hospital 130. For example, interactions between the relevant user and the virtual hospital 130 may include interactions between the relevant user and a digital twin in the virtual hospital 130, interactions between the relevant user and an agent, interactions between the relevant user and a virtual character, and the like, or any combination thereof.
In some embodiments, at least a portion of the user services may be provided to the associated user based on interactions between the associated user and at least one of the digital twins. For example, an update instruction of the second digital twin input by the relevant user may be received by the user space application 120, and the corresponding physical entity of the second digital twin may be updated according to the update instruction. As another example, a user may view a first digital twin of a physical entity (e.g., a 3D digital twin model of a patient organ or hardware device) through the user space application 120 to learn about the state of the physical entity. Alternatively, the user may change the display angle, display size, etc. of the digital twin.
In some embodiments, the processing device of the hospital support platform 140 may present virtual characters corresponding to the agents through the user space application, interact with the associated user, and provide at least a portion of the user services to the associated user based on the interactions between the associated user and the virtual characters.
In some embodiments, the hospital support platform 140 may have a five-layer structure including a hardware device layer, an interface layer, a data processing layer, an application development layer, and a service layer, see fig. 3 and its associated description. In some embodiments, the hardware devices of the physical hospital 110 may be part of the hospital support platform 140.
According to some embodiments of the present application, a virtual hospital corresponding to a physical hospital may be established by integrating various internal and external resources (e.g., medical service equipment, hospital personnel, medicines and consumables, etc.) of the physical hospital. The virtual hospital may reflect real-time status (e.g., changes, updates, etc.) of physical entities associated with the physical hospital, thereby enabling monitoring and assessment of the physical entities. Such integration may provide accurate data support for the operation and intelligent decision-making of medical services. In addition, through the virtual hospital, users related to medical services can commonly establish an open shared ecosystem, thereby promoting innovation and promotion of medical services.
In addition, the medical care service of the patient in the whole life cycle can be provided for the linkage between the inside and outside of the hospital. The perspective of medical services extends from mere disease treatment to covering the entire life cycle of a patient, including prevention, diagnosis, treatment, rehabilitation, health management, and the like. By establishing the intra-and-inter-hospital linkage, the physical hospital can integrate online and offline resources better and provide comprehensive and continuous medical and health services for patients. For example, by remote monitoring and online consultation, the health condition of the patient can be followed in real time, the treatment scheme can be adjusted in time, and the treatment effect can be improved.
Fig. 2 is a schematic diagram of an exemplary healthcare system 200 shown according to some embodiments of the present application.
As shown in fig. 2, the healthcare system 200 may include a processing device 210, a network 220, a storage device 230, one or more healthcare devices 240, one or more perception devices 250, one or more patient terminals 260 of a patient 261, and one or more doctor terminals 270 of a doctor 271 associated with the patient 261. In some embodiments, components in the healthcare system 200 may be interconnected and/or communicate by a wireless connection, a wired connection, or a combination thereof. The connections between the components of the healthcare system 200 may be variable.
The processing device 210 may process data and/or information obtained from the storage device 230, the healthcare device 240, the sensing device 250, the patient terminal 260, and/or the doctor terminal 270. For example, the processing device 210 may map data related to a physical hospital to a virtual hospital corresponding to the physical hospital and provide user services to the patient 261 and the doctor 271 through the patient terminal 260 and/or the doctor terminal 270, respectively, by processing the data related to the physical hospital. As another example, processing device 210 may maintain a digital smart object and provide user services to patient 261 and doctor 271 through patient terminal 260 and/or doctor terminal 270, respectively, by engaging the digital smart object in processing data related to a physical hospital.
In some embodiments, the processing device 210 may be a single server or a group of servers. The server group may be centralized or distributed. In some embodiments, the processing device 210 may be located locally or remotely from the healthcare system 200. In some embodiments, the processing device 210 may be implemented on a cloud platform. For example, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or a combination thereof.
In some embodiments, processing device 210 may include one or more processors (e.g., single-core processors or multi-core processors). For illustration only, only one processing device 210 is depicted in the healthcare system 200. It should be noted, however, that the healthcare system 200 of the present application may also include multiple processing devices. Thus, as with the present application, operations and/or method steps performed by one processing device 210 may also be performed by multiple processing devices in combination or separately.
The network 220 may include any suitable network capable of facilitating the exchange of information and/or data by the healthcare system 200. The network 220 may be or include a wired network, a wireless network (e.g., an 802.11 network, a Wi-Fi network), a bluetooth (TM) network, a Near Field Communication (NFC) network, etc., or any combination thereof.
Storage device 230 may store data, instructions, and/or any other information. In some embodiments, the storage device 230 may store data obtained from other components of the medical services system 200. In some embodiments, storage device 230 may store data and/or instructions that processing device 210 may perform or be used to perform the exemplary methods described herein.
In some embodiments, the data stored in the storage device 230 may include multi-modal data. Multimodal data may include various forms of data (e.g., images, graphics, video, text, etc.), various types of data, data obtained from different sources, data related to different medical services (e.g., diagnosis, surgery, rehabilitation, etc.), data related to different users (e.g., patients, medical personnel, management personnel, etc.). For example, the data stored in the storage device 230 may include medical data of the patient 261 reflecting the health of the patient 261. For example, the medical data may include an electronic health profile of patient 261. An electronic health record refers to an electronic file that records various types of patient data (e.g., basic information, examination data, imaging data). For example, the electronic health record may include a three-dimensional model of a plurality of organs and/or tissues of patient 261.
In some embodiments, storage device 230 may include mass storage devices, removable storage devices, volatile read-write memory, read-only memory (ROM), and the like, or any combination thereof. In some embodiments, storage device 230 may include a data lake and a data warehouse, as will be described in detail in connection with FIG. 3.
The healthcare device 240 may be used to provide or assist in healthcare. As shown in FIG. 2, the medical services device 240 includes a clinic terminal 240-1, a hospital bed 240-2, a smart surgical terminal 240-3, a smart care cart 240-4, a smart wheelchair 240-5, etc., or any combination thereof.
The office terminal 240-1 is a terminal device that is configured within the office for use by doctors and patients in a medical outpatient procedure. For example, the office terminal 240-1 may include one or more of a screen, a sound output component, an image sensor, or a sound sensor. A doctor interface may be displayed on the screen of the doctor-room terminal 240-1 and data may be displayed on the doctor interface to facilitate communication between the doctor and patient. Exemplary data may include electronic health files (or portions thereof), pre-consultation records, medical images, 3D organ models, examination results, consultation advice, and the like.
Hospital bed 240-2 refers to a hospital bed that is capable of supporting inpatients in a hospital ward and providing user services to the patient. The hospital bed 240-2 may include beds, bedside terminal equipment, bedside inspection equipment, sensors, and the like, or any combination thereof. The bedside terminal device may include an XR device, a display device, a mobile device, etc., or any combination thereof. In some embodiments, the hospital bed 240-2 may be controlled by an agent corresponding to the hospitalization service, wherein the hospital bed may also be referred to as a smart hospital bed or a meta-hospital bed.
The intelligent surgical terminal 240-3 refers to a device configured with an agent for assisting surgery, and is controlled by the agent corresponding to a surgical service. The intelligent surgical terminal 240-3 may sense interactions (e.g., conversations, behaviors, etc.) between the healthcare provider, the patient, and the agent and obtain data captured by the sensing device 250 to provide surgical assistance. In some embodiments, the intelligent surgical terminal 240-3 may be configured to perform a risk alert for a surgical procedure, generate a surgical record of a surgical procedure, etc., based on the agent configured therein.
The intelligent nursing car 240-4 is a nursing car having an automatic driving function and capable of assisting patient treatment and nursing. For example, the intelligent care vehicle 240-4 may be configured to guide a nurse to a hospital ward for admission of the patient. In some embodiments, the intelligent care vehicle 240-4 may be controlled by an agent (e.g., an agent corresponding to a hospitalization service, a care agent). In some embodiments, the smart care vehicle 240-4 may include a cart, a presentation device, one or more examination devices and/or care tools, a sensing device (e.g., an image sensor, a GPS sensor, a sound sensor, etc.), and so forth. In some embodiments, the intelligent care vehicle 240-4 may be configured to obtain relevant treatment and care information for the patient and generate the screening data, care data, and the like. The physical examination data may include vital sign data of the patient. The care data may include detailed records of care operations, such as care time, care operator, care measure, patient response, and the like.
The intelligent wheelchair 240-5 refers to a transport device for intelligently taking in and out of a patient. In some embodiments, the smart wheelchair 240-5 may be configured to perform autonomous navigation through integrated sensors and maps, locate the patient's location using Radio Frequency Identification Devices (RFID), bluetooth, or Wi-Fi signals, and identify the patient through biometric technology. In some embodiments, the intelligent wheelchair 240-5 may be controlled by an agent (e.g., an agent corresponding to a hospitalization service, an agent corresponding to a surgical service). In some embodiments, the intelligent wheelchair 240-5 may be configured to generate data (e.g., a record of the interaction between the agent and the patient) by sensing the interaction data through the built-in cameras/sensors.
The sensing device 250 may be configured to gather sensing information related to the environment in which it is located. In some embodiments, the sensing device 250 may comprise a sensing device in a physical hospital 110. For example, the sensing device 250 may include an image sensor 250-1, a sound sensor 250-2, a temperature sensor, a humidity sensor, and the like.
The patient terminal 260 may be a terminal device that interacts with the patient 261. In some embodiments, patient terminal 260 may include a mobile terminal 260-1, an XR device 260-2, a smart wearable device 260-3, and so forth. Doctor terminal 270 may be a terminal device that interacts with doctor 271. In some embodiments, the physician terminal 270 may include a mobile terminal 270-1, an XR device 270-2, or the like. In some embodiments, patient 261 may access a user space application (e.g., a patient space application) through patient terminal 260 and doctor 271 may access a user space application (e.g., a doctor space application) through doctor terminal 270. In some embodiments, patient 261 and doctor 271 may communicate with each other remotely through patient terminal 260 and doctor terminal 270, thereby providing remote medical services, such as remote outpatient services, remote ward services, remote follow-up services, and the like.
The sensing device 250, patient terminal 260, and doctor terminal 270 may be configured as data sources to provide information to the healthcare system 200. For example, the devices may transmit the collected data to the processing device 210, and the processing device 210 may provide user services based on the received data.
It should be noted that the above description of the healthcare systems 100 and 200 is intended to be illustrative, and not limiting of the scope of the present application. Many alternatives, modifications, and variations will be apparent to those skilled in the art. The features, structures, methods, and other features of the example embodiments herein may be combined in various ways to obtain additional and/or alternative example embodiments. For example, the healthcare system 200 may include one or more additional components, such as, for example, other users 'terminal devices, a hospital's public terminal device, and the like. As another example, two or more components of the healthcare system 200 may be integrated into a single component.
Fig. 3 is a schematic diagram of an exemplary hospital support platform 300 shown according to some embodiments of the present application.
As shown in fig. 3, the hospital support platform 300 may include a hardware layer 310 (also referred to as a hardware module), an interface layer 320 (also referred to as an interface module), a data processing layer 330 (also referred to as a data processing module), an application development layer 340 (also referred to as an application development module), and a service layer 350 (also referred to as a service module). It should be understood that the "layers" and "modules" in this disclosure are used only for logically dividing the components of the hospital support platform and are not intended to be limiting.
The hardware layer 310 may be configured to provide a hardware basis for interactions between the real world and the digital world, and may include one or more hardware devices related to hospital operations. Exemplary hardware devices may include healthcare devices, sensing devices, terminal devices, and base devices.
The interface layer 320 may be connected with the hardware layer 310 and the data processing layer 330. The interface layer 320 may be configured to obtain data collected by hardware devices of the hardware layer 310 and send the data to the data processing layer 330 for storage and/or processing. Interface layer 320 may also be configured to control at least a portion of the hardware devices of hardware layer 310. In some embodiments, interface layer 320 may include hardware interfaces and software interfaces (e.g., data interfaces, control interfaces).
The data processing layer 330 may be configured to store and/or process data. The data processing layer 330 may include a processing device on which a plurality of data processing units may be configured. The data processing layer 330 may be configured to obtain data from the interface layer 320 and process the data by at least one data processing unit to enable user services related to hospital services.
The data processing unit may comprise various preset algorithms for implementing data processing. In some embodiments, data processing layer 330 may include a processing device (e.g., processing device 210 in fig. 2). The data processing unit may be configured on the processing device. In some embodiments, the data processing unit may include an XR unit configured to process data using XR technology to implement XR services, an AI unit (e.g., an agent unit) configured to process data using AI technology to implement AI services, a digital twin unit configured to process data using digital twin technology to implement digital twin services, a data flow unit configured to process data using data flow technology (e.g., blockchain technology, data privacy computing technology) to implement data flow services, and so forth.
In some embodiments, data processing layer 330 may also include a data center configured to store data. In some embodiments, the data center may employ a lake-warehouse integrated architecture, which may include data lakes and data warehouses. The data lake may be used to persist large amounts of data in a tamper-proof manner. The data warehouse may be used to store index data corresponding to data in the data lake. The data stored in the data lake may include native (or raw) data collected by the hardware device, derived data generated based on the native data, and the like. In some embodiments, the data in the data lake may be processed by a processing device (e.g., processing device 210).
The application development layer 340 may be configured to support application development, publishing, subscribing, and the like. The application development layer 340 is also referred to as an ecological suite layer. In some embodiments, the application development layer 340 may be configured to provide an open interface for application developers to access or invoke at least a portion of the data processing units and to utilize at least a portion of the data processing units to develop applications. In some embodiments, as shown in fig. 3, the application development layer 340 may provide development kits, application markets, multi-tenant operation platforms, cloud officials, workspaces, and other support kits to assist developers in doing work.
The service layer 350 may be configured to enable relevant users of the hospital service to access user services related to the hospital service through the user space application.
The application provides a hospital support platform which is designed for comprehensive management of various resources in a hospital, including hardware resources, software resources and data resources. In some embodiments, the platform further integrates data processing units capable of supporting advanced technologies, such as artificial intelligence, XR, digital twinning, and blockchain. These advanced techniques are used to improve the efficiency and quality of service in the healthcare industry. For example, artificial intelligence techniques enable autonomous evolution and continuous optimization of hospital operations, while XR and digital twins techniques facilitate the creation and maintenance of virtual hospitals. The virtual hospital can interact with the user, providing an immersive novel service experience. In addition, the platform includes an application development layer for granting access to these advanced technologies to third party developers of the healthcare industry. This access promotes an open ecosystem, which promotes the development and innovation of applications, and thus promotes the advancement of medical services.
Fig. 4 is a block diagram of an exemplary medical treatment system 400 according to some embodiments of the present description. As shown in fig. 4, the medical treatment system 400 may include a registration module 410, a waiting module 420, a consultation module 430, and/or a post-consultation service module 440. In some embodiments, one or more of the modules shown in fig. 4 may be implemented by the processing device 210.
Registration module 410 may be used to perform operations related to the registration link and/or provide services related to the registration link, such as, for example, acquisition/creation of electronic health records, intelligent registration services, path planning and path guidance, etc. For a detailed description of registration module 410, see description below regarding registration link 510.
The waiting module 420 may be configured to perform operations related to the waiting process and/or provide services related to the waiting process, such as pre-interrogation services, force/temperature feedback, and the like. For a detailed description of the waiting module 420, see description below regarding the waiting section 520.
The inquiry module 430 may be used to perform related operations of the inquiry link and/or provide related services of the inquiry link, such as, for example, outpatient previews, medical data presentation, providing consultation advice, diagnostic record generation, remote accompany service, etc. For a detailed description of the interrogation module 430, reference is made to the description of the interrogation link 530 that follows.
The post-diagnosis service module 440 may be used to perform operations related to the post-diagnosis link and/or provide services related to the post-diagnosis link, such as providing medication intake services, providing inspection services, health care, and the like. For a detailed description of the post-diagnosis service module 440, reference is made to the description of the post-diagnosis link 540.
In some embodiments, registration module 410, waiting module 420, inquiry module 430, and/or post-diagnosis service module 440 may be implemented on the same or different processing devices. In some embodiments, the medical treatment system 400 may include one or more other modules and/or omit one or more of the modules described above. In some embodiments, a module may be split into multiple modules, which may also be combined into a module.
Fig. 5 is a schematic diagram of an exemplary medical treatment procedure 500 shown in accordance with some embodiments of the present description. As shown in fig. 5, the medical treatment procedure 500 includes a registration link 510, a waiting link 520, a consultation link 530, a post-consultation link 540, and so on. In some embodiments, the processing device 210 and/or the medical treatment system 400 may perform the operations involved in the various links of the medical treatment procedure 500.
The patient may reserve a doctor for a visit in the registration link 510. The relevant operations of the registration link 510 may be performed by the registration module 410. As shown in fig. 5, the relevant operations of the registration link 510 include electronic health record acquisition/establishment, intelligent registration, path planning, path guidance, and the like. In some embodiments, the patient may initiate the registration request through a third terminal (also referred to as a patient terminal) or a registration terminal of the hospital. After receiving the registration request, registration module 410 may perform one or more operations corresponding to registration link 510. In some embodiments, the patient may initiate the registration request at the hospital site via a third terminal or registration terminal, or at another site outside the hospital via a third terminal. In some embodiments, the third terminal has installed thereon an affected space application through which the patient may initiate a registration request.
In some embodiments, upon receiving a patient's registration request, registration module 410 may obtain or establish an electronic health profile of the patient. For example, if the electronic health record of the patient is already stored in the storage device of the hospital, the registration module 410 may obtain the electronic health record of the patient, otherwise the registration module 410 may establish the electronic health record of the patient.
The electronic health record is an electronic record for recording various patient data. The patient data includes one or more of patient basic information, genetic information, medical history information, health management information, and the like. The patient basic information may include the patient's identity ID (identification card number, medical insurance card number, etc.), name, age, sex, height, weight, occupation, etc. Genetic information includes family history information, genetic test information, and the like. The medical record information includes information generated in a patient's historic visit, such as complaints, registration records, pre-consultation records, diagnosis records, hospitalization records, operation records, pathology examination results, image data, and the like. The health management information relates to behavioral habits that have an impact on health (e.g., life habits that are beneficial and detrimental to health). In some embodiments, the electronic health record includes a three-dimensional model of the patient, a three-dimensional model of multiple organs and/or tissues of the patient, etc., for visually displaying the health/condition of the patient. In some embodiments, the health management information may also include healthcare information for the patient. For a detailed description of health care information, see fig. 17 and its associated description.
In some embodiments, the information recorded in the electronic health record is multi-modal. For example, the electronic health record contains various forms of information such as text, pictures, graphics, voice and the like. For another example, the electronic health record may include various types of information related to a plurality of dimensions and/or aspects, such as basic information, physical examination data, imaging data, and the like. In some embodiments, the electronic health record may be updated based on the advancement of the patient's visit procedure. For example, the electronic health record may include patient basic information when a patient is registered, the electronic health record may be updated to further include patient complaints, registration records after the patient has completed registration, the electronic health record may be updated to further include diagnostic records after the patient has completed a visit at a consulting room, and the electronic health record may be updated to further include patient healthcare information when the patient is at home.
In some embodiments, registration module 410 may obtain patient data and build an electronic health profile of the patient based on the patient data. In some embodiments, registration module 410 may determine at least a portion of the patient data (e.g., the base information) based on the patient input. Patient input includes text input, voice input, and the like. The patient input may be obtained from a third terminal or registration terminal. In some embodiments, registration module 410 may obtain at least a portion of the patient data by reading data stored in a storage device, invoking an associated interface, and/or otherwise. For example, registration module 410 may obtain medical record information and electronic physical examination reports of a patient from hospital and physical examination center databases, respectively. For another example, registration module 410 may obtain movement information for a patient over a month from a smart terminal (e.g., cell phone, smart watch) authorized by the patient.
In some embodiments, before acquiring the patient data, the registration module 410 may use the third terminal or the registration terminal to authenticate the patient and acquire the authorized license of the patient. Identity authentication may be performed by biometric identification (e.g., face recognition, fingerprint recognition, palm print recognition, voiceprint recognition, behavioral recognition, etc.), identity document chip recognition (e.g., identification card chip recognition, clinic card chip recognition), and/or password verification, among others.
In some embodiments, upon receiving a patient's registration request, registration module 410 may provide intelligent registration services to the patient. The intelligent registration service is used for matching a registration department, a registration doctor and a registration time period for a patient.
When the patient has an explicit registration department, the corresponding registration department may be directly selected, and when the patient does not have an explicit registration department, the registration module 410 may provide intelligent registration services to the patient based on patient complaints, patient uploaded exam reports, and/or historical patient information stored by the storage device to determine a recommended department that matches the patient's visit needs. After determining a registration department based on a recommendation department, the registration module 410 may generate a registration link to send to the patient, and the patient may further select a registration doctor and a registration time period after clicking the registration link, or may directly provide a registration recommendation scheme to the patient at the original interface.
When the patient has an explicit registry, the corresponding registry may be selected directly, and when the patient does not have an explicit registry, the registry module 410 may recommend the registry to the patient based on patient complaints, and/or present the patient with doctor information for the registry to assist the patient in selecting the registry.
When the patient has an explicit registration time, the corresponding registration time period can be directly selected, and when the patient does not have an explicit registration time, the registration module 410 can recommend the registration time period with a smaller number of appointment people for the patient based on appointment information of a registering doctor. In some embodiments, when a patient is on-site registered at a hospital, registration module 410 may recommend to the patient the most recent registration period that they can check in on time based on the patient's current location and the location of the registration department. In some embodiments, when the patient selects a plurality of registration departments and/or registration doctors, the registration module 410 may recommend registration time periods corresponding to the plurality of registration departments and/or registration doctors, respectively, to the patient based on information such as appointment records, treatment efficiency, department locations, etc. of the registration departments and/or registration doctors, so that the patient can complete medical treatment service of the plurality of registration departments and/or registration doctors in a shortest time.
In some embodiments, after the patient determines the registering doctor and the registration period, the registration module 410 may send the patient a check-in credential (e.g., a two-dimensional code or key) and appointment information, and update the patient's electronic health record based on the registration record.
In some embodiments, the patient may be remotely registered at another location outside the hospital via the third terminal. After registration, registration module 410 may send inquiry information about the mode of visit to the third terminal. For example, the query information is used to query whether the patient is going to a hospital visit or a remote visit. When the patient chooses to go to the hospital visit, the registration module 410 may recommend an appropriate departure time to the patient via the third terminal based on the appointment information, the current location of the patient, and location information of the consulting room. When the patient selects a remote visit, the registration module 410 may alert the patient before the appointment time via the third terminal.
In some embodiments, during registration, the processing device 210 (e.g., an agent disposed on the processing device) may actively query the patient as to whether the patient needs to receive the telemedicine visit service. When the patient chooses to receive the telemedicine visit service, the registration module 410 may recommend an appropriate visit time to the patient via the third terminal based on the registration information.
In some embodiments, the registration module 410 may make a first query (also referred to as a registration query) with the patient using a third terminal or registration terminal to determine a registered doctor of the patient. For more description of determining a registering doctor, see fig. 6 and its associated description.
The path planning service is used to provide the patient with a path suggestion to the doctor's office. In particular, registration module 410 may generate a planned path to a consulting room based on the current location of the patient and location information of the consulting room of the registering doctor. In some embodiments, registration module 410 may generate the planned path after the patient registers at the hospital site. In some embodiments, the registration module 410 may generate the planned path when the patient arrives at the hospital through a remote registration and on a appointment date of a visit. For more description of path planning see fig. 7 and its associated description.
The path guidance service is used to guide the patient to a doctor's office based on the planned path. In some embodiments, registration module 410 may present guidance information to the patient via a third terminal (e.g., an XR device) of the patient. For more description of path guidance see fig. 7 and its associated description.
In some embodiments, registration module 410 may provide relevant services of a registration link through a registration terminal of a hospital. The hospital registration terminal is a terminal provided at the hospital site and may include one or more of a display screen, an input device (e.g., a keyboard), a sound output device, a sound sensor, an XR device, and the like. Illustratively, after the patient arrives at the registration terminal of the hospital, the registration module 410 may interact with the patient through a display screen, a sound output device, a sound sensor, etc. on the registration terminal, to provide services such as intelligent registration services, path planning services, etc. for the patient. For another example, after the patient arrives at the registration terminal of the hospital, the XR device provided by the registration terminal may be worn, and the registration module 410 may provide services such as intelligent registration, path planning, path guidance, etc. for the patient through the XR device. After the patient has completed the visit, the XR device may be returned to the hospital.
In some embodiments, registration module 410 may provide services related to the registration link via a third terminal of the patient. Specifically, the registration module 410 may interact with the patient through a third terminal of the patient to provide services such as intelligent registration, path planning, path guidance, etc. to the patient. For example, the registration module 410 may provide a remote intelligent registration service to the patient through the patient's third terminal, and the registration module 410 may provide a path planning service and a path guidance service to the patient through the patient's third terminal after the patient arrives at the hospital on the appointment date.
In some embodiments, registration module 410 may receive monitoring data collected by one or more monitoring devices (e.g., infrared monitoring) in the registration hall and monitor the status of the registration hall based on the monitoring data. For example, registration module 410 may monitor a congestion procedure at a registration hall and monitor whether a dangerous event has occurred at the registration hall.
The patient may wait for a visit and prepare for a visit in the waiting section 520. In the waiting section 520, the patient may be located at a hospital site (e.g., in a waiting area at the gate of a doctor's office), or may be located at another location outside the hospital (e.g., at the patient's home). The relevant operations of the waiting section 520 may be performed by the waiting module 420. As shown in fig. 5, the relevant operations of the waiting section 520 include pre-consultation, force feedback/temperature feedback, etc.
The pre-consultation service is used to make a preliminary inquiry to the patient to obtain patient-related information before the patient enters the office for a formal consultation. Specifically, when the patient is waiting, the waiting module 420 may perform a pre-inquiry with the patient through the third terminal of the patient or the waiting terminal configured in the waiting area, so as to reduce anxiety emotion of the patient when waiting, and generate a pre-inquiry record to provide reference for the doctor, so as to improve the doctor's efficiency of the doctor.
Force feedback/temperature feedback is used to provide pacifying feedback to the patient. Specifically, when the patient is waiting for a diagnosis, after the waiting module 420 detects the emotion of the patient such as tension, fear, anxiety, etc., the feedback and/or temperature feedback can be applied to the patient through the wearable device of the patient, so that the patient can feel actions such as handshake, hugging, etc., and thus the bad emotion of the patient is calmed.
For more description of pre-interrogation, force feedback/temperature feedback see fig. 8 and its associated description.
In some embodiments, the waiting module 420 may provide relevant services of the waiting link through a waiting terminal of the hospital. Similar to the registration terminal, the terminal for waiting in the hospital is a terminal provided at the hospital site, and may include one or more of a display screen, a sound output device, a sound sensor, an XR device, a wearable device, and the like. For example, after the patient arrives at the waiting area of the registration department, the waiting module 420 may interact with the patient through the waiting terminal set in the waiting area, so as to provide a pre-consultation service for the patient. For another example, after the patient arrives at the waiting area of the consulting room, a wearable device provided by the waiting terminal may be worn, through which the waiting module 420 may provide force feedback/temperature feedback services for the patient.
In some embodiments, the waiting module 420 may provide relevant services in the waiting session via the XR equipment of the hospital. For example, after an XR device wearing a registration terminal for a patient arrives at a waiting area, the waiting module 420 may provide pre-interrogation services to the patient via the XR device. In some embodiments, the terminal includes an XR device, and the waiting module 420 may provide pre-interrogation services to the patient via the XR device of the terminal.
In some embodiments, the waiting module 420 may receive monitoring data collected by one or more monitoring devices (e.g., infrared monitoring) in the waiting area and monitor the status of the waiting area based on the monitoring data. For example, the waiting module 420 may monitor waiting time of a waiting patient and monitor whether a dangerous event has occurred in the waiting area.
In some embodiments, the waiting module 420 may provide relevant services of the waiting link through a third terminal of the patient. For example, the waiting module 420 may provide a pre-consultation service remotely for a patient waiting for a visit on line through a third terminal of the patient. For another example, the waiting module 420 may provide a pre-consultation service for a patient waiting in a hospital through a third terminal of the patient.
The patient may communicate with the registered doctor in the inquiry link 530 to receive medical treatment services. In the consultation step 530, the patient may be located at a clinic site for on-site consultation services or may be located at another site outside the hospital (e.g., at the patient's home) for remote consultation services. In particular, the relevant operations of the inquiry link 530 may be performed by the inquiry module 430. As shown in FIG. 5, the relevant operations of the inquiry link 530 include outpatient previews of the day, medical data presentation, providing consultation advice, diagnostic record generation, remote accompany service, and the like.
The present day outpatient preview is used to show the doctor relevant information of the patient to be treated on the present day. For example, the consultation module 430 may present the doctor with the category of the patient that was registered or awaiting the doctor's visit on the same day before the doctor begins the visit. The categories of patients may include initial patients, re-patients, and the like. When a review patient is present, the consultation module 430 may present the physician with changes in the condition of the review patient based on the review patient's information. In some embodiments, the physician may view the current day's outpatient preview through a first terminal in the consulting room or through his own second terminal (e.g., an XR device). In some embodiments, a doctor's second terminal has installed thereon a doctor space application through which the doctor can obtain a preview of the current day's clinic. Alternatively, before the doctor needs to provide the outpatient service, the doctor space application may issue a prompt to the doctor to remind the doctor to review the current day outpatient preview.
The medical data presentation is used to present medical data, such as an electronic health record, of a patient to a doctor, patient, and/or remote attendant. Specifically, the inquiry module 430 can synchronously display the medical data of the patient to the doctor, the patient and/or the remote attendant through at least one terminal, and update the display mode and/or the display content of the electronic health record according to the interaction operation of the doctor, the patient and/or the remote attendant on the medical data, so that the doctor, the patient and/or the remote attendant can communicate. For more description of the presentation of medical data, see fig. 12 and its associated description.
Providing a consultation advice refers to providing a consultation advice that can be referred to by a doctor during a consultation. Exemplary consultation advice may include supplemental inquiry advice, physical advice, prescription advice, treatment advice, and the like. Specifically, the inquiry module 430 may generate a diagnosis suggestion provided to the doctor based on the sensing information collected by the sensing device during the diagnosis process, so that the doctor can adjust the diagnosis mode, the diagnosis result and the prescription, thereby improving the diagnosis efficiency and the diagnosis accuracy. For further description of providing review advice, reference may be made to the relevant description of sub-process 1310, which is not repeated here.
The diagnostic record generates patient condition information for recording a doctor's diagnosis, a doctor's order, etc., to assist the doctor in generating a diagnostic record of the patient. Specifically, the inquiry module 430 may generate an initial diagnosis record based on the sensing information collected by the sensing device during the diagnosis process, and generate a target diagnosis record according to the feedback information of the doctor on the initial diagnosis record, thereby reducing the paperwork of the doctor and improving the efficiency and accuracy of generating the diagnosis record. For further description of diagnostic record generation, reference may be made to sub-flow 1320 and the associated description of FIG. 15, which are not repeated here.
The remote accompany service is used to provide immersive accompany services for the patient and the remote accompany. For example, the consultation module 430 may present to the remote consultant a virtual consulting room space emulating a real-world consulting room, a real-time view of a doctor within the consulting room, and a real-time view of the patient. For another example, the consultation module 430 may display a real-time picture of the remote consultant through the first terminal in the consulting room. For another example, the consultation module 430 may present real-time views of the remote consultants to the doctor and the patient through the doctor's second device and the patient's third device, respectively. For another example, the interrogation module 430 may obtain force feedback and/or temperature feedback of the remote co-diagnostic person via a wearable device worn by the remote co-diagnostic person and apply corresponding force feedback and/or temperature feedback to the patient via a wearable device worn by the patient. For further description of the remote co-diagnosis service, reference may be made to sub-flow 1330 and the associated description of fig. 16, which are not repeated here.
In some embodiments, the inquiry module 430 may also provide other services related to the inquiry link 530, such as assisting the doctor in completing work related to the workflow, such as calling numbers, approving accompanying requests, etc.
In some embodiments, the interrogation module 430 may provide relevant services of the interrogation link to the doctor and patient through a first terminal in the consulting room. The first terminal is a terminal disposed at a clinic site and may include one or more of a display screen, a sound output device, a sound sensor, an XR device, a wearable device, and the like. For example, after the patient enters the consulting room, the interrogation module 430 may present the doctor and/or patient with an electronic health record, consultation advice, pre-interrogation records, diagnostic records, real-time images of remote co-diagnosticians, etc., via the large screen of the first terminal. For another example, the patient may wear a wearable device of the first terminal and receive force feedback of the remote co-therapist through the wearable device. For another example, the physician and patient may wear an XR device of the first terminal, and the interrogation module 430 may present various types of information to the physician and/or patient via the XR device of the first terminal. In some embodiments, the patient-worn XR device may also be an XR device configured on a registration terminal. For more description of the first terminal presentation content, see fig. 14 and its associated description.
In some embodiments, the interrogation module 430 may provide the doctor and the patient with relevant services of the interrogation link through the doctor's second terminal and the patient's third terminal. For example, the consultation module 430 may display an electronic health record, a consultation suggestion, a diagnosis record, a real-time picture of a remote consultant, etc. to the doctor and the patient through the doctor's second terminal and the patient's third terminal, respectively. At least part of the content presented by the second terminal and the third terminal is synchronized. For example, the electronic health record presented on the second terminal and the third terminal are synchronized. When a doctor or a patient updates the display mode or the display content of the electronic health record through the respective terminal, the electronic health record on the other terminal is correspondingly updated. In some embodiments, the inquiry module 430 may further provide the remote partner with the relevant services of the inquiry link through a fourth terminal (e.g., an XR terminal) of the remote partner. The fourth terminal may also be referred to as a companion terminal.
After the patient has finished communicating with the doctor's visit, a post-diagnosis link 540 may be entered. The operations associated with the post-diagnosis link 540 may be performed by the post-diagnosis service module 440. As shown in FIG. 5, the operations associated with the post-diagnosis link 540 include providing medication access services, providing inspection services, health care, and the like.
The medication intake service is used to assist the patient in taking medication configured by the doctor. For example, the drug delivery service is used to assist the patient in paying for medications, reserving a pharmacy for delivery, guiding the patient to the pharmacy, and the like. The examination service is used to assist the patient in receiving the examination that the physician requires to perform. For example, examination services are used to assist patients in paying examination fees, reserving examinations for examination departments, guiding patients to an examination department, and the like.
In some embodiments, in response to detecting the end of the interrogation process, the post-interrogation service module 440 may determine a target service to be provided to the patient after the interrogation process and reserve the patient with a target business department that provides the target service. The post-diagnosis service module 440 may detect whether the inquiry process is finished in various ways. For example, the post-diagnosis service module 440 may determine that the patient's inquiry process is over when it detects that the doctor submitted a target diagnosis record for the patient or that the doctor called the next patient.
In some embodiments, the post-diagnosis service module 440 may determine the target services required by the patient based on the target treatment prescription in the target diagnosis record. For example, the target service may be a medication intake service and the target business may be a pharmacy. When the patient requires a medication intake service, the post-diagnosis service module 440 may send the patient's medication order information to the pharmacy and schedule medication intake. As another example, the target service may be an inspection service and the target business department may be an inspection department. When the patient requires the examination service, the post-diagnosis service module 440 may transmit examination prescription information of the patient to the examination room and reserve the examination. Alternatively, after making an appointment with the target business segment, the post-diagnosis service module 440 may send appointment information to the third terminal of the patient. In some embodiments, the post-diagnosis service module 440 may first send a fee payment reminder to the third terminal of the patient after detecting the end of the inquiry process, and the post-diagnosis service module 440 may make a reservation with the target service department after the patient pays the fee for the target service.
In some embodiments, the post-diagnosis service module 440 may provide the patient with a path planning service and a path guidance service to the target business department based on the current location of the patient and the location of the target business department. The path planning and path guidance services provided to the patient for the target business segment are similar to those of registration link 510 and are not described in detail herein.
The health monitoring service is used for continuously monitoring the health condition of the patient after the inquiry process is finished. The detailed description of the health care services can be found in fig. 17 and the related description thereof, and will not be repeated here.
In some embodiments, one or more operations in the medical treatment procedure 500 may be performed by an agent corresponding to the medical treatment service/procedure. An agent is a computational entity capable of autonomous learning, autonomous evolution, and sensing and analyzing data to perform specific tasks and/or achieve specific goals. The intelligent agent can continuously learn and self-evolve in the interaction with the environment through AI technologies such as reinforcement learning, deep learning and the like. In addition, the intelligent agent can collect and analyze mass information by utilizing a big data technology, and mine patterns and learning rules from the data, so that a decision process is optimized, environmental changes can be identified, response is quickly made, and reasonable judgment is made in an uncertain or dynamic environment.
In some embodiments, different links of the medical treatment procedure 500 may share an agent, or have corresponding agents, respectively. For example, the operations associated with registration link 510, waiting link 520, inquiry link 530, and post-diagnosis link 540 may be performed by registration agent, pre-inquiry agent, visit agent, and post-diagnosis agent, respectively. For another example, the intelligent registration service in registration link 510 may be performed by a registration agent, the pre-consultation service in waiting link 520 may be performed by a pre-consultation agent, and the path planning and path guidance in registration link 510 and post-consultation link 540 may be performed by a smooth-running agent. As another example, all links or operations in the medical treatment procedure 500 may be performed by the same medical treatment service agent.
In some embodiments, different departments and/or different disease categories, different hospitals may correspond to different agents. The medical service corresponding to a specific department and/or a specific disease can be realized by the department and/or the intelligent agent corresponding to the disease. In some embodiments, the agent may refer to knowledge data of its corresponding department and/or disease species, such as a dictionary, knowledge graph, template, etc., at run time. Alternatively, the knowledge data may be set by a hospital administrator (e.g., department owner). In some embodiments, multiple agents may cooperate, share information, and jointly accomplish complex tasks through network communications.
Fig. 6 is a schematic diagram of an exemplary flow of a physician determining that a patient is to be registered, according to some embodiments of the present description. The flow 600 shown in fig. 6 may be performed in the registration link 510.
At step 610, patient complaints are obtained.
Patient complaints are patient self-description of the condition and/or symptoms. For example, patient complaints may include descriptions of the affected area, time of progression of the condition, symptom intensity, symptom frequency, related life events, medical purposes, and the like. In some embodiments, the form of patient complaints may include, but is not limited to, text, speech, pictures, gestures, and the like. For example, patient complaints may include textual descriptions, phonetic descriptions, sign language descriptions of the patient's condition. For another example, a patient complaint may include a picture of a lesion of a patient. For another example, patient complaints may include patient descriptions of pain levels.
In some embodiments, registration module 410 may obtain patient complaints through a third terminal of the patient or a registration terminal of the hospital. Illustratively, the registration module 410 may obtain the patient complaint in text form through a third terminal or a keyboard (physical keyboard, screen keyboard, and/or virtual keyboard) of the registration terminal. For another example, registration module 410 may obtain patient complaints in voice form through a third terminal or a sound sensor of the registration terminal. Still another example, the registration module 410 may obtain a video of the patient through the third terminal or an image sensor of the registration terminal, etc., and recognize a gesture of the patient in the video, and further determine a patient complaint in sign language according to the gesture of the patient. Still another example, the registration module 410 may take a picture of the lesion of the patient through the third terminal or an image sensor of the registration terminal.
At step 620, at least one candidate department is determined based on the patient complaint.
Candidate departments are departments that are primarily determined based on patient conditions and/or symptoms.
Specifically, registration module 410 may extract a first keyword of a patient complaint. The first keyword may include keywords that summarize the condition and/or symptom, such as the affected part, symptom, time, intensity, frequency, etc. Illustratively, registration module 410 may extract a first keyword in a patient complaint in text form through a keyword extraction algorithm. Keyword extraction algorithms may include, but are not limited to, TF/IDF algorithms, topic Model algorithms, textrank algorithms, rake algorithms, and the like. For another example, registration module 410 may identify a first keyword in a patient complaint in speech form through speech recognition techniques. The speech recognition techniques may include, but are not limited to, automatic Speech Recognition (ASR) techniques, computer speech recognition techniques, speech to text recognition (STT) techniques, and the like. For another example, registration module 410 may identify locations, symptoms, etc. in a patient complaint in the form of a picture as a first keyword through image recognition techniques. Image recognition techniques may include, but are not limited to, image feature extraction techniques, object detection techniques, object recognition techniques, and the like. Still further exemplary, registration module 410 may identify a first keyword in a patient complaint in the form of a gesture through gesture recognition techniques. Gesture recognition techniques may include gesture trajectory recognition techniques, gesture recognition techniques, gesture analysis techniques, and the like.
Further, registration module 410 can determine at least one candidate department based on the first keyword. For example, registration module 410 may determine at least one candidate department by retrieving a preset keyword-department lookup table. The preset keyword-department comparison table can be generated in advance based on medical knowledge, doctor experience, history registration records and other information, wherein various main complaint keywords and corresponding departments are recorded. For another example, registration module 410 can input the first keyword into a department determination model to determine at least one candidate department. The department determination model is a pre-trained machine learning model that can process model inputs to output recommended candidate departments. In some embodiments, the preset keyword-department table and/or department determination model may be learned from the historical registration record by the agent corresponding to the medical treatment procedure.
Step 630, based on the at least one candidate department, controlling the third terminal or registration terminal to make a first query to the patient.
The first query is used to further define the patient's need for a visit to determine the matching department of registration and/or doctor of registration for the patient. The first query may also be referred to as a registration query. In some embodiments, the first query may include multiple rounds of conversations. Each round of dialog may include one query and one patient answer.
In some embodiments, registration module 410 may determine first query content of the first query based on the patient complaint and the at least one candidate department.
The first query may also be referred to as a registered query. The first interrogation content includes at least the content of the first round of interrogation. For example, registration module 410 may input a patient complaint and at least one candidate department into a first query model that outputs the content of a first round of queries. In some embodiments, the first interrogation model may include, but is not limited to, a convolutional neural network (Convolutional Neural Network, CNN) model, a recurrent neural network (Recurrent Neural Network, RNN) model, a long-short-term memory network (Long Short Term Memory Network, LSTM) model, a BERT model, a ChatGPT model, and the like. In some embodiments, the first query model may be obtained based on a first sample training set training. The first training sample set may include a plurality of first training samples and a plurality of first training labels corresponding thereto. The first training sample may include a sample patient complaint and a sample candidate department, and the first training tag may include content of a sample first round query. The first training sample and the first training label may be based on a historical registration record and/or manually determined by a user. In some embodiments, the first training sample may be input into the initial model, the value of the loss function is determined based on the data output by the initial first query model and the first training tag, and the parameters of the initial model are iteratively updated based on the value of the loss function until the value of the loss function reaches a preset value or the number of iterations reaches a preset number of times, so as to obtain the trained first query model.
For another example, the registration module 410 may obtain keywords corresponding to each candidate department based on a preset keyword-department comparison table, determine a difference word between keywords corresponding to any two candidate departments, and then determine the content of the first round of query according to the difference word. For example, if the keywords corresponding to orthopedics are "local pain, fever", the keywords corresponding to vasculology are "pain in legs, cool" it is possible to determine the difference words "fever" and "cool" between the keywords corresponding to orthopedics and vasculology, and determine the content of the first round of inquiry "whether the legs are hot or cool" according to the difference words.
In some embodiments, registration module 410 may utilize a second query model to determine its corresponding first query content based on patient complaints, at least one candidate department, historical wheel dialog for subsequent rounds of queries in the first query. The second query model may include a CNN model, an RNN model, an LSTM model, a BERT model, a ChatGPT model, and the like. In some embodiments, the second query model may be obtained based on a second sample training set training. Wherein the second training sample set may include a plurality of second training samples and a plurality of second training labels corresponding thereto. The second training sample may include a sample patient complaint, a sample candidate department, a sample history round dialog, and the second training tag may include content of a sample current round query. The second training sample and the second training label may be based on historical registration records and/or manually determined by the user. The training process of the second challenge model is similar to that of the first challenge model and will not be described in detail herein.
In some embodiments, registration module 410 may utilize a third terminal or registration terminal to make the first query by way of text display, voice output, or the like. For example, the first inquiry content is displayed based on a screen of the third terminal or the registration terminal. For another example, the first interrogation content is played based on a microphone of the third terminal or the registering terminal.
In some embodiments, registration module 410 may control the third terminal or registration terminal to display the first avatar to make the first query. The first virtual character is a digital character image with specific appearance characteristics, language characteristics and the like, can communicate with a patient and assists the patient in registering. Specifically, the registration module 410 may display the first avatar through a screen of the third terminal or the registration terminal and play the first query content through a microphone of the third terminal or the registration terminal. Meanwhile, the first virtual character can simulate the expression, action and the like of human speaking, and the real communication experience is provided for the patient.
In some embodiments, the first avatar has a predetermined shape characteristic. For example, the first avatar may be a nurse. In some embodiments, registration module 410 may determine the appearance characteristics of the first avatar based on the patient basic information. The patient basic information may include the age, sex, occupation, etc. of the patient. Specifically, registration module 410 may determine an appearance characteristic of the first avatar based on the patient base information and generate a corresponding first avatar based on the appearance characteristic. For example only, the topographical features may include body conformation features, skin features, appearance features, clothing features, and the like. For example, for a female patient aged 70, the first virtual character may be a virtual female nurse with a strong affinity. For another example, for a patient who is professional as a doctor, the first avatar may be a professional avatar. In some embodiments, registration module 410 may determine the first avatar from the plurality of candidate first avatars based on the patient profile. In some embodiments, the first avatar may initiate a first query to the patient based on the sound characteristics. For a detailed description of sound features, see the relevant description of step 820.
In some embodiments, the third terminal or registration terminal may include an XR device that registration module 410 may control to present the first avatar making the first query. The XR device may display a first virtual character within the field of view of the patient and play the first query. The field of view of the patient may present the real world within the patient's gaze or present a virtual background. In some embodiments, registration module 410 may synchronize display of the first query content within the field of view of the patient via the XR device. FIG. 10 is a schematic diagram of a process for presenting a avatar by an XR device, according to some embodiments of the present description. As shown in FIG. 10, the XR device may display a virtual character 1010 (e.g., a first virtual character) within the patient's field of view and simultaneously display interrogation content 1020 (e.g., a first interrogation content).
In some embodiments, registration module 410 may end the first query based on the first preset condition. The first preset condition may be that the dialog turns reach a threshold, e.g. 10 turns. The first preset condition may be the patient entering a preset answer or asking for the content to be preset. For example, the first preset condition is satisfied when the patient inputs the answer "end" or "no other symptoms". For another example, the first preset condition is satisfied when the content of the front-wheel inquiry is "no other problem, thank you |".
In some embodiments of the present description, a first query is made to a patient with a first avatar, and the patient's interaction is enhanced by the personified first avatar without increasing labor costs, thereby improving the quality and efficiency of registration services. In some embodiments of the present description, a machine learning model (e.g., a first query model and/or a second query model) is utilized to determine first query content of a first query. Compared with the method adopting preset inquiry contents, the method in the specification can improve inquiry efficiency and quality and improve the accuracy of a follow-up determined registering department and/or a registering doctor.
Step 640, determining a doctor to whom the patient is to register based on the first data collected by the third terminal or registration terminal in the first query.
The first data may include various types of data input by the patient via the third terminal or the registration terminal, such as voice data, text data, image input, and the like. In some embodiments, a sound sensor is configured on the third terminal or the registration terminal, and the first data may be collected by the sound sensor. Specifically, registration module 410 may determine a confidence level for each candidate department based on the first data, the patient complaint, and the at least one candidate department. Confidence in a candidate subject is a numerical value that evaluates the degree of matching of the candidate subject to the patient's condition/symptom. The higher the confidence of the candidate subject, the higher the degree of match with the patient's condition/symptom.
For example, for each candidate department, the registration module 410 may determine the keywords of the candidate department based on a preset keyword-department comparison table, and the registration module 410 may further determine the confidence level of the candidate department based on the number and the number by counting the number and the number of times of the keywords of the candidate department occurring in the transcribed text of the first data and the patient complaint. For another example, the transcript text of the first data, the patient complaint, and the at least one candidate department may be processed using a trained department determination model, which outputs a confidence level for each candidate department.
Further, the registration module 410 may determine at least one recommended department based on the confidence level of the at least one candidate department such that the patient may select the at least one registered department based on the at least one recommended department. For example, registration module 410 may determine candidate departments with confidence levels greater than a preset threshold as recommended departments and control a third terminal or registration terminal to display the recommended departments from which the registered departments were determined by the patient. After the registration department determines, the third terminal or the registration terminal can display a registration link corresponding to the registration department, and the patient further selects a registration doctor and registration time by clicking the registration link through the end.
In some embodiments, registration module 410 may present to the patient the base information and the appointment time for each of the registered doctors corresponding to the registration department based on the third terminal or registration terminal. For example only, the registration module 410 may control the third terminal or the XR device of the registration terminal to present the virtual character of the registering doctor, which introduces the patient with basic information corresponding to the registering doctor and the appointment time of the appointment. In some embodiments, the registration module 410 may recommend a registration doctor and a registration period to the patient.
In some embodiments of the present disclosure, by performing a first query to communicate with a patient, the patient's need for a visit is fully understood, thereby improving the accuracy of registration advice provided to the patient, improving registration efficiency, and improving the patient's interactive experience.
In some embodiments, the process 600 may be performed by an agent corresponding to a medical care service/medical care process. For example, intelligent registration services may be provided by registration agents. The registration agent can understand and process natural language and images, allows the patient to describe symptoms in his own words, and accurately evaluates the patient's condition to give registration advice with high confidence. Meanwhile, the registering agent can learn knowledge data and historical service data, so that accuracy and efficiency of intelligent registering service are improved continuously, and user experience is improved.
In some embodiments, a manager of the hospital may configure the registering agent, for example, through a tube space application installed on the manager's terminal. Configuration of registration agents includes configuration of general parameters related to hardware configuration, concurrent request processing, caching policies, security settings, model parameters, error handling and logging, performance monitoring and alerting, version management, etc. The configuration of the registering agent also includes the configuration of the service parameters. The business parameters relate to elective course rooms, doctor lists and number sources, and interrogation rounds.
Fig. 7 is a schematic diagram of a path planning and path guidance service shown in accordance with some embodiments of the present description. Path planning and path guidance are associated with registration element 510.
After the patient has hung the corresponding doctor's number, registration module 410 may generate a planned path based on the current location of the patient and the location information of the doctor's corresponding office. The current position of the patient is the position at which the current time of the patient is located. Registration module 410 may obtain the current location of the patient based on the location information of the third terminal or registration terminal. The planned path is a path that directs the patient to the consulting room. In some embodiments, registration module 410 may determine a path from the current location of the patient to the location of the consulting room based on a real-time three-dimensional map of the hospital. The real-time three-dimensional map of the hospital is a three-dimensional map reflecting the layout of the hospital and the real-time status.
In some embodiments, when a patient hangs a number of multiple doctors, and needs to travel to multiple rooms, registration module 410 may determine the order of visits for multiple rooms, and the planned path through the rooms, based on one or more of the patient's current location, registration records, location information for the multiple rooms, and the like. For example, registration module 410 may determine an estimated wait time for a patient for each consulting room based on the current day's records of visits by the consulting room doctor and the patient's registration records. A detailed description of determining the estimated wait time for the patient may be found in relation to step 810. Further, registration module 410 may sort the plurality of rooms in descending order according to the estimated wait time, and then determine the planned path of the ordered rooms.
After the planned path is generated, registration module 410 may control a third terminal or registration terminal of the patient to present guidance information related to the planned path. The guidance information is information indicating that the patient is advancing along the planned path. For example, the guidance information may include a guidance gesture of a virtual guidance persona (e.g., a first virtual persona), a guidance utterance, and so on. For another example, the guidance information may include arrows, text, and the like.
In some embodiments, the third terminal or registration terminal includes an XR device that can be caused by registration module 410 to overlay guidance information over the real world field of view of the patient via AR technology or MR technology. That is, the patient can see the real world around him/her and the guiding information at the same time while wearing the XR device. The real world field of view of a patient is the real world within the patient's eye gaze range. The patient's eye gaze range changes with patient head movement, as does the corresponding patient's real world field of view.
For example, as shown in fig. 7, when the patient's head is in the orientation as shown, the real world view of the corresponding patient may include a real hospital scene within the dashed box, and the registration module 410 may superimpose the virtual guide character on the real world view corresponding to the dashed box and guide the patient along the planned path through the gestures and utterances of the virtual guide character. For another example, registration module 410 may superimpose a virtual arrow on the real world field of view corresponding to the dashed box, thereby guiding the patient along the planned path. As the patient moves along the planned path, the real world view of the patient changes in real time, and the registration module 410 may superimpose guidance information on the real world view changing in real time to guide the patient.
In some embodiments of the present description, providing a path plan for a patient and providing a path guidance service for the patient based on the path plan may help the patient to quickly reach a consulting room, improving the patient's efficiency of the consultation. On the other hand, in some embodiments, route guidance by AR/MR technology may enable guidance information to be presented to the patient in a more accurate manner, improving guidance accuracy without affecting the patient's perception of the real world.
Fig. 8 is a schematic diagram of an exemplary flow of providing a pre-consultation service according to some embodiments of the present description. The process 800 shown in fig. 8 may be performed in the waiting section 520.
Step 810, determining second query content of the second query based on the department of the doctor.
The second query may also be referred to as a pre-consultation query for making a preliminary query to the patient prior to the formal consultation. The second query may include multiple rounds of queries. The second query may also be referred to as a pre-consultation query. The second interrogation content may include interrogation content for each round of interrogation. Or the second interrogation content may include only the interrogation content of the first round of interrogation.
In some embodiments, the waiting module 420 may obtain a pre-consultation record template corresponding to the doctor's department and determine the second query content based on the pre-consultation record template. For example, the pre-consultation record template records various types of information of the patient to be acquired in the pre-consultation, and the waiting module 420 can use each type of information as the content of a round of inquiry. Alternatively, the physician may personalize and modify the pre-consultation record template.
In some embodiments, the waiting module 420 may obtain known information (e.g., electronic health record, complaint, etc.) of the patient, and determine missing information that has not been collected in the pre-consultation record template by comparing the pre-consultation record template to the known information. For example, if the known information includes a family history of the patient, the missing information need not include a family history. As another example, if the patient complaint includes a medical history of the patient, the missing information need not include a medical history. In some embodiments, missing information may be determined based on basic information of the patient. For example, for male patients, the omission need not include a menstrual history, a fertility history. Further, the waiting module 420 may determine second query content based on the missing information. Specifically, the query content of each round of queries includes a query for one or more items of missing information. For example, the waiting module 420 may compare the patient's electronic health record with the pre-consultation record template and determine that the allergy history and pain level in the pre-consultation record template are not contained in the electronic health record, and that the allergy history and pain level are missing information. The waiting module 420 may determine that the second query includes a question of an allergy history and a question of a pain class. In some embodiments, before performing the pre-consultation, the waiting module 420 may display the electronic health record to the patient through the third terminal of the patient, and determine the second query content after the patient confirms that the electronic health record is correct.
In some embodiments, the waiting module 420 may determine the second query content based on the known information of the doctor's department and patient using a third query model. The third query model may include a CNN model, an RNN model, an LSTM model, a BERT model, a ChatGPT model, and the like. In some embodiments, the third interrogation model may include a missing information determination model and a first interrogation content determination model.
The missing information determination model is used to process known information of the doctor's department and patient to output missing information. In some embodiments, the missing information determination model may be obtained based on a third training sample set training. Wherein the third training sample set may include a plurality of third training samples and a plurality of third training labels corresponding thereto. The third training sample may include known information of the sample patient and the department of the corresponding sample doctor, and the third training tag may include sample missing information. The third training sample set may be determined based on historical pre-consultation records or manually by the user. In some embodiments, the waiting module 420 may determine sample missing information based on the third training sample and a knowledge database corresponding to the department of the sample doctor. The knowledge database may include, among other things, the visit specifications for the corresponding department, e.g., disorder description specifications, diagnosis specifications, prescription specifications, order specifications, etc. Specifically, the waiting module 420 may traverse the third training sample based on the knowledge database, and determine that the third training sample in the knowledge database is not related to the sample missing information.
The first query content determination model is for outputting second query content based on missing information of the patient. The first query content determination model is for processing the missing information to output second query content for each round of queries in the second query or second query content for a first round of queries. The first query content determination model may be trained to be acquired based on a fourth set of training samples. The fourth training sample set may include a plurality of fourth training samples and a plurality of fourth training labels corresponding thereto. The fourth training sample may include sample missing information and the fourth training tag may include sample interrogation content. The fourth training sample set may be determined based on historical pre-consultation records or manually by the user.
Step 820, controlling the third terminal of the patient to make a second query to the patient based on the second query content.
In some embodiments, after patient registration, the waiting module 420 may determine an estimated wait time for the patient. For example, the estimated wait time may be the time difference between the current time and the patient's registration period. For another example, the estimated wait time may be determined based on a doctor's current day visit record and a patient registration record. The patient's registration record may include a registration period for which the patient was subscribed. The doctor's daily visit record is a record reflecting the doctor's daily visit. For example only, the waiting module 420 may determine a number of waiting patients prior to the patient visit and an average time to visit for each patient by the doctor based on the doctor's current day visit record and the patient registration record, and take the product of the number and average time to visit as the estimated wait time.
In some embodiments, in response to determining that the estimated wait time is greater than the first preset time threshold, the waiting module 420 may initiate a second query or display a suggestion to the patient to make the second query via a third terminal of the patient. For example, the patient's estimated wait time of 10 minutes is greater than 5 minutes and the waiting module 420 may initiate a second query to the patient via the patient's third terminal. In some embodiments of the present disclosure, the determination to initiate the second query based on the estimated wait time of the patient may ensure that the pre-consultation time is sufficient, avoiding the doctor calling the patient during the pre-consultation, resulting in the patient being over-numbered.
In some embodiments, in response to determining that the estimated wait time is less than the second preset time threshold, the waiting module 420 may initiate a second query to the patient or present a suggestion to make the second query via a third terminal of the patient. The second preset time threshold is greater than the first preset time threshold. For example, when the current time is detected to be shorter than 24 hours (i.e., the estimated waiting time is shorter than 24 hours), the third terminal may display a suggestion for making a second query to the patient (e.g., a suggestion to the patient by a virtual character), thereby timely reminding the patient of making a pre-consultation.
In some embodiments, the waiting module 420 may detect that the patient is actively initiating a pre-consultation request through the third terminal. In response to the pre-consultation request, the waiting module 420 may initiate a second query to the patient through a third terminal of the patient. In some embodiments, the third terminal has installed thereon an affected space application with which the second query is made. The patient space application is an interactive portal for the patient and the third terminal. The patient can obtain various medical services through the application of the affected space.
In some embodiments, a second avatar may be presented via the third terminal, the second avatar being configured to conduct a second query based on the second query content. The second virtual character is a digital character image with specific appearance characteristics, language characteristics and the like, and can communicate with the patient to conduct pre-consultation on the patient. Specifically, the waiting module 420 may display the second virtual character through a screen of the third terminal and play the second query content through a sound output device of the third terminal. Meanwhile, the second virtual character can simulate the expression, action and the like of human speaking, and the real communication experience is provided for the patient. For example, the third terminal may include an XR device through which the waiting module 420 may present the second virtual character. For example only, as shown in fig. 10, the XR device may display a virtual character 1010 (e.g., a second virtual character) within the patient's field of view and synchronously display interrogation content 1020 (e.g., a second interrogation content).
In some embodiments, the second avatar has a pre-set appearance characteristic. In some embodiments, the topographical features of the second virtual character may be determined based on an optical image of a doctor registering the patient. Specifically, the waiting module 420 may extract the appearance characteristics (e.g., body shape information, clothing information) of the doctor from the optical image of the doctor to be registered, and generate a second virtual character having the same or similar appearance characteristics as the doctor based on the appearance characteristics of the doctor. In some embodiments, the appearance characteristics of the second avatar may be determined based on patient basic information. For a detailed description of determining the appearance characteristics of the second avatar based on the basic information, reference may be made to step 630 for a related description of determining the appearance characteristics of the first avatar based on the basic information. In some embodiments, the waiting module 420 may select an appropriate avatar from the avatar library as the second avatar based on the physician's profile and/or patient profile.
In some embodiments, as previously described, the second query comprises a plurality of rounds of queries, the second query content may comprise the query content of each round of queries in the second query, and the second query may be performed by the process 900 shown in fig. 9.
As shown in fig. 9, for the first round of interrogation, the waiting module 420 may control the third terminal to make the first round of interrogation based on the corresponding interrogation content.
For each current round of interrogation (abbreviated as current interrogation) except for the first round of interrogation, the waiting module 420 may adjust the interrogation content of the current interrogation (abbreviated as current interrogation content) based on the second data collected prior to the current interrogation to make the interrogation content more consistent with the patient's condition. In particular, the waiting module 420 may determine semantic information and affective information of the patient's historical answers based on the second data collected prior to the current query. The second data may include voice data, image data, text data, etc. collected by the third terminal. For example, the second data may be collected after the sound is detected by the sound sensor of the third terminal. The historical answers are answers to the patient's queries of the historical rounds. For example, if the current query is a round 3 query, the historical answers may include the patient's answers to round 1 and round 2 queries.
The semantic information of the historical answers characterizes the content of the historical answers. The mood information of the historical answer characterizes the mood of the patient at the time of the answer (e.g., calm, tension, anxiety, fear, confusion, dysphoria, etc.). The waiting module 420 may perform text transcription, voice content recognition, etc. on the second data to determine semantic information. The waiting module 420 may analyze the voice content, mood, intonation, speed of speech, etc. characteristics of the second data to determine emotion information. In some embodiments, the waiting module 420 may extract the target voice signal of the patient from the second data, and then perform semantic information and emotion information extraction on the target voice signal.
With continued reference to fig. 9, the waiting module 420 may adjust the current query content based on semantic information and mood information. For example, when the patient's mood information is "tension", "fear", the waiting module 420 may add a pacifying utterance to the current query content. For another example, when the emotional information of the patient is "puzzled," the waiting module 420 may add an interpreted utterance to the current query. For another example, when the semantic information indicates that the patient does not explicitly answer the historical query, the waiting module 420 may adjust the current query content to repeat the historical query, thereby guiding the patient to explicitly answer the historical query. The originally determined current query content can be used as the query content of the next round of queries.
In some embodiments of the present description, the current query content is adjusted based on semantic information and mood information of the patient's historical answers, so that the current query content can be adjusted in time for the patient's status, thereby improving the quality of service of the pre-consultation.
In some embodiments, in addition to adjusting the current query content, the sound characteristics used for the query may also be adjusted in real-time based on the patient's status. The sound features include speech rate features, mood features, intonation features, volume features, etc. As shown in fig. 9, the waiting module 420 may determine the sound characteristics of the current query based on the semantic information and the emotion information of the patient's historical answers, and control the third terminal to make the current query based on the adjusted query content and sound characteristics. Specifically, the waiting module 420 may determine the sound feature of the current round of inquiry based on the semantic information and the emotion information of the patient's historical answer according to the preset correspondence. The preset corresponding relation can represent the corresponding relation among semantic information, emotion information and sound characteristics. By way of example only, the preset correspondence may characterize that when the semantic information is a positive answer, the mood information is calm, the speed of speech is characterized by medium speed, the mood is characterized by polite, the intonation is characterized by calm, the volume is characterized by medium.
In some embodiments of the present disclosure, the voice features used in the second query are adjusted in real time based on semantic information and emotion information of the patient's historical answers, so that emotion changes of the patient can be better attended, thereby improving the personification effect of the second virtual character and improving the quality of service of the pre-consultation.
In some embodiments, as shown in fig. 9, the waiting module 420 may further obtain physiological status information of the patient. The physiological state information of the patient may reflect a real-time physiological state of the patient. The physiological state information may include physiological parameter values of the patient, such as heart rate, pulse, respiratory rate, etc. The physiological state information may also include information related to the posture, limb behavior, facial expression, muscle state, etc. of the patient. In some embodiments, the physiological state information of the patient may be obtained using a wearable device worn by the patient. For example, the physiological parameter values of the patient may be acquired by physiological sensors integrated on the wearable device. In some embodiments, physiological state information of the patient may be acquired by an image sensor in the environment of the patient. For example, the posture, facial expression, etc. of the patient may be collected by a monitoring device in the waiting area.
Further, the waiting module 420 may adjust the current query content based on semantic information, emotional information, and physiological state information. Specifically, the waiting module 420 may update the emotional information of the patient based on the physiological state information of the patient. It will be appreciated that the patient's internal mood is not necessarily expressed entirely by the patient's answer, and thus, the patient's mood information may be updated or revised based on the patient's physiological state information. For example, assuming that the patient is determined to be in a calm state based on the second speech information, but the physiological state information of the patient indicates that the patient is in a stressed state (e.g., the heart rate exceeds a preset threshold), the interrogation module 430 may modify the emotional information of the patient to a stressed state. Further, the candidate module 420 may adjust the current query content based on the semantic information and the updated mood information.
In some embodiments of the present description, the emotional information of the patient may be more accurately captured by further considering the physiological state information of the patient, so that the adjustment of the second query content is more accurate, thereby improving the service quality of the pre-query.
As shown in fig. 9, in some embodiments, the waiting module 420 may determine feedback parameters based on at least a portion of the semantic information, the emotional information, and the physiological state information, and control the wearable device to apply feedback to the patient based on the feedback parameters. The feedback may include at least one of force feedback and temperature feedback. The feedback parameters are used to control the manner in which the feedback is applied, e.g., the type of feedback, the body part on which the feedback is applied, the size of the feedback, etc. In some embodiments, the waiting module 420 may determine the emotion and emotion level of the patient based on at least a portion of the semantic information, the emotional information, and the physiological state information, and determine the feedback parameters based on the emotion and emotion level. For example, the mood of the patient may be used to determine the type of feedback and the body part on which the feedback is applied, and the mood level of the patient may be used to determine the size of the feedback.
In some embodiments of the present disclosure, feedback parameters are determined based on semantic information, emotion information, and physiological state information, and the wearable device is controlled to apply feedback to the patient according to the feedback parameters, so that bad emotion of the patient can be pacified in time, and thereby pre-consultation service quality is improved.
In some embodiments, the waiting module 420 may end the second query based on a second preset condition. The second preset condition may be that the remaining missing information amount is 0. The second preset condition may be that a time difference of the current time from the estimated time of the patient to the visit is less than a threshold. The second preset condition may also be similar to the first preset condition. For example, the second preset condition may be that the number of rounds of interrogation performed is equal to a threshold. For another example, the second preset condition may be the current query content or the content that the patient answers as preset.
In some embodiments, the second interrogation content determined in step 810 includes only the interrogation content of the first round of interrogation. The current query content of each current query except the first round of queries may be determined during the progress of the second query. For example, in the current query, the waiting module 420 may input query content of the historical query, historical answers of the patient, known information of the patient, etc. into the second query content determination model, from which the current query content is output. The second query content determination model may be trained based on the fifth sample training set. Wherein the fifth training sample set may include a plurality of fifth training samples and a corresponding plurality of fifth training labels. The fifth training sample may include the query content of the sample history query, the sample history answer of the sample patient, the sample known information of the sample patient, and the fifth training tag may include the content of the sample current round of queries. The fifth training sample set may be determined based on historical pre-consultation records or manually by the user. The training process of the second challenge content determination model is similar to that of the first challenge model, and will not be described in detail herein.
In some embodiments of the present disclosure, by performing the second query to communicate with the patient, various types of information of the patient are fully known, thereby improving accuracy of the pre-consultation record and providing more accurate reference information for subsequent consultations. In some embodiments of the present description, a second query is performed on the patient with a second avatar, and the patient's interaction is enhanced by the personified second avatar without increasing labor costs, thereby improving the quality and efficiency of the pre-consultation service. In some embodiments of the present description, a machine learning model (e.g., missing information determination model, first query content determination model, second query content determination model) is utilized to determine second query content of the second query, making the second query more natural and flexible.
In step 830, a pre-consultation record is generated based on the second data collected by the third terminal in the second query.
The second data includes voice data, text data, image data, etc. input by the patient through the third terminal in the second inquiry. The pre-consultation record is used to record patient information collected in a second query (i.e., a pre-consultation query). Alternatively, the pre-consultation record may also record known information about some patients. In some embodiments, the pre-interrogation record is generated according to a preset template. The preset templates can be templates corresponding to departments where doctors are located, and can also be templates set by the doctors. In some embodiments, the pre-consultation record may be presented to the physician for reference at the time of patient visit, and specific description may refer to FIG. 11 and its associated description.
For example, when the second data includes a voice signal, the waiting module 420 may first transcribe the second data into text and then extract the second keyword from the text through a keyword extraction algorithm. Further, the waiting module 420 may convert the second keyword into a medical term. Still further, the waiting module 420 may obtain a plurality of template fields in the pre-consultation record template, retrieve the content corresponding to each template field from the hospital terminology, and fill in the corresponding location of the pre-consultation record template. The transformation of the second keyword may be performed based on a term transformation model or may be performed based on a knowledge dictionary. For a detailed description of the knowledge dictionary, see step 1520. The term conversion model is used to convert the spoken description into medical terms, which may be obtained based on a sixth sample training set training. Wherein the sixth training sample set may include a plurality of sixth training samples and a plurality of sixth training labels corresponding thereto. The sixth training sample may include sample keywords and the sixth training label may include corresponding sample medical terms. The sixth training sample set may be based on a historical pre-consultation record or determined manually by the user. The training process of the term conversion model is similar to the first query model and will not be described in detail herein.
In some embodiments of the present disclosure, a pre-consultation service is provided to a patient while the patient is waiting, and a pre-consultation record is generated, so that on one hand, bad emotion of the patient during waiting can be reduced, and on the other hand, doctor's visit efficiency can be improved through the pre-consultation record.
In some embodiments, the second interrogation may be performed via a terminal other than the third terminal. For example, a patient may be awaiting a diagnosis in a awaiting area of a hospital in which a awaiting terminal is deployed, which may be used to make a second interrogation of the patient. The process of making the second query on the patient using the other terminal is similar to the process of making the second query on the patient using the third terminal, and will not be described in detail herein.
In some embodiments, the process 800 may be performed by an agent corresponding to a medical care service/medical care process. For example, the pre-consultation service may be provided by a pre-consultation agent. The pre-consultation agent can understand and process natural language, image and other multi-mode data, and stores knowledge data (such as dictionary, knowledge graph, knowledge database and template) and patient data corresponding to the department of diagnosis to provide personalized and specialized pre-consultation service for the patient. On the other hand, the pre-consultation agent can convert the data acquired in the pre-consultation inquiry into accurate medical terms, so that a doctor can conveniently check a structured pre-consultation report when looking at the doctor, and the paperwork of the doctor is reduced. Moreover, the pre-consultation intelligent body can realize self evolution and self learning based on an artificial intelligence technology, and can continuously improve the accuracy and efficiency of the pre-consultation service by learning knowledge data and historical service data, thereby improving the user experience.
In some embodiments, the hospital administrator may configure the pre-consultation agent, for example, through a tube space application installed on the administrator terminal. Configuration of pre-consultation agents includes configuration of general parameters related to hardware configuration, concurrent request processing, caching policies, security settings, model parameters, error handling and logging, performance monitoring and alerting, version management, etc. The configuration of the pre-consultation agent also includes the configuration of the business parameters. The business parameters are related to department dictionary configuration, common problem answers, query rounds, pre-consultation key fields, structural rules and the like.
Fig. 11 is a schematic diagram of an exemplary medical treatment system 1100 shown according to some embodiments of the present description. The medical treatment system 1100 may be used to provide user services to a target user in connection with the inquiry link 530. As shown in fig. 11, the medical treatment system 1100 may include a processing device 210, a terminal 1110, a perception device 1120, and a physical examination device 1130. The processing device 210 may be communicatively coupled to the terminal 1110, the sensing device 1120, and the physical examination device 1130. The target user may include a patient and a doctor providing medical services to the patient. In some cases, the target user also includes a remote co-doctor.
The sensing device 1120 is capable of sensing the surrounding environment in which it is located to gather sensing information. Exemplary sensing devices 1120 can include various types of sensor devices such as sound sensors, image sensors, temperature sensors, humidity sensors, and the like. Sensing device 1120 may be a stand-alone device, such as a monitoring device installed in a consulting room. The sensing device 1120 may also be integrated into other devices. For example, the sound sensor may be integrated in the at least one terminal 1110. In some embodiments, sensing device 1120 may transmit (e.g., faithfully transmit) the collected sensing information to processing device 210 over a network. In some embodiments, the sensing device 1120 may be an internet of things device.
When the medical treatment system 1100 is used to provide on-site medical treatment services, the sensing device 1120 may be located in a consulting room for sensing the consulting room environment. When the medical treatment system 1100 is used to provide a remote medical treatment service, the sensing device 1120 may include a first sensing device located in the environment of a doctor and a second sensing device located in the environment of a patient. The first sensing device is used for collecting sensing information related to the environment where the doctor is located. The second sensing device is used for acquiring second sensing information related to the environment of the patient.
In some embodiments, the sensing device 1120 further comprises a third sensing device located in the environment of the remote co-mores for collecting sensing information related to the environment of the remote co-mores.
The physical examination device 1130 performs a health examination on a patient to collect physical examination data of the patient. Exemplary physical examination devices 1130 may include, but are not limited to, stethoscopes, blood pressure meters, thermometers, heart rate monitors, oximetry, and the like. The physical examination device 1130 may be a stand-alone device or may be integrated into other devices (e.g., a wearable device). In some embodiments, the examination device 1130 may transmit (e.g., faithfully transmit) the acquired examination data to the processing device 210 over a network. In some embodiments, the peer device 1130 may be an internet of things device.
When the medical treatment system 1100 is used to provide on-site medical treatment services, the patient facility 1130 may be located in a consulting room and a doctor may operate the patient facility 1130 to perform a health examination on the patient. When the medical treatment system 1100 is used to provide remote medical treatment services, the patient device 1130 may be included in the patient's environment and the patient may operate the patient device 1130 to perform a health check on his or her own. Or the physician may remotely manipulate the examination device 1130 via a second terminal (e.g., a second XR device) to perform a health examination on the patient. In some embodiments, the patient device 1130 may be worn on the patient as a wearable device for continuously acquiring patient data during a visit.
The terminal 1110 is a device that interacts with a target user. In some embodiments, the terminals 1110 may include at least one of a first terminal 1111, a second terminal 1112, a third terminal 1113, and/or a fourth terminal 1114. During a visit, at least one terminal may be used to present data to a corresponding target user.
The first terminal 1111 is a terminal equipped for a consulting room by a hospital. The first terminal 1111 may include one or more of a display screen, a sound output device, a sound sensor, an image sensor, and the like. The second terminal 1112, the third terminal 1113, and the fourth terminal 1114 are terminals for respectively interacting with a doctor, a patient, and a remote attendant. The second terminal 1112, third terminal 1113, fourth terminal 1114 may include one or more of a cell phone, a computer device, a wearable device, an XR device, etc. As shown in fig. 11, second terminal 1112 may include a second XR device, third terminal 1113 may include a first XR device, and fourth terminal 1114 may include a third XR device.
When a patient receives a live medical treatment service in a consulting room, a patient and a doctor can be simultaneously presented with a treatment interface through the first terminal 1111, and various data can be presented on the treatment interface, so as to facilitate communication between the patient and the doctor. In some embodiments, patient data, such as medical data relating to the patient, may be presented on the visit interface. Exemplary medical data includes electronic health records (or portions thereof), pre-consultation records, medical images, three-dimensional organ models, examination results, and the like. In some embodiments, the data presented on the clinic interface is multimodal, including various forms of data such as images, graphics, video, text, and the like.
Fig. 14 is a schematic diagram of an exemplary clinic interface shown in accordance with some embodiments of the present description. As shown in fig. 14, the visit interface 1400 may include a first interface element 1410 related to the patient's electronic health record, a second interface element 1420 related to the remote accompany service, a third interface element 1430 related to the medical document, and a fourth interface element 1440 related to the visit proposal. It should be understood that the encounter interface 1400 is for illustrative purposes only and may include one or more other interface elements, or omit one or more of the interface elements shown in FIG. 14.
Also, the content and/or form presented in the visit interface 1400 may change as the visit process progresses. For example, a doctor and/or patient may issue control instructions to adjust what is presented in the clinic interface 1400. The control command may be a voice control command, a gesture control command, a touch control command, a control command inputted by an input device (such as a mouse, a keyboard, etc.), or the like. In some embodiments, the hospital administrator may configure the visit interface 1400, for example, through a tube space application installed on the administrator terminal.
The first interface element 1410 is an icon corresponding to the electronic health record, or is used to present the content of the electronic health record. During a visit, a doctor and/or patient may retrieve at least a portion of the electronic health record by issuing control instructions and present it via the first interface element 1410. A detailed description of presenting at least a portion of an electronic health record may be found in fig. 12 and its associated description.
The second interface element 1420 is an icon corresponding to a remote companion service or is used to present a picture of a remote companion. When a patient's remote accompany request is approved by a doctor, it may communicate with the remote accompany during the inquiry and view the remote accompany's picture through the second interface element 1420. A detailed description of the remote companion service may be found in fig. 16 and its associated description.
The third interface element 1430 is an icon corresponding to a medical document or is used to present a medical document such as a pre-consultation record, a diagnostic record (initial consultation record, target consultation record), etc. In the diagnosis process, when the patient is detected to start to visit, a pre-diagnosis record can be presented through the third interface element 1430, the third interface element 1430 can be updated to present an initial diagnosis record after the doctor and the patient finish communication, and the third interface element 1430 can be updated to present a target diagnosis record after the doctor confirms that the initial diagnosis record is modified.
In some embodiments, the interrogation module 430 may determine that the patient is beginning to visit based on the number instruction. The instruction to call the number is an instruction for the doctor to instruct the start of the next patient visit. The doctor can actively initiate the number calling instruction through the first terminal or the second terminal. Or the inquiry module 430 may automatically generate a number call instruction after the doctor submits the target diagnosis record of the last patient. In some embodiments, when the patient receives the on-site medical visit service, the interrogation module 430 may determine that the patient began a visit based on the location information of the third terminal of the patient. Or the interrogation module 430 may determine that the patient is beginning to visit based on the sensing device (e.g., image sensor) acquiring an image of the patient entering the consulting room. When the patient receives the telemedicine visit service, the consultation module 430 may acquire the remote visit confirmation messages transmitted from the doctor and the patient through the doctor's second terminal and the patient's third terminal, respectively, to determine that the patient starts the visit. For more description of pre-consultation records and diagnostic records, see FIG. 13 and its associated description.
The fourth interface element 1440 is an icon corresponding to the review proposal or is used to present the content of the review proposal. After generating the review proposal based on the perception information during the review, the review proposal may be presented via a fourth interface element 1440. For a detailed description of the review proposal, reference may be made to fig. 13 and its associated description.
In some embodiments, multiple data may be presented to the patient and physician simultaneously via the second terminal 1112 and the third terminal 1113, respectively. For example, a doctor and a patient may wear XR devices separately, through mixed reality, augmented reality techniques, allowing the doctor and patient to share multiple data simultaneously while perceiving the real physical world. Wherein the second terminal 1112 and the third terminal 1113 may present similar content as the clinic interface. When the remote attendant participates in the visit process, the fourth terminal 1114 may present the remote attendant with a real-time view of the clinic's site and content similar to that presented on the visit interface.
When the patient receives the telemedicine visit service, the doctor may be presented with the virtual office space through the first terminal 1111 or the second terminal 1112, and the patient may be presented with the virtual office space through the third terminal 1113. For convenience of description, description will be hereinafter given taking an example of presentation of data to a doctor through the second terminal 1112. For example, a third terminal of the patient may include a first XR device and a second terminal of the physician may include a second XR device, and a virtual office space may be presented to the patient and physician by the first XR device and the second XR device, respectively.
In some embodiments, the various data may be presented to the physician and patient simultaneously via the second terminal 1112 and the third terminal 1113, respectively. For example, the first XR device and the second XR device allow the patient and physician to simultaneously share multiple data while perceiving the virtual office space. Wherein the second terminal 1112 and the third terminal 1113 may present similar content as the clinic interface. When the remote attendant participates in the visit process, the virtual office space and various data may be presented to the remote attendant through the third XR device of fourth terminal 1114.
The virtual consulting room space is a virtual space for the consultation participants to interact with. In some embodiments, the virtual consulting room space may include a virtual consulting room space, a consulting virtual participant. The virtual consulting room space may be a virtually created environment or may be generated based on real-time information of the environment in which the doctor or patient is located (e.g., a consulting room). Further, a visit virtual participant may be generated based on real-time information of the visit participant, and then a virtual consulting room space may be generated based on the virtual consulting room space and the virtual visit participant. The interrogation module 430 may present a first view of the virtual room space to the patient via the first XR device, a second view of the virtual room space to the physician via the second XR device, and a third view of the virtual room space to the remote co-morer via the third XR device of the remote co-mor. The first, second, and third perspectives of the virtual consulting room space may be perspectives of a virtual patient, a virtual doctor, and a virtual remote co-doctor, respectively, in the virtual consulting room space. In some embodiments, when the participant wearing the XR device moves and rotates, the interrogation module 430 may obtain the distance moved and the angle rotated by the participant through the XR device and change the view corresponding to the participant based on the distance moved and the angle rotated by the participant.
The processing device 210 may provide services related to the inquiry link 530 by processing the perception information and the query data. For a detailed description of services related to the inquiry link 530, see fig. 5 and its associated description.
In some implementations, the processing device 210 may provide services related to the inquiry links through agents corresponding to the medical care services/medical care procedures. For example, services such as presentation of medical data, providing advice for viewing a diagnosis, generating a diagnostic record, etc., may be provided by the medical agent processing the perception information, the physical examination data, and other sources of data (e.g., electronic health records).
In some embodiments, the hospital administrator may configure the medical agent, for example, through a tube space application installed on the administrator terminal. The configuration of the medical agent includes the configuration of general parameters related to hardware configuration, concurrent request processing, caching policies, security settings, model parameters, error handling and logging, performance monitoring and alerting, version management, etc. The configuration of the agent for the visit also includes the configuration of the business parameters. The business parameters are related to a special knowledge graph, a medical record and other document templates, a custom vocabulary and the like.
Fig. 12 is a flow chart illustrating an exemplary flow of presenting medical data to a target user during a visit according to some embodiments of the present disclosure. The process 1200 shown in fig. 12 may be performed in the inquiry link 530.
At step 1210, a first interface element is presented to a target user by at least one terminal. The first interface element is associated with medical data of a patient receiving a medical visit service, and the target user includes at least the patient and a doctor.
The medical data of the patient can comprise various data reflecting the health condition of the patient, such as electronic health files, medical images, medical examination results and the like. In some embodiments, the medical data may include multimodal data, which may include various forms of data in text, pictures, graphics, voice, and the like. In some embodiments, the medical data may include an electronic health record, and the first interface element is related to the electronic health record of the patient. As mentioned above, the electronic health record is an electronic record for recording various patient data. For example, the electronic health record includes three-dimensional models of a plurality of organs and/or tissues of the patient.
The first interface element may be in various forms. For example, the first interface element may present at least a portion of the medical data. For example only, the first interface element may present a home page of the electronic health record. For another example, the first interface element may be an icon corresponding to medical data of the patient (e.g., an icon corresponding to an electronic health record).
In some embodiments, the medical care service is a point-of-care medical care service performed in a consulting room, i.e., both the patient and the doctor are in the consulting room. In this case, the at least one terminal may comprise a first terminal in the consulting room (e.g., first terminal 1111 as depicted in fig. 11), which may present the first interface element to the patient and the physician together. Or the at least one terminal may include a second terminal of the physician (e.g., second terminal 1112 shown in fig. 11) and a third terminal of the patient (e.g., third terminal 1113 shown in fig. 11). For example, the patient and physician may each wear an XR device during a visit. The patient-worn XR device may overlay the first interface element in the patient's real world field of view using AR/MR technology, and the doctor-worn XR device may overlay the first interface element in the doctor's real world field of view using AR/MR technology. The patient and doctor can view the first interface element together while perceiving the real environment of the consulting room.
In some embodiments, the medical care service is a telemedicine service, i.e., both the patient and the doctor are outside of the examination room, or only the patient is outside of the examination room.
When both the patient and the physician are outside the examination room, the at least one terminal may include a second terminal of the physician (e.g., second terminal 1112 shown in fig. 11) and a third terminal of the patient (e.g., third terminal 1113 shown in fig. 11). In some embodiments, the second and third terminals may include a second XR device and a first XR device that present virtual office space to the physician and patient, respectively. The first XR device worn by the patient may present the corresponding virtual room space to the patient and superimpose the first interface element in the virtual room space of the patient. The second XR device worn by the doctor may present the doctor with a corresponding virtual office space and superimpose the first interface element in the doctor's virtual office space. The patient and physician may co-view the first interface element while being able to perceive the respective virtual consulting room space.
The at least one terminal may include a second terminal of the doctor (e.g., second terminal 1112 shown in fig. 11) and a third terminal of the patient (e.g., third terminal 1113 shown in fig. 11) when the patient is outside the consulting room and the doctor is in the consulting room. Or the at least one terminal may further comprise a first terminal in the consulting room (first terminal 1111 as described in fig. 11), which may present an interface element to the physician that is synchronized with the third terminal.
In some embodiments, the target user further comprises a remote attendant to the patient, and the at least one terminal further comprises a fourth terminal to the remote attendant. For example, the fourth terminal may be a third XR device worn by a remote co-operator, which may present the virtual office space and the first interface element to the remote co-operator. For more description of virtual consulting room space, reference may be made to fig. 19 and its associated description.
In some embodiments, the interrogation module 430 may present the first interface element via the at least one terminal based on the trigger condition. For example, the trigger condition may include detecting that the patient is beginning to visit.
Step 1220, based on the perceived information collected by the one or more perceived devices during the visit, obtains control instructions issued by at least one of the target users.
The control instructions are instructions for retrieving at least a portion of the medical data (e.g., electronic health record) for display. For example, the control instructions are used to retrieve a three-dimensional model of the organ of interest of the patient for display in the electronic health record. In some embodiments, the control instructions may also be used to set display parameters (e.g., display angle, display size, display position). In some embodiments, the control instructions may also be used to annotate critical data on medical data (e.g., a three-dimensional model of an organ of interest).
For a point-of-care medical visit service, the sensing device may include one or more sensing devices located in a consulting room. For example, the sensing device may include an image sensor, a sound sensor, etc. in a consulting room. The sensing device may be installed at any location in the consulting room. For example, the image sensor may be mounted on a wall or ceiling of a consulting room. In some embodiments, the sensing device in the consulting room may be integrated in a terminal in the consulting room. For example, the sound sensor may be integrated in the first terminal. For another example, the sound sensor may be integrated in the second terminal of the doctor and/or in the third terminal of the patient.
For a remote medical interrogation service, the sensing devices may include one or more sensing devices of the doctor's environment (e.g., doctor's room, doctor's residence) and the patient's environment (e.g., patient's residence). The sensing device in the environment of the doctor may be a stand-alone sensing device or integrated in the second terminal. The sensing device in the environment of the patient may be a stand-alone sensing device or integrated in the third terminal.
In some embodiments, the perception information may include a voice signal collected by the sound sensor, and the control instruction may be obtained by performing semantic analysis on the voice signal. Semantic analysis refers to analyzing the content of speech contained in a speech signal. For example, when the interrogation module 430 detects that the speaking content includes keywords related to interface display, such as "display", "presentation", "retrieval", etc., it may determine that a control instruction is received, and further determine, according to the speaking content, the type of medical data that the user wants to retrieve and/or the display parameters that the user wants to set. In some embodiments, the target user may issue the control instruction by speaking a preset wake-up word. When the interrogation module 430 detects that the speaker content includes a wake-up word, it may determine that a control instruction is received.
In some embodiments, the perception information may include an optical image of a target user (e.g., a patient and/or a physician) captured by an image sensor, and the control instructions may be obtained by gesture recognition of the target user in the optical image. For example, the target user may issue control instructions by performing a pre-set gesture (e.g., a tap gesture, a swipe gesture, a rotate gesture, an expand or contract action with two fingers, etc.). When the interrogation module 430 detects a preset gesture in the optical image, it may determine that a control instruction is received, and further determine the type of medical data the user wants to invoke and/or the display parameters the user wants to set based on the type of gesture.
In some embodiments, the target user may utilize the control device to issue control instructions. For example, when a doctor and/or patient is in a clinic on site, the doctor and/or patient may issue control instructions via a remote control, intelligent control glove, etc. to control the first terminal to display at least a portion of the medical data.
In some embodiments of the present disclosure, the target user may adjust the display content and/or the display parameters in a flexible manner such as voice, gesture, and so on, so that the user experience may be optimized and the diagnosis efficiency may be improved.
Step 1230, in response to the control instructions, retrieving and presenting at least a portion of the medical data via the at least one terminal.
For example, the interrogation module 430 may retrieve at least a portion of the medical data from the storage device and control the at least one terminal to present the at least a portion of the medical data. When the control instructions include display parameters, the interrogation module 430 may control at least one terminal to present at least a portion of the medical data based on the display parameters. In some embodiments, during the remote consultation, based on the sensing data collected by the sensing device in the environment of the doctor, a control instruction sent by the doctor may be detected, the three-dimensional model of the organ of interest may be called to display, display parameters may be adjusted, key data may be marked, and in response to the control instruction, the display content on at least one terminal (e.g., the first and second XR devices) may be updated synchronously.
In some embodiments of the present disclosure, a plurality of target users may browse medical data together through at least one terminal, and the presentation content and the presentation manner of the medical data on each terminal are changed synchronously, which helps to improve the communication efficiency between the target users and improve the interactivity in the treatment process.
Fig. 13 is a flowchart illustrating an exemplary process for providing medical treatment services based on awareness information, in accordance with some embodiments of the present description. In some embodiments, process 1300 may include one or more of sub-processes 1310, 1320, and 1330.
The sub-process 1310 may be used to provide review suggestions based on the perceptual information. The sub-process 1310 may be performed in the inquiry link 530. As shown in fig. 13, sub-process 1310 may include steps 1312 and 1314.
Step 1312 generates a review proposal based on the perception information and patient data of the patient.
For a detailed description of the perception information, reference may be made to the related description of fig. 11 and 12. For more description of patient data, see the relevant description of registration link 510. The consultation advice is advice that assists the doctor in providing medical care services. Exemplary consultation advice may include supplemental inquiry advice, physical advice, prescription advice, treatment advice, and the like.
In some embodiments, the review proposal may be determined based on a knowledge database, a review specification, etc. corresponding to the registered subject. For example, the interview module 430 may determine dialog content for the physician and patient based on the voice signals collected by the voice sensors and search in a knowledge database, a clinic specification, etc., based on the dialog content and/or patient data, to determine the consultation advice. For example only, a search may be conducted in the visit profile based on dialogue content and/or patient data to determine which information in the visit profile has not been collected and to provide supplemental query suggestions based on such information.
In some embodiments, the review proposal may be generated based on a diagnostic model. Specifically, the interview module 430 may determine model inputs based on the perception information and the patient data and input the model inputs to a diagnostic model, which may output corresponding interview suggestions. For example, the model input may include one or more of patient data, dialogue content determined based on speech signals, status information of the patient determined based on image data, and the like.
In some embodiments, the diagnostic model may be obtained based on a seventh sample training set. Wherein the seventh training sample set may include a plurality of seventh training samples and a plurality of seventh training labels corresponding thereto. The seventh training sample may include a sample model input similar to the model input described above, and the seventh training label may include a sample review proposal. The seventh training sample set may be based on historical visit records and/or determined manually. The training process of the diagnostic model is similar to that of the first query model and will not be described in detail herein.
In some embodiments, the review advice may be generated by an agent corresponding to the medical visit service. The agent can learn the generation mechanism of the diagnosis advice from various data such as historical diagnosis records, knowledge databases, diagnosis specifications and the like, and process the perception information and the patient data based on the mechanism to provide the diagnosis advice.
At step 1314, at least a portion of the at least one terminal is controlled to present the review proposal.
For example, when the patient receives the medical visit service in the clinic on site, the consultation module 430 may send the visit suggestion to the first terminal, which may present the visit suggestion to the patient and the doctor via the fourth interface element. For another example, the consultation module 430 may send the consultation advice to a second terminal of the doctor, which may present the consultation advice to the doctor. For another example, when the patient receives the telemedicine visit service, the consultation module 430 may control the second terminal of the doctor and the third terminal of the patient to present the visit advice, respectively.
Along with the progress of the diagnosis process, the perception information is updated in real time, and the diagnosis advice is updated correspondingly. For example, before a doctor performs a physical examination of a patient, the consultation advice may include physical examination advice. When the doctor completes the examination of the patient, the consultation advice may be updated to prescription advice.
In some embodiments of the present description, automatically generating the review advice based on the perceptual information may improve the accuracy of the diagnosis and prescription, while improving the efficiency of the doctor's visit.
The sub-process 1320 may be used to generate a target diagnostic record based on the perceptual information. The sub-flow 1320 may be performed after the end of the inquiry link 530. As shown in fig. 13, sub-process 1320 may include steps 1322, 1324, and 1326.
At step 1312, an initial diagnostic record is generated based on the perceptual information.
The initial diagnostic record is an automatically generated diagnostic record. In some embodiments, the initial diagnostic record may include an initial patient medical record, an initial diagnostic opinion, an initial diagnostic prescription (e.g., an initial treatment prescription and an initial examination prescription), an initial medical order, and the like. For a detailed description of generating an initial diagnostic record, reference may be made to FIG. 15 and its associated description.
In some embodiments, the initial diagnostic record may be generated by an agent corresponding to the medical care service. The agent may learn the diagnostic record generation mechanism from various types of data such as diagnostic record templates, knowledge dictionaries, knowledge databases, etc., and process the perception information and patient data based on the mechanism to generate diagnostic records.
At step 1314, the initial diagnostic record is presented to the physician.
Illustratively, the inquiry module 430 may send the initial diagnostic record to the first terminal, which may present the initial diagnostic record. For example, when the patient begins a visit, the third interface element presents the patient and physician with a pre-consultation record, and when the visit is completed, the third interface element may instead be updated to present the initial diagnostic record.
For another example, the interrogation module 430 may send the initial diagnostic record to a second terminal of the doctor, which may present the initial diagnostic record to the doctor. In some embodiments, the second terminal may present the initial diagnostic record to the doctor at a preset time (e.g., after the doctor has completed the current day of inquiry work). Or the second terminal may present the initial diagnostic record to the doctor after the doctor enters the paperwork review request. In some embodiments, the second terminal is provided with a medical space application, and a doctor can acquire an initial diagnosis record through the medical space application and give feedback information for the initial diagnosis record.
At step 1316, a target diagnostic record is generated based on the initial diagnostic record and feedback information entered by the physician for the initial diagnostic record.
The feedback information entered by the physician may include modifications and/or confirmations of the initial diagnostic record by the physician. The target diagnostic record is a diagnostic record that has been modified and/or validated by the physician. In some embodiments, the target diagnostic record may include a target patient medical record, a target diagnostic opinion, a target diagnostic prescription (e.g., a target treatment prescription and a target examination prescription), a target medical order, and the like. In some embodiments, the doctor may input feedback information via the first terminal or the second terminal in various ways, such as typing, voice, gesture, etc.
In some embodiments of the present description, an initial diagnostic record may be generated based on the perception information, and then a target diagnostic record may be generated based on the doctor's feedback information to the initial diagnostic record. On the one hand, the manual writing errors of the target diagnosis record can be reduced, and the generation efficiency of the target diagnosis record is improved. On the other hand, the paperwork of doctors can be reduced, so that the doctors have more energy to care the patients, and the medical treatment service quality is improved.
Sub-process 1330 may be used to provide remote companion services based on the awareness information. The patient may apply for a remote companion service prior to a consultation, a detailed description of which may be found in fig. 16. Sub-process 1330 may be performed in inquiry link 530. As shown in fig. 13, sub-process 1330 may include steps 1332 and 1334.
Step 1332 determines whether the patient needs to communicate with a remote co-diagnostic based on the perception information.
In some embodiments, the interrogation module 430 may detect whether the patient has issued a request to communicate with a remote co-diagnostic based on perceptual information (e.g., voice data and/or image data). For example, the patient may make the request by speaking or making a particular gesture.
In some embodiments, the interrogation module 430 may determine status information of the patient based on the sensory information and determine whether the patient needs to communicate with a remote co-attendant based on the status information of the patient. For example, when the status information indicates that the patient is in a state of high tension, fear, etc., it is determined that the patient needs to communicate with a remote attendant. For another example, when the status information indicates that the patient is unable to select or determine the treatment regimen by himself, it is determined that the patient needs to communicate with a remote attendant.
Upon determining that the patient needs to communicate with the remote co-morbid, the consultation module 430 executes step 1334.
Step 1334 controls at least a portion of the at least one terminal to magnify the second interface element.
When the patient receives the on-site medical consultation service in the consulting room, the consultation module 430 may control the first terminal to zoom in on the second interface element. When the patient receives the remote medical interrogation service, the interrogation module 430 may control the third terminal of the patient to zoom in on the second interface element. Through the enlarged second interface element, the patient can view the real-time picture of the remote consultant, and the remote consultant can communicate better.
In some embodiments, when a patient receives an on-site medical consultation service in a consulting room, upon detecting that the patient needs to communicate with a remote accompanying person, the consultation module 430 may prompt the patient and the physician to wear the first XR device and the second XR device, respectively, and control the first and second XR devices to present image data of the remote accompanying person. That is, the patient may communicate directly with the physician face-to-face without having to wear an XR device when the patient does not need to communicate with a remote attendant. When a patient needs to communicate with a remote attendant, he/she can be provided with an XR device worn by the physician, thereby achieving a better communication experience.
In some embodiments of the present description, communication needs of a patient can be detected based on perceived information and timely satisfied, providing more humanized care to the patient, and providing a more realistic and immersive accompany experience.
FIG. 15 is a flowchart illustrating an exemplary process of generating an initial diagnostic record, according to some embodiments of the present description. The flow 1500 shown in fig. 15 may be used to implement step 1322.
At step 1510, key content is extracted from the perceptual information based on the diagnostic record template.
The diagnostic record template is used to define the format and content of the diagnostic record. For example, a diagnostic record template includes a plurality of template fields arranged in a particular format, the template fields representing what is to be included in the diagnostic record. In some embodiments, the diagnostic record templates may include template fields for patient medical records (including patient basic information, descriptions of patient conditions, physical examination data, etc.), diagnostic comments, diagnostic prescriptions (e.g., treatment prescriptions and examination prescriptions), medical orders, and the like.
The key content is content related to template fields in the diagnostic record template. In some embodiments, the perceptual information comprises a speech signal. Since the voice signal records the dialogue of the patient and the doctor, the key content extracted from the voice signal includes content in the form of natural language. In particular, the diagnostic module 430 may transcribe the speech signal into text and extract key content from the transcribed text based on a plurality of template fields in the diagnostic record template. For example, based on the template field "check prescription", the key content "CT in the legs" may be extracted from the transcribed text.
In some embodiments, the diagnostic module 430 may extract key content from the volume data acquired by the volume device during the visit. For example, when the template field "blood pressure" is included in the diagnostic record template, the blood pressure value of the patient may be extracted from the data collected by the blood pressure meter as key content.
At step 1520, the key content is converted to professional content based on the knowledge dictionary.
The knowledge dictionary is a comparison dictionary of natural language descriptions and specialized language descriptions. Specifically, the query module 430 may retrieve the corresponding specialized language from the knowledge dictionary as specialized content with the key content as an index. For example, based on a knowledge dictionary, the key content "CT on the leg" can be converted to the professional content "CT on the leg".
In some embodiments, the questioning module 430 may convert the key content into professional content based on a term conversion model. For a detailed description of the term transformation model, see the relevant description of step 830.
In some embodiments, the expertise may be generated using knowledge dictionaries and/or term transformation models corresponding to registered departments of the patient, where the different departments correspond to different knowledge dictionaries and/or term transformation models.
In step 1530, the diagnostic record template is updated based on the expertise and knowledge database, generating an initial diagnostic record.
The knowledge database refers to a knowledge database of a registered department, which includes the visit specifications of the department (such as disorder description specifications, diagnosis specifications, prescription specifications, doctor's advice specifications, etc.). In some embodiments, the consultation module 430 may evaluate and/or adjust the expertise based on the knowledge database to conform to the department's consultation specifications. Further, the interrogation module 430 may populate corresponding template fields in the diagnostic record template with the evaluated or adjusted professional content to generate an initial diagnostic record. In some embodiments, the interrogation module 430 may generate the initial diagnostic record further based on the patient's electronic health profile. For example, the query module 430 may search the electronic health record for content corresponding to the template fields, evaluate and/or adjust the content according to the knowledge database, and fill the evaluated or adjusted content into the corresponding template fields in the diagnostic record template.
In some embodiments of the present description, the initial diagnostic record is generated based on the knowledge database such that the initial diagnostic record meets the visit criteria of the registered subject, thereby improving the accuracy of the initial diagnostic record.
Fig. 16 is a schematic diagram illustrating an exemplary process for providing remote companion services according to some embodiments of the present description.
Step 1610, a request for a remote companion diagnosis issued by a patient prior to a visit procedure is received.
The remote diagnosis request is used for requesting the remote diagnosis partner to participate in the diagnosis process and remotely communicate with the doctor and the patient. In some embodiments, the remote companion request may include patient information (e.g., basic information, registration information, mode of visit), remote companion information, information about the fourth terminal of the remote companion, and the like. In some embodiments, the patient may initiate the remote companion request through an installed patient space application on the third terminal.
Step 1620, the second terminal of the request to the doctor performs approval.
For example only, the interrogation module 430 may send the request to the second terminal of the doctor via the network, and the second terminal may alert the doctor to approval in a vibration, bell, or the like manner after receiving the remote accompanying request. In some embodiments, the second terminal may uniformly display the remote accompanying request sent by the patient in the current day to the doctor for approval before the doctor provides the outpatient service. In some embodiments, the day outpatient preview may contain information as to whether the patient submitted a remote co-diagnosis request, which the doctor may approve when reviewing the day outpatient preview. For more description of the current day clinic preview, reference may be made to fig. 5.
Step 1630, in response to determining that the request is approved, controlling at least one terminal to display a second interface element associated with the remote companion service.
The at least one terminal may include a first terminal and/or a third terminal when the patient receives the on-site medical visit service in the consulting room. The second interface element on the first terminal and/or the third terminal may be used to present a real-time view of the remote attendant. Meanwhile, the sound output equipment on the first terminal and/or the third terminal can play real-time sound of the remote consultant. The at least one terminal further comprises a fourth terminal of the remote attendant. The second interface element on the fourth terminal is used to present real-time pictures of doctors and patients in the consulting room. Meanwhile, the sound output device on the fourth terminal can play real-time sounds of doctors and patients. In some embodiments, the fourth terminal of the remote attendant may be an XR device. The interrogation module 430 may generate a virtual consulting room space based on real-time monitoring information of the consulting room, the virtual consulting room space is then displayed to the remote attendant via the remote attendant's XR device, allowing the remote attendant an immersive experience in the consulting room site. For a detailed description of the virtual consulting room space, see fig. 11 and its associated description.
When the patient receives the telemedicine visit service, the at least one terminal may include a second terminal of the doctor, a third terminal of the patient, and a fourth terminal of the remote attendant. The second interface element on the second terminal and the third terminal may be used to present a real-time view of the remote co-diagnostic person. The second interface element on the fourth terminal may be used to present a real-time view of the patient and doctor. Meanwhile, the sound output equipment of each terminal can play the real-time sound of the participants in other treatment processes.
In some embodiments, when the request is passed, the consultation module 430 may send a consultation portal link to the third terminal of the patient and/or the fourth terminal of the remote consultant. In the process of the patient's visit, the remote consultant can acquire the remote consultation service through the consultation portal link. In some embodiments, the at least one terminal may display the second interface element when it is detected that the patient is beginning to visit.
In some embodiments of the present description, unobstructed real-time communication between a patient, a doctor, and a remote attendant can be achieved through a remote attendant service, improving the experience of a visit.
Fig. 17 is a schematic diagram of an exemplary process for providing healthcare services according to some embodiments of the present description. The process 1700 shown in FIG. 17 may be performed in the post-diagnosis stage 540.
At step 1710, a healthcare plan is generated based on the target diagnostic record.
The healthcare plan is used to monitor the health of a patient after the patient has finished the consultation. In some embodiments, the healthcare plan may monitor whether the patient is following the prescription and the order. In some embodiments, monitoring the health condition may include one or more of medication monitoring, health habit monitoring, and physiological data monitoring, among others. The medication monitoring is used for monitoring medication conditions (such as medication time, medication type, medication mode and medication amount) of patients. The health habit monitoring is used for monitoring life habits such as exercise, diet and the like of patients. Physiological data monitoring is used to monitor physiological data of a patient. The healthcare plan may define monitoring parameters, monitoring frequencies, monitoring times, associated thresholds, etc.
The healthcare plan may be generated based on a target diagnostic prescription, a target order, etc. in a target diagnostic record. For example, the post-diagnosis service module 440 may generate medication monitoring plans based on the target diagnosis orders, and generate health habit monitoring plans and physiological data monitoring plans based on the target orders in the target diagnosis records. In some embodiments, the post-diagnosis service module 440 may further generate a healthcare plan based on other data of the patient (e.g., other data in the electronic health record).
After the healthcare plan is generated, one or more monitoring devices of the patient may be controlled to collect healthcare information for the patient based on the healthcare plan. The monitoring device may include any device capable of collecting health monitoring information of a patient, such as a home monitoring device (e.g., an image sensor, a sound sensor), a smart medicine box, a smart injection device, a smart watch, a physical examination device, etc. In particular, steps 1720-1740 can be performed to gather healthcare information.
At 1720, healthcare instructions are generated based on the healthcare plan. At step 1730, healthcare instructions are sent to one or more monitoring devices. At 1740, healthcare information for the patient is obtained from the one or more monitoring devices.
The healthcare instruction is an instruction that instructs the monitoring device to conduct healthcare on the patient. In some embodiments, the healthcare instructions may include one or more of medication instructions, health habit instructions, physiological data monitoring instructions, and the like, and the healthcare information may include one or more of medication information, health habit information (e.g., athletic information, dietary information), physiological parameter information, and the like, of the patient, corresponding to the healthcare plan. For example, the medication instructions may instruct the intelligent medicine box to remind the patient to take medication at the medication time and collect medication information for the patient. For another example, the health habit instructions may instruct the smart watch to remind the patient of the movement at the movement time and collect the patient's operational information. For another example, the physiological data monitoring instructions may instruct the physical examination device to remind the patient to collect a specific physiological parameter at the physical examination time and record physiological parameter information of the patient.
Upon receiving the healthcare instruction, the monitoring device may perform a specific operation based on the healthcare instruction and send the collected healthcare information to the processing device 210.
Step 1750, updating the healthcare plan based on the healthcare information.
Specifically, the post-diagnosis service module 440 may evaluate the health condition of the patient based on the health care information of the patient and update the health care plan based on the evaluation result of the health condition. For example, when the health condition of the patient is determined to be poor based on the health status information of the patient, the medication monitoring plan (e.g., medication amount), the health habit monitoring plan (e.g., increase exercise duration), the physiological data monitoring instructions (decrease the time interval for blood pressure monitoring) of the patient may be updated. In some embodiments, the post-diagnosis service module 440 may evaluate the health risk level of the patient based on the health care information of the patient and update the health care plan based on the health risk level.
In some embodiments, when a change (e.g., improvement or deterioration) in the patient's health condition is detected, the post-diagnosis service module 440 may send healthcare information to the physician, who determines whether an update to the patient's healthcare plan is needed. In some embodiments, when an abnormality is detected in the health condition of the patient, the post-diagnosis service module 440 may issue a medical attention to the patient, such as by the patient's third terminal's patient space application. In some embodiments of the present description, a healthcare plan may be generated based on the target diagnostic record, and the patient may be healthcare according to the healthcare plan, and then the healthcare plan may be updated based on the healthcare information. On one hand, the patient can be reminded to follow the medical advice and the prescription in the target diagnosis record, and on the other hand, the health condition of the patient can be deeply known so as to correspondingly adjust the medication, the living habit and the like of the patient in time.
Fig. 18 is a schematic diagram of an exemplary medical treatment procedure 1800 shown in accordance with some embodiments of the present description. The hospital visit procedure 1800 is used to provide on-site medical visit services and patients need to go to the hospital to communicate face-to-face with doctors in the consulting room. As depicted in fig. 18, medical procedure 1800 includes a registration link 1810, a waiting link 1820, and a consultation link 1830.
In a registration link 1810, a first query is made to the patient by controlling a third terminal of the patient to determine a registration doctor of the patient. Specifically, registration link 1810 may be performed by registration module 410.
A detailed description of the first query, the third terminal, and the determination of the registering doctor may be found in fig. 5-7 and the associated description.
In the waiting step 1820, a second query is made to the patient by controlling a third terminal of the patient to generate a pre-consultation record. Specifically, the waiting step 1820 may be performed by the waiting module 420.
A related description of the second query and generation of the pre-query record may be found in fig. 5 and 8 and related descriptions thereof.
In the inquiry link 1830, a third interface element is displayed to the doctor and the patient using the first terminal in response to detecting that the patient begins to visit. The third interface element is associated with a pre-consultation record. For further description of the third interface element and pre-consultation record, see fig. 8, 9, 11 and their associated description.
In the inquiry link 1830, an initial diagnostic record may also be generated based further on the sensory information collected by the one or more sensory devices of the consulting room during the consultation. For a detailed description of the initial diagnostic record, reference may be made to fig. 13 and 15 and their associated descriptions.
In some embodiments of the present description, a first query is initiated to a patient via a third device of the patient to provide a registration service to the patient, thereby improving the efficiency and accuracy of patient registration. And a second inquiry is initiated to the patient when the patient waits for the diagnosis, a pre-diagnosis service is provided for the patient, and a pre-diagnosis record is generated, wherein the pre-diagnosis record can be displayed to the patient and the doctor in the diagnosis room after the diagnosis is started, so that the diagnosis efficiency of the doctor can be improved. By generating the initial diagnostic record based on the perception information, the paperwork of the doctor can be reduced, and the overall working efficiency of the doctor can be improved.
Fig. 19 is a schematic diagram of an exemplary medical treatment procedure 1900 shown in accordance with some embodiments of the present description. The hospital visit procedure 1900 is used to provide a remote medical visit service, and the patient can communicate with the doctor online in any place without going to the hospital. As depicted in fig. 19, the medical procedure 1900 includes a registration link 1910, a waiting link 1920, and a consultation link 1930.
Registration link 1910 is similar to registration link 1810 in fig. 18. The waiting segment is similar to the waiting segment 1820 in fig. 18.
In the consultation step 1930, in response to detecting that the patient begins to visit, the second terminal of the doctor and the third terminal of the patient are controlled to respectively present third interface elements related to the pre-consultation record. For further description of the third interface element and pre-consultation record, see fig. 8, 9, 11 and their associated description.
In some embodiments, the second terminal may include a second XR device, the third terminal may include a first XR device, and the first XR device and the second XR device may present virtual office space to the patient and physician, respectively. For a detailed description of presenting virtual consulting room space, see fig. 11 and its associated description. When a remote attendant participates in the visit process, the virtual office space may be presented synchronously through the third XR device of the remote attendant. In some embodiments of the present description, by presenting virtual consulting room space, such that consultants (patients, doctors, remote co-diagnostic agents, etc.) can communicate remotely, the limitations of distance, time to patients and remote co-diagnostic agents are reduced.
In the inquiry link 1930, an initial diagnosis record may be further generated based on the sensing information collected by the second terminal and the third terminal during the diagnosis process. In some embodiments, the interrogation module 430 may collect patient's query data based on the physician's second XR device and the patient's wearable device. An initial diagnostic record may be generated based on the perception information and the query data. For a detailed description of generating an initial diagnostic record, reference may be made to fig. 11, 15 and their associated descriptions.
In particular, the interrogation module 430 may acquire or generate a three-dimensional patient model. The three-dimensional patient model may correspond to the patient or a portion of the patient (e.g., the upper body). For example only, the interrogation module 430 may obtain an initial three-dimensional patient model from the patient's electronic health record and update the initial three-dimensional patient model based on the patient's real-time dynamic data and physiological data to obtain a three-dimensional patient model. Further, the interrogation module 430 may present the three-dimensional patient model to the physician via the second XR device and obtain the physician's physical examination instructions. In particular, the interrogation module 430 may display the three-dimensional patient model on the doctor's second field of view via the second XR device. The second field of view may be a real field of view within the doctor's gaze range, or may be a virtual background. The physical examination instructions of the doctor can comprise physical examination parts, physical examination equipment and physical examination operation. The doctor can input the examination instructions in various modes such as voice, gestures, operation input devices (such as intelligent gloves, intelligent handles and the like) and the like. For example, the second XR device may present virtual examination devices corresponding to a variety of examination devices, and the physician may select the virtual examination device via the input device and perform a virtual examination operation on the three-dimensional patient simulation using the virtual examination device. The interrogation module 430 may determine the site of the examination, the examination device, the examination procedure, etc., based on the virtual examination procedure performed by the physician, thereby generating the examination instructions. In addition to the examination instructions, the physician may also input other instructions to instruct the second XR device to rotate, zoom in, out, etc. the three-dimensional patient model.
After acquiring the physician's physical examination instructions, the interrogation module 430 may control the patient's wearable device to collect the patient's physical examination data. For example, the interrogation module 430 may control the patient's smart watch to collect the patient's pulse within one minute based on the physical examination instructions.
In some embodiments of the present description, a doctor and a patient may communicate remotely in a virtual treatment space through a second XR device and a first XR device, giving the doctor and patient an unobstructed and immersive treatment experience. And by presenting a three-dimensional patient model of the patient to a doctor, remote examination can be more accurately and conveniently performed, and the accuracy of diagnosis is improved.
While the basic concepts have been described above, it will be apparent to those skilled in the art that the foregoing detailed disclosure is by way of example only and is not intended to be limiting. Although not explicitly described herein, various modifications, improvements, and adaptations to the present disclosure may occur to one skilled in the art. Such modifications, improvements, and modifications are intended to be suggested within this specification, and therefore, such modifications, improvements, and modifications are intended to be included within the spirit and scope of the exemplary embodiments of the present invention.
Meanwhile, the specification uses specific words to describe the embodiments of the specification. Reference to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic is associated with at least one embodiment of the present description. Thus, it should be emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various positions in this specification are not necessarily referring to the same embodiment. Furthermore, certain features, structures, or characteristics of one or more embodiments of the present description may be combined as suitable.
Furthermore, the order in which the elements and sequences are processed, the use of numerical letters, or other designations in the description are not intended to limit the order in which the processes and methods of the description are performed unless explicitly recited in the claims. While certain presently useful inventive embodiments have been discussed in the foregoing disclosure, by way of various examples, it is to be understood that such details are merely illustrative and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements included within the spirit and scope of the embodiments of the present disclosure. For example, while the system components described above may be implemented by hardware devices, they may also be implemented solely by software solutions, such as installing the described system on an existing server or mobile device.
Likewise, it should be noted that in order to simplify the presentation disclosed in this specification and thereby aid in understanding one or more inventive embodiments, various features are sometimes grouped together in a single embodiment, figure, or description thereof. This method of disclosure does not imply that the subject matter of the present description requires more features than are set forth in the claims. Indeed, less than all of the features of a single embodiment disclosed above.
In some embodiments, numbers describing the components, number of attributes are used, it being understood that such numbers being used in the description of embodiments are modified in some examples by the modifier "about," approximately, "or" substantially. Unless otherwise indicated, "about," "approximately," or "substantially" indicate that the number allows for a 20% variation. Accordingly, in some embodiments, numerical parameters set forth in the specification and claims are approximations that may vary depending upon the desired properties sought to be obtained by the individual embodiments. In some embodiments, the numerical parameters should take into account the specified significant digits and employ a method for preserving the general number of digits. Although the numerical ranges and parameters set forth herein are approximations that may be employed in some embodiments to confirm the breadth of the range, in particular embodiments, the setting of such numerical values is as precise as possible.
Each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, etc., referred to in this specification is incorporated herein by reference in its entirety. Except for application history documents that are inconsistent or conflicting with the content of this specification, documents that are currently or later attached to this specification in which the broadest scope of the claims to this specification is limited are also. It is noted that, if the description, definition, and/or use of a term in an attached material in this specification does not conform to or conflict with what is described in this specification, the description, definition, and/or use of the term in this specification controls.
Finally, it should be understood that the embodiments described in this specification are merely illustrative of the principles of the embodiments of this specification. Other variations are possible within the scope of this description. Thus, by way of example, and not limitation, alternative configurations of embodiments of the present specification may be considered as consistent with the teachings of the present specification. Accordingly, the embodiments of the present specification are not limited to only the embodiments explicitly described and depicted in the present specification.

Claims (87)

1. A method of providing medical services for a doctor, performed by at least one processor, comprising:
Registering inquiry is carried out on a patient through a patient terminal of the patient so as to determine a doctor registering the patient;
based on the department of the doctor, performing a pre-consultation inquiry on the patient through the patient terminal to generate a pre-consultation record of the patient;
In response to detecting that the patient begins a consultation, displaying interface elements related to a pre-consultation record through a consulting room terminal in a consulting room to present the pre-consultation record to the doctor and the patient;
generating an initial diagnostic record based on sensory information acquired by a sensory device in the consulting room during a visit of the patient, wherein:
the registering query to the patient via the patient terminal of the patient includes displaying a first avatar via the patient terminal to perform the registering query,
The pre-interviewing the patient via the patient terminal includes displaying a second virtual character via the patient terminal to perform the pre-interviewing.
2. The method of claim 1, wherein,
The appearance characteristics of the first virtual character are determined based on the basic information of the patient;
the appearance characteristic of the second avatar is determined based on the optical image of the doctor.
3. The method of claim 1, wherein the method further comprises:
Receiving a remote accompany request initiated by the patient before the patient is treated;
Transmitting the remote accompany request to the doctor terminal of the doctor for approval, and
In response to determining that the remote co-diagnosis request is approved and detecting that the patient begins a consultation, presenting interface elements related to a remote co-diagnosis service via the consulting room terminal.
4. A method as claimed in claim 3, wherein the method further comprises:
In the course of the visit to the doctor,
Determining whether the patient needs to communicate with a remote attendant based on the perception information;
And in response to determining that the patient needs to communicate with the remote co-diagnostic, controlling the consulting room terminal to amplify the interface element related to the remote co-diagnostic service.
5. The method of claim 1, wherein said registering query of the patient via the patient terminal to determine a doctor to whom the patient is registered comprises:
acquiring patient complaints of the patient through the patient terminal;
determining at least one candidate department based on the patient complaint;
Performing the registration inquiry on the patient through the patient terminal based on the at least one candidate department;
the doctor is determined based on first data acquired by the patient terminal in the registration inquiry.
6. The method of claim 5, wherein the registration inquiry comprises one or more rounds of inquiry, wherein the inquiry content of the first round of inquiry is determined based on the steps of:
inputting the patient complaint and the at least one candidate department into a first query model, the first query model outputting the content of the first round of queries, the first query model being a trained machine learning model, or
Based on a preset keyword-department comparison table, acquiring keywords corresponding to each candidate department in the at least one candidate department, determining difference words between the keywords corresponding to any two candidate departments, and determining the content of the first round of inquiry according to the difference words.
7. The method of claim 1, wherein the method further comprises:
After the patient finishes registering, determining estimated waiting time of the patient based on the current day visit record of the doctor and the registering record of the patient;
and responding to the estimated waiting time being larger than a first preset time threshold and smaller than a second preset time threshold, and carrying out the pre-inquiry on the patient through the patient terminal, wherein the second preset time threshold is larger than the first preset time threshold.
8. The method of claim 1, wherein the pre-consultation inquiry is made based on pre-consultation inquiry content and includes a plurality of rounds of inquiry, the pre-consultation inquiry content including inquiry content of each round of inquiry, the pre-consultation inquiry being made to the patient including:
For each current round of interrogation except the first round of interrogation,
Determining semantic information and emotion information of historical answers of the patient based on second data acquired by the patient terminal before a current round of interrogation;
Based on the semantic information and the emotion information, adjusting the query content of the current round of queries;
and carrying out current round inquiry through the patient terminal based on the inquiry content of the adjusted current round inquiry.
9. The method of claim 1, wherein the pre-consultation inquiry comprises a plurality of rounds of inquiry, the pre-consultation inquiry being performed on the patient comprising:
For each current round of queries except the first round of queries,
Determining query contents corresponding to current round queries by utilizing a query content determination model based on query contents of historical queries, historical answers of the patient and known information of the patient, wherein the query content determination model is a trained machine learning model;
And carrying out the current round inquiry through the patient terminal based on the inquiry content corresponding to the current round inquiry.
10. The method of claim 1, wherein the processor is configured with a diagnostic agent that implements self-evolution based on artificial intelligence techniques, the processor utilizing the diagnostic agent to generate the initial diagnostic record:
extracting key content from the perception information based on a diagnostic record template;
Converting the key content into professional content based on a knowledge dictionary;
The diagnostic record template is updated based on the expertise and knowledge database to generate the initial diagnostic record.
11. The method of claim 1, wherein the method further comprises:
Responsive to detecting the patient starting a consultation, displaying, by a consulting room terminal in a consulting room, interface elements related to the electronic health record of the patient;
based on the perception information, obtaining control instructions initiated by the patient and/or the doctor, the control instructions being used to retrieve at least a portion of the electronic health record, and
In response to the control instructions, at least a portion of the electronic health record is retrieved and presented to the patient and the physician through the office terminal.
12. The method of claim 1, wherein the processor is configured with a consultation agent that implements self-evolution based on artificial intelligence techniques, the method further comprising:
processing the perceived information with the medical agent to generate a medical advice that includes at least one of a supplemental query advice, a physical advice, a prescription advice, and a treatment plan advice.
13. The method of claim 1, wherein the method further comprises:
detecting whether a treatment process of the patient is finished based on the perception information;
Generating a target diagnostic record based on the initial diagnostic record and feedback information entered by the physician for the initial diagnostic record in response to detecting that the visit process is over;
Determining, based on the target diagnostic record, a target service to be provided to the patient after the visit procedure;
reserving a reservation for the patient with a target business department providing the target service.
14. The method of claim 1, wherein the patient terminal presents the first virtual character and the second virtual character via an augmented reality technology.
15. A method of providing medical services for a doctor, performed by at least one processor, comprising:
Registering inquiry is carried out on a patient through a patient terminal of the patient so as to determine a doctor registering the patient;
based on the department of the doctor, performing a pre-consultation inquiry on the patient through the patient terminal to generate a pre-consultation record of the patient;
responsive to detecting the patient starting a consultation, displaying interface elements related to the pre-consultation record by the patient terminal and a doctor terminal of a doctor, respectively, to present the pre-consultation record to the patient and the doctor;
generating an initial diagnostic record based on the perception information acquired by the patient terminal and the doctor terminal during the patient's visit, wherein:
the registering query to the patient via the patient terminal of the patient includes displaying a first avatar via the patient terminal to perform the registering query,
The pre-interviewing the patient via the patient terminal includes displaying a second virtual character via the patient terminal to perform the pre-interviewing.
16. The method of claim 15, wherein the method further comprises:
And presenting virtual consulting room spaces to the patient and the doctor respectively by using an augmented reality technology through the patient terminal and the doctor terminal.
17. The method of claim 15, wherein,
The appearance characteristics of the first virtual character are determined based on the basic information of the patient;
the appearance characteristic of the second avatar is determined based on the optical image of the doctor.
18. The method of claim 15, wherein the generating an initial diagnostic record comprises:
Displaying the three-dimensional model of the patient to the doctor by using an augmented reality technology through the doctor terminal;
acquiring a physical examination instruction of the doctor through the doctor terminal;
Based on the physical examination instruction, controlling the wearable equipment of the patient to acquire physical examination data of the patient;
the initial diagnostic record is generated based on the volume data and the perception information.
19. The method of claim 15, wherein the method further comprises:
Receiving a remote accompany request initiated by the patient before the patient is treated;
Transmitting the remote accompany request to the doctor terminal for approval, and
In response to determining that the remote co-diagnosis request is approved, presenting interface elements related to a remote co-diagnosis service via the patient terminal and the doctor terminal.
20. The method of claim 19, wherein the method further comprises:
In the course of the visit to the doctor,
Determining whether the patient needs to communicate with a remote attendant based on the perception information;
And in response to determining that the patient needs to communicate with the remote consultation service, controlling the patient terminal and the doctor terminal to respectively amplify the interface elements related to the remote consultation service.
21. The method of claim 15, wherein said registering query of the patient via the patient terminal to determine a doctor to whom the patient is registered comprises:
acquiring patient complaints of the patient through the patient terminal;
determining at least one candidate department based on the patient complaint;
Performing the registration inquiry on the patient through the patient terminal based on the at least one candidate department;
the doctor is determined based on first data acquired by the patient terminal in the registration inquiry.
22. The method of claim 21, wherein the registration inquiry comprises one or more rounds of inquiry, wherein the inquiry content of the first round of inquiry is determined based on the steps of:
inputting the patient complaint and the at least one candidate department into a first query model, the first query model outputting the content of the first round of queries, the first query model being a trained machine learning model, or
Based on a preset keyword-department comparison table, acquiring keywords corresponding to each candidate department in the at least one candidate department, determining difference words between the keywords corresponding to any two candidate departments, and determining the content of the first round of inquiry according to the difference words.
23. The method of claim 15, wherein the method further comprises:
After the patient finishes registering, determining estimated waiting time of the patient based on the current day visit record of the doctor and the registering record of the patient;
and responding to the estimated waiting time being larger than a first preset time threshold and smaller than a second preset time threshold, and carrying out the pre-inquiry on the patient through the patient terminal, wherein the second preset time threshold is larger than the first preset time threshold.
24. The method of claim 15, wherein the pre-consultation inquiry is made based on pre-consultation inquiry content and includes a plurality of rounds of inquiry, the pre-consultation inquiry content including inquiry content of each round of inquiry, the pre-consultation inquiry being made to the patient including:
For each current round of interrogation except the first round of interrogation,
Determining semantic information and emotion information of historical answers of the patient based on second data acquired by the patient terminal before a current round of interrogation;
Based on the semantic information and the emotion information, adjusting the query content of the current round of queries;
and carrying out current round inquiry through the patient terminal based on the inquiry content of the adjusted current round inquiry.
25. The method of claim 15, wherein the pre-consultation inquiry comprises a plurality of rounds of inquiry, the pre-consultation inquiry being performed on the patient comprising:
For each current round of queries except the first round of queries,
Determining query contents corresponding to current round queries by utilizing a query content determination model based on query contents of historical queries, historical answers of the patient and known information of the patient, wherein the query content determination model is a trained machine learning model;
And carrying out the current round inquiry through the patient terminal based on the inquiry content corresponding to the current round inquiry.
26. The method of claim 15, wherein the processor is configured with a diagnostic agent that implements self-evolution based on artificial intelligence techniques, the processor utilizing the diagnostic agent to generate the initial diagnostic record:
extracting key content from the perception information based on a diagnostic record template;
Converting the key content into professional content based on a knowledge dictionary;
The diagnostic record template is updated based on the expertise and knowledge database to generate the initial diagnostic record.
27. The method of claim 15, wherein the method further comprises:
Responsive to detecting the patient starting a consultation, displaying interface elements related to the patient's electronic health record via the patient terminal and the doctor terminal, respectively;
based on the perception information, obtaining control instructions initiated by the patient and/or the doctor, the control instructions being used to retrieve at least a portion of the electronic health record, and
And in response to the control instruction, retrieving at least a portion of the electronic health record and presenting to the patient and the doctor, respectively, via the patient terminal and the doctor terminal.
28. The method of claim 15, wherein the processor is configured with a consultation agent that implements self-evolution based on artificial intelligence techniques, the method further comprising:
processing the perceived information with the medical agent to generate a medical advice that includes at least one of a supplemental query advice, a physical advice, a prescription advice, and a treatment plan advice.
29. The method of claim 15, wherein the method further comprises:
detecting whether a treatment process of the patient is finished based on the perception information;
Generating a target diagnostic record based on the initial diagnostic record and feedback information entered by the physician for the initial diagnostic record in response to detecting that the visit process is over;
Determining, based on the target diagnostic record, a target service to be provided to the patient after the visit procedure;
reserving a reservation for the patient with a target business department providing the target service.
30. The method of claim 15, wherein the patient terminal presents the first virtual character and the second virtual character via an augmented reality technology.
31. A method of providing medical services for a doctor, performed by at least one processor, comprising:
Presenting, by at least one terminal, a first interface element to a target user, the first interface element being related to an electronic health profile of a patient receiving medical services, the target user comprising at least the patient and a doctor;
Acquiring control instructions initiated by at least one of the target users based on perception information acquired by a perception device during a visit, the control instructions for retrieving at least a portion of the electronic health record, and
And in response to the control instruction, retrieving at least a portion of the electronic health record and presenting the at least a portion of the electronic health record to the target user through the at least one terminal, wherein the at least one terminal presents the at least a portion of the electronic health record to the target user using an augmented reality technology.
32. The method of claim 31, wherein the medical visit service is a telemedicine visit service, the at least one terminal including a patient terminal of the patient and a doctor terminal of the doctor, the method further comprising:
and presenting a virtual consulting room space by using the augmented reality technology through the patient terminal and the doctor terminal.
33. The method of claim 31, wherein the medical treatment service is a telemedicine treatment service, the at least one terminal including a patient terminal of the patient and a doctor terminal of the doctor,
At least a portion of the electronic health record includes a three-dimensional model of an organ of interest of the patient,
And the patient terminal and the doctor terminal synchronously display the three-dimensional model through an augmented reality technology.
34. The method of claim 33, wherein the control instructions are further for:
setting display parameters of the three-dimensional model, wherein the display parameters comprise at least one of a display angle, a display size and a display position, and/or
And marking the key data on the three-dimensional model.
35. The method of claim 33, wherein the target user further comprises a remote attendant, the at least one terminal further comprising a remote attendant terminal, the attendant terminal displaying the three-dimensional model in synchronization with the patient terminal and the doctor terminal via an augmented reality technique.
36. The method of claim 31, wherein,
The perception information comprises voice signals acquired by a sound sensor, the control instruction is acquired after semantic analysis is carried out on the voice signals, or
The perception information comprises an optical image of the target user, which is acquired by an image sensor, and the control instruction is acquired after gesture recognition is performed on the target user in the optical image.
37. The method of claim 31, wherein the processor is configured with a diagnostic agent, the processor processing the sensing device with the diagnostic agent to obtain the control instructions, the diagnostic agent implementing self-evolution with artificial intelligence techniques.
38. The method of claim 31, wherein the method further comprises:
Receiving a remote accompany request initiated by the patient before the patient is treated;
Transmitting the remote accompany request to the doctor terminal of the doctor for approval, and
In response to determining that the remote accompany request is approved and detecting that the patient begins a consultation, a second interface element associated with the accompany service is presented via the at least one terminal.
39. The method of claim 31, wherein the method further comprises:
In response to detecting that the patient begins a consultation, presenting, via the at least one terminal, a third interface element related to a pre-consultation record of the patient, the pre-consultation record obtained by:
Based on the department of the doctor, carrying out pre-consultation inquiry on the patient through a patient terminal;
the pre-consultation record is generated based on data acquired by the patient terminal in the pre-consultation inquiry.
40. The method of claim 31, wherein the method further comprises:
generating an initial diagnostic record based on the perceptual information;
Presenting the initial diagnostic record to the physician;
a target diagnostic record is generated based on the initial diagnostic record and feedback information entered by the physician for the initial diagnostic record.
41. The method of claim 31, wherein presenting, by at least one terminal device, the first interface element to the target user is performed based on a trigger condition, the trigger condition comprising detecting that the patient is beginning to visit.
42. A method of providing registration services, performed by at least one processor, comprising:
acquiring patient complaints of a patient through the patient terminal of the patient or a registration terminal of a hospital;
determining at least one candidate department based on the patient complaint;
based on the patient complaint and the at least one candidate department, displaying a virtual character through the patient terminal or the registration terminal to perform registration inquiry;
the doctor is determined based on data acquired by the patient terminal or the registration terminal in the registration inquiry.
43. The method of claim 42, wherein said determining at least one candidate department based on said patient complaint comprises:
Extracting a first keyword of the patient complaint;
At least one candidate department is determined based on the first keyword.
44. The method of claim 43, wherein the form of the patient complaint comprises at least one of text, speech, picture, and gesture, and wherein the extracting the first keyword of the patient complaint is performed by at least one of:
Extracting the first keywords in the patient complaints in the text form through a keyword extraction algorithm;
The first keyword in the patient complaint in speech form is identified by a speech recognition technique.
Identifying parts, symptoms and the like in the patient complaints in a picture form by an image identification technology to serve as the first keywords;
The first keyword in the patient complaint in the form of a gesture is identified by a gesture recognition technique.
45. The method of claim 43, wherein said determining at least one candidate department based on said first keyword comprises:
inputting the first keyword into a department determination model to determine at least one candidate department, the department determination model being a trained machine learning model, or
And determining at least one candidate department by searching a preset keyword-department comparison table, wherein the preset keyword-department comparison table is generated in advance based on medical knowledge, doctor experience and historical registration record information.
46. The method of claim 42, wherein the processor configuration is performed by a registration agent corresponding to a registration service, the method performed by the registration agent, the registration agent implementing self-evolution using artificial intelligence techniques.
47. The method of claim 42, wherein the registration inquiry comprises a plurality of rounds of inquiry, wherein the registering inquiry is based on the patient complaint and the at least one candidate department by displaying a virtual character through the patient terminal or the registration terminal, comprising:
inputting the patient complaint and the at least one candidate department into a first query model, wherein the first query model outputs the content of a first round of query, and the first query model is a trained machine learning model;
the first round of inquiry is made by displaying the virtual character through the patient terminal or the registration terminal based on the content of the first round of inquiry.
48. The method of claim 42, wherein the registration inquiry comprises a plurality of rounds of inquiry, wherein the registering inquiry is based on the patient complaint and the at least one candidate department by displaying a virtual character through the patient terminal or the registration terminal, comprising:
Acquiring a keyword corresponding to each candidate department in the at least one candidate department based on a preset keyword-department comparison table;
Determining difference words between keywords corresponding to any two candidate departments;
Determining the content of the first round of inquiry according to the difference words;
the first round of inquiry is made by displaying the virtual character through the patient terminal or the registration terminal based on the content of the first round of inquiry.
49. The method of claim 42, wherein the registration inquiry comprises a plurality of rounds of inquiry, wherein the registering inquiry is based on the patient complaint and the at least one candidate department by displaying a virtual character through the patient terminal or the registration terminal, comprising:
for each current round of interrogation except the first round of interrogation,
Determining query content of a current round based on the patient complaint, the at least one candidate department, and a historical round dialogue using a second query model, the second query model being a trained machine learning model;
based on the inquiry content of the current round, displaying a virtual character through the patient terminal or the registration terminal to inquire about the current round.
50. The method of claim 42, wherein the determining a registry doctor based on data collected by the patient terminal or the registry terminal in the registry query comprises:
Determining a confidence level for each of the at least one candidate department based on the data, the patient complaint, and the at least one candidate department, the confidence level reflecting a degree of match between the corresponding candidate department and the patient condition/symptom;
determining at least one recommended department based on the confidence level of each candidate department in the at least one candidate department;
displaying the at least one recommended department to the patient through the patient terminal or the registration terminal;
The registering doctor is determined based on a registering department selected by the patient from the at least one recommended department by the patient terminal or the registering terminal.
51. The method of claim 50, wherein the determining a confidence level for each of the at least one candidate subject based on the data, the patient complaint, and the at least one candidate subject comprises:
For each of the candidate departments,
Determining keywords of the candidate departments based on a preset keyword-department comparison table;
Determining the number and times of keywords of the candidate departments appearing in the data and the patient complaints;
And determining the confidence degree of the candidate departments based on the number and the times.
52. The method as recited in claim 42, wherein the method further comprises:
After registering, the patient generates a planning path based on the current position of the patient and the position information of the consulting room;
displaying, by the patient terminal, guidance information of the planned path on a real world view of the patient using an augmented reality technique.
53. A method of providing a pre-consultation service, performed by at least one processor, comprising:
based on a department where a doctor registering a patient is located, displaying a virtual character through a patient terminal of the patient to perform pre-consultation inquiry on the patient;
generating the pre-consultation record based on data acquired by the patient terminal in the pre-consultation inquiry;
In response to detecting that the patient starts a consultation, presenting, via a consulting room terminal in a consulting room, an interface element related to a pre-consultation record of the patient, wherein the processor is configured with a pre-consultation agent corresponding to a pre-consultation service of the consulting room, the pre-consultation inquiry being performed with the pre-consultation agent, the pre-consultation agent storing knowledge data corresponding to the consulting room and being capable of self-evolution with artificial intelligence techniques.
54. The method as recited in claim 53 wherein said pre-consultation inquiry comprises a plurality of rounds of inquiry, said pre-consultation inquiry being performed on said patient comprising:
For each current round of queries except the first round of queries,
Determining query content corresponding to a current round of queries by using a second query content determination model based on query content of historical queries, historical answers of the patient and known information of the patient, wherein the second query content determination model is a trained machine learning model;
And carrying out the current round inquiry through the patient terminal based on the inquiry content corresponding to the current round inquiry.
55. The method of claim 53, wherein the pre-interrogation is based on pre-interrogation content, the pre-interrogation content being determined by:
acquiring a pre-consultation record template corresponding to a department where the doctor is located and known information of the patient;
Determining missing information which is not collected in the pre-consultation record template by comparing the pre-consultation record template with the known information of the patient;
and determining the content of the pre-consultation inquiry based on the missing information.
56. The method of claim 53, wherein the pre-interrogation is based on pre-interrogation content, the pre-interrogation content being determined by:
Processing the department where the doctor is and the known information of the patient by using a missing information determining model to determine missing information to be acquired by the patient;
And processing the missing information by using a first inquiry content determining model to determine the pre-inquiry content, wherein the missing information determining model and the first inquiry content determining model are trained machine learning models.
57. The method of claim 55 or 56, wherein the pre-consultation inquiry includes a plurality of rounds of inquiry, the pre-consultation inquiry content including inquiry content of each round of inquiry, the performing the pre-consultation inquiry on the patient including:
For each current round of interrogation except the first round of interrogation,
Determining semantic information and mood information of historical answers of the patient based on data collected before a current round of interrogation;
Based on the semantic information and the emotion information, adjusting the query content of the current round of queries;
and carrying out current round inquiry through the patient terminal based on the inquiry content of the adjusted current round inquiry.
58. The method of claim 57, wherein the performing, by the patient terminal, the current round query based on the query content of the adjusted current round query comprises:
determining sound features of the current round query based on the semantic information and the mood information;
And controlling a patient terminal of the patient to perform current round inquiry on the patient terminal based on the sound characteristic and the inquiry content of the adjusted current round inquiry.
59. The method of claim 57, wherein said adjusting the query content of the current round of queries based on the semantic information and the emotional information comprises:
acquiring physiological state information corresponding to the current round inquiry of the patient, wherein the physiological state information is acquired by using wearable equipment worn by the patient;
based on the semantic information, the emotion information and the physiological state information, the query content of the current round of queries is adjusted.
60. The method of claim 59, wherein the method further comprises:
Determining a feedback parameter based on at least a portion of the semantic information, the mood information, and the physiological state information;
and controlling the wearable device to apply feedback to the patient based on the feedback parameters, wherein the feedback at least comprises at least one of force feedback and temperature feedback.
61. The method of claim 53, wherein the patient terminal displays the virtual character via an augmented reality technique.
62. The method of claim 53, wherein,
The appearance characteristics of the virtual character are determined based on the optical image of the doctor and/or the basic information of the patient.
63. The method of claim 53, wherein the controlling the patient terminal of the patient to pre-query the patient comprises:
After the patient finishes registering, determining estimated waiting time of the patient based on the current day visit record of the doctor and the registering record of the patient;
and responding to the estimated waiting time being larger than a first preset time threshold and smaller than a second preset time threshold, and carrying out the pre-inquiry on the patient through the patient terminal, wherein the second preset time threshold is larger than the first preset time threshold.
64. A method of providing medical services for a doctor, performed by at least one processor, comprising:
the method comprises the steps of acquiring sensing information in real time in the treatment process of a patient, wherein the sensing information is acquired by sensing equipment in a treatment room in the treatment process;
During the visit, processing the perceived information with a visit agent to generate a visit suggestion, the visit suggestion including at least one of a supplemental query suggestion, a physical examination suggestion, a prescription suggestion, a treatment plan suggestion;
And after the diagnosis process is finished, the perception information is processed by the diagnosis agent to generate an initial diagnosis record, and the diagnosis agent realizes self evolution by using an artificial intelligence technology.
65. The method of claim 64, wherein generating an initial diagnostic record comprises:
extracting key content from the perception information based on a diagnostic record template;
Converting the key content into professional content based on a knowledge dictionary;
The diagnostic record template is updated based on the expertise and knowledge database to generate the initial diagnostic record.
66. The method as recited in claim 65, wherein the method further comprises:
acquiring the patient's physical examination data acquired by physical examination equipment in the diagnosis process;
and extracting the key content based on the check-in data.
67. The method of claim 65, wherein the converting the key content into professional content based on a knowledge dictionary comprises:
And taking the key content as an index, and retrieving the corresponding professional language from the knowledge dictionary as professional content.
68. The method of claim 64, wherein generating an initial diagnostic record comprises:
presenting the initial diagnostic record to the physician at a preset time, or
After the physician enters a paperback review request, the initial diagnostic record is presented to the physician.
69. The method of claim 64, wherein processing the sensory information with a diagnostic agent to generate a viewing advice comprises:
Determining dialogue content of the doctor and the patient based on the perception information;
searching in a knowledge database and/or a diagnosis standard corresponding to a registration department of the patient based on the dialogue content and patient data of the patient to determine non-collected information;
The supplemental query suggestion is determined based on the non-collected information.
70. The method of claim 64, wherein the diagnostic agent processes the perception information and patient data of the patient using a diagnostic model to generate the review proposal.
71. The method as recited in claim 64, wherein said method further comprises:
Presenting the initial diagnostic record to a doctor through a consulting room terminal of the consulting room or a doctor terminal of the doctor;
A target diagnostic record is generated based on the initial diagnostic record and feedback information of the doctor to the initial diagnostic record, the feedback information being input by the doctor through the office terminal or the doctor terminal via the doctor.
72. A method of providing medical services for a doctor, performed by at least one processor, comprising:
Receiving a remote accompanying request initiated by a patient, wherein the remote accompanying request relates to a remote accompanying person;
And responding to the remote accompanying request and detecting that the patient starts to visit, and displaying real-time pictures of the remote accompanying person to the patient and the doctor by using an augmented reality technology through at least one terminal device.
73. The method of claim 72, wherein the processor is configured with a diagnostic agent that implements self-evolution based on artificial intelligence techniques, the method being implemented by the diagnostic agent.
74. The method of claim 72, wherein the method further comprises:
determining whether the patient needs to communicate with the remote accompanying person based on sensing information acquired by sensing equipment in real time in the process of visiting;
In response to determining that the patient needs to communicate with the remote co-morter, controlling the at least one terminal device to zoom in on a real-time view of the remote co-morter.
75. The method of claim 74, wherein the patient is located in a consulting room, the at least one terminal device including a first augmented reality device and a second augmented reality device, the controlling the at least one terminal device to zoom in on real-time pictures of the remote co-mores in response to determining that the patient needs to communicate with the remote co-mores further comprising:
In response to determining that the patient needs to communicate with the remote accompanying person, reminding the patient and the doctor to wear a first augmented reality device and a second augmented reality device respectively, and controlling the first augmented reality device and the second augmented reality device to amplify real-time pictures of the remote accompanying person.
76. The method of claim 74, wherein determining whether the patient needs to communicate with the remote companion based on the sensory information collected by the sensory device in real-time during the visit includes:
determining state information of the patient based on the perception information;
based on the status information, it is determined whether the patient needs to communicate with the remote companion.
77. The method of claim 74, wherein the sensory information comprises voice data and/or image data, and wherein determining whether the patient needs to communicate with the remote co-diagnostic based on the sensory information collected by the sensory device in real-time during the visit comprises:
based on the voice data and/or the image data, it is detected whether the patient has issued a request to communicate with the remote co-diagnostic person, thereby determining whether the patient needs to communicate with the remote co-diagnostic person.
78. The method of claim 72, wherein the patient receives a telemedicine visit service, the at least one terminal device including a first augmented reality device of the patient and a second augmented reality device of the doctor, the method further comprising:
Generating a virtual consulting room space based on real-time information of an environment in which the patient or the doctor is located;
And displaying the virtual consulting room space through the first augmented reality device, the second augmented reality device and a third augmented reality device of the remote consultant.
79. The method of claim 78, wherein the method further comprises:
Presenting, by the first augmented reality device, a first perspective view of the virtual consulting room space to the patient, the first perspective being a perspective view of a virtual patient corresponding to the patient in the virtual consulting room space;
Presenting a second view angle picture of the virtual consulting room space to the doctor through the second augmented reality device, wherein the second view angle is a view angle of a virtual doctor corresponding to the doctor in the virtual consulting room space;
And presenting a third visual angle picture of the virtual consulting room space to the remote consultant through the third augmented reality equipment, wherein the third visual angle can be a visual angle of the virtual remote consultant corresponding to the remote consultant in the virtual consulting room space.
80. A method of providing medical services for a doctor, performed by at least one processor, comprising:
Generating a healthcare plan related to at least one of medication monitoring, health habit monitoring, and physiological data monitoring based on a target diagnostic record of the patient;
Generating health monitoring instructions based on the health monitoring plan, wherein the health monitoring instructions comprise at least one of medication instructions, health habit instructions and physiological data monitoring instructions;
sending the health monitoring instruction to monitoring equipment so as to acquire health monitoring information of the patient from the monitoring equipment;
And updating the health care plan based on the health care information, wherein the processor is configured with a post-diagnosis agent, the method is executed by the post-diagnosis agent, and the post-diagnosis agent realizes self evolution based on an artificial intelligence technology.
81. The method of claim 80, wherein the healthcare plan includes at least one of a monitoring parameter, a monitoring frequency, a monitoring time, and a correlation threshold.
82. The method of claim 80, wherein the sending the healthcare instructions to a monitoring device to obtain healthcare information for the patient from the monitoring device comprises at least one of:
the medication instruction is used for indicating an intelligent medicine box to remind the patient of medication at medication time, and medication information of the patient is collected;
Instructing a smart watch to wake the patient to move and collect the operation information of the patient during the movement time by using the health habit instructions, and
And using the physiological data monitoring instruction to instruct a physical examination device to remind the patient of acquiring specific physiological parameters at physical examination time, and recording physiological parameter information of the patient.
83. The method of claim 80, wherein the updating the healthcare plan based on the healthcare information comprises:
Assessing the health condition of the patient based on the healthcare information;
updating the healthcare plan based on the evaluation of the health condition of the patient.
84. The method of claim 83, wherein the updating the healthcare plan based on the assessment of the patient's health condition comprises:
when a change in the health condition of the patient is detected, the health care information is sent to a doctor so that the doctor determines whether the health care plan of the patient needs to be updated.
85. The method of claim 83, wherein the method further comprises:
and when detecting that the health condition of the patient is abnormal, sending a medical treatment reminding to the patient.
86. The method of claim 80, wherein the processor is further configured with a diagnostic agent that implements self-evolution based on artificial intelligence techniques, the target diagnostic record being generated by the diagnostic agent by:
generating an initial diagnosis record based on perception information, wherein the perception information is acquired by a perception device in the treatment process of the patient;
presenting the initial diagnostic record to a physician;
The target diagnostic record is generated based on the initial diagnostic record and feedback information entered by the physician for the initial diagnostic record.
87. The method of claim 86, wherein generating an initial diagnostic record based on the perceptual information comprises:
extracting key content from the perception information based on a diagnostic record template;
Converting the key content into professional content based on a knowledge dictionary;
The diagnostic record template is updated based on the expertise and knowledge database to generate the initial diagnostic record.
CN202411742797.2A 2024-07-31 2024-11-29 A method, system and storage medium for providing medical consultation services Pending CN119541906A (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
CN202510043844.2A CN119541908A (en) 2024-07-31 2024-11-29 A method, system and storage medium for providing medical consultation services
CN202510043800.XA CN119581067A (en) 2024-07-31 2024-11-29 Method, system and storage medium for providing medical treatment service
CN202510052426.XA CN119601261A (en) 2024-07-31 2024-11-29 A method, system and storage medium for providing medical consultation services
CN202510046056.9A CN119560184A (en) 2024-07-31 2024-11-29 Method, system and storage medium for providing medical treatment service
CN202510052613.8A CN119541909A (en) 2024-07-31 2024-11-29 Method, system and storage medium for providing medical treatment service
CN202510043977.XA CN119724628A (en) 2024-07-31 2024-11-29 A method, system and storage medium for providing medical consultation services

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CNPCT/CN2024/109057 2024-07-31
CN2024109057 2024-07-31

Related Child Applications (6)

Application Number Title Priority Date Filing Date
CN202510043800.XA Division CN119581067A (en) 2024-07-31 2024-11-29 Method, system and storage medium for providing medical treatment service
CN202510043977.XA Division CN119724628A (en) 2024-07-31 2024-11-29 A method, system and storage medium for providing medical consultation services
CN202510052426.XA Division CN119601261A (en) 2024-07-31 2024-11-29 A method, system and storage medium for providing medical consultation services
CN202510046056.9A Division CN119560184A (en) 2024-07-31 2024-11-29 Method, system and storage medium for providing medical treatment service
CN202510043844.2A Division CN119541908A (en) 2024-07-31 2024-11-29 A method, system and storage medium for providing medical consultation services
CN202510052613.8A Division CN119541909A (en) 2024-07-31 2024-11-29 Method, system and storage medium for providing medical treatment service

Publications (1)

Publication Number Publication Date
CN119541906A true CN119541906A (en) 2025-02-28

Family

ID=94705383

Family Applications (7)

Application Number Title Priority Date Filing Date
CN202510043977.XA Pending CN119724628A (en) 2024-07-31 2024-11-29 A method, system and storage medium for providing medical consultation services
CN202510046056.9A Pending CN119560184A (en) 2024-07-31 2024-11-29 Method, system and storage medium for providing medical treatment service
CN202510052613.8A Pending CN119541909A (en) 2024-07-31 2024-11-29 Method, system and storage medium for providing medical treatment service
CN202510043800.XA Pending CN119581067A (en) 2024-07-31 2024-11-29 Method, system and storage medium for providing medical treatment service
CN202411742797.2A Pending CN119541906A (en) 2024-07-31 2024-11-29 A method, system and storage medium for providing medical consultation services
CN202510043844.2A Pending CN119541908A (en) 2024-07-31 2024-11-29 A method, system and storage medium for providing medical consultation services
CN202510052426.XA Pending CN119601261A (en) 2024-07-31 2024-11-29 A method, system and storage medium for providing medical consultation services

Family Applications Before (4)

Application Number Title Priority Date Filing Date
CN202510043977.XA Pending CN119724628A (en) 2024-07-31 2024-11-29 A method, system and storage medium for providing medical consultation services
CN202510046056.9A Pending CN119560184A (en) 2024-07-31 2024-11-29 Method, system and storage medium for providing medical treatment service
CN202510052613.8A Pending CN119541909A (en) 2024-07-31 2024-11-29 Method, system and storage medium for providing medical treatment service
CN202510043800.XA Pending CN119581067A (en) 2024-07-31 2024-11-29 Method, system and storage medium for providing medical treatment service

Family Applications After (2)

Application Number Title Priority Date Filing Date
CN202510043844.2A Pending CN119541908A (en) 2024-07-31 2024-11-29 A method, system and storage medium for providing medical consultation services
CN202510052426.XA Pending CN119601261A (en) 2024-07-31 2024-11-29 A method, system and storage medium for providing medical consultation services

Country Status (1)

Country Link
CN (7) CN119724628A (en)

Also Published As

Publication number Publication date
CN119601261A (en) 2025-03-11
CN119541909A (en) 2025-02-28
CN119560184A (en) 2025-03-04
CN119541908A (en) 2025-02-28
CN119724628A (en) 2025-03-28
CN119581067A (en) 2025-03-07

Similar Documents

Publication Publication Date Title
US11776669B2 (en) System and method for synthetic interaction with user and devices
Martinez-Ortigosa et al. Applications of artificial intelligence in nursing care: a systematic review
JP6949128B2 (en) system
US20230052573A1 (en) System and method for autonomously generating personalized care plans
WO2021236961A1 (en) System and method for processing medical claims
WO2019159007A1 (en) A system and method for documenting a patient medical history
CN109310317A (en) System and method for automated medical diagnosis
Palagin et al. Hospital Information Smart-System for Hybrid E-Rehabilitation.
Ktistakis et al. Applications of ai in healthcare and assistive technologies
JP7024450B2 (en) Computer programs, support devices and support methods
CN119541906A (en) A method, system and storage medium for providing medical consultation services
CN119626588A (en) Method, system and storage medium for providing medical service
US20220367054A1 (en) Health related data management of a population
WO2021181634A1 (en) Teacher data collection requesting device, and teacher data collection method
CN119560117A (en) A system and method for providing admission inquiry service
CN119811606A (en) System and method for providing hospitalization service
CN119580977A (en) Medical service method and system
CN119580978A (en) Hospital management system
CN119580976A (en) Hospital supporting platform
US20240355462A1 (en) Artificial intelligence patient intake and appointment booking system
Monroy Rodríguez et al. Wearable and Pervasive Architecture for Digital Companions in Chronic Disease Care
Palagin et al. Digital health systems: SMART-system for remote support of hybrid E-rehabilitation services and activities
WO2025062383A1 (en) Advanced multisectoral counter and kiosk for integrated regional health-related data gathering and personalized physiologic and hemodynamic monitoring
US20240355486A1 (en) Artificial intelligence system for facilitating interactions via digital representations
Balasubramaniam MultiSense Diagnosis: Navigating Disease Diagnosis with Metaverse Multimodal Interaction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination