[go: up one dir, main page]

CN115803785B - Display device, control system and drawing method - Google Patents

Display device, control system and drawing method Download PDF

Info

Publication number
CN115803785B
CN115803785B CN202080102240.6A CN202080102240A CN115803785B CN 115803785 B CN115803785 B CN 115803785B CN 202080102240 A CN202080102240 A CN 202080102240A CN 115803785 B CN115803785 B CN 115803785B
Authority
CN
China
Prior art keywords
display device
abnormality
unit
image
rotation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202080102240.6A
Other languages
Chinese (zh)
Other versions
CN115803785A (en
Inventor
山本晶仁
森山直希
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Publication of CN115803785A publication Critical patent/CN115803785A/en
Application granted granted Critical
Publication of CN115803785B publication Critical patent/CN115803785B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B23/00Testing or monitoring of control systems or parts thereof
    • G05B23/02Electric testing or monitoring
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)
  • User Interface Of Digital Computer (AREA)
  • Testing And Monitoring For Control Systems (AREA)

Abstract

A display device (1) is provided with: a communication unit (10) capable of communicating with a control device that controls a controlled device; an image generation unit (12) that generates a three-dimensional image for rotationally displaying the controlled device on a screen; a superimposed object generation unit (14) that generates an object that indicates the presence or absence of an abnormal object at a portion of the controlled device and that is superimposed at the position of the portion in the three-dimensional image; a rotation information acquisition unit (11) that acquires rotation information indicating a rotation pattern of a controlled device displayed on a screen; and an image display unit (16) that displays a three-dimensional image on which the object is superimposed. An image generation unit (12) causes a three-dimensional image to be displayed in a rotated state based on the rotation information.

Description

Display device, control system and drawing method
Technical Field
The present invention relates to a display device, a control system, and a drawing method, which can be connected to a control device that controls a controlled device.
Background
When an abnormality occurs in a device used for an operation on a site such as a construction site or a production line, an operator performing a repair operation may specify a site where the abnormality occurs based on information displayed on a display device. The display device is connected to a control device that controls the display device. The display device and the control device constitute a control system for controlling the device, i.e. the controlled device.
In order to realize efficient repair work when an abnormality occurs, a technique for displaying visual information indicating a site where an abnormality occurs in addition to text information related to the site where an abnormality occurs has been proposed. Patent document 1 discloses that, regarding a display unit included in a sequence controller system, a portion in which an abnormality occurs in the sequence controller system is displayed in a three-dimensional figure representing the sequence controller system. Further, according to the technique of patent document 1, the three-dimensional graphic being displayed rotates in accordance with the operation of the terminal. The operator can search for a portion where an abnormality has occurred by rotating the three-dimensional figure being displayed.
Patent document 1: japanese patent application laid-open No. 2010-160540
Disclosure of Invention
According to the conventional technique disclosed in patent document 1, in order to find a site where an abnormality occurs, it is necessary to rotate the three-dimensional pattern by an operator's operation. Since the operator who has less experience in repairing work takes more time to find a site where an abnormality has occurred, because the operator searches for a site where an abnormality has occurred while changing the direction in which the three-dimensional pattern is rotated. As described above, according to the related art, there is a problem in that it takes time to identify a portion where an abnormality is generated.
The present invention has been made in view of the above circumstances, and an object of the present invention is to provide a display device capable of notifying occurrence of an abnormality in a controlled device so that an operator can easily identify a portion of the controlled device where the abnormality has occurred.
In order to solve the above problems and achieve the object, a display device according to the present invention includes: a communication unit capable of communicating with a control device that controls a controlled device; an image generation unit that generates a three-dimensional image for rotationally displaying a controlled device on a screen; a superimposed object generating unit that generates an object indicating whether or not an abnormality exists at a portion of the controlled device, the object being superimposed at a position of the portion in the three-dimensional image; a rotation information acquisition unit that acquires, from the communication unit, rotation information corresponding to a portion in which an abnormality has occurred, among rotation information that indicates a rotation pattern of a controlled device displayed on a screen and that is preset for each of a plurality of portions in which the abnormality is monitored; and an image display unit that displays a three-dimensional image on which the object is superimposed in a rotating manner. The image generation unit acquires rotation information corresponding to the portion where the abnormality has occurred from the rotation information acquisition unit, and generates a three-dimensional image that rotates based on the rotation information corresponding to the portion where the abnormality has occurred.
ADVANTAGEOUS EFFECTS OF INVENTION
The display device according to the present invention has an effect that the occurrence of an abnormality in the controlled device can be notified so that an operator can easily identify the site of the controlled device where the abnormality has occurred.
Drawings
Fig. 1 is a diagram showing a functional configuration of a display device according to embodiment 1.
Fig. 2 is a diagram showing a hardware configuration of the display device according to embodiment 1.
Fig. 3 is a diagram showing a configuration of a control system including the display device according to embodiment 1.
Fig. 4 is a diagram showing a hardware configuration of a control device connected to the display device according to embodiment 1.
Fig. 5 is a diagram showing a case where a three-dimensional image is displayed on the display device according to embodiment 1.
Fig. 6 is a diagram for explaining creation of a job assistance screen displayed on the display device according to embodiment 1.
Fig. 7 is a diagram showing a hardware configuration of a drawing device for creating a screen of the display device according to embodiment 1.
Fig. 8 is a flowchart showing a flow of preparation steps until the display of the job support screen by the display device according to embodiment 1 is started.
Fig. 9 is a flowchart showing a flow of a drawing method for creating a job assistance screen displayed on the display device according to embodiment 1.
Fig. 10 is a flowchart showing a flow of processing when the display device according to embodiment 1 performs rotation display of the controlled device.
Fig. 11 is a flowchart showing a flow of processing when the three-dimensional image is generated by the image generating unit of the display device according to embodiment 1.
Fig. 12 is a flowchart showing a flow of processing when a two-dimensional object (object) is generated by the display device according to embodiment 1.
Fig. 13 is a diagram for explaining a process of superimposing a two-dimensional object representing a normal situation on a three-dimensional image by the display device according to embodiment 1.
Fig. 14 is a diagram for explaining a process of superimposing a two-dimensional object representing an abnormal situation on a three-dimensional image by the display device according to embodiment 1.
Fig. 15 is a diagram showing a functional configuration of a display device according to embodiment 2.
Fig. 16 is a diagram for explaining an operation of manually rotating an image of a controlled device in the display device according to embodiment 2.
Fig. 17 is a flowchart showing a flow of processing for rotating an image of a controlled device by a display device according to an operation according to embodiment 2.
Fig. 18 is a flowchart showing a flow of processing for changing rotation information in the display device according to embodiment 2.
Fig. 19 is a diagram showing a functional configuration of a display device according to embodiment 3.
Fig. 20 is a diagram showing an example in which a two-dimensional object displayed in an opaque manner is superimposed on a three-dimensional image by the display device according to embodiment 3.
Fig. 21 is a diagram showing an example in which a two-dimensional object displayed in a transparent manner is superimposed on a three-dimensional image by the display device according to embodiment 3.
Fig. 22 is a flowchart showing a flow of processing when a two-dimensional object is generated by the display device according to embodiment 3.
Fig. 23 is a diagram showing a functional configuration of a display device according to embodiment 4.
Fig. 24 is a diagram showing a display of an alarm list in the display device according to embodiment 4 and a case where a three-dimensional image is displayed in the display device.
Fig. 25 is a flowchart showing a flow of processing when the display device according to embodiment 4 performs the rotation display of the controlled device in the window of the alarm screen.
Detailed Description
The display device, the control system, and the drawing method according to the embodiment will be described in detail below with reference to the drawings.
Embodiment 1
Fig. 1 is a diagram showing a functional configuration of a display device 1 according to embodiment 1. Fig. 2 is a diagram showing a hardware configuration of the display device 1 according to embodiment 1. Fig. 3 is a diagram showing a configuration of a control system 100 including the display device 1 according to embodiment 1.
The control system 100 includes a display device 1, a control device 2 as a connection device for controlling a controlled device, and a robot 3 as a controlled device. The control system 100 is a system for controlling the robot 3. The display device 1 is a programmable display or the like HMI (Human Machine Interface). The control device 2 is a controller such as a programmable logic controller (Programmable Logic Controller: PLC). The display device 1 is connected to the control device 2. The robot 3 is connected to the control device 2.
The display device 1 communicates with the control device 2. The display device 1 has a function of displaying information related to the operation state of the control device 2 and a function of receiving an operation. The control device 2 controls the robot 3 by a control program such as a ladder diagram program. The picture displayed on the display device 1 is created by a drawing device. The drawing device will be described later.
Next, a functional configuration of the display device 1 will be described. The display device 1 includes: a communication unit 10 capable of communicating with the control device 2; a rotation information acquisition unit 11 that acquires rotation information indicating a rotation pattern of the controlled device displayed on the screen; an image generation unit 12 that generates a three-dimensional image for rotationally displaying the controlled device on the screen; and an abnormality information acquisition unit 13 that acquires abnormality information indicating the presence or absence of abnormality in a portion of the controlled device. The display device 1 further includes: a superimposed object generating unit 14 that generates a two-dimensional object that is an object indicating whether or not an abnormality exists in a portion of the controlled device; a data storage unit 15 for storing various data; and an image display unit 16 that displays a three-dimensional image on which the two-dimensional object is superimposed.
Next, a hardware configuration of the display device 1 will be described. The display device 1 has a processor 20 that executes various processes, a memory 21 that is a built-in memory, a storage device 22 that holds various information, a communication interface 23 that performs communication with the control device 2, a display 24 that displays information, and a touch panel 25 for inputting information to the display device 1.
The processor 20 is CPU (Central Processing Unit). The processor 20 may also be a processing device, an arithmetic device, a microprocessor, a microcomputer, or DSP (Digital Signal Processor). The memory 21 is a volatile memory such as RAM (Random Access Memory). The storage device 22 is a nonvolatile memory such as ROM (Read Only Memory), flash memory, EPROM (Erasable Programmable Read Only Memory), or EEPROM (registered trademark) (Electrically Erasable Programmable Read Only Memory).
The communication interface 23 is an interface circuit for connecting with an external device of the display device 1. The control device 2 and the drawing device are connected to the communication interface 23 via a network using Ethernet (registered trademark) or USB (Universal Serial Bus). The display 24 is, for example, a liquid crystal display (Liquid Crystal Display: LCD). The touch panel 25 detects contact of a contact object such as a finger and outputs touch information. The touch information includes information indicating the identification number, coordinates, contact state, and the like of each finger.
The functions of the rotation information acquisition unit 11, the image generation unit 12, the abnormality information acquisition unit 13, and the superimposed object generation unit 14 are realized by using a combination of the processor 20 and software. The functions may be implemented using a combination of the processor 20 and the firmware, or may be implemented using a combination of the processor 20, the software, and the firmware. The software or firmware is described as a program and stored in the storage device 22. The function of the communication unit 10 is realized by using the communication interface 23. The function of the image display unit 16 is realized by using the display 24.
The program executed in the display device 1 may be stored in a storage medium readable by a computer system. The display device 1 may store the program recorded on the storage medium in the memory 21. The storage medium may also be a removable storage medium as a floppy disk or a flash memory as a semiconductor memory.
Next, the hardware configuration of the control device 2 will be described. Fig. 4 is a diagram showing a hardware configuration of the control device 2 connected to the display device 1 according to embodiment 1. The control device 2 includes a processor 26 for executing various processes, a memory 27 as a built-in memory, a storage device 28 for storing various information, and a communication interface 29 for performing communication with the display device 1 and the robot 3.
The storage device 28 is HDD (Hard Disk Drive) or SSD (Solid State Drive). The control program is stored in the storage device 28. The processor 26 reads out a control program stored in the storage device 28 to the memory 27 and executes the control program. The processor 26 performs processing for generating an instruction to be output to the robot 3, and the like. Here, the description of the contents common to the processor 20, the memory 21, the storage device 22, and the communication interface 23 is omitted.
Next, the display of the three-dimensional image in the display device 1 will be described. Fig. 5 is a diagram showing a case where a three-dimensional image is displayed on the display device 1 according to embodiment 1. The job assistance screen 30 is one of the screens displayed on the display 24. The display device 1 displays a work support screen 30 for notifying occurrence of an abnormality in the controlled device. The work support screen 30 is also a screen displayed for supporting the repair work of the worker. The repair operation is an operation for recovering the operation of the controlled device, which is interrupted by the occurrence of an abnormality.
The abnormality is a state in which normal operation is disabled due to a problem such as deterioration or failure of a component. A plurality of parts as objects to be monitored for the presence or absence of abnormalities are preset for the robot 3. For each of the plurality of sections, a data area for storing a value indicating the presence or absence of an abnormality is secured in the storage device 28 of the control device 2. Variables used in programming of the ladder program are associated with each data region. The variable or data region is sometimes referred to as a "device" and the value stored in the data region is sometimes referred to as a "device value".
When an abnormality occurs in the robot 3, the control device 2 detects the occurrence of the abnormality based on a signal received from the robot 3. The control device 2 rewrites the device value corresponding to the portion where the abnormality has occurred from the value indicating normal to the value indicating abnormality. The control device 2 notifies the display device 1 that an abnormal situation has occurred. When notifying the display device 1, the control device 2 transmits rotation information and abnormality information indicating the presence or absence of an abnormality to the display device 1.
The rotation information is preset for each of the plurality of portions. The memory 28 of the control device 2 stores rotation information set in advance. The control device 2 reads out rotation information corresponding to the portion where the abnormality has occurred from the storage device 28, and transmits the rotation information to the display device 1. The communication unit 10 of the display device 1 receives the rotation information and the abnormality information. The communication unit 10 outputs the received rotation information to the rotation information acquisition unit 11. The communication unit 10 outputs the received abnormality information to the abnormality information acquisition unit 13. In this way, the rotation information acquisition unit 11 acquires the rotation information transmitted from the control device 2. The abnormality information acquisition unit 13 acquires abnormality information transmitted from the control device 2. The rotation information acquisition unit 11 outputs the acquired rotation information to the image generation unit 12 and the superimposed object generation unit 14. The abnormality information acquisition unit 13 outputs the acquired abnormality information to the superimposed object generation unit 14.
The data storage unit 15 stores Computer-Aided Design (CAD) data as Design data of the robot 3. The image generating unit 12 reads CAD data, which is three-dimensional data, from the data storage unit 15. The image generating unit 12 generates a three-dimensional image for displaying the robot 3 in a rotatable manner on the work support screen 30 based on the CAD data and the rotation information. The rotation information includes information such as a direction of rotation, an angle of rotation, and a period of rotation. The image generating unit 12 rotates and displays the three-dimensional image based on the rotation information. The image generation unit 12 outputs the data of the generated three-dimensional image to the image display unit 16.
The data storage unit 15 stores two-dimensional data, which is data of a two-dimensional object. The superimposed object generating unit 14 reads out two-dimensional data from the data storage unit 15. The superimposed object generating unit 14 generates a two-dimensional object indicating whether or not there is an abnormality at each portion of the robot 3 based on the two-dimensional data and the abnormality information. The object 33 indicating the presence of an abnormality is generated as a two-dimensional object superimposed at a portion where an abnormality exists. The superimposed object generating unit 14 outputs the generated data of the object 33 to the image display unit 16.
The image display unit 16 displays a three-dimensional image on which the object 33 is superimposed on the work support screen 30. An image 31 of the robot 3 rotated in a manner indicated by the rotation information is displayed on the work support screen 30. The object 33 is superimposed at the position of the portion of the image 31 where the abnormality is generated. The object 33 is a highlighted two-dimensional object for indicating the existence of an abnormal situation. Fig. 5 shows a case where the image 31 of the robot 3 in the installed posture is rotated about the rotation axis in the vertical direction. When the state of the image 31 shown in the left diagram of fig. 5 is set to the reference state, the image 31 shown in the right diagram of fig. 5 is rotated 180 degrees from the reference state. The work support screen 30 displays a case where the image 31 is rotated between the reference state and the state rotated 180 degrees from the reference state.
The rotation information is set so that the image 31 is rotated from the reference state, and thereby the position of the portion where the abnormality has occurred can be displayed with ease of visual recognition. Rotation information enabling rotation of the image 31 is set so that each portion of the robot 3 to be monitored for abnormality can be appropriately visually recognized. The rotation information is information described in the control program. The rotation information may be information other than the information described in the control program.
As shown in fig. 5, an object 32 indicating a normal condition may be superimposed on the image 31 at a position of a portion where no abnormality occurs, that is, at a position of a portion in a normal state. The control device 2 transmits abnormality information indicating that there is no abnormality to the display device 1 for the normal portion. The superimposed object generating section 14 generates an object 32 indicating a normal condition as a two-dimensional object superimposed on a normal portion. The object 32 is superimposed at the position of the normal portion in the image 31. The object 32 is a two-dimensional object for highlighting that represents a normal condition.
The object 32 and the object 33 can be recognized separately by visual sense. The superimposed object generating section 14 generates the object 32 and the object 33 having different colors from each other in such a manner that the object 32 is blue and the object 33 is red. In this way, the superimposed object generating unit 14 generates a two-dimensional object in which abnormality and normality can be clearly visually recognized. The object 32 and the object 33 may be visually recognized separately, or may be different from each other by a factor other than color.
The display device 1 can display not only an abnormality of a portion displayed in the image 31 in the reference state but also an abnormality of a portion not displayed in the image 31 in the reference state on the work support screen 30 by rotating the image 31. The display device 1 can automatically rotate the image 31 so as to be suitable for displaying the portion where the abnormality has occurred. Therefore, the operator can easily check the portion where the abnormality has occurred without performing an operation for searching the portion where the abnormality has occurred. Even an operator with little repair work experience can identify the abnormal part in a short time.
The display device 1 acquires rotation information preset for each part to be monitored for the presence or absence of an abnormality from the control device 2. The display device 1 does not need to hold rotation information for each portion for rotating the image 31 so as to be suitable for visual recognition in advance. The display device 1 can display a three-dimensional image in which abnormality of each portion can be easily visually recognized without increasing the storage capacity.
Next, creation of the job support screen 30 will be described. Fig. 6 is a diagram for explaining creation of the job support screen 30 displayed on the display device 1 according to embodiment 1.
The drawing device 4 is a computer on which a drawing program is installed. The edit screen for receiving the operation for creating the screen is displayed on the display 44 of the drawing device 4. The creator creating the screen of the display device 1 performs an editing job of arranging the object as a display element on the editing screen. The drawing device 4 creates a monitor screen for displaying information on the operation state of the control device 2, an operation screen for receiving an operation, and a job assistance screen 30 in accordance with an editing job performed by a creator.
A still image of the robot 3 is displayed on an edit screen for creating the job assistance screen 30. A window 34 for setting a two-dimensional object superimposed on the three-dimensional image is displayed beside the still image of the robot 3. The creator disposes the two-dimensional object displayed in the window 34 at the position of the portion set as the monitoring object in the still image of the robot 3. By performing the operation, the drawing device 4 thereby correlates the two-dimensional data with coordinates representing the position of the portion in the three-dimensional data of the robot 3. The creator can arrange the two-dimensional object while rotating the still image of the robot 3 appropriately.
Then, the drawing device 4 associates a device, which is a data area for storing a value indicating whether or not an abnormality exists in the portion, with the two-dimensional data associated with the coordinates of the three-dimensional data. For example, in the control program, a process is described in which when a certain part of the robot 3 is normal, the device "M1000" is set to OFF, that is, the device value of "M1000" is set to "0", and when the part is abnormal, the device value of "M1000" is set to ON, that is, the device value of "M1000" is set to "1". The drawing means 4 associates "M1000" with the two-dimensional data of the portion. By making the above-described association, the display apparatus 1 selects one of the 2 items 32, 33 in accordance with the device value of "M1000". In this way, the display apparatus 1 can use the device value of the associated device as the abnormality information.
The display device 1 downloads picture data 35, which is data of a picture created by the drawing device 4, from the drawing device 4. The screen data 35 includes data of the job assistance screen 30, data of the monitoring screen, and data of the operation screen. The display device 1 stores the downloaded screen data 35 in the data storage unit 15. The image display unit 16 appropriately reads data of each screen from the data storage unit 15 to display each screen.
Next, the hardware configuration of the drawing device 4 will be described. Fig. 7 is a diagram showing a hardware configuration of the drawing device 4 for creating a screen of the display device 1 according to embodiment 1. The drawing device 4 has a processor 40 that executes various processes, a memory 41 that is a built-in memory, a storage device 42 that holds various information, a communication interface 43 that performs communication with the display device 1, a display 44 that displays information, and an input device 45 that inputs information to the drawing device 4.
The drawing program is stored in the storage device 42. The processor 40 reads out the drawing program stored in the memory device 42 to the memory 41 and executes the drawing program. The processor 40 performs processing for creating a picture in accordance with the input. Here, descriptions of the contents common to the processor 26, the memory 27, the storage device 28, the communication interface 29, and the display 24 are omitted. The input device 45 is a device such as a keyboard, a mouse, or a touch panel.
Next, a preparation process until the display of the work support screen 30 by the display device 1 is started will be described. Fig. 8 is a flowchart showing a flow of preparation steps until the display of the job support screen 30 by the display device 1 according to embodiment 1 is started.
In step S1, the drawing device 4 creates a job assistance screen 30. In step S2, the display apparatus 1 downloads the created screen data 35 from the drawing apparatus 4. In step S3, the display device 1 is connected to the control device 2. The display device 1 is connected to the control device 2 via a communication cable. In step S4, the control device 2 and the display device 1 are started. Thus, the preparation process according to the flow shown in fig. 8 ends.
Fig. 9 is a flowchart showing a flow of a drawing method for creating the job support screen 30 displayed on the display device 1 according to embodiment 1. In step S5, the drawing device 4 receives an operation of associating the two-dimensional data of the object indicating whether or not there is an abnormality at the portion of the robot 3 with the three-dimensional data indicating the three-dimensional shape of the robot 3. The drawing device 4 correlates the two-dimensional data with coordinates indicating the position of the portion to be monitored for the presence or absence of an abnormality.
In step S6, the drawing device 4 associates the two-dimensional data with a data area for storing a value indicating whether or not there is an abnormality at the portion. That is, the drawing device 4 further associates the two-dimensional data associated with the coordinates indicating the position of the portion with respect to the portion to be monitored for the presence or absence of the abnormality.
In step S7, the drawing device 4 determines whether or not the association of the two-dimensional data and the data area at all the parts of the monitoring target has been completed. When there are portions where the association between the two-dimensional data and the data area has not been completed (No in step S7), the drawing device 4 repeats the flow of steps S5 to S7 for each portion. When the association of the two-dimensional data and the data area has been completed for all the parts (Yes in step S7), the drawing device 4 ends the creation of the job support screen 30 according to the flow shown in fig. 9.
Next, the processing performed by the display device 1 when the rotation display of the controlled device is performed on the work support screen 30 will be described. Fig. 10 is a flowchart showing a flow of processing when the display device 1 according to embodiment 1 performs rotation display of the controlled device.
In step S11, the display device 1 starts communication with the control device 2. In step S12, the display device 1 determines whether or not there is a notification of occurrence of an abnormality. If there is No notification of occurrence of an abnormality (No in step S12), the display device 1 repeats the flow of step S12 until the notification of occurrence of an abnormality occurs. When there is a notification of occurrence of an abnormality (Yes in step S12), the display device 1 acquires rotation information by the rotation information acquiring unit 11 in step S13. The rotation information acquisition unit 11 outputs the acquired rotation information to the image generation unit 12 and the superimposed object generation unit 14.
Here, the rotation information will be described. In the example shown in fig. 5, the rotation axis is an axis perpendicular to the reference plane. The reference plane is a plane parallel to the installation plane of the robot 3 in the robot 3. In a virtual three-dimensional space in which the image 31 of the robot 3 is rotated, the direction of the rotation axis can be set to an arbitrary direction. In the case of the example shown in fig. 5, the direction of the rotation axis is the vertical direction in the three-dimensional space. The rotation information includes information indicating the direction of the rotation axis as the direction of rotation.
The rotation information also includes information indicating the angle of rotation and information indicating the period of rotation. The rotation angle can be set to any angle in the range of 0 degrees to 360 degrees with the reference state as a reference. For example, in the case where the angle is 180 degrees, the image 31 of the robot 3 is rotated by a range of 180 degrees from the reference state. Any number of seconds can be set for the period of rotation. For example, in the case where the period is 1 second, the image 31 of the robot 3 is rotated at a period of 1 second.
By including the rotation information including the information indicating the direction of rotation, the information indicating the angle of rotation, and the information indicating the period of rotation, the display device 1 can realize display in which the position of the portion where the abnormality has occurred is rotated so as to be easily observed for each portion as the monitoring object. The rotation information may include information other than information indicating the direction of rotation, information indicating the angle of rotation, and information indicating the period of rotation.
In step S14, the display device 1 generates a three-dimensional image by the image generating unit 12. In step S15, the display device 1 generates a two-dimensional object by superimposing the object generating section 14. The display device 1 superimposes the generated two-dimensional object on the generated three-dimensional image by the image display unit 16. The image display unit 16 displays an image newly generated by superimposing a two-dimensional object on a three-dimensional image on the work support screen 30. Thus, in step S16, the display device 1 updates the work support screen 30 via the image display unit 16. The data of the three-dimensional image, the data of the superimposed two-dimensional object, and the rotation information are stored in the data storage unit 15. Thereby, the display device 1 displays the rotation of the robot 3 on the work support screen 30. The display device 1 ends the processing relating to the flow of fig. 10.
Next, generation of a three-dimensional image in step S14 shown in fig. 10 will be described. Fig. 11 is a flowchart showing a flow of processing when the image generating unit 12 of the display device 1 according to embodiment 1 generates a three-dimensional image.
In step S21, the image generation unit 12 determines whether or not there is a change in the rotation information acquired by the rotation information acquisition unit 11. The data storage unit 15 stores rotation information used for the previous display of the operation support screen 30. The image generation unit 12 compares the rotation information acquired this time with the previous rotation information to determine whether there is a change.
When the rotation information is not changed (No in step S21), the image generation unit 12 ends the processing related to the flow shown in fig. 11. In this case, the image generation unit 12 does not generate a three-dimensional image based on the rotation information acquired at this time, and reads out the data of the three-dimensional image stored in the data storage unit 15. That is, the display device 1 is directly used without updating the three-dimensional image used for the previous display. On the other hand, when there is a change in the rotation information (Yes in step S21), the image generation unit 12 advances the flow to step S22.
In step S22, the image generating unit 12 reads three-dimensional data, which is CAD data of the robot 3, from the data storage unit 15 to acquire three-dimensional data. In step S23, the image generation unit 12 performs drawing processing of three-dimensional data based on the rotation information. Thereby, the image generating unit 12 generates a three-dimensional image.
Next, the generation of the two-dimensional object in step S15 shown in fig. 10 will be described. Fig. 12 is a flowchart showing a flow of processing when a two-dimensional object is generated by the display device 1 according to embodiment 1.
In step S31, the abnormality information acquisition unit 13 acquires abnormality information. The abnormality information acquisition unit 13 acquires abnormality information on each part to be monitored for the presence or absence of an abnormality. The abnormality information acquisition unit 13 outputs the acquired abnormality information to the superimposed object generation unit 14. In step S32, the superimposed object generating unit 14 determines whether or not there is a change in the acquired abnormality information.
The data storage unit 15 stores abnormality information used for the previous display of the operation support screen 30. The superimposed object generating unit 14 compares the abnormality information acquired this time with the abnormality information of the previous time for each part. The superimposed object generating unit 14 determines whether or not there is a change in the abnormality information for each part.
When the abnormality information has not changed (No in step S32), the superimposed piece generating unit 14 ends the processing relating to the flow shown in fig. 12. In this case, the superimposed object generating unit 14 does not generate a two-dimensional object based on the abnormality information acquired this time, and reads out the data of the two-dimensional object stored in the data storage unit 15. That is, the display device 1 is directly used without updating the two-dimensional object used for the previous display. On the other hand, when there is a change in the abnormality information (Yes in step S32), the superimposed object generating unit 14 advances the flow to step S33.
In step S33, the superimposed object generating unit 14 determines whether or not the value of the abnormality information is a value indicating an abnormality. The superimposed object generating unit 14 determines whether or not the value indicating the abnormality is a value indicating the abnormality for the abnormality information acquired for each section.
When the value of the abnormality information is a value indicating an abnormality (Yes in step S33), the superimposed object generating unit 14 reads out the two-dimensional data of the object 33 indicating an abnormality from the data storage unit 15 in step S34, thereby acquiring the two-dimensional data of the object 33 indicating an abnormality. The superimposed object generating unit 14 obtains coordinates indicating the position of the placement object 33.
On the other hand, when the value of the abnormality information is not the value indicating abnormality (No in step S33), the superimposed item generating unit 14 reads out the two-dimensional data indicating the normal item 32 from the data storage unit 15 in step S35, thereby acquiring the two-dimensional data indicating the normal item 32. The superimposed object generating unit 14 obtains coordinates indicating the position of the placement object 33.
The superimposed object generating unit 14 also acquires rotation information from the rotation information acquiring unit 11. If the superimposed object generating unit 14 acquires the two-dimensional data, coordinates, and rotation information, the flow advances to step S36. The abnormality information is stored in the data storage unit 15.
In step S36, the superimposed item generating section 14 performs superimposed drawing processing of the two-dimensional item based on the two-dimensional data, the coordinates, and the rotation information. Thereby, the superimposed object generating section 14 generates a two-dimensional object superimposed on the three-dimensional image.
Fig. 13 is a diagram for explaining a process of superimposing a two-dimensional object representing a normal situation on a three-dimensional image by the display device 1 according to embodiment 1. Fig. 13 shows an image of the object 32 generated for a portion where a value indicating a normal condition is acquired as abnormality information, and an image in which the object 32 is superimposed on the portion, of the image 31 of the robot 3.
The superimposed object generating unit 14 acquires two-dimensional data of the object 32 among two-dimensional objects associated with the devices corresponding to the portions. The superimposed object generating section 14 performs superimposed drawing processing of superimposing the object 32 at the surface of the portion in the robot 3 by a texture mapping method. The superimposed object generating unit 14 obtains parameters necessary for texture mapping from the rotation information. Thus, the superimposed object generating unit 14 performs superimposed drawing processing for superimposing the object 32 representing a normal situation on the three-dimensional image.
Fig. 14 is a diagram for explaining a process of superimposing a two-dimensional object representing an abnormal situation on a three-dimensional image by the display device 1 according to embodiment 1. Fig. 14 shows an image of the object 33 generated for a portion where a value indicating an abnormal situation is acquired as abnormality information, and an image in which the object 33 is superimposed on the portion, among the images 31 of the robot 3.
The superimposed object generating unit 14 acquires two-dimensional data of the object 33 among two-dimensional objects associated with the devices corresponding to the portions. The superimposed object generating section 14 performs superimposed drawing processing of superimposing the object 33 at the surface of the portion in the robot 3 by a texture mapping method. The superimposed object generating unit 14 obtains parameters necessary for texture mapping from the rotation information. Thus, the superimposed object generating unit 14 performs superimposed drawing processing for superimposing the object 33 representing the abnormal situation on the three-dimensional image.
In this way, the superimposed object generating unit 14 performs superimposed drawing processing for superimposing the two-dimensional object on the surface of the robot 3 in the three-dimensional image based on the rotation information. Thus, the display device 1 can display the abnormal portion and the normal portion of the robot 3 in an easy-to-observe manner.
According to embodiment 1, the display device 1 generates a three-dimensional image for rotationally displaying the controlled device based on the rotation information, and generates a two-dimensional object superimposed on the three-dimensional image, whereby the three-dimensional image superimposed with the object can be rotationally displayed. The display device 1 can automatically rotate the image 31 of the controlled device so as to be suitable for displaying the portion where the abnormality has occurred. The operator can identify the site where the abnormality has occurred in a short time by visually checking the screen of the display device 1. Further, since the display device 1 acquires the rotation information stored in the control device 2, it is possible to display a three-dimensional image in which abnormality at each portion can be easily visually recognized without increasing the storage capacity. As a result, the display device 1 can notify the occurrence of an abnormality in the controlled device, so that the operator can easily identify the location of the controlled device where the abnormality has occurred. In addition, the display device 1 can display a portion where an abnormality has occurred in an easily observable manner without increasing the storage capacity.
Embodiment 2
Fig. 15 is a diagram showing a functional configuration of a display device 1A according to embodiment 2. The display device 1A stops rotation of the image 31 of the controlled device. Further, by hand, the display device 1A rotates the image 31 in a state where the rotation is stopped. In embodiment 2, the same reference numerals are given to the same components as those in embodiment 1, and mainly different configurations from embodiment 1 will be described.
The display device 1A has the same functional configuration as the display device 1 according to embodiment 1. The display device 1A includes a coordinate determination unit 52, a rotation information change unit 53, and an operation unit 51 that receives an operation to stop rotation of the controlled device on the work support screen 30. The function of the operation unit 51 is realized by using the touch panel 25. The functions of the coordinate determination unit 52 and the rotation information modification unit 53 are realized by using a combination of the processor 20 and software.
The operation unit 51 receives the 1 st operation of stopping the rotation of the robot 3 on the work support screen 30. The 1 st operation is an operation of touching a certain 1 point on the screen by a contact object. In addition, the operation section 51 receives the 2 nd operation after the 1 st operation. The 2 nd operation is a sliding operation that is an operation of moving the contact from the touched position on the screen. The 2 nd operation is an operation for rotating the image of the robot 3 in the stopped state by the 1 st operation in a direction specified by the movement of the contact object. In the following description, the 1 st operation and the 2 nd operation are collectively referred to as a touch operation.
When there is an operation, the operation unit 51 outputs touch information to the coordinate determination unit 52. The coordinate determination unit 52 determines coordinates representing the position of the contact object based on the touch information. The coordinate determination unit 52 determines whether or not the touch operation is present based on the coordinates. When the touch operation is present, the coordinate determination unit 52 outputs the coordinates to the rotation information changing unit 53. The rotation information changing unit 53 performs a process of changing the rotation information according to the coordinates.
Fig. 16 is a diagram for explaining an operation of manually rotating the image 31 of the controlled device in the display device 1A according to embodiment 2. The image 31 shown in the left view of fig. 16 is the image 31 when the rotation is stopped by touching the 1 point on the work support screen 30 with a finger as a contact object. The automatic rotation of the image 31 is stopped in a state where an arbitrary position on the work support screen 30 is touched by a finger.
When the finger in contact with the work support screen 30 is moved from the state shown in the left diagram of fig. 16, the image 31 is rotated in the direction in which the finger is moved. That is, the image 31 is rotated manually. Fig. 16 is a right view showing a case where the finger is moved from the state shown in the left view of fig. 16 to the right direction on the work support screen 30. By moving the finger touching the work support screen 30 in the rightward direction, the image 31 displayed on the work support screen 30 is rotated rightward from the state shown in the left view of fig. 16.
The image 31 shown in the left view of fig. 16 is in a state in which the upper part of the robot 3 is lying down to the right when set. When the finger is moved rightward from the position near the lower portion of the robot 3 at the time of setting on the work support screen 30, the image 31 rotates so that the lower portion of the robot 3 moves rightward, and the image 31 is in the state shown in the right view of fig. 16. The rotation amount of the image 31 is a rotation amount corresponding to the movement amount of the finger touching the work support screen 30. Further, the operator can move the finger touching the work support screen 30 in an arbitrary direction. The image 31 rotates from a state of stopping rotation in the direction in which the finger moves.
By stopping the automatic rotation of the image 31 by the operation of the operator, the display device 1A can confirm the portion where the abnormality has occurred in detail. Further, the display device 1A can rotate the image 31 from a state in which the rotation is stopped based on the direction in which the finger is moved and the movement amount of the finger, and thus the display device 1A can improve the operability of the operator.
Next, a process for rotating the image 31 of the controlled device by the display device 1A in accordance with the operation will be described. Fig. 17 is a flowchart showing a flow of processing for rotating the image 31 of the controlled device by the display device 1A according to the operation according to embodiment 2.
In step S41, the display device 1A starts communication with the control device 2. In step S42, the display device 1A determines whether or not there is a notification of occurrence of an abnormality. If there is No notification of occurrence of an abnormality (No in step S42), the display device 1A repeats the flow of step S42 until the notification of occurrence of an abnormality occurs. When there is a notification of occurrence of an abnormality (Yes in step S42), the display device 1A acquires rotation information by the rotation information acquiring unit 11 in step S43. The rotation information acquisition unit 11 outputs the acquired rotation information to the image generation unit 12 and the superimposed object generation unit 14.
When there is a touch operation to the operation unit 51, the display device 1A changes the rotation information according to the touch operation in step S44. In step S45, the display device 1A generates a three-dimensional image based on the changed rotation information. In step S46, the display apparatus 1A generates a two-dimensional object. The display device 1A superimposes the generated two-dimensional object on the generated three-dimensional image. The image display unit 16 displays an image newly generated by superimposing a two-dimensional object on a three-dimensional image on the work support screen 30. Thus, in step S47, the display device 1A updates the work support screen 30 via the image display unit 16. Thereby, the display device 1A ends the processing related to the flow of fig. 17.
Next, the change of the rotation information in step S44 shown in fig. 17 will be described. Fig. 18 is a flowchart showing a flow of processing for changing rotation information in the display device 1A according to embodiment 2.
The operation unit 51 outputs touch information generated by touching the work support screen 30 with a contact object to the coordinate determination unit 52. In step S51, the coordinate determination unit 52 acquires touch information. In step S52, the coordinate determination unit 52 determines whether or not the touch operation is present based on the touch information. In the case where there is No touch operation (No in step S52), the display device 1A ends the processing performed by the flow shown in fig. 18. On the other hand, in the case where there is a touch operation (Yes in step S52), the display device 1A advances the flow to step S53. The coordinate determination unit 52 outputs the coordinates of the position where the touch operation is present to the rotation information modification unit 53.
The rotation information changing unit 53 saves the previous coordinates, which are display coordinates before the rotation according to the movement amount is performed. In step S53, the rotation information changing unit 53 determines whether or not the current coordinate, which is the coordinate of the touched position, has changed from the previous coordinate, which is the stored coordinate. When the coordinates of the touched position change from the previous coordinates (Yes in step S53), the display device 1A advances the flow to step S54. On the other hand, when the coordinates of the touched position have not changed from the previous coordinates (step S53, no), the display device 1A advances the flow to step S57.
In step S54, the rotation information changing unit 53 calculates the amount of change in coordinates between the previous coordinates and the coordinates of the touched position. In step S55, the rotation information changing unit 53 turns OFF the initial flag. The initial flag is a flag indicating whether the automatic rotation or the manual rotation is set. In step S56, the rotation information changing unit 53 adds the change amount calculated in step S54 to the manual rotation information, which is the rotation information set by the operation to the operation unit 51. That is, the rotation information changing unit 53 reflects the rotation of the movement amount based on the sliding operation to the rotation display. If the flow of step S56 ends, the display device 1A advances the flow to step S61.
In step S57, the rotation information changing unit 53 determines whether or not the initial flag is OFF. When the initial flag is ON (No in step S57), the display device 1A advances the flow to step S61. ON the other hand, when the initial flag is OFF (Yes in step S57), the rotation information changing unit 53 turns ON the initial flag in step S58. In step S59, the rotation information changing unit 53 updates the stored coordinates to the coordinates of the touched position. In step S60, the rotation information changing unit 53 saves the rotation information acquired from the control device 2 as manual rotation information. If the flow of step S60 ends, the display device 1A advances the flow to step S61.
In step S61, the rotation information changing unit 53 changes the rotation information by the manual rotation information. That is, the rotation information changing unit 53 changes the rotation information acquired by the rotation information acquiring unit 11 to manual rotation information. Thereby, the display device 1A ends the process for changing the rotation information.
According to embodiment 2, the display device 1A stops the automatic rotation of the image 31 of the controlled device on the work support screen 30 in accordance with the operation to the operation unit 51. In addition, the display device 1A rotates the image 31 from a state in which the automatic rotation of the image 31 has stopped, based on the direction in which the contact object is moved and the movement amount of the contact object. In this way, the display device 1A can display a three-dimensional image so that the portion where the abnormality has occurred can be confirmed in detail, and operability of the operator can be improved.
Embodiment 3
Fig. 19 is a diagram showing a functional configuration of a display device 1B according to embodiment 3. The display device 1B generates a two-dimensional object to which the transparency process is applied. In embodiment 3, the same components as those in embodiment 1 or embodiment 2 are denoted by the same reference numerals, and configurations different from those in embodiment 1 or embodiment 2 will be mainly described.
The display device 1B has the same functional configuration as the display device 1A according to embodiment 2. The display device 1B further includes a transparency processing unit 61 for generating a two-dimensional object to be displayed by the transparency display by the superimposed object generating unit 14. The function of the transparency processing unit 61 is realized by using a combination of the processor 20 and software. The display device 1B is not limited to the configuration in which the transparency processing unit 61 is added to the same functional configuration as the display device 1A. The display device 1B may have a configuration in which a transparency processing unit 61 is added to the same functional configuration as the display device 1 according to embodiment 1.
The transparency processing unit 61 determines whether or not the transparency processing of the two-dimensional object superimposed on the three-dimensional image can be performed. The transparency processing unit 61 sets a transparency flag indicating whether or not transparency processing is required, based on the result of the determination. The transparency processing unit 61 determines that the transparency processing of the two-dimensional object is possible when the format of the two-dimensional data is a format that can be subjected to the transparency processing, for example, the ARGB8000 format. In this case, the transparency processing unit 61 turns ON the transparency flag. The transparency processing unit 61 determines that the transparency processing of the two-dimensional object is not possible when the format of the two-dimensional data is a format which is not capable of the transparency processing, for example, RGB888 format. In this case, the transparency processing unit 61 turns OFF the transparency flag.
When the transparency flag is ON, the superimposed object generating unit 14 performs superimposed drawing processing ON the two-dimensional object subjected to the transparency processing. Thus, the superimposed object generating unit 14 superimposes the two-dimensional object displayed in the transparent state on the three-dimensional image. On the other hand, when the transparency flag is OFF, the superimposed object generating unit 14 performs superimposed drawing processing on the two-dimensional object to which no transparency processing is applied. Thus, the superimposed object generating unit 14 superimposes the two-dimensional object displayed in an opaque manner on the three-dimensional image.
Fig. 20 is a diagram showing an example in which a two-dimensional object displayed in an opaque manner is superimposed on a three-dimensional image by the display device 1B according to embodiment 3. When the transparency flag is OFF, the superimposed object generating unit 14 generates an object 33 which is a two-dimensional object to be displayed in an opaque manner. The superimposed object generating unit 14 performs superimposed drawing processing for superimposing the object 33 on the three-dimensional image.
Since the display device 1B can generate the two-dimensional object displayed in a transparent manner, the operator can easily confirm whether or not an abnormality is present, and can confirm the surface state of the portion where the abnormality is present in the state where the two-dimensional object is displayed. Thus, the display device 1B can further improve visibility of the portion where the abnormality has occurred.
Fig. 21 is a diagram showing an example in which a two-dimensional object displayed in a transparent manner is superimposed on a three-dimensional image by the display device 1B according to embodiment 3. When the transparency flag is ON, the superimposed object generating unit 14 generates an object 62 which is a two-dimensional object displayed in a transparent manner. The superimposed object generating unit 14 performs superimposed drawing processing for superimposing the object 62 on the three-dimensional image.
Next, generation of a two-dimensional object by the display device 1B according to embodiment 3 will be described. Fig. 22 is a flowchart showing a flow of processing when a two-dimensional object is generated by the display device 1B according to embodiment 3.
In step S71, the superimposed object generating unit 14 acquires the abnormality information from the abnormality information acquiring unit 13. The superimposed object generating unit 14 acquires abnormality information on each part to be monitored for abnormality. In step S72, the superimposed object generating unit 14 determines whether or not there is a change in the acquired abnormality information. When the abnormality information has not changed (No in step S72), the superimposed piece generating unit 14 ends the processing relating to the flow shown in fig. 22. On the other hand, when there is a change in the abnormality information (Yes in step S72), the superimposed object generating unit 14 advances the flow to step S73.
In step S73, the superimposed object generating unit 14 determines whether or not the value of the abnormality information is a value indicating an abnormality. When the value of the abnormality information is a value indicating an abnormality (Yes in step S73), the superimposed object generating unit 14 acquires two-dimensional data of the object 33 indicating an abnormality in step S74. If the flow of step S74 ends, the display device 1B advances the flow to step S76. On the other hand, if the value of the abnormality information is not the value indicating an abnormality (No in step S73), the superimposed object generating unit 14 acquires two-dimensional data indicating a normal object 32 in step S75. If the flow of step S75 ends, the display device 1B advances the flow to step S76.
In step S76, the transparency processing unit 61 sets a transparency flag. When the format of the two-dimensional data acquired in step S74 or step S75 is a format that can be subjected to the transparency processing, the transparency processing unit 61 turns ON the transparency flag. On the other hand, when the format of the two-dimensional data acquired in step S74 or step S75 is a format in which the transparency processing is impossible, the transparency processing unit 61 turns OFF the transparency flag.
In step S77, the superimposed object generating unit 14 determines whether or not the transparency flag set in step S76 is OFF. When the transparency flag is OFF (Yes in step S77), the superimposed object generating unit 14 executes superimposed drawing processing of the two-dimensional object in step S78. That is, the superimposed object generating unit 14 superimposes the two-dimensional object displayed in an opaque manner on the three-dimensional image.
ON the other hand, in the case where the transparency flag is ON (No in step S77), the superimposed object generating section 14 performs the transparency processing and the superimposed drawing processing of the two-dimensional object in step S79. That is, the superimposed object generating unit 14 superimposes the two-dimensional object displayed in the transparent state on the three-dimensional image. The superimposed item generating unit 14 performs a transparency process of the two-dimensional item by adjusting the transparency degree or the alpha value of the pixel in the two-dimensional data. Thus, the display device 1B generates a two-dimensional object to be displayed in a transparent state or a two-dimensional object to be displayed in an opaque state, which are superimposed on the three-dimensional image.
The transparency processing unit 61 may switch between ON and OFF of the transparency flag for the two-dimensional data in which the transparency flag is set to ON. The transparency processing section 61 switches the transparency flag ON and OFF in accordance with the operation to the operation section 51. By switching the transparency flag ON and OFF, the display device 1B can arbitrarily change the two-dimensional object superimposed ON the three-dimensional image to the transparent display or the opaque display.
According to embodiment 3, the display device 1B can generate the two-dimensional object to be transparently displayed by the superimposed object generating unit 14, and can further improve the visibility of the portion where the abnormality occurs.
Embodiment 4
Fig. 23 is a diagram showing a functional configuration of a display device 1C according to embodiment 4. The display device 1C selects the alarm information, which is information of each generated abnormality, from the alarm list, and displays a portion where the abnormality has occurred for the abnormality indicated by the selected alarm information. In embodiment 4, the same components as those in embodiments 1 to 3 are denoted by the same reference numerals, and configurations different from those in embodiments 1 to 3 will be mainly described.
The display device 1C has the same functional configuration as the display device 1B according to embodiment 3. The display device 1C further includes an alarm selecting unit 71 for determining alarm information selected by an operation, a window generating unit 72 for generating a window displayed on an alarm screen, and an alarm display unit 73 for displaying an alarm list on the alarm screen. The functions of the alarm selecting unit 71 and the window generating unit 72 are realized by using a combination of the processor 20 and software. The function of the alarm display unit 73 is realized by using the display 24. The display device 1C is not limited to the configuration in which the alarm selecting unit 71, the window generating unit 72, and the alarm display unit 73 are added to the same functional configuration as the display device 1B. The display device 1C may have the same functional configuration as the display device 1 according to embodiment 1 or the same functional configuration as the display device 1A according to embodiment 2, with the alarm selecting unit 71, the window generating unit 72, and the alarm display unit 73 added thereto.
The alarm list is a list of alarm information, which is information of each abnormality generated. The display device 1C displays the alarm list on the alarm screen. If the alarm information is selected by the alarm selecting section 71, the alarm selecting section 71 determines the selected alarm information, and notifies the window generating section 72, the image generating section 12, and the superimposed object generating section 14 of the determined alarm information. The window generation unit 72 generates a window to be displayed on the alarm screen when the notification of the alarm information is received.
The image generation unit 12 identifies three-dimensional data for generating a three-dimensional image based on the notified alarm information, and reads out the identified three-dimensional data from the data storage unit 15. The superimposed item generating unit 14 identifies two-dimensional data for generating a two-dimensional item based on the notified alarm information, and reads out the identified two-dimensional data from the data storage unit 15. The alarm display unit 73 displays a three-dimensional image in which two-dimensional objects are superimposed on a window displayed on an alarm screen by selecting alarm information from an alarm list. In this way, the display device 1C matches the display of the image 31 of the controlled device with the alarm list.
Next, the display of the alarm list and the display of the three-dimensional image in the display device 1C will be described. Fig. 24 is a diagram showing a display of an alarm list on the display device 1C and a case where a three-dimensional image is displayed on the display device 1C according to embodiment 4. Fig. 24 shows a case where an alarm screen 74 is displayed on the display device 1C and a case where the image 31 of the robot 3 is displayed in a rotated state by selecting alarm information from the alarm screen 74.
The alarm information includes the time of occurrence of the abnormality, a message indicating the content of the abnormality, and the time of repair of the abnormality. The alarm information may include information such as an error code indicating the content or cause of the abnormality, the number of times the abnormality is accumulated.
The operation unit 51 receives an operation for selecting alarm information from the alarm list on the alarm screen 74. The operator selects the alarm information from the alarm list by touching the alarm screen 74 on the line 76 in which the alarm information is recorded. If the alarm information is selected, the alarm display unit 73 displays a window 75 on the alarm screen 74. The alarm display unit 73 displays the image 31 of the robot 3 rotated according to the rotation information on the window 75. At the position of the portion of the image 31 where the abnormality shown by the alarm information is generated, a two-dimensional object representing the abnormality is superimposed. The window 75 shows a case where the image 31 on which the two-dimensional object is superimposed is rotated in accordance with the rotation information.
When an alarm is generated, an operator may not immediately grasp the content of the generated abnormality. The operator can confirm information related to the abnormality such as the time when the abnormality is generated from the alarm list. Further, the operator can specify the portion in which the abnormality has occurred in the image 31 to be displayed in rotation by selecting the alarm information from the alarm list. This can improve the work efficiency of the worker.
The alarm list may include not only alarm information related to an abnormality generated at present but also alarm information related to an abnormality generated in the past. The operator can specify the portion where the abnormality has occurred at present and the portion where the abnormality has occurred in the past from the image 31 to be displayed by rotating by selecting the alarm information from the alarm list.
Next, a process when the display device 1C performs the rotation display of the controlled device in the window 75 of the alarm screen 74 will be described. Fig. 25 is a flowchart showing a flow of processing when the display device 1C according to embodiment 4 performs the rotation display of the controlled device in the window 75 of the alarm screen 74.
In step S81, the display device 1C starts communication with the control device 2. In step S82, the display device 1C determines whether or not there is a notification of occurrence of an abnormality. If there is No notification of occurrence of an abnormality (No in step S82), the display device 1C repeats the flow of step S82 until the notification of occurrence of an abnormality occurs. If there is a notification of occurrence of an abnormality (Yes in step S82), the display device 1C advances the flow to step S83.
In step S83, the display device 1C displays the alarm list on the alarm screen 74. In step S84, the display device 1C determines whether or not the alarm information is selected. When the alarm information is not selected (No in step S84), the display device 1C repeats the flow of step S84 until the alarm information is selected. When the alarm information is selected (Yes in step S84), the display device 1C displays the window 75 on the alarm screen 74 in step S85.
In step S86, the display device 1C acquires the rotation information by the rotation information acquisition unit 11. The rotation information acquisition unit 11 outputs the acquired rotation information to the image generation unit 12 and the superimposed object generation unit 14. In step S87, the display device 1C generates a three-dimensional image by the image generating unit 12. In step S88, the display device 1C generates a two-dimensional object by superimposing the object generating section 14. The display device 1C superimposes the generated two-dimensional object on the generated three-dimensional image by the alarm display unit 73. The alarm display unit 73 displays an image newly generated by superimposing a two-dimensional object on a three-dimensional image on the window 75. Thus, in step S89, the display device 1C updates the display of the window 75 by the alarm display unit 73. In this way, the display device 1C displays the rotation of the robot 3 on the window 75 of the alarm screen 74.
In step S90, the display device 1C determines whether there is an operation to close the window 75. One of the operations of closing the window 75 is an operation of touching a button existing within the window 75. In the case where there is No operation to close the window 75 (No in step S90), the display device 1C returns the flow to step S86. In the case where there is an operation to close the window 75 (Yes in step S90), the display device 1C clears the display of the window 75 in step S91 to close the window 75. Thereby, the display device 1C ends the processing performed by the flow shown in fig. 25.
According to embodiment 4, the display device 1C displays a three-dimensional image on which two-dimensional objects are superimposed by selecting alarm information from an alarm list. This allows the display device 1C to achieve an effect of improving the work efficiency of the operator.
In embodiments 1 to 4, the controlled device is not limited to the robot 3 as long as it can be controlled by the control device 2. The display devices 1, 1A, 1B, and 1C are not limited to the programmable display, and may be a computer system such as a personal computer or a general-purpose computer. A program for realizing the functions of the display apparatus 1 is installed in a computer system. The storage device 22 shown in fig. 2 may also be an HDD or SSD. The program is stored in the storage device 22. The processor 20 reads out a program stored in the storage device 22 to the memory 21 and executes the program.
The configuration shown in the above embodiments represents an example of the content of the present invention. The structure of each embodiment can be combined with other known techniques. The structures of the embodiments may be combined with each other as appropriate. A part of the structure of each embodiment can be omitted or changed without departing from the scope of the present invention.
Description of the reference numerals
1. 1A, 1B, 1C display device, 2 control device, 3 robot, 4 drawing device, 10 communication unit, 11 rotation information acquisition unit, 12 image generation unit, 13 abnormality information acquisition unit, 14 superimposed object generation unit, 15 data storage unit, 16 image display unit, 20, 26, 40 processor, 21, 27, 41 memory, 22, 28, 42 storage device, 23, 29, 43 communication interface, 24, 44 display, 25 touch panel, 30 work auxiliary device, 31 image, 32, 33, 62 object, 34, 75 window, 35 screen data, 45 input device, 51 operation unit, 52 coordinate determination unit, 53 rotation information change unit, 61 transparency processing unit, 71 alarm selection unit, 72 window generation unit, 73 alarm display unit, 74 alarm screen, 76 line, 100 control system.

Claims (12)

1. A display device, comprising:
a communication unit capable of communicating with a control device that controls a controlled device;
an image generation unit that generates a three-dimensional image for rotationally displaying the controlled device on a screen;
a superimposed object generating unit that generates an object indicating the presence or absence of an abnormality at a portion of the controlled device, the object being superimposed at a position of the portion in the three-dimensional image;
A rotation information acquisition unit that acquires, from the communication unit, rotation information corresponding to a portion in which an abnormality has occurred, among rotation information that indicates a rotation pattern of the controlled device displayed on the screen and that is preset for each of a plurality of portions in which the presence or absence of the abnormality is monitored; and
an image display unit that rotationally displays the three-dimensional image on which the object is superimposed,
the image generating unit acquires the rotation information corresponding to the portion in which the abnormality has occurred from the rotation information acquiring unit, and generates the three-dimensional image rotated based on the rotation information corresponding to the portion in which the abnormality has occurred.
2. The display device of claim 1, wherein the display device comprises a display device,
the rotation information is information transmitted from the control device to the communication unit.
3. The display device according to claim 1 or 2, wherein,
comprises an abnormality information acquisition unit for acquiring abnormality information indicating the presence or absence of abnormality in the portion,
the superimposed object generating unit acquires the abnormality information from the abnormality information acquiring unit, and generates the object based on the abnormality information.
4. The display device according to claim 1 or 2, wherein,
the superimposed object generating unit acquires the rotation information corresponding to the portion in which the abnormality has occurred from the rotation information acquiring unit, and performs processing for superimposing the object on the surface of the controlled device in the three-dimensional image based on the rotation information corresponding to the portion in which the abnormality has occurred.
5. The display device according to claim 1 or 2, wherein,
the rotation information includes information indicating a direction of rotation, information indicating an angle of rotation, and information indicating a period of rotation.
6. The display device according to claim 1 or 2, wherein,
an operation unit is provided, which receives an operation to stop rotation of the controlled device on the screen.
7. The display device of claim 6, wherein the display device comprises a display device,
the image generating unit generates a three-dimensional image for rotating an image of the controlled device in a state in which the rotation is stopped according to operation 1, in a direction according to operation 2 of the operating unit.
8. The display device according to claim 1 or 2, wherein,
The display device includes a transparency processing unit for generating the object to be displayed in a transparent state by the superimposed object generating unit.
9. The display device according to claim 1 or 2, wherein,
comprises an alarm display unit for displaying an alarm list, which is a list of alarm information that is information of each abnormality generated,
the alarm display unit displays the three-dimensional image on which the object is superimposed by selecting the alarm information from the alarm list.
10. A control system, characterized by comprising:
a control device that controls the controlled device; and
the display device comprises a display device, a display device and a display control unit,
the display device includes:
a communication unit capable of communicating with the control device;
an image generation unit that generates a three-dimensional image for rotationally displaying the controlled device on a screen;
a superimposed object generating unit that generates an object indicating the presence or absence of an abnormality at a portion of the controlled device, the object being superimposed at a position of the portion in the three-dimensional image;
a rotation information acquisition unit that acquires, from the communication unit, rotation information corresponding to a portion in which an abnormality has occurred, among rotation information that indicates a rotation pattern of the controlled device displayed on the screen and that is preset for each of a plurality of portions in which the presence or absence of the abnormality is monitored; and
An image display unit that rotationally displays the three-dimensional image on which the object is superimposed,
the image generating unit acquires the rotation information corresponding to the portion in which the abnormality has occurred from the rotation information acquiring unit, and generates the three-dimensional image rotated based on the rotation information corresponding to the portion in which the abnormality has occurred.
11. The control system of claim 10, wherein the control system is configured to control the control system,
the control device stores the rotation information preset for each of the plurality of portions, and transmits the rotation information corresponding to the portion in which the abnormality has occurred to the display device.
12. A drawing method of creating a screen displayed in the display device according to any one of claims 1 to 9 using a drawing device, the drawing method characterized by comprising the steps of:
associating two-dimensional data representing an object representing the presence or absence of an abnormality at a portion of the controlled device with coordinates representing the position of the portion in three-dimensional data representing the three-dimensional shape of the controlled device; and
and associating a data area, which is a data area of a control device that controls the controlled device, with the two-dimensional data, and storing data indicating whether or not abnormality is present in the portion.
CN202080102240.6A 2020-12-28 2020-12-28 Display device, control system and drawing method Active CN115803785B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/049246 WO2022145014A1 (en) 2020-12-28 2020-12-28 Display device, control system, and drawing method

Publications (2)

Publication Number Publication Date
CN115803785A CN115803785A (en) 2023-03-14
CN115803785B true CN115803785B (en) 2024-03-15

Family

ID=80447954

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080102240.6A Active CN115803785B (en) 2020-12-28 2020-12-28 Display device, control system and drawing method

Country Status (3)

Country Link
JP (1) JP6991396B1 (en)
CN (1) CN115803785B (en)
WO (1) WO2022145014A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004227175A (en) * 2003-01-21 2004-08-12 Sony Corp Maintenance system
CN111922784A (en) * 2019-05-13 2020-11-13 株式会社理光 State monitoring device and method, storage medium and computer device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09212219A (en) * 1996-01-31 1997-08-15 Fuji Facom Corp Three-dimensional virtual model creation device and monitoring control device for controlled object
EP3203336B1 (en) * 2014-09-30 2019-07-10 Makino Milling Machine Co., Ltd. Control device for machine tool
WO2016067342A1 (en) * 2014-10-27 2016-05-06 株式会社牧野フライス製作所 Machine tool control method and machine tool control device
JP2016107379A (en) * 2014-12-08 2016-06-20 ファナック株式会社 Robot system including augmented reality corresponding display
JP6736944B2 (en) * 2016-03-29 2020-08-05 ソニー株式会社 Information processing device, information processing method, and program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004227175A (en) * 2003-01-21 2004-08-12 Sony Corp Maintenance system
CN111922784A (en) * 2019-05-13 2020-11-13 株式会社理光 State monitoring device and method, storage medium and computer device

Also Published As

Publication number Publication date
JP6991396B1 (en) 2022-01-14
WO2022145014A1 (en) 2022-07-07
JPWO2022145014A1 (en) 2022-07-07
CN115803785A (en) 2023-03-14

Similar Documents

Publication Publication Date Title
KR19990072951A (en) Robot control apparatus
US20170262568A1 (en) Program development support device, non-transitory storage medium storing thereon computer-readable program development support program, and program development support method
US12084319B2 (en) Method and device for controlling a materials handling and/or construction machine
JP7616712B2 (en) Programming device and program
US20170083017A1 (en) Ladder diagram monitoring device capable of additionally displaying operation situation of cnc in comment
TWI465868B (en) Sequence program design support device
US20200125245A1 (en) Control device, communication terminal, and control system
CN115803785B (en) Display device, control system and drawing method
JP5060675B2 (en) Program creation device for image processing controller
JP2019125230A (en) Abnormality detection parameter adjustment display device
US20080312755A1 (en) Safe Plc, Sequence Program Creation Support Software and Sequence Program Judgment Method
JP6337810B2 (en) Information processing apparatus, information processing method, and program
JP3847665B2 (en) Control program search device and program thereof
JP6374456B2 (en) Electronic equipment and numerical control device
CN106155519B (en) Screen information generating device
JP2003084811A (en) Ladder monitor device, and its program and recording medium
JP2006209381A (en) Control display device, its program, and recording medium
JP5970080B2 (en) Programmable display, program
JP6474253B2 (en) Display system
JP7522226B2 (en) Control device
JP2015093126A (en) sewing machine
JP3890917B2 (en) Production equipment monitoring system
JP3729457B2 (en) Alarm history display system
CN114730167B (en) PLC system construction auxiliary program, operation terminal readable storage medium, and PLC system construction auxiliary device
JP2005018282A (en) Production equipment control device, production equipment control method, computer program, and computer-readable recording medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant