[go: up one dir, main page]

CN110580806A - display control device and computer-readable storage medium - Google Patents

display control device and computer-readable storage medium Download PDF

Info

Publication number
CN110580806A
CN110580806A CN201910434601.6A CN201910434601A CN110580806A CN 110580806 A CN110580806 A CN 110580806A CN 201910434601 A CN201910434601 A CN 201910434601A CN 110580806 A CN110580806 A CN 110580806A
Authority
CN
China
Prior art keywords
vehicle
captured image
request information
imaging target
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910434601.6A
Other languages
Chinese (zh)
Inventor
片山睦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honda Motor Co Ltd
Original Assignee
Honda Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd filed Critical Honda Motor Co Ltd
Publication of CN110580806A publication Critical patent/CN110580806A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/0112Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/012Measuring and analyzing of parameters relative to traffic conditions based on the source of data from other sources than vehicle or roadside beacons, e.g. mobile networks
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • G08G1/0133Traffic data processing for classifying traffic situation
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Traffic Control Systems (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Navigation (AREA)

Abstract

It is desirable to realize a technique capable of preferentially providing a newer captured image when providing a captured image of an imaging target spot. Provided is a display control device provided with: a target point acquisition unit that acquires an imaging target point; a captured image receiving unit that receives a captured image of a target site captured by a vehicle from the vehicle when the vehicle is present, and that receives the captured image from the vehicle when the vehicle stores the captured image obtained by capturing the target site when the vehicle is not present; and a display control unit that displays the captured image received by the captured image receiving unit.

Description

Display control device and computer-readable storage medium
Technical Field
the invention relates to a display control apparatus and a computer-readable storage medium.
Background
There is known an in-vehicle system including means for receiving a setting of an observation point from a user, requesting another in-vehicle system to take a picture of the observation point, and receiving and displaying an image of the observation point from the other in-vehicle system (see, for example, patent document 1).
Documents of the prior art
Patent document
Patent document 1: japanese patent laid-open publication No. 2006-031583
Disclosure of Invention
It is desirable to realize a technique capable of preferentially providing a newer captured image when providing a captured image of an imaging target spot.
According to the 1 st aspect of the present invention, there is provided a display control device. The display control device may include a target point acquisition unit that acquires an imaging target point. The display control device may include a captured image receiving unit that receives a captured image of an imaging target spot captured by a vehicle from the vehicle when the vehicle is present, and receives the captured image from the vehicle when the vehicle storing the captured image obtained by capturing an image of the imaging target spot is present. The display control device may include a display control unit that displays the captured image received by the captured image receiving unit.
The captured image receiving unit may receive the captured image of the imaging target spot from an image management server that manages captured images captured by a plurality of vehicles when there is no vehicle that can capture the imaging target spot and the captured image cannot be received from a vehicle that stores the captured image obtained by capturing the imaging target spot. The display control unit may display, in correspondence with the captured image, imaging time information indicating a time at which the captured image is captured.
The display control device may further include a request information transmitting unit that broadcasts request information including position information indicating the imaging target location, and the captured image receiving unit may receive the captured image of the imaging target location captured by the vehicle that transmitted the 1 st response information, when the 1 st response information indicating that imaging of the imaging target location is possible is received with respect to the request information. The 1 st response information may include time information indicating a time until the vehicle that has received the request information reaches the imaging target location, and the captured image receiving unit may receive the captured image from the vehicle that has transmitted the 1 st response information when the time information satisfies a predetermined condition. The 1 st response information may include distance information indicating a distance to the imaging target point of the vehicle that has received the request information, and the captured image receiving unit may receive the captured image from the vehicle that has transmitted the 1 st response information when the distance information satisfies a predetermined condition. The request information transmitting unit may transmit request information including time information indicating a predetermined time, and the captured image receiving unit may receive the captured image of the imaging target site captured by the vehicle that transmitted the 1 st response when the 1 st response information indicating that the imaging target site can be captured during a period from a time point when the request information is received to before the predetermined time is received with respect to the request information. The request information transmitting unit may transmit request information including distance information indicating a predetermined distance, and the captured image receiving unit may receive the captured image of the imaging target spot captured by the vehicle transmitting the 1 st response when the 1 st response information indicating that the distance to the imaging target spot is shorter than the predetermined distance is received with respect to the request information.
The captured image receiving unit may receive the captured image from the vehicle that transmitted the 2 nd response information when the 1 st response information is not received and the 2 nd response information indicating that the captured image obtained by capturing the image of the image pickup target spot is stored is received with respect to the request information. The 2 nd response information may include captured image time information indicating a time at which a captured image obtained by capturing an image of the image capture target spot is captured, and the captured image receiving unit may receive the captured image from a vehicle that transmitted the 2 nd response information when the captured image time information satisfies a predetermined condition. The request information transmitting unit may transmit request information including the position information and time information indicating a predetermined time, and the captured image receiving unit may receive the captured image from the vehicle that transmitted the 2 nd response when the 2 nd response information indicating the captured image in which the captured image obtained by capturing the image of the image pickup target spot during a period from a time point when the request information is received to before the predetermined time is received with respect to the request information.
according to the 2 nd aspect of the present invention, there is provided a computer-readable storage medium storing a program for causing a computer to function as the display control device.
moreover, the above summary does not enumerate all features of the present invention. In addition, a subset of these feature groups can also constitute the invention.
Drawings
Fig. 1 schematically shows an example of a communication environment of a vehicle 100.
Fig. 2 schematically shows an example of the structure of the vehicle 100.
Fig. 3 schematically shows an example of the positional relationship of the plurality of vehicles 100.
fig. 4 schematically shows an example of the functional configuration of the control device 200.
fig. 5 schematically shows an example of the flow of processing performed by the control device 200.
Fig. 6 schematically shows an example of the hardware configuration of the computer 1000 that functions as the control device 200.
Fig. 7 schematically shows an example of a functional configuration of communication terminal 500.
fig. 8 schematically shows an example of a hardware configuration of a computer 1100 that functions as the communication terminal 500.
description of the reference numerals
10: a network; 100: a vehicle; 110: an operation section; 120: a display unit; 130: a wireless communication unit; 140: an image pickup unit; 150: a GNSS receiving unit; 160: a sensor section; 200: a control device; 202: an object location acquisition unit; 204: a request information transmitting unit; 206: a response receiving section; 208: a request information transmitting unit; 210: a picked-up image receiving unit; 212: a display control unit; 214: a vehicle information acquisition unit; 300: a vehicle management server; 410; 420; 430; 440, a step of; 450: a vehicle; 500: a communication terminal; 502: an object location acquisition unit; 504: a request information transmitting unit; 506: a response receiving section; 508: a request information transmitting unit; 510: a picked-up image receiving unit; 512: a display control unit; 514: a vehicle information acquisition unit; 1000: a computer; 1010: a CPU; 1020: a ROM; 1030: a RAM; 1040: a communication I/F; 1050: a hard disk drive; 1080: an input-output chip; 1085: a graphics controller; 1092: a host controller; 1094: an input-output controller; 1100: a computer; 1110: SoC; 1122: a main memory; 1124: a flash memory; 1132: an antenna; 1134: an antenna; 1136: an antenna; 1140: a display; 1142: a microphone; 1144: a speaker; 1152: a USB port; 1154: a card slot.
Detailed Description
The present invention will be described below with reference to embodiments thereof, but the following embodiments do not limit the invention according to the claims. In addition, all combinations of the features described in the embodiments are not essential to the solution of the invention.
Fig. 1 schematically shows an example of a communication environment of a vehicle 100 according to the present embodiment. The vehicle 100 wirelessly communicates with other vehicles 100. The vehicle 100 can perform wireless communication with the other vehicle 100 by at least one of wireless communication with the other vehicle 100 via the network 10, direct wireless communication with the other vehicle 100 (sometimes referred to as inter-vehicle direct communication), and wireless communication with the other vehicle 100 via inter-route communication (sometimes referred to as inter-vehicle communication).
The network 10 may be any network. For example, the Network 10 may include at least one of the internet, a mobile telephone Network such as a so-called 3G (3rd Generation), LTE (Long Term Evolution), 4G (4th Generation), and 5G (5th Generation), a public wireless LAN (Local Area Network), and a private Network.
the vehicle 100 may perform vehicle-to-vehicle direct communication and vehicle-to-vehicle communication using any known vehicle-to-vehicle communication technique and road-to-vehicle communication technique. For example, the vehicle 100 executes vehicle-to-vehicle direct communication and vehicle-to-vehicle communication by communication using a predetermined frequency band such as a 700MHz band and a 5.8GHz band. The vehicle 100 may also wirelessly communicate with other vehicles 100 via other vehicles 100. For example, the inter-vehicle network may be formed by a plurality of vehicles 100 cooperating with each other by inter-vehicle direct communication or vehicle-to-vehicle communication, or the vehicles 100 at remote locations may perform communication with each other via the inter-vehicle network.
The vehicle management server 300 manages a plurality of vehicles 100. The vehicle management server 300 may manage the vehicle information of each of the plurality of vehicles 100. The vehicle information may include the location of the vehicle 100. The vehicle information may include a running condition of the vehicle 100. For example, the vehicle information includes a traveling direction, a traveling speed, and the like of the vehicle 100. In addition, the vehicle information includes, for example, route information indicating a route to the destination of the vehicle 100. The vehicle management server 300 may periodically receive various pieces of vehicle information from the vehicle 100 via the network 10.
The vehicle 100 may receive various vehicle information from the vehicle management server 300 via the network 10. The vehicle 100 may receive various pieces of vehicle information from another vehicle 100 via at least one of inter-vehicle direct communication, inter-vehicle communication, and inter-vehicle network. The vehicle 100 recognizes the status of the other vehicle 100 from the received vehicle information.
The vehicle 100 includes an imaging unit that images the surroundings of the vehicle 100, and transmits an image captured by the imaging unit to the vehicle management server 300 or to another vehicle 100. Further, the vehicle 100 receives the captured image captured by the imaging unit of the other vehicle 100 from the other vehicle 100, or receives the captured image captured by the imaging unit of the other vehicle 100 from the vehicle management server 300. In this way, the plurality of vehicles 100 share the captured image. The captured image may be a still image or a video (moving image). The vehicle management server 300 may be an example of an image management server.
The vehicle 100 receives and displays a captured image of a capturing target spot specified by the user of the vehicle 100, for example, from another vehicle 100 or the vehicle management server 300. The vehicle 100 can receive the captured image from, for example, the vehicle 100 that captures the image capture target spot while traveling in the vicinity of the image capture target spot. Further, the vehicle 100 can receive the captured image from the vehicle 100 that stores the captured image captured when traveling near the imaging target point. In addition, the vehicle 100 can also receive a captured image of the imaging target point from the vehicle management server 300.
In many cases, the user of the vehicle 100 desires to view as new a captured image as possible in order to confirm the current congestion state at a certain point, the current road state at a certain point, and the like. Therefore, it is preferable that the captured image in real time be received from the vehicle 100 that travels in the vicinity of the imaging target point and captures the imaging target point, and be presented to the user. However, since the vehicle 100 does not necessarily travel near the imaging target point, a real-time captured image may not be presented to the user. In such a case, it is also preferable that the user can be provided with a past captured image rather than any image. In this case, it is more preferable that a newer captured image among the past captured images can be provided.
The vehicle 100 according to the present embodiment receives a captured image of an imaging target spot captured by a vehicle from the vehicle when the vehicle capable of imaging the imaging target spot is present, and receives the captured image from the vehicle when the vehicle storing the captured image obtained by capturing an image of the imaging target spot is not present. This makes it possible to display the captured image in the presence of a live captured image, and to display the captured image without elapse of time after the imaging even in the absence of a live captured image. In addition, vehicle 100 may receive the captured image of the imaging target spot from vehicle management server 300 when there is no vehicle that can capture the imaging target spot and the captured image cannot be received from a vehicle that stores the captured image obtained by capturing the imaging target spot. This makes it possible to display a captured image that was captured in the past and uploaded to vehicle management server 300, even if none of the captured images is available.
Fig. 2 schematically shows an example of the structure of the vehicle 100. The vehicle 100 includes an operation unit 110, a display unit 120, a wireless communication unit 130, an imaging unit 140, a GNSS (Global Navigation Satellite System) receiving unit 150, a sensor unit 160, and a control device 200. At least a part of these structures may be included in a so-called vehicle navigation system.
The operation unit 110 receives an operation performed by a user of the vehicle 100. The operation section 110 may include physical operation buttons. The operation unit 110 and the display unit 120 may be touch panel displays. The operation unit 110 may also receive voice operation. The operation part 110 may include a microphone and a speaker.
The wireless communication unit 130 performs wireless communication with the vehicle management server 300 and other vehicles 100. The wireless communication section 130 may include a communication section that communicates with the network 10 via a wireless base station of a mobile phone network. In addition, the wireless communication section 130 may include a communication means that communicates with the network 10 via a WiFi (registered trademark) access point. In addition, the wireless communication section 130 may include a communication section that performs vehicle-to-vehicle communication. In addition, the wireless communication section 130 may include a communication section that performs road-to-vehicle communication.
The image pickup section 140 includes 1 or more cameras. The camera may also be a tachograph. When the imaging unit 140 includes a plurality of cameras, the plurality of cameras are disposed at different positions of the vehicle 100. In addition, the plurality of cameras respectively take images in different imaging directions.
the GNSS receiver 150 receives radio waves transmitted from GNSS satellites. The GNSS receiver 150 may determine the position of the vehicle 100 based on signals received from GNSS satellites.
The sensor part 160 includes 1 or more sensors. The sensor unit 160 includes, for example, an acceleration sensor. The sensor unit 160 includes, for example, an angular velocity sensor (gyro sensor). The sensor unit 160 includes, for example, a geomagnetic sensor. The sensor unit 160 includes, for example, a vehicle speed sensor.
The control device 200 controls the operation unit 110, the display unit 120, the wireless communication unit 130, the imaging unit 140, the GNSS reception unit 150, and the sensor unit 160, and executes various processes. The control device 200 executes, for example, navigation processing. The control device 200 may execute navigation processing similar to navigation processing executed by a known vehicle navigation system.
For example, the control device 200 specifies the current position of the vehicle 100 based on the outputs from the GNSS receiver 150 and the sensor 160, reads map data corresponding to the current position, and displays the map data on the display 120. Further, control device 200 receives an input of a destination via operation unit 110, specifies a recommended route from the current position of vehicle 100 to the destination, and displays the route on display unit 120. When receiving the selection of the route, controller 200 directs the route on which vehicle 100 should travel through display unit 120 and the speaker in accordance with the selected route.
The control device 200 according to the present embodiment executes display processing for receiving and displaying a captured image of a capturing target spot designated by a user from another vehicle 100 or the vehicle management server 300. The control device 200 first acquires an imaging target spot designated via the operation unit 110. Next, the control device 200 broadcasts request information for requesting transmission of the captured image of the imaging target point, including position information indicating the imaging target point, to the other vehicle 100. Communication between control device 200 and the outside of vehicle 100 may be performed via wireless communication unit 130. The request information transmitted by the wireless communication unit 130 reaches the plurality of vehicles 100 via at least one of the network 10 and the inter-vehicle network.
When receiving the 1 st response information indicating that the imaging target spot can be imaged with respect to the transmitted request information, the control device 200 receives the imaged image from the vehicle 100 that transmitted the 1 st response information. When receiving no 1 st response information but receiving 2 nd response information indicating that a captured image obtained by capturing an image of the image capture target spot is stored with respect to the request information, control device 200 may receive the captured image from vehicle 100 that transmitted the 2 nd response information. When the 1 st response information and the 2 nd response information are not received, control device 200 may receive the captured image of the imaging target point from vehicle management server 300. The control device 200 can display the received captured image on the display unit 120.
Fig. 3 schematically shows an example of the positional relationship of the plurality of vehicles 100. Fig. 3 shows vehicle 100, and vehicles 410, 420, 430, 440, and 450 other than vehicle 100, as an example of a plurality of vehicles 100. Vehicle 410, vehicle 420, vehicle 430, vehicle 440, and vehicle 450 have the same configuration as vehicle 100. When vehicle 410, vehicle 420, vehicle 430, vehicle 440, and vehicle 450 are not distinguished, they may be referred to as other vehicles.
When receiving a designation of an imaging target location from a user via the operation unit 110, for example, the control device 200 of the vehicle 100 broadcasts request information including position information indicating the imaging target location. In the example shown in fig. 3, the request information reaches vehicle 410, vehicle 420, vehicle 430, vehicle 440, and vehicle 450 via at least one of network 10 and the inter-vehicle network.
the other vehicle that has received the request information can determine whether or not the vehicle has passed the imaging target spot 402 by referring to the position and route information of the other vehicle, and whether or not a captured image obtained by capturing the imaging target spot 402 is stored when the vehicle has passed the imaging target spot 402.
In the example shown in fig. 3, a case will be described where the vehicle 410 has already passed through the imaging target spot 402 and has stored the captured image obtained by capturing the imaging target spot 402, the vehicle 420 has passed through the imaging target spot 402 after receiving the request information, the vehicle 430 does not pass through the imaging target spot 402, the vehicle 440 has passed through the imaging target spot 402 after receiving the request information, and the vehicle 450 has already passed through the imaging target spot 402 and has stored the captured image obtained by capturing the imaging target spot 402.
The vehicle 410 transmits, to the vehicle 100, the 2 nd response information including the identification information and the position information of the vehicle 410 and the imaging time for imaging the imaging target spot. The identification information of the vehicle 410 may be information capable of identifying the vehicle 410 in communication. The identification information of the vehicle 410 is, for example, an ID and an IP address assigned to the vehicle 410. The vehicle 420 transmits the 1 st response information including the identification information and the position information of the vehicle 420 to the vehicle 100. The vehicle 430 does not transmit the response message. The vehicle 440 transmits the 1 st response information including the identification information and the position information of the vehicle 440 for the vehicle 100. The vehicle 450 transmits, to the vehicle 100, the 2 nd response information including the identification information and the position information of the vehicle 450 and the imaging time for imaging the imaging target spot.
On condition that the 1 st response information is received, control device 200 of vehicle 100 determines to receive the captured image from the vehicle that transmitted the 1 st response information. When there are a plurality of vehicles that transmit the 1 st response information, control device 200 selects a vehicle that receives the captured image based on the positions of the plurality of vehicles. The control device 200 specifies, for example, a vehicle closest to the imaging target point 402. In the example shown in FIG. 3, the vehicle 100 determines the vehicle 420. Then, the vehicle 100 transmits request information requesting the captured image of the imaging target spot 402 to the vehicle 420, and receives the captured image of the imaging target spot 402 from the vehicle 420.
In the example shown in fig. 3, when there are no vehicle 420 and no vehicle 440, control device 200 of vehicle 100 cannot receive the 1 st response information, receives the 2 nd response information, and then determines to receive a captured image from the vehicle that transmitted the 2 nd response. When there are a plurality of vehicles transmitting the 2 nd response information, the control device 200 selects a vehicle that receives a captured image according to the imaging time for capturing an image of the imaging target point. The control device 200 selects, for example, a vehicle whose imaging time is the latest. In the example shown in fig. 3, vehicle 100 selects vehicle 410. Then, the vehicle 100 transmits request information for requesting the captured image of the imaging target point 402 to the vehicle 410, and receives the captured image of the imaging target point 402 from the vehicle 410.
In the example shown in fig. 3, when vehicle 410, vehicle 420, vehicle 440, and vehicle 450 are not present, control device 200 of vehicle 100 does not receive either 1 st response information or 2 nd response information. In this case, control device 200 may transmit position information indicating imaging target spot 402 to vehicle management server 300 and receive an imaged image obtained by imaging target spot 402 from vehicle management server 300.
Fig. 4 schematically shows an example of the functional configuration of the control device 200. The control device 200 includes a target point acquisition unit 202, a request information transmission unit 204, a response reception unit 206, a request information transmission unit 208, a captured image reception unit 210, a display control unit 212, and a vehicle information acquisition unit 214. In addition, the control device 200 does not necessarily have all of the above-described configurations.
The target point acquisition unit 202 acquires an imaging target point. The target point acquisition unit 202 can acquire the imaging target point designated via the operation unit 110. The target point acquisition unit 202 acquires, for example, an imaging target point that the operation unit 110 has received through a pointing input and has been designated. The target point acquisition unit 202 acquires an imaging target point that the operation unit 110 has received through voice input.
The request information transmitting unit 204 broadcasts the request information to the other vehicle 100. The request information transmitting unit 204 may transmit request information including identification information for identifying the vehicle 100 (which may be referred to as a vehicle) on which the control device 200 is mounted and position information indicating the imaging target spot acquired by the target spot acquiring unit 202 to the other vehicle 100.
the request information transmitting unit 204 may transmit request information including time information indicating the 1 st predetermined time. In this case, when the vehicle 100 that has received the request information and has passed through the imaging target spot can image the imaging target spot from the time when the request information is received to before the 1 st predetermined time indicated by the time information, the 1 st response information indicating that fact is transmitted. The 1 st predetermined time may be set by a user of the vehicle 100 or the like. For example, if the 1 st predetermined time is set to 5 minutes, if the imaging of the imaging target spot is performed after 5 minutes or more has elapsed, the 1 st response information may not be transmitted.
The request information transmitting unit 204 may transmit request information including time information indicating 2 nd predetermined time. In this case, the vehicle 100 which has received the request information and stores the captured image obtained by capturing the image of the image capturing target point transmits the 2 nd response information when the capturing time for capturing the captured image is within a period from the time when the request information is received to the 2 nd predetermined time indicated by the time information included in the request information. The 2 nd predetermined time may be set by a user of the vehicle 100 or the like. For example, when the 2 nd predetermined time is set to 5 minutes, the 2 nd response information may not be transmitted when 5 minutes or more have elapsed since the imaging of the imaging target spot. The 1 st predetermined time and the 2 nd predetermined time may be the same or different.
The request information transmitting unit 204 may transmit request information including distance information indicating the 1 st predetermined distance. In this case, the vehicle 100 that has received the request information and has passed through the imaging target point transmits the 1 st response information when the distance to the imaging target point is shorter than the 1 st predetermined distance indicated by the distance information. The 1 st predetermined distance may be set by a user of the vehicle 100 or the like. For example, when the 1 st predetermined distance is set to 5km, the 1 st response information may not be transmitted to the vehicle 100 which is separated from the imaging target point by more than 5 km.
the request information transmitting unit 204 may transmit request information including distance information indicating a 2 nd predetermined distance. In this case, when the vehicle 100 which has received the request information and stored the captured image obtained by capturing the image of the image capturing target spot is within the 2 nd predetermined distance from the position of the vehicle 100 to the position of the image capturing target spot, the 2 nd response information is transmitted. For example, when the 2 nd predetermined distance is set to 5km, if the vehicle 100 is separated from the imaging target point by 5km or more after imaging the imaging target point, the 2 nd response information may not be transmitted. The 1 st predetermined distance and the 2 nd predetermined distance may be the same or different.
the response receiving unit 206 receives a response to the request information transmitted by the request information transmitting unit 204. The response receiving unit 206 receives, for example, the 1 st response information indicating that the imaging target spot can be imaged. The 1 st response message may include identification information of the vehicle 100 that transmitted the 1 st response message. In addition, the 1 st response information may include time information indicating a time until the vehicle 100 that received the request information reaches the image pickup target point. The 1 st response information may include distance information indicating a distance to the imaging target point of the vehicle 100 that has received the request information.
The response receiving unit 206 receives, for example, the 2 nd response information indicating that the captured image obtained by capturing the image of the image capture target spot is stored. The 2 nd response message may include identification information of the vehicle 100 that transmitted the 2 nd response message. In addition, the 2 nd response information may include an image capturing time for capturing an image of the image capturing target spot.
The request information transmitting unit 208 transmits the request information according to the reception status of the response received by the response receiving unit 206. For example, when the response receiving unit 206 receives the 1 st response information, the request information transmitting unit 208 transmits request information requesting the captured image of the imaging target spot to the vehicle 100 that transmitted the 1 st response information.
when the 1 st response information includes time information, the request information transmitting unit 208 may transmit the request information to the vehicle 100 that transmitted the 1 st response information, on the condition that the time information satisfies a predetermined condition. The predetermined condition may be, for example, that the time indicated by the time information is shorter than a predetermined time. The predetermined condition may be set by a user of the vehicle 100 or the like. The control device 200 may determine that the 1 st response information is not received when the 1 st response information does not satisfy a predetermined condition.
when the 1 st response information includes the distance information, the request information transmitting unit 208 may transmit the request information to the vehicle 100 that transmitted the 1 st response information, on the condition that the distance information satisfies a predetermined condition. The predetermined condition may be, for example, that the distance indicated by the distance information is shorter than a predetermined distance. The predetermined condition may be set by a user of the vehicle 100 or the like. The control device 200 may determine that the 1 st response information is not received when the 1 st response information does not satisfy a predetermined condition.
When there are a plurality of vehicles 100 transmitting the 1 st response information, the request information transmitting unit 208 may select the vehicle 100 transmitting the request information according to the positions of the plurality of vehicles.
When the response receiving unit 206 does not receive the 1 st response information but receives the 2 nd response information, the request information transmitting unit 208 transmits request information requesting the captured image of the imaging target spot to the vehicle 100 that transmitted the 2 nd response information. When the 2 nd response information includes the imaging time for imaging the imaging target point, the request information transmitting unit 208 may transmit the request information to the vehicle that transmitted the 2 nd response information, on the condition that the imaging time satisfies a predetermined condition. The predetermined condition may be, for example, that the imaging time is after a certain time. The predetermined condition may be set by a user of the vehicle 100 or the like. When the 2 nd response message does not satisfy the predetermined condition, the control device 200 may determine that the 2 nd response message is not received. When there are a plurality of vehicles 100 transmitting the 2 nd response information, the request information transmitting unit 208 may select the vehicle 100 transmitting the request information according to the imaging time taken to image the imaging target point.
The request information transmitting unit 208 may transmit request information requesting the captured image of the imaging target point to the vehicle management server 300 when the response receiving unit 206 does not receive any of the 1 st response information and the 2 nd response information. The request information transmitting unit 208 may transmit request information requesting a captured image whose imaging time is the latest among captured images at the imaging target points stored in the vehicle management server 300, to the vehicle management server 300. The vehicle management server 300 may transmit the captured image whose imaging time is the latest among the captured images at the imaging target point, based on the request information. The vehicle management server 300 may transmit the camera time of the camera image together with the camera image.
The captured image receiving unit 210 receives the captured image transmitted from the vehicle 100 or the vehicle management server 300, based on the request information transmitted from the request information transmitting unit 208.
The display control unit 212 displays the captured image received by the captured image receiving unit 210. The display control unit 212 can display the captured image on the display unit 120. The display control unit 212 may transmit the captured image to a communication terminal designated in advance and display the captured image on the communication terminal. Examples of the communication terminal include a mobile phone such as a smartphone owned by the user of the vehicle 100, a tablet terminal, and the like.
The display control unit 212 may display the captured image in association with imaging time information indicating the time at which the captured image is captured. For example, when displaying the captured image received from the vehicle 100 that transmitted the 1 st response information, the display control unit 212 displays the captured image together with the captured image so that the captured image is a live image or a live image. For example, when displaying the captured image received from the vehicle 100 that transmitted the 2 nd response information, the display control unit 212 displays the captured image together with the captured time included in the 2 nd response information. For example, when displaying the captured image received from the vehicle management server 300, the display control unit 212 may display the captured image together with the captured time transmitted together with the captured image. This makes it possible for a viewer of a captured image to easily grasp whether a displayed captured image represents a current imaging target site or an imaging target site at a past certain time point.
The vehicle information acquisition unit 214 acquires vehicle information of another vehicle 100. The vehicle information acquisition unit 214 may receive the vehicle information from the vehicle management server 300 via the network 10. The vehicle information acquisition unit 214 may receive various types of vehicle information from another vehicle 100 via at least one of inter-vehicle direct communication, inter-vehicle communication, and inter-vehicle network. When the target location acquisition unit 202 acquires the imaging target location, the request information transmission unit 204 may first refer to the vehicle information of the other vehicle 100 acquired by the vehicle information acquisition unit 214 to determine whether or not there is a vehicle 100 that can image the imaging target location. If it is determined that the vehicle 100 is present, the request information transmitting unit 204 may transmit the request information to the vehicle 100 that can image the image capture target point. If it is determined that the request information does not exist, the request information transmitting unit 204 may broadcast the request information to the other vehicle 100.
fig. 5 schematically shows an example of the flow of processing performed by the control device 200. Fig. 5 shows an example of processing from the reception of the designation of the imaging target spot to the display of the captured image. Each process shown in fig. 5 is executed mainly by a control unit provided in the control device 200.
In step (S is not described in some cases) 102, when the operation unit 110 receives a designation of an imaging target spot, the target spot acquisition unit 202 acquires the designated imaging target spot. In S104, the request information transmitting unit 204 broadcasts request information including the identification information of the own vehicle to the other vehicle 100.
If the response receiving unit 206 receives the 1 st response information within a predetermined time from the transmission of the request information (yes in S106) for the request information broadcast in S104, the process proceeds to S120, and if not (no in S106), the process proceeds to S108. In S108, if the response receiving unit 206 receives the 2 nd response information within a predetermined time from the transmission of the request information (yes in S108), the process proceeds to S114, and if it does not receive the response information (no in S108), the process proceeds to S110.
In S110, the request information transmitting unit 208 transmits the request information to the vehicle management server 300, and the captured image receiving unit 210 receives the captured image of the imaging target point from the vehicle management server 300. In S112, the display control unit 212 causes the display unit 120 to display the captured image.
In S114, the request information transmitting unit 208 transmits the request information to the vehicle 100 that transmitted the 2 nd response information, and the captured image receiving unit 210 receives the captured image of the imaging target spot from the vehicle 100. In S118, the display controller 212 causes the display 120 to display the captured image.
In S120, the request information transmitting unit 208 transmits the request information to the vehicle 100 that transmitted the 1 st response information. In S122, the start of imaging at the imaging target point by the vehicle 100 is waited for. When starting imaging of the imaging target point, vehicle 100 may notify control device 200 of the fact. If imaging has started (yes in S122), the process proceeds to S124.
In S124, the captured image receiving unit 210 receives the captured image in which the vehicle 100 captured the image while moving toward the imaging target point. In S126, the display control unit 212 causes the display unit 120 to display the captured image. In S128, it is determined whether or not imaging performed by the vehicle 100 at the imaging target point is completed.
If it is determined in S128 that imaging has not been completed, the process returns to S124, and the captured image is received and displayed. If it is determined in S128 that imaging has ended, the process ends.
Fig. 6 schematically shows an example of a computer 1000 that functions as the control device 200. The computer 1000 according to the present embodiment includes: a CPU peripheral portion having a CPU1010, a RAM1030, and a graphics controller 1085 connected to each other through a host controller 1092; and an input/output unit having a ROM1020 connected to the host controller 1092 via an input/output controller 1094, a communication I/F1040, a hard disk drive 1050, and an input/output chip 1080.
The CPU1010 operates according to programs stored in the ROM1020 and the RAM1030 to control the respective units. The graphic controller 1085 acquires image data generated on a frame buffer built in the RAM1030, such as the CPU1010, and displays the image data on a display. Instead, the graphic controller 1085 may also internally include a frame buffer that stores image data generated by the CPU1010 or the like.
The communication I/F1040 communicates with other apparatuses via a network by wire or wirelessly. The communication I/F1040 functions as hardware for performing communication. The hard disk drive 1050 stores programs and data used by the CPU 1010.
The ROM1020 stores a boot program executed when the computer 1000 is started, a program dependent on hardware of the computer 1000, and the like. The input/output chip 1080 connects various input/output devices to the input/output controller 1094 via, for example, a parallel port, a serial port, a keyboard port, a mouse port, or the like.
The user stores a program supplied to the hard disk drive 1050 via the RAM1030 in a recording medium such as an IC card and supplies the program. The program is read from the recording medium and installed in the hard disk drive 1050 via the RAM1030 to be executed in the CPU 1010.
A program installed in the computer 1000 to cause the computer 1000 to function as the control device 200 may act on the CPU1010 or the like, and cause the computer 1000 to function as each unit of the control device 200. The information processing described in these programs is read into the computer 1000, and functions as specific means for cooperating software with the various hardware resources described above, i.e., the target point acquisition unit 202, the request information transmission unit 204, the response reception unit 206, the request information transmission unit 208, the captured image reception unit 210, the display control unit 212, and the vehicle information acquisition unit 214. By using these specific means, the calculation or processing of information corresponding to the purpose of use of the computer 1000 in the present embodiment is realized, and the specific control device 200 corresponding to the purpose of use is constructed.
In the above-described embodiment, the description has been given by taking the control device 200 mounted on the vehicle 100 as an example of the display control device, but the present invention is not limited to this, and for example, a communication terminal owned by a user who is riding on the vehicle 100 may function as the display control device.
Fig. 7 schematically shows an example of a functional configuration of communication terminal 500. The communication terminal 500 includes a target point acquisition unit 502, a request information transmission unit 504, a response reception unit 506, a request information transmission unit 508, a captured image reception unit 510, a display control unit 512, and a vehicle information acquisition unit 514. Here, the point that the processing content is different from the control device 200 shown in fig. 4 will be mainly described.
The target point acquisition unit 502 acquires an imaging target point. The target point acquisition unit 502 can acquire, for example, an imaging target point that has been designated by the map application.
The request information transmitting unit 504 broadcasts the request information to the other vehicle 100. The request information transmitting unit 504 may transmit request information including identification information for identifying the communication terminal 500 and position information indicating the imaging target spot acquired by the target spot acquiring unit 202 to the other vehicle 100.
The response receiving unit 506 receives a response to the request information transmitted by the request information transmitting unit 504. The request information transmitting unit 508 transmits the request information according to the reception status of the response received by the response receiving unit 506. The captured image receiving unit 510 receives the captured image transmitted from the vehicle 100 or the vehicle management server 300, based on the request information transmitted from the request information transmitting unit 508. The display control unit 512 displays the captured image received by the captured image receiving unit 510. The display control unit 512 can display the captured image on a display provided in the communication terminal 500.
Fig. 8 shows an example of a hardware configuration of a computer 1100 that functions as the communication terminal 500. The computer 1100 according to this embodiment includes an SoC1110, a main memory 1122, a flash memory 1124, an antenna 1132, an antenna 1134, an antenna 1136, a display 1140, a microphone 1142, a speaker 1144, a USB port 1152, and a card slot 1154.
SoC1110 controls each unit in accordance with program operations stored in main memory 1122 and flash memory 1124. Antenna 1132 is a so-called cellular network antenna. The antenna 1134 is a so-called WiFi (registered trademark) antenna. The antenna 1136 is an antenna for so-called short-range wireless communication such as bluetooth (registered trademark). SoC1110 may implement various communication functions using antenna 1132, antenna 1134, and antenna 1136. SoC1110 can receive programs used by SoC1110 using antenna 1132, antenna 1134, or antenna 1136, and store the programs in flash memory 1124.
The SoC1110 may implement various display functions using the display 1140. The SoC1110 may implement various voice input functions using a microphone 1142. The SoC1110 may implement various sound output functions using a speaker 1144.
The USB port 1152 implements a USB connection. The card slot 1154 realizes connection with various cards such as an SD card. The SoC1110 may receive programs used by the SoC1110 from a machine or a memory connected to the USB port 1152 and a card connected to the card slot 1154, and store the programs in the flash memory 1124.
A program that is installed in the computer 1100 and causes the computer 1100 to function as the communication terminal 500 may act on the SoC1110 and the like, and cause the computer 1100 to function as each unit of the communication terminal 500. The information processing described in these programs is read into the computer 1100, and functions as specific means for cooperating software with the various hardware resources described above, i.e., the target location acquisition unit 502, the request information transmission unit 504, the response reception unit 506, the request information transmission unit 508, the captured image reception unit 510, the display control unit 512, and the vehicle information acquisition unit 514. By using these specific means, the operation or processing of information corresponding to the purpose of use of the computer 1100 in the present embodiment is realized, and a unique communication terminal 500 corresponding to the purpose of use is constructed.
The present invention has been described above with reference to the embodiments, but the technical scope of the present invention is not limited to the scope described in the above embodiments. Those skilled in the art will appreciate that various modifications and improvements can be made to the above-described embodiments. The technical scope of the present invention also includes the embodiments to which such changes or improvements are explicitly made according to the description of the claims.
It should be noted that the execution order of the operations, the sequence, the steps, the stages, and the like in the apparatus, the system, the program, and the method shown in the claims, the specification, and the drawings may be realized in any order unless it is specifically indicated as "before …", "before", and the like, and the output of the previous process is used in the subsequent process. In the operation flows in the claims, the specification, and the drawings, even if "first", "next", and the like are used for convenience of description, the description does not mean that the operations are necessarily performed in this order.

Claims (12)

1. A display control device is provided with:
A target point acquisition unit that acquires an imaging target point;
A captured image receiving unit that receives a captured image of the imaging target spot captured by a vehicle from the vehicle when the vehicle is present, and that receives the captured image from the vehicle when the vehicle stores the captured image obtained by capturing the imaging target spot when the vehicle is not present; and
and a display control unit that displays the captured image received by the captured image receiving unit.
2. The display control apparatus according to claim 1,
The captured image receiving unit receives a captured image of the imaging target spot from an image management server that manages captured images captured by a plurality of vehicles when there is no vehicle that can capture the imaging target spot and the captured image cannot be received from a vehicle that stores the captured image obtained by capturing the imaging target spot.
3. The display control apparatus according to claim 1 or 2,
The display control unit displays imaging time information indicating a time at which the captured image is captured, in association with the captured image.
4. the display control apparatus according to any one of claims 1 to 3, comprising a request information transmitting unit,
The request information transmitting unit broadcasts request information including position information indicating the imaging target location,
The captured image receiving unit receives a captured image of the imaging target spot captured by the vehicle that transmitted the 1 st response information when the 1 st response information indicating that the imaging target spot can be captured is received with respect to the request information.
5. The display control apparatus according to claim 4,
The 1 st response information includes time information indicating a time until the vehicle having received the request information reaches the image pickup object point,
The captured image receiving unit receives the captured image from the vehicle that transmitted the 1 st response information when the time information satisfies a predetermined condition.
6. The display control apparatus according to claim 4,
The 1 st response information includes distance information indicating a distance from the vehicle that received the request information to the imaging target point,
The captured image receiving unit receives the captured image from the vehicle that transmitted the 1 st response information when the distance information satisfies a predetermined condition.
7. The display control apparatus according to claim 4,
The request information transmitting unit transmits request information including time information indicating a predetermined time,
The captured image receiving unit receives the captured image of the imaging target spot captured by the vehicle that transmitted the 1 st response information when the 1 st response information indicating that the imaging target spot can be captured within a period from a time point when the request information is received to before the predetermined time is received with respect to the request information.
8. the display control apparatus according to claim 4,
The request information transmitting unit transmits request information including distance information indicating a predetermined distance,
The captured image receiving unit receives the captured image of the imaging target point captured by the vehicle that transmitted the 1 st response information when the 1 st response information indicating that the distance to the imaging target point is shorter than the predetermined distance is received with respect to the request information.
9. The display control apparatus according to claim 4,
The captured image receiving unit receives the captured image from the vehicle that transmitted the 2 nd response information when the 1 st response information is not received and the 2 nd response information indicating that the captured image obtained by capturing the image of the image capturing target spot is stored is received with respect to the request information.
10. The display control apparatus according to claim 9,
The 2 nd response information includes imaging time information indicating a time at which an imaging image obtained by imaging the imaging target spot is imaged,
The captured image receiving unit receives the captured image from the vehicle that transmitted the 2 nd response information when the captured time information satisfies a predetermined condition.
11. The display control apparatus according to claim 9,
The request information transmitting unit transmits request information including the position information and time information indicating a predetermined time,
The captured image receiving unit receives the captured image from the vehicle that transmitted the 2 nd response information when the 2 nd response information indicating that the captured image obtained by capturing the image of the imaging target spot during a period from the time when the request information is received to before the predetermined time is received with respect to the request information is received.
12. A computer-readable storage medium storing a program,
The program is for causing a computer to function as the display control apparatus according to any one of claims 1 to 11.
CN201910434601.6A 2018-06-11 2019-05-23 display control device and computer-readable storage medium Pending CN110580806A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-111467 2018-06-11
JP2018111467A JP7026003B2 (en) 2018-06-11 2018-06-11 Display control device and program

Publications (1)

Publication Number Publication Date
CN110580806A true CN110580806A (en) 2019-12-17

Family

ID=68810907

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910434601.6A Pending CN110580806A (en) 2018-06-11 2019-05-23 display control device and computer-readable storage medium

Country Status (2)

Country Link
JP (1) JP7026003B2 (en)
CN (1) CN110580806A (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3125600A1 (en) * 2021-07-21 2023-01-27 Psa Automobiles Sa Method and device for aiding navigation

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101228785A (en) * 2005-07-26 2008-07-23 松下电器产业株式会社 Image data management device and image data management method
CN101516058A (en) * 2008-02-22 2009-08-26 富士通株式会社 Image management apparatus
CN102568240A (en) * 2010-11-15 2012-07-11 株式会社电装 Traffic Information System, Traffic Information Acquisition Device And Traffic Information Supply Device
US20120176500A1 (en) * 2003-06-12 2012-07-12 Denso Corporation Image server, image deliver based on image information and condition, and image display terminal
JP2015070350A (en) * 2013-09-27 2015-04-13 日産自動車株式会社 Monitor image presentation system
CN105741535A (en) * 2016-03-10 2016-07-06 江苏南亿迪纳数字科技发展有限公司 Real time road condition on-demand method and system based on image or video

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015161592A (en) 2014-02-27 2015-09-07 パイオニア株式会社 Navigation device, communication device, server device, control method, program, and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120176500A1 (en) * 2003-06-12 2012-07-12 Denso Corporation Image server, image deliver based on image information and condition, and image display terminal
CN101228785A (en) * 2005-07-26 2008-07-23 松下电器产业株式会社 Image data management device and image data management method
CN101516058A (en) * 2008-02-22 2009-08-26 富士通株式会社 Image management apparatus
CN102568240A (en) * 2010-11-15 2012-07-11 株式会社电装 Traffic Information System, Traffic Information Acquisition Device And Traffic Information Supply Device
JP2015070350A (en) * 2013-09-27 2015-04-13 日産自動車株式会社 Monitor image presentation system
CN105741535A (en) * 2016-03-10 2016-07-06 江苏南亿迪纳数字科技发展有限公司 Real time road condition on-demand method and system based on image or video

Also Published As

Publication number Publication date
JP7026003B2 (en) 2022-02-25
JP2019215638A (en) 2019-12-19

Similar Documents

Publication Publication Date Title
CN110519555B (en) Display control device and computer-readable storage medium
US10997853B2 (en) Control device and computer readable storage medium
US11322026B2 (en) Control device and computer readable storage medium
CN110706497B (en) Image processing apparatus and computer-readable storage medium
CN110620901B (en) Control device and computer-readable storage medium
CN110581981B (en) Display control device and computer-readable storage medium
CN110730326B (en) Imaging system, imaging device, communication terminal, and computer-readable storage medium
JP7026003B2 (en) Display control device and program
CN110782686B (en) Control device and computer-readable storage medium
CN110782685B (en) Display control device and computer-readable storage medium
CN110576793B (en) Display control device and computer-readable storage medium
JP2006031583A (en) On-vehicle system and remote observation system
JP7016773B2 (en) Display control device and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20191217

RJ01 Rejection of invention patent application after publication