[go: up one dir, main page]

WO2024195154A1 - Image system and imaging apparatus - Google Patents

Image system and imaging apparatus Download PDF

Info

Publication number
WO2024195154A1
WO2024195154A1 PCT/JP2023/033342 JP2023033342W WO2024195154A1 WO 2024195154 A1 WO2024195154 A1 WO 2024195154A1 JP 2023033342 W JP2023033342 W JP 2023033342W WO 2024195154 A1 WO2024195154 A1 WO 2024195154A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
images
camera
imaging
captured
Prior art date
Application number
PCT/JP2023/033342
Other languages
French (fr)
Japanese (ja)
Inventor
哲也 伊藤
Original Assignee
株式会社日立国際電気
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立国際電気 filed Critical 株式会社日立国際電気
Priority to JP2025508109A priority Critical patent/JPWO2024195154A1/ja
Publication of WO2024195154A1 publication Critical patent/WO2024195154A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B61RAILWAYS
    • B61KAUXILIARY EQUIPMENT SPECIALLY ADAPTED FOR RAILWAYS, NOT OTHERWISE PROVIDED FOR
    • B61K13/00Other auxiliaries or accessories for railways
    • B61K13/04Passenger-warning devices attached to vehicles; Safety devices for preventing accidents to passengers when entering or leaving vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B61RAILWAYS
    • B61LGUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
    • B61L25/00Recording or indicating positions or identities of vehicles or trains or setting of track apparatus
    • B61L25/02Indicating or recording positions or identities of vehicles or trains
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to an imaging system.
  • IP-based CCTV systems encode images captured by IP cameras in H264/MPEG4/JPEG, etc., then decode the images received by a server, etc., and display the images on a monitor.
  • IP-based CCTV systems have the advantage of being able to select image resolution and frame rate, and being able to transfer multiple images over a single cable, simplifying wiring. For this reason, IP-based CCTV systems are used in on-board surveillance systems for railways, where installation space is limited. On-board surveillance systems are used to check the safety of multiple boarding and alighting entrances with surveillance cameras, in order to reduce the number of people required for train operations. For this reason, the display area of a single monitor is divided and surveillance images of multiple boarding and alighting entrances are displayed to check safety.
  • Patent Document 1 JP 2018-113602 A describes a surveillance system in which a camera that captures at least the vicinity of the doors of the cars that make up the train and a monitor that displays the camera images captured by the camera are mounted on the train, the camera is disposed for each of a number of doors provided on both sides of the cars, the display area of the monitor is divided into a number of areas, and a display control unit is provided that controls the allocation of the camera images to each area based on information indicating the direction in which the train is traveling and information indicating the side on which the door opens.
  • In-vehicle surveillance systems use multiple cameras to capture surveillance images, but because each camera only outputs one image, nearby objects may be framed out of the field of view, or distant objects may be small and difficult to see.
  • the display area of the surveillance monitor installed in the driver's seat is divided to display images of multiple boarding and alighting doors in each area, and passengers getting on and off near the doors when the vehicle is stopped are monitored.
  • a single-chip camera with a single image sensor is usually attached to the side of the vehicle, and when the vehicle is long or when monitoring multiple doors, objects close to the camera may be framed out of the field of view or distant objects may appear small, making it difficult to obtain the necessary information.
  • the present invention aims to provide an imaging system that can accurately monitor both areas close to the camera and areas far away.
  • An image system comprising a camera that captures images and a display device that displays the images captured by the camera, the camera having an image sensor that captures images and an image processing section that processes the images captured by the image sensor, the image processing section being characterized in that it outputs a number of images with different capture ranges.
  • the camera has a plurality of the image sensors and a plurality of lenses provided in front of each of the plurality of image sensors, the plurality of lenses having different focal lengths so as to have different shooting ranges, and the image processing unit outputs a plurality of images having different angles of view captured by the plurality of image sensors.
  • the display device displays multiple images output from the camera with different shooting ranges side by side.
  • the display device selectively displays one of a number of images output from the camera and having different shooting ranges.
  • the image processing unit is characterized in that it synthesizes multiple images with different shooting ranges into a single image and outputs the image.
  • An example of an imaging device is an imaging device that captures images, and includes an imaging element that captures images, and an image processing unit that processes the images captured by the imaging element, and the image processing unit is characterized in that it outputs multiple images with different capture ranges.
  • a camera outputs multiple images with different shooting ranges, allowing a wider shooting range of nearby subjects and allowing distant subjects to be captured in large size.
  • FIG. 1 is a diagram showing an overview of an image system according to an embodiment of the present invention
  • FIG. 1 is a diagram showing a configuration of an image system according to an embodiment of the present invention.
  • FIG. 1 is a diagram illustrating a configuration of a camera according to an embodiment of the present invention.
  • FIG. 1 is a diagram showing an image displayed on a surveillance monitor in a conventional imaging system.
  • FIG. 1 is a diagram showing an image captured by a camera in a leading vehicle in a conventional image system.
  • 1 is a diagram showing an image displayed on a surveillance monitor in the imaging system of the present embodiment.
  • FIG. FIG. 2 is a diagram showing a telephoto image captured by a camera in a leading vehicle in the imaging system of the present embodiment.
  • FIG. 2 is a diagram showing a wide-angle image captured by a camera in a leading vehicle in the image system of the present embodiment.
  • FIG. 1 shows an overview of an image system according to an embodiment of the present invention.
  • the image system of this embodiment is implemented in a three-car train consisting of a leading car 1, a middle car 2, and a trailing car 3.
  • Each of the leading car 1, middle car 2, and trailing car 3 has two doors 7-12 on one side, and passengers board and disembark through these doors 7-12.
  • Cameras 4-6 are provided on the side of each of the cars 1-3 to monitor passengers boarding and disembarking through the doors 7-12. As shown in FIG. 1, in this embodiment, one camera is provided per car, and this one camera monitors passengers boarding and disembarking through the two doors.
  • the cameras 4-6 are provided near the leading car of each car, but they may also be provided near the trailing car.
  • Surveillance monitors 13, 14 are provided in the driver's cabs of the leading car 1 and the trailing car 3.
  • the surveillance monitors 13, 14 are display devices that display images captured by the cameras 4-6.
  • the crew can check the images captured by the cameras 4-6 on the surveillance monitors 13, 14.
  • passengers 31-35 are present on the platform where the train is stopped.
  • FIG. 2 shows the configuration of the image system in this embodiment.
  • the leading car 1 is equipped with an image server 19, a recording device 18, a higher-level server 20, a surveillance monitor 13, and a camera 4, and these devices are connected by a network device (e.g., an L2 switch) 15.
  • the network connecting these devices may be configured as a wired network, or may be configured as a wireless network in part or in whole.
  • the image server 19 is a control device that performs image processing such as switching between images captured by the cameras 4 to 6 and displaying them on the surveillance monitors 13 and 14.
  • the recording device 18 records the images captured by the cameras 4 to 6 on a non-volatile recording medium as needed. Crew members can display the images recorded on the recording device 18 by operating the operation panels provided on the surveillance monitors 13 and 14. The images recorded on the recording device 18 are used to identify the cause of any trouble that may occur.
  • the upper server 20 is a control device that controls the operation of the train.
  • the image server 19 obtains the train status (e.g., running status, door open/close status) from the upper server 20.
  • the image server 19 receives a door open signal from the upper server 20, it controls the surveillance monitors 13 and 14 to display images of the doors.
  • the image server 19 receives a departure signal (over 5 km/h signal) from the upper server 20, it erases the images of the doors from the surveillance monitors 13 and 14.
  • the surveillance monitor 13 has a display unit that displays images captured by the cameras 4 to 6, and an operation unit for issuing instructions to the image server 19.
  • the operation unit may be configured as a touch panel, and the operation unit and display unit may be configured as one unit.
  • the surveillance monitor 13 is not limited to a display device provided in the driver's cab, and may be a portable terminal device carried by the crew.
  • Camera 4 is installed on the side of the leading car 1 and captures images of passengers getting on and off through two doors 7 and 8. The configuration of camera 4 will be described later with reference to Figure 3.
  • the rear vehicle 3 is equipped with a surveillance monitor 14 and a camera 6, and these devices are connected by a network device (e.g., an L2 switch) 17.
  • the network connecting these devices may be configured as a wired network, or may be configured as a wireless network in part or in whole.
  • Surveillance monitor 14 has the same configuration as surveillance monitor 13, and includes a display unit that displays images captured by cameras 4 to 6, and an operation unit for issuing instructions to image server 19.
  • the operation unit may be configured as a touch panel, and the operation unit and display unit may be integrated.
  • Surveillance monitor 14 is not limited to a display device provided in the driver's cab, and may be a portable terminal device carried by the crew.
  • Camera 6 is installed on the side of the rear car 3 and captures images of passengers getting on and off through the two doors 11, 12.
  • the configuration of camera 6 is the same as camera 4, and will be described later with reference to Figure 3.
  • a camera 5 is provided on the intermediate car 2, and the camera 5 is connected to a network device (e.g., an L2 switch) 16.
  • a network device e.g., an L2 switch
  • Camera 5 is installed on the side of intermediate car 2 and captures images of passengers getting on and off through two doors 9, 10.
  • the configuration of camera 5 is the same as camera 4, and will be described later with reference to Figure 3.
  • Network devices 15, 16, and 17 are connected via communication path 50 to form an in-train network.
  • FIG. 3 is a diagram showing the configuration of camera 4 in this embodiment. Camera 4 will be described in FIG. 3, but cameras 5 and 6 have the same configuration.
  • camera 4 is a two-chip camera that can output images via a network.
  • Camera 4 has two light receiving windows 81, 82, and inside each of the light receiving windows 81, 82 are lenses 83, 84 with different characteristics (e.g. different focal lengths).
  • Lens 83 is a telephoto lens with a field of view in the range of 95 to 92, and captures door 8 at a long distance.
  • Lens 84 is a wide-angle lens with a field of view in the range of 93 to 94, and captures door 7 at a close distance. Note that although lenses 83, 84 are illustrated as single convex lenses, in reality they are composed of a combination of multiple lenses.
  • Image sensors 85 and 86 capture images with different shooting ranges.
  • Image sensors 85 and 86 generate raw image data (e.g., RAW data) through photoelectric conversion.
  • Signal processors 87 and 88 perform signal processing such as noise removal and edge processing to clarify contours from the raw image data, and generate image data.
  • the image processing unit 89 cuts out the necessary areas from the image data generated by the signal processing units 87 and 88, or combines multiple images into one image, and inputs the generated image to the encoder unit 90. For example, it generates image 37 displayed in area 25 shown in FIG. 7, which is a cut-out of the area around the door, or image 38 displayed in area 26 shown in FIG. 8, or it combines images 37 and 38 to generate one image.
  • the encoder unit 90 converts the image data into a predetermined data format (e.g., JPEG/H.264) and outputs it to the network via the physical interface 91.
  • a predetermined data format e.g., JPEG/H.264
  • the image processing unit 89 receives image data from the signal processing unit 87 and image data from the signal processing unit 88, i.e., two images with different shooting ranges, and controls whether to output only the image from the signal processing unit 87, the image from the signal processing unit 88, or two images, depending on the user's operation on the surveillance monitors 13, 14. Furthermore, when outputting two images, the image processing unit 89 can select whether to output them as two independent images, or to combine the two images and output them as one image. Also, since the two images are output from the same address, it is advisable to assign different port numbers to the transmission of each output image. Also, door identification information can be added to each of the two image data so that it is possible to identify which door is shown in the image.
  • cameras 4, 5, and 6 are two-chip cameras having two image sensors 85, 86, it is sufficient that cameras 4, 5, and 6 output images with different angles of view.
  • cameras 4, 5, and 6 may each have one high-resolution image sensor, generate multiple images with different shooting ranges (e.g., two images with different angles of view) from an image captured by that image sensor using digital zoom processing, and output the multiple images generated.
  • optical zoom which moves the lens inside the lens barrel, cannot be used for in-vehicle image systems, since they are required to be vibration-resistant. Therefore, it is preferable to generate two images using digital zoom, rather than changing the magnification of the optical zoom to capture two images each.
  • the configuration of the camera 4 is illustrated according to the flow of the image signal, but the camera 4 has a control unit (logic and CPU) that is not illustrated.
  • the control unit controls the operation of the image sensors 85 and 86, the signal processing units 87 and 88, the image processing unit 89, and the encoder unit 90.
  • a series including a lens 83, an image sensor 85, and a signal processor 87 and a series including a lens 84, an image sensor 86, and a signal processor 88 are arranged one above the other, but since the two series will be upside down when mounted on the opposite side of the vehicle, the two series may be arranged horizontally.
  • the camera 4 has a light receiving window 81 facing to the right and a light receiving window 82 facing diagonally downward to the right, but the positions of the light receiving windows can be adjusted as desired depending on the positions of the camera 4 and the door to be monitored.
  • the camera 4 outputs the two images acquired by the signal processing units 87 and 88 as one or two images.
  • cameras 5 and 6 have the same configuration. That is, camera 5 captures an image of door 10 at a long distance with lens 83, and captures an image of door 9 at a close distance with lens 84. Similarly, camera 6 captures an image of door 12 at a long distance with lens 83, and captures an image of door 11 at a close distance with lens 84.
  • FIG. 4 is a diagram showing an image displayed on a monitor in a conventional imaging system
  • FIG. 5 is a diagram showing an image captured by the camera in the leading car 1 in the conventional imaging system
  • FIG. 6 is a diagram showing an image displayed on a monitor in the imaging system of this embodiment
  • FIG. 7 is a diagram showing a telephoto image captured by the camera 4 in the leading car 1 in the imaging system of this embodiment
  • FIG. 8 is a diagram showing a wide-angle image captured by the camera 4 in the leading car 1 in the imaging system of this embodiment.
  • images taken by each camera are displayed side by side on the monitor of the leading car 1 or the trailing car 3.
  • the monitor screen is divided into three parts as shown in Figure 4, and image 21 taken by the camera of the leading car 1, image 22 taken by the camera of the middle car 2, and image 23 taken by the camera 6 of the trailing car are displayed side by side.
  • Image 21 taken by the camera of the leading car 1 and displayed in the three-part image is image 36 taken by the camera of the leading car 1 shown in Figure 5, cropped by area 24.
  • the image of leading car 1 displayed on the monitor of the conventional image system shows passengers 31 and 32 getting on through door 7, and passenger 33 getting on through door 8.
  • the image of middle car 2 shows no passengers.
  • the image of rear car 3 shows passenger 34 getting off through door 11, and passenger 35 getting off through door 12.
  • the lower parts of passengers 31 and 32 near door 7, and passenger 34 near door 11 are in blind spots outside the image from the waist down.
  • passenger 33 near door 8 and passenger 35 near door 12 are small and difficult to see.
  • the images taken by the cameras 4 to 6 are displayed side by side on the monitoring monitor 13 of the leading car 1 or the monitoring monitor 14 of the trailing car 3.
  • the screen of the monitoring monitors 13 and 14 is divided into six parts as shown in FIG.
  • a telephoto image 40 taken by the camera 4 of the leading car 1 a wide-angle image 41 taken by the camera 4 of the leading car 1, a telephoto image 42 taken by the camera 5 of the middle car 2, a wide-angle image 43 taken by the camera 5 of the middle car 2, a wide-angle image 44 taken by the camera 6 of the trailing car 3, and a telephoto image 45 taken by the camera 6 of the trailing car 3 are displayed side by side.
  • the image 40 displayed on the six-split screen is a crop of the telephoto image 37 taken by the camera 4 of the leading car 1 shown in FIG. 7, cropped by the area 25, and the image 41 is a crop of the wide-angle image 38 taken by the camera 4 of the leading car 1 shown in FIG. 8, cropped by the area 26.
  • the surveillance monitors 13 and 14 can display one image selected by the driver's operation, as shown in FIG. 7 and FIG. 8.
  • Camera 4 cuts out image 37 in area 25 and image 28 in area 26, combines the two images into one image, and transfers it to surveillance monitors 13 and 14 via the network.
  • the other cameras 5 and 6 process the images they capture in the same way, and transfer the combined image to surveillance monitors 13 and 14 via the network.
  • Surveillance monitors 13 and 14 display the image shown in Figure 6 by arranging the three images received from cameras 4, 5 and 6. Note that while an example has been described in which cameras 4, 5 and 6 cut out images and surveillance monitors 13 and 14 combine images, surveillance monitors 13 and 14 may also perform the cutting and combining of images, or image server 19 may perform the cutting and combining of images.
  • the above shows an example of applying the image system of the present invention to monitoring boarding and alighting from trains, but the present invention is not limited to monitoring boarding and alighting from trains and can be applied to other systems that display images from surveillance cameras on a multi-split screen.
  • the wide-angle image of the camera captures the entire pantograph
  • the telephoto image captures the contact strip that comes into contact with the overhead wire. This makes it possible to simultaneously monitor the operating state of the pantograph and the wear of the contact strip.
  • the camera's wide-angle image captures the coupling section of the cars
  • the telephoto image captures the track in the direction of travel.
  • the camera's wide-angle image captures the interior of the nearby passenger compartment
  • the telephoto image captures the interior of the distant passenger compartment (for example, the end of the car). This allows a wide area inside the car to be captured with sufficient resolution.
  • the camera's wide-angle image captures the interior of a nearby door
  • the telephoto image captures the interior of a distant door. This allows passengers getting on and off at multiple doors to be monitored simultaneously.
  • a close-up wide-angle image and a long-distance telephoto image are captured from the camera, making it possible to capture a wide range, including blind spots that would be outside the range of a monocular camera, and to capture large images of distant subjects.
  • close-up images and long-distance images are captured with different image sensors, making it possible to achieve both wide-angle capture of a wide range and telephoto capture of distant subjects.
  • the present invention is not limited to the above-described embodiments, but includes various modified examples and equivalent configurations within the spirit of the appended claims.
  • the above-described embodiments have been described in detail to clearly explain the present invention, and the present invention is not necessarily limited to having all of the configurations described.
  • part of the configuration of one embodiment may be replaced with the configuration of another embodiment.
  • the configuration of another embodiment may be added to the configuration of one embodiment.
  • part of the configuration of each embodiment may be added, deleted, or replaced with other configurations.
  • each of the configurations, functions, processing units, processing means, etc. described above may be realized in part or in whole in hardware, for example by designing them as integrated circuits, or may be realized in software by a processor interpreting and executing a program that realizes each function.
  • Information such as programs, tables, and files that realize each function can be stored in a storage device such as a memory, hard disk, or SSD (Solid State Drive), or in a recording medium such as an IC card, SD card, or DVD.
  • a storage device such as a memory, hard disk, or SSD (Solid State Drive)
  • a recording medium such as an IC card, SD card, or DVD.
  • control lines and information lines shown are those considered necessary for explanation, and do not necessarily represent all control lines and information lines necessary for implementation. In reality, it is safe to assume that almost all components are interconnected.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Mechanical Engineering (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

In the present invention, an image system comprises a camera for capturing images, and a display device for displaying the images captured by the camera. The camera has an imaging element for capturing images, and an image processing unit for processing the images captured by the imaging element. The image processing unit outputs a plurality of images having different capturing ranges.

Description

画像システム及び撮像装置Image system and imaging device 参照による取り込みIncorporation by Reference

 本出願は、令和5年(2023年)3月23日に出願された日本出願である特願2023-47049の優先権を主張し、その内容を参照することにより、本出願に取り込む。 This application claims priority to Japanese Patent Application No. 2023-47049, filed on March 23, 2023, the contents of which are incorporated herein by reference.

 本発明は画像システムに関する。 The present invention relates to an imaging system.

 IP化されたCCTVシステムは、IPカメラで撮像した画像をH264/MPEG4/JPEGなどでエンコードした後、サーバ等で受信した画像をデコードし、その画像をモニタに表示する。IP化されたCCTVシステムは、アナログCCTVシステムと比較して、画像の解像度やフレームレートなど選択が可能であり、1本のケーブルで複数の画像を転送可能であるため配線が簡素化されるなどのメリットがある。そのため、IP化されたCCTVシステムは、設置スペースが狭小な鉄道向けの車載監視システムに利用されている。車載監視システムは列車運行の省人化を実現するため、複数ある乗降口の安全を監視カメラで確認するために使用されている。そのため、一つのモニタの表示領域を分割し、複数の乗降口の監視画像を表示して、安全を確認している。 IP-based CCTV systems encode images captured by IP cameras in H264/MPEG4/JPEG, etc., then decode the images received by a server, etc., and display the images on a monitor. Compared to analog CCTV systems, IP-based CCTV systems have the advantage of being able to select image resolution and frame rate, and being able to transfer multiple images over a single cable, simplifying wiring. For this reason, IP-based CCTV systems are used in on-board surveillance systems for railways, where installation space is limited. On-board surveillance systems are used to check the safety of multiple boarding and alighting entrances with surveillance cameras, in order to reduce the number of people required for train operations. For this reason, the display area of a single monitor is divided and surveillance images of multiple boarding and alighting entrances are displayed to check safety.

 本技術分野の背景技術として、以下の先行技術がある。例えば、特許文献1(特開2018-113602号公報)には、列車を編成する車両の少なくともドア付近を撮像するカメラと、前記カメラによって撮影されたカメラ映像を表示するモニタとを、前記列車に搭載した監視システムにおいて、車両の両側それぞれに設けられた複数のドアの各々に対して前記カメラが配置され、前記モニタのディスプレイ領域を複数のエリアに分割し、各車両のカメラによって撮影された複数のカメラ映像を各エリアにそれぞれ割り当てて表示させる制御を行う表示制御部を備え、前記表示制御部は、前記列車の進行方向を示す情報と、前記ドアが開く側を示す情報とに基づいて、各エリアに対する前記カメラ映像の割り当てを制御することを特徴とする監視システムが記載されている。 The following prior art is included as background technology in this technical field. For example, Patent Document 1 (JP 2018-113602 A) describes a surveillance system in which a camera that captures at least the vicinity of the doors of the cars that make up the train and a monitor that displays the camera images captured by the camera are mounted on the train, the camera is disposed for each of a number of doors provided on both sides of the cars, the display area of the monitor is divided into a number of areas, and a display control unit is provided that controls the allocation of the camera images to each area based on information indicating the direction in which the train is traveling and information indicating the side on which the door opens.

 車載監視システムには、複数のカメラで監視画像を撮影するが、各カメラは一つの画像しか出力しないので、近傍のものが視野外にフレームアウトしたり、遠方のものが小さくて見づらくなることがある。特に、運転席に設けられた監視モニタの表示領域を分割して複数の乗降口の画像を各領域に表示し、停車時のドア付近の乗客の乗降を監視している。  In-vehicle surveillance systems use multiple cameras to capture surveillance images, but because each camera only outputs one image, nearby objects may be framed out of the field of view, or distant objects may be small and difficult to see. In particular, the display area of the surveillance monitor installed in the driver's seat is divided to display images of multiple boarding and alighting doors in each area, and passengers getting on and off near the doors when the vehicle is stopped are monitored.

 しかし、通常は、一つの撮像素子を有する単板カメラが車両の側面に取り付けられており、車両長が長い場合や複数ドアを監視する場合など、カメラの近傍のものが視野外にフレームアウトしたり、遠方のものが小さく映ることがあり、必要な情報の取得が困難な場合がある。 However, a single-chip camera with a single image sensor is usually attached to the side of the vehicle, and when the vehicle is long or when monitoring multiple doors, objects close to the camera may be framed out of the field of view or distant objects may appear small, making it difficult to obtain the necessary information.

 本発明は、カメラの近傍も遠方も的確に監視できる画像システムの提供を目的とする。 The present invention aims to provide an imaging system that can accurately monitor both areas close to the camera and areas far away.

 本願において開示される発明の代表的な一例を示せば以下の通りである。すなわち、画像システムであって、画像を撮影するカメラと、前記カメラが撮影した画像を表示する表示装置とを備え、前記カメラは、画像を撮影する撮像素子と、前記撮像素子が撮影した画像を処理する画像処理部とを有し、前記画像処理部は、撮影範囲が異なる複数の画像を出力することを特徴とする。 A representative example of the invention disclosed in this application is as follows: An image system comprising a camera that captures images and a display device that displays the images captured by the camera, the camera having an image sensor that captures images and an image processing section that processes the images captured by the image sensor, the image processing section being characterized in that it outputs a number of images with different capture ranges.

 また、本発明の一例の画像システムでは、前記カメラは、複数の前記撮像素子と、前記複数の撮像素子の各々の前面に設けられる複数のレンズとを有し、前記複数のレンズは、撮影範囲が異なるように焦点距離が異なるものであって、前記画像処理部は、前記複数の撮像素子が撮影した画角が異なる複数の画像を出力することを特徴とする。 In one example of the image system of the present invention, the camera has a plurality of the image sensors and a plurality of lenses provided in front of each of the plurality of image sensors, the plurality of lenses having different focal lengths so as to have different shooting ranges, and the image processing unit outputs a plurality of images having different angles of view captured by the plurality of image sensors.

 また、本発明の一例の画像システムでは、前記表示装置は、前記カメラから出力された撮影範囲が異なる複数の画像を並べて表示することを特徴とする。 In one example of the image system of the present invention, the display device displays multiple images output from the camera with different shooting ranges side by side.

 また、本発明の一例の画像システムでは、前記表示装置は、前記カメラから出力された撮影範囲が異なる複数の画像の一つを選択的に表示することを特徴とする画像システム。 In one example of the image system of the present invention, the display device selectively displays one of a number of images output from the camera and having different shooting ranges.

 また、本発明の一例の画像システムでは、前記画像処理部は、撮影範囲が異なる複数の画像を一つの画像に合成して出力することを特徴とする。 In one example of the image system of the present invention, the image processing unit is characterized in that it synthesizes multiple images with different shooting ranges into a single image and outputs the image.

 また、本発明の一例の撮像装置は、画像を撮影する撮像装置であって、画像を撮影する撮像素子と、前記撮像素子が撮影した画像を処理する画像処理部とを備え、前記画像処理部は、撮影範囲が異なる複数の画像を出力することを特徴とする。 An example of an imaging device according to the present invention is an imaging device that captures images, and includes an imaging element that captures images, and an image processing unit that processes the images captured by the imaging element, and the image processing unit is characterized in that it outputs multiple images with different capture ranges.

 本発明の一態様によれば、カメラから撮影範囲が異なる複数の画像を出力するので、近傍の撮影範囲を広くでき、遠方の被写体も大きく撮影できる。上記した以外の課題、構成及び効果は、以下の発明を実施するための形態の説明により明らかにされる。 According to one aspect of the present invention, a camera outputs multiple images with different shooting ranges, allowing a wider shooting range of nearby subjects and allowing distant subjects to be captured in large size. Problems, configurations, and effects other than those described above will become clear from the following description of the embodiment of the invention.

本発明の実施例の画像システムの概要を示す図である。1 is a diagram showing an overview of an image system according to an embodiment of the present invention; 本実施例の画像システムの構成を示す図である。FIG. 1 is a diagram showing a configuration of an image system according to an embodiment of the present invention. 本実施例のカメラの構成を示す図である。FIG. 1 is a diagram illustrating a configuration of a camera according to an embodiment of the present invention. 従来の画像システムにおいて監視モニタに表示される画像を示す図である。FIG. 1 is a diagram showing an image displayed on a surveillance monitor in a conventional imaging system. 従来の画像システムにおいて先頭車両のカメラが撮影した画像を示す図である。FIG. 1 is a diagram showing an image captured by a camera in a leading vehicle in a conventional image system. 本実施例の画像システムにおいて監視モニタに表示される画像を示す図である。1 is a diagram showing an image displayed on a surveillance monitor in the imaging system of the present embodiment. FIG. 本実施例の画像システムにおいて先頭車両のカメラが撮影した望遠画像を示す図である。FIG. 2 is a diagram showing a telephoto image captured by a camera in a leading vehicle in the imaging system of the present embodiment. 本実施例の画像システムにおいて先頭車両のカメラが撮影した広角画像を示す図である。FIG. 2 is a diagram showing a wide-angle image captured by a camera in a leading vehicle in the image system of the present embodiment.

 本発明の実施例について、図面を参照して説明する。 An embodiment of the present invention will be described with reference to the drawings.

 図1は、本発明の実施例の画像システムの概要を示す図である。 FIG. 1 shows an overview of an image system according to an embodiment of the present invention.

 本実施例の画像システムは、先頭車両1、中間車両2、及び後尾車両3による3両編成の列車に実装されている。 The image system of this embodiment is implemented in a three-car train consisting of a leading car 1, a middle car 2, and a trailing car 3.

 先頭車両1、中間車両2、及び後尾車両3の各々は片側に二つのドア7~12を有し、これらのドア7~12から乗客が車両へ乗降する。各車両1~3の側面には、ドア7~12で乗降する乗客を監視するカメラ4~6が設けられる。図1に示すように、本実施例では、1両に1台のカメラを設け、当該1台のカメラで二つのドアで乗降する乗客を監視する。カメラ4~6は、各車両の先頭車両寄りに設けられているが、後尾車両寄りに設けられてもよい。 Each of the leading car 1, middle car 2, and trailing car 3 has two doors 7-12 on one side, and passengers board and disembark through these doors 7-12. Cameras 4-6 are provided on the side of each of the cars 1-3 to monitor passengers boarding and disembarking through the doors 7-12. As shown in FIG. 1, in this embodiment, one camera is provided per car, and this one camera monitors passengers boarding and disembarking through the two doors. The cameras 4-6 are provided near the leading car of each car, but they may also be provided near the trailing car.

 先頭車両1及び後尾車両3の運転台には監視モニタ13、14が設けられる。監視モニタ13、14は、カメラ4~6が撮影した画像を表示する表示装置である。乗務員は、監視モニタ13、14によってカメラ4~6が撮影した画像を確認できる。図示した例では、列車が停車しているプラットフォームに乗客31~35が居る。 Surveillance monitors 13, 14 are provided in the driver's cabs of the leading car 1 and the trailing car 3. The surveillance monitors 13, 14 are display devices that display images captured by the cameras 4-6. The crew can check the images captured by the cameras 4-6 on the surveillance monitors 13, 14. In the example shown, passengers 31-35 are present on the platform where the train is stopped.

 図2は、本実施例の画像システムの構成を示す図である。 FIG. 2 shows the configuration of the image system in this embodiment.

 先頭車両1には、画像サーバ19、録画装置18、上位サーバ20、監視モニタ13、及びカメラ4が設けられており、これらの装置は、ネットワーク機器(例えばL2スイッチ)15によって接続される。これらの装置を接続するネットワークは、有線ネットワークで構成しても、一部又は全部と無線ネットワークで構成してもよい。 The leading car 1 is equipped with an image server 19, a recording device 18, a higher-level server 20, a surveillance monitor 13, and a camera 4, and these devices are connected by a network device (e.g., an L2 switch) 15. The network connecting these devices may be configured as a wired network, or may be configured as a wireless network in part or in whole.

 画像サーバ19は、カメラ4~6が撮影した画像を切り替えて、監視モニタ13、14に表示するなどの画像処理を実行する制御装置である。録画装置18は、カメラ4~6が撮影した画像をその必要に応じて不揮発性記録媒体に記録する。乗務員は、監視モニタ13、14に設けられる操作パネルの操作によって、録画装置18に記録された画像を表示できる。録画装置18に記録された画像は、トラブル発生時の原因究明のために使用される。 The image server 19 is a control device that performs image processing such as switching between images captured by the cameras 4 to 6 and displaying them on the surveillance monitors 13 and 14. The recording device 18 records the images captured by the cameras 4 to 6 on a non-volatile recording medium as needed. Crew members can display the images recorded on the recording device 18 by operating the operation panels provided on the surveillance monitors 13 and 14. The images recorded on the recording device 18 are used to identify the cause of any trouble that may occur.

 上位サーバ20は、列車の運行を制御する制御装置である。本実施例では、画像サーバ19は、上位サーバ20から、列車の状態(例えば、走行状態、ドアの開閉状態)を取得する。例えば、画像サーバ19は、上位サーバ20からドア開信号を受信すると、ドアを撮影した画像を監視モニタ13、14に表示するように制御する。また、画像サーバ19は、上位サーバ20から発車信号(5km/h超信号)を受信すると、監視モニタ13、14からドアを撮影した画像を消去する。 The upper server 20 is a control device that controls the operation of the train. In this embodiment, the image server 19 obtains the train status (e.g., running status, door open/close status) from the upper server 20. For example, when the image server 19 receives a door open signal from the upper server 20, it controls the surveillance monitors 13 and 14 to display images of the doors. In addition, when the image server 19 receives a departure signal (over 5 km/h signal) from the upper server 20, it erases the images of the doors from the surveillance monitors 13 and 14.

 監視モニタ13は、カメラ4~6が撮影した画像を表示する表示部と、画像サーバ19に指示をするための操作部を有する。タッチパネルで操作部を構成し、操作部と表示部を一体に構成してもよい。監視モニタ13は、運転台に設けられる表示装置に限らず、乗務員が携行する可搬型の端末装置でもよい。 The surveillance monitor 13 has a display unit that displays images captured by the cameras 4 to 6, and an operation unit for issuing instructions to the image server 19. The operation unit may be configured as a touch panel, and the operation unit and display unit may be configured as one unit. The surveillance monitor 13 is not limited to a display device provided in the driver's cab, and may be a portable terminal device carried by the crew.

 カメラ4は、先頭車両1の側面に設けられており、二つのドア7、8で乗降する乗客の画像を撮影する。カメラ4の構成は図3を参照して後述する。 Camera 4 is installed on the side of the leading car 1 and captures images of passengers getting on and off through two doors 7 and 8. The configuration of camera 4 will be described later with reference to Figure 3.

 後尾車両3には、監視モニタ14、及びカメラ6が設けられており、これらの装置は、ネットワーク機器(例えばL2スイッチ)17によって接続される。これらの装置を接続するネットワークは、有線ネットワークで構成しても、一部又は全部と無線ネットワークで構成してもよい。 The rear vehicle 3 is equipped with a surveillance monitor 14 and a camera 6, and these devices are connected by a network device (e.g., an L2 switch) 17. The network connecting these devices may be configured as a wired network, or may be configured as a wireless network in part or in whole.

 監視モニタ14は、監視モニタ13と同じ構成を有し、カメラ4~6が撮影した画像を表示する表示部と、画像サーバ19に指示をするための操作部を有する。タッチパネルで操作部を構成し、操作部と表示部を一体に構成してもよい。監視モニタ14は、運転台に設けられる表示装置に限らず、乗務員が携行する可搬型の端末装置でもよい。 Surveillance monitor 14 has the same configuration as surveillance monitor 13, and includes a display unit that displays images captured by cameras 4 to 6, and an operation unit for issuing instructions to image server 19. The operation unit may be configured as a touch panel, and the operation unit and display unit may be integrated. Surveillance monitor 14 is not limited to a display device provided in the driver's cab, and may be a portable terminal device carried by the crew.

 カメラ6は、後尾車両3の側面に設けられており、二つのドア11、12で乗降する乗客の画像を撮影する。カメラ6の構成はカメラ4と同じであり、図3を参照して後述する。 Camera 6 is installed on the side of the rear car 3 and captures images of passengers getting on and off through the two doors 11, 12. The configuration of camera 6 is the same as camera 4, and will be described later with reference to Figure 3.

 中間車両2には、カメラ5が設けられており、カメラ5は、ネットワーク機器(例えばL2スイッチ)16に接続される。 A camera 5 is provided on the intermediate car 2, and the camera 5 is connected to a network device (e.g., an L2 switch) 16.

 カメラ5は、中間車両2の側面に設けられており、二つのドア9、10で乗降する乗客の画像を撮影する。カメラ5の構成はカメラ4と同じであり、図3を参照して後述する。 Camera 5 is installed on the side of intermediate car 2 and captures images of passengers getting on and off through two doors 9, 10. The configuration of camera 5 is the same as camera 4, and will be described later with reference to Figure 3.

 ネットワーク機器15、16、17は通信路50で接続されて、列車内ネットワークを構成する。 Network devices 15, 16, and 17 are connected via communication path 50 to form an in-train network.

 図3は、本実施例のカメラ4の構成を示す図である。図3ではカメラ4について説明するが、カメラ5、6も同じ構成である。 FIG. 3 is a diagram showing the configuration of camera 4 in this embodiment. Camera 4 will be described in FIG. 3, but cameras 5 and 6 have the same configuration.

 本実施例のカメラ4は、ネットワーク経由で画像を出力可能な2板カメラである。 In this embodiment, camera 4 is a two-chip camera that can output images via a network.

 カメラ4は、二つの受光窓81、82を有し、受光窓81、82の各々の内部には異なる特性(例えば焦点距離が異なる)のレンズ83、84を有する。レンズ83は望遠レンズであり、その視野は95 ~92の範囲であり、遠距離のドア8を撮像する。レンズ84は広角レンズであり、その視野は93~94の範囲であり、近距離のドア7を撮像する。なお、レンズ83、84として1枚の凸レンズを図示しているが、実際には複数のレンズの組み合わせによって構成される。 Camera 4 has two light receiving windows 81, 82, and inside each of the light receiving windows 81, 82 are lenses 83, 84 with different characteristics (e.g. different focal lengths). Lens 83 is a telephoto lens with a field of view in the range of 95 to 92, and captures door 8 at a long distance. Lens 84 is a wide-angle lens with a field of view in the range of 93 to 94, and captures door 7 at a close distance. Note that although lenses 83, 84 are illustrated as single convex lenses, in reality they are composed of a combination of multiple lenses.

 レンズ83で集光された光は撮像素子85に入射し、レンズ84で集光された光は撮像素子86に入射する。レンズ83とレンズ84の焦点距離及び光軸方向の差によって、撮像素子85と撮像素子86は撮影範囲が異なる画像を撮影する。撮像素子85、86は、光電変換によって画像の生データ(例えばRAWデータ)を生成する。信号処理部87、88は、画像の生データからノイズ除去や輪郭を明確にするエッジ処理などの信号処理を行い、画像データを生成する。 Light focused by lens 83 enters image sensor 85, and light focused by lens 84 enters image sensor 86. Due to differences in focal length and optical axis direction between lenses 83 and 84, image sensor 85 and image sensor 86 capture images with different shooting ranges. Image sensors 85 and 86 generate raw image data (e.g., RAW data) through photoelectric conversion. Signal processors 87 and 88 perform signal processing such as noise removal and edge processing to clarify contours from the raw image data, and generate image data.

 画像処理部89は、信号処理部87、88で生成された画像データから必要な領域を切り取ったり、複数の画像を一つの画像に合成して、生成された画像をエンコーダ部90に入力する。例えば、ドア周辺を切り取った図7に示す領域25に表示される画像37や、図8に示す領域26に表示される画像38を生成したり、画像37と画像38を合成して一つの画像を生成する。 The image processing unit 89 cuts out the necessary areas from the image data generated by the signal processing units 87 and 88, or combines multiple images into one image, and inputs the generated image to the encoder unit 90. For example, it generates image 37 displayed in area 25 shown in FIG. 7, which is a cut-out of the area around the door, or image 38 displayed in area 26 shown in FIG. 8, or it combines images 37 and 38 to generate one image.

 エンコーダ部90は、画像データを所定のデータ形式(例えばJPEG/H.264)に変換し、物理インターフェース91を介してネットワークに出力する。 The encoder unit 90 converts the image data into a predetermined data format (e.g., JPEG/H.264) and outputs it to the network via the physical interface 91.

 画像処理部89は、信号処理部87からの画像データと信号処理部88から画像データ、すなわち撮影範囲が異なる二つの画像が入力されるが、ユーザの監視モニタ13、14による操作によって、信号処理部87の画像だけを出力するか、信号処理部88の画像を出力するか、二つの画像を出力するかを制御する。さらに、画像処理部89は、二つの画像を出力する場合、独立した二つの画像として出力するか、二つの画像を結合して一つの画像として出力するかも選択可能である。また、二つの画像は、同一アドレスから出力されることから、各出力画像の伝送に異なるポート番号を割り当てるとよい。また、どのドアが映っている画像かを識別できるよう、二つの画像データのそれぞれにドアの識別情報を付与してもよい。 The image processing unit 89 receives image data from the signal processing unit 87 and image data from the signal processing unit 88, i.e., two images with different shooting ranges, and controls whether to output only the image from the signal processing unit 87, the image from the signal processing unit 88, or two images, depending on the user's operation on the surveillance monitors 13, 14. Furthermore, when outputting two images, the image processing unit 89 can select whether to output them as two independent images, or to combine the two images and output them as one image. Also, since the two images are output from the same address, it is advisable to assign different port numbers to the transmission of each output image. Also, door identification information can be added to each of the two image data so that it is possible to identify which door is shown in the image.

 カメラ4、5、6が二つの撮像素子85、86を有する2板カメラである例を説明したが、カメラ4、5、6が異なる画角の画像を出力すればよい。例えば、カメラ4、5、6が一つの高解像度の撮像素子を有し、当該撮像素子が撮影した画像からデジタルズーム処理によって撮影範囲が異なる複数の画像(例えば、画角が異なる二つの画像)を生成し、生成された複数の画像を出力するものでもよい。なお、車載用の画像システムは、耐振動性が要求されることから鏡筒内でレンズを動かす光学ズームを採用できない。よって、光学ズームの倍率を変更して二つの画像をそれぞれ撮影するのではなく、デジタルズームで二つの画像を生成する方が望ましい。 Although the example has been described in which cameras 4, 5, and 6 are two-chip cameras having two image sensors 85, 86, it is sufficient that cameras 4, 5, and 6 output images with different angles of view. For example, cameras 4, 5, and 6 may each have one high-resolution image sensor, generate multiple images with different shooting ranges (e.g., two images with different angles of view) from an image captured by that image sensor using digital zoom processing, and output the multiple images generated. Note that optical zoom, which moves the lens inside the lens barrel, cannot be used for in-vehicle image systems, since they are required to be vibration-resistant. Therefore, it is preferable to generate two images using digital zoom, rather than changing the magnification of the optical zoom to capture two images each.

 図3では、画像信号の流れに従ってカメラ4の構成を図示したが、カメラ4は、図示を省略した制御部(ロジックやCPU)を有する。制御部は、撮像素子85、86、信号処理部87、88、画像処理部89、及びエンコーダ部90の動作を制御する。 In FIG. 3, the configuration of the camera 4 is illustrated according to the flow of the image signal, but the camera 4 has a control unit (logic and CPU) that is not illustrated. The control unit controls the operation of the image sensors 85 and 86, the signal processing units 87 and 88, the image processing unit 89, and the encoder unit 90.

 図3は、レンズ83、撮像素子85及び信号処理部87を含む系列と、レンズ84、撮像素子86及び信号処理部88を含む系列を上下に配置しているが、反対側の車両側面に取り付けると上下が逆になるため、二つの系列を水平に配置してもよい。また、カメラ4は、受光窓81を右方向に、受光窓82を右斜め下方向に設けているが、カメラ4と監視すべきドアの位置によって、受光窓の位置は任意に調整するとよい。 In Fig. 3, a series including a lens 83, an image sensor 85, and a signal processor 87 and a series including a lens 84, an image sensor 86, and a signal processor 88 are arranged one above the other, but since the two series will be upside down when mounted on the opposite side of the vehicle, the two series may be arranged horizontally. Also, the camera 4 has a light receiving window 81 facing to the right and a light receiving window 82 facing diagonally downward to the right, but the positions of the light receiving windows can be adjusted as desired depending on the positions of the camera 4 and the door to be monitored.

 このように、カメラ4は信号処理部87、88が取得した二つの画像を、一つ又は二つの画像として出力する。 In this way, the camera 4 outputs the two images acquired by the signal processing units 87 and 88 as one or two images.

 以上、カメラ4について説明したが、カメラ5,6も同じ構成を有する。すなわち、カメラ5は、レンズ83によって遠距離のドア10を撮像し、レンズ84によって近距離のドア9を撮像する。同様に、カメラ6は、レンズ83によって遠距離のドア12を撮像し、レンズ84によって近距離のドア11を撮像する。 The above describes camera 4, but cameras 5 and 6 have the same configuration. That is, camera 5 captures an image of door 10 at a long distance with lens 83, and captures an image of door 9 at a close distance with lens 84. Similarly, camera 6 captures an image of door 12 at a long distance with lens 83, and captures an image of door 11 at a close distance with lens 84.

 次に、本実施例の画像システムが出力する画像について、従来の画像システムが出力する画像と比較して説明する。 Next, we will explain the images output by the imaging system of this embodiment, comparing them with images output by a conventional imaging system.

 図4は、従来の画像システムにおいてモニタに表示される画像を示す図であり、図5は、従来の画像システムにおいて先頭車両1のカメラが撮影した画像を示す図である。また、図6は、本実施例の画像システムにおいてモニタに表示される画像を示す図であり、図7は、本実施例の画像システムにおいて先頭車両1のカメラ4が撮影した望遠画像を示す図であり、図8は、本実施例の画像システムにおいて先頭車両1のカメラ4が撮影した広角画像を示す図である。 FIG. 4 is a diagram showing an image displayed on a monitor in a conventional imaging system, and FIG. 5 is a diagram showing an image captured by the camera in the leading car 1 in the conventional imaging system. Also, FIG. 6 is a diagram showing an image displayed on a monitor in the imaging system of this embodiment, FIG. 7 is a diagram showing a telephoto image captured by the camera 4 in the leading car 1 in the imaging system of this embodiment, and FIG. 8 is a diagram showing a wide-angle image captured by the camera 4 in the leading car 1 in the imaging system of this embodiment.

 従来の画像システムでは、駅停車時の乗客の乗降を監視するために、先頭車両1又は後尾車両3のモニタに各カメラが撮影した画像を並べて表示する。モニタの画面は、図4に示すように3分割されており、先頭車両1のカメラが撮影した画像21、中間車両2のカメラが撮影した画像22、後尾車両のカメラ6が撮影した画像23が並べて表示される。3分割画像に表示される先頭車両1のカメラが撮影した画像21は、図5に示す先頭車両1のカメラが撮影した画像36を領域24で切り取ったものである。 In conventional image systems, to monitor passengers getting on and off when the train stops at a station, images taken by each camera are displayed side by side on the monitor of the leading car 1 or the trailing car 3. The monitor screen is divided into three parts as shown in Figure 4, and image 21 taken by the camera of the leading car 1, image 22 taken by the camera of the middle car 2, and image 23 taken by the camera 6 of the trailing car are displayed side by side. Image 21 taken by the camera of the leading car 1 and displayed in the three-part image is image 36 taken by the camera of the leading car 1 shown in Figure 5, cropped by area 24.

 図1に示す状況において、従来の画像システムのモニタに表示される先頭車両1の画像には、乗客31と32がドア7から乗り、乗客33がドア8から乗っている様子が映っている。そして、中間車両2の画像には乗客が映っていない。後尾車両3の画像には、乗客34がドア11から降り、乗客35がドア12から降りている様子が映っている。しかし、ドア7付近の乗客31、32の腰から下、ドア11付近の乗客34の腰から下が画像からはみ出した死角となっている。さらに、ドア8付近の乗客33とドア12付近の乗客35が小さく見づらい。 In the situation shown in Figure 1, the image of leading car 1 displayed on the monitor of the conventional image system shows passengers 31 and 32 getting on through door 7, and passenger 33 getting on through door 8. The image of middle car 2 shows no passengers. The image of rear car 3 shows passenger 34 getting off through door 11, and passenger 35 getting off through door 12. However, the lower parts of passengers 31 and 32 near door 7, and passenger 34 near door 11, are in blind spots outside the image from the waist down. Furthermore, passenger 33 near door 8 and passenger 35 near door 12 are small and difficult to see.

 一方、本実施例の画像システムでは、駅停車時の乗客の乗降を監視するために、先頭車両1の監視モニタ13又は後尾車両3の監視モニタ14に各カメラ4~6が撮影した画像を並べて表示する。監視モニタ13、14の画面は、図6に示すように6分割されており、先頭車両1のカメラ4が撮影した望遠側の画像40、先頭車両1のカメラ4が撮影した広角側の画像41、中間車両2のカメラ5が撮影した望遠側の画像42、中間車両2のカメラ5が撮影した広角側の画像43、後尾車両3のカメラ6が撮影した広角側の画像44、後尾車両3のカメラ6が撮影した望遠側の画像45が並べて表示される。6分割画面に表示される画像40は、図7に示す先頭車両1のカメラ4が撮影した望遠画像37を領域25で切り取ったものであり、画像41は、図8に示す先頭車両1のカメラ4が撮影した広角画像38を領域26で切り取ったものである。監視モニタ13、14は、図6に示す6分割画像の他、乗務員の操作によって選択された一つの画像を図7や図8に示すように表示できる。 On the other hand, in the image system of this embodiment, in order to monitor passengers getting on and off when the train stops at a station, the images taken by the cameras 4 to 6 are displayed side by side on the monitoring monitor 13 of the leading car 1 or the monitoring monitor 14 of the trailing car 3. The screen of the monitoring monitors 13 and 14 is divided into six parts as shown in FIG. 6, and a telephoto image 40 taken by the camera 4 of the leading car 1, a wide-angle image 41 taken by the camera 4 of the leading car 1, a telephoto image 42 taken by the camera 5 of the middle car 2, a wide-angle image 43 taken by the camera 5 of the middle car 2, a wide-angle image 44 taken by the camera 6 of the trailing car 3, and a telephoto image 45 taken by the camera 6 of the trailing car 3 are displayed side by side. The image 40 displayed on the six-split screen is a crop of the telephoto image 37 taken by the camera 4 of the leading car 1 shown in FIG. 7, cropped by the area 25, and the image 41 is a crop of the wide-angle image 38 taken by the camera 4 of the leading car 1 shown in FIG. 8, cropped by the area 26. In addition to the six-split image shown in FIG. 6, the surveillance monitors 13 and 14 can display one image selected by the driver's operation, as shown in FIG. 7 and FIG. 8.

 カメラ4は、画像37を領域25で切り取り、画像28を領域26で切り取って、二つの画像を一つの画像に合成して、ネットワーク経由で監視モニタ13、14へ転送する。他のカメラ5、6も、撮影した画像を同様に処理して、合成された一つの画像をネットワーク経由で監視モニタ13、14へ転送する。監視モニタ13、14は、カメラ4、5、6から受信した三つの画像を並べて図6に示す画像を表示する。なお、カメラ4、5、6が画像を切り取り、監視モニタ13、14が画像を合成する例を説明したが、監視モニタ13、14が画像の切り取りや画像の合成を行ってもよく、画像サーバ19が画像の切り取りや画像の合成を行ってもよい。 Camera 4 cuts out image 37 in area 25 and image 28 in area 26, combines the two images into one image, and transfers it to surveillance monitors 13 and 14 via the network. The other cameras 5 and 6 process the images they capture in the same way, and transfer the combined image to surveillance monitors 13 and 14 via the network. Surveillance monitors 13 and 14 display the image shown in Figure 6 by arranging the three images received from cameras 4, 5 and 6. Note that while an example has been described in which cameras 4, 5 and 6 cut out images and surveillance monitors 13 and 14 combine images, surveillance monitors 13 and 14 may also perform the cutting and combining of images, or image server 19 may perform the cutting and combining of images.

 従来の画像システムでは、図4に示すように、ドア7付近の乗客31、32の腰から下、ドア11付近の乗客34の腰から下が画像からはみ出した死角となっているが、本実施例の画像システムでは、図6に示すように、乗客の全身が映っており、ドア周辺の死角が殆ど無い。また、従来の画像システムでは、図4に示すように、カメラから遠いドア8の乗客33やドア12の乗客35は小さく映っているが、本実施例の画像システムでは、図6に示す画像40や画像44のように、大きく十分な解像度で表示される。 In a conventional imaging system, as shown in FIG. 4, passengers 31 and 32 near door 7 and passenger 34 near door 11 are in blind spots that extend beyond the image from the waist down, but in the imaging system of this embodiment, as shown in FIG. 6, the entire bodies of passengers are captured, with almost no blind spots around the doors. Also, in a conventional imaging system, passengers 33 at door 8 and passenger 35 at door 12, who are far from the camera, are captured small, as shown in FIG. 4, but in the imaging system of this embodiment, they are displayed large and with sufficient resolution, as in images 40 and 44 shown in FIG. 6.

 以上、本発明の画像システムを列車の乗降監視に適応する例を示したが、本発明は列車の乗降監視に限定されず、監視カメラの画像を多分割画面で表示する他のシステムに適用できる。 The above shows an example of applying the image system of the present invention to monitoring boarding and alighting from trains, but the present invention is not limited to monitoring boarding and alighting from trains and can be applied to other systems that display images from surveillance cameras on a multi-split screen.

 例えば、列車の屋根に設置されたカメラでパンタグラフを撮影するパンタグラフ監視システムにおいて、当該カメラの広角画像でパンタグラフ全体を撮影し、望遠画像で架線と接触する擦り板を撮影する。これにより、パンタグラフの動作状態と擦り板の摩耗を同時に監視できる。 For example, in a pantograph monitoring system that photographs the pantograph with a camera installed on the roof of a train, the wide-angle image of the camera captures the entire pantograph, and the telephoto image captures the contact strip that comes into contact with the overhead wire. This makes it possible to simultaneously monitor the operating state of the pantograph and the wear of the contact strip.

 また、列車の前方(例えば乗務員室)に設置されたカメラで進行方向を撮影する前方監視システムにおいて、当該カメラの広角画像で車両の連結部を撮影し、望遠画像で進行方向の線路を撮影する。これにより、列車の前方と連結時及び切り離し時の列車の接近状態や連結部の状態を一つのカメラで監視できる。さらに、カメラの広角画像で乗務員室(例えば運転席)を撮影し、望遠画像で進行方向の線路を撮影すると、列車の前方と乗務員を同時に監視でき、乗務員に発生した異常事態を把握できる。 In addition, in a forward monitoring system that uses a camera installed at the front of the train (e.g., in the crew cabin) to capture the direction of travel, the camera's wide-angle image captures the coupling section of the cars, and the telephoto image captures the track in the direction of travel. This makes it possible to monitor the front of the train, the approaching state of trains when coupling and uncoupling, and the state of the coupling section with a single camera. Furthermore, by capturing the crew cabin (e.g., the driver's seat) with a wide-angle image from the camera and the track in the direction of travel with a telephoto image, it is possible to monitor the front of the train and the crew simultaneously, and to identify any abnormalities that occur to the crew.

 また、列車の客室内に設置されたカメラで客室内を撮影する客室監視システムにおいて、当該カメラの広角画像で近傍の客室内を撮影し、望遠画像で遠方の客室内(例えば車両端部)を撮影する。これにより、車両内の広い領域を十分な解像度で撮影できる。さらに、カメラの広角画像で近傍のドアを撮影し、望遠画像で遠方のドアを撮影する。これにより、複数のドアの乗降客を同時に監視できる。 In addition, in a passenger compartment monitoring system that uses a camera installed inside a train passenger compartment to capture images of the interior of the passenger compartment, the camera's wide-angle image captures the interior of the nearby passenger compartment, and the telephoto image captures the interior of the distant passenger compartment (for example, the end of the car). This allows a wide area inside the car to be captured with sufficient resolution. Furthermore, the camera's wide-angle image captures the interior of a nearby door, and the telephoto image captures the interior of a distant door. This allows passengers getting on and off at multiple doors to be monitored simultaneously.

 以上に説明したように、本発明の実施例の映像システムによると、カメラから近距離の広角画像と遠距離の望遠画像を撮像するので、単眼カメラによる撮影では撮影範囲外となる死角領域も含めて広範囲を撮影でき、かつ、遠方の被写体を大きく撮影できる。また、1台のカメラ内に複数のレンズ及び複数の撮像素子を設けたので、近傍の画像と遠方の画像を異なる撮像素子で撮像するので、広範囲の広角撮影と、遠方の望遠撮影を両立できる。 As described above, according to the video system of the embodiment of the present invention, a close-up wide-angle image and a long-distance telephoto image are captured from the camera, making it possible to capture a wide range, including blind spots that would be outside the range of a monocular camera, and to capture large images of distant subjects. In addition, since multiple lenses and multiple image sensors are provided in one camera, close-up images and long-distance images are captured with different image sensors, making it possible to achieve both wide-angle capture of a wide range and telephoto capture of distant subjects.

 なお、本発明は前述した実施例に限定されるものではなく、添付した特許請求の範囲の趣旨内における様々な変形例及び同等の構成が含まれる。例えば、前述した実施例は本発明を分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに本発明は限定されない。また、ある実施例の構成の一部を他の実施例の構成に置き換えてもよい。また、ある実施例の構成に他の実施例の構成を加えてもよい。また、各実施例の構成の一部について、他の構成の追加・削除・置換をしてもよい。 The present invention is not limited to the above-described embodiments, but includes various modified examples and equivalent configurations within the spirit of the appended claims. For example, the above-described embodiments have been described in detail to clearly explain the present invention, and the present invention is not necessarily limited to having all of the configurations described. Furthermore, part of the configuration of one embodiment may be replaced with the configuration of another embodiment. Furthermore, the configuration of another embodiment may be added to the configuration of one embodiment. Furthermore, part of the configuration of each embodiment may be added, deleted, or replaced with other configurations.

 また、前述した各構成、機能、処理部、処理手段等は、それらの一部又は全部を、例えば集積回路で設計する等により、ハードウェアで実現してもよく、プロセッサがそれぞれの機能を実現するプログラムを解釈し実行することにより、ソフトウェアで実現してもよい。 Furthermore, each of the configurations, functions, processing units, processing means, etc. described above may be realized in part or in whole in hardware, for example by designing them as integrated circuits, or may be realized in software by a processor interpreting and executing a program that realizes each function.

 各機能を実現するプログラム、テーブル、ファイル等の情報は、メモリ、ハードディスク、SSD(Solid State Drive)等の記憶装置、又は、ICカード、SDカード、DVD等の記録媒体に格納することができる。 Information such as programs, tables, and files that realize each function can be stored in a storage device such as a memory, hard disk, or SSD (Solid State Drive), or in a recording medium such as an IC card, SD card, or DVD.

 また、制御線や情報線は説明上必要と考えられるものを示しており、実装上必要な全ての制御線や情報線を示しているとは限らない。実際には、ほとんど全ての構成が相互に接続されていると考えてよい。 Furthermore, the control lines and information lines shown are those considered necessary for explanation, and do not necessarily represent all control lines and information lines necessary for implementation. In reality, it is safe to assume that almost all components are interconnected.

Claims (7)

 画像システムであって、
 画像を撮影するカメラと、
 前記カメラが撮影した画像を表示する表示装置とを備え、
 前記カメラは、画像を撮影する撮像素子と、前記撮像素子が撮影した画像を処理する画像処理部とを有し、
 前記画像処理部は、撮影範囲が異なる複数の画像を出力することを特徴とする画像システム。
1. An imaging system comprising:
A camera for taking images;
a display device for displaying an image captured by the camera;
The camera includes an image sensor that captures an image, and an image processor that processes the image captured by the image sensor.
The image system according to the present invention is characterized in that the image processing unit outputs a plurality of images having different shooting ranges.
 請求項1に記載の画像システムであって、
 前記カメラは、複数の前記撮像素子と、前記複数の撮像素子の各々の前面に設けられる複数のレンズとを有し、
 前記複数のレンズは、撮影範囲が異なるように焦点距離が異なるものであって、
 前記画像処理部は、前記複数の撮像素子が撮影した画角が異なる複数の画像を出力することを特徴とする画像システム。
2. The imaging system of claim 1,
the camera includes a plurality of the imaging elements and a plurality of lenses provided in front of each of the imaging elements;
The plurality of lenses have different focal lengths so as to have different photographing ranges,
The image processing unit outputs a plurality of images captured by the plurality of image pickup elements, the images having different angles of view.
 請求項1に記載の画像システムであって、
 前記表示装置は、前記カメラから出力された撮影範囲が異なる複数の画像を並べて表示することを特徴とする画像システム。
2. The imaging system of claim 1,
The image system is characterized in that the display device displays a plurality of images output from the camera with different shooting ranges side by side.
 請求項3に記載の画像システムであって、
 前記表示装置は、前記カメラから出力された撮影範囲が異なる複数の画像の一つを選択的に表示することを特徴とする画像システム。
4. The imaging system of claim 3,
The image system is characterized in that the display device selectively displays one of a plurality of images having different shooting ranges output from the camera.
 請求項1に記載の画像システムであって、
 前記画像処理部は、撮影範囲が異なる複数の画像を一つの画像に合成して出力することを特徴とする画像システム。
2. The imaging system of claim 1,
The image system according to the present invention, wherein the image processing unit synthesizes a plurality of images with different shooting ranges into a single image and outputs the single image.
 画像を撮影する撮像装置であって、
 画像を撮影する撮像素子と、
 前記撮像素子が撮影した画像を処理する画像処理部とを備え、
 前記画像処理部は、撮影範囲が異なる複数の画像を出力することを特徴とする撮像装置。
An imaging device for capturing an image,
An image sensor for capturing an image;
an image processing unit that processes an image captured by the imaging element,
The imaging device, wherein the image processing unit outputs a plurality of images having different shooting ranges.
 請求項6に記載の撮像装置であって、
 複数の前記撮像素子と、
 前記複数の撮像素子の各々の前面に設けられる複数のレンズとを備え、
 前記複数のレンズは、撮影範囲が異なるように焦点距離が異なるものであって、
 前記画像処理部は、前記複数の撮像素子が撮影した画角が異なる複数の画像を出力することを特徴とする撮像装置。
7. The imaging device according to claim 6,
A plurality of the imaging elements;
a plurality of lenses provided in front of each of the plurality of image pickup elements;
The plurality of lenses have different focal lengths so as to have different photographing ranges,
The imaging device, wherein the image processing unit outputs a plurality of images having different angles of view captured by the plurality of imaging elements.
PCT/JP2023/033342 2023-03-23 2023-09-13 Image system and imaging apparatus WO2024195154A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2025508109A JPWO2024195154A1 (en) 2023-03-23 2023-09-13

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2023047049 2023-03-23
JP2023-047049 2023-03-23

Publications (1)

Publication Number Publication Date
WO2024195154A1 true WO2024195154A1 (en) 2024-09-26

Family

ID=92841695

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/033342 WO2024195154A1 (en) 2023-03-23 2023-09-13 Image system and imaging apparatus

Country Status (2)

Country Link
JP (1) JPWO2024195154A1 (en)
WO (1) WO2024195154A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017181634A (en) * 2016-03-29 2017-10-05 パナソニックIpマネジメント株式会社 Imaging apparatus and on-vehicle camera system
WO2019102935A1 (en) * 2017-11-21 2019-05-31 シャープ株式会社 Display device, imaging and display system, and train

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017181634A (en) * 2016-03-29 2017-10-05 パナソニックIpマネジメント株式会社 Imaging apparatus and on-vehicle camera system
WO2019102935A1 (en) * 2017-11-21 2019-05-31 シャープ株式会社 Display device, imaging and display system, and train

Also Published As

Publication number Publication date
JPWO2024195154A1 (en) 2024-09-26

Similar Documents

Publication Publication Date Title
JP5520142B2 (en) Railway vehicle status monitoring device
JP6758203B2 (en) Monitoring system and monitoring method
JP6456358B2 (en) Monitoring system and monitoring method
JP6845306B2 (en) Monitoring system and monitoring method
JP6764031B2 (en) Video display system and video display method
EP3371969B1 (en) Rail vehicle provided with an internally mounted video camera with external field of vision
KR20020010498A (en) Surveillance apparatus for a vehicle
US20200280699A1 (en) Display device, imaging and display system, and train
GB2570185A (en) Monitoring system and monitoring method
JP6955584B2 (en) Door image display system and monitor
JP2021002835A (en) Monitoring system and monitoring method
CZ237495A3 (en) Remote representation system for the control of railway vehicles and trains
WO2024195154A1 (en) Image system and imaging apparatus
JP6934986B2 (en) Video display system and video display method
WO2023248663A1 (en) Onboard device, operation-recording method, and operation-recording system
JPH10166943A (en) External monitoring device for vehicle
RU2018147792A (en) VIDEO SURVEILLANCE SYSTEM AND NOTIFICATION OF PASSENGERS FOR ELECTRIC TRAINS TYPE ES2G (SWAP)
JP7549466B2 (en) Surveillance system and display control method
JP2023144233A (en) Cab monitor system
CN112744259A (en) External electronic monitoring system for train door
WO2025032867A1 (en) Video system and imaging apparatus
JP2022104716A (en) Image recording device, server device, program, etc.
WO2021191998A1 (en) Monitoring system and camera
JP2024074695A (en) Cameras, Trains, and Surveillance Systems
JP2005303523A (en) Vehicle monitor system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23928729

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2025508109

Country of ref document: JP