WO2018105436A1 - 連携表示システム - Google Patents
連携表示システム Download PDFInfo
- Publication number
- WO2018105436A1 WO2018105436A1 PCT/JP2017/042512 JP2017042512W WO2018105436A1 WO 2018105436 A1 WO2018105436 A1 WO 2018105436A1 JP 2017042512 W JP2017042512 W JP 2017042512W WO 2018105436 A1 WO2018105436 A1 WO 2018105436A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display
- terminal
- information
- light emission
- server
- Prior art date
Links
- 238000006243 chemical reaction Methods 0.000 claims description 48
- 238000000034 method Methods 0.000 claims description 35
- 238000012545 processing Methods 0.000 claims description 27
- 230000008569 process Effects 0.000 claims description 25
- 238000002360 preparation method Methods 0.000 claims description 11
- 239000000284 extract Substances 0.000 claims description 6
- 238000004020 luminiscence type Methods 0.000 claims 2
- 238000007781 pre-processing Methods 0.000 claims 2
- 235000019557 luminance Nutrition 0.000 description 25
- 238000003384 imaging method Methods 0.000 description 17
- 238000004891 communication Methods 0.000 description 13
- 238000012937 correction Methods 0.000 description 12
- 238000010586 diagram Methods 0.000 description 8
- 230000000694 effects Effects 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000013507 mapping Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09F—DISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
- G09F19/00—Advertising or display means not otherwise provided for
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/12—Synchronisation between the display unit and other units, e.g. other display units, video-disc players
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
Definitions
- the present invention relates to a cooperative display system in which a plurality of terminals having a display unit cooperate to form a single pseudo display.
- Patent Document 1 Traditionally, at events such as concerts, spectators support chefs using chemical lights and battery-powered penlights. In recent years, there is also a penlight that can be controlled via wireless communication (see Patent Document 1). According to the penlight described in Patent Document 1, it is possible to simultaneously turn off, turn on, and change the color, and it is possible to produce a sense of unity in the venue.
- Patent Document 2 discloses a collaborative system that is more versatile and allows each device to perform a unified operation. According to the collaborative system described in Patent Document 2, since the parent device transmits a signal, the collaborative device receives the signal, and repeatedly transmits a similar signal to the other collaborative devices, the ripple is generated. The signal can be transmitted so as to spread.
- Patent Documents 1 and 2 describes a method for grasping an accurate position of a penlight or a cooperating device. Cannot be configured.
- the present invention has been made in view of the above-described problems, and an object of the present invention is to provide a cooperative display system in which a plurality of terminals having a display unit cooperate to form a single pseudo display. It is.
- a first invention for achieving the above-described object is a cooperative display system in which a plurality of terminals having a display unit cooperate to form a single pseudo display and display moving image content, wherein the terminal And the terminal stores moving image data including display information of all pixels of the content, the terminal accepts an input of a seat number, and the terminal receives the seat number. And specifying the pixel position of the pixel that it is in charge of based on map data indicating a correspondence between the pixel position in the pseudo display and the terminal, the terminal is specified from the video data Extracting the display information relating to the pixel position, and a preparation process including the step of displaying the display information according to a command from the server. Step a cooperative display system and executes a display processing, the containing.
- a plurality of terminals having a display unit can cooperate to form a single pseudo display. In particular, in a venue where a seat number is determined, a pseudo display can be easily configured.
- the pre-preparation process according to the first aspect of the invention may further include a step of allowing the terminal to immediately read out the display information to be extracted in time series.
- a step of allowing the terminal to immediately read out the display information to be extracted in time series As a result, it is possible to prevent an increase in delay time until display start due to a difference in processing speed of each terminal.
- the decoding of moving images has a heavy CPU load, and the time until display starts varies greatly due to the difference in processing speed.
- By preparing in advance the display data of one pixel in charge of display instead of moving image data of the total number of pixels it becomes possible to display immediately without decoding each time. As a result, it is possible to minimize variations in display start delay time.
- the terminal further specifies a voice channel and a volume based on information obtained from the display information, and the terminal is specified. And a step of making the sound information based on the sound channel and the sound volume ready to be read out in time series order. As a result, the terminal can immediately read out the sound information during the display process.
- the server further transmits a display start command of the first frame of the moving image data to the terminal; and when the terminal receives the display start command, A step of displaying the display information in chronological order from the first frame; a step of transmitting a display synchronization command to the terminal at a predetermined interval; and a step of receiving a predetermined frame before the terminal receives the display synchronization command.
- the display information for several minutes is displayed, the display information regarding the next frame for the predetermined number of frames is displayed after the display synchronization command is received, and the display information is displayed before the display information for the predetermined number of frames is displayed.
- the terminal may output the sound information in synchronization with displaying the display information.
- the display on the pseudo display and the sound can be linked.
- a second invention is a cooperative display system in which a plurality of terminals having a display unit cooperate to form a single pseudo display and display moving image content, a server capable of communicating with the terminal, A step of storing a program for providing a function of transmitting the identification information of the terminal to the server using a color or a light emission pattern, and a photographing device for photographing an area in the pseudo display; A terminal converting its identification information into the color or the light emission pattern based on identification information light emission conversion data indicating correspondence between the identification information of the terminal and the color or the light emission pattern; Displaying the color to be converted or the light emission pattern; and the table of the terminal in which the imaging device is included in the pseudo display.
- a step of photographing a section the server analyzing a photographing result by the photographing device, extracting the color or the light emission pattern for each terminal, and the server extracting the color or the light emission pattern. Identifying the pixel position in the pseudo display based on the position of the pseudo-display, and the server identifying the identification information based on the extracted color or the emission pattern with reference to the identification information emission conversion data And the step of transmitting the pixel position associated with the identification information to the terminal; the terminal receiving the pixel position associated with the identification information; and from the content And a step of extracting display information relating to the pixel position associated with the identification information of itself,
- the terminal in accordance with instructions from the server, a cooperative display system and executes a display process including the step of displaying the display information.
- a plurality of terminals having a display unit can cooperate to form a single pseudo display.
- the assigned pixel can be mapped to each terminal even in a venue where a seat number is not defined.
- a third invention is a cooperative display system in which a plurality of terminals having a display unit cooperate to form a single pseudo display and display moving image content, the server being communicable with the terminal, An imaging device for imaging an area in the pseudo display, wherein the terminal inputs input information by a user; and the input information emission conversion data indicating the correspondence between the input information and a color or an emission pattern.
- the input information to be input is converted into the color or the light emission pattern, the terminal displays the color or the light emission pattern to be converted, and the photographing device is the pseudo display.
- Photographing the display unit of the terminal included in the terminal and the server analyzes a photographing result of the photographing device, Information that includes a step of extracting a color or the light emission pattern, and a step in which the server refers to the input information light emission conversion data and specifies the input information based on the extracted color or the light emission pattern.
- a cooperative display system characterized by executing a collection process. According to the third invention, since the server can simultaneously read the questionnaire results and messages from the terminals in the pseudo display, the questionnaire results of tens of thousands of people can be aggregated at high speed, or a plurality of messages can be acquired in real time. can do. Thereby, it is possible to select a moving image to be reproduced based on the questionnaire result.
- the questionnaire results can be used for effects such as performing a display that distinguishes between gender, age, and who the fan is.
- the display content can be changed for each group of passenger seats (for example, the left side and the right side) based on information such as sex.
- the figure which shows an example of the output data 50 The flowchart which shows the flow of the prior preparation process in 1st Embodiment.
- the figure which shows an example of the input information light emission conversion data 70 The flowchart which shows the flow of the questionnaire totalization process in 2nd Embodiment
- a plurality of terminals having a display unit cooperate to form a single pseudo display.
- content such as a moving image is displayed on the pseudo display.
- FIG. 1 is a diagram showing an overview of a cooperative display system 1a in the first embodiment.
- the pseudo display 4 in FIG. 1 is not a display composed of a single casing, but a pseudo display composed of display units of a plurality of terminals 3 in different casings.
- the terminals 3 are arranged apart from each other.
- FIG. 1 as a space in which the pseudo display 4 is configured, a part of the spectator seats in a venue for holding an event such as a concert is schematically shown.
- a spectator possesses a terminal 3 such as a smartphone, for example, and sits in a seat with a predetermined seat number 5. Numbers such as A1 and A2 shown in FIG.
- the cooperative display system 1a includes a plurality of terminals 3 and a server 2 that is communicably connected to each terminal 3.
- the communication means is not particularly limited, but short-range wireless communication technology such as BLE (Bluetooth (registered trademark) Low Energy) or WiFi may be used within the venue.
- BLE Bluetooth (registered trademark) Low Energy) or WiFi
- the terminal 3 and the server 2 may perform communication not only directly but also via a relay terminal having a relay function. When exchanging data in advance outside the venue, the terminal 3 and the server 2 may communicate via the Internet.
- the communication line breaks down when the individual terminals 3 and the server 2 perform individual communications. Therefore, in such a case, it is common to use a communication method (broadcast) in which information is unilaterally distributed from the server 2 side without individual communication between the individual terminals 3 and the server 2.
- Each terminal 3 is arranged indoors where external light does not enter, outdoors at night, etc., and is responsible for display processing of a part of the content displayed on the pseudo display 4.
- each terminal 3 is in charge of display processing for one pixel, and a plurality of terminals 3 cooperate to realize a single pseudo display 4.
- the RGB values output to the display units of the plurality of terminals 3, that is, the collection of light spots, are as if they were dot pictures. appear.
- a plurality of terminals 3 cooperate to display the number “1” on the pseudo display 4.
- FIG. 1 illustrates 35 terminals 3, but the number of terminals 3 is not limited. Depending on the number of spectators of the event, a single pseudo display 4 can be constituted by several to tens of thousands of terminals 3. A plurality of pseudo displays 4 may be set in one venue. For example, when the audience seats are arranged around the stage, the pseudo display 4 may be set for each of the four directions viewed from the stage.
- the server 2 and the terminal 3 include a CPU (abbreviation of “Central Processing Unit”) as a control unit, a memory as a main storage unit, an HDD (abbreviation of “Hard Disk Drive”), a flash memory, and an external storage as an auxiliary storage unit.
- a storage medium as a unit, a liquid crystal display as a display unit, a keyboard and mouse as an input unit, a touch panel display, a wireless module as a wireless communication unit, and the like.
- An HDD abbreviation of “Operating System”
- an application program data necessary for processing, and the like are stored in the HDD or flash memory serving as an auxiliary storage unit.
- the CPUs of the terminal 3 and the server 2 read out the OS and application programs from the auxiliary storage unit, store them in the main storage unit, control other devices while accessing the main storage unit, and execute processing to be described later.
- the terminal 3 may have a speaker or the like as a sound output unit.
- the server 2 may be composed of a single unit or a plurality of units.
- the server 2 connected to the terminal 3 via the Internet may be installed in a data center outside the venue.
- the server 2 connected to the terminal 3 via short-range wireless communication such as BLE is installed in the venue.
- the server 2 is not distinguished depending on the installation location.
- FIG. 2 is a diagram showing an example of download data.
- the download data is stored in advance in the server 2 and is common data distributed to each terminal 3.
- the download data includes, for example, moving image data 10, map data 20, hue audio conversion data 30, and luminance volume conversion data 40.
- the hue sound conversion data 30 and the luminance / volume conversion data 40 are used when the terminal 3 controls sound information switching based on display information when the sound information is output.
- Each terminal 3 downloads these data from the server 2 in advance.
- the moving image data 10 includes display information of all pixels of content displayed on the pseudo display 4.
- the display information included in the moving image data 10 is, for example, an RGB value for each pixel.
- the moving image data 10 may include sound information.
- the moving image data 10 may be a single moving image file or may be divided into a plurality of moving image files.
- the map data 20 is used for mapping the assigned pixel to each terminal 3.
- the map data 20 includes, for example, a seat number 21 of a seat included in the pseudo display 4 and a pixel position 22 of content corresponding to the seat number 21.
- the seat number 21 is a data item having the same meaning as the seat number 5 shown in FIG.
- the pixel position 22 indicates the position of each pixel of the content by the X coordinate and the Y coordinate.
- the terminal 3 can determine the assigned pixel based on the seat number 21 by referring to the map data 20.
- a plurality of moving image data 10 and map data 20 that are different for each pseudo display 4 may be distributed.
- the hue audio conversion data 30 is used to control switching of the audio channel 32 by the hue 31, and the hue 31 and the audio channel 32 are associated with each other.
- the hue audio conversion data 30 for example, the hue 31 specified by the RGB values may be divided into two ranges and associated with the left and right (L, R) audio channels 32.
- An existing color model can be used as a method for specifying a hue from RGB values.
- the hue 31 is taken as an example here, the information associated with the audio channel 32 is not limited to this example, and any information can be used as long as it is obtained from the display information. Information can be used.
- the terminal 3 may use the LSB (least significant bit) of RGB included in the moving image data and the LSB of the luminance value as a signal for switching the audio channel independently of the display.
- the luminance / volume conversion data 40 is used for controlling switching of the volume 42 by the luminance 41, and the luminance 41 and the volume 42 are associated with each other.
- the luminance / volume conversion data 40 may be associated so that the volume increases as the luminance 41 specified by the RGB value increases.
- An existing color model can be used as a method for specifying the luminance from the RGB values.
- luminance 41 is mentioned here as an example, the information matched with the sound volume 42 is not limited to this example, What kind of information will be used if it is information obtained from display information? But it ’s okay.
- the terminal 3 can convert the RGB value into a sound channel or sound volume by the hue sound conversion data 30 and the luminance sound volume conversion data 40, and can control the display and sound in synchronization. Thereby, the display on the pseudo display 4 and the sound can be synchronized. Even if the volume of one terminal 3 is small, by collecting tens of thousands, for example, it is possible to produce an effect in which the sound moves around in conjunction with the movement of the light spot.
- FIG. 3 is a diagram illustrating an example of the output data 50.
- the output data 50 is display information and sound information in chronological order output from each terminal 3, and is generated based on the moving image data 10 downloaded in advance by each terminal 3. A specific generation method will be described later with reference to FIG.
- the output data 50 includes an RGB value 51, a volume 52, an audio channel 53, and the like.
- the RGB value 51 is display information displayed on the display unit of the terminal 3.
- the volume 52 is one piece of sound information output to the sound output unit of the terminal 3 and is the volume of the sound output from the speaker of the terminal 3.
- the audio channel 53 is one of sound information and is a number for identifying the audio channel of the audio output from the speaker of the terminal 3.
- the display information is not limited to the RGB value 51, but may be anything as long as it relates to the pixel value of each pixel, such as a gray scale value or an ON / OFF value relating to a specific hue.
- FIG. 4 is a flowchart showing the flow of pre-preparation processing in the first embodiment.
- the terminal 3 downloads the application and data related to the cooperative display system 1a from the server 2 (step S101), and installs the application.
- the data to be downloaded is the data shown in FIG.
- the terminal 3 receives an input of the seat number 5 from the user via a touch display or the like (step S102).
- the means for receiving the seat number 5 is not limited to this example.
- the terminal 3 may scan a bar code having seat number information, or may acquire a seat number from an electronic ticket.
- the terminal 3 specifies the pixel position of its own responsible pixel based on the seat number and the map data 20 (step S103). Specifically, the terminal 3 refers to the map data 20 and specifies the pixel position 22 corresponding to the seat number input in step S102.
- the volume may be specified based on the luminance and stored as a single channel (monaural) audio signal.
- the information for specifying the audio channel is not limited to the hue, and any information may be used as long as it is information obtained from the display information.
- the terminal 3 may use the LSB (least significant bit) of RGB included in the moving image data and the LSB of the luminance value as a signal for switching the audio channel independently of the display.
- the information for specifying the volume is not limited to the luminance, and any information may be used as long as the information is obtained from the display information.
- the terminal 3 can display the display information for one pixel on the pseudo display 4 or output the sound information by generating the output data 50 of the pixel for which the terminal 3 is responsible.
- the terminal 3 makes the output data 50 of the assigned pixel resident in the main storage unit (step S105).
- the terminal 3 may store the output data 50 in the auxiliary storage unit in addition to making the output data 50 resident in the main storage unit, and dividing the output data 50 into files having a small capacity. In either case, the terminal 3 is in a state where the output data 50 of the assigned pixel can be immediately read during the display process. Similarly, the terminal 3 is in a state in which sound information based on the volume and the audio channel specified in step S104 can be immediately read in chronological order.
- the entire data size may be several hundred MB to several GB, but the output data of the pixel in charge of each terminal 3 is shared by several thousand to several tens of thousands of terminals 3. Since 50 is several MB or less, the data size can be kept resident in the main storage unit. Each terminal 3 may make the output data 50 stay resident in the main storage unit for each short moving image of several seconds.
- the terminal 3 When the sound information is controlled using the hue sound conversion data 30 and the luminance / volume conversion data 40, the terminal 3 makes only the RGB value resident in the main storage unit as the output data 50, and in the display processing, the RGB data is real-time. You may convert into sound information from a value.
- the data that is finally required by the terminal 3 is only the output data 50 of the pixel that it is responsible for. Therefore, it is conceivable that after the user arrives at the venue, the terminal 3 transmits the seat number to the server 2, and the server 2 generates and transmits the output data 50 for each terminal 3. However, when the number of terminals 3 reaches tens of thousands, the load on the server 2 is concentrated in the same time zone. Therefore, in the present embodiment, the terminal 3 downloads common data from the server 2 in advance, and the terminal 3 generates necessary data. As a result, the load on the server 2 can be distributed over time.
- FIG. 5 is a flowchart showing the flow of display processing in the first embodiment.
- FIG. 5 shows the processing contents of the terminal 3.
- the terminal 3 displays the display information prepared in the preliminary preparation process in accordance with the command from the server 2.
- the output data 50 is divided into a plurality of sets at a predetermined interval, and the server 2 transmits a synchronization command to all the terminals 3 at the timing of switching the set.
- the server 2 transmits a synchronization command by broadcasting.
- the time series order of the moving image files is specified by adding a serial number to the end of the moving image file number, and the plurality of moving image files may be reproduced continuously. it can.
- the terminal 3 sets the first frame of the first moving image file as a display target frame (step S201). If there are a plurality of contents, before step S201, the terminal 3 may receive from the server 2 a content specifying command for specifying which content is to be displayed, and determine the content to be displayed. .
- a display start command is received from the server 2 (step S202).
- the server 2 transmits a display start command by broadcasting.
- the terminal 3 may output sound information based on the volume 52 and the audio channel 53 of the output data 50 to the sound output unit in synchronization with displaying the pixel value of the current display target frame on the display unit.
- the terminal 3 confirms whether or not the final frame of the final moving image file has been reached (step S204).
- the terminal 3 sets the next frame as a display target frame (Step S205) and increments the synchronization counter by 1 (Step S206).
- step S207 the terminal 3 confirms whether or not a synchronization command has been received from the server 2 (step S207).
- the terminal 3 proceeds to step S210.
- the terminal 3 proceeds to step S208.
- step S208 the terminal 3 checks whether or not the synchronization counter has reached a predetermined number, that is, the number of frames included in one set. If the synchronization counter has not reached the predetermined number (No in step S208), the terminal 3 repeats from step S203. On the other hand, if the synchronization counter has reached the predetermined number (Yes in step S208), the terminal 3 waits until a synchronization command is received from the server 2 (step S209). When the synchronization command is received, the process proceeds to step S210.
- step S210 the terminal 3 resets the synchronization counter. Then, the terminal 3 sets the first frame of the next set (which may be the first set of the next moving image file) as a display target frame (step S211), and repeats from step S203. When the display target frame reaches the final frame of the final moving image file (Yes in step S204), the terminal 3 ends the process.
- FIG. 6 is a diagram for explaining the synchronization processing in the first embodiment.
- FIG. 6A when synchronization processing is not performed, a moving image having a long reproduction time is gradually changed due to a difference in processing speed of the terminal 3 even if the display timing is adjusted by a display start command in the first frame. The display shifts greatly. Therefore, as shown in FIG. 6B, synchronization processing is performed in the present embodiment, and the server 2 periodically transmits a synchronization command. Then, like the terminal 3a shown in FIG. 6B, the terminal 3 having a low processing speed, that is, the terminal 3 that receives the synchronization command before the display of all the frames included in one set is being processed.
- the display of the frames included in the set is interrupted, and the first frame of the next set is displayed.
- the terminal 3 having a high processing speed that is, the terminal 3 that terminates the display of all frames included in one set before receiving the synchronization command It waits without displaying the next frame until it receives. Accordingly, even a moving image having a long reproduction time can be prevented from gradually shifting the display, and the display shift can be minimized until the end.
- simulation display 4 can comprise the several terminal 3 which has a display part in cooperation.
- the user inputs a seat number, and the terminal 3 maps the assigned pixel to each terminal 3 by referring to the predetermined map data 20, so that the single pseudo display 4 can be easily obtained. Can be configured.
- FIG. 7 is a diagram illustrating an outline of the cooperative display system 1b according to the second embodiment.
- the assigned pixel is mapped to each terminal 3 using the seat number 5 where the position of each terminal 3 can be specified.
- the assigned pixel in the venue where the seat number 5 cannot be used.
- symbol is attached
- the cooperative display system 1b captures a plurality of terminals 3, a server 2 connected to be communicable with each terminal 3, and display portions of all the terminals 3 included in the pseudo display 4 so as to be able to communicate with the server 2. And an imaging device 6 to be connected.
- the server 2 and the imaging device 6 are connected by a wired connection such as a USB cable or an HDMI (registered trademark) cable.
- each terminal 3 is in charge of display processing for one pixel, and a plurality of terminals 3 cooperate to realize a single pseudo display 4.
- a plurality of terminals 3 cooperate to display the number “2” on the pseudo display 4.
- the imaging device 6 is a video camera that captures a moving image at a predetermined frame rate, and is installed at a position where a moving image having an image quality capable of identifying the light emission of the display unit of each terminal 3 is obtained.
- the imaging device 6 may be composed of one unit or a plurality of units.
- FIG. 8 is a diagram illustrating an example of the identification information light emission conversion data 60.
- the identification light emission conversion data 60 is stored in advance in the storage unit of the server 2 and the terminal 3.
- predetermined identification information emission conversion data 60 is used.
- the identification information light emission conversion data 60 includes, for example, an RGB value 62 and a light emission pattern 63 for each identification number 61 for identifying the terminal 3.
- the RGB value 62 is color information displayed on the display unit of the terminal 3.
- a light emission pattern 63 indicates a pattern of timing at which the terminal 3 emits light. For example, “0” means no light emission, and “1” means light emission. When the frame rate is 30 fps, the light emission pattern 63 may include 30 numbers “0” or “1”.
- the RGB value 62 and the light emission pattern 63 are different combinations for each identification number 61.
- the terminal 3 emits light based on the identification information light emission conversion data 60, the photographing device 6 photographs the display unit of the terminal 3, and the server 2 analyzes the photographing result by the photographing device 6.
- the identification number can be transmitted. Further, since the server 2 can also specify the position of the terminal 3 in the pseudo display 4, the assigned pixel can be mapped to each terminal 3.
- the identification information for identifying the terminal 3 is not limited to the identification number 61 of the serial number as shown in FIG. 8 (for example, the serial number automatically assigned in the cooperative display system 1b). It may be an individual identification number or a mail address of the user of the terminal 3.
- FIG. 9 is a flowchart showing the flow of pre-preparation processing in the second embodiment.
- the server 2 and the terminal 3 store the identification information light emission conversion data 60 in the storage unit in advance.
- the terminal 3 downloads the application program and data from the server 2 (step S301), and installs the application program.
- This application program is a program for providing a function of transmitting the identification information of the terminal 3 to the server 2 using a color or a light emission pattern, that is, processing in steps S302 and S303 described later.
- Step S301 may be executed in advance outside the venue.
- the application includes the contents of the display position specifying process described later.
- the data to be downloaded is the moving image data 10 shown in FIG. 2 and the identification information light emission conversion data 60 shown in FIG. 8, and also includes the hue sound conversion data 30 and the luminance volume conversion data 40 shown in FIG.
- each terminal 3 converts its identification information into a light emission pattern based on the identification information light emission conversion data 60 (step S302), and simultaneously displays the light emission pattern on the display unit in the hall (step S303). ).
- the imaging device 6 captures a moving image on the display unit of the terminal 3 in the pseudo display 4 and transmits the imaging result to the server 2 (step S304).
- the server 2 analyzes the photographing result by the photographing device 6, extracts the light emission pattern for each terminal 3, and specifies the light emission position, the identification number, and the light emission correction value of the terminal 3 (step S305).
- the server 2 stores the number of seats in the horizontal direction and the vertical direction in the storage unit as the number of pixels in the horizontal direction and the vertical direction. . Then, the server 2 divides the image of each frame of the photographing result by the photographing device 6 into each pixel based on the number of pixels in the horizontal direction and the vertical direction, and specifies the pixel position of the light emitting point of the terminal 3.
- the server 2 calculates the number of light emission points from the image of the photographing result by the photographing device 6 and responds to the content aspect ratio.
- the number of pixels in the horizontal direction and the vertical direction may be calculated from the number of light emitting points. For example, when the number of light emitting points is 1,200 and the content aspect ratio is 4: 3, the number of pixels in the horizontal direction is 40 and the number of pixels in the vertical direction is 30.
- the server 2 divides the image of the photographing result by the photographing device 6 into each pixel based on the number of pixels in the horizontal direction and the vertical direction, and specifies the pixel position of the light emitting point of the terminal 3.
- each terminal 3 is generally responsible for one pixel.
- a plurality of terminals 3 may be in charge of the same pixel, or there may be pixels that are not in charge of any terminal 3.
- the server 2 may store the number of pixels in the horizontal direction and the vertical direction as set values in the storage unit. In this case, in order to eliminate missing pixels in the content, it is desirable to set the number of pixels in the horizontal and vertical directions small.
- the server 2 calculates the color information and the light emission pattern for each light emission point from the image of the image taken by the imaging device 6. Then, the server 2 searches the identification information light emission conversion data 60 for the closest combination of the RGB value 62 and the light emission pattern 63 and specifies the identification number 61.
- the display unit of the terminal 3 such as a smartphone has a difference in color and brightness depending on the model. Therefore, in order to absorb these differences, the server 2 specifies the light emission correction value of the terminal 3 from the image of the photographing result by the photographing device 6. For example, after displaying the light emission pattern for each terminal 3 on the display unit, the terminal 3 displays a light emission pattern for performing color tone correction and luminance correction on the display unit.
- all the terminals 3 display the light emission pattern with the same luminance setting value
- the server 2 calculates the average of the luminances of all the light emitting points, and calculates the difference from the luminance of each light emitting point as the light emission related to the luminance correction. Set the correction value.
- the server 2 transmits the pixel position of the assigned pixel and the light emission correction value for each terminal 3 to the terminal 3 (step S306). Further, the server 2 transmits the number of pixels in the horizontal direction and the vertical direction to the terminal 3 as necessary. The server 2 may identify the destination terminal 3 based on the identification number and transmit the pixel position of the assigned pixel and the light emission correction value via the Internet or one-to-one wireless communication. Alternatively, the server 2 may distribute the correspondence between the identification numbers of all the terminals 3, the pixel positions of the assigned pixels, and the light emission correction value via WiFi via WiFi.
- the terminal 3 receives the pixel position of the assigned pixel and the light emission correction value (step S307), and corrects the output data of the assigned pixel based on the light emission correction value (step S308). For example, the terminal 3 whose luminance of the light emitting point is lower than the average corrects the output data so that the output data of the assigned pixel is displayed with the high luminance setting value. Then, the terminal 3 makes the output data of the assigned pixel resident in the main storage unit (step S309). The terminal 3 decodes the moving image of the content based on the number of pixels in the horizontal direction and the vertical direction received from the server 2 as necessary, and extracts the output data of the assigned pixel.
- the identification information light emission conversion data 60 including the light emission pattern for each identification number is created in advance, and the terminal 3 displays the light emission pattern corresponding to its own identification number on the display unit.
- the imaging device 6 images the display unit of the terminal 3, and the server 2 analyzes the imaging result of the imaging device 6 to identify the identification number of the terminal 3 and the pixel position of the assigned pixel.
- the assigned pixel can be mapped to each terminal 3. Note that the display process in the second embodiment can be executed in the same way as the display process in the first embodiment, and thus the description thereof is omitted.
- FIG. 10 is a diagram showing an example of the input information light emission conversion data 70.
- the input information light emission conversion data 70 is stored in advance in the storage unit of the server 2 and the terminal 3.
- predetermined input information light emission conversion data 70 is used in order to aggregate the questionnaire results for the audience.
- the input information light emission conversion data 70 includes, for example, an RGB value 72 and a light emission pattern 73 for each input value 71 related to a questionnaire response.
- the RGB value 72 is a data item having the same meaning as the RGB value 62 shown in FIG.
- the light emission pattern 73 is a data item having the same meaning as the light emission pattern 63 shown in FIG.
- the RGB value 72 and the light emission pattern 73 are different combinations for each input value 71.
- the questionnaire answers with numbers such as 1, 2, 3,.
- FIG. 11 is a flowchart showing the flow of the questionnaire counting process in the second embodiment.
- the server 2 can execute an information collection process in which information is collected from a plurality of terminals 3 at once.
- the questionnaire totalization process is an example of an information collection process.
- the server 2 and the terminal 3 store the input information light emission conversion data 70 in the storage unit in advance. As illustrated in FIG. 11, the terminal 3 receives a questionnaire result from the user and inputs the questionnaire result via the input unit (step S401).
- the terminal 3 converts the questionnaire result input in step S401 into a light emission pattern based on the input information light emission conversion data 70 (step S402), and displays the light emission pattern to be converted all at once in the hall. (Step S403).
- the imaging device 6 images the display unit of the terminal 3 in the pseudo display 4 and transmits the imaging result to the server 2 (step S404).
- the server 2 analyzes the photographing result by the photographing device 6, extracts the light emission pattern for each terminal 3, and specifies the questionnaire result (step S405). Specifically, the server 2 calculates the color information and the light emission pattern for each light emission point from the image of the photographing result obtained by the photographing device 6. Then, the server 2 searches the input information light emission conversion data 70 for the closest combination of the RGB value 72 and the light emission pattern 73 and specifies the input value 71.
- the server 2 may transmit the questionnaire result to the terminal 3 via BLE or the like. Further, when there is a large screen display different from the pseudo display 4 in the venue, the server 2 may output the questionnaire results tabulated in step S406 to the large screen display.
- the number of terminals 3 is not limited as long as the light emission of the display unit of each terminal 3 can be identified by the imaging result of the imaging device 6. And since the server 2 can read a questionnaire result simultaneously from the terminal 3 in the pseudo display 4, it can total the questionnaire result of tens of thousands of people at high speed. Thereby, it is possible to select a moving image to be reproduced based on the questionnaire result. For example, video content with many voting results can be preferentially reproduced.
- the questionnaire results can be used for effects such as performing a display that distinguishes between gender, age, and who the fan is.
- the display content can be changed for each group of passenger seats (for example, the left side and the right side) based on information such as sex.
- the cooperative display system 1b can also implement message collection processing as an example of information collection processing.
- the input information light emission conversion data 70 shown in FIG. 10 is expanded, and light emission patterns are associated not only with numbers but also with characters (for example, light emission patterns are defined in association with character codes).
- the terminal 3 switches the light emission pattern for each character constituting the message to be sent and displays it on the display unit.
- the server 2 analyzes the photographing result by the photographing device 6, specifies the character for each light emission pattern, and restores the message. Then, the server 2 may output the restored message on a large screen display different from the pseudo display 4 in real time.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Computer Hardware Design (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Accounting & Taxation (AREA)
- Marketing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Databases & Information Systems (AREA)
- Human Computer Interaction (AREA)
- Controls And Circuits For Display Device (AREA)
- User Interface Of Digital Computer (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
Description
図1は、第1実施形態における連携表示システム1aの概要を示す図である。図1の擬似ディスプレイ4は、単一の筐体から構成されるディスプレイではなく、異なる筐体の複数の端末3の表示部から構成される擬似的なディスプレイである。端末3は、互いに離間して配置される。図1では、擬似ディスプレイ4が構成される空間として、コンサート等のイベントを開催する会場の観客席の一部を模式的に示している。観客は、例えばスマートフォン等の端末3を所持し、予め定められている座席番号5の座席に着席する。図1に示すA1、A2等の番号が座席番号5である。
図7は、第2実施形態における連携表示システム1bの概要を示す図である。第1実施形態では、各端末3の位置を特定可能な座席番号5を利用して担当画素を各端末3にマッピングしたが、第2実施形態では、座席番号5を利用できない会場における担当画素のマッピング方法を提供する。尚、第1実施形態と同様の要素については同一の符号を付し、重複する説明を省略する。
2………サーバ
3………端末
4………擬似ディスプレイ
5………座席番号
6………撮影装置
10………動画データ
20………マップデータ
30………色相音声変換データ
40………輝度音量変換データ
50………出力データ
60………識別情報発光変換データ
70………入力情報発光変換データ
Claims (7)
- 表示部を有する複数の端末が連携して単一の擬似ディスプレイを構成し、動画像のコンテンツを表示する連携表示システムであって、
前記端末と通信可能なサーバを備え、
前記端末が、前記コンテンツの全画素の表示情報を含む動画データを記憶するステップと、
前記端末が、座席番号の入力を受け付けるステップと、
前記端末が、前記座席番号と前記擬似ディスプレイ内の画素位置との対応付けを示すマップデータに基づいて、自らが担当する画素の前記画素位置を特定するステップと、
前記端末が、前記動画データの中から、特定される前記画素位置に関する前記表示情報を抽出するステップと、
を含む事前準備処理と、
前記端末が、前記サーバからの命令に従い、前記表示情報を表示するステップ、
を含む表示処理と、
を実行することを特徴とする連携表示システム。 - 前記事前準備処理は、更に、
前記端末が、抽出される前記表示情報を時系列順に即時に読み出し可能な状態とするステップ、
を含むことを特徴とする請求項1に記載の連携表示システム。 - 前記事前準備処理は、更に、
前記端末が、前記表示情報の中から得られる情報に基づいて音声チャンネル及び音量を特定するステップと、
前記端末が、特定される前記音声チャンネル及び前記音量に基づく音情報を時系列順に即時に読み出し可能な状態とするステップと、
を含むことを特徴とする請求項2に記載の連携表示システム。 - 前記表示処理は、更に、
前記サーバが、前記動画データの先頭フレームの表示開始命令を前記端末に送信するステップと、
前記端末が、前記表示開始命令を受信すると、前記先頭フレームから時系列順に前記表示情報を表示するステップと、
前記サーバが、所定の間隔で表示同期命令を前記端末に送信するステップと、
前記端末が、前記表示同期命令を受信する前に所定フレーム数分の前記表示情報を表示した場合、前記表示同期命令を受信した後に所定フレーム数分の次フレームに関する前記表示情報を表示し、前記所定フレーム数分の前記表示情報を表示する前に前記表示同期命令を受信した場合、即時に前記所定フレーム数分の次フレームに関する前記表示情報を表示するステップと、
を含むことを特徴とする請求項1乃至請求項3のいずれかに記載の連携表示システム。 - 前記表示処理において、前記端末が、前記表示情報を表示することに同期して、前記音情報を出力することを特徴とする請求項3に記載の連携表示システム。
- 表示部を有する複数の端末が連携して単一の擬似ディスプレイを構成し、動画像のコンテンツを表示する連携表示システムであって、
前記端末と通信可能なサーバと、前記擬似ディスプレイ内の領域を撮影する撮影装置とを備え、
前記端末が、前記端末の識別情報を色又は発光パターンを用いて前記サーバに伝達する機能を提供するためのプログラムを記憶するステップと、
前記端末が、前記端末の前記識別情報と前記色又は前記発光パターンの対応付けを示す識別情報発光変換データに基づいて、自らの前記識別情報を前記色又は前記発光パターンに変換するステップと、
前記端末が、変換される前記色又は前記発光パターンを表示するステップと、
前記撮影装置が、前記擬似ディスプレイ内に含まれる前記端末の前記表示部を撮影するステップと、
前記サーバが、前記撮影装置による撮影結果を解析し、前記端末ごとに前記色又は前記発光パターンを抽出するステップと、
前記サーバが、抽出される前記色又は前記発光パターンの位置に基づいて、前記擬似ディスプレイ内の画素位置を特定するステップと、
前記サーバが、前記識別情報発光変換データを参照し、抽出される前記色又は前記発光パターンに基づいて前記識別情報を特定するステップと、
前記サーバが、前記識別情報と対応付けられる前記画素位置を前記端末に送信するステップと、
前記端末が、前記識別情報と対応付けられる前記画素位置を受信し、前記コンテンツの中から、自らの前記識別情報と対応付けられる前記画素位置に関する表示情報を抽出するステップと、
を含む事前準備処理と、
前記端末が、前記サーバからの命令に従い、前記表示情報を表示するステップ、
を含む表示処理と、
を実行することを特徴とする連携表示システム。 - 表示部を有する複数の端末が連携して単一の擬似ディスプレイを構成し、動画像のコンテンツを表示する連携表示システムであって、
前記端末と通信可能なサーバと、前記擬似ディスプレイ内の領域を撮影する撮影装置とを備え、
前記端末が、ユーザによる入力情報を入力するステップと、
前記端末が、前記入力情報と色又は発光パターンの対応付けを示す入力情報発光変換データに基づいて、入力される前記入力情報を前記色又は前記発光パターンに変換するステップと、
前記端末が、変換される前記色又は前記発光パターンを表示するステップと、
前記撮影装置が、前記擬似ディスプレイ内に含まれる前記端末の前記表示部を撮影するステップと、
前記サーバが、前記撮影装置による撮影結果を解析し、前記端末ごとに前記色又は前記発光パターンを抽出するステップと、
前記サーバが、前記入力情報発光変換データを参照し、抽出される前記色又は前記発光パターンに基づいて前記入力情報を特定するステップと、
を含む情報収集処理を実行することを特徴とする連携表示システム。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016-238929 | 2016-12-09 | ||
JP2016238929A JP2020030223A (ja) | 2016-12-09 | 2016-12-09 | 連携表示システム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018105436A1 true WO2018105436A1 (ja) | 2018-06-14 |
Family
ID=62491954
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/042512 WO2018105436A1 (ja) | 2016-12-09 | 2017-11-28 | 連携表示システム |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP2020030223A (ja) |
WO (1) | WO2018105436A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023067933A1 (ja) * | 2021-10-19 | 2023-04-27 | キヤノン株式会社 | 画像表示システム |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024034412A1 (ja) * | 2022-08-10 | 2024-02-15 | ソニーグループ株式会社 | 情報処理システム、情報処理装置および方法、並びにプログラム |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08125760A (ja) * | 1994-10-28 | 1996-05-17 | Hitachi Ltd | 情報処理装置 |
JP2004264735A (ja) * | 2003-03-04 | 2004-09-24 | Nec Corp | 同期式映像再生システム |
JP2005244931A (ja) * | 2004-01-26 | 2005-09-08 | Seiko Epson Corp | マルチ画面映像再生システム |
JP2007110582A (ja) * | 2005-10-17 | 2007-04-26 | Sony Corp | 画像表示装置および方法、並びにプログラム |
JP2008164986A (ja) * | 2006-12-28 | 2008-07-17 | Fuji Electric Holdings Co Ltd | 映像表示システム |
JP2015022592A (ja) * | 2013-07-19 | 2015-02-02 | 株式会社リコー | 集合出力システム、端末装置および出力プログラム |
JP2016038514A (ja) * | 2014-08-08 | 2016-03-22 | キヤノン株式会社 | 表示制御装置、表示装置、それらの制御方法、およびプログラム |
-
2016
- 2016-12-09 JP JP2016238929A patent/JP2020030223A/ja active Pending
-
2017
- 2017-11-28 WO PCT/JP2017/042512 patent/WO2018105436A1/ja active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08125760A (ja) * | 1994-10-28 | 1996-05-17 | Hitachi Ltd | 情報処理装置 |
JP2004264735A (ja) * | 2003-03-04 | 2004-09-24 | Nec Corp | 同期式映像再生システム |
JP2005244931A (ja) * | 2004-01-26 | 2005-09-08 | Seiko Epson Corp | マルチ画面映像再生システム |
JP2007110582A (ja) * | 2005-10-17 | 2007-04-26 | Sony Corp | 画像表示装置および方法、並びにプログラム |
JP2008164986A (ja) * | 2006-12-28 | 2008-07-17 | Fuji Electric Holdings Co Ltd | 映像表示システム |
JP2015022592A (ja) * | 2013-07-19 | 2015-02-02 | 株式会社リコー | 集合出力システム、端末装置および出力プログラム |
JP2016038514A (ja) * | 2014-08-08 | 2016-03-22 | キヤノン株式会社 | 表示制御装置、表示装置、それらの制御方法、およびプログラム |
Non-Patent Citations (1)
Title |
---|
"Three companies in Akita will collaborate to develop ''huge screen'' using smartphones", AKITAKEIZAI SHIMBUN, 21 July 2016 (2016-07-21), Retrieved from the Internet <URL:https://akita.keizai.biz/headline/2532> [retrieved on 20180118] * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023067933A1 (ja) * | 2021-10-19 | 2023-04-27 | キヤノン株式会社 | 画像表示システム |
Also Published As
Publication number | Publication date |
---|---|
JP2020030223A (ja) | 2020-02-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11812232B2 (en) | Electronic device and music visualization method thereof | |
KR101970358B1 (ko) | 중앙 서버 및 이를 포함하는 공연 시스템 | |
US11297253B2 (en) | Control system for controlling plurality of user terminals | |
US9398255B2 (en) | Information processing apparatus, information processing system and information processing method | |
US9524698B2 (en) | System and method for collectively displaying image by using a plurality of mobile devices | |
US10303419B2 (en) | Information processing system, display processing apparatus, display processing method, and recording medium | |
JPWO2016110943A1 (ja) | 映像表示装置、映像表示方法、及び映像表示システム | |
CN105472399A (zh) | 一种转播视频与现场显示视频分离方法及装置 | |
WO2018105436A1 (ja) | 連携表示システム | |
US20230283888A1 (en) | Processing method and electronic device | |
CN104469078A (zh) | 互动投影控制方法和系统 | |
JP6241103B2 (ja) | 集合出力システム、端末装置および出力プログラム | |
US10244207B2 (en) | Videoconference communication device | |
WO2019102886A1 (ja) | 演出制御システム、制御装置、および無線通信端末 | |
JP6600975B2 (ja) | コレオグラフィ作成支援方法、コレオグラフィ作成支援プログラムおよび情報処理装置 | |
US10970828B2 (en) | Image processing apparatus, image processing system, image processing method, and recording medium | |
JP2016225823A (ja) | 表示システム、情報端末、表示装置、再生制御プログラム及び再生制御方法 | |
CN111143014B (zh) | 一种拼墙系统场景缩略图的生成方法及系统 | |
JP2015046016A (ja) | 画像処理サーバ、画像処理システム及びプログラム | |
KR20170063273A (ko) | 위치 기반 스마트 단말 그룹 제어 장치 및 방법 | |
KR20240050054A (ko) | 디스플레이 장치 및 그 동작 방법 | |
JP2022038824A (ja) | 照明システム、照明演出方法およびプログラム | |
KR20140108385A (ko) | 다중 모바일 기기를 통한 영상의 화소분할 동시 재생 기술 | |
JP2021086372A (ja) | 表示装置およびその制御方法 | |
JP2013229775A (ja) | カメラ映像中継システム、スタジオ装置、映像中継用カメラ |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17878014 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205N DATED 16/09/2019) |
|
NENP | Non-entry into the national phase |
Ref country code: JP |
|
NENP | Non-entry into the national phase |
Ref country code: JP |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17878014 Country of ref document: EP Kind code of ref document: A1 |