WO2014045699A1 - Sensor share control device, method, and computer program - Google Patents
Sensor share control device, method, and computer program Download PDFInfo
- Publication number
- WO2014045699A1 WO2014045699A1 PCT/JP2013/069637 JP2013069637W WO2014045699A1 WO 2014045699 A1 WO2014045699 A1 WO 2014045699A1 JP 2013069637 W JP2013069637 W JP 2013069637W WO 2014045699 A1 WO2014045699 A1 WO 2014045699A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensor
- policy
- request
- permission
- information
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 31
- 238000004590 computer program Methods 0.000 title claims description 8
- 238000012545 processing Methods 0.000 claims abstract description 49
- 230000004044 response Effects 0.000 claims abstract description 19
- 230000035945 sensitivity Effects 0.000 claims description 10
- 238000013475 authorization Methods 0.000 claims description 8
- 238000009434 installation Methods 0.000 claims description 5
- 238000004891 communication Methods 0.000 abstract description 3
- 230000006870 function Effects 0.000 description 29
- 238000005259 measurement Methods 0.000 description 12
- 230000007704 transition Effects 0.000 description 11
- 238000004458 analytical method Methods 0.000 description 8
- 238000010191 image analysis Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 238000012546 transfer Methods 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 239000000779 smoke Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/62—Protecting access to data via a platform, e.g. using keys or access control rules
- G06F21/6218—Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/62—Protecting access to data via a platform, e.g. using keys or access control rules
- G06F21/6218—Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
- G06F21/6245—Protecting personal data, e.g. for financial or medical purposes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/10—Network architectures or network communication protocols for network security for controlling access to devices or network resources
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/20—Network architectures or network communication protocols for network security for managing network security; network security policies in general
Definitions
- the present invention relates to a sensor sharing control method, and more particularly to a sensor control apparatus, method, and computer program for a plurality of sensor providers and a plurality of sensor users.
- Patent Documents 1 to 3 the sensor provider registers its own permission policy and sensor in the management system, the user registers the usage conditions and the sensor function to be connected to the system, and the management system is in the state of the endpoint.
- a special-purpose real-time information collection system is disclosed that compares usage requests with acceptable policies.
- the policy on the side using the sensor is not taken into consideration.
- no device that dynamically controls sharing when a sensor or sensor user is moving is not provided.
- the present invention has been made in view of such circumstances, and an object thereof is to provide an apparatus, a method, and a computer program for performing sensor sharing control dynamically, flexibly, and seamlessly. Another object is to provide a sensor sharing control apparatus and method that appropriately considers the policy on the side using the sensor. Another object is to provide an apparatus and a method for performing dynamic sensor sharing control when a policy of a sensor or a sensor user is changed. Another problem is to provide a sensor sharing control apparatus and method that take into account the attributes of the sensor user or sensor provider. Another object is to provide an apparatus and method for performing sensor sharing control in real time. Another object is to provide an apparatus and a method for dynamically tracking a moving object by real-time sensor sharing control.
- the present invention is a sensor sharing control apparatus, wherein a usage request processing unit receives sensor usage request information from a sensor user terminal, and the usage request processing unit makes a usage request a request policy.
- Means for recording and notifying the sensor search section that the request policy has been recorded means for receiving a sensor use permission information from the sensor provider terminal, and means for allowing the use permission processing section to
- the sensor search unit records the permission policy and notifies the sensor search unit that the permission policy has been recorded,
- a means for searching for an authorization policy that matches the request policy and a list of sensors included in the matched authorization policy are created, and the records recorded in the request policy are created.
- the use request processing unit has means for authenticating whether the sensor user is a valid user according to the sensor use request information, and when the authentication is successful, the sensor use request information is Record in request policy.
- the use permission processing unit has means for authenticating whether the sensor provider is a valid provider according to the sensor use permission information, and when the authentication is successful, the permission to use the sensor use permission information. Record in the policy.
- the status management unit has means for managing the status of each sensor, creating a list of available sensors in response to a request from the sensor search unit, and transmitting the list to the sensor search unit.
- the access management unit receives a list of the sensors from the sensor search unit, creates a token that allows the sensor user terminal to control the sensors in the list, and transmits the token to the sensor user terminal. To have.
- the means for receiving the sensor use request information includes means for periodically receiving the position information of the sensor user terminal and updating the request policy.
- the status management unit has means for periodically acquiring sensor position information and updating the permission policy.
- the sensor use request information includes the user terminal ID, sensor type, position, and sensitivity.
- the sensor usage permission information includes the sensor type, installation location, sensor serial, and sensitivity.
- a terminal that communicates with the sensor sharing control apparatus and that receives a list of controllable sensors from the sensor sharing control apparatus, the sensor user terminal, and position information of the sensor
- the controllable sensor existing in the range is set at the position of the sensor.
- a terminal comprising means for transmitting a signal for controlling the sensor is provided.
- the senor communicates with the sensor sharing control device, and receives a token from the sensor sharing control device for enabling the control of the function of the sensor, and stores the token.
- Providing a sensor comprising: means for receiving a token from a sensor user terminal; and means for enabling control of the function from the sensor user terminal when the stored token matches the received token. To do.
- a usage request processing unit that receives a sensor usage request
- a usage permission processing unit that receives a sensor usage permission
- a sensor search unit that searches for available sensors by matching the sensor usage request and sensor usage permission.
- the sensor is objected at the position of the sensor for the controllable sensors existing in the range.
- the method comprising the step of transmitting a signal for controlling the sensor to the sensor shared control device.
- the controllable sensor is dynamically updated so that the updated sensor is displayed as an object.
- the center of the sensor search range is a moving object
- the position of the moving object is captured by information from the sensor
- the sensor search range is dynamically changed as the moving object moves.
- Embodiment 1 of this invention It is a block diagram which shows the structure of Embodiment 1 of this invention.
- An example of a request policy in which a sensor user requests the management server to use a sensor is shown.
- An example of an authorization policy in which the sensor provider permits the management server to use the sensor is shown.
- the flowchart for sensor sharing control is shown.
- the flowchart of user authentication and policy recording is shown.
- the flowchart of a sensor search part is shown.
- the transition flowchart of MAP mode, VIEW mode, and solo mode is shown.
- 3 shows a flowchart of a MAP mode.
- the flowchart of VIEW mode is shown.
- the block diagram of Embodiment 2 of this invention is shown.
- the flowchart of solo mode is shown.
- the present invention can be implemented by dedicated hardware.
- the present invention can be implemented as a computer program that can be executed by a computer. Therefore, the present invention provides a hardware embodiment of a management server that is connected so as to be capable of data communication, a software embodiment, and a combination of software and hardware. Can be taken.
- the computer program can be recorded on any computer-readable recording medium such as a hard disk, DVD, CD, optical storage device, magnetic storage device or the like.
- FIG. 1 is a block diagram showing the configuration of the management server 110 according to Embodiment 1 of the present invention.
- the sensor user 120 requests the sensor use request processing unit 111 to use the sensor, a suitable sensor is searched from the sensor group 150 provided by the sensor provider, and the sensor is used from the terminal 121 owned by the sensor user.
- the general flow is that it becomes possible.
- the management server 110 includes a use request processing unit 111 that receives a request from a user, a request policy DB 112 that records a request policy, a use permission processing unit 113 that registers a sensor provided from the sensor provider 130, and a sensor A permission policy DB 114 that records usage permission contents, a sensor search unit 115 that searches for available sensors by matching a request policy and a permission policy, an access management unit 116 that presents the searched sensors to the user, and a sensor group
- the status management unit 117 manages 150 statuses.
- the access control unit 140 can literally control various accesses from the use terminal 121 to each function of the sensor.
- the access control unit 140 is illustrated separately from the sensor.
- the access control unit 140 performs identity authentication (confirms the identity of the user) when using the sensor.
- re-authentication is omitted in cooperation with the access management unit 116.
- the access management unit 116 preferably performs personal authentication when using the sensor.
- the access management unit 116 issues a sensor usable token to the user terminal 121, and the access control unit 140 permits access to the user terminal having the token.
- the sensor group 150 represents a plurality of sensors provided by a plurality of sensor providers. Examples of sensors include still images, moving images, voice, human body detection, infrared rays, atmospheric pressure, temperature, humidity, wind speed, smoke detection, and the like. Generally, anything that converts various external information into an electrical signal and sends it can be used as a sensor. In the present invention, an embodiment using a video camera as a sensor will be described, but other sensors can be implemented without departing from the scope of the present invention.
- the certification body 160 is an organization that issues digital certificates required for encryption communication and the like to electronic commerce businesses.
- CA The certification body 160
- the sensor user 120 or the sensor provider 130 submits a usage request or usage permission, it is determined whether or not the identity of the user, the user terminal, and the sensor itself can be trusted. Note that the identification is not necessarily performed by the CA, and can be replaced by a known authentication technique.
- the issuer's certificate authority is checked to determine whether or not the certificate can be trusted using an electronic certificate, and the authentication is performed. You can confirm that it is trustworthy by checking the higher-level stations that are authenticating the station, and finally reaching the root certificate authority that matches the root certificate that you have.
- the usage request or usage permission may be encrypted with a digital certificate (key). Encryption and decryption techniques using public keys and private keys (personal keys) are known and will not be described in detail.
- CA certification authority 160
- the contents of the use request or use permission are the request policy and the permission policy. As recorded.
- Fig. 2 shows an example of a request policy in which the sensor user requests the management server to use the sensor.
- the request policy 112 is information representing the attributes of the user, and most of them are filled with the minimum conditions of the sensor that is desired to be used.
- “filling” refers to recording in a storage area as electronic information rather than writing on a physical medium such as paper.
- the “user requester ID” for identifying the user the “user terminal ID” for identifying the sensor user's terminal, the “sensor type” indicating the type of sensor to be used, the sensor “Control level” indicating the degree of control, “Security level” indicating the sensitivity of the sensor, “Period” indicating the period of use, “Location” indicating the location of the sensor, “Function” indicating the function of the sensor
- it consists of “user attribute” indicating the attribute of the additional user.
- the request policy 112 can be appropriately added as a set of user attribute names and attribute values.
- the “use requester ID” for identifying the user may be anything as long as it can identify the user. For example, a resident identification number. If there is no such number, the use request processing unit 111 may assign a sequential number to each user.
- User terminal ID is information for identifying a sensor user's terminal. Preferably, further detailed information is added to the subsequent user attribute column.
- the terminal type, terminal vendor, operating OS, etc. are described as user attributes, and the attribute values are described. This makes it possible to perform fine sharing control for a specific device, model, and OS.
- Sensor type indicating the type of sensor to be used is filled in information indicating the type of mechanical sensor that obtains external information, such as MOVIE (camera), still image (camera), and infrared sensor.
- MOVIE camera
- still image camera
- infrared sensor infrared sensor
- the “security level” indicating the sensitivity of the sensor specifies the level of confidentiality of the sensor. Public, government and police sensors have a high security level. Conversely, the sensitivity of sensors installed by individuals is low. Those installed by companies have an intermediate sensitivity. If this item is specified as Low, almost all sensors will be searched, but whether or not they can be used depends on the attributes of the user. In particular, when using highly sensitive sensors, the use of digital certificates is enforced for users, user terminals, providers, and provided sensors in order to confirm the identity of the user and the reliability of the data. This measure is essential.
- ⁇ “Location” indicating the location of the sensor specifies the location where the sensor you want to use is located. Normally, a specific location is specified, so “Yes” is selected and the range where the sensor exists is specified. In this case, the latitude, longitude, and radius are specified. In addition, it is possible to specify (X1, Y1)-(X2, Y2) as a rectangular shape. Enter the direction when instructing the direction of the camera. In the case of FIG. 2, since there is “no designation”, the orientation of the sensor is not questioned.
- “Function” indicating the function of the sensor specifies the function that the sensor wants to have and its parameters.
- MOVIE camera
- items of stream, audio, resolution, transfer speed, and SNAP Shot still image
- YES is selected in the voice item.
- the resolution is specified by the number of vertical and horizontal bits. If you want to use a resolution higher than XGA, enter “1024x768 upper” as shown in FIG. Similarly, the transfer speed of the moving image is specified, and the presence / absence of the snap shot function as a still image is specified.
- controllable items such as camera orientation (left / right / up / down), zoom in / out, etc. are additionally specified.
- policies can be additionally specified for various functions depending on the sensor type. For example, in the case of a radio wave sensor, an antenna rotation function by a motor can be specified.
- User attribute indicating the attribute of the user specifies what category the user belongs to by attribute name and attribute value. For example, information such as gender, age, address, occupation, workplace, terminal type, terminal vendor, and operating OS is included. These are referred to when the sensor provider designates a user-limited attribute, and a user who does not meet the designated condition cannot use the sensor.
- Linux 3.0.0.0 upper is described as the user-limited attribute information of the sensor provider If it is, the sensor cannot be used. If there is a change in the content of the request policy 112, the use request processing unit 111 transmits to the sensor search unit 115 that the content of the request policy 112 has been changed. The sensor search unit 115 performs a search again based on this information.
- Fig. 3 shows an example of an authorization policy that allows the sensor provider to allow the management server to use the sensor.
- Permission policy 114 is information representing the attributes on the sensor provider side, and in many cases, the usage permission conditions of the sensor to be used are written. “Sensor Provider ID” for identifying the provider, “Sensor Type” indicating the type of sensor to be permitted to use, “Sensor Serial” indicating the sensor serial number, and “Control Level” indicating how much the sensor is controlled ”,“ Security level ”indicating the sensitivity of the sensor,“ Period ”indicating the period of use,“ Location ”indicating the location of the sensor,“ Function ”indicating the function of the sensor, and other attributes that limit the user It consists of “user-only attributes”.
- the permission policy 114 can be appropriately added as a set of attribute names and attribute values of sensors provided or used as necessary.
- the “sensor provider ID” for identifying the provider may be any information as long as it can identify the provider. For example, a resident identification number. When there is no such number, the use permission processing unit 113 may allocate a sequential number for each provider.
- Sensor type indicating the type of sensor to be provided is information indicating the type of mechanical sensor that obtains external information, such as MOVIE (camera), still image (camera), infrared sensor.
- MOVIE camera
- still image camera
- infrared sensor infrared sensor
- Control level indicating the degree to which the sensor is controlled is set to “FULL” when all the functions of the sensor are fully accessed. Enter LOW, MID, and specific level.
- the “security level” indicating the sensitivity of the sensor specifies the level of confidentiality of the sensor. Public, government and police sensors have a high security level. Conversely, the sensitivity of sensors installed by individuals is low. Those installed by companies have an intermediate sensitivity. When this item is High, the user is restricted. Usually, the condition is presented in the subsequent user-only attribute. In particular, when permitting the use of highly sensitive sensors, the use of digital certificates is compulsory for users, user terminals, providers, and provided sensors in order to confirm identity and to ensure data reliability. Such measures are essential.
- “Installation location” indicating the installation location of the sensor specifies the installation location of the sensor to be used. “Fixed” indicates that the position is fixed. “Non-fixed” means a sensor whose position may move, such as an in-vehicle camera. In the case of “with fixed”, latitude and longitude are entered as subsequent coordinates. If the camera direction is fixed, enter the direction. In the case of “not fixed”, the status management unit 117 periodically updates the coordinates based on position information of a sensor such as GPS. Regular differs depending on the type of sensor. A sensor that hardly moves is a unit of several hours to a few days, and a sensor that moves very fast can have a unit of several milliseconds.
- “Function” indicating the sensor function specifies the function of the sensor and its parameters.
- MOVIE camera
- items of stream, audio, resolution, transfer speed, and snapshot can be selected.
- the sensor is a type that transmits camera video in stream format. If the sensor is a type that transmits camera video in stream format, select YES for the stream item. If the sensor can also transmit voice information, YES is selected for the voice item.
- the resolution of the sensor is specified by the number of vertical and horizontal bits. In the case of FIG. 3, “1920 ⁇ 1200” is entered. Similarly, the transfer speed of moving images and the presence / absence of a snap shot function as a still image are designated.
- “User-limited attributes” that limit users, specify the user attribute conditions that must be met at least when using the sensor.
- the attribute condition is specified by an attribute name and an attribute value. For example, this includes information such as gender, age, address, occupation, workplace, terminal type, terminal vendor, and operating OS. Users who do not meet this condition cannot use the sensor. If there is a change in the content of the permission policy 114, the use permission processing unit 113 transmits to the sensor search unit 115 that the content of the permission policy 114 has been changed. The sensor search unit 115 performs a search again based on this information.
- FIG. 9 shows a flowchart for dynamic sensor sharing control in the first embodiment.
- the sensor user 120 issues a sensor use request to the management server 110.
- the usage request processing unit 111 records the usage request in the request policy 112.
- the sensor search unit reads the permission policy of each sensor provided by the sensor provider 130, and searches for a sensor that matches (policy matching) with the request policy of the sensor user 120.
- the searched sensor list is transmitted to the access management unit 116.
- the access management unit 116 receives the sensor list from the sensor search unit 115, transmits the sensor list to the sensor user terminal 121, receives a controllable sensor selected by the sensor user 120, and finally receives the sensor list in the access control unit. , Send information to identify the sensor users that can be operated (user ID, user terminal ID, certificate including them, temporary issued token, etc.).
- the access management unit 116 basically performs a process of passing access control information to the access control unit 140. As a simpler mode, an ACL may be issued that describes which terminal can access what sensor. Preferably, the access management unit 116 transmits a token to the sensor user terminal 121 and the sensor. Each access control unit permits access only to a user terminal having a token that can control its own sensor.
- the token may be configured to include a set of a sensor serial number and a user terminal ID that can use the sensor.
- a hash function may be used to calculate a hash value from a sensor serial number and a user terminal ID that can use the sensor, and this may be used as a token.
- this token is also transmitted to the access control unit 140 of the sensor. Only the user terminal having this token can control the sensor, thereby preventing illegal access control.
- step 940 the sensor user 120 controls the sensor that can be accessed from the sensor user terminal 121 via the access control unit 140.
- FIG. 10 shows a flowchart of user authentication and policy recording.
- the flow is the same for the user and the authorized user. The flow will be described from the viewpoint of the use requester, but in the case of the use permitter, it is shown in parentheses.
- step 1010 the use request processing unit 111 (use permission processing unit 113) accepts use (provision) from the sensor user 120 (sensor provider 130).
- step 1020 if a certificate is attached to the request content (provided content), or if the content of the request content (provided content) has sensitive information and attribute condition designation, the sensor user 120
- the authentication authority is inquired about the authentication of whether the (sensor provider 130) is a valid user and the sensor user terminal 121 (the sensor group 150 provided by the sensor provider) is a valid device. It authenticates whether it is. If the authentication fails, the request content (provided content) as a policy and subsequent processing are not executed.
- step 1030 the request content (provided content) is recorded in the request policy 112 (permission policy 114).
- step 1040 the use request processing unit 111 (use permission processing unit 113) notifies the sensor search unit 115 that the policy record has been updated, and transmits necessary information including the update part.
- FIG. 12 shows a flowchart of the sensor search unit. Normally, the sensor search unit 115 searches for a sensor when a policy is updated from the use request processing unit 111 or the use permission processing unit 113.
- an unprocessed policy is selected.
- An unprocessed policy refers to a request policy that has been added or changed, a request policy that is affected by the addition or change of a permission policy, and the like. That is, a request policy that requires a new search is selected.
- step 1220 a permission policy that matches the selected request policy is searched from the permission policy 114.
- the status management unit 117 is inquired about the status of the sensor having the retrieved permission policy.
- the status management unit 117 returns the usable status of the designated sensor to the sensor search unit 115.
- the status management unit 117 periodically polls the sensor (which varies depending on the sensor type) and transmits the status status to the sensor search unit 115.
- This configuration effectively functions as a dynamic re-search trigger of the sensor search unit 115.
- the GPS position information of the sensor is also acquired at the time of polling.
- the sensor search unit 115 transmits an available sensor list to the access management unit 116 as a search result. Then, the process returns to step 1210, and the sensor search unit 115 selects the next unprocessed policy.
- the sensor search unit 115 performs a re-search (policy matching). This enables dynamic sensor sharing control.
- FIG. 11 shows, as an example, a block diagram of computer hardware used for the terminal of the sensor user 120, the terminal of the sensor provider 130, and the management server 110 of the present invention.
- the computer device (1101) includes a CPU (1102) and a main memory (1103), which are connected to a bus (1104).
- CPU (1102) is zSeries (TM), PowerPC (TM), and other CPUs based on 32-bit or 64-bit architecture, such as Intel Xeon (TM) series, Core (TM) series, Atom (TM) series , Pentium (TM) series, Celeron (TM) series, AMD's Phenom (TM) series, Athlon (TM) series, Turion (TM) series, and Empron (TM) can be used.
- a display (1106) corresponding to a display device such as an LCD monitor is connected to the bus (1104) via a display controller (1105).
- the display (1106) is used to display an application, a sensor request screen, a sensor provision screen, and a GUI 330.
- the bus (1104) is also connected to a hard disk (1108) or silicon disk and a CD-ROM, DVD drive or Blu-ray drive (1109) via a storage device controller (1107).
- the storage device of the management server 110 stores programs that perform processing of the use request processing unit 111, the use permission processing unit 113, the sensor search unit 115, the access management unit 116, and the status management unit 117.
- the request policy 112 and the permission policy 114 are also stored in the storage device.
- an OS, an application, and a program for displaying the GUI 330 are stored.
- Programs and data are preferably loaded from the hard disk (1108) into the main memory (1103) and executed by the CPU (1102).
- a CD-ROM, DVD or Blu-ray drive (1109) installs the program of the present invention on a hard disk from a CD-ROM, DVD-ROM or Blu-ray disc, which is a computer-readable medium, if necessary. Or it is used to read data.
- a keyboard (1111) and a mouse (1112) are further connected to the bus (1104) via a keyboard / mouse controller (1110).
- shared control of a plurality of sensors is smoothly performed in real time between a plurality of sensor providers and a plurality of sensor users.
- FIG. 16 shows a block diagram of the second embodiment of the present invention. Although it is substantially the same as Embodiment 1, it differs in the point which has added GUI330 to the sensor user terminal 121, and the point which uses GPS310 and GPS320 in order to measure a position in real time.
- the real-time sensor sharing control is enabled by the GPS 310 and the GPS 320.
- the user attribute (user terminal) and the attribute value (terminal coordinate) of the request policy 112 are added.
- the location information of the user terminal 121 is periodically transmitted based on information such as GPS.
- the usage request processing unit 111 receives this and updates the terminal coordinates of the request policy 112.
- the status management unit 117 periodically updates the coordinates based on position information of sensors such as GPS.
- the sensor search unit 115 refers to these coordinate information as necessary during policy matching.
- regular refers to a unit of several hours to a few days for a user who hardly moves, and a unit of seconds for a user who moves very fast.
- the GUI 330 greatly contributes to sensor sharing control with high usability in terms of how efficiently the sensor group 150 is shared and controlled, and how to simultaneously grasp information of a plurality of sensors in real time.
- Fig. 4 shows the GUI 330 MAP mode display screen (initial screen).
- the GUI 330 provides an efficient UI for providing the sensor information that can be shared and controlled to the user.
- a sensor group that can be used by the sensor user terminal 121 is displayed on the map based on the GPS position information.
- the sensor user terminal 121 does not move and exists at a fixed position (in the information management center or the like) (for example, at the center of the displayed map).
- the sensor user terminal 121 may be in a moving vehicle. In that case, you may display so that the center of a screen may be united with an own position based on the positional information from GPS.
- the sensor user 120 designates an area where sensor information is to be received.
- a specific location on the screen is selected (clicked) with the pointer 405.
- the specific location is a point that the sensor user 120 wants to check, such as a violation vehicle, an accident vehicle, an incident site, an event site, or a disaster location. This location may automatically acquire position information from specific sensor information.
- the point where the criminal escape vehicle is considered to be present is selected.
- the radius by numerical input or GUI (dragging with the right mouse button).
- GUI dragging with the right mouse button.
- a radius of 1 km is designated.
- the area is designated by a circle, but a rectangular area may be designated.
- the sensor selection area 495 is displayed as a circle.
- the sensor group existing in the area is identified.
- the in-vehicle cameras 420, 430, 440, 450, 460, and the fixed camera 470 are in the area.
- a transition is made to a solo mode (to be described later) for visually recognizing information of each sensor.
- FIG. 5 shows a screen example of the GUI 330 in the VIEW mode.
- the sensors in the specified area are divided and displayed on the screen.
- FIG. 5 it is identified that the in-vehicle cameras 420, 430, 440, 450, 460, and the fixed camera 470 are in the area.
- the VIEW mode is a multi-sensor simultaneous viewing mode.
- MAP button 510 is a button for making a transition to the MAP mode in FIG.
- the Record button 520 is a button for recording video.
- the SNAP button 530 is a button for recording a still image.
- the Measurement button 540 is a button for performing surveying.
- the Track button 550 is a button for tracking the selected target.
- FIG. 6 shows a screen display example in the solo mode.
- the solo mode is a mode for displaying single sensor information on one screen.
- function buttons (VIEW button 610, MAP button 620, Record button 630, SNAP button 640, Analyze button) in which single sensor information is displayed on one screen and a plurality of control commands are assigned to the right end. 650) and control buttons are displayed.
- the MAP button 610 is a button for making a transition to the MAP mode in FIG.
- the VIEW button 620 is a button for making a transition to the VIEW mode of FIG.
- the Record button 630 is a button for recording video.
- the SNAP button 640 is a button for recording a still image.
- the Analyze button 650 is a button for analyzing the selection unit.
- buttons 660 to 694 are displayed when the sensor can control the direction. If the up / down / left / right camera direction can be controlled, the camera direction can be adjusted by clicking the buttons 660, 670, 680, and 690, respectively. Buttons 692 and 694 are displayed in the case of a sensor capable of zooming in and zooming out.
- the SNAP button 640 is pressed to record and display a still image.
- image analysis is performed.
- the result of image analysis is displayed at the bottom of the screen. Since there are various kinds of image analysis in the selected area and are well-known, the details are omitted.
- the vehicle type, the license plate character analysis, and the color analysis results are displayed from the shape.
- the Measurement button 540 in the VIEW mode of FIG. 5 is a button for measuring the physical parameter of the object (the speed of the escape vehicle in the example of FIG. 5).
- FIG. 7 shows an example of the measurement.
- the window 430 is focused with a cursor or the like.
- the pointer is clicked at the rightmost vehicle position with the pointer 710 and clicked at the leftmost vehicle position.
- the speed of the escape vehicle can be measured from the direction.
- the Measurement button if the fixed camera 470 is a speed measurement sensor, the speed of the escape vehicle can be measured only by pressing the Measurement button in the window 470.
- a lock-on function is provided by selecting an object in a specific window.
- Image analysis is performed by selecting the object 700 in the window of FIG. 7 with a pointer or the like, and its shape is determined.
- an object that is determined to have the same shape is found in another sensor window, it is automatically selected. In this way, the two-dimensional position of the object is calculated.
- the movement can be traced to the selection target.
- the escape vehicle can be tracked. That is, the position selected by the pointer 405 in FIG. 4 is replaced with the escape vehicle.
- the current position is acquired by the lock-on function, or the position is automatically calculated from the speed and direction of the escape vehicle, and the range of the sensor selection area 495 is automatically changed.
- the Track button When the Track button is pressed while there is an object that is locked on, the object can be tracked. Since the two-dimensional position of the object is obtained by calculation from a plurality of sensor information, the position selected by the pointer 405 in FIG. 4 is replaced with the position of the object, and the object can be automatically tracked in the MAP mode. At the same time, the range of the sensor selection area 495 also moves dynamically as the object moves.
- FIG. 8 shows an example of a screen in which the escape vehicle is moving to the point 820.
- the sensor selection area 495 is changed to the sensor selection area 810 as the escape vehicle moves. Accordingly, the in-region sensor is also changed to the in-vehicle cameras 410, 420, 430, 440 and the fixed cameras 470, 480.
- FIG. 13 shows a transition flowchart of the MAP mode, VIEW mode, and solo mode provided by the GUI 330.
- the mode transition in FIG. 13 conceptually represents an event driven format. Transition to the mode of the selected button in any mode. In the MAP mode, only the VIEW button exists on the GUI 330, but transitions to the VIEW mode in response to the area designation. Also, transition to the solo mode is made by selecting an individual sensor displayed on the map with a pointer or the like.
- Fig. 14 shows a flowchart of the MAP mode.
- the MAP mode is the initial screen of the GUI 330.
- the position information of the sensor user terminal 121 is acquired from the GPS.
- the sensor user terminal 121 is fixed, for example, a house, a building, etc., it is not necessary to confirm the position information by GPS.
- step 1420 the map is displayed with the position of the sensor user terminal 121 aligned with the center of the screen. In this way, the center of the screen is always drawn at the position of the sensor user terminal 121 as the sensor user terminal 121 moves.
- step 1430 an area is designated.
- the second embodiment shows an example of specifying a circular area.
- the user selects the position to be searched and inputs the radius.
- a rectangular area may be designated.
- the mouse is dragged to specify (X1, Y1)-(X2, Y2) from the start point to the end point. If there is no area designation, or if an area has already been set by transition from another mode, step 1430 is skipped.
- step 1440 a use request is made to the sensor existing in the area designated by the management server 110.
- an available sensor list is received from the access management unit 116.
- a UI for excluding unused or unnecessary sensors from the list may be presented.
- the sensor list cannot be controlled, but sensor list information existing in a range obtained by expanding (for example, double) the specified range is included.
- sensor list information existing in a range obtained by expanding (for example, double) the specified range is included.
- a sensor that can be newly controlled can be visually recognized, and further pre-authentication can be performed. That is, by completing device authentication such as authentication token issuance and certificate authentication between the user terminal 121 and a sensor that may be controlled in advance, the user terminal 121 moves fast and usage conditions are frequent. Even if it is changed to, seamless connection with high real-time characteristics is possible.
- a deformed deformed shape corresponding to the type of sensor (car if it is an in-vehicle camera, camera shape if it is a fixed camera, etc.) is displayed on the screen as an object at the location of each sensor in the sensor list.
- step 1460 sensor information is acquired from the available sensor, and the sensor is controlled via the access control unit 140 as necessary.
- Fig. 15 shows a flowchart of the VIEW mode.
- the VIEW mode is a mode in which each sensor information is simultaneously displayed in a plurality of windows and visually recognized.
- step 1510 it is determined whether the Record button has been pressed. If it has been pressed, video and audio are recorded in step 1560.
- step 1520 it is determined whether or not the SNAP button has been pressed. If so, a snapshot (still image) is recorded and displayed in step 1570.
- step 1530 it is determined whether the measurement button is pressed, and if it is pressed, the physical parameter of the object is measured. In the example of the second embodiment, speed measurement is performed.
- step 1580 it is determined whether the Track button has been pressed. If it has been pressed, it is confirmed in step 1550 whether the speed has been measured. If the speed has not been measured, the speed is first measured at 1590, the current position at that speed is calculated at step 1595, the area centered on that position is designated, and the mode is shifted to the MAP mode.
- the lock-on function calculates the position of the object by searching for the same shape object in a plurality of windows.
- shape analysis is performed on an object selected in one window, and the same shape object in another window is automatically selected.
- the current position of the object can be calculated two-dimensionally. This calculation requires at least two window viewpoints, but the accuracy is further increased when three or more window viewpoints are detected.
- the position is tracked in the MAP mode with the area specified as the center.
- the image analysis may be used for the lock-on function.
- the two-dimensional position of the object is obtained.
- the region is designated centering on the position, and the mode is changed to the MAP mode. Since the sensor information is acquired in the background, the position of the object is dynamically recalculated even if the mode is changed.
- the vehicle tracking by the Track button is performed by estimating the current position from the vehicle speed and moving direction measured in advance or estimating the position by the lock-on function, specifying the area including the position, and switching to the MAP mode. Is called. By doing so, it is possible to track an object and dynamically change available sensors in real time.
- Fig. 17 shows a flowchart of the solo mode.
- the solo mode is a mode for displaying information of a single sensor.
- step 1710 it is determined whether the Record button has been pressed. If it is pressed, video / audio is recorded in step 1760.
- step 1720 it is determined whether or not the SNAP button has been pressed. If so, a snapshot (still image) is recorded and displayed in step 1770.
- step 1730 it is determined whether the control button (or control key) has been pressed. If it is pressed, sensor control of up / down / left / right, zoom-in and zoom-out is performed according to the button (or key) pressed in step 1780. .
- step 1740 it is determined whether the Analyze button has been pressed. If so, step 1750 analyzes the image and displays the result on the screen. Processing then returns to 1710.
- Image analysis can be done with moving images, but it is usually performed with still images taken with snapshots.
- a region to be analyzed in the still image is designated with a mouse or the like, and shape analysis, color analysis, person analysis, and character recognition are performed on the region, and the analysis result is displayed in the lower part of the screen.
- Management server 110 Display device 111 Usage request processing unit 112 Request policy 113 Usage permission processing unit 114 Authorization policy 115 Sensor search unit 116 Access management unit 117 Status management unit 120 Sensor user 121 Sensor user terminal 130 Sensor provider 131 Sensor provision User terminal 140 access control unit 150 sensor 160 authentication authority 310, 320 GPS 330 GUI 405 pointer
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Health & Medical Sciences (AREA)
- Bioethics (AREA)
- General Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- Databases & Information Systems (AREA)
- General Physics & Mathematics (AREA)
- Computing Systems (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Medical Informatics (AREA)
- Telephonic Communication Services (AREA)
- Storage Device Security (AREA)
Abstract
Description
また別の課題は、センサーを利用する側のポリシを適切に考慮したセンサー共有制御装置、方法を提供することである。
また別の課題は、センサーまたはセンサー利用者のポリシが変更された場合に動的なセンサーの共有制御を行う装置、方法を提供することである。
また別の課題は、センサー利用者またはセンサー提供者の属性を考慮したセンサー共有制御装置、方法を提供することである。
また別の課題は、リアルタイムにセンサー共有制御を行う装置、方法を提供することである。
また別の課題は、リアルタイムなセンサー共有制御により移動物体の動的追跡を行う装置、方法を提供することである。 The present invention has been made in view of such circumstances, and an object thereof is to provide an apparatus, a method, and a computer program for performing sensor sharing control dynamically, flexibly, and seamlessly.
Another object is to provide a sensor sharing control apparatus and method that appropriately considers the policy on the side using the sensor.
Another object is to provide an apparatus and a method for performing dynamic sensor sharing control when a policy of a sensor or a sensor user is changed.
Another problem is to provide a sensor sharing control apparatus and method that take into account the attributes of the sensor user or sensor provider.
Another object is to provide an apparatus and method for performing sensor sharing control in real time.
Another object is to provide an apparatus and a method for dynamically tracking a moving object by real-time sensor sharing control.
図1は、本発明の実施の形態1に係る管理サーバ110の構成を示すブロック図である。センサー利用者120がセンサー利用要求処理部111にセンサーの利用を要求すると、センサー提供者から提供されたセンサー群150から適合するセンサーが検索され、センサー利用者の所有する端末121からそのセンサーが利用可能になるというのが大まかな流れである。 (Embodiment 1)
FIG. 1 is a block diagram showing the configuration of the
図16に本発明の実施の形態2のブロック図を示す。実施の形態1とほぼ同一であるが、センサー利用者端末121にGUI330を付加している点と、リアルタイムに位置を測定するためにGPS310とGPS320を用いている点が異なる。GPS310とGPS320により、センサー利用者端末121が移動している場合、センサー群150が移動している場合、またその両方のケースにおいてリアルタイムのセンサー共有制御を可能にする。 (Embodiment 2)
FIG. 16 shows a block diagram of the second embodiment of the present invention. Although it is substantially the same as Embodiment 1, it differs in the point which has added GUI330 to the
110 表示装置
111 利用要求処理部
112 要求ポリシ
113 利用許可処理部
114 許可ポリシ
115 センサー検索部
116 アクセス管理部
117 ステータス管理部
120 センサー利用者
121 センサー利用者端末
130 センサー提供者
131 センサー提供者端末
140 アクセス制御部
150 センサー
160 認証機関
310、320 GPS
330 GUI
405 ポインタ 110
330 GUI
405 pointer
Claims (22)
- センサー共有制御装置であって、
利用要求処理部がセンサー利用者端末からセンサー利用要求情報を受信する手段と、
利用要求処理部が利用要求を要求ポリシに記録し、当該要求ポリシの記録がされたことをセンサー検索部に通知する手段と、
利用許可処理部がセンサー提供者端末からセンサー利用許可情報を受信する手段と、
利用許可処理部が利用許可を許可ポリシに記録し、当該許可ポリシの記録がされたことをセンサー検索部に通知する手段と、
センサー検索部が、前記要求ポリシまたは前記許可ポリシの記録の通知を受信することに応答して、要求ポリシに合致する許可ポリシを検索する手段と、
前記合致した許可ポリシに含まれるセンサーのリストを作成し、前記要求ポリシに記録されたセンサー利用者端末に送信する手段と、
を有し、
前記検索する手段が、前記要求ポリシまたは前記許可ポリシが変更された場合に動的に再検索を行うことを特徴とする、
前記装置。 A sensor sharing control device,
Means for the usage request processing unit to receive sensor usage request information from the sensor user terminal;
Means for the usage request processing unit to record the usage request in the request policy and to notify the sensor search unit that the request policy has been recorded;
Means for the use permission processing unit to receive sensor use permission information from the sensor provider terminal;
Means for notifying the sensor search section that the use permission processing section records the use permission in the permission policy and that the permission policy has been recorded;
Means for searching for a permission policy that matches the request policy in response to receiving a notification of the request policy or the record of the permission policy;
Means for creating a list of sensors included in the matched authorization policy and transmitting to a sensor user terminal recorded in the request policy;
Have
The searching means dynamically performs a re-search when the request policy or the permission policy is changed.
Said device. - 利用要求処理部が、前記センサー利用要求情報に応じて、センサー利用者が正当な利用者であるかどうか認証する手段を有し、認証成功した場合に、前記センサー利用要求情報を前記要求ポリシに記録する、請求項1記載の装置。 The use request processing unit has means for authenticating whether the sensor user is a valid user according to the sensor use request information, and when the authentication is successful, the sensor use request information is added to the request policy. The apparatus of claim 1 for recording.
- 利用許可処理部が、前記センサー利用許可情報に応じて、センサー提供者が正当な提供者であるかどうか認証する手段を有し、認証成功した場合に、前記センサー利用許可情報を前記許可ポリシに記録する、請求項1記載の装置。 The use permission processing unit has means for authenticating whether the sensor provider is a valid provider according to the sensor use permission information, and when the authentication is successful, the sensor use permission information is added to the permission policy. The apparatus of claim 1 for recording.
- 前記装置がステータス管理部を有し、当該ステータス管理部が、各センサーのステータスを管理し、センサー検索部の要求に応じて、利用可能なセンサーのリストを作成し、前記センサー検索部に送信する手段を有する、請求項1記載の装置。 The apparatus has a status management unit, which manages the status of each sensor, creates a list of available sensors in response to a request from the sensor search unit, and transmits the list to the sensor search unit The apparatus of claim 1, comprising means.
- 前記装置がアクセス管理部を有し、当該アクセス管理部が、前記センサー検索部から前記センサーのリストを受け取り、前記センサー利用者端末が前記リスト内のセンサーを制御可能にするトークンを作成し、前記センサー利用者端末に送信する手段を有する、請求項1記載の装置。 The apparatus has an access management unit, the access management unit receives a list of the sensors from the sensor search unit, creates a token that allows the sensor user terminal to control the sensors in the list, and The apparatus according to claim 1, further comprising means for transmitting to the sensor user terminal.
- 前記センサー利用要求情報を受信する手段が、前記センサー利用者端末の位置情報を定期的に受信し、前記要求ポリシを更新する手段を含む、請求項1記載の装置。 The apparatus according to claim 1, wherein the means for receiving the sensor use request information includes means for periodically receiving position information of the sensor user terminal and updating the request policy.
- 前記ステータス管理部が、センサーの位置情報を定期的に取得し、前記許可ポリシを更新する手段を有する、請求項4記載の装置。 The apparatus according to claim 4, wherein the status management unit includes means for periodically acquiring sensor position information and updating the permission policy.
- 前記センサー利用要求情報が、利用者端末ID、センサー種別、位置、機密度を含む、請求項1記載の装置。 The apparatus according to claim 1, wherein the sensor use request information includes a user terminal ID, a sensor type, a position, and a sensitivity.
- 前記センサー利用許可情報が、センサー種別、設置場所、センサーシリアル、機密度を含む、請求項1記載の装置。 The apparatus according to claim 1, wherein the sensor use permission information includes a sensor type, an installation location, a sensor serial, and a sensitivity.
- 請求項7記載のセンサー共有制御装置と通信を行う、端末であって、
制御可能なセンサーのリストを前記センサー共有制御装置から受信する手段と、
前記センサー利用者端末および前記センサーの位置情報に基づき地図を表示装置に表示する手段と、
前記表示された地図上においてセンサー検索範囲が指定されることに応答して、前記範囲内に存在する前記御可能なセンサーについて、当該センサーの位置に前記センサーをオブジェクト表示する手段と、
前記オブジェクトが選択されることに応答して、当該センサーの情報を表示する手段と、
前記センサーの制御指示がされることに応答して、前記センサー共有制御装置へ前記センサーを制御する信号を送信する手段
を具備する、端末。 A terminal that communicates with the sensor sharing control device according to claim 7,
Means for receiving a list of controllable sensors from the sensor sharing controller;
Means for displaying a map on a display device based on the sensor user terminal and position information of the sensor;
In response to the sensor search range being specified on the displayed map, for the controllable sensors present in the range, means for object-displaying the sensor at the position of the sensor;
Means for displaying information of the sensor in response to the object being selected;
A terminal comprising means for transmitting a signal for controlling the sensor to the sensor sharing control device in response to an instruction to control the sensor. - 請求項7記載のセンサー共有制御装置と通信を行う、センサーであって、
センサーが有する機能を制御可能とするためのトークンを前記センサー共有制御装置から受信し、当該トークンを記憶する手段と、
センサー利用者端末から、トークンを受信する手段と、
前記記憶したトークンと前記受信したトークンが一致する場合に、前記センサー利用者端末から前記機能を制御可能とさせる手段、
を具備する、センサー。 A sensor that communicates with the sensor sharing control device according to claim 7,
Means for receiving a token for enabling control of a function of the sensor from the sensor sharing control device and storing the token;
Means for receiving a token from the sensor user terminal;
Means for enabling the function to be controlled from the sensor user terminal when the stored token matches the received token;
Comprising a sensor. - センサー利用要求を受け付ける利用要求処理部と、センサー利用許可を受け付ける利用許可処理部と、センサー利用要求とセンサー利用許可とのマッチングを行い利用可能なセンサーを検索するセンサー検索部を含む装置において、センサーを動的に共有制御する方法であって、
利用要求処理部がセンサー利用者端末からセンサー利用要求情報を受信するステップと、
利用要求処理部が利用要求を要求ポリシに記録し、当該要求ポリシの記録がされたことをセンサー検索部に通知するステップと、
利用許可処理部がセンサー提供者端末からセンサー利用許可情報を受信するステップと、
利用許可処理部が利用許可を許可ポリシに記録し、当該許可ポリシの記録がされたことをセンサー検索部に通知するステップと、
センサー検索部が、前記要求ポリシまたは前記許可ポリシの記録の通知を受信することに応答して、要求ポリシに合致する許可ポリシを検索するステップと、
前記合致した許可ポリシに含まれるセンサーのリストを作成し、前記要求ポリシに記録されたセンサー利用者端末に送信するステップと、
を有し、
前記検索するステップが、前記要求ポリシまたは前記許可ポリシが変更された場合に動的に再検索を行うステップである、
前記方法。 In an apparatus including a use request processing unit that receives a sensor use request, a use permission processing unit that receives a sensor use permission, and a sensor search unit that searches for a usable sensor by matching the sensor use request and the sensor use permission. Is a method of dynamically sharing and controlling
The use request processing unit receiving sensor use request information from the sensor user terminal; and
The usage request processing unit records the usage request in a request policy, and notifies the sensor search unit that the request policy has been recorded;
A step where the use permission processing unit receives sensor use permission information from the sensor provider terminal;
A step in which the use permission processing unit records the use permission in the permission policy, and notifies the sensor search unit that the permission policy has been recorded;
In response to receiving a notification of the request policy or the record of the permission policy, a sensor search unit searches for a permission policy that matches the request policy;
Creating a list of sensors included in the matched authorization policy and sending to a sensor user terminal recorded in the request policy;
Have
The searching step is a step of dynamically performing a re-search when the request policy or the permission policy is changed.
Said method. - 利用要求処理部が、前記センサー利用要求情報に応じて、センサー利用者が正当な利用者であるかどうか認証するステップを有し、認証成功した場合に、前記センサー利用要求情報を前記要求ポリシに記録する、請求項12記載の方法。 The use request processing unit has a step of authenticating whether the sensor user is a valid user according to the sensor use request information, and when the authentication is successful, the sensor use request information is added to the request policy. 13. The method of claim 12, wherein recording is performed.
- 利用許可処理部が、前記センサー利用許可情報に応じて、センサー提供者が正当な提供者であるかどうか認証するステップを有し、認証成功した場合に、前記センサー利用許可情報を前記許可ポリシに記録する、請求項12記載の方法。 The use permission processing unit has a step of authenticating whether the sensor provider is a valid provider according to the sensor use permission information, and when the authentication is successful, the sensor use permission information is added to the permission policy. 13. The method of claim 12, wherein recording is performed.
- 前記装置がステータス管理部をさらに含み、当該ステータス管理部が、各センサーのステータスを管理し、センサー検索部の要求に応じて、利用可能なセンサーのリストを作成し、前記センサー検索部に送信するステップを有する、請求項12記載の方法。 The apparatus further includes a status management unit, which manages the status of each sensor, creates a list of available sensors in response to a request from the sensor search unit, and transmits the list to the sensor search unit The method of claim 12, comprising steps.
- 前記装置がアクセス管理部をさらに含み、当該アクセス管理部が、前記センサー検索部から前記センサーのリストを受け取り、前記センサー利用者端末が前記リスト内のセンサーを制御可能にするトークンを作成し、前記センサー利用者端末に送信する、請求項12記載の方法。 The apparatus further includes an access management unit, the access management unit receives a list of the sensors from the sensor search unit, creates a token that allows the sensor user terminal to control the sensors in the list, and The method according to claim 12, wherein the method is transmitted to a sensor user terminal.
- 前記センサー利用要求情報を受信するステップが、前記センサー利用者端末の位置情報を定期的に受信し、前記要求ポリシを更新するステップを含む、請求項12記載の方法。 The method according to claim 12, wherein the step of receiving the sensor usage request information includes the step of periodically receiving position information of the sensor user terminal and updating the request policy.
- 前記ステータス管理部が、センサーの位置情報を定期的に取得し、前記許可ポリシを更新するステップを有する、請求項15記載の方法。 The method according to claim 15, further comprising the step of the status management unit periodically acquiring sensor position information and updating the permission policy.
- センサー利用要求を受け付ける利用要求処理部と、センサー利用許可を受け付ける利用許可処理部と、センサー利用要求とセンサー利用許可とのマッチングを行い利用可能なセンサーを検索するセンサー検索部を含むセンサー共有制御装置と通信を行うための方法であって、
制御可能なセンサーのリストを前記センサー共有制御装置から受信するステップと、
前記センサー利用者端末および前記センサーの位置情報に基づき地図を表示装置に表示するステップと、
前記表示された地図上においてセンサー検索範囲が指定されることに応答して、前記範囲内に存在する前記御可能なセンサーについて、当該センサーの位置に前記センサーをオブジェクト表示するステップと、
前記オブジェクトが選択されることに応答して、当該センサーの情報を表示するステップと、
前記センサーの制御指示がされることに応答して、前記センサー共有制御装置へ前記センサーを制御する信号を送信するステップ
を有する、方法。 A sensor sharing control device including a usage request processing unit that receives a sensor usage request, a usage permission processing unit that receives a sensor usage permission, and a sensor search unit that searches for available sensors by matching the sensor usage request and sensor usage permission. A method for communicating with
Receiving a list of controllable sensors from the sensor sharing controller;
Displaying a map on a display device based on the sensor user terminal and the position information of the sensor;
In response to a sensor search range being specified on the displayed map, for the controllable sensor existing in the range, displaying the sensor at the position of the sensor as an object;
Displaying the sensor information in response to the object being selected;
A method comprising: transmitting a signal for controlling the sensor to the sensor sharing control device in response to an instruction to control the sensor. - 前記センサー検索範囲が変更されたか若しくは移動した場合に、前記制御可能なセンサーが動的に更新され、当該更新されたセンサーがオブジェクト表示される、請求項19記載の方法。 20. The method of claim 19, wherein when the sensor search range is changed or moved, the controllable sensor is dynamically updated and the updated sensor is displayed as an object.
- 前記センサー検索範囲の中心が移動物体であり、前記センサーからの情報により当該移動物体の位置を捕捉し、前記移動物体の移動に伴い前記センサー検索範囲を動的に変更する、請求項20記載の方法。 21. The center of the sensor search range is a moving object, captures the position of the moving object based on information from the sensor, and dynamically changes the sensor search range as the moving object moves. Method.
- 請求項12~21の何れか1つに記載の方法の各ステップをコンピュータに実行させる、コンピュータ・プログラム。 A computer program for causing a computer to execute the steps of the method according to any one of claims 12 to 21.
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/425,458 US9614852B2 (en) | 2012-09-21 | 2013-07-19 | Sensor sharing control |
DE112013003833.1T DE112013003833T5 (en) | 2012-09-21 | 2013-07-19 | Apparatus, methods and computer program for controlling the sharing of sensors |
JP2014536649A JP5823050B2 (en) | 2012-09-21 | 2013-07-19 | Sensor sharing control apparatus, method, and computer program |
GB1505985.0A GB2520898B (en) | 2012-09-21 | 2013-07-19 | Sensor sharing control apparatus, method, and computer program |
CN201380049439.7A CN104685512B (en) | 2012-09-21 | 2013-07-19 | Sensor Compliance control device and method thereof |
US15/446,088 US9916470B2 (en) | 2012-09-21 | 2017-03-01 | Sensor sharing control |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012208125 | 2012-09-21 | ||
JP2012-208125 | 2012-09-21 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/425,458 A-371-Of-International US9614852B2 (en) | 2012-09-21 | 2013-07-19 | Sensor sharing control |
US15/446,088 Continuation US9916470B2 (en) | 2012-09-21 | 2017-03-01 | Sensor sharing control |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014045699A1 true WO2014045699A1 (en) | 2014-03-27 |
Family
ID=50341025
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/069637 WO2014045699A1 (en) | 2012-09-21 | 2013-07-19 | Sensor share control device, method, and computer program |
Country Status (6)
Country | Link |
---|---|
US (2) | US9614852B2 (en) |
JP (1) | JP5823050B2 (en) |
CN (1) | CN104685512B (en) |
DE (1) | DE112013003833T5 (en) |
GB (1) | GB2520898B (en) |
WO (1) | WO2014045699A1 (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9614852B2 (en) | 2012-09-21 | 2017-04-04 | International Business Machines Corporation | Sensor sharing control |
WO2017104287A1 (en) * | 2015-12-14 | 2017-06-22 | オムロン株式会社 | Data flow control device and data flow control method |
JP2018055571A (en) * | 2016-09-30 | 2018-04-05 | 横河電機株式会社 | Application development environment providing system, application development environment providing method, application development environment providing program, and terminal device |
JP6451909B1 (en) * | 2017-08-03 | 2019-01-16 | オムロン株式会社 | Sensor management unit, sensor device, sensor management method, and sensor management program |
WO2019026709A1 (en) * | 2017-08-03 | 2019-02-07 | オムロン株式会社 | Sensor management unit, sensor device, sensor management method, and sensor management program |
JP6473864B1 (en) * | 2018-07-09 | 2019-02-27 | ネクスト・シェアリング株式会社 | Confidential processing / restoration pre-processing application device, confidential processing / restoration pre-processing application device terminal, confidential processing / restoration pre-processing application device terminal Pre-processing method and confidentiality processing / restoration pre-processing method of the terminal |
WO2020116610A1 (en) * | 2018-12-06 | 2020-06-11 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | Data retrieval device, data retrieval method and program, and edge server and program thereof |
JP2020091705A (en) * | 2018-12-06 | 2020-06-11 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | Data search device, data search method and program thereof, edge server and program thereof |
KR102226606B1 (en) * | 2020-08-06 | 2021-03-11 | 주식회사 글로쿼드텍 | Home gateway apparatus, sensor terminal, and method thereof |
US12019911B2 (en) | 2018-12-06 | 2024-06-25 | Ntt Communications Corporation | Storage management apparatus, method and program |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10425414B1 (en) * | 2015-08-31 | 2019-09-24 | United Services Automobile Association (Usaa) | Security platform |
US10097999B2 (en) | 2015-12-07 | 2018-10-09 | International Business Machines Corporation | Satisfying virtual machine security criteria using remote sensor devices |
US10587616B2 (en) * | 2016-09-16 | 2020-03-10 | Google Llc | Methods, systems, and media for authentication of user devices to a display device |
US10846417B2 (en) * | 2017-03-17 | 2020-11-24 | Oracle International Corporation | Identifying permitted illegal access operations in a module system |
CN107566488B (en) * | 2017-09-01 | 2021-01-05 | 西安万像电子科技有限公司 | Sensor data processing method and system |
EP3687147B1 (en) * | 2017-09-19 | 2024-06-26 | Omron Corporation | Mobile sensor management unit, mobile sensor apparatus, matching apparatus, sensing data distribution system, data provision method, and data provision program |
JP6468377B1 (en) * | 2018-02-13 | 2019-02-13 | オムロン株式会社 | Output management apparatus, output management method and program |
US11481509B1 (en) | 2018-07-10 | 2022-10-25 | United Services Automobile Association (Usaa) | Device management and security through a distributed ledger system |
CN111983929B (en) * | 2019-05-23 | 2024-03-12 | 美智纵横科技有限责任公司 | Household appliance, control method thereof and computer readable storage medium |
US11856419B2 (en) * | 2021-08-17 | 2023-12-26 | Xerox Corporation | Method and system for commissioning environmental sensors |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002016956A (en) * | 2000-06-30 | 2002-01-18 | Toshiba Corp | System and method for acquiring positional information |
JP2004328310A (en) * | 2003-04-24 | 2004-11-18 | Nippon Telegr & Teleph Corp <Ntt> | Information providing system, information providing method, and service providing server on the basis of positional information |
JP2006252436A (en) * | 2005-03-14 | 2006-09-21 | Ntt Docomo Inc | User privacy authentication system and user privacy authentication method for multiple simultaneous positioning requests |
JP2008052601A (en) * | 2006-08-25 | 2008-03-06 | Kagaku Joho Systems:Kk | Information providing system, information providing method, and service server |
JP2009278357A (en) * | 2008-05-14 | 2009-11-26 | Nec Corp | Communication support method, communication support system and terminal |
JP2009296433A (en) * | 2008-06-06 | 2009-12-17 | Nec Corp | Target image display system |
JP2010217952A (en) * | 2009-03-13 | 2010-09-30 | Zenrin Datacom Co Ltd | Information providing device and program |
Family Cites Families (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6914893B2 (en) * | 1998-06-22 | 2005-07-05 | Statsignal Ipc, Llc | System and method for monitoring and controlling remote devices |
US20020097322A1 (en) * | 2000-11-29 | 2002-07-25 | Monroe David A. | Multiple video display configurations and remote control of multiple video signals transmitted to a monitoring station over a network |
AUPP776498A0 (en) * | 1998-12-17 | 1999-01-21 | Portus Pty Ltd | Local and remote monitoring using a standard web browser |
AU2002230389A1 (en) * | 2000-06-14 | 2002-04-29 | Vermeer Manufacturing Company | Utility mapping and data distribution system and method |
JP2002063694A (en) | 2000-08-18 | 2002-02-28 | Nippon Telegr & Teleph Corp <Ntt> | Method and system for selectively receiving information |
JP2002063964A (en) | 2000-08-21 | 2002-02-28 | Ricoh Co Ltd | Image forming equipment |
US7266503B2 (en) * | 2001-03-22 | 2007-09-04 | International Business Machines Corporation | System and method for generating a company group user profile |
US20030208490A1 (en) * | 2001-06-15 | 2003-11-06 | Jean-Jacques Larrea | System and method for data storage, control and access |
WO2003019450A2 (en) * | 2001-08-24 | 2003-03-06 | March Networks Corporation | Remote health-monitoring system and method |
US6658091B1 (en) * | 2002-02-01 | 2003-12-02 | @Security Broadband Corp. | LIfestyle multimedia security system |
US7249177B1 (en) * | 2002-11-27 | 2007-07-24 | Sprint Communications Company L.P. | Biometric authentication of a client network connection |
US7467400B1 (en) * | 2003-02-14 | 2008-12-16 | S2 Security Corporation | Integrated security system having network enabled access control and interface devices |
US20040255167A1 (en) * | 2003-04-28 | 2004-12-16 | Knight James Michael | Method and system for remote network security management |
JP2005227841A (en) | 2004-02-10 | 2005-08-25 | Nec Fielding Ltd | Information providing system, server, mobile communication terminal, and its method |
US8289390B2 (en) * | 2004-07-28 | 2012-10-16 | Sri International | Method and apparatus for total situational awareness and monitoring |
JP4792823B2 (en) * | 2005-06-09 | 2011-10-12 | ソニー株式会社 | NETWORK SYSTEM, MOBILE DEVICE, ITS CONTROL METHOD, AND COMPUTER PROGRAM |
US7424399B2 (en) * | 2005-06-10 | 2008-09-09 | Ge Analytical Instruments, Inc. | Systems and methods for fluid quality sensing, data sharing and data visualization |
US8281386B2 (en) * | 2005-12-21 | 2012-10-02 | Panasonic Corporation | Systems and methods for automatic secret generation and distribution for secure systems |
KR100792293B1 (en) | 2006-01-16 | 2008-01-07 | 삼성전자주식회사 | Service providing method and device therein considering user's context |
US20070254641A1 (en) * | 2006-04-30 | 2007-11-01 | International Business Machines Corporation | Integration of Instant Messaging Systems with Sensors |
US20110268274A1 (en) * | 2008-05-28 | 2011-11-03 | Agency For Science, Technology And Research | Authentication and Key Establishment in Wireless Sensor Networks |
US8299920B2 (en) * | 2009-09-25 | 2012-10-30 | Fedex Corporate Services, Inc. | Sensor based logistics system |
RU2477929C2 (en) * | 2011-04-19 | 2013-03-20 | Закрытое акционерное общество "Лаборатория Касперского" | System and method for prevention safety incidents based on user danger rating |
US9160536B2 (en) * | 2011-11-30 | 2015-10-13 | Advanced Biometric Controls, Llc | Verification of authenticity and responsiveness of biometric evidence and/or other evidence |
US8646032B2 (en) * | 2011-12-30 | 2014-02-04 | Nokia Corporation | Method and apparatus providing privacy setting and monitoring user interface |
US9066125B2 (en) * | 2012-02-10 | 2015-06-23 | Advanced Biometric Controls, Llc | Secure display |
JP2014045699A (en) | 2012-08-30 | 2014-03-17 | Fuji Oil Co Ltd | Method for enhancing milk flavor of food product |
US9614852B2 (en) | 2012-09-21 | 2017-04-04 | International Business Machines Corporation | Sensor sharing control |
-
2013
- 2013-07-19 US US14/425,458 patent/US9614852B2/en not_active Expired - Fee Related
- 2013-07-19 JP JP2014536649A patent/JP5823050B2/en active Active
- 2013-07-19 GB GB1505985.0A patent/GB2520898B/en active Active
- 2013-07-19 DE DE112013003833.1T patent/DE112013003833T5/en active Pending
- 2013-07-19 CN CN201380049439.7A patent/CN104685512B/en active Active
- 2013-07-19 WO PCT/JP2013/069637 patent/WO2014045699A1/en active Application Filing
-
2017
- 2017-03-01 US US15/446,088 patent/US9916470B2/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002016956A (en) * | 2000-06-30 | 2002-01-18 | Toshiba Corp | System and method for acquiring positional information |
JP2004328310A (en) * | 2003-04-24 | 2004-11-18 | Nippon Telegr & Teleph Corp <Ntt> | Information providing system, information providing method, and service providing server on the basis of positional information |
JP2006252436A (en) * | 2005-03-14 | 2006-09-21 | Ntt Docomo Inc | User privacy authentication system and user privacy authentication method for multiple simultaneous positioning requests |
JP2008052601A (en) * | 2006-08-25 | 2008-03-06 | Kagaku Joho Systems:Kk | Information providing system, information providing method, and service server |
JP2009278357A (en) * | 2008-05-14 | 2009-11-26 | Nec Corp | Communication support method, communication support system and terminal |
JP2009296433A (en) * | 2008-06-06 | 2009-12-17 | Nec Corp | Target image display system |
JP2010217952A (en) * | 2009-03-13 | 2010-09-30 | Zenrin Datacom Co Ltd | Information providing device and program |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9614852B2 (en) | 2012-09-21 | 2017-04-04 | International Business Machines Corporation | Sensor sharing control |
US10896347B2 (en) | 2015-12-14 | 2021-01-19 | Omron Corporation | Dataflow control apparatus and dataflow control method for metadata matching and device extraction |
WO2017104287A1 (en) * | 2015-12-14 | 2017-06-22 | オムロン株式会社 | Data flow control device and data flow control method |
JP2017111501A (en) * | 2015-12-14 | 2017-06-22 | オムロン株式会社 | Data flow control device and data flow control method |
JP2018055571A (en) * | 2016-09-30 | 2018-04-05 | 横河電機株式会社 | Application development environment providing system, application development environment providing method, application development environment providing program, and terminal device |
WO2018061621A1 (en) * | 2016-09-30 | 2018-04-05 | 横河電機株式会社 | Application development environment provision system, application development environment provision method, computer-readable non-transitory medium, and terminal device |
US11134320B2 (en) | 2017-08-03 | 2021-09-28 | Omron Corporation | Sensor management unit, sensor device, sensor management method, and sensor management program |
JP6451909B1 (en) * | 2017-08-03 | 2019-01-16 | オムロン株式会社 | Sensor management unit, sensor device, sensor management method, and sensor management program |
WO2019026709A1 (en) * | 2017-08-03 | 2019-02-07 | オムロン株式会社 | Sensor management unit, sensor device, sensor management method, and sensor management program |
JP6473864B1 (en) * | 2018-07-09 | 2019-02-27 | ネクスト・シェアリング株式会社 | Confidential processing / restoration pre-processing application device, confidential processing / restoration pre-processing application device terminal, confidential processing / restoration pre-processing application device terminal Pre-processing method and confidentiality processing / restoration pre-processing method of the terminal |
JP2020009150A (en) * | 2018-07-09 | 2020-01-16 | ネクスト・シェアリング株式会社 | Confidentiality processing/restoration pre-processing application device, confidentiality processing/restoration pre-processing application device terminal, method performed by confidentiality processing/restoration pre-processing application device for causing terminal to perform confidentiality processing/restoration pre-processing, and confidentiality processing/restoration pre-processing method for the terminal |
WO2020116611A1 (en) * | 2018-12-06 | 2020-06-11 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | Data retrieval device, data retrieval method and program, edge server, and program thereof |
JP2020091707A (en) * | 2018-12-06 | 2020-06-11 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | Data search device, data search method and program thereof, edge server and program thereof |
JP2020091705A (en) * | 2018-12-06 | 2020-06-11 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | Data search device, data search method and program thereof, edge server and program thereof |
WO2020116610A1 (en) * | 2018-12-06 | 2020-06-11 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | Data retrieval device, data retrieval method and program, and edge server and program thereof |
JP7150584B2 (en) | 2018-12-06 | 2022-10-11 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | Edge server and its program |
JP7150585B2 (en) | 2018-12-06 | 2022-10-11 | エヌ・ティ・ティ・コミュニケーションズ株式会社 | Data retrieval device, its data retrieval method and program, edge server and its program |
US11695832B2 (en) | 2018-12-06 | 2023-07-04 | Ntt Communications Corporation | Data search apparatus, and data search method and program thereof, and edge server and program thereof |
US11886520B2 (en) | 2018-12-06 | 2024-01-30 | Ntt Communications Corporation | Data search apparatus, and data search method and program thereof, and edge server and program thereof |
US12019911B2 (en) | 2018-12-06 | 2024-06-25 | Ntt Communications Corporation | Storage management apparatus, method and program |
KR102226606B1 (en) * | 2020-08-06 | 2021-03-11 | 주식회사 글로쿼드텍 | Home gateway apparatus, sensor terminal, and method thereof |
KR20220018402A (en) * | 2020-08-06 | 2022-02-15 | 주식회사 텍쎈 | Home gateway apparatus, sensor terminal, and method thereof |
KR102538552B1 (en) | 2020-08-06 | 2023-05-31 | 주식회사 텍쎈 | Home gateway apparatus, sensor terminal, and method thereof |
Also Published As
Publication number | Publication date |
---|---|
GB2520898B (en) | 2015-10-14 |
US20170177897A1 (en) | 2017-06-22 |
CN104685512B (en) | 2017-10-17 |
US9614852B2 (en) | 2017-04-04 |
GB201505985D0 (en) | 2015-05-20 |
DE112013003833T5 (en) | 2015-04-30 |
GB2520898A (en) | 2015-06-03 |
JP5823050B2 (en) | 2015-11-25 |
US9916470B2 (en) | 2018-03-13 |
US20150229643A1 (en) | 2015-08-13 |
CN104685512A (en) | 2015-06-03 |
JPWO2014045699A1 (en) | 2016-08-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5823050B2 (en) | Sensor sharing control apparatus, method, and computer program | |
CN110689460B (en) | Traffic accident data processing method, device, equipment and medium based on block chain | |
US11582711B2 (en) | Systems, devices, methods, and program products enhancing structure walkthroughs | |
KR101016556B1 (en) | Method, server and computer readable recording medium for accessing a person's information using augmented reality | |
US20150181548A1 (en) | Indoor Remote Triggered Location Scanning | |
KR20150106856A (en) | System and method for encrypting folder in device | |
TWI606334B (en) | Method and apparatus for selecting events from event log for timeline generation | |
US20130102335A1 (en) | Mobile device, information processing device, location information acquisition method, location information acquisition system, and program | |
KR20120075580A (en) | A system and method for car service management using mobile augmented reality in smart phone | |
EP3680807A1 (en) | Password verification method, password setting method, and mobile terminal | |
JP2018512106A (en) | Method and system for anti-phishing using smart images | |
US20200168015A1 (en) | Systems, devices, methods, and program products enhancing structure walkthroughs | |
US11627139B2 (en) | System, device, and method for transferring security access permissions between in-camera users | |
JP5339316B1 (en) | IDENTIFICATION INFORMATION MANAGEMENT SYSTEM, IDENTIFICATION INFORMATION MANAGEMENT SYSTEM CONTROL METHOD, INFORMATION PROCESSING DEVICE, AND PROGRAM | |
US10362321B2 (en) | Image distribution device, image distribution system, and image distribution method | |
EP2373117A1 (en) | Connection management device, communication terminal, connection management method, connection method, connection management program, connection program, and recording medium | |
EP3139582B1 (en) | Information processing apparatus and method | |
KR20150106803A (en) | System and method for encrypting file system structure in device | |
KR101780566B1 (en) | Mobile device and operating method hereof | |
JP6179328B2 (en) | Information processing apparatus and information processing program | |
CN115240299B (en) | Resource using method and system | |
CN110717605A (en) | Access information processing method and device based on block chain | |
JP6015590B2 (en) | Information processing apparatus and information processing program | |
US20230199707A1 (en) | Systems, Devices, Methods, and Program Products Enhancing Structure Walkthroughs | |
JP2024131723A (en) | Information processing device, information processing method, and information processing program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13839431 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014536649 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14425458 Country of ref document: US Ref document number: 1120130038331 Country of ref document: DE Ref document number: 112013003833 Country of ref document: DE |
|
ENP | Entry into the national phase |
Ref document number: 1505985 Country of ref document: GB Kind code of ref document: A Free format text: PCT FILING DATE = 20130719 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1505985.0 Country of ref document: GB |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13839431 Country of ref document: EP Kind code of ref document: A1 |