[go: up one dir, main page]

CN112384902B - Real-time data acquisition and recording data sharing system - Google Patents

Real-time data acquisition and recording data sharing system Download PDF

Info

Publication number
CN112384902B
CN112384902B CN201980045424.0A CN201980045424A CN112384902B CN 112384902 B CN112384902 B CN 112384902B CN 201980045424 A CN201980045424 A CN 201980045424A CN 112384902 B CN112384902 B CN 112384902B
Authority
CN
China
Prior art keywords
data
user
remote
asset
real
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201980045424.0A
Other languages
Chinese (zh)
Other versions
CN112384902A (en
Inventor
L·B·乔丹
D·迪内希
M·D·哈姆史密斯
D·阿尔温
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wi Tronix LLC
Original Assignee
Wi Tronix LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US16/431,466 external-priority patent/US11423706B2/en
Application filed by Wi Tronix LLC filed Critical Wi Tronix LLC
Publication of CN112384902A publication Critical patent/CN112384902A/en
Application granted granted Critical
Publication of CN112384902B publication Critical patent/CN112384902B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/008Registering or indicating the working of vehicles communicating information to a remotely located station
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F13/38Information transfer, e.g. on bus
    • G06F13/382Information transfer, e.g. on bus using universal interface adapter
    • G06F13/385Information transfer, e.g. on bus using universal interface adapter for adaptation of a particular data processing system to different peripheral devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/54Interprogram communication
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0841Registering performance data
    • G07C5/085Registering performance data using electronic data carriers
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Databases & Information Systems (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Television Signal Processing For Recording (AREA)
  • Information Transfer Between Computers (AREA)
  • Time Recorders, Dirve Recorders, Access Control (AREA)
  • Storage Device Security (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The real-time data acquisition and recording data sharing system works with the real-time data acquisition and recording system and viewers to provide real-time or near real-time access to a variety of data, such as event and operational data, video data, and audio data, to remotely located users, such as asset owners, operators, and investigators. The data sharing system allows the user to share data obtained from the data acquisition and recording system with remotely located users. The user may share data in a secure, controlled, tracked, and censored manner with remote recipient end users that own internet access and modern web browsers. Instead of sharing files, the users share URLs to the data. URL-based data sharing enables the user to control, track, and review sensitive data.

Description

Real-time data acquisition and recording data sharing system
Cross-reference to related applications
The present application claims priority of U.S. provisional application No. 62/680,907 of the 2018 month 5 application, priority of and part of U.S. provisional application No. 62/825,943 of the 2019 month 29 application, priority of and part of U.S. provisional application No. 62/337,227 of the 2016 month 5 month 16 application, priority of and part of the same, non-U.S. provisional application No. 16/595,650 of the 2017 month 5 month 15 application (now U.S. patent No. 9,934,623, issued at 2018 month 3) priority of and part of the same, non-U.S. provisional application No. 15/907,486 of the 2018 month 2 application, priority of and part of the same, U.S. provisional application No. 62/337 of the 2016 application of the 2016 month 5 month 16 application, priority of the same, non-provisional application No. 16/595,650 of the same, priority of the 2017 month 5 month 15 application of which priority of the same is claimed and part of the same, priority of the same, and part of the same, priority of the application No. 15/907 of the same, priority of the application of the priority of the same was claimed 5 month 16, and part of the same was filed in the priority of the same application of priority as that was filed in the priority of the same application of 5 month 5, and part of the same application was filed.
Technical Field
The present disclosure relates to systems and methods for viewing video, images, and data from a real-time data acquisition and recording system and sharing the video, images, and/or data with other individuals.
Background
High value mobile assets such as locomotives, airplanes, public transportation systems, mining equipment, transportable medical equipment, cargo, marine vessels, and military vessels typically employ on-board data acquisition and recording "black box" systems. These data acquisition and recording systems (e.g., event data loggers or flight data loggers) record a variety of system parameters for accident investigation, panelist performance assessment, fuel efficiency analysis, maintenance planning, and predictive diagnostics. Typical data acquisition and recording systems include digital and analog inputs, as well as pressure switches and pressure sensors, which record data from various on-board sensor devices. The recorded data may include parameters such as speed, distance traveled, orientation, fuel level, engine Revolutions Per Minute (RPM), liquid level, operator controls, pressure, current and forecasted weather conditions, and environmental conditions. In addition to the base event and operational data, video and audio event/data recording capabilities are deployed on multiple of these same liquidity assets. Typically, after an incident occurs that involves an asset and requires investigation, data is extracted from the data logger once it is restored. Certain situations may occur in which the data logger is unable to recover or the data is otherwise unavailable. In these contexts, it is desirable to quickly acquire data, such as event and operational data, video data, and audio data, through a data acquisition and recording system, whether physically accessing the data acquisition and recording system or the data is not available, and to allow a user to share the data or portions thereof with other authorized individuals.
Disclosure of Invention
The present disclosure relates generally to real-time data acquisition and recording systems for use in high-value liquidity assets. The teachings herein may provide real-time or near real-time access to data, such as event and operational data, video data, and audio data, recorded by a real-time data acquisition and recording system. One embodiment of a method for processing, storing, and transmitting data from at least one asset includes receiving, using a web server, a request from a first user, the request including specified data stored in a remote data store and an email address of a second user; determining a Uniform Resource Locator (URL) suitable for providing access to the specified data; generating an email including the URL; and sending the email to the email address.
Another embodiment of a system for processing, storing and transmitting data from at least one asset includes a web server adapted to receive a request from a first user, determine a Uniform Resource Locator (URL) providing access to specified data, generate an email including the URL, and send the email to an email address, the request including the specified data stored in a remote data store and an email address of a second user; and a data logger onboard the asset comprising at least one local memory component, a data encoder, an onboard data manager, and a queuing repository, the data logger being adapted to receive data based on at least one data signal from at least one of at least one data source onboard the asset and at least one data source remote from the asset, and the data encoder being adapted to encode the data into encoded data.
These and other aspects of the disclosure are disclosed in the following detailed description of the embodiments, the appended claims and the accompanying drawings.
Drawings
The various features, advantages and other uses of the apparatus will become more fully apparent by reference to the following detailed description and drawings in which like reference numerals refer to like parts throughout the several views. It should be emphasized that, in accordance with common practice, the various features of the drawing are not to scale. On the contrary, the dimensions of the various features are arbitrarily expanded or reduced for clarity.
FIG. 1 illustrates a field implementation of a first example of an exemplary real-time data acquisition and recording system according to an embodiment of the present disclosure;
FIG. 2 illustrates a field implementation of a second example of an exemplary real-time data acquisition and recording system according to an embodiment of the present disclosure;
FIG. 3 is a flow chart of a process for recording data and/or information from a fluid asset according to an embodiment of the present disclosure;
FIG. 4 is a flow chart of a process for appending data and/or information from a fluid asset after a power outage, according to an embodiment of the present disclosure;
FIG. 5 is a diagram illustrating exemplary temporary and full record blocks saved to a collision resistant memory module in accordance with an embodiment of the present disclosure;
FIG. 6 is a diagram illustrating an exemplary temporary record block in an anti-collision memory module before power interruption and after power restoration, according to an embodiment of the present disclosure;
FIG. 7 is a diagram illustrating an exemplary recording segment in an anti-collision memory module after power restoration, according to an embodiment of the present disclosure;
FIG. 8 illustrates a field implementation of a first example of a real-time data acquisition and recording system viewer according to an embodiment of the present disclosure;
FIG. 9 is a flowchart of a process for recording video data, audio data, and/or information from a mobile asset according to an embodiment of the present disclosure;
FIG. 10 is a flowchart of a process for recording video data, audio data, and/or information from a mobile asset according to an embodiment of the present disclosure;
FIG. 11 is a flowchart illustrating an exemplary fisheye view of a 360 degree camera of a real-time data acquisition and recording system viewer in accordance with an embodiment of the present disclosure;
FIG. 12 is a diagram illustrating an exemplary panoramic view of a 360 degree camera of a real-time data acquisition and recording system viewer, according to an embodiment of the present disclosure;
FIG. 13 is a diagram illustrating an exemplary four-view of a 360 degree camera of a real-time data acquisition and recording system viewer, according to an embodiment of the present disclosure;
FIG. 14 is a diagram illustrating an exemplary fisheye correction (dewarped) view of a 360 degree camera of a real-time data acquisition and recording system viewer in accordance with an embodiment of the present disclosure;
FIG. 15 illustrates a field implementation of a first example of a data acquisition and recording system video content analysis system according to an embodiment of the present disclosure;
FIG. 16A is a diagram illustrating exemplary track detection according to an embodiment of the present disclosure;
FIG. 16B is a diagram illustrating exemplary track detection and switch detection in accordance with embodiments of the present disclosure;
FIG. 16C is a diagram illustrating exemplary track detection, track number count, and signal detection according to an embodiment of the present disclosure;
Fig. 16D is a diagram illustrating an exemplary crossing (crossing) and track detection in accordance with an embodiment of the present disclosure;
FIG. 16E is a diagram illustrating exemplary dual overhead signal detection according to an embodiment of the present disclosure;
FIG. 16F is a diagram illustrating an exemplary multi-track detection according to an embodiment of the present disclosure;
FIG. 16G is a diagram illustrating exemplary switch and track detection according to an embodiment of the present disclosure;
FIG. 16H is a diagram illustrating exemplary switch detection according to an embodiment of the present disclosure;
FIG. 17 is a flowchart of a process for determining an internal state of a fluid asset according to an embodiment of the present disclosure;
FIG. 18 is a flowchart of a process for determining object detection and obstacle detection occurring outside of a fluid asset, according to an embodiment of the present disclosure;
FIG. 19 illustrates a field implementation of a first example of an exemplary real-time data acquisition and recording system according to an embodiment of the present disclosure;
FIG. 20 illustrates a field implementation of a second example of an exemplary real-time data acquisition and recording system according to an embodiment of the present disclosure; and
FIG. 21 is a flow chart of a process for sharing data and/or information from an asset according to an embodiment of the disclosure.
Detailed Description
A first embodiment of the real-time data acquisition and recording system described herein provides real-time or near real-time access to various data related to high value assets (e.g., event and operational data, video data, and audio data) for remotely located users (e.g., asset owners, operators, and investigators). The data acquisition and recording system records data related to the asset via a data logger and streams the data to a remote data repository and to remotely located users before, during and after an incident. Streaming data to a remote data store in real-time or near real-time makes the information available at least as soon as an incident or emergency occurs, thereby virtually eliminating the need to locate and download a "black box" for investigation of incidents involving the asset and the need to interact with a data logger on the asset to request the downloading of specific data, locating and transferring files, and viewing the data using a custom application. The system of the present disclosure maintains typical recording capabilities and increases the ability to stream data to remote data stores and remote end users before, during, and after an incident. In most situations, the information recorded in the data logger is redundant and not needed because the data is already acquired and stored in the remote data repository.
Prior to the system of the present disclosure, data was extracted from a "black box" or "event recorder" after the incident occurred and when investigation was required. The data files containing the time slices recorded by the "black box" must be downloaded and retrieved from the "black box" and then viewed by the user through proprietary software. The user must gain physical or remote access to the asset, select the desired data to be downloaded from the "black box," download the file containing the desired information to the computing device, and locate the appropriate file with the desired data using a custom application operating on the computing device. The system of the present disclosure eliminates the need for the user to perform these steps, requiring only the user to navigate to the desired data using a common web browser. Remotely located users may access a common web browser to navigate to desired data related to the selected asset to view and analyze the operational efficiency and security of the asset in real-time or near real-time.
Remotely located users (e.g., property owners, operators, and/or investigators) may access a common web browser to navigate to current and/or historical desired data related to a selected property to view and analyze the operational efficiency and security of the property in real-time or near real-time. The ability to view operations in real-time or near real-time enables rapid assessment and adjustment of behavior. During an incident, for example, real-time information and/or data may facilitate classifying the context by nature and providing valuable information to the first responders. During normal operation, near real-time information and/or data may be used to review panelist performance and assist in network-wide context awareness, for example.
The data may include, but is not limited to: simulation and frequency parameters such as velocity, pressure, temperature, current, voltage and acceleration derived from the asset and/or nearby assets; boolean data such as switch position, actuator position, warning light illumination, and actuator commands; global Positioning System (GPS) data and/or Geographic Information System (GIS) data, such as location, speed, and altitude; internally generated information such as legal speed limits on the asset at its current location; video and image information from cameras located in various orientations in, on, or near the asset; audio information from microphones located in various locations in, on, or near the asset; information about the operating scheme of the asset, such as route, schedule, and manifest information, sent from the data center to the asset; information about environmental conditions, including current and forecasted weather conditions for the area in which the asset is currently operating or is scheduled to operate; asset control status and operational data generated by a system such as active train control (PTC) in a locomotive; and data derived from a combination of any of the above, including but not limited to additional data, video and audio analysis, and analysis results.
FIGS. 1 and 2 illustrate field implementations of first and second embodiments, respectively, of an exemplary real-time Data Acquisition and Recording System (DARS) 100, 200 in which aspects of the present disclosure may be implemented. The DARS100, 200 is a system that delivers real-time information from a data logging device to remotely located end users. The DARS100, 200 includes a data logger 154, 254 that is mounted on the vehicle or mobile asset 148, 248 and communicates with any number of various sources of information via a data center 150, 250 of the DARS100, 200 via a data link, such as the wireless data link 146, through any combination of on-board wired and/or wireless data links 170, 270 (e.g., wireless gateways/routers) or off-board sources of information. The data loggers 154, 254 include the onboard data manager 120, 220, the data encoder 122, 222, the vehicle event detector 156, 256, the queuing repository 158, 258, and the wireless gateway/router 172, 272. Additionally, in this embodiment, the data loggers 154, 254 may include the anti-collision memory modules 118, 218 and/or the ethernet switches 162, 262 with or without Power Over Ethernet (POE). The exemplary hardened memory module 118, 218 may be, for example, a crash event recorder memory module that complies with federal regulations and federal railway administration regulations, a crash safeguarding memory unit that complies with federal regulations and federal aviation administration regulations, a crash resistant memory module that complies with any applicable federal regulations, or any other suitable hardened memory device as known in the art. In a second embodiment shown in fig. 2, the data logger 254 may additionally include an optional non-crash-resistant removable storage device 219.
The wired and/or wireless data links 170, 270 may include any one or combination of discrete signal inputs, standard or proprietary ethernet, serial connection, and wireless connection. The ethernet-connected devices may utilize the data loggers 154, 254 and the ethernet switches 162, 262 may utilize POE. The ethernet switches 162, 262 may be internal and external and may support POE. In addition, data from remote data sources (e.g., map components 164, 264, route/group member inventory components 124, 224, and weather components 126, 226 in the embodiments of fig. 1 and 2) may be used from data centers 150, 250 through wireless data links 146, 246 and wireless gateway/routers 172, 272 for on-board data managers 120, 220 and vehicle event detectors 156, 256.
The data loggers 154, 254 gather data or information from a variety of sources (which may vary greatly based on the configuration of the asset) over the onboard data links 170, 270. The data encoders 122, 222 encode at least a minimum set of data that is typically defined by a regulatory agency. In this embodiment, the data encoders 122, 222 receive data from a variety of sources of assets 148, 248 and sources of the data centers 150, 250. The information source may include any number of components in the asset 148, 248, such as any of the following: analog inputs 102, 202, digital inputs 104, 204, I/O modules 106, 206, vehicle controllers 108, 208, engine controllers 110, 210, inertial sensors 112, 212, global Positioning System (GPS) 114, 214, cameras 116, 216, active train control (PTC)/signal data 166, 266, fuel data 168, 268, cellular transmission detectors (not shown), internal drive data, and any additional data signals; and any number of components in the data centers 150, 250, such as any of the route/group member inventory components 124, 224, weather components 126, 226, map components 164, 264, and any additional data signals. The data encoders 122, 222 compress or encode data that is synchronized in time to facilitate efficient real-time transmission and copying to the remote data stores 130, 230. The data encoder 122, 222 transmits the encoded data to the on-board data manager 120, 220, which on-board data manager 120, 220 then saves the encoded data in the anti-collision memory module 118, 218 and the enqueue store 158, 258 for copying to the remote data store 130, 230 via the remote data manager 132, 232 located in the data center 150, 250. Optionally, the onboard data manager 120, 220 may save a three-level copy of the encoded data in a non-collision-resistant removable storage 219 of the second embodiment shown in fig. 2. The onboard data manager 120, 220 and the remote data manager 132, 232 work together to manage the data replication process. A single remote data manager 132, 232 in a data center 150, 250 may manage the replication of data from multiple assets 148, 248.
Data from the various input components and data from the in-cab audio/Graphical User Interfaces (GUIs) 160, 260 are sent to the vehicle event detectors 156, 256. The vehicle event detectors 156, 256 process the data to determine whether an event, incident, or other predefined context involving the asset 148, 248 occurred. When the vehicle event detectors 156, 256 detect a signal indicating that a predetermined event has occurred, the vehicle event detectors 156, 256 transmit processed data indicating that the predetermined event has occurred to the on-board data manager 120, 220 along with support data surrounding the predetermined event. The vehicle event detectors 156, 256 detect events based on data from a variety of sources (e.g., analog inputs 102, 202, digital inputs 104, 204, I/O modules 106, 206, vehicle controllers 108, 208, engine controllers 110, 210, inertial sensors 112, 212, GPS114, 214, cameras 116, 216, route/group member inventory components 124, 224, weather components 126, 226, map components 164, 264, PTC/signal data 166, 266, and fuel data 168, 268, which may vary based on the configuration of the asset). When a vehicle event detector 156, 256 detects an event, the detected asset event information is stored in a queue store 158, 258 and may optionally be presented to a team member of the asset 148, 248 via an in-cab audio/Graphical User Interface (GUI) 160, 260.
The onboard data manager 120, 220 also sends the data to the queuing store 158. In near real-time mode, the on-board data manager 120, 220 stores the encoded data and any event information received from the data encoder 122, 222 in the anti-collision memory module 118, 218 and in the queuing store 158, 258. In the second embodiment of fig. 2, the onboard data manager 220 may optionally store the encoded data in a non-collision-resistant removable storage 219. After five minutes of encoded data are accumulated in the enqueue store 158, 258, the on-board data manager 120, 220 stores the five minutes of encoded data to the remote data store 130, 230 via the remote data manager 132, 232 in the data center 150, 250 over the wireless data link 146, 246 accessed through the wireless gateway/router 172, 272. In real-time mode, the on-board data manager 120, 220 stores the encoded data and any event information received from the data encoder 122, 222 to the anti-collision memory module 118, 218, and optionally in the non-anti-collision removable storage 219 of fig. 2, and to the remote data store 130, 230 via the remote data manager 132, 232 in the data center 150, 250 over the wireless data link 146, 246 accessed through the wireless gateway/router 172, 272. The onboard data managers 120, 220 and the remote data managers 132, 232 may communicate over a variety of wireless communication links (e.g., wi-Fi, cellular, satellite, and private wireless systems) utilizing the wireless gateway/router 172, 272. The wireless data links 146, 246 may be, for example, a Wireless Local Area Network (WLAN), a Wireless Metropolitan Area Network (WMAN), a Wireless Wide Area Network (WWAN), a private wireless system, a cellular telephone network, or any other means of transferring data from the data loggers 154, 254 of the DARS100, 200 to the remote data manager 130, 230 of the DARS100, 200 (in this example). When a wireless data connection is not available, the data is stored in memory and queued in the queuing store 158, 258 until the wireless connectivity is restored and the data replication process can resume.
In parallel with the data recording, the data logger 154, 254 continuously and autonomously copies the data to the remote data store 130, 230. The replication process has two modes, real-time mode and near real-time mode. In real-time mode, data is copied to the remote data store 130, 230 every second. In near real-time mode, data is copied to the remote data store 130, 230 every five minutes. The rate for near real-time mode is configurable and the rate for real-time mode is adjustable to support high resolution data by copying data to the remote data store 130, 230 every 0.10 seconds. When the DARS100, 200 is in near real-time mode, the onboard data manager 120, 220 queues the data in the queuing store 158, 258 before copying the data to the remote data manager 132, 232. The onboard data manager 120, 220 also copies the vehicle event detector information queued in the queuing store 158, 258 to the remote data manager 132, 232. Near real-time mode is used under most conditions during normal operation in order to improve the efficiency of the data replication process.
The real-time mode may be initiated based on the event occurring and detected by the vehicle event detectors 156, 256 onboard the assets 148, 248, or may be initiated by a request initiated from the data center 150, 250. When remotely located users 152, 252 request real-time information from network clients 142, 242, a typical request for real-time mode initiated by data center 150, 250 is initiated. Typical reasons for initiating a real-time mode on an asset 148, 248 are that the vehicle event detector 156, 256 detects an event or accident, such as an operator initiating an emergency stop request, an emergency braking activity, a rapid acceleration or deceleration on any axis, or a loss of input power to the data logger 154, 254. When transitioning from near real-time mode to real-time mode, all data that has not been copied to the remote data store 130, 230 is copied and stored in the remote data store 130, 230 and then a current copy is initiated. The transition between near real-time mode and real-time mode typically occurs in less than five seconds. After a predetermined amount of time has elapsed from the event or incident, the predetermined amount of inactivity time, or when the user 152, 252 no longer needs real-time information from the asset 148, 248, the data logger 154, 254 reverts to near real-time mode. The predetermined amount of time required to initiate the transition is configurable and is typically set to ten minutes.
While the data logger 154, 254 is in real-time mode, the onboard data manager 120, 220 attempts to continuously empty its queue to the remote data manager 132, 232, store the data to the anti-collision memory module 118, 218, and optionally to the non-anti-collision removable storage 219 of FIG. 2, and simultaneously send the data to the remote data manager 132, 232. The onboard data manager 120, 220 also transmits the detected vehicle information queued in the queuing store 158, 258 to the remote data manager 132, 232.
Upon receiving the data to be replicated from the data logger 154, 254 and the data from the map component 164, 264, the route/group member inventory component 124, 224, and the weather component 126, 226, the remote data manager 132, 232 stores the compressed data to the remote data store 130, 230 in the data center 150, 250 of the DARS100, 200. The remote data store 130, 230 may be, for example, a cloud-based data store or any other suitable remote data store. When data is received, the following process is initiated: causing the data decoder 136, 236 to decode the most recently copied data for/from the remote data store 130, 230 and send the decoded data to the remote event detector 134, 234. The remote data manager 132, 232 stores the vehicle event information in the remote data store 130, 230. When the remote event detector 134, 234 receives the decoded data, it processes the decoded data to determine if an event of interest is found in the decoded data. The remote event detector 134, 234 then uses the decoded information to detect an event, incident, or other predefined context in the data that occurred with the asset 148, 248. Upon detecting an event of interest from the decoded data, the remote event detector 134, 234 stores the event information and support data in the remote data store 130, 230. When the remote data manager 132, 232 receives the remote event detector 134, 234 information, the remote data manager 132, 232 stores the information in the remote data store 130, 230.
Remotely located users 152, 252 may access information related to a particular asset 148, 248 or assets, including vehicle event detector information, using standard web clients 142, 242 (e.g., web browsers) or virtual reality devices (not shown) (in this embodiment, thumbnail images from selected cameras may be displayed). The network clients 142, 242 communicate requests for information by the users 152, 252 to the network servers 140, 240 over the networks 144, 244 using common network standards, protocols, and techniques. The networks 144, 244 may be, for example, the internet. The networks 144, 244 may also be Local Area Networks (LANs), metropolitan Area Networks (MANs), wide Area Networks (WANs), virtual Private Networks (VPNs), cellular telephone networks, or any other means of communicating data from the network servers 140, 240 to, in this example, the network clients 142, 242. The web server 140, 240 requests the desired data from the data decoder 136, 236. The data decoder 136, 236 obtains the requested data pertaining to the particular asset 148, 248 or assets from the remote data store 130, 230 upon request from the web server 140, 240. The data decoder 136, 236 decodes the requested data and sends the decoded data to the locator 138, 238. Positioning is the process of converting data into a format desired by an end user, such as converting data into a user-preferred language and unit of measure. The locator 138, 238, by accessing the web client 142, 242, recognizes the profile settings set by the user 152, 252 and uses the profile settings to prepare the information to be sent to the web client 142, 242 for presentation to the user 152, 252 because the original encoded data and detected event information are saved to the remote data store 130, 230 using universal coordinated time (UTC) and international units (SI units). The locator 138, 238 converts the decoded data into a format desired by the user 152, 252, such as the language and unit of measure preferred by the user 152, 252. The locator 138, 238 sends the located data in a format preferred by the user 152, 252 to the web server 140, 240 in view of the request. The web servers 140, 240 then send the located data of the asset or assets to the web clients 142, 242 for viewing and analysis, providing playback and real-time display of standard video and 360 degree video. The network clients 142, 242 may display and the users 152, 252 may view data, video, and audio for a single asset or view data, video, and audio for multiple assets simultaneously. The network clients 142, 242 may also provide synchronized playback and real-time display of data as well as multiple video and audio data from assets, nearby assets, and/or standard and 360 degree video sources located on, in, or near a remote location.
FIG. 3 is a flow chart illustrating a process 300 for recording data and/or information from assets 148, 248 in accordance with an embodiment of the present disclosure. The data loggers 154, 254 receive data signals (302) from various input components including physical or calculated data elements from the assets 148, 248 and the data centers 150, 250, such as speed, latitude coordinates, longitude coordinates, horn detection, throttle position, weather data, map data, or route and/or team member data. The data encoder 122, 222 creates a record (304) containing a structured series of bits used to configure and record data signal information. The encoded records are then sent to the on-board data manager 120, 220 (306), which on-board data manager 120, 220 sequentially chronologically combines a series of records into a record block containing up to five minutes of data. The temporary record block contains less than five minutes of data, while the full record block contains the entire five minutes of data. Each record block contains all the data required to fully decode the contained signal, including the data integrity check. At a minimum, a recording block must start with a beginning recording and end with an ending recording.
To ensure that all encoded signal data is saved to the anti-collision memory module 118, and optionally to the non-anti-collision removable storage 219 of fig. 2, assuming that the data loggers 154, 254 lose power or experience extreme temperatures or mechanical stresses due to a collision or other catastrophic event, the on-board data manager 120, 220 stores temporary log blocks in the anti-collision memory module 118 at a predetermined rate 308, and optionally in the non-anti-collision removable storage 219 of fig. 2, wherein the predetermined rate is configurable and/or variable, as shown in an exemplary representation in fig. 5. Temporary record blocks are stored at least once every second, but may be stored at a frequency of one tenth of a second. The rate at which the temporary record blocks are held depends on the sampling rate of each signal. Each temporary record block contains a complete set of records from the last complete record block. When storing data to the anti-collision memory module 118, 218 or the optional non-anti-collision removable storage device 219 of the data logger 254 of fig. 2, each temporary record block is recorded to prevent damage or loss of more than one second of data if the data logger 154, 254 loses power, at which point the data logger 154, 254 may alternate between two temporary memory orientations in the anti-collision memory module 118, 218 and optionally the non-anti-collision removable storage device 219 of fig. 2. Whenever a new temporary record block is saved to a temporary anti-collision memory location, the existing previously stored temporary record blocks in that location will be overwritten.
In this implementation, when the data logger 154, 254 is in near real-time mode, every five minutes the on-board data manager 120, 220 stores full record blocks including the last five minutes of encoded signal data into record segments in the anti-collision memory module 118, 218 shown in fig. 7, and sends copies of the full record blocks to the remote data manager 132, 232 for storage in the remote data store 130, 230 for a predetermined retention period (e.g., two years) 310. The optional non-crashworthy removable storage 219 of the crashworthy memory module 118, 218 and/or the data logger 254 of fig. 2 stores the recorded segments of the most recently recorded blocks for a specified storage duration, which in this embodiment is federally specified duration, during which the data logger 154, 254 must store the operational or video data in the crashworthy memory module 118, 218 in an additional 24-hour buffer, followed by overwriting.
FIG. 4 is a flow chart of a process 400 for appending data and/or information from an asset 148, 248 after a power outage, according to an embodiment of the present disclosure. Once power is restored, the data logger 154, 254 identifies the last temporary log block stored in one of the two temporary anti-collision memory locations (402) and verifies the last temporary log block contained in the end log of each log block using a 32-bit cyclic redundancy check (404). The verified temporary record block is then appended to the anti-collision memory record segment and the record segment, which may contain up to five minutes of data prior to the loss of power, is sent to the remote data manager 132, 232 for storage for a retention period (406). The encoded signal data is stored in a circular buffer of specified storage duration to the anti-collision memory module 118, 218 and/or the optional non-anti-collision removable storage device 219 of the data recorder 254 of fig. 2. Since the anti-collision memory record segment is broken up into a plurality of record blocks, the data logger 154, 254 removes older record blocks if needed to free memory space whenever a full record block is saved to the anti-collision memory module 118, 218 and/or the optional non-anti-collision removable storage 219 of the data logger 254 of fig. 2.
Fig. 6 is a diagram illustrating exemplary temporary recording blocks before loss of power to the data recorders 154, 254 and after power recovery. When the temporary record block stored in temporary position 2 at (2/1/2016 10:10:08 am) 602 is valid, the temporary record block is appended to the record segment 702 (fig. 7) in the anti-collision memory module 118, 218 and/or the optional non-anti-collision removable storage 219 of the data recorder 254 of fig. 2, as shown in fig. 7. When the temporary record block in temporary position 2 stored at (2/1/2016 10:10:08 am) is not valid, the temporary record block in temporary position 1 at (2/1/2016 10:10:07 am) is verified and, if valid, appended to the record segment in the anti-collision memory module 118, 218 and/or the optional non-anti-collision removable storage 219 of the data recorder 254 of fig. 2.
Recording fragments are emptied to disk immediately whenever any recording block needs to be saved in the anti-collision memory module 118, 218 and/or the optional non-anti-collision removable storage 219 of the data recorder 254 of fig. 2. Since the data loggers 154, 254 alternate between two different temporary storage orientations when the temporary log blocks are saved, there is always one temporary storage orientation that is not modified or emptied into the anti-collision memory or non-anti-collision removable storage, thereby ensuring that at least one of the two temporary log blocks stored in the temporary storage orientation is valid and that no more than one second of data is lost by the data loggers 154, 254 each time the data loggers 154, 254 lose power. Similarly, when the data logger 154, 254 writes data to the anti-collision memory module 118, 218 and/or the optional non-anti-collision removable storage 219 of the data logger 254 of fig. 2 every tenth of a second, the data logger 154, 254 does not lose at most more than one tenth of a second of data whenever the data logger 154, 254 loses power.
For simplicity of explanation, process 300 and process 400 are depicted and described as a series of steps. However, steps according to the present disclosure may occur in various orders and/or concurrently. Additionally, steps according to the present disclosure may occur where other steps are not present and not as described herein. Furthermore, not all illustrated steps may be required to implement a methodology in accordance with the disclosed subject matter.
A third embodiment of the real-time data acquisition and recording system and viewer described herein provides real-time or near real-time access to various data related to high-value assets (e.g., event and operational data, video data, and audio data) for remotely located users (e.g., asset owners, operators, and investigators). The data acquisition and recording system records data related to the asset via a data logger and streams the data to a remote data repository and to remotely located users before, during and after an incident. Streaming data to a remote data store in real-time or near real-time makes the information available at least as soon as an incident or emergency occurs, thereby virtually eliminating the need to locate and download a "black box" for investigation of incidents involving the asset and the need to interact with a data logger on the asset to request the downloading of specific data, locating and transferring files, and viewing the data using a custom application. The system of the present disclosure maintains typical recording capabilities and increases the ability to stream data to remote data stores and remote end users before, during, and after an incident. In most situations, the information recorded in the data logger is redundant and not needed because the data is already acquired and stored in the remote data repository.
Prior to the system of the present disclosure, data was extracted from a "black box" or "event recorder" after the incident occurred and when investigation was required. The data files containing the time slices recorded by the "black box" must be downloaded and retrieved from the "black box" and then viewed by the user through proprietary software. The user must gain physical or remote access to the asset, select the desired data to be downloaded from the "black box," download the file containing the desired information to the computing device, and locate the appropriate file with the desired data using a custom application operating on the computing device. The system of the present disclosure eliminates the need for the user to perform these steps, requiring only the user to navigate to the desired data using a common web browser. Remotely located users may access a common web browser to navigate to desired data related to the selected asset to view and analyze the operational efficiency and security of the asset in real-time or near real-time.
Remotely located users (e.g., property owners, operators, and/or investigators) may access a common web browser to navigate to current and/or historical desired data related to a selected property to view and analyze the operational efficiency and security of the property in real-time or near real-time. The ability to view operations in real-time or near real-time enables rapid assessment and adjustment of behavior. During an incident, for example, real-time information and/or data may facilitate classifying the context by nature and providing valuable information to the first responders. During normal operation, near real-time information and/or data may be used to review panelist performance and assist in network-wide context awareness, for example.
The real-time data acquisition and recording system of the third embodiment uses at least one or any combination of an image measurement device, a video measurement device, and a range measurement device in, on, or near a fluid asset as part of the data acquisition and recording system. Image measurement devices and/or video measurement devices include, but are not limited to, 360 degree cameras, fixed cameras, narrow view cameras, wide view cameras, 360 degree fisheye view cameras, and/or other cameras. Range measurement devices include, but are not limited to, radar and light detection and ranging ("light arrival"). Photoarrival is a survey method that measures the distance to an object by illuminating the object with a pulsed laser and measuring the reflected pulse with a sensor. Prior to the systems of the present disclosure, the "black box" and/or "event recorder" did not include a 360 degree camera or other camera in, on, or near the mobile asset. The system of the present disclosure increases the ability to use 360 degree cameras, fixed cameras, narrow view cameras, wide view cameras, 360 degree fish eye view cameras, radar, light and/or other cameras as part of a data acquisition and recording system to use and record video, thereby providing 360 degree, narrow, wide, fish eye and/or other views in, on or near a mobile asset to a remote data repository and remote users and investigators before, during and after an incident involving the mobile asset. The ability to view operations, 360 degree video, and/or other video in real-time or near real-time enables rapid assessment and adjustment of panelist behavior. Owners, operators, and investigators can view and analyze operational efficiency, personnel, vehicle, and infrastructure safety and can investigate or check incidents. The ability to view 360 degree video and/or other video from a liquidity enables rapid assessment and adjustment of panelist behavior. During an incident, for example, 360 degree video and/or other video may facilitate classifying the context by nature and providing valuable information to the first respondents and investigators. During normal operation, for example, 360 degree video and/or other video may be used to review panelist performance and assist in network-wide context awareness. 360 degree cameras, fixed cameras, narrow vision cameras, wide vision cameras, 360 degree fish eye view cameras, radar, light and/or other cameras provide a complete picture of the context to provide surveillance videos for law enforcement and/or railway police, critical infrastructure inspection, railway crossing monitoring, view track construction processes, in-cab and in-cabin panelist inspection, real-time remote monitoring, and the like.
Previous systems required users to download video files containing time slices in order to view the video files using proprietary software applications or other external video playback applications. The data acquisition and recording system of the present disclosure provides 360 degree video, other video, image information and audio information, as well as range measurement information, which may be displayed to a remote user through the use of a virtual reality device and/or through a standard network client, thereby eliminating the need to download and view video using external applications. In addition, remotely located users may view 360 degrees of video and/or other video in various modes through the use of a virtual reality device or through a standard web client (e.g., web browser), thereby eliminating the need to download and view video using external applications. Previous video systems require the user to download video files containing only viewable data time segments using proprietary application software or other external video playback applications that the user must purchase separately.
The data may include, but is not limited to, video and image information from cameras located at various locations in, on, or near the asset and audio information from microphones located at various locations in, on, or near the asset. A 360 degree camera is a camera that provides a 360 degree spherical field of view, a 360 degree hemispherical field of view, and/or a 360 degree fisheye field of view. The use of 360 degree cameras, fixed cameras, narrow view cameras, wide view cameras, 360 degree fisheye view cameras, and/or other cameras in, on, or near an asset provides the ability to use 360 degree cameras, fixed cameras, narrow view cameras, wide view cameras, 360 degree fisheye view cameras, and/or other cameras as part of a DARS to use and record video, thereby making 360 degree views and/or other views in, on, or near an asset available to remote data stores, remotely located users, and investigators before, during, and after an incident.
FIG. 8 illustrates a field implementation of a third example of an exemplary real-time Data Acquisition and Recording System (DARS) 800 in which aspects of the present disclosure may be implemented. DARS 800 is a system that delivers real-time information, video information, and audio information from data logger 808 on a streaming asset 830 to remotely located end-users via data center 832. The data logger 808 is mounted on a vehicle or mobile asset 830 and communicates with any number of various information sources through any combination of wired and/or wireless data links, such as wireless gateways/routers (not shown). The data logger 808 includes a collision avoidance memory module 810, an onboard data manager 812, and a data encoder 814. In a fourth embodiment, the data logger 808 may also include a non-crash-resistant removable memory device (not shown). Exemplary hardened memory module 810 may be, for example, a crash event recorder memory module that complies with federal regulations and federal railway administration regulations, a crash safeguarding memory unit that complies with federal regulations and federal aviation administration regulations, a crash resistant memory module that complies with any applicable federal regulations, or any other suitable hardened memory device as known in the art. The wired and/or wireless data links may include any one or combination of discrete signal inputs, standard or proprietary ethernet, serial connection, and wireless connection.
The data logger 808 gathers video data, audio data, and other data and/or information from a variety of sources (which may vary based on the configuration of the asset) over an onboard data link. In this implementation, the data logger 808 receives data from the video management system 804, the video management system 804 continuously logs video data and audio data from 360-degree cameras, fixed cameras, narrow-view cameras, wide-view cameras, 360-degree fisheye-view cameras, radar, light and/or other cameras 802 and fixed cameras 806 placed in, on, or near the asset 830, and stores the video and audio data to the anti-collision memory module 810, which may also store the video and audio data in the non-anti-collision detachable storage of the second example. Different versions of video data are created using different bit rates or spatial resolutions, and are separated into variable length segments, such as thumbnail images, five minute low resolution segments, and five minute high resolution segments.
The data encoder 814 encodes at least a minimum set of data, typically defined by regulatory authorities. The data encoder 814 receives video and audio data from the video management system 804 and compresses or encodes the data and synchronizes the data in time in order to facilitate efficient real-time transmission and replication to the remote data store 820. The data encoder 814 transmits the encoded data to the on-board data manager 812, which on-board data manager 812 then sends the encoded video and audio data to the remote data store 820 via the remote data manager 818 located in the data center 830 in response to an on-demand request by a remotely located user 834 or in response to certain operating conditions being observed on the asset 830. On-board data manager 812 and remote data manager 818 work together to manage the data replication process. Remote data manager 818 in data center 832 may manage the replication of data from multiple assets. Video and audio data stored in the remote data store 820 are available to the web server 822 for access by remotely located users 834.
The onboard data manager 812 also sends the data to a queuing repository (not shown). The on-board data manager 812 monitors the video and audio data stored in the anti-collision memory module 810 and/or the optional non-anti-collision removable memory device of the fourth embodiment via the video management system 804 and determines whether it is in near real-time mode or real-time mode. In near real-time mode, the on-board data manager 812 is stored in the anti-collision memory module 810 and/or the optional non-anti-collision removable storage of the fourth embodiment, and in the queuing store, with the encoded data (including video data, audio data, and any other data or information) and any event information received from the data encoder 814. After five minutes of encoded data are accumulated in the enqueue store, the on-board data manager 812 stores the five minutes of encoded data to the remote data store 820 via the remote data manager 818 in the data center 832 over the wireless data link 816. In real-time mode, on-board data manager 812 stores the encoded data (including video data, audio data, and any other data or information) and any event information received from data encoder 814 to remote data store 820 via remote data manager 818 in data center 832 over wireless data link 816 for each configurable predetermined period of time, e.g., every second or every 0.10 seconds. The onboard data manager 812 and the remote data manager 818 may communicate over a variety of wireless communication links. The wireless data link 816 may be, for example, a Wireless Local Area Network (WLAN), a Wireless Metropolitan Area Network (WMAN), a Wireless Wide Area Network (WWAN), a private wireless system, a cellular telephone network, or any other means of communicating data from the data logger 808 to the remote data manager 818. The process of sending and retrieving video data and audio data remotely from the asset 830 requires a wireless data connection between the asset 830 and the data center 832. When a wireless data connection is not available, the data is stored and queued in the anti-collision memory module 810 and/or the optional non-anti-collision removable memory device of the fourth embodiment until wireless connectivity is restored. Once the wireless connectivity is restored, the video, audio and any other additional data retrieval process resumes.
In parallel with data recording, the data logger 808 continuously and autonomously copies data to the remote data store 820. The replication process has two modes, real-time mode and near real-time mode. In real-time mode, data is copied to remote data store 820 every second. In near real-time mode, data is copied to remote data store 820 every five minutes. The rate for near real-time mode is configurable and the rate for real-time mode is adjustable to support high resolution data by copying data to remote data store 820 every 0.10 seconds. Near real-time mode is used under most conditions during normal operation in order to improve the efficiency of the data replication process.
The real-time mode may be initiated based on an event occurring on the asset 830 or by a request initiated from the data center 832. When a remotely located user 834 requests real-time information from the network client 826, a typical request for real-time mode initiated by the data center 832 is initiated. Typical reasons for initiating a real-time mode on the asset 830 are the detection of an event or accident, such as an operator initiating an emergency stop request, an emergency braking activity, a rapid acceleration or deceleration on any axis, or a loss of input power to the data logger 808. When transitioning from near real-time mode to real-time mode, all data that has not been copied to remote data store 820 is copied and stored in remote data store 820 and then a current copy is initiated. The transition between near real-time mode and real-time mode typically occurs in less than five seconds. After a predetermined amount of time has elapsed from the event or incident, a predetermined amount of inactivity time, or when the user 834 no longer needs real-time information from the asset 830, the data logger 808 reverts to near real-time mode. The predetermined amount of time required to initiate the transition is configurable and is typically set to ten minutes.
While the data logger 808 is in real-time mode, the onboard data manager 812 attempts to continuously empty its queue to the remote data manager 818, store the data to the anti-collision memory module 810, and optionally to the non-anti-collision removable storage of the second embodiment, and simultaneously send the data to the remote data manager 818.
Upon receiving video data, audio data, and any other data or information to be replicated from data logger 808, remote data manager 818 stores the data to remote data store 820 in data center 830. The remote data store 820 may be, for example, a cloud-based data store or any other suitable remote data store. When data is received, the following process is initiated: a data decoder (not shown) decodes the most recently copied data from the remote data store 820 and sends the decoded data to a remote event detector (not shown). Remote data manager 818 stores the vehicle event information in remote data store 820. When the remote event detector receives the decoded data, it processes the decoded data to determine if an event of interest is found in the decoded data. The remote event detector then uses the decoded information to detect an event, incident, or other predefined context in the data that occurred with the asset 830. Upon detecting an event of interest from decoded data previously stored in remote data store 820, the remote event detector stores event information and support data in remote data store 820.
The video data, audio data, and any other data or information may be made available to the user 834 in response to an on-demand request by the user 834 and/or transmitted to the remote data store 820 through the on-board data manager 812 in response to certain operating conditions being observed on the asset 830. Video data, audio data, and any other data or information stored in remote data storage 820 are available to user 834 on network server 822. The remotely located user 834 may access video data, audio data, and any other data or information stored in the remote data store 820 relating to a particular asset 830 or assets using a standard web client 826 (e.g., a web browser) or virtual reality device 828 (in this implementation, thumbnail images of selected cameras may be displayed). Network client 826 communicates user 834 requests for video, audio, and/or other information to network server 822 over network 824 using common network standard protocols and techniques. For example, the network 824 may be the Internet. The network 824 may also be, for example, a Local Area Network (LAN), metropolitan Area Network (MAN), wide Area Network (WAN), virtual Private Network (VPN), cellular telephone network, or any other means of communicating data from a network server 822 to, in this example, a network client 826. Web server 822 requests the desired data from remote data storage 820. The web server 822 then sends the requested data to the web client 826, which web client 826 provides playback and real-time display of standard video, 360 degree video, and/or other video. The network client 826 plays video data, audio data, and any other data or information to the user 834, which user 834 may interact with 360 degree video data and/or other video data and/or still image data for viewing and analysis. The user 834 may also download video data, audio data, and any other data or information from the network client 826 and may then interact with the 360 degree video data using the virtual reality device 828 for viewing and analysis.
The network client 826 may be enhanced by a software application that provides playback of 360 degrees of video and/or other video in a number of different modes. The user 834 may select a mode in which the software application presents video playback (e.g., a fisheye view as shown in fig. 11, a panoramic view as shown in fig. 12, a double panoramic view (not shown), a four view as shown in fig. 13, and a fisheye correction view as shown in fig. 14).
Fig. 9 is a flowchart illustrating a process 840 for recording video data, audio data, and/or information from an asset 830 according to an embodiment of the disclosure. Video management system 804 receives data signals (842) from various input components (e.g., 360-degree cameras, fixed cameras, narrow-view cameras, wide-view cameras, 360-degree fisheye-view cameras, radar, light and/or other cameras 802 and fixed cameras 806) on, in, or near asset 830. The video management system 804 then stores the video data, audio data, and/or information in the anti-collision memory module 810 and/or the optional non-anti-collision removable storage device of the fourth embodiment using any combination of industry standard formats, such as still images, thumbnails, sequences of still images, or compressed video formats (844). The data encoder 814 creates a record containing a structured series of bits to configure and record the data signal information (846). In near real-time mode, the video management system 804 stores video data into the anti-collision memory module 810 and/or the optional non-anti-collision removable storage of the fourth embodiment, while sending only limited video data (e.g., thumbnail images or very short low resolution video clips) off-board to the remote data store 820 (848).
In another implementation, the encoded records are then sent to an on-board data manager 812, which on-board data manager 812 sequentially combines a series of records chronologically into a record block containing up to five minutes of data. The temporary record block contains less than five minutes of data, while the full record block contains the entire five minutes of data. Each record block contains all the data required to fully decode the contained signal, including the data integrity check. At a minimum, a recording block must start with a beginning recording and end with an ending recording.
To ensure that all encoded signal data is saved to the anti-collision memory module 810, and/or the optional non-anti-collision removable memory device of the fourth embodiment, assuming that the data logger 808 loses power, the on-board data manager 812 stores temporary log blocks in the anti-collision memory module 810 at a predetermined rate, and/or the optional non-anti-collision removable memory device of the fourth embodiment, wherein the predetermined rate is configurable and/or variable. Temporary record blocks are stored at least once every second, but may be stored at a frequency of one tenth of a second. The rate at which the temporary record blocks are held depends on the sampling rate of each signal. Each temporary record block contains a complete set of records from the last complete record block. Each temporary record block is recorded while storing data to the anti-collision memory module 810 to prevent damage or loss of more than one second of data in the event that the data logger 808 loses power, at which time the data logger 808 may alternate between two temporary storage orientations in the anti-collision memory module 810. Whenever a new temporary record block is saved to a temporary anti-collision memory location, the existing previously stored temporary record blocks in that location will be overwritten.
In this implementation, when the data logger 808 is in near real-time mode, every five minutes the on-board data manager 812 stores full recorded blocks including the last five minutes of encoded signal data into recorded segments in the anti-collision memory module 810 and/or the optional non-anti-collision removable storage device of the fourth example, and sends copies of the full recorded blocks (including five minutes of video data, audio data, and/or information) to the remote data manager 818 for storage in the remote data store 820 for a predetermined retention period (e.g., two years). The anti-collision memory module 810 and/or the optional non-anti-collision removable memory device of the fourth example stores the recorded segments of the most recently recorded blocks for a specified storage duration, which in this embodiment is federally specified duration, during which the data logger 808 must store the operational or video data in the anti-collision memory module 810 in an additional 24-hour buffer, followed by overwriting.
Fig. 10 is a flow chart illustrating a process 850 of viewing data and/or information from an asset 830 through a web browser 826 or virtual reality device 828. When an event occurs or when a remotely located authorized user 834 requests a segment of video data stored in the anti-collision memory module 810 via the network client 826, the on-board data manager 812 will begin sending off-board video data in real-time at the best resolution available at the bandwidth of the given wireless data link 816, depending on the event. Remotely located user 834 initiates a request (852) for specified video and/or audio data in a specified view mode through network client 826, which network client 826 communicates the request to network server 822 over network 824. The web server 822 requests specified video and/or audio data from the remote data storage 820 and sends the requested video and/or audio data to the web client 826 (854) via the network 824. The network client 826 displays video and/or audio data in a view mode specified by the user 834 (856). The user 834 may then download the specified video and/or audio data for viewing on the virtual reality device 828. In another implementation, the real-time mode thumbnail is sent first at one second intervals, then short segments of lower resolution video are sent, and then short segments of higher resolution video are sent.
For simplicity of explanation, process 840 and process 850 are depicted and described as a series of steps. However, steps according to the present disclosure may occur in various orders and/or concurrently. Additionally, steps according to the present disclosure may occur where other steps are not present and not as described herein. Furthermore, not all illustrated steps may be required to implement a methodology in accordance with the disclosed subject matter.
A fifth embodiment of the real-time data acquisition and recording system and video analysis system described herein provides real-time or near real-time access to various data of high value assets (e.g., event and operational data, video data, and audio data) for remotely located users. A data acquisition and recording system records data related to the asset and streams the data to a remote data store and to remotely located users before, during, and after an incident occurs. Streaming data to a remote data store in real-time or near real-time makes information available at least as soon as an incident or emergency occurs, thereby virtually eliminating the need to locate and download a "black box" for purposes of investigating incidents involving assets by streaming information to a remote data store in real-time and making information available at least as soon as a catastrophic incident occurs. The DARS performs video analysis of recorded video data of the streaming asset to determine, for example, cab occupancy, track detection, and object detection proximate to the track. Remotely located users may use a common web browser to navigate to and view desired data related to a selected asset and do not need to interact with the data acquisition and recording system on the asset to request specific data to be downloaded, locate or transfer files, and view data using a custom application.
DARS provides remotely located users with access to video data and video analysis performed by a video analysis system by streaming data to remote data stores and remotely located users before, during, and after an accident, thereby eliminating the need for users to manually download, extract, and replay video to review video data to determine cab occupancy, whether crew members or unauthorized individuals are present during the accident, track detection, object detection near the track, surveys, or any other time of interest. In addition, the video analysis system provides cab occupancy status determination, track detection, object detection near the track, traction and drag unit determination by processing image and video data in real time, thereby ensuring that the user always has the correct data available. For example, real-time image processing ensures that locomotives designated as trailing locomotives are not in traction service to improve railway safety. Previous systems provide locomotive locations within a train by using train consist functionality in a dispatch system. Sometimes, the dispatch system information may be outdated as the information is not updated in real time and the crew may change locomotives as needed.
Prior to the system of the present disclosure, an inspection crew and/or asset personnel must manually inspect track conditions, manually inspect whether the vehicle is in a towing or towing position, manually survey the orientations of each individual object of interest, manually create a database of the geographic orientations of all objects of interest, periodically perform a manual field survey of each object of interest to verify its orientation and identify any changes in geographic orientation that differ from the original survey, manually update the database when the object of interest changes orientation due to repair or additional infrastructure development occurring since the time of creation of the original database, select and download desired data from digital video recorders and/or data recorders and check off-line for downloaded data and/or video and inspect the track for any obstructions, and the vehicle operator must physically inspect any obstructions and/or switch changes. The system of the present disclosure eliminates the need for the user to perform these steps, requiring only the user to navigate to the desired data using a common web browser. Asset owners and operators can automate and improve the efficiency and safety of a mobile asset in real time and can actively monitor track conditions and can obtain warning information in real time. The system of the present disclosure eliminates the need for asset owners and operators to download data from data loggers in order to monitor track conditions and survey events. As an active safety system, the DARS may assist an operator in checking for any obstructions, send alerts in real-time and/or save information offline, and send alert information for remote monitoring and storage. Track detection information for both current and past tracks and/or information related to object detection near the tracks may be stored in real-time in a remote data store to assist a user in viewing the information when needed. Remotely located users may access a common web browser to navigate to desired data related to the selected asset to view and analyze the operational efficiency and security of the asset in real-time or near real-time.
The real-time data acquisition and recording system of the fifth embodiment may be used to continuously monitor an object of interest and to identify in real-time when the object of interest has moved or damaged, is blocked by leaves, and/or is out of repair and requires maintenance. DARS uses video, image and/or audio information to detect and identify various infrastructure objects (e.g., tracks) in the video, has the ability to follow the tracks as the streaming asset progresses, and has the ability to create, review and periodically update a database of objects of interest with geographic locations. The real-time data acquisition and recording system of the fifth embodiment uses at least one or any combination of an image measurement device, a video measurement device, and a range measurement device in, on, or near a fluid asset as part of the data acquisition and recording system. Image measurement devices and/or video measurement devices include, but are not limited to, 360 degree cameras, fixed cameras, narrow view cameras, wide view cameras, 360 degree fisheye view cameras, and/or other cameras. Range measurement devices include, but are not limited to, radar and light detection and ranging ("light arrival"). Photoarrival is a survey method that measures the distance to an object by illuminating the object with a pulsed laser and measuring the reflected pulse with a sensor.
The DARS may automatically check track conditions, such as counting the number of tracks present, identifying the current track on which the asset is traveling, and detect any obstructions or defects present (e.g., ballast washed away by floods, track breaks, track overruns, turnout misalignments, turnout rides, in-track floods, snow accumulation, etc.), schedule any inspection to avoid any catastrophic events. DARS can also detect track switches and follow track changes. The DARS may further detect changes in the orientation of the data, including whether the object is missing, blocked, and/or not present at the intended orientation. Track detection, infrastructure diagnostic information, and/or infrastructure monitoring information may be displayed to a user using any standard web client (e.g., web browser), thereby eliminating the need to download files from a data logger and view the information using proprietary application software or other external applications as required by previous systems. This process may be extended to automatically create, review, and/or update databases with the geographic locations of objects of interest and ensure compliance with federal regulations. With the system of the present disclosure, various tasks previously requiring human interaction, specialized vehicles, and/or alternative equipment are performed with previously installed federal regulation compliant cameras. DARS allows these tasks to be automatically performed as part of normal tax services and daily operations as the liquidity travels through the territory. DARS can be used to save countless person-hour (person-hour) manual work by utilizing normal operation of the vehicle and previously installed cameras to accomplish the task previously requiring manual work. DARS may also perform tasks previously performed using specialized vehicles, which prevents closed track segments from inspecting and locating tracks and objects of interest, which typically incur the expense of tax service expensive purchase and maintenance equipment. DARS further reduces the amount of time a human needs to be located near a track, thereby reducing overall accidents and the number of potential personal casualties.
The data may include, but is not limited to, measured simulation and frequency parameters such as speed, pressure, temperature, current, voltage, and acceleration derived from the asset and/or nearby assets; measured boolean data such as switch position, actuator position, warning light illumination, and actuator commands; location, speed and altitude information from the Global Positioning System (GPS) and additional data from the Geographic Information System (GIS), such as latitude and longitude of each object of interest; internally generated information such as legal speed limits on the asset at its current location; train control status and operational data generated by a system such as active train control (PTC); vehicle and inertial parameters such as speed, acceleration and bearing, such as bearing received from GPS; GIS data such as latitude and longitude of each object of interest; video and image information from at least one camera located in, on or near various orientations in the asset; audio information from at least one microphone located in various locations in, on, or near the fluid asset; information about the operating scheme of the mobile asset, such as route, schedule, and manifest information, sent from the data center to the mobile asset; information about environmental conditions, including current and forecasted weather conditions for the area in which the mobile asset is currently operating or is scheduled to operate; and data derived from a combination of any of the above sources, including additional data, video and audio analysis, and analysis results.
"Track" may include, but is not limited to, railroad rails and ties for locomotive and/or train traffic. "objects of interest" may include, but are not limited to, individual infrastructure objects installed and maintained within the vicinity of a railroad track, which may be identified through artificial intelligence (e.g., supervised learning or reinforcement learning) using asset camera images and video. Supervised learning and/or reinforcement learning uses previously tagged datasets defined as "training" data to allow remote and autonomous identification of objects within a view of cameras in, on, or near the mobile asset. Supervised learning and/or reinforcement learning trains neural network models to identify patterns that occur within visual images obtained from cameras. These patterns (e.g., people, crossing gates, cars, trees, signals, switches, etc.) may be visible in only a single image. Subsequent frames within the video may also be analyzed for patterns such as blinking signals, moving cars, sleeping persons, etc. DARS may or may not require human interaction at any stage of the implementation, including but not limited to the training dataset required for marker supervised learning and/or reinforcement learning. Objects of interest include, but are not limited to, tracks, track centerline points, mileage tags, signals, crossing gates, switches, crossing, and text-based tags. "video analysis" refers to any understandable information such as, but not limited to, objects of interest, geographic orientation of objects, track obstructions, distance between objects of interest and the fluid asset, track misalignments, etc., gathered by analyzing video and/or images recorded from images, video and/or range measurement devices (e.g., at least one camera, such as a 360 degree camera, a fixed camera, a narrow view camera, a wide view camera, a 360 degree fisheye view camera, radar, photoarrival, and/or other cameras) in, on, or near the fluid asset. The video analytics system may also be used in any mobile asset, residential area, space, or room equipped with monitoring cameras to enhance video surveillance. In a mobile asset, a video analytics system economically and efficiently provides autonomous cab occupancy event detection to remotely located users.
FIG. 15 illustrates a field implementation of a fifth example of an exemplary real-time Data Acquisition and Recording System (DARS) 900 in which aspects of the present disclosure may be implemented. DARS 900 is a system that delivers real-time information, video information, and audio information from data logger 902 on a streaming asset 964 to remotely located end-users 968 via data center 966. The data logger 902 is mounted on a vehicle or mobile asset 964 and communicates with any number of various information sources through any combination of wired and/or wireless data links 942, such as wireless gateways/routers (not shown). The data logger 902 gathers video data, audio data, and other data or information from a variety of sources (which may vary based on the configuration of the asset) over an onboard data link 942. The data logger 902 includes local memory components such as an anti-collision memory module 904 in the asset 964, an on-board data manager 906, and a data encoder 908. In a sixth embodiment, the data logger 902 may also include a non-crash-resistant removable memory device (not shown). The exemplary hardened memory module 904 may be, for example, a crash event recorder memory module that complies with federal regulations and federal railway administration regulations, a crash safeguarding memory unit that complies with federal regulations and federal aviation administration regulations, a crash resistant memory module that complies with any applicable federal regulations, or any other suitable hardened memory device as known in the art. The wired and/or wireless data links may include any one or combination of discrete signal inputs, standard or proprietary ethernet, serial connection, and wireless connection.
DARS 900 additionally includes a video analytics system 910 that includes a track and/or object detection and infrastructure monitoring component 914. The track detection and infrastructure monitoring component 914 includes a supervised learning and/or reinforcement learning component 924, or other neural network or artificial intelligence component, an object detection and orientation component 926, and an obstacle detection component 928 that detects obstacles and/or camera obstacles (e.g., people blocking the view of the camera) present on or near the track. In this embodiment, live video data is captured by at least one camera 940 mounted in the cab of asset 964, on asset 964, or near asset 964. The camera 940 is placed at an appropriate height and angle to capture video data expressions in and around the asset 964 to obtain a sufficient amount of views for further processing. Live video data and image data in front of and/or around the asset 964 are captured by the camera 940 and fed to the track and/or object detection and infrastructure monitoring component 914 for analysis. The track detection and infrastructure monitoring component 914 of the video analytics system 910 processes live video and image data frame-by-frame to detect any objects of interest and track presence. Camera position parameters (e.g., height, angle, shift, focal length, and field of view) may be fed to the track and/or object detection and infrastructure monitoring component 914 or camera 940 may be configured to allow the position 910 to detect and determine camera position and parameters.
To make status determinations, such as cab occupancy detection, the video analytics system 910 uses a supervised learning and/or reinforcement learning component 924, and/or other artificial intelligence and learning algorithms to evaluate, for example, video data from cameras 940, asset data 934, such as speed, GPS data, and inertial sensor data, weather component 936 data, and route/group member inventory, and GIS component data 938. Cab occupancy detection is inherently susceptible to sources of environmental noise, light being reflected from the cloud and sunlight passing through buildings and trees as the asset moves. To handle environmental noise, the supervised learning and/or reinforcement learning component 924, the object detection and position component 926, the obstacle detection component 928, the asset component 934 data (which may include speed, GPS data, and inertial sensor data), the weather component 936 data, and other learning algorithms are structured together to form internal and/or external state determinations involving the mobile asset 964. The track and/or object detection and infrastructure monitoring component 914 may also include a facial recognition system adapted to allow authorized access to the locomotive as part of the locomotive security system, a fatigue detection component adapted to monitor team member vigilance, and an activity detection component that detects unauthorized activity (e.g., smoke).
Additionally, the video analytics system 910 may receive position information (including latitude and longitude coordinates) of signals (e.g., stop signals, traffic signals, speed limit signals, and/or object signals approaching a track) from an asset owner. The video analytics system 910 then determines whether the bearing information received from the asset owner is correct. If the orientation information is correct, the video analysis system 910 stores the information and does not review the orientation information for a predetermined amount of time, such as on a monthly basis. If the location information is incorrect, the video analytics system 910 determines and reports the correct location information to the asset owner, stores the location information, and does not review the location information for a predetermined amount of time, such as on a monthly basis. Storing the bearing information provides for easier detection of signals such as stop signals, traffic signals, speed limit signals, and/or object signals approaching a track.
The supervised and/or reinforcement learning component 924 is used to perform supervised and/or reinforcement learning of the track to determine learned data by utilizing various information obtained from successive video and/or image frames and also using additional information received from the data center 966 and the vehicle data component 934 (including inertial sensor data and GPS data). The object detection and bearing component 926 uses the learning data received from the supervised learning and/or reinforcement learning component 924 and specific information about the fluid asset 964 and railways (e.g., track width and curvature, sleeper positioning and vehicle speed to distinguish between tracks, signs, signals, etc.) from other objects to determine object detection data. Obstacle detection component 928 uses object detection data received from object detection and bearing component 926 (e.g., information regarding obstacles present on or near the track and/or camera obstacles (e.g., people blocking camera views) and additional information from weather component 936, route/group member inventory data and GIS data component 938 and vehicle data component 934, including inertial sensor data and GPS data) to enhance accuracy and determine obstacle detection data. The fluid asset data from the vehicle data assembly 934 includes, but is not limited to, speed, azimuth, acceleration, yaw/pitch rate, and track crossing. Any additional information received and utilized from the data center 966 includes, but is not limited to, day and night details of the mobile asset 964 and geographic location.
The infrastructure object of interest information and diagnostic and monitoring information processed by the track and/or object detection and infrastructure monitoring component 914 is sent via an on-board data link 942 to the data encoder 908 of the data logger 902 to encode the data. The data logger 902 stores the encoded data in the anti-collision memory module 904, and optionally in the optional non-anti-collision removable memory device of the sixth embodiment, and sends the encoded information to a remote data manager 946 in the data center 966 via a wireless data link 944. Remote data manager 946 stores the encoded data in a remote data store 948 in data center 966.
To determine the presence of obstacle detection 928 or object detection 926, such as a track in front of asset 964, an object on and/or near the track, an obstacle on or near the track, and/or an obstacle blocking a view of the camera, vehicle analysis system 910 uses supervised learning and/or reinforcement learning component 924 or other artificial intelligence, object detection and bearing component 926, and obstacle detection component 928, and other image processing algorithms to process and evaluate camera image and video data from camera 940 in real-time. The track and/or object detection and infrastructure monitoring component 914 uses the processed video data and asset component 934 data (which may include speed, GPS data and inertial sensor data, weather component 936 data, and route/team member, checklist and GIS component 938 data) to determine external status determinations, such as towing and towing a mobile asset in real time. When processing image and video data for track and/or object detection, for example, the video analysis system 910 automatically configures camera 940 parameters required for track detection, detects switch crossings, counts the number of tracks, detects any additional tracks along the sides of the asset 964, determines the track on which the asset 964 is currently running, detects track geometry defects, detects track erosion scenarios, such as detecting water conditions near the track within defined limits of the track, and detects missed slopes or track scenarios. Object detection accuracy depends on existing lighting conditions in and around asset 964. The DARS 900 will handle different lighting conditions by means of additional data collected from the assets 964 and the data center 966. DARS 900 is enhanced to operate under a variety of lighting conditions, to operate under a variety of weather conditions, to detect more objects of interest, to integrate with existing database systems to create, audit and automatically update data, to detect multiple tracks, to operate in correspondence with curved tracks, to detect any obstructions, to detect any track defects that may cause security problems, and to operate in low cost embedded systems.
Internal and/or external status determinations (e.g., cab occupancy), object detection and orientation (e.g., track detection and object detection proximate to the track), and obstacle detection (e.g., obstacles on or near the track and obstacles blocking the camera) from the video analytics system 910 are provided to the data logger 902 along with any data from the Vehicle Management System (VMS) or digital video recorder component 932 via the on-board data link 942. The data logger 902 stores the internal and/or external state determination, object detection and position component 926 data and obstacle detection component 928 data in the anti-collision memory module 904, and optionally in the non-anti-collision removable memory device of the sixth embodiment, and in the remote data store 948 via a remote data manager 946 located in the data center 966. The web server 958 provides internal and/or external state determination, object detection and position component 926 information and obstacle detection component 928 information to remotely located users 968 via a web client 962 upon request.
The data encoder 908 encodes at least a minimum set of data, typically defined by regulatory authorities. The data encoder 908 receives video, image, and audio data from any of the camera 940, video analysis system 910, and video management system 932 and compresses or encodes the data and synchronizes the data in time in order to facilitate efficient real-time transmission and replication to the remote data store 948. The data encoder 908 transmits the encoded data to the on-board data manager 906, which on-board data manager 906 then sends the encoded video, image, and audio data to the remote data store 948 via the remote data manager 946 located in the data center 966 in response to an on-demand request by the user 968 or in response to certain operating conditions being observed on the asset 964. The onboard data manager 906 and the remote data manager 946 work together to manage the data replication process. A remote data manager 946 in the data center 966 may manage the copying of data from the plurality of assets 964.
The onboard data manager 906 determines whether the detected events, internal and/or external state determinations, object detection and bearing and/or obstacle detection should be queued or sent out immediately based on the prioritization of the detected events. For example, in a normal operating scenario, detecting an obstacle on a track is much more critical than detecting whether someone is in the cab of asset 964. The onboard data manager 906 also sends the data to a queuing repository (not shown). In near real-time mode, the onboard data manager stores the encoded data and any event information received from the data encoder 908 in the anti-collision memory module 904 and in a queuing store. After five minutes of encoded data are accumulated in the enqueue store, the on-board data manager 906 stores the five minutes of encoded data to the remote data store 948 via a remote data manager 946 in the data center 966 over a wireless data link 944. In real-time mode, the onboard data manager 906 stores the encoded data and any event information received from the data encoder 908 to the anti-collision memory module 904 and to the remote data store 948 on the wireless data link 944 via the remote data manager 946 in the data center 966 at each configurable predetermined time period, for example, every second or every 0.10 seconds.
In this embodiment, the on-board data manager 906 transmits video data, audio data, internal and/or external state determinations, object detection and bearing information, obstacle detection information, and any other data or event information to the remote data store 948 via a remote data manager 946 in the data center 966 through a wireless data link 944. The wireless data link 944 may be, for example, a Wireless Local Area Network (WLAN), a Wireless Metropolitan Area Network (WMAN), a Wireless Wide Area Network (WWAN), a Wireless Virtual Private Network (WVPN), a cellular telephone network, or any other means of transferring data from the data logger 902 to the remote data manager 946 (in this example). The process of remotely retrieving data from the asset 964 requires a wireless connection between the asset 964 and the data center 966. When the wireless data connection is not available, the data is stored and queued until the wireless connectivity is restored.
In parallel with the data recording, the data recorder 902 continuously and autonomously copies data to the remote data store 948. The replication process has two modes, real-time mode and near real-time mode. In real-time mode, data is copied to the remote data store 10 every second. In near real-time mode, data is copied to the remote data store 15 every five minutes. The rate for near real-time mode is configurable and the rate for real-time mode is adjustable to support high resolution data by copying data to the remote data store 15 every 0.10 seconds. Near real-time mode is used under most conditions during normal operation in order to improve the efficiency of the data replication process.
The real-time mode may be initiated based on an event occurring at asset 964 or by a request initiated from data center 966. When a remotely located user 968 requests real-time information from a network client 962, a typical request for real-time mode initiated by a data center 966 is initiated. Typical reasons for initiating a real-time mode on asset 964 are the detection of an event or incident involving asset 964, such as an operator initiating an emergency stop request, an emergency braking activity, any rapid acceleration or deceleration on the axis, or a loss of input power to data logger 902. When transitioning from near real-time mode to real-time mode, all data that has not been copied to remote data store 948 is copied and stored in remote data store 948 and then a current copy is initiated. The transition between near real-time mode and real-time mode typically occurs in less than five seconds. After a predetermined amount of time has elapsed from the event or incident, a predetermined amount of inactivity time, or when the user 968 no longer needs real-time information from the asset 964, the data logger 902 resumes near real-time mode. The predetermined amount of time required to initiate the transition is configurable and is typically set to ten minutes.
While the data logger 902 is in real-time mode, the onboard data manager 906 attempts to continuously empty its queue to the remote data manager 946, store the data to the anti-collision memory module 940, and optionally to the non-anti-collision removable storage of the sixth embodiment, and simultaneously send the data to the remote data manager 946.
Upon receiving video data, audio data, internal and/or external state determinations, object detection and orientation information, obstacle detection information, and any other data or information from data logger 902, remote data manager 946 stores the data, as well as data received from onboard data manager 906 (e.g., encoded data and detected event data), to remote data store 948 in data center 966. The remote data store 948 can be, for example, a cloud-based data storage device or any other suitable remote data storage device. When data is received, the following process is initiated: causing the data decoder 954 to decode the most recently copied data from the remote data store 948 and send the decoded data to the track/object detection/orientation information component 950, which track/object detection/orientation information component 950 looks up additional 'post-processed' events on the stored data. In this implementation, the track/object detection/bearing information component 950 includes an object/obstacle detection component for determining internal and/or external state determinations, object detection and bearing information, and obstacle detection information. Upon detection of internal and/or external information, object detection and bearing information, and/or obstacle detection information, the track/object detection/bearing information component 950 stores the information in the remote data store 948.
Remotely located users 968 may access video data, audio data, internal and/or external status determinations, object detection and bearing information, obstacle detection information, and any other information stored in remote data store 948 (including track information, asset information, and cab occupancy information) related to a particular asset 964 or multiple assets using a standard web client 962 (e.g., a web browser) or a virtual reality device (not shown) (e.g., virtual reality device 828 in fig. 8) (in this implementation, thumbnail images from selected cameras may be displayed). Network client 962 communicates requests for information by user 968 to network server 958 using common network standards, protocols, and techniques over network 960. The network 960 may be, for example, the Internet. Network 960 may also be a Local Area Network (LAN), metropolitan Area Network (MAN), wide Area Network (WAN), virtual Private Network (VPN), cellular telephone network, or any other means of communicating data from network server 958 to network client 962, in this example. The web server 958 requests the desired data from the remote data store 948, and the data decoder 954 obtains the requested data from the remote data store 948 regarding the particular asset 964 upon request from the web server 958. The data decoder 954 decodes the requested data and sends the decoded data to the locator 956. Locator 956 identifies profile settings set by user 968 by accessing network client 962 and using the profile settings prepares information to be sent to network client 962 for presentation to user 968 because the original encoded data and detected track/object detection/orientation information are saved to remote data store 948 using universal coordinated time (UTC) and international units (SI units). The locator 956 converts the decoded data into a format desired by the user 968, such as a unit of measure and language preferred by the user 968. The locator 956 sends the located data in a format preferred by the user 968 to the web server 958 in view of the request. The network server 958 then sends the located data along with internal and/or external status determinations, object detection and orientation information, and obstacle detection information (e.g., track and/or object detection (fig. 16A), track and switch detection (fig. 16B), track and/or object detection, track number count and signal detection (fig. 16C), crossing and track and/or object detection (fig. 16D), double overhead signal detection (fig. 16E), multi-track and/or multi-object detection (fig. 16F), switch and track and/or object detection (fig. 16G), and switch detection (fig. 16H)) to the network client 962 for viewing and analysis, providing playback and real-time display of standard video and 360 degrees video.
Network client 962 is enhanced by a software application that provides playback of 360 degrees of video and/or other video in a number of different modes. The user 968 may select a mode in which the software application presents video playback (e.g., fisheye view, fisheye correction view, panoramic view, double panoramic view, and four view).
Fig. 17 is a flow chart of a process 970 for determining an internal state of an asset 964 according to an embodiment of the present disclosure. Video analytics system 910 receives data signals (972) from various input components such as cameras 940 (including but not limited to 360 degree cameras, fixed cameras, narrow view cameras, wide view cameras, 360 degree fish eye view cameras, radar, light and/or other cameras) on, in or near asset 964, vehicle data components 934, weather components 936, and route/inventory and GIS components 938. The video analysis system 910 processes the data signals using supervised learning and/or reinforcement learning components (974) and determines internal states (e.g., cab occupancy) (976).
Fig. 18 is a flow chart illustrating a process 980 for determining object detection/position and obstacle detection occurring outside and inside an asset 964 according to an embodiment of the disclosure. Video analytics system 910 receives data signals (982) from various input components such as cameras 940 (including but not limited to 360 degree cameras, fixed cameras, narrow view cameras, wide view cameras, 360 degree fish eye view cameras, radar, light and/or other cameras) on, in or near asset 964, vehicle data components 934, weather components 936, and route/inventory and GIS components 938. Video analysis system 910 processes the data signals (984) and determines obstacle detection (986) and object detection and orientation (e.g., track presence) using supervised learning and/or reinforcement learning component 924, object detection/orientation component 926, and obstacle detection component 928 (988).
For simplicity of explanation, process 970 and process 980 are depicted and described as a series of steps. However, steps according to the present disclosure may occur in various orders and/or concurrently. Additionally, steps according to the present disclosure may occur where other steps are not present and not as described herein. Furthermore, not all illustrated steps may be required to implement a methodology in accordance with the disclosed subject matter.
The real-time data acquisition and recording data sharing system works with the real-time data acquisition and recording system and viewers to provide real-time or near real-time access to various data (e.g., event and operational data, video data, and audio data) of high-value assets to remotely located users (e.g., asset owners, operators, and investigators). A data acquisition and recording system records data related to the asset and streams the data to a remote data store and to remotely located users before, during, and after an incident occurs. Streaming data to a remote data store in real-time or near real-time makes the information available at least as soon as an incident or emergency occurs, thereby virtually eliminating the need to locate and download a "black box" for investigation of incidents involving the asset and the need to interact with the data acquisition and recording system on the asset to request the downloading of specific data, locating and transferring files, and viewing the data using a custom application. Real-time data acquisition and recording systems maintain typical recording capabilities and increase the ability to stream data to remote data stores and remote end users before, during, and after an accident.
Remotely located users (e.g., property owners, operators, and/or investigators) may access a common web browser to navigate to current and/or historical desired data related to a selected property to view and analyze the operational efficiency and security of the property in real-time or near real-time. The ability to view operations in real-time or near real-time enables rapid assessment and adjustment of behavior. During an incident, for example, real-time information and/or data may facilitate classifying the context by nature and providing valuable information to the first responders. During normal operation, near real-time information and/or data may be used to review panelist performance and assist in network-wide context awareness, for example.
Remotely located users may access a common web browser to use the viewer and navigate to desired data related to the selected asset to view and analyze the operational efficiency and security of the asset in real-time or near real-time. The viewer provides the ability to view operations and/or 360 degree video in real-time or near real-time, which enables rapid assessment and adjustment of panelist behavior. Owners, operators, and investigators can view and analyze operational efficiency, personnel safety, vehicles, and infrastructure and can investigate or verify incidents. During an incident, for example, 360 degree video may facilitate classifying the context by nature and providing valuable information to the first responders and investigators. During normal operation, for example, 360 degree video may be used to review panelist performance and assist network-wide context awareness. In addition, remotely located users may view 360 degrees of video in various modes with a viewer, either through the use of a virtual reality device (e.g., virtual reality device 828 in fig. 8) or through a standard web client (e.g., web browser), thereby eliminating the need to download and view video using external applications.
The data sharing system allows a user to share data obtained from the data acquisition and recording system with remotely located users. Users can share data in a secure, controlled, tracked, and censored manner with remote recipient end users that own internet access and modern web browsers. Instead of sharing files, users share URLs to the data. URL-based data sharing enables users to control, track, and review sensitive data. Users can share data to improve security of the global transportation system without fear of unauthorized data diffusion. The investigator can share data with remotely located users using a web client and without the need to locate and download a "black box".
The data may include, but is not limited to: simulation and frequency parameters such as velocity, pressure, temperature, current, voltage and acceleration derived from the asset and/or nearby assets; boolean data such as switch position, actuator position, warning light illumination, and actuator commands; global Positioning System (GPS) data and/or Geographic Information System (GIS) data, such as location, speed, and altitude; internally generated information such as legal speed limits on the asset at its current location; video and image information from cameras located in various orientations in, on, or near the asset; audio information from microphones located in various locations in, on, or near the asset; information about the operating scheme of the asset, such as route, schedule, and manifest information, sent from the data center to the asset; information about environmental conditions, including current and forecasted weather conditions for the area in which the asset is currently operating or is scheduled to operate; asset control status and operational data generated by a system such as active train control (PTC) in a locomotive; and data derived from a combination of any of the above, including but not limited to additional data, video and audio analysis, and analysis results.
FIGS. 19 and 20 illustrate field implementations of first and second embodiments, respectively, of an exemplary real-time Data Acquisition and Recording System (DARS) 100, 200 in which aspects of the present disclosure may be implemented. The DARS100, 200 includes a data logger 154, 254 that is mounted on the vehicle or mobile asset 148, 248 and communicates with any number of various sources of information via a data center 150, 250 of the DARS100, 200 via a data link, such as the wireless data link 146, through any combination of on-board wired and/or wireless data links 170, 270 (e.g., wireless gateways/routers) or off-board sources of information. The data loggers 154, 254 include the onboard data manager 120, 220, the data encoder 122, 222, the vehicle event detector 156, 256, the queuing repository 158, 258, and the wireless gateway/router 172, 272. Additionally, in this embodiment, the data loggers 154, 254 may include the anti-collision memory modules 118, 218 and/or the ethernet switches 162, 262 with or without Power Over Ethernet (POE). The exemplary hardened memory module 118, 218 may be, for example, a crash event recorder memory module that complies with federal regulations and federal railway administration regulations, a crash safeguarding memory unit that complies with federal regulations and federal aviation administration regulations, a crash resistant memory module that complies with any applicable federal regulations, or any other suitable hardened memory device as known in the art. In a second embodiment shown in fig. 2 and 20, the data logger 254 may additionally include an optional non-crash-resistant removable storage device 219.
The wired and/or wireless data links 170, 270 may include any one or combination of discrete signal inputs, standard or proprietary ethernet, serial connection, and wireless connection. The ethernet-connected devices may utilize the data loggers 154, 254 and the ethernet switches 162, 262 may utilize POE. The ethernet switches 162, 262 may be internal and external and may support POE. In addition, data from remote data sources (e.g., map components 164, 264, route/group member inventory components 124, 224, and weather components 126, 226 in the embodiments of fig. 1,2, 19, and 20) may be used from data centers 150, 250 through wireless data links 146, 246 and wireless gateway/routers 172, 272 for on-board data managers 120, 220 and vehicle event detectors 156, 256.
The data loggers 154, 254 gather data or information from a variety of sources (which may vary greatly based on the configuration of the asset) over the onboard data links 170, 270. The data encoders 122, 222 encode at least a minimum set of data that is typically defined by a regulatory agency. In this embodiment, the data encoders 122, 222 receive data from a variety of sources of assets 148, 248 and sources of the data centers 150, 250. The information source may include any number of components in the asset 148, 248, such as any of the following: analog inputs 102, 202, digital inputs 104, 204, I/O modules 106, 206, vehicle controllers 108, 208, engine controllers 110, 210, inertial sensors 112, 212, global Positioning System (GPS) 114, 214, cameras 116, 216, active train control (PTC)/signal data 166, 266, fuel data 168, 268, cellular transmission detectors (not shown), internal drive data, and any additional data signals; and any of several components in the data centers 150, 250, such as any of the route/panelist components 124, 224, weather components 126, 226, map components 164, 264, and any additional data signals. The cameras 116, 216 or image measurement devices and/or video measurement devices include, but are not limited to, 360 degree cameras, fixed cameras, narrow view cameras, wide view cameras, 360 degree fisheye view cameras, and/or other cameras, both internal and external to the asset 148. The data encoders 122, 222 compress or encode data and synchronize the data in time for efficient real-time transmission and copying to the remote data stores 130, 230. The data encoder 122, 222 transmits the encoded data to the on-board data manager 120, 220, which on-board data manager 120, 220 then saves the encoded data in the anti-collision memory module 118, 218 and the enqueue store 158, 258 for copying to the remote data store 130, 230 via the remote data manager 132, 232 located in the data center 150, 250. Optionally, the onboard data manager 120, 220 may save a three-level copy of the encoded data in the non-collision-resistant removable storage 219 of the second embodiment shown in fig. 2 and 20. The onboard data manager 120, 220 and the remote data manager 132, 232 work together to manage the data replication process. A single remote data manager 132, 232 in a data center 150, 250 may manage the replication of data from multiple assets 148, 248.
Data from the various input components and data from the in-cab audio/Graphical User Interfaces (GUIs) 160, 260 are sent to the vehicle event detectors 156, 256. The vehicle event detectors 156, 256 process the data to determine whether an event, incident, or other predefined context involving the asset 148, 248 occurred. When the vehicle event detectors 156, 256 detect a signal indicating that a predetermined event has occurred, the vehicle event detectors 156, 256 transmit processed data indicating that the predetermined event has occurred to the on-board data manager 120, 220 along with support data surrounding the predetermined event. The vehicle event detectors 156, 256 detect events based on data from a variety of sources (e.g., analog inputs 102, 202, digital inputs 104, 204, I/O modules 106, 206, vehicle controllers 108, 208, engine controllers 110, 210, inertial sensors 112, 212, GPS114, 214, cameras 116, 216, route/group member inventory components 124, 224, weather components 126, 226, map components 164, 264, PTC/signal data 166, 266, and fuel data 168, 268, which may vary based on the configuration of the asset). When a vehicle event detector 156, 256 detects an event, the detected asset event information is stored in a queue store 158, 258 and may optionally be presented to a team member of the asset 148, 248 via an in-cab audio/Graphical User Interface (GUI) 160, 260.
The onboard data manager 120, 220 also sends the data to the queuing store 158. In near real-time mode, the on-board data manager 120, 220 stores the encoded data and any event information received from the data encoder 122, 222 in the anti-collision memory module 118, 218 and in the queuing store 158, 258. In the second embodiment of fig. 2 and 20, the onboard data manager 220 may optionally store the encoded data in a non-collision-resistant removable storage 219. After five minutes of encoded data are accumulated in the enqueue store 158, 258, the on-board data manager 120, 220 stores the five minutes of encoded data to the remote data store 130, 230 via the remote data manager 132, 232 in the data center 150, 250 over the wireless data link 146, 246 accessed through the wireless gateway/router 172, 272. In real-time mode, the on-board data manager 120, 220 stores the encoded data and any event information received from the data encoder 122, 222 to the anti-collision memory module 118, 218, and optionally in the non-anti-collision removable storage 219 of fig. 2 and 20, and to the remote data repository 130, 230 via the remote data manager 132, 232 in the data center 150, 250 over the wireless data link 146, 246 accessed through the wireless gateway/router 172, 272. The onboard data managers 120, 220 and the remote data managers 132, 232 may communicate over a variety of wireless communication links (e.g., wi-Fi, cellular, satellite, and private wireless systems) utilizing the wireless gateway/router 172, 272. The wireless data links 146, 246 may be, for example, a Wireless Local Area Network (WLAN), a Wireless Metropolitan Area Network (WMAN), a Wireless Wide Area Network (WWAN), a private wireless system, a cellular telephone network, or any other means of transferring data from the data loggers 154, 254 of the DARS100, 200 to the remote data manager 130, 230 of the DARS100, 200 (in this example). When a wireless data connection is not available, the data is stored in memory and queued in the queuing store 158, 258 until the wireless connectivity is restored and the data replication process can resume.
In parallel with the data recording, the data logger 154, 254 continuously and autonomously copies the data to the remote data store 130, 230. The replication process has two modes, real-time mode and near real-time mode. In real-time mode, data is copied to the remote data store 130, 230 every second. In near real-time mode, data is copied to the remote data store 130, 230 every five minutes. The rate for near real-time mode is configurable and the rate for real-time mode is adjustable to support high resolution data by copying data to the remote data store 130, 230 every 0.10 seconds. When the DARS100, 200 is in near real-time mode, the onboard data manager 120, 220 queues the data in the queuing store 158, 258 before copying the data to the remote data manager 132, 232. The onboard data manager 120, 220 also copies the vehicle event detector information queued in the queuing store 158, 258 to the remote data manager 132, 232. Near real-time mode is used under most conditions during normal operation in order to improve the efficiency of the data replication process.
The real-time mode may be initiated based on the event occurring and detected by the vehicle event detectors 156, 256 onboard the assets 148, 248, or may be initiated by a request initiated from the data center 150, 250. When remotely located users 152, 252 request real-time information from network clients 142, 242, a typical request for real-time mode initiated by data center 150, 250 is initiated. Typical reasons for initiating a real-time mode on an asset 148, 248 are that the vehicle event detector 156, 256 detects an event or accident, such as an operator initiating an emergency stop request, an emergency braking activity, a rapid acceleration or deceleration on any axis, or a loss of input power to the data logger 154, 254. When transitioning from near real-time mode to real-time mode, all data that has not been copied to the remote data store 130, 230 is copied and stored in the remote data store 130, 230 and then a current copy is initiated. The transition between near real-time mode and real-time mode typically occurs in less than five seconds. After a predetermined amount of time has elapsed from the event or incident, the predetermined amount of inactivity time, or when the user 152, 252 no longer needs real-time information from the asset 148, 248, the data logger 154, 254 reverts to near real-time mode. The predetermined amount of time required to initiate the transition is configurable and is typically set to ten minutes.
While the data logger 154, 254 is in real-time mode, the onboard data manager 120, 220 attempts to continuously empty its queue to the remote data manager 132, 232, store the data to the anti-collision memory module 118, 218, and optionally to the non-anti-collision removable storage 219 of fig. 2 and 20, and simultaneously transmit the data to the remote data manager 132, 232. The onboard data manager 120, 220 also transmits the detected vehicle information queued in the queuing store 158, 258 to the remote data manager 132, 232.
Upon receiving the data to be replicated from the data logger 154, 254 and the data from the map component 164, 264, the route/group member inventory component 124, 224, and the weather component 126, 226, the remote data manager 132, 232 stores the compressed data to the remote data store 130, 230 in the data center 150, 250 of the DARS100, 200. The remote data store 130, 230 may be, for example, a cloud-based data store or any other suitable remote data store. When data is received, the following process is initiated: causing the data decoder 136, 236 to decode the most recently copied data for/from the remote data store 130, 230 and send the decoded data to the remote event detector 134, 234. The remote data manager 132, 232 stores the vehicle event information in the remote data store 130, 230. When the remote event detector 134, 234 receives the decoded data, it processes the decoded data to determine if an event of interest is found in the decoded data. The remote event detector 134, 234 then uses the decoded information to detect an event, incident, or other predefined context in the data that occurred with the asset 148, 248. Upon detecting an event of interest from the decoded data stored in the remote data store 130, 230, the remote event detector 134, 234 stores the event information and support data in the remote data store 130, 230. When the remote data manager 132, 232 receives the remote event detector 134, 234 information, the remote data manager 132, 232 stores the information in the remote data store 130, 230.
Remotely located users 152, 252 may access information related to a particular asset 148, 248 or assets, including vehicle event detector information, using standard web clients 142, 242 (e.g., web browsers) or a virtual reality device (not shown) (e.g., virtual reality device 828 in fig. 8), which in this embodiment, thumbnail images from selected cameras may be displayed. The network clients 142, 242 communicate requests for information by the users 152, 252 to the network servers 140, 240 over the networks 144, 244 using common network standards, protocols, and techniques. The networks 144, 244 may be, for example, the internet. The networks 144, 244 may also be Local Area Networks (LANs), metropolitan Area Networks (MANs), wide Area Networks (WANs), virtual Private Networks (VPNs), cellular telephone networks, or any other means of communicating data from the network servers 140, 240 to, in this example, the network clients 142, 242. The web server 140, 240 requests the desired data from the data decoder 136, 236. The data decoder 136, 236 obtains the requested data pertaining to the particular asset 148, 248 or assets from the remote data store 130, 230 upon request from the web server 140, 240. The data decoder 136, 236 decodes the requested data and sends the decoded data to the locator 138, 238. Positioning is the process of converting data into a format desired by an end user, such as converting data into a user-preferred language and unit of measure. The locator 138, 238, by accessing the web client 142, 242, recognizes the profile settings set by the user 152, 252 and uses the profile settings to prepare the information to be sent to the web client 142, 242 for presentation to the user 152, 252 because the original encoded data and detected event information are saved to the remote data store 130, 230 using universal coordinated time (UTC) and international units (SI units). The locator 138, 238 converts the decoded data into a format desired by the user 152, 252, such as the language and unit of measure preferred by the user 152, 252. The locator 138, 238 sends the located data in a format preferred by the user 152, 252 to the web server 140, 240 in view of the request. The web servers 140, 240 then send the located data of the asset or assets to the web clients 142, 242 for viewing and analysis, providing playback and real-time display of standard video and 360 degree video through the viewer. The network clients 142, 242 may display and the users 152, 252 may view data, video, and audio for a single asset or view data, video, and audio for multiple assets simultaneously. The network clients 142, 242 may also provide synchronized playback and real-time display of data as well as multiple video and audio data from assets, nearby assets, and/or standard and 360 degree video sources located on, in, or near a remote location. The web client 142, 242 may play video data on a viewer to the user 152, 252 that may interact with the video for viewing and analysis. The user 152, 252 may also download video data using the network client 142, 242 and may then interact with the video data on the viewer using a virtual reality device (e.g., virtual reality device 828 in fig. 8) for viewing and analysis.
The network clients 142, 242 are enhanced by a software application that provides playback of video data and/or 360 degree video in a number of different modes. The user 152, 252 selects a mode in which the software application presents video playback (e.g., a fisheye view, a fisheye-corrected panoramic view, a fisheye-corrected double panoramic view, and a fisheye-corrected four view).
Users 152, 252 may further use the data sharing system of the present disclosure to share data in a secure, controlled, tracked, and censored manner with remotely located recipient end users having internet access and modern web browsers. The users 152, 252 instead of sharing files share URLs to the data. URL-based data sharing enables users to control, track, and review sensitive data. Users can share data to improve security of the global transportation system without fear of unauthorized data diffusion. The administrator has the authority to increase and/or decrease the inherent authority of the users 152,252 and each remotely located recipient end user. The inherent rights of the user 152, 252 and each remotely located recipient end user determine the rights that a particular remotely located recipient end user must view data on the network client 142, 242. The asset 148, 248 owners, operators, and investigators share real-time data regarding the operational efficiency and safety of the vehicles using a data sharing system. Data sharing enables rapid assessment and adjustment of behavior.
FIG. 21 is a flow chart illustrating a procedure 1000 for sharing data and/or information from assets 148, 248 through a web browser 142, 242 or a virtual reality device (e.g., virtual reality device 828 in FIG. 8). Typically, the user 152, 252 will request that the data center 150, 250 share asset 148, 248 data (FIG. 3) using the network client 142, 242 (1002). Typical reasons for data sharing are the detection of an accident, such as an operator initiating an emergency stop request, an emergency braking activity, rapid acceleration or deceleration on any axis, and/or loss of input power to the DARS100, 200. No file is downloaded or sent to a remotely located recipient end user. The user 152, 252 will not be able to share more content than their inherent rights on the network client 142, 242 allow. Remotely located recipient end users will be able to view the data based on their own inherent rights on the network client 142, 242. Network clients 142, 242 record such sharing activity in data centers 150, 250. An administrator can use the rights upgrade to share data through the network client 142, 242 to multiple users 152, 252 that do not themselves access the data. Network clients 142, 242 also record such rights upgrade activity in data centers 150, 250.
As previously discussed, the users 152, 252 access information including vehicle event detector 156, 256 information using the network clients 142, 242. The network clients 142, 242 exchange information desired by the users 152, 252 with the network servers 140, 240 using common network standards, protocols, and technologies, such as the internet or private networks 144, 244. As described above, the web servers 140, 240 request the desired data from the data decoders 136, 236. The data decoder 136, 236 extracts the data and then locates the data via the locator 138, 238, converting the data to a format desired by the user 152, 252. The network server 140, 240 then sends the located data to the network client 142, 242 for viewing and analysis 1004 (fig. 21).
The sharer end user 152, 252 may share this information, including the vehicle event detector 156, 256 information and video data, with a plurality of remotely located recipient end users using the network client 142, 242, regardless of whether the recipient end user has a pre-registered account on the network client 142, 242. The sharer end users 152, 252 may share information and data with a plurality of remotely located recipient end users regardless of whether the recipient end users have pre-registered accounts on the network clients 142, 242. During this process, the web client 142, 242 will generate an email (1006) with a URL pointing to the data in the data center 150, 250 (fig. 21). Remotely located recipient end users receive emails with URL addresses to access the data. The URL address is not a link to the file. Files are not shared with the recipient end user. The data is not a discrete file, but rather a range of data pulled from the remote data store 15 based on a shared network-based viewer link. The URL address sent via email is a link to the web-based viewer allowing the recipient end user to view specific pieces of data synchronized with the still image and video via the web-based viewer. When a remotely located recipient end user clicks on the URL, the shared information will be able to be accessed using its own network client 142, 242, and the network client 142, 242 records the sharing activity in the data center 150, 250.
For simplicity of explanation, the process 1000 is depicted and described as a series of steps. However, steps according to the present disclosure may occur in various orders and/or concurrently. Additionally, steps according to the present disclosure may occur where other steps are not present and not as described herein. Furthermore, not all illustrated steps may be required to implement a methodology in accordance with the disclosed subject matter.
As used in this application, the term "or" is intended to mean an inclusive "or" rather than an exclusive "or". That is, unless specified otherwise, or clear from context, "X comprises a or B" is intended to mean any natural inclusive permutation. That is, if X contains A; x comprises B; or X contains A and B, then "X contains A or B" is satisfied under any of the foregoing examples. Furthermore, "X comprises at least one of a and B" is intended to mean any natural inclusive permutation. That is, if X contains A; x comprises B; or X contains A and B, then "X contains at least one of A or B" is satisfied under any of the foregoing examples. The articles "a" and "an" as used in this application and the appended claims should generally be construed to mean "one or more" unless specified otherwise or clear from context to be directed to a singular form. Furthermore, the use of the terms "an embodiment" or "one embodiment" throughout are not intended to denote the same example, aspect, or implementation, unless described as such.
While the disclosure has been described in connection with certain embodiments, it is to be understood that the disclosure is not to be limited to the disclosed embodiments, but on the contrary is intended to cover various modifications and equivalent arrangements included within the scope of the appended claims, which scope is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures as is permitted under the law.

Claims (20)

1. A method for providing access to view data from at least one liquidity asset, comprising:
using a data logger onboard the mobile asset, receiving data based on at least one data signal from at least one of:
at least one data source onboard the mobile asset; and
At least one data source remote to the mobile asset;
Encoding a predetermined amount of the data into encoded data using a data encoder of the data recorder;
appending the encoded data to a data segment using an on-board data manager of the data logger;
Storing, using the on-board data manager, at least one of the encoded data and the data segments in at least one of at least one local memory component of the data logger and a queuing store of the data logger at a first predetermined rate that is configurable;
transmitting at least one of the encoded data and the data segments to a remote data manager using the on-board data manager at a configurable second predetermined rate, wherein the second predetermined rate is in the range of zero seconds and five minutes and includes zero seconds and five minutes;
Storing, using the remote data manager, at least one of the encoded data and the data fragment to a remote data store;
receiving, using a web server, a request from a first user, the request including at least one email address of at least one second user and specified data stored in the remote data store selected by the first user;
identifying the specified data in the remote data store;
determining a uniform resource locator, URL, the URL comprising a shared web-based viewer link adapted to provide access to view the specified data under a first condition that the first user includes a first inherent right to allow access to view the specified data;
Generating an email including the URL;
Sending the email to the at least one email address using the web server; and
The specified data is displayed on the web browser of the at least one second user when the second user selects the URL provided in the email under a second condition that the second user includes a second inherent right to allow access to view the specified data.
2. The method of claim 1, further comprising:
requesting the specified data from a data decoder using the web server; and
The specified data is decoded using the data decoder.
3. The method of claim 2, further comprising:
The specified data is processed using a locator into processed specified data comprising a predetermined language and at least one predetermined unit of measure.
4. The method of claim 1, further comprising:
Using a web client, storing a record in the remote data store, the record including at least one of the email, the URL, the specified data, the first user, and the second user.
5. The method according to claim 1, wherein:
the at least one data source onboard the mobile asset includes at least one of analog input, digital input, input and output modules, a vehicle controller, an engine controller, inertial sensors, a global positioning system, at least one camera, fuel data, active train control PTC signal data, 360 degree cameras, fixed cameras, narrow view cameras, wide view cameras, and 360 degree fisheye view cameras, and wherein the at least one data source remote from the mobile asset includes at least one of map components, route and panelist components, weather components, 360 degree cameras, fixed cameras, narrow view cameras, wide view cameras, and 360 degree fisheye view cameras; and is also provided with
The data includes at least one of speed, pressure, temperature, current, voltage, acceleration from the asset, acceleration from a remote asset, switch position, actuator position, warning light illumination, actuator command, position, altitude, internally generated information, video information, audio information, route, schedule, manifest information, environmental conditions, current weather conditions, and forecasted weather conditions.
6. The method of claim 1, wherein an administrator is capable of performing at least one of: increasing the first intrinsic rights of the first user and the second intrinsic rights of the at least one second user; and reducing the first intrinsic rights of the first user and the second intrinsic rights of the at least one second user.
7. The method of claim 6, wherein at least one of:
The first user being able to request the specified data on condition that the administrator raises the first intrinsic right of the first user; and
The at least one second user is able to view the specified data on condition that the administrator promotes the second intrinsic rights of the at least one second user.
8. The method as recited in claim 1, further comprising:
Transmitting the request for the specified data from a first network client of the first user to the network server;
requesting the specified data from a data decoder using the web server;
Decoding the specified data into decoded specified data using the data decoder; and
The decoded specified data is processed using a locator into processed specified data, the processed specified data comprising a predetermined language and at least one predetermined unit of measure.
9. The method of claim 1, wherein the first user is able to share the URL with a plurality of remotely located recipient end users.
10. The method of claim 1, wherein the at least one second user does not need to have a pre-registered account.
11. A system for providing access to view data from at least one liquidity asset, comprising:
the mobile asset comprising at least one local memory component, a data encoder, an on-board data manager, and a queuing repository, the data recorder adapted to receive data based on at least one data signal from at least one of:
at least one data source onboard the mobile asset; and
At least one data source remote to the mobile asset; and
The data encoder adapted to encode a predetermined amount of the data into encoded data;
the on-board data manager adapted to:
appending the encoded data to a data segment;
Storing at least one of the encoded data and the data fragments at a configurable first predetermined rate in at least one of the at least one local memory component and the enqueue store; and
Transmitting a predetermined amount of at least one of the encoded data and the data fragments to a remote data manager via a wireless data link at a configurable second predetermined rate, wherein the second predetermined rate is in the range of and including zero seconds and five minutes, the remote data manager being adapted to store the predetermined amount of at least one of the encoded data and the data fragments to a remote data store;
a web server adapted to receive a request from a first user, determine a uniform resource locator, URL, the URL comprising a shared web-based viewer link adapted to provide access to view specified data under a first condition that the first user includes a first inherent authority to allow access to view the specified data, generate an email comprising the URL, and send the email to at least one email address of at least one second user, the request comprising the at least one email address of the at least one second user and the URL adapted to allow the at least one second user to view specified data stored in the remote data store.
12. The system of claim 11, further comprising:
A remote data decoder adapted to decode the specified data into decoded data.
13. The system of claim 11, further comprising:
A locator adapted to process the specified data into processed specified data comprising a predetermined language and at least one predetermined unit of measure.
14. The system of claim 11, further comprising:
A web client adapted to store a record in the remote data store, the record comprising the URL, the specified data, the first user, an email address of the first user, the at least one second user, and the at least one email address of the at least one second user.
15. The system of claim 11, further comprising:
a first network client adapted to send the request for specified data to the network server;
said web server adapted to request said specified data from a data decoder;
the data decoder adapted to decode the specified data into decoded specified data; and
A locator adapted to process the decoded specified data into processed specified data, the processed specified data comprising a predetermined language and at least one predetermined unit of measure.
16. The system of claim 11, wherein the first user is capable of sharing the URL with a plurality of remotely located recipient end users.
17. The system of claim 11, wherein the at least one second user does not need to have a pre-registered account.
18. The system of claim 11, further comprising:
At least one second web client adapted to display the specified data on the at least one second user's web browser when the at least one second user selects the URL provided in the email under a second condition that the at least one second user includes a second inherent right to allow access to view the specified data.
19. The system of claim 18, wherein an administrator is capable of performing at least one of: increasing the first intrinsic rights of the first user and the second intrinsic rights of the at least one second user; and
Reducing the first intrinsic rights of the first user and the second intrinsic rights of the at least one second user.
20. The system of claim 18, wherein at least one of:
the first user can request the specified data under the condition that an administrator promotes the first inherent authority of the first user; and
The at least one second user is able to view the specified data on condition that the administrator promotes the second intrinsic rights of the at least one second user.
CN201980045424.0A 2018-06-05 2019-06-05 Real-time data acquisition and recording data sharing system Active CN112384902B (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US201862680907P 2018-06-05 2018-06-05
US62/680,907 2018-06-05
US16/431,466 2019-06-04
US16/431,466 US11423706B2 (en) 2016-05-16 2019-06-04 Real-time data acquisition and recording data sharing system
PCT/US2019/035606 WO2019236719A1 (en) 2018-06-05 2019-06-05 Real-time data acquisition and recording data sharing system

Publications (2)

Publication Number Publication Date
CN112384902A CN112384902A (en) 2021-02-19
CN112384902B true CN112384902B (en) 2024-06-25

Family

ID=68769554

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980045424.0A Active CN112384902B (en) 2018-06-05 2019-06-05 Real-time data acquisition and recording data sharing system

Country Status (9)

Country Link
JP (1) JP7419266B2 (en)
KR (1) KR102680231B1 (en)
CN (1) CN112384902B (en)
AU (1) AU2019280705B2 (en)
CA (1) CA3102127C (en)
MX (1) MX2020013121A (en)
PE (1) PE20210637A1 (en)
WO (1) WO2019236719A1 (en)
ZA (1) ZA202007451B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220044183A1 (en) * 2020-08-05 2022-02-10 Wi-Tronix, Llc Engineer recertification assistant
US12118833B2 (en) * 2020-11-06 2024-10-15 Wi-Tronix, Llc Connected diagnostic system and method
CN112907782A (en) * 2021-02-18 2021-06-04 江西洪都航空工业集团有限责任公司 Multisource multi-type data playback device based on time synchronization
US20240112705A1 (en) * 2022-10-01 2024-04-04 It Us Acquisition Co, Llc Cloud-connected dash camera with continuous recording capability
GB2624689A (en) * 2022-11-28 2024-05-29 Continental Automotive Tech Gmbh A 360-degrees motor vehicle monitoring system

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6278936B1 (en) * 1993-05-18 2001-08-21 Global Research Systems, Inc. System and method for an advance notification system for monitoring and reporting proximity of a vehicle
US7103357B2 (en) * 1999-11-05 2006-09-05 Lightsurf Technologies, Inc. Media spooler system and methodology providing efficient transmission of media content from wireless devices
US20020184485A1 (en) * 1999-12-20 2002-12-05 Dray James F. Method for electronic communication providing self-encrypting and self-verification capabilities
JP2003202917A (en) 2002-01-08 2003-07-18 Mitsubishi Heavy Ind Ltd Plant monitoring device, plant data server to be used together with the same device and plant monitoring system using the server
JP2004030113A (en) 2002-06-25 2004-01-29 Yamatake Corp Notification device
US20040039614A1 (en) * 2002-08-26 2004-02-26 Maycotte Higinio O. System and method to support end-to-end travel service including disruption notification and alternative flight solutions
JP2004282257A (en) 2003-03-13 2004-10-07 Iwatsu Electric Co Ltd Network system
US7283047B2 (en) * 2003-08-01 2007-10-16 Spectrum Tracking Systems, Inc. Method and system for providing tracking services to locate an asset
JP4454358B2 (en) 2004-03-26 2010-04-21 株式会社野村総合研究所 System monitoring work support system and support program
JP4938527B2 (en) 2007-03-28 2012-05-23 株式会社日本総合研究所 E-mail mistransmission prevention system, e-mail mistransmission prevention method and e-mail mistransmission prevention program
US20090313703A1 (en) 2008-06-17 2009-12-17 Fujitsu Network Communications, Inc. File-Based Chat System And Method
JP2010257066A (en) 2009-04-22 2010-11-11 Hitachi Software Eng Co Ltd Troubleshooting support system
JP4673941B1 (en) 2010-02-26 2011-04-20 株式会社野村総合研究所 Screen design support system for failure handling
KR20130136634A (en) * 2012-06-05 2013-12-13 주식회사 아이에스디엔코리아 Car wireless black box system and control method
KR20140016467A (en) * 2012-07-30 2014-02-10 주식회사 케이티 Method of real-time content information service based on vehicle location and system for it
KR101430155B1 (en) * 2012-09-03 2014-08-14 삼성중공업 주식회사 Vehicle accident management system and method
US9298896B2 (en) * 2013-01-02 2016-03-29 International Business Machines Corporation Safe auto-login links in notification emails
KR101481051B1 (en) 2013-09-12 2015-01-16 (주)유즈브레인넷 Private black box apparatus and driviing method thereof
US9934623B2 (en) 2016-05-16 2018-04-03 Wi-Tronix Llc Real-time data acquisition and recording system

Also Published As

Publication number Publication date
CA3102127A1 (en) 2019-12-12
AU2019280705A1 (en) 2021-01-07
AU2019280705B2 (en) 2022-08-25
JP7419266B2 (en) 2024-01-22
MX2020013121A (en) 2021-02-18
CA3102127C (en) 2024-04-23
ZA202007451B (en) 2024-06-26
WO2019236719A1 (en) 2019-12-12
CN112384902A (en) 2021-02-19
KR20210028181A (en) 2021-03-11
PE20210637A1 (en) 2021-03-23
KR102680231B1 (en) 2024-07-01
JP2021527251A (en) 2021-10-11

Similar Documents

Publication Publication Date Title
AU2022228123C1 (en) Video content analysis system and method for transportation system
CN113874921B (en) Automatic signal coincidence monitoring and alarming system
CN112384902B (en) Real-time data acquisition and recording data sharing system
US12118833B2 (en) Connected diagnostic system and method
RU2786372C2 (en) Data-sharing system for obtaining and recording data in real time
RU2812263C2 (en) Automated monitoring system of signal conformity and emergency notification
EP3803607A1 (en) Real-time data acquisition and recording data sharing system
RU2784892C2 (en) Method and system for analysis of video content for transport system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40037627

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant