The present application claims priority from U.S. provisional application No. 63/292,275 entitled "SYSTEM AND METHOD FOR AN AUTONOMOUS VEHICLE" filed on month 21 of 2021 and U.S. provisional application No. 63/292,281 entitled "SYSTEM AND METHOD FOR AN AUTONOMOUS VEHICLE" filed on month 21 of 2021, both of which are incorporated herein by reference.
Detailed Description
As noted above, the prior art fails to provide an efficient, reliable, and secure solution to facilitate secure and trusted communication between various components of an autonomous vehicle and route and schedule incoming messages. The present disclosure provides various systems, methods, and devices to facilitate secure and trusted communication between various components of an autonomous vehicle and route and schedule incoming messages. Embodiments of the present disclosure and their advantages may be understood by referring to fig. 1 through 6. Fig. 1-6 are used to describe systems and methods for facilitating secure and trusted communications between various components of an autonomous vehicle and routing and scheduling incoming messages.
Overview of the System
Fig. 1 illustrates one embodiment of a system 100 configured to implement a communication gateway architecture for an autonomous vehicle 402 to facilitate secure and trusted communications between various components of the autonomous vehicle 402. In certain embodiments, system 100 includes an autonomous vehicle 402 communicatively coupled with supervisory server 170 via network 110. Network 110 enables communication between components of system 100. The network 110 allows the autonomous vehicle 402 to communicate with other autonomous vehicles 402, systems, supervisory servers 170, databases, devices, etc. The autonomous vehicle 402 includes a control device 450. The control device 450 includes a gateway processor 120 in signal communication with a memory 126. The memory 126 stores software instructions 128 that, when executed by the gateway processor 120, cause the gateway processor 120 to perform one or more operations described below. The supervisory server 170 includes a processor 172 in signal communication with a memory 178. Memory 178 stores software instructions 180 that, when executed by processor 172, cause supervisory server 170 to perform one or more operations described herein. In other embodiments, system 100 may not have all of the components listed and/or may have other elements in place of or in addition to those described above. The system 100 may be configured as shown, or in any other configuration.
In general, the system 100 provides improvements to autopilot vehicle technology, for example, by improving data routing or data communication between components of the control device 450. In one example, the system 100 may improve data routing between components of the control device 450 by establishing a particular boundary field for each component of the control device 450. For example, the system 100 may establish an autonomous vehicle component boundary field 102 that includes a first set of components configured to facilitate autonomous operation of the autonomous vehicle 402. In another example, the system 100 may establish a vehicle component boundary field 104 that includes a second set of components configured to facilitate non-autonomous operation of the autonomous vehicle 402. In another example, the system 100 can establish the secure boundary domain 106 that includes a third set of components configured to facilitate authentication of components in the control device 450, authentication of messages 140 received from external devices (e.g., the supervisory server 170, etc.), authentication of messages 140 received from internal components (e.g., any component in the boundary domain to another component) with respect to the control device 450.
The system 100 is configured to establish a trusted communication path between any combination of the boundary domains 102, 104, 106 using the secure boundary domain 106. For example, the system 100 is configured to provide an initial security key 156 to each component in each of the boundary domains 102, 104, and 106, and query the security key 156 from that component to authenticate that component (e.g., in response to receiving a request from that component to initiate communication with another component). If the received secure key 156 matches or corresponds to the originally provided secure key 156, then the component is determined to be authenticated and the communication from the component is secure and trusted.
The system 100 is also configured to provide improvements to data routing techniques, particularly improvements to data routing between components of an autonomous vehicle. For example, after receiving the message 140 (from an external device, such as the supervisory server 170), the control device 450 may evaluate the received message 140 and determine its priority 210. If the priority 210 of the message 140 is determined to be high (e.g., as indicated by a priority flag bit or priority data field included in the message 140), the control device 450 may move the message 140 to the top of a dispatch queue or route the message 140 to a particular dispatch queue dedicated to the message 140 having a high priority. Similarly, if the priority 210 of the message 140 is determined to be medium (e.g., as indicated by a priority flag bit or priority data field included in the message 140), the control device 450 may route the message 140 to a particular dispatch queue dedicated to the message 140 having medium priority; if the priority 210 of the message 140 is determined to be low (e.g., as indicated by a priority flag bit or priority data field included in the message 140), the control device 450 may route the message 140 to a particular dispatch queue dedicated to the message 140 having a low priority. In this way, if a particular message 140 includes particular instructions that require more urgent execution than other messages, the particular message 140 may be associated with a high priority and its execution may be prioritized. Thus, the system 100 improves the underlying operation of the autonomous vehicle, as well as network communication between components of the autonomous vehicle 402. This in turn results in an improvement in autopilot vehicle navigation technology, for example, by upgrading the execution of messages with high priority, instructions that may include navigation instructions, updated map data 134, or any other suitable instructions/information may be accessed and executed faster than current autopilot vehicle navigation technology. This provides safer driving conditions and experience for the autonomous vehicle 402, surrounding vehicles, and pedestrians.
System component
Network 110 may include any interconnection system capable of transmitting audio, video, signals, data, messages, or any combination of the preceding. The network 110 may include all or a portion of the following: local Area Networks (LANs), wide Area Networks (WANs), metropolitan Area Networks (MANs), personal Area Networks (PANs), wireless PANs (WPANs), overlay networks, software Defined Networks (SDNs), virtual Private Networks (VPNs), packet data networks (e.g., the internet), mobile telephone networks (e.g., cellular networks such as 4G or 5G), plain Old Telephone (POT) networks, wireless data networks (e.g., wiFi, wiGig, wiMAX, etc.), long Term Evolution (LTE) networks, universal Mobile Telecommunications System (UMTS) networks, peer-to-peer (P2P) networks, bluetooth networks, near Field Communication (NFC) networks, zigbee networks, Z-wave networks, wiFi networks, and/or any other suitable networks.
Example autonomous vehicle
In one embodiment, autonomous vehicle 402 may include a half truck tractor unit (see fig. 4) attached to a trailer to transport cargo or goods from one location to another. Autonomous vehicle 402 is generally configured to travel along a roadway in an autonomous mode. Autonomous vehicle 402 may navigate using a number of components described in detail in fig. 4-6. The operation of autonomous vehicle 402 is described in more detail in fig. 4-6. The following corresponding description includes a brief description of certain components of autonomous vehicle 402.
The control device 450 may generally be configured to control the operation of the autonomous vehicle 402 and its components and facilitate autonomous driving of the autonomous vehicle 402. The control device 450 may also be configured to determine a path ahead of the autonomous vehicle 402 that is safe to travel and free of objects or obstacles, and navigate the autonomous vehicle 402 to travel in the path. This process is described in more detail in fig. 4-6. The control device 450 may generally include one or more computing devices in signal communication with other components of the autonomous vehicle 402 (see fig. 4). In this disclosure, the control device 450 may be interchangeably referred to as an onboard control computer 450.
The control device 450 may be configured to detect objects on and around the road on which the autonomous vehicle 402 is traveling by analyzing the sensor data 130 and/or the map data 134. For example, the control device 450 may detect objects on and around a roadway by implementing the object detection machine learning module 132. The object detection machine learning module 132 may be implemented using a neural network and/or machine learning algorithm for detecting objects from images, video, infrared images, point clouds, audio feeds, radar data, and the like. The object detection machine learning module 132 will be described in more detail below. The control device 450 may receive sensor data 130 from sensors 446 positioned on the autonomous vehicle 402 to determine a safe path of travel. The sensor data 130 may include data captured by the sensors 446.
The sensor 446 may be configured to capture any object within its detection area or field of view, such as landmarks, lane markers, lane boundaries, road boundaries, vehicles, pedestrians, road/traffic signs, and the like. In some embodiments, the sensor 446 may be configured to detect rain, fog, snow, and/or any other weather condition. The sensors 446 may include detection and ranging (LiDAR) sensors, radar sensors, video cameras, infrared cameras, ultrasonic sensor systems, gust detection systems, microphone arrays, thermocouples, humidity sensors, barometers, inertial measurement units, positioning systems, infrared sensors, motion sensors, rain sensors, and the like. In some embodiments, the sensor 446 may be positioned around the autonomous vehicle 402 to capture the environment around the autonomous vehicle 402. For further description of the sensor 446, see the corresponding description of fig. 4.
Control apparatus
The control device 450 is described in more detail in fig. 4. In short, the control device 450 may facilitate autonomous driving of the autonomous vehicle 402. In the illustrated embodiment, the control device 450 includes an autonomous vehicle component boundary field 102, a vehicle component boundary field 104, and a safety boundary field 106. The control device 450 may establish these boundary fields based on the operation of the various components of the autonomous vehicle 402.
Autonomous vehicle component boundary domain
The autonomous vehicle component boundary field 102 may include a first set of components configured to facilitate autonomous operation of the autonomous vehicle 402. For example, components in the autonomous vehicle component boundary field 102 may be configured to enable autonomous driving of the autonomous vehicle 402, e.g., from a non-autonomous state to an autonomous state, executing various software instructions 128, the software instructions 128 for sensing, actuating, controlling 570 (see fig. 5), planning 562 (see fig. 5), object detection (e.g., the LiDAR-based object detection module 512 of fig. 5, the image-based object detection module 518 of fig. 5, the machine-learning object detection module 132), and so forth. The autonomous vehicle component boundary field 102 may include a gateway processor 120, one or more automatic driving calculation (ADS) units 122a-c, a Pulse Per Second (PPS) synchronization unit 123, a network interface 124, a Controller Area Network (CAN) controller 125, and a memory 126. The components of the autonomous vehicle component boundary field 102 are operably coupled to each other by wired and/or wireless communication.
The gateway processor 120 is in signal communication with ADC units 122a-c, PPS unit 123, network interface 124, CAN controller 125, memory 126, and other components in the other domains 104, 106. Gateway processor 120 may include one or more processing units that perform various functions as described herein. Memory 126 may store any data and/or instructions used by gateway processor 120 to perform its functions. For example, the memory 126 may store software instructions 128 that, when executed by the gateway processor 120, cause the control device 450 to perform one or more functions described herein.
Gateway processor 120 may be one of the data processors 470 depicted in fig. 4. Gateway processor 120 includes one or more processors. Gateway processor 120 may be any electronic circuitry including a state machine, one or more Central Processing Unit (CPU) chips, logic units, cores (e.g., a multi-core processor), a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), or a Digital Signal Processor (DSP). Gateway processor 120 may be a programmable logic device, a microcontroller, a microprocessor, or any suitable combination thereof. Gateway processor 120 may be communicatively coupled to and in signal communication with other components of control device 450. The one or more processors may be configured to process data and may be implemented in hardware or software. For example, gateway processor 120 may be 8-bit, 16-bit, 32-bit, 64-bit, or any other suitable architecture. Gateway processor 120 may include an Arithmetic Logic Unit (ALU) to perform arithmetic and logical operations, processor registers to provide operands to the ALU and store the results of the ALU operations, and a control unit to retrieve instructions from memory and execute the instructions by directing coordinated operations of the ALU, registers, and other components. The one or more processors may be configured to implement various instructions. For example, the one or more processors may be configured to execute the software instructions 128 to implement the functions disclosed herein, such as some or all of the functions described with respect to fig. 1-6. In some embodiments, the functions described herein are implemented using logic units, FPGA, ASIC, DSP, or any other suitable hardware or electronic circuitry.
Each ADC unit 122a-c may include hardware processing circuitry or a hardware processor configured to execute a software algorithm that, when executed, facilitates one or more autonomous operations of autonomous vehicle 402. For example, each ADC unit 122a-c may be configured to facilitate enablement from a non-autonomous state to an autonomous state, autonomous driving of an autonomous vehicle, and so forth.
The ADC units 122a-c may be one of the data processors 470 depicted in fig. 4. The ADC units 122a-c include one or more processors operatively coupled to other components of the control device 450, such as the gateway processor 120. The ADC units 122a-c may be any electronic circuitry including a state machine, one or more CPU chips, logic units, cores (e.g., multi-core processors), FPGAs, ASICs, or DSPs. The ADC units 122a-c may be programmable logic devices, microcontrollers, microprocessors, or any suitable combination thereof. The ADC units 122a-c may be coupled in signal communication with and in communication with the network interface 124, the memory 126, and other components of the control device 450. The one or more processors may be configured to process data and may be implemented in hardware or software. For example, the ADC units 122a-c may be 8-bit, 16-bit, 32-bit, 64-bit, or any other suitable architecture. The ADC units 122a-c may include an ALU for performing arithmetic and logic operations, processor registers that provide operands to the ALU and store the results of the ALU operations, and a control unit that retrieves instructions from memory and executes the instructions by directing coordinated operations of the ALU, registers, and other components. The one or more processors may be configured to implement various instructions. For example, one or more processors may be configured to execute software instructions (e.g., autonomous instructions) to implement the functions disclosed herein, such as some or all of the functions described with respect to fig. 1-6. In some embodiments, the functions described herein are implemented using logic units, FPGA, ASIC, DSP, or any other suitable hardware or electronic circuitry. Although fig. 1 shows control device 450 as including three ADC units 122a-c, control device 450 may include any suitable number of ADC units 122a-c.
PPS synchronization unit 123 may be implemented in software and/or hardware and is executed by gateway processor 120 executing software instructions 128 and is generally configured to synchronize the timing of operations between components of control device 450. For example, PPS synchronization unit 123 may distribute timing of operations from gateway processor 120 (or cause gateway processor 120 to distribute) to other components of control apparatus 450. For example, the PPS synchronization unit 123 may distribute timing of executing instructions among components as one instruction per millisecond, two instructions per millisecond, and the like. PPS synchronization unit 123 may be interchangeably referred to herein as a timing synchronization component.
The network interface 124 may be a component of the network communication subsystem 492 depicted in fig. 4. The network interface 124 may be configured to enable wired and/or wireless communication. The network interface 124 may be configured to communicate data between the autonomous vehicle 402 and other devices, systems, or domains. For example, network interface 124 may include an NFC interface, a Bluetooth interface, a Zigbee interface, a Z-wave interface, a Radio Frequency Identification (RFID) interface, a WIFI interface, a Local Area Network (LAN) interface, a Wide Area Network (WAN) interface, a Metropolitan Area Network (MAN) interface, a Personal Area Network (PAN) interface, a Wireless PAN (WPAN) interface, a modem, a switch, and/or a router. Gateway processor 120 may be configured to send and receive data using network interface 124. The network interface 124 may be configured to use any suitable type of communication protocol, as will be appreciated by those of ordinary skill in the art.
CAN controller 125 may be a component of vehicle subsystem interface 460 depicted in fig. 4. CAN controller 125 may be configured to allow communication between components of control device 450 without a host computer device. CAN controller 125 may be a message-based protocol or any other suitable type of communication protocol. CAN controller 125 may allow for serial and/or parallel data transmission. For example, for a high priority message 140, communication of the message 140 may take precedence over other messages 140 having lower priorities. For example, for high priority messages 140, communication of the messages 140 may be accomplished by parallel data transmission, while other data is serially transmitted or queued in a dispatch queue according to its priority.
Memory 126 may be one of the data storage devices 490 depicted in fig. 4. Memory 126 may be volatile or nonvolatile and may include Read Only Memory (ROM), random Access Memory (RAM), ternary Content Addressable Memory (TCAM), dynamic Random Access Memory (DRAM), and Static Random Access Memory (SRAM). The memory 126 may include one or more of a local database, a cloud database, a Network Attached Storage (NAS), and the like. Memory 126 may store any of the information described in fig. 1-6, as well as any other data, instructions, logic, rules, or code that may be used to implement the functionality described herein when executed by gateway processor 120 and/or any of ADC units 122 a-c. For example, the memory 126 may store software instructions 128, sensor data 130, object detection machine learning module 132, map data 134, route plans 136, driving instructions 138, messages 140, priorities 210, domain label data 212, destination data 214, and/or any other data/instructions. The software instructions 128 include code that, when executed by the gateway processor 120 and/or the ADC units 122a-c, cause the control device 450 to perform the functions described herein (such as some or all of the functions described in fig. 1-6). Memory 126 includes one or more magnetic disks, tape drives, or solid state drives, and may be used as an over-flow data storage device to store programs as such programs are selected for execution, and to store instructions and data that are read during program execution.
The object detection machine learning module 132 may be implemented by the gateway processor 120 and/or the ADC units 122a-c executing the software instructions 128 and may generally be configured to detect objects and obstructions from the sensor data 130. The object detection machine learning module 132 may be implemented using a neural network and/or machine learning algorithms for detecting objects from any data type, such as images, video, infrared images, point clouds, audio feeds, radar data, and the like.
In some embodiments, the object detection machine learning module 132 may be implemented using a machine learning algorithm, such as a Support Vector Machine (SVM), naive bayes, logistic regression, k-nearest neighbor, decision tree, and the like. In some embodiments, the object detection machine learning module 132 may utilize multiple neural network layers, convolutional neural network layers, long term memory (LSTM) layers, bi-directional LSTM layers, recurrent neural network layers, etc., wherein the weights and bias of these layers are optimized during the training process of the object detection machine learning module 132. The object detection machine learning module 132 may be trained by a training data set that may include samples of the data type marked with one or more objects in each sample. For example, the training data set may include sample images of objects (e.g., vehicles, lane markers, pedestrians, roadways, obstacles, etc.) marked with object(s) in each sample image. Similarly, the training data set may include samples of other data types marked with object(s) in each sample data, such as video, infrared images, point clouds, audio feeds, radar data, and the like. The object detection machine learning module 132 may train, test, and refine through the training data set and the sensor data 130. The object detection machine learning module 132 uses the sensor data 130 (which is not labeled with an object) to improve their accuracy of prediction in detecting an object. For example, upon detecting an object in the sensor data 130, supervised and/or unsupervised machine learning algorithms may be used to verify predictions of the object detection machine learning module 132.
Map data 134 may include a virtual map of a city or region that includes roads traveled by autonomous vehicle 402. In some examples, map data 134 may include map 558 and map database 1136 (see fig. 5 for a description of map 558 and map database 1136). Map data 134 may include drivable regions such as roads, paths, highways, and the like, and non-drivable regions such as terrain, as determined by occupancy grid module 1160 (see fig. 5 for a description of occupancy grid module 1160). The map data 134 may specify location coordinates of road markers, lanes, lane markers, lane boundaries, road boundaries, traffic lights, obstacles, and the like.
Route plan 136 may be a plan to travel from a starting location (e.g., a first autonomous vehicle launch pad/landing pad) to a destination (e.g., a second autonomous vehicle launch pad/landing pad). For example, route plan 136 may specify one or more street, road, and highway combinations in a particular order from a starting location to a destination. Route plan 136 may specify stages including a first stage (e.g., moving out of a starting location/launch pad), a plurality of intermediate stages (e.g., traveling along a particular lane of one or more particular streets/roads/highways), and a last stage (e.g., entering a destination/landing pad). The route plan 136 may include other information about the route from the starting location to the destination, such as road/traffic signs in the route plan 136, and the like.
The driving instructions 138 may be implemented by the planning module 562 (see description of the planning module 562 in fig. 5). The driving instructions 138 may include instructions and rules for adapting the autonomous driving of the autonomous vehicle 402 according to the driving rules of each stage of the route plan 136. For example, the driving instructions 138 may include instructions for: remains within the speed range of the road on which autonomous vehicle 402 is traveling, adapting the speed of autonomous vehicle 402 (such as the speed of the surrounding vehicle, the speed of objects within the detection area of sensor 446) relative to the changes observed by sensor 446), and so forth.
Vehicle component boundary domain
The vehicle component boundary field 104 may include a second set of components configured to facilitate non-autonomous operation of the autonomous vehicle 402. For example, the vehicle component boundary field 104 may include components that perform mechanical operations of the autonomous vehicle 402, such as a vehicle drive subsystem 442 (see FIG. 4), a vehicle control subsystem 448 (see FIG. 4), and the like. For example, the vehicle component boundary field 104 may include a communication module 142, a vehicle component controller 144, a vehicle component 146, and an authentication component 148.
The communication module 142 may be or include a hardware processor, modem, router, or network interface configured to provide software and/or hardware resources to other components of the control device 450. The communication module 142 may be one of the components of the data processor 470 (see fig. 4). The communication module 142 may be interchangeably referred to as a communication processor. The communication module 142 includes one or more processors operatively coupled to other components of the control device 450, such as the gateway processor 120. The communication module 142 may be any electronic circuitry including a state machine, one or more CPU chips, logic units, cores (e.g., a multi-core processor), FPGA, ASIC, or DSP. The communication module 142 may be a programmable logic device, a microcontroller, a microprocessor, or any suitable combination thereof. The communication module 142 may be communicatively coupled to and in signal communication with the vehicle component controller 144, the vehicle component 146, the authentication component 148, and other components of the control device 450. The one or more processors may be configured to process data and may be implemented in hardware or software. For example, the communication module 142 may be 8-bit, 16-bit, 32-bit, 64-bit, or any other suitable architecture. The communication module 142 may include an ALU for performing arithmetic and logic operations, processor registers that provide operands to the ALU and store the results of the ALU operations, and a control unit that retrieves instructions from memory and executes the instructions by directing coordinated operations of the ALU, registers, and other components. The one or more processors may be configured to implement various instructions. For example, one or more processors may be configured to execute software instructions to implement the functions disclosed herein, such as some or all of the functions described with respect to fig. 1-6. In some embodiments, the functions described herein are implemented using logic units, FPGA, ASIC, DSP, or any other suitable hardware or electronic circuitry.
The vehicle component controller 144 may include hardware processing circuitry configured to control the vehicle component 146. The vehicle component controller 144 may be associated with a vehicle control subsystem 448 depicted in fig. 4. The vehicle component 146 may be any of the components in the vehicle control subsystem 448 depicted in fig. 4. For example, the vehicle components 146 may include a human interface, a brake unit, a power distribution unit, a camera array, a microphone array, a speaker array, a sensor 446, and the like. Each vehicle component 146 may be configured to perform its respective operation as described in fig. 4. The human-machine interface may be configured to provide support for audio, video and/or message based communications. The human-machine interface may be configured to support one-way or two-way communications. Using the human interface, a person can communicate with another device (e.g., the supervisory server 170) and/or a remote operator via the network 110. The power distribution unit may be implemented in hardware and/or software and is configured to distribute power to components of the autonomous vehicle 402. The power distribution unit may be a component in the power source 442e depicted in fig. 1. The components of the vehicle component boundary field 104 are operably coupled to each other by wired and/or wireless communication.
The authentication component 148 may include a hardware processor, memory, and/or circuitry (not explicitly shown) and is generally configured to authenticate the components of the vehicle component boundary field 104 and communications between the components. For example, software applications designed using software code may be stored in memory and executed by a processor to perform the functions of authentication component 148. Examples of authentication component 148 may include Near Field Communication (NFC) devices, mobile phones (e.g., smartphones), laptops, computing devices, and the like. The authentication component 148 is configured to communicate with other components of the vehicle component boundary field 104 via wired and/or wireless communications.
Secure boundary field
The security boundary field 106 may include a third set of components configured to facilitate authentication/authorization of any component in the control device 450, authentication of any message 140 received from an external device (e.g., the supervisory server 170, etc.), and authentication of a message 140 received from an internal component regarding the control device 450 (e.g., any component of the control device 450 to another component). The security border field 106 may include a memory 152. The memory 152 may be one of the data storage devices 490 depicted in fig. 4. Memory 152 may be volatile or nonvolatile and may include ROM, RAM, TCAM, DRAM and SRAM. The memory 152 may include one or more of a local database, cloud database, NAS, and the like. The memory 152 may store any of the information described in fig. 1-6, as well as any other data, instructions, logic, rules, or code that may be used to implement the function(s) described herein when executed by the gateway processor 120 and/or any ADC unit 122 a-c. For example, the memory 152 may store authentication/authorization instructions 154, security keys 156, access management 158, and/or any other data/instructions. The software instructions 128 include code that, when executed by the gateway processor 120 and/or the ADC units 122a-c, cause the control device 450 to perform the functions described herein (such as some or all of the functions described in fig. 1-6). Memory 152 includes one or more magnetic disks, tape drives, or solid state drives, and may serve as an overflow data storage device to store programs as such programs are selected for execution and store instructions and data that are read during program execution.
The authentication/authorization instructions 154 include code that, when executed by the gateway processor 120 and/or the ADC units 122a-c and/or the processor in the secure boundary domain 106, cause the control device 450 to perform the functions described herein, such as authenticating a component of the control device 450 that initiates communication with another component, authorizing communication after authenticating the component, distributing the secure key 156 to each component in the control device 450 for authenticating each component.
The security key 156 may include a plurality of security keys, security codes, etc. for authenticating each component of the control device 450. The secure key 156 may also be used to establish a secure communication path between any two combinations of components in one or more of the boundary domains 102, 104, and 106. The security key 156 may also be used to establish a secure communication path between any two combinations of the boundary domains 102, 104, and 106. For example, the control device 450 and/or gateway processor 120 (e.g., by executing the authentication/authorization instructions 154) may establish a trusted communication path between the autonomous vehicle component boundary field 102 and the vehicle component boundary field 104 by: the initial private security key 156 is received from the supervisory server 170, the initial private security key 156 is shared with the communication module 142, a request is received from the communication module 142 to communicate the message 140 to, for example, the gateway processor 120 (where the request includes the message 140 and the private security key 156), the private security key 156 is received from the communication model 142, and the received private security key 156 is compared to the initial private security key 156 (received from the supervisory server 160). If it is determined that the received private security key 156 corresponds to or matches the original private security key 156, the control device 450 may determine that the communication module 142 is authenticated and authorized to communicate the message 140 to, for example, the gateway processor 120. The control device 450 may perform similar operations to establish a trusted communication path between any two domains or components of the control device 450.
Access management 158 may include a record of access to secure key 156 (e.g., a record of components associated with a particular secure key 156), a history of access to secure key 156, and so forth. The access management 158 may indicate which component(s) are authorized to initiate communications, i.e., are trusted. The access management 158 may also indicate which component(s) are not authorized to access the message(s) 140.
Supervision server
The supervisory server 170 may include one or more processing devices and is generally configured to supervise the operation of the autonomous vehicle 402 during transportation and to supervise its travel when the autonomous vehicle 402 is at a terminal. The supervisory server 170 may provide software and/or hardware resources (e.g., map data 134, route plans 136, messages 140, advice, feedback from a remote operator regarding the navigation of the autonomous vehicle, etc.) to the autonomous vehicle 402. The supervision server 170 may include a processor 172, a network interface 174, a user interface 176, and a memory 178. The components of the supervision server 170 are operably coupled to each other. The processor 172 may include one or more processing units for performing various functions of the supervisory server 170. Memory 178 may store any data and/or instructions used by processor 172 to perform its functions. For example, the memory 178 may store software instructions 180 that, when executed by the processor 172, cause the supervisory server 170 to perform one or more functions described herein. The supervision server 170 may be configured as shown or in any other suitable configuration.
In one embodiment, the supervisory server 170 may be implemented by a cluster of computing devices for supervising the operation of the autonomous vehicle 402. For example, the supervisory server 170 may be implemented by a plurality of computing devices using distributed computing and/or cloud computing systems. In another example, the supervisory server 170 may be implemented by a plurality of computing devices in one or more data centers. Thus, in one embodiment, the supervisory server 170 may include more processing power than the control device 450. The supervisory server 170 is in signal communication with the autonomous vehicle 402 and its components (e.g., control device 450).
The processor 172 includes one or more processors. The processor 172 may be any electronic circuitry including a state machine, one or more CPU chips, logic units, cores (e.g., a multi-core processor), FPGA, ASIC, or DSP. The processor 172 may be a programmable logic device, a microcontroller, a microprocessor, or any suitable combination thereof. The processor 172 may be communicatively coupled to and in signal communication with a network interface 174, a user interface 176, and a memory 178. The one or more processors are configured to process data and may be implemented in hardware or software. For example, the processor 172 may be 8-bit, 16-bit, 32-bit, 64-bit, or any other suitable architecture. The processor 172 may include an ALU for performing arithmetic and logic operations, processor registers that provide operands to the ALU and store the results of the ALU operations, and a control unit that retrieves instructions from memory and executes the instructions by directing the coordinated operation of the ALU, registers, and other components. The one or more processors are configured to implement various instructions. For example, the one or more processors are configured to execute the software instructions 180 to implement the functions disclosed herein, such as some or all of the functions described with respect to fig. 1-6. In some embodiments, the functions described herein may be implemented using logic units, FPGA, ASIC, DSP, or any other suitable hardware or electronic circuitry.
The network interface 174 may be configured to enable wired and/or wireless communication of the supervisory server 170. The network interface 174 may be configured to communicate data between the supervisory server 170 and other devices, servers, autonomous vehicles 402, systems, or domains. For example, network interface 174 may include an NFC interface, a Bluetooth interface, a Zigbee interface, a Z-wave interface, an RFID interface, a WIFI interface, a LAN interface, a WAN interface, a PAN interface, a modem, a switch, and/or a router. The processor 172 may be configured to send and receive data using the network interface 174. The network interface 174 may be configured to use any suitable type of communication protocol, as will be appreciated by those of ordinary skill in the art.
The user interface 176 may include one or more user interfaces configured to interact with a user, such as a remote operator. The remote operator may access the supervisory server 170 via a communication path. In some embodiments, the user interface 176 may include peripherals of the supervisory server 170, such as a display, keyboard, mouse, touch pad, microphone, webcam, speakers, and the like. In some embodiments, the user interface 176 may include a graphical user interface, a software application, or a web application. The remote operator may use the user interface 176 to access the memory 178 to view any data stored in the memory 178. The remote operator may confirm, update, and/or overrule the routing plan 136, the message 140, the map data 134, and/or any other data stored in the memory 178.
Memory 178 may be volatile or non-volatile and may include ROM, RAM, TCAM, DRAM and SRAM. The memory 178 may include one or more of a local database, cloud database, NAS, and the like. The memory 178 may store any of the information described in fig. 1-6, as well as any other data, instructions, logic, rules, or code operable to implement the function(s) described herein when executed by the processor 172. For example, the memory 178 may store software instructions 150, sensor data 130, object detection machine learning module 132, map data 134, route plans 136, driving instructions 138, messages 140, and/or any other data/instructions. The software instructions 180 may include code that, when executed by the processor 172, causes the supervisory server 170 to perform the functions described herein, such as some or all of the functions described in fig. 1-6. Memory 168 includes one or more magnetic disks, tape drives, or solid state drives, and may be used as an over-flow data storage device to store programs as such programs are selected for execution, as well as store instructions and data that are read during program execution.
Operational flow for facilitating secure communication of an autonomous vehicle
FIG. 2 illustrates an example operational flow 200 of the system 100 of FIG. 1 for facilitating secure communications of an autonomous vehicle 402. Gateway processor 120 may be configured to coordinate communication between autonomous vehicle 402 and external devices, systems (such as supervisory server 170), other autonomous vehicles 402, and the like. The gateway processor 120 may also be configured to coordinate communications between internal components of the control device 450, including components in the autonomous vehicle component boundary field 102, the vehicle component boundary field 104, and the safety boundary field 106, similar to that described in fig. 1. Gateway processor 120 may also be configured to coordinate communication between any combination of autonomous vehicle component boundary field 102, vehicle component boundary field 104, safety boundary zone 106, and supervisory server 170.
Operational flow 200 may begin when gateway processor 120 receives message 140 from supervisory server 170, for example, via a network (110 in fig. 1). Examples of message 140 may include an autonomously enabled command, map data 134, an autonomous software algorithm update, a minimum risk maneuver command (which includes instructions to park or stop the autonomous vehicle against the roadside without impeding traffic), a safety software instruction update, an autonomous vehicle profile, an access management update, diagnostic data, ADS units 122a-c log data, ADC unit 122a-c configuration data, event triggers, human interface audio (e.g., when a communication path is established between a device at autonomous vehicle 402 and a remote operator at supervisory server 170 so that the remote operator may be heard from the device), human interface video (e.g., when a communication path is established between a device at autonomous vehicle 402 and a remote operator at supervisory server 170 so that the remote operator may be seen from the device), and any other suitable data/instructions that may be transmitted to control device 450. The message 140 may be associated with one of the domains 102, 104, 106. For example, the message 140 may be designated as being received by a particular component in one of the domains 102, 104, 106. Message 140 may be in the form of a particular data structure/format, a data object field, or software code that may be evaluated by gateway processor 120, such as a structured data packet.
Determining priority and destination of a message
Gateway processor 120 may evaluate message 140 to extract information therefrom, e.g., determine priority 210, domain data 212, and destination data 214 associated with message 140. To this end, gateway processor 120 may parse message 140. The priority 210 associated with the message 140 may indicate scheduling requirements associated with the message 140.
In some embodiments, the priority 210 associated with the message 140 may indicate, for example, whether the priority for executing the message 140 and/or routing the message 140 to the respective destination is low, medium, or high. In some embodiments, determining the priority 210 associated with the message 140 may include determining that the message 140 is associated with priority tag data that indicates the priority 210 of the message 140. For example, the priority 210 may be indicated by a priority flag bit or priority data field in the message 140. For example, if the priority flag is "11", the priority 210 may be determined to be high, if the priority flag is "01", the priority 210 may be determined to be medium, and if the priority flag is "00", the priority 210 may be determined to be low. Other levels of priority 210 are also possible.
In some embodiments, the priority 210 may be indicated by a value. For example, when priority 210 is greater than a threshold (e.g., greater than 8/10), priority 210 may be determined to be high, when priority 210 is determined to be between two thresholds (e.g., between 4/10 and 8/10), priority 210 may be determined to be medium, and when priority 210 is determined to be less than a threshold (e.g., less than 4/10), priority 210 may be determined to be low.
In some embodiments, determining the priority 210 associated with the message 140 may include determining that the message 140 is associated with a particular Internet Protocol (IP) address that is associated with the priority 210 of the message 140. For example, different IP addresses may be used to transmit messages 140 having different priorities 210. The control device 540 may be provided by the supervision server 170 with a table or list of IP addresses, each for transmitting a message 140 with a particular priority 210.
In response to determining that message 140 is associated with a first IP address that is preset to transmit message 140 having a high priority based on the IP address table, gateway processor 120 may determine that priority 210 of message 140 is a high priority. In response to determining that message 140 is associated with a second IP address that is preset to transmit message 140 having a medium priority based on the IP address table, gateway processor 120 may determine that priority 210 of message 140 is a medium priority. In response to determining that message 140 is associated with a third IP address that is preset to transmit message 140 having a low priority based on the IP address table, gateway processor 120 may determine that priority 210 of message 140 is a low priority.
Gateway processor 120 may also determine domain label data 212 associated with message 140. The domain tag data 212 may indicate that the message 140 is associated with or assigned to a particular one of the domains 102, 104, 106. In some embodiments, determining the domain tag data 212 may be responsive to determining domain data (e.g., a domain flag bit or a domain data field) transmitted along the message 140 or included in the message 140, where the domain data may indicate a particular domain to which the message is assigned.
Gateway processor 120 may also identify destination data 214 associated with message 140. The destination data 214 may indicate that the message 140 is assigned to a particular component 216 within a particular domain 102, 104, 106 identified from the domain label data 212. In some cases, the specific component 216 may be an internal software or hardware component with respect to the gateway processor 120. In some cases, the specific component 216 may be an external software or hardware component with respect to the gateway processor 120, such as any of the ADS units 122a-c, PPS123, memory 126, memory 152, communication module 142, vehicle component controller 144, vehicle component 146, or any of the other components described in fig. 1, 4-6.
Scheduling messages to determined destinations based on priority
In the scheduling and routing operations, the gateway processor 120 may schedule messages 140 to be transmitted to a particular domain 102, 104, 106 identified from the domain label data 214 based on the priority 210, the domain label data 212, and the destination data 214.
In some embodiments, where the priority 210 associated with the message 140 is high (as indicated by the priority tag data or priority data fields included in the message 140, similar to that described above), scheduling the message 140 to be transmitted to a particular domain 102, 104, 106 based on the priority 210 and the identified domain tag data 212 may include moving or routing the message 140 to the top of a scheduling queue, which may include a plurality of messages associated with various priorities.
In some embodiments, where the priority 210 associated with the message 140 is high (as indicated by the priority tag data or priority data field included in the message 140, similar to that described above), scheduling the message 140 to be transmitted to a particular domain 102, 104, 106 based on the priority 210 and the identified domain tag data 212 may include moving or routing the message 140 to a particular scheduling queue dedicated to messages having high priority (e.g., messages having a priority greater than a threshold, such as greater than 8/10).
In some embodiments, where the priority 210 associated with the message 140 is medium (as indicated by the priority tag data or priority data fields included in the message 140, similar to that described above), scheduling the message 140 to be transmitted to a particular domain 102, 104, 106 based on the priority 210 and the identified domain tag data 212 may include moving or routing the message 140 to a particular scheduling queue dedicated to messages having medium priority (e.g., messages having priorities within two thresholds, such as between 4/10 and 8/10).
In some embodiments, where the priority 210 associated with the message 140 is low (as indicated by the priority tag data or priority data field included in the message 140, similar to that described above), scheduling the message 140 to be transmitted to a particular domain 102, 104, 106 based on the priority 210 and the identified domain tag data 212 may include moving or routing the message 140 to a particular scheduling queue dedicated to messages having low priority (e.g., messages having a priority less than a threshold, such as less than 4/10).
In some embodiments, messages 140 transmitted to components within the security border field 106 may not be shared with other components in other fields 102 and 104. For example, if the message 140 is destined for the security border domain 106, the gateway processor 120 may route the message 140 to the security border domain 106 such that the message 140 is not shared with the other domains 102, 104. This is because the message 140 may include the private security key 156 of the particular component. Thus, the message 140 is not shared with the other domains 102, 104.
Gateway processor 120 may route message 140 to a particular component 216 based on priority 210, domain data 212, and destination data 214. The particular component 216 can receive the message 140 and execute or process the message 140 according to information/instructions included in the message 140. For example, if message 140 includes instructions for enabling autonomous functions (i.e., initiating autonomous driving of autonomous vehicle 402), then particular component 216 may execute a particular autonomous driving algorithm (and optionally instruct other related components) to enable autonomous functions of autonomous vehicle 402. In another example, if message 140 includes a command to perform a minimum risk condition maneuver (e.g., stop-by-edge or stop-by-automatic vehicle 402), then particular component 216 may execute particular minimum risk condition maneuver instructions (and optionally instruct other related components) to perform the minimum risk condition maneuver. In another example, if the message 140 includes updated map data 134, the particular component 216 may use the updated map data 134 for travel of the autonomous vehicle 402. In this manner, gateway processor 120 may receive, process, schedule, route, and take actions with incoming message 140.
In some embodiments, gateway processor 120 may perform similar operations on outgoing message 140. The outgoing message 140 may include a request to enable autonomous functions for the autonomous vehicle 402, location coordinates of the autonomous vehicle 402, autonomous vehicle health data, sensor data 130 captured by the sensor 464, information received from the supervisory server 170, information received from another autonomous vehicle 402, and/or any other data/instructions/requests. The outgoing message 140 may be transmitted to the supervisory server 170, other autonomous vehicles 402, devices associated with authorized personnel attempting to access the autonomous vehicles 402, or information associated with the autonomous vehicles 402, etc. For example, to transmit the outgoing message 140, the gateway processor 120 may process the outgoing message 140, in response determine a priority 210 and destination data 214 associated with the outgoing message 140, schedule the outgoing message 140 (in a particular scheduling queue based on the determined priority 210, similar to the scheduling queue described above with respect to the incoming message 140), and transmit the outgoing message 140 to a destination component 216 defined in the destination data 214.
Example methods for facilitating secure communications for autonomous vehicles
FIG. 3 illustrates an example flow chart of a method 300 for facilitating secure communications of an autonomous vehicle 402. Modifications, additions, or omissions may be made to method 300. The method 300 may include more, fewer, or other operations. For example, operations may be performed in parallel or in any suitable order. While sometimes discussed as system 100, autonomous vehicle 402, control device 450, gateway processor 120, or any component thereof, any suitable system or component of the system may perform one or more operations of method 300. For example, one or more operations of method 300 may be implemented at least in part in the form of software instructions 128 and processing instructions 480 from fig. 1 and 4, respectively, stored on a non-transitory, tangible, machine-readable medium (e.g., memory 126 and data storage 490 from fig. 1 and 4, respectively), which when executed by one or more processors (e.g., gateway processor 120 and processor 470 from fig. 1 and 4, respectively) may cause the one or more processors to perform operations 302-314.
At operation 302, the control device 450 coordinates (e.g., via the gateway processor 120) secure communications between the autonomous vehicle boundary field 102, the vehicle component boundary field 104, the secure boundary field 106, and the supervisory server 170. In this process, the control device 450 (e.g., via the gateway processor 120) may share the secure key 156 between the boundary domains 102, 104, 106, where the secure key 156 is received from the supervisory server 170. The control device 450 (e.g., gateway processor 120) may establish a secure communication path between any combination of the border domains 102, 104, 106 and the supervisory server 170 by sharing and/or evaluating the secure key 156, similar to that described in fig. 1 and 2.
At operation 304, the control device 450 determines (e.g., via the gateway processor 120) whether the message 140 was received. For example, the control device 450 (e.g., via the gateway processor 120) may determine whether the message 140 was received from the supervisory server 170 or from another autonomous vehicle 402. If it is determined that the message 140 was received, the method 300 may proceed to operation 306. Otherwise, the method 300 may remain at operation 304 until the message 140 is received.
At operation 306, the control device 450 determines (e.g., via the gateway processor 120) the priority 210 associated with the message 140. To this end, the control device 450 (e.g., via the gateway processor 120) may process the message 140, for example, by extracting various data fields and/or information from the message 140, such as the priority 210, the domain label data 212, and the destination data 214, similar to that described in fig. 2. For example, control device 450 (e.g., via gateway processor 120) may determine priority 210 associated with message 140 based on priority tag data, a priority data field, and/or an IP address associated with message 140 for transmitting message 140, similar to that described in fig. 2.
At operation 308, the control device 450 (e.g., via the gateway processor 120) identifies the domain label data 212 associated with the message 140, wherein the domain label data 312 indicates the particular domain 102, 104, 106. In this process, control device 450 (e.g., via gateway processor 120) may identify domain label data 212 included in message 140, similar to that described in fig. 2.
At operation 310, the control device 450 (e.g., via the gateway processor 120) identifies the destination data 214 associated with the message 140, where the destination data 214 indicates that the message 140 is assigned to a particular component 216, similar to the components described in fig. 2.
At operation 312, the control device 450 schedules (e.g., via the gateway processor 120) the message 140 to be transmitted to the particular domain 102, 104, 106. In this process, the control device 450 (e.g., via the gateway processor 120) may route the message 140 to a particular dispatch queue based on the determined priority 210, domain label data 212, and destination data 214 associated with the message 140, similar to that described in fig. 2. At operation 314, the control device 450 routes the message 140 (e.g., via the gateway processor 120) to the particular component 216. Upon receipt of the message 140, the particular component 216 may take action with the message 140, such as executing a command or instruction included in the message 140.
Automatic driving vehicle and operation example thereof
FIG. 4 illustrates a block diagram of an example vehicle ecosystem 400 in which autonomous driving operations may be determined. As shown in fig. 4, autonomous vehicle 402 may be a semi-trailer truck. The vehicle ecosystem 400 can include several systems and components that can generate and/or deliver one or more information/data sources and related services to an on-board control computer 450 that can be located in the autonomous vehicle 402. The onboard control computer 450 may be in data communication with a plurality of vehicle subsystems 440, all of which may reside in the autonomous vehicle 402. A vehicle subsystem interface 460 may be provided to facilitate data communication between the onboard control computer 450 and the plurality of vehicle subsystems 440. In some embodiments, the vehicle subsystem interface 460 may include a Controller Area Network (CAN) controller to communicate with devices in the vehicle subsystem 440.
Autonomous vehicle 402 may include various vehicle subsystems that support the operation of autonomous vehicle 402. The vehicle subsystems 440 may include a vehicle drive subsystem 442, a vehicle sensor subsystem 444, a vehicle control subsystem 448, and/or a network communication subsystem 492. The components or devices of the vehicle drive subsystem 442, the vehicle sensor subsystem 444, and the vehicle control subsystem 448 shown in fig. 4 are examples. Autonomous vehicle 402 may be configured as shown, or according to any other configuration.
Vehicle drive subsystem 442 may include components operable to provide powered movement of autonomous vehicle 402. In an example embodiment, the vehicle drive subsystem 442 may include an engine/motor 442a, wheels/tires 442b, a transmission 442c, an electrical subsystem 442d, and a power source 442e.
The vehicle sensor subsystem 444 may include a plurality of sensors 446 configured to sense information about the environment or condition of the autonomous vehicle 402. The vehicle sensor subsystem 444 may include one or more cameras 446a or image capture devices, a radar unit 446b, one or more thermal sensors 446c, a wireless communication unit 446d (e.g., a cellular communication transceiver), an Inertial Measurement Unit (IMU) 446e, a laser rangefinder/LiDAR unit 446f, a Global Positioning System (GPS) transceiver 446g, and a wiper control system 446h. The vehicle sensor subsystem 444 may also include sensors configured to monitor internal systems of the autonomous vehicle 402 (e.g., O 2 monitors, fuel gauges, engine oil temperature, etc.).
The IMU 446e may include any combination of sensors (e.g., accelerometers and gyroscopes) configured to sense changes in the position and orientation of the autonomous vehicle 402 based on inertial acceleration. The GPS transceiver 446g may be any sensor configured to estimate the geographic location of the autonomous vehicle 402. To this end, the GPS transceiver 446g may include a receiver/transmitter operable to provide information regarding the position of the autonomous vehicle 402 relative to earth. Radar unit 446b may represent a system that utilizes radio signals to sense objects within the local environment of autonomous vehicle 402. In some embodiments, in addition to sensing objects, radar unit 446b may be configured to sense speed and heading of objects proximate autonomous vehicle 402. The laser rangefinder or LiDAR unit 446f may be any sensor configured to sense objects in the environment in which the autonomous vehicle 402 is located using a laser. Camera 446a may include one or more devices configured to capture a plurality of images of the environment of autonomous vehicle 402. The camera 446a may be a still image camera or a motion video camera.
The camera 446a may be rearward and forward directed such that pedestrians, as well as any gesture signals they issue or signs held by pedestrians, may be observed from around the autonomous vehicle. These cameras 446a may include video cameras, cameras with filters for specific wavelengths, and any other camera suitable for detecting gesture signals, hand-held traffic signs, or both gesture signals and hand-held traffic signs. A sound detection array (such as a microphone or microphone array) may be included in the vehicle sensor subsystem 444. The microphone of the sound detection array may be configured to receive audio indications of the presence of an authority or instructions from an authority, including alarms and commands such as "stop by side". These microphones are mounted or positioned on the exterior of the vehicle, particularly the exterior of the tractor portion of the autonomous vehicle. The microphone used may be of any suitable type, mounted so that it is effective both when the autonomous vehicle is stationary and when traveling at normal travel speeds.
The vehicle control subsystem 448 may be configured to control the operation of the autonomous vehicle 402 and its components. Accordingly, the vehicle control subsystem 448 may include various elements such as throttle and gear selector 448a, brake unit 448b, navigation unit 448c, steering system 448d, and/or autonomous control unit 448e. The throttle and gear selector 448a may be configured to control, for example, the operating speed of the engine and, in turn, the speed of the autonomous vehicle 402. The throttle and gear selector 448a may be configured to control gear selection of the transmission. The brake unit 448b may include any combination of mechanisms configured to slow down the autonomous vehicle 402. The brake unit 448b can decelerate the autonomous vehicle 402 in a standard manner, including by using friction to slow the wheels or engine brake. The brake unit 448b may include an anti-lock braking system (ABS) that may prevent the brakes from locking when the brakes are applied. Navigation unit 448c may be any system configured to determine a travel path or route of autonomous vehicle 402. The navigation unit 448c may also be configured to dynamically update the driving path as the autonomous vehicle 402 operates. In some embodiments, the navigation unit 448c may be configured to combine data from the GPS transceiver 446g with one or more predetermined maps in order to determine the travel path of the autonomous vehicle 402. The steering system 448d may represent any combination of mechanisms that may be operable to adjust the heading of the autonomous vehicle 402 in an autonomous mode or a driver controlled mode.
The autopilot control unit 448e may represent a control system configured to identify, evaluate, and avoid or otherwise traverse potential obstacles or obstructions in the environment of the autopilot vehicle 402. In general, the autonomous control unit 448e may be configured to control the autonomous vehicle 402 to operate without a driver or to provide driver assistance in controlling the autonomous vehicle 402. In some embodiments, the autonomous control unit 448e may be configured to combine data from the GPS transceiver 446g, radar unit 446b, liDAR unit 446f, camera 446a, and/or other vehicle subsystems to determine a travel path or trajectory of the autonomous vehicle 402.
The network communication subsystem 492 may include network interfaces such as routers, switches, modems, and the like. The network communication subsystem 492 may be configured to establish communication between the autonomous vehicle 402 and other systems, servers, and the like. The network communication subsystem 492 may also be configured to transmit data to and receive data from other systems.
Many or all of the functions of autonomous vehicle 402 may be controlled by on-board control computer 450. The onboard control computer 450 may include at least one data processor 470 (which may include at least one microprocessor) that executes processing instructions 480 stored in a non-transitory computer readable medium, such as data storage device 490 or memory. In-vehicle control computer 450 may also represent a plurality of computing devices that may be used to control individual components or subsystems of autonomous vehicle 402 in a distributed manner. In some embodiments, the data storage device 490 may contain processing instructions 480 (e.g., program logic) executable by the data processor 470 to perform various methods and/or functions of the autonomous vehicle 402, including those described with respect to fig. 1-6.
The data storage device 490 may also contain additional instructions including instructions for transmitting data to, receiving data from, interacting with, or controlling one or more of the vehicle drive subsystem 442, vehicle sensor subsystem 444, and vehicle control subsystem 448. The on-board control computer 450 may be configured to include a data processor 470 and a data storage device 490. The on-board control computer 450 may control the functions of the autonomous vehicle 402 based on inputs received from various vehicle subsystems (e.g., vehicle drive subsystem 442, vehicle sensor subsystem 444, and vehicle control subsystem 448).
Fig. 5 illustrates an exemplary system 500 for providing accurate autonomous driving operations. The system 500 may include several modules that may be operated in an onboard control computer 450, as described in fig. 4. The onboard control computer 450 may include a sensor fusion module 502 as shown in the upper left corner of fig. 5, wherein the sensor fusion module 502 may perform at least four image or signal processing operations. The sensor fusion module 502 may acquire images from cameras located on the autonomous vehicle to perform image segmentation 504 to detect the presence of moving objects (e.g., other vehicles, pedestrians, etc.) and/or static obstacles (e.g., parking signs, speed bumps, terrain, etc.) located around the autonomous vehicle. The sensor fusion module 502 may obtain LiDAR point cloud data items from LiDAR sensors located on the autonomous vehicle to perform LiDAR segmentation 506 to detect the presence of objects and/or obstacles located around the autonomous vehicle.
The sensor fusion module 502 may perform instance segmentation 508 on the image and/or point cloud data items to identify contours (e.g., boxes) around objects and/or obstacles located around the autonomous vehicle. The sensor fusion module 502 may perform a time fusion 510 in which objects and/or obstructions from one image and/or frame of point cloud data items are correlated or associated with objects or obstructions from one or more images or frames subsequently received in time.
The sensor fusion module 502 can fuse objects and/or obstructions from the image acquired from the camera and/or the point cloud data items acquired from the LiDAR sensor. For example, the sensor fusion module 502 may determine that an image from one of the cameras that includes half of the vehicle in front of the autonomous vehicle is the same as the vehicle captured by the other camera based on the locations of the two cameras. The sensor fusion module 502 may send the fused object information to the tracking or prediction module 546 and the fused obstacle information to the occupancy grid module 560. The onboard control computer may include an occupied grid module 560 that may retrieve landmarks from a map database 558 stored in the onboard control computer. The occupancy grid module 560 may determine drivable areas and/or obstacles from the fused obstacles obtained from the sensor fusion module 502 and the landmarks stored in the map database 558. For example, occupancy grid module 560 may determine that the drivable region may include a deceleration strip obstruction.
Below the sensor fusion module 502, the onboard control computer 450 may include a LiDAR-based object detection module 512 that may perform object detection 516 based on point cloud data items acquired from a LiDAR sensor 514 located on the autonomous vehicle. The object detection 516 technique may provide the location of the object (e.g., in 3D world coordinates) from the point cloud data item. Under the LiDAR-based object detection module 512, the onboard control computer may include an image-based object detection module 518 that may perform object detection 524 based on images acquired from a camera 520 located on the autonomous vehicle. For example, object detection 518 techniques may employ depth image-based object detection 524 (e.g., machine learning techniques) to provide a position of an object (e.g., in 3D world coordinates) from an image provided by camera 520.
The radar 556 on the autonomous vehicle may scan an area in front of the autonomous vehicle or an area toward which the autonomous vehicle is driving. The radar data may be sent to a sensor fusion module 502, which may use the radar data to correlate objects and/or obstructions detected by radar 556 with objects and/or obstructions detected from both LiDAR point cloud data items and camera images. The radar data may also be sent to a tracking or prediction module 546, which may perform data processing on the radar data to track objects through an object tracking module 548, as described further below.
The onboard control computer 450 (shown in fig. 1 and 4) may include a tracking or prediction module 546 that receives the location of the object from the point cloud and the location of the object from the image, as well as the fusion object from the sensor fusion module 502. The tracking or prediction module 546 also receives radar data with which the tracking or prediction module 546 can track objects through the object tracking module 548 from one point cloud data item and one image acquired at one time to another (or next) point cloud data item and another image acquired at another subsequent time.
The tracking or prediction module 546 may perform object property estimation 550 to estimate one or more properties of objects detected in the image or point cloud data items. The one or more attributes of the object may include a type of object (e.g., pedestrian, automobile, truck, etc.). The tracking or prediction module 546 may perform behavior prediction 552 to estimate or predict a motion pattern of an object detected in an image and/or point cloud. Behavior prediction 552 may be performed to detect a location of an object in a set of images (e.g., consecutive images) received at different points in time or in a set of point cloud data items (e.g., sequential point cloud data items) received at different points in time. In some embodiments, behavior prediction 552 may be performed for each image received from a camera and/or each point cloud data item received from a LiDAR sensor. In some embodiments, a tracking or prediction module 546 may be executed (e.g., run or conducted) to reduce computational load by performing behavior prediction 552 every other or after every other predetermined number of images received from a camera or point cloud data items received from a LiDAR sensor (e.g., after every other two images or after every third point cloud data item).
Behavior prediction 552 features may determine a speed and direction of objects surrounding the autonomous vehicle from radar data, where the speed and direction information may be used to predict or determine a motion pattern of the objects. The motion pattern may include predicted trajectory information of the object for a predetermined length of time in the future after receiving the image from the camera. Based on the predicted movement pattern, the tracking or prediction module 546 may assign movement pattern context labels to objects (e.g., "located at coordinates (x, y)", "stopped", "traveling at 50 mph", "accelerating", or "decelerating"). The contextual model may describe a movement pattern of the object. Tracking or prediction module 546 may send one or more object properties (e.g., type of object) and a athletic profile context label to planning module 562. The tracking or prediction module 546 may use any information obtained by the system 500 and any number and combination of its components to perform the environmental analysis 554.
The onboard control computer may include a planning module 562 that receives object attributes and motion profile context tags from tracking or prediction module 546, drivable regions and/or obstacles, and vehicle position and pose information (described further below) from fusion positioning module 526.
The planning module 562 can execute the navigation plan 564 to determine a set of trajectories on which the autonomous vehicle can travel. The set of trajectories may be determined based on the drivable region information, one or more object properties of the object, a motion profile tag of the object, a position of the obstacle, and the drivable region information. In some embodiments, the navigation plan 564 may include an area beside a road where it is determined that an autonomous vehicle may safely park in an emergency. The planning module 562 can include a behavior decision 566 to determine a driving action (e.g., steering, braking, throttle) in response to determining a changing condition on the road (e.g., traffic lights turning yellow, or the autonomous vehicle being in an unsafe driving condition because another vehicle is traveling in front of the autonomous vehicle and within a predetermined safe distance of the location of the autonomous vehicle). The planning module 562 performs the trajectory generation 568 and selects a trajectory from the set of trajectories determined by the navigation planning operation 564. The selected trajectory information may be sent by planning module 562 to control module 570.
The onboard control computer may include a control module 570, which control module 570 receives the proposed track from planning module 562 and the position and pose of the autonomous vehicle from fusion positioning module 526. The control module 570 may include a system identifier 572. The control module 570 may perform model-based trajectory refinement 574 to refine the proposed trajectory. For example, the control module 570 may apply filtering (e.g., a kalman filter) to smooth the proposed trajectory data and/or minimize noise. The control module 570 may perform robust control 576 by determining an amount of brake pressure to apply, a steering angle, an amount of throttle to control vehicle speed, and/or a transmission gear based on the refined proposed trajectory information and the current position and/or attitude of the autonomous vehicle. The control module 570 may send the determined brake pressure, steering angle, throttle amount, and/or transmission gear to one or more devices in the autonomous vehicle to control and facilitate accurate driving operations of the autonomous vehicle.
The depth image-based object detection 524 performed by the image-based object detection module 518 may also be used to detect landmarks on the road (e.g., parking marks, deceleration strips, etc.). The fusion locator module 526 obtains information regarding landmarks detected from the image, landmarks obtained from a map database 536 stored on the onboard control computer, landmarks detected from the point cloud data item by the LiDAR based object detection module 512. The fusion positioning module 526 also obtains information regarding speed and displacement from the odometer sensor 544 or rotary encoder, and obtains an estimated position for the autonomous vehicle from the GPS/IMU sensor 538 (i.e., the GPS sensor 540 and the IMU sensor 542) located on or in the autonomous vehicle. Based on this information, fusion positioning module 526 can perform positioning operation 528 to determine the location of the autonomous vehicle, which can be sent to planning module 562 and control module 570.
The fusion positioning module 526 can estimate the pose 530 of the autonomous vehicle based on the GPS and/or IMU sensors 538. The pose of the autonomous vehicle may be sent to planning module 562 and control module 570. The fusion positioning module 526 can also estimate the status (e.g., position, possible movement angle) of the trailer unit based on information (e.g., angular rate and/or linear velocity) provided by, for example, the IMU sensor 542 (e.g., trailer status estimate 534). The fused localization module 526 may also examine map content 532.
Fig. 6 shows an exemplary block diagram of an onboard control computer 450 included in an autonomous vehicle 402. The onboard control computer 450 may include at least one processor 604 and a memory 602 having instructions (e.g., software instructions 128 and processing instructions 480 in fig. 1 and 4, respectively) stored thereon. The instructions, when executed by the processor 604, configure the onboard control computer 450 and/or various modules of the onboard control computer 450 to perform the operations described in fig. 1-6. The transmitter 606 may transmit information or data to or receive information or data from one or more devices in the autonomous vehicle. For example, the transmitter 606 may send instructions to one or more motors of the steering wheel to steer the autonomous vehicle. The receiver 608 receives information or data transmitted or sent by one or more devices. For example, the receiver 608 receives a state of a current speed from an odometer sensor, or a current transmission gear from the transmission. The transmitter 606 and receiver 608 may also be configured to communicate with the plurality of vehicle subsystems 440 and the onboard control computer 450 described above in fig. 4 and 5.
Although several embodiments are provided in this disclosure, it should be understood that the disclosed systems and methods may be embodied in many other specific forms without departing from the spirit or scope of the present disclosure. The present embodiments are to be considered as illustrative and not restrictive, and the intention is not to be limited to the details given herein. For example, various elements or components may be combined or integrated into another system, or certain features may be omitted or not implemented.
Furthermore, techniques, systems, subsystems, and methods described and illustrated in the various embodiments as discrete or separate may be combined or integrated with other systems, modules, techniques, or methods without departing from the scope of the present disclosure. Other items shown or discussed as coupled or directly coupled or communicating with each other may be indirectly coupled or communicating through some interface, device, or intermediate component, whether electrically, mechanically, or otherwise. Other examples of changes, substitutions, and alterations are ascertainable by one skilled in the art and could be made without departing from the spirit and scope disclosed herein.
To assist the patent office and any readers of any patent issued in accordance with the present application in interpreting the claims appended hereto, the applicant notes that no reference is intended to be made to section 112 (f) of the U.S. code 35, which exists on the date of filing of the present application unless a "means" or "step" is explicitly used in a particular claim.
Implementations of the present disclosure may be described in terms of the following clauses, which may be combined in any reasonable manner.
Clause 1. A system comprising:
a memory configured to store a first message; and
A gateway processor operably coupled with the memory and configured to:
coordinating communication among the autonomous vehicle component boundary field, the vehicle component boundary field, and the supervisory server;
receiving a first message from a supervision server, wherein:
The first message is associated with one of an autonomous vehicle component boundary field, a vehicle component boundary field, or a safety field; and
The security domain includes a third set of components configured to facilitate authentication of the received message;
determining a priority associated with the first message, wherein the priority associated with the first message indicates a scheduling requirement associated with the first message;
Identifying domain tag data associated with the first message, wherein the domain tag data indicates that the first message is associated with a particular domain from an autonomous vehicle component boundary domain, a vehicle component boundary domain, or a safety domain;
Identifying destination data associated with the first message, wherein the destination data indicates that the first message is assigned to a particular component within a particular domain;
Scheduling the first message to be transmitted to the particular domain based at least in part on the priority associated with the first message and the identified domain label data; and
The first message is routed to a particular component based at least in part on the destination data.
Clause 2 the system of clause 1, wherein the specific component is an internal software component with respect to the gateway processor.
Clause 3 the system of clause 1, wherein the specific component is an external hardware component with respect to the gateway processor.
Clause 4 the system of clause 1, wherein the priority of the first message further indicates whether the priority of the first message is low, medium or high.
Clause 5 the system of clause 4, wherein scheduling the first message to be transmitted to the particular domain based at least in part on the priority associated with the first message and the identified domain label data comprises: in response to determining that the priority associated with the first message is high, the first message is moved to a top of a dispatch queue that includes a plurality of messages associated with various priorities, wherein the priority is determined to be high when the priority is greater than a threshold.
Clause 6 the system of clause 4, wherein scheduling the first message to be transmitted to the particular domain based at least in part on the priority associated with the first message and the identified domain label data comprises: in response to determining that the priority associated with the first message is high, the first message is moved to a dispatch queue dedicated to messages having high priority, wherein the priority is determined to be high when the priority is greater than a threshold.
Clause 7 the system of clause 4, wherein scheduling the first message to be transmitted to the particular domain based at least in part on the priority associated with the first message and the identified domain label data comprises: in response to determining that the priority associated with the first message is medium, the first message is moved to a dispatch queue dedicated to messages having medium priority, wherein the priority is determined to be medium when the priority is between a first threshold and a second threshold.
Clause 8 the system of clause 4, wherein scheduling the first message to be transmitted to the particular domain based at least in part on the priority associated with the first message and the identified domain label data comprises: in response to determining that the priority associated with the first message is low, the first message is moved to a dispatch queue dedicated to messages having low priority, wherein the priority is determined to be low when the priority is less than a threshold.
Clause 9. A method comprising:
coordinating communication among the autonomous vehicle component boundary field, the vehicle component boundary field, and the supervisory server;
receiving a first message from a supervision server, wherein:
The first message is associated with one of an autonomous vehicle component boundary field, a vehicle component boundary field, or a safety field; and
The security domain includes a third set of components configured to facilitate authentication of the received message;
determining a priority associated with the first message, wherein the priority associated with the first message indicates a scheduling requirement associated with the first message;
Identifying domain tag data associated with the first message, wherein the domain tag data indicates that the first message is associated with a particular domain from an autonomous vehicle component boundary domain, a vehicle component boundary domain, or a safety domain;
Identifying destination data associated with the first message, wherein the destination data indicates that the first message is assigned to a particular component within a particular domain;
Scheduling the first message to be transmitted to the particular domain based at least in part on the priority associated with the first message and the identified domain label data; and
The first message is routed to a particular component based at least in part on the destination data.
Clause 10. The method according to clause 9, wherein:
The autonomous vehicle component boundary field includes a first set of components configured to facilitate autonomous operation of the autonomous vehicle;
the first set of components includes at least one of:
One or more autonomous driving computing units;
a memory associated with the autonomous vehicle;
a controller area network controller; or alternatively
A timing synchronization component.
Clause 11 the method according to clause 9, wherein:
the vehicle component boundary field includes a second set of components configured to facilitate non-autonomous operation of the autonomous vehicle; and
The second set of components includes at least one of:
A modem;
A vehicle component controller;
A human-machine interface;
a braking unit;
A power distribution unit;
A camera array;
A microphone array; or alternatively
A speaker array.
Clause 12 the method according to clause 9, wherein the third set of components comprises at least one of:
authenticating the software component; or alternatively
One or more security keys for establishing a secure communication path between any two combinations of the autonomous vehicle component boundary domain, the vehicle component boundary domain, and the security domain.
Clause 13 the method of clause 9, further comprising establishing a trusted communication path between the autonomous vehicle component boundary field and the vehicle component boundary field.
Clause 14 the method of clause 13, wherein establishing a trusted communication path between the autonomous vehicle component boundary field and the vehicle component boundary field comprises:
Receiving an initial private security key from a supervision server;
Sharing an initial private security key with a communication processor associated with a vehicle component boundary domain;
receiving a request from the communication processor to transmit a second message, wherein the request includes the second message and the private security key; and
It is determined that the received private security key corresponds to the initial private security key.
Clause 15, a non-transitory computer readable medium storing instructions that, when executed by a processor, cause the processor to:
coordinating communication among the autonomous vehicle component boundary field, the vehicle component boundary field, and the supervisory server;
receiving a first message from a supervision server, wherein:
The first message is associated with one of an autonomous vehicle component boundary field, a vehicle component boundary field, or a safety field; and
The security domain includes a third set of components configured to facilitate authentication of the received message;
determining a priority associated with the first message, wherein the priority associated with the first message indicates a scheduling requirement associated with the first message;
Identifying domain tag data associated with the first message, wherein the domain tag data indicates that the first message is associated with a particular domain from an autonomous vehicle component boundary domain, a vehicle component boundary domain, or a safety domain;
Identifying destination data associated with the first message, wherein the destination data indicates that the first message is assigned to a particular component within a particular domain;
Scheduling the first message to be transmitted to the particular domain based at least in part on the priority associated with the first message and the identified domain label data; and
The first message is routed to a particular component based at least in part on the destination data.
Clause 16, the non-transitory computer readable medium of clause 15, wherein scheduling the first message to be transmitted to the particular domain based at least in part on the priority associated with the first message and the identified domain label data comprises:
determining that the particular domain is a security domain; and
In response to determining that the particular domain is a security domain, the first message is routed to the security domain.
Clause 17 the non-transitory computer readable medium of clause 15, wherein determining the priority associated with the first message comprises: the first message is determined to be associated with priority tag data that indicates a priority associated with the first message.
Clause 18 the non-transitory computer readable medium of clause 15, wherein determining the priority associated with the first message comprises: it is determined that the first message is associated with a particular Internet Protocol (IP) address, the particular IP address being associated with a priority associated with the first message.
Clause 19 the non-transitory computer readable medium of clause 15, wherein:
in response to determining that the first message is associated with the first IP address, determining that the priority associated with the first message is a high priority;
Responsive to determining that the first message is associated with the second IP address, determining that the priority associated with the first message is a medium priority; and
In response to determining that the first message is associated with the third IP address, it is determined that the priority associated with the first message is a low priority.
Clause 20 the non-transitory computer readable medium of clause 15, wherein the first message comprises one of:
A command to enable autonomous operation;
Map data including a virtual map of an area in which the autonomous vehicle is traveling;
updating an autonomous software algorithm; or alternatively
A minimum risk maneuver command includes instructions for stopping the autonomous vehicle alongside or at the end of the autonomous vehicle.
Clause 21 the system of any of clauses 1 to 8, wherein the processor is further configured to perform one or more operations of the method of any of clauses 9 to 14.
Clause 22 the system of any of clauses 1 to 8, wherein the processor is further configured to perform one or more operations according to any of clauses 15 to 20.
Clause 23 an apparatus comprising means for performing the method according to any of clauses 9 to 14.
Clause 24 an apparatus comprising means for executing one or more instructions according to any of clauses 15 to 20.
Clause 25. The non-transitory computer-readable medium according to any of clauses 15 to 20, storing instructions that, when executed on a system, further cause the one or more processors to perform one or more operations of the method according to any of clauses 9 to 14.