[go: up one dir, main page]

CN110007752A - The connection of augmented reality vehicle interfaces - Google Patents

The connection of augmented reality vehicle interfaces Download PDF

Info

Publication number
CN110007752A
CN110007752A CN201910009386.5A CN201910009386A CN110007752A CN 110007752 A CN110007752 A CN 110007752A CN 201910009386 A CN201910009386 A CN 201910009386A CN 110007752 A CN110007752 A CN 110007752A
Authority
CN
China
Prior art keywords
user
vehicle
autonomous vehicle
stroke
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910009386.5A
Other languages
Chinese (zh)
Inventor
张洵铣
A·阿拉劳
J·A·米勒
H·拉维钱德兰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Motional AD LLC
Original Assignee
Uteno Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Uteno Co Ltd filed Critical Uteno Co Ltd
Publication of CN110007752A publication Critical patent/CN110007752A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/20Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used
    • B60R2300/207Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used using multi-purpose displays, e.g. camera image and navigation or video on same display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

This application provides the connections of augmented reality vehicle interfaces.A kind of device includes: processor;And the storage for the instruction to perform the following operation can be executed by processor: in conjunction with the stroke of the people in autonomous vehicle, specific position that the purpose of the specific position where people is met for the trip or stroke is located in by selection;And pass through the visual information of user interface presentation description specific position of equipment, etc. such theme.

Description

The connection of augmented reality vehicle interfaces
Priority claim
The application requires on 2 13rd, the 2018 U.S. Provisional Patent Application No.62/ submitted according to 35USC § 119 (e) The U.S. Provisional Patent Application No.62/613 that on January 4th, 629,764 and 2018 submits, 664 benefit of priority, the two Full content is incorporated herein by reference.
Background technique
This description is related to augmented reality vehicle interfaces connection (interfacing).
Such as article " the following not instead of glasses of augmented reality, automobile " (Augmented Reality ' s Future Isn’t Glasses.It’s the Car)https://venturebeat.com/2017/08/23/ar-will-drive- As the-evolution-of-automated-cars/ is implied, the use of augmented reality and vehicle, which has become, to be made us feeling The topic of interest,.
Summary of the invention
In general, on the one hand, a kind of device includes: processor;And it is following to carry out for that can be executed by processor The storage of the instruction of operation: the information of the stroke about autonomous vehicle (a) is received;(b) it identifies opposite with the stroke of autonomous vehicle The real-world scene answered;And (c) lead to the enhancing element that the stroke for autonomous vehicle is presented in equipment, enhancing member Element enhances element for the figure that the user for enabling equipment watches near the visual elements in real-world scene is present in In position.
Realization may include the combination of one or two or more items in following characteristics.Equipment includes augmented reality viewing Equipment.Equipment includes augmented reality glasses.Equipment includes head up display (head up display).Equipment includes that movement is set Standby display screen.Equipment includes the long-range monitor positioned at autonomous vehicle.It is live that user watches real-world scene, and increases Strong element is present on real-world scene or near real-world scene.Both real-world scene and enhancing element quilt It is presented in equipment.Equipment is located inside autonomous vehicle.Equipment is located at the outside of autonomous vehicle near.Equipment is located at remote operating In facility.Enhancing element includes the label for identifying the mobile object in real-world scene.Enhancing element includes identifying real generation The graphical representation of mobile object in boundary's scene.Enhancing element includes the planned trajectory of autonomous vehicle.It includes practical for enhancing element Specific boarding position.Enhancing element includes in real-world scene among two or more visible autonomous vehicles of user The label of the specific autonomous vehicle of user will be connect.
In general, on the one hand, a kind of device includes: processor;And it is following to carry out for that can be executed by processor The storage of the instruction of operation: (a) combining the stroke of the people in autonomous vehicle, and selection will connect the certain bits where people for the trip It sets or specific position that the purpose of stroke is located in;And (b) presented by the user interface of equipment describe specific position can See information.
Realization may include the combination of one or two or more items in following characteristics.Specific position include on road or The special entity position of near roads.Specific position includes the special entity position identified before people requests stroke.It can be seen that letter Breath includes the real world images of special entity position.Visual information includes oneself presented together with the real world images of special entity position The real world images of main vehicle.The specific position is not by street address identifier.
In general, on the one hand, a kind of device includes mobile device, the mobile device include display, processor and App or browser, the app or browser will meet people institute for stroke by autonomous vehicle for processor to be presented over the display Practical specific position or the trip the description of specific position that is located in of purpose, which makes in people The request to the trip before be determined.
In general, on the one hand, a kind of device includes: processor;And it is following to carry out for that can be executed by processor The storage of the instruction of operation: the row from the user about the user in autonomous vehicle (a) is received by the user interface of equipment The signal of journey, the feature for the autonomous vehicle which may be in response to signal from the user to control;And (b) determining pair In the movement that autonomous vehicle to be taken to be made a response by the feature for controlling autonomous vehicle to signal from the user.
Realization may include the combination of one or two or more items in following characteristics.It may be in response to letter from the user It number include temperature in autonomous vehicle come the feature of the autonomous vehicle controlled.It include in autonomous vehicle from user's received signal Temperature.The feature for the autonomous vehicle that may be in response to signal from the user to control includes the passenger capacity of vehicle.It is connect from user The signal of receipts includes the passengers quantity of stroke.The feature for the autonomous vehicle that may be in response to signal from the user to control includes vehicle The state of entertainment systems in.The state of entertainment systems includes at least one of a type of entertainment content.Amusement system The state of system includes the identifier in broadcasting station.It include the content source of entertainment systems or the mark of content type from user's received signal Know.The feature for the autonomous vehicle that may be in response to signal from the user to control includes the appearance to the children for needing children's seat It receives.It include to by the instruction of the children present in stroke from user's received signal.It may be in response to signal from the user The feature of the autonomous vehicle of control includes the receiving to the package for stroke.It include about for row from user's received signal The information of the package of journey.The executable instruction of processor is used for available certainly from two or more based on signal from the user Autonomous vehicle is selected in main vehicle.At least one of the following is based on to the selection of autonomous vehicle: passenger capacity, package capacity, Or the availability of children car seat.Processor and storage are the parts of central AV system.The instruction that can be executed by processor is used Autonomous vehicle is transferred in that will act.
In general, on the one hand, a kind of device includes mobile device, the mobile device include display, processor and App or browser, the app or browser are used to that processor to be made to be presented over the display for controlling the autonomous vehicle for stroke Feature at least one option, this feature includes at least one of the following: passenger capacity, package capacity, automobile seat for child The feature of the availability of chair, the temperature in vehicle or entertainment systems.
In general, on the one hand, a kind of device includes: processor;And it is following to carry out for that can be executed by processor The storage of the instruction of operation: the information for the real world environments being just advanced through about autonomous vehicle (a) is received;(b) mark is existing One or more mobile objects in real world environments;And it includes real world that the user interface of equipment is presented, which to people, The scene for the visible instruction that the current visible expression of environment and confirmation mobile object have been identified.
Realization may include the combination of one or two or more items in following characteristics.Institute is received about real world The information of environment includes the image from image-capturing apparatus, and the expression for being presented to people includes the image.It is presented to the people Expression include that real world environments being schematically shown.Real world environments schematically show the figure including road network It indicates.It is presented to expression the schematically showing including the planned trajectory of vehicle of the people.What confirmation mobile object had been identified can See that instruction includes the illustrative array of the graphic element based on the signal from laser radar, radar or camera.Confirm motive objects The visible instruction that body has been identified includes the label for the mobile object that the live video of real world environments is presented.Label includes Graphical boxes.For executable instruction for making user interface and the second scene that the scene simultaneously be presented, which includes real world The live video of environment, the second scene include Exemplary elements corresponding with real world environments.Equipment includes mobile device. Equipment includes work station.Equipment is located in autonomous vehicle.Equipment is located at outside autonomous vehicle.Equipment is located at remote operating position.Such as Device described in any one of aforementioned exemplary, wherein can be by the instruction that processor executes for the institute in real world environments Mark mobile object is classified.It marks different classes of and different because of the mobile object that is identified.
In general, on the one hand, a kind of device includes: head up display, which is placed in multiplying for autonomous vehicle Between the view of the real world environments of member and occupant autonomous vehicle experienced;Processor;And for can be by processor It executes with the storage of the instruction performed the following operation: (a) identifying the object in real world environments in movement;And (b) cause In the instruction that the object in real world environments in movement is presented in display that comes back, the instruction is in real world environments The occupant of object be visible.
Realization may include the combination of one or two or more items in following characteristics.Instruction indicates mobile object Graphic element.Executable instruction is used to make the presentation and the change phase of the position of autonomous vehicle of the mobile object in head up display Coordinate.Head up display includes actually transparent screen, and showing on the actually transparent screen indicates mobile object Graphic element.Head up display is placed between occupant and the windshield of vehicle.Head up display is placed in the window of occupant and vehicle Between family.Screen includes the coating on the windshield or window of vehicle.
In general, on the one hand, a kind of device includes the vehicle comprising multiple sensors, processor and by processor Configuration is to allow display of user's viewing about the information of vehicle, and wherein processor is received from multiple sensors about vehicle The data of environment analyze the data to generate the information about vehicle, and show information over the display, and wherein packet Include the feature that the operation to vehicle is relevant, is not present in vehicle environmental.
In general, on the one hand, a kind of vehicle includes: the driving group including acceleration component, steering assembly and reduction assemblies Part;Autonomous driving ability, for issuing signal to driving component to drive vehicle under at least partly autonomous driving mode;Point Component is analysed, for analyzing the information by the data and generation of the sensor accumulation on vehicle about the environment of vehicle;And display Device, the display show the information of the environment about vehicle, and wherein the information includes the feature being not present in the environment of vehicle.
In general, on the one hand, a kind of method includes: to receive to pass about one or more from one or more sensors The data of the ambient enviroment of sensor;It is special to identify the one or more of the ambient enviroment of one or more sensors to analyze the data Sign, the one or more feature are related with the operation of autonomous vehicle (AV);And one or more sensors are shown on the screen Ambient enviroment rendering, wherein the rendering includes data from least one of sensor and indicates institute's identification characteristics One or more of at least one object.
Realization may include the combination of one or two or more items in following characteristics.Rendering includes and autonomous vehicle Operate at least one related imaginary object.Imaginary object includes the rendering of vehicle.Imaginary object includes being located at user to select The rendering of the vehicle of position.Imaginary object includes the label for indicating the planned trajectory of autonomous vehicle.One or more sensors quilt It is attached to headphone.One or more sensors are attached to a pair of glasses.One or more sensors are attached to intelligence It can phone.This at least one object is the label for identifying class belonging to institute's identification characteristics.
Realization may include the combination of one or two or more items in following characteristics.Display rendering includes that interaction is presented Formula interface.The visual field or get a bird's eye view that interactive interface includes the visual sensor that vehicle is presented is presented.Interactive interface, which is presented, includes Current awareness information is presented.It includes the motion planning information that current or past or both is presented that interactive interface, which is presented,.It presents Interactive interface includes the system diagram that vehicle is presented, which includes one or more hardware components or one or more Software process or the two.Data include one of the following or multiple: the sensor in map, vehicle or correlation AV system Data, vehicle or the track data or vehicle in correlation AV system or the vision data in correlation AV system.Display is located at In autonomous vehicle.Display is far from autonomous vehicle.
Aspect, feature and the realization of these and other will become aobvious and easy according to including being described below for claim See, and method, apparatus, system, component, program product, the method for doing business, the dress for executing function can be expressed as It sets or step, and can express in other ways.
Detailed description of the invention
Fig. 1, Fig. 6 A, Fig. 5 A, Fig. 7, Fig. 8 and Fig. 9 are block diagrams.
Fig. 2A -2C, Fig. 3 A-3J, Fig. 4 B-4C and Fig. 6 B-6C are screenshotss.
Fig. 4 A, Fig. 4 D-4F and Fig. 5 B-5E are schematic diagrames.
Specific embodiment
General view
As shown in figure 8, ought on system 200 using, check one or more 12 (such as, but not limited to Autonomous Vehicles of vehicle ) and avoid the operation of people 10 when the accident with one or more vehicles 12 (such as, but not limited to autonomous vehicle), greet, multiply With, drive and other activities can with realized on various user interface apparatus 14 or the increasing realized by various user interface apparatus 14 Strong reality technology 13 is associated, various user interface apparatus such as mobile device 16, wear-type interface equipment 18, head up display Or Vehicular display device 20, etc..User interface apparatus can be located at interior or exterior of the vehicle.In some implementations, augmented reality skill Art includes the presentation for the augmented reality element 22 being superimposed upon on real-world scene 24.In some cases, by via user Augmented reality element 22 is superimposed upon real-world scene 24 by the presentation of the user interface 26 on the display 27 of interface equipment On.In some instances, by via the user interface of augmented reality element 22 and 28 the two of view of real-world scene Present by augmented reality element overlaid on real-world scene, the view 28 of real-world scene can by camera, microphone, Or other scene capture devices 30 or their combination capture.User interface apparatus may be used on user interface apparatus and transports Capable native applications (or Web browser) Lai Chengxian augmented reality element (and real world field is presented in some cases Scape).Native applications (or Web browser) can receive to indicate that the data of real-world scene as input, generate enhancing Display elements, and combine them to present on a display of the user interface.In general, real-world scene and augmented reality member Element is that real-time (for example, " fact ") is presented, so that they are for the people in real world situation associated with vehicle It is related and useful.
Term " augmented reality " or " AR " be widely used in including for example by present be not physically real-world scene The real-world scene that partial any sense organ element is added, enhances, amplifying, extending or otherwise " enhancing " it is any Direct or indirect view.Sensation element can be (such as, video, figure or the GPS data) of vision, tactile or the sense of hearing Or two of them or more combination and other kinds of element.
Autonomous vehicle example
We continually will use the example of autonomous vehicle as the context of our description.Nevertheless, we At least some of technology described herein is applicable to the vehicle of being driven by people and useful to the vehicle of being driven by people.
Term " autonomous vehicle " or " AV " or " pilotless automobile " are widely used in " from driving a car " including for example having There is any vehicle of one or more autonomous driving abilities.
Term " autonomous driving ability " be widely used in including for example can in addition to by people manipulate the steering wheel of AV, accelerator, Any function, feature or the facility of the driving of AV are participated in outside brake or other entity controllers.
Our technologies described herein are applicable to any vehicle with one or more autonomous driving abilities, including complete Full autonomous vehicle, height autonomous vehicle and conditional autonomous vehicle, such as, respectively so-called 5 grades, 4 grades and 3 grades vehicles (referring to SAE international standard J3016: the classification and definition of term relevant to road vehicle automated driving system (Taxonomy and Definitions for Terms Related to On-Road Motor Vehicle Automated Driving Systems), entire contents are incorporated by reference into, for related vehicle independence rank The more details of classification).Autonomous driving ability can attempt to the steering or speed of control vehicle.Technology described in this document can also To be applied to the vehicle of part autonomous vehicle and driver assistance, such as, so-called 2 grades and 1 grade of vehicle are (referring to the world SAE mark Quasi- J3016: the classification and definition of term relevant to road vehicle automated driving system).1 grade, 2 grades, 3 grades, 4 grades and 5 The processing that one or more of grade Vehicular system can be inputted based on sensor grasps certain vehicles under certain riving conditions Make (for example, turning to, brake and use map) automation.Technology described in this document can make range from entirely autonomous vehicle to The other vehicle of any level in the rank of manual operation vehicle is benefited.
Compared with the people of navigation conventional truck, autonomous vehicle is typically capable of inputting in no mankind or with reduction The mankind sense its environment and navigate through this environment in the case where inputting.Autonomous vehicle using various technologies (such as, radar, swash Light, GPS, ranging and computer vision etc.) its ambient enviroment is detected, and generate corresponding sensory information.Advanced control system (ACS) These sensory informations are explained, to identify guidance path and barrier and correlating markings appropriate.Autonomous vehicle can reduce motor-driven Property cost and infrastructure cost, and improve safety, thus significant reduction Traffic Collision and its caused by injure.Autonomous vehicle It can promote for transport services, especially via the various business models of shared economy.
As shown in Figure 1, the typical motion of autonomous vehicle (AV) 100 is safely and reliably to pass through environment along track 198 190 automatically or partly manually or not only automatically but also partly manually drive towards destination 199, while avoiding object (example Such as, mountain 191, vehicle 193, pedestrian 192, cyclist and other barriers) and road planning is abided by (for example, operation rules Or drive preference).AV 100 or AV system 120 make AV 100 be able to carry out the feature, function and facility of autonomous driving usually Referred to as autonomous driving ability.
Term " track " be widely used in including for example from one place to another any path or route;Example Such as, the path from boarding position to out-of-the-car position;Towards the path of target position.
Term " target " or " target position " are widely used in from anywhere in reaching including such as AV, including for example interim Out-of-the-car position, final out-of-the-car position or destination etc..
The driving of AV 100 is typically by a series of technologies (for example, hardware, software and the two storage and in real time number According to) support, these technologies are collectively known as AV system 120 with AV system 100 in the document.In some implementations, in technology One kind or some or all by vehicle-mounted on AV 100.In some cases, one of these technologies or some or all Positioned at another location, such as, at server (for example, in cloud computing infrastructure).
The component of AV system 120 may include that one of the following is multiple or whole:
The function device 101 of 1.AV 100 is assembled into and receives and act on from one or more computation processors 146 and 148 order, these order for drive (for example, turn to 102, accelerate, slow down, gear selection and braking 103) and For miscellaneous function (for example, by signal activation).
2. data storage cell 142 or memory 144 or the two, for storing machine instruction or various types of data Or the two.
3. one or more sensors 121, for measuring or inferring or not only measure but also inferring the state of AV or the category of condition Property, such as, position, linear velocity and the angular speed and acceleration and direction of advance (for example, orientation of the front end AV) of AV.For example, Sensors with auxiliary electrode may include but be not limited to: GPS;Inertial Measurement Unit, the Inertial Measurement Unit measure Vehicular linear acceleration and Angular speed;Multiple individual vehicle-wheel speed sensors, for measuring or estimating multiple individual wheel slips;It is multiple individual Wheel brake pressure or brake torque sensor;The multiple individual wheel torque sensors of engine torque;And steering wheel angle And angular rate sensor.
4. one or more sensors, the attribute of the environment for sensing or measuring AV.For example, sensors with auxiliary electrode may include But be not limited to: in visible light, it is infrared or hot (or both) monocular or stereoscopic camera 122 in spectrum;Laser radar 123;Thunder It reaches;Ultrasonic sensor;Flight time (TOF) depth transducer;Velocity sensor;And temperature and rainfall sensor.
5. one or more communication equipments 140, for transmitting measured by the state and condition of other vehicles or being inferred Or the attribute of measurement and deduction, such as, position, linear velocity and angular speed, linear acceleration and angular acceleration and line direction of advance With angle direction of advance.These equipment include vehicle to vehicle (V2V) and vehicle to infrastructure (V2I) communication equipment and are used for Equipment by point-to-point or self-organizing (ad hoc) network or being carried out wireless communication by the two.Communication equipment 140 can be with It is communicated across electromagnetic spectrum (including radio and optic communication) or other media (for example, air and acoustic medium).
6. one or more communication interfaces 140 are (for example, wired, wireless, WiMAX, Wi-Fi, bluetooth, satellite, honeycomb, light Learn, near field or radio or their combination), for will be from being located at the data transmission of long-range database 134 to AV system 120, by sensing data or data transmission relevant to driving performance to being located at long-range database 134, or transmission with it is distant Operate the combination of relevant information or above situation.
7. one or more data sources, for provides about environment 190 history in real time or the information predicted or it In any two or more combinations, including for example map, driving performance, traffic congestion update or weather condition.This Class data can be stored on the data storage cell 142 or memory 144 on AV 100, or can be via communication channel from remote Journey database 134 is transferred into the combination of AV 100 or above situation.
8. one or more data sources 136 potentially include for providing digital road map data from GIS database One of the following is multiple: the high-precision map of road geometric attribute;The map of road network connectivity attribute is described;Description Road physical attribute map (such as, traffic speed, the volume of traffic, vehicle and the quantity in bicycle traffic lane, lane width, Lane traffic direction or lane markings type and position or their combination);And (such as, people's row is horizontal for description roadway characteristic Road, traffic sign or other various traveling signals) spatial position map.Such data can be stored in depositing on AV 100 On reservoir 144, or via communication channel from be located at long-range database server be transferred into AV 100 or both of these case Combination.
9. one or more data sources 136 or sensor 132 are provided for the similar time for example at one day about previous Along the historical information of the driving attributes (for example, velocity and acceleration overview) of the vehicle of local sections of road.Such data can It is stored on the memory 144 on AV 100, or is transferred into AV from long-range database 134 is located at via communication channel 100 or both of these case combination.
10. being located on AV 100 the one or more of (or be located at long-range or be not only located on AV 100 but also be located at long-range) to calculate Equipment 146 and 148, for executing the calculation for generating control action online based on both real time sensor data and previous message Method, so that AV system 120 be allowed to execute its autonomous driving) ability.
11. one of the following or multiple processes: processing sensing data, understands currently by being perceived perception environment Condition that environment is presented and in the future may by condition that the environment that is perceived is presented, execution track planning, execute motion control, And it is made decision based on those perception and understanding.Can by integrated circuit, field programmable gate array, hardware, software or Process is realized in the combination of firmware or two of which or more.
12. be coupled to calculate equipment 146 and 148 one or more interface equipments 150 (for example, display, mouse, with Track ball, keyboard, touch screen, loudspeaker, biometric reader and posture reader), for AV 100 user (for example, Occupant or remote user) information is provided and alerts and receive the input of the user from AV 100.Coupling can be wireless Or it is wired.Any two or more in interface equipment can be integrated into individual equipment.
Other features and component are also desirably integrated into AV system 120.
Augmented reality in vehicle interfaces connection
We have various extensive uses in vehicle interfaces connection at technology described herein.Using can be located at Vehicle interior or outside vehicle, and can to driver, passenger, greet the people of vehicle, pedestrian and other people are useful.Below we Some applications are described as example.
Request transport services
In some applications, shared service is such as taken, other vehicles of advocating peace certainly can be provided to the user for not possessing vehicle Transport services.In such example, the technology for allowing users to request or greet vehicle is provided.In some cases, make The user interface presented in the mobile device run on it with such as native applications or Web browser come request transport services or Greet vehicle.The some aspects of such user interface may include augmented reality feature.
For example, with reference to Fig. 2A -2C, user 202 can request the part operation as AV traffic service system 200 (Fig. 8) Vehicle travel in AV 100.Other human-machine interfaces that user is used to be communicated with AV system 120 using mobile device or vehicle Display 203 on jaws equipment 150 requests vehicle travel.For example, user interface 201, which can be, is serving as human interface device A part of the mobile phone application (or webpage) shown on 150 smart phone, and the part as AV greeting system 200 It is operated.
User 202 can interact with AV traffic service system in various ways;For example, can posture based on user, voice, beat Word, controller, 2D/3D user interface (such as, 2D/3D augmented reality) take request to make.In example shown in the figure, User 202 uses the sequence of user interface screen for starting screen 201 started from smart phone app.Such screen has upper Portion 204, the top 204 may include menu access icon 205, state update 207 and other heading messages.Including starting screen 201 exemplary screen also has lower part 206, which shows the information of map 208 of such as position of user 202 etc, And the augmented reality element of such as Enhanced feature 220 (Fig. 2A is not shown into 2C) etc, Enhanced feature 220 can below more It is discussed in detail.
Login page
A referring to fig. 2, illustrated by screen show be app or website initial login page or start screen 201, wherein Shown to user 202: map 208 near his or her current location and with high accuracy (for example, in 1 meter, at 10 lis In rice) icon 209 of his or her current location is shown, and optionally include neighbour's road network and geophysical character Other relevant informations, and showing can be by the icon 213,215,217 of the position of AV 100 near greeting.The top of screen 204 further include the information of such as prompting frame 210 etc, and in the case where starting screen 201, which invites user 202 Select destination.User 202 can key in or otherwise input destination or advice screen 211 (Fig. 2 B) and can show and build The icon 19 of destination nearby is discussed, and prompting frame 210 shows one request in the shown potential destination of user's selection. Advice screen 211 can also include additional information 207 in the lower part of screen 206, and in this case, additional information 207 is to close The information of one of destination may be selected illustrated by.In fig. 2 c, it selected destination 199, its address 222 be shown, and One boarding position 214 being highlighted in available vehicle.Fig. 2 C further includes that user can call so that boarding position is arranged Button 223 and current highlighted boarding position view, which shows its address and other information.Calculation processing The instruction for selecting and showing address 222 can be performed in device (for example, computation processor 146 and 148), which includes being stored in Instruction in memory 144.
Boarding position and trip details page
Referring to Fig. 3 A-3J, user 202 may specify about his or her boarding position 214 further details and about To the details of the stroke that will be carried out of destination 199.Screen 231 of getting on the bus allows user 202 to select using the second prompting frame 212 His or her boarding position.The current location 237 of user 202 is shown on the map 208 on screen 231 getting on the bus, and is being started Compared on screen 201, screen 231 of getting on the bus be localized to the current position of user from it is his or her it is closer near.Map The possible substitution boarding position 214,216 (being indicated by number 1 and 2) of 208 displays, at these substitution boarding positions 214,216, AV 100 can be parked near the current location 237 of user, so that he or she enters the AV 100 stopped and proceeds to destination 199。
Used by AV traffic service system 200 and combined the letter of each provenance from such as data source 136 and sensor 121 Breath is to select show on map 208 two possible substitution boarding positions 214,216.AV traffic service system 200 combines The information is analyzed in the current location (his or her general boarding position 222) of user 202, to be determined for compliance with safety regulation and road Limitation and the potential boarding position 214,216 of user preference etc. etc..As shown, potential boarding position 214,216 is AV 100 will stop the preferred coordinates that can enter whereabouts so as to user 202.In some instances, AV traffic service system 200 It can include (the example in the tolerance interval of user location or walking distance in its data storage cell 142 or memory 144 Such as, within walking in 2 minutes, or in 250 meters) predetermined get on the bus a little.These make a reservation for get on the bus a little to can be to be known to be security bit The parking stall set or curb point, AV 100 can be parked at these homes when waiting user to enter.
In screen 231 of getting on the bus, two potential boarding positions 214,216 are shown, but only one is potentially got on the bus Position or the potential boarding position of more than two are also possible.In screen 231 of getting on the bus, for vehicle from the current location of user Left side approach to highlight the first potential boarding position 214, will be approached on the right side of the current location of user for vehicle The case where highlight the second potential boarding position 216.Exemplify two positions, but a position or more than two position It is also possible.
Screen 231 of getting on the bus has additional information 207 in the lower part of screen 206.Additional information 207 is shown in working as user The element (for example, augmented reality element) not actually existed in real world near front position, for example, augmented reality (AR).In this example, additional information 207 shows the camera view 218 of the general current location of user.In some instances, View is derived from the camera for the smart phone held by user, or alternatively, view can be preservation normal view (for example, The street view of service from such as Google Maps etc).In camera view 218, the first potential boarding position 214 and increasing Strong feature 220 is shown together, and the Enhanced feature 220 is for example, there is no be still superimposed upon in the real world environments of user To seem the object for belonging to the real world such as seen in camera view on camera view 218.In this example, enhance Feature 220 is the image of the AV 100 positioned at the first potential boarding position 214.User can assess augmented reality element (AV) phase For the relationship of potential boarding position, to determine whether shown potential boarding position 214 is desired (for example, if just Then close to pendle or the barrier being located temporarily in potential boarding position 214 or obstruction are got rid of raining.User 202 Desired boarding position is selected, is the first potential boarding position 214 here.
AV greeting system 200 may include multiple processes, these processes by integrated circuit, field programmable gate array, hardware, Software or firmware or two of which or multiple combinations are realized.Processing and analysis signal and data can by processor or The computing resource for managing device is realized.Processor can be by for the interface for showing the screen of screen 231 etc of such as getting on the bus and display Device to communicate with user.Calculating equipment can be realized with many different forms, for example, it can be implemented as smart phone 682 Part.
Referring to Fig. 3 B, user 202 can continue in itinerary screen 241 specify about his or her boarding position details with And the details about the stroke to destination 199.User 202 selected general boarding position 222, for example, address or four crossway A point on mouth or map, as shown in Figure 2 B, to select destination.General boarding position 222 appears in the second prompting frame In 212.Itinerary screen 241 includes user's details prompt 224, and in user's details prompt 224, user 202 informs AV traffic Details of the service system 200 about user, for example, the quantity of adult travelers, whether will have children in stroke and be expert at Whether there will be luggage in journey.Allow AV transport services by the information that AV traffic service system 200 is given in user's details prompt 224 System 200 selects vehicle appropriate, prompts those of 224 inputs to match in user's details with by user for example, having Vehicle near space requirement.
The stroke that will be carried out in Fig. 3 C shows that screen 251 is shown to user 202 from general upper parking stall on map 208 Set 222 to destination 199 programme path.Additional information 207 includes estimated stroke distances and time.If user agrees to The programme path, then he or she presses " request is taken " button 227.Then AV traffic service system 200 matches the request (example Such as, vehicle has been identified whether near user, and whether they are with the criterion for including in user's details prompt 224 and from vehicle The requested type of vehicle matching of details prompt 226).Referring to Fig. 3 D, at vehicle selection screen 261, user 202 can be in vehicle Desired type of vehicle of the selection for the stroke that will be carried out in details prompt 226.In some instances, for example, such as The certain form of vehicle of fruit is unavailable, or if certain type of vehicle is empty for necessity of traveller or luggage because not having Between or without can be used for children travelling automotive seat and with user's details prompt 224 at input information mismatch, It will not then show that certain form of vehicle or user 202 will not be able to select certain form of vehicle to user 202.
See also Fig. 9, it may include AV adaptation 700 (for example, optimization algorithm) that AV, which greets system 200, for which to be determined A AV (if there is) makes request and the AV fleet information for the user for including in user's details prompt 224 and Vehicle Detail prompt 226 720 (for example, each vehicle whether in use, current track, data or their combination from sensor) match, from And it determines which AV in AV fleet and is suitble to user 202.AV traffic service system 200 identifies most suitable particular vehicle, and Vehicle specific information 228 (Fig. 3 E) is shown to user on selected vehicle screen 271.Vehicle specific information 228 may include specific vehicle Picture, the licence plate of particular vehicle and the title of particular vehicle.Vehicle specific information 228 the can be vision or sense of hearing Or the two.Vehicle specific information 228 can also be transferred into long-range reciever or calculate equipment 730, such as, remote operation Person or another autonomous system.
As shown in Fig. 3 F and Fig. 3 G with screen 281 and 291, map 208 can show approaching for AV vehicle 100 Details, such as, the current track 198 in the way of current location 232 and the selected boarding position 214 of arrival.Screen can It shows and reaches information 234, such as, until vehicle reaches the remaining time of boarding position 214.
In Fig. 3 G, user 202 can continue to specify at vehicle customization screen 281 about the stroke to destination 199 Details.Vehicle particular hint 230 allows user to customize his or her seating, including selection music (for example, radio station or music Service, if had in selected vehicle one of available) or vehicle interior temperature.Vehicle particular hint 230 will show for difference to Determine vehicle and its feature different selections and information (for example, the control of (multiple) seat heater, window upward or downward, seat Chair is forward or backward etc.).
In Fig. 3 H, AV vehicle 100 arrived and be located at boarding position 214.Enhanced feature 220 is shown on selected The automobile of vehicle point 214, the actual view seen now with such as user match, thus reach screen 311 have should with it is existing The corresponding Enhanced feature 220 of reality.Screen 281,291 can also indicates to occupant on how to reaching boarding position 214, and It reaches screen 311 and informs user (Fig. 3 H) when AV vehicle 100 reaches.
Referring to Fig. 3 I, when user 202 is inside the vehicle and in the way for going to destination 199, in interface equipment 150 Upper display itinerary screen 321.Itinerary screen 321 shows the current location 232 of AV 100 on map 208.During the trip, it uses Family 202 can be interacted with the man-machine interface on the used interface equipment 150 (for example, smart phone) of user.
In Fig. 3 J, user 202 arrived his or her destination, and interface equipment 150 is requested about the anti-of stroke Feedback.AV traffic service system 200 can store the information (for example, being stored in memory 144) of user to recommend following AV pre- It orders or takes.AV traffic service system 200 also records seating, for example, being stayed in unintentionally with more easily tracking after user leaves Any article in vehicle.
Vehicle-mounted augmented reality experience
Fig. 4 A shows the user 202 being sitting in inside AV in 104 back seat.Vehicular display device 331 can be located at (for example, Be installed on) in car-mounted display equipment 341 inside AV 100 and when user is sitting in inside in its visual field (for example, figure Shown in like that close to passenger seat behind, towards back seat occupant).Various technologies can be used to control Vehicular display device 331, for example, gesture, voice, typewriting, controller, such as 2D/3D augmented reality interface etc 2D/3D user interface or it Combination, and can for example be shown on tablet computer, or project on the window or windshield 364 of AV 100.Such as Fruit, which does not show, to be projected on windshield 364, then as shown, showing the environment 190 of AV 100 through windshield 364 Normal real world.
Information about ongoing stroke may be displayed on interface equipment 150 or/and Vehicular display device 331, packet The result of safety inspection is included (for example, whether safety belt is fastened, whether the children that list in the trip are in the guarantor of children's seat Whether shield, door lock).In some instances, the camera that their smart phone can be used in user 202 carries out safety inspection Visual inspection, for example, user can shoot the photo of the children in the safety seat for being sitting in legal requirement and upload to image AV traffic service system 200.In some instances, the various sensors 121 in vehicle or the camera in vehicle 122 can be remembered Safety inspection is recorded (for example, seat belt sensor or in-vehicle camera, to automatically snap and upload and be fixed in legal requirement Safety seat in children photo).
Vehicle 345 can be shown referring to the car-mounted display equipment 341 (and/or interface equipment 150) of Fig. 4 B and Fig. 4 C, Fig. 4 A 343 three-dimensional perspective near in driving direction, road 349, vehicle including the road 347, intersection that are currently travelling The perspective view at current location 232 in driving direction, the object 360 in running environment and towards destination 199 Track 198 close to part.The view can be schematic diagram 358 or picture shows that 368 or both are.The object described in view Body 360 can be the object arrived via sensor 121 (including camera 122) real-time detection of a part as AV system 120 Body.
Fig. 4 B and Fig. 4 C show two examples of Vehicular display device 341;Around the movement of AV 100 and AV 100 When object (such as pedestrian or other vehicles) is mobile, Vehicular display device 341 is continuous in real time to be updated to show about working as forward pass The information of the object identified in sensor data.
Show that this be portrayed as in 368 may not know in the case where nobody operates in schematic diagram 358 and picture The user of ride-on vehicles provides comfort.The description can also provide the information and guidance about vehicle operating.Camera 122 is examined 360 degree of vehicle periphery are surveyed, therefore the information within the scope of 360 ° can be presented in view.Travel information 370 is further depicted, including current Position, destination, remaining time and distance and current time etc. in stroke.
In schematic diagram 358, using by laser radar or the point of the received information of radar sensor, dotted line or other be abstracted It indicates to show object 360.As vehicle advances along track, the array of these points, dotted line or other abstract representations continuously becomes Change to reflect from laser radar or the received delta data of radar sensor.
The picture of Vehicular display device 341 shows that 368 include the real-time view of the real-world scene on vehicle heading Frequency camera demonstration 369.Picture shows that 368 include Enhanced feature 220.In exemplary diagram, Enhanced feature 220 is mark vehicle periphery The red block or label 362 of feature or object 360 in environment.Enhanced feature 220 is covered on truthful data, for example, label 362 are covered on the vision data from one of the camera 122 for being installed on AV 100.Therefore, Vehicular display device 341 shows increasing Strong reality, that is, camera view is supplemented by Enhanced feature, the one or more shown in Enhanced feature mark camera view Object.In some instances, the appearance of Enhanced feature 220 can change according to the type of the object identified by AV system 120. For example, label 362 can have color or shape or profile or other identifier label, different classes of object, example are distinguished Such as pedestrian, vehicle or traffic control feature, such as cone mark or traffic lights.
The key function of label 362 is to show which in environment be AV system identified when the vehicle is running to observer A little objects, especially mobile object.By imply AV system sensor can be identified for that may to the object that vehicle causes damages, Show that this mark of the object including mobile object helps to make occupant comfort.
In addition to the object that is detected by sensor 121 this be portrayed as the passenger for taking in the car provide comfort it Outside, Enhanced feature 220 can be used by vehicle operators (in the car or remotely), and by such as safety engineer it The Systems Operator of class uses.Check that the Systems Operator of Vehicular display device 341 can analyze and assess AV system detection and mark Know the effect of the ability of the object detected by sensor 121.Systems Operator can also look at the vehicle of the camera in AV 100 Carry view.
In general, operation autonomous system includes planning its movement.It track can be with multiple athletic performances (for example, accelerating, holding Speed, deceleration, change are orientated, stop, following traffic signals and avoid impacting object) it is associated, the multiple athletic performance will It is executed by the autonomous system in driving environment to realize track.Some athletic performances can execute parallel (for example, change orientation and Slow down), and some athletic performances can serially execute (for example, accelerate and then slow down).For example, certainly along the operation of track 198 Main system 120 may include the following contents: section start accelerate, slow down and given position turn right, keep at a slow speed, when traffic believe Turn left when number allowing to turn left in the second place, accelerate, slow down and is parked in target location.The realization of movement planner can wrap Include track identification device.Track identification device can analyze map so that autonomous system is navigated from initial position and target position.Map It can show other vehicles on not dirigible region and road.In order to identify possible track, track identification device can be with It is sampled by map to start.Then it removes in not dirigible region or by object (for example, vehicle) blocking Sample.Based on remaining sampled point, track can identify multiple candidate tracks.Develop the safety engineering of this motion planning Teacher can be assisted by the AR feature (Enhanced feature 220) for including in map.Such safety engineer may exist In AV 100 and accesses the system diagram of vehicle or the long-range of AV 100 can be located at.
In addition to this, risk monitoring and control process can pass through environment, the operation of AV system or the inside of AV near monitoring AV Or their combination identifies risk.For example, analysis is from sensor 121 (for example, visual sensor, laser radar or radar Or their combination) signal can produce the letter about other objects (for example, vehicle, infrastructure and pedestrian) in environment Breath;The example of this type of information includes: position, speed, orientation, boundary, size, size, traffic lights, manufacturer, license plate number, vehicle Main, driver and vehicle operating.The information be can analyze for example to predict potentially to collide or detect existing collision.Analysis comes Map from database or/and the image from visual sensor may further determine that foreground and background.For example, by AV system 120 maps used can encode the information of the height profile about road surface.The information can be used for by analyzing from visual sensing The depth information of device (for example, stereoscopic camera) acquisition will simultaneously be given using segmentation with identifying background area or/and foreground object Point is classified as belonging to road surface.The safety engineer of development risk management algorithm (can be enhanced by the AR feature for including in map Feature 220) it is assisted.For example, system may include the Enhanced feature 220 on the real world-view of pavement image, It, which divides into point, belongs to background or prospect.Check that the safety engineer of such image can easily classify simultaneously to point And classify again to if they are by error identification, to improve the accuracy of detection algorithm.
It can be the visible Vehicular display device on the windshield 364 of AV 100 referring to Fig. 4 D and Fig. 4 E, AR screen 362 A part.In this case, windshield 364 is mainly traditional perspective surface, and passes through most of windshield 364 it can be seen that actual vehicle 193.In addition, AR screen 262 occupies the projection in the bottom of windshield 364, such as AV 100 Instrument 366 projects to information and Enhanced feature 220 in AR screen 262.In some instances, AR screen 262 is at least partly had an X-rayed. Information, such as travel information 370 can be shown on AR screen 262.AR screen 262 can also show Enhanced feature 220;At this In the case of kind, Enhanced feature 220 is label or label, is indicated via the sensor 121 of AV system 120 in the position of label Vehicle 193 (or other objects) are detected at position near or over, and are identified or be classified as vehicle.Enhanced feature 220 can be different the object of each type or classification that are identified.For example, label or label can have it is different Color or different shapes are to identify pedestrian, another vehicle, cone etc..In some instances, the bottom of windshield 364 Can be coated with allows the coating for showing information on it to replace individual physical screen (or other than individual physical screen Also there is the coating);For example, coating can permit on the information projection to the glass of windshield 364 of AR screen 262.
The user interface information presented on car-mounted display screen 331 or on AR screen 262 can be total with other equipment It enjoys.For example, information can be shared to the screen on smart phone (for example, smart phone of the kinsfolk waited), or It can be shared to the screen by remote operating person or fleet manager observation as discussed below.
In some instances, Vehicular display device 341 can show the virtual scene of such as natural scene etc.Such void Quasi- scene can permit the passenger experience virtual environment in AV 100, and (because his or her attention is not vehicle operating, institute is required ), to replace position and the operation of the instruction AV system 100 of Enhanced feature 220 shown in Fig. 4 D.Virtual scene may be displayed on On window and/or windshield, therefore oneself can be immersed in virtual environment (for example, virtual in AV 100 by occupant Reality).Therefore, vehicle may seem passing through different (tranquil or exotic) scenes rather than vehicle is passed through The actual environment (for example, urban landscape) crossed.
In some instances, virtual barrier 381 is located between the seat in 104 inside AV or between Vehicular display device 331. Virtual barrier 381 ensures that a user 202 can't see another Vehicular display device 331 (can be AR headphone 380) Screen.If the first passenger for wearing AR glasses can have him or she certainly there are multiple users (for example, share-car people) in vehicle Oneself immersion environment, the AR environment separation and secret of the immersion environment and the second companion passenger.It is being shared in multiple users 202 In the case where same AV 100, virtual barrier 381 be may be advantageous.In some instances, virtual barrier 381 can be individual Screen does not allow viewer such as to be greater than the predetermined angular relative to screen and watch the screen that screen is shown.In other situations Under, virtual barrier 381 can be a damper.
Various screens discussed above are shown consider as example, and other information, details and input are also may And within the scope of this disclosure.
Fig. 5 A shows augmented reality (AR) the wear-type ear that can be used as a part operation of AV traffic service system 200 The figure of machine 380.AR headphone may include processor 302.Processor receives data from AV system 120 and returns data to AV system 120, and data are received from tracking cell 310.380 headphone of AR is shown in by the data that processor 302 is handled Present on screen, which is a pair of of lens (lens) 304, for rendering the visualization of image.Tracking cell is via more The movement of a sensor tracking user's head, these sensors determine the information of such as user's orientation coordinate etc, and can be with Including three-axis gyroscope, three axis accelerometer, three axle magnetometer (or compass) and detection sensing environment light and object and biography The sensor of the degree of approach of sensor.These data may include Angle Position (yaw/pitching/rolling), velocity and acceleration and line The property position (x/y/z), velocity and acceleration data.These data are fed to processor 302, and processor 302 generates real-time wash with watercolours Contaminate image.Then, the image of rendering is shown on screen or lens 304, to generate stereoscopic vision impression.AR headphone 380 can also include be able to carry out video record one or more cameras 120 and wireless transmitter and receiver (for example, Wi-Fi and/or bluetooth).Such AR headphone 380 detects the real world environments of user 202, the orientation including user The rendering for the content seen in his or her environment with user.
Fig. 5 B shows birds-eye perspective, and Fig. 5 C-E shows the user 202 for wearing augmented reality (AR) headphone 380 3D schematic diagram, augmented reality (AR) headphone 380 can be used as AV transmission service system 200 a part grasped Make.AR headphone 380 can be by user 202) wear goggles or glasses, it illustrates Enhanced features 220.It is this AR headphone 380 may include the optics head-mounted display for being designed to a pair of glasses shape.
In example shown in the figure, Enhanced feature 220 is the superposition of two or more potential boarding positions, including potential Boarding position 214,216 and be directed into user 202 calling AV 100 suggestion route or track 198.AR is worn These features are added in display (for example, appearing in them on the glasses of AR headphone 380) by formula earphone 380, It is in real world environments so that they seem those objects.Therefore, the use watched by AR headphone 380 Real-world feature, such as building in the visual field of headphone 380 385 or pedestrian 192 are seen in family 202, and also See the Enhanced feature 220 shown on AR headphone 380, just look like Enhanced feature 220 is the one of real world environments Part.
When user changes the his or her visual field 385 to include different sight and object, AR headphone 380 can be with The view of the shown Enhanced feature 220 of adaptation and shown Enhanced feature.Gyroscope in AR headphone 380, Accelerometer and the visible new orientation of camera and New view in magnetometer detection AR headphone 380.Pass through wear-type ear Machine 380 can see different objects (for example, different pedestrians 192).Meanwhile by headphone 380 it can also be seen that Different Enhanced features 220.Processor associated with AR headphone 380 generates the different views of Enhanced feature, and will AR feature Real-time embedding is into real world-view.In this example, the Enhanced feature 220 seen by headphone 380 It is the called part to be traversed AV 100 in track 198.It is existing when user 202 changes his or her orientation, such as to the left Real world environments and the different piece of Enhanced feature 220 become visible.The system is able to detect the practical view of user's current environment Scheme and project accurate Enhanced feature 20 on glasses, to be carried out simultaneously to the given visual field 385 being suitble under given time Viewing.
As seen in Fig. 5 E, Enhanced feature 220 can be used for headphone 380.Camera, the top of AR headphone 380 The real world-view and orientation of spiral shell instrument, accelerometer and magnetometer detection headphone 380.User 202 is seen by glasses It sees and sees the real world visual field, and also seem to see the part for the track 198 being present in present viewing field 385.The increasing Strong feature 220 is shown on the glasses of AR headphone 380 itself, so as in the visual field 385 of given time covering user 202 Interior real-world feature.
In some instances, Enhanced feature 220 allows user for the vehicle distributed and may appoint in same area What his vehicle distinguishes.In Fig. 5 E, two Enhanced features 220 are visible, track 198 and AV label 387.AV label 387 be the specific characteristic (for example, color, flash lamp, spotlight, arrow etc.) shown on AR headphone 380, mark Specific AV 100.If there are multiple AV 100 in the zone, AV label 387 is particularly useful;When user 202 is worn by AR When formula earphone 380 is watched, AV label 387 identifies the particular vehicle for already being allocated to the user 202.
Remote operating
In some cases, such as when AV is just travelled on road and undergo such as system failure, extreme weather conditions or When the event of interim detour etc, it may be useful for allowing remote personnel to provide help in the operation of AV.Remote control system (its Can be long-range or local or AV or AV system long-range and local combination) person that aloows remote operating pass through it is logical Believe channel and AV system interaction (for example, providing order, visualization riving condition and investigating hardware component or software process Function).These interactions can help AV system to sufficiently respond to various events.The use of remote control system with AV system exists Application " the operation of the vehicle with autonomous driving ability for the U.S. Patent Application No. 15/624780 that on June 16th, 2017 submits In intervention (Intervention in Operation of a Vehicle Having Autonomous Driving Capabilities it is described in) ", the content of the document is integrally incorporated in interior.
Fig. 6 A shows the framework of remote control system.Remote control system 490 may include several elements, including remote operating visitor Family end 401 (for example, hardware, software, firmware or two of which or multiple combinations), is typically mounted on AV system 120 On AV 100.Remote operating client 401 can be with the component of AV system 120 (for example, sensor 121, communication equipment 140, data Library 412, user interface apparatus, memory 140 or function device or their combination) interaction, such as send and receive information and Order.Remote operating client 401 can pass through communication interface 140 (it can at least partly be wireless) and remote operating server 120 communications.
Remote control system 490 have remote operating server 410, can be located at AV 100 in or remote location, for example, extremely Few 100 0.1 meters of distance AV, 1 meter, 4 meters, 3 meters, 4 meters, 5 meters, 10 meters, 40 meters, 30 meters, 40 meters, 50 meters, 100 meters, 100 meters, 300 meters, 100 meters, 500 meters, 600 meters, 700 meters, 900 meters or 1000 meters.Remote operating server 410 using communication interface 140 with Remote operating client 401 communicates.In some embodiments, remote operating server 410 can be same with multiple remote operating clients Shi Tongxin;For example, another remote operating of remote operating server 410 and another AV of a part as another AV system Client 451 communicates.Client 401 can be with one or more data sources 420 (for example, central server 422, remote sensing Device 424 and remote data base 426 or their combination) it communicates to collect data (for example, road network, map, weather and traffic) For realizing automatic Pilot ability.Remote operating server 410 can also be communicated with remote data source 420 for AV system 120 Remote operating.
When the one or more components of AV system 42 (Fig. 1) are in exception or unexpected situation (for example, failure or generation are abnormal Output) when, remote operating event can be triggered.For example, brake troubles;It blows out;The visual field of visual sensor is blocked;Depending on Feel that the frame rate of sensor is down to threshold value or less;The movement of AV system and current steering angle, throttle be horizontal, brake level or they Combination mismatch;Software code breaks down;Signal strength reduces;Noise level increases;It is perceived in the environment of AV system To unknown object;Since planning mistake makes motion planning process that can not find target-bound track;Data source is (for example, data Library, sensor and source of map data) become inaccessible;Or their combination.Remote operating event can be touched by occurring or requesting Hair.Example include: detour, protest, fire, accident, flood, the trees to fall down or rock, condition, police require, The request (for example, driving behavior that passenger does not like AV system) of occupant, the request of the user of AV are (for example, use AV system in AV The package sender that system transports package wants the new track or destination of change) perhaps initiated by remote operating person or they Combination.
Remote control system 490 also has the user interface 412 presented by remote operating server 410, is used for mankind's remote operating The remote operating of the participation AV system 100 of person 414.In some cases, interface 412 can render AV system 100 to remote operating person 414 The content for having perceived or having perceived in real world.Referring to Fig. 6 B, example remote operating screen 500 can be looked like In Vehicular display device 341 (it shows on the user interface apparatus 150 in AV 100) and it can show virtual map or figure Piece is shown (Fig. 6 C), describes the current location 232 of vehicle, a part of object 360 and track 198 in driving environment. The rendering can be schematic map, and can be same or like with Vehicular display device 341, increase permission human user The 414 remote operating features 502 interacted with remote operating client 401.
Rendering on remote operating screen 500 may include frame 220 in Enhanced feature 220, such as Fig. 6 B and 6C with protrusion Show the identified object 360 in the ambient enviroment of AV 100.Enhanced feature 220 can help human user 414 to decide whether Or how to respond remote operating event or remote operating request.In some embodiments, remote operating server 410 passes through user interface The environment of AV system 120 is rendered to remote operating person, and remote operating person can see the environment to select best remote operating.One In a little embodiments, the user interface that the environment of AV system 120 is rendered to remote operating person can be screen, or can be more A screen.Multiple user interface screen can connect and be bent or warpage, so as at least partly around remote operating person;One In a little examples, remote operating person can be surrounded by the rendering of the environment of AV system 120 completely, to immerse for remote operating person creation Formula environment.This makes remote operating person is it can be seen that all things occurred in complete 360 degree of visuals field of vehicle periphery, including vehicle The vehicle and object of rear and vehicular sideview, these vehicles and object are usually except the visual field of people's eyes front.This is enclosed The experience of the passenger actually rided in AV vehicle 100 can be simulated around the rendering of remote operating person.Rendering may include that enhancing is special Sign 220 and remote operating feature 502.In some instances, the AR that the user that rendering may be coupled in AV vehicle 100 wears is worn Formula earphone 380.When passenger rotates his head and changes the visual field of headphone 380 whereby, the visual field of remote operating person can To be updated together with the visual field of passenger.
Remote operating server 410 can recommend possible remote operating, and remote operating person 414 to remote operating person by interface It can choose one or more of recommended remote operating and remote operating made to be sent to AV system 120.In some instances, Remote operating person draws the recommendation track of AV using the interface, continues the driving of AV along the recommendation track.
Technology component
Fig. 7 shows example computer 600 and example mobile computer device 650, can be used for realizing us The technology of description.For example, the operation of some or all of AV system 150 can be set by computer equipment 600 and/or mobile computer Standby 650 execute.Computing system 600 is intended to indicate that various forms of digital computers, including such as laptop computer, desk-top calculating Machine, work station, personal digital assistant, server, blade server, mainframe and other suitable computers.Calculating is set Standby 650 are intended to indicate that various forms of mobile devices, including for example personal digital assistant, mobile phone, smart phone and other Similar calculating equipment.Component shown here, their connection and relationship and their function are only example, and unexpectedly Taste limitation is described herein and/or the embodiment of claimed technology.
Equipment 600 is calculated to include processor 602, memory 604, storage equipment 606, be connected to memory 604 and high speed The high-speed interface 608 of ECP Extended Capabilities Port 610 and the low-speed interface 612 for being connected to low speed bus 614 and storage equipment 606.Component 602, each of 604,606,608,610 and 612 various bus interconnections are used, and public master can be appropriately mounted at On plate or otherwise install.Processor 602 can handle for calculating the instruction executed in equipment 600, including storage In the memory 604 or storage equipment 606 on instruction, to show the graph data of GUI on external input/output device, Display 616 including being for example coupled to high-speed interface 608.In other embodiments, multiple processing can be suitably used Device and/or multiple buses and multiple memories and various types of memories.Furthermore, it is possible to multiple calculating equipment 600 are connected, Each equipment provides the part (for example, as server library, one group of blade server or multicomputer system) of necessary operation.
Memory 604, which stores data in, to be calculated in equipment 600.In one embodiment, memory 604 is volatibility Memory cell.In another embodiment, memory 604 is Nonvolatile memery unit.Memory 604 can also be Another form of computer-readable medium, including such as disk or CD.
Storage equipment 606 can provide massive store to calculate equipment 600.In one embodiment, equipment is stored 606 can be or comprising computer-readable medium, including such as floppy device, hard disc apparatus, compact disk equipment, tape unit, sudden strain of a muscle It deposits or other similar solid-state memory device or a series of equipment, including the equipment in storage area network or other configurations. Computer program product can be tangibly embodied in data medium.Execution when computer program product is further included in execution The instruction of one or more methods (including method those of as described above).Data medium is computer-readable medium or machine Device readable medium, including the memory etc. on such as memory 604, storage equipment 606, processor 602.
High-speed controller 608 management calculate equipment 600 bandwidth-intensive operations, and low speed controller 612 manage it is lower Bandwidth-intensive operations.This function distribution is only example.In one embodiment, high-speed controller 608 is coupled to storage Device 604, display 616 (for example, passing through graphics processor or accelerator), and it is coupled to high-speed expansion ports 610, it can connect By various expansion card (not shown).In this embodiment, low speed controller 612 is coupled to storage equipment 606 and low-speed expansion Port 614.May include various communication port (for example, USB,Ethernet, wireless ethernet) low-speed expansion port One or more input-output apparatus, including such as keyboard, indicating equipment, scanner or networked devices are may be coupled to, including Such as switch or router (for example, passing through network adapter).
Calculating equipment 600 can be realized with many different forms, as shown in the figure.For example, it can be implemented as standard It is realized in server 620, or the server as one group multiple.It is also implemented as frame server system 624 A part.Additionally or alternatively, it can be realized in personal computer (for example, laptop computer 622).In some examples In, it can be combined with the other assemblies in mobile device (not shown) (for example, equipment 650) from the component for calculating equipment 600. Each such equipment, which may include, calculates equipment 600, and one or more of 650, and whole system can be by leading to each other Multiple calculating equipment 600,650 composition of letter.
Calculating equipment 650 includes processor 652, memory 664 and input-output apparatus comprising such as display 654, communication interface 666 and transceiver 668 and other assemblies.Equipment 650 is also provided with storage equipment, including for example micro- Driver or other equipment, to provide additional storage.Each component 650,652,664,654,666 and 668 can be used respectively Kind bus interconnection, and several components can be appropriately mounted on public mainboard or otherwise install.
Processor 652 can execute the instruction calculated in equipment 650, including the instruction being stored in memory 664.Processing Device can be implemented as the chipset of chip comprising individual and multiple analog- and digital- processors.Processor can be mentioned for example For the coordination of the other assemblies for equipment 650, the control including such as user interface, the application and equipment that equipment 650 is run 650 wireless communication.
Processor 652 can be communicated by the control interface 658 and display interface 656 for being coupled to display 654 with user. Display 654 can be such as TFT LCD (Thin Film Transistor-LCD) or OLED (Organic Light Emitting Diode) display, Or other display technologies appropriate.Display interface 656 may include for driving display 654 so that figure and its is presented to user The proper circuit of his data.Control interface 658 can receive order from user and convert them to submit to processor 652. In addition, external interface 662 can be communicated with processor 642, to realize the near region field communication of equipment 650 Yu other equipment.Outside Portion's interface 662 can for example provide wire communication in some embodiments, or provide channel radio in other embodiments Letter.Also multiple interfaces can be used.
Memory 664, which stores data in, to be calculated in equipment 650.Memory 664 can be implemented as computer-readable medium, One or more of volatile memory-elements or Nonvolatile memery unit.Extended menory 674 can also be provided simultaneously It is connected to equipment 850 by expansion interface 672, expansion interface 672 may include such as SIMM (single-in-line memory module) Card interface.This extended menory 674 can provide additional memory space for equipment 650, and/or can deposit for equipment 650 Store up application program or other data.Specifically, extended menory 674 can also include the finger for executing or supplementing the above process It enables, and may include secure data.Thus, for example, extended menory 674 may be provided as the security module of equipment 650, And can by allow use safely equipment 650 instruction programming.Furthermore it is possible to by SIMM card provide security application with And additional data, including for example mark data can not be placed on SIMM card in a manner of hacker attack.
Memory can include such as flash memory and or NVRAM memory as described below.In one embodiment, it counts Calculation machine program product is tangibly embodied in data medium.Computer program product includes to execute one or more sides when being executed The instruction of method (including method those of as described above).Data medium is computer-readable medium or machine readable media, Including the memory on such as memory 664, extended menory 674 and/or processor 652, it can for example pass through transceiver 668 or external interface 662 receive.
Equipment 650 can be wirelessly communicated by communication interface 666, and communication interface 666 can include number letter if necessary Number processing circuit.Communication interface 666 can provide the communication under various modes or agreement, including such as GSM (global system for mobile telecommunications System) audio call, SMS (Mobile Phone Short Message Service), EMS (enhancing short message service) or MMS (MMS) message biography Pass, CDMA (CDMA), TDMA (time division multiple acess), PDC (vehicle distant-control system), WCDMA (wideband code division multiple access), CDMA2000 or GPRS (General Packet Radio Service) etc..This communication can for example be occurred by RF transceiver 668.Separately Outside, can occur short haul connection, including for example usingWi-Fi or other such transceiver (not shown).Separately Outside, it is relevant with position wireless can to provide additional navigation to equipment 650 for GPS (global positioning system) receiver module 670 Data can be suitably used by the application run in equipment 650.
Equipment 650 can also use audio codec 660 audibly to communicate, and audio codec 660 can be from user It receives voice data and is converted into available numerical data.Audio codec 660 can equally be generated for user audible Sound, including for example by loudspeaker, such as in the earpiece of equipment 650.This sound may include exhaling from voice call The sound (for example, speech message, music file etc.) of the sound, recording that cry and the application program by being operated in equipment 650 The sound of generation.
Calculating equipment 650 can be realized with many different forms, as shown in the figure.For example, it can be implemented as moving Phone 680.It is also implemented as a part of smart phone 682, personal digital assistant or other similar mobile device.
The various embodiments of system and technology described herein can in Fundamental Digital Circuit, integrated circuit, specially set It is realized in the ASIC (specific integrated circuit) of meter, computer hardware, firmware, software and/or combination thereof.These various embodiments It may include one or more computer programs executable and/or interpretable on programmable systems.This includes at least one Programmable processor, it is special or general to can be, and coupling is with from storage system, at least one input equipment and at least one A output equipment receives data and instruction, and sends them for data and instruction.
These computer programs (also referred to as program, software, software application or code) include for programmable processing The machine instruction of device, and can be with the programming language and/or compilation/machine language of level process and/or object-oriented come real It is existing.As it is used herein, term machine readable media and computer-readable medium refer to for providing to programmable processor The computer program product of machine instruction and/or data, device and/or equipment are (for example, disk, CD, memory, programmable Logical device (PLD)), the machine readable media including receiving machine instruction.
In order to provide the interaction with user, system and technology described herein can have for data to be presented to user Show that equipment (for example, CRT (cathode-ray tube) or LCD (liquid crystal display) monitor) and user can be by them to computer It provides and is realized on the keyboard of input and the computer of indicating equipment (for example, mouse or tracking ball).Other kinds of equipment can also For providing and the interaction of user.For example, the feedback for being supplied to user can be sensory feedback form (for example, visual feedback, Audio feedback or touch feedback).Input from the user may include that the form of acoustics, voice or tactile input receives.
System and technology described herein can include realizing in computing system below: aft-end assembly is (for example, as number According to server) either middleware component (for example, application server) or front end assemblies (for example, have user can by its with The client computer of user interface or web browser that the realization of system and technology described herein interacts) or this A little rear ends, middleware or front end assemblies any combination.Can by way of digital data communications (for example, communication network) or Medium is interconnected the component of system.The example of communication network includes local area network (LAN), wide area network (WAN) and internet.
Computing system may include client and server.Client-server is general apart very remote and usually logical Communication network is crossed to interact.The relationship of client-server according to run on corresponding computer and each other have client Machine-relationship server computer program generates.
In some embodiments, engine described herein can separate, combine or be integrated to single or combined engine In.The engine described in attached drawing, which is not intended to, is limited to software architecture shown in figure for system described herein.
Multiple embodiments of the invention have been described.For example, although the description herein has been described remote operating person It is the realization of people, but the person's function that partially or even wholly can automatically carry out remote operating.
Other embodiments are also within the scope of the appended claims.

Claims (30)

1. a kind of device, comprising:
Processor;And
For that can be executed by the processor with the storage of the instruction performed the following operation:
In conjunction with the stroke of the people in autonomous vehicle, selection is by the specific position where meeting the people for the stroke or described The specific position that the purpose of stroke is located in;And
The visual information for describing the specific position is presented by the user interface of equipment.
2. device as described in claim 1, which is characterized in that the specific position include on road or near roads it is specific Provider location.
3. device as described in claim 1, which is characterized in that the specific position be included in the people request the stroke it The special entity position of preceding mark.
4. device as described in claim 1, which is characterized in that the visual information includes the reality of the special entity position Image.
5. device as described in claim 1, which is characterized in that the visual information includes showing with the special entity position The real world images for the autonomous vehicle that real image is presented together.
6. device as described in claim 1, which is characterized in that the specific position is not by street address identifier.
7. a kind of mobile device, comprising:
Display;
Processor;And
App or browser will meet people institute for stroke by autonomous vehicle for the processor to be presented on the display Practical specific position or the stroke the description of specific position that is located in of purpose, the practical specific position is in institute It is determined before stating the request to the stroke that people makes.
8. mobile device as claimed in claim 7, which is characterized in that the specific position includes on road or near roads Special entity position.
9. mobile device as claimed in claim 7, which is characterized in that the specific position, which is included in the people, requests the row The special entity position identified before journey.
10. mobile device as claimed in claim 7, which is characterized in that the visual information includes the special entity position Real world images.
11. mobile device as claimed in claim 7, which is characterized in that the visual information includes and the special entity position The real world images for the autonomous vehicle that the real world images set are presented together.
12. mobile device as claimed in claim 7, which is characterized in that the specific position is not by street address identifier.
13. a kind of device, comprising:
Processor;And
For that can be executed by the processor with the storage of the instruction performed the following operation:
The signal of the stroke from the user about the user in autonomous vehicle is received by the user interface of equipment, it is described Signal designation can be in response to the signal from the user come the feature of the autonomous vehicle controlled;And
It determines for the movement to be taken of the autonomous vehicle, controls the feature of the autonomous vehicle to coming from will pass through The signal of the user makes a response.
14. device as claimed in claim 13, which is characterized in that can be controlled in response to the signal from the user The feature of the autonomous vehicle include temperature in the autonomous vehicle.
15. device as claimed in claim 13, which is characterized in that from the received signal of the user include described autonomous Temperature in vehicle.
16. device as claimed in claim 13, which is characterized in that can be controlled in response to the signal from the user The autonomous vehicle the feature include the vehicle passenger capacity.
17. device as claimed in claim 13, which is characterized in that from the received signal of the user include the stroke Passengers quantity.
18. device as claimed in claim 13, which is characterized in that can be controlled in response to the signal from the user The autonomous vehicle the feature include the vehicle interior entertainment systems state.
19. device as claimed in claim 13, which is characterized in that the state of the entertainment systems includes a type of At least one of entertainment content.
20. device as claimed in claim 13, which is characterized in that the state of the entertainment systems includes the mark in broadcasting station Know symbol.
21. device as claimed in claim 13, which is characterized in that from the received signal of the user include the amusement The content source of system or the mark of content type.
22. device as claimed in claim 13, which is characterized in that can be controlled in response to the signal from the user The feature of the autonomous vehicle include receiving to the children for needing children's seat.
23. device as claimed in claim 13, which is characterized in that from the received signal of the user include the stroke In the instruction of children will be present.
24. device as claimed in claim 13, which is characterized in that can be controlled in response to the signal from the user The autonomous vehicle the feature include to for the stroke package receiving.
25. device as claimed in claim 13, which is characterized in that from the received signal of the user include about being directed to The information of the package of the stroke.
26. device as claimed in claim 13, which is characterized in that the instruction that can be executed by the processor is used to be based on to come from The signal of the user selects autonomous vehicle among two or more available autonomous vehicles.
27. device as claimed in claim 13, which is characterized in that the selection of the autonomous vehicle based in following At least one of: the availability of passenger capacity, package capacity or children car seat.
28. device as claimed in claim 13, which is characterized in that the processor and storage be central AV system part.
29. device as claimed in claim 13, which is characterized in that can be used for by the instruction that the processor executes will be described dynamic It is transferred to the autonomous vehicle.
30. a kind of mobile device, comprising:
Display;
Processor;And
App or browser, for the processor to be presented on the display for controlling the autonomous vehicle for stroke Feature at least one option, the feature includes at least one of the following: passenger capacity, package capacity, juvenile automobile The feature of the availability of seat, the temperature in vehicle or entertainment systems.
CN201910009386.5A 2018-01-04 2019-01-04 The connection of augmented reality vehicle interfaces Pending CN110007752A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201862613664P 2018-01-04 2018-01-04
US62/613,664 2018-01-04
US201862629764P 2018-02-13 2018-02-13
US62/629,764 2018-02-13

Publications (1)

Publication Number Publication Date
CN110007752A true CN110007752A (en) 2019-07-12

Family

ID=67165350

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910009386.5A Pending CN110007752A (en) 2018-01-04 2019-01-04 The connection of augmented reality vehicle interfaces

Country Status (1)

Country Link
CN (1) CN110007752A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110703753A (en) * 2019-10-16 2020-01-17 北京京东乾石科技有限公司 Path planning method and device, electronic equipment and storage medium
CN112351241A (en) * 2019-08-09 2021-02-09 丰田自动车株式会社 Vehicle operation system
CN113247015A (en) * 2021-06-30 2021-08-13 厦门元馨智能科技有限公司 Vehicle driving auxiliary system based on somatosensory operation integrated glasses and method thereof
CN113903184A (en) * 2020-06-22 2022-01-07 奥迪股份公司 Vehicle state display method and device, vehicle-mounted electronic equipment and vehicle
US11292458B2 (en) 2019-07-31 2022-04-05 Toyota Research Institute, Inc. Autonomous vehicle user interface with predicted trajectories
US11292457B2 (en) * 2019-07-31 2022-04-05 Toyota Research Institute, Inc. Autonomous vehicle user interface with predicted trajectories
US11328593B2 (en) 2019-07-31 2022-05-10 Toyota Research Institute, Inc. Autonomous vehicle user interface with predicted trajectories

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103339009A (en) * 2010-10-05 2013-10-02 谷歌公司 Diagnosis and repair for autonomous vehicles
US20150134180A1 (en) * 2013-11-08 2015-05-14 Electronics And Telecommunications Research Institute Autonomous driving control apparatus and method using navigation technology
US20160370194A1 (en) * 2015-06-22 2016-12-22 Google Inc. Determining Pickup and Destination Locations for Autonomous Vehicles
WO2017079222A1 (en) * 2015-11-04 2017-05-11 Zoox, Inc. Software application to request and control an autonomous vehicle service
US20170282821A1 (en) * 2016-04-01 2017-10-05 Uber Technologies, Inc. Transport facilitation system for configuring a service vehicle for a user
CN107450531A (en) * 2016-05-31 2017-12-08 通用汽车环球科技运作有限责任公司 The system for dynamically directing the user to the loading position of the autonomous driving vehicles

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103339009A (en) * 2010-10-05 2013-10-02 谷歌公司 Diagnosis and repair for autonomous vehicles
US20150134180A1 (en) * 2013-11-08 2015-05-14 Electronics And Telecommunications Research Institute Autonomous driving control apparatus and method using navigation technology
US20160370194A1 (en) * 2015-06-22 2016-12-22 Google Inc. Determining Pickup and Destination Locations for Autonomous Vehicles
WO2017079222A1 (en) * 2015-11-04 2017-05-11 Zoox, Inc. Software application to request and control an autonomous vehicle service
US20170282821A1 (en) * 2016-04-01 2017-10-05 Uber Technologies, Inc. Transport facilitation system for configuring a service vehicle for a user
CN107450531A (en) * 2016-05-31 2017-12-08 通用汽车环球科技运作有限责任公司 The system for dynamically directing the user to the loading position of the autonomous driving vehicles

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11292458B2 (en) 2019-07-31 2022-04-05 Toyota Research Institute, Inc. Autonomous vehicle user interface with predicted trajectories
US11292457B2 (en) * 2019-07-31 2022-04-05 Toyota Research Institute, Inc. Autonomous vehicle user interface with predicted trajectories
US11328593B2 (en) 2019-07-31 2022-05-10 Toyota Research Institute, Inc. Autonomous vehicle user interface with predicted trajectories
CN112351241A (en) * 2019-08-09 2021-02-09 丰田自动车株式会社 Vehicle operation system
CN110703753A (en) * 2019-10-16 2020-01-17 北京京东乾石科技有限公司 Path planning method and device, electronic equipment and storage medium
CN110703753B (en) * 2019-10-16 2022-11-08 北京京东乾石科技有限公司 Path planning method and device, electronic equipment and storage medium
CN113903184A (en) * 2020-06-22 2022-01-07 奥迪股份公司 Vehicle state display method and device, vehicle-mounted electronic equipment and vehicle
CN113247015A (en) * 2021-06-30 2021-08-13 厦门元馨智能科技有限公司 Vehicle driving auxiliary system based on somatosensory operation integrated glasses and method thereof

Similar Documents

Publication Publication Date Title
US11676346B2 (en) Augmented reality vehicle interfacing
KR102470217B1 (en) Utilization of passenger attention data captured from vehicles for localization and location-based services
US11710251B2 (en) Deep direct localization from ground imagery and location readings
JP7599493B2 (en) Geolocation models for perception, prediction or planning
US11077850B2 (en) Systems and methods for determining individualized driving behaviors of vehicles
US11472291B2 (en) Graphical user interface for display of autonomous vehicle behaviors
US11884155B2 (en) Graphical user interface for display of autonomous vehicle behaviors
CN110007752A (en) The connection of augmented reality vehicle interfaces
US12062136B2 (en) Mixed reality-based display device and route guide system
US10242457B1 (en) Augmented reality passenger experience
KR20210035296A (en) System and method for detecting and recording anomalous vehicle events
CN108205830A (en) Identify the personal method and system for driving preference for automatic driving vehicle
US20250033653A1 (en) Dynamic control of remote assistance system depending on connection parameters
US20240232457A1 (en) Test validation
US20220414387A1 (en) Enhanced object detection system based on height map data
US12107747B1 (en) Chargeable button latency check
US20240005359A1 (en) Projected Advertisement Modification
US11741721B2 (en) Automatic detection of roadway signage
US20240231201A9 (en) Projected av data, hud and virtual avatar on vehicle interior
US20250218055A1 (en) Visualization of a driving scene overlaid with predictions
WO2024225024A1 (en) Information processing device, information processing method, and program
CN118435609A (en) Content distribution system, method for operating same, mobile body, method for operating same, terminal device, method for operating same, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: Massachusetts, USA

Applicant after: Dynamic ad Ltd.

Address before: Massachusetts, USA

Applicant before: NUTONOMY Inc.

CB02 Change of applicant information
TA01 Transfer of patent application right

Effective date of registration: 20201119

Address after: Massachusetts, USA

Applicant after: Motional AD LLC

Address before: Massachusetts, USA

Applicant before: Dynamic ad Ltd.

TA01 Transfer of patent application right