[go: up one dir, main page]

CN121220065A - Position measurement technology - Google Patents

Position measurement technology

Info

Publication number
CN121220065A
CN121220065A CN202480036809.1A CN202480036809A CN121220065A CN 121220065 A CN121220065 A CN 121220065A CN 202480036809 A CN202480036809 A CN 202480036809A CN 121220065 A CN121220065 A CN 121220065A
Authority
CN
China
Prior art keywords
mobile device
sensor
location
user
fingerprint
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202480036809.1A
Other languages
Chinese (zh)
Inventor
Y·费因梅瑟
A·辛格·阿尔瓦拉多
J·R·休恩伯格
E·G·克里明格
J·M·彼尔德
H·R·菲格罗阿
E·沃瑟曼
R·维托里
R·埃亚尔
Y·叶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US18/677,583 external-priority patent/US20240402211A1/en
Application filed by Apple Inc filed Critical Apple Inc
Publication of CN121220065A publication Critical patent/CN121220065A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/33Services specially adapted for particular environments, situations or purposes for indoor environments, e.g. buildings
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/01Determining conditions which influence positioning, e.g. radio environment, state of motion or energy consumption
    • G01S5/017Detecting state or type of motion
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0252Radio frequency fingerprinting
    • G01S5/02521Radio frequency fingerprinting using a radio-map
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/61Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio
    • H04L65/612Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio for unicast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/131Protocols for games, networked simulations or virtual reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/52Network services specially adapted for the location of the user terminal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72457User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to geographic location
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41407Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42202Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] environmental sensors, e.g. for detecting temperature, luminosity, pressure, earthquakes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/43615Interfacing a Home Network, e.g. for connecting the client to a plurality of peripherals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • H04N21/43637Adapting the video stream to a specific local network, e.g. a Bluetooth® network involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/4508Management of client data or end-user data
    • H04N21/4524Management of client data or end-user data involving the geographical location of the client
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/4508Management of client data or end-user data
    • H04N21/4532Management of client data or end-user data involving end-user characteristics, e.g. viewer profile, preferences
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/466Learning process for intelligent management, e.g. learning user preferences for recommending movies
    • H04N21/4667Processing of monitored end-user data, e.g. trend analysis based on the log file of viewer selections
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/816Monomedia components thereof involving special video data, e.g 3D video
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/021Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/023Services making use of location information using mutual or relative location information between multiple location based services [LBS] targets or of distance thresholds
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/025Services making use of location information using location based information parameters
    • H04W4/027Services making use of location information using location based information parameters using movement velocity, acceleration information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/20Services signaling; Auxiliary data signalling, i.e. transmitting data via a non-traffic channel
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W52/00Power management, e.g. Transmission Power Control [TPC] or power classes
    • H04W52/02Power saving arrangements
    • H04W52/0209Power saving arrangements in terminal devices
    • H04W52/0251Power saving arrangements in terminal devices using monitoring of local events, e.g. events related to user activity
    • H04W52/0254Power saving arrangements in terminal devices using monitoring of local events, e.g. events related to user activity detecting a user operation or a tactile contact or a motion of the device
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S2205/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S2205/01Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations specially adapted for specific applications
    • G01S2205/02Indoor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0252Radio frequency fingerprinting
    • G01S5/02521Radio frequency fingerprinting using a radio-map
    • G01S5/02524Creating or updating the radio-map
    • G01S5/02525Gathering the radio frequency fingerprints
    • G01S5/02526Gathering the radio frequency fingerprints using non-dedicated equipment, e.g. user equipment or crowd-sourcing
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0284Relative positioning
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Databases & Information Systems (AREA)
  • Computer Hardware Design (AREA)
  • Remote Sensing (AREA)
  • Software Systems (AREA)
  • Environmental & Geological Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Social Psychology (AREA)
  • Ecology (AREA)
  • Emergency Management (AREA)
  • General Health & Medical Sciences (AREA)
  • Environmental Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Business, Economics & Management (AREA)
  • Medical Informatics (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Automation & Control Theory (AREA)
  • Mobile Radio Communication Systems (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Telephone Function (AREA)

Abstract

在一些具体实施中,响应于在相关联的第一时间的触发信号,移动设备使用与一个或多个其他设备的第一测距会话来生成第一位置值。技术可包括将第一位置值存储在存储器中。该技术可包括使用移动设备的运动传感器跟踪移动设备的运动以确定相对于第一位置值的当前位置。此外,该技术可包括确定自相关联的第一时间以来移动设备的当前位置已从第一位置值改变了预定阈值量。响应于自相关联的第一时间以来移动设备的当前位置已改变超过预定阈值量,该技术可包括使用与一个或多个其他设备的第二测距会话来生成第二位置值。

In some implementations, in response to a trigger signal at an associated first time, the mobile device generates a first location value using a first ranging session with one or more other devices. The technique may include storing the first location value in memory. The technique may include using the mobile device's motion sensors to track the mobile device's motion to determine its current location relative to the first location value. Furthermore, the technique may include determining that the mobile device's current location has changed from the first location value by a predetermined threshold amount since the associated first time. In response to the mobile device's current location changing more than the predetermined threshold amount since the associated first time, the technique may include generating a second location value using a second ranging session with one or more other devices.

Description

Position measurement technique
Cross-reference to other applications
This patent application claims priority from U.S. provisional application No. 63/470,675, filed on 2 nd 6 of 2023, regarding "LOCATION MEASUREMENT TECHNIQUES (position measurement technique)", and U.S. patent application No. 18/677,583, filed on 29 of 2024, 5, regarding "LOCATION MEASUREMENT TECHNIQUES (position measurement technique)", which are incorporated herein by reference in their entirety for all purposes.
Background
The mobile device may use several different sensors for accurate navigation. These sensors may be used to determine the location of a mobile device even when indoor or when Global Navigation Satellite System (GNSS) information is not available. However, some of these different sensors may cause processing time loss when calculating the position of the mobile device from the sensor information. This processing time may cause undesirable delays in loading applications and may result in a poor user experience when using these mobile device applications. Further, it is desirable for the mobile device to actively maintain a balance between navigation sensor usage and battery power savings.
An Augmented Reality (AR) application may experience hysteresis when activated due to determining the location of a mobile device relative to an augmented reality area. Conventional ranging services may be time consuming and may result in inefficient use of battery resources. Hysteresis in determining device positioning may result in a poor user experience.
Accordingly, improvements in determining whether an updated RF scan is needed due to movement of the mobile device are desirable.
Disclosure of Invention
Embodiments provide techniques for determining a location of a mobile device even when the mobile device is indoors. These techniques allow for determining when to update the location of the mobile device and balancing the energy savings of the sensor usage with the battery of the mobile device in a ranging session. These techniques may use motion sensors to determine when to determine to update the location of the mobile device (e.g., using ranging techniques). In addition, these techniques may be used with Augmented Reality (AR) techniques. The AR technology may have determined an AR area in which a sensor of the mobile device may be used to accurately determine the location of the mobile device. The techniques may determine when a trajectory of a mobile device indicates that the mobile device enters and/or exits an AR region.
As one example, the device may integrate motion sensor features to determine whether the device has moved significantly (e.g., greater than a threshold) since the last use. Such motion sensor techniques may be performed in a background mode with minimal power usage even if the device is in a standby mode (e.g., screen off, or the host application is in a sleep mode). The motion sensor feature may determine whether the mobile device has moved a predetermined amount that would trigger a re-determination of the device's location using a subsequent ranging session.
As another example, the mobile device may determine the location of the mobile device in a background mode. The mobile device may determine whether the mobile device is within the augmented reality area or on a trajectory of the augmented reality area. If the mobile device is determined to be on the track of the augmented reality area, the augmented reality application may be preloaded into memory to reduce hysteresis. If the mobile device is within the augmented reality area, the mobile device may prompt the user if an augmented reality application should be launched.
Other embodiments relate to systems, portable consumer devices, and computer-readable media associated with the methods described herein.
The nature and advantages of embodiments of the disclosure may be better understood by reference to the following detailed description and the accompanying drawings.
Drawings
FIG. 1A is a simplified diagram illustrating a physically located cluster in physical space.
FIG. 1B is a simplified diagram illustrating a cluster of sensor locations in a sensor space corresponding to the physical locations in the physical space of FIG. 1A.
FIG. 2A is a flow chart of a method for generating clusters of sensor locations.
FIG. 2B is a flow chart of a method for identifying an application based on sensor positioning.
Fig. 3 illustrates a block diagram of a system for determining a trigger event.
FIG. 4 illustrates a block diagram of a system for identifying an application or playback device for a user based on sensor positioning.
Fig. 5 illustrates a first exemplary environment for determining a location of a mobile device.
Fig. 6 illustrates a second exemplary environment for determining a location of a mobile device.
Fig. 7 illustrates a flow chart of a process for determining when to update a location of a mobile device, in accordance with some embodiments.
FIG. 8 illustrates a third exemplary environment for demonstrating an exemplary use case for updated positioning of a mobile device.
Fig. 9 is a flow chart of a method for determining a location of a mobile device, according to some embodiments.
Fig. 10 illustrates an exemplary AR map according to some embodiments.
FIG. 11 illustrates a first map of one potential use of Predictive Dead Reckoning (PDR) and geofencing for an Augmented Reality (AR) application.
FIG. 12 illustrates a second map of potential uses of PDRs and geofences for AR applications.
Fig. 13 illustrates a third map of a home for an AR application.
Fig. 14 is a flowchart of a process for determining when an electronic device trajectory is predicted into a region of interest (AOI) using positioning techniques, according to some embodiments.
Fig. 15 illustrates a simplified block diagram of a mobile device obtaining a fingerprint during a first time period, in accordance with various embodiments.
Fig. 16 illustrates a simplified block diagram of a mobile device obtaining a fingerprint during a second time period, in accordance with various embodiments.
FIG. 17 is a lane diagram depicting a technique for detecting device proximity according to some embodiments.
FIG. 18 is a swim lane diagram depicting a technique for detecting device proximity at a server device, in accordance with some embodiments.
Fig. 19 is a simplified block diagram illustrating an example architecture of a system for detecting device proximity at a mobile device, in accordance with some embodiments.
Fig. 20 is a simplified block diagram illustrating an example architecture of a system for detecting device proximity at a server device, in accordance with some embodiments.
Fig. 21 is a flow chart of a method for providing proximity classification according to some embodiments.
Fig. 22 is a block diagram of an example device, which may be a mobile device, in accordance with some embodiments.
Detailed Description
The ranging feature may determine the location of the user at home. However, the budget (e.g., power or transmitter/receiver cycles) for performing Radio Frequency (RF) scans that can be used to update the range values may be limited. Techniques may be determined to utilize the limited budget as efficiently as possible. It is not desirable to waste any ranging cycles to determine that the user equipment was in the same location where it was at the last time the range value was determined. Furthermore, it is desirable that the device has a very low latency so that when a user unlocks its device and launches a media application (application), the recommendation for the playback device can be quickly presented when the application enters the foreground.
An "application" (or application) may be a client software program that is executed by a processor of the device (e.g., within an operating system), or may be provided as part of the operating system, or provided by a third party developer and downloaded to the device. An application may be a particular portion of an operating system that is designed to perform particular functions when executed by a processor. An application may stream media content (e.g., audio and/or video content) to another device. For example, the music application may stream music to one or more speakers that may be in communication with the mobile device. Applications on mobile devices (e.g., home applications) may be used to control other devices throughout the home, such as accessory devices (e.g., kitchen appliances, lighting, thermostats, smart locks on doors, curtains, etc.). The user of the home application may be in the same room as the accessory device being controlled or may be in a different room than the accessory device being controlled. For example, a user may be in a kitchen when the user uses a home application on his mobile device to close a garage door.
An "accessory device" may be a remotely controllable device. The accessory device may be near a particular environment, zone, or location, such as a home, apartment, or office. Accessory devices can include garage door openers, door locks, fans, lighting devices (e.g., lights), thermometers, windows, shutters, kitchen appliances, and any other device configured to be controlled by an application, such as a home application. The accessory device may be determined by the home application or associated with the home. The accessory device may be determined by, for example, the mobile device automatically scanning the environment for any accessory devices, or the user may manually input accessory device information via, for example, a home application.
Users often perform the same or repeated actions on the accessory device when in a particular location. For example, each time a user goes home from work they may close a garage door in a kitchen. In addition, when the outside is dark, the user can turn on a lamp in the living room or change the temperature of the thermostat in the living room. Thus, certain activities may be performed on devices in the home periodically and repeatedly (e.g., daily, several times a day) while the user is in a particular location. In another example, video content may be selected to be streamed to a living room smart TV while the user is sitting on a sofa in the living room. Performing these tasks through various user interfaces can be time consuming and tedious tasks for a user, as the tasks need to be performed periodically or several times a day.
Embodiments provide improved mobile devices and methods for recommending applications and/or accessory devices, or using sensor measurements to automatically perform actions with applications based on historical use of the applications at identifiable locations (which may be referred to as micro-locations). A sensor (e.g., an antenna and associated circuitry) on the mobile device may measure sensor values from wireless signals transmitted by one or more signal sources that are substantially stationary (e.g., a wireless router in a home or a network-enabled device). These sensor values may be reproduced at the same physical location of the mobile device, so the sensor values may be used as agents for physical location. In this way, the sensor values may form a sensor location, formed in the sensor space rather than in the physical space. The "sensor location" may be a multi-dimensional data point defined by individual sensor values for each dimension. In various embodiments, the parameter of the wireless signal may be a signal characteristic (e.g., signal strength or time of flight, such as Round Trip Time (RTT)), or other sensor value measured by a sensor of the mobile device (e.g., related to data conveyed in one or more wireless signals).
I. Sensor measurement and cluster
When the mobile device is positioned at a physical location within a home (e.g., home or building) or other location area, the mobile device may detect a triggering event and then measure signals transmitted from one or more signal sources present at that point in physical space. For example, the mobile device may detect a button press that acts as a trigger event and causes the mobile device to measure a signal (e.g., wi-Fi, bluetooth (BT), bluetooth Low Energy (BLE), ultra Wideband (UWB), zigbee, etc.) emitted from any signal source (e.g., an electronic device such as a wireless router, wi-Fi equipped appliance (e.g., a set top box, smart home device), or bluetooth device). The detected signals may be used to generate multidimensional data points of sensor values in a sensor space, where each dimension in the sensor space may correspond to a characteristic of a signal emitted from a signal source. The multi-dimensional data points may represent sensor locations of the mobile device in a sensor space, where the sensor locations correspond to physical locations of the mobile device in a physical space.
Fig. 1A is a simplified diagram illustrating multiple physical locations in physical space 103. For example, the physical space 103 may be the interior of a home, office, store, or other building. Physical space 103 may include multiple signal sources, such as signal sources 102A and 102B. Each signal source may transmit a wireless communication signal, such as a wireless communication signal transmitted from a wireless router or a bluetooth device. The signal source may be considered a stationary device because its positioning is not typically changed.
A "cluster" corresponds to a set of sensor locations (e.g., scalar data points, multidimensional data points, etc.) where measurements have been made. According to embodiments described herein, it may be determined that a sensor location is in a cluster. For example, the sensor locations of the clusters may have parameters that are within a threshold distance from each other or from the centroid of the clusters. When viewed in sensor space, the clusters of sensor locations appear as groups of sensor locations that are in close proximity to each other. The cluster of sensor locations may be located, for example, in a room of a house or in a particular area of the house (e.g., hallway, front door area).
A given location in a house or building may also be referred to as a "micro-location". The location may be referred to as a micro-location because the location refers to a particular area of a user's home, for example. In addition, a location or micro-location may also be referred to as a location cluster. The terms location, micro-location, and location cluster may refer to the same area or region. A home may have multiple locations. The location may correspond to a room in the house, a portion of a room, or other area in the house. For example, the location may be a backyard area, a front door area, or a hallway area.
The mobile device may be located within physical space 103 such that one or more signals transmitted from signal sources 102A and 102B are detected. For example, the mobile device may be located at physical location 104 in fig. 1A, where signals 101 and 100 are detected from signal sources 102A and 102B, respectively. It should be appreciated that at some locations, the mobile device may only measure one of signals 101 and 100 (e.g., due to signal attenuation at some locations). Furthermore, the mobile device may be in a physical location where signals from other signal sources (not shown) outside of physical space 103 may be detected, and the techniques herein are not limited to mobile devices that can only detect the physical locations of signals 101 and 100.
Typical human behavior results in mobile devices being used more frequently in some physical locations than in other physical locations. For example, a user may use a mobile device more frequently when the user is on a sofa or a bed. These physical locations may be represented by physically located clusters, such as physically located clusters 114 and 116. Each cluster may have physical positioning groups positioned close to each other. For example, cluster 114 may include physical locations 104, 106, and 112. As shown, cluster 116 includes physical locations 108 and 110. The mobile device may be configured to determine when the mobile device is located in one of the clusters and identify an application associated with the cluster based on the detected signals (signals 100 and 101).
As part of detecting signals at any of the physical locations using the sensors of the mobile device, the mobile device may measure one or more sensor values from signals transmitted from the signal sources 102A and 102B. For example, if the mobile device is at the physical location 104, the mobile device may measure sensor values from the signal 101 transmitted from the signal source 102A and the signal 100 transmitted from the signal source 102B. The measured sensor values may be signal characteristics of signal 101 and signal 100. The measured sensor values can be used to form a sensor location in the sensor space, as shown in fig. 1B.
Fig. 1B is a simplified diagram illustrating the positioning of a plurality of sensors in a sensor space 105, which corresponds to the physical space 103. The sensor space 105 is depicted as a graph of sensor positioning measured in signal strength. The X-axis may represent measured values of the signal from the signal source 102B in dB, increasing right along the X-axis, and the Y-axis may represent measured values of the signal from the signal source 102A in dB, increasing up along the Y-axis. Although fig. 1B illustrates an example in which the sensor space has two dimensions (e.g., sensor values of signals from signal sources 102A and 102B, respectively), the sensor space may include more or fewer dimensions.
The sensor locations in the sensor space correspond to corresponding physical locations in the physical space 103. For example, the sensor values measured at physical location 104 in FIG. 1A correspond to sensor locations 132 in the sensor space shown in FIG. 1B. The sensor locations 132 are represented as two-dimensional data points, with one dimension corresponding to sensor values from the signal source 102A and another dimension corresponding to sensor values from the signal source 102B. Sensor space 105 may include clusters of sensor locations, such as clusters of sensor locations 124 and clusters of sensor locations 126. The clusters 124 and 126 of sensor locations correspond to the physically located clusters 114 and 116, respectively, of fig. 1A.
Clusters 124 and 126 may be unlabeled locations, meaning that the mobile device does not know the actual physical coordinates corresponding to clusters 124 and 126. The device may simply know that there is a cluster of sensor locations with similar sensor values and that the cluster represents a discrete location in physical space. However, the mobile device may perform functions based on sensor positioning in the sensor space such that use of the mobile device in physical space benefits. For example, the mobile device may determine a sensor location of the mobile device and suggest an application to the user based on whether the sensor location is within a cluster in which the application usage pattern is known. Methods of forming clusters and suggesting applications based on sensor locations are discussed further below.
Thus, the sensor location may correspond to a set of one or more sensor values measured by a sensor of the mobile device at a physical location of the physical space, the one or more sensor values from one or more wireless signals transmitted by one or more signal sources (e.g., external devices such as networking devices). The sensor value may be a measurement of a signal characteristic (e.g., signal strength, time of flight, or data transmitted in a wireless signal), which may occur if the signal source measures the signal characteristic from the mobile device and sends the value back to the mobile device. Each sensor value in the set may correspond to a different dimension in the sensor space, wherein the set of one or more sensor values forms a data point (e.g., a multi-dimensional data point, also referred to as a feature vector) in the sensor space.
In the example shown in fig. 1A, for example, when the sensor value is signal strength, the sensor value of the sensor locations in cluster 114 may be higher than the sensor value of the sensor locations in cluster 116 for signal source 102A (which is on the vertical axis in fig. 1B). This may be caused by the fact that the physical locations in cluster 114 are closer to signal source 102A than the physical locations in cluster 116 are to signal source 102A. When the sensor value is a signal characteristic of time of flight, the value of cluster 114 will be less than the value of cluster 116.
A given measurement of one or more wireless signals obtained at a physical location may be obtained at one or more times within a certain time interval to obtain a set of sensor values. For example, when two measurements are made at the same physical location at two different times, the two measurements at the two different times may correspond to the same sensor location. The sensor location may have a zero value for a given dimension (e.g., if no particular wireless signal is measured), or a nominal value (e.g., in the case of low signal power (-100 decibels (dB) Received Signal Strength Indication (RSSI)) or a large uncertainty.
Sensor positioning groups with similar parameters may form clusters that may be used to define discrete locations. One or more clusters may be used to identify applications or accessory devices to suggest to a user in, for example, a message (e.g., an on-screen display or an audio message).
The application and/or accessory device can be associated with one or more clusters. In particular, the application and accessory device may be associated with a particular location of the mobile device. The location of a particular area in a user's home may be referred to as a micro-location. The micro-location may also be referred to as a location cluster. The terms location, micro-location, and location cluster may refer to the same area or region. The location may correspond to a room in a house or other area in a house. For example, the location may be a backyard area, a front door area, or a hallway area. Although a home is used as an example, any area or room in which an accessory device is located may be used to determine the location cluster.
Predicting user interactions with devices
The mobile device can identify which applications (or accessory devices) the user runs at each sensor location. After collecting the sensor locations and the corresponding applications that the user runs at the sensor locations, the device may generate clusters of sensor locations (e.g., periodically generated during the night) and associate one or more applications that may be run by the user with the clusters of sensor locations. Thus, upon detection of a subsequent trigger event, the device may generate a new sensor location and compare the new sensor location to the cluster of generated sensor locations. If the new sensor location is determined to be within a threshold distance from one of the sensor location clusters, one or more applications associated with the sensor location cluster may be identified and used in the action, for example, provided to the user as suggested applications. The threshold distance may be a distance expressed in units of decibels (e.g., for Received Signal Strength Indication (RSSI)) or meters (e.g., for time of flight (TOF)), depending on the unit of sensor positioning, as will be discussed further herein. The mobile device may use the location information to determine an associated playback device of the plurality of playback devices at a location. In various implementations, the mobile device may use the location information to trigger activation or deactivation of one or more sensors when the mobile device enters or leaves an Augmented Reality (AR) region.
A. Learning and generating clusters
Fig. 2A is a flow chart of a method 200 for generating clusters of sensor locations. The cluster of sensor locations may later be used by the mobile device to identify an application and suggest the application to the user. The cluster of sensor locations may also be used to recommend playback devices for streaming media (e.g., video or audio) to an associated playback device based on the micro-locations. The method 200 may be performed by a mobile device (e.g., a phone, tablet, wearable computing device, etc.).
At block 202, a trigger event is detected. The trigger event may be identified as an event that is sufficiently likely associated with operation of the mobile device. The trigger event may be caused by a user and/or an external device. For example, the trigger event may be a specific interaction of the user with the mobile device. In various implementations, the triggering event may be the mobile device being stationary longer than a specified period of time determined by one or more motion sensors (e.g., IMU). The specific interactions may be used to learn what the user did at a specific location and thus may be considered a learning trigger event. Examples of learning trigger events are application launch, specific activities within an application (e.g., making a selection within an application), voice commands (e.g., initiating at a voice assistant to perform a search or other activity with an application), and first interactions during the day. As other examples, the triggering event may be the accessory device connecting to the mobile device, such as inserting a headset into a headset interface, making a bluetooth connection, and so forth. An event list of trigger events may be stored on the mobile device. Such events may be a default list and maintained as part of the operating system, and may or may not be configured by the user.
At block 204, one or more sensor values may be measured by one or more sensors of the device to generate a sensor location (e.g., sensor location 132 in fig. 1A) in the form of a data point (e.g., a multi-dimensional data point or a one-dimensional data point). Sensor positioning is measured at a physical location in physical space. The sensor values may correspond to one or more signal characteristics of signals (e.g., signals 100 and 101 in fig. 1A) transmitted from one or more signal sources (e.g., signal sources 102A and 102B in fig. 1A). For example, the sensor value may be a value corresponding to the signal strength of the measurement signal, such as a Received Signal Strength Indication (RSSI) value or any other suitable signal characteristic whose value varies with respect to distance from the signal origin. In other cases, the sensor values may include signal characteristics, such as time of flight (TOF) measurements, indicative of the distance between the device and the signal origin. For example, the one or more sensors may include one or more antennas and associated circuitry to measure characteristics of the signal. Other examples of sensors may include light sensors or audio sensors.
The signal source may be any suitable device configured to transmit wireless signals. For example, the signal source may be an access point, such as a wireless router, a bluetooth device, or any other networking device suitable for transmitting and/or receiving signals (e.g., bluetooth speaker, refrigerator, thermostat, home automation portal, etc.). Different signal sources may be identified based on identifiers in the signals. Each measurement from a different device may correspond to a different sensor value.
Even if only signals from one signal source are measured, the data points representing the position may still have multiple dimensions. For example, multiple measurements may be made on signals from the same signal source, where each measurement corresponds to a different dimension of a data point (e.g., a measurement of a different signal characteristic of the signal). The additional dimensions may correspond to other devices even if no signal is detected at a given location. Undetected devices may have zero sensor values assigned, and thus still have values for these dimensions.
At block 206, the device may identify an application that is used in measuring the sensor value. The identified applications may be assigned to corresponding sensor locations. When an application is used multiple times at or near a sensor location (e.g., as part of the same cluster), a correlation between the sensor location and the application may be determined. The correlation can be used to predict which application the user will likely use when a given sensor location is measured.
In various embodiments, at 206, the device may identify a playback device of a plurality of playback devices in a location (e.g., room). The mobile device may also use the history information to recommend a playback device of the plurality of playback devices. For example, when a user is located on a living room sofa, a smart television may typically be selected to play back video media files. If the user selects a video media file for playback and sits on a sofa, the mobile device will recommend a smart television for playback.
In various embodiments, the application may be an augmented reality application. When the mobile device enters the AR application area, the mobile device may activate one or more sensors of the mobile device.
An application may correspond to any software executing on a processor of a device. For example, the application may be a client application (e.g., electronic recipe software or software for reading an electronic book) executed by an operating system. As another example, the application may be software (e.g., daemon) that is part of an operating system.
Once the application is identified, the method 200 loops back to block 202 and detects the occurrence of another trigger event at that block. In this case, the method 200 may again measure the sensor values at block 204 to generate a sensor location and identify another application used at the sensor location at block 206. Blocks 202, 204, and 206 may be performed multiple times to collect multiple records of individual sensor locations and associated applications.
At block 208, a plurality of records of individual sensor locations may be analyzed to form a cluster of sensor locations. The sensor locations may be analyzed during night when the user is asleep or not planning to use the mobile device for a long period of time (e.g., at least 4 hours). The analysis may be a batch analysis performed over days or weeks from multiple sensor locations stored in use of the mobile device. The clusters of sensor locations may correspond to groups of sensor locations in the sensor space that are in close proximity to each other. For example, the sensor locations of the clusters may have data points that are within a threshold distance from each other or from the centroid of the clusters. The sensor locations of clusters in the sensor space will typically have their corresponding physical locations form clusters in the physical space.
As shown in FIG. 1A, physical locations 104, 106, 108, and 110 may be grouped into clusters 114 and 116 based on having similar sensor values. For example, physical locations 104 and 106 may be grouped within cluster 114 because the sensor values of physical location 104 are within a threshold distance of the sensor values of physical location 106. Likewise, physical locations 108 and 110 may be grouped within cluster 116 because the difference of the sensor values of physical location 108 and the sensor values of physical location 110 may be within a threshold distance. However, the sensor values of the sensor locations in cluster 114 may be different from the sensor values of the sensor locations in cluster 116, thereby causing clusters 114 and 116 to separate. The threshold distance may be defined by typical use of the device in physical space, e.g., a width of a few feet. The threshold distance in physical space may be associated to the sensor distance based on a mapping function that may be determined via a calibration process that may be performed by a manufacturer of the mobile device. Further details of the threshold distance can be found in a concurrently filed application entitled "DETERMINING LOCATION OF MOBILE DEVICE USING SENSOR SPACE TO PHYSICAL SPACE MAPPING," which is incorporated by reference in its entirety.
Because the device has identified applications that the device is running at the various sensor locations, one or more applications may be associated with each cluster of sensor locations. For example, if the user runs a food related application at the sensor location 132, the food related application may be associated with the cluster 124. The cluster 124 may be associated with a physical location, such as a kitchen of a residence (e.g., physical location 114). However, the device may not generally be aware of the association between the cluster 124 and the kitchen. The device may simply know that the cluster 124 is associated with a food related application. Different applications may be associated with the same cluster, which may result in applications having different probabilities of being run by a user, where such probabilities may be used to determine actions to be performed by one or more applications, e.g., using a set of one or more evaluation rules.
At block 209, a set of one or more evaluation rules for each cluster is optionally determined based on the user's historical interactions with the mobile device. The evaluation rules for the cluster may be determined based on the number of runs of a particular application in the cluster or the number of selections of playback devices in the cluster, thereby determining which applications run most often and which operations the applications are most likely to perform. In various implementations, the clusters may correspond to AR areas. Once the clustering is performed, evaluation rules may optionally be performed to minimize processing time when the clustering is used as part of determining a predicted application in response to a detected event. Alternatively, the evaluation rule may be performed later (e.g., upon detection of an event). This determination may be made by a predictive model, as discussed in more detail below.
B. Performing proactive actions based on measured sensor positioning and clusters
Once the clusters have been determined and possibly after the predictive model has been run, the mobile device may be used in a mode that allows for identification of the predicted application based on the measured sensor values. The determination of sensor locations and associated applications may continue even if predictions are made, where clusters and any updates to the prediction model are performed periodically.
Fig. 2B is a flow chart of a method 201 for suggesting an application or playback device to a user through a cluster of reference sensor locations. At block 210, a new trigger event is detected. The detection of the new trigger event may be similar to the detection of the trigger event discussed with reference to block 202 in fig. 2A, but the new trigger event may occur at a later time and a different physical location. In various implementations, the triggering event may be the mobile device being stationary longer than a specified period of time determined by one or more motion sensors (e.g., IMU). The new physical location may be in a cluster of physical locations corresponding to a cluster of previous sensor measurements.
In some cases, the new trigger event is a predictive trigger event that is used to detect when an action that an application on the mobile device may take is predicted, not just an event that learns what the user does. As an example, a user may press a button or touch a touch-sensitive display screen to turn on a backlight of the mobile device (such as pressing a home button), thereby indicating that the user intends to interact with the mobile device. In such examples, the turning on of the backlight may be a triggering event. Other examples of trigger events include a user moving substantially around while on a lock screen or home screen. Some predicted trigger events may also be learning trigger events.
At block 212, one or more sensor values are measured by the mobile device to generate a new sensor location. Generating a new sensor location may also be similar to generating a sensor location discussed with reference to block 204.
At block 214, it is determined whether the new sensor location is within a known cluster of sensor locations. If the new sensor location is within a threshold distance from the centroid of the known sensor location cluster, the new sensor location may be determined to be within the known sensor location cluster. If the new sensor location is not within the known sensor location cluster, then at block 216, the application may be identified if the application is used in conjunction with a new trigger event. For example, the sensor locations 128 corresponding to the physical locations 118 in fig. 1A may be located at a distance from the clusters 124 or 126 that is not within a threshold distance. Thus, the mobile device can record the sensor location 128 and identify an associated application run by the user at the sensor location 128 for future reference. But may not take specific action because the located cluster is not yet identified.
However, if the new sensor is located within a known sensor location cluster, at block 218, an application corresponding to the known cluster is identified. If the new sensor location is located within a threshold distance from the sensor location cluster, one or more applications associated with the sensor location cluster may be identified. The one or more application programs may be determined using one or more evaluation rules generated at the time of measuring sensor positioning or at a previous time (e.g., after clustering).
For example, referring back to FIG. 1A, physical location 112 may correspond to sensor location 134. The sensor locations 134 may be located within a threshold distance from the cluster 124. Thus, applications associated with the cluster 124 may be identified for the sensor locations 134. If the example discussed with respect to block 208 continues, the application identified for sensor location 134 may be a food-related application. As mentioned herein, the mobile device may not be aware that clusters 124 and 126 are associated with a physical location in the home, such as a kitchen or bedroom. Instead, the mobile device may simply know that the measured sensor locations form a set of sensor locations (as shown in fig. 1B), and may associate each set of sensor locations with a discrete location.
At block 220, an action is performed in association with the application. In various implementations, the action may be streaming media files (e.g., video files, audio files). In various embodiments, the action may be activating one or more sensors (e.g., inertial optical sensors) for the AR application.
The action may be to provide a message including or using the identified application, such as a notification including the application. For example, the message may be a user interface that allows a user to select to run an application. The user interface may be provided in a variety of ways, such as by being displayed on a screen of the mobile device, projected onto a surface, or providing an audio interface. The particular user interface provided to the user may depend on the degree of probability performed by the user. For example, the higher the probability of use (e.g., based on more instances of such use), more aggressive actions may be taken, such as automatically opening an application (e.g., visual or voice commands) with a corresponding user interface, rather than merely providing a mechanism to more easily open the application (e.g., lock an icon on a screen). In some implementations, if the probability is too low for any application, no action may be taken.
A "message" may correspond to any form of data communicated, which may be provided to a user (e.g., displayed to a user) or provided to another device. The message may be generated by the application or may include information related to the application running on the device (e.g., including a link to the application). For example, the message may be an alert, notification, suggested application, etc. The message does not necessarily have to include text conveying readable messages, as the message may be another application, and thus the message may be in binary form.
At block 222, an application is identified. The application may be the same application identified at block 218, thereby enhancing the identification of the application and the action taken, or the application may be a different application that the user runs, such as even after the message is provided to the user. Thus, even if the sensor is located within the cluster, the application used may be identified, as iterations of the cluster and updates to the evaluation rules of the predictive model may be performed continuously. Execution of block 222 is an example where the predicted trigger event is also used as a learning trigger event. Alternatively, block 222 may be performed in response to a different learning trigger event (e.g., application launch, rather than home button press), wherein a new sensor location may be reused instead of performing a new measurement.
The method 201 may enable a mobile device to accurately predict applications of a user at a particular location where location information (e.g., information from GPS, global navigation satellite system (GLONASS), beidou, galileo, and wireless fidelity (Wi-Fi) based positioning) does not exist or is unreliable, or where the spatial resolution of existing location information is too great. For example, the method 201 may allow the mobile device to predict a food related application when the mobile device is positioned in the kitchen of the user's residence and the mobile device is not aware that it is positioned in the kitchen of the user's residence, and may allow the mobile device to predict another application (such as a news related application) when the mobile device is positioned in the bedroom of the user and the mobile device is not aware that it is positioned in the bedroom of the user's residence.
In another example, the method 201 may allow the mobile device to transmit a message or alert to the user when the mobile device detects that it is located in a particular location. For example, the method 201 may allow the mobile device to send the correct meeting location when the mobile device is positioned in the wrong meeting room. In another example, the method 200 may allow the mobile device to send a reminder to the customer to make a call when the mobile device enters the user's office (e.g., when the user accesses the mobile device). As such, performing method 201 makes the mobile device more user friendly and has a deeper connection with the user.
Event-triggered prediction
The predicted trigger event may be a predetermined set of events that trigger one or more applications that are identified to be provided to the user. The event may be detected using a signal generated by a device component. Examples of predictive trigger events are discussed above. The following description is also applicable to other triggering events.
Fig. 3 illustrates a simplified block diagram of a detection system 300 for determining a trigger event. The detection system 300 may reside within a device for which a triggering event (also referred to simply as an "event") is being determined. As shown, the detection system 300 may detect a number of different events. One or more of the detected events may be determined by the detection system 300 to be a triggering event. Other processing modules may then use the trigger event to perform processing.
A. Detecting events
The detection system 300 includes hardware components and software components for detecting events. For example, the detection system 300 may include a plurality of input devices 302, such as input devices 302. In various embodiments, the input device 302 may include a motion sensor. The motion sensor may detect motion of the mobile device. For example, a motion sensor may detect mobile device movement and then stop. The motion sensor may detect that the mobile device is stationary for a period of time. The motion sensor may detect various orientations of the mobile device that may be used to detect events. In various embodiments, the motion may be used as part of a trigger as part of a combined event (e.g., motion detected in conjunction with a touch screen or button input). The input device 302 may be any suitable device capable of generating a signal in response to an event. For example, the input devices 302 may include a device connection input device 304 and a user interaction input device 306 that may detect device connection events (e.g., headset interface, bluetooth device, wi-Fi connection, etc.) and user interaction events (e.g., buttons, switches, latches, touch screen, etc.), respectively. When an event is detected at the input device, the input device may send a signal indicating the particular event for further analysis.
1. User interaction event
The user interaction input device 306 may be used to detect user interaction events. User interaction events may occur when a user interacts with a device. Any suitable device component of the user interface may be used as the user interaction input device 306. The user interface may correspond to any interface that is typically used for a user to interact with the mobile device or with a particular application. Examples of suitable user interaction input devices are buttons 314 and a touch screen 316. The buttons 314 of the mobile device may be home buttons, power buttons, volume buttons, and the like. In some cases, any input device that turns on the backlight of the mobile device may be a user interaction input device 306. The user interaction event may include the user moving the mobile device or changing its orientation (e.g., flipping the mobile device). When a user interacts with the device, it may be determined that the user has provided user input, and a corresponding user interaction event may be generated.
Touch screen 316 may allow a user to provide user input via the display screen. For example, the user may swipe his or her finger across the display to generate a user input signal. When the user performs the action, a corresponding user interaction event is detected.
The user interaction event may also include a user action within the application. For example, in a home application, a user selection of an accessory device (e.g., turning on an interior light, opening a garage door, etc.) may be a user interaction event. The user may select the accessory device via a touch screen or via a voice command.
2. Device connection event
A device connection event may be an event that occurs when other devices are connected to the device. For example, the device connection input device 304 may detect an event in which the device is communicatively coupled to the device. Any suitable device component that forms a wired or wireless connection with an external device may be used as device connection input device 304. Examples of device connection input devices 304 include a headset interface 310 and a data connection 312, such as a wireless connection circuit (e.g., bluetooth, wi-Fi, bluetooth low energy or BLE, etc.) or a wired connection circuit (e.g., ethernet, etc.).
The headset interface 310 allows a headset set to be coupled to a device. When the headset is coupled, a signal may be generated, for example, by making an electrical connection when plugged into the headset interface 310. In more complex cases, the headset interface 310 may include circuitry that provides an identification signal that identifies the headset interface type of the device. The event may thus be detected in various ways and the signal may be generated and/or transmitted in various ways.
The data connection 312 may be communicatively coupled with an external device, for example, through a wireless connection. For example, a bluetooth connection may be coupled to a computer of a vehicle or a computer of a wireless headset. Thus, when an external device is coupled to the mobile device via data connection 312, it may be determined that the external device is connected and a corresponding device connection event signal may be generated. As another example, when a beacon communication via BLE is received, it may be determined that an external device is connected. When controlling an accessory device (e.g., smart lock, room light) via a wireless connection, it may also be determined that an external device is connected.
B. determining trigger events
As further illustrated in fig. 3, the input device 302 may output the detected event 322, for example, as a result of any corresponding event. The detected event 322 may include information about which input device is transmitting a signal for the detected event 322, a subtype for a particular event (e.g., which type of button is pressed). Such information may be used to determine whether the detected event 322 is a triggering event, and such information may be passed to later modules for use in determining which predictor module to use to determine which application to suggest, which message should be transmitted, or which action to perform.
The detected event 322 may be received by the event manager 330. Event manager 330 may receive signals from input devices 302 and determine which type of event was detected. Depending on the type of event, event manager 330 may output a signal (e.g., event signal 332) to a different engine. Different engines may have subscriptions to event manager 330 to receive specific event signals 332 that are important to their functionality. For example, trigger event engine 324 may be subscribed to receive event signals 332 generated in response to detected events 322 from input device 302. Event signal 332 may correspond to an event type determined from detected event 322.
The trigger event engine 324 may be configured to determine whether the detected event 322 is a trigger event and possibly the type of trigger event. To make this determination, the trigger event engine 324 may reference a specified trigger event database 326, which may be coupled to the trigger event engine 324. The specified trigger event database 326 may include a list of predetermined events that are specified as trigger events, and potentially what type of trigger event each event is.
The trigger event engine 324 may compare the received detected event 322 to a list of predetermined events and output a trigger event 328 if the detected event 322 matches a predetermined event listed in the specified trigger event database 326. In various implementations, the triggering event may be the mobile device being stationary longer than a specified period of time determined by one or more motion sensors (e.g., IMU). Examples of the list of predetermined events may include pressing a power button, pressing a home button, or performing any other action that turns on the backlight of the mobile device, indicating that the user wishes to interact with the mobile device to perform an action or run an application.
C. identifying an application and performing an associated action
Once the trigger event is detected, the application may be identified based on the trigger event. In some cases, the identification of the application is not a preprogrammed action. Rather, the identification of the application may be a dynamic action that may change based on additional information. For example, the identity of the suggested application may be determined based on the context information.
The "context information" is collectively referred to as any data that may be used to define the context of a device. The context information for a given context may include one or more context data, each context data corresponding to a different characteristic of the device. The potential characteristics may belong to different categories, such as a temporal category (e.g., temporal information) or a location category. The context data may be used as features of the model (or sub-model), while the data used to train the model may include different characteristics of the same class. The particular context may correspond to a particular combination of device characteristics, or to only one characteristic.
Prediction using location
Fig. 4 illustrates a simplified block diagram of a predictive system 400 for identifying applications and corresponding actions based on trigger events and context information. The predictive system 400 resides within a mobile device that identifies an application. Prediction system 400 may include hardware and software components.
The prediction system 400 includes a prediction manager 402 for identifying suggested applications, playback devices, or entry/exit AR areas. The prediction manager 402 may receive a trigger event, such as trigger event 328 discussed in fig. 3. The prediction manager 402 may use the information collected from the trigger event 328 to identify a suggested application or playback device 404. As shown, in addition to trigger event 328, prediction manager 402 may also receive context data 406.
Playback/streaming of media content from one electronic device to a remote device can be cumbersome. The user interface may be clicked multiple times to navigate to identify and select a desired playback device. The desired electronic device determines a preferred remote/playback device based on the location of the user and the known behavior of the user. These preferences are not limited to activities of certain applications, but also consider user behavior at certain locations at certain times of the day or days of the week at home or out or at home.
A. Contextual information
The context information may be collected from the context data 406 and may be received at any time. For example, the context information may be received before and/or after the trigger event 328 is detected. Additionally, context information may be received during detection of the trigger event 328. The context information may specify one or more characteristics of the mobile device for a particular context. The context may be the surrounding environment (type of context) of the mobile device when the trigger event 328 is detected. For example, the context information may be the time at which the trigger event 328 was detected. In another example, the context information may be a particular location of the device when the trigger event 328 is detected. In yet another example, the context information may be a particular date when the trigger event 328 was detected. Further, the context information may be data collected from a calendar. For example, the amount of time (e.g., days or hours) between the current time and the event time.
The context information may include information about the current media content being played on the mobile device. For example, the context information may include information indicating whether the mobile device is currently streaming a media file to the playback device. The context information may indicate the type of media file being played (e.g., audio file, video file, etc.). The context information may also include information about devices associated with the mobile device. For example, the mobile device may be associated with a user account or group of user accounts (e.g., a family plan), and the context information may include information about other mobile devices or playback devices associated with the account. Such context information may provide more meaningful information about the context of the mobile device such that predictive application manager 402 may accurately suggest applications that the user may use in that context. Thus, the predictive application manager 402 using the context information may suggest an application to the user more accurately than without the context information.
Context data 406 may be generated by context source 408. Context source 408 may be a component in the mobile device that provides data related to the current context of the mobile device. For example, the context source 408 may be a hardware device and/or software code operating as an internal digital clock 410, a GPS device 412, a calendar 414, a motion sensor 432, and a sensor location module 416 to provide information related to time of day, location of the mobile device, and day of year, motion (or acceleration) of the mobile device, and sensor location of the mobile device, respectively. The sensor 428 may include an optical sensor configured for an AR application. Other context sources may be used.
The sensor location module 416 may be software code configured to receive information from the sensor 418 and write data to the sensor location database 420. In an embodiment, the sensor location module 416 may receive measurements of sensor values from the sensor 418 and store the measured values as sensor locations in entries of the sensor location database 420. The sensor 418 may be a hardware component configured to detect transmitted signals, such as Wi-Fi signals, bluetooth signals, radio Frequency (RF) signals, and any other type of signal capable of wirelessly transmitting information. The sensor location module 416 may be coupled to a sensor location database 420 to store the detected sensor values for future reference by a learning expert module 422, as will be discussed further herein. The sensor location module 416 may then output the sensor locations as context data to the predictive application manager 402 using the measured sensor values.
B. Predictor module for determining recommendations
The predictive application manager 402 may then use the information collected from both the trigger event 328 and the context data 406 to identify the suggested application 404. Predictive application manager 402 may also determine actions to be performed, e.g., how and when to provide a message including or using suggested application 404, as may occur through a user interface provided to a user for interaction with suggested application 404.
Predictive application manager 402 may be coupled to several predictor modules 424A-424D to identify suggested applications 404. Each predictor module 424A-424D may be configured to receive information from the predictive application manager 402 and output predictions back to the predictive application manager 402. The information communicated to the predictor modules 424A-424D may include the trigger event 328 and any associated context data 406, and the predictions output to the predictive application manager 402 may include one or more applications and their corresponding confidence values that indicate the likelihood that a user will run an application based on the received information.
The predictor modules 424A-424D may be configured for different purposes. For example, predictor modules 424A-242D may be configured to predict applications based on trigger events, predict actions for controlling accessories in the home, predict applications that are not currently installed on the mobile device but that may be of interest to the user, and predict applications based on sensor positioning (i.e., sensor positioning predictor module 424D). The predictor modules 424A-424D may be used to recommend a playback device of a plurality of playback devices at a location.
The predictor modules 424A-424D may be used to detect when the mobile device is on-track, on-the-fly, or on-the-track away from the AR area. For example, an AR map may define multiple AR areas, where one or more sensors (e.g., optical sensors) may be used to determine a user's location within the AR areas. The predictor modules 424A-424D may use the mobile device location and the predicted motion of the mobile device to determine a trajectory of the mobile device relative to the defined AR region.
The position may be determined using GNSS sensors (e.g., GPS), micro-positioning and/or motion sensors from various cluster technologies. Depending on the type of trigger event detected, predictive application manager 402 may communicate information to only those predictor modules. Thus, the predictive application manager 402 may communicate information to one predictor module, a subset of the predictor modules 424A-424D, or all of the predictor modules 424A-424D.
Each predictor module may have a set of one or more evaluation rules for determining predictions (e.g., applications and actions) that are communicated to the predictive application manager 402. The set of evaluation rules for the sensor location predictor module 424D may include a list of one or more applications corresponding to the sensor location or cluster of sensor locations and one or more criteria and actions to be taken for the one or more applications. The evaluation rules may select one or more applications based on the one or more criteria. For example, a likelihood (e.g., confidence value) for each application may be provided, and the criteria may be to provide the first 5 most likely applications on the screen of the user interface, where such display may include a message. The set of evaluation rules may also include confidence values for the applications in the list. The one or more criteria may include a predetermined set of context information that indicates which applications in the list are likely to be accessed by the user when measured upon detection of the trigger event.
Each set of evaluation rules may be a set of strings stored in memory or code compiled as part of an operating system. When the predictor modules 424A-424D receive information from the predictive application manager 402, the predictor modules 424A-424D may compare the received information to the evaluation rules and output a predicted application and confidence that best fits the received information. For example, the sensor location predictor module 424D may have a set of evaluation rules that establish that if the sensor is located within cluster 1, the user's likelihood of running a food related application has a 90% confidence value, and if the sensor is located within cluster 2, the user's likelihood of running a news related application has a 80% confidence value.
While this example discusses considering sensor locations of mobile devices, other context data 406 may also be considered to determine predicted applications and their corresponding confidence values. For example, the time of day and day of the week may also affect the predictions determined by the predictor modules 424A-424D.
Once predictive application manager 402 receives the predicted application from predictor modules 424A-424D, predictive application manager 402 may communicate suggested application 404 to expert center 426. Expert center 426 may be part of code that manages what content is displayed on the mobile device (e.g., on a lock screen, when a search screen is opened, or other screen). For example, expert center 426 may coordinate which information is displayed to the user, such as suggested applications, suggested contacts, and/or other information. Expert center 426 may also determine how to provide such information to the user. As previously described herein, the particular user interface provided to the user may depend on the degree of probability performed by the user. The higher the probability of use, the more aggressive actions can be taken, such as automatically opening an application (e.g., visual or voice commands) with a corresponding user interface, rather than merely providing a mechanism to more easily open the application.
If the expert center 426 determines that it is now the proper time to output the suggested application (or a message generated by the suggested application) to the user (e.g., when the user has not yet been running the application on the mobile device but is actively interacting with the mobile device), the expert center 426 may output a message 428 with the suggested application 404 to the recipient component 430. The recipient component 430 may be a user interface of the mobile device itself, or of another device, such as a tablet, laptop, smart watch, smart phone, or other mobile device. As another example, the recipient component 430 can be another application on the mobile device or an application of another device, where the application can be an operating system of the other device (e.g., in firmware), as may occur when a command message is transmitted to the other device to perform an action. In various embodiments, the recipient component 430 may be a playback device. Prediction manager 402 may use the available information to recommend playback devices for users seeking to stream media files. In the case where the suggested application 404 is included in the message 428, the recipient component 430 (e.g., a user interface) can communicate the suggested application 404 to the user and request a response from the user regarding the suggested application 404.
The recipient component 430 can require different levels of interaction in order for the user to run the suggested application 404. The respective levels may correspond to a degree of probability that the user will run the suggested application 404. For example, if predictive application manager 402 determines that the probability that suggested application 404 is being run by the user is greater than a threshold probability, recipient component 430 may output a prompt that allows the user to run the application faster by skipping one or more intermediate steps.
Alternatively, if predictive application manager 402 determines that the probability of the user running the identified application is less than the high threshold probability but still above the lower threshold probability, the identified application may be displayed as an icon. The lower threshold probability may be higher than the baseline threshold probability. The baseline threshold probabilities may establish a minimum probability that the corresponding application will be suggested. Thus, the user may have to perform the additional step of clicking on the icon to run the identified application. The number of clicks may still be less than that required when no application is suggested to the user. The threshold probability may vary depending on the application type. In some cases, the high threshold probability may be in a range between 75% and 100%, the lower threshold probability may be in a range between 50% and 75%, and the baseline threshold may be in a range between 25% and 50%. In an example, the high threshold probability is 75%, the lower threshold probability is 50%, and the baseline probability is 25%.
In some cases, a higher probability may result in more aggressive application suggestions. For example, if an application has a high probability of about 90%, predictive application manager 402 may provide an icon on a lock screen of the mobile device to allow a user to access the application by clicking on the icon once. If the application has a higher probability of about 95%, predictive application manager 402 may even automatically run the suggested application for the user without requiring the user to click on any content. In such cases, predictive application manager 402 may output not only the suggested application, but also commands specific to the application, such as a command to open a first article in a news-related application or a command asking the user to accept or decline to initiate a predetermined set of actions.
In some cases, predictive application manager 402 may determine what level of interaction is needed and then output this information to expert center 426. The expert center 426 may then communicate this information to the recipient component 430 for output to the user.
In some implementations, the recipient component 430 can display a notification to the user on a display screen. The notification may be transmitted, for example, by a system notification. The notification may be a visual notification that includes a picture and/or text informing the user of the suggested application. The notification may suggest an application to the user for selection and operation at the user's leisure. In some cases, the notification may also include suggested actions within the suggested application in order to make more aggressive predictions. That is, the notification may notify the user of the suggested application and suggested actions within the suggested application. Thus, the user may be provided with the option to run the suggested application or to perform suggested actions within the suggested application. For example, the notification may notify the user that the suggested application is a news-related application and that the suggested action is to access the first article within the news-related application. The user may indicate that he or she wants to read the first article by clicking on an icon indicating the first article. Alternatively, the user may indicate that he or she would like to run the application more to read another article by swiping the notification on the screen.
In some cases, the mobile device may identify which application is running at the sensor location and then draw an association between the sensor location and the application. The application may be stored in the sensor location database 420 along with the corresponding sensor location. In some cases, the sensor location database 420 may store sensor location data during a particular period of time. As an example, the sensor location database 420 may store sensor location data measured during the past seven weeks. Knowing which application is running at the sensor location helps to evaluate the habit of the user to update the evaluation rules stored in the sensor location predictor module 424D for predicting applications based on the habit of the user. In some cases, the expert module may routinely update the predictor modules 424A-424D.
C. notifying a prediction module using historical information
As shown in fig. 4, the learning expert module 422 is coupled to the sensor location database 420 and the sensor location predictor module 424D. The learning expert module 422 may be configured to update the set of evaluation rules contained in the sensor location predictor module 424D. Although fig. 4 shows only one learning expert for updating one predictor module, the technique is not so limited. For example, the learning expert module 422 may also be configured to update any of the predictor modules 424A-424C. In other cases, additional learning experts may be implemented in the prediction system 400 for updating the predictor modules 424A-424C.
The learning expert module 422 may be a software module configured to access the sensor location database 420 and analyze its stored information to generate an updated set of evaluation rules for the sensor location predictor module 424D. The learning expert module 422 may include one or more predictive models (not shown). Each predictive model may be a portion of code and/or data specifically designed to identify an application for a particular trigger event. For example, one predictive model may be specifically designed to identify an application for a trigger event associated with turning on a backlight of a mobile device. Each predictive model may be coupled to a context source such that each predictive model may utilize context information to identify suggested applications. Examples of predictive models include neural networks, decision trees, multi-labeled logistic regression, and combinations thereof, as well as other types of supervised learning. Further details can be found in U.S. patent application Ser. Nos. 14/732,359 and 14/732,287, which are incorporated by reference in their entireties.
As mentioned herein, the sensor location database 420 may store sensor location data over a particular period of time (e.g., over the last seven weeks of use). Thus, the updated set of evaluation rules generated by the learning expert module 422 for the sensor location predictor module 424D may reflect the device usage patterns over the past seven weeks. In some cases, once the learning expert module 422 has generated an updated set of evaluation rules, the learning expert module 422 may be deleted or removed from memory. After the next update of the sensor location predictor module 424D, the learning expert module 422 may be initiated again to generate an updated set of evaluation rules, which are then deleted again. Deleting the learning expert module 422 after generating the updated set of evaluation rules may save memory space and improve device performance.
In some cases, the learning expert module 422 may be run periodically. The time at which the learning expert module 422 is run may depend on the availability of the mobile device and the likelihood of use by the user. For example, the learning expert module 422 may run every night when the user sleeps, e.g., after the sensor location module determines a cluster of sensor locations. In such cases, the mobile device is typically connected to a power source to charge its battery and it is unlikely that the user will access the mobile device and interrupt the operation of the learning expert module 422.
D. Example use case
The location may be used to predict applications and actions of the mobile device. The prediction may be based on past actions of the mobile device, and associations between the mobile device, playback device, and other mobile devices. For example, the mobile device may present a graphical user interface for a music program (e.g., an application program), and the interface may suggest to the user to stream an audio file to a particular playback device (e.g., an action). Actions and applications may be suggested because the mobile device has an audio file that was previously streamed to the playback device at the current location of the device.
The mobile device may use the current location of the mobile device to suggest an application to the user. The device may be located where the user has previously streamed media content to the playback device, and the mobile device may suggest (e.g., presented as a graphical element on the user interface) an application program that allows streaming to the playback device. The suggested application may be based on past behavior of the mobile device (e.g., activity logs). For example, the list of suggested applications may include only playback devices that the mobile device has previously used to stream content to the current location or at any location.
Suggested actions for a particular application may be based on historical information regarding actions of the mobile device and the application. For example, the suggested actions may include streaming the media content from the mobile device to a prompt of the remote playback device. Some applications may be able to stream different types of media content. For example, an application may include audio media content and video media content. The history information may indicate that the mobile device is more likely to stream audio content at the current location, from an application, or from an application at the current location. After identifying the application, the mobile device may present a graphical user interface with suggested actions for the application. The user interface may include graphical elements that suggest to the user to stream the audio file to the eligible playback device because the history information indicates that the mobile device is more likely to stream the audio file from the application. Similarly, the user interface may include graphical elements that suggest to the user to stream the video file to a qualified playback device because the history information indicates that the mobile device is more likely to stream the video file from the application.
The suggested actions and applications may be based on historical information regarding interactions between devices. The suggested actions may include identifying a playback device. Because of the association between the user account of the mobile device and the user account of the playback device (e.g., the two user accounts are the same or associated), these playback devices may be suggested. If the history information indicates that the mobile device has performed a currently suggested action in the past with the associated playback device, the device may be suggested. In some embodiments, if the history information indicates that the mobile device does not have a history of interactions with the playback device, the associated playback device may not be suggested.
The current actions of the mobile device may be used to suggest additional actions or applications. For example, the mobile device may stream media content to the playback device. The mobile device may present a graphical user interface that identifies other eligible playback devices within range of the mobile device. A qualified playback device may be a device that is located at the current location of the mobile device and is capable of playing currently streamed media content. The movement of the mobile device may be used to suggest playback devices and, for example, when the mobile device moves to a second location, the mobile device that is streaming media content to the playback device associated with the first location may be presented with eligible playback devices in the second location.
Actions or applications may be suggested for the mobile device based on current actions of other mobile devices. For example, two or more mobile devices may be associated with a user account. One of the other mobile devices may stream media content to the playback device and the action may cause the mobile device to present a graphical user interface for controlling the playback device. If both of the associated devices are co-located, a graphical user interface may be presented. The graphical user interface, as well as any other graphical user interface in the present disclosure, may be presented on a lock screen of the mobile device.
The suggested applications or actions may be presented in an interface graphic (e.g., a widget). The interface graphic may present information from an application executing on the mobile device and receive input to the application. The interface graphic may have limited functionality when compared to an application associated with the interface graphic, but the interface graphic may have a higher priority access to the processor of the mobile device. For example, the interface graphic may be capable of waking up the processor from a low power mode, and the application may only be capable of interacting with the processor when the processor wakes up.
The interface graphic may present suggested actions and applications within the graphic. For example, the interface graphic may present the suggested media file for streaming. The suggested actions may be based on the context of the mobile device. The mobile device may detect that the movement of the device is above a threshold (e.g., based on motion sensor output). In response, the mobile device may determine the current location of the device. If a subsequent movement of the mobile device is detected, the location may be updated. Using the location, the mobile device may suggest playback devices in the location in response to input to the interface graphic (e.g., pressing a play button).
The suggested action may include disconnecting the mobile device from the playback device. For example, the mobile device may receive input to the application program and, in response, the mobile device may connect to the playback device for streaming media content from the application program. The mobile device may not stream the media content to the playback device or the device may pause streaming. After a threshold amount of time has elapsed, the mobile device may disconnect the playback device so that the mobile device does not inadvertently stream media content to an unintended playback device. The threshold amount of time may be 1 second, 5 seconds, 10 seconds, 15 seconds, 30 seconds, 45 seconds, 1 minute, 5 minutes, 8 minutes, or 10 minutes.
Efficient triggering of new ranging sessions
The mobile device may use several different sensors for accurate navigation. These sensors may be used to determine the location of the mobile device even when indoors or when GNSS information is not available. However, some of these different location techniques may cause processing time loss when calculating the position of the mobile device from the sensor information. This processing time may cause undesirable delays in loading applications and may result in a poor user experience when using these mobile device applications. Further, it is desirable for the mobile device to actively maintain a balance between navigation sensor usage and battery power savings. Other types of sensors (e.g., motion sensors) may be used to determine that the location of the mobile device has changed sufficiently to warrant use of the activation sensor.
A. Motion sensor and use of sensor positioning
A motion sensor (e.g., accelerometer or IMU) may be incorporated into the mobile device. These sensors may operate in the background and use minimal power. The motion sensor may detect acceleration over a period of time. Acceleration and time information may be used to determine the distance moved. The mobile device may experience multiple accelerations during discrete periods of time. The motion may be determined for a plurality of discrete time periods to determine a number of distances that may be combined to determine movement of the device from the starting position. For example, if the user is walking back and forth, the mobile device may experience movement in a first direction for a first period of time and in a second direction for a second period of time. The mobile device may determine a distance of the first time period and the second time period. The distances may be combined to determine whether the mobile device has moved more than a preset amount (e.g., 3 meters).
The mobile device may combine the various distances calculated to determine movement of the mobile device. For example, if the mobile device has moved more than a certain amount, this may trigger activation of a ranging session to update the location of the mobile device. The updated location may be used to determine the relevant playback device for streaming the media file. In various implementations, the updated location may be used to determine whether the mobile device is on-track, inside or outside the AR bubble to automatically activate or deactivate various sensors (e.g., optical sensors).
Fig. 5 illustrates an exemplary environment 500 for determining an updated location of a mobile device. Fig. 5 illustrates an indoor environment in which GNSS signals may not be available. Furthermore, GNSS signals (if available) may not be accurate enough to determine the location of the mobile device within the room. A user with a mobile device at a first location 502 may want to stream some media content to a playback device (e.g., a smart speaker 506 or a smart television 508). If only GNSS information is used, the mobile device may not be able to distinguish between a first location 502 in a room and a second location 504 in a different room.
The mobile device may periodically use ranging techniques to update the precise location of the mobile device. However, if the mobile device is stationary (e.g., the user is on a couch), the location of the mobile device may not change and the ranging techniques may unnecessarily consume power of the mobile device. To avoid unnecessary ranging sessions, the mobile device may use a motion sensor to determine a valid time for updating the location of the mobile device.
The updated location of the mobile device may be used for many different applications. For example, the updated location may be used to determine the relevant playback device for streaming the media content. In various embodiments, an application executing on a mobile device of a first user at a first location 502 may recommend one or more playback devices based on the first user's location and previous behavior of the first user. The updated location may be used to determine when the mobile device is on a trajectory into or out of the AR area for sensor activation or deactivation.
If the location of the mobile device is not updated, the application may provide a poor user experience. For example, if a user moves from a first location 502 in a first room to a second location 504 in a second room, a playback device (e.g., smart speaker 506 or smart television 508) in the first room may not be relevant to the user. The second location 504 may be separated from the time of the first location 502 by seconds, minutes, days, weeks, or months.
The second location 504 may be determined using the ranging techniques described above. Various techniques for determining the second location 504 may include RF scanning (e.g., wi-Fi, BLE, UWB). However, this RF scan may cause some hysteresis in determining the second location 504 and may increase the power consumption of the mobile device to perform the RF scan.
Techniques using motion sensors can determine motion between two moments (e.g., seconds, perhaps minutes) that are relatively close in time. If the motion causes the mobile device to move a distance greater than a threshold amount, an updated location may be calculated.
B. Using motion as a trigger for ranging
Fig. 6 illustrates a second exemplary environment 600 for efficient location technology. The mobile device of the user at the first location 602 may determine a first geofence 606. The first geofence may be a defined area around the mobile device. For example, the first geofence may be a circular area with a radius of three meters. The mobile device can use geofences of different sizes to determine whether the mobile device has stopped moving. The radius of the geofence may be predefined and may be changed as desired. For example, if the geofence is used to determine whether the mobile device has stopped moving, the geofence may initially be a first value (e.g., three meters). If the geofence is not crossed within a certain period of time, the geofence can be sized (e.g., two meters, one meter, or half meters) to determine if the mobile device is crossing a shorter range geofence. For example, if the geofence is not breached within a predetermined period of time, a subsequent smaller geofence may be used to determine if the mobile device is still moving. Several consecutive smaller geofences can be used to detect if a mobile device is stationary. Once the device has not moved beyond a certain threshold distance, an updated position determination (e.g., RF scan) may be performed.
One or more sensors (e.g., inertial measurement units) may detect movement of the mobile device. An Inertial Measurement Unit (IMU) is an electronic device that uses a combination of accelerometers, gyroscopes, and sometimes magnetometers to measure and report the specific force, angular rate, and sometimes also the orientation of the body of the device. The sensor may determine the speed of the mobile device. The distance traveled by the mobile device may be determined by determining the speed and the time elapsed at the speed.
In various embodiments, the mobile device can establish a geofence a predetermined distance (e.g., three meters) from the current location of the mobile device. The predetermined distance may vary and multiple geofences of different sizes may be used. The mobile device can use the sensor to detect when the geofence has been breached. For example, if the mobile device moves greater than a predetermined distance, the mobile device may exit the first geofence 606. When the mobile device leaves the first geofence, a second geofence 608 can be established at a second location 604. The motion sensor may then determine whether the second geofence 608 is breached.
The mobile device can use the geofence to determine whether an updated location (e.g., RF scan) is needed to update the current location of the mobile device. In various embodiments, the mobile device may wait until the mobile device is stationary for a predetermined period of time and then update the location.
Fig. 7 illustrates a flow chart of a process 700 for determining when to update a location of a mobile device (e.g., update an RF scan). It may not be efficient for a mobile device to update its location while it is in motion. For example, if a mobile device is moving from a first room to a second room, the mobile device may span several geofences. However, if the user is not using an application on the mobile device while traveling, the mobile device may wait until stationary to update the position fix. The mobile device may also use the orientation (e.g., screen right side up for user use) to determine the best time to update the location of the mobile device.
At block 702, the mobile device can determine whether the geofence is breached, as described above. The mobile device can use one or more sensors to determine whether the geofence is breached. The one or more motion sensors may include an IMU. If the motion sensor detects that the mobile device has moved beyond a predetermined distance. Even if the mobile device has moved beyond a predetermined distance (e.g., breached a geofence), conditions may exist that cause the mobile device to delay RF scanning. For example, the mobile device may wait until the display is active and then perform an RF scan. For example, the device may be packaged in a pocket or bag, and the motion sensor may detect that the geofence has been breached. However, the user may not be actively using the mobile device, so the screen is not active and updating the RF scan by the mobile device would result in wasted power.
At block 704, the mobile device may determine whether the screen is active. If the mobile device is turned on, the screen is active and the screen is unlocked and not in a screen saver or power saving mode. One or more display sensors may be used to determine whether the screen is active.
At block 706, it is determined that the screen is active and the geofence is breached, the mobile device may update the location (e.g., conduct an RF scan). After the scan is complete, the mobile device may set a new geofence. The mobile device can then monitor whether the new geofence is breached.
At block 708, the mobile device may wait for the next screen activity event. After detecting that the screen is active, the mobile device may proceed to block 706.
C. Example use cases (e.g., predictive routing)
Fig. 8 illustrates a third exemplary environment 800 for determining updated position location of a mobile device. A first user at a first location 502 may select a media file 802 on a mobile device to stream to a playback device (e.g., smart speaker 506 or smart television 508). Although only two playback devices are depicted, the location may include many different playback devices (e.g., multiple smart speakers, one or more tablet computers, one or more laptop computers, one or more desktop computers, or one or more wearable devices). The mobile device 804 can display a list of streaming devices 806 and potentially related playback devices 808 (e.g., living room devices, bedroom devices, kitchen devices, office devices). The user at the first location 502 may select one of the playback devices 808 by tapping the name of the selected device on the screen of the mobile device 804.
In various embodiments, the mobile device may store a user behavior history for each application and for each location (e.g., room or location in a room). For example, if a first user sits on a living room sofa, audio media from a music application is often played back using the smart speaker 506, the smart speaker 506 may obtain a higher priority in the list of recommended playback devices 808.
The user experience desires to specify low latency requirements. This means that the recommendations are listed quickly for the user to select the desired playback device. The mobile device may detect patterns of behavior history using locations of events that may propagate over days.
In other embodiments, the updated position of the mobile device may be used to determine whether the mobile device is on a trajectory into the AR area. The AR area may be a designated area of a plurality of areas in which a sensor (e.g., inertial, optical, or inertial-optical) of the mobile device may be used to determine the precise location. The mobile device may automatically activate one or more sensors if the mobile device is expected to enter or be located within the AR area. If the mobile device is expected to leave the AR area, the mobile device may actively deactivate one or more sensors. Other use cases are also conceivable.
Flow for efficient positioning of mobile devices
Fig. 9 illustrates a flow chart of a process 900 for determining when to update a location of a mobile device (e.g., update an RF scan). The method may be performed by one or more processors of a mobile device. The method may also use motion sensors (e.g., IMUs). In various implementations, if the motion sensor detects that the mobile device has moved more than a threshold amount, the mobile device may update the location of the mobile device using one or more wireless sensors.
At block 902, in response to the trigger signal at the associated first time, process 900 may include generating a first location value using a first ranging session with one or more other devices. The ranging session may include transmitting and receiving wireless signals. The ranging session may use the received wireless signals to triangulate the location or cluster in which the mobile device is located. The wireless signal (e.g., wi-Fi, BT, BLE, UWB signal) may be transmitted from any signal source, for example, an electronic device, such as a wireless router, wi-Fi equipped appliance (e.g., set top box, smart home device), or bluetooth device. In various implementations, the first location value may be triangulated from receiving various wireless signals.
The trigger event may be identified as an event that is sufficiently likely associated with operation of the mobile device. The trigger event may be caused by a user and/or an external device. For example, the trigger event may be a specific interaction of the user with the mobile device. In various implementations, the triggering event may be the mobile device being stationary longer than a specified period of time determined by one or more motion sensors (e.g., IMU). The specific interactions may be used to learn what the user did at a specific location and thus may be considered a learning trigger event. Examples of learning trigger events are application launch, specific activities within an application (e.g., making a selection within an application), voice commands (e.g., initiating at a voice assistant to perform a search or other activity with an application), and first interactions during the day. As other examples, the triggering event may be the accessory device connecting to the mobile device, such as inserting a headset into a headset interface, making a bluetooth connection, and so forth. An event list of trigger events may be stored on the mobile device. Such events may be a default list and maintained as part of the operating system, and may or may not be configured by the user.
According to an example, one or more of the process blocks of fig. 9 may be performed by a mobile device (e.g., a smart phone, a tablet device, a wearable device, and a laptop).
At block 904, the process 900 may include storing the first location value in a memory of the electronic device. The first location value may be stored locally on the mobile device memory or may be stored in a cloud-based service.
At block 906, the process 900 may include tracking motion of the mobile device using a motion sensor of the mobile device to determine a current location relative to the first location value. In various embodiments, the motion sensor may be an accelerometer of the mobile device. In various embodiments, the sensor may be an IMU of the mobile device. The accelerometer may determine movement of the mobile device over a period of time. The motion and time may be used to determine a dead reckoning position fix for the mobile device. For example, if the movement of the mobile device is 1.5 meters per second, after 2 seconds the mobile device may have moved 3 meters from the previous location. A series of locations may be stored in a memory of the electronic device. The series of positions may be used to determine movement of the mobile device.
At block 908, the process 900 may include determining that the current location of the mobile device has changed from the first location value by a threshold amount since the associated first time. The mobile device can establish one or more geofences. The geofence may be a predetermined radius around a particular location. The predetermined amount may vary. For example, the predetermined amount may be 0.5 meters, 1 meter, 2 meters, 4 meters, etc. The mobile device may use the detected motion over a period of time to determine whether the mobile device has moved from the first location beyond a threshold. For example, a user may be sitting on a sofa but using a mobile device. However, the movement of the mobile device will not be greater than a predetermined threshold amount. Thus, the mobile device does not use ranging techniques to update its location. However, if the person stands up and walks out of the room, the mobile device may detect that the predetermined threshold amount has been exceeded and trigger an update of the location. In various implementations, the motion sensor may include an accelerometer.
In various embodiments, the mobile device may verify that other conditions are met in addition to the mobile device moving beyond a threshold since the associated first time. For example, in various embodiments, the second ranging session also requires the screen of the mobile device to face the user. In various embodiments, the second ranging session also requires the screen of the mobile device to be active.
At block 910, in response to the current location of the mobile device having changed more than a predetermined threshold amount since the associated first time, process 900 may include generating a second location value using a second ranging session with one or more other devices. The second ranging session may include transmitting and receiving wireless signals. The second ranging session may use the received wireless signals to triangulate the location or cluster in which the mobile device is located. The wireless signal (e.g., wi-Fi, BT, BLE, UWB signal) may be transmitted from any signal source, for example, an electronic device, such as a wireless router, wi-Fi equipped appliance (e.g., set top box, smart home device), or bluetooth device. In various implementations, the second location value may be triangulated from receiving various wireless signals.
In various embodiments, the mobile device delays generating the second location value until the screen of the mobile device is active. The screen of the mobile device is active if the mobile device does not display a screen saver or is not in a power saving mode.
In various embodiments, the mobile device may detect whether the display screen is facing down, which may be an indication that the mobile device is not being used. If the display is detected facing down, the subsequent ranging session may be delayed until the device is in an orientation indicating that it is in use.
In various implementations, process 900 can include establishing a series of progressively smaller geofences to determine that the mobile device has stopped moving. For example, the first geofence may be 5 meters, the second geofence may be 2 meters, the third geofence may be 1 meter, and so on.
In various embodiments, the mobile device delays generating the second location value until the mobile device is stationary for a predetermined period of time.
At block 912, the process 900 may include storing the second location value in memory. For example, as described above, the device may store the second location value in memory. The second location value may be stored locally on the mobile device memory or may be stored in a cloud-based service.
Process 900 may include additional implementations, such as any single implementation or any combination of implementations described below and/or in connection with one or more other processes described elsewhere herein. The first implementation, the process 900 further includes determining a playback device for the streaming service based on the first location value or the second location value.
In various embodiments, process 900 further includes receiving a notification from the playback device, the notification instructing the mobile device to generate a third location value using a third ranging session for the mobile device. The third location value may be generated using one or more of the location techniques described above.
It should be noted that while fig. 9 shows example blocks of process 900, in some implementations, process 900 may include additional blocks, fewer blocks, different blocks, or differently arranged blocks than those depicted in fig. 9. Additionally or alternatively, two or more of the blocks of process 900 may be performed in parallel.
According to some embodiments, one or more target objects may be provided to the user as recommendations. According to some other embodiments, the action may be performed automatically based on one or more target objects.
According to some embodiments, the mobile device may periodically measure the sensor value. Once the likelihood of using an application, accessory device, or performing an action is sufficiently high, a target object may be predicted by performing steps 910 and 912.
Predicting the one or more target objects may include comparing, by the arbiter module, outputs of the respective prediction models to determine one or more target objects to be provided on the user interface, to be implemented, or to be provided to another software module on the mobile device. For example, another software module in communication with the device may be provided with action behaviors or recommendations to be performed (e.g., the device).
It should be appreciated that the specific steps illustrated in fig. 9 provide a particular method of predicting a target object according to some embodiments. Other sequences of steps may also be performed according to alternative embodiments. For example, alternative embodiments of the present invention may perform the steps outlined above in a different order. Furthermore, each step illustrated in fig. 9 may include a plurality of sub-steps, which may be performed in various sequences appropriate to each step. Also, additional steps may be added or removed depending on the particular application. Those of ordinary skill in the art will recognize many variations, modifications, and alternatives.
In various aspects, a mobile device may include one or more processors and memory storing a plurality of instructions that, when executed by the one or more processors, perform all or part of process 900 as described above.
In various aspects, a computer-readable medium may store a plurality of instructions that, when executed by one or more processors, may perform all or part of process 900 as described above.
Predicted actions of location-based augmented reality bubbles
The augmented reality system may use one or more sensors (e.g., cameras) to locate the position of the mobile device within the virtual map. The one or more sensors may determine the location of the mobile device based on the relative positions of various objects in the region of interest. For example, a user with a mobile device may map their location from a room in the real world to a location in the virtual world based on the relative positions of various items, objects, and obstacles from the real world, which may be mapped into the virtual world. For example, a room such as a bedroom may include four walls, a bed, a bedside table, a wardrobe, a door, and a light. Items of the bedroom may be mapped into the virtual space. Once items of the bedroom are mapped into the virtual bedroom, sensors on the mobile device may map the user into the virtual world based on the relative positions to the various mapping objects.
An Augmented Reality (AR) application may experience hysteresis when activated due to determining the location of the mobile device relative to an augmented reality area (also referred to as a bubble). There may also be hysteresis in the loading of the AR application. The preloading technique may help the process run faster. Conventional ranging services may be time consuming and may result in inefficient use of battery resources. Hysteresis in determining device positioning may result in a poor user experience. Positioning techniques may be used to determine when a mobile device is approaching, departing, or within some defined location.
A. example AR device
Optical tracking may use cameras placed on or around a device (e.g., a mobile device such as a phone or a headset) to determine position and orientation based on computer vision algorithms. The method may be based on the same principle as stereoscopic human vision. When a person views an object with binocular vision, he/she can roughly judge the distance of the object due to the difference in binocular vision. In optical tracking, the camera may be calibrated to determine the distance to the object and its positioning in space. Optical systems are reliable and relatively inexpensive, but they can be difficult to calibrate. Furthermore, the system may require direct light rays that are not occluded, otherwise it will receive erroneous data.
Optical tracking may be accomplished with or without markers. Tracking with markers may involve using targets with known patterns as reference points, and cameras continually look for these markers, and then use various algorithms (e.g., the POSIT algorithm) to extract the location of the object. The indicia may be visible, such as a printed Quick Reference (QR) code, but many indicia use Infrared (IR) light that can only be picked up by a camera. Active implementations feature a marker with built-in IR LED lights that can be turned on and off to synchronize with the camera to more easily block other IR lights in the tracking area. Passive implementations may include a retro-reflector that reflects IR light back to the source with little scatter. Marker-less tracking does not require any pre-placed targets, but rather uses natural features of the surrounding environment to determine position and orientation.
In outside-in tracking, a camera may be placed in a stationary position in the environment to track the location of a marker on a tracked device (such as a head mounted display or controller). Having multiple cameras allows different views of the same marker and this overlap allows accurate readings of the device's position. For example, a Virtual Reality (VR) system (e.g., original Oculus Rift) may utilize this technique to place a group of IR LEDs on its head-mounted device and controller to allow external cameras in the environment to read their positioning. Outside-in tracking is most mature and is applied not only to VR but also to motion capture technology for movies. However, this solution is spatially limited, requiring the external sensor to be placed in a constant field of view of the device.
For inside-out tracking, a camera may be placed on the tracked device and looking outward to determine its location in the environment. A head mounted device using this technology has multiple cameras facing in different directions to obtain a view of its entire surroundings. The method may work with or without a mark. The Lighthouse system used by HTC Vive is an example of active marking. Each external Lighthouse module contains an IR LED and an array of lasers that scan in the horizontal and vertical directions, and sensors on the head-mounted device and controller can detect these scans and use timing to determine positioning. No tag tracking (such as on Oculus Quest) does not require anything to be installed in the external environment. It may use a camera on the head-mounted device to perform a process called SLAM or simultaneous localization and mapping, where a 3D map of the environment is generated in real-time. The machine learning algorithm then uses feature detection to determine the position of the head mounted device within the 3D map to reconstruct and analyze its surroundings. This technique allows the high-end headset (e.g., microsoft HoloLens) to be freestanding, but it also opens the door for a cheaper mobile headset without requiring a connection to an external computer or sensor.
Inertial tracking may use data from accelerometers and gyroscopes, and sometimes from magnetometers. The accelerometer measures linear acceleration. Since the derivative of the position fix with respect to time is velocity and the derivative of the velocity is acceleration, the output of the accelerometer can be integrated to find the velocity and then integrated again to find the position fix with respect to some initial point. The gyroscope measures angular velocity. The angular velocity may also be integrated to determine the angular position relative to the initial point. Magnetometers measure magnetic fields and magnetic dipole moments. The direction of the earth's magnetic field may be integrated to obtain an absolute orientation reference and compensate for gyroscope drift. Modern Inertial Measurement Unit Systems (IMUs) are based on MEMS technology, allowing tracking of orientation (roll, pitch, yaw) in space with high update rates and minimal latency. Gyroscopes are always used for rotational tracking, but based on factors such as cost, ease of setup, and tracking volume, different techniques are used for position tracking.
Dead reckoning may be used to track positioning data, which may change the virtual environment by updating the user's motion changes. Dead reckoning update rates and predictive algorithms used in virtual reality systems affect the user experience, but since many different techniques have been used, no consensus has been reached for best practice. Since dead reckoning can cause drift, it is difficult to determine accurate positioning by inertial tracking alone, and thus this type of tracking is not used in isolation in virtual reality. It has been found that a lag between the movement of the user and the virtual reality display exceeding a defined time (e.g., 100 milliseconds) can cause nausea.
Inertial sensors are able to track not only rotational movements (roll, pitch, yaw), but also translational movements. Together, these two types of motion are referred to as six degrees of freedom. Many applications of virtual reality need to track not only the user's head rotation, but also how their body moves with them (left/right, back/front, up/down). Not all virtual reality experiences require six degrees of freedom capability, but it is useful when a user needs to move something other than his head.
Sensor fusion may combine data from several tracking algorithms and may produce better outputs than just one technique. One of the variants of sensor fusion is to combine inertial and optical tracking. These two techniques are often used together because inertial sensors are optimal for tracking fast movements and can accumulate errors quickly, while optical sensors provide an absolute reference to compensate for inertial weaknesses. Furthermore, inertial tracking may counteract some of the deficiencies of optical tracking. For example, optical tracking may be the primary tracking method, but when occlusion occurs, inertial tracking estimates position until the object is again visible to the optical camera. Inertial tracking may also generate positioning data between optical tracking positioning data because inertial tracking has a higher update rate. Optical tracking also helps to cope with drift in inertial tracking. Combined optical and inertial tracking has been shown to reduce misalignment errors that typically occur when users move their heads too fast. Advances in micro-electromagnetic systems have made magnetic/electrical tracking more popular due to its small size and low cost.
Acoustic tracking systems use techniques for identifying the location of an object or device that are similar to those found naturally in animals that use echolocation. Similar to bats that use the time difference of return of sound waves to their two ears to locate an object, an acoustic tracking system in a VR may use a set of at least three ultrasonic sensors and at least three ultrasonic transmitters on a device in order to calculate the location and orientation of the object (e.g., a handheld controller). There are two ways to determine the location of an object, measuring the time of flight of an acoustic wave from a transmitter to a receiver, or measuring the phase coherence of a sinusoidal acoustic wave by receiving a transmission.
The time-of-flight method may use a set of three non-collinear sensors (or receivers) with distances d1 and d2 between them and the propagation time of the ultrasonic waves (waves with acoustic frequencies greater than 20 kHz) from the transmitter to the three receivers, the relative cartesian positioning of the transmitter may be calculated as follows:
here, each li represents a distance from the transmitter to each of the three receivers calculated based on the propagation time of the ultrasonic wave using the equation l=ct us. The constant C represents the sound velocity, which is equivalent to 343.2m/s in dry air at a temperature of 20 ℃. These calculations are often referred to as triangulation because of the need for at least three receivers.
In addition to its positioning, determining the orientation of the device (i.e. its degree of rotation in all directions) requires knowledge of at least three non-collinear points on the tracked object, requiring that the number of ultrasound transmitters per tracked device be at least three in addition to the three aforementioned receivers. The transmitter transmits ultrasonic waves sequentially to the three receivers, which can then be used to derive spatial data on the three transmitters using the methods described above. The orientation of the device may then be derived based on the known positioning of the transmitters on the device and their spatial positions relative to each other.
B. loading an AR environment from memory (e.g., database)
The user may create an AR map using the mobile application. The mobile application may use the mobile device camera to define the boundaries of the AR map. The mobile application may use the images from the camera of the mobile device and generate virtual representations of objects in the images and place these virtual representations in an AR map generated from the map. The map may be stored in memory. In various implementations, the map may be stored on a server. Fig. 10 illustrates an exemplary AR map 100 of a home. The home may include a kitchen, a restaurant, a living room, and two bedrooms. Various AR areas may be defined in the AR map 1000. The various AR regions may include, but are not limited to, a first bedroom region (AOI-1) 1002, a second bedroom region (AOI-2) 1004, a hallway region (AOI-3) 1006, a restaurant region (AOI-4) 1008, a kitchen region (AOI-5) 1010, and a living room region 1012. Other regions may be defined.
The AR map 1000 may be stored in a memory of the mobile device. In various implementations, the AR map 1000 may be uploaded from a server via a network (e.g., the internet).
Simultaneous localization and mapping (SLAM) techniques may be used to construct a map of an environment (e.g., a bedroom) and simultaneously locate a user in the map. The SLAM algorithm allows the mobile device to draw an unknown environment.
Visual SLAM (or vSLAM) uses images acquired from cameras and other image sensors. Visual SLAM may use simple cameras (e.g., wide-angle, fish-eye, and spherical cameras), compound-eye cameras (stereo and multi-camera), and RGB-D cameras (depth and ToF cameras). Visual SLAM can be implemented at low cost with a relatively inexpensive camera. In addition, because cameras provide a large amount of information, they can be used to detect landmarks (previously measured locations). Landmark detection may also be combined with graph-based optimization to achieve flexibility in SLAM implementation.
Monocular SLAM is a case in which vSLAM uses a single camera as the sole sensor, which makes defining depth challenging. This may be addressed by detecting AR markers, checkerboards or other known objects in the image for localization, or by fusing camera information with another sensor, such as an Inertial Measurement Unit (IMU), which may measure physical quantities, such as speed and orientation. Technologies related to vSLAM include motion restoration structure (SfM), visual odometer, and beam adjustment.
SLAM techniques may use landmarks or areas of interest (AOI) to minimize positioning errors. Using previous mapping techniques, the mobile device may detect when the mobile device is near an AOI and may identify which AOI of a plurality of AOIs (e.g., in a house) the mobile device is near. However, the user will need to manually turn on the sensor to perform the positioning technique within the AOI. Additionally, when the mobile device leaves the AOI, the user will need to manually turn off the sensor.
AOI (AR bubble) creation may involve several technical steps. The location manager module may activate optical and inertial sensors to provide accurate positioning information for the mobile device. The location manager module may update the heading information (based on the heading information from the compass of the mobile device) to determine the azimuth of the mobile device relative to true north. The AR toolkit module may begin visual SLAM world mapping, including geometry-based planar detection of regions to define AOI (or AR bubbles). The AR toolkit and scene toolkit module may add enhancements (e.g., spatialized audio or video content) to the AOI. The user interface may be used to store AOI to a world map. The stored information of the AOI may include latitude, longitude, altitude information, and absolute altitude and time accuracy provided by a location manager.
C. positioning of tracking device relative to region of interest
Various positioning techniques may be used to automatically detect when the trajectory of the mobile device is predicted to be within the AOI and then automatically turn on the sensor. These techniques may include Pedestrian Dead Reckoning (PDR), wireless ranging, GNSS signals, acoustic techniques, and using micro-positioning derived from wireless signals (e.g., wi-Fi, bluetooth low energy, etc.). In these techniques, the origin of the AR world may be the center of the geofence. The geofence may be a predetermined distance around the point. Positioning techniques may be used to determine when a mobile device moves beyond a predetermined distance. If the mobile device is outside of the AR area and the sensor detects that the mobile device is on a trajectory into the AR area, the mobile device may load the AR map 1000 of the area and initiate a sensor (e.g., an optical and/or inertial sensor). When the mobile device is within the AR area and on the trajectory exiting the geofence, the sensor may be turned off, thereby saving limited power resources for the mobile device. Positioning techniques may also be used to detect the proximity to the second AOI and once within the second AOI, turn the sensor on again.
These techniques allow suspension of sensor (e.g., camera) collection. In various embodiments, sensor collection may be reduced to a low collection rate (e.g., 1 hz). The techniques may also pause the visual SLAM session and/or enhance the display session until the predicted trajectory indicates that the mobile device is on the path into the AOI. Positioning techniques allow the use of low power techniques to determine movement between AOIs. However, a single positioning technique (e.g., PDR) may not alone provide sufficient resolution to determine which of multiple AOIs the mobile device trajectory is going to. Thus, various positioning techniques (e.g., GNSS and wireless ranging) may be combined with a geofence that determines when a mobile device has moved a predetermined distance. Combining various positioning techniques allows for improved resolution in determining the positioning and trajectory towards the AOI.
D. Entering the AR region
Previous AR techniques may detect when a mobile device is in proximity to AOI. However, manual user interface input is required to start the AR session and initiate the sensor for the session. This manual interaction also has a lag in the positioning of the mobile device in the AR area because the sensor needs time to receive input and calculate the precise positioning of the mobile device within that area. The following techniques describe using positioning techniques to automate the positioning process and reduce hysteresis, thereby improving the user experience.
Fig. 11 illustrates an example of a map 1100 of one potential use of a positioning technique for entering an AR area. Map 1100 illustrates the location of mobile device 1102 and the trajectory of mobile device 1102 based on the detected motion of mobile device 1104. There are two potential AOIs that the mobile device 1102 can be predicted to enter, such as a first AOI 1106 and a second AOI 1108. Without the use of PDR techniques, the AR system determines that there may be hysteresis in the AOI to be located. If the trajectory of the device changes, such as in the second trajectory 1110, the system may disambiguate and begin locating the first AOI 1106. As the mobile device 1102 passes over the geofence, the sensors may be activated to begin collecting and analyzing sensor data and prepare to begin positioning techniques for the first AOI 1106. By activating the sensors and collecting data early, these techniques can reduce the lag of previously used manual techniques.
AOI (AR bubble) entry may involve several technical steps. A positioning module in the mobile device may be initiated to provide accurate positioning information prior to AOI entry. The location module may be a combination of hardware (e.g., optical sensor, inertial sensor, GNSS sensor, camera, compass, accelerometer, wireless sensor) and software code that can determine the precise location of the mobile device. The location module may begin by updating heading information of the mobile device relative to the azimuth of true north. The mobile device may load the saved AOI file (ARBubble) to active memory. The mobile device may begin to coarsely sort the AOI files (ARBubble) by the "most recent" AOI as determined by the coarse location from the location module. The location module may begin updating the motion of the mobile device, including representing gestures (true north reference), rotational rates, gravity, and user acceleration. The mobile device may calculate a predicted trajectory of the mobile device based on the three motion update points and the heading information. The mobile device may begin fine ordering of AOI (ARBubble) locations based on positioning techniques. Once the fine ordering has identified candidates ARBubble, the augmented reality module may begin visual SLAM techniques to enable world map repositioning. The sensor may be turned on at this point. Once the world has been relocated, the mobile device may load the enhanced content. Once the enhancement is loaded, the mobile device can set a geofence and AOI (ARBubble) "world center".
E. Exit the AR region
AOI (arbutton) position ordering may be suspended and the user operates inside the AOI (arbutton). When the mobile device exits the AOI (AR bubble), the mobile device can use various positioning features to maintain tracking of the mobile device within the AOI (ARBubble) within the constraints of the geofence. Once the mobile device has exited the geofence, (ARKit) ARBubble session pauses and the sensor is turned off. The mobile device may return to the AOI "bubble in" state and sort until the system is interactively turned off/out via the user interface.
Fig. 12 illustrates an exemplary map 1200 of a user with an AR area. Map 1200 illustrates the location of mobile device 1102 and a trajectory 1204 of mobile device 1102 that is based on the detected motion of mobile device 1102. In fig. 12, trace 1204 shows that the mobile device is about to leave the first AOI 1106.
The prior art cannot detect when the mobile device has exited the AOI and the sensor and AR toolkit will continue the positioning technique until the user manually deactivates. Using various positioning techniques, the mobile device application may determine when the mobile device 1102 has traveled a predetermined distance (e.g., 3 or 4 meters) to exit the AR area. When the mobile device exits the AR area, power may be removed from the sensors (e.g., optical and inertial sensors), accurate positioning techniques may be suspended or turned off, thereby conserving power resources of the mobile device battery.
The initial trigger of AR localization using visual SLAM techniques may originate from the core location. The core location may be from GPS positioning, wi-Fi SLAM technology, micro positioning, UWB ranging, or another positioning technology.
Example use of F.AR positioning techniques
Fig. 13 illustrates a second map 1300 of a home. PDR techniques may be used to determine a first path 1302 of a first mobile device of a first user and a second path 1302 of a second mobile device of a second user. The identified points 1306 may identify known areas (e.g., kitchen, bedroom, bathroom, etc.) in the second map 1300. The identified points 1306 may be mapped into coordinate space as AOI or AR bubbles.
The location manager module may be pre-heated to provide accurate location information for the mobile device. The location manager module may update the heading information (based on the heading information from the compass of the mobile device) to determine the azimuth of the mobile device relative to true north. The AR toolkit module may begin a visual SLAM world map that includes geometry-based planar detection of regions to define AOI (or AR bubbles). The AR toolkit and scene toolkit module may add enhancements (e.g., spatialized audio or video content) to the AOI. The user interface may be used to store AOI to a world map. The stored information of the AOI may include latitude, longitude, altitude information, and absolute altitude and time accuracy provided by a location manager.
G. Flow for augmented reality positioning techniques
Fig. 14 is a flow chart of a process 1400 for determining when a mobile device trajectory is predicted into a region of interest (AOI) using positioning techniques according to examples of the present disclosure. According to an example, one or more of the process blocks of fig. 14 may be performed by a mobile device (e.g., a wearable device, a smart phone, a tablet computer, a laptop computer).
At block 1402, in response to the trigger signal at the associated first time, the process 1400 may include determining a first location value of the mobile device. For example, the mobile device may determine the first location value of the mobile device using one or more of GNSS (e.g., GPS timing signals), a micro-positioning determination, a wireless ranging session, an RSSI determination from one or more transmitting devices.
At block 1404, process 1400 may include storing the first location value in memory. For example, the mobile device may store the first location value in a local memory of the mobile device. In various embodiments, the mobile device may store the first location in a memory of a cloud-based server device.
At block 1406, the process 1400 may include tracking a motion of the mobile device using a motion sensor of the mobile device to determine a current location relative to the augmented reality area. For example, the device may track the motion of the mobile device using a motion sensor of the mobile device to determine a current location relative to the augmented reality area, as described above.
At block 1408, the process 1400 may include initiating an augmented reality mode on the mobile device when the mobile device is within a predetermined distance from the augmented reality area. The predetermined distance may vary depending on the desired size of the AOI. In various embodiments, the predetermined distance may be between 1 meter and 7 meters. For example, as described above, the mobile device may initiate an augmented reality mode on the mobile device when the mobile device is within a predetermined distance from the augmented reality region.
Process 1400 may include additional implementations, such as any single implementation or any combination of implementations described below and/or in connection with one or more other processes described elsewhere herein. In a first implementation, the process 1400 may include capturing sensor data using a second sensor on the mobile device and storing the sensor data in a memory.
In various implementations, process 1400 may include suspending an augmented reality mode on the mobile device when the current location of the mobile device is outside of an augmented reality region (such as an AOI or AR bubble). In various implementations, suspending the augmented reality mode may include disabling, powering down, or changing the detection rate of one or more of the sensors of the mobile device that are available for positioning.
In various embodiments, process 1400 further includes tracking motion of the mobile device using a motion sensor of the mobile device to determine a trajectory of the mobile device, and initiating an augmented reality mode when the trajectory of the mobile device is to be within a predetermined range from the augmented reality area.
In various implementations, the process 1400 can include suspending an augmented reality mode on the mobile device when the trajectory of the mobile device is outside of the augmented reality region. In various implementations, suspending the augmented reality mode may include disabling, powering down, or changing the detection rate of one or more of the sensors of the mobile device that are available for positioning.
It should be noted that while fig. 14 shows example blocks of process 1400, in some implementations, process 1400 may include additional blocks, fewer blocks, different blocks, or differently arranged blocks than those depicted in fig. 14. Additionally or alternatively, two or more of the blocks of process 1400 may be performed in parallel.
It should be appreciated that the specific steps illustrated in fig. 14 provide a particular method of predicting an action according to some embodiments. Other sequences of steps may also be performed according to alternative embodiments. For example, alternative embodiments of the present invention may perform the steps outlined above in a different order. Furthermore, each step illustrated in fig. 14 may include a plurality of sub-steps, which may be performed in various sequences appropriate to each step. Also, additional steps may be added or removed depending on the particular application. Those of ordinary skill in the art will recognize many variations, modifications, and alternatives.
VIII use station stored sensor proximity determination by location
Detecting device proximity may be accomplished using simultaneous exchange of ranging messages between proximity devices. If both devices are active, close to each other and share a common message protocol, then these messages are exchanged. Thus, these simultaneous techniques cannot be used to detect battery depletion, synchronization failure, proximity devices that cannot support ranging measurements or are otherwise offline.
According to some embodiments of the present disclosure, a device may use sensor positioning (e.g., as described above in sections I and II) to determine device proximity non-simultaneously without directly exchanging messages. The signal fingerprint may include a set of electromagnetic signal measurements (e.g., a set of sensor values) made by the device using nearby anchors (e.g., signal sources). The signal fingerprint (e.g., sensor location) may include a list of detected signal sources (e.g., wi-Fi access points) and a distance measurement (e.g., time of flight or Received Signal Strength Indicator (RSSI)) for each signal source. Such signal sources may be referred to as anchor devices. The first mobile device may upload the signal fingerprint to the server, for example, at regular intervals or in response to a triggering event, such as pairing with the device or detecting a low battery. The signal fingerprint may include a set of sensor values and corresponding anchor identifiers of corresponding anchor devices.
The second device may use the stored sensor locations (also referred to as signal fingerprints) of the one or more first devices in various ways. For example, an action of a first user of a first device at a previous location may be used to predict an action that a second user may take when the second device is in a similar location. The action may include suggesting a device to be paired with, for example, for delivering content to be played, such as audio or video. As another example, the second device may guide the user to find the first device, for example, when the battery of the first device has run out and thus the first device is no longer responsive. This is an example of a proximity classification.
To determine the current or previous location of the first device, the second device may retrieve the signal fingerprint from the server. The fingerprint (e.g., a fingerprint generated by a lost device associated with a device identifier) may be retrieved in response to a request that may include a device identifier associated with a previously generated fingerprint. The second device may determine its own signal fingerprint and provide the fingerprint and the retrieved fingerprint as inputs to the similarity function. The output of the similarity function may be used to determine whether the two sensor locations are sufficiently similar to provide an active suggestion (e.g., an application or an action of an application) or to direct a user of the second device to the first device. Such steps may also be performed by the server to determine a proximity classification.
As an example, the similarity function may translate the similarity of sensor locations into a probability of spatial proximity (e.g., a probability that two devices are within a specified distance of each other) or potentially into a distance vector. If the probability is above the threshold, the second device may designate the first device as a proximate device. The threshold may depend on the ranging technique used to calculate the distance. For example, a Received Signal Strength Indicator (RSSI) measurement may not be as accurate as a time-of-flight measurement, and thus the threshold value of the time-of-flight measurement may have a smaller magnitude when compared to the RSSI measurement.
The similarity function may be a function of comparing the Cartesian distances between two fingerprints to each anchor. If the difference between the distances to each anchor is within a threshold, the two fingerprints may be classified as being close to each other. The Cartesian distance between two anchors in three-dimensional space can be given by:
Where p and q are points on a three-dimensional coordinate system having mutually perpendicular axes.
As another example, the similarity function may determine cosine similarity between vectors in each fingerprint. The vectors in the fingerprint may be vectors from the device to anchors, one for each anchor. Cosine similarity is the cosine of the angle between two vectors, and cosine similarity can vary from-1 (if the two vectors are opposite), 0 (for vertical vectors), and 1 (for the same vector). Cosine similarity may be calculated for each anchor device summed. If the sum is below the threshold, the two fingerprints may be classified as corresponding to the same location. Cosine similarity can be calculated by the following formula:
Where A and B are vectors and A i and B i are the ith component of vectors A and B, respectively.
The similarity function may compare a list of anchors and signals in the two fingerprints. For example, each fingerprint may include a listing signal from an anchor having a unique identifier. These lists may be compared to each other to identify hamming distances between each list. The hamming distance is the minimum number of substitutions that change the first list to the second list. For example, fingerprint 1 may be (A1, A2, A3, A4, and A5), and fingerprint 2 may be (A1, B1, A3, A4, and B2). In this example, the hamming distance is 2, because exchanging A2 with B2 and exchanging A5 with B2 is sufficient to change fingerprint 1 to fingerprint 2. In addition to the techniques described above, the similarity function may include any combination of the techniques described above. For example, the similarity function may be a weighted sum of euclidean distance, cosine similarity, and hamming distance between two fingerprints.
A. Obtaining and storing a first sensor location
Fig. 15 illustrates a simplified block diagram 1500 of a first mobile device obtaining a fingerprint during a first time period, in accordance with various embodiments. Mobile device 1505 may be any suitable computing device (e.g., smart phone, smart watch, laptop, tablet, etc.). Communication software executing on mobile device 1505 may detect signals received at the sensors of the mobile device and these detected signals may be used to generate a fingerprint (e.g., sensor location) for a first period of time.
The fingerprint may include a list of detected signal sources (e.g., wi-Fi access points) and distance measurements (e.g., time-of-flight or Received Signal Strength Indicators (RSSI)) for each signal source (e.g., anchor device). The list of detected signal sources (anchor devices) may include anchor identifiers to distinguish the anchor devices from each other. As described elsewhere in this disclosure, the signal source may be detected by a wireless signal (e.g., a set of sensor values) received at mobile device 1505 during a certain period of time. The time period may be 1 millisecond, 10 milliseconds, 50 milliseconds, 100 milliseconds, 0.5 seconds, 1 second, 5 seconds, 10 seconds, 30 seconds, 1 minute, 5 minutes, 15 minutes, and 1 hour. The wireless signals may include any wireless electromagnetic signals, and for example, the wireless signals may include personal area network signals (e.g., bluetooth Low Energy (BLE) signals) or local area network signals (e.g., wi-Fi signals).
Wireless signals may be received at mobile device 1505 from anchor devices 1510-1525. The wireless signal is illustrated in diagram 1500 as a double-headed arrow, and may include an electromagnetic signal and an anchor identifier of the anchor device that transmitted the signal. In some embodiments, the electromagnetic signal may be a ranging measurement. The ranging signal may include a time-of-flight measurement (e.g., a measurement of the time it takes for a message to travel between mobile device 1505 and one of anchor devices 1510-1525) or a Received Signal Strength Indicator (RSSI) (e.g., a measurement of the strength of a wireless signal received at mobile device 1505 from one of anchor devices 1510-1525).
The anchor identifier may be a unique identifier assigned to the anchor device by the device manufacturer or device user. For example, the anchor identifier may be a Service Set Identifier (SSID) or a bluetooth advertiser address for a Wi-Fi access point. As another example, if the user marks anchor devices, for example, a geographic location (e.g., as determined using GPS) may be used as part of the identifier in order to ensure unique values between all anchor devices.
Thus, the fingerprint may include information provided by mobile device 1505, such as a Global Positioning System (GPS) address measured by the mobile device. Including information measured by mobile device 1505 may reduce the likelihood of two anchor devices having the same or similar identifiers merging (e.g., the names such as "home network" may be SSID of multiple anchor devices at different locations). This information may also reduce the search space if mobile device 1505 retrieves a fingerprint therefrom (e.g., retrieves only fingerprints generated within a threshold distance of a particular GPS address).
The set of sensor values in the fingerprint may be organized by an identifier, and for example, the sensor values of device 1505 may be associated with the SSID of anchor device 1510, and the sensor values of anchor devices 1515-1525 may be associated with a list of bluetooth advertiser addresses of anchor devices 1515-1525. As mentioned above, the combination of sensor values measured at a particular location may be a sensor location (e.g., a fingerprint). The measured sensor values may vary between locations, and the sensor values may be used to determine whether two sets of sensor values are measured at the same location. The fingerprint may be an average of sensor values measured during a certain period of time or a snapshot of sensor measurements taken at a certain point in time. Table 1 depicts example sensor values with a single Wi-Fi access point and a single bluetooth device. The identifier type and sensor values may vary as shown in table 1 below:
TABLE 1
Mobile device 1505 can upload the fingerprint or set of sensors to server device 1530. The uploaded fingerprint or set of sensors is illustrated by the double-headed dashed arrow between mobile device 1505 and server device 1530. Mobile device 1505 may upload the fingerprint at regular intervals (e.g., once every 5 minutes) or in response to an event. For example, mobile device 1505 may upload a fingerprint or set of sensors in response to a low battery indication, in response to turning on a bluetooth setting, or receiving user input to a streaming service.
B. Obtaining a second sensor location and retrieving a first sensor location
Fig. 16 illustrates a simplified block diagram 1600 of a second mobile device obtaining a second fingerprint over a second time period, in accordance with various embodiments. The mobile device 1605 may be any suitable computing device (e.g., smart phone, smart watch, laptop, tablet, etc.). The fingerprint from the graph 1600 may be measured at the location depicted in 1500. However, graph 1500 depicts a first time period and graph 1600 depicts a second time period. The previous location 1635 of the first mobile device is indicated to indicate that the two mobile devices are located in similar areas.
The mobile device 1605 may measure a set of sensor values received from the anchor device to generate a fingerprint (e.g., sensor location). If a new anchor is added or removed from a location after one fingerprint is generated and before a second fingerprint is obtained, the set of anchor devices for which the sensor location is determined may vary between time periods. In addition to adding or removing devices, the identifier of the anchor may also be changed, and for example, if the user renames an access point, the SSID of the Wi-Fi access point may change. Additionally, a removable anchor (such as a bluetooth enabled speaker) may change the sensor value received from the anchor device. There is no need to have a perfect correspondence between devices at each time period, and device proximity can be determined even if the anchor device changes between time periods. However, during these two time periods, at least two anchor devices should be present, identifiable and located at similar locations.
In this case, there are anchor devices 1610 to 1620 (e.g., anchor devices 1510 to 1620) during the first period of time. However, anchor device 1525 is not present for the second time period depicted in diagram 1600, and anchor device 1625 is added after the first time period shown in diagram 1500. Removing the anchor device 1525 and adding the anchor device 1625 may mean that the fingerprint generated by the mobile device 1505 and the fingerprint generated by the mobile device 1605 do not exactly correspond because there is no sensor value of 1525 in the fingerprint and the sensor value of 1625 is included in the fingerprint.
Mobile device 1605 may retrieve the fingerprint of first mobile device 1505 from server device 1630. The fingerprint may be retrieved in response to the request transmitted to the server device 1630, and the request may include one or more of a device identifier of the first mobile device 1505, global positioning coordinates of the location depicted in fig. 16, or a fingerprint generated by the mobile device 1605. If the request includes a fingerprint, server device 1630 may compare the fingerprints and return a proximity classification to mobile device 1605.
If the server returns a fingerprint generated by mobile device 1505, mobile device 1605 may provide the fingerprint to a similarity model (e.g., a machine learning model) to determine if the two fingerprints correspond to the same location. Mobile device 1605 may determine that its fingerprint and the fingerprint generated by mobile device 1505 correspond to the same location because the sensor values generated by the threshold number of anchor devices represented in the fingerprint overlap (e.g., the sensor values are identified as similar by a similarity function). In addition to comparing the similarity of the received signals, the list of anchor devices in the fingerprints may also be compared to determine correspondence between identifiers in each fingerprint. The number of matching anchor identifiers may be compared to a threshold to determine whether the two fingerprints correspond to the same location, e.g., so their sensor locations may be compared to each other. The threshold number of anchor devices may be 1 device, 2 devices, 3 devices, 4 devices, 5 devices, 6 devices, 7 devices, 8 devices, 9 devices, 10 devices, 15 devices, or 20 devices. The threshold may be a threshold percentage (e.g., 10%, 20%, 30%, 40%, 50%, 60%, 70%, 80%, and 90%) of overlapping devices.
Anchor devices 1610 to 1625 and anchor devices 1510 to 1515 are devices that broadcast electromagnetic signals and are associated with locations. For example, the anchor device (e.g., signal source) may be a smart television, streaming device, smart home appliance (e.g., lights, speakers, ovens, refrigerators, locks, security systems, etc.), wireless access point, desktop computer, or printer. However, some devices may broadcast signals, but are not associated with a particular location. Such as a smart phone, a smart watch, a laptop computer, a tablet computer, a headset, or a portable wireless access point.
The mobile device 1605 may use a unique identifier or serial number in the payload of the signal received at the mobile device 1605 to distinguish between devices that are primarily static devices or mobile devices. For example, mobile device 1605 may exclude headphones from the fingerprint while preserving the wireless speaker. Excluding mobile devices while preserving static devices may help improve reliability of proximity classification by excluding devices that are not primarily associated with location.
C. Example flow for determining proximity classification
The proximity classification may be determined locally at the mobile device or may be determined at a server. The proximity classification is determined by inputting the fingerprint into a similarity model that determines whether the input fingerprints are similar. The similarity model may be executed on the mobile device or on the server device.
1. Flow for detecting device proximity at a mobile device
Fig. 17 is a swim lane diagram 1700 depicting a technique for detecting device proximity at a mobile device, in accordance with some embodiments.
At S1, the first mobile device 1702 may measure sensor values and obtain anchor identifiers for one or more anchor devices 1704. Measuring the sensor values may mean exchanging ranging measurements between the first mobile device 1702 and the anchor device 1704, or the first mobile device 1702 may measure electromagnetic signals broadcast by the anchor device 1704. The sensor values may include unique identifiers of some or all of the anchor devices 1704.
At S2, the first mobile device 1702 may generate a first fingerprint using the sensor values measured at S1. Generating the first fingerprint may include excluding one or more of the sensor values measured at S1, associating the sensor values with the unique identifier of the anchor device 1704, and recording an action taken by the first mobile device 1702 during a period of time prior to measuring the sensor values.
At S3, the first fingerprint may be transmitted to the server device 1706. The first fingerprint may be transmitted to the server device 1706 via a network connection. The server device 1706 may be one or more network accessible computing devices. At S4, the server device 1706 may store a first fingerprint. The first fingerprint may be stored with a unique identifier of the first mobile device 1702.
At S5, the second mobile device 1708 may measure sensor values and obtain anchor identifiers for one or more anchor devices 1704. Measuring the sensor values may mean exchanging ranging measurements between the second mobile device 1708 and the anchor device 1704, or the second mobile device 1708 may measure electromagnetic signals broadcast by the anchor device 1704. The sensor values may include unique identifiers of some or all of the anchor devices 1704.
At S6, the second mobile device 1702 may generate a second fingerprint using the sensor values measured at S5. Generating the second fingerprint may include excluding one or more of the sensor values measured at S5, associating the sensor values with the unique identifier of the anchor device 1704, and recording an action taken by the second mobile device 1702 during a period of time prior to measuring the sensor values.
At S7, a request may be transmitted to the server device 1706. The request may include information that may identify the second fingerprint such that a matching fingerprint in the memory of the server device may be retrieved. The information identifying the second fingerprint may include the second fingerprint, one or more anchor identifiers from the second fingerprint, or a Global Positioning System (GPS) location captured simultaneously with the second fingerprint. The request may be transmitted to the server device 1706 via a network connection. The server device 1706 may be one or more network accessible computing devices.
At S8, the server device 1706 may identify the first fingerprint as a matching fingerprint. The server device 1706 may use the information identifying the second fingerprint to retrieve one or more fingerprints corresponding to the second fingerprint from a memory of the server. These retrieved fingerprints may include the first fingerprint generated at S2. One or more fingerprints may be retrieved by comparing the second fingerprint or information identifying the second fingerprint to fingerprints stored in the server device 1706. If the request includes a second fingerprint, processing the request may mean storing the second fingerprint with the unique identifier of the second mobile device 1708.
At S9, a response to the request may be transmitted from the server device 1706 to the second mobile device 1708. In some embodiments, the second device may perform the proximity classification using a proximity function, and the response may include one or more fingerprints identified at S8. . The response may include information identifying one or more actions associated with each of the retrieved fingerprints. If the action is performed by the mobile device during a period of time prior to the mobile device generating the fingerprint, the action may be associated with the fingerprint. The action may be used to suggest an action to the second mobile device. For example, if the first mobile device 1702 was paired with a particular streaming device prior to generating a fingerprint at S2, the associated actions may be used to identify the streaming device as a suggested paired device for the second mobile device 1708. The suggested actions are described in more detail below in section viii.e "use case".
At S10, the second mobile device 1708 may determine a proximity classification using the first fingerprint and the second fingerprint. Determining the proximity classification may include providing the first fingerprint and the second fingerprint as inputs to a similarity model, and the similarity model may output the classification.
At S11, the second mobile device 1708 may perform an action commensurate with the proximity classification determined at S11. The action commensurate with the proximity classification may include presenting a distance between the locations at which the first fingerprint and the second fingerprint were generated. Additionally, processing the response may include presenting one or more actions identified at S9 or performing the identified actions via the user interface.
1. Flow for detecting device proximity at a server device
Fig. 18 is a swim lane diagram 1800 depicting a technique for detecting device proximity at a server device, in accordance with some embodiments.
At S1, the first mobile device 1802 may measure sensor values and obtain anchor identifiers for one or more anchor devices 1804. Measuring sensor values may mean exchanging ranging measurements between the first mobile device 1802 and the anchor device 1804, or the first mobile device 1802 may measure electromagnetic signals broadcast by the anchor device 1804. The sensor values may include unique identifiers for some or all of the anchor devices 1804.
At S2, the first mobile device 1802 may generate a first fingerprint using the sensor values measured at S1. Generating the first fingerprint may include excluding one or more of the sensor values measured at S1, associating the sensor values with the unique identifier of the anchor device 1804, and recording an action taken by the first mobile device 1802 during a period of time prior to measuring the sensor values.
At S3, the first fingerprint may be transmitted to the server device 1806. The first fingerprint may be transmitted to the server device 1806 via a network connection. The server device 1806 may be one or more network accessible computing devices. At S4, the server device 1806 may store a first fingerprint. The first fingerprint may be stored with a unique identifier of the first mobile device 1802.
At S5, the second mobile device 1808 may measure sensor values and obtain anchor identifiers for the one or more anchor devices 1804. Measuring the sensor values may mean exchanging ranging measurements between the second mobile device 1808 and the anchor device 1804, or the second mobile device 1808 may measure electromagnetic signals broadcast by the anchor device 1804. The sensor values may include unique identifiers for some or all of the anchor devices 1804.
At S6, the second mobile device 1802 may generate a second fingerprint using the sensor values measured at S5. Generating the second fingerprint may include excluding one or more of the sensor values measured at S5, associating the sensor values with the unique identifier of the anchor device 1804, and recording the action taken by the second mobile device 1802 during a period of time prior to measuring the sensor values.
At S7, a request may be transmitted to the server device 1806. The request may include information that may identify the second fingerprint such that a matching fingerprint in the memory of the server device may be retrieved by the server. The information identifying the second fingerprint may include the second fingerprint, one or more anchor identifiers from the second fingerprint, or a Global Positioning System (GPS) location captured simultaneously with the second fingerprint. The request may be transmitted to the server device 1806 via a network connection. The server device 1806 may be one or more network accessible computing devices.
At S8, the server device 1806 may identify the first fingerprint as a matching fingerprint. The server device 1806 may retrieve one or more fingerprints corresponding to the second fingerprint from a memory of the server using information identifying the second fingerprint. These retrieved fingerprints may include the first fingerprint generated at S2. One or more fingerprints may be retrieved by comparing the second fingerprint or information identifying the second fingerprint to fingerprints stored in the server device 1806. If the request includes a second fingerprint, processing the request may mean storing the second fingerprint with the unique identifier of the second mobile device 1808.
At S9, the server device 1806 may determine a proximity classification using the first fingerprint and the second fingerprint. Determining the proximity classification may include providing the first fingerprint and the second fingerprint as inputs to a similarity model, and the similarity model may output the classification. The similarity model may be executed on the server device 1806.
At S10, a response to the request may be transmitted from the server device 1806 to the second mobile device 1808. The response may include the proximity classification generated at S9. Additionally, the response may include information identifying one or more actions associated with each of the retrieved fingerprints. If the action is performed by the mobile device during a period of time prior to the mobile device generating the fingerprint, the action may be associated with the fingerprint. The action may be used to suggest an action to the second mobile device. For example, if the first mobile device 1802 was paired with a particular streaming device prior to generating a fingerprint at S2, the associated actions may be used to identify the streaming device as a suggested paired device for the second mobile device 1808. The suggested actions are described in more detail below in section viii.e "use case".
At S11, the second mobile device 1808 may perform an action commensurate with the proximity classification determined at S11. The action commensurate with the proximity classification may include presenting a distance between the locations at which the first fingerprint and the second fingerprint were generated. Additionally, processing the response may include presenting one or more actions identified at S9 or performing the identified actions via the user interface.
D. system for detecting proximity of devices
The device proximity may be determined on the mobile device or on the server device. The proximity may be determined by inputting two fingerprints into a similarity model. The following describes a system for determining device proximity on a mobile device and on a server device.
1. System for detecting device proximity on a mobile device
Fig. 19 is a simplified block diagram 1900 illustrating an example architecture of a system for detecting device proximity on a mobile device, in accordance with some embodiments. The diagram includes a representative mobile device 1902, one or more mobile devices 1904, one or more networks 1908, and a server device 1910. Each of these elements depicted in fig. 19 may be similar to one or more elements depicted in other figures described herein.
The mobile device 1904 can be any suitable computing device (e.g., smart phone, smart watch, laptop, tablet, etc.). In some embodiments, the mobile device may perform any one or more of the operations of the mobile device described herein. Depending on the type of mobile device and/or the location of the accessory device, the mobile device may be enabled to communicate over the network 1908 (e.g., including a LAN or WAN) using one or more network protocols (e.g., bluetooth connection, thread connection, zigBee connection, wi-Fi connection, etc.) and network paths, as further described herein.
In some embodiments, the server device 1910 may be a computer system including at least one memory, one or more processing units (or processors), a storage unit, a communication device, and an I/O device. In some embodiments, the server device 1910 may perform any one or more of the operations of the server device described herein. In some embodiments, these elements may be implemented in a similar manner (or different manner) as described with reference to similar elements of the mobile device 1902.
In some implementations, the representative mobile device 1902 may correspond to any one or more of the computing devices described herein. The representative computing device may be any suitable computing device (e.g., a smart speaker, a mobile phone, a tablet, a wireless speaker, a smart hub speaker device, a smart media player communicatively connected to a TV, etc.).
In some embodiments, one or more networks 1908 may include the internet WAN and LAN. For example, a router associated with a LAN may enable traffic from the LAN to be sent to the WAN, and vice versa. The mobile device 1902 or 1904 may communicate with the WAN through a telecommunications network, such as a broadband cellular network (e.g., a 2G, 3G, 4G, 5G, or 6G telecommunications network). In some embodiments, the server device 1910 may be external to the monitored environment and thus communicate with other devices over the WAN. For example, the mobile device 1902 or the mobile device 1904 may send or retrieve signal fingerprints from the server device 1910. The signal fingerprint may be stored in the server device 1910 and the stored fingerprint may be associated with a geographic region, a unique identifier, or a device identifier.
As described herein, computing device 1902 may represent one or more computing devices connected to one or more of networks 1908. The computing device 1902 has at least one memory 1912, a communication interface 1914, one or more processing units (or processors) 1916, a storage unit 1918, and one or more input/output (I/O) devices 1920.
Turning in further detail to each element of the computing device 1902, the processor 1916 may be implemented in hardware, computer-executable instructions, firmware, or a combination thereof, as appropriate. Computer-executable instructions or firmware implementations of the processor 1916 may include computer-executable instructions or machine-executable instructions written in any suitable programming language to perform the various functions described.
The memory 1912 may store program instructions that can be loaded and executed on the processor 1916 as well as data generated during execution of such programs. Depending on the configuration and type of computing device 1902, memory 1912 may be volatile (such as random access memory ("RAM")) or non-volatile (such as read-only memory ("ROM"), flash memory, etc.). In some implementations, the memory 1912 may include a variety of different types of memory, such as static random access memory ("SRAM"), dynamic random access memory ("DRAM"), or ROM. The computing device 1902 may also include additional storage 1918, such as removable storage or non-removable storage, including, but not limited to, magnetic storage, optical disks, and/or tape storage. The disk drives and their associated computer-readable media may provide nonvolatile storage of computer-readable instructions, data structures, program modules and other data for the computing device. In some embodiments, the storage 1918 may be used to store data content received from one or more other devices (e.g., the server device 1910, other computing devices, or the mobile device 1904). For example, the storage 1918 may store accessory management settings, accessory settings, and user data associated with users affiliated with the mobile device 1902.
The computing device 1902 may also contain a communication interface 1914 that allows the computing device 1902 to communicate with a stored database, another computing device or server, a user terminal, or other devices over the network 1908. The computing device 1902 may also include I/O devices 1920, such as for enabling connections to a keyboard, mouse, pen, voice input device, touch input device, display, speakers, printer, etc. In some implementations, the I/O device 1920 may be used to output an audio response or other indication as part of performing a response to a user request. The I/O devices may include one or more speakers 1946 or one or more microphones 1948.
The memory 1912 may include an operating system 1922 and one or more applications or services for implementing the features disclosed herein, including a communication module 1924, a user interface module 1926, and a proximity module 1930. The proximity module also includes a fingerprint module 1936 and a similarity module 1950. The proximity module 1930 may be configured to determine device proximity by comparing the similarity of the signal fingerprint retrieved from the server device 1910 to the signal fingerprint generated by the fingerprint module 1936. Device proximity may be determined by providing the retrieved signal fingerprint and the generated signal fingerprint to the similarity module 1950 and receiving a proximity classification as an output of the similarity module 1950. The similarity module may be a machine learning model that compares a set of signals of a first fingerprint with a set of signals in a second fingerprint. In other examples, the similarity module may use one or more rules to determine whether two fingerprints are similar. For example, the similarity model may compare the sensor values of each fingerprint to determine whether the difference between the magnitudes of each of these sensor values is within a threshold. The similarity model may use one or more of a cartesian distance, cosine similarity, or hamming distance between the signal sets in the two fingerprints.
The communication module 1924 may include code that causes the processor 1916 to generate instructions and messages, transmit data, or otherwise communicate with other entities. As described herein, the communication module 1924 may send messages via one or more network paths of the network 1908 (e.g., via a LAN or internet WAN associated with the monitored environment). For example, the communication module 1924 may communicate the sensor fingerprint to the server device 1910. The communication module 1924 may provide information to the fingerprint module 1936 regarding signals received at the mobile device 1902 such that the fingerprint module may generate a signal fingerprint. The user interface module 1926 may include code to cause the processor 1916 to present information corresponding to a location of a mobile device or a proximity computing device.
2. System for detecting device proximity on a server device
Fig. 20 is a simplified block diagram 2000 illustrating an example architecture of a system for detecting device proximity on a server device, in accordance with some embodiments. The diagram includes a server device 2002, one or more mobile devices 2004, and one or more networks 2008. Each of these elements depicted in fig. 20 may be similar to one or more elements depicted in other figures described herein.
The mobile device 2004 may be any suitable computing device (e.g., a smart phone, a smart watch, a laptop computer, a tablet computer, etc.). In some embodiments, the mobile device may perform any one or more of the operations of the mobile device described herein. Depending on the type of mobile device and/or the location of the accessory device, the mobile device may be enabled to communicate over the network 2008 (e.g., including a LAN or WAN) using one or more network protocols (e.g., bluetooth connection, thread connection, zigBee connection, wi-Fi connection, etc.) and network paths, as further described herein.
In some embodiments, the server device 2002 may be a computer system comprising at least one memory, one or more processing units (or processors), a storage unit, a communication device, and an I/O device. In some embodiments, the server device 2002 may perform any one or more of the operations of the server device described herein.
In some embodiments, the one or more networks 2008 may include an internet WAN and LAN. For example, a router associated with a LAN may enable traffic from the LAN to be sent to the WAN, and vice versa. The mobile device 2002 or 2004 may communicate with the WAN over a telecommunications network, such as a broadband cellular network (e.g., a 2G, 3G, 4G, 5G, or 6G telecommunications network). In some embodiments, the server device 2002 may be external to the monitored environment and thus communicate with other devices over the WAN. For example, the mobile device 2004 may send signal fingerprints to or retrieve signal fingerprints from the server device 2002. The signal fingerprint may be stored in the server device 2002 and the stored fingerprint may be associated with a geographic region, a unique identifier, or a device identifier.
As described herein, server device 2002 may represent one or more computing devices connected to one or more of networks 2008. The server device 2002 has at least one memory 2012, a communication interface 2014, one or more processing units (or processors) 2016, a storage unit 2018, and one or more input/output (I/O) devices 2020.
Turning in further detail to each element of the computing device 2002, the processor 2016 may be suitably implemented in hardware, computer-executable instructions, firmware, or a combination thereof. Computer-executable instructions or firmware implementations of the processor 2016 may include computer-executable instructions or machine-executable instructions written in any suitable programming language to perform the various functions described.
The memory 2012 may store program instructions that can be loaded and executed on the processor 2016 as well as data generated during execution of such programs. Depending on the configuration and type of computing device 2002, the memory 2012 may be volatile (such as random access memory ("RAM")) or non-volatile (such as read-only memory ("ROM"), flash memory, etc.). In some implementations, the memory 2012 may include a variety of different types of memory, such as static random access memory ("SRAM"), dynamic random access memory ("DRAM"), or ROM. The computing device 2002 may also include additional storage 2018, such as removable storage or non-removable storage, including, but not limited to, magnetic storage, optical disks, and/or tape storage. The disk drives and their associated computer-readable media may provide nonvolatile storage of computer-readable instructions, data structures, program modules and other data for the computing device. In some embodiments, storage 2018 may be used to store data content received from one or more other devices (e.g., server device 2002, other computing devices, or mobile device 2004). For example, storage 2018 may store accessory management settings, accessory settings, and user data associated with users affiliated with mobile device 2002.
The server device 2002 may also contain a communication interface 2014 that allows the computing device 2002 to communicate with a stored database, another computing device or server, a user terminal, or other devices over the network 2008. The server device 2002 may also include an I/O device 2020, such as for implementing a connection to a keyboard, mouse, pen, voice input device, touch input device, display, speaker, printer, etc. In some embodiments, the I/O device 2020 may be used to output an audio response or other indication as part of performing a response to a user request. The I/O devices may include one or more speakers 2046 or one or more microphones 2048.
The memory 2012 may include an operating system 2022 and one or more applications or services for implementing the features disclosed herein, including a communication module 2024, a user interface module 2026, and a proximity module 2030. The proximity module also includes a fingerprint module 2036 and a similarity module 2050. The proximity module 2030 may be configured to determine device proximity by comparing the similarity of the signal fingerprint retrieved from the mobile device 2004 to the signal fingerprint generated by the fingerprint module 2036 or the fingerprint stored in the storage 2018. The device proximity may be determined by providing the retrieved signal fingerprint and the generated signal fingerprint to the similarity module 2050 and receiving a proximity classification as an output of the similarity module 2050. The similarity module may be a machine learning model that compares a set of signals of a first fingerprint with a set of signals in a second fingerprint. In some implementations, the similarity module may use one or more rules to determine whether two fingerprints are similar. For example, the similarity model may compare the sensor values of each fingerprint to determine whether the difference between the magnitudes of each of these sensor values is within a threshold. The similarity model may use one or more of a cartesian distance, cosine similarity, or hamming distance between the signal sets in the two fingerprints.
The communication module 2024 may include code that causes the processor 2016 to generate instructions and messages, transmit data, or otherwise communicate with other entities. As described herein, the communication module 2024 may send the message via one or more network paths of the network 2008 (e.g., via a LAN or internet WAN associated with the monitored environment). For example, the communication module 2024 may receive or transmit sensor fingerprints between the server device 2002 and the mobile device 2004. The communication module 2024 may receive information about the signals received at the mobile device 2004 to the fingerprint module 2036 so that the fingerprint module may generate a signal fingerprint. The user interface module 2026 may include code that causes the processor 2016 to present information corresponding to the location of the mobile device or the proximity computing device.
E. use case
Determining device proximity may be used to locate a lost or unresponsive device that cannot communicate its location directly to the searching device. Additionally, device proximity determination may be used to recommend actions to a mobile device. For example, proximity determination may be used to recommend devices for pairing or for playback of media (such as audio or video).
1. Indirect device positioning
Two mobile devices may not be able to communicate directly. Lack of communication may be because the two devices do not support the same communication protocol, or one of the mobile devices may be powered down or damaged. The two devices may communicate indirectly by sharing a fingerprint via a third party device, such as a server device connected to a network (e.g., a Wide Area Network (WAN) or a Local Area Network (LAN)).
To indirectly locate the mobile device, the first mobile device may measure the sensor location and provide the sensor location to a third party device, such as a server device, which may be any network device available on a network accessible to the mobile device. Sensor positioning may be provided at regular intervals or in response to events. For example, the first mobile device may provide sensor positioning in response to a low battery notification, leaving the wireless network, joining the wireless network, detecting movement, not detecting movement within a threshold amount of time, connecting to a paired device, disconnecting from a paired device, or pairing with a device. The regular intervals may be 5 minutes, 10 minutes, 15 minutes, 30 minutes, 1 hour, 2 hours, 6 hours, 12 hours, 24 hours, 36 hours, or 7 days.
The second mobile device may locate the first mobile device in response to the request. For example, a request to locate the second mobile device may be transmitted to a server device storing a fingerprint of the second mobile device. In some embodiments, the server device may store one or more fingerprints of the first mobile device. The request may cause the server device to determine whether any of the stored fingerprints match a fingerprint generated by the first mobile device. The fingerprints may match if one or more of the sensor values in the sensor set of the fingerprints are within a threshold similarity determined by a similarity function, or if the total distance between the fingerprints is within a threshold. Additionally, fingerprints may be matched if a threshold number of anchor identifiers are shared between the fingerprints in addition to the distance or similarity threshold calculated using the sensor values. The server computer may provide the fingerprint generated by the first mobile device to the second mobile device.
The fingerprint may be provided as an input to the similarity model. The similarity model may be run on the second mobile device or on the server device. The similarity model may run on a server device and the proximity classification may be determined at the server. If the classification is determined at the server, the classification may be provided to the second mobile device via a network connection. Proximity classification may include determining that a fingerprint is recorded at the same location or at a distance between locations of two fingerprints. The distance in the proximity classification may be a numeric distance (e.g., 25 meters) or a category classification (e.g., at, near, or far from the location). The similarity model may output a proximity classification (e.g., near, far, or estimated distance (5 meters)) of the two sensor locations. In some embodiments, the similarity model may be on a server.
To identify matching fingerprints, the server device may compare each stored fingerprint to the fingerprint generated by the first mobile device, or the server may filter the stored fingerprints prior to the comparison. For example, each fingerprint may be associated with a Global Positioning System (GPS) location, and compare stored fingerprints that may include only GPS locations that are within a threshold distance of the GPS location of the fingerprint generated by the first mobile device. The threshold distance may be 1 meter, 5 meters, 15 meters, 20 meters, 50 meters, 100 meters, 200 meters, 500 meters, 1 kilometer, 1.5 kilometers, 5 kilometers, 10 kilometers, or 100 kilometers.
After the comparison, the server computer may provide the matching fingerprint to the second mobile device. If the similarity model indicates that any of the fingerprints generated by the first mobile device are in proximity to the stored fingerprints, the second mobile device may input matching fingerprints to the similarity model to identify matching locations. The location may be GPS coordinates associated with a stored fingerprint that is in proximity to a fingerprint generated by the first mobile device. The second mobile device may travel to the location and obtain a second fingerprint. The second fingerprint may be fed to a similarity model to determine whether the second mobile device is in proximity to the first mobile device. For example, the location may be a multi-story building and the second mobile device may obtain a fingerprint for each floor. Once the similarity model indicates that the devices are in proximity to each other, the user of the second mobile device may search for the first mobile device. In some implementations, the second mobile device may use ranging measurements with the signal source to locate the precise location of the first mobile device.
2. Suggested actions
The device location may be used to suggest an action to the first mobile device. The suggested action may be an action taken by other mobile devices at a particular location. For example, mobile devices may turn on a silent mode when entering a movie theater, and these devices may provide fingerprints and actions to a server. The first mobile device may use the fingerprint to determine whether the first mobile device is in a location where other devices have taken a particular action. Continuing with this example, if the first mobile device determines that it is in an area where other devices typically turn on the mute mode, the mobile device may recommend that the mute mode be turned on. Furthermore, the suggested actions are described in more detail in section IV of the present application, and triggerable predicted events are described in section III.
To determine the suggested action, the first mobile device may provide a fingerprint to the server while moving around in the day. The server may identify any stored fingerprints associated with the provided fingerprints. For example, the server may identify a fingerprint that shares at least one common anchor identifier with the provided fingerprint. In some implementations, the fingerprint may include Global Positioning System (GPS) coordinates that may be compared to identify a fingerprint associated with (e.g., within a threshold distance of) the provided fingerprint. The fingerprints may be captured at regular intervals or in response to an action. For example, the actions may include interacting with an application, changing a setting of the mobile device, or changing a measured sensor value.
The first mobile device may input the matching fingerprint and the fingerprint generated by the mobile device as inputs to the similarity model. The matching fingerprint may include suggested actions, and if the first mobile device is classified as being near a location associated with the matching fingerprint, the mobile device may suggest actions to a user of the first mobile device. For example, the first mobile device may present the notification via a graphical user interface on a display device of the first mobile device.
The suggested actions may include actions taken by the mobile device during a period of time prior to the fingerprint. For example, the suggested actions may include any action taken within a 30 minute period prior to generating the fingerprint. If the fingerprint includes sensor values measured during a period of time, the fingerprint may include any action taken by the mobile device during that period of time. The time period may be 5 minutes, 10 minutes, 15 minutes, 30 minutes, 1 hour, 2 hours, 6 hours, 12 hours, 24 hours, 36 hours, or 7 days. The actions may include a change to a setting of the mobile device, an input provided to a particular application on the mobile device, joining a wireless network, or pairing with the electronic device.
For example, a mobile device in a hotel may detect multiple bluetooth enabled streaming devices. However, it may be difficult for a user of a mobile device to determine which streaming device corresponds to her room. Instead of systematically testing each streaming device to determine which device corresponds to the room, the user may trigger the device to capture a fingerprint by turning on a bluetooth setting or interacting with the streaming service. The mobile device may capture the fingerprint and provide the fingerprint to the server.
The server may use the provided fingerprint to transmit a matching fingerprint to the mobile device. The matching fingerprint indicates that the previous mobile device at that location at an earlier time period was paired with a particular streaming device. The mobile device may input the matching fingerprint and the captured fingerprint as inputs to the similarity model. The similarity model may classify the mobile device as being in proximity to the mobile device that generated the matching fingerprint. Thus, the mobile device may suggest to the user a pairing with a particular streaming device.
The suggested actions may include prompting the user to switch between playback devices. For example, a user may stream music from her mobile device to speakers in her living room while learning on a sofa. The user can leave the sofa without pausing the flow and begin preparing the food in the kitchen.
F. techniques for providing proximity classification
Fig. 21 is a flow diagram of a method 2100 for providing proximity classification, according to some embodiments. In different embodiments, different devices may perform the steps of method 2100. According to an example, one or more process blocks of fig. 21 may be performed by a computing device, such as a first or second mobile device (e.g., a smart phone, a tablet device, a wearable device, and a laptop computer) or a network device (e.g., a server device).
At block 2102, a first sensor location (e.g., a first fingerprint) may be obtained. The sensor location may include a first set of sensor values measured using a first sensor of the mobile device. The first set of sensor values may be determined from a wireless signal transmitted by the anchor device. Each of these sensor values may be associated with an anchor device that generates a sensor value from an anchor identifier in the set of anchor identifiers.
In some implementations, the second mobile device (e.g., mobile device 1904, 2004) can obtain the first sensor location through a request transmitted to the server device. The server device may use the request to search the memory of the device for a stored sensor location that matches the information identifying the second sensor location, and the search may return the first sensor location. In some embodiments, the server device may obtain the first sensor location by searching for a sensor location in the memory of the device that matches the information identifying the second sensor location, and the search may return the first sensor location. In some embodiments, the server device may provide all sensor locations stored in the memory of the server.
The first sensor location (e.g., fingerprint) may include a Global Positioning System (GPS) address measured by the first mobile device. The first mobile device may associate a set of signal values for generating a fingerprint with the GPS address. Generating the first sensor location may include removing one or more sensor values from the set of signal values. The sensor values may correspond to one or more mobile devices. The sensor location may include information identifying an action performed by the first mobile device.
At block 2104, a second sensor location may be obtained. The second sensor location may include a second set of sensor values measured using a second sensor of a second mobile device (e.g., mobile device 1902, mobile device 1904, mobile device 2004). The second set of sensor values may be generated by the second mobile device from wireless signals transmitted by the anchor identifier. In some embodiments, wherein the device proximity is determined by the second mobile device, obtaining the second sensor location may include generating the second sensor location. In embodiments in which the device proximity is determined by the server device, the second sensor location may be obtained in response to a request from the second mobile device. The request may include the second sensor location or information (e.g., an identifier) identifying the second sensor location. In some embodiments, the server device may provide all sensor locations stored in the memory of the server.
Generating, by the second mobile device, a second sensor location (e.g., a fingerprint) may include measuring a Global Positioning System (GPS) address and associating a set of signal values with the GPS address. Generating the second sensor location may include removing, by the second mobile device, one or more sensor values from the set of sensor values. The sensor values may correspond to one or more mobile devices. The sensor location may include information identifying one or more actions performed by the device (e.g., second mobile device) recording the sensor value at the location where the sensor value was recorded.
At block 2106, it may be determined that both the first sensor location and the second sensor location are determined using wireless signals transmitted by the anchor device. The determination may be performed at the second mobile device or at a server device (e.g., server device 1530, 1630, 1910, or 2002). In some embodiments, if at least some of the anchor devices overlap (e.g., one or more of the anchor devices are used to determine the first location and the second location), the anchor device used to determine the first location may be different from the anchor device used to determine the first location. For example, a first location may be determined using anchor devices 1-5 and a second location may be determined using anchor devices 3-8.
At block 2108, the first sensor location and the second sensor location may be provided as inputs to a similarity model. The similarity function may be performed on the second mobile device or on a server device (e.g., server device 1530, 1630, 1910, or 2002). The similarity model may provide a proximity classification between the first mobile device and the second mobile device. In some embodiments, examples of proximity classification may include a distance between a first sensor location and a second sensor location. The distance may be a numeric distance or a classification distance (e.g., at the location (e.g., same cluster), near the location, or remote from the location). The similarity model may be an algorithm or a machine learning model. The proximity classification may be determined using a probability or score that indicates whether the first mobile device and the second mobile device are proximate to each other (e.g., in the same room, in the same building, within a threshold distance of each other).
The proximity classification may be determined by comparing a score or probability generated by the proximity model to a threshold. For example, if the probability or score is above a threshold, a proximity classification may be assigned to the second mobile device that indicates that the second mobile device is a proximate device (e.g., in the vicinity of the first mobile device). If the probability or score is above a threshold, a proximity classification may be assigned to the second mobile device indicating that the second mobile device is a remote device (e.g., away from the first mobile device).
In embodiments where the proximity classification is determined on the server device, the proximity classification may be provided to the second mobile device. The second mobile device may use the proximity classification calculated by the second mobile device or received from the server device to take action commensurate with the proximity classification. The proportional action may include a notification indicating a proximity classification. The notification may be displayed on a display of the second mobile device. In some embodiments, the notification may include an arrow pointing to a location of the first mobile device or a distance between the first mobile device and the second mobile device. The action may include a suggested action, such as a notification identifying a device with which the second mobile device may be paired.
IX. example device
Fig. 22 is a block diagram of an example device 2200, which may be a mobile device. Device 2200 generally includes a computer-readable medium 2202, a processing system 2204, an input/output (I/O) subsystem 2206, wireless circuitry 2208, and audio circuitry 2210 including a speaker 2250 and a microphone 2252. These components may be coupled by one or more communication buses or signal lines 2203. Device 2200 may be any portable mobile device including a handheld computer, tablet computer, mobile phone, laptop computer, tablet device, media player, personal Digital Assistant (PDA), key fob, car key, access card, multi-function device, mobile phone, portable gaming device, car display unit, and the like, including combinations of two or more of these items.
It is to be appreciated that the architecture shown in fig. 22 is merely one example of an architecture of device 2200, and that device 2200 may have more or fewer components than shown or differently configured components. The various components shown in fig. 22 may be implemented in hardware, software, or a combination of both hardware and software, including one or more signal processing circuits and/or application specific integrated circuits.
The wireless circuitry 2208 is used to transmit and receive information over a wireless link or network to conventional circuitry of one or more other devices, such as an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, memory, etc. The wireless circuitry 2208 may use various protocols, such as those described herein.
The wireless circuitry 2208 is coupled to the processing system 2204 via a peripheral interface 2216. Interface 2216 may include conventional components for establishing and maintaining communications between the peripheral devices and processing system 2204. Voice information and data information received over the wireless circuitry 2208 (e.g., in a voice recognition or voice command application) is transferred to the one or more processors 2218 via the peripheral interface 2216. The one or more processors 2218 can be configured to process various data formats for one or more applications 2234 stored on the medium 2202.
Peripheral interface 2216 couples input and output peripherals of the device to processor 2218 and computer-readable medium 2202. The one or more processors 2218 communicate with the computer-readable medium 2202 via a controller 2220. The computer-readable medium 2202 can be any device or medium that can store code and/or data for use by the one or more processors 2218. The medium 2202 may include a memory hierarchy including a cache, a main memory, and a secondary memory.
The device 2200 also includes a power system 2242 for powering the various hardware components. The power system 2242 may include a power management system, one or more power sources (e.g., battery, alternating Current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., light Emitting Diode (LED)), and any other components typically associated with the generation, management, and distribution of power in a mobile device.
In some implementations, the apparatus 2200 includes a camera 2244. In some embodiments, the apparatus 2200 includes a sensor 2246. The sensors 2246 may include accelerometers, compasses, gyroscopes, pressure sensors, audio sensors, light sensors, barometers, and the like. The sensor 2246 may be used to sense a positional aspect, such as an audible or a light marker of position.
In some implementations, the device 2200 may include a GPS receiver sometimes referred to as a GPS unit 2248. The mobile device may use a satellite navigation system such as a Global Positioning System (GPS) to obtain positioning information, timing information, altitude, or other navigation information. During operation, the GPS unit may receive signals from GPS satellites orbiting the earth. The GPS unit analyzes the signals to estimate the time and distance of transmission. The GPS unit may determine the current location (current position) of the mobile device. Based on these estimates, the mobile device may determine a location fix, altitude, and/or current speed. The location fix may be a geographic coordinate, such as latitude information and longitude information. In other embodiments, the device 2200 may be configured to identify GLONASS signals or any other similar type of satellite navigation signals.
One or more processors 2218 execute various software components stored in the medium 2202 to perform various functions for the device 2200. In some embodiments, the software components include an operating system 2222, a communication module (or instruction set) 2224, a location module (or instruction set) 2226, a trigger event module 2228, a predictive application manager module 2230, and other application programs (or instruction set) 2234, such as automobile locator applications and navigation applications.
The operating system 2222 may be any suitable operating system including iOS, mac OS, darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks. The operating system may include various programs, sets of instructions, software components, and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.), and facilitate communication between the various hardware and software components.
The communication module 2224 facilitates communication with other devices via one or more external ports 2236 or via wireless circuitry 2208, and includes various software components for processing data received from the wireless circuitry 2208 and/or external ports 2236. External port 2236 (e.g., USB, firewire, lightning connector, 60 pin connector, etc.) is adapted to be coupled to other devices directly or indirectly through a network (e.g., the internet, wireless local area network, etc.).
The location/motion module 2226 may assist in determining the current location (e.g., coordinates or other geographic location identifier) and motion of the device 2200. Modern positioning systems include satellite-based positioning systems such as Global Positioning System (GPS), cellular network positioning based on "cell IDs", and Wi-Fi positioning technology based on Wi-Fi networks. GPS also relies on the visibility of multiple satellites to determine a position estimate, which may not be visible (or have weak signals) indoors or in "urban canyons". In some embodiments, the position/motion module 2226 receives data from the GPS unit 2248 and analyzes the signal to determine the current location of the mobile device. In some implementations, the location/motion module 2226 may use Wi-Fi or cellular location technology to determine the current location. For example, knowledge of nearby cell sites and/or Wi-Fi access points and knowledge of their locations may be used to estimate the location of the mobile device. Information identifying Wi-Fi or cellular transmitters is received at the wireless circuit 2208 and passed to the location/motion module 2226. In some embodiments, the location module receives one or more transmitter IDs. In some embodiments, the sequence of transmitter IDs may be compared to a reference database (e.g., cell ID database, wi-Fi reference database) that maps or correlates transmitter IDs to location coordinates of corresponding transmitters, and calculates estimated location coordinates of device 2200 based on the location coordinates of corresponding transmitters. Regardless of the particular positioning technique used, the position/motion module 2226 receives information from which a position fix may be derived, interprets the information, and returns position information, such as geographic coordinates, latitude/longitude, or other position fix data.
The trigger event module 2228 may include various sub-modules or systems, for example, as described herein with respect to fig. 2A. Further, the predictive application manager module 2230 may include various sub-modules or systems, for example, as described herein with respect to fig. 3.
The one or more applications 2234 on the mobile device may include any application installed on the device 2200, including but not limited to a browser, address book, contact list, email, instant message, word processing, keyboard emulation, desktop applet, JAVA-enabled application, encryption, digital rights management, voice recognition, voice replication, music player (which plays back recorded music stored in one or more files such as MP3 files or AAC files), and so forth.
Other modules or sets of instructions (not shown) may be present, such as a graphics module, a time module, etc. For example, the graphics module may include various conventional software components for rendering, animating and displaying graphical objects (including, but not limited to, text, web pages, icons, digital images, animations, etc.) on a display surface. In another example, the timer module may be a software timer. The timer module may also be implemented in hardware. The time module may maintain various timers for any number of events.
The I/O subsystem 2206 may be coupled to a display system (not shown), which may be a touch-sensitive display. The display system displays visual output to the user in the GUI. Visual output may include text, graphics, video, and any combination thereof. Some or all of the visual outputs may correspond to user interface objects. Although the display may use LED (light emitting diode), LCD (liquid crystal display) technology or LPD (light emitting polymer display) technology, other display technologies may be used in other embodiments.
In some implementations, the I/O subsystem 2206 may include a display and user input devices such as a keyboard, mouse, and/or touch pad. In some implementations, the I/O subsystem 2206 may include a touch sensitive display. The touch sensitive display may also accept input from a user based on haptic and/or tactile contact. In some implementations, the touch-sensitive display forms a touch-sensitive surface for accepting user input. The touch-sensitive display/surface (along with any associated modules and/or sets of instructions in medium 2202) detects contact (and any movement or release of contact) on the touch-sensitive display and converts the detected contact into interactions with user interface objects, such as one or more soft keys displayed on a touch screen when the contact occurs. In some implementations, the point of contact between the touch-sensitive display and the user corresponds to one or more fingers of the user. The user may contact the touch-sensitive display using any suitable object or appendage, such as a stylus, pen, finger, etc. The touch-sensitive display surface may detect contact and any movement or release thereof using any suitable touch-sensitive technology, including capacitive technology, resistive technology, infrared technology, and surface acoustic wave technology, as well as other proximity sensor arrays or other elements for determining one or more points of contact with the touch-sensitive display.
In addition, the I/O subsystem may also be coupled to one or more other physical control devices (not shown), such as buttons, keys, switches, rocker buttons, dials, slide switches, levers, LEDs, and the like, for controlling or performing various functions, such as power control, speaker volume control, phone ring loudness, keyboard input, scrolling, holding, menus, screen locking, clearing and ending communications, and the like. In some implementations, the device 2200 may include a touch pad (not shown) for activating or deactivating a particular function in addition to the touch screen. In some embodiments, the touch pad is a touch sensitive area of the device that, unlike the touch screen, does not display visual output. The touch pad may be a touch sensitive surface separate from the touch sensitive display or an extension of a touch sensitive surface formed by the touch sensitive display.
In some embodiments, some or all of the operations described herein may be performed using an application executing on a user's device. The circuits, logic modules, processors, and/or other components may be configured to perform various operations described herein. Those skilled in the art will appreciate that such configuration may be accomplished through design, setup, interconnection, and/or programming of particular components depending on the implementation, and that the configured components may or may not be reconfigurable for different operations, again depending on the implementation. For example, a programmable processor may be configured by providing suitable executable code, dedicated logic may be configured by appropriately connecting logic gates and other circuit elements, and so forth.
Any of the software components or functions described in this patent application may be implemented as software code executed by a processor using any suitable computer language such as, for example, java, C, c++, c#, object-C, swift, or a scripting language such as Perl or Python using, for example, conventional or object-oriented techniques. The software codes may be stored on a computer readable medium as a series of instructions or commands to implement storage and/or transmission. Suitable non-transitory computer readable media may include Random Access Memory (RAM), read Only Memory (ROM), magnetic media such as a hard disk drive or floppy disk, or optical media such as Compact Disk (CD) or DVD (digital versatile disk), flash memory, and the like. The computer readable medium may be any combination of such storage devices or transmitting devices.
Computer programs incorporating the various features of the present disclosure may be encoded on various computer readable storage media including magnetic disks or tapes, optical storage media such as Compact Discs (CDs) or DVDs (digital versatile discs), flash memory, etc. The computer readable storage medium encoded with the program code may be packaged with a compatible device or provided separately from other devices. Furthermore, the program code may be encoded and transmitted via a wired optical and/or wireless network (including the internet) compliant with a variety of protocols, allowing distribution, for example, via internet downloads. Any such computer-readable medium may reside on or within a single computer product (e.g., a solid state drive, hard drive, CD, or entire computer system), and may reside on or within different computer products within a system or network. The computer system may include a monitor, printer, or other suitable display for providing any of the results mentioned herein to a user.
As described above, one aspect of the present technology is to collect and use data from various sources to improve predictions of users with whom the user may be interested. The present disclosure contemplates that in some instances, such collected data may include personal information data that uniquely identifies or may be used to contact or locate a particular person. Such personal information data may include demographic data, location-based data, telephone numbers, email addresses, TWITTER ID, home addresses, data or records related to the user's health or fitness level (e.g., vital signs measurements, medication information, exercise information), date of birth, or any other identifying or personal information.
The present disclosure recognizes that the use of such personal information data in the present technology may be used to benefit users. For example, personal information data may be used to predict users that a user may want to communicate at a particular time and place. Thus, using such personal information data included in the context information enables human-centered prediction of persons with whom a user may want to interact at a particular time and place. In addition, the present disclosure contemplates other uses for personal information data that are beneficial to the user. For example, health and fitness data may be used to provide insight into the overall health of a user, or may be used as positive feedback to individuals using technology to pursue health goals.
The present disclosure contemplates that entities responsible for the collection, analysis, disclosure, delivery, storage, or other use of such personal information data will adhere to sophisticated privacy policies and/or privacy measures. In particular, such entities should enforce and adhere to the use of privacy policies and measures that are recognized as meeting or exceeding industry or government requirements for maintaining the privacy and security of personal information data. Such policies may be readily accessed by a user and should be updated as the collection and/or use of data changes. Personal information from users should be collected for legitimate and reasonable physical uses and must not be shared or sold outside of these legitimate uses. Further, such collection/sharing should be performed after receiving the user's informed consent. Additionally, such entities should consider taking any necessary steps for protecting and securing access to such personal information data and ensuring that other entities having access to the personal information data adhere to the privacy policies and procedures of other entities. Moreover, such entities may subject themselves to third party evaluations to prove compliance with widely accepted privacy policies and privacy practices. In addition, policies and measures should be adapted to the particular type of personal information data collected and/or accessed and to applicable laws and standards including consideration of a particular jurisdiction. For example, in the united states, the collection or acquisition of certain health data may be governed by federal and/or state law, such as the health insurance circulation and liability act (HIPAA), while health data in other countries may be subject to other regulations and policies and should be treated accordingly. Thus, different privacy measures should be claimed for different personal data types in each country.
Regardless of the foregoing, the present disclosure also contemplates embodiments in which a user selectively blocks use or access to personal information data. That is, the present disclosure contemplates hardware elements and/or software elements to prevent or block access to such personal information data. For example, with respect to human-centric predictive services, the present technology may be configured to allow a user to choose to "opt-in" or "opt-out" to participate in the collection of personal information data during or at any time after the registration service. In another example, the user may choose not to provide location information to the recipient suggestion service. In yet another example, the user may choose not to provide accurate location information, but to permit delivery of location area information. In addition to providing the "opt-in" and "opt-out" options, the present disclosure contemplates providing notifications related to accessing or using personal information. For example, the user may be notified that his personal information data will be accessed when the application is downloaded, and then be reminded again just before the personal information data is accessed by the application.
Furthermore, it is intended that personal information data should be managed and processed in a manner that minimizes the risk of inadvertent or unauthorized access or use. Once the data is no longer needed, risk can be minimized by limiting the collection and deletion of data. Additionally, and when applicable, including in certain health-related applications, data de-identification may be used to protect the privacy of the user. De-identification may be facilitated by removing specific identifiers (e.g., date of birth, etc.), controlling the amount or specificity of stored data (e.g., collecting location data at a city level instead of at an address level), controlling how data is stored (e.g., aggregating data among users), and/or other methods, as appropriate.
Thus, while the present disclosure broadly covers the use of personal information data to implement one or more of the various disclosed embodiments, the present disclosure also contemplates that the various embodiments may be implemented without accessing such personal information data. That is, various embodiments of the present technology do not become inoperable due to the lack of all or a portion of such personal information data. For example, a user with whom the user wishes to communicate at a particular time and place may be predicted based on non-personal information data or an absolute minimum of personal information, such as content requested by a device associated with the user, other non-personal information, or publicly available information.
Although the present disclosure has been described with respect to specific embodiments, it will be understood that the present disclosure is intended to cover all modifications and equivalents within the scope of the following claims.
All patents, patent applications, publications, and descriptions mentioned herein are incorporated by reference in their entirety for all purposes. No admission is made that any document is prior art. In the event of conflict between the present disclosure and the references provided herein, the present disclosure should predominate.

Claims (41)

1.一种方法,所述方法包括:1. A method, the method comprising: 响应于在相关联的第一时间的触发信号,使用与一个或多个其他设备的第一测距会话来生成第一位置值;In response to a trigger signal at an associated first time, a first position value is generated using a first ranging session with one or more other devices; 将所述第一位置值存储在存储器中;Store the first position value in memory; 使用移动设备的运动传感器跟踪所述移动设备的运动以确定相对于所述第一位置值的当前位置;The motion sensor of the mobile device is used to track the motion of the mobile device to determine the current position relative to the first position value; 确定自所述相关联的第一时间以来所述移动设备的当前位置已从所述第一位置值改变了阈值量;Determine the threshold amount by which the current location of the mobile device has changed from the first location value since the associated first time. 响应于自所述相关联的第一时间以来所述移动设备的所述当前位置已改变超过预定阈值量,使用与所述一个或多个其他设备的第二测距会话来生成第二位置值;以及In response to the mobile device's current location having changed by more than a predetermined threshold amount since the associated first time, a second location value is generated using a second ranging session with the one or more other devices; and 将所述第二位置值存储在所述存储器中。The second position value is stored in the memory. 2.根据权利要求1所述的方法,其中所述第二测距会话是进一步基于所述移动设备的面向用户的屏幕来生成的。2. The method of claim 1, wherein the second ranging session is further generated based on the user-facing screen of the mobile device. 3.根据权利要求1至2中任一项所述的方法,其中所述第二测距会话还要求所述移动设备的屏幕是活动的。3. The method according to any one of claims 1 to 2, wherein the second ranging session further requires the screen of the mobile device to be active. 4.根据权利要求1至3中任一项所述的方法,所述方法还包括:4. The method according to any one of claims 1 to 3, further comprising: 基于所述第一位置值或所述第二位置值来确定用于流式传输服务的回放设备。The playback device for the streaming service is determined based on either the first location value or the second location value. 5.根据权利要求4所述的方法,所述方法还包括:5. The method according to claim 4, further comprising: 从回放设备接收通知,所述通知指示所述移动设备使用针对所述移动设备的第三测距会话来生成第三位置值。A notification is received from the playback device, instructing the mobile device to use a third ranging session specific to the mobile device to generate a third location value. 6.根据权利要求1至5中任一项所述的方法,其中所述运动传感器是所述移动设备的加速度计。6. The method according to any one of claims 1 to 5, wherein the motion sensor is an accelerometer of the mobile device. 7.根据权利要求1至6中任一项所述的方法,其中所述移动设备延迟生成所述第二位置值,直到所述移动设备的屏幕开启。7. The method according to any one of claims 1 to 6, wherein the mobile device delays generating the second location value until the screen of the mobile device is turned on. 8.根据权利要求1至7中任一项所述的方法,其中生成所述第二位置值被延迟,直到所述移动设备静止达指定的时间段。8. The method according to any one of claims 1 to 7, wherein the generation of the second location value is delayed until the mobile device remains stationary for a specified time period. 9.根据权利要求8所述的方法,所述方法还包括:9. The method according to claim 8, further comprising: 通过建立一系列逐渐变小的地理围栏以确定所述移动设备已停止移动来确定所述移动设备是静止的。The mobile device is determined to be stationary by establishing a series of progressively smaller geofences to determine if the mobile device has stopped moving. 10.根据权利要求1至9中任一项所述的方法,所述方法还包括:10. The method according to any one of claims 1 to 9, further comprising: 响应于所述第二位置值来呈现图形用户界面。The graphical user interface is rendered in response to the second position value. 11. 根据权利要求10所述的方法,其中呈现所述图形用户界面包括:11. The method of claim 10, wherein presenting the graphical user interface comprises: 基于所述第二位置值来确定一个或多个应用程序;以及One or more applications are determined based on the second location value; and 由所述图形用户界面呈现表示所述一个或多个应用程序的一个或多个图形元素。The graphical user interface presents one or more graphical elements representing the one or more applications. 12. 一种移动设备,所述移动设备包括:12. A mobile device, the mobile device comprising: 一个或多个处理器;和One or more processors; and 存储器,所述存储器存储指令,所述指令在由所述一个或多个处理器执行时执行根据权利要求1至11中任一项所述的操作。A memory that stores instructions that, when executed by the one or more processors, perform the operations according to any one of claims 1 to 11. 13.一种非暂态计算机可读介质,所述非暂态计算机可读介质存储指令,所述指令在由一个或多个处理器执行时执行根据权利要求1至11中任一项所述的操作。13. A non-transitory computer-readable medium storing instructions that, when executed by one or more processors, perform the operations claimed in any one of claims 1 to 11. 14.一种由移动设备执行的方法,所述方法包括:14. A method performed by a mobile device, the method comprising: 存储与安装在所述移动设备上的增强现实应用程序相关联的增强现实区域的空间定义;The spatial definition of the augmented reality region associated with the augmented reality application installed on the mobile device; 响应于在第一时间的触发信号,确定所述移动设备的第一位置值,所述第一位置值相对于所述增强现实区域具有已知的空间关系;In response to a trigger signal at a first moment, a first location value of the mobile device is determined, the first location value having a known spatial relationship with respect to the augmented reality area; 将所述第一位置值存储在所述移动设备的存储器中;The first location value is stored in the memory of the mobile device; 使用所述移动设备的传感器跟踪所述移动设备的定位从所述第一位置值的变化,以确定所述移动设备相对于所述增强现实区域的当前位置;以及Using the mobile device's sensors to track changes in the mobile device's location from the first location value, to determine the mobile device's current location relative to the augmented reality area; and 当所述移动设备的所述当前位置、指定距离或多个位置中的至少一者指示所述移动设备可能进入或退出所述增强现实区域时,在所述移动设备上发起针对所述移动设备上的所述增强现实应用程序的一个或多个动作。When at least one of the mobile device's current location, specified distance, or multiple locations indicates that the mobile device may enter or exit the augmented reality area, one or more actions are initiated on the mobile device for the augmented reality application on the mobile device. 15.根据权利要求14所述的方法,其中所述传感器是惯性传感器、光学传感器或GNSS传感器中的一者。15. The method of claim 14, wherein the sensor is one of an inertial sensor, an optical sensor, or a GNSS sensor. 16.根据权利要求14至15中任一项所述的方法,其中当所述移动设备在距所述增强现实区域的指定距离内时。16. The method according to any one of claims 14 to 15, wherein the mobile device is within a specified distance from the augmented reality area. 17.根据权利要求14至16中任一项所述的方法,其中当所述移动设备的轨迹预测进入所述增强现实区域时,发起所述一个或多个动作。17. The method of any one of claims 14 to 16, wherein the one or more actions are initiated when the trajectory prediction of the mobile device enters the augmented reality area. 18.根据权利要求14至17中任一项所述的方法,其中当所述移动设备的轨迹预测从所述增强现实区域退出时,发起所述一个或多个动作。18. The method of any one of claims 14 to 17, wherein the one or more actions are initiated when the trajectory prediction of the mobile device exits the augmented reality area. 19.根据权利要求14至18中任一项所述的方法,所述方法还包括使用所述移动设备上的第二传感器捕获传感器数据,并将所述传感器数据存储在所述存储器中。19. The method according to any one of claims 14 to 18, the method further comprising capturing sensor data using a second sensor on the mobile device and storing the sensor data in the memory. 20.根据权利要求14至19中任一项所述的方法,所述方法还包括当所述移动设备的所述当前位置在所述增强现实区域之外时,暂停所述移动设备上的所述增强现实应用程序。20. The method of any one of claims 14 to 19, further comprising pausing the augmented reality application on the mobile device when the current location of the mobile device is outside the augmented reality area. 21. 根据权利要求14至20中任一项所述的方法,所述方法还包括:21. The method according to any one of claims 14 to 20, the method further comprising: 使用所述移动设备的所述传感器跟踪所述移动设备的运动以确定所述移动设备的轨迹;以及The sensors of the mobile device are used to track the motion of the mobile device to determine its trajectory; and 当所述移动设备的所述轨迹将在距所述增强现实区域的预定范围内时,发起增强现实模式。Augmented reality mode is initiated when the trajectory of the mobile device is within a predetermined range of the augmented reality area. 22.根据权利要求21所述的方法,所述方法还包括当所述移动设备的所述轨迹在所述增强现实区域之外时,暂停所述移动设备上的所述增强现实模式。22. The method of claim 21, further comprising pausing the augmented reality mode on the mobile device when the trajectory of the mobile device is outside the augmented reality area. 23.根据权利要求22所述的方法,其中所述传感器是所述移动设备的加速度计。23. The method of claim 22, wherein the sensor is an accelerometer of the mobile device. 24. 一种移动设备,所述移动设备包括:24. A mobile device, the mobile device comprising: 一个或多个处理器;和One or more processors; and 存储器,所述存储器存储指令,所述指令在由所述一个或多个处理器执行时执行根据权利要求14至23中任一项所述的操作。A memory that stores instructions that, when executed by the one or more processors, perform the operations according to any one of claims 14 to 23. 25.一种非暂态计算机可读介质,所述非暂态计算机可读介质存储指令,所述指令在由一个或多个处理器执行时执行根据权利要求14至23中任一项所述的操作。25. A non-transitory computer-readable medium storing instructions that, when executed by one or more processors, perform the operations claimed in any one of claims 14 to 23. 26.一种由计算设备执行的方法,所述方法包括:26. A method performed by a computing device, the method comprising: 获得第一传感器定位,所述第一传感器定位包括使用第一移动设备的第一传感器测量的传感器值的第一集合,其中传感器值的所述第一集合是根据由锚设备发射的无线信号确定的,其中所述锚设备中的每个锚设备具有锚标识符集合中的锚标识符;A first sensor location is obtained, the first sensor location comprising a first set of sensor values measured using a first sensor of a first mobile device, wherein the first set of sensor values is determined based on wireless signals emitted by anchor devices, wherein each of the anchor devices has an anchor identifier in a set of anchor identifiers; 获得第二传感器定位,所述第二传感器定位包括使用第二移动设备的第二传感器测量的传感器值的第二集合,其中传感器值的所述第二集合是根据由所述锚设备发射的无线信号确定的;A second sensor location is obtained, the second sensor location comprising a second set of sensor values measured using a second sensor of a second mobile device, wherein the second set of sensor values is determined based on wireless signals emitted by the anchor device; 使用所述锚标识符集合来确定所述第一传感器定位和所述第二传感器定位两者是使用由所述锚设备发射的无线信号来确定的;以及The location of the first sensor and the location of the second sensor are determined using the set of anchor identifiers, both of which are determined using wireless signals emitted by the anchor device; and 将所述第一传感器定位和所述第二传感器定位作为输入提供给相似性模型,所述相似性模型提供所述第一移动设备与所述第二移动设备之间的接近度分类。The first sensor location and the second sensor location are provided as input to a similarity model, which provides a proximity classification between the first mobile device and the second mobile device. 27.根据权利要求26所述的方法,其中所述第二定位是由所述第二移动设备响应于所述第二移动设备的运动传感器指示所述第二移动设备已从先前测量移动了阈值距离来测量的。27. The method of claim 26, wherein the second positioning is measured by the second mobile device in response to an indication from a motion sensor of the second mobile device that the second mobile device has moved a threshold distance from a previously measured distance. 28.根据权利要求26至27中任一项所述的方法,其中提供所述接近度分类还包括:28. The method according to any one of claims 26 to 27, wherein providing the proximity classification further comprises: 接收相似性分数作为来自所述相似性模型的输出;Receive similarity scores as output from the similarity model; 将所述相似性分数与阈值进行比较;以及The similarity score is compared to a threshold; and 基于所述相似性分数高于所述阈值来将所述移动设备分类为接近设备。The mobile device is classified as a proximity device based on the similarity score being higher than the threshold. 29. 根据权利要求26至28中任一项所述的方法,其中所述计算设备是所述第二移动设备,并且其中获得传感器值的所述第一集合包括:29. The method of any one of claims 26 to 28, wherein the computing device is the second mobile device, and wherein obtaining the first set of sensor values comprises: 标识在所述第二传感器处测量的信号值集合;以及Identify the set of signal values measured at the second sensor; and 对于所述信号值集合中的每个信号值,利用生成所述信号值的锚设备执行测距测量。For each signal value in the set of signal values, a distance measurement is performed using the anchor device that generated the signal value. 30.根据权利要求29所述的方法,其中所述第一传感器定位是从服务器设备获得的。30. The method of claim 29, wherein the first sensor positioning is obtained from a server device. 31.根据权利要求30所述的方法,其中由所述第二移动设备获得所述第二传感器定位还包括:31. The method of claim 30, wherein obtaining the second sensor positioning from the second mobile device further comprises: 测量所述第二传感器定位处的第一全球定位系统(GPS)位置;Measure the location of the first Global Positioning System (GPS) at the location of the second sensor; 生成包括所述第一GPS位置的请求;Generate a request including the first GPS location; 将所述请求提供给所述服务器设备;以及Provide the request to the server device; and 从所述服务器设备接收所述第一传感器定位,其中所述第一传感器定位与在所述第一GPS位置的阈值距离内的第二全球定位系统(GPS)位置相关联。The server device receives the first sensor location, wherein the first sensor location is associated with a second Global Positioning System (GPS) location within a threshold distance of the first GPS location. 32.根据权利要求30所述的方法,其中由所述第二移动设备获得所述第二传感器定位还包括:32. The method of claim 30, wherein obtaining the second sensor positioning from the second mobile device further comprises: 生成包括所述锚标识符集合的请求;Generate a request that includes the set of anchor identifiers; 将所述请求提供给所述服务器设备;以及Provide the request to the server device; and 从所述服务器设备接收所述第一传感器定位,其中所述第一传感器定位是所述锚标识符集合。The first sensor location is received from the server device, wherein the first sensor location is the set of anchor identifiers. 33.根据权利要求26至32中任一项所述的方法,其中接近度分类包括:33. The method according to any one of claims 26 to 32, wherein proximity classification comprises: 所述第一传感器定位与所述第二传感器定位之间的距离;The distance between the location of the first sensor and the location of the second sensor; 对所述第一传感器定位在所述第二传感器定位附近的确定;Determining that the first sensor is located near the location of the second sensor; 对所述第一传感器定位处于所述第二传感器定位处的确定;或者Determining the location of the first sensor at the location of the second sensor; or 对所述第一传感器定位远离所述第二传感器定位的确定。Determining the location of the first sensor as far away from the location of the second sensor. 34. 根据权利要求26至33中任一项所述的方法,其中获得所述第二传感器定位包括:34. The method according to any one of claims 26 to 33, wherein obtaining the second sensor positioning comprises: 将所述第一传感器定位提供给服务器设备;以及Provide the positioning information from the first sensor to the server device; and 从所述服务器设备接收所述第二传感器定位。Receive the second sensor positioning from the server device. 35.根据权利要求26至34中任一项所述的方法,其中所述计算设备是服务器设备。35. The method according to any one of claims 26 to 34, wherein the computing device is a server device. 36.根据权利要求26至35中任一项所述的方法,其中所述计算设备是所述第二移动设备。36. The method according to any one of claims 26 to 35, wherein the computing device is the second mobile device. 37. 根据权利要求26至36中任一项所述的方法,其中获得所述第二传感器定位还包括:37. The method according to any one of claims 26 to 36, wherein obtaining the second sensor positioning further comprises: 标识传感器值的所述第二集合中的一个或多个传感器值,所述一个或多个传感器值对应于一个或多个移动设备;以及One or more sensor values in a second set that identify sensor values, the one or more sensor values corresponding to one or more mobile devices; and 从传感器值的所述第二集合中移除所述一个或多个传感器值。Remove one or more sensor values from the second set of sensor values. 38.根据权利要求26至37中任一项所述的方法,其中所述计算设备是服务器设备,并且所述第一传感器定位包括标识由所述第一移动设备执行的一个或多个动作的信息。38. The method of any one of claims 26 to 37, wherein the computing device is a server device, and the first sensor positioning includes information identifying one or more actions performed by the first mobile device. 39.根据权利要求38所述的方法,所述方法还包括:39. The method according to claim 38, further comprising: 向所述第二移动设备提供标识所述一个或多个动作的所述信息。The information identifying the one or more actions is provided to the second mobile device. 40. 一种移动设备,所述移动设备包括:40. A mobile device, the mobile device comprising: 一个或多个处理器;和One or more processors; and 存储器,所述存储器存储指令,所述指令在由所述一个或多个处理器执行时执行根据权利要求26至39中任一项所述的操作。A memory that stores instructions that, when executed by the one or more processors, perform the operations claimed in any one of claims 26 to 39. 41.一种非暂态计算机可读介质,所述非暂态计算机可读介质存储指令,所述指令在由一个或多个处理器执行时执行根据权利要求26至39中任一项所述的操作。41. A non-transitory computer-readable medium storing instructions that, when executed by one or more processors, perform the operations claimed in any one of claims 26 to 39.
CN202480036809.1A 2023-06-02 2024-05-30 Position measurement technology Pending CN121220065A (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US202363470675P 2023-06-02 2023-06-02
US63/470,675 2023-06-02
US18/677,583 2024-05-29
US18/677,583 US20240402211A1 (en) 2023-06-02 2024-05-29 Location measurement techniques
PCT/US2024/031782 WO2024249701A2 (en) 2023-06-02 2024-05-30 Location measurement techniques

Publications (1)

Publication Number Publication Date
CN121220065A true CN121220065A (en) 2025-12-26

Family

ID=91700293

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202480036809.1A Pending CN121220065A (en) 2023-06-02 2024-05-30 Position measurement technology

Country Status (2)

Country Link
CN (1) CN121220065A (en)
WO (1) WO2024249701A2 (en)

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7962155B2 (en) * 2007-07-18 2011-06-14 Hewlett-Packard Development Company, L.P. Location awareness of devices
WO2012099377A2 (en) * 2011-01-17 2012-07-26 엘지전자 주식회사 Augmented reality (ar) target updating method, and terminal and server employing same
CN103503493B (en) * 2011-02-25 2017-04-12 黑莓有限公司 Determining the in-range proximity of a device
US20130085861A1 (en) * 2011-09-30 2013-04-04 Scott Dunlap Persistent location tracking on mobile devices and location profiling
BR112014014780A2 (en) * 2011-12-20 2017-06-13 Koninklijke Philips Nv method of controlling a device comprising a receiver and a motion sensor, computer program product, and device
US8977296B1 (en) * 2012-02-02 2015-03-10 T3 Innovation Labs, LLC Methods and systems for setting up geo fences and delivering digital media content based on such geo fences
EP3001215A1 (en) * 2014-09-24 2016-03-30 Alcatel Lucent Method for determining the relative position of user equipment in a wireless telecommunication network, a node and a computer program product
US10833886B2 (en) * 2018-11-07 2020-11-10 International Business Machines Corporation Optimal device selection for streaming content
CN110958325B (en) * 2019-12-11 2021-08-17 联想(北京)有限公司 A control method, device, server and terminal

Also Published As

Publication number Publication date
WO2024249701A2 (en) 2024-12-05
WO2024249701A3 (en) 2025-02-13

Similar Documents

Publication Publication Date Title
US12015670B2 (en) Using in-home location awareness
US9807725B1 (en) Determining a spatial relationship between different user contexts
EP3639119B1 (en) Object holographic augmentation
CN110301118B (en) Location calibration for an intelligent assistant computing device
JP6810748B2 (en) Control of electronic devices and display of information based on wireless ranging
KR102362117B1 (en) Electroninc device for providing map information
CN109247070B (en) Proactive actions on mobile devices using uniquely identifiable and unmarked locations
US10327082B2 (en) Location based tracking using a wireless earpiece device, system, and method
US9107040B2 (en) Systems, methods, and computer readable media for sharing awareness information
CN105100390B (en) A method of control mobile terminal
CN117859077A (en) System and method for generating three-dimensional maps of indoor spaces
US20180014102A1 (en) Variable Positioning of Distributed Body Sensors with Single or Dual Wireless Earpiece System and Method
US10082584B2 (en) Hybrid device location determination system
CN113728293A (en) System and interface for location-based device control
KR20130119473A (en) Non-map-based mobile interface
US11181376B2 (en) Information processing device and information processing method
WO2014150405A2 (en) Context aware localization, mapping, and tracking
WO2019200385A1 (en) Proximity-based event networking system and wearable augmented reality clothing
KR20200029271A (en) Electronic device and method for identifying location in the electronic device
US12299918B2 (en) Methods and systems to facilitate passive relocalization using three-dimensional maps
US20250089016A1 (en) Techniques for device localization
US20240402211A1 (en) Location measurement techniques
CN121220065A (en) Position measurement technology
WO2023107729A1 (en) Methods and systems to allow three-dimensional maps sharing and updating
US20250377742A1 (en) Controller engagement detection using hybrid sensor approach

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination