HK1228053A1 - Methods and apparatus to detect engagement with media presented on wearable media devices - Google Patents
Methods and apparatus to detect engagement with media presented on wearable media devices Download PDFInfo
- Publication number
- HK1228053A1 HK1228053A1 HK17101622.5A HK17101622A HK1228053A1 HK 1228053 A1 HK1228053 A1 HK 1228053A1 HK 17101622 A HK17101622 A HK 17101622A HK 1228053 A1 HK1228053 A1 HK 1228053A1
- Authority
- HK
- Hong Kong
- Prior art keywords
- media
- opacity
- wearable
- user
- meter
- Prior art date
Links
Abstract
Methods and apparatus to detect engagement with media presented on a wearable media device are disclosed. An example method includes determining a degree of opacity of a media presentation displayed on a wearable media device; and calculating an engagement score for media presented via the media presentation based on the degree of opacity.
Description
RELATED APPLICATIONS
This patent claims the benefit of provisional patent application No. us 61/923,859 filed on 6/1/2014, which is hereby incorporated by reference in its entirety.
Technical Field
The present disclosure relates generally to audience measurement and, more particularly, to methods and devices for detecting engagement (engagement) with media presented on a wearable media device.
Background
Media monitoring companies desire information about user interaction with, for example, media devices. To this end, media monitoring companies obtain monitoring information related to media presented at a media device such that media monitoring entities obtain knowledge of, for example, exposure to advertisements, exposure to content (e.g., shows, web pages, etc.), user purchasing activity related to exposure to media, demographic information of viewers exposed to the media, and so forth.
Drawings
Fig. 1 illustrates an example environment including an example wearable media device with a meter constructed in accordance with the teachings of the present disclosure.
Fig. 2 is a block diagram of an example implementation of the example wearable media device of fig. 1.
Fig. 3 is a screenshot representing a first diagram seen with the example wearable media device of fig. 1 and/or 2 with media presented in a first opacity.
Fig. 4 is a screenshot representing a second diagram seen with the example wearable media device of fig. 1 and/or 2 with media presented in a second opacity.
FIG. 5 is a block diagram of an example implementation of the example meter of FIG. 1 and/or FIG. 2.
Fig. 6 is a flow diagram representing example machine readable instructions that may be executed to implement the example wearable media device of fig. 1 and 2.
Fig. 7 is a flow diagram representing example machine readable instructions that may be executed to implement the example meters of fig. 1,2, and/or 5.
Fig. 8 is a flow diagram representing example machine readable instructions that may be executed to implement the example media measurement entity of fig. 1.
Fig. 9 is a block diagram of an example processor platform that is capable of executing the example machine-readable instructions of fig. 6 to implement the example wearable media device of fig. 1 and/or 2, that is capable of executing the example machine-readable instructions of fig. 7 to implement the example meter of fig. 1,2, and/or 5, and/or that is capable of executing the example machine-readable instructions of fig. 8 to implement the example media measurement entity of fig. 1.
Detailed Description
Media monitoring companies desire information about user interaction with media devices. For example, media monitoring companies desire to obtain monitoring information related to media presented at a media device so that media monitoring entities obtain knowledge of, for example, exposure to advertisements, exposure to content (e.g., shows, web pages, etc.), user purchasing activity in response to exposure to media, demographic information of viewers exposed to media, and so forth. As used herein, media refers to any form of content and/or advertising delivered by any promotional tool (e.g., television, radio, tablet, smartphone, wearable media device, etc.). Monitoring information includes, for example, media identification information (e.g., media identification metadata, code, signatures, watermarks, and/or other information that may be used to identify the presented media), application usage information (e.g., an identifier of the application, a time and/or duration of use of the application, a rating of the application, etc.), and/or user identification information (e.g., demographic information, panelist identifier, username, etc.). The media identification information may be aggregated to determine and/or estimate, for example, exposure of one or more demographics and/or demographics to particular media and/or types of media, ownership and/or usage statistics of the media devices, relative rankings of usage and/or ownership of the media devices, types of usage of the media devices (e.g., whether the devices are used to browse the internet, streaming media from the internet, etc.), and/or other types of media device information. Traditionally, such systems similarly treat each detected media identification situation for purposes such as calculating exposure data (e.g., ratings), although a first person associated with a first detection of media may be less or not interested in presentation of the detected media and a second person associated with a second detection of media may be focused on, for example, highly interested in and/or interacting with presentation of the media.
The examples disclosed herein recognize that while media may be detected on a media device, the presentation of the media does not necessarily indicate that a person is paying attention to (e.g., participating in) the media presentation. Examples disclosed herein generate engagement information (e.g., a likelihood of engagement) that indicates whether a user is interested in media presented on a wearable media device. Some wearable media devices have a head-mounted display that presents media (e.g., audiovisual media such as television programming, movies, streaming video, websites, advertisements, text messages, emails, maps, augmented reality data, etc.) to a user, for example, on a portion (e.g., upper right corner) of a reflective surface (e.g., a lens of glasses). As can be seen, the user of the wearable media device is exposed to the displayed media while still interacting with the surrounding environment. Because a user of the wearable media device can more easily perform tasks with the wearable media device other than focusing on (e.g., focusing on) the displayed media, the user may not focus on the displayed media.
To determine whether the user is engaged with (e.g., paying attention to) the displayed media and/or how likely the user is paying attention to the displayed media, examples disclosed herein use display characteristics generated by the wearable media device. In examples disclosed herein, the opaque or transparent characteristics of the display are captured and used to generate, for example, an engagement score (e.g., a likelihood of engagement expressed as a percentage) for the displayed media. While the display of media on head-mounted wearable media devices (such as glasses) is often completely opaque, in some scenarios, the media is displayed in transparency. Since transparency and opacity are referred to conversely, transparency may also be referred to as opacity or percent opacity. The wearable media device provides a transparent or semi-opaque display to enable the user to perceive his or her environment beyond the display of the media. In other words, head mounted displays sometimes generate displays that at least partially see through so that the user can view his or her surroundings while still exposed to media. As described in detail below, in some example wearable devices, the transparency or opacity of a display on such a head-mounted display is set according to, for example, manual input provided by a user and/or automatic detection of a gaze direction of the user relative to the media display. Examples disclosed herein use data representing an opacity of a display to determine a likelihood of a user engaging (e.g., focusing on) corresponding media. Example measures of user attention that are provided by examples disclosed herein are referred to herein as engagement levels or engagement scores. In some examples disclosed herein, because (1) the manual input provided by the user corresponds to a desire to view the media clearly and/or (2) the user may be looking at the display from the gaze direction detector, a greater opacity of the display (i.e., 90% opacity) is scaled to a higher engagement score for the corresponding media. In some examples disclosed herein, a smaller opacity (i.e., 40% opacity) is scaled to a lower engagement score for the corresponding media because (1) the manual input provided by the user corresponds to the user increasing transparency in view of seeing through (e.g., ignoring or partially ignoring) the media and/or (2) the user may look away from the display based on the gaze direction detector.
In some examples disclosed herein, the opacity of the display generated by the wearable media device is obtained by reference to one or more settings used by the wearable media device to generate the display. Additionally or alternatively, examples disclosed herein capture the opacity characteristic via an interface with a manual input (e.g., a button) that the user has access to use during use of the wearable media device. Additionally or alternatively, examples disclosed herein capture the opacity characteristic according to a gaze direction detector of a wearable media device that determines or sets an opacity of the media presentation based on whether the user is looking at or about the media presentation. While the eye position information itself may indicate whether the user is focusing on the displayed media, in some scenarios the user looks in the direction of the media display, but at the same time provides manual input that makes the display transparent. Accordingly, some examples disclosed herein base participation determination as opacity characteristic data only and/or prioritize opacity characteristic data over eye position data. In some examples, eye position data is not used and/or not obtained.
In some examples disclosed herein, the engagement level or score is calculated by scaling the opacity characteristic data to the engagement score based on a data structure (e.g., a translation table) having a plurality of mappings between opacity and engagement score. In some examples disclosed herein, the engagement score is calculated according to one or more algorithms defined by, for example, a media measurement entity. In some examples disclosed herein, one or more algorithms combine opacity characteristic data with additional or alternative data generated by the wearable media device, such as sensor information (e.g., motion data, location data, facial expression data, eye tracking data, etc.), to generate an engagement score. That is, some examples disclosed herein consider additional factors along with opacity characteristic data to generate an engagement score.
The engagement information provided by examples disclosed herein is used, for example, to generate an engagement rating for a particular media presented on a wearable media device. A traditional rating generated using presence information is an indication of exposure to the media, but does not indicate whether the audience member is actually interested in the media presence (e.g., an individual may divert attention from the media). Rather, the engagement information provided by examples disclosed herein may be used to generate an engagement rating indicating how much attention the user of the wearable media device is focused on a particular media. The engagement ratings provided by examples disclosed herein may stand alone and/or may be used to supplement traditional (e.g., exposure-only) ratings. The engagement ratings provided by the examples disclosed herein are finer from multiple perspectives than traditional ratings generated using only presentation and/or media identification information. For example, the engagement levels disclosed herein provide information about the wearable media device user's attention to a particular portion or event of the media (such as a particular scene, the presence of a particular actor or actress, a particular song played, a particular product shown, etc.). Thus, the engagement level or score provided by examples disclosed herein indicates, for example, that audience members have become and/or remain focused when a particular person, brand, or object is present in the media and/or when a particular event or type of event occurs in the media. Thus, more refined data (relative to data provided by previous exposure-only based systems) relating to a particular portion of media is provided by examples disclosed herein.
Fig. 1 illustrates an example environment including a user 100 wearing a wearable media device 102. In the example of fig. 1, wearable media device 102 includes a meter 104 constructed in accordance with the teachings of the present disclosure. As described in detail below, the example meter 104 uses the opacity data related to the display of the wearable media device 102 to generate engagement information indicative of a level of attention to media presented on the wearable media device 102. The example wearable media device 102 of fig. 1 can obtain (e.g., download) any suitable type of media from any suitable media source. For example, wearable media device 102 of fig. 1 communicates with media provider 106 via network 108 and/or directly to obtain media displayed on wearable media device 102. The example network 108 of fig. 1 is a Wide Area Network (WAN), such as the internet. However, in some examples, a local area network may additionally or alternatively be used. For example, multiple networks (e.g., cellular networks, ethernet networks, etc.) may be used to implement the example network 108 of fig. 1. The example media provider 106 of fig. 1 may be via any media provider (such as a media broadcaster, a multiplex broadcaster, or a single broadcaster (e.g., cable television service, fiber optic television service, IPTV provider, etc.), on-demand digital media provider (e.g., a provider of internet streaming video and/or audio services (such as a cable television service, an IPTV provider, etc.))Etc.), a web page, and/or any other provider of any type of electronic media).
In the illustrated example of fig. 1, the wearable media device 102 is a head-mounted display device (such as, for example, Google)). As can be seen, the example wearable media device 102 of fig. 1 communicates with the example network 108 via a first wireless coupling 110 established, for example, with a Wi-Fi access point 112. Additionally or alternatively, the example wearable media device 102 of fig. 1 communicates with the network 108 via a second wireless coupling 114 (e.g., bluetooth pairing, Wi-Fi session) established with a portable apparatus 116 having, for example, cellular capabilities. The example portable device 116 of fig. 1 is, for example, a smart phone, a tablet computer, a tablet phone, a portable personal meter, and/or any other portable device having wireless communication capability to communicate with the network 108. In this case, the example wearable media device 102 of fig. 1 communicates data to the portable device 116 via the second wireless coupling 114, and the portable device 116 forwards the data to the network 108 over the third wireless coupling 118 (e.g., a cellular connection). In some examples, the wearable media device 102 of fig. 1 uses the first wireless coupling 110 when the wearable media device 102 is within range of the Wi-Fi access point 112. When the example wearable media device 102 of fig. 1 is not within range of the Wi-Fi access point 112 (or any other Wi-Fi access point and/or other type of short-range communication device), the wearable media device 102 of fig. 1 uses the second wireless coupling 114 with the portable device 116 for communicating with the network 108.
In the illustrated example of fig. 1, the meter 104 collects information related to media presentations generated by the example wearable media device 102. In the example of fig. 1, the example meter 104(1) detects and/or measures user engagement with the media presentation, (2) detects and/or identifies media presented on the wearable media device 102, and/or (3) detects and/or identifies a user of the wearable media device 102. In the illustrated example of fig. 1, the example meter of fig. 1 is downloadable via, for example, internet software. In the illustrated example of fig. 1, the meter 104 is provided by a media measurement entity 120 (e.g., a monitoring entity such as the Nielsen Company) and/or the example media provider 106. For example, the media measurement entity 120 of fig. 1 includes an SDK (software development kit) provider 122 that provides instructions for application developers associated with, for example, the media provider 106. In some examples, the SDK provider 122 provides the SDK to the application developer so that the developer can integrate monitoring instructions (e.g., including instructions to implement the example meter 104) into existing applications. In this case, the media provider 106 uses the SDK to integrate the meter 104 into an application associated with the media provider 106 (e.g., by instrumenting the application with instructions corresponding to the SDK of the meter 104), and places the instrumented application, having the meter 104 integrated therein, into, for example, an application store (e.g., Apple iTunes, Google play, etc.). In some examples, the instrumented application has basic functionality other than media monitoring, such as, for example, presenting media from a particular media provider (e.g., when the instrumented application is specific to the particular media provider (e.g., a television broadcaster such as ESPN, ABC, NBC, etc.)).
Members of the public (some of which are panelists of the media measurement entity 120) may download the meter 104 (e.g., from an application store) to a corresponding media device (such as the example wearable media device 102 of fig. 1). People become panelists via, for example, a user interface (e.g., a website) presented on wearable media device 102. People become panelists in additional or alternative ways, such as, for example, via telephone access, by completing an online survey, etc. Additionally or alternatively, people may be contacted and/or recruited using any desired methodology (e.g., random selection, static selection, phone collection, internet advertising, surveys, advertising in a mall, product packaging, etc.). During enrollment of panelists, the media measurement entity 120 of fig. 1 receives demographic information from the recruiter so that subsequent associations can be made between media exposures associated with those panelists and different demographic markets.
Although in the illustrated example of fig. 1, the meter 104 is provided via an SDK, the meter 104 and/or corresponding instructions provided via an SDK may be provided in any other suitable manner. For example, instructions associated with the example meter 104 of fig. 1 may be provided as an Application Program Interface (API), plug-in, add-in, and the like. Alternatively, the instructions associated with the example meter 104 may be maintained externally, and the SDK may facilitate monitoring the installation of a link of instructions into one or more applications. This latter approach is advantageous because it makes it easy to implement the monitoring equipment after deployment of the corresponding application.
As described in detail below with respect to fig. 5 and 6, the example meter 104 of fig. 1 collects monitoring data (e.g., media identification information, user identification information, device identification information, etc.), generates engagement information indicative of interest in a display of the wearable media device 102, and transmits a record including the monitoring data and the engagement information to the example media measurement entity 120 (e.g., via a communication interface of the wearable media device 102 and the network 108). To exchange information with the media appliance 102 via the network 104, the example media measurement entity 120 employs a server 124 (and/or any other suitable computing platform) that implements an interface 126 that receives reported monitoring information from, for example, the wearable media device 102 via the network 108. The example interface 126 of FIG. 1 is a HyperText transfer protocol (HTTP) interface. However, the example server 124 of FIG. 1 may use any suitable type of interface and/or protocol. In the illustrated example, the HTTP interface 126 receives HTTP requests including, for example, media monitoring information. In some examples, the HTTP request is sent over media monitoring information in the payload portion of the request. The media monitoring information received via the HTTP request includes, for example, media identification information (e.g., media identification metadata, code, signature, watermark, and/or other information that may be used to identify the presented media), user identification information (e.g., an alphanumeric identifier assigned to the current user), device identification information (e.g., a model number, manufacturer identification, version information, etc.), application usage information (e.g., an identifier of the application, a time and/or duration of use of the application, a rating of the application, etc.), participation information generated by the example meter 104, and/or any other suitable monitoring information. The request is not actually intended to acquire media, but rather serves as a vehicle for communicating media monitoring information. Thus, an HTTP request may be referred to as a "virtual request". The example server 124 of fig. 1 is provided with software (e.g., a daemon process) that extracts media monitoring information from the payload of the virtual request. Additionally or alternatively, any other method of communicating media monitoring information may be used (e.g., HTTP Secure (HTTPs) protocol), File Transfer Protocol (FTP), Secure File Transfer Protocol (SFTP), HTTP and/or HTTPs GET requests, HTTP and/or HTTPs POST requests, etc.
The example media measurement entity 120 of fig. 1 employs a data store 128 implemented via one or more storage devices, such as, for example, flash memory, magnetic media, optical media, and so forth. The data stored in the example data store 128 of FIG. 1 can be in any data format (such as, for example, binary data, comma delimited data, tab delimited data, Structured Query Language (SQL) structures, etc.). Although in the illustrated example of fig. 1, the data store 128 is illustrated as a single database, the data store 128 may be implemented via multiple databases and/or may be stored in multiple storage locations. The example data store 128 of FIG. 1 stores participation information and monitoring information received, for example, from the example meter 104 of FIG. 1. In some examples, the data store 128 may store personally identifying information (e.g., demographic information, bibliographic information, etc.) about one or more panelists and/or other persons that indicate, for example, one or more characteristics of the corresponding person.
While the above discussion has focused on a single wearable media device 102, a single meter 104, a single media provider 106, and a single media measurement entity 120 for simplicity, any number of these elements may be present. For example, in a typical embodiment, it is expected that the media measurement entity 120 will provide a plurality of different meters 104 to the public at large. Thus, it is expected that there will be many media devices accessing metering applications, and that a large proportion of users desiring access to such applications agree to be panelists. Thus, it is expected that there will be many instances of the above-described processing that will occur across many devices at overlapping times and/or at different times. Thus, for example, there may be many instantiations of the machine readable instructions disclosed in the following flowcharts that operate at the same or different times. Many of these cases may be implemented as parallel threads operating on the same device.
Fig. 2 is a block diagram of an example implementation of the example wearable media device 102 of fig. 1. The example wearable media device 102 of fig. 2 includes sensors 200 that monitor the environment in which the wearable media device 102 is located and/or monitor activity of the wearable media device 102. The sensors 200 of fig. 2 include, for example, motion sensors, accelerometers, location trackers (e.g., global positioning system modules), audio sensors, touch sensors, image capturers, and/or any other suitable sensors that collect information related to the wearable media device 102. In some examples, the sensor 200 includes an image capture sensor that obtains image information indicative of a gaze direction of a user. For example, the user's gaze direction is calculated by determining the direction in which the center of the eye is pointing. As described below, the user's gaze direction may be used to control the opacity characteristics of the displayed media.
The example wearable media device 102 of fig. 2 includes a communication interface 202 that facilitates communications such as those described above with respect to fig. 1. For example, the communication interface 202 of fig. 2 includes a Wi-Fi interface to communicate with available (e.g., in-range) Wi-Fi access points. Accordingly, the example communication interface 202 of fig. 2 facilitates the first example wireless coupling 110 described above in connection with fig. 1. In some cases, the Wi-Fi communication interface 202 is additionally or alternatively used to facilitate the second example wireless coupling 114 of fig. 1 with the example portable device 116. Additionally or alternatively, the example communication interface 202 of fig. 2 includes a bluetooth interface that facilitates, for example, the first example wireless coupling 110 and/or the second example wireless coupling 114 of fig. 1. In some examples, the communication interface 202 of fig. 2 includes one or more wired interfaces that exchange information over a cable and/or receive charge from a power source.
The example wearable media device 102 of fig. 2 includes one or more applications 204 to be executed on the example wearable media device 102. As described above, the example wearable media device 102 of fig. 2 may download any number and/or type of applications (e.g., email applications, texting applications, mapping applications, browsers, augmented reality applications, etc.) from, for example, an application memory. The example application 204 of fig. 2 includes a media acquirer 206 and an eye tracker 208. The example media retriever 206 of fig. 2 obtains media from any suitable source, such as, for example, the media provider 106 of fig. 1. The example media retriever 206 of FIG. 2 implements, for example, a web browser (e.g., Google) that facilitates retrieval of media) A streaming service (e.g.,) And/or on-demand programming (e.g.,). The example media retriever 206 of fig. 2 receives a request for a particular media (e.g., input from a user) and submits one or more queries to the appropriate media source, causing the media to be delivered to the wearable media device 102.
The example eye tracker 208 of fig. 2 uses the detected gaze direction of the user to control opacity of media displayed on the display surface 210 of the wearable media device 102. The example eye tracker 208 of fig. 2 uses the eye position and/or movement data provided by one or more of the sensors 200 to determine or estimate a user's gaze direction and determine whether the estimated annotation direction corresponds to a portion of the display surface 210 designated for media display (e.g., a segment of a lens). That is, the example eye tracker 208 of fig. 2 indicates how close the user's gaze is to the media presented on the display surface 210. In the illustrated example of fig. 2, eye tracker 208 calculates an angular difference (e.g., a certain degree of angle) between the direction of the detected gaze and a direct line of sight between the user's eyes and the designated display portion of display surface 210. In the illustrated example of fig. 2, the eye tracker 208 uses the magnitude of the angular difference to provide opacity instructions regarding the media display to the display generator 212 of the wearable media device 102. For example, eye tracker 208 of fig. 2 queries a reference table 214 that includes a mapping between the angular difference (between the detected gaze direction and the direction corresponding to the designated media display portion of the display surface) and the opacity of the currently displayed media. Using mapping table 214, the example eye tracker 208 of fig. 2 selects an opacity of the display to correspond to the detected gaze direction (e.g., looking away from the displayed media, looking near the displayed media, looking directly at the displayed media, etc.). In the illustrated example of FIG. 2, mapping table 214 includes a high opacity (e.g., 80% to 100% opacity) such as when the user is looking directly at the designated media display portion of display surface 210. Additionally, the example mapping table 214 of FIG. 2 includes a medium opacity (e.g., 50% to 80% opaque) such as when a user is looking near the designated media display portion of the display surface 210. Additionally, the example mapping table 214 of FIG. 2 includes a low opacity (e.g., 25% to 50% opaque) such as when a user looks away from a designated media display portion of the display surface 210.
Fig. 3 illustrates an example of the eye tracker 208 of fig. 2 determining that the user's eye 300 is looking directly at the media display 302 of the display surface 210 of the wearable media device 102. At a time corresponding to fig. 3, the example eye tracker 208 of fig. 2 determines that the user's gaze direction 304 is directed toward the media display 302. Thus, the example media of display 302 of FIG. 3 has a high opacity (e.g., 100% opaque).
Fig. 4 illustrates an example of the eye tracker 208 of fig. 2 determining that the user's eye 300 is looking away from the media display 302 of the display surface 210 of the wearable media device 102. At a time corresponding to fig. 4, the example eye tracker 208 of fig. 2 determines that the user's gaze direction 400 is pointing away from the media display 302. Thus, the example media of display 302 of FIG. 4 has a low opacity (e.g., 25% opaque).
The example eye tracker 208 of fig. 2 conveys the opacity settings (e.g., opacity percentages) obtained from the mapping table 214 to a display generator 212, which display generator 212 facilitates display of media on the display surface 210 according to the received opacity settings.
Further, the example eye tracker 208 of fig. 2 conveys the obtained opacity settings to display settings 216 used by the example display generator 212 to generate a display of media on the display surface 210. That is, the example display settings 216 of FIG. 2 include entries (e.g., variables, files, etc.) that are dedicated to tracking the current opacity of the media displayed on the display surface 210.
The example wearable media device 102 of fig. 2 includes a manual opaque input 218 that a user of the wearable media device 102 has access to. For example, the manual opaque input 218 of fig. 2 is implemented via a button on the frame of the wearable media device 102 and/or via an on-screen menu presented on the display surface 210 that the user selects. The example manual opacity input 218 of fig. 2 enables a user to instruct the display generator 212 to display media at a particular opacity. Instructions provided via the example manual opaque input 218 are stored in the example display settings 216. In some examples, the manual opacity input 218 of fig. 2 switches across a range of opacities (which may include predetermined values). Additionally or alternatively, the example manual opacity input 218 provides a field or prompt for the user to enter an opaque specific number (e.g., percentage) of the displayed media. In the illustrated example of fig. 2, the example display generator 212 uses the provided input to generate a display of media on the display surface 210. In the example of fig. 2, display generator 212 prioritizes the instructions provided by manual opacity input 218 over the settings provided by the example eye tracker 208. However, any suitable combination and/or priority setting is possible.
As explained above, the example wearable media device 102 of fig. 2 displays media on the display surface 210 according to the display settings 216 (including an indication of the opacity of the displayed media). As described below, the example meter 104 obtains opacity information and uses the opacity information to generate engagement information for media displayed on the display surface 210.
Although an example manner of implementing the wearable media device 102 of fig. 1 is illustrated in fig. 2, one or more of the elements, processes and/or apparatuses illustrated in fig. 2 may be combined, divided, rearranged, omitted, eliminated and/or implemented in any other way. Further, the example communication interface 202, the example application 204, the example media retriever 206, the example eye tracker 208, the example display generator 212, and/or the example meter 104 may be implemented via hardware, software, firmware, and/or any combination of hardware, software, and/or firmware. Thus, for example, any of the example communication interface 202, the example application 204, the example media obtainer 206, the example eye tracker 208, the example display generator 212, and/or the example meter 104 of fig. 2 may be implemented via one or more analog or digital circuits, logic circuits, programmable processors, Application Specific Integrated Circuits (ASICs), Programmable Logic Devices (PLDs), and/or Field Programmable Logic Devices (FPLDs). At least one of the example communication interface 202, the example application 204, the example media retriever 206, the example eye tracker 208, the example display generator 212, and/or the example meter 104 of fig. 2 is expressly defined herein to include a tangible computer readable storage or storage disk (such as a memory, a Digital Versatile Disc (DVD), a Compact Disc (CD), a blu-ray disc, etc.) that stores software and/or firmware when reading any of the patented device or system claims covering purely software and/or firmware implementations. Still further, the example wearable media device 102 of fig. 2 may include one or more elements, processes and/or apparatuses in addition to, or instead of, the elements, processes and/or apparatuses of fig. 2, and/or may include more than one of any or all of the illustrated elements, processes and apparatuses.
Fig. 5 is a block diagram of the example meter 104 of fig. 1 and/or 2. The example meter 104 of fig. 5 includes an engagement level detector 500 to detect and/or measure an engagement level of media presented on the example wearable media device 102 of fig. 1 and/or 2. The engagement level detector 500 of fig. 5 includes an opacity obtainer 502, the opacity obtainer 502 obtaining opacity information associated with a display generated by the wearable media device 102; a score calculator 540, the score calculator 504 receiving data from the opaque obtainer 504; and one or more conversions 506 used by the score calculator 504. The example opacity obtainer 502 of fig. 5 captures display characteristic data from the wearable media device 102. In some examples, opacity obtainer 502 communicates with (e.g., queries and receives responses to) example display settings 216 stored in memory of wearable media device 102. As described above, example display settings 216 include instructions and/or settings referenced by display generator 212 that display generator 212 uses display settings 216 to generate a media display having a particular opacity. In this case, the example opacity obtainer 502 of fig. 5 identifies and/or interprets the display settings 216 as corresponding to a particular opacity and provides corresponding data to the example score calculator 504 of fig. 4. Additionally or alternatively, the example opacity obtainer 502 of fig. 5 obtains opacity information by interfacing with the example manual opacity input 218 of the wearable media device 102. That is, the example opacity obtainer of fig. 5 receives one or more signals from the manual opacity input 218 when, for example, a user presses a button corresponding to the manual opacity input 218 and/or selects from an on-screen menu corresponding to the manual opacity input 218. In this case, the example opacity obtainer 502 of fig. 5 identifies and/or interprets the received signal as corresponding to a particular opacity and provides data to the example score calculator 504 of fig. 5. Additionally or alternatively, the example opacity obtainer 502 of fig. 5 obtains opacity information by interfacing with the example eye tracker 206 of the wearable media device 102. That is, the example opacity obtainer 502 of fig. 5 receives one or more signals from the eye tracker 208 in, for example, calculating a gaze direction of the user. In this case, the example opacity obtainer 502 of fig. 5 identifies and/or interprets the received signal as corresponding to a particular opacity and provides data to the example score calculator 504 of fig. 5.
The example score calculator 504 of fig. 5 uses data representing the opacity provided by the example opacity obtainer bar to generate a user's attention metric (e.g., engagement level). In the illustrated example of fig. 5, the engagement level calculated by the score calculator 504 is a likelihood that the user is focusing on media presented on the display surface 210 of the wearable media device 102. The metric generated by the example score calculator 504 of fig. 5 is any suitable type of value (such as, for example, a scale-based numerical score, a percentage, a classification, one of a plurality of levels defined by various thresholds, etc.). In some examples, the metric generated by the example score calculator 504 of fig. 5 is a total score or percentage formed by combining a plurality of individual engagement scores or percentages based on different data and/or detections corresponding to, for example, consecutive intervals.
In the illustrated example of fig. 5, the score calculator 504 uses the provided opacity to determine or estimate, for example, whether the user is paying attention to the displayed media. The example score calculator 504 of fig. 5 calculates a score (e.g., likelihood) indicating whether the user is interested in the displayed media based on the conversion 506. For example, score calculator 504 of fig. 5 compares the received opacity with one or more thresholds stored in conversion 506 to select one of the plurality of engagement scores. For example, the transformation 506 of FIG. 5 includes a table where a particular range of opacity corresponds to a particular engagement score. Table 1 is an example illustration of the example conversion 506 of fig. 5.
TABLE 1
| Opacity(%) | Engagement score |
| 100 | 10 |
| 90-99 | 9 |
| 80-89 | 8 |
| 70-79 | 7 |
| 60-69 | 6 |
| 50-59 | 5 |
| 40-49 | 4 |
| 30-39 | 3 |
| 20-29 | 2 |
| 10-19 | 1 |
| 0-9 | 0 |
As shown in table 1, the user is assigned a greater engagement score when the opacity is high. While the engagement scores of table 1 are integers, additional or alternative types of scores (such as percentages) are possible.
Additionally or alternatively, the example score calculator 504 of fig. 5 scales the exact opacity level to a particular engagement score using any suitable algorithm or equation. In other words, the example score calculator 504 of fig. 5 may scale the opacity directly to the engagement score in addition to or instead of using the range of likelihoods (e.g., according to converted table 1) for assigning scores to corresponding users. In this case, the example conversion 506 includes one or more algorithms or functions that receive the opacity as an input and output a digital representation of, for example, the engagement probability. For example, transition 506 receives a first opacity percentage and generates a second percentage that indicates a likelihood of user engagement with the displayed media. In this case, a higher percentage indicates proportionally a higher level of attention or engagement.
In some examples, the example score calculator 504 of fig. 5 considers data collected by, for example, the sensors 200 of the wearable media device 102 along with opacity characteristic data provided by the opacity obtainer 502. For example, the conversion 506 of fig. 5 includes one or more algorithms that combine the opacity characteristic data with additional or alternative data, such as sensor information (e.g., motion data, location data, facial expression data, etc.) generated by the sensors 200 of the wearable media device 102, to generate the engagement score. The example score calculator 504 may consider additional factors along with the opacity characteristic data to generate the engagement score.
In some examples, the score calculator 504 of fig. 5 combines the calculations taken with respect to multiple intervals. For example, the likelihoods of engagement calculated by the example score calculator 504 of fig. 5 may be combined (e.g., averaged) over a period of time that lasts multiple frames of media to generate a total likelihood that the user is engaged in the media over the period of time. Detecting that the user is likely to be focusing on the media via multiple consecutive frames may indicate a higher likelihood of participation in the displayed media, as opposed to an indication that the user frequently switches to, for example, a lower opacity. For example, the score calculator 504 may calculate a percentage representing a likelihood of engagement for each of twenty consecutive frames of media. In some examples, score calculator 504 calculates an average of twenty percentages and compares the average to one or more thresholds each indicative of an engagement score as described above with respect to table 1. Based on the comparison of the average to one or more thresholds, the example score calculator 504 determines a likelihood of engagement or classification of the user over a time period corresponding to twenty frames.
The example score calculator 504 of fig. 5 outputs the calculated score to the example time stamper 508. The example time stamper 508 of fig. 5 includes a clock and a calendar. The example time stamper 508 of fig. 5 associates a time period (e.g., Central Standard Time (CST) 1:00 am to CST 1:01 am) and a date (e.g., 2014 1 month 2 days) with each calculated engagement score, for example, by appending time period and date information to the end of the data. The data packets (e.g., engagement scores, opacities, and timestamps) are stored in memory 510. The example memory 510 of FIG. 5 includes, for example, volatile memory (e.g., Synchronous Dynamic Random Access Memory (SDRAM), Dynamic Random Access Memory (DRAM), Rambus Dynamic Random Access Memory (RDRAM), etc.) and/or non-volatile memory (e.g., flash memory). Memory 510 may include one or more Double Data Rate (DDR) memories (such as DDR, DDR2, DDR3, Mobile DDR (mDDR), etc.)
The example time stamper 508 of fig. 5 also receives data from the example media detector 512 and the example user identifier 514. The example media detector 512 of fig. 5 detects presentation of media on the wearable media device 102 and/or collects identification information associated with the detected presentation. In some examples, media detector 512 includes an instrumented application that extracts, for example, a code and/or watermark embedded in media presented by wearable media device 102. Audio watermarking is a technique for identifying media, such as television programs, radio broadcasts, advertisements, downloaded media, streaming media, prepackaged media, and the like. Existing audio watermarking techniques identify media by embedding one or more audio codes (e.g., one or more watermarks), such as media identification information and/or an identifier that may be mapped to the media identification information, into the audio and/or video components. In some examples, the audio or video component is selected to have signal characteristics sufficient to conceal the watermark. As used herein, the terms "code" or "watermark" are used interchangeably and are defined to refer to any identifying information (e.g., identifier) that may be inserted or embedded in the audio or video of a medium (e.g., a program or advertisement) for the purpose of identifying the medium or for another purpose, such as tuning (e.g., a packet identification header). To identify the watermarked media, the watermark is extracted and used to access a reference watermark table mapped to media identification information.
Additionally or alternatively, the example media detector 512 of fig. 5 facilitates generation of a watermark and/or signature representative of media presented on the wearable media device 102. Unlike media monitoring techniques based on inclusion and/or embedding of codes and/or watermarks in the monitored media, fingerprint or signature based media monitoring techniques typically use one or more inherent characteristics of the monitored media during a monitoring interval to generate a substantially unique proxy for the media. Such a proxy is referred to as a signature or fingerprint and may take any form (e.g., a series of values, waveforms, etc.) that represents any aspect of a media signal (e.g., an audio and/or video signal that forms a monitored media presentation). A good signature is a signature that is repeatable when processing the same media presentation but unique with respect to other (e.g., different) presentations of other (e.g., different) media. Accordingly, the terms "fingerprint" and "signature" are used interchangeably herein, and are defined herein to refer to a proxy for identifying media that is generated based on one or more inherent characteristics of the media.
Signature-based media monitoring generally involves determining (e.g., generating and/or collecting) a signature representative of a media signal (e.g., an audio signal and/or a video signal) output by a monitored media device and comparing the monitored signature to one or more reference signatures corresponding to known (e.g., reference) media sources. Various comparison criteria (such as cross-correlation values, hamming distances, etc.) may be evaluated to determine whether the monitored signature matches a particular reference signature. When the monitored signature is found to match one of the reference signatures, the monitored media may be identified as corresponding to the particular reference media represented by the reference signature that matches the monitored signature. Since attributes (such as an identifier of the media, presentation time, channel, etc.) are collected for the reference signature, these attributes can then be associated with the monitored media for which the monitored signature matches the reference signature. Example systems for identifying media based on codes and/or signatures have been known for a long time and are first disclosed in Thomas U.S. patent No. 5,481,294, which is hereby incorporated by reference in its entirety.
In some examples, the code/watermark is transmitted with and/or in association with the media as media identifying metadata. The media identification metadata may be formatted in a textual or binary format (such as, for example, an ID3 tag). In some examples, the media identification metadata includes data from a code/watermark, or the like. However, in some other embodiments, the media identifying metadata originates from and/or represents a code/watermark and/or signature or the like. Example methods and apparatus for transcoding watermarks to ID3 tags are disclosed in U.S. patent application No. 13/341,646, U.S. patent application No. 13/341,661, U.S. patent application No. 13/443,596, U.S. patent application No. 13/455,961, U.S. patent application No. 13/341,646, and U.S. patent application No. 13/472,170, which are hereby incorporated by reference in their entirety.
In the illustrated example of fig. 5, the detection function of the media detector 512 stores data associated with and/or representative of the collected information, for example, in the memory 513 and/or sends the collected monitoring information to the example media measurement entity 120 of fig. 1. In some examples, the wearable media device 102 includes additional or alternative monitoring functionality (e.g., local monitoring functionality and/or monitoring software other than that of the media detector 512). In some examples, the monitoring functions of the media detector 512 and/or other monitoring functions operating on the wearable media device 102 are referred to as "on-device meters. The example detector 512 of FIG. 5 provides media identification information to the example time stamper 508.
To determine the identity of the user of the wearable media device 102, the example meter 104 of fig. 5 includes a user identifier 514. The example user identifier 514 of fig. 5 determines an identity of a user based on user identification information stored in the memory 510 of the example wearable media device 102, e.g., along with, for example, registration of the wearable media device 102 and/or installation of the example meter 104 on the wearable media device 102. For example, when a user registers to participate in a monitoring panel associated with the media measurement entity 120 of fig. 1, the user is assigned an identifier (e.g., an alphanumeric string) stored on the wearable media device 102. In this case, the example user identifier 514 of FIG. 5 references the stored identifier to obtain user identification information. Additionally or alternatively, the example user identifier 514 of fig. 5 uses a log of information provided when the user initiates (e.g., unlocks) a session with the wearable media device 102. The example user identifier 514 of fig. 5 uses any other suitable technique (e.g., facial recognition data provided by an application of the wearable media device 102) to identify the current user. The example user identifier 514 of FIG. 5 provides user identification information to the example time stamper 508.
In the illustrated example of fig. 5, output device 516 outputs data (e.g., media identification information, user identification information, participation score, etc.) from memory 510 (e.g., via network 108) to, for example, media measurement entity 120 of fig. 1 on a periodic and/or aperiodic basis. In the illustrated example, the output devices 516 use the communication capabilities of the wearable media device 102 (e.g., the communication interface 202) for conveying information. In the illustrated example of fig. 5, a media measurement entity 120 (e.g., Nielsen Company (usa), LLC) uses data generated by the meter 104 to generate, for example, exposure information, such as engagement ratings, traditional exposure/audience composite ratings (e.g., Nielsen ratings), and so forth. Information from many meters can be compiled and analyzed to generate ratings indicative of media exposure via one or more populations of interest.
Although an example manner of implementing the meter 104 is illustrated in fig. 5, one or more of the elements, processes and/or devices illustrated in fig. 5 may be combined, divided, rearranged, omitted, eliminated and/or implemented in any other manner. Further, the example engagement level detector 500, the example opacity obtainer 512, the example score calculator 504, the example time stamper 508, the example media detector 512, the example user identifier, the example output device 516, and/or, more generally, the example meter 104 of fig. 5 may be implemented via hardware, software, firmware, and/or any combination of hardware, software, and/or firmware. Thus, for example, any of the example engagement level detector 500, the example opacity obtainer 512, the example score calculator 504, the example time stamper 508, the example media detector 512, the example user identifier, the example output device 516, and/or, more generally, the example meter 104 may be implemented via one or more analog or digital circuits, logic circuits, programmable processors, Application Specific Integrated Circuits (ASICs), Programmable Logic Devices (PLDs), and/or Field Programmable Logic Devices (FPLDs). When reading any of the patented device or system claims covering purely software and/or firmware implementations, at least one of the example engagement level detector 500, the example opacity obtainer 512, the example score calculator 504, the example time stamper 508, the example media detector 512, the example user identifier, the example output device 516, and/or, more generally, the example meter 104 of fig. 5 is expressly defined herein to include a tangible computer-readable storage device or storage disk (such as a memory, a Digital Versatile Disk (DVD), a Compact Disk (CD), a blu-ray disk, etc.) that stores software and/or firmware. Still further, the example meter 104 of fig. 5 may include one or more elements, processes and/or devices in addition to or in place of those illustrated in fig. 5, and/or may include more than one of any or all of the illustrated elements, processes and devices.
A flowchart representative of example machine readable instructions to implement the example wearable media device 102 represented in fig. 1 and/or 2 is shown in fig. 6. A flowchart representative of example machine readable instructions for implementing the example meter 104 represented in fig. 1,2, and/or 5 is shown in fig. 7. A flowchart representative of example machine readable instructions for implementing the example media measurement entity 120 of fig. 1 is shown in fig. 8. In these examples, the machine readable instructions comprise a program for execution by a processor (such as the processor 912 shown in the example processor platform 900 discussed below in connection with fig. 9). The program may be embodied in a tangible computer readable storage medium such as a CR-ROM, a floppy disk, a hard drive, a Digital Versatile Disk (DVD), a blu-ray disk, or software stored on a memory associated with the processor 912, but the entire program and/or parts of the program could alternatively be executed by a device other than the processor 912, and/or embodied in firmware or dedicated hardware. Further, while the example programs are described with reference to the flowcharts illustrated in fig. 6-8, many other methods of implementing the example wearable media device 102 of fig. 1 and/or 2, the example meter 104 of fig. 1,2, and/or 5, and/or the example SDK provider of fig. 1 may alternatively be used. For example, the order of execution of the blocks may be changed, and/or some of the blocks may be changed, eliminated, or combined.
As described above, the example processes of fig. 6-8 may be implemented using coded instructions (e.g., computer and/or machine readable instructions) stored on a tangible computer readable storage medium such as a hard disk drive, a flash memory, a Read Only Memory (ROM), a Compact Disc (CD), a Digital Versatile Disc (DVD), a cache, a Random Access Memory (RAM), and/or any other storage device or storage disk that internally stores information for any duration (e.g., for extended periods of time, permanently, brief instances, for temporarily buffering, and/or for caching of the information). As used herein, the term tangible computer readable storage medium is expressly defined to include any type of computer readable storage and/or storage disk and to exclude propagating signals and transmission media. As used herein, "tangible computer-readable storage medium" and "tangible machine-readable storage medium" are used interchangeably. Additionally or alternatively, the example processes of fig. 6-8 may be implemented using encoded instructions (e.g., computer and/or machine readable instructions) stored on a persistent computer and/or machine readable storage medium such as a hard disk drive, a flash memory, a read-only memory, a compact disk, a digital versatile disk, a cache, a random-access memory, and/or any other storage device or storage disk that internally stores information for any duration (e.g., for extended periods of time, permanently, brief instances, for temporarily buffering, and/or for caching of the information). As used herein, the term non-transitory computer readable storage medium is expressly defined to include any type of computer readable storage device and/or storage disk and to exclude propagating signals and transmission media. As used herein, the phrase "at least" when used as a transitional term in the preamble of a claim is open-ended in the same way that the term "comprising" is open-ended.
Fig. 6 begins when the wearable media device 102 of fig. 1 is activated (e.g., placed on the user's head and turned on) (block 600). Inputs and outputs of wearable media device 102, such as sensors 200, communication interface 202, and manual opaque input 218 of fig. 2, are initiated via, for example, a basic input/output system (BIOS) (block 602). When media presentation is initiated or triggered on the wearable media device 102 (block 604), the example display generator 212 of fig. 2 references the display settings 216 to determine a current opacity characteristic for displaying media on the display surface 210 (block 608). As described above, the display settings 216 receive data from, for example, the manual opacity input 218 and/or the eye tracker 208. In the illustrated example, manual opacity input 218 is prioritized (e.g., has override authority) over instructions provided by eye tracker 208. With the opacity information from the display settings 216, the example display generator 212 displays the media on the display surface 210 at a corresponding opacity level (block 608). If the media presentation is complete (block 610), control returns to block 604. Otherwise, the display generator 212 continues to refer to the display settings 216 for the opaque information, and thus, the media is displayed on the display surface (blocks 606 and 608).
Fig. 7 begins when the example meter 104 of fig. 1,2, and/or 5 is triggered to generate a participation score (block 700). In some examples, the meter 104 is scheduled to generate the engagement score (e.g., every 2 seconds, every 10 seconds, every minute, etc.). Additionally or alternatively, the example meter 104 triggers in response to, for example, an initiated media presentation on the wearable media device 102. In the example of fig. 7, user identifier 514 determines the identity of the user, e.g., by requesting identification information from the user and/or referencing one or more sources of user identification information on wearable media device 102 (block 702). Additional or alternative techniques for obtaining user identification information may be employed (such as, for example, deriving an identity based on a social security number associated with the wearable media device 102, deriving an identity based on a phone number associated with the wearable media device 102, deriving an identity based on a hardware address of the wearable media device 102 (e.g., a Media Access Control (MAC) address of the wearable media device 102), etc.). The example media detector 512 obtains media identification information representing media being displayed on the wearable media device 104 (block 704).
The example opacity obtainer 502 of the example engagement level detector 500 of fig. 5 obtains a current opacity characteristic (e.g., degree or percentage) of the currently displayed media (block 706). The opacity obtainer 502 obtains opacity characteristics from, for example, the display settings 216, the manual opacity input 218, and/or the eye tracker of the wearable media device. The score calculator 504 receives the opacity characteristic data from the opacity obtainer 502 and uses the opacity characteristic data to generate an engagement score for the currently displayed media (block 708). For example, the score calculator 504 uses the example conversion 506 to scale the opacity to the engagement score (e.g., the likelihood that the user is paying attention to the media displayed on the wearable media device 102).
The example meter 104 generates (e.g., via the output device 516) a package of information including at least the user identification information, the media identification information, and the engagement score (block 710). The meter 104 uses the communication interface 202 of the wearable media device 102 to deliver the packet to the media measurement entity (block 712). As can be seen, the media measurement entity is made aware of the identity of the user, the identity of the media displayed on the wearable media device 102, and the score representing the identified user's engagement (non-engagement) with the identified media. The example of FIG. 7 then ends (block 714).
Fig. 8 is a flow diagram representative of example machine readable instructions that may be executed to implement media measurement entity 120 of fig. 1. The example of fig. 8 begins when the SDK provider 122 of the media measurement entity 120 provides an SDK to an application developer, such as, for example, the media provider 106 and/or developer associated with an application store (e.g., Apple iTunes, Google Play, etc.) (block 800). The SDKs provided by the example SDK provider 122 enable the receiving application developer to create, for example, the meter 104 and/or integrate the meter 104 into one or more applications. In the illustrated example, the meter 104 of fig. 1,2, and/or 5 is provided via a provided SDK. However, the meters 104 of fig. 1,2, and/or 5 may be provided via, for example, an API, programming library, Dynamic Link Library (DLL), plug-in, add-in, etc. In some examples, the meter 104 is provided directly to the wearable media device 102 via, for example, a website, a mailed optical disc, or the like. In some examples, the meter 104 is provided to a wearable media device manufacturer and/or an intermediary. In examples where the meter 104 is provided to a wearable media device manufacturer, the wearable media device manufacturer may design (e.g., develop, produce, manufacture, etc.) the wearable media device 102 with the meter 104 as an integrated component. In examples where the meter 104 is provided to an intermediary, the intermediary may install (e.g., modify, change, adapt, etc.) the wearable media device 102 to include the meter 104 at or before the time of sale of the wearable media device 102 to a retailer and/or to an end user (e.g., a customer).
The example media measurement entity 120 receives demographic information from a user of the wearable media device 102 regarding, for example, installation of the meter 104 on the wearable media device and/or registration with a panel associated with the media measurement entity 120 (block 802). In the illustrated example, the media measurement entity 120 assigns an identifier to the user (block 804). In some examples, the identifier is generated based on demographic information. The identifier is then stored on a memory (e.g., patch board) on the wearable media device 102 and/or on the data store 128 of the media measurement entity 120. In the illustrated example, the media measurement entity 120 begins collecting monitoring data such as, for example, media identification information (e.g., media identification metadata, code, signatures, watermarks, and/or other information that may be used to identify the presented media), user identification information, time and/or duration of use, engagement scores, and/or demographic information) (block 806). The example of FIG. 8 then ends (block 808).
Fig. 9 is a block diagram of an example processor platform 900 capable of executing the instructions of fig. 6 to implement the example wearable media device 102 of fig. 1 and/or 2, capable of executing the instructions of fig. 7 to implement the example meter 104 of fig. 1,2, and/or 5, and/or capable of executing the instructions of fig. 8 to implement the example media measurement entity 120 of fig. 1. The processor platform 900 may be, for example, a server, a personal computer, a mobile device (e.g., a cellular phone, a smart phone, a tablet (such as an iPad), etcTM) Wearable media device (e.g., Google)) Internet applications, or any other type of computing device.
The processor platform 900 of the illustrated example includes a processor 912. The processor 912 of the illustrated example is hardware. For example, the processor 912 may be implemented via one or more integrated circuits, logic circuits, microprocessors or controllers from any desired home or manufacturer.
The processor 912 of the illustrated example includes local memory 913 (e.g., a cache). The processor 912 of the illustrated example communicates with a main memory including a volatile memory 914 and a non-volatile memory 916 via a bus 918. The volatile memory 914 may be implemented via Synchronous Dynamic Random Access Memory (SDRAM), Dynamic Random Access Memory (DRAM), Rambus Dynamic Random Access Memory (RDRAM), and/or any other type of random access memory device. The non-volatile memory 916 may be implemented via flash memory and/or any other desired type of storage device. Access to the main memory 914, 916 is controlled by a memory controller.
The processor platform 900 of the illustrated example also includes interface circuitry 920. The interface circuit 920 may be implemented via any type of interface standard, such as an ethernet interface, a Universal Serial Bus (USB), and/or a PCI express interface.
In the illustrated example, one or more input devices 922 are connected to the interface circuit 920. An input device 922 allows a user to enter data and commands into the processor 912. The input device may be implemented via, for example, an audio sensor, a microphone, a still or video camera, a keyboard, a button, a mouse, a touch screen, a touch pad, a track ball, an isopoint, and/or a voice recognition system.
One or more output devices 924 are also connected to the interface circuit 920 of the illustrated example. The output devices 924 may be implemented, for example, via display devices (e.g., Light Emitting Diodes (LEDs), Organic Light Emitting Diodes (OLEDs), liquid crystal displays, Cathode Ray Tube (CRT) displays, touch screens, tactile output devices, Light Emitting Diodes (LEDs), printers, and/or speakers). Thus, the interface circuit 920 of the illustrated example generally includes a graphics driver card, a graphics driver chip, or a graphics driver processor.
The interface circuit 920 of the illustrated example also includes a communication device (such as a transmitter, receiver, transceiver, modem, and/or network interface card) that facilitates exchange of data with external machines (e.g., any kind of computing device) via a network 926 (e.g., an ethernet connection, a Digital Subscriber Line (DSL), a telephone line, a coaxial cable, a cellular telephone system, etc.).
The processor platform 900 of the illustrated example also includes one or more mass storage devices 928 for storing software and/or data. Examples of such mass storage devices 928 include floppy disk drives, hard disk drives, optical disk drives, blu-ray disk drives, RAID systems, and Digital Versatile Disk (DVD) drives.
The encoded instructions 932 of fig. 6, 7, and/or 8 may be stored in the mass storage device 928, in the volatile memory 914, in the non-volatile memory 916, and/or on a removable tangible computer-readable storage medium, such as a CD or DVD.
Although certain example methods, apparatus, and articles of manufacture have been disclosed herein, the scope of coverage of this patent is not limited thereto. On the contrary, this patent covers all methods, apparatus, and articles of manufacture fairly falling within the scope of the appended claims either literally or under the doctrine of equivalents.
Claims (20)
1. A method, comprising the steps of:
determining an opacity of a media presentation displayed on a wearable media device; and
calculating an engagement score for media presented via the media presentation based on the opacity.
2. The method of claim 1, wherein the engagement score represents a likelihood that a user of the wearable media device is paying attention to the media presentation.
3. The method of claim 1, wherein determining the opacity comprises determining a setting associated with a display device of the wearable media device.
4. The method of claim 3, wherein the setting is based on input provided by a user to the wearable media device.
5. The method of claim 3, wherein the setting is based on automatic tracking of an eye position of a user of the wearable media device relative to the media presentation.
6. The method of claim 1, further comprising the steps of: media identification information associated with the media is collected.
7. The method of claim 6, further comprising the steps of: generating a record including the media identification information and the engagement score.
8. The method of claim 7, further comprising the steps of: sending the record to a media measurement entity.
9. The method of claim 1, further comprising the steps of: maintaining conversion information to be used in calculating the engagement score.
10. An apparatus, comprising:
glasses worn by a user, the glasses comprising a display for displaying media in front of the user's eyes;
a display generator that generates a media presentation on the display according to an opacity setting, the opacity setting indicating a degree of transparency of the media presentation; and
a meter that generates an engagement score based on the transparency of the media presentation.
11. An apparatus as defined in claim 10, wherein the meter is to associate the engagement score with media identification information representative of media associated with the media presentation.
12. The apparatus of claim 10, wherein the meter:
assigning a first value to the engagement score when the transparency is a first opacity; and
assigning a second value greater than the first value to the engagement score when the transparency is a second opacity greater than the first opacity.
13. The apparatus of claim 10, further comprising an opacity input selectable by a user to set the opacity setting, the meter to obtain the transparency via an interface with the opacity input.
14. The apparatus of claim 10, further comprising an eye tracker to set the opacity setting, the meter to obtain the transparency via an interface with the eye tracker.
15. The device of claim 10, further comprising a sensor that collects motion data associated with the device, wherein the meter generates the engagement score based on a combination of the transparency and the motion data.
16. A tangible computer readable storage medium storing instructions that, when executed, cause a machine to at least:
retrieving an opacity value corresponding to media displayed on the wearable media device; and is
Calculating a participation score for the media based on the opacity value.
17. The storage medium of claim 16, wherein the engagement score represents a likelihood that a user of the wearable media device is paying attention to the media presentation.
18. A storage medium as defined in claim 16, wherein the instructions are to cause the machine to determine the opacity value by reference to a setting associated with a display of the wearable media device.
19. A storage medium as defined in claim 16, wherein the instructions are to cause the machine to collect media identification information associated with the media.
20. A storage medium as defined in claim 19, wherein the instructions are to cause the machine to generate a record including the media identification information and the engagement score.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US61/923,859 | 2014-01-06 | ||
| US14/250,068 | 2014-04-10 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| HK1228053A1 true HK1228053A1 (en) | 2017-10-27 |
| HK1228053B HK1228053B (en) | 2020-04-24 |
Family
ID=
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6265572B2 (en) | Method and apparatus for detecting involvement with media presented at a wearable media device | |
| US12425688B2 (en) | Methods and apparatus to determine engagement levels of audience members | |
| US11423437B2 (en) | Methods and apparatus to detect advertisements embedded in online media | |
| JP2023036898A (en) | Systems and methods for evaluating audience engagement | |
| JP6179907B2 (en) | Method and apparatus for monitoring media presentation | |
| KR101850101B1 (en) | Method for providing advertising using eye-gaze | |
| US20140026156A1 (en) | Determining User Interest Through Detected Physical Indicia | |
| CN105230034A (en) | Method and apparatus for identifying accompanying media interactions | |
| US20210160569A1 (en) | Methods and apparatus to generate reference signatures | |
| US11997351B2 (en) | Mobile device attention detection | |
| JP7785684B2 (en) | SYSTEM AND METHOD FOR COLLECTING DATA FROM USER DEVICES - Patent application | |
| HK1228053A1 (en) | Methods and apparatus to detect engagement with media presented on wearable media devices | |
| HK1228053B (en) | Methods and apparatus to detect engagement with media presented on wearable media devices | |
| US11689765B1 (en) | Methods and apparatus for obfuscated audience identification | |
| US20140172580A1 (en) | System and method for providing information on advertised goods |