[go: up one dir, main page]

HK1232628A1 - Concurrent use of multiple user interface devices - Google Patents

Concurrent use of multiple user interface devices Download PDF

Info

Publication number
HK1232628A1
HK1232628A1 HK17106125.6A HK17106125A HK1232628A1 HK 1232628 A1 HK1232628 A1 HK 1232628A1 HK 17106125 A HK17106125 A HK 17106125A HK 1232628 A1 HK1232628 A1 HK 1232628A1
Authority
HK
Hong Kong
Prior art keywords
user interface
interface device
content
media content
display
Prior art date
Application number
HK17106125.6A
Other languages
Chinese (zh)
Inventor
B.拉尼尔
J.M.巴顿
Original Assignee
TiVo解决方案有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by TiVo解决方案有限公司 filed Critical TiVo解决方案有限公司
Publication of HK1232628A1 publication Critical patent/HK1232628A1/en

Links

Description

Simultaneous use of multiple user interface devices
The present application is a divisional application of a patent application having an application date of 2011, month 1, and day 18, an application number of 201180022278.3, and an invention name of "simultaneous use of a plurality of user interface devices".
Technical Field
The present invention relates to user interface devices. In particular, the present invention relates to the simultaneous use of multiple user interface devices.
Background
The methods described in this section are methods that can be performed, but are not necessarily methods that have been previously conceived or performed. Therefore, unless otherwise indicated, what is described in this section should not be assumed to be prior art merely by their inclusion in this section.
Generally, user interface systems found in televisions, laptops, tablets, telephones, kiosks, or most other devices include a display screen and an interactive interface. The interactive interface may include physical control buttons (e.g., buttons on a remote controller, a mouse, a joystick, a keyboard, etc.).
As shown in fig. 1, in some systems, a touch screen remote control (10) may be used to operate a media device (e.g., a Video Cassette Recorder (VCR) (20)) that outputs media content (35) that is displayed on a separate display screen (30). The remote control (10) runs a remote control operating system and displays a touch screen menu (15), the touch screen menu (15) being specifically designed for the remote control (10) and being displayed exclusively on the remote control (10). The communicatively connected media device (20) receives user commands submitted to the remote controller (10) and displays media content (35) (e.g., a movie or a show) based on the user commands selected on the remote controller (10).
Drawings
The present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements and in which:
FIG. 1 is a block diagram illustrating a prior art system;
FIG. 2A is a block diagram illustrating an exemplary system in accordance with one or more embodiments;
FIG. 2B is a block diagram illustrating an exemplary user interface device in accordance with one or more embodiments;
3A-3G illustrate exemplary screen shots in accordance with one or more embodiments;
FIG. 4 shows a block diagram illustrating a system in which embodiments of the invention may be implemented.
Detailed Description
In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It may be evident, however, that the present invention may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to avoid unnecessarily obscuring the present invention.
Some features are described below, each of which can be used independently of the other features or in any combination with the other features. However, any individual feature may not solve any of the problems described above, or may only solve one of the problems described above. Some of the above problems may not be adequately addressed by any of the features described herein. Although headings are provided, information regarding a particular heading that does not appear in the section having that heading may also be found elsewhere in the specification.
Exemplary features are described according to the following outline:
1.0 functional overview
2.0 System architecture
3.1 content management-similar and non-similar displays
3.2 content management-spatial and temporal Generation
3.3 content management-alternative content
3.4 content management-informational content/sponsored content
3.5 content management-alerts/notifications
3.6 content management-time delay
3.7 content management-image layer selection
3.8 content management-Audio
3.9 content management-multiple Secondary user interface devices
4.0 user interface device-exemplary implementation
5.0 Command execution-exemplary implementation
6.0 hardware overview
7.0 extensions and substitutions
1.0 functional overview
In an embodiment, a method comprises: transmitting multimedia content for display on a first user interface device; concurrently transmitting related content for display on a second user interface device, at least a portion of the multimedia content and at least a portion of the related content being similar or identical; obtaining user input received at the second user interface device; performing an operation related to the multimedia content displayed on the first user interface device based on the user input received at the second user interface device; wherein the method is performed by at least one device comprising a processor.
Transmitting the multimedia content for display on the first user interface and simultaneously transmitting the related content for display on the second user interface may be performed by the same device.
The method may include selecting a spatial portion of frames in the multimedia content as a complete whole frame in the related content. In response to detecting that the spatial portion includes a menu, the spatial portion of the frames in the multimedia content may be selected as a complete frame in the related content. The spatial portion of the frame in the multimedia content may be selected as a complete frame in the related content based on the user input.
The related content may include multimedia content with additional content overlaid on the multimedia content. The related content may include an alternative version of the multimedia content displayed on the first user interface device. The related content may include an advertisement for a product service in the multimedia content displayed on the first user interface device. The related content may include information describing one or more image attributes in the multimedia content displayed on the first user interface device.
In an embodiment, a method comprises: cause display of a first menu on a first user interface device; transmitting a second menu to a second user interface device for display on the second user interface device, at least a portion of the first menu and at least a portion of the menu being similar or identical; obtaining a menu selection received at the second user interface device; performing an operation related to the multimedia content displayed on the first user interface device based on the menu selection received at the second user interface device; wherein the method is performed by at least one device comprising a processor. The first user interface device may be a non-touch screen interface and the second user interface device may be a touch screen interface.
In an embodiment, a method comprises: transmitting the multimedia content to the first user interface device and the second user interface device for simultaneous display; selecting one of the first user interface device and the second user interface device to display additional content; transmitting the additional content to the selected user interface device for display on the selected user interface device. The additional content may be overlaid on the multimedia content for display on the selected user interface device. The additional content may be displayed in response to receiving a request for the additional content on the selected user interface device.
In an embodiment, the additional content may include an alert notification. The alert notification may be displayed on the first user interface device and detailed information related to the alert notification is displayed on the second user interface device. The selecting step may include selecting the second user interface device and the method may include, in response to displaying the alert notification on the second user interface, displaying the alert notification on the first user interface device after a specified time interval in which no user input is received by the second user interface device.
One of the first user interface device and the second user interface device may be selected based on a type of the additional content or a source of the additional content.
In an embodiment, the method comprises: displaying video content comprising a plurality of frames, each frame overlaid with a menu on a first user interface device; displaying a single frame of a plurality of frames overlaid with the menu on a second user interface device while displaying the video content on the first user interface device; receiving user input at a second user interface device; based on the user input received at the second user interface device, performing an operation related to the video content displayed on the first user interface device. The video content and the single frame from the video content may be received by the first user interface device and the second user interface device, respectively, from a same content source.
In an embodiment, a method comprises: the media device transmitting multimedia content to a user interface for display of the multimedia content on a television; the media device transmitting the multimedia content to a cellular telephone for display of the multimedia content on the cellular telephone; the user interface and the cellular telephone simultaneously display the multimedia content received from the media device. At least one of a frame rate and a resolution for displaying the multimedia content on the user interface may be different from a frame rate and a resolution for displaying the multimedia content on the cellular phone.
In an embodiment, a method comprises: transmitting multimedia content to a first user interface device for display on the first user interface device; receiving a first message from a network server relating to a user while transmitting the multimedia content to the first user interface device; transmitting information related to the first message to a second user interface device for display on the second user interface device; wherein the method is performed by at least one device comprising a processor. The second user interface device may be a cellular telephone that displays a text message based on information associated with the first message. The method may further include receiving a second message from the second user interface device based on user input and transmitting information related to the second message to the network server.
In an embodiment, a method comprises: playing the multimedia content only on a first user interface device of the first user interface device and the second user interface device; detecting that the second user interface device has moved beyond a specified distance from the first user interface device; in response to detecting that the second user interface device has moved outside of the specified distance, playing the multimedia content on the second user interface device. The method may further comprise: in response to detecting that the second user interface device has moved outside of the specified distance, stopping playing the multimedia content on the first user interface device. The method may further include, after playing the multimedia content on the second user interface device, detecting that the second user interface device has moved back within a specified distance from the first user interface device; in response to detecting that the second user interface device has moved back within a specified distance from the first user interface device, stopping playing the multimedia content on the second user interface device.
Although specific components are described herein as performing the steps of the method, in other embodiments factors or mechanisms that act on behalf of the specific components may perform the method steps. Further, while some aspects of the invention are discussed with respect to components on a system, the invention may be implemented with distributed components on multiple systems. Embodiments of the present invention also include any system comprising means for performing the method steps described herein. Embodiments of the present invention also include computer-readable media carrying instructions that, when executed, cause performance of the method steps described herein.
2.0 System architecture
Although a particular computer architecture is described herein, other embodiments of the invention are suitable for any architecture that may be used to perform the functions described herein.
FIG. 1 illustrates an exemplary system in accordance with one or more embodiments. The content system (100) includes one or more media devices that function as content sources (110), user interface devices (115), and/or content management devices (130). Each of these components is presented to clarify the functionality described herein and may not be necessary to practice the present invention. Also, components not shown in FIG. 1 may also be used to perform the functionality described herein. The functionality performed by one element may instead be performed by another element.
In an embodiment, the content system (100) may include a media device that functions as a content source (110). The content source (110) generally represents any audio and/or visual content source. Examples of the content source (110) may include a Digital Video Disc (DVD) player that reads data from a Digital Video Disc (DVD) or a Video tape recorder that reads data from a Video tape. Other examples include a digital video recorder, a set-top box, a computer system, a media device, a local server, a network server, a data warehouse, a kiosk, a mobile device, or any other source of content. The content system (100) may also receive content from other content systems. The content system (100) may include one or more components that allow the content system (100) to receive and/or transmit content. For example, the content system (100) may include a network card, tuner, compressor, decompressor, modem, encryption device, decryption device, multiplexer, demultiplexer, receiver, or any component involved in receiving or transmitting data. In embodiments, the content system (100) may receive and/or transmit content through wired and/or wireless portions. For example, the content system (100) may receive content on any suitable frequency in a broadcast stream, a network stream (e.g., the internet, an intranet, a local area network), a bluetooth signal, an infrared signal, an electromagnetic spectrum, and/or receive content from or transmit content to devices within the content system (100) or outside the content system (100) by any other method that may be employed.
In an embodiment, the content system (100) may include a media device that functions as a user interface device (115). The user interface device (115) generally represents any device with input and/or output means. The user interface device (115) may include one or more of the following: a display screen, touch screen interface, keypad, mouse, joystick, scanner, speaker, audio input, audio output, camera, and the like. Examples of user interface devices (115) include monitors, televisions, projectors, mobile device interfaces, kiosks, tablets, laptops, speakers, headphones, or any other device that can be used to receive and/or present audio content and/or visual content. In an embodiment, at least one user interface device (115) within the content system (100) may be configured to receive input from a user.
In an embodiment, the user interface device (115) may be configured to automatically detect a user interaction element in the display. For example, the secondary user interface device may be configured to display a menu that is simultaneously displayed on the primary user interface device. The secondary user interface device may automatically identify "buttons" within the menu as user interaction elements (e.g., by optical character recognition, button shape recognition, color-based authentication, etc.). The secondary user interface device may obtain an identification of the displayed user interaction element based on the data identifying the user interaction element. For example, x and y coordinates of corners on the display that identify the button may be received. As another example, it may involve receiving an image or feature of the button separately, which may be compared to a menu screen to identify a matching element. The matching element may then be identified as a user interaction element.
The user interaction element may be visually indicated to the user. For example, the user interaction element may be covered with a special shade, color, line, or other suitable visual cue that may identify the user interaction element. The secondary user interface may then allow the user to select the automatically identified button. For example, on a touch screen secondary user interface, an automatically recognized button may be touched by the user to select the button. Alternatively, a keypad (or other input device) on the secondary user interface may be used to select the automatically identified button displayed on the secondary user interface. In embodiments, the user interface device may detect a text entry field (e.g., by detecting a white box or cursor in the displayed content), a scroll bar (e.g., by detecting an arrow in the displayed content that reverses direction along a vertical column), a radio button (e.g., by detecting a set of vertically oriented circles or squares in the displayed content), or any other interface component that may be used to obtain user input. The secondary user interface may be configured to communicate information related to user input to another component of the content system (100), such as the content management device (130).
In an embodiment, different types of user interface devices (115) may be used simultaneously in the content system (100). Different resolutions and/or frame rates may be used to display content on different user interface devices. In embodiments, for clarity, one user interface device may represent a primary user interface device herein, while another user interface device may be a secondary or auxiliary user interface device. However, the functionality described herein as pertaining to a particular user interface device may be applied to another user interface device. Accordingly, references to a primary user interface device or a secondary user interface device should not be construed as limiting the scope.
3.1 content management-related content
In an embodiment, the content system (100) may include a media device that functions as a content management device (130). A content management device (130) determines what content is to be played on each user interface device of a set of user interface devices. For example, the content management device (130) may be configured to simultaneously display media content on the primary user interface device and the secondary user interface device. As shown in exemplary fig. 3A, media content delivered by media device (310) may be displayed simultaneously on user interface device a (312) and user interface device B (314). The video content may be displayed by the content management device (130) at different resolutions and/or frame rates on different user interface devices. The content management device (130) may be a separate device or part of a media device (310) that outputs the media content.
In an embodiment, multimedia content may be displayed on user interface device a (312) while related content is displayed on user interface device B (314). The multimedia content displayed on user interface device a (312) may be simultaneously displayed on user interface device B (314) with minor modifications (e.g., related content). For example, text or other designated content may be displayed in different sizes (related to underlying images or graphics) on different user interface devices. For example, as shown in fig. 3B, user interface device a (312) may display multimedia content overlaid with a menu (316a) for operating media device (310). User interface device B (314) may be configured to display the same multimedia content overlaid with a larger version of the menu (316B) associated with the underlying media content. User interface device B (314) may be a touch screen that visually represents that a menu button can be selected by, for example, bolding the outline of the button.
In an embodiment, user input may be received on user interface device B (314) and operations may be performed with respect to multimedia content displayed on user interface device a (312). For example, while multimedia content is being displayed simultaneously on user interface device a (312) and user interface device B (314), the user may select pause or fast forward on a menu (316B) displayed on user interface device B (314). In response to receiving the user input, the multimedia content may pause or fast forward. The user input may be received to operate any of the devices (e.g., media devices, user interface devices, etc.). The input received on the user interface device may be for the user interface device, for multiple user interface devices, and/or for one or more media devices.
In an embodiment, only the automatically selected spatial portion of the media content is displayed on the secondary user interface device, while all of the media content is simultaneously displayed on the primary user interface device. For example, as shown in exemplary FIG. 3C, the content management device (130) may be configured to display a menu overlaid on media content on user interface device A (312). The content management device (130) may be further configured to display only the spatial portion (318) of the media content including the menu on the user interface device B (314). The menu may be generated by a media device (310) that provides media content for display, or may be generated by and correspond to another media device. For example, the media content may be received from a first content source (110), and the menu (e.g., with options regarding brightness, color, sharpness, etc.) may be generated by the primary user interface device and overlaid on the media content received from the first content source (110). The content management device (130) may be configured to obtain a menu generated by the primary user interface device and display the menu on a secondary user interface that includes functionality to receive input from a user.
In an embodiment, the primary user interface device is designed as a standard display remote from the user. For example, the primary user interface device may be a wall-mounted liquid crystal display. The secondary user interface device is designed as a mobile device (e.g., a cellular phone, tablet PC, laptop, or other suitable mobile device). In embodiments, the format of content on a secondary user interface (e.g., a mobile device) may change based on the distance from the primary user interface. For example, radio frequency or other suitable techniques may be used to determine the distance between the primary user interface device and the secondary user device. The text displayed on the primary user interface device may be visible to the user because the primary user interface may include a large display screen. Also, assuming that the user is within an estimated distance (e.g., 1 foot) from the secondary user interface device, an estimate of the distance between the user and the primary user interface device may be generated. Based on the estimated distance between the user and the primary user interface device, the size of the text viewed by the user on the primary user interface device may be determined. The text on the secondary user interface may then be zoomed in or out to produce the same visual effect for the user viewing the secondary user interface as when viewing the primary user interface device.
3.2 content management-spatial and temporal spanning (spanning)
In embodiments, the user may use the secondary user interface device to zoom in on any particular spatial portion of the primary user interface device. For example, as shown in FIG. 3D, the user input may be used to select the portion of space currently displayed on user interface device A (312) (320 a). The user input selecting the spatial portion may be entered on user interface device B (314) or any other device. For example, the user may first select a magnification level on user interface device B (314), which causes the currently displayed version of the image on user interface device a (312) to be magnified. Thereafter, the user may slide a finger on user interface device B (314) to indicate movement of the user-selected portion of space (320a) in user interface device A (312) to the upper left corner. As a result, a magnified view (320B) of the selected spatial portion (320a) is displayed on the user interface device B (314).
In an embodiment, two user interface devices may continue to display media content simultaneously, with the primary interface device displaying the entire media content and the secondary interface device displaying a spatial portion of the media content. Also, the secondary user interface device may be configured to pan to another spatial portion of the media content based on user input.
In an embodiment, the secondary user interface device may be used to temporally scan the media content being displayed on the primary user interface. For example, the content management device (130) may initially simultaneously and synchronously display multimedia content on the primary user interface device and the secondary user interface at a standard playback speed. The secondary user interface device may then be configured based on the user input to fast forward, rewind, or pause the play of the media content while the primary user interface device continues to play the multimedia content at the standard playback speed. The secondary user interface device may accordingly allow a particular user to comment on, skip, or otherwise modify playback on the secondary user interface device without interruption of the play of the multimedia content on the primary user interface device. In an embodiment, the secondary user interface device may further include a synchronization option that synchronizes the playing of the multimedia content on the secondary user interface device with the primary user interface device. For example, the secondary user interface device may begin receiving the exact same video stream from the media device as the first user interface device and display the frames in synchronization with the first user interface device. In another example, the second user interface device may simply resume playback of the multimedia content at a frame that is currently being displayed on the first user interface device.
In an embodiment, video streams for a primary user interface device (which displays a standard video stream) and a secondary user interface (which is used for temporal or spatial spanning) are received from a single source of media content. For example, for spatial crossing, the media device may output the same video stream to the primary user interface device and the secondary user interface device. The primary user interface device displays the video stream as received, while the secondary user interface is configured to display only a spatial portion of the video stream being received. The time crossing may involve the media device outputting the video stream to the secondary user interface in advance, while the secondary user interface buffers the video stream. The frames stored in the buffer may then be indicated in time by the user using the secondary user interface device.
3.3 content management-alternative content
The content management device (130) may be configured to display and/or play different versions of multimedia content on different user interface devices (115). For example, during display of an R-rated movie, the content management device (130) may be configured to display a censored version of the R-rated movie on a primary user interface viewable to all viewers. The content management device (130) may be further configured to simultaneously display an unviewed version of the R-rated movie on a secondary user interface device (115) (e.g., a handheld device configured for adult viewers).
In an embodiment, the content management device (130) may receive two different but related content streams for display on two user interface devices (115). In this case, the content management device (130) may simply be configured to display each content stream on the corresponding user interface device (115) simultaneously. In an embodiment, a content management device (130) may receive a single content stream for display on multiple user interface devices simultaneously. In this case, the content management device (130) may automatically review content for one user interface device, without reviewing the same content that is simultaneously displayed on a second user interface device.
In an embodiment, the content management device (130) may display the same video stream on multiple user interface devices simultaneously while playing different audio streams on the multiple user interface devices. For example, different user interface devices may play corresponding audio in different languages. In another example, one user interface device may play the censored audio corresponding to a video stream, while another user interface device may play the censored audio corresponding to a concurrently playing video stream.
3.4 content management-informational content/sponsored content
In an embodiment, a content management device (130) may be configured to display multimedia content on a primary interface device and related content on a secondary interface device. The related content may include actor information, synopsis, scene information, geographic information, etc., or any other information related to the multimedia content. In an embodiment, a content management device (130) may receive metadata with multimedia content, wherein the content management device (130) is configured to display the multimedia content on one user interface device and present the metadata on another user interface device. The information displayed on the secondary interface device may be obtained by the content management device (130) to respond to the request for the specific information based on a user request. For example, a user may request identification of a structure or geographic context or other information related to the media content being presented. As shown in exemplary fig. 3E, user interface device a (312) may display content without any additional information, while user interface device B (314) may be annotated with additional information (322) regarding the structure displayed in the media content. The display of other information (e.g., actor information, producer information) may be displayed on user interface device B (314) without simultaneously displaying the media content (e.g., images of the house) displayed in user interface device a (312). The additional information may also include episode information (e.g., identifying characters in an episode cue, good or bad characteristics, scenes, etc.) that may help viewers (e.g., elderly people or children) understand the complex story line.
In an embodiment, the information presented on the secondary user interface device may be sponsor information related to the multimedia content displayed on the primary media interface. The information simultaneously presented on the secondary user interface may be suggestions to the user for additional media content related to the multimedia content displayed on the primary user interface (e.g., similar genre, actors, director, producer, language, etc.). In embodiments, products and/or services that are distinctive in media content displayed on the primary user interface device may be displayed on the secondary user interface. For example, when a movie displayed on the primary user interface device is playing an actor using a particular cell phone, information related to the particular cell phone may be displayed on the secondary user interface device.
3.5 content management-alerts/notifications
In an embodiment, the content management device (130) may be configured to display a visual alert or play an audio alert on the first user interface device and not on the second user interface device. In embodiments, multiple secondary display devices may be configured to display the same content as displayed on the primary display device. Additionally, each of the plurality of secondary display devices may display an alert or notification to the user that is associated with the particular secondary display device. For example, the user may first be viewing content on the primary display device. Thereafter, the user may receive a personalized notification or alert overlaid on the same content being displayed on a secondary display device (e.g., tablet PC). When the alert or information related to the alert is viewed, the user may then switch to viewing the content on the secondary display device. Information related to the alert may be overlaid on the content in a transparent manner, allowing the user to view the content at the same time. Information related to the alert may be displayed in the second window while the original content is displayed in the first window.
In an embodiment, the content management device (130) may be configured to first display a visual alert or play an audio alert on a first user interface device and to display a visual alert or play an audio alert on a second user interface device after a particular time period or other condition. For example, if the user does not respond to the alert, or otherwise indicates that the alert has been received on the first user interface within a specified time period, the alert may be displayed or played on the second user interface.
In an embodiment, the alert/notification may be displayed on one user interface device and consumed on another user interface device. For example, as shown in FIG. 3E, an email alert icon (324a) indicating that an email has been received may be displayed on user interface device A (312). User interface device B (314) may then be used to display the email content (324B) (e.g., in response to user input or automatically). User interface device B (314) may be used to simultaneously display multimedia content when the user reads an email message on user interface B (314).
3.6 content management-time delay
In an embodiment, a content management device (130) displays the same video stream on multiple user interface devices with a time delay between the multiple displays. For example, the multimedia content may be displayed ten seconds earlier on the secondary media device than the primary media device. The adult user may then provide input to review or otherwise modify the content to be displayed to all users on the primary media device. In an embodiment, a user may select a time delay for displaying a video stream between different user interface devices (115). The amount of time delay may be automatically selected by the content management device (130) based on a rating (e.g., general, parental guidance 13, etc.) of the media content.
3.7 content management-image layer selection
In an embodiment, the content management device (130) may display a video stream overlaid with a menu on the primary user interface device while displaying a single frame of the video stream overlaid with the same menu on the secondary user interface device.
For example, the content management device (130) may first display a video stream simultaneously on the first user interface device and the second user interface device. In response to receiving user input at the second user interface device (or at another device), the content management device (130) may obtain snapshots of the frames being displayed on both user interface devices upon receipt of the user input. The content management device (130) may then be configured to display a screenshot of the frame overlaid with the menu on the secondary user interface device. Moreover, the content management device (130) can continue to display the original video stream on the primary user interface device without any change. Alternatively, the content management device (130) may continue to display the original video stream, overlaid with the menu, on the primary user interface device. The secondary user interface device displaying the single frame (e.g., snapshot) may then allow the user to submit an input selecting an option from the menu.
In an embodiment, a content management device (130) displays multiple layers of visual content on a user interface device. For example, as shown in fig. 3G, the content management device (130) may display layer 1 (multimedia content) overlaid with layer 2 (menu) on the first user interface device (326). A menu may be overlaid on multimedia content by first loading a frame from the multimedia content into a frame buffer and overwriting a portion of the frame buffer with data associated with the menu. The content composited in the frame buffer may then be displayed on the user interface device a (312). The content management device (130) may be further configured to display a single layer (e.g., layer 2(328B)) of the layers available on the user interface device B (314). Accordingly, in the above example, only the menu or only the multimedia content may be displayed simultaneously on the first user interface device and the second user interface device, while at least one other layer is displayed on the first user interface device.
In another example, a video stream may create the illusion of a still image overlaid on the video stream. One spatial portion (e.g., the bottom portion) of the video stream may include unchanged images (e.g., a DVD menu) while another spatial portion (e.g., the top portion) may include changed images (e.g., the playing of a movie scene). In the illusion, while the video stream displayed on the media device may be included in a single stream of video content, the appearance of a still image overlaid on the video content may be created on the primary user interface device. In the example, a snapshot of a single frame from a video stream on the primary user interface device may be displayed on the secondary user interface device until user input is received.
3.8 content management-Audio
In an embodiment, the content management device (130) may manage sound differently for different user interface devices (115). For example, a default setting may cause all sounds associated with multimedia content displayed on both the primary user interface device and the secondary interface device to be sent to the primary user interface device along a particular route. The secondary user interface device may automatically launch when the secondary user interface device moves farther than a specified amount from the primary user interface device. For example, when the phone rings in another room, the user may be watching a television program on a primary user interface device (e.g., a wall-mounted plasma screen). The user may then leave the primary user interface device and take the last secondary user interface device (e.g., mobile device) with the hand to receive the call. When the secondary user interface device (which displays the same content simultaneously with the primary user interface device) is farther from the primary user interface device than the specified distance, audio (and/or video) is automatically launched on the secondary user interface device. In the example, the user can continue to watch and listen to the playing of the multimedia content in another room. In an embodiment, video feeds may function in a similar manner. For example, when the secondary user interface device is farther from the primary user interface device than a specified distance, the secondary user interface device may display the video stream simultaneously or in place of the primary user interface device. In an embodiment, when the secondary user interface device returns to within a specified distance of the primary user interface device, the video and/or audio may switch back to the primary user interface device.
In an embodiment, the secondary user interface device may be used as audio by default when the content system (100) is first turned on. For example, when a user turns on the content system (100) in a bedroom, the primary user interface device may only display video, with audio being played on the secondary user interface device. Thereafter, the user may provide input to the content management device (130) to turn on audio on the primary user interface device.
In an embodiment, the secondary user interface device may be used to provide additional audio streams. For example, the primary user interface device may play a video stream with corresponding audio. However, a particular user (e.g., a person with hearing disability) may want a higher volume. The particular user may increase the volume of audio played to the particular user on the secondary user interface device to personalize the user experience. In an embodiment, multiple secondary user interfaces may be used simultaneously with the primary user interface, with each secondary user interface configured to play audio using a volume selected for that particular secondary user interface. A system using multiple secondary user interfaces with audio output allows each user to individually select volume levels.
While this section presents specific examples related to audio content management, the techniques for visual content management described in other sections are applicable to audio content. Also, the examples related to audio content management described in the current section are applicable to visual content.
3.9 content management-multiple Secondary user interface devices
In embodiments, multiple secondary user interface devices may be used simultaneously with a primary user interface device. For example, in an embodiment, media content may be displayed on a primary user interface device and two or more secondary user interface devices. The user may perform functions related to the media content on the corresponding secondary user interface device. For example, each user may zoom in on, move in time, move in space, request information, or perform any other suitable function on the corresponding secondary user interface device without affecting the primary user interface device or other user interface devices. Each user may personally or otherwise configure the respective secondary user interface device for alerts, notifications, messages, and the like. Different secondary user interface devices may be configured differently and/or display different content in addition to the common content displayed on the primary user interface device.
In an embodiment, a primary user interface device and a plurality of secondary user interface devices may be used in a gaming environment. For example, the primary user interface device may display an environment that is visible to all participants, such as a combat environment in which multiple participants interact. The gaming environment displayed on the primary user interface may also exhibit multiple first-person perspectives corresponding to different participants. In addition, a particular weapon, capability, function, asset, or any other gaming attribute for a particular participant may be displayed only on secondary gaming devices associated with the particular participant. In another example, the primary user interface may display through-the-city racing cars (which may include a separate perspective for each participant), while the secondary user interface for each participant may include special speed-enhancing tools, weapons, or other game features.
4.0 user interface device-exemplary implementation
In an embodiment, the user interface device described above is implemented as a remote control device communicatively connected to at least one component in the content system. In embodiments, the remote control device may be a cellular phone or other mobile device. In an embodiment, the remote control device may be implemented as any mobile device or handheld device with a touch screen interface. Although a particular structure for implementing a user interface device (e.g., a primary user interface device or a secondary user interface device) is described herein, any other structure may be used. Any particular components described herein should not be construed as limiting the scope of the user interface device.
As shown in fig. 2B, according to one or more embodiments, the remote control device (130) may be communicatively coupled to one or more media devices via wired and/or wireless portions. The remote control device (130) may communicate wirelessly via radio waves (e.g., wi-fi signals, bluetooth signals), infrared waves, via any other suitable frequency in the electromagnetic spectrum, via a network connection (e.g., intranet, internet, etc.), or via any other suitable method.
In an embodiment, a remote control device (200) may include Read Only Memory (ROM) (206), a Central Processing Unit (CPU) (208), Random Access Memory (RAM) (210), an infrared control unit (212), a key pad scan (214), a key pad (216), non-volatile memory (NVM) (218), one or more microphones (224), gain control logic (220), an analog-to-digital converter (ADC) (222), general purpose input/output (GPIO) interface (226), speaker/microphone (228), key transmitter/indicator (230), low-battery indicator (or output signal) (232), microphone LED (234), radio (236), Infrared (IR) blaster (238), Radio Frequency (RF) antenna (240), standard conventional keypad sliding keypad (not shown), ambient noise cancellation device (not shown), and the like. Memory (e.g., ROM (206), RAM (210), or NVM (218)) on the remote control device (200) may include control code and/or key code for one or more media devices (e.g., media device a (100) or media device B (120)). The memory may include a Run-Length-Limited (RLL) waveform table.
In an embodiment, the low battery indicator (232) may correspond to a visual indication (e.g., an LED light) on the remote control device (200) for a low battery level. In an embodiment, the low battery indicator (232) may represent a signal output for display on a screen other than the remote control device (200). In an embodiment, the low battery code is sent using a standard command. For example, when a channel selection is made on the remote control device (200), a command from the channel selection device returns and a low battery signal is displayed on the display screen.
In an embodiment, the microphone (224) may be located anywhere on the remote control device (200) (e.g., one or more microphones (224) may be located at an end of the remote control device (200)). Multiple microphones may be used to obtain user input if they are available and open. In an embodiment, one of the plurality of microphones may be used for noise cancellation/optimization operations. A single audio stream may be determined from multiple input audio streams by the remote control device (200) or by a media device that receives the multiple audio streams from the remote control device (200).
In an embodiment, the remote control device (200) may include a proximity sensor (not shown) to detect the presence of a user within a specified distance of the remote control device (200) (even before the user presses a button on the remote control device (200)). For example, the remote control device (200) may operate in a low power state until a user is detected. Upon detection of a user, the remote control device (200) may operate in either a normal power state or a high power state. The remote control device (200) may be configured to turn on the keypad lights upon detection of a user. In an embodiment, the proximity sensor may detect a user in the vicinity of the remote control device (200) based on capacitive coupling.
In an embodiment, the remote control device (200) includes one or more displays (242). The display may be a touch screen display comprising functionality to receive user input by a user touching the display screen. The display (242) may be used as a secondary display for a secondary interface device, such as a remote control device (200). The content on the display (242) may be related to content displayed on another display device (e.g., on a primary user interface). Both the content on the display (242) and the content on the primary user interface may be delivered from a single media device or media management device.
5.0 Command execution-exemplary implementation
One possible method of communication between the user interface device and other devices within the system is described by the following example. The device set and the steps performed by the device set should not be construed as limiting the scope as other variations of the device set and performing the steps may be implemented with other embodiments.
A command is received from the user interface device for operating the target media device. The user interface device requests information related to the command from the second media device. The user interface device may request information about the command itself. For example, the user interface device may request an actual signal from the second media device that corresponds to a command transmitted to the target media device. The user interface device may request a portion of the actual signal from the second media device. For example, the user interface device may only request a device code for the target media device or other identification of the target media device for use in a signal (e.g., an infrared signal) to transmit to the target media device.
The user interface device may request state information about the system maintained by the second media device. For example, the second media device may function as a management device and maintain current state information about the system. Examples of status information include currently displayed information, such as a displayed interface, a selection displayed to a user, media content being played, a media device providing input to the current display, a selected channel, and so forth. The status information may include current configuration settings such as volume, brightness, hue, color, user preferences, and the like. The status information may include media device information, such records being stored on the media device, recording schedule, viewing/recording history, and the like.
The second media device may transmit information related to the command to the user interface device. The second media device may communicate any information requested by the user interface device described above. For example, the second media device may transmit the code or actual signal of the target media device for the user interface device to transmit to the target media device. The second media device may transmit any of the above without receiving a specific request for information. For example, the second media device may periodically update the user interface device with status information. The second media device may provide information to the user interface device in response to detecting a low usage level of the user interface device or the second media device. For example, the plurality of processor operations may be monitored during a period of time to determine a usage level, and thereafter the second media device may communicate a status information update to the user interface device upon detecting a threshold indication of a low usage level.
The user interface device may determine a signal to be sent to the target media device based on a command received from a user and/or information received from the second media device. Determining the signal may be as simple as receiving all information, including the signal from the second media device. Determining the signal may include determining the opcode based on a command received by accessing a locally stored table that forms a mapping of the received command to the opcode. Determining the signal may include combining the operation code with information received from the second media device identifying the target media device. Based on the current display and received command (e.g., up button or select button), the user interface device may determine a signal to be transmitted to the target media device.
Another example may relate to a user interface device that communicates directly with a target media device without interacting with other devices. Yet another example may involve a user interface device communicating information related to a command to a second media device, which then communicates with a target media device to perform a function based on the command.
6.0 hardware overview
According to one embodiment, the techniques described herein are implemented by one or more special-purpose computing devices. A specific purpose computing device may be hardwired to perform the techniques, or may include digital electronic devices such as one or more Application Specific Integrated Circuits (ASICs) or field programmable logic arrays (FPGAs) that are permanently programmed to perform the techniques, or may include one or more general purpose hardware processors programmed to perform the techniques in accordance with program instructions in firmware, memory, other storage, or a combination. The dedicated computing device may also incorporate custom hardwired logic, ASICs, or FPGAs with custom programming to accomplish the techniques. A specific purpose computing device may be a desktop computer system, portable computer system, handheld device, network device, or any other device that contains hardwired and/or program logic to implement the techniques.
For example, FIG. 4 is a block diagram that illustrates a computer system 400 upon which an embodiment of the invention may be implemented. System 400 includes a bus 402 or other communication mechanism for communicating information, and a hardware processor 404 coupled with bus 402 for processing information. Hardware processor 404 may be, for example, a general purpose microprocessor.
System 400 also includes a main memory 406, such as a Random Access Memory (RAM) or other dynamic storage device, coupled to bus 402 for storing information and instructions to be executed by processor 404. Main memory 406 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 404. Such instructions, when stored in a storage medium accessible to processor 404, render system 400 a special-purpose machine that is customized to perform the specified operations in the instructions.
System 400 further includes a Read Only Memory (ROM)408 or other static storage device coupled to bus 402 for storing static information and instructions for processor 404. A storage device 410, such as a magnetic disk or optical disk, is provided and coupled to bus 402 for storing information and instructions.
System 400 may be coupled via bus 402 to a display 412, such as a Cathode Ray Tube (CRT), for displaying information to a computer user. An input device 414, including alphanumeric and other keys, is coupled to bus 402 for communicating information and command selections to processor 404. Another type of user input device is cursor control 416, such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 404 and for controlling cursor movement on display 412. The input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y), that allows the device to specify positions in a plane.
System 400 may implement the techniques described herein using custom hardwired logic, one or more ASICs or FPGAs, firmware, and/or program logic in combination with the system, such that system 400 becomes a special purpose machine or that system 400 is programmed as a special purpose machine. According to one embodiment, the techniques are performed by system 400 in response to processor 404 executing one or more sequences of one or more instructions contained in main memory 406. Such instructions may be read into main memory 406 from another storage medium, such as storage device 410. Execution of the sequences of instructions contained in main memory 406 causes processor 404 to perform the process steps described herein. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions.
The term "storage medium" is used herein to refer to any medium that stores data and/or instructions that cause a machine to operate in a specific manner. The storage medium may include non-volatile media and/or volatile media. Non-volatile media includes, for example, optical or magnetic disks, such as storage device 410. Volatile media includes dynamic memory, such as main memory 406. Common forms of storage media include, for example, a floppy disk, a flexible disk, hard disk, solid state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge.
A storage medium is different from but may be used in conjunction with a transmission medium. Transmission media participate in the transfer of information between storage media. For example, transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus 402. Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.
Various forms of media may be involved in carrying one or more sequences of one or more instructions to processor 404 for execution. For example, the instructions may initially be carried on a magnetic disk or solid state drive of a remote computer. The remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem. A modem local to system 400 can receive the data on the telephone line and use an infrared transmitter to convert the data to an infrared signal. An infra-red detector can receive the data carried in the infra-red signal and appropriate circuitry can place the data on bus 402. Bus 402 transfers data to main memory 406, from which processor 404 retrieves and executes instructions. The instructions received by main memory 406 may optionally be stored on storage device 410 either before or after execution by processor 404.
System 400 also includes a communication interface 418 coupled to bus 402. Communication interface 418 provides a two-way data communication coupling to a network link 420 that is connected to a local network 422. For example, communication interface 418 may be an Integrated Services Digital Network (ISDN) card, cable modem, satellite modem, or a modem to provide a data communication connection to a corresponding type of telephone line. As another example, communication interface 418 may be a Local Area Network (LAN) card to provide a data communication connection to a compatible LAN. Wireless links may also be implemented. In any such implementation, communication interface 418 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.
Network link 420 typically provides data communication through one or more networks to other data devices. For example, network link 420 may provide a connection through local network 422 to a host computer 424 or to data equipment operated by an Internet Service Provider (ISP) 426. ISP 426 in turn provides data communication services through the internet packet data communication network, now generally designated as the "internet" 428. Local network 422 and internet 428 both use electrical, electromagnetic or optical signals that carry digital data streams. The signals through the various networks and the signals on network link 420 and through communication interface 418, which transmit and receive digital data to and from computer system 400, are exemplary forms of transmission media.
System 400 can send messages and receive data, including program code, through the network(s), network link 420 and communication interface 418. In the Internet example, a server 430 might transmit a requested code for an application program through Internet 428, ISP 426, local network 422 and communication interface 418.
The processor 404 may execute the received code as it is received and/or stored in the storage device 410 or other non-volatile storage for later execution.
7.0 extensions and substitutions
In the foregoing specification, embodiments of the invention have been described with reference to numerous specific details that may vary from implementation to implementation. Thus, the sole and exclusive indicator of what is the invention, and is intended by the applicants to be the invention, is the set of claims that issue from this application, in the specific form in which such claims issue, including any subsequent correction. The definitions set forth herein for explicit explanations of terms contained in the claims shall govern the meaning of these terms as used in the claims. Hence, no limitation, element, property, feature, advantage or attribute that is not expressly recited in a claim should limit the scope of such claim in any way. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.

Claims (12)

1. A method, comprising:
transmitting media content for display on a first user interface device;
concurrently with transmitting the media content for display on the first user interface device, transmitting the media content for display on a second user interface device, the media content being displayed at the second user interface device a time earlier than the first user interface device by a time delay;
receiving an input to modify a portion of the media content to be displayed on the first user interface device;
modifying the portion of the media content based on the input modifying the portion;
transmitting the modified portion for display on the first user interface device.
2. The method of claim 1, further comprising: a time delay is automatically selected based on a rating of the media content.
3. The method of claim 1, further comprising, while simultaneously transmitting the media content to the first user interface device and the second user interface device:
receiving an input to fast forward the media content;
fast forwarding a display of the media content on the second user interface device in response to the input for fast forwarding while the first user interface device continues to display the media content at a standard playback speed.
4. The method of claim 1, wherein modifying comprises reviewing.
5. A method, comprising:
transmitting media content for display on a first user interface device;
concurrently with transmitting the media content for display on the first user interface device, transmitting the media content for display on a second user interface device;
receiving an input to fast forward the media content;
fast forwarding a display of the media content on the second user interface device in response to the input for fast forwarding while the first user interface device continues to display the media content at a standard playback speed.
6. The method of any of claims 1 to 5, further comprising:
wherein transmitting the media content for display on a second user interface device comprises, after fast forwarding, synchronizing display of the media content on the second user interface device with the first user interface device in response to selection of a synchronization option at the second user interface device.
7. The method of any of claims 1-5, wherein transmitting the media content to the second user interface device includes displaying a personalized notification overlaid on the media content for the second user interface device, wherein the personalized notification is not overlaid on the media content for the first user interface device.
8. The method of any of claims 1-5, wherein transmitting the media content to the second user interface device includes displaying related content overlaid on the media content, the related content including episode information, wherein the related content is not overlaid on the media content for the first user interface device.
9. The method of any of claims 1 to 5, further comprising: concurrently with transmitting the media content for display on the first user interface device, transmitting the media content for display on a third user interface device, the media content being indicated in time with respect to the first user interface device and the second user interface device.
10. The method of any of claims 1-5, wherein the first user interface device is a computing device configured to display a media content item on a television screen and the second user interface device is a mobile device communicatively connected to the computing device, wherein the method is performed by a separate content management device.
11. The method of any of claims 1-5, wherein the first user interface device is a computing device configured to display a media content item on a television screen and the second user interface device is a mobile device communicatively connected to the computing device, wherein the first user interface device includes a content management device configured to perform the method.
12. An apparatus comprising one or more memories and one or more processors, the apparatus configured to perform the method of any of claims 1-11.
HK17106125.6A 2010-01-25 2017-06-20 Concurrent use of multiple user interface devices HK1232628A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US12/693,410 2010-01-25

Publications (1)

Publication Number Publication Date
HK1232628A1 true HK1232628A1 (en) 2018-01-12

Family

ID=

Similar Documents

Publication Publication Date Title
US10469891B2 (en) Playing multimedia content on multiple devices
US20110181780A1 (en) Displaying Content on Detected Devices
EP2801208B1 (en) Method and system for synchronising content on a second screen
JP6231524B2 (en) System and method for providing media guidance application functionality using a wireless communication device
US9088814B2 (en) Image display method and apparatus
KR20130132886A (en) Method and system for providing additional content related to a displayed content
US20120161928A1 (en) Display Apparatus, Remote Controller and Associated Display System
US11928381B2 (en) Display device and operating method thereof
HK1232628A1 (en) Concurrent use of multiple user interface devices
HK1231977A1 (en) Concurrent use of multiple user interface devices
HK40007163A (en) Concurrent use of multiple user interface devices
HK1181912B (en) Concurrent use of multiple user interface devices
KR20240169024A (en) Display device
KR20240103534A (en) Display device
CN115086722A (en) Method for displaying content of auxiliary screen and display equipment