JP2021002399A - ユーザー相互作用を共有するための装置 - Google Patents
ユーザー相互作用を共有するための装置 Download PDFInfo
- Publication number
- JP2021002399A JP2021002399A JP2020167593A JP2020167593A JP2021002399A JP 2021002399 A JP2021002399 A JP 2021002399A JP 2020167593 A JP2020167593 A JP 2020167593A JP 2020167593 A JP2020167593 A JP 2020167593A JP 2021002399 A JP2021002399 A JP 2021002399A
- Authority
- JP
- Japan
- Prior art keywords
- content
- user
- gesture
- users
- hand
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 claims abstract description 35
- 230000005540 biological transmission Effects 0.000 claims description 4
- 230000033001 locomotion Effects 0.000 description 12
- 230000003993 interaction Effects 0.000 description 10
- 230000002457 bidirectional effect Effects 0.000 description 9
- 238000004891 communication Methods 0.000 description 9
- 238000010586 diagram Methods 0.000 description 9
- 238000012545 processing Methods 0.000 description 9
- 238000004590 computer program Methods 0.000 description 8
- 230000008569 process Effects 0.000 description 8
- 230000000694 effects Effects 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 238000006243 chemical reaction Methods 0.000 description 4
- 230000001419 dependent effect Effects 0.000 description 4
- 238000013519 translation Methods 0.000 description 4
- 239000011521 glass Substances 0.000 description 3
- 230000001953 sensory effect Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 238000003491 array Methods 0.000 description 2
- 238000010009 beating Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000003278 mimic effect Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000013515 script Methods 0.000 description 2
- 238000000926 separation method Methods 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 230000002776 aggregation Effects 0.000 description 1
- 238000004220 aggregation Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 239000012634 fragment Substances 0.000 description 1
- 239000010410 layer Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 239000004033 plastic Substances 0.000 description 1
- 230000001681 protective effect Effects 0.000 description 1
- 239000011241 protective layer Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001568 sexual effect Effects 0.000 description 1
- 230000026676 system process Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/04—Real-time or near real-time messaging, e.g. instant messaging [IM]
- H04L51/046—Interoperability with other network applications or services
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H1/00—Details of electrophonic musical instruments
- G10H1/0008—Associated control or indicating means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H3/00—Instruments in which the tones are generated by electromechanical means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/06—Message adaptation to terminal or network requirements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/06—Message adaptation to terminal or network requirements
- H04L51/063—Content adaptation, e.g. replacement of unsuitable content
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L51/00—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
- H04L51/07—User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
- H04L51/10—Multimedia information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M3/00—Automatic or semi-automatic exchanges
- H04M3/42—Systems providing special services or facilities to subscribers
- H04M3/50—Centralised arrangements for answering calls; Centralised arrangements for recording messages for absent or busy subscribers ; Centralised arrangements for recording messages
- H04M3/53—Centralised arrangements for recording incoming messages, i.e. mailbox systems
- H04M3/533—Voice mail systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R1/00—Details of transducers, loudspeakers or microphones
- H04R1/10—Earpieces; Attachments therefor ; Earphones; Monophonic headphones
- H04R1/1041—Mechanical or electronic switches, or control elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F15/00—Digital computers in general; Data processing equipment in general
- G06F15/16—Combinations of two or more digital computers each having at least an arithmetic unit, a program unit and a register, e.g. for a simultaneous processing of several programs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/1454—Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/165—Management of the audio stream, e.g. setting of volume, audio stream path
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/091—Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith
- G10H2220/096—Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith using a touch screen
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/155—User input interfaces for electrophonic musical instruments
- G10H2220/265—Key design details; Special characteristics of individual keys of a keyboard; Key-like musical input devices, e.g. finger sensors, pedals, potentiometers, selectors
- G10H2220/311—Key design details; Special characteristics of individual keys of a keyboard; Key-like musical input devices, e.g. finger sensors, pedals, potentiometers, selectors with controlled tactile or haptic feedback effect; output interfaces therefor
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/155—User input interfaces for electrophonic musical instruments
- G10H2220/391—Angle sensing for musical purposes, using data from a gyroscope, gyrometer or other angular velocity or angular movement sensing device
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/155—User input interfaces for electrophonic musical instruments
- G10H2220/395—Acceleration sensing or accelerometer use, e.g. 3D movement computation by integration of accelerometer data, angle sensing with respect to the vertical, i.e. gravity sensing
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2230/00—General physical, ergonomic or hardware implementation of electrophonic musical tools or instruments, e.g. shape or architecture
- G10H2230/005—Device type or category
- G10H2230/015—PDA [personal digital assistant] or palmtop computing devices used for musical purposes, e.g. portable music players, tablet computers, e-readers or smart phones in which mobile telephony functions need not be used
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2230/00—General physical, ergonomic or hardware implementation of electrophonic musical tools or instruments, e.g. shape or architecture
- G10H2230/045—Special instrument [spint], i.e. mimicking the ergonomy, shape, sound or other characteristic of a specific acoustic musical instrument category
- G10H2230/251—Spint percussion, i.e. mimicking percussion instruments; Electrophonic musical instruments with percussion instrument features; Electrophonic aspects of acoustic percussion instruments or MIDI-like control therefor
- G10H2230/275—Spint drum
- G10H2230/281—Spint drum assembly, i.e. mimicking two or more drums or drumpads assembled on a common structure, e.g. drum kit
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2240/00—Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
- G10H2240/095—Identification code, e.g. ISWC for musical works; Identification dataset
- G10H2240/101—User identification
- G10H2240/105—User profile, i.e. data about the user, e.g. for user settings or user preferences
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2240/00—Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
- G10H2240/121—Musical libraries, i.e. musical databases indexed by musical parameters, wavetables, indexing schemes using musical parameters, musical rule bases or knowledge bases, e.g. for automatic composing methods
- G10H2240/145—Sound library, i.e. involving the specific use of a musical database as a sound bank or wavetable; indexing, interfacing, protocols or processing therefor
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2240/00—Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
- G10H2240/121—Musical libraries, i.e. musical databases indexed by musical parameters, wavetables, indexing schemes using musical parameters, musical rule bases or knowledge bases, e.g. for automatic composing methods
- G10H2240/155—Library update, i.e. making or modifying a musical database using musical parameters as indices
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2240/00—Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
- G10H2240/171—Transmission of musical instrument data, control or status information; Transmission, remote access or control of music data for electrophonic musical instruments
- G10H2240/281—Protocol or standard connector for transmission of analog or digital data to or from an electrophonic musical instrument
- G10H2240/295—Packet switched network, e.g. token ring
- G10H2240/305—Internet or TCP/IP protocol use for any electrophonic musical instrument data or musical parameter transmission purposes
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2240/00—Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
- G10H2240/171—Transmission of musical instrument data, control or status information; Transmission, remote access or control of music data for electrophonic musical instruments
- G10H2240/281—Protocol or standard connector for transmission of analog or digital data to or from an electrophonic musical instrument
- G10H2240/321—Bluetooth
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R5/00—Stereophonic arrangements
- H04R5/033—Headphones for stereophonic communication
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- Multimedia (AREA)
- Telephone Function (AREA)
- User Interface Of Digital Computer (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- General Health & Medical Sciences (AREA)
Abstract
Description
本出願は、2014年10月10日出願の米国仮出願第62/062,672号、DEVICES FOR SHARING USER INTERACTIONSに対する優先権の利益を主張するものであり、参照によりその全体が本明細書に組み込まれる。
Claims (20)
- ユーザー間でコンテンツを共有するためのコンピュータ実装方法であって、
a.コンテンツをユーザーのヘッドホンに伝送するステップであって、前記コンテンツが前記コンテンツを識別するメタデータに関連付けられている、前記伝送するステップと、
b.前記ヘッドホンに関連付けられるタッチバッド上で前記ユーザにより行われた手のジェスチャを示すステップであって、前記手のジェスチャは、前記コンテンツに関係するユーザー選好を判定する、前記手のジェスチャを示すステップと、
c.前記手のジェスチャに関する情報を前記ヘッドホンからモバイル装置に伝送するステップと、
d.前記モバイル装置で前記手のジェスチャに関する前記情報を受信するステップと、
e.前記手のジェスチャによって判定された前記コンテンツに関係する前記ユーザー選好および前記メタデータをネットワークに伝送するステップと、を含み、
前記ネットワークに伝送された前記ユーザー選好は消失メッセージを含み、該消失メッセージは、該消失メッセージにアクセスすることが可能な回数、前記消失メッセージが送信された時刻、及び/又は前記消失メッセージの有効期限の日時を含み、前記消失メッセージは、モバイル装置アプリケーションを通じて伝送され、前記消失メッセージは、前記ユーザーによって録音された音声メッセージを含む、方法。 - 前記タッチパッド上で前記ユーザにより行われた前記手のジェスチャが、前記タッチパッド上でのスワイプ、前記タッチパッド上でのタップ、前記タッチパッド上での複数の指でのタッチ、前記タッチパッド上でのスワイプおよびホールド、前記タッチパッド上での1つを超える方向におけるスワイプ、または1つを超える場所における前記タッチパッドとの接触を含む、請求項1に記載のコンピュータ実装方法。
- 前記タッチパッド上で前記ユーザにより行われた前記手のジェスチャが、ユーザー選好を示す1つ以上のコマンドと関連付けられ、前記ユーザー選好が、共有する、気に入っていることを示す、地理的位置を示す、評価を示す、購入を示すためのコマンド、または前記コンテンツを贈るためのコマンドを含む、請求項1に記載のコンピュータ実装方法。
- 前記モバイル装置で受信された前記手のジェスチャの表示が、前記ヘッドホン上で再生されている前記コンテンツに関係する特定のユーザー選好を判定するために前記モバイル装置により翻訳される、請求項1に記載のコンピュータ実装方法。
- 前記モバイル装置で受信された前記手のジェスチャに関する前記情報が、前記モバイル装置によって翻訳され、前記ネットワークに接続される1人以上の他のユーザーの前記ヘッドホン上で再生される前記コンテンツを識別する前記メタデータを共有するためのコマンドを生成する、請求項1に記載のコンピュータ実装方法。
- 前記モバイル装置は、携帯電話、タブレット、メディアプレーヤ、ゲーム機又はパーソナルコンピュータである、請求項1に記載のコンピュータ実装方法。
- 前記手のジェスチャによって示された前記コンテンツに関係する前記ユーザー選好及び前記コンテンツを特定するメタデータが伝送される前記ネットワークが、ソーシャルネットワークである、請求項1に記載のコンピュータ実装方法。
- 前記ネットワークが、LAN(ローカルエリアネットワーク)、WAN(広域ネットワーク)、インターネットワーク、ピアツーピアネットワーク及び携帯電話ネットワークのうちの1つ以上である、請求項5に記載のコンピュータ実装方法。
- 前記1人以上の他のユーザーは、携帯電話、タブレット、メディアプレーヤ、ゲーム機又はパーソナルコンピュータにより、前記ネットワークに接続される、請求項5に記載のコンピュータ実装方法。
- 前記ネットワークはサーバを含み、前記コンテンツに関係するユーザー選好及び前記コンテンツを特定するメタデータは前記サーバに伝送され、前記コンテンツに関係する前記ユーザー選好及び前記コンテンツを特定する前記メタデータは、前記サーバから前記ネットワークに接続された1人以上の他のユーザーに共有される、請求項5に記載のコンピュータ実装方法。
- 前記ネットワークはソーシャルメディアサーバを更に含み、前記コンテンツに関係するユーザー選好及び前記コンテンツを特定するメタデータは前記ソーシャルメディアサーバに伝送され、前記コンテンツに関係する前記ユーザー選好及び前記コンテンツを特定する前記メタデータは、前記ソーシャルメディアサーバから前記ネットワークに接続された1人以上の他のユーザーに共有される、請求項10に記載のコンピュータ実装方法。
- 前記ネットワークはメディアコンテンツを含むメディアソースを更に含み、前記ネットワークに接続された1人以上の他のユーザーは、メディアコンテンツを要求することができ、前記メディアソースは、要求された前記メディアコンテンツを前記1人以上の他のユーザーに伝送することができる、請求項11に記載のコンピュータ実装方法。
- 前記消失メッセージは、音声をテキストに変換したメッセージを含む、請求項1に記載のコンピュータ実装方法。
- 前記消失メッセージは、前記消失メッセージが送信された後、設定された有効期限の日時に削除される、請求項1に記載のコンピュータ実装方法。
- 前記消失メッセージは、前記消失メッセージが最大の回数だけアクセスされた後に削除される、請求項1に記載のコンピュータ実装方法。
- 前記ユーザーは、前記消失メッセージを受信する1人以上のユーザーを選択し、前記ユーザーは、前記消失メッセージが前記1人以上の他のユーザーのうちの1人に聞かれるか、読み取られるときに通知を受信する、請求項1に記載のコンピュータ実装方法。
- 1人以上の他のユーザーは、前記消失メッセージを着信していることの通知を受信する、請求項1に記載のコンピュータ実装方法。
- 前記1人以上の他のユーザーは、前記消失メッセージを読む又は聞くとの選択肢を有する、請求項17に記載のコンピュータ実装方法。
- ユーザー間でコンテンツを共有するためのコンピュータ実装方法であって、
a.コンテンツをユーザーのヘッドホンに伝送するステップであって、前記コンテンツが前記コンテンツを識別するメタデータに関連付けられている、前記伝送するステップと、
b.前記ヘッドホンに関連付けられるタッチバッド上で前記ユーザにより行われた手のジェスチャを示すステップであって、前記手のジェスチャは、前記コンテンツに関係するユーザー選好を判定する、前記手のジェスチャを示すステップと、
c.前記手のジェスチャに関する情報を前記ヘッドホンからモバイル装置に伝送するステップと、
d.前記モバイル装置で前記手のジェスチャに関する前記情報を受信するステップと、
e.前記手のジェスチャによって判定された前記コンテンツに関係する前記ユーザー選好および前記メタデータをネットワークに伝送するステップと、を含み、
前記ネットワークに伝送された前記ユーザー選好は消失メッセージを含み、該消失メッセージは、音声をテキストに変換したメッセージを含む、方法。 - ユーザー間でコンテンツを共有するためのコンピュータ実装方法であって、
a.コンテンツをユーザーのヘッドホンに伝送するステップであって、前記コンテンツが前記コンテンツを識別するメタデータに関連付けられている、前記伝送するステップと、
b.前記ヘッドホンに関連付けられるタッチバッド上で前記ユーザにより行われた手のジェスチャを示すステップであって、前記手のジェスチャは、前記コンテンツに関係するユーザー選好を判定する、前記手のジェスチャを示すステップと、
c.前記手のジェスチャに関する情報を前記ヘッドホンからモバイル装置に伝送するステップと、
d.前記モバイル装置で前記手のジェスチャに関する前記情報を受信するステップと、
e.前記手のジェスチャによって判定された前記コンテンツに関係する前記ユーザー選好および前記メタデータをネットワークに伝送するステップと、を含み、
前記ネットワークに伝送された前記ユーザー選好は消失メッセージを含み、前記ユーザーは、前記消失メッセージを受信する1人以上のユーザーを選択し、前記ユーザーは、前記消失メッセージが前記1人以上の他のユーザーのうちの1人に聞かれるか、読み取られるときに通知を受信する、方法。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201462062672P | 2014-10-10 | 2014-10-10 | |
US62/062,672 | 2014-10-10 |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2017538923A Division JP2017534132A (ja) | 2014-10-10 | 2015-10-09 | ユーザー相互作用を共有するための装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
JP2021002399A true JP2021002399A (ja) | 2021-01-07 |
Family
ID=55653865
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2017538923A Pending JP2017534132A (ja) | 2014-10-10 | 2015-10-09 | ユーザー相互作用を共有するための装置 |
JP2020167593A Pending JP2021002399A (ja) | 2014-10-10 | 2020-10-02 | ユーザー相互作用を共有するための装置 |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP2017538923A Pending JP2017534132A (ja) | 2014-10-10 | 2015-10-09 | ユーザー相互作用を共有するための装置 |
Country Status (4)
Country | Link |
---|---|
US (3) | US10088921B2 (ja) |
JP (2) | JP2017534132A (ja) |
CN (1) | CN107210950A (ja) |
WO (1) | WO2016057943A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102273759B1 (ko) * | 2021-01-27 | 2021-07-06 | 안중영 | 기록매체에 저장된 동작신호 전달 어플리케이션 프로그램 및 이를 이용한 동작신호 전달 시스템 |
Families Citing this family (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170031525A1 (en) | 2010-05-14 | 2017-02-02 | Racing Optics, Inc. | Touch screen shield |
US9295297B2 (en) | 2014-06-17 | 2016-03-29 | Racing Optics, Inc. | Adhesive mountable stack of removable layers |
CN115412385B (zh) * | 2015-05-06 | 2024-07-02 | 斯纳普公司 | 用于短暂群组聊天的系统和方法 |
US11283742B2 (en) * | 2016-09-27 | 2022-03-22 | Bragi GmbH | Audio-based social media platform |
US10861428B2 (en) | 2018-01-10 | 2020-12-08 | Qrs Music Technologies, Inc. | Technologies for generating a musical fingerprint |
US11846788B2 (en) | 2019-02-01 | 2023-12-19 | Racing Optics, Inc. | Thermoform windshield stack with integrated formable mold |
JP2022518602A (ja) | 2019-02-01 | 2022-03-15 | レーシング オプティクス,インコーポレイテッド | 一体化成形可能な成形物を備えた熱成形フロントガラス積層物 |
WO2020203425A1 (ja) * | 2019-04-01 | 2020-10-08 | ソニー株式会社 | 情報処理装置、情報処理方法、及びプログラム |
US11364715B2 (en) | 2019-05-21 | 2022-06-21 | Racing Optics, Inc. | Polymer safety glazing for vehicles |
JP7188337B2 (ja) * | 2019-09-24 | 2022-12-13 | カシオ計算機株式会社 | サーバ装置、演奏支援方法、プログラム、および情報提供システム |
US11648723B2 (en) | 2019-12-03 | 2023-05-16 | Racing Optics, Inc. | Method and apparatus for reducing non-normal incidence distortion in glazing films |
US11912001B2 (en) | 2019-12-03 | 2024-02-27 | Ro Technologies, Llc | Method and apparatus for reducing non-normal incidence distortion in glazing films |
US11548356B2 (en) | 2020-03-10 | 2023-01-10 | Racing Optics, Inc. | Protective barrier for safety glazing |
US20210285661A1 (en) | 2020-03-10 | 2021-09-16 | Wolf Steel Ltd. | Heating and cooling appliance |
US11490667B1 (en) | 2021-06-08 | 2022-11-08 | Racing Optics, Inc. | Low haze UV blocking removable lens stack |
US11709296B2 (en) | 2021-07-27 | 2023-07-25 | Racing Optics, Inc. | Low reflectance removable lens stack |
US11307329B1 (en) | 2021-07-27 | 2022-04-19 | Racing Optics, Inc. | Low reflectance removable lens stack |
US12140781B2 (en) | 2021-07-27 | 2024-11-12 | Laminated Film Llc | Low reflectance removable lens stack |
US11882166B2 (en) * | 2021-11-17 | 2024-01-23 | Lemon Inc. | Methods, systems and storage media for generating an effect configured by one or more network connected devices |
US12162330B2 (en) | 2022-02-08 | 2024-12-10 | Ro Technologies, Llc | Multi-layer windshield film having progressive thickness layers |
US11933943B2 (en) | 2022-06-06 | 2024-03-19 | Laminated Film Llc | Stack of sterile peelable lenses with low creep |
US11808952B1 (en) | 2022-09-26 | 2023-11-07 | Racing Optics, Inc. | Low static optical removable lens stack |
WO2025147282A1 (en) * | 2024-01-05 | 2025-07-10 | Chord Board, Llc | Arpeggiator musical instrument |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130339859A1 (en) * | 2012-06-15 | 2013-12-19 | Muzik LLC | Interactive networked headphones |
JP2014183594A (ja) * | 2013-03-19 | 2014-09-29 | Samsung Electronics Co Ltd | ディスプレイ装置及びそのアクティビティに関する情報のディスプレイ方法 |
Family Cites Families (93)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5880743A (en) | 1995-01-24 | 1999-03-09 | Xerox Corporation | Apparatus and method for implementing visual animation illustrating results of interactive editing operations |
JPH08305663A (ja) | 1995-04-28 | 1996-11-22 | Hitachi Ltd | 共同作業支援システム |
US6057845A (en) | 1997-11-14 | 2000-05-02 | Sensiva, Inc. | System, method, and apparatus for generation and recognizing universal commands |
US7840912B2 (en) | 2006-01-30 | 2010-11-23 | Apple Inc. | Multi-touch gesture dictionary |
US6476834B1 (en) | 1999-05-28 | 2002-11-05 | International Business Machines Corporation | Dynamic creation of selectable items on surfaces |
US6956562B1 (en) | 2000-05-16 | 2005-10-18 | Palmsource, Inc. | Method for controlling a handheld computer by entering commands onto a displayed feature of the handheld computer |
US7030861B1 (en) | 2001-02-10 | 2006-04-18 | Wayne Carl Westerman | System and method for packing multi-touch gestures onto a hand |
JP3609748B2 (ja) * | 2001-05-14 | 2005-01-12 | 株式会社バーテックススタンダード | マイクロホンの特性調整装置 |
US7202861B2 (en) | 2001-06-25 | 2007-04-10 | Anoto Ab | Control of a unit provided with a processor |
AUPR956901A0 (en) | 2001-12-17 | 2002-01-24 | Jayaratne, Neville | Real time translator |
US7356564B2 (en) * | 2002-01-09 | 2008-04-08 | At&T Delaware Intellectual Property, Inc. | Method, system, and apparatus for providing self-destructing electronic mail messages |
US20030140769A1 (en) * | 2002-01-30 | 2003-07-31 | Muzik Works Technologies Inc. | Method and system for creating and performing music electronically via a communications network |
US20040145574A1 (en) | 2003-01-29 | 2004-07-29 | Xin Zhen Li | Invoking applications by scribing an indicium on a touch screen |
KR20040083788A (ko) | 2003-03-25 | 2004-10-06 | 삼성전자주식회사 | 제스쳐 커맨드를 이용하여 프로그램을 구동시킬 수 있는휴대용 단말기 및 이를 이용한 프로그램 구동 방법 |
US7925029B2 (en) * | 2003-04-18 | 2011-04-12 | Koninklijke Philips Electronics N.V. | Personal audio system with earpiece remote controller |
AU2003304306A1 (en) | 2003-07-01 | 2005-01-21 | Nokia Corporation | Method and device for operating a user-input area on an electronic display device |
US20050047629A1 (en) | 2003-08-25 | 2005-03-03 | International Business Machines Corporation | System and method for selectively expanding or contracting a portion of a display using eye-gaze tracking |
US7774411B2 (en) * | 2003-12-12 | 2010-08-10 | Wisys Technology Foundation, Inc. | Secure electronic message transport protocol |
US7653816B2 (en) * | 2003-12-30 | 2010-01-26 | First Information Systems, Llc | E-mail certification service |
US7176888B2 (en) | 2004-03-23 | 2007-02-13 | Fujitsu Limited | Selective engagement of motion detection |
US7365736B2 (en) | 2004-03-23 | 2008-04-29 | Fujitsu Limited | Customizable gesture mappings for motion controlled handheld devices |
US8448083B1 (en) | 2004-04-16 | 2013-05-21 | Apple Inc. | Gesture control of multimedia editing applications |
CN1874613A (zh) * | 2005-03-25 | 2006-12-06 | 南承铉 | 使用电容传感器的自动控制耳机系统 |
JP2006301683A (ja) * | 2005-04-15 | 2006-11-02 | Media Socket:Kk | 電子コミュニケーション処理装置および方法 |
US8331603B2 (en) | 2005-06-03 | 2012-12-11 | Nokia Corporation | Headset |
US8266219B2 (en) * | 2005-07-20 | 2012-09-11 | Research In Motion Limited | Method and system for instant messaging conversation security |
US7610345B2 (en) * | 2005-07-28 | 2009-10-27 | Vaporstream Incorporated | Reduced traceability electronic message system and method |
US20070052672A1 (en) | 2005-09-08 | 2007-03-08 | Swisscom Mobile Ag | Communication device, system and method |
BRPI0622428A2 (pt) | 2005-09-16 | 2021-05-11 | Janssen Pharmaceutica N.V. | ciclopropila aminas como moduladores do receptor h3 de histamina |
JP2007109118A (ja) | 2005-10-17 | 2007-04-26 | Hitachi Ltd | 入力指示処理装置および入力指示処理プログラム |
US8073137B2 (en) * | 2006-03-06 | 2011-12-06 | Sony Ericsson Mobile Communications Ab | Audio headset |
US8094673B2 (en) * | 2006-04-10 | 2012-01-10 | Microsoft Corporation | Cable user interface |
KR20080004229A (ko) | 2006-07-05 | 2008-01-09 | 엘지노텔 주식회사 | 무선 단말기를 이용하여 원격으로 응용 프로그램을제어하는 시스템 및 방법 |
US8564544B2 (en) * | 2006-09-06 | 2013-10-22 | Apple Inc. | Touch screen device, method, and graphical user interface for customizing display of content category icons |
US7694240B2 (en) | 2006-11-22 | 2010-04-06 | General Electric Company | Methods and systems for creation of hanging protocols using graffiti-enabled devices |
US7877707B2 (en) * | 2007-01-06 | 2011-01-25 | Apple Inc. | Detecting and interpreting real-world and security gestures on touch and hover sensitive devices |
EP1956589B1 (en) * | 2007-02-06 | 2009-12-30 | Oticon A/S | Estimating own-voice activity in a hearing-instrument system from direct-to-reverberant ratio |
US8060841B2 (en) | 2007-03-19 | 2011-11-15 | Navisense | Method and device for touchless media searching |
KR100814067B1 (ko) * | 2007-05-30 | 2008-03-14 | (주)엠비즈코리아 | 에스엠에스 메시지 열람확인 방법 및 이를 수행하기 위한단말장치 |
US8219936B2 (en) | 2007-08-30 | 2012-07-10 | Lg Electronics Inc. | User interface for a mobile device using a user's gesture in the proximity of an electronic device |
US8564574B2 (en) * | 2007-09-18 | 2013-10-22 | Acer Incorporated | Input apparatus with multi-mode switching function |
US7631811B1 (en) | 2007-10-04 | 2009-12-15 | Plantronics, Inc. | Optical headset user interface |
JP2009129257A (ja) * | 2007-11-26 | 2009-06-11 | Sony Corp | サーバ装置、端末装置、共感動作管理処理方法、共感動作方法、プログラム |
US8677285B2 (en) | 2008-02-01 | 2014-03-18 | Wimm Labs, Inc. | User interface of a small touch sensitive display for an electronic data and communication device |
UA99649C2 (ru) | 2008-04-07 | 2012-09-10 | Косс Корпорейшн | Беспроводной наушник, осуществляющий переход между беспроводными сетями, система и способ его реализации |
US8731218B2 (en) * | 2008-04-10 | 2014-05-20 | Apple Inc. | Deformable controller for electronic device |
US8320578B2 (en) * | 2008-04-30 | 2012-11-27 | Dp Technologies, Inc. | Headset |
US20100293462A1 (en) | 2008-05-13 | 2010-11-18 | Apple Inc. | Pushing a user interface to a remote device |
TW200949618A (en) * | 2008-05-16 | 2009-12-01 | Kye Systems Corp | Input device and the control method thereof |
US8169414B2 (en) | 2008-07-12 | 2012-05-01 | Lim Seung E | Control of electronic games via finger angle using a high dimensional touchpad (HDTP) touch user interface |
CN101640857A (zh) * | 2008-08-01 | 2010-02-03 | 中兴通讯股份有限公司 | 一种手机电视节目共享方法及其手机 |
US8447609B2 (en) | 2008-12-31 | 2013-05-21 | Intel Corporation | Adjustment of temporal acoustical characteristics |
US8819597B2 (en) | 2009-04-10 | 2014-08-26 | Google Inc. | Glyph entry on computing device |
CN102460349A (zh) | 2009-05-08 | 2012-05-16 | 寇平公司 | 使用运动和语音命令对主机应用进行远程控制 |
US9563350B2 (en) | 2009-08-11 | 2017-02-07 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
US8957918B2 (en) | 2009-11-03 | 2015-02-17 | Qualcomm Incorporated | Methods for implementing multi-touch gestures on a single-touch touch surface |
KR101319264B1 (ko) | 2010-01-22 | 2013-10-18 | 전자부품연구원 | 멀티 터치 압력에 기반한 ui 제공방법 및 이를 적용한 전자기기 |
CN102906676A (zh) | 2010-03-22 | 2013-01-30 | 美泰有限公司 | 电子装置及数据的输入和输出 |
KR101119835B1 (ko) * | 2010-03-25 | 2012-02-28 | 이노디지털 주식회사 | 터치패드의 사용자 인터페이스가 구비된 리모콘 |
US9329767B1 (en) | 2010-06-08 | 2016-05-03 | Google Inc. | User-specific customization based on characteristics of user-interaction |
US9586147B2 (en) * | 2010-06-23 | 2017-03-07 | Microsoft Technology Licensing, Llc | Coordinating device interaction to enhance user experience |
EP2437163A1 (en) | 2010-09-09 | 2012-04-04 | Harman Becker Automotive Systems GmbH | User interface for a vehicle system |
EP2432218B1 (en) | 2010-09-20 | 2016-04-20 | EchoStar Technologies L.L.C. | Methods of displaying an electronic program guide |
US9316827B2 (en) | 2010-09-20 | 2016-04-19 | Kopin Corporation | LifeBoard—series of home pages for head mounted displays (HMD) that respond to head tracking |
US9122307B2 (en) | 2010-09-20 | 2015-09-01 | Kopin Corporation | Advanced remote control of host application using motion and voice commands |
US9129234B2 (en) * | 2011-01-24 | 2015-09-08 | Microsoft Technology Licensing, Llc | Representation of people in a spreadsheet |
US20120216152A1 (en) | 2011-02-23 | 2012-08-23 | Google Inc. | Touch gestures for remote control operations |
US8203502B1 (en) | 2011-05-25 | 2012-06-19 | Google Inc. | Wearable heads-up display with integrated finger-tracking input sensor |
US8223088B1 (en) | 2011-06-09 | 2012-07-17 | Google Inc. | Multimode input field for a head-mounted display |
US9024843B2 (en) * | 2011-06-30 | 2015-05-05 | Google Inc. | Wearable computer with curved display and navigation tool |
US8873147B1 (en) * | 2011-07-20 | 2014-10-28 | Google Inc. | Chord authentication via a multi-touch interface |
US20130021269A1 (en) | 2011-07-20 | 2013-01-24 | Google Inc. | Dynamic Control of an Active Input Region of a User Interface |
US8941560B2 (en) | 2011-09-21 | 2015-01-27 | Google Inc. | Wearable computer with superimposed controls and instructions for external device |
JP5866728B2 (ja) * | 2011-10-14 | 2016-02-17 | サイバーアイ・エンタテインメント株式会社 | 画像認識システムを備えた知識情報処理サーバシステム |
US9292082B1 (en) | 2011-11-08 | 2016-03-22 | Google Inc. | Text-entry for a computing device |
WO2013069858A1 (ko) * | 2011-11-09 | 2013-05-16 | 주식회사 예일전자 | 시각 및 음향신호 출력이 가능한 모바일장치의 음향출력 메커니즘 및 음향처리수단의 고정구조 |
US9064436B1 (en) | 2012-01-06 | 2015-06-23 | Google Inc. | Text input on touch sensitive interface |
US20130194301A1 (en) * | 2012-01-30 | 2013-08-01 | Burn Note, Inc. | System and method for securely transmiting sensitive information |
US9035878B1 (en) | 2012-02-29 | 2015-05-19 | Google Inc. | Input system |
CN203164614U (zh) * | 2013-03-05 | 2013-08-28 | 成名 | 一种多用手表 |
US10545660B2 (en) * | 2013-05-03 | 2020-01-28 | Blackberry Limited | Multi touch combination for viewing sensitive information |
WO2014204330A1 (en) * | 2013-06-17 | 2014-12-24 | 3Divi Company | Methods and systems for determining 6dof location and orientation of head-mounted display and associated user movements |
US20150169505A1 (en) * | 2013-12-12 | 2015-06-18 | Steve Kim | Textbox magnifier |
US20150222686A1 (en) * | 2014-02-06 | 2015-08-06 | Elijah Aizenstat | System and a method for sharing interactive media content by at least one user with at least one recipient over a communication network |
US9479909B2 (en) * | 2014-03-20 | 2016-10-25 | Tigertext, Inc. | Method of sending messages to devices not configured to receive them |
WO2015176037A1 (en) * | 2014-05-16 | 2015-11-19 | T-Ink, Inc. | Devices and techniques relating to a touch sensitive control device |
US20160070702A1 (en) * | 2014-09-09 | 2016-03-10 | Aivvy Inc. | Method and system to enable user related content preferences intelligently on a headphone |
USD793356S1 (en) * | 2014-12-16 | 2017-08-01 | Musik, LLC | Headphone touchpad |
US20170063751A1 (en) * | 2015-08-27 | 2017-03-02 | Nicolas Korboulewsky-Braustein | Systems and methods for generating and transmitting an email message including an active content |
US20170293351A1 (en) * | 2016-04-07 | 2017-10-12 | Ariadne's Thread (Usa), Inc. (Dba Immerex) | Head mounted display linked to a touch sensitive input device |
US20180077096A1 (en) * | 2016-09-13 | 2018-03-15 | Mark A. DeMattei | Messaging environment for mobile device with multitask toolbar, search engine and keyboard control access to apps and centralized functionality |
WO2020132818A1 (zh) * | 2018-12-24 | 2020-07-02 | 华为技术有限公司 | 无线短距离音频共享方法及电子设备 |
WO2022119772A1 (en) * | 2020-12-02 | 2022-06-09 | Amazon Technologies, Inc. | Sharing audio from a source device |
-
2015
- 2015-10-09 WO PCT/US2015/054973 patent/WO2016057943A1/en active Application Filing
- 2015-10-09 US US14/879,803 patent/US10088921B2/en active Active
- 2015-10-09 CN CN201580067550.8A patent/CN107210950A/zh active Pending
- 2015-10-09 JP JP2017538923A patent/JP2017534132A/ja active Pending
-
2018
- 2018-08-24 US US16/111,486 patent/US10824251B2/en active Active
-
2020
- 2020-10-02 JP JP2020167593A patent/JP2021002399A/ja active Pending
- 2020-10-21 US US17/076,230 patent/US20210034176A1/en not_active Abandoned
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130339859A1 (en) * | 2012-06-15 | 2013-12-19 | Muzik LLC | Interactive networked headphones |
JP2014183594A (ja) * | 2013-03-19 | 2014-09-29 | Samsung Electronics Co Ltd | ディスプレイ装置及びそのアクティビティに関する情報のディスプレイ方法 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102273759B1 (ko) * | 2021-01-27 | 2021-07-06 | 안중영 | 기록매체에 저장된 동작신호 전달 어플리케이션 프로그램 및 이를 이용한 동작신호 전달 시스템 |
WO2022164170A1 (ko) * | 2021-01-27 | 2022-08-04 | 안중영 | 기록매체에 저장된 동작신호 전달 어플리케이션 프로그램 및 이를 이용한 동작신호 전달 시스템 |
Also Published As
Publication number | Publication date |
---|---|
US10824251B2 (en) | 2020-11-03 |
US20210034176A1 (en) | 2021-02-04 |
US20160231834A1 (en) | 2016-08-11 |
CN107210950A (zh) | 2017-09-26 |
JP2017534132A (ja) | 2017-11-16 |
WO2016057943A1 (en) | 2016-04-14 |
US10088921B2 (en) | 2018-10-02 |
US20180364825A1 (en) | 2018-12-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP2021002399A (ja) | ユーザー相互作用を共有するための装置 | |
JP6553124B2 (ja) | 複数センサ入力のためのセンサフュージョンインタフェース | |
US20160103511A1 (en) | Interactive input device | |
CN104685470B (zh) | 用于从模板生成用户界面的设备和方法 | |
TWI578288B (zh) | 多媒體海報生成方法及裝置 | |
JP2017534132A5 (ja) | ||
US9779710B2 (en) | Electronic apparatus and control method thereof | |
JP2014002748A (ja) | 遠隔制御装置及びその制御方法 | |
WO2020224322A1 (zh) | 音乐文件的处理方法、装置、终端及存储介质 | |
CN107209668A (zh) | 反应型代理开发环境 | |
US8878043B2 (en) | Systems, methods, and apparatus for music composition | |
Löcken et al. | User-centred process for the definition of free-hand gestures applied to controlling music playback | |
US8962967B2 (en) | Musical instrument with networking capability | |
US20250208720A1 (en) | Extend the game controller functionality with virtual buttons using hand tracking | |
JP5729844B1 (ja) | コンテンツの評価装置、システム、サーバ装置及び端末装置 | |
JP2013008360A (ja) | 制御機器として用いられる運動制御子 | |
JP2023540256A (ja) | ワークアウトコミュニティに対する個人パフォーマンスフィードバック | |
JP2014002719A (ja) | 遠隔制御装置、ディスプレイ装置およびその制御方法 | |
US10307672B2 (en) | Distribution system, distribution method, and distribution device | |
US12190015B2 (en) | Cognitive aid for audio books | |
KR20170093644A (ko) | 휴대용 단말기 및 그 제어방법 | |
JP3165268U (ja) | マルチメディア操作のワイヤレスキーボードヘッドフォンセット | |
CN117135554A (zh) | 用于平衡定向到用户的每只耳朵的音频的方法和系统 | |
García et al. | Usable Control System for Interaction with Ubiquitous Television | |
Kajastila | Interaction with eyes-free and gestural interfaces |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A521 | Request for written amendment filed |
Free format text: JAPANESE INTERMEDIATE CODE: A523 Effective date: 20201026 |
|
A621 | Written request for application examination |
Free format text: JAPANESE INTERMEDIATE CODE: A621 Effective date: 20201026 |
|
A977 | Report on retrieval |
Free format text: JAPANESE INTERMEDIATE CODE: A971007 Effective date: 20210915 |
|
A131 | Notification of reasons for refusal |
Free format text: JAPANESE INTERMEDIATE CODE: A131 Effective date: 20211001 |
|
A601 | Written request for extension of time |
Free format text: JAPANESE INTERMEDIATE CODE: A601 Effective date: 20220104 |
|
A02 | Decision of refusal |
Free format text: JAPANESE INTERMEDIATE CODE: A02 Effective date: 20220419 |