US5561796A - Apparatus for searching for speech and moving images - Google Patents
Apparatus for searching for speech and moving images Download PDFInfo
- Publication number
- US5561796A US5561796A US08/404,082 US40408295A US5561796A US 5561796 A US5561796 A US 5561796A US 40408295 A US40408295 A US 40408295A US 5561796 A US5561796 A US 5561796A
- Authority
- US
- United States
- Prior art keywords
- speech
- label
- moving image
- person
- storage means
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Lifetime
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/70—Information retrieval; Database structures therefor; File system structures therefor of video data
- G06F16/78—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/783—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
- G06F16/7837—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using objects detected or recognised in the video content
- G06F16/784—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using objects detected or recognised in the video content the detected or recognised objects being people
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/70—Information retrieval; Database structures therefor; File system structures therefor of video data
- G06F16/73—Querying
- G06F16/732—Query formulation
- G06F16/7343—Query language or query format
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S707/00—Data processing: database and file management or data structures
- Y10S707/99931—Database or file accessing
- Y10S707/99933—Query processing, i.e. searching
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S707/00—Data processing: database and file management or data structures
- Y10S707/99941—Database schema or data structure
- Y10S707/99944—Object-oriented database structure
- Y10S707/99945—Object-oriented database structure processing
Definitions
- the present invention relates to a speech and moving image apparatus for searching for a desired portion from recorded data of speech and moving images.
- a recording/reproduction/editing apparatus for images and speech which records image and speech information of a conference, attaches marks to scenes necessary for recording, and searches for necessary information in reproduction by using such marks as a key (Japanese Patent Application Laying Open No. 5-2857).
- the recording/reproduction/editing apparatus for images and speech comprises, as shown in FIG. 14, an input device 1 consisting of a camera, a microphone, and the like for inputting image and speech information; a recording device 2 for recording the image and speech information sent from the input device 1; an editing device 3 for editing the image and speech information recorded in the recording device 2; a search device 4 for finding the necessary information by reproducing the recorded information at a high speed; a reproduction device 5 for reproducing the recorded information at a normal speed; an output device 6 consisting of a television, a speaker, and the like for outputting the recorded image and speech information; a controller 7, and a console 8 for controlling processing and delivery of information between these devices.
- the conventional recording/reproduction/editing apparatus for images and speech and the search method have such disadvantages that they can search for only either one event of speech and/or moving image, and cannot search for a desired scene such as a scene desired to be searched where a person is nodding while saying "yes.”
- the present invention is intended to eliminate such problems, and to provide a speech and moving image search apparatus which can search for a desired scene.
- a speech and moving image search apparatus comprising a database for storing the speech and moving image data by frame as the minimum time unit, a label attribute storage means for storing attribute information of labels which is for labeling the speech and moving image data stored in the database by frame, a label information storage means for storing label information in which the speech and moving image data stored by frame is labeled in a plurality of different events based on the attribute information of labels stored in the label attribute storage means, a broader term storage means for storing broader terms which is for searching for the labels by OR or AND search, an input means for inputting a command for specifying the broader term, a search means for searching for the label information from the label information storage means based on the broader term stored in the broader term storage means in response to the specified command, a controller means for accessing the database for the speech and moving image data corresponding to the label information searched for by the search means, and an output means for outputting the speech and moving image data accessed from the database
- a speech and moving image search apparatus comprising an input means for inputting the speech and moving image data by frame as the minimum time unit, a feature parameter storage means for storing each feature parameter corresponding to the prelabeled respective speech and moving image data, a matching means for matching the speech and moving image data input by the input means with the feature parameter stored in the feature parameter storage means, and for outputting labels corresponding to the matched feature parameter only when predetermined conditions are met, a broader term storage means for storing broader terms which is for searching for the labels by OR or AND search, a search means for accepting the labels output from the matching means, and for searching the broader terms for the labels from the broader term storage means, and an output means for outputting the broader terms searched for by the search means.
- the speech and moving image data is stored by frame as the minimum time unit by the database.
- the label information is stored by the label information storage device
- the label attribute storage device stores the label attribute.
- the broader term of the label is stored by the broader term storage device and, the speech and moving images are stored by the database.
- searching for data when a command is input through the input device, label information on the data is extracted by the search device through reference to the broader term storage device.
- the database being accessed for the speech and moving image data corresponding to the label information by the control device.
- output device outputs the speech and moving image data.
- a desired scene can be easily searched for with an OR or AND type search simply by inputting a predetermined command corresponding to the broader term.
- the speech and moving image data is input by frame as the minimum time unit by the input device.
- Each feature parameter corresponding to the prelabeled respective speech and moving image data are stored in the feature parameter storage means and the broader term for searching the label with an OR or AND type search is stored in the broader term storage device.
- the matching means matches the speech and moving image data input from the input device with the feature parameter stored in the feature parameter storage device, and outputs the label corresponding to the matched feature parameter only when predetermined conditions are met, while the search device accepts the label. It then searches for the broader term of the label from the broader term storage device, and searched-for broader term is output by the output device. Therefore, it is possible to easily search for the broader term for the input speech and moving image data.
- FIG. 1 is a block diagram showing the arrangement of a first speech and moving image search apparatus according to the present invention.
- FIG. 2 is a schematic diagram showing the arrangement of a recording/reproduction/editing apparatus for images and speech for which the first speech and moving image search apparatus according to the present invention is used.
- FIG. 3 is a configuration of label attribute data according to the present invention.
- FIG. 4 is an example of the label attribute data according to the present invention.
- FIG. 5 is an example of the label attribute data according to the present invention.
- FIG. 6 is an example of the arrangement on the screen displayed on a display.
- FIG. 7 is a diagram for illustrating a label information window.
- FIG. 8 is a diagram for illustrating a rectangle for a label.
- FIG. 9a is a flowchart showing the procedure for a label.
- FIG. 9b is a flowchart showing the procedure for a label.
- FIG. 9c is a flowchart showing the procedure for a label.
- FIG. 9d is a flowchart showing the procedure for a label.
- FIG. 10 is a diagram for illustrating the operation of a mouse for a label.
- FIG. 11 is a diagram for illustrating a search region according to the present invention.
- FIG. 12 is a block diagram showing the arrangement of a second speech and moving image search apparatus according to the present invention.
- FIG. 13 is a diagram showing various regions of an image.
- FIG. 14 is a block diagram showing a conventional recording/reproduction/editing apparatus for images and speech.
- the first speech and moving image search apparatus includes, as shown in FIG. 1, an input section 101 as an input means (for example) for inputting a command and the like; a label information storage 103 as a label information storage means (for example) for storing label information; a label attribute storage 104 as a label attribute storage means (for example) for storing label attributes; a broader term storage 105 as a broader term storage means (for example) for storing broader terms of the labels; a search section 102 as a first search means (for example) for extracting information on the data from the label information storage 103; a database 107 for storing speech and image data; a control section 106 as a control means (for example) for accessing desired data from the database 107 based on the information extracted from the search section 102; and an output section 108 as an output means (for example) for outputting the data from the database 107.
- a label information storage 103 as a label information storage means (for example) for storing label information
- the input section 101 consists of a keyboard 10 and a mouse 11.
- the search section 102 and the control section 106 are contained in the housing of a computer unit 12. Attached to the computer unit 12 are a magnetic disk 13, which commonly serves as the label information storage 103, the label attribute storage 104, and the broader term storage 105, a magneto-optical disk 14 serving as the database 107, and a display 15.
- Attached to the magneto-optical disk 14 are a monitor 16 serving as the output section 108, an input/output device 17 located at a receptionist 110, and an input/output device 18 located at a person to be monitored 111.
- the input/output device 17 includes a camera 19, a microphone 20, a monitor 21, and a speaker 22.
- a half-mirror 23 is placed in front of the receptionist 110.
- the receptionist 110 views the half-mirror 23, he or she can answer to the person to be monitored 111 displayed on the monitor 21 as if he or she directly faces that person.
- the input/output device 18 includes a camera 24, a microphone 25, a monitor 26, and a speaker 27.
- the voice of the person to be monitored 111 is output to the speaker 22 at the receptionist 110 through the microphone 25, while the figure of the person to be monitored 111 is taken by the camera 24 and output to the monitor 21 at the receptionist 110.
- the voice of the receptionist 110 is output to the speaker 27 at the person to be monitored 111 through the microphone 20, while the figure of the receptionist 110 is taken by the camera 19 and output to the monitor 26 at the person to be monitored 111.
- the receptionist 110 views the person to be monitored displayed on the monitor 21 through the half-mirror 23, because the line of sight of the receptionist 110 is directed to the camera 19 through the half-mirror 23, the receptionist 110 displayed on the monitor 26 also appears to see the person to be monitored 111.
- the image of the receptionist 110 is taken as if the camera is placed on the line of sight of the receptionist 110 to the monitor 21 by placing the half-mirror 23 in front of the monitor 21.
- the voice of the person to be monitored ill picked up by the microphone 25, the image of the figure of the person to be monitored 111 shot by the camera 24, the voice of the receptionist 110 picked up by the microphone 20, and the image of the figure of the receptionist 110 shot by the camera 19 are recorded on the magneto-optical disk 14.
- the magneto-optical disk 14 is controlled for writing and reproduction by a workstation including a keyboard 10, a mouse 11, a computer unit 12, a magnetic disk 13, and a display 15.
- the speech and moving images in reproduction are output to the monitor 16.
- the data such as the label information, label attribute, and broader term have been written to the magnetic disk 13.
- the keyboard 10 and the mouse 11 are provided for allowing input of a command and the like.
- the label attribute data as shown in FIG. 3, has a symbol # followed by an attribute name which represents different events.
- FIG. 3 describes the i-th attribute.
- An event means, for example, the action of which a portion of the body is being noticed.
- attribute name there are label names which label different gestures in the attribute. For example, they represent differences of gestures by a person.
- the first attribute “task” represents the task of data which includes labels of “uketsuke” (task for reception) and “janken” (task for matching paper, scissors, and stones to determine a choice).
- the second attribute “gesture” represents the gesture of a person which includes labels of "bow,” “nod,” “look” (looking memo), and "pointing.”
- the third attribute “expression” represents the expression of a person which includes labels of "smile”(smiling), “angry,” “laugh”(laughing), and "confuse”(confused).
- the fourth attribute "head” represents in which direction the head of a person is directed which includes labels of "left,” “right,” “up,” “down,” and “center.”
- the fifth attribute “eye” represents in which direction the line of sight of a person is directed which includes labels of "contact” (looking at the other person) and “eclose” (the eyes being closed).
- the sixth attribute “mouth” represents the state of the mouth of a person which includes labels of "mopen” (the mouth being opened) and “mclose” (the mouth being closed).
- the seventh attribute “speech” represents the content of the speech spoken by a person which includes labels of "irasshai” (meaning “welcome” in Japanese), “hai” (meaning “yes” in Japanese), “iie” (meaning “no” in Japanese”), and “arigatou” meaning “thanks” in Japanese).
- the label information is described, as shown in FIG. 5, on the line following each attribute name 28 in the sequence of the start frame 29, the end frame 30, and the label name 31 for each label.
- the name defined in the label attribute is used as the label name.
- the start frame 29 and end frame 30 here represent the frame numbers of data recorded on the magneto-optical disk 14. For example, in the attribute name "gesture,” the frame numbers from 6943 to 6962 represent "nod.”
- an attribute information window 32 displays the label attributes.
- a reproduction frame window 33 displays the current frame of the magneto-optical disk 14 being reproduced.
- a label information window 34 displays the labels of each label attribute in correspondence to the frame. The labels are linearly arranged on a line for each attribute. The region from the start frame to the end frame indicating the position of the label is represented by a rectangle 35.
- a frame window 36 displays a range of frames including the frame displayed in the label information window 34.
- a frame cursor 37 allows a change in the time frame seen from the label information window 34 by laterally moving a bar being displayed. In this case, the scale and values of the frame window 36 are arranged to be moved in interlocking action with such movement.
- Operation buttons 38 are for controlling the magneto-optical disk 14, and include buttons of the following functions from the left-most button: fast reverse reproduction, normal reverse reproduction, slow reverse reproduction, reverse jog feed, stop, jog feed, slow reproduction, normal reproduction, and fast reproduction.
- the buttons for fast reverse reproduction, fast reproduction, reverse jog feed, and jog feed function operate as long as they are pressed. Each of the other buttons starts its function once it is pressed and continues to function until another button is pressed.
- Command buttons 39 are used to Add, Modify, or Delete the label information, or to Select it.
- Label select buttons 40 are used to provide a label with a label name, or to specify a label.
- the label information window 34 will be described in detail with reference to FIG. 7 in the following sections.
- the horizontal direction is the x direction
- the vertical direction is the y direction
- the y coordinates representing the label attributes "task,” “gesture,” “expression,” “head,” “eye,” “mouth,” and “speech” are Y[1], Y[2], Y[3], Y[4], Y[5], Y[6], and Y[7], respectively
- the coordinates of the mouse cursor are (MX, MY)
- the left-most x coordinate on the screen displaying the label is LEFT
- the right-most x coordinate is RIGHT
- the left-most frame displayed in the frame window 36 is START FRAME
- the right-most frame is END FRAME.
- the position of the label that is, the left-most coordinate of LX1, the right-most of LX2, the lowermost of LY1, and the uppermost of LY2, can be found by the following expression:
- the mouse 11 is moved to position the mouse cursor 41 on the Add button in the command window 39 where the mouse button is pressed.
- the Add button changes its color to the selected state color different from that in the normal state so that one can easily detect that that button has been selected.
- the mouse cursor 41 is moved to a label of the label select buttons 40 desired for labeling, and the mouse button is pressed (step S1).
- the selected label changes its color to the selected state color.
- the mouse button is pressed at the start frame (step S2), the mouse cursor 41 is dragged while pressing the mouse button, and the mouse button 41 is released at the end frame (step S3). While the mouse is being dragged, a rectangle is drawn from the start frame to the frame currently pointed by the mouse cursor 41 at the position of the y coordinate specific to the selected label attribute. Further at the same time, the image at that moment is reproduced.
- the information on the label name, the start frame, and the end frame thus input is stored in the magnetic disk 13 as the label information storage 103 (steps S4 and S5).
- the label select buttons 40 maintains the currently selected state until a button other than currently selected one is pressed. Therefore, two or more buttons cannot be simultaneously selected.
- the process terminates (step S6). At the moment, the color of Add button returns to the original one from the selected state one.
- the Modify button of the command buttons 39 When the Modify button of the command buttons 39 is pressed, it changes its color to the selected state color.
- the label select button 40 for the label number to be changed is pressed (step S7), and then, the rectangle for the label to be changed in the label information window 34 is pressed (step S8). This causes the label name to be changed to a new one (step S9). If the label attribute selected by the label select button 40 fails to match the attribute of the label selected in the label information window 34, the rewriting of the label name does not occur.
- the start or end frame is changed, as shown in FIG.
- the start or end frame which is a boundary of the rectangle for the label to be changed is pointed by the mouse cursor 41 (step S10), and dragged to an intended frame where the mouse button is released (step S11).
- the information on the label name, the start frame, and the end frame thus modified is stored in the magnetic disk 13 in place of the previous information (step S12).
- the display screen is rewritten (step S13).
- the label select buttons 40 maintain the currently selected state until a button other than currently selected one is pressed. Therefore, two or more buttons cannot be simultaneously selected.
- the Modify button is pressed again, or another command button 39 is pressed, the process terminates (step S14).
- the color of the Modify button returns to the original one from the selected state color.
- the Delete button of the command buttons 39 When the Delete button of the command buttons 39 is pressed, it changes its color to the selected state color.
- the mouse cursor 41 is moved to the rectangle for the label to be deleted from the label information window 34, and the mouse button is pressed (step S15). Then, the control section 106 issues a signal confirming the deletion of the label to the display 15. If the deletion of the label is acknowledged (step S16), the information for that label is deleted from the label information storage 103 (step S17). In this case, the display screen is rewritten (step S18).
- the process terminates (step S19). The color of the Modify button returns to the original one from the selected state color. If the deletion of the label is not acknowledged in step S16, the operation for the deletion is terminated.
- the Select button of the command buttons 39 When the Select button of the command buttons 39 is pressed, it changes its color to the selected color.
- the mouse cursor 41 is sequentially moved to the rectangles for the plurality of labels to be deleted from the label information window 34, where the mouse button is pressed (steps S20 and S21). At the moment, the selected labels change their color to the selected state color as shown in FIG. 10 (c).
- the control section 106 issues a signal confirming the deletion of the plurality of labels to the display 15. If the deletion of the labels is acknowledged (step S23), the information for those labels is deleted from the label information storage 103 (step S24). In this case, the display screen is rewritten (step S25).
- step S26 When the Select button is pressed again, or another command button 39 is pressed, the process terminates (step S26). The color of the Select button returns to the original one from the selected state color. If the deletion of the labels is not acknowledged in step S23, the deletion is terminated.
- the mouse cursor 41 is sequentially moved to the rectangles for the plurality of labels to be reproduced, where the mouse button is pressed (steps S20 and S21).
- the selected labels change their color to the selected state color as shown in FIG. 10 (c).
- the search section 102 searches for label information (step S28).
- the selected label information is sequentially reproduced (step 29).
- the process terminates (step S26).
- the color of Select button returns to the original one from the selected state color.
- the labels for attributes are arranged in the vertical direction.
- the horizontal direction represents time in a unit of frame.
- a region S1 is provided.
- the region S1 becomes subject to the reproduction of speech and images.
- a section containing either one of two labels is OR searched and a region S2 is provided.
- the region S2 becomes subject to the reproduction of speech and images.
- the region S1 represents an interval from the moment to start “hai” to the moment to end “nod,” while the region S2 represents an interval from the moment to start “nod” to the moment to end saying "hai.”
- the region S1 represents an interval during "nodding” and saying "hai”(meaning yes), while the region S2 represents the entire interval of "nodding.”
- AND or OR search which is a technique for inputting and searching for a label is specified through the keyboard 10 or the mouse 11.
- the broader term as shown in Table 1, means an event which can be represented by the relationship of the occurrence of a plurality of labels, and can be extracted by an AND or OR search.
- a broader term “agree” is a region extracted by OR-searching for the label name "hai” (meaning “yes") for the label attribute "speech” and the label name “nod” for the label attribute "gesture.” This originates in the fact that, when a person "agrees,” he or she expresses either or both "hai” speech and a "nod” gesture.
- the embodiment includes, as shown in FIG. 12, an input section 201 as an input (for example) for inputting speech and moving image data; a feature parameter storage 203 as a feature parameter storage (for example) for storing feature parameters of prelabeled data; a matching section 202 as a matching means (for example) for comparing the speech and moving image data input from the input section 201 with the feature parameters stored in the feature parameter storage 203, and for outputting a label name attached to a feature parameter which has the largest similarity and is larger than a predetermined threshold; a broader term storage 205 as a broader term storage means (for example) for storing broader terms for labels; a search section 204 as a second search means (for example) for searching for the broader terms stored in the broader term storage 205 from time series data of the label name of each label attribute obtained by the matching section 202; and an output section 206 as an output means (for example) for outputting the result searched for by the search section 204.
- a feature parameter storage 203 as
- the feature parameters for each label are extracted in this embodiment.
- the extraction is performed, for example, as shown in FIG. 13, for a whole body region 50, a head region 51, a face region 52, an eye region 53, and a mouth region 54.
- Table 2 shows which attribute relates to which region.
- Each region is represented, as shown in Table 2, by average coordinates obtained from many data items assuming that the position of a person recorded in the image data moves little.
- the values of coordinates are expressed in the sequence of left-most, lowermost, right-most, and uppermost.
- the feature parameters extracted from each region are time series data in the label interval, which data is previously stored in the feature parameter storage 203.
- speech and moving images are input from the input section 201, they are matched with all feature parameters stored in the feature parameter storage 203 by the matching section 202.
- a technique such as dynamic programming is used, and the most similar label for each label attribute is obtained together with the frame value for the interval being matched.
- the broader terms stored in the broader term storage 205 are searched for by a specified search device, and the results are output by the output section 206
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- Library & Information Science (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Computational Linguistics (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
Description
TABLE 1 ______________________________________ SEARCH BROADER LABEL LABEL SPECIFI- TERM ATTRIBUTE NAME CATION ______________________________________ AGREE SPEECH YES OR SEARCH GESTURE NODDING DISAGREE SPEECH NO GESTURE SHAKE OR SEARCH ONE'S HEAD -- -- -- -- ______________________________________
TABLE 2 ______________________________________ ATTRIBUTE NAME REGION NAME COORDINATES ______________________________________ GESTURE WHOLE BODY REGION (50,10,350,590) EXPRESSION FACE REGION (150,400,250,500) HEAD HEAD REGION (80,350,320,570) EYE EYE REGION (160,450,240,500) MOUTH MOUTH REGION (180,320,220,350) SPEECH MOUTH REGION (180,320,220,350) ______________________________________
Claims (8)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP6-044080 | 1994-03-15 | ||
JP04408094A JP3171744B2 (en) | 1994-03-15 | 1994-03-15 | Voice and video search device |
Publications (1)
Publication Number | Publication Date |
---|---|
US5561796A true US5561796A (en) | 1996-10-01 |
Family
ID=12681648
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US08/404,082 Expired - Lifetime US5561796A (en) | 1994-03-15 | 1995-03-14 | Apparatus for searching for speech and moving images |
Country Status (2)
Country | Link |
---|---|
US (1) | US5561796A (en) |
JP (1) | JP3171744B2 (en) |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5828809A (en) * | 1996-10-01 | 1998-10-27 | Matsushita Electric Industrial Co., Ltd. | Method and apparatus for extracting indexing information from digital video data |
US5831616A (en) * | 1996-06-21 | 1998-11-03 | Samsung Electronics Co., Ltd. | Apparatus, and method for searching and retrieving moving image information |
GB2365552A (en) * | 2000-03-30 | 2002-02-20 | Canon Kk | Machine interface using bookmarks |
US20020055916A1 (en) * | 2000-03-29 | 2002-05-09 | Jost Uwe Helmut | Machine interface |
US20020059303A1 (en) * | 2000-10-27 | 2002-05-16 | Yoshihiro Ohmori | Multimedia data management system |
US6400996B1 (en) | 1999-02-01 | 2002-06-04 | Steven M. Hoffberg | Adaptive pattern recognition based control system and method |
US6411922B1 (en) * | 1998-12-30 | 2002-06-25 | Objective Systems Integrators, Inc. | Problem modeling in resource optimization |
US6418424B1 (en) | 1991-12-23 | 2002-07-09 | Steven M. Hoffberg | Ergonomic man-machine interface incorporating adaptive pattern recognition based control system |
US20030120461A1 (en) * | 2001-12-21 | 2003-06-26 | Mets Christiaan M.H. | Method and system for capturing, storing and retrieving events and activities |
US20030120661A1 (en) * | 2001-12-21 | 2003-06-26 | Mets Christiaan M.H. | Method and apparatus for retrieving event data related to an activity |
US20030120627A1 (en) * | 2001-12-21 | 2003-06-26 | Emery Michael J. | Method and apparatus for retrieving time series data related to an activity |
US20030120465A1 (en) * | 2001-12-21 | 2003-06-26 | Mets Christiaan M. H. | Method and apparatus for retrieving activity data related to an activity |
US20050141849A1 (en) * | 2003-11-27 | 2005-06-30 | Fuji Photo Film Co., Ltd. | Apparatus, method, and program for editing images |
US7038715B1 (en) * | 1999-01-19 | 2006-05-02 | Texas Instruments Incorporated | Digital still camera with high-quality portrait mode |
US7082436B1 (en) | 2000-01-05 | 2006-07-25 | Nugenesis Technologies Corporation | Storing and retrieving the visual form of data |
US20060222214A1 (en) * | 2005-04-01 | 2006-10-05 | Canon Kabushiki Kaisha | Image sensing device and control method thereof |
US7143109B2 (en) | 1998-02-04 | 2006-11-28 | Nugenesis Technologies Corporation | Information storage and retrieval system for storing and retrieving the visual form of information from an application in a database |
US7242988B1 (en) | 1991-12-23 | 2007-07-10 | Linda Irene Hoffberg | Adaptive pattern recognition based controller apparatus and method and human-factored interface therefore |
US20070269140A1 (en) * | 2006-05-09 | 2007-11-22 | Seiko Epson Corporation | Image management device |
US7974714B2 (en) | 1999-10-05 | 2011-07-05 | Steven Mark Hoffberg | Intelligent electronic appliance system and method |
US8046313B2 (en) | 1991-12-23 | 2011-10-25 | Hoffberg Steven M | Ergonomic man-machine interface incorporating adaptive pattern recognition based control system |
US8369967B2 (en) | 1999-02-01 | 2013-02-05 | Hoffberg Steven M | Alarm system controller and a method for controlling an alarm system |
US8892495B2 (en) | 1991-12-23 | 2014-11-18 | Blanding Hovenweep, Llc | Adaptive pattern recognition based controller apparatus and method and human-interface therefore |
US10361802B1 (en) | 1999-02-01 | 2019-07-23 | Blanding Hovenweep, Llc | Adaptive pattern recognition based control system and method |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE19647660B4 (en) * | 1996-11-19 | 2005-09-01 | Daimlerchrysler Ag | Tripping device for occupant restraint systems in a vehicle |
JP2000315259A (en) * | 1999-05-06 | 2000-11-14 | Sharp Corp | Database creating device and recording medium in which database creation program is recorded |
EP1887561A3 (en) | 1999-08-26 | 2008-07-02 | Sony Corporation | Information retrieving method, information retrieving device, information storing method and information storage device |
WO2022003836A1 (en) * | 2020-06-30 | 2022-01-06 | 日本電信電話株式会社 | Processing system and processing method |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4979050A (en) * | 1983-12-02 | 1990-12-18 | Lex Computer And Management Corporation | Video composition method for assembling video segments |
JPH052857A (en) * | 1991-02-20 | 1993-01-08 | Fuji Xerox Co Ltd | Recording, reproducing and editing device for picture and voice |
US5253361A (en) * | 1989-09-15 | 1993-10-12 | Emtek Health Care Systems, Inc. | System for accessing a row of time-dependent data by referring to a composite index table indicating page locations of linked row labels |
US5257185A (en) * | 1990-05-21 | 1993-10-26 | Ann W. Farley | Interactive, cross-referenced knowledge system |
US5267351A (en) * | 1989-12-22 | 1993-11-30 | Avid Technology, Inc. | Media storage and retrieval system |
US5339166A (en) * | 1991-10-30 | 1994-08-16 | Telediffusion De France | Motion-dependent image classification for editing purposes |
US5400436A (en) * | 1990-12-26 | 1995-03-21 | Mitsubishi Denki Kabushiki Kaisha | Information retrieval system |
US5404435A (en) * | 1991-07-29 | 1995-04-04 | International Business Machines Corporation | Non-text object storage and retrieval |
US5404295A (en) * | 1990-08-16 | 1995-04-04 | Katz; Boris | Method and apparatus for utilizing annotations to facilitate computer retrieval of database material |
US5428774A (en) * | 1992-03-24 | 1995-06-27 | International Business Machines Corporation | System of updating an index file of frame sequences so that it indexes non-overlapping motion image frame sequences |
-
1994
- 1994-03-15 JP JP04408094A patent/JP3171744B2/en not_active Expired - Lifetime
-
1995
- 1995-03-14 US US08/404,082 patent/US5561796A/en not_active Expired - Lifetime
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4979050A (en) * | 1983-12-02 | 1990-12-18 | Lex Computer And Management Corporation | Video composition method for assembling video segments |
US5253361A (en) * | 1989-09-15 | 1993-10-12 | Emtek Health Care Systems, Inc. | System for accessing a row of time-dependent data by referring to a composite index table indicating page locations of linked row labels |
US5267351A (en) * | 1989-12-22 | 1993-11-30 | Avid Technology, Inc. | Media storage and retrieval system |
US5257185A (en) * | 1990-05-21 | 1993-10-26 | Ann W. Farley | Interactive, cross-referenced knowledge system |
US5404295A (en) * | 1990-08-16 | 1995-04-04 | Katz; Boris | Method and apparatus for utilizing annotations to facilitate computer retrieval of database material |
US5400436A (en) * | 1990-12-26 | 1995-03-21 | Mitsubishi Denki Kabushiki Kaisha | Information retrieval system |
JPH052857A (en) * | 1991-02-20 | 1993-01-08 | Fuji Xerox Co Ltd | Recording, reproducing and editing device for picture and voice |
US5404435A (en) * | 1991-07-29 | 1995-04-04 | International Business Machines Corporation | Non-text object storage and retrieval |
US5339166A (en) * | 1991-10-30 | 1994-08-16 | Telediffusion De France | Motion-dependent image classification for editing purposes |
US5428774A (en) * | 1992-03-24 | 1995-06-27 | International Business Machines Corporation | System of updating an index file of frame sequences so that it indexes non-overlapping motion image frame sequences |
Cited By (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8046313B2 (en) | 1991-12-23 | 2011-10-25 | Hoffberg Steven M | Ergonomic man-machine interface incorporating adaptive pattern recognition based control system |
US8892495B2 (en) | 1991-12-23 | 2014-11-18 | Blanding Hovenweep, Llc | Adaptive pattern recognition based controller apparatus and method and human-interface therefore |
US6418424B1 (en) | 1991-12-23 | 2002-07-09 | Steven M. Hoffberg | Ergonomic man-machine interface incorporating adaptive pattern recognition based control system |
US7242988B1 (en) | 1991-12-23 | 2007-07-10 | Linda Irene Hoffberg | Adaptive pattern recognition based controller apparatus and method and human-factored interface therefore |
US5831616A (en) * | 1996-06-21 | 1998-11-03 | Samsung Electronics Co., Ltd. | Apparatus, and method for searching and retrieving moving image information |
US5828809A (en) * | 1996-10-01 | 1998-10-27 | Matsushita Electric Industrial Co., Ltd. | Method and apparatus for extracting indexing information from digital video data |
US7143109B2 (en) | 1998-02-04 | 2006-11-28 | Nugenesis Technologies Corporation | Information storage and retrieval system for storing and retrieving the visual form of information from an application in a database |
US6411922B1 (en) * | 1998-12-30 | 2002-06-25 | Objective Systems Integrators, Inc. | Problem modeling in resource optimization |
US7038715B1 (en) * | 1999-01-19 | 2006-05-02 | Texas Instruments Incorporated | Digital still camera with high-quality portrait mode |
US9535563B2 (en) | 1999-02-01 | 2017-01-03 | Blanding Hovenweep, Llc | Internet appliance system and method |
US10361802B1 (en) | 1999-02-01 | 2019-07-23 | Blanding Hovenweep, Llc | Adaptive pattern recognition based control system and method |
US6400996B1 (en) | 1999-02-01 | 2002-06-04 | Steven M. Hoffberg | Adaptive pattern recognition based control system and method |
US6640145B2 (en) | 1999-02-01 | 2003-10-28 | Steven Hoffberg | Media recording device with packet data interface |
US8583263B2 (en) | 1999-02-01 | 2013-11-12 | Steven M. Hoffberg | Internet appliance system and method |
US8369967B2 (en) | 1999-02-01 | 2013-02-05 | Hoffberg Steven M | Alarm system controller and a method for controlling an alarm system |
US7974714B2 (en) | 1999-10-05 | 2011-07-05 | Steven Mark Hoffberg | Intelligent electronic appliance system and method |
US7082436B1 (en) | 2000-01-05 | 2006-07-25 | Nugenesis Technologies Corporation | Storing and retrieving the visual form of data |
US7043439B2 (en) | 2000-03-29 | 2006-05-09 | Canon Kabushiki Kaisha | Machine interface |
US20020055916A1 (en) * | 2000-03-29 | 2002-05-09 | Jost Uwe Helmut | Machine interface |
GB2365552B (en) * | 2000-03-30 | 2004-11-17 | Canon Kk | Machine interface |
GB2365552A (en) * | 2000-03-30 | 2002-02-20 | Canon Kk | Machine interface using bookmarks |
US20020059303A1 (en) * | 2000-10-27 | 2002-05-16 | Yoshihiro Ohmori | Multimedia data management system |
US20030120661A1 (en) * | 2001-12-21 | 2003-06-26 | Mets Christiaan M.H. | Method and apparatus for retrieving event data related to an activity |
US7152068B2 (en) | 2001-12-21 | 2006-12-19 | Honeywell International Inc. | Method and apparatus for retrieving time series data related to an activity |
US20030120461A1 (en) * | 2001-12-21 | 2003-06-26 | Mets Christiaan M.H. | Method and system for capturing, storing and retrieving events and activities |
US20030120627A1 (en) * | 2001-12-21 | 2003-06-26 | Emery Michael J. | Method and apparatus for retrieving time series data related to an activity |
US7496591B2 (en) | 2001-12-21 | 2009-02-24 | Honeywell International Inc. | Method and system for capturing, storing and retrieving events and activities |
US20030120465A1 (en) * | 2001-12-21 | 2003-06-26 | Mets Christiaan M. H. | Method and apparatus for retrieving activity data related to an activity |
US7027954B2 (en) | 2001-12-21 | 2006-04-11 | Honeywell International Inc. | Method and apparatus for retrieving activity data related to an activity |
US7225193B2 (en) * | 2001-12-21 | 2007-05-29 | Honeywell International Inc. | Method and apparatus for retrieving event data related to an activity |
US20050141849A1 (en) * | 2003-11-27 | 2005-06-30 | Fuji Photo Film Co., Ltd. | Apparatus, method, and program for editing images |
US7327905B2 (en) * | 2003-11-27 | 2008-02-05 | Fujifilm Corporation | Apparatus, method, and program for editing images |
US20060222214A1 (en) * | 2005-04-01 | 2006-10-05 | Canon Kabushiki Kaisha | Image sensing device and control method thereof |
US7639282B2 (en) * | 2005-04-01 | 2009-12-29 | Canon Kabushiki Kaisha | Image sensing device that acquires a movie of a person or an object and senses a still image of the person or the object, and control method thereof |
US7903882B2 (en) * | 2006-05-09 | 2011-03-08 | Seiko Epson Corporation | Image management device |
US20070269140A1 (en) * | 2006-05-09 | 2007-11-22 | Seiko Epson Corporation | Image management device |
Also Published As
Publication number | Publication date |
---|---|
JPH07253986A (en) | 1995-10-03 |
JP3171744B2 (en) | 2001-06-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US5561796A (en) | Apparatus for searching for speech and moving images | |
US5537528A (en) | System and method for inputting scene information | |
US5473744A (en) | Computer-assisted interactive method and apparatus for making a multi-media presentation | |
JP3526067B2 (en) | Reproduction device and reproduction method | |
US5339393A (en) | Graphical user interface for displaying available source material for editing | |
US7102644B2 (en) | Apparatus and method for storing a movie within a movie | |
JP3185505B2 (en) | Meeting record creation support device | |
US6304283B1 (en) | Conference apparatus and method for realistically reproducing image data and shared board data | |
KR100781623B1 (en) | System and method for annotating multi-modal characteristics in multimedia documents | |
US6154218A (en) | Method for generating, managing and displaying information retrieval data on information processing system | |
JPH10320400A (en) | Video search method and apparatus | |
JP2009163643A (en) | Video retrieval device, editing device, video retrieval method and program | |
JPH1049515A (en) | Information display device and information storing/ reproducing device | |
JPH07175816A (en) | Video associative search apparatus and method | |
US7266290B2 (en) | Real-time rich media recording system and method | |
JP2765270B2 (en) | Video presentation method | |
CN112565905B (en) | Image locking operation method, system, intelligent terminal and storage medium | |
GB2306836A (en) | Video presentation system | |
JP3711993B2 (en) | Video associative search device | |
JP2565048B2 (en) | Scenario presentation device | |
JPH0713687A (en) | Handwriting input device | |
JP3291327B2 (en) | Media data editing method and editing device | |
JPH01284973A (en) | Picture recording and reproducing device | |
JPH04291583A (en) | Information recorder | |
JP3676413B2 (en) | Video data management method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: REAL WORLD COMPUTING PARTNERSHIP, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SAKAMOTO, KENJI;WATANUKI, KEIKO;TOGAWA, FUMIO;REEL/FRAME:007455/0201 Effective date: 19950403 Owner name: SHARP KABUSHIKI KAISHA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SAKAMOTO, KENJI;WATANUKI, KEIKO;TOGAWA, FUMIO;REEL/FRAME:007455/0201 Effective date: 19950403 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
AS | Assignment |
Owner name: MINISTER OF ECONOMY, TRADE AND INDUSTRY, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:REAL WORLD COMPUTING PARTNERSHIP;REEL/FRAME:011996/0855 Effective date: 20010703 |
|
FPAY | Fee payment |
Year of fee payment: 8 |
|
FPAY | Fee payment |
Year of fee payment: 12 |