WO2011152072A1 - コンテンツ出力装置、コンテンツ出力方法、プログラム、プログラム記録媒体及びコンテンツ出力集積回路 - Google Patents
コンテンツ出力装置、コンテンツ出力方法、プログラム、プログラム記録媒体及びコンテンツ出力集積回路 Download PDFInfo
- Publication number
- WO2011152072A1 WO2011152072A1 PCT/JP2011/003160 JP2011003160W WO2011152072A1 WO 2011152072 A1 WO2011152072 A1 WO 2011152072A1 JP 2011003160 W JP2011003160 W JP 2011003160W WO 2011152072 A1 WO2011152072 A1 WO 2011152072A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- content
- image
- attribute
- user
- contents
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 94
- 238000003860 storage Methods 0.000 claims abstract description 143
- 238000012545 processing Methods 0.000 claims description 34
- 238000009826 distribution Methods 0.000 claims description 12
- 230000006870 function Effects 0.000 description 71
- 230000008569 process Effects 0.000 description 65
- 238000011156 evaluation Methods 0.000 description 47
- 238000012854 evaluation process Methods 0.000 description 16
- 235000009508 confectionery Nutrition 0.000 description 14
- 238000004364 calculation method Methods 0.000 description 13
- 230000033458 reproduction Effects 0.000 description 11
- 238000010586 diagram Methods 0.000 description 10
- 238000005516 engineering process Methods 0.000 description 7
- 230000008859 change Effects 0.000 description 4
- 239000000203 mixture Substances 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 3
- 230000010354 integration Effects 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 239000000470 constituent Substances 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 238000010606 normalization Methods 0.000 description 2
- 239000000047 product Substances 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/40—Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/19—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
- G11B27/28—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
- G11B27/32—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier
- G11B27/322—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on separate auxiliary tracks of the same or an auxiliary record carrier used signal is digitally coded
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/34—Indicating arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/432—Content retrieval operation from a local storage medium, e.g. hard-disk
- H04N21/4325—Content retrieval operation from a local storage medium, e.g. hard-disk by playing back content from the storage medium
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/433—Content storage operation, e.g. storage operation in response to a pause request, caching operations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/4508—Management of client data or end-user data
- H04N21/4532—Management of client data or end-user data involving end-user characteristics, e.g. viewer profile, preferences
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/454—Content or additional data filtering, e.g. blocking advertisements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/466—Learning process for intelligent management, e.g. learning user preferences for recommending movies
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/482—End-user interface for program selection
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/84—Generation or processing of descriptive data, e.g. content descriptors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/77—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
- H04N5/772—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
- H04N21/440263—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the spatial resolution, e.g. for displaying on a connected PDA
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/8146—Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics
- H04N21/8153—Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics comprising still images, e.g. texture, background image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/84—Generation or processing of descriptive data, e.g. content descriptors
- H04N21/8405—Generation or processing of descriptive data, e.g. content descriptors represented by keywords
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
Definitions
- the present invention relates to a system that outputs content such as video, video, and audio, and more particularly to a technique for searching and outputting content.
- a search can be performed using the metadata of the content, and related content can be further displayed.
- the present invention has been made in view of such problems, and an object of the present invention is to provide a content output apparatus that displays related content that is highly likely to be of interest to the user.
- a content output device includes a storage unit that stores a plurality of content and the attribute of the content in association with each other, and a plurality of contents stored in the storage unit.
- First output control means for outputting predetermined content on the screen, and a relation for specifying a set of related contents having attributes related to the attributes of the outputted contents from a plurality of contents stored in the storage means.
- the content specifying means and the plurality of contents stored in the storage means are classified into a plurality of groups based on the respective attributes of the plurality of contents, and the total number of contents belonging to each classified group is calculated.
- the content included in the group in which the calculated total number of contents exceeds a predetermined value is set as a high preference container.
- User preference specifying means for specifying as a set of items, and second output control means for outputting content belonging to both the set of contents and the set of highly preferred content to the screen. .
- the content output method is a content output method for outputting content for presentation to a user, and is stored in a storage unit that stores a plurality of content and attributes of the content in association with each other.
- a first output control step for outputting predetermined content to the screen among a plurality of existing content, and a set of related content having an attribute related to the attribute of the output content are stored in the storage means
- a related content specifying step for specifying from a plurality of contents, and a plurality of contents stored in the storage means are classified into a plurality of groups based on respective attributes of the plurality of contents, and belong to each of the classified groups
- the total number of content items is calculated, and the calculated total number of content items exceeds a predetermined value.
- the program according to the present invention is a program described in a computer-readable format so that a processing procedure for outputting content to be presented to a user is executed on a computer, and the processing procedure includes the content and the content.
- a first output control step for outputting a predetermined content on the screen, and an association with the attributes of the output contents
- a related content specifying step for specifying a set of related content having an attribute to be determined from a plurality of contents stored in the storage means; and a plurality of contents stored in the storage means for each attribute of the plurality of contents Classify into multiple groups and belong to each group
- a second output control step of outputting contents belonging to both of the two sets to
- a program recording medium is a program recording medium that records a program described in a computer-readable format so that a processing procedure for outputting content to be presented to a user is executed on the computer,
- the processing procedure includes a first output control step of outputting a predetermined content on a screen among a plurality of contents stored in a storage unit that stores a plurality of contents and corresponding attributes of the contents, and the output
- a related content specifying step for specifying a set of related content having an attribute related to the attribute of the stored content from the plurality of contents stored in the storage means; and a plurality of contents stored in the storage means Classify into a plurality of groups based on respective attributes of the plurality of content;
- a user preference specifying step of calculating a total number of contents belonging to each of the classified groups, and specifying a content included in the group in which the calculated total number of contents exceeds a predetermined value as a set of high-preference content.
- a second output control step of outputting content belonging to both the content set
- a content output integrated circuit is a content output integrated circuit that outputs content to be presented to a user, and is stored in a storage unit that stores a plurality of content and attributes of the content in association with each other.
- a first output control means for outputting predetermined contents on the screen and a set of related contents having attributes related to the attributes of the outputted contents are stored in the storage means.
- Related content specifying means for specifying from a plurality of contents, and a plurality of contents stored in the storage means are classified into a plurality of groups based on respective attributes of the plurality of contents, and belong to each of the classified groups The total number of content items that have been calculated, and the calculated total number of content items exceeds a predetermined value.
- User preference specifying means for specifying the content included in the content as a set of high-preference content
- second output control means for outputting content belonging to both the set of content and the set of high-preference content to the screen, , Provided.
- the content output apparatus can display related content that is highly likely to be of interest to the user by having the above-described configuration.
- the content output device includes a user preference function generation unit, and the user preference function generation unit generates a user preference function for calculating a user preference level for content belonging to the set of high preference content,
- the second output control means uses the generated user preference function to calculate a user preference level of each content belonging to the set of high-preference content, and sets the related content set and the high-preference content set. It is also possible to process the content belonging to both of the two so as to be prominent according to the degree of user preference of the content and output it on the screen.
- the display mode of the related content to be displayed is changed according to the user's preference level for each related content (for example, processing is made more conspicuous if the user preference level is high, and is not noticeable if the user preference level is low) Therefore, the user can pay more attention to related content that is more likely to be of interest to the user.
- the user preference function generation unit may be configured to display each attribute belonging to the set of high-preferred content on the horizontal axis when the attribute commonly held by the set of high-preferred content can be expressed by a numerical value.
- the user preference function may be generated by plotting attribute values and calculating a probability density function assuming that the plotted attribute values are samples according to a normal distribution.
- FIG. 1 is a configuration diagram of an image display apparatus 1000 according to Embodiment 1.
- FIG. This is the structure of data stored in the storage unit 1100.
- 5 is a flowchart illustrating processing performed by the image display apparatus 1000.
- 10 is a flowchart showing processing performed by related content specifying means 1500.
- 5 is a diagram showing an example of listing attributes associated with all images stored in a storage unit 1100.
- FIG. It is a figure which shows the example of the result of having specified the related image by the related content specific
- FIG. It is a flowchart which shows the process which the user preference specific
- FIG. 6 is a configuration diagram of an image display apparatus 2000 according to Embodiment 2.
- FIG. 10 is a flowchart illustrating processing performed by the image display apparatus 2000. It is a figure which shows the concept of the user preference calculation which the 2nd output control means 2600 performs in the "creation date" attribute of the image display apparatus 2000. FIG. It is a figure which shows the concept of the user preference degree calculation which the 2nd output control means 2600 in the "place" attribute of the image display apparatus 2000 performs. 10 is a diagram illustrating an example of an output result by the image display device 2000. FIG. It is a figure which shows the example of the other output result by the image display apparatus.
- FIG. 10 is a configuration diagram of an image display device 3000 according to a third embodiment. It is a figure which shows the data structure and content example of the attached data 1800 for every image. It is a figure which shows the example of the content of the life event information memorize
- FIG. 5 is a flowchart showing the operation of the image display device 3000.
- FIG. 10 is a configuration diagram of an image display device 4000 according to a fourth embodiment. 5 is a flowchart showing the operation of the image display device 4000.
- FIG. 10 is a flowchart showing an arrangement determining process performed by an arrangement determining unit 4400. It is a figure which illustrates the output result by the image display apparatus 4000.
- FIG. It is a flowchart which shows the attribute information evaluation process which the attribute information evaluation means 4200 performs. It is a figure which illustrates the other output result by the image display apparatus 4000.
- FIG. It is a figure which illustrates the output result of an image display device. It is a figure which illustrates the output result of an image display device. It is a figure which illustrates the output result of an image display device. It is a figure which illustrates the output result of an image display device. It is a figure which illustrates the output result of an image display device. It is a figure which illustrates the output result of an image display device. It is a figure which illustrates the output result of an image display device. It is a figure which illustrates the output result of an image display device.
- the image display apparatus 1000 includes a storage unit that stores a plurality of images, displays an image based on a user operation, and displays an image that is related to the displayed image and that matches the user's preference.
- a search is made from a plurality of images stored in the storage means and displayed to the user.
- a creation date and time, a place, a creator name, a character string (hereinafter referred to as a keyword) representing the content of the image, and the like are associated with each other as attributes and stored in the storage unit.
- the image display device searches for an image related to the displayed image based on a user operation by extracting an image having an attribute related to the attribute of the displayed image.
- a search for an image that matches the user's preference from among images stored in the storage means is grouped according to attributes, and the number of images is counted for each group, and the number of images is greater than the reference value. This is done by extracting images belonging to. Since it can generally be estimated that an image with a large number of possession is important for the user, it can be expected to extract an image considered to meet the user's preference.
- the image belonging to both the above-described set of related images and the set of images estimated to meet the user's preference is displayed on the display device.
- a keyword attribute representing content is used as an attribute used for processing for specifying an image suitable for preference. If the keyword has a large amount of travel images, the keyword attribute will display the travel image from among the related images. In this case, since it can be expected that travel is important for the user, an image that is highly likely to be of interest to the user is displayed.
- the image display apparatus 1000 will be described with reference to the configuration diagram shown in FIG.
- the image display apparatus 1000 includes a storage unit 1100, a user preference specifying unit 1200, a user operation receiving unit 1300, a first output control unit 1400, a related content specifying unit 1500, and a second output control unit 1600.
- a display device 1700 that is a display or the like for drawing an image is connected to the image display device 1000 as an external output device.
- the storage unit 1100 includes an HDD (Hard Disk Drive), a memory, and the like, and stores a plurality of images 1101 and attributes 1102 of the images in association with each other. Is obtained in association with attributes and stored.
- HDD Hard Disk Drive
- the user preference specifying unit 1200, the user operation accepting unit 1300, the first output control unit 1400, the related content specifying unit 1500, and the second output control unit 1600 are composed of a processor and a memory, and the control program stored in the memory is processed by the processor. Each function described later is realized by executing the above.
- the user operation accepting unit 1300 additionally includes a pointing device for accepting an operation from the user.
- the user preference specifying unit 1200 When the user preference specifying unit 1200 receives a notification of image input from the user operation receiving unit 1300 to the storage unit 1100, the user preference specifying unit 1200 refers to the attribute of the image in the storage unit 1100, and groups the images having the same keyword attribute.
- a group in which the total number of images in the group exceeds a predetermined reference number hereinafter referred to as a high user preference group
- images included in the group hereinafter referred to as a high user preference image It has a function of notifying the output control means 1600.
- the user operation accepting unit 1300 stores an image input in the storage unit 1100 in association with the attribute when an image input by the user is received, and notifies the user preference specifying unit 1200 of the image input,
- a function for notifying the first output control means 1400 of an image display command and an operation for selecting one of the images output to the display device 1700 are received from the user.
- a function of notifying the first output control means 1400 of an image selection command is an instruction to create a plurality of reduced images (thumbnail images) of images stored in the storage unit 1100 and display them on the display device 1700.
- the image selection command is to search for the selected image from the storage unit 1100 and display it on the display device 1700 and to instruct the related content specifying unit 1500 to specify the related image.
- the first output control unit 1400 has a function of receiving two types of commands from the user operation receiving unit 1300 and processing them.
- an image display command is received, a plurality of images to be displayed are acquired from the storage unit 1100, a thumbnail image is created, output to the external display device 1700, and displayed.
- an image selection command is received, an image selected by the user is acquired from the storage unit 1100, and the image is output and displayed on the external display device 1700, and a related image of the image selected by the user is displayed on the related content specifying unit 1500. Instruct to specify.
- the related content specifying unit 1500 has a function of receiving the specific instruction of the related image of the image selected by the user from the first output control unit 1400 and performing the following process.
- an image having an attribute having a predetermined relationship with the image selected by the user for each of the three types of attributes, the creation date attribute, the creator name attribute, and the location attribute is identified as a related image
- the second output control means 1600 is notified of the image. The predetermined relationship will be described later.
- the second output control unit 1600 receives the notification of the set of high user preference images from the user preference specifying unit 1200 and the set of related images from the related content specifying process, and specifies images belonging to both sets as recommended images.
- the recommended image is acquired from the storage unit 1100, and the recommended image is output and displayed on the external display device 1700.
- the display device 1700 has a function of displaying an image on a display or the like when an image output from the first output control unit 1400 or a recommended image output from the second output control unit 1600 is received.
- Storage unit 1100 stores a plurality of images and attributes corresponding to the images in association with each other.
- the image here refers to still image data such as a photograph taken with a digital camera (JPEG (Joint Photographic Experts Group) or the like).
- FIG. 2 is an example showing a data structure 200 associated with an image stored in the storage unit 1100.
- the table 200 includes a column 201 indicating the type of data and a column 202 that is actual data, and has a plurality of sets of elements. Each element in the example of FIG. 2 will be described.
- “File path” represents a location where an image is stored.
- the “content ID” is a number for uniquely identifying the image stored in the “file path”.
- “Type” is the type of the file stored in “file path”, and in the first embodiment, all are “images”.
- “Created date / time attribute”, “Location attribute”, “Creator name attribute”, and “Keyword attribute” are the date / time when the image stored in “File path” was created, the location where the image was created, and the name of the image creator
- the attribute means a character string representing the contents of the image.
- the image is a file in the abc folder in the storage unit 1100. xxx.
- Content ID as an element of 204 in the table is a numerical value given by the image display device when an image is stored in the storage unit 1100, and is uniquely assigned to each content. In the example of FIG. 2, / abc / 00123. This indicates that the content ID, which is a number for uniquely identifying the xxx image, is 8.
- “Location” as the attribute type 207 in the table represents the latitude and longitude of the created location in the format of “latitude and longitude” using numerical values. This data may be given by the specifications of the device that was created when the file was created. In the example of FIG. 2, the image is created at latitude x6 and longitude y6.
- the “creator name” which is the attribute type 208 in the table may be given depending on the specifications of the device that created the image. However, since it is not necessarily given, the user may input using the image display device. In the example of FIG. 2, the image is created by a DD man.
- “Keyword”, which is the attribute type 209 in the table, may be given depending on the specifications of the device that created the image. However, since it is not necessarily given, the user may input using the image display device. In the example of FIG. 2, the image represents content related to travel.
- Attribute type “location”, “creator name”, and “keyword” are not necessarily given, so data may not exist.
- FIG. 3 is a flowchart showing display processing performed by the image display apparatus 1000.
- the user operation receiving means 1300 receives an image display request from the user (step S301).
- the process returns to step S301.
- the first output control unit 1400 reads a plurality of images from the storage unit 1100, creates thumbnail images, and outputs them to the display device 1700. It is displayed (step S302).
- the display process may be referred to as a first output control process. Thereafter, it waits for one image to be selected in response to a request to view an image that the user is interested in (step S303). If not selected (“NO” in step S303), the process waits until it is selected.
- the first output control unit 1400 displays the selected image (step S304), and the related content specifying unit 1500 performs related content specifying processing (step S305).
- FIG. 4 is a flowchart showing the related content specifying process performed by the related content specifying unit 1500 in step S305.
- the related content specifying unit 1500 acquires a creation date attribute, a creator name attribute, and a location attribute from the attributes of the image selected by the user (step S401).
- the creation date / time attribute, the creator name attribute, and the location attribute are each checked for an association with the image selected by the user (step S402).
- the creation date attribute if the difference value from the creation date of the image selected by the user is within the reference value, the image is regarded as a related image (step S403).
- the creator name attribute if it is the same as the creator name of the image selected by the user, it is regarded as a related image (step S404).
- the location attribute if the difference value from the location of the image selected by the user is within the reference value, the image is regarded as a related image (step S405).
- the related content specifying unit 1500 having the attribute having a predetermined relationship with the image selected by the user by the related content specifying unit 1500 described in the above configuration is executed by the related content specifying unit 1500 executing the processing from step S403 to step S405. Made. Note that the processes in steps S403 to S405 may be executed in order or may be performed in parallel.
- the related image of the creation date attribute, the related image of the creator name attribute, and the related image of the place attribute are specified (step S406).
- related content specifying processing is performed using the table of FIG. In FIG. 5, data with a content ID of 11 or more is omitted and displayed.
- processing performed by the related content specifying unit 1500 for images having content IDs up to 10 will be described.
- “ ⁇ ” indicates that there is no attribute. In this case, it is excluded from the related content specifying process as having no attribute.
- the “creation date” attribute is selected as the attribute of this image.
- the “author name” attribute will be described.
- the “location” attribute is also described.
- the attribute “ ⁇ x6, y6 ⁇ (latitude is x6, longitude is y6) for the“ location ”attribute of the image with content ID 8.
- the user operation accepting unit 1300 confirms whether or not an image is input based on the presence / absence of an input signal from an external storage unit (flash memory, SD (Secure Digital) card, etc.) to the internal storage unit 1100 ( Step S306). If there is no image input by the user (“NO” in step S306), the user waits for an image input. When the user inputs an image (“YES” in step S306), the user preference specifying unit 1200 performs a user preference specifying process (step S307).
- an external storage unit flash memory, SD (Secure Digital) card, etc.
- FIG. 7 is a flowchart showing the user preference specifying process performed by the user preference specifying unit 1200 in step S307.
- the user preference specifying unit 1200 performs grouping for all images stored in the storage unit 1100. Grouping is performed using keyword attributes. First, each group having the same keyword attribute is divided (step S701). The number of images is counted for each divided group, and a group exceeding the predetermined reference number is set as a high user preference group (step S702). Finally, an image included in the high user preference group is specified as a high user preference image (step S703).
- FIG. 5 shows a table of data recorded in the storage unit 1100.
- data with a content ID of 11 or more is omitted and displayed.
- a process performed by the user preference specifying unit 1200 with an image having a content ID of up to 10 will be described.
- a group exceeding the predetermined reference number is set as a high user preference group (step S702).
- the reference number is 3
- the group having the keyword attributes “sweets” and “travel” is set as a high user preference group.
- an image included in the high user preference group is specified as a high user preference image (step S703).
- the second output control means 1600 uses the images belonging to both the set of high user preference images and the set of related images as recommended images.
- the recommended image is displayed together with the image selected by the user (step S308), and the process returns to step S301.
- the specification and display processing of the recommended image by the second output control means 1600 may be referred to as second output control processing.
- An icon indicating content ID 10, which is a highly user-preferred image, is displayed together with the keyword attribute “sweets” (902A in FIG. 9).
- the image display apparatus 2000 which is an embodiment of the content output apparatus will be described.
- FIG. 10 is a configuration diagram of the image display apparatus 2000 according to the second embodiment.
- the image display device 2000 is obtained by changing the constituent elements so as to calculate the user preference level based on the attribute of the high user preference image from the image display device 1000 shown in the first embodiment.
- the second output control means 2600 having a function for calculating the user's preference is connected to the image display apparatus 1000 shown in the first embodiment. did. Thereby, before displaying a recommendation image, the user preference degree which shows the height of a user's preference degree is calculated, and it is reflected on an output.
- the image display apparatus 2000 includes the same storage means 1100, user preference specifying means 1200, user operation accepting means 1300, first output control means 1400, and related contents as the image display apparatus shown in the first embodiment.
- description of the same components as those of the image display apparatus 1000 is omitted.
- the second output control means 2600 includes a processor and a memory, and the following functions are realized when the processor executes a control program stored in the memory.
- the second output control unit 2600 receives notification of a set of high user preference images from the user preference specifying unit 1200 and a set of related images from the related content specifying process, and specifies images belonging to both sets as recommended images. Then, a recommended image is acquired from the storage unit 1100, the user preference level is calculated using the image attribute selected by the user and the image attribute specified by the user preference specifying unit 1200, and recommended to the external display device 1700. It has a function of outputting and displaying the user preference level together with the image. A method for calculating the user preference will be described later.
- the second output control unit 2600 includes a user preference function generation unit 2610.
- the user preference function generation unit 2610 has a function of generating a user preference function for calculating a user preference level.
- the second output control means 2600 includes the user preference function generation means 2610. However, if the user preference function generation means 2610 is in the image display device 2000, the second output control means 2600 includes. In this case, the user preference function generation unit 2610 may be configured to output the user preference function generated to the second output control unit 2600.
- the data structure of the storage unit 1100 handled by the image display apparatus 2000 having the above-described configuration will also be described.
- the data structure handled in the image display apparatus 2000 is the same as the data structure handled in the first embodiment. Therefore, the description is omitted.
- FIG. 11 is a flowchart showing the operation of the image display apparatus 2000.
- the same reference numerals as in the first embodiment are used for the same operations as those in the flowchart shown in FIG. 3 of the first embodiment, and the description thereof is omitted.
- the image display device 2000 uses the second output control unit 2600 to specify an image belonging to both the related image and the high user preference image as a recommended image (step S1108).
- the display content is determined using the user preference degree calculated using the image attribute selected by the user and the image attribute specified by the user preference specifying unit 1200, output to the display device 1700, and displayed. This is performed (step S1109).
- the second output control process is described in step S1109, but strictly speaking, the processes in steps S1108 and S1109 are the second output control process.
- the high user preference group has two attributes “sweets” and “travel” as shown in FIG.
- the attributes that can be represented by numerical values are “creation date” and “location”.
- the probability density function is defined as a user preference function F_creation date (t) with “creation date” t of a high user preference group having “sweets” as a variable.
- the value obtained by substituting x and y in the user preference function F_place (x, y) having “place” ⁇ x, y ⁇ of the high user preference group as a variable is 0.5.
- the second output control means 2600 displays an image that is a related image and a high user preference image according to the user preference level. For example, as shown in FIG. 14, an area (1401) for displaying a related image of the “creation date” attribute of an image selected by the user and an area (1402) for displaying a related image of the “location” attribute are provided. Similarly, in each area, a user preference degree is selected from an image that is a related image of the “creation date” attribute and is a high user preference image and an image that is a related image of the “location” attribute and is a high user preference image. Display in descending order.
- the second output control means 2600 may change the display mode of each recommended image according to the user preference level and output it. For example, a recommended image with a higher user preference is more conspicuous (for example, a recommended image with a higher user preference is displayed larger), and a recommended image with a lower user preference is less noticeable (for example, the user preference The image may be displayed smaller than a highly recommended image), or the display mode may be changed. Further details of the change of the display mode according to the user preference level will be described in the third and fourth embodiments.
- the image display device 3000 includes a storage unit 3100 that stores a plurality of images, and displays a plurality of images in the same size and in the order of created date and time based on a user operation. By changing the shade of the image according to the user's operation, the user's image search is assisted.
- the creation date / time, location, and face information of the person shown in the image are stored in the storage unit 3100 in association with each other as attributes.
- the image display device 3000 calculates an importance component that constitutes an importance for each attribute, and uses a slide bar as a user interface in the screen to express a weighting factor and an importance indicating an importance degree for each attribute specified by the user
- the sum of products with the components is used as the importance of the image.
- the importance of the image is classified into N levels (here, 3 levels), and the image display device 3000 controls to display the images in different display modes according to the classification results. That is, the display mode of the image is controlled so that the display size is the same, and the image classified in the higher level is darker and the image classified in the lower level is lighter. Further, the arrangement for displaying the image is controlled so that the images are arranged in the order of the date and time indicated by the creation date and time attribute. This facilitates user image search.
- N levels here, 3 levels
- the image display device 3000 includes a storage unit 3100, an attribute information evaluation unit 3200, a user operation reception unit 3300, an arrangement determination unit 3400, a display mode classification unit 3500, and a display control unit 3600.
- the image display device 3000 is connected to a display device 3700 that is a display or the like for drawing an image.
- a range surrounded by a dotted line in FIG. 17 corresponds to the second output control means 2600 shown in the first and second embodiments.
- the user operation accepting unit 3300 corresponds to the user operation accepting unit 1300 shown in the first and second embodiments
- the storage unit 3100 corresponds to the storage unit 1100 shown in the first and second embodiments.
- the storage unit 3100 includes an HDD (Hard Disk Drive), a memory, and the like, and stores a plurality of images 1101 and attributes 1102 of the images in association with each other. For example, a user input from a digital camera takes a picture. The image is stored in association with the attribute.
- HDD Hard Disk Drive
- the attribute information evaluation unit 3200, the user operation reception unit 3300, the arrangement determination unit 3400, the display mode classification unit 3500, and the display control unit 3600 include a processor and a memory. Each function of these units (described in detail later) is realized by the processor executing a control program stored in the memory.
- the user operation accepting unit 3300 additionally includes a USB (Universal Serial Bus) interface for accepting image input, a mouse for accepting an operation from the user, and the like.
- USB Universal Serial Bus
- the attribute information evaluation unit 3200 has a function of performing attribute information evaluation processing, which will be described later when receiving a notification of image input from the user operation reception unit 3300 to the storage unit 3100, that is, an attribute of an image in the storage unit 3100.
- the importance component for each attribute of each image is calculated by using life event information and frequent location information configured by information input in advance by the user and frequent human information generated based on the input image. , Having a function of storing in the storage means 3100. The calculation method of the importance component for each of the life event information, the frequent location information, the frequent human information, and the attribute of each image will be described in detail later.
- the user operation receiving means 3300 has four functions.
- the creation date attribute and the location attribute are acquired from information added to the image by the device that created the image
- Information attributes are acquired by analyzing an image by a face recognition program using a conventional face recognition technique immediately after receiving an input, and these acquired attributes for the image are set to “file path” or “content ID” described later. ”And“ type ”, and has a function of storing attached data that is associated with information. Further, after storing the image and the attached data in the storage unit 3100, it has a function of notifying the attribute information evaluation unit 3200 that the image is input.
- it has a function of notifying the display mode classification means 3500 of the weighting factor for each attribute when receiving an input operation on the slide bar for inputting the weighting factor for each attribute output to the display device 3700 from the user.
- the arrangement determining unit 3400 receives a notification from the user operation accepting unit 3300 to determine the arrangement for display, and performs an arrangement determination process based on a display arrangement algorithm for the creation date attribute that arranges the images in the order of the date and time when the images are created.
- the display position is determined, and the determined arrangement is notified to the display control means 3600.
- the display mode classification unit 3500 receives a notification of the weight coefficient value for each attribute notified from the user operation reception unit 3300, that is, a function for performing a display mode classification process to be described later.
- a value is acquired from the storage means 3100, and for each attribute, a weighting factor for each attribute and an importance component are multiplied, and all the products (values of the multiplied results) are added to obtain the importance of the image, It has a function of notifying the display control means 3600 that all the images have the same size and the display mode is such that the higher the importance of the image, the darker the display mode.
- the display control unit 3600 receives a notification of the display arrangement of each image from the arrangement determination unit 3400, receives a notification of the display mode for each image from the display mode classification unit 3500, acquires the displayed image from the storage unit 3100, and It has a function of outputting an image to an external display device 3700 according to the display arrangement and display mode of the image.
- the display device 3700 has a function of displaying an image on a display or the like when an image input from the display control means 3600 is received.
- the image display device stores a plurality of images and attributes corresponding to the images in the storage unit 3100 in association with each other.
- the image here refers to still image data (JPEG: Joint Photographic Groups, etc.) such as a photograph taken with a digital camera.
- FIG. 18 is a diagram showing a data structure and example contents of the attached data 200 including attributes that are associated with each image stored in the storage unit 3100.
- the attached data 1800 includes a file path 1803, a content ID 1804, a type 1805, a creation date / time attribute 1806, a location attribute 1807, a face information attribute 1808, a creation date / time attribute importance component 1809, a location attribute importance component 1810, and a face information attribute significance.
- a degree component 1811 is included. Each element in the example of FIG. 18 will be described.
- the file path 1803 represents the storage location of the file, and is determined by the user operation accepting means 3300 to indicate the location of the image in response to the user specifying where to save the input image.
- the image is a file in the abc folder in the storage unit 3100. It represents that it is stored in xxx.
- the content ID 1804 is a numerical value given by the user operation accepting unit 3300 when the image is stored in the storage unit 3100, and is uniquely assigned to each image. In the example of FIG. 18, / abc / 00123. This indicates that the content ID, which is a number for uniquely identifying the xxx image, is 8.
- the type 1805 represents the type of content, and will be described here as being “image” uniformly.
- image indicates that the image as the file content is a still image.
- the creation date / time attribute 1806 indicates the date and time at which the image was created by numerical values. This data is given to the image as indicating the time of image creation by a device (such as a digital camera) that created the image. The example of FIG. 18 indicates that the image was created at 15:01 on November 3, 2009.
- the location attribute 1807 is a numerical value indicating the latitude and longitude of the created location. This data is given to the image as indicating the location of the image creation by a device (such as a digital camera) that created the image. The example in FIG. 18 indicates that the image is created at latitude x6 and longitude y6.
- the face information attribute 1808 is given by analyzing the image by the user operation receiving means 3300.
- the number of people shown in the image is two, and each person is assigned a face number of 1 or 2 and shows an image feature for face identification as an analysis result of the face recognition program.
- Parameter values are represented by “parameter A”, “parameter B”, and the like.
- the importance component 1809 of the creation date / time attribute, the importance component 1810 of the place attribute, and the importance component 1811 of the face information attribute are importance components of an image configured for each attribute given by the attribute information evaluation unit 3200.
- the range is 0 to 1. Details will be described later.
- the life event information is a set of date information representing the date and time of the event for the user. For example, if the user was a woman born on May 10, 1990, the birthday was May 10, the first visit to the palace was June 10, 1990, the first phrase was March 3, 1991, etc. Candidate.
- the life event information as shown in FIG. 19 is stored in the storage unit 3100 by the user. Since the life event may be a date that the user considers as an event, date information other than that shown in FIG. 19 may be input.
- the frequently-occurring place information is information relating to places where the user frequently goes.
- Each location shown in the frequent location information is specifically represented by longitude and latitude information.
- ⁇ 35, 135 ⁇ indicates a latitude of 35 degrees and a longitude of 135 degrees.
- the frequent location information as shown in FIG. 20 is stored in the storage unit 3100 by the user.
- location information other than having shown in FIG. 20 can also be input.
- the frequent human information is information related to people that the user frequently appears with. For example, as shown in FIG. 21, the user himself / herself or the parents of the user.
- the parameter values obtained by the same method as the analysis by the face recognition program used for obtaining the parameter values in the face information attribute 1808 shown in FIG. 18 are also used in FIG. 21, and “parameter A” and “parameter B” are used.
- Etc. The frequent human information as shown in FIG. 21 is stored in the storage unit 3100 by the user.
- the frequent human information may be a person other than that shown in FIG. 21 because it is sufficient if the user thinks that the user “photographs frequently”.
- FIG. 22 is a flowchart showing display processing performed by the image display device 3000.
- the user operation receiving means 3300 transfers the image from the external storage device to the storage unit 3100, and the attribute information evaluation unit 3200 performs the attribute information evaluation process on the newly input image (step S2202).
- FIG. 23 is a flowchart showing the attribute information evaluation process performed by the attribute information evaluation unit 3200 in step S2202.
- steps S2301 to S2303 are the importance component 1809 of the creation date attribute in the data of FIG. 18
- steps S2304 to S2306 are the importance component 1810 of the location attribute
- steps S2307 to S2308 are the facial information attributes. This is a process for obtaining the importance components 1811 respectively.
- the attribute information evaluation unit 3200 refers to the life event information shown in FIG. 19 (step S2301).
- the attribute information evaluation unit 3200 displays the closest date / time among the date / time indicated by the creation date / time attribute of the image subject to attribute information evaluation processing and the date / time registered as the life event information.
- the difference value is calculated using the day as the unit (step S2302). However, for a life event with no year designation such as a birthday, the difference value is calculated as a life event of the same year as the date indicated by the creation date attribute.
- the difference value is 177 days.
- the attribute information evaluation unit 3200 normalizes the difference value calculated in step S2302 so that the difference value becomes 1 as the difference value increases and becomes 0 as the difference value decreases.
- the component is obtained (step S2303).
- the importance component of the creation date attribute is a value between 0 and less than 1.
- the value becomes negative it is set to 0.
- the attribute information evaluation unit 3200 refers to the frequent location information as shown in FIG. 20 (step S2304).
- the attribute information evaluation unit 3200 refers to the frequent location information, and then compares the location attribute of the image that is the target of the attribute information evaluation process with the closest location information among the locations registered as frequent location information.
- the distance is calculated in km (step S2305). For example, when an image having the attribute shown in FIG. 18 is used as an example, the frequent location information shown in FIG. 20 is registered. For example, the location attribute “latitude x6, longitude y6” and the distance between the homes are 30 km, If the distance is greater than 30 km, the distance is 30 km.
- the attribute information evaluation unit 3200 normalizes the distance calculated in step S2305 so that the distance is 1 as the distance increases and 0 as the distance decreases.
- the degree component is obtained (step S2306).
- step S2307 and step S2308 will be described.
- the attribute information evaluation unit 3200 After calculating the importance component of the place attribute, the attribute information evaluation unit 3200 refers to the frequent human information as shown in FIG. 21 and obtains the number of people who are not frequently photographed, that is, the number of people who are not frequently photographed (step). S2307). For example, when an image having the attribute shown in FIG. 18 is used as an example, the frequent human information shown in FIG. 21 is registered. The number of people who do not have two.
- step S2201 when there is no image input by the user (“NO” in step S2201) and when the attribute information evaluation process shown in FIG. 23 (step S2202 in FIG. 22) is performed, an image display request from the user is issued. (Step S2203). If there is an image display request from the user (“YES” in step S2203), the arrangement determining unit 3400 performs an arrangement determining process (step S2204).
- the arrangement determining unit 3400 arranges the displayed images in the order of the creation date attribute (step S2401). As a result, the images are arranged in order of date. Next, grouping is performed for each month, and the display order is determined (step S2402). Thereby, the images displayed for each month are classified, and the display order for each month is determined in the order of date. Finally, the arrangement of each image is determined (step S2403).
- step S2403 The arrangement in step S2403 will be described.
- the month is output on the left side of the screen so that the creation date is known.
- images are arranged on the screen in the order of the created date and time. Accordingly, the arrangement is determined so that images indicating months having different creation date attributes are displayed in different rows, and images indicating the same month are displayed side by side in date order in the horizontal direction.
- the arrangement is performed such that three vertically and four horizontally.
- the image may be displayed by scrolling the screen by providing a scroll bar or the like.
- step S2203 If there is no image display request from the user (“NO” in step S2203), the process returns to step S2201.
- the display mode classification means 3500 After the arrangement determination process shown in FIG. 24 (step S2204 in FIG. 22), the display mode classification means 3500 performs a display mode classification process (step S2205).
- the importance of each displayed image is calculated (step S2601).
- the importance is obtained by multiplying the weighting factor of each attribute, which is a value between 0 and 1, and the importance component, and adding all the resulting values.
- classification is made into three groups using importance values (step S2602).
- the importance is from 0 to 1/3 of the sum of the weight values, from 1/3 of the sum of the weight values to 2/3 of the sum of the weight values
- This group is a group from 2/3 of the sum of the weight values to the sum of the weight values.
- the number of groups to be classified may be any number as long as it is three or more.
- each image to be displayed is reduced to a certain size (for example, 60 ⁇ 60 pixels).
- the reduced image is hereinafter referred to as a reduced image.
- Each image in the column 2701 of the table was created using the pixels of the reduced image as they were. Therefore, it is a reduced image itself. “0%” in the top element of the column 2701 means that the ratio of information thinned out from the original reduced image is 0%.
- Each image in the column 2702 of the table was created by thinning out 50% of information by using a pattern in which only 2 pixels out of every 2 ⁇ 2 pixels in the vertical and horizontal directions among the pixels of the reduced image were masked. “50%” in the top element of the column 2702 means that the ratio of the thinned information is 50%.
- the display shade is determined by thinning out information from the reduced image.
- the display mode is determined using each reduced image created so that the shades of FIG. 27 are different.
- the group of weight values 2/3 to the weight value sum group displays each image in column 2701, and the weight value sum 1/3 to the weight value sum 2/3.
- the group displays the respective images in the column 2702, and the group of 0 to 1/3 of the sum of the weight values displays the respective images in the column 2703.
- the display control means 3600 performs screen display according to the display arrangement when using the creation date attribute and the shade of each image (step S2206), and displays a slide bar for inputting the weighting factor of each attribute (step S2207). ).
- FIG. 25 shows an example of a screen displayed in this state.
- a rectangle such as 800 in FIG. 25 represents an image. The greater the importance of the image, the darker the image is displayed, and the smaller the image, the lighter the image is displayed.
- 800 in FIG. 25 is an image associated with the attribute shown in FIG. 18 and is displayed at a medium density because the importance of the creation date attribute is 0.52.
- the slide bar displayed at the bottom of the screen is operated with the mouse, the weighting factor is 0 at the left end, the weighting factor is 1 at the right end, and continuously changes from 0 to 1.
- step S2208 the selection of an image from the user is awaited (step S2208). If an image is selected by double clicking with the mouse (“YES” in step S2208), the selected image is displayed (step S2209), and the process ends. If no image is selected (“NO” in step S2208), input to the slide bar is awaited (step S2210). If there is no input to the slide bar (“NO” in step S2210), the process returns to step S2208. If there is an input to the slide bar (“YES” in step S2210), the process returns to step S2205 to display the screen again. A display example of the screen when there is an input to the slide bar is shown in FIG. A rectangle such as 900 in FIG. 28 represents an image as in FIG. 28 in FIG.
- FIG. 29 is a configuration diagram of an image display device 4000 according to the fourth embodiment.
- the image display device 4000 is obtained by changing the constituent elements so as to support searching for another similar image based on the image selected by the user from the image display device 3000 shown in the third embodiment.
- the image display device 3000 shown in the third embodiment uses the attribute of the image selected by the user from the attribute information evaluation unit 3200 and the attribute of the displayed image to determine the importance for each attribute.
- the attribute information evaluation unit 4200 has a function of calculating the degree component. Accordingly, the importance of an image similar to the image selected by the user is greatly evaluated, and the image is displayed darker than other images that are not similar, thereby assisting the user in searching for an image.
- the image display device 4000 includes the same storage means 3100, display mode classification means 3500, and display control means 3600 as the image display apparatus shown in the third embodiment.
- description of the same components as those of the image display device 3000 is omitted.
- the attribute information evaluation unit 4200, the user operation reception unit 4300, and the arrangement determination unit 4400 each include a processor and a memory, and the functions described later are realized by the processor executing a control program stored in the memory.
- the user operation accepting unit 4300 additionally has a USB (Universal Serial Bus) interface for accepting image input, a mouse for accepting an operation from the user, and the like.
- USB Universal Serial Bus
- the attribute information evaluation unit 4200 When the attribute information evaluation unit 4200 receives a notification from the user operation reception unit 4300 that the user has selected one image to search for another similar image, the attribute information evaluation unit 4200 Using the attribute of each image displayed on the display device 3700, an importance component is calculated for each attribute, and the value is stored in the storage unit 3100. A method for calculating the importance component for each attribute will be described later.
- the user operation receiving means 4300 has four functions. First, when inputting an image from an external storage device connected to a USB interface or the like to the storage means 3100, the creation date attribute and the location attribute are set immediately after receiving the input of the face information attribute from the device that created the image. It has a function of acquiring attributes by performing analysis using a face recognition program, and storing these attributes in association with information such as “file path”, “content ID”, and “type”. In addition, when receiving an operation for displaying an image from a user, a function for instructing the arrangement determining unit 4400 to determine a display arrangement to be adopted among the display arrangements determined for each attribute. Have.
- the display mode classification unit 3500 has a function of notifying the weighting factor for each attribute.
- the attribute information evaluation unit 4200 stores the importance component for each attribute. It has a function to notify to perform calculation.
- the layout determining unit 4400 has a function of receiving a notification from the user operation receiving unit 4300 to determine a layout for display and notifying the display control unit 3600 that a display layout in which images are arranged in order of creation date and time attributes is used.
- the data in the storage unit 3100 in the image display device 4000 having the above-described configuration is the same as the data handled in the third embodiment.
- FIG. 30 is a flowchart showing the operation of the image display device 4000.
- the same reference numerals as those in the third embodiment are used for the processing steps that perform the same operations as those in the third embodiment, and detailed description thereof is omitted.
- the user operation accepting unit 4300 accepts a display request from the user, and notifies the arrangement determining unit 4400 to determine the arrangement for display. After receiving the notification, the arrangement determining unit 4400 performs an arrangement determining process (step S3001).
- the arrangement determining means 4400 arranges the displayed images in the order of creation date and time attributes (step S3101). As a result, the images are arranged in order of date. Next, the arrangement of each image is determined (step S3102).
- step S3102 The arrangement in step S3102 will be described. As shown in FIG. 32, the arrangement is performed such that three vertically and four horizontally. On this arrangement, the images stored in the storage unit 3100 are arranged in the order of the creation date attribute. In FIG. 32, as an example, the arrangement is performed such that three vertically and four horizontally. In addition, when an image cannot be displayed in an arrangement in which three are arranged vertically and four are arranged horizontally, the image may be displayed by scrolling the screen by providing a scroll bar or the like.
- the display mode classification unit 3500 performs display mode classification processing, thereby determining the shading when displaying each image using the importance component and the weighting factor for each attribute (step S2205).
- Each image is displayed according to the arrangement for the creation date attribute determined in steps S3001 and S2205 and the display density of each image (step S3002), and is displayed when the user selects the image by single-clicking with the mouse.
- a slide bar for inputting a frame line and a weighting factor for each attribute is displayed (step S3003).
- An example of displaying a screen in this state is shown in FIG. In FIG. 32, reference numeral 1201 denotes a frame line.
- step S2208 the selection of an image from the user is awaited. If an image is selected by double clicking the mouse (“YES” in step S2208), the selected image is displayed and the process ends (step S2209). ). If it is not selected (“NO” in step S2208), it waits for input to the slide bar (step S3004). When the user moves the slide bar (“YES” in step S3004), the process returns to step S2205. When there is no input to the slide bar (“NO” in step S3004), the selection of an image by single-clicking the mouse is awaited (step S3005). When the user selects an image by single-clicking the mouse (“YES” in step S3005), the attribute information evaluation unit 4200 performs an attribute information evaluation process (step S3006).
- FIG. 33 is a flowchart showing the attribute information evaluation process performed by the attribute information evaluation unit 4200 in step S3006.
- the attribute information evaluation unit 4200 is executed when the user selects an image by single-clicking the mouse.
- steps S3301 to S3303 show the importance component 1809 of the creation date attribute in the data of FIG. 18, steps S3304 to S3306 show the place component importance component 1810, and steps S3307 to S3308 show the face information attribute. This is a process for obtaining the importance components 1811 respectively.
- the attribute information evaluation unit 4200 refers to the creation date attribute of the image selected by the user (step S3301).
- the attribute information evaluation unit 4200 calculates the difference between the creation date / time attribute of the displayed image and the creation date / time attribute of the image selected by the user in units of days (step S3302).
- steps S3301 to S3303 which is a method for calculating the importance component of the creation date attribute.
- the attribute information evaluation unit 4200 refers to the location attribute of the image selected by the user (step S3304).
- the attribute information evaluation unit 4200 calculates the distance between the location attribute of the displayed image and the location attribute of the image selected by the user in km (step S3305).
- steps S3304 to S3306, which is a method for calculating the importance component of the place attribute.
- step S3307 and step S3308 will be described.
- the attribute information evaluation unit 4200 After calculating the importance component of the place attribute, the attribute information evaluation unit 4200 refers to the face information attribute of the image selected by the user and the displayed image, and the displayed image is the same as the image selected by the user. If a person is shown, the number of persons is calculated (step S3307).
- the attribute information evaluation unit 4200 After calculating the number of persons when the same person is shown, the attribute information evaluation unit 4200 performs normalization so that the larger the number of persons calculated in step S3307 is, the smaller the number is 0 (step S3308). If the number of persons in the image selected by the user is a, and the number of persons in the other displayed images is the same person as the person in the image selected by the user, b is b. ⁇ Calculate importance component as a.
- steps S3307 and S3308, are methods for calculating the importance component of face information attributes.
- step S3005 If the user does not select an image by single clicking with the mouse (“NO” in step S3005), the process returns to step S2208. After the attribute information evaluation process in step S1106 is completed, the process returns to step S2205 and the screen is displayed again.
- FIG. 34 shows a display example of the screen when there is an input by a slide bar or a single click of the mouse. Images that are similar to the image surrounded by the frame selected by the user with a single mouse click are displayed darker.
- an image (still image) is stored in the storage unit 1100, and a recommended image is searched using the attribute.
- a moving image, audio, etc. May be stored in the storage means, and the recommended content may be searched using the attribute.
- “moving image” or “sound” may be recorded in the “type” of the data structure shown in FIG.
- a representative scene in the moving image may be a thumbnail image, or the size of the moving image is reduced and there is no sound. It may be played back with.
- a moving image in which the user is interested is selected in step S303 in FIG. 3 and a recommended moving image is displayed in step S308, the moving image selected by the user is displayed on the left side as illustrated in FIGS.
- a recommended video may be displayed on the right side.
- the recommended video may be a thumbnail image of a representative scene, or may be played back without sound as a reduced video.
- step S301 in FIG. 3 When voice is presented instead of an image, when there is a voice presentation request in step S301 in FIG. 3 and a plurality of voices are presented, some icons such as a photograph of the creator's face and a note mark are created as a voice creation date attribute and creation. You may display with attributes, such as a person name attribute. Further, when the user's interested voice is selected in step S303 in FIG. 3 and the recommended voice is presented in step S308, some icon indicating the voice selected by the user is displayed on the left as shown in FIG. 9 or FIG. In addition, some icons representing the recommended sound may be displayed on the right side as well as being played back.
- images, moving images, voices, and the like may be mixed.
- the number after grouping using keywords in the user preference specifying unit 1200, when counting the number of each group, the number may be weighted according to the format of image, video, audio, and the like.
- a group having a large number of contents included in the group is set as a high user preference group.
- a group having a large sum of the number of reproductions of contents included in the group is set as a high user preference group.
- a group with a large total sum of playback times of content included in a group is set as a highly user-preferred group, or the sum of scores is obtained by multiplying the number of times of content playback and the total playback time by a weighting factor.
- a group with a large value becomes a high user preference group, or a group with a large sum of values obtained by multiplying the score by the content coefficient by setting a content coefficient for each type of content (image, video, audio), etc. May be.
- the number of reproductions and the reproduction time may be measured when the first output control unit 1400 outputs to the display device 1700. At this time, the number of times of reproduction may be incremented by one if a predetermined reference time has elapsed since the reproduction was performed, and the measurement of the reproduction time may be started from the time of reproduction. Further, the number of reproductions and the reproduction time may be stored in the storage unit 1100.
- Embodiments 1 and 2 described above it is assumed that there is only one user, but it may be used by a plurality of users. In this case, history such as the number of times of content playback and playback time may be stored in the storage unit 1100 for each user.
- a user name may be input to identify the user at the start of use.
- a history such as the number of times of content playback and playback time can be recorded for each user, and recommended content can be proposed for each user.
- the second output control unit 2600 obtains the user preference level by substituting the creation date / time attribute and location attribute of the image selected by the user into the user preference function. You may substitute. As a result, it is possible to obtain the degree of user preference when a related image that is considered to be preferred by the user is created. Further, the current date and time displayed on the display device 1700 may be substituted into the user preference function. As a result, the user preference level at the date and time when the display device 1700 is to be displayed can be obtained.
- the attribute of the high user preference group is a sample according to a normal distribution, but a distribution other than the normal distribution may be used.
- the frequency distribution using the number of images, the number of times of reproduction, the reproduction time, and the like, or a numerical value created by combining them may be used as the user preference function.
- weighting was not performed when the number of belonging images for each high user preference group was counted in the user preference specifying unit 1200, but considering that user preferences change over time,
- the number of images may be counted by applying a larger weighting factor to images created at a date and time closer to the present than images created in the past.
- the keyword attribute stored in the storage unit 1100 is one keyword per image, but a plurality of keywords may be included per image.
- the grouping by the user preference specifying unit 1200 may be performed for each of a plurality of keywords.
- the user preference specifying unit 1200 has the keyword attributes “sweet” and “travel” as shown in FIG. However, if there is a difference in the number of images, for example, if there are 6 “candy” and 3 “travel”, the relationship included in 6 “candy” The image may be displayed with emphasis, for example, in a display order earlier than the related image included in the “travel”.
- the user preference specifying unit 1200 has a predetermined reference number in advance, but after performing grouping using keywords, the one having the largest number of images belonging to the group May be set as a high user preference group, or a plurality of upper groups may be set as a high user preference group.
- the user preference specifying unit 1200 uses “keyword” as a predetermined attribute as an example, but any attribute may be used. Information about which attribute is to be a predetermined attribute may be incorporated in the image display device, or by presenting an attribute list screen to the user before step S701 in FIG. It may be selected.
- the predetermined attributes are, for example, “creation date”, “creator name”, and “location”. However, any number of attributes may be used.
- Information about which attribute is to be a predetermined attribute may be incorporated in the image display device, or may be displayed by the user by presenting an attribute list screen to the user before step S401 in FIG. It may be selected.
- the same attributes are grouped into the same group, but when the attribute is expressed as text (character information), the storage unit The same group may be used as long as the attributes are similar even if they are not the same using a synonym dictionary stored in 1100. Further, when attributes are expressed as numerical values such as date and time, latitude / longitude, the same group may be used as long as the difference value between attributes is within a certain range. Also, in the related content specifying unit 1500, when the attribute is expressed as text, the related image can be used as long as the attributes are similar even if they are not the same using the synonym dictionary or the like stored in the storage unit 1100. It may be.
- the list as shown in FIG. 5 is used for explanation, but the user preference specifying unit 1200 only needs to be able to refer to the attributes of all the contents in order to group the images according to predetermined attributes. Instead of creating a list such as 5, images stored in the storage unit 1100 may be searched one by one to refer to attributes.
- the predetermined reference number in step S702 of FIG. 7 by the user preference specifying unit 1200 is a value preset in the image display device or a value set by the user, and the method of determining the value is not limited.
- a screen for allowing the user to determine the number may be displayed and selected before step S701 in FIG.
- the reference value for the related content specifying unit 1500 to specify the related image performed in step S403 or step S405 in FIG. 4 may be designed in advance in the image display device, or may be set by the user. When the user sets, a screen to be set by the user may be displayed and selected before step S401 in FIG.
- the second output control unit 1500 As shown in FIG. 9, an area for displaying the related image is provided for each attribute.
- the content ID high-user-preferred image, which is a related image without doing so.
- the two or ten icons may be displayed as shown in FIG. 15 without distinguishing which attribute the related attribute is.
- an area for displaying an image that is a related image and a high user preference image is provided for each attribute as shown in FIG. 14, but which attribute is shown in FIG. May be displayed without distinction.
- the user preference level of the “creation date” attribute may be different from the user preference level of the “location” attribute.
- the maximum user preference level Shall be used.
- grouping is performed according to a predetermined attribute.
- two or more attributes among a plurality of attributes of the content are used, and the content is grouped for each attribute.
- a group having a high user preference may be identified from among all the divided groups. For example, a case where not only the “keyword” attribute but also the “creator name” attribute is considered in the user preference specifying process will be described.
- FIG. 5 using an image with a content ID of up to 10, when grouping is performed by the creator name attribute, the group “A mountain A child” has IDs of 1, 3, 5, 6 Images belong.
- “-” Indicates that the attribute does not exist. In this case, it is excluded from the target of the user preference specifying process on the assumption that there is no creator name attribute.
- the image has been described as being stored in the storage unit 1100, but may be stored in an external storage unit.
- at least “file path” pointing to the address of the image in the external storage means) that is the element 203 in the table in the data structure of FIG. 2 needs to be stored in the internal storage means 1100. .
- an image (still image) is stored in the storage unit 3100, and an image is searched using the attribute.
- a moving image, audio, or the like is used instead of the image (still image).
- the content may be stored in the storage device, and the content may be searched using the attribute.
- “moving image” or “sound” may be recorded instead of “image” in the “type” of the data shown in FIG.
- a representative scene in the moving image may be a thumbnail image, or there is no sound by reducing the size of the moving image. It may be played back with.
- voice when there is a voice presentation request in step S2208 of FIG. 22 and a plurality of voices are presented, some icons such as a photograph of the creator's face and musical note marks may be displayed together with the voice creation date / time attribute.
- images, moving images, voices, and the like may be mixed.
- the shade of the image is expressed by thinning out the information of the image, but may be expressed by other methods.
- ⁇ blending may be performed with the background image as a white monochrome image.
- the ⁇ blend is a pixel value Value of a new image generated as a result of the ⁇ blend, where the pixel value of the original image is Value 0, the pixel value of the background image is Value 1, and the coefficient is ⁇ ( ⁇ is a value from 0 to 1).
- the coefficient ⁇ By setting the coefficient ⁇ to 1, the color of the background image gradually becomes.
- the background image is a single white color, the entire image gradually becomes white.
- the shade of the image may be expressed by changing the value of ⁇ .
- the background image for performing the ⁇ blend may be a gray single color image, or the ⁇ blend may be performed using the background image for displaying the result by the image display device.
- the entire image may be made gray by gradually setting the RGB values for each pixel of the image to intermediate values. Further, the entire image may be gradually changed into a mosaic shape.
- the slide bar is displayed and the user is allowed to input the weighting factor for each attribute.
- the user may select a value using a pull-down menu.
- An area to be directly input may be provided to allow the user to input a weighting factor.
- the slide bar need not be displayed.
- the weight coefficient may be a value set in advance by the user or a value given by the designer of the image display apparatus.
- the arrangement in the case where the creation date / time attribute is used as the display arrangement in the arrangement determination means 3400 and in the fourth embodiment, the arrangement in the case where another arrangement date / time attribute is used in the arrangement determination means 4400.
- a display arrangement in the case of other attributes such as a location attribute and a display arrangement in the case of a face information attribute may be adopted.
- an image may be arranged at a position represented by the place attribute on the map as shown in FIG.
- the display mode in the display mode classification unit 3500 is such that the image size is the same and the importance is expressed by the shading of the image, and the attribute evaluation method in the attribute information evaluation unit Are calculated in the same manner as in the third embodiment, and the weighting factor is 1 for the location attribute and 0 for the other attributes. Therefore, the image is displayed so dark that it is not always a place to go.
- the display mode in the display mode classification unit 3500 has the same image size and the importance is expressed by the shading of the image, and the attribute information evaluation unit uses the attribute information evaluation unit.
- the weighting coefficient is set to 1 for the face information attribute and 0 for the other attributes. Therefore, an image that is captured by a small number of people who are not frequently captured is displayed darker.
- the display mode is set so that the size of the image is the same, for example, 60 ⁇ 60 pixels vertically and horizontally, and the importance is expressed by the shade of the image.
- the importance may be set to be expressed by the size of the image.
- it is displayed as shown in FIG.
- the attribute information is evaluated by the attribute information evaluation unit
- the display arrangement is the display arrangement in the case of the location attribute
- the weighting factor is 1, and the other Is set to 0. Therefore, the image is displayed larger as it is not always a place to go.
- the size of each reduced image is the same, the size is not limited to 60 ⁇ 60 pixels. If the input image is smaller than the size, the image is enlarged rather than reduced.
- the image size and the image density may be the same, and the importance may be represented by the thickness of the frame surrounding the image.
- the display is as shown in FIG.
- the attribute information is evaluated by the attribute information evaluation unit in the same manner as in the third embodiment
- the layout for display adopts the layout for the creation date attribute in the fourth embodiment
- the weighting coefficient is the creation date attribute. 1 and other attributes are set to 0. Therefore, the frame line is displayed thicker as the image is closer to the life event.
- the image size and the image density may be the same, and the degree of importance may be expressed by the number of marks attached to the image.
- the display is as shown in FIG.
- the attribute information is evaluated by the attribute information evaluation unit
- the arrangement for display adopts the arrangement for the creation date attribute in the fourth embodiment
- the weighting coefficient is the creation date attribute. 1 and other attributes are set to 0.
- the degree of importance of the image there are no stars and one, two, three stars are expressed. Therefore, the number of stars that are marks increases as the image is closer to the life event.
- the size and contrast of the image may be the same as the display mode, and the image may be arranged according to the importance value on the axis perpendicular to the reference plane.
- the front is large and the back is small and blurry.
- the importance of an image may be expressed by arranging the image so that the smaller the importance of the image falls, and the larger the image stands.
- the attribute information evaluation unit 3200 calculates the importance for each attribute by using the life event information, frequent location information, and frequent human information input by the user.
- the calculation of the importance for each attribute may be performed without using information input by the user.
- the creation date attribute if the number of images that are usually taken is about 5 but there is a day when 50 shots are taken, the importance is assumed to be a unique day that is not a normal day. Also good.
- location attributes if the location is taken from a location far away from the location where you usually shoot from the information of latitude and longitude, the location is more important as it is a unique location that is not a normal location. May be.
- face information attributes the importance may be increased as the number of persons in the captured image is smaller and the more specific the person is.
- step S2307 to step S2308 need not be calculated in the order shown in FIG. 23, and may be performed by changing the order. For example, the processing may be executed in the order of (ii), (i), and (iii).
- the attribute information evaluation process is shown in FIG. 33, but (iv) the process from step S1301 to step S1303, (v) the process from step S1304 to step S1306, and (vi) step S1307. From the processing in step S1308, it is not necessary to calculate in the order shown in FIG. 33, and calculation may be performed by changing the order. For example, the processing may be executed in the order of (vi), (v), and (iv).
- the importance component configured for each attribute is calculated.
- the calculation method of the importance component and the normalization method may be other than those described.
- the value may be changed by the user after calculating the importance component for each attribute.
- the range and size of the value of the weighting factor may be freely determined.
- the attribute information evaluation unit 3200 is more important as the number of people captured from the face information attribute of the image is smaller. However, the number of people in the image is larger. It may be determined that the group photo at the time is important. In this case, it may be selected whether the photograph is a crowd photograph or a group photograph according to the size of the face in the image.
- the display mode classification unit 3500 uses the sum of the weight values to determine the importance from 0 to 1/3 of the weight value sum, 1/3 of the weight value sum to 2/2 of the weight value sum. 3. The grouping was performed from 2/3 of the sum of the weight values to the sum of the weight values, but using the maximum importance value, the importance value is from 0 to 1/3 of the maximum value, and the maximum value. Grouping may be performed from 1/3 to 2/3 of the maximum value and from 2/3 of the maximum value to the maximum value. The importance range for each group need not be equally spaced, and may be determined freely. Moreover, a frequency distribution of importance may be created, and display modes may be grouped from the distribution.
- the age of the person shown in the image may be calculated using the face information of the image, and the importance may be increased as the image shows the person of the same age. For example, if an image showing an infant is selected, the importance of an image including not only the infant shown in the selected image but also other infants may be increased.
- the user operation receiving means 3300 and 4300 have a mouse or the like for receiving an operation from the user, but may have a general pointing device or the like and a keyboard or the like.
- the single click with the mouse may be performed by coordinate designation by the pointing device, and the double click may be performed by simultaneous input of the coordinate designation and the Enter key of the keyboard.
- the image display devices according to the first to fourth embodiments described above are typically realized as an LSI which is a semiconductor integrated circuit. These may be individually made into one chip, or may be made into one chip so as to include a part or all of them.
- the name used here is LSI, but it may also be called IC, system LSI, super LSI, or ultra LSI depending on the degree of integration.
- the method of circuit integration is not limited to LSI, and implementation with a dedicated circuit or a general-purpose processor is also possible.
- An FPGA Field Programmable Gate Array
- a reconfigurable processor that can reconfigure the connection and setting of circuit cells inside the LSI may be used.
- An image output device can be used as information drawing means in a mobile phone, a television, a digital video recorder, a digital video camera, a car navigation system, or the like.
- a display a cathode ray tube (CRT), a liquid crystal, a PDP (plasma display panel), a flat display such as an organic EL, a projection display represented by a projector, or the like can be combined.
- a control program composed of a program code for causing a processor and various circuits connected to the processor to execute the program is recorded on a recording medium, or various communication paths, etc. It can also be distributed and distributed via Such recording media include IC cards, hard disks, optical disks, flexible disks, ROMs, and the like.
- the distributed and distributed control program is used by being stored in a memory or the like that can be read by the processor, and the processor executes the control program to realize various functions as shown in each embodiment. Will come to be.
- a control device includes a storage unit that stores a plurality of contents in association with attributes of the content, and one content among the contents stored in the storage unit.
- First output control means for outputting, and related content for specifying content having an attribute having a predetermined relationship with the attribute of the content output by the first output control means, from a plurality of contents stored in the storage means
- Classifying means and a plurality of contents stored in the storage means are classified into a plurality of groups based on attributes, and an amount relating to the contents belonging to each classified group is calculated, and the amount is a predetermined reference
- User preference specifying means for specifying content included in a group exceeding the amount, and related content specified by the related content specifying means.
- a second output control means for outputting a content to a set of tools to belong to both the set of identified content in the user preference specifying means.
- this content output device has content that is related to the user's preference and has an attribute related to the attribute of the content being presented, from a plurality of content stored in the storage means by the user with respect to the content presented to the user. Can be presented to the user.
- the storage means may store information relating to the content as an attribute of the content, and the user preference specifying means may be classified into a plurality of groups based on the information relating to the content.
- the content is an image
- the first output control unit performs the output by display
- the second output control unit performs the output by display
- the storage unit includes a plurality of storage units for each content.
- the attribute of the content includes a character string representing the content of the content
- the user preference specifying unit represents a plurality of contents stored in the storage unit as characters representing the content of the content. It is good also as classifying based on a column.
- images can be handled and displayed as content, and content belonging to a large number of groups along with the content can be specified when specifying user preferences.
- the user preference specifying means may calculate the total number of contents belonging to each group as an amount related to the contents belonging to each group.
- the storage unit stores information relating to content creation as the content attribute, and the related content specifying unit includes information relating to creation of the content without using a character string representing the content. It is good also as specifying related content based on.
- information at the time of content creation can be obtained as content attributes.
- the attribute of information at the time of content creation without using the keyword that is the attribute used by the user preference specifying means The related content can be specified using.
- the information related to the creation indicates a place where the content is created, and the related content specifying unit is created at a location within a predetermined distance from the location of the content output by the first output control unit.
- the content may be specified from the content stored in the storage means.
- the location where the content is created can be included as an attribute, and the content created in a location within a predetermined distance from the location of the content output by the first output control means in the related content specifying means is related content. It can be.
- (G) comprising user operation accepting means for accepting a user operation, wherein one content output by the first output control means is determined based on a user operation accepted by the user operation accepting means, and the storage means , Storing information indicating the creation date and time of the content as the attribute of the content, and the second output control means of the one content to be output based on the distribution of the creation date and time of the content specified by the user preference specifying means It is also possible to calculate the preference level at the creation date and output according to the preference level.
- the content output by the first output control means can be determined by the user's operation, the date and time when the content was created can be included as the content attribute, and the user preference specifying means when calculating the user preference level
- the user preference function can be determined based on the creation date and time of the content specified in, the user preference level is calculated by the creation date and time of the content output by the first output control means determined based on the user operation, Output can be performed according to the degree of preference.
- the attribute of the content includes information indicating the creation date and time of the content
- the second output control unit is based on the distribution of the creation date and time of the content specified by the user preference specifying unit and the time when the content is output. Then, the second output control means may calculate the preference level at the time when the content is output, and output according to the preference level.
- the date and time when the content was created can be included as an attribute of the content
- the user preference function can be determined based on the creation date and time of the content specified by the user preference specifying means when calculating the user preference level.
- the user preference function can be determined based on the creation date and time of the content specified by the user preference specifying unit when calculating the user preference level, and the user preference level at the date and time when the second output control unit outputs the content. It is possible to calculate and output according to the degree of preference.
- the attribute of the content includes a character string representing the content of the content
- the storage unit stores a table in which character strings having meanings related to each other are stored
- the related content specifying unit includes Based on the content stored in the storage unit, the content associated with the character string representing the content of the content output by the first output control unit may be specified.
- a keyword can be used as an attribute of the content, and the related content specifying unit can specify the related content using a table in which keywords having meanings are related.
- (J) comprising user operation accepting means for accepting a user operation, wherein the first output control means measures the number of times content is output based on the user operation accepted by the user operation accepting means,
- the storage means stores information indicating the number of outputs measured for each content sequentially output by the first output control means as the content attribute, and the content belonging to each group in the previous period in the user preference specifying means The amount is the number of output times of the contents belonging to each group, and the calculation may be performed by summing up the number of output times of all contents belonging to each group.
- the number of output times of the content can be measured in accordance with the user's operation by the first output control unit, the number of output times measured for each content output to the storage unit can be stored, and the user's preference When specifying the content, it is possible to specify the content that belongs to the group that follows the content and has a large number of user outputs.
- the storage means stores information indicating a place where the content is created as an attribute of the content, and the user preference specifying means uses a plurality of contents stored in the storage means as predetermined points and the content It is good also as classifying based on the distance with the place which created.
- a content output apparatus includes a storage unit that stores a plurality of contents in association with attributes of the content, and a plurality of displays stored in the storage unit.
- An arrangement determining means for determining an arrangement for display according to a predetermined standard for each content to be displayed, and calculating the importance of each content based on the attribute of each content to be displayed, and each content based on the importance
- Display mode classification means for determining the display mode, and display control means for displaying the plurality of contents to be displayed in the determined display mode of each content according to the arrangement for display.
- this content output apparatus uses a display arrangement determined by the importance based on the arrangement of a plurality of contents from the contents stored in the storage device by the user for the user and the attribute of the contents. Can be displayed on the display.
- the content attribute includes the time when the content is created, the storage means stores the time of the event for the user, and the display mode classification means is the time when the content is created and for the user.
- the calculation of the importance may be performed so that the smaller the difference value from the time of the event, the more important.
- the content may be an image, and the display mode classification unit may thin out pixels of each content according to the importance of each content.
- the storage means stores a plurality of attributes for each content, and the display mode classification means calculates an importance component constituting the importance for each attribute, and the importance for each of the plurality of attributes It is good also as performing the said calculation of the said importance by applying a weight to a component and taking the sum total.
- the content can have a plurality of attributes
- the importance component that constitutes the importance for determining the display mode for each attribute can be calculated, the importance component is weighted, and the sum is calculated. By taking it, the importance can be calculated.
- the predetermined criterion followed by the arrangement determining means may be that the contents to be displayed are arranged in the order in which they were created as attributes of the contents.
- the content may be an image, and the display control unit may display the content by changing the display size of the content to be displayed according to the importance.
- the attribute of the content includes a location where the content is created, the storage means stores a predetermined location, and the display mode classification means is a difference between the location where the content is created and the predetermined location. It is good also as performing the said calculation of the said importance so that it is so important that a value is large.
- the location created as the content attribute can be used, and the importance of the content can be increased as the difference value of the distance from the predetermined location determined by the user is larger.
- (I) further comprising face information generating means for generating face information for each person appearing in the content based on the content and storing it in the storage means as one of the attributes of the content, wherein the content is an image
- the display mode classification means may perform the calculation of the importance based on the number of people who do not match the predetermined person.
- the content output device further includes user operation accepting means, the user operation accepting means accepts selection of one content from the displayed contents, and the display mode classification means The calculation of the importance may be performed so that the content having an attribute similar to the content attribute is more important.
- the display control means Before accepting a user operation, the display control means may display all the display modes of the contents to be displayed as the same.
- the content output device of the present invention can be used for various purposes.
- menu display web browser, editor, battery-powered portable display terminal such as mobile phone, portable music player, digital camera, digital video camera, etc.
- high-resolution information display equipment such as TV, digital video recorder, car navigation
- the utility value is high as information display means in EPG, map display, and the like.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Human Computer Interaction (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
以下、コンテンツ出力装置の一実施形態である画像表示装置1000について説明する。
実施形態1の画像表示装置1000は、画像が複数記憶されている記憶手段を含み、ユーザ操作に基づいて画像を表示し、その表示している画像と関連し且つユーザの嗜好に合った画像を記憶手段に記憶された複数の画像の中から探索し、ユーザに表示する。
以下、画像表示装置1000の構成について説明する。
以下、上述の構成を備える画像表示装置1000で扱うデータの構造について説明する。
以下、上述の構成とデータ構造を備える画像表示装置1000の動作について説明する。
<実施形態2>
以下、コンテンツ出力装置の一実施形態である画像表示装置2000について説明する。
図10は、実施形態2の画像表示装置2000の構成図である。
以下、画像表示装置2000の構成について説明する。
以下、上述の構成とデータ構造を備える画像表示装置2000の動作について説明する。
<実施形態3>
本実施形態3においては、上記実施形態4において説明した第2出力制御手段によるコンテンツごとの表示態様を換えたコンテンツ出力手法の詳細について説明する。以下、実施形態3に係る画像表示装置3000について説明する。
実施形態3の画像表示装置3000は、画像が複数記憶されている記憶手段3100を含み、ユーザ操作に基づいて複数の画像を同じサイズで、作成した日時の順に表示し、それらの表示している画像の濃淡をユーザの操作に合わせて変化させることで、ユーザの画像探索の補助を行う。
以下、図17を用いて画像表示装置3000の構成について説明する。
以下、上述の構成を備える画像表示装置3000で扱うデータについて説明する。
以下、上述の構成とデータを備える画像表示装置3000の動作について説明する。
<実施形態4>
以下、画像表示装置の一実施形態である画像表示装置4000について説明する。
図29は、実施形態4の画像表示装置4000の構成図である。
以下、画像表示装置4000の構成について説明する。
また、上述の構成を備える画像表示装置4000における記憶手段3100のデータは、実施形態3で扱うデータと同一である。
以下、上述の構成とデータを備える画像表示装置4000の動作について説明する。
<補足>
以上、本発明に係るコンテンツ出力装置の実施形態として、実施形態1~4を例として説明したが、例示したコンテンツ出力装置を以下のように変形することも可能であり、本発明は上述の実施形態で示した通りのコンテンツ出力装置に限られないことはもちろんである。
1100、3100 記憶手段
1200 ユーザ嗜好特定手段
1300、3300、4300 ユーザ操作受付手段
1400 第一出力制御手段
1500 関連コンテンツ特定手段
1600、2600 第二出力制御手段
1700、3700 表示装置
2610 ユーザ嗜好関数生成手段
3200、4200 属性情報評価手段
3400、4400 配置決定手段
3500 表示態様分類手段
3600 表示制御手段
Claims (7)
- コンテンツと当該コンテンツの属性とを対応させて複数記憶している記憶手段と、
前記記憶手段に記憶されている複数のコンテンツのうち、所定のコンテンツを画面に出力する第一出力制御手段と、
前記出力されているコンテンツの属性と関連する属性を持つ関連コンテンツの集合を、前記記憶手段に記憶されている複数のコンテンツから特定する関連コンテンツ特定手段と、
前記記憶手段に記憶されている複数のコンテンツを前記複数のコンテンツのそれぞれの属性に基づいて複数のグループに分類し、前記分類された各グループに属しているコンテンツの総数を計算し、前記計算されたコンテンツの総数が所定値を超えたグループに含まれるコンテンツを、高嗜好コンテンツの集合として特定するユーザ嗜好特定手段と、
前記コンテンツの集合と前記高嗜好コンテンツの集合との双方に属しているコンテンツを画面に出力する第二出力制御手段と、
を備えることを特徴とするコンテンツ出力装置。 - 前記コンテンツ出力装置は、ユーザ嗜好関数生成手段を備え、
前記ユーザ嗜好関数生成手段は、前記高嗜好コンテンツの集合に属するコンテンツに対するユーザの嗜好度を算出するためのユーザ嗜好関数を生成し、
前記第二出力制御手段は、前記生成されたユーザ嗜好関数を用いて、前記高嗜好コンテンツの集合に属する各コンテンツのユーザ嗜好度を計算し、前記関連コンテンツの集合と前記高嗜好コンテンツの集合との双方に属しているコンテンツを、前記コンテンツのユーザ嗜好度の高さに応じて目立つように加工して、画面に出力する
ことを特徴とする請求項1記載のコンテンツ出力装置。 - 前記ユーザ嗜好関数生成手段は、前記高嗜好コンテンツの集合が共通して保有する属性が数値で表現できる場合に、前記属性を横軸にして、前記高嗜好コンテンツの集合に属する各コンテンツの属性値をプロットし、前記プロットした各属性値が正規分布に従う標本であるとして確率密度関数を算出して、ユーザ嗜好関数を生成する
ことを特徴とする請求項2記載のコンテンツ出力装置。 - ユーザに提示するためにコンテンツを出力するコンテンツ出力方法であって、
コンテンツと当該コンテンツの属性とを対応させて複数記憶している記憶手段に記憶されている複数のコンテンツのうち、所定のコンテンツを画面に出力する第一出力制御ステップと、
前記出力されているコンテンツの属性と関連する属性を持つ関連コンテンツの集合を、前記記憶手段に記憶されている複数のコンテンツから特定する関連コンテンツ特定ステップと、
前記記憶手段に記憶されている複数のコンテンツを前記複数のコンテンツのそれぞれの属性に基づいて複数のグループに分類し、前記分類された各グループに属しているコンテンツの総数を計算し、前記計算されたコンテンツの総数が所定値を超えたグループに含まれるコンテンツを、高嗜好コンテンツの集合として特定するユーザ嗜好特定ステップと、
前記コンテンツの集合と前記高嗜好コンテンツの集合との双方に属しているコンテンツを画面に出力する第二出力制御ステップと、
を含むことを特徴とするコンテンツ出力方法。 - ユーザに提示するコンテンツを出力するための処理手順をコンピュータ上で実行するようにコンピュータ可読形式で記述されたプログラムであって、
前記処理手順は、
コンテンツと当該コンテンツの属性とを対応させて複数記憶している記憶手段に記憶されている複数のコンテンツのうち、所定のコンテンツを画面に出力する第一出力制御ステップと、
前記出力されているコンテンツの属性と関連する属性を持つ関連コンテンツの集合を、前記記憶手段に記憶されている複数のコンテンツから特定する関連コンテンツ特定ステップと、
前記記憶手段に記憶されている複数のコンテンツを前記複数のコンテンツのそれぞれの属性に基づいて複数のグループに分類し、前記分類された各グループに属しているコンテンツの総数を計算し、前記計算されたコンテンツの総数が所定値を超えたグループに含まれるコンテンツを、高嗜好コンテンツの集合として特定するユーザ嗜好特定ステップと、
前記コンテンツの集合と前記高嗜好コンテンツの集合との双方に属しているコンテンツを画面に出力する第二出力制御ステップと、
を含むことを特徴とするプログラム。 - ユーザに提示するコンテンツを出力するための処理手順をコンピュータ上で実行するようにコンピュータ可読形式で記述されたプログラムを記録したプログラム記録媒体であって、
前記処理手順は、
コンテンツと当該コンテンツの属性とを対応させて複数記憶している記憶手段に記憶されている複数のコンテンツのうち、所定のコンテンツを画面に出力する第一出力制御ステップと、
前記出力されているコンテンツの属性と関連する属性を持つ関連コンテンツの集合を、前記記憶手段に記憶されている複数のコンテンツから特定する関連コンテンツ特定ステップと、
前記記憶手段に記憶されている複数のコンテンツを前記複数のコンテンツのそれぞれの属性に基づいて複数のグループに分類し、前記分類された各グループに属しているコンテンツの総数を計算し、前記計算されたコンテンツの総数が所定値を超えたグループに含まれるコンテンツを、高嗜好コンテンツの集合として特定するユーザ嗜好特定ステップと、
前記コンテンツの集合と前記高嗜好コンテンツの集合との双方に属しているコンテンツを画面に出力する第二出力制御ステップと、
を含むことを特徴とするプログラム記録媒体。 - ユーザに提示するコンテンツを出力するコンテンツ出力集積回路であって、
コンテンツと当該コンテンツの属性とを対応させて複数記憶している記憶手段に記憶されている複数のコンテンツのうち、所定のコンテンツを画面に出力する第一出力制御手段と、
前記出力されているコンテンツの属性と関連する属性を持つ関連コンテンツの集合を、前記記憶手段に記憶されている複数のコンテンツから特定する関連コンテンツ特定手段と、
前記記憶手段に記憶されている複数のコンテンツを前記複数のコンテンツのそれぞれの属性に基づいて複数のグループに分類し、前記分類された各グループに属しているコンテンツの総数を計算し、前記計算されたコンテンツの総数が所定値を超えたグループに含まれるコンテンツを、高嗜好コンテンツの集合として特定するユーザ嗜好特定手段と、
前記コンテンツの集合と前記高嗜好コンテンツの集合との双方に属しているコンテンツを画面に出力する第二出力制御手段と、
を備えることを特徴とするコンテンツ出力集積回路。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012518264A JP5879544B2 (ja) | 2010-06-04 | 2011-06-03 | コンテンツ出力装置、コンテンツ出力方法、プログラム、プログラム記録媒体及びコンテンツ出力集積回路 |
CN201180003127.3A CN102473196B (zh) | 2010-06-04 | 2011-06-03 | 内容输出装置、内容输出方法及内容输出集成电路 |
US13/387,217 US8732149B2 (en) | 2010-06-04 | 2011-06-03 | Content output device, content output method, program, program recording medium, and content output integrated circuit |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-128589 | 2010-06-04 | ||
JP2010128589 | 2010-06-04 | ||
JP2010156095 | 2010-07-08 | ||
JP2010-156095 | 2010-07-08 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011152072A1 true WO2011152072A1 (ja) | 2011-12-08 |
Family
ID=45066462
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/003160 WO2011152072A1 (ja) | 2010-06-04 | 2011-06-03 | コンテンツ出力装置、コンテンツ出力方法、プログラム、プログラム記録媒体及びコンテンツ出力集積回路 |
Country Status (4)
Country | Link |
---|---|
US (1) | US8732149B2 (ja) |
JP (1) | JP5879544B2 (ja) |
CN (1) | CN102473196B (ja) |
WO (1) | WO2011152072A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104537252A (zh) * | 2015-01-05 | 2015-04-22 | 深圳市腾讯计算机系统有限公司 | 用户状态单分类模型训练方法和装置 |
JP2016057901A (ja) * | 2014-09-10 | 2016-04-21 | 富士フイルム株式会社 | 画像処理装置、画像処理方法、プログラムおよび記録媒体 |
JP2019207507A (ja) * | 2018-05-28 | 2019-12-05 | 株式会社電警 | サイト作成用画像選択支援方法 |
CN114327628A (zh) * | 2021-12-28 | 2022-04-12 | 深圳市汇川技术股份有限公司 | 分层控制方法、系统、终端设备及存储介质 |
JP7345677B1 (ja) * | 2022-06-06 | 2023-09-15 | 三菱電機株式会社 | 情報生成装置、情報出力装置、および、情報出力方法 |
Families Citing this family (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9552376B2 (en) | 2011-06-09 | 2017-01-24 | MemoryWeb, LLC | Method and apparatus for managing digital files |
KR101851241B1 (ko) * | 2011-12-06 | 2018-04-24 | 삼성전자 주식회사 | 휴대 단말기의 컨텐츠 통합 관리 방법 및 장치 |
TWI533197B (zh) * | 2012-06-19 | 2016-05-11 | 緯創資通股份有限公司 | 影像輸出方法與電子裝置 |
US20140082023A1 (en) * | 2012-09-14 | 2014-03-20 | Empire Technology Development Llc | Associating an identity to a creator of a set of visual files |
KR20140081220A (ko) * | 2012-12-21 | 2014-07-01 | 삼성전자주식회사 | 사용자 단말 장치 및 그 제어 방법 |
US10496937B2 (en) * | 2013-04-26 | 2019-12-03 | Rakuten, Inc. | Travel service information display system, travel service information display method, travel service information display program, and information recording medium |
US9934222B2 (en) | 2014-04-22 | 2018-04-03 | Google Llc | Providing a thumbnail image that follows a main image |
USD781317S1 (en) | 2014-04-22 | 2017-03-14 | Google Inc. | Display screen with graphical user interface or portion thereof |
US9972121B2 (en) | 2014-04-22 | 2018-05-15 | Google Llc | Selecting time-distributed panoramic images for display |
USD781318S1 (en) | 2014-04-22 | 2017-03-14 | Google Inc. | Display screen with graphical user interface or portion thereof |
USD780777S1 (en) | 2014-04-22 | 2017-03-07 | Google Inc. | Display screen with graphical user interface or portion thereof |
US10409453B2 (en) | 2014-05-23 | 2019-09-10 | Microsoft Technology Licensing, Llc | Group selection initiated from a single item |
US10291597B2 (en) | 2014-08-14 | 2019-05-14 | Cisco Technology, Inc. | Sharing resources across multiple devices in online meetings |
US10034038B2 (en) | 2014-09-10 | 2018-07-24 | Cisco Technology, Inc. | Video channel selection |
US10542126B2 (en) | 2014-12-22 | 2020-01-21 | Cisco Technology, Inc. | Offline virtual participation in an online conference meeting |
US9948786B2 (en) | 2015-04-17 | 2018-04-17 | Cisco Technology, Inc. | Handling conferences using highly-distributed agents |
US10069697B2 (en) * | 2016-01-29 | 2018-09-04 | Microsoft Technology Licensing, Llc | Routing actions to user devices based on a user graph |
US10592867B2 (en) | 2016-11-11 | 2020-03-17 | Cisco Technology, Inc. | In-meeting graphical user interface display using calendar information and system |
US10516707B2 (en) | 2016-12-15 | 2019-12-24 | Cisco Technology, Inc. | Initiating a conferencing meeting using a conference room device |
US10440073B2 (en) | 2017-04-11 | 2019-10-08 | Cisco Technology, Inc. | User interface for proximity based teleconference transfer |
US10375125B2 (en) | 2017-04-27 | 2019-08-06 | Cisco Technology, Inc. | Automatically joining devices to a video conference |
US10375474B2 (en) | 2017-06-12 | 2019-08-06 | Cisco Technology, Inc. | Hybrid horn microphone |
US10477148B2 (en) | 2017-06-23 | 2019-11-12 | Cisco Technology, Inc. | Speaker anticipation |
US10516709B2 (en) | 2017-06-29 | 2019-12-24 | Cisco Technology, Inc. | Files automatically shared at conference initiation |
US10706391B2 (en) | 2017-07-13 | 2020-07-07 | Cisco Technology, Inc. | Protecting scheduled meeting in physical room |
US10091348B1 (en) | 2017-07-25 | 2018-10-02 | Cisco Technology, Inc. | Predictive model for voice/video over IP calls |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11154153A (ja) * | 1997-11-20 | 1999-06-08 | Sharp Corp | データ検索方法、データ検索装置、及びコンピュータに実行させるデータ検索プログラムを記録した記録媒体 |
JP2000322445A (ja) * | 1999-05-14 | 2000-11-24 | Mitsubishi Electric Corp | 情報検索システムおよびこのプログラムを記録した記録媒体 |
JP2004206679A (ja) * | 2002-12-12 | 2004-07-22 | Sony Corp | 情報処理装置および方法、記録媒体、並びにプログラム |
JP2006318033A (ja) * | 2005-05-10 | 2006-11-24 | Olympus Imaging Corp | 画像管理装置、画像管理プログラム、画像管理方法及び記録媒体 |
JP2007058562A (ja) * | 2005-08-24 | 2007-03-08 | Sharp Corp | コンテンツ分類装置、コンテンツ分類方法、コンテンツ分類プログラムおよび記録媒体 |
JP2007310610A (ja) * | 2006-05-18 | 2007-11-29 | Hitachi Software Eng Co Ltd | 衛星画像処理システム |
JP2009140452A (ja) * | 2007-12-11 | 2009-06-25 | Sony Corp | 情報処理装置および方法、並びにプログラム |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6317739B1 (en) | 1997-11-20 | 2001-11-13 | Sharp Kabushiki Kaisha | Method and apparatus for data retrieval and modification utilizing graphical drag-and-drop iconic interface |
JP2003067397A (ja) | 2001-06-11 | 2003-03-07 | Matsushita Electric Ind Co Ltd | コンテンツ管理システム |
US20040172410A1 (en) | 2001-06-11 | 2004-09-02 | Takashi Shimojima | Content management system |
JP2003067393A (ja) | 2001-08-29 | 2003-03-07 | Matsushita Electric Ind Co Ltd | 問い合わせ応答方法および装置 |
CN100524297C (zh) * | 2002-12-12 | 2009-08-05 | 索尼株式会社 | 信息处理设备、信息处理方法、记录介质和程序 |
JP4380494B2 (ja) | 2004-10-07 | 2009-12-09 | ソニー株式会社 | コンテンツ・マネジメント・システム及びコンテンツ・マネジメント方法、並びにコンピュータ・プログラム |
US20080294607A1 (en) * | 2007-05-23 | 2008-11-27 | Ali Partovi | System, apparatus, and method to provide targeted content to users of social networks |
-
2011
- 2011-06-03 JP JP2012518264A patent/JP5879544B2/ja not_active Expired - Fee Related
- 2011-06-03 US US13/387,217 patent/US8732149B2/en not_active Expired - Fee Related
- 2011-06-03 WO PCT/JP2011/003160 patent/WO2011152072A1/ja active Application Filing
- 2011-06-03 CN CN201180003127.3A patent/CN102473196B/zh not_active Expired - Fee Related
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH11154153A (ja) * | 1997-11-20 | 1999-06-08 | Sharp Corp | データ検索方法、データ検索装置、及びコンピュータに実行させるデータ検索プログラムを記録した記録媒体 |
JP2000322445A (ja) * | 1999-05-14 | 2000-11-24 | Mitsubishi Electric Corp | 情報検索システムおよびこのプログラムを記録した記録媒体 |
JP2004206679A (ja) * | 2002-12-12 | 2004-07-22 | Sony Corp | 情報処理装置および方法、記録媒体、並びにプログラム |
JP2006318033A (ja) * | 2005-05-10 | 2006-11-24 | Olympus Imaging Corp | 画像管理装置、画像管理プログラム、画像管理方法及び記録媒体 |
JP2007058562A (ja) * | 2005-08-24 | 2007-03-08 | Sharp Corp | コンテンツ分類装置、コンテンツ分類方法、コンテンツ分類プログラムおよび記録媒体 |
JP2007310610A (ja) * | 2006-05-18 | 2007-11-29 | Hitachi Software Eng Co Ltd | 衛星画像処理システム |
JP2009140452A (ja) * | 2007-12-11 | 2009-06-25 | Sony Corp | 情報処理装置および方法、並びにプログラム |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016057901A (ja) * | 2014-09-10 | 2016-04-21 | 富士フイルム株式会社 | 画像処理装置、画像処理方法、プログラムおよび記録媒体 |
US9721163B2 (en) | 2014-09-10 | 2017-08-01 | Fujifilm Corporation | Image processing apparatus, image processing method, and recording medium |
CN104537252A (zh) * | 2015-01-05 | 2015-04-22 | 深圳市腾讯计算机系统有限公司 | 用户状态单分类模型训练方法和装置 |
JP2019207507A (ja) * | 2018-05-28 | 2019-12-05 | 株式会社電警 | サイト作成用画像選択支援方法 |
JP7072189B2 (ja) | 2018-05-28 | 2022-05-20 | 株式会社エヌエルプラス | サイト作成用画像選択支援方法 |
CN114327628A (zh) * | 2021-12-28 | 2022-04-12 | 深圳市汇川技术股份有限公司 | 分层控制方法、系统、终端设备及存储介质 |
JP7345677B1 (ja) * | 2022-06-06 | 2023-09-15 | 三菱電機株式会社 | 情報生成装置、情報出力装置、および、情報出力方法 |
Also Published As
Publication number | Publication date |
---|---|
US8732149B2 (en) | 2014-05-20 |
JP5879544B2 (ja) | 2016-03-08 |
CN102473196B (zh) | 2015-08-12 |
US20120127066A1 (en) | 2012-05-24 |
CN102473196A (zh) | 2012-05-23 |
JPWO2011152072A1 (ja) | 2013-07-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5879544B2 (ja) | コンテンツ出力装置、コンテンツ出力方法、プログラム、プログラム記録媒体及びコンテンツ出力集積回路 | |
US7716157B1 (en) | Searching images with extracted objects | |
US10289273B2 (en) | Display device providing feedback based on image classification | |
CN1680939B (zh) | 数字文件和数据的快速可视分类方法 | |
CN108369633B (zh) | 相册的视觉表示 | |
US20190163768A1 (en) | Automatically curated image searching | |
CN111339246A (zh) | 查询语句模板的生成方法、装置、设备及介质 | |
CN107622518A (zh) | 图片合成方法、装置、设备及存储介质 | |
US9538116B2 (en) | Relational display of images | |
WO2015107640A1 (ja) | アルバム作成プログラム、アルバム作成方法およびアルバム作成装置 | |
CN101783886A (zh) | 信息处理设备、信息处理方法和程序 | |
CN108028054A (zh) | 对自动生成的音频/视频展示的音频和视频分量进行同步 | |
US20110016398A1 (en) | Slide Show | |
JP2013105309A (ja) | 情報処理装置、情報処理方法、及びプログラム | |
CN103403765B (zh) | 内容加工装置及其集成电路、方法 | |
US20130055079A1 (en) | Display device providing individualized feedback | |
CN113836334A (zh) | 图像处理装置、图像处理方法及记录介质 | |
CN116301348A (zh) | 一种博物馆展厅个性化游览布局方法及系统 | |
JP5237724B2 (ja) | 画像検索システム | |
CN117041679A (zh) | 视频剪辑方法、装置、计算机设备及存储介质 | |
US11283945B2 (en) | Image processing apparatus, image processing method, program, and recording medium | |
US9767579B2 (en) | Information processing apparatus, information processing method, and non-transitory computer readable medium | |
Leung et al. | Content-based retrieval in multimedia databases | |
JP2010257266A (ja) | コンテンツ出力システム、サーバー装置、コンテンツ出力装置、コンテンツ出力方法、コンテンツ出力プログラム、及びコンテンツ出力プログラムを記憶した記録媒体 | |
JP6276375B2 (ja) | 画像検索装置、画像表示装置、画像検索方法、および画像表示方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180003127.3 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11789485 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13387217 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012518264 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11789485 Country of ref document: EP Kind code of ref document: A1 |