Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or described herein. The objects distinguished by "first", "second", and the like are usually a class, and the number of the objects is not limited, and for example, the first object may be one or a plurality of objects. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The searching method provided by the embodiment of the present application is described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
Fig. 1 is a schematic flowchart of a search method provided in an embodiment of the present application, including steps 201 to 203:
step 201: the searching device displays M first searching images according to N first inquiry images input by a user.
The N first query images indicate target query items, each first search image indicates one first search item, and N and M are positive integers.
In this embodiment of the application, the N first query images may be images acquired by a search device, images stored in a system of the search device, images downloaded by the search device, and the like, which is not limited in this embodiment of the application. In embodiments of the present application, each first query image may comprise an image of all or a portion of the target query item.
In the embodiments of the present application, the article in the present application may be any article. The articles in the present application may be, for example, daily necessities (e.g., towels, quilts, etc.), electronic products (e.g., usb disks, cameras, etc.), and milk drinks (e.g., red wine, milk, etc.), and the embodiments of the present invention are not limited thereto.
In the embodiment of the present invention, there may be one or more target query items, and the embodiment of the present invention does not limit this.
It is to be understood that the above-mentioned M first search images indicate M first search items.
In an example, the step 201 may specifically include the following steps: the searching device displays a first searching interface according to N first query images input by a user. Wherein, the first search interface comprises the M first search images.
In addition, the first search interface may include M first search links in addition to the M first search images, where one first search link corresponds to one first search image.
Optionally, before the step 201, the method may further include the following step a 1:
step A1: the searching device receives the N first query images input by the user.
In one example, the user may enter the N first query images via the image search control. The image search control may be a newly added control or an original control, which is not limited in the embodiment of the present application.
For example, the image search control may be a control in the target application. For example, the target application may be a shopping application, a search application, or the like, which is not limited in this embodiment of the present application.
Optionally, in this embodiment of the application, when the searching apparatus searches out a plurality of first search images according to the N first query images, the searching apparatus may display the plurality of first search images in a certain order, so that the user may conveniently view the images.
For example, the displaying M first search images in step 201 may specifically include the following step 201 a:
step 201 a: the search means displays M first search images in a preset order.
The preset sequence may be a time sequence or a sequence of similar programs between images, and the like, which is not limited in the embodiment of the present application.
In an example, in the case that the preset order is an order of the inter-image similarity program, the order of the inter-image similarity program may be: and (3) the sequence of the similarity degree between each first search image and any first query image, or the sequence of the mean value of the similarity degree between each first search image and the N first query images.
For example, the searching device may display the N first search images from high to low in order of the degree of similarity between the images, or may display the N first search images from low to high in order of the degree of similarity between the images, which is not limited in this embodiment of the application.
For example, the user inputs a query image 1 and a query image 2 through a mobile phone, and searches for a search image 1 and a search image 2 according to the query image 1 and the query image 2. At this time, the mobile phone can calculate that the degree of similarity between the search image 1 and the query image 1 is 10, the degree of similarity between the search image 1 and the query image 2 is 8, the degree of similarity between the search image 2 and the query image 1 is 8, and the degree of similarity between the search image 2 and the query image 2 is 9. Then, the mobile phone may determine that the mean similarity degree between the search image 1 and the 2 query images is 9, and the mean similarity degree between the search image 2 and the 2 query images is 8.5. Finally, the mobile phone can display the search image 1 and the search image 2 in sequence from high to low according to the average value of the similarity degree.
In another example, in the case that the preset order is a time order, the time order may be: the searching means searches for a time sequence of each first search image, or a photographing time sequence of each first search image.
For example, the searching apparatus may display the N first search images from front to back in a time sequence, or may display the N first search images from back to front in a time sequence, which is not limited in this embodiment of the application.
Step 202: and the searching device performs image processing on the target query image according to the target search image in the M first search images to obtain a second query image.
Wherein the target query image is at least one of the N first query images.
In the embodiment of the present application, the target search image may be one or a plurality of target search images, and the embodiment of the present application is not limited thereto. In this embodiment of the present application, the target search image may be set by default in the system, or may be set by the user, which is not limited in this embodiment of the present application.
In an embodiment of the present application, the image processing on the target query image includes at least one of: and replacing the area in the target search image with the area in the target query image, adjusting the image parameters of the target query image, and splicing the target search image and the target query image. Illustratively, the image parameters may include at least one of: image brightness, image color, image exposure value. It should be noted that the image parameters include, but are not limited to, the three parameters described above, which can be specifically set according to actual requirements, and the embodiment of the present application does not limit this.
It should be noted that, the target query image may be corrected by performing the image processing on the target query image, so that the corrected second query image better meets the requirements of the user, and the accuracy of item search may be improved.
It should be noted that, in the case where N is equal to 1, the target query image is the first query image,
in one example, the target search item indicated by the target search image described above may belong to the same category as the target query item. For example, the target query item belongs to red wine, and the target search item also belongs to red wine.
Specifically, the search apparatus may first determine a category to which the target query item belongs; then, the searching device can determine the searching articles which belong to the same category as the target query article from the M first searching articles; then, the searching device may determine the target search item from the search items belonging to the same category as the target query item, and the search image corresponding to the icon search item is the target search image.
In one example, the searching apparatus may perform step 202 described above after receiving the first input of the user.
For example, the first input may be: the click input of the user on the screen of the search device, or the voice instruction input by the user, or the specific gesture input by the user may be specifically determined according to the actual use requirement, and the embodiment of the present application does not limit this.
The specific gesture in the embodiment of the application can be any one of a single-click gesture, a sliding gesture, a dragging gesture, a pressure identification gesture, a long-press gesture, an area change gesture, a double-press gesture and a double-click gesture; the click input in the embodiment of the application can be click input, double-click input, click input of any number of times and the like, and can also be long-time press input or short-time press input.
Step 203: the searching device displays X second searching images according to the second inquiring image.
Wherein each second search image indicates a second search item, and X is a positive integer.
In an embodiment of the present application, the displaying X second search images may include the following steps: the search apparatus displays X second search images according to the preset sequence, and specifically refers to the description of displaying M first search images according to the preset sequence in the present application, which is not described herein again.
In an example, the step 202 may specifically include the following steps: and the searching device displays a second searching interface according to the second query image. Wherein the second search interface comprises X second search images.
In addition, the second search interface may include X second search links in addition to the X second search images, where one second search link corresponds to one second search image.
In this embodiment of the application, after the step 203, the method may further include the following steps: the searching means may perform image processing on a part or all of the second query image according to a third search image among the X second search images to obtain a third query image. Then, the search means may display Y fourth search images based on the third query image described above. And so on until the user no longer searches. Therefore, the accuracy of image searching can be improved through repeated iterative searching, and the searched articles can better meet the requirements of users.
It should be noted that, in the embodiment of the present application, a description of performing image processing on the target query image according to the target search image may be specifically referred to in the process of performing image processing on a part or all of the second query image according to the third search image, and details are not described here again.
According to the searching method provided by the embodiment of the application, firstly, the searching device can display M first searching images according to N first query images input by a user. Then, the searching apparatus may perform image processing on the target query image according to the target search image among the M first search images to obtain a second query image. Finally, the search means may display X second search images based on the second query image. The N first query images all indicate target query items, each first search image indicates one first search item, the target query image is at least one of the N first query images, and each second search image indicates one second search item. Through the scheme, when a user wants to search for a certain article through the image search function of the application program, compared with the scheme that the article is searched again in a mode that the user needs to input an image with high image quality or manually input a keyword in the related art, the search device in the application can perform image processing on the target query image through the searched target search image to obtain the second query image under the condition that the image quality of the first query image input by the user is not high, namely, the target query image is corrected. Then, the search device may perform a search using the corrected second query image to obtain X second search images. Therefore, the steps of searching for the articles can be simplified, the efficiency is improved, the accuracy of searching for the articles can be improved, and the searched articles can better meet the requirements of users.
Alternatively, in the embodiment of the present application, the search device may determine the above target search image in at least two possible ways.
A first possible implementation (system default):
illustratively, before the step 202, the method may further include the following steps 204a and 204 b:
step 204 a: the search means determines a degree of similarity between each first search image and each first query image.
For example, the similarity degree in the present application may be determined according to a related similarity degree evaluation algorithm, which is not described herein again.
In one example, the degree of similarity in the present application can be characterized by a feature distance, wherein the smaller the feature distance, the higher the degree of similarity, and vice versa. That is, the search apparatus may determine the degree of similarity between the first search image and the first query image by calculating a characteristic distance (e.g., euclidean distance, manhattan distance, chebyshev distance, cosine of an included angle, etc.) between the first search image and the first query image.
Step 204 b: the searching device determines the first searching image with the similarity degree larger than or equal to a second preset threshold value as the target searching image.
For example, the second preset threshold may be set by default in the system, or may be set by the user, which is not limited in the embodiment of the present application.
In an example, the second preset threshold may be a fixed value or a non-fixed value, which is not limited in this embodiment of the application. For example, the second preset threshold may be a fixed value 90, or may be an average of all the determined degrees of similarity, or may be a maximum of all the determined degrees of similarity.
A second possible implementation (user-set):
for example, after the step 201, before the step 202, the method may further include the following steps B1 and B2:
step B1: the searching device receives a second input of the user to at least one first searching image in the M first searching images.
For example, the second input may be: the click input of the user to the at least one first search image, or the voice instruction input by the user, or the specific gesture input by the user may be specifically determined according to the actual use requirement, which is not limited in this embodiment of the application.
Step B2: in response to the second input, the search device determines the at least one first search image as the target search image.
For example, as shown in fig. 2, the mobile phone displays an item search interface 31 of the shopping application 1, and a picture uploading control 32 and a shooting control 33 are displayed in the item search interface 31. When a user wants to search for red wine in the query image (i.e., the target query item described above), the user can click on the picture upload control 42, find and click on the query image. At this time, the mobile phone will search by using the query image. Then, as shown in fig. 3, the mobile phone may display a thumbnail 42 corresponding to the query image and thumbnails corresponding to 3 search images in the search interface 41, where the thumbnails corresponding to the 3 search images are a search image a thumbnail 43, a search image b thumbnail 44, and a search image c thumbnail 45, respectively. When the user clicks the search image a thumbnail 43, the cellular phone can determine that the search image a is a replacement image, i.e., an image used to correct the query image (i.e., the target search image described above).
The searching method provided by the embodiment of the application can be applied to a scene of determining the target searching image, in one example, the searching device can directly determine the target searching image from M first searching images, in another example, the user can select the target searching image according to requirements, and therefore the determining process of the target searching image is more flexible.
Optionally, in this embodiment of the present application, the search apparatus may perform image processing on the target query image in a targeted manner according to different image problems (for example, low image definition or missing image) of the target query image, so as to correct the target query image, thereby improving the accuracy of image search.
Illustratively, the step 202 may specifically include the following step 202 a:
step 202 a: the searching means replaces the first area in the target inquiry image with the second area in the target search image.
For example, the first area may be all or a part of the area in the target query image, and the second area may be all or a part of the area in the target search image, which is not limited in this embodiment of the application. For example, the search means may divide the target query image into a plurality of regions including the first region.
For example, the region in the present application may be a region of any shape. For example, the area may be a rectangular area, or may be a triangular area, or may be a circular area, and the like, which is not limited in this embodiment.
For example, the search means may replace the first area with the second area in at least two possible implementations.
In a first possible implementation (smart correction):
for example, the search device may perform intelligent correction on the target query image, that is, the search device may automatically determine the first region and the second region, and replace the first region with the second region.
The following will exemplarily describe the intelligent correction process by taking one first query image and one target search image as an example.
For example, the search apparatus may determine, among the M first search images, a first search image having a minimum distance (i.e., a highest similarity) to the first query image feature, and use the first search image as the target search image. Then, the searching device can automatically detect and mark a first region in the first query image, search a second region corresponding to the first region in the target search image, and scratch the second region. Finally, the search means may replace the first region in the first query image with the second region extracted. The specific process is as follows:
(1) the searching device can preliminarily classify the first query image and the M first search images by using a deep learning classification model, and give class attribute labels of articles corresponding to all the images. For example, the category attribute label of the target query item corresponding to the first query image is red wine.
(2) The search means may screen out, as the candidate item, an item that matches the category attribute tag of the target query item from among M first search items corresponding to the M first search images. It should be noted that, limiting the commodities with the same category attribute can reduce the problem of inaccurate matching of the candidate items to the greatest extent.
(3) The search means may scale the sizes of both the first query image and the M first search images to H × W, and extract global features of the first query image and the M first search images, respectively, using the deep learning feature extractor. Wherein, H is the height of the inquired commodity picture, and W is the width of the inquired commodity picture, that is H, W is the width and the high resolution of the image after being zoomed uniformly.
(4) The search device can design an attention module in the deep learning feature extractor, and screen out a characteristic detail region feature vector capable of distinguishing the article from the global feature of each image in the first query image and the M first search images by using the attention module.
(5) The search device may expand a vector with characteristic detail region feature dimensions H × W × C into a 1-dimensional column vector. And C is the channel number of the output features of the global features after the global features pass through the attention module.
(6) The search means may calculate feature distances between the 1-dimensional column vector corresponding to the first query image and the 1-dimensional column vector corresponding to each first search image, respectively, and select the first search image with the smallest feature distance as the target search image.
(7) The searching device can respectively locate the specific detail area in the first query image and the specific detail area in the target search image according to the specific detail area feature vector.
(8) The searching device can respectively extract the characteristic points in the specific detail area of the first query image and the specific detail area of the target search image by using a traditional image algorithm, match the characteristic points, and then delete the mismatching characteristic points through geometric verification.
(9) The search means may extract a first region containing all the matching feature points from the first query image and extract a second region containing all the matching feature points from the target search image, respectively.
For example, referring to fig. 3, as shown in (a) of fig. 4, a circumscribed rectangle frame 51 is displayed on the query image, and the region framed by the circumscribed rectangle frame 51 is the region that needs to be replaced by the query image (i.e., the first region described above). Wherein, A (x)1,y1) The coordinates, B (x), of the upper left corner of the region framed by the circumscribed rectangular frame 512,y2) The coordinates of the lower right corner of the area are framed by the circumscribed rectangle 51. Referring to fig. 3, as shown in (b) of fig. 4, a circumscribed rectangle frame 52 is displayed on the search image a, and the region framed by the circumscribed rectangle frame 52 is the region for replacing the search image a (i.e., the second region). Wherein, C (p)1,q1) To search for the coordinates of the upper left corner of the area framed by the circumscribed rectangle 52 in image a, D (p)2,q2) The coordinates of the lower right corner of the area are framed for search image a by a circumscribed rectangle 52.
(10) The searching device can rotate and deform the target search image according to the geometric sorting mode of the feature points in the first region extracted by the first query image, so that the geometric sorting mode of the feature points in the first query image and the preset number (such as 80%) in the target search image is the same.
(11) The search device can acquire the image integration mask M and the reverse mask
For example, the search device may use formula (1) to integrate the image of the mask M and obtain the reverse mask according to formula (2)
The specific formula is as follows:
wherein x is1And y1Corresponding coordinate value, x, of coordinate point A2And y2Is the corresponding coordinate value of coordinate point B.
(12) The search means may process the target search image such that the second region is located at the same position as the first region in the processed target search image. Illustratively, the circumscribed rectangular region of the target search image is taken as a reference, wherein the coordinates C (p) at the upper left corner of the circumscribed rectangular region1,q1) Lower right corner coordinate D (p)2,q2) The search means may expand p to the left, respectively1-x1Upward and outward expansion q1-y1P, outward expansion to the right2+W-x2Downward flaring q2+H-y2And obtaining the processed target search image.
(13) The search means may replace the first area with the second area. For example, it can be realized by the following formula (3):
wherein pic is the spliced second query image, que is the first query image, and ask is the processed target search image.
For example, referring to (a) and (b) in fig. 4, the mobile phone may replace the area framed by the external rectangular frame 51 in the query image with the area framed by the external rectangular frame 52 in the search image a, as shown in fig. 5, and the mobile phone may finally obtain a modified query image (i.e., the second query image). Then, the mobile phone can search again by using the corrected query image.
It should be noted that the search apparatus may obtain the intelligently corrected second query image by using the color difference between the second region reduced by the edge smoothing algorithm and the other regions in the first query image.
The searching method provided by the embodiment of the application can be applied to a scene of intelligently correcting the first query image, and the searching device can automatically select the first area needing to be replaced in the first query image and the second area used for replacing in the target search image, so that the steps of the process of correcting the first query image are simple and convenient, the efficiency is high, and the accuracy of subsequent image searching can be improved.
In a second possible implementation (manual correction):
for example, the item searching apparatus may manually correct the target query image, i.e., select the first area and the second area by the user, and then the item searching apparatus may replace the first area with the second area.
For example, before the step 202a, the method may further include the following steps 202b and 202 c:
step 202 b: the search means receives a third input from the user.
For example, the third input may be: the click input of the user on the first query image, or the voice instruction input by the user, or the specific gesture input by the user may be specifically determined according to the actual use requirement, which is not limited in the embodiment of the present application.
Step 202 c: in response to the third input, the first and second regions are determined.
In one example, the search apparatus may display the first query image and the target query image in a matting interface.
For example, in connection with fig. 3, when the user wants to select a search image a as the substitute image, the thumbnail 43 of the search image a may be clicked. At this time, as shown in fig. 6, the mobile phone may display the cutout interface 61, and automatically display the query image and the rectangular frame 62 in the left fixed area 61a of the cutout interface 61, and display the query image search image a and the rectangular frame 63 in the right fixed area 61b of the cutout interface 61. Then, the user can manually adjust the rectangular frame 62 to frame the region in the query image that needs to be replaced (i.e., the first region mentioned above), and manually adjust the rectangular frame 63 to frame the region in the search image a for replacement (i.e., the second region mentioned above), as required. Finally, the mobile phone can replace the area framed by the rectangular frame 62 with the area framed by the rectangular frame 63.
The searching method provided by the embodiment of the application can be applied to a scene of manually correcting the first query image, and a user can select the first area needing to be replaced in the first query image and the second area used for replacing in the target search image according to requirements, so that the replacement area can better meet the requirements of the user, and the accuracy of subsequent image searching can be improved.
Further alternatively, in this embodiment of the present application, the searching apparatus may determine the degree of blurring of the first region, and further determine whether region replacement is required, so as to reduce the workload of the searching apparatus.
Illustratively, the step 202a may specifically include the following step 202a 1:
step 202a 1: the searching device replaces the first area in the target inquiry image with the second area in the target search image when the fuzzy degree of the first area is larger than or equal to a first preset threshold value.
For example, the first preset threshold may be set by default in the system, or may be set by the user, which is not limited in the embodiment of the present application.
For example, the searching apparatus may determine the blurring degree of the first region through an associated blurring detection algorithm, which is not described herein again.
It should be noted that, in the case that the degree of blur of the first area is smaller than the first preset threshold, the search device may not perform any processing on the target query image.
The searching method provided by the embodiment of the application can be applied to a scene of judging whether to perform image correction on the first query image, the searching device can determine whether to perform image correction by judging the fuzzy degree of the first area, and the first query image can be corrected only under the condition that the fuzzy degree of the first area is greater than or equal to a first preset threshold value, so that the image correction process is more flexible, and the workload of the searching device can be reduced.
Optionally, in this embodiment of the application, when the search apparatus obtains multiple query images containing the same item, the search apparatus may perform parallel search and joint processing on the multiple query images, so as to improve the accuracy of the search.
In one example, in the case that N is greater than 1, before the displaying of the M first search images in the above-mentioned step 201, the method may further include the following steps C1 and C2:
step C1: and the searching device determines the target image characteristic information corresponding to the target query object according to the N first query images.
Step C2: the searching device determines the M first search images according to the characteristic information of the target image.
For example, the searching device may determine an image feature information group corresponding to the target query item in each first query image, where the image feature information group includes at least one piece of image feature information, so as to obtain N image feature information groups corresponding to N first query images. The target image feature information may include all or part of image feature information in the N image feature information groups. For example, the target image feature information may include all of the repeated image feature information in the N image feature information groups.
In another example, in case that N is greater than 1, before the displaying of the M first search images in the above-mentioned step 201, the method may further include the following steps D1 and D2:
step D1: and the searching device determines a searching image group corresponding to each first query image according to each first query image to obtain N searching image groups.
Step D2: the searching device determines the M first search images from the N search image groups.
For example, the search means may perform image integration on the search images among the N search images. For example, the search means may scale each of the fifth search image sizes in the first query image and the N search image groups to (H, W). Wherein H is the height of the first query image, and W is the width of the first query image. Then, the search apparatus may extract the three-dimensional features H × W × C of the first query image and each fifth search image using the deep learning feature extraction model. And C is the number of characteristic channels corresponding to the global characteristics of the model output image. Then, the search device may perform channel fusion on the three-dimensional feature vectors to obtain H × W × 1 morphology vectors. Finally, the searching device may expand the feature vector of H × W × 1 into a one-dimensional column vector, and sequentially calculate the feature distance between the column vector corresponding to the first query image and the column vector corresponding to each fifth search image.
Exemplary, A (x)1,y1) Is the feature vector of the first query image, B (x)2,y2) For the feature vector of any fifth search image in the N search image groups, the calculation formula of the feature distance is as follows:
for example, the searching means may display a fifth search image in which the feature distance in the N search image groups is less than or equal to a third preset threshold. Further, the search means may display a first preset number (e.g., the first 30) of fifth search images whose feature distances are smaller, among all the fifth search images whose feature distances are smaller than or equal to the third preset threshold value.
For example, the searching device may count whether the searched articles corresponding to all the fifth search images have the duplicate goods, and if so, the searching device may preferentially display the fifth search images corresponding to the duplicate goods for the user to view; if not, the search means may display the fifth search image from small to large according to the feature distance.
The searching method provided by the embodiment of the application can be applied to a scene of searching for the articles by using a plurality of query images, and the searching device searches for the articles by using a plurality of images comprising the same query article, so that the accuracy of article searching can be improved, and the article searching process is more flexible.
It should be noted that, in the search method provided in the embodiment of the present application, the execution main body may be a search apparatus, or a control module in the search apparatus for executing the search method. The embodiment of the present application takes a search device executing a search method as an example, and the search device provided in the embodiment of the present application is described.
Fig. 7 is a schematic diagram of a possible structure of a search apparatus for implementing the embodiment of the present application, and as shown in fig. 7, the search apparatus 700 includes: a display module 701 and an image processing module 702, wherein: a display module 701, configured to display M first search images according to N first query images input by a user; an image processing module 702, configured to perform image processing on a target query image according to a target search image in the M first search images displayed by the display module 701 to obtain a second query image; a display module 701, configured to display X second search images according to the second query image processed by the image processing module 702; wherein the N first query images each indicate a target query item, each first search image indicates one first search item, the target query image is at least one of the N first query images, each second search image indicates one second search item, N, M and X are positive integers.
Optionally, the image processing module 702 is specifically configured to replace the first area in the target query image with the second area in the target search image.
Optionally, the image processing module 702 is specifically configured to, when the degree of blur of the first area is greater than or equal to a first preset threshold, replace the first area in the target query image with a second area in the target search image.
Alternatively, as shown in fig. 7, the search apparatus 700 includes: a determination module 703; a determining module 703, configured to determine a similarity degree between each first search image and each first query image; and determining the first search image with the similarity degree larger than or equal to a second preset threshold as the target search image.
Alternatively, as shown in fig. 7, the search apparatus 700 includes: a determination module 703; a determining module 703, configured to determine, according to the N first query images, target image feature information corresponding to the target query item when N is greater than 1; and determining the M first search images according to the characteristic information of the target image.
Optionally, the display module 701 is specifically configured to display the M first search images according to a preset sequence, where the preset sequence is a time sequence or a sequence of similarity degrees between the images.
It should be noted that, as shown in fig. 7, modules that are necessarily included in the search apparatus 700 are illustrated by solid line boxes, such as a display module 701; modules that may or may not be included in the search apparatus 700 are illustrated with dashed boxes, such as the determination module 703.
According to the searching device provided by the embodiment of the application, firstly, the searching device can display M first searching images according to N first query images input by a user. Then, the searching apparatus may perform image processing on the target query image according to the target search image among the M first search images to obtain a second query image. Finally, the search means may display X second search images based on the second query image. The N first query images all indicate target query items, each first search image indicates one first search item, the target query image is at least one of the N first query images, and each second search image indicates one second search item. Through the scheme, when a user wants to search for a certain article through the image search function of the application program, compared with the scheme that the article is searched again in a mode that the user needs to input an image with high image quality or manually input a keyword in the related art, the search device in the application can perform image processing on the target query image through the searched target search image to obtain the second query image under the condition that the image quality of the first query image input by the user is not high, namely, the target query image is corrected. Then, the search device may perform a search using the corrected second query image to obtain X second search images. Therefore, the steps of searching for the articles can be simplified, the efficiency is improved, the accuracy of searching for the articles can be improved, and the searched articles can better meet the requirements of users.
The beneficial effects of the various implementation manners in this embodiment may specifically refer to the beneficial effects of the corresponding implementation manners in the above method embodiments, and are not described herein again to avoid repetition.
The searching device in the embodiment of the present application may be a device, and may also be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The search device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The search apparatus provided in the embodiment of the present application can implement each process implemented by the method embodiments of fig. 1 to fig. 6, and is not described here again to avoid repetition.
Optionally, as shown in fig. 8, an electronic device 800 is further provided in this embodiment of the present application, and includes a processor 801, a memory 802, and a program or an instruction stored in the memory 802 and capable of running on the processor 801, where the program or the instruction is executed by the processor 801 to implement each process of the foregoing search method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
It should be noted that the electronic devices in the embodiments of the present application include the mobile electronic devices and the non-mobile electronic devices described above.
Fig. 9 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 100 includes, but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, a display unit 106, a user input unit 107, an interface unit 108, a memory 109, and a processor 110.
Those skilled in the art will appreciate that the electronic device 100 may further comprise a power source (e.g., a battery) for supplying power to various components, and the power source may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 9 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is not repeated here.
The display unit 106 is configured to display M first search images according to N first query images input by a user; a processor 110, configured to perform image processing on a target query image according to a target search image in the M first search images displayed by the display unit 106 to obtain a second query image; a display unit 106, configured to display X second search images according to the second query image processed by the processor 110; wherein the N first query images each indicate a target query item, each first search image indicates one first search item, the target query image is at least one of the N first query images, each second search image indicates one second search item, N, M and X are positive integers.
Optionally, the processor 110 is specifically configured to replace the first area in the target query image with the second area in the target search image.
Optionally, the processor 110 is specifically configured to replace the first region in the target query image with the second region in the target search image when the degree of blur of the first region is greater than or equal to a first preset threshold.
Optionally, the processor 110 is further configured to determine a degree of similarity between each first search image and each first query image; and determining the first search image with the similarity degree larger than or equal to a second preset threshold as the target search image.
Optionally, the processor 110 is further configured to determine, according to the N first query images, target image feature information corresponding to the target query item when N is greater than 1; and determining the M first search images according to the characteristic information of the target image.
Optionally, the display unit 106 is specifically configured to display the M first search images according to a preset order, where the preset order is a time order or an order of similarity degrees between the images.
According to the electronic device provided by the embodiment of the application, firstly, the electronic device can display M first search images according to N first query images input by a user. Then, the electronic device may perform image processing on the target query image according to the target search image in the M first search images to obtain a second query image. Finally, the electronic device can display X second search images based on the second query image. The N first query images all indicate target query items, each first search image indicates one first search item, the target query image is at least one of the N first query images, and each second search image indicates one second search item. Through the scheme, when a user wants to search for a certain article through the image search function of the application program, compared with a scheme that the user needs to input an image with high image quality or manually input a keyword to re-search for the article in the related art, the electronic device in the application can perform image processing on the target query image through the searched target search image to obtain the second query image under the condition that the image quality of the first query image input by the user is not high, namely, the target query image is corrected. Then, the search device may perform a search using the corrected second query image to obtain X second search images. Therefore, the steps of searching for the articles can be simplified, the efficiency is improved, the accuracy of searching for the articles can be improved, and the searched articles can better meet the requirements of users.
The beneficial effects of the various implementation manners in this embodiment may specifically refer to the beneficial effects of the corresponding implementation manners in the above method embodiments, and are not described herein again to avoid repetition.
It should be understood that, in the embodiment of the present application, the input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the Graphics Processing Unit 1041 processes image data of a still picture or a video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 107 includes a touch panel 1071 and other input devices 1072. The touch panel 1071 is also referred to as a touch screen. The touch panel 1071 may include two parts of a touch detection device and a touch controller. Other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 109 may be used to store software programs as well as various data including, but not limited to, application programs and an operating system. The processor 110 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the above search method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the above search method embodiment, and can achieve the same technical effect, and in order to avoid repetition, the details are not repeated here.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.