[go: up one dir, main page]

CN117788510B - Background removing processing system for image data reading - Google Patents

Background removing processing system for image data reading Download PDF

Info

Publication number
CN117788510B
CN117788510B CN202410148240.XA CN202410148240A CN117788510B CN 117788510 B CN117788510 B CN 117788510B CN 202410148240 A CN202410148240 A CN 202410148240A CN 117788510 B CN117788510 B CN 117788510B
Authority
CN
China
Prior art keywords
data
image
background
different
confirming
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410148240.XA
Other languages
Chinese (zh)
Other versions
CN117788510A (en
Inventor
陈振华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Huilang Times Technology Co Ltd
Original Assignee
Beijing Huilang Times Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Huilang Times Technology Co Ltd filed Critical Beijing Huilang Times Technology Co Ltd
Priority to CN202410148240.XA priority Critical patent/CN117788510B/en
Publication of CN117788510A publication Critical patent/CN117788510A/en
Application granted granted Critical
Publication of CN117788510B publication Critical patent/CN117788510B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a background removal processing system for image data reading, which relates to the technical field of image reading and solves the problem that the situation of false deletion is easy to occur in the normal removal process because the same image data is not considered in the removal process.

Description

Background removing processing system for image data reading
Technical Field
The invention relates to the technical field of image reading, in particular to a background removing processing system for image data reading.
Background
The picture grabbing data refers to a process of identifying information such as text, numbers, charts and the like in an image and converting the information into digital data which can be processed by a computer; such a method may be used to extract data from a variety of file types, including PDF, word documents, images, videos, and the like;
The patent publication No. CN105631888B relates to an image data background removing processing system and method, and belongs to the technical field of computers. The invention reads in image data and outputs the processed image by combining with Mask codes, realizes the quick processing of hardware parallelization, and is assisted with frame skip control logic to control and save bandwidth data quantity, thereby being capable of quickly reading and writing out the image, achieving the aim of obvious bandwidth saving and high detection efficiency.
In the process of reading the image data, the corresponding background data is required to be removed, but in the process of removing, the same image data is not considered, so that in the process of normal removing, the situation of false deletion easily exists, partial data in the image is removed, and repair is required in the later period, so that the overall ornamental effect of the whole image is affected.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides a background removal processing system for image data reading, which solves the problem that the same image data is not considered in the removal process, so that the situation of false deletion is easy to exist in the normal removal process.
In order to achieve the above purpose, the invention is realized by the following technical scheme: a background removal processing system for image data reading, comprising:
the image point position confirming end is used for receiving an image to be processed, which is required to be subjected to background removal, confirming characteristic points of different areas of the image to be processed from the received image to be processed, marking the confirmed characteristic points in the original image to be processed, and enabling contour paths on two sides of the characteristic points to be different;
The point location characteristic analysis end comprises a point location curve construction unit, a function confirmation unit and a characteristic analysis unit;
The point location curve construction unit is used for carrying out curve analysis on the image to be processed with the confirmed characteristic points, confirming the characteristic points of different aggregation areas, constructing contour trend lines corresponding to the different aggregation areas, and transmitting the constructed different contour trend lines into the function confirmation unit, wherein the specific mode is as follows:
dividing different characteristic points according to the aggregation strength among the characteristic points, so as to confirm the characteristic points of different aggregation areas;
Connecting the feature points of different aggregation areas outside to form contour trend lines of the external feature points, and transmitting the different contour trend lines determined by the different aggregation areas into a function confirmation unit;
The function confirmation unit is used for combining the determined different contour trend lines with the two-dimensional coordinate system, determining corresponding coordinate parameters according to specific point positions in the contour trend lines after the combination is completed, subsequently, confirming a plurality of groups of functions of the different contour trend lines according to the different coordinate parameters, and integrating function sequences of the corresponding contour trend lines to obtain an integrated function sequence packet;
The feature analysis unit compares and analyzes different function sequence packets in the image to be processed with images of front and rear adjacent frames, so that corresponding background images in the image to be processed are identified, and the identified background images are transmitted to the associated parameter identification end, and the specific mode is as follows:
confirming corresponding function sequence packets through analysis processing of point location feature analysis ends on images of front and rear adjacent frames;
comparing and analyzing the confirmed function sequence packets, and confirming the function repetition rate, wherein if the function repetition rate of the two groups of function sequence packets meets the following conditions: when the function repetition rate is more than or equal to 95%, marking the corresponding contour trend line as a similar curve, marking the same area where the similar curve is positioned as a background image area, and transmitting the confirmed background image into the associated parameter confirmation end;
The association parameter confirming end confirms the confirmed background image, extracts the background data corresponding to the background image, analyzes the association degree among different internal data from the extracted background data, establishes an association coefficient parameter table belonging to the background image, and then fully eliminates the background data according to the association coefficient parameter table, wherein the specific mode is as follows:
extracting and confirming background data of a background image, determining data of different features in the background data, and marking the data as column data;
From among several sets of column data, the source data is the original data of the background data, the column data associated with the source data is analyzed and confirmed, and the total duty ratio ZB i of the associated data generated by the source data is confirmed, wherein i represents different duty ratios, and: ZB i = associated data capacity within column data +.column data total capacity;
Sequentially confirming associated data corresponding to different columns of data, confirming a corresponding total occupation ratio ZB i, and subsequently generating an associated coefficient parameter table of the background data according to the total occupation ratio among different columns of data;
And when background data is removed, if the data which is the same as the image data exists, analyzing the association degree of the same data and the previous group of data according to the specific association coefficient of the association coefficient parameter table, marking the data which belongs to the same association degree as the data to be removed, reserving the other group of the same data, and fully removing the background data.
Preferably, the system further comprises a picture critical analysis end and a filling unit, wherein the picture critical analysis end comprises an edge confirmation unit, a sharp point locking unit and an area confirmation unit;
The image critical analysis end confirms the edge outline of the display image by the edge confirming unit preferentially, then confirms the turning angle inside the edge outline by the sharp point locking unit, confirms the sharp point belonging to the edge outline, confirms the radiation area according to the sharp point, confirms the filling area by the area confirming unit, and then trims and fills the display image according to the confirmed filling area by the filling unit; the specific method is as follows:
the edge confirming unit confirms the edge outline of the display image preferentially through a mode of follow-up resharpening in an image blurring mode;
Subsequently, confirming the included angle between adjacent line segments of the edge profile, and confirming an included angle parameter JJ k, wherein k represents different included angles, and analyzing whether the included angle parameter JJ k meets the following conditions: JJ k is more than or equal to Y1, wherein Y1 is a preset value, if yes, no treatment is performed, and if not, the corresponding included angle is marked as a sharp point;
and then, according to the determined sharp point, peripheral radiation is carried out by taking the sharp point as a circle center and a preset parameter Y2 as a radius, a radiation area is confirmed, Y2 is a preset value, and then, a filling unit fills according to the background image missing in the radiation area, so that the repairing work of the whole display image is completed.
Advantageous effects
The invention provides a background removal processing system for image data reading. Compared with the prior art, the method has the following beneficial effects:
According to the method, the characteristic functions of different areas in the corresponding images are confirmed, the characteristic functions of the different areas are integrated subsequently, then the images of the front frame and the rear frame are analyzed and confirmed, and the corresponding background images are locked, so that the method is quick and simple, and corresponding background image data can be directly locked;
Subsequently, when background image removal is carried out, fully considering the relevance between the data, determining a corresponding relevance coefficient parameter table according to the relevance between the data, analyzing the relevance between the data and the previous group of data when the same data is encountered later, locking the data to be removed through the corresponding relevance coefficient parameter table, ensuring that no error exists in the data removal process by confirming the relevance, improving one accuracy of background data removal processing, and improving the accuracy;
subsequently, aiming at the image with the sharp point area, an image patching mode is needed to fill the sharp point area with the display image, so that the corresponding display image is ensured not to be too abrupt when being watched, and the subsequent overall watching effect is improved.
Drawings
FIG. 1 is a schematic diagram of a principal frame of the present invention;
FIG. 2 is a diagram of a point feature analysis end frame of the present invention;
FIG. 3 is a diagram of a frame of a critical analysis end of a frame according to the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Example 1
Referring to fig. 1, the application provides a background removal processing system for image data reading, which comprises an image point location confirming end, a point location feature analyzing end, an associated parameter confirming end, a picture critical analyzing end and a filling unit, wherein the image point location confirming end is electrically connected with a point location feature analyzing end input node, the point location feature analyzing end is electrically connected with an associated parameter confirming end input node, the associated parameter confirming end is electrically connected with a picture critical analyzing end input node, the image point location confirming end is electrically connected with a filling unit input node, and the filling unit is electrically connected with a picture critical analyzing end input node;
referring to fig. 2, the point location feature analysis terminal includes a point location curve construction unit, a function confirmation unit, and a feature analysis unit, where the point location curve construction unit is electrically connected to an input node of the function confirmation unit, and the function confirmation unit is electrically connected to an input node of the feature analysis unit;
The image point location confirming end receives an image to be processed, which needs background removal, confirms feature points of different areas of the image to be processed from the received image to be processed, and marks the confirmed feature points in the original image to be processed, wherein contour paths on two sides of the feature points are different, and points generated by crossing two groups of different contour paths of the feature points are in specific application: if a group of mountain images exist, from the places where a plurality of edge angles exist in the shot images, such as the top points of mountain tops, the points are relatively sharp, namely a group of characteristic points, the outline salient points of the edges are also corresponding characteristic points, the characteristic points generated by the images in different states are different, such as a blue sky background behind the mountain images, and the characteristic points exist between the blue sky and clouds, but the arrangement of the characteristic points is definitely not more than that of the mountain images, and the determination of the characteristic points of the images is the prior art, so that excessive description is omitted;
The point location curve construction unit in the point location feature analysis end performs curve analysis on the image to be processed after feature point confirmation, confirms feature points of different aggregation areas, constructs contour trend lines corresponding to the different aggregation areas, and transmits the constructed different contour trend lines to the function confirmation unit, wherein the specific mode for constructing the contour trend lines of the different aggregation areas is as follows:
Dividing different feature points according to the aggregation force among the feature points so as to confirm the feature points of different aggregation areas, specifically, when the distance value among the corresponding point positions is smaller than a certain preset value, representing that the corresponding point positions are the point positions of the same aggregation area, otherwise, the point positions of the same aggregation area are not represented, and different image areas including background areas or feature areas exist in the same group of images to be processed, and the generated feature points have different aggregation forces so as to form different aggregation areas;
Connecting the feature points of the different aggregation areas outside to form contour trend lines of the external feature points, and transmitting the different contour trend lines determined by the different aggregation areas into the function confirmation unit.
The function confirmation unit is used for combining the determined different contour trend lines with a two-dimensional coordinate system, determining corresponding coordinate parameters according to specific points in the contour trend lines after the combination is completed, and subsequently, confirming a plurality of groups of functions of the different contour trend lines according to the different coordinate parameters, and integrating function sequences of the corresponding contour trend lines to obtain an integrated function sequence packet, wherein the different contour trend lines correspond to different function sequence packets, and in particular, a plurality of groups of line segments with different trend are arranged in one group of contour trend lines, each group of line segments corresponds to one group of different functions, so that a plurality of groups of function sequences can be generated in one group of contour trend lines, a plurality of groups of function sequences corresponding to the contour trend lines can be generated, and thus, the function sequence packet is generated;
the feature analysis unit compares and analyzes different function sequence packets in the image to be processed with images of front and rear adjacent frames, so that corresponding background images in the image to be processed are identified, and the identified background images are transmitted to the associated parameter identification end, wherein the specific mode for identifying the background images is as follows:
processing the images of the front and rear adjacent frames in the same way, and confirming the corresponding function sequence packet;
comparing and analyzing the confirmed function sequence packets, and confirming the function repetition rate, wherein if the function repetition rate of the two groups of function sequence packets meets the following conditions: when the function repetition rate is more than or equal to 95%, marking the corresponding contour trend line as a similar curve, marking the same area where the similar curve is positioned as a background image area, and transmitting the confirmed background image into the associated parameter confirmation end;
Specifically, the backgrounds corresponding to the front and back frames of images are not changed, corresponding superposition parameters can be confirmed according to the superposition ratio generated in the comparison process according to function comparison, the image with higher superposition ratio is the background, and the later removal can be performed after the background is confirmed.
Example two
In the specific implementation process of this embodiment, the specific difference is that:
The correlation parameter confirmation end confirms the confirmed background image, extracts the background data corresponding to the background image, analyzes the correlation degree among different internal column data from the extracted background data, establishes a correlation coefficient parameter table belonging to the background image, and then fully eliminates the background data according to the correlation coefficient parameter table, and transmits the image to be processed after the background data elimination treatment to the picture critical analysis end, wherein the specific mode for establishing the correlation coefficient parameter table of the corresponding background image is as follows:
Extracting and confirming background data of a background image, determining data of different features in the background data, marking the data as column data, specifically, each group of data contains different features, and corresponding features can be quickly identified by a self-adaptive identification mode, for example: color parameters, resolution parameters, and other image parameters, all having different characteristics;
From among several sets of column data, the source data is identified, and the source data is the original data of the background data, which can be understood as the starting end of a large amount of data, the column data associated with the source data subsequently is analyzed and identified, and the total duty ratio ZB i of the associated data generated by the source data is identified, wherein i represents different duty ratios, and: ZB i =associated data capacity in column data/total column data capacity, when there is an association between a group of data and subsequent data, for example, some data in the subsequent data may be generated by a certain algorithm or other means, where some data is the data generated by the previous group of data, that is, the associated data in the subsequent data;
Sequentially confirming associated data corresponding to different columns of data, confirming a corresponding total occupation ratio ZB i, and subsequently generating an associated coefficient parameter table of the background data according to the total occupation ratio among different columns of data;
Subsequently, when background data is removed, if the same data as the image data exists, analyzing the association degree of the same data and the previous group of data according to the specific association coefficient of the association coefficient parameter table, marking the data belonging to the same association degree as data to be removed, reserving the other group of the same data, and fully removing the background data;
Specifically, because the background data and the image data are in the same group of images, the color difference or the resolution ratio may have repeated conditions, when the data are removed, the situation of false deletion is avoided, the corresponding data in the image data are deleted, and the corresponding data of the background data are reserved.
Example III
In the specific implementation process, the embodiment further comprises a corresponding data filling process, so that the phenomenon that the reserved display image is abrupt after the background data is deleted is avoided, and therefore, the display image needs to be filled in a certain proportion through a picture critical analysis end and a filling unit, and the boundary is trimmed;
Referring to fig. 3, the image critical analysis end includes an edge confirmation unit, a sharp point locking unit and an area confirmation unit, wherein the edge confirmation unit is electrically connected with an input node of the sharp point locking unit, and the sharp point locking unit is electrically connected with the input node of the area confirmation unit;
The image critical analysis end confirms the edge outline of the display image preferentially through the edge confirming unit, then confirms the turning angle inside the edge outline through the sharp point locking unit, confirms the sharp point belonging to the edge outline, confirms the radiation area according to the sharp point, confirms the filling area through the area confirming unit, and subsequently, the filling unit trims and fills the display image according to the confirmed filling area, wherein the specific mode of confirming the filling area is as follows:
the edge confirming unit confirms the edge outline of the display image preferentially through a mode of follow-up resharpening in an image blurring mode;
Subsequently, confirming the included angle between adjacent line segments of the edge profile, and confirming an included angle parameter JJ k, wherein k represents different included angles, and analyzing whether the included angle parameter JJ k meets the following conditions: JJ k is more than or equal to Y1, wherein Y1 is a preset value, the specific value is determined by an operator according to experience, if the specific value is met, the treatment is not performed, and if the specific value is not met, the corresponding included angle is marked as a sharp point;
And then, according to the determined sharp point, peripheral radiation is carried out by taking the sharp point as a circle center and a preset parameter Y2 as a radius, and a radiation area is confirmed, wherein Y2 is a preset value, the specific value of the Y2 is planned by an operator according to experience, the value is generally 1mm, and then, a filling unit fills according to a background image missing in the radiation area, so that the repairing work of the whole display image is completed.
By adopting the image patching mode, the display image can be patched, and the sharp point region of the display image can be filled, so that the corresponding display image is ensured not to be too abrupt when being watched, and the subsequent overall watching effect is improved.
Example IV
This embodiment includes all of the three embodiments described above in the specific implementation.
Some of the data in the above formulas are numerical calculated by removing their dimensionality, and the contents not described in detail in the present specification are all well known in the prior art.
The above embodiments are only for illustrating the technical method of the present invention and not for limiting the same, and it should be understood by those skilled in the art that the technical method of the present invention may be modified or substituted without departing from the spirit and scope of the technical method of the present invention.

Claims (6)

1. A background removal processing system for image data reading, comprising:
the image point position confirming end is used for receiving an image to be processed, which is required to be subjected to background removal, confirming characteristic points of different areas of the image to be processed from the received image to be processed, marking the confirmed characteristic points in the original image to be processed, and enabling contour paths on two sides of the characteristic points to be different;
The point location characteristic analysis end comprises a point location curve construction unit, a function confirmation unit and a characteristic analysis unit;
The point location curve construction unit is used for carrying out curve analysis on the image to be processed with the confirmed characteristic points, confirming the characteristic points of different aggregation areas, constructing contour trend lines corresponding to the different aggregation areas, and transmitting the constructed different contour trend lines into the function confirmation unit;
The function confirmation unit is used for combining the determined different contour trend lines with the two-dimensional coordinate system, determining corresponding coordinate parameters according to specific point positions in the contour trend lines after the combination is completed, subsequently, confirming a plurality of groups of functions of the different contour trend lines according to the different coordinate parameters, and integrating function sequences of the corresponding contour trend lines to obtain an integrated function sequence packet;
The feature analysis unit compares and analyzes different function sequence packets in the image to be processed with images of front and rear adjacent frames, so that corresponding background images in the image to be processed are confirmed, and the confirmed background images are transmitted to the associated parameter confirmation end;
And the association parameter confirmation end confirms the confirmed background image, extracts the background data corresponding to the background image, analyzes the association degree among different internal data from the extracted background data, establishes an association coefficient parameter table belonging to the background image, and then fully eliminates the background data according to the association coefficient parameter table.
2. The background removal processing system for image data reading according to claim 1, wherein the point location curve construction unit constructs contour trend lines of different aggregation areas in the following specific ways:
dividing different characteristic points according to the aggregation strength among the characteristic points, so as to confirm the characteristic points of different aggregation areas;
Connecting the feature points of the different aggregation areas outside to form contour trend lines of the external feature points, and transmitting the different contour trend lines determined by the different aggregation areas into the function confirmation unit.
3. The background removal processing system for image data reading according to claim 1, wherein the feature analysis unit confirms the background image in a specific manner that:
confirming corresponding function sequence packets through analysis processing of point location feature analysis ends on images of front and rear adjacent frames;
Comparing and analyzing the confirmed function sequence packets, and confirming the function repetition rate, wherein if the function repetition rate of the two groups of function sequence packets meets the following conditions: and when the function repetition rate is more than or equal to 95%, marking the corresponding contour trend line as a similar curve, marking the same area where the similar curve is positioned as a background image area, and transmitting the confirmed background image to the associated parameter confirmation end.
4. The background removal processing system for image data reading according to claim 1, wherein the specific way for the association parameter confirmation end to establish the association coefficient parameter table of the corresponding background image is as follows:
extracting and confirming background data of a background image, determining data of different features in the background data, and marking the data as column data;
From among several sets of column data, the source data is the original data of the background data, the column data associated with the source data is analyzed and confirmed, and the total duty ratio ZB i of the associated data generated by the source data is confirmed, wherein i represents different duty ratios, and: ZB i = associated data capacity within column data +.column data total capacity;
Sequentially confirming associated data corresponding to different columns of data, confirming a corresponding total occupation ratio ZB i, and subsequently generating an associated coefficient parameter table of the background data according to the total occupation ratio among different columns of data;
And when background data is removed, if the data which is the same as the image data exists, analyzing the association degree of the same data and the previous group of data according to the specific association coefficient of the association coefficient parameter table, marking the data which belongs to the same association degree as the data to be removed, reserving the other group of the same data, and fully removing the background data.
5. The background removal processing system for image data reading of claim 1, further comprising a frame critical analysis end and a shim unit, wherein the frame critical analysis end comprises an edge validation unit, a sharp point lock unit, and a region validation unit;
and the picture critical analysis end confirms the edge outline of the display image by the edge confirming unit preferentially, then confirms the turning angle inside the edge outline by the sharp point locking unit, confirms the sharp point belonging to the edge outline, confirms the radiation area according to the sharp point, confirms the filling area by the area confirming unit, and then, trims and fills the display image according to the confirmed filling area by the filling unit.
6. The background removal processing system for image data reading according to claim 5, wherein the picture critical analysis end confirms the filling area in the following specific manner:
the edge confirming unit confirms the edge outline of the display image preferentially through a mode of follow-up resharpening in an image blurring mode;
Subsequently, confirming the included angle between adjacent line segments of the edge profile, and confirming an included angle parameter JJ k, wherein k represents different included angles, and analyzing whether the included angle parameter JJ k meets the following conditions: JJ k is more than or equal to Y1, wherein Y1 is a preset value, if yes, no treatment is performed, and if not, the corresponding included angle is marked as a sharp point;
and then, according to the determined sharp point, peripheral radiation is carried out by taking the sharp point as a circle center and a preset parameter Y2 as a radius, a radiation area is confirmed, Y2 is a preset value, and then, a filling unit fills according to the background image missing in the radiation area, so that the repairing work of the whole display image is completed.
CN202410148240.XA 2024-02-02 2024-02-02 Background removing processing system for image data reading Active CN117788510B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410148240.XA CN117788510B (en) 2024-02-02 2024-02-02 Background removing processing system for image data reading

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410148240.XA CN117788510B (en) 2024-02-02 2024-02-02 Background removing processing system for image data reading

Publications (2)

Publication Number Publication Date
CN117788510A CN117788510A (en) 2024-03-29
CN117788510B true CN117788510B (en) 2024-05-31

Family

ID=90394684

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410148240.XA Active CN117788510B (en) 2024-02-02 2024-02-02 Background removing processing system for image data reading

Country Status (1)

Country Link
CN (1) CN117788510B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105631888A (en) * 2016-01-22 2016-06-01 上海厚安信息技术有限公司 Image data background removing processing system and image data background removing processing method
WO2017113794A1 (en) * 2015-12-31 2017-07-06 北京体基科技有限公司 Gesture recognition method, control method and apparatus, and wrist-type device
CN111209898A (en) * 2020-03-12 2020-05-29 敦泰电子(深圳)有限公司 Method and device for removing optical fingerprint image background

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170193644A1 (en) * 2015-12-30 2017-07-06 Ebay Inc Background removal

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017113794A1 (en) * 2015-12-31 2017-07-06 北京体基科技有限公司 Gesture recognition method, control method and apparatus, and wrist-type device
CN105631888A (en) * 2016-01-22 2016-06-01 上海厚安信息技术有限公司 Image data background removing processing system and image data background removing processing method
CN111209898A (en) * 2020-03-12 2020-05-29 敦泰电子(深圳)有限公司 Method and device for removing optical fingerprint image background

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
A Novel Background Subtraction using Canonical Correlation Analysis;P.Ramya 等;IEEE;20161231;全文 *

Also Published As

Publication number Publication date
CN117788510A (en) 2024-03-29

Similar Documents

Publication Publication Date Title
US8761511B2 (en) Preprocessing of grayscale images for optical character recognition
EP0843275B1 (en) Pattern extraction apparatus and method for extracting patterns
CN109766778A (en) Invoice information input method, device, equipment and storage medium based on OCR technology
CN113012059B (en) Shadow elimination method and device for text image and electronic equipment
JP2003228712A (en) Method for identifying text-like pixel from image
CN102473278B (en) Image processing apparatus, image processing method, and storage medium
CN108830275B (en) Recognition method and device for dot matrix characters and dot matrix numbers
US5708731A (en) Pattern matching method with pixel vectors
JP2766053B2 (en) Image data processing method
US12236620B2 (en) Three-dimensional reconstruction method and apparatus
US20180089157A1 (en) Text editing in an image of a document
CN117788510B (en) Background removing processing system for image data reading
US9275316B2 (en) Method, apparatus and system for generating an attribute map for processing an image
US6751345B2 (en) Method and apparatus for improving object boundaries extracted from stereoscopic images
CN111079624B (en) Sample information acquisition method and device, electronic equipment and medium
US11423597B2 (en) Method and system for removing scene text from images
US5760787A (en) Data storage format
JP2012098852A (en) Image processing apparatus and image processing program
JP2005352735A (en) Document file creation support device, document file creation support method, and program thereof
CN109409370B (en) Remote desktop character recognition method and device
CN111652013A (en) Character filtering method, device, equipment and storage medium
CN116362202B (en) Font generation method, storage medium and electronic device
US8941881B2 (en) Method and apparatus for rasterizing transparent page
US20180268583A1 (en) Method for generating a single representative image from multi-view images
CN110246098B (en) A Fragment Recovery Method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant