WO2021155575A1 - Dispositif électrique, procédé de commande de dispositif électrique, et support d'informations lisible par ordinateur - Google Patents
Dispositif électrique, procédé de commande de dispositif électrique, et support d'informations lisible par ordinateur Download PDFInfo
- Publication number
- WO2021155575A1 WO2021155575A1 PCT/CN2020/074508 CN2020074508W WO2021155575A1 WO 2021155575 A1 WO2021155575 A1 WO 2021155575A1 CN 2020074508 W CN2020074508 W CN 2020074508W WO 2021155575 A1 WO2021155575 A1 WO 2021155575A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- signal processor
- image signal
- image
- page
- point cloud
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 48
- 230000009466 transformation Effects 0.000 claims description 24
- 238000004590 computer program Methods 0.000 claims description 9
- 238000001514 detection method Methods 0.000 claims description 7
- 230000001131 transforming effect Effects 0.000 claims description 6
- 238000000513 principal component analysis Methods 0.000 claims description 4
- 230000004044 response Effects 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 56
- 230000008569 process Effects 0.000 description 17
- 230000014509 gene expression Effects 0.000 description 13
- 230000015654 memory Effects 0.000 description 10
- 238000012545 processing Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 9
- 238000003702 image correction Methods 0.000 description 6
- 238000004891 communication Methods 0.000 description 5
- 239000000463 material Substances 0.000 description 4
- 239000011159 matrix material Substances 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 229930091051 Arenine Natural products 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000005452 bending Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 239000005357 flat glass Substances 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000035515 penetration Effects 0.000 description 1
- 238000007639 printing Methods 0.000 description 1
- 230000001902 propagating effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/387—Composing, repositioning or otherwise geometrically modifying originals
- H04N1/3877—Image rotation
- H04N1/3878—Skew detection or correction
Definitions
- the present invention relates to an electric device, a method of controlling the electric device, and a computer readable storage medium.
- the present disclosure aims to solve at least one of the technical problems mentioned above. Accordingly, the present disclosure needs to provide an electric device and a method of controlling electric device.
- an electric device may include:
- a camera module that takes a photograph of a subject to acquire a master camera image
- a range sensor module that acquires range depth information of the subject by using a light
- an image signal processor that controls the camera module and the range sensor module to acquire a camera image, based on the master camera image and the range depth information
- the image signal processor controls the camera module and the range sensor module to acquire the master camera image including a curved page surface of a page in a state where a book is opened, and range depth information of the curved page surface,
- the image signal processor estimates a curve corresponding to the position of the curved page surface based on the master camera image and the range depth information
- the image signal processor obtains an image of a surface of the page that has been projection transformed into a plane, by projective transformation of the curved page surface of the page in the master camera image to be a plane, based on the estimated curve.
- the range sensor module emits a pulsed light to the subject and detects a reflected light of the pulsed light reflected from the subject, thereby acquiring time of flight (ToF) depth information as the range depth information.
- ToF time of flight
- the image signal processor obtains a master camera image including the curved page surface, by taking a photograph of the curved page surface that has been opened and curved with the camera module, and
- the image signal processor acquires the ToF depth information of the curved page surface by the range sensor module.
- the image signal processor estimates the curve corresponds to the position of the curved page surface, in a plane perpendicular to the crease direction of the position of the opened crease of the page, based on the master camera image and the ToF depth information.
- the image signal processor obtains a master camera image including a curved page surface of other page in the state where the book is opened and ToF depth information of the curved page surface of the other page,
- the image signal processor estimates other curve corresponding to the position of the curved page surface of the other page based on the master camera image and the ToF depth information
- the image signal processor obtains an image of a surface of the other page that has been projection transformed into a plane, by projection transforming the curved page surface of the other page in the master camera image to be a plane, based on the estimated other curve, and
- the image signal processor synthesizes the images of the surface of the two pages of the acquired plane, and acquires an image of the surface of the two pages of the plane in the state where the book is opened.
- the image signal processor sets a crease position designation frame designated by the user on the opened page in the image taken by the camera module in the state where the book is opened, and
- the image signal processor acquires the master camera image including the curved page surface of the page in which the crease position designation frame is set, by taking an image of the curved page surface of the page that has been opened and curved in the state of opening the book with the camera module.
- a display module that displays predefined information
- a main processor that controls the display module and the input module
- the image signal processor displays the crease position designation frame on the display module together with the master camera image taken by the camera module, and
- the image signal processor sets the crease position designation frame at a position designated by the user on the curved page surface of the page of the master camera image, in response to an operation input related to an instruction of the crease position designation frame by the user to the input module.
- the image signal processor obtains reference point cloud data for the curved page surface, based on the master camera image and the ToF depth information
- the image signal processor calculates a normal vector by applying a principal component analysis to the reference point cloud data
- the image signal processor performs projection transform of the reference point cloud data into first point cloud data of the master camera image taken from the depth direction, based on the calculated normal vector.
- the image signal processor scans the first point cloud data along a plurality of lines, in a direction perpendicular to the longitudinal direction of the crease position designation frame,
- the image signal processor calculates a slope of a valley of the first point cloud data, by applying the least squares method to the scanned first point cloud data,
- the image signal processor estimates the crease position based on the calculated slope of the valley
- the image signal processor obtains second point cloud data by first rotating the first point cloud data so that the estimated crease position is parallel to a preset reference direction.
- the image signal processor scans the second point cloud data along a plurality of lines in a direction perpendicular to the reference direction
- the image signal processor calculates a slope of the ridge in the reference direction of the scanned second point cloud data by applying a least square method to the scanned second point cloud data
- the image signal processor acquires the third point cloud data by a second rotating the second point cloud data, so that the calculated slope of the ridge is parallel to a preset reference plane.
- the image signal processor scans the third point cloud data along a plurality of lines in the reference direction
- the image signal processor calculates an average value of the third point cloud data in the vicinity of the plurality of lines in the depth direction
- the image signal processor approximates the calculated average value as a curve in a direction perpendicular to the reference direction and a depth direction by a fourth-order or higher order polynomial.
- the image signal processor divides the third point cloud data into a plurality of rectangular areas for each section obtained by dividing the curve
- the image signal processor obtains the image of the surface of the page that has been projective transformed into a plane by projective transformation of the curved page surface of the page in the master camera image to be a plane, based on a relation between coordinates of the plurality of rectangular regions and coordinates of the projection space of the reference point cloud data.
- a detection resolution of the range sensor module is lower than a detection resolution of the camera module.
- a method for controlling an electric device including: a camera module that takes a photograph of a subject to acquire a master camera image; a range sensor module that acquires range depth information of the subject by using a light; and an image signal processor that controls the camera module and the range sensor module to acquire a camera image, based on the master camera image and the range depth information,
- the method including:
- the range depth information is time of flight (ToF) depth information.
- ToF time of flight
- a computer readable storage medium having a computer program stored thereon, wherein when the computer program is executed by a processor, the computer program implements a method for controlling an electric device, and the method comprises:
- the range depth information is time of flight (ToF) depth information.
- ToF time of flight
- FIG. 1 is a diagram illustrating an example of an arrangement of an electric device 100 and a subject 101 according to an embodiment of the present invention
- FIG. 2 is a diagram illustrating an example of the configuration of the electric device 100 shown in FIG. 1;
- FIG. 3 is a diagram illustrating an example of an overall flow for the electric device 100 shown in FIG. 1 and FIG. 2 to acquire an image obtained by transforming the surface of a page of the book 100 in an opened state into a plane;
- FIG. 4 is a diagram illustrating a specific example of step S1 shown in FIG. 3 for acquiring the master camera image and ToF depth information;
- FIG. 5 is a diagram illustrating an example of display on the display module 45 of the electric device 100 when the crease position is designated;
- FIG. 6 is a diagram illustrating a specific example of step S2 shown in FIG. 3 for executing the top view process
- FIG. 7A is a diag ram illustrating an example of the reference point cloud data P of the image of the surface of the cu rved page imaged from the tilted direction with the book opened;
- FIG. 7B is a diagram illustrating an example of the first point cloud data P1 obtained by projective transformation as photographed from the depth direction (from the front) ;
- FIG. 8A is a diagram illustrating an example of scanning along a plurality of lines L1 with respect to the first point cloud data P1 in a direction perpendicular to the longitudinal direction D1 of the crease position designation frame;
- FIG. 8B is a diagram illustrating an example of the second point cloud data P2 obtained by the first rotation R of the first point cloud data so that the estimated crease position D2 is parallel to a preset reference direction (the z axis direction) ;
- FIG. 9A is a diagram illustrating an example of scanning along a plurality of lines L2 with respect to the second point cloud data P2 in a direction perpendicular to the preset reference direction (the z axis direction) ;
- FIG. 9B is a diagram illustrating an example of the slope of the ridge N in the longitudinal direction of the scanned second point cloud data P2;
- FIG. 9C is a diagram illustrating an example of the third point cloud data P3 obtained by rotating the second point cloud data P2 by the second rotation Q so that the calculated inclination of the ridge N is parallel to a preset reference plane (z-y plane) ;
- FIG. 10A is a diagram illustrating an example of scanning along a plurality of lines L3 with respect to the third point cloud data P3 in the reference direction (the z axis direction) ;
- FIG. 10B is a diagram illustrating an example of an average value A of the third point cloud data P3 in the depth direction (the x axis direction) in the vicinity of the plurality of lines L3;
- FIG. 10C is a diagram illustrating an example of a curve E obtained by approximating the calculated average value A with a fourth-order or higher order polynomial in the direction perpendicular to the reference direction (the y axis direction) and the depth direction (the x axis direction) ;
- FIG. 11 is a diagram illustrating a specific example of step S25 for executing the dividing process shown in FIG. 6;
- FIG. 12A is a diagram illustrating an example of dividing a curve E by points so that an error between the points of the curve E approximated by a polynomial falls within an allowable range;
- FIG. 12B is a diagram illustrating an example of dividing the curve E by points so that an error between the points of the curve E approximated by a polynomial falls within an allowable range, is continuous with FIG. 12A;
- FIG. 12C is a diagram illustrating an example of dividing the curve E by points so that an error between the points of the curve E approximated by a polynomial falls within an allowable range, is continuous with FIG. 12B;
- FIG. 13 is a diagram showing a specific example of step S3 for executing the image correction processing shown in FIG. 3;
- FIG. 14A is a diagram illustrating an example in which the third point cloud data P3 is divided into a plurality of rectangular areas for each section obtained by dividing the curve E;
- FIG. 14B is a diagram illustrating an example of coordinates obtained by inversely transforming the third point cloud data P3 into the projection space of the point cloud data P1;
- FIG. 15A is a diagram illustrating an example of a plurality of rectangular areas J for each section that the third point cloud data P3 is divided into obtained by dividing the curve E;
- FIG. 15B is a diagram illustrating an example of a plurality of rectangular areas G developed in the two-dimensional space (u, v) from a plurality of rectangular areas J in the three-dimensional space (x, y, z) ;
- FIG. 16 is a diagram illustrating an example of a relationship between coordinates when the point cloud data is expanded on a plane and coordinates on a captured image
- FIG. 17 is a diagram illustrating an example of an image obtained by projective transformation for each corresponding rectangular area so that the surface of the curved page in the master camera image becomes a plane;
- FIG. 18 is a diagram illustrating an example of the images of the surfaces of the two pages 200 and 201 that have been projected and transformed into a plane.
- FIG. 1 is a diagram illustrating an example of an arrangement of an electric device 100 and a subject 101 according to an embodiment of the present invention.
- FIG. 2 is a diagram illustrating an example of the configuration of the electric device 100 shown in FIG. 1.
- the electric device 100 includes a camera module 10, a range sensor module 20, and an image signal processor 30 that controls the camera module 10 and the range sensor module 20, and processes camera image data acquired from the camera module 10.
- reference numeral 101 in FIG. 1 depicts a subject which is book.
- the camera module 10 includes, for example, a master lens 10a that is capable of focusing on a subject, a master image sensor 10b that detects an image inputted via the master lens 10a, and a master image sensor driver 10c that drives the master image sensor 10b, as shown in FIG. 2.
- the camera module 10 includes, for example, a Gyro sensor 10d that the angular velocity and the acceleration of the camera module 10, a focus &OIS actuator 10f that actuates the master lens 10a, and a focus &OIS driver 10e that drives the focus &OIS actuator 10f, as shown in FIG. 2.
- the camera module 10 acquires a master camera image of the subjects 101, for example (FIG. 1) .
- the range sensor module 20 includes, for example, a ToF lens 20a, a range sensor 20b that detects the reflection light inputted via the ToF lens 20a, a range sensor driver 20c that drives the range sensor 20b, and a projector 20d that outputs the pulse lights.
- the range sensor module 20 acquires range depth information of the subject 101 by using a light. Especially, the range sensor module 20 acquires the time of flight (ToF) depth information (ToF depth value) as the range depth information by emitting pulsed light toward the subjects 101, and detecting the reflection light from the subjects 101, for example.
- ToF time of flight
- the resolution of the detection by the range sensor module 20 is lower than the resolution of the detection by the camera module 10.
- the image signal processor 30 controls, for example, the camera module 10 and the range sensor module 20 to acquire a camera image, which is the master camera image, based on the master camera image obtained by means of the camera module 10 and the ToF depth information obtained by means of the range sensor module 20.
- the electric device Furthermore, as shown in FIG. 2, for example, the electric device
- GNSS global navigation satellite system
- CODEC CODEC
- speaker 43 CODEC
- microphone 44 a display module 45
- input module 46 an inertial navigation unit (IMU) 47
- main processor 48 main processor
- memory 49 main memory
- the GNSS module 40 measures the current position of the electric device 100, for example.
- the wireless communication module 41 performs wireless communications with the Internet, for example.
- the CODEC 42 bidirectionally performs encoding and decoding, using a predetermined encoding/decoding method, as shown in FIG. 2 for example.
- the speaker 43 outputs a sound in accordance with sound data decoded by the CODEC 42, for example.
- the microphone 44 outputs sound data to the CODEC 42 based on inputted sound, for example.
- the display module 45 displays predefined information.
- the input module 46 receives a user′s input (a user′s operations) .
- An IMU 47 detects, for example, the angular velocity and the acceleration of the electric device 100.
- the main processor 48 controls the global navigation satellite system (GNSS) module 40, the wireless communication module 41, the CODEC 42, the speaker 43, the microphone 44, the display module 45, the input module 46, and the IMU 47.
- GNSS global navigation satellite system
- the memory 49 stores a program and data required for the image signal processor 30 to control the camera module 10 and the range sensor module 20, acquired image data, and a program and data required for the main processor 48 to control the electric device 100.
- the memory 49 includes a computer readable storage medium having a computer program stored thereon, wherein when the computer program is executed by the main processor 48, the computer program implements a method for controlling the electric device 100.
- the method comprises: correcting, by means of the image signal processor 30, the camera module 10 and the range sensor module 20 to acquire the master camera image including a curved page surface in a state where a book is opened, and range depth information of the curved page surface; estimating, by means of the image signal processor 30, a curve corresponding to the position of the curved page surface based on the master camera image and the range depth information; and obtaining, by means of the image signal processor 30, an image of the surface of the page that has been projection transformed into a plane, by projective transformation of the curved page surface of the page in the master camera image to be a plane, based on the estimated curve.
- the electric device 100 having the above-described configuration is a mobile phone such as a smartphone in this embodiment, but may be other types of electric devices (for instance, a tablet computer and a PDA) including camera modules.
- FIG. 3 is a diagram illustrating an example of an overall flow for the electric device 100 shown in FIG. 1 and FIG. 2 to acquire an image obtained by transforming the curved page surface of a page of the book 100 in an opened state into a plane.
- the image signal processor 30 controls the camera module 10 and the range sensor module 20 to take a master camera image including a curved page surface in a state where a book is opened, and the ToF depth information of the curved page surface (a step S1) .
- the image signal processor 30 executes the top view processing (a step S2) .
- the top view processing estimates a crease position of the page based on the master camera image and the ToF depth information, and estimates a curve corresponding to the position of the curved page surface of the page perpendicular to the crease position.
- the image signal processor 30 executes image correction processing (a step S3) .
- This image correction processing projects and transforms the curved page surface of the page in the master camera image so as to be a plane based on the estimated curve.
- the image signal processor 30 acquires an image of the surface of the page that has been projectively transformed to a plane.
- the image signal processor 30 determines whether or not additional photographing such as another curved page in the state where the book is opened is necessary (a step S4) .
- the image signal processor 30 returns to the step S1, and takes the master camera image including the curved page surface of the other page in the state where the book is opened and the ToF depth information of the curved page surface of the page.
- the image signal processor 30 estimates another curve corresponding to the position of the curved page surface of the other page based on the master camera image and the ToF depth information.
- the image signal processor 30 performs projective transformation on the curved page surface of the other page curved in the master camera image so as to be a plane based on the estimated other curve. Thereby, the image signal processor 30 acquires an image of the transformed surface of the page onto the plane.
- the image signal processor 30 determines in the step S4 that no additional photographing is necessary, the image signal processor 30 combines the images of the two surface pages of the acquired plane and opens the book. Thereby, the image signal processor 30 acquires an image of the surface of the two pages of the plane in the state (a step S5) .
- the present invention has the following preconditions (a) to (d) .
- (a) The book is opened up and down (in the direction of the crease) with almost the same way of bending.
- (b) The binding margin (the crease position) of the book is positioned substantially above or below one of the center, left end, and right end of the screen.
- (c) The resolution of the detection by the range sensor module 20 is lower than the resolution of the camera module 10.
- (d) The distance to the surface of the book can be detected by the range sensor module 20.
- FIG. 4 is a diagram illustrating a specific example of step S1 shown in FIG. 3 for acquiring the master camera image and ToF depth information.
- FIG. 5 is a diagram illustrating an example of display on the display module 45 of the electric device 100 when the crease position is designated.
- the image signal processor 30 sets the crease position designation frame 45a designated by the user on the opened page in the image taken by the camera module 10 in the state where the book is opened (a step S11) .
- the image signal processor 30 displays the crease position designation frame 45 a on the display module 45 together with the master camera image taken by the camera module 10.
- the image signal processor 30 sets the crease position designation frame 45a at a position designated by the user on the curved page surface of the page of the master camera image, in response to an operation input related to an instruction of the crease position designation frame 45a by the user to the input module 46.
- the crease position designation frame 45a is first displayed at the center or the previous operation position in the display module 45.
- the user touches the position to be designated.
- nothing may be displayed at first, and then, when the user touches the display module 45, the crease position designation frame 45a is displayed at the touched position in the display module 45.
- the crease position designation frame 45a rotates up and down or left and right in the display module 45.
- the frame direction designation button B1 is included in the input module 46.
- the crease position designation frame 45a has a mark (an arrow in the example) indicating the vertical direction.
- the page display when the page display is upside down, it is rotated by the user's operation of the frame direction designation button B1 so as to be in the opposite direction.
- the user touches and drags the end of the crease position designation frame 45a, it can be rotated based on the center position.
- the crease position designation frame 45a is set in the direction connecting the touched points in the display module 45.
- the image signal processor 30 takes a master camera image including the curved page surface of the page on which the crease position designation frame 45a is set, by taking a photograph of the curved page surface of the page that has been opened and curved with the camera module 10 in a state where the book is opened (a step S12) .
- the camera module 10 takes a photograph of the curved page surface of the page that is opened and curved while the book displayed on the display module 45 is opened.
- the image signal processor 30 acquires a master camera image including the curved page surface, by capturing the curved page surface of the page that is opened and curved with the camera module 10, in a state where the book 101 is opened.
- the image signal processor 30 acquires ToF depth information of the curved page surface of the page by irradiating the curved page surface of the page with pulse light from the range sensor module 20.
- FIG. 6 is a diagram illustrating a specific example of step S2 shown in FIG. 3 for executing the top view process.
- the image signal processor 30 performs the projective conversion of the point cloud data into data photographed from the front (a step S21) .
- the image signal processor 30 executes estimation of the crease position of the point cloud data subjected to the projective transformation, and rotation of the point cloud data into data (a step S22) .
- the image signal processor 30 performs estimation of the ridge inclination and projective transformation of the rotated point cloud data (a step S23) .
- the image signal processor 30 estimates the curved page surface by estimating a curve corresponding to the position of the curved page surface of the page (a step S24) .
- the image signal processor 30 estimates the curve corresponding to the position of the curved page surface, in the plane (XY plane) perpendicular to the crease direction (the z axis direction) of the opened crease position of the page, based on the master camera image and the ToF depth information (the steps S22 to S24) .
- FIG. 7A is a diagram ill ustrating an example of the reference point cloud data P of the image of the curved page surface of the curved page imaged from the tilted direction with the book opened.
- FIG. 7B is a diagram illustrating an example of the first point cloud data P1 obtained by projective transformation as photographed from the depth direction (from the front) .
- the image signal processor 30 acquires the reference point cloud data of the curved page surface of the page based on the master camera image and the ToF depth information.
- the image signal processor 30 calculates a normal vector by applying the principal component analysis to the reference point cloud data.
- the image signal processor 30 performs the projective transform T of the reference point cloud data P into the first point cloud data P1 of the master camera image taken from the depth direction (the front) , based on the calculated normal vector.
- FIG. 8A is a diagram illustrating an example of scanning along a plurality of lines L1 with respect to the first point cloud data P1 in a direction perpendicular to the longitudinal direction D1 of the crease position designation frame.
- FIG. 8B is a diagram illustrating an example of the second point cloud data P2 obtained by the first rotation R of the first point cloud data so that the estimated crease position D2 is parallel to a preset reference direction (the z axis direction) .
- the image signal processor 30 scans the first point cloud data P1 along a plurality of lines L1 in a direction (short direction) perpendicular to the longitudinal direction D1 of the crease position designation frame 45a.
- the image signal processor 30 calculates the slope of the valley of the first point cloud data P1 by applying the least square method to the scanned first point cloud data P1.
- the image signal processor 30 estimates the crease position (the valley position) M based on the calculated slope of the valley.
- the image signal processor 30 acquires the second point cloud data P2, by performing the first rotation R of the first point cloud data P1 so that the estimated crease position M is parallel to the preset reference direction (the z axis direction) .
- the image signal processor 30 scans the first point cloud data P1 along the plurality of lines L1
- the image signal processor 30 extracts data, in the predetermined range from the first point cloud data P1, is the data for the least square method.
- FIG. 9A is a diagram illustrating an example of scanning along a plurality of lines L2 with respect to the second point cloud data P2 in a direction perpendicular to the preset reference direction (the z axis direction) .
- FIG. 9B is a diagram illustrating an example of the slope of the ridge N in the longitudinal direction of the scanned second point cloud data P2.
- FIG. 9C is a diagram illustrating an example of the third point cloud data P3 obtained by rotating the second point cloud data P2 by the second rotation Q so that the calculated inclination of the ridge N is parallel to a preset reference plane (the z-y plane) .
- the image signal processor 30 scans the second point cloud data P2 along a plurality of lines L2 in a direction perpendicular to the reference direction (the z axis direction) .
- the image signal processor 30 calculates the slope of ridges N in the reference direction of the scanned second point cloud data, by applying a least square method to the scanned second point cloud data.
- the image signal processor 30 acquires the third point cloud data P3 by the second rotating Q of the second point cloud data P2 to so that the calculated slope of the ridge N is parallel to a preset reference plane (the z-y plane) .
- the coordinate transformation, from the original reference point cloud data P to the third point cloud data P3, is "QRT" .
- FIG. 10A is a diagram illustrating an example of scanning along a plurality of lines L3 with respect to the third point cloud data P3 in the reference direction (the z axis direction) .
- FIG. 10B is a diagram illustrating an example of an average value A of the third point cloud data P3 in the depth direction (the x axis direction) in the vicinity of the plurality of lines L3.
- FIG. 10C is a diagram illustrating an example of a curve E obtained by approximating the calculated average value A with a fourth-order or higher order polynomial in the direction perpendicular to the reference direction (the y axis direction) and the depth direction (the x axis direction) .
- the image signal processor 30 scans the third point cloud data P3 along a plurality of lines L3 in the reference direction (z) direction.
- the image signal processor 30 calculates an average value A of the third point cloud data P3 in the vicinity of the depth direction (the x axis direction) of the plurality of lines L3.
- FIG. 11 is a diagram illustrating a specific example of step S25 for executing the dividing process shown in FIG. 6.
- FIG. 12A is a diagram illustrating an example of dividing a curve E by points so that an error between the points of the curve E approximated by a polynomial falls within an allowable range.
- FIG. 12B is a diagram illustrating an example of dividing the curve E by points so that an error between the points of the curve E approximated by a polynomial falls within an allowable range, is continuous with FIG. 12A.
- FIG. 12C is a diagram illustrating an example of dividing the curve E by points so that an error between the points of the curve E approximated by a polynomial falls within an allowable range, is continuous with FIG. 12B.
- the image signal processor 30 calculates an error between points in the division process (astep S251) .
- the image signal processor 30 determines whether or not the error between points is within an allowable range (a step S252) .
- the image signal processor 30 divides the points exceeding the range (a step S253) .
- the image signal processor 30 ends the division process when the error between the points falls within an allowable range.
- FIG. 13 is a diagram showing a specific example of step S3 for executing the image correction processing shown in FIG. 3.
- FIG. 14A is a diagram illustrating an example in which the third point cloud data P3 is divided into a plurality of rectangular areas for each section obtained by dividing the curve E.
- FIG. 14B is a diagram illustrating an example of coordinates obtained by inversely transforming the third point cloud data P3 into the projection space of the point cloud data P1.
- FIG. 15A is a diagram illustrating an example of a plurality of rectangular areas J for each section that the third point cloud data P3 is divided into obtained by dividing the curve E.
- FIG. 14A is a diagram illustrating an example in which the third point cloud data P3 is divided into a plurality of rectangular areas for each section obtained by dividing the curve E.
- FIG. 15B is a diagram illustrating an example of a plurality of rectangular areas G developed in the two-dimensional space (u, v) from a plurality of rectangular areas J in the three-dimensional space (x, y, z) .
- FIG. 16 is a diagram illustrating an example of a relationship between coordinates when the point cloud data is expanded on a plane and coordinates on a taken image.
- the image signal processor 30 sets an area for executing the process (a step S31) .
- the image signal processor 30 divides the third point cloud data P3 into a plurality of rectangular areas for each section obtained by dividing the curve E.
- the coordinates of the third point cloud data P3 are transformed into the projection space coordinates of the original reference point of the cloud data P (FIG. 14B) by the inverse transformation T -1 R -1 Q -1 shown in the expression (2) .
- the image signal processor 30 estimates a transformation matrix for expanding the three-dimensional space plane (FIG. 15A) into the two-dimensional space plane (FIG. 15B) .
- the image signal processor 30 divides the third point cloud data P3 into a plurality of rectangular areas J for each section obtained by dividing the curve.
- the image signal processor 30 transforms a plurality of rectangular regions J obtained by dividing the third point cloud data P3 in the three-dimensional space (x, y, z) into the two-dimensional space (u, v) , to obtain a plurality of rectangular areas G expanded in the two-dimensional space (u, v) .
- the width of the rectangular region G shown in FIG. 15B is expressed by the expression (3) .
- the length of the rectangular region G shown in FIG. 15B is expressed by the expression (4) .
- the image signal processor 30 calculates a transformation matrix for each region from the relationship between the coordinates when expanded on a plane and the coordinates on the taken image.
- the offset values Ou, Ov and the enlargement/reduction ratio k match with the camera image after distortion correction, are obtained by calibration, by adding an offset to the coordinates normalized by the distance to enlarge/reduce.
- Expression (8) is obtained from the expression (6) and the expression (7) .
- a transformation matrix is calculated by solving an equation from a combination of known points.
- the image signal processor 30 performs projective transformation so that the curved page surface of the page in the master camera image becomes a plane, based on the coordinates of the plurality of rectangular areas and the coordinates of the projection space of the reference point cloud data P (step S33 in FIG. 13) .
- FIG. 17 is a diagram illustrating an example of an image obtained by projective transformation for each corresponding rectangular area so that the curved page surface of the curved page in the master camera image becomes a plane.
- the image signal processor 30 performs projective transformation for each corresponding rectangular region so that the curved page surface of the curved page in the master camera image becomes a plane.
- the image signal processor 30 acquires an image of the surface of the page that has been projectively transformed into a plane by this projective transformation.
- the image signal processor 30 determines whether or not the entire area of the third point cloud data P3 has been processed (step S34 in FIG. 13) .
- FIG. 18 is a diagram illustrating an example of the images of the surfaces of the two pages 200 and 201 that have been projected and transformed into a plane.
- the image signal processor 30 synthesizes two images of the surface of the two pages of the acquired plane, and acquires the image of the surface of the two pages of the plane in the state where the book is opened.
- the camera image of the surface of the page that is extended in a plane can be acquired from the camera image of the surface of the curved page of the book opened.
- the present invention it is possible to obtain a widened image by simply photographing a widened book.
- This technology can be provided at low cost by being sold as a smartphone application.
- the present invention does not require a large-scale device.
- first and second are used herein for purposes of description and are not intended to indicate or imply relative importance or significance or to imply the number of indicated technical features.
- a feature defined as “first” and “second” may comprise one or more of this feature.
- a plurality of means “two or more than two” , unless otherwise specified.
- the terms “mounted” , “connected” , “coupled” and the like are used broadly, and may be, for example, fixed connections, detachable connections, or integral connections; may also be mechanical or electrical connections; may also be direct connections or indirect connections via intervening structures; may also be inner communications of two elements which can be understood by those skilled in the art according to specific situations.
- a structure in which a first feature is "on" or “below” a second feature may include an embodiment in which the first feature is in direct contact with the second feature, and may also include an embodiment in which the first feature and the second feature are not in direct contact with each other, but are in contact via an additional feature formed therebetween.
- a first feature "on” , “above” or “on top of” a second feature may include an embodiment in which the first feature is orthogonally or obliquely “on” , “above” or “on top of” the second feature, or just means that the first feature is at a height higher than that of the second feature; while a first feature “below” , “under” or “on bottom of” a second feature may include an embodiment in which the first feature is orthogonally or obliquely “below” , "under” or “on bottom of” the second feature, or just means that the first feature is at a height lower than that of the second feature.
- Any process or method described in a flow chart or described herein in other ways may be understood to include one or more modules, segments or portions of codes of executable instructions for achieving specific logical functions or steps in the process, and the scope of a preferred embodiment of the present disclosure includes other implementations, in which it should be understood by those skilled in the art that functions may be implemented in a sequence other than the sequences shown or discussed, including in a substantially identical sequence or in an opposite sequence.
- the logic and/or step described in other manners herein or shown in the flow chart may be specifically achieved in any computer readable medium to be used by the instructions execution system, device or equipment (such as a system based on computers, a system comprising processors or other systems capable of obtaining instructions from the instructions execution system, device and equipment executing the instructions) , or to be used in combination with the instructions execution system, device and equipment.
- the computer readable medium may be any device adaptive for including, storing, communicating, propagating or transferring programs to be used by or in combination with the instruction execution system, device or equipment.
- the computer readable medium comprise but are not limited to: an electronic connection (an electronic device) with one or more wires, a portable computer enclosure (a magnetic device) , a random access memory (RAM) , a read only memory (ROM) , an erasable programmable read-only memory (EPROM or a flash memory) , an optical fiber device and a portable compact disk read-only memory (CDROM) .
- the computer readable medium may even be a paper or other appropriate medium capable of printing programs thereon, this is because, for example, the paper or other appropriate medium may be optically scanned and then edited, decrypted or processed with other appropriate methods when necessary to obtain the programs in an electric manner, and then the programs may be stored in the computer memories.
- each part of the present disclosure may be realized by the hardware, software, firmware or their combination.
- a plurality of steps or methods may be realized by the software or firmware stored in the memory and executed by the appropriate instructions execution system.
- the steps or methods may be realized by one or a combination of the following techniques known in the art: a discrete logic circuit having a logic gate circuit for realizing a logic function of a data signal, an application-specific integrated circuit having an appropriate combination logic gate circuit, a programmable gate array (PGA) , a field programmable gate array (FPGA) , etc.
- each function cell of the embodiments of the present disclosure may be integrated in a processing module, or these cells may be separate physical existence, or two or more cells are integrated in a processing module.
- the integrated module may be realized in a form of hardware or in a form of software function modules. When the integrated module is realized in a form of software function module and is sold or used as a standalone product, the integrated module may be stored in a computer readable storage medium.
- the storage medium mentioned above may be read-only memories, magnetic disks, CD, etc.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
Abstract
La présente invention, selon ses modes de réalisation, concerne un dispositif électrique comprenant un module caméra qui prend une photographie d'un sujet pour acquérir une image de caméra maître ; un module capteur de distance qui émet une lumière pulsée vers le sujet et qui détecte une lumière réfléchie de la lumière pulsée réfléchie par le sujet, ce qui permet d'acquérir des informations de profondeur le temps de vol (ToF) ; et un processeur de signal d'image qui commande le module caméra et le module capteur de distance pour acquérir une image de caméra, sur la base de l'image de caméra maître et des informations de profondeur de ToF.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2020/074508 WO2021155575A1 (fr) | 2020-02-07 | 2020-02-07 | Dispositif électrique, procédé de commande de dispositif électrique, et support d'informations lisible par ordinateur |
CN202080093632.0A CN114982214A (zh) | 2020-02-07 | 2020-02-07 | 电子设备、控制电子设备的方法、和计算机可读存储介质 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2020/074508 WO2021155575A1 (fr) | 2020-02-07 | 2020-02-07 | Dispositif électrique, procédé de commande de dispositif électrique, et support d'informations lisible par ordinateur |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021155575A1 true WO2021155575A1 (fr) | 2021-08-12 |
Family
ID=77199693
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/074508 WO2021155575A1 (fr) | 2020-02-07 | 2020-02-07 | Dispositif électrique, procédé de commande de dispositif électrique, et support d'informations lisible par ordinateur |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN114982214A (fr) |
WO (1) | WO2021155575A1 (fr) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5497236A (en) * | 1993-06-23 | 1996-03-05 | Ricoh Company Ltd. | Method and apparatus for distortion correction of scanned images |
CN102833460A (zh) * | 2011-06-15 | 2012-12-19 | 富士通株式会社 | 图像处理方法、图像处理设备及扫描仪 |
US20140292802A1 (en) * | 2013-03-26 | 2014-10-02 | Sharp Laboratories Of America, Inc. | Methods and Systems for Correcting a Document Image |
CN104835119A (zh) * | 2015-04-23 | 2015-08-12 | 天津大学 | 一种定位弯曲书面基准线的方法 |
CN105872291A (zh) * | 2016-05-31 | 2016-08-17 | 大连成者科技有限公司 | 带有激光校正的智能互联网高清扫描仪 |
CN105979117A (zh) * | 2016-04-28 | 2016-09-28 | 大连成者科技有限公司 | 基于激光线的弯曲书页图像展平方法 |
CN110519480A (zh) * | 2019-09-21 | 2019-11-29 | 深圳市本牛科技有限责任公司 | 一种基于激光标定的曲面展平方法 |
CN110533769A (zh) * | 2019-08-20 | 2019-12-03 | 福建捷宇电脑科技有限公司 | 一种翻开书本图像的平整化方法及终端 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102124014B1 (ko) * | 2013-10-29 | 2020-06-17 | 삼성전자주식회사 | 영상 촬상 장치 및 이의 보케 영상 생성 방법 |
CN109726614A (zh) * | 2017-10-27 | 2019-05-07 | 北京小米移动软件有限公司 | 3d立体成像方法及装置、可读存储介质、电子设备 |
CN109089047B (zh) * | 2018-09-29 | 2021-01-12 | Oppo广东移动通信有限公司 | 控制对焦的方法和装置、存储介质、电子设备 |
-
2020
- 2020-02-07 CN CN202080093632.0A patent/CN114982214A/zh active Pending
- 2020-02-07 WO PCT/CN2020/074508 patent/WO2021155575A1/fr active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5497236A (en) * | 1993-06-23 | 1996-03-05 | Ricoh Company Ltd. | Method and apparatus for distortion correction of scanned images |
CN102833460A (zh) * | 2011-06-15 | 2012-12-19 | 富士通株式会社 | 图像处理方法、图像处理设备及扫描仪 |
US20140292802A1 (en) * | 2013-03-26 | 2014-10-02 | Sharp Laboratories Of America, Inc. | Methods and Systems for Correcting a Document Image |
CN104835119A (zh) * | 2015-04-23 | 2015-08-12 | 天津大学 | 一种定位弯曲书面基准线的方法 |
CN105979117A (zh) * | 2016-04-28 | 2016-09-28 | 大连成者科技有限公司 | 基于激光线的弯曲书页图像展平方法 |
CN105872291A (zh) * | 2016-05-31 | 2016-08-17 | 大连成者科技有限公司 | 带有激光校正的智能互联网高清扫描仪 |
CN110533769A (zh) * | 2019-08-20 | 2019-12-03 | 福建捷宇电脑科技有限公司 | 一种翻开书本图像的平整化方法及终端 |
CN110519480A (zh) * | 2019-09-21 | 2019-11-29 | 深圳市本牛科技有限责任公司 | 一种基于激光标定的曲面展平方法 |
Also Published As
Publication number | Publication date |
---|---|
CN114982214A (zh) | 2022-08-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4435145B2 (ja) | 幾何情報を校正してパノラマ映像を提供する方法および装置 | |
JP5670476B2 (ja) | 傾斜又は遠近修正能力を有する画像捕獲装置 | |
JP4529837B2 (ja) | 撮像装置、画像補正方法及びプログラム | |
US10915998B2 (en) | Image processing method and device | |
WO2018228330A1 (fr) | Procédé et appareil de mise au point permettant d'obtenir un visage humain net, et dispositif informatique | |
CN111788572A (zh) | 用于面部识别的方法和系统 | |
KR102524982B1 (ko) | 흐림 처리가 수행된 이미지에 노이즈 패턴을 반영하는 방법 및 장치 | |
CN114019473A (zh) | 对象检测方法及装置、电子设备和存储介质 | |
KR102452575B1 (ko) | 광학식 이미지 안정화 움직임에 의한 이미지의 변화를 보상하기 위한 장치 및 방법 | |
KR20140144410A (ko) | 음향신호를 위한 빔포밍 방법 및 장치 | |
WO2017020150A1 (fr) | Procédé de traitement d'image, dispositif et appareil photographique | |
KR102200866B1 (ko) | 2차원 이미지를 이용한 3차원 모델링 방법 | |
US20190355104A1 (en) | Image Correction Method and Apparatus | |
CN111327823A (zh) | 视频生成方法、装置及对应的存储介质 | |
CN110189269A (zh) | 用于广角镜头3d畸变的矫正方法、装置、终端及存储介质 | |
CN112529967B (zh) | 一种散斑结构光系统的参考图获取方法及装置 | |
CN106791818A (zh) | 测试摄像头解像力的方法和用于摄像头解像力测试的设备 | |
CN115174878A (zh) | 投影画面校正方法、装置和存储介质 | |
CN111815695B (zh) | 深度图像获取方法、装置、移动终端及存储介质 | |
WO2021031709A1 (fr) | Procédé d'imagerie, dispositif d'imagerie et équipement électronique | |
WO2021155575A1 (fr) | Dispositif électrique, procédé de commande de dispositif électrique, et support d'informations lisible par ordinateur | |
WO2022193310A1 (fr) | Dispositif électrique, procédé de commande de dispositif électrique et support de stockage lisible par ordinateur | |
WO2022027191A1 (fr) | Procédé et dispositif de correction de plan, support lisible par ordinateur, et dispositif électronique | |
CN113344789A (zh) | 图像拼接方法及装置、电子设备、计算机可读存储介质 | |
US11164388B2 (en) | Electronic device and method for providing augmented reality object therefor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20917423 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20917423 Country of ref document: EP Kind code of ref document: A1 |