CN115187989A - Image processing method and device, electronic equipment, scanning pen and storage medium - Google Patents

Image processing method and device, electronic equipment, scanning pen and storage medium Download PDF

Info

Publication number
CN115187989A
CN115187989A CN202210716575.8A CN202210716575A CN115187989A CN 115187989 A CN115187989 A CN 115187989A CN 202210716575 A CN202210716575 A CN 202210716575A CN 115187989 A CN115187989 A CN 115187989A
Authority
CN
China
Prior art keywords
scanning
image
included angle
area
scanned image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210716575.8A
Other languages
Chinese (zh)
Inventor
普浏清
汪向飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
iFlytek Co Ltd
Original Assignee
iFlytek Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by iFlytek Co Ltd filed Critical iFlytek Co Ltd
Priority to CN202210716575.8A priority Critical patent/CN115187989A/en
Publication of CN115187989A publication Critical patent/CN115187989A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/14Image acquisition
    • G06V30/142Image acquisition using hand-held instruments; Constructional details of the instruments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/14Image acquisition
    • G06V30/1444Selective acquisition, locating or processing of specific regions, e.g. highlighted text, fiducial marks or predetermined fields
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/14Image acquisition
    • G06V30/148Segmentation of character regions
    • G06V30/155Removing patterns interfering with the pattern to be recognised, such as ruled lines or underlines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/32Indexing scheme for image data processing or generation, in general involving image mosaicing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The application provides an image processing method, an image processing device, an electronic device, a scanning pen and a storage medium, wherein the method comprises the following steps: acquiring a scanning included angle when a scanning terminal scans to obtain a scanned image; and if the scanning included angle is out of the range of the preset standard scanning included angle, determining an image area to be cut from the scanned image according to the scanning included angle, wherein the image area to be cut comprises an area which is out of the effective light supplement range of the corresponding scanning terminal in the scanned image, so that the image area to be cut in the scanned image can be cut and deleted conveniently. In the embodiment of the application, when the action of a user holding the scanning terminal is not standard, and the scanning included angle is beyond the range of the preset standard scanning included angle, the scanning included angle when the scanning terminal scans to obtain the scanned image is determined, the dim, content-deformed and fuzzy image area to be cut in the scanned image is determined, the image area to be cut is cut and deleted, and the scanning effect is guaranteed.

Description

Image processing method and device, electronic equipment, scanning pen and storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method and apparatus, an electronic device, a wand, and a storage medium.
Background
Scanning terminal is in the use, and the light of light filling lamp can concentrate and light the scanning area among the scanning terminal to make the camera gather clear, bright image in the scanning area. However, if the action of the user holding the scanning terminal is not standardized, so that the scanning included angle between the camera and the text is out of the range of the standard scanning included angle, the light of the light supplement lamp can be scattered out of the scanning area, the light of the position, far away from the light supplement lamp, of the scanning area is gradually dim, the image collected by the camera has the conditions of dim edge, deformation of content, blurring and the like, the quality of the image is reduced, and the scanning effect is affected.
Therefore, a technical solution is needed at present to still ensure the scanning effect when the scanning included angle between the camera and the text is out of the standard scanning included angle range.
Disclosure of Invention
Based on the above requirements, the present application provides an image processing method, an image processing apparatus, an electronic device, a scanning pen, and a storage medium, where the method still can ensure a scanning effect when a scanning included angle between a camera and a text is outside a standard scanning included angle range.
The technical scheme provided by the application is as follows:
in one aspect, the present application provides a method for processing an image, including:
acquiring a scanning included angle when a scanning terminal scans to obtain a scanned image;
if the scanning included angle is out of the range of the preset standard scanning included angle, determining an image area to be cut from the scanned image according to the scanning included angle; the area of the image to be cut comprises an area which is out of an effective supplementary lighting range corresponding to the scanning terminal in the scanning image;
and cutting and deleting the image area to be cut in the scanned image.
Further, in the method for processing an image, determining an image area to be cut from the scanned image according to the scanning included angle includes:
determining the cutting length of the scanned image according to the scanning included angle and the length of the irradiation blind area under the scanning included angle; the length of the irradiation blind area under the scanning included angle is the length of the irradiation blind area of the light supplement lamp of the scanning terminal when the scanning terminal works at the scanning included angle;
and determining an image area to be cut from the scanned image according to the cutting length and the scanning direction of the scanning terminal.
Further, in the above method for processing an image, the length of the dead zone of illumination under the scanning included angle includes:
any one of a distance length between a first end and a second end of a scanning window of the scanning terminal, a length between the second end of the scanning window and a target position when the scanning image is generated, and a preset length;
the first end of the scanning window is the end, close to the scanning camera, of the scanning window, and the second end of the scanning window is the end, far away from the scanning camera, of the scanning window;
the target position is the intersection point position of the optical axis of the light supplement lamp of the scanning terminal and the scanning object when the scanning terminal works under the scanning included angle.
Further, in the above image processing method, determining the cutting length of the scanned image according to the scanning included angle and the length of the dead zone of illumination under the scanning included angle includes:
and determining the product of the cosine value of the scanning included angle and the length of the irradiation blind area as the cutting length of the scanned image.
Further, in the above method for processing an image, determining an image area to be cut from the scanned image according to the cutting length and the scanning direction of the scanning terminal includes:
determining a scanning initial end of the scanned image according to the scanning direction of the scanning terminal;
and selecting an image area with the same length as the cutting length from the scanning image as an image area to be cut by taking the scanning initial end as an initial end.
Further, in the image processing method, the obtaining of the scanning included angle when the scanning terminal scans the scanned image includes:
acquiring an Euler angle when a scanning terminal scans to obtain a scanned image;
judging whether the rolling angle in the Euler angles is smaller than a preset rolling angle threshold value or not;
and if the rolling angle is smaller than a preset rolling angle threshold value, determining the pitch angle in the Euler angles as a scanning included angle when the scanning terminal scans to obtain a scanned image.
In another aspect, the present application provides an apparatus for processing an image, comprising:
the acquisition module is used for acquiring a scanning included angle when the scanning terminal scans to obtain a scanned image;
the determining module is used for determining an image area to be cut from the scanned image according to the scanning included angle if the scanning included angle is out of the range of the preset standard scanning included angle; the image area to be cut comprises an area outside an effective supplementary lighting range corresponding to the scanning terminal in the scanning image;
and the cutting module is used for cutting and deleting the image area to be cut in the scanned image.
In another aspect, the present application provides an electronic device, comprising:
a memory and a first processor;
wherein the memory is used for storing programs;
the processor is configured to implement the image processing method according to any one of the above items by running the program in the memory.
In another aspect, the present application provides a wand including a second processor, and a gesture detection component coupled to the second processor;
the gesture detection component is used for detecting the gesture of the scanning pen during image scanning and sending the detected gesture data of the scanning pen to the second processor;
the second processor is used for acquiring a scanning included angle when the scanning pen scans to obtain a scanned image according to the scanning pen posture data sent by the posture detection component, and if the scanning included angle is detected to be out of a preset standard scanning included angle range, determining an image area to be cut from the scanned image according to the scanning included angle; the image area to be cut comprises an area outside an effective supplementary lighting range corresponding to the scanning terminal in the scanning image; and cutting and deleting the image area to be cut in the scanned image.
In another aspect, the present application provides a storage medium comprising: the storage medium has stored thereon a computer program which, when executed by a processor, implements the steps of the image processing method described in any one of the above.
The image processing method, the image processing device, the electronic equipment, the scanning pen and the storage medium comprise the following steps: acquiring a scanning included angle when a scanning terminal scans to obtain a scanned image; and if the scanning included angle is out of the range of the preset standard scanning included angle, determining an image area to be cut from the scanned image according to the scanning included angle, wherein the image area to be cut comprises an area which is out of the effective light supplement range of the corresponding scanning terminal in the scanned image, so that the image area to be cut in the scanned image can be cut and deleted conveniently. In the embodiment of the application, when the action of a user holding the scanning terminal is not standard, and the scanning included angle is beyond the range of the preset standard scanning included angle, the scanning included angle when the scanning terminal scans to obtain the scanned image is determined, the dim, content-deformed and fuzzy image area to be cut in the scanned image is determined, the image area to be cut is cut and deleted, and the scanning effect is guaranteed.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is a schematic flowchart of an image processing method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of scan angles provided by an embodiment of the present application;
fig. 3 is a schematic diagram illustrating a scanning terminal in a scanning state according to an embodiment of the present application;
fig. 4 is a schematic diagram illustrating a scanning terminal in another scanning state according to an embodiment of the present application;
fig. 5 is a schematic flowchart of determining an image area to be cropped from a scanned image according to an embodiment of the present application;
fig. 6 is a schematic view of an illumination blind area of a scanning terminal according to an embodiment of the present application;
fig. 7 is a schematic flowchart of another process for determining an image area to be cropped from a scanned image according to an embodiment of the present application;
FIG. 8 is a schematic flowchart of determining a scan angle according to an embodiment of the present application;
FIG. 9 is a schematic Euler angle diagram provided by an embodiment of the present application;
fig. 10 is a schematic diagram illustrating stitching of scanned images to be cropped and deleted in an image area to be cropped according to an embodiment of the present application;
fig. 11 is a schematic diagram of stitching scanned images without trimming and deleting an image area to be trimmed according to an embodiment of the present application;
fig. 12 is a schematic structural diagram of an apparatus for processing an image according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 14 is a schematic structural diagram of a wand according to an embodiment of the present application.
Detailed Description
The technical scheme of the embodiment of the application is suitable for an application scene for processing a scanned image obtained by scanning of the scanning terminal, and by adopting the technical scheme of the embodiment of the application, an image area outside an effective light supplement range of a scanning terminal light supplement lamp can be determined from the scanned image and cut and deleted according to a scanning included angle when the scanned image is obtained by scanning of the scanning terminal, so that the aim of removing dim and fuzzy image areas and other unclear image areas in the scanned image is fulfilled, and the quality of the scanned image is improved.
For example, the technical solution of the present application may be applied to a hardware device such as a hardware processor, or may be packaged into a software program to be executed, and when the hardware processor executes the processing procedure of the technical solution of the present application, or the software program is executed, the processing on the scanned image may be implemented. The embodiment of the present application only introduces the specific processing procedure of the technical scheme of the present application by way of example, and does not limit the specific execution form of the technical scheme of the present application, and any technical implementation form that can execute the processing procedure of the technical scheme of the present application may be adopted by the embodiment of the present application.
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The present embodiment proposes an image processing method, as shown in fig. 1, the method includes:
s101, acquiring a scanning included angle when a scanning terminal scans to obtain a scanned image.
The scanning terminal refers to a terminal device having a scanning function, such as a scanning pen, a translation pen, a dictionary pen, and the like. The scanning terminal is provided with a scanning device, wherein the scanning device is generally a camera. When the scanning terminal is used for scanning, a camera on the scanning terminal can be aligned to a target text to be identified for sliding scanning, and a scanning image acquired by the camera in the sliding scanning process is acquired. And performing image splicing processing on the obtained scanned images to obtain spliced images corresponding to the target texts, and subsequently performing character recognition on the spliced images corresponding to the target texts to obtain scanned texts corresponding to the target texts.
The scanning included angle refers to an included angle formed by the optical axis of the camera and the plane where the target text is located in the scanning process. Wherein, the optical axis of the camera refers to a line passing through the center of the lens of the camera. Illustratively, as shown in fig. 2, an optical axis L1 of the camera C forms a scanning angle θ with a plane T where the target text is located.
The quality of the scanned image collected by the camera directly influences the quality of the subsequent spliced image and the recognition effect of the spliced image. Therefore, in order to ensure the quality of the scanned image, improve the quality of the subsequent spliced image and the recognition effect of the spliced image, people are generally required to use a standard posture when the scanning terminal is used for scanning, so that the scanning included angle between the optical axis of the camera in the scanning terminal and the target text is in the range of the standard scanning included angle. In such a state, as shown in fig. 3, the scanning window M of the scanning terminal a contacts with the plane T where the target text is located, the light supplement lamp of the scanning terminal a can intensively illuminate the standard scanning area S1, and the camera can acquire a bright and clear scanned image. The standard scanning area S1 is a shooting area of the camera at the current scanning included angle, that is, the scanned image corresponds to the content in the standard scanning area S1 at the current scanning included angle.
However, in the using process, the user cannot constantly ensure that the scanning included angle is within the range of the standard scanning included angle. Once the action of the user holding the scanning terminal is not standardized, so that the scanning included angle is out of the range of the standard scanning included angle, as shown in fig. 4, in such a state, the scanning window M of the scanning terminal a tilts up and cannot be contacted with a target text, light of the light supplement lamp scatters and scatters outside the scanning area, at this time, the scanning image corresponds to the content in the scanning area S2, light of an area, far away from the light supplement lamp, of the scanning area S2 is gradually dim, that is, light supplement light of the light supplement lamp in the scanning area S2 is gradually dim according to the direction X1 in fig. 4, and then the scanning image collected by the camera has the conditions of dim edge, deformation of the content, blurring and the like, thereby affecting the quality of the subsequent spliced image and the recognition effect of the spliced image.
Therefore, in the embodiment of the application, the scanning included angle when the scanning terminal scans to obtain the scanned image is obtained, so that the scanning included angle is monitored, and when the scanning included angle is out of the range of the preset standard scanning included angle, the scanned image is processed, so that the quality of a subsequent spliced image and the influence on the recognition effect of the spliced image are avoided.
S102, judging whether the scanning included angle is out of a preset standard scanning included angle range; if the scanning included angle is outside the preset standard scanning included angle range, executing step S103; if the scanning included angle is within the preset standard scanning included angle range, the step is finished.
In the embodiment of the application, after the scanning included angle is acquired, whether the scanning included angle is outside a preset standard scanning included angle range or not is judged, if the scanning included angle is outside the preset standard scanning included angle range, it indicates that the quality of the current scanning image is low, step S103 can be executed, the scanning image is processed, the quality of the subsequent spliced image and the influence on the recognition effect of the spliced image are avoided, if the scanning included angle is within the preset standard scanning included angle range, the quality of the current scanning image is high, the scanning image does not need to be processed, and the step is finished.
The standard scan angle range may be determined according to actual situations, for example, the standard scan angle range is set to be 60 ° to 90 °, and the present embodiment is not limited. Exemplarily, if the standard scan angle is in a range of 60 ° to 90 °, when the scan angle is detected to be 40 °, it indicates that the scan angle is out of the preset standard scan angle range, and step S103 needs to be performed; and when the scanning included angle is detected to be 75 degrees, the scanning included angle is within the range of the preset standard scanning included angle, and the step is finished.
And S103, determining an image area to be cut from the scanned image according to the scanning included angle.
The image area to be cut refers to an area outside an effective supplementary lighting range of the corresponding scanning terminal in the scanned image. As described in the above embodiments, when the scanning included angle is outside the range of the standard scanning included angle, the light of the light supplement lamp is scattered and scattered outside the scanning area, the light of the area of the scanning area far away from the light supplement lamp is gradually dim, and thus an effective light supplement cannot be obtained. Therefore, it is necessary to determine an image area to be cut out in the scanned image, which is outside the effective fill-in light range of the corresponding scanning terminal, so as to delete the image area to be cut out from the scanned image.
Specifically, as shown in fig. 4, when the scanning included angle is outside the range of the standard scanning included angle, the smaller the scanning included angle is, the more serious the scattering of the light supplement lamp is, so that the larger the area of the region in the scanning region where effective light supplement cannot be obtained is, the larger the length of the region of the image to be cut in the corresponding scanned image is. Therefore, the cropping length of the image area to be cropped can be determined from the scanned image based on the inverse relation between the area of the cropped image area and the scanning included angle. That is, when the scanning included angle is outside the range of the standard scanning included angle, the smaller the scanning included angle is, the larger the length value corresponding to the determined cutting length is, and the larger the scanning included angle is, the smaller the length value corresponding to the determined cutting length is.
The cutting starting position of the area to be cut is related to the scanning direction when the scanning terminal scans to obtain the scanned image. Specifically, when the scanning included angle is outside the range of the standard scanning included angle, if the user scans in a certain direction, the area of the scanning area far away from the light supplement lamp corresponds to the initial scanning part in the scanned image, so that the initial scanning part in the scanned image can be determined based on the scanning direction of the user, and the image area with the same length as the cutting length is intercepted from the initial scanning part in the scanned image and is used as the image area to be cut.
And S104, cutting and deleting the image area to be cut in the scanned image.
And after the area to be cut is determined, cutting and deleting the area to be cut from the scanned image.
In this embodiment, a scanning included angle when the scanning terminal scans the scanned image is obtained, and if the scanning included angle is outside a preset standard scanning included angle range, an image area to be cut is determined from the scanned image according to the scanning included angle, where the image area to be cut includes an area outside an effective light supplement range of the scanning terminal in the scanned image, and the image area to be cut in the scanned image is cut and deleted. In the embodiment of the application, when the action of a user holding the scanning pen is not standard, and the scanning included angle is beyond the range of the preset standard scanning included angle, the scanning included angle when the scanning image is obtained based on the scanning of the scanning terminal is determined, the dim, content-deformed and fuzzy image area to be cut in the scanning image is determined, the image area to be cut is cut and deleted, and the scanning effect is guaranteed.
As an alternative implementation manner, as shown in fig. 5, in another embodiment of the present application, it is disclosed that the step of the foregoing embodiment determines an image area to be cropped from a scanned image according to a scanning included angle, and includes the following steps:
s501, determining the cutting length of the scanned image according to the scanning included angle and the length of the irradiation blind area under the scanning included angle.
The irradiation blind area length under the scanning included angle refers to the length of the irradiation blind area of the light supplement lamp of the scanning terminal when the scanning terminal works at the scanning included angle. Specifically, when the scanning terminal works at the scanning included angle, the projection of the length of the irradiation blind area on the plane where the target text is located can be determined, and the length of the projection is determined as the cutting length of the scanned image.
As shown in fig. 6, where L2 is a light ray passing through the upper boundary of the camera, and an intersection point e formed by intersecting the light ray L2 and the plane where the target text is located falls on the boundary of the scanning area S2, in this embodiment, a length is cut from the light ray L2 as a length of the dead irradiation area, and a projection length of the dead irradiation area on the plane where the target text is located is determined as a cutting length of the scanned image.
As an alternative implementation, the distance length between the first end and the second end of the scanning window may be used as the irradiation blind area length at the scanning included angle. The first end of the scanning window is the end of the scanning window close to the scanning camera, and the second end of the scanning window is the end of the scanning window far away from the scanning camera.
As an optional implementation manner, the preset length may also be used as the irradiation blind area length at the scanning included angle. The preset length is determined experimentally. Specifically, the scanning terminal can be used for scanning to obtain a large number of sample scanning images, and in the scanning process, the scanning included angle is out of the range of the standard scanning included angle. Then, an image area to be cut in each sample scanning image is determined, based on the length of the image area to be cut and the scanning included angle when the sample scanning image is scanned, the length of the sample irradiation blind area corresponding to each sample scanning image is reversely deduced, then the lengths of the sample irradiation blind areas corresponding to all the sample scanning images are analyzed, for example, the average value is calculated, and the length of the irradiation blind area of the terminal device is obtained.
And S502, determining an image area to be cut from the scanned image according to the cutting length and the scanning direction of the scanning terminal.
In this embodiment, a scanning initial end is determined according to a scanning direction of a scanning terminal, and an image area having the same length as a cutting length is selected from a scanned image as an image area to be cut, with the scanning initial end as an initial point, so that the image area to be cut is determined from the scanned image, cut and deleted, and a scanning effect is guaranteed.
As an optional implementation manner, in another embodiment of the present application, it is disclosed that the determining, according to the scanning included angle and the length of the dead zone of illumination under the scanning included angle, the cropping length of the scanned image in the above embodiment includes the following steps:
and determining the product of the cosine value of the scanning included angle and the length of the irradiation blind area as the cutting length of the scanned image. Specifically, as shown in fig. 6, if the length between the point e and the point d is determined as the length of the illumination blind area, the projection of the length of the illumination blind area on the plane where the target text is located can be obtained by calculating the product of the cosine value of the scanning included angle θ and the length of the illumination blind area, and then the length of the projection is determined as the cutting length of the scanned image.
In this embodiment, the cutting length of the scanned image is determined based on the product of the cosine value of the scanning included angle and the length of the irradiation blind area, so that the image area to be cut is cut and deleted, and the scanning effect is guaranteed.
As an alternative implementation manner, as shown in fig. 7, in another embodiment of the present application, it is disclosed that, the determining, by the step of the foregoing embodiment, an image area to be cropped from a scanned image according to the cropping length and the scanning direction of the scanning terminal includes:
s701, determining a scanning initial end of a scanned image according to the scanning direction of a scanning terminal.
The scanning initial end of the area to be cut is related to the scanning direction when the scanning terminal scans to obtain the scanned image. Specifically, if the scanning terminal scans in the first direction to the second direction and the scanning included angle is outside the range of the standard scanning included angle, in the scanning area formed by the scanning terminal, one end close to the first direction is far away from the light supplement lamp, so that effective light supplement may not be obtained, and the formed image is easy to have dim, content deformation and blur. Therefore, the side corresponding to one end of the scanning area close to the first direction in the scanned image can be used as the scanning initial end. And taking the scanning initial end as the start, and intercepting the image area with the same length as the cutting length as the image area to be cut.
As shown in fig. 4, a direction X2 is a scanning direction of the scanning terminal a, that is, the scanning terminal a performs sliding scanning in a direction from a to b, in the scanning area S2, an area close to the end a is far away from light of the supplementary lighting, so that effective supplementary lighting may not be obtained, and a formed image is prone to dim, content deformation, and blur. Therefore, the corresponding side of the end a of the scanning area S2 in the scanned image can be used as the scanning start end.
S702, taking the scanning initial end as the start, selecting an image area with the same length as the cutting length from the scanned image as an image area to be cut.
In the embodiment of the application, after the scanning initial end is determined, an image area with the same length as the cutting length is selected from a scanned image by taking the scanning initial end as the start, and the content of the image content is determined as the image area to be cut.
In this embodiment, the image area to be cut of the image is scanned based on the scanning direction and the cutting length of the scanning terminal, so that the image area to be cut is cut and deleted, and the scanning effect is guaranteed.
As an optional implementation manner, as shown in fig. 8, another embodiment of the present application discloses that the step of the foregoing embodiment of obtaining a scanning included angle when the scanning terminal scans to obtain a scanned image specifically includes the following steps:
s801, acquiring an Euler angle when a scanning terminal scans to obtain a scanned image.
The euler angles include a pitch angle and a roll angle when the scanning terminal scans and obtains a scanned image. As shown in fig. 9, if the scanning direction of the scanning terminal is taken as the X axis, the upward direction perpendicular to the plane of the target text is taken as the Y axis, and the outward direction perpendicular to the XOY plane is taken as the Z axis, a three-dimensional coordinate system is established. Wherein the pitch angle is an angle of rotation around the Y-axis, i.e., an angle of rotation in the direction V1 shown in fig. 9; the roll angle is the angle of rotation about the X-axis, i.e., the angle of rotation in the direction V2 shown in fig. 9. The angle obtained by rotation about the Z axis, i.e., in the direction V3 shown in fig. 9, is the yaw angle.
In the embodiment of the application, only the pitch angle and the roll angle in the euler angles when the scanning terminal scans and obtains the scanned image are obtained. For example, the euler angle at which the scanning terminal scans the scanned image may be determined based on a gyroscope provided in the scanning terminal.
S802, judging whether the rolling angle in the Euler angles is smaller than a preset rolling angle threshold value or not; if the rolling angle is smaller than the preset rolling angle threshold, executing step S803; and if the rolling angle is not smaller than the preset rolling angle threshold value, ending the step.
In the embodiment of the application, when the scanning terminal scans on the horizontal plane, the pitching angle is an included angle between the scanning terminal and the target text, that is, an included angle between an optical axis of the camera and the target text. When the scanning terminal scans on an inclined plane, the pitching angle obtained by the gyroscope is the included angle between the scanning terminal and the horizontal plane, namely the included angle between the optical axis of the camera and the horizontal plane, but not the included angle between the optical axis of the camera and the target text.
In order to ensure the accuracy of the scanning included angle, in this embodiment, it is first determined whether the rolling angle is smaller than a preset rolling angle threshold, if the rolling angle is smaller than the preset rolling angle threshold, the plane where the current target text is located is considered as a horizontal plane, step S803 is executed, and the pitch angle is determined as the scanning included angle when the scanning terminal scans to obtain a scanned image; if the rolling angle is not smaller than the preset rolling angle threshold value, the plane where the current text is located is considered to be an inclined plane, the obtained pitching angle is not a scanning included angle, and the step is stopped. In addition, if the rolling angle is not smaller than the preset rolling angle threshold value, prompt information can be output to inform a user that the current target text is inclined, and scanning is performed after the target text is placed on a horizontal plane.
The preset rolling angle threshold may be set according to actual conditions, for example, set to 10 °, and the present embodiment is not limited.
And S803, determining the pitch angle in the Euler angles as a scanning included angle when the scanning terminal scans to obtain a scanned image.
Specifically, if the roll angle is smaller than the preset roll angle threshold, the pitch angle in the euler angle is determined as the scanning included angle when the scanning terminal scans to obtain the scanned image.
Specifically, based on the coordinate system shown in fig. 9, a rotation matrix corresponding to the direction V1, the direction V2, and the direction V3 may be established:
Figure BDA0003708834240000111
Figure BDA0003708834240000112
Figure BDA0003708834240000113
wherein R is x (α) is a rotation matrix obtained by rotating in the X-axis direction V1, R y (β) is a rotation matrix obtained by rotating in the direction V2 on the Y axis, R z And (gamma) is a rotation matrix obtained by rotating the Z axis according to the direction V3, wherein alpha is a pitch angle, beta is a roll angle, and gamma is a yaw angle.
Based on the rotation matrix, the calculation formulas of the pitch angle, the roll angle and the yaw angle are as follows:
Figure BDA0003708834240000121
α=atan(r 32 ,r 32 )
Figure BDA0003708834240000122
γ=atan(r 21 ,r 11 )
in the embodiment, whether the currently scanned target text is located on the horizontal plane or not is determined according to the rolling angle when the scanning terminal scans to obtain the scanned image, and when the scanned target text is located on the horizontal plane, the pitch angle when the scanning terminal scans to obtain the scanned image is determined as the scanning included angle, so that the situation that when the scanned text is not located on the horizontal plane, the pitch angle is mistakenly used as the scanning included angle to determine the wrong image area to be cut is avoided, and the quality of the scanned image is guaranteed.
As an optional implementation manner, in another embodiment of the present application, it is disclosed that after an image region to be cropped is cropped and deleted from a scanned image, a plurality of scanned images obtained in a scanning process may be further processed, including image registration, projection transformation, seam calculation, image fusion, and the like.
Wherein the image registration process comprises:
feature points are first extracted from the scanned image. The characteristic points refer to points in the scanned image where the gray value of the image changes dramatically or the curvature is large on the edge of the image. When the same scene is photographed from different angles, the feature points can be extracted robustly in each picture. Feature points can be extracted from the scanned image by using a SIFT algorithm, a SURF algorithm, an ORB algorithm and the like in the prior art.
After the feature point extraction is completed, the scanned images with the same features may be subjected to registration processing. I.e. the scanned images with the same characteristics are fused into a sub-image. In this embodiment, registration and stitching are performed on two pairs of scanned images with the same characteristics based on the homography matrix, and a stitched image is obtained preliminarily. It should be noted that, in consideration of possible extraction errors in the feature point extraction process, the RANSAC algorithm is used in the embodiment of the present application to perform the solution. In short, four points are randomly extracted each time to solve the homography matrix, then whether the remaining matching pairs are correct matching or not is judged according to the homography matrix, and a group of homography matrices with the largest number of correct matching is selected to solve. And after the registration of the adjacent scanning images is finished, carrying out projection transformation on the spliced images.
The process of projective transformation includes:
since the scanning direction does not always follow a straight line during the scanning process, the splicing result will be rippled, and in order to solve this problem, horizontal correction is also needed. In general, the stitched image needs to be projected on a spherical surface or a cylindrical surface through projective transformation, so as to obtain a further processed stitched image. However, at this time, the stitched image may have obvious brightness change and partial misalignment, and an overlapping region between the images also has obvious transition traces.
The seam line can be found based on the point-by-point method, the dynamic programming method or the graph cutting method in the prior art. The point-by-point method is the simplest and relatively poor in effect, the graph cut method is the highest in calculation complexity and the best in effect, the dynamic programming rule is equivalent to the compromise of the point-by-point method and the graph cut method, the calculation complexity and the effect are both intermediate, and a person skilled in the art can select different seam calculation methods according to actual needs.
Common fusion algorithms include a feathering fusion algorithm and a laplacian fusion algorithm, the feathering fusion algorithm is to calculate the weight of the position near the seam according to the distance from the seam, and perform weighted fusion, and the laplacian fusion algorithm is equivalent to calculating the components of the image with different frequencies, and then performing fusion according to the frequencies, so that the laplacian fusion algorithm has a better effect obviously, but has higher calculation complexity. Those skilled in the art can select different fusion algorithms according to actual needs.
And after the splicing calculation and the image fusion processing are carried out on the spliced images, the final spliced images are obtained, and the character recognition can be carried out on the spliced images to obtain the scanned texts corresponding to the scanned images.
In the embodiment of the application, when the action of a user holding a scanning terminal is not standard, and the scanning included angle is caused to be out of the range of the preset standard scanning included angle, based on the scanning included angle when the scanning terminal scans to obtain a scanned image, a dim, content-deformed and fuzzy image area to be cut in the scanned image is determined and the image area to be cut is cut and deleted, then after the steps of image registration, projection transformation, seam splicing calculation, image fusion and the like are carried out, the obtained spliced image is shown in fig. 10, and after the steps of image registration, projection transformation, seam splicing calculation, image fusion and the like are carried out on the scanned image without carrying out cutting and deleting on the image area to be cut, the obtained spliced image is shown in fig. 11, so that the comparison is visible.
In correspondence with the above image processing method, an embodiment of the present application further discloses an image processing apparatus, as shown in fig. 12, the apparatus includes:
an obtaining module 100, configured to obtain a scanning included angle when a scanning terminal scans to obtain a scanned image;
the determining module 110 is configured to determine an image area to be cut from a scanned image according to a scanning included angle if the scanning included angle is outside a preset standard scanning included angle range; the image area to be cut comprises an area outside an effective supplementary lighting range corresponding to the scanning terminal in the scanning image;
and the cropping module 120 is configured to crop and delete an image area to be cropped in the scanned image.
In this embodiment, the obtaining module 100 obtains a scanning included angle when the scanning terminal scans to obtain a scanned image; if the scanning included angle is outside the preset standard scanning included angle range, the determining module 110 determines an image area to be cut from the scanned image according to the scanning included angle, where the image area to be cut includes an area outside an effective light supplement range of a corresponding scanning terminal in the scanned image, so that the cutting module 120 cuts and deletes the image area to be cut in the scanned image. In the embodiment of the application, when the action of a user holding the scanning terminal is not standard, and the scanning included angle is beyond the range of the preset standard scanning included angle, the scanning included angle when the scanning terminal scans to obtain the scanned image is determined, the dim, content-deformed and fuzzy image area to be cut in the scanned image is determined, the image area to be cut is cut and deleted, and the scanning effect is guaranteed.
Optionally, in another embodiment of the present application, the determining module 110 includes:
the first determining unit is used for determining the cutting length of the scanned image according to the scanning included angle and the length of the irradiation blind area under the scanning included angle; the length of the irradiation blind area under the scanning included angle is the length of the irradiation blind area of a light supplement lamp of the scanning terminal when the scanning terminal works at the scanning included angle;
and the second determining unit is used for determining an image area to be cut from the scanned image according to the cutting length and the scanning direction of the scanning terminal.
Optionally, in another embodiment of the present application, the length of the illumination blind area at the scan angle includes:
any one of the distance length between the first end and the second end of the scanning window of the scanning terminal, the length between the second end of the scanning window and the target position when the scanning image is generated, and the preset length;
the first end of the scanning window is the end of the scanning window close to the scanning camera, and the second end of the scanning window is the end of the scanning window far away from the scanning camera;
the target position is the intersection point position of the optical axis of the light supplement lamp of the scanning terminal and the scanning object when the scanning terminal works under the scanning included angle.
Optionally, in another embodiment of the present application, when the first determining unit determines the cropping length of the scanned image according to the scanning included angle and the length of the dead zone of illumination under the scanning included angle, the first determining unit is specifically configured to:
and determining the product of the cosine value of the scanning included angle and the length of the irradiation blind area as the cutting length of the scanned image.
Optionally, in another embodiment of the present application, the second determining unit includes:
the first determining subunit is used for determining a scanning initial end of a scanned image according to the scanning direction of the scanning terminal;
and the selecting subunit is used for selecting an image area with the same length as the cutting length from the scanned image as an image area to be cut by taking the scanning initial end as an initial end.
Optionally, in another embodiment of the present application, the obtaining module 100 includes:
the acquisition unit is used for acquiring an Euler angle when a scanning terminal scans to obtain a scanned image;
the judging unit is used for judging whether the rolling angle in the Euler angles is smaller than a preset rolling angle threshold value or not;
and the third determining unit is used for determining the pitch angle in the Euler angles as a scanning included angle when the scanning terminal scans to obtain a scanned image if the rolling angle is smaller than a preset rolling angle threshold value.
Specifically, please refer to the contents of the method embodiment for the specific working contents of each unit of the image processing apparatus, which are not described herein again.
Another embodiment of the present application further provides an electronic device, as shown in fig. 13, the electronic device includes:
a memory 200 and a first processor 210;
wherein, the memory 200 is connected to the first processor 210 for storing programs;
the first processor 210 is configured to implement the image processing method disclosed in any of the above embodiments by executing the program stored in the memory 200.
Specifically, the electronic device may further include: a bus, a communication interface 220, an input device 230, and an output device 240.
The first processor 210, the memory 200, the communication interface 220, the input device 230, and the output device 240 are connected to each other through a bus. Wherein:
a bus may include a path that transfers information between components of a computer system.
The first processor 210 may be a general-purpose processor, such as a general-purpose Central Processing Unit (CPU), a microprocessor, etc., an application-specific integrated circuit (ASIC), or one or more integrated circuits for controlling the execution of programs according to the present disclosure. But may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components.
The first processor 210 may include a main processor and may also include a baseband chip, a modem, and the like.
The memory 200 stores programs for executing the technical solution of the present application, and may also store an operating system and other key services. In particular, the program may include program code including computer operating instructions. More specifically, memory 200 may include a read-only memory (ROM), other types of static storage devices that may store static information and instructions, a Random Access Memory (RAM), other types of dynamic storage devices that may store information and instructions, a disk storage, a flash, and so forth.
The input device 230 may include a means for receiving data and information input by a user, such as a keyboard, mouse, camera, scanner, light pen, voice input device, touch screen, pedometer, or gravity sensor, among others.
Output device 240 may include equipment that allows output of information to a user, such as a display screen, a printer, speakers, and the like.
Communication interface 220 may include any device that uses any transceiver or the like to communicate with other devices or communication networks, such as an ethernet network, a Radio Access Network (RAN), a Wireless Local Area Network (WLAN), etc.
The first processor 210 executes the program stored in the memory 200 and calls other devices, which can be used to implement the steps of the image processing method provided in the above embodiments of the present application.
Another embodiment of the present application further provides a wand. Referring to fig. 14, the wand includes a second processor 300, and a posture detecting part 310 connected to the second processor 300;
the gesture detection component 310 is configured to detect a gesture of the scanning pen during image scanning, and send detected gesture data of the scanning pen to the second processor 300;
the second processor 300 is configured to obtain a scanning included angle when the scanning pen scans to obtain a scanned image according to the scanning pen posture data sent by the posture detection component 310, and determine an image area to be cut from the scanned image according to the scanning included angle if the scanning included angle is detected to be outside a preset standard scanning included angle range; the image area to be cut comprises an area outside an effective supplementary lighting range corresponding to the scanning terminal in the scanning image; and cutting and deleting the image area to be cut in the scanned image.
The gesture detection part 310 may include a gyroscope.
The second processor 300 may be a general-purpose processor, such as a general-purpose Central Processing Unit (CPU), a microprocessor, etc., an application-specific integrated circuit (ASIC), or one or more integrated circuits for controlling the execution of programs according to the present disclosure. But may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components.
In the embodiment of the application, when the action of a user holding the scanning terminal is not standard, and the scanning included angle is beyond the range of the preset standard scanning included angle, the scanning included angle when the scanning terminal scans to obtain the scanned image is determined, the dim, content-deformed and fuzzy image area to be cut in the scanned image is determined, the image area to be cut is cut and deleted, and the scanning effect is guaranteed.
As an alternative implementation manner, in another embodiment of the present application, it is disclosed that the determining, by the second processor 300, an image area to be cropped from the scanned image according to the scan angle includes:
determining the cutting length of the scanned image according to the scanning included angle and the length of the irradiation blind area under the scanning included angle; the length of the irradiation blind area under the scanning included angle is the length of the irradiation blind area of a light supplement lamp of the scanning terminal when the scanning terminal works at the scanning included angle;
and determining an image area to be cut from the scanned image according to the cutting length and the scanning direction of the scanning terminal.
As an optional implementation manner, in another embodiment of the present application, it is disclosed that the irradiation blind area length at the scan angle includes:
any one of a distance length between a first end and a second end of a scanning window of the scanning terminal, a length between the second end of the scanning window and a target position when a scanned image is generated, and a preset length;
the first end of the scanning window is the end, close to the scanning camera, of the scanning window, and the second end of the scanning window is the end, far away from the scanning camera, of the scanning window;
the target position is the intersection point position of the optical axis of the light supplement lamp of the scanning terminal and the scanning object when the scanning terminal works under the scanning included angle.
As an alternative implementation manner, in another embodiment of the present application, it is disclosed that the determining, by the second processor 300, the cropping length of the scanned image according to the scan angle and the illumination blind area length under the scan angle includes:
and determining the product of the cosine value of the scanning included angle and the length of the irradiation blind area as the cutting length of the scanned image.
As an optional implementation manner, in another embodiment of the present application, it is disclosed that the determining, by the second processor 300, an image area to be cropped from the scanned image according to the scan included angle, the cropping length, and the scanning direction of the scanning terminal includes:
determining a scanning initial end of a scanned image according to the scanning direction of a scanning terminal;
and taking the scanning initial end as the start, and selecting an image area with the same length as the cutting length from the scanned image as an image area to be cut.
As an optional implementation manner, in another embodiment of the present application, it is disclosed that the acquiring, by the second processor 300, a scanning included angle when the scanning pen scans to obtain a scanned image according to the posture data of the scanning pen sent by the posture detecting component 310 includes:
acquiring an Euler angle when a scanning terminal scans to obtain a scanned image;
judging whether the rolling angle in the Euler angles is smaller than a preset rolling angle threshold value or not;
and if the rolling angle is smaller than a preset rolling angle threshold value, determining the pitch angle in the Euler angles as a scanning included angle when the scanning terminal scans to obtain a scanned image.
The scan pen provided in this embodiment is the same as the image processing method provided in the above embodiments of the present application, and can execute the image processing method provided in any of the above embodiments of the present application, and has functional modules corresponding to the image processing method and technical details which are not described in detail in this embodiment of the present application and have beneficial effects, and reference may be made to specific processing contents of the image processing method provided in the above embodiments of the present application, and details are not repeated here.
Another embodiment of the present application further provides a storage medium, where a computer program is stored on the storage medium, and when the computer program is executed by a processor, the computer program implements the steps of the image processing method provided in any of the above embodiments.
Specifically, the specific working contents of each part of the electronic device and the scan pen, and the specific processing contents of the computer program on the storage medium when being executed by the processor can refer to the contents of each embodiment of the image processing method, which are not described herein again.
While, for purposes of simplicity of explanation, the foregoing method embodiments are presented as a series of acts or combinations, it will be appreciated by those of ordinary skill in the art that the present application is not limited by the illustrated ordering of acts, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
It should be noted that, in the present specification, the embodiments are all described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments may be referred to each other. For the device-like embodiment, since it is basically similar to the method embodiment, the description is simple, and reference may be made to the partial description of the method embodiment for relevant points.
The steps in the method of each embodiment of the present application may be sequentially adjusted, combined, and deleted according to actual needs, and technical features described in each embodiment may be replaced or combined.
The modules and sub-modules in the device and the terminal in the embodiments of the application can be combined, divided and deleted according to actual needs.
In the several embodiments provided in the present application, it should be understood that the disclosed terminal, apparatus and method may be implemented in other manners. For example, the above-described terminal embodiments are merely illustrative, and for example, the division of a module or a sub-module is only one logical division, and there may be other divisions when the terminal is actually implemented, for example, a plurality of sub-modules or modules may be combined or integrated into another module, or some features may be omitted or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or modules, and may be in an electrical, mechanical or other form.
The modules or sub-modules described as separate parts may or may not be physically separate, and parts that are modules or sub-modules may or may not be physical modules or sub-modules, may be located in one place, or may be distributed over a plurality of network modules or sub-modules. Some or all of the modules or sub-modules can be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, each functional module or sub-module in the embodiments of the present application may be integrated into one processing module, or each module or sub-module may exist alone physically, or two or more modules or sub-modules may be integrated into one module. The integrated modules or sub-modules may be implemented in the form of hardware, or may be implemented in the form of software functional modules or sub-modules.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the technical solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software unit executed by a processor, or in a combination of the two. The software cells may be located in Random Access Memory (RAM), memory, read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
Finally, it should also be noted that, in this document, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising a … …" does not exclude the presence of another identical element in a process, method, article, or apparatus that comprises the element.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. A method of processing an image, comprising:
acquiring a scanning included angle when a scanning terminal scans to obtain a scanned image;
if the scanning included angle is out of the range of the preset standard scanning included angle, determining an image area to be cut from the scanned image according to the scanning included angle; the image area to be cut comprises an area outside an effective supplementary lighting range corresponding to the scanning terminal in the scanning image;
and cutting and deleting the image area to be cut in the scanned image.
2. The method according to claim 1, wherein determining an image region to be cropped from the scanned image according to the scan angle comprises:
determining the cutting length of the scanned image according to the scanning included angle and the length of the irradiation blind area under the scanning included angle; the length of the irradiation blind area under the scanning included angle is the length of the irradiation blind area of a light supplement lamp of the scanning terminal when the scanning terminal works at the scanning included angle;
and determining an image area to be cut from the scanned image according to the cutting length and the scanning direction of the scanning terminal.
3. The method of claim 2, wherein the dead zone length of illumination at the scan angle comprises:
the distance length or the preset length between the first end and the second end of the scanning window of the scanning terminal;
the first end of the scanning window is the end, close to the scanning camera, of the scanning window, and the second end of the scanning window is the end, far away from the scanning camera, of the scanning window.
4. The method of claim 2, wherein determining the cropping length for the scanned image according to the scan angle and the illumination dead zone length under the scan angle comprises:
and determining the product of the cosine value of the scanning included angle and the length of the irradiation blind area as the cutting length of the scanned image.
5. The method according to claim 2, wherein determining an image area to be cropped from the scanned image according to the cropping length and the scanning direction of the scanning terminal comprises:
determining a scanning initial end of the scanned image according to the scanning direction of the scanning terminal;
and selecting an image area with the same length as the cutting length from the scanned image as an image area to be cut by taking the scanning initial end as an initial end.
6. The method according to claim 1, wherein the obtaining of the scan angle when the scan terminal scans to obtain the scan image comprises:
acquiring an Euler angle when a scanning terminal scans to obtain a scanned image;
judging whether the rolling angle in the Euler angle is smaller than a preset rolling angle threshold value or not;
and if the rolling angle is smaller than a preset rolling angle threshold value, determining the pitch angle in the Euler angles as a scanning included angle when the scanning terminal scans to obtain a scanned image.
7. An apparatus for processing an image, comprising:
the acquisition module is used for acquiring a scanning included angle when the scanning terminal scans to obtain a scanned image;
the determining module is used for determining an image area to be cut from the scanned image according to the scanning included angle if the scanning included angle is out of the range of the preset standard scanning included angle; the image area to be cut comprises an area outside an effective supplementary lighting range corresponding to the scanning terminal in the scanning image;
and the cutting module is used for cutting and deleting the image area to be cut in the scanned image.
8. An electronic device, comprising:
a memory and a first processor;
wherein the memory is used for storing programs;
the processor is configured to implement the image processing method according to any one of claims 1 to 6 by executing the program in the memory.
9. A wand comprising a second processor, and a gesture detection component coupled to the second processor;
the gesture detection component is used for detecting the gesture of the scanning pen during image scanning and sending the detected gesture data of the scanning pen to the second processor;
the second processor is used for acquiring a scanning included angle when the scanning pen scans to obtain a scanned image according to the scanning pen posture data sent by the posture detection component, and determining an image area to be cut from the scanned image according to the scanning included angle if the scanning included angle is detected to be out of a preset standard scanning included angle range; the area of the image to be cut comprises an area which is out of an effective supplementary lighting range corresponding to the scanning terminal in the scanning image; and cutting and deleting the image area to be cut in the scanned image.
10. A storage medium, comprising: the storage medium has stored thereon a computer program which, when executed by a processor, carries out the steps of the method of processing an image according to any one of claims 1 to 6.
CN202210716575.8A 2022-06-22 2022-06-22 Image processing method and device, electronic equipment, scanning pen and storage medium Pending CN115187989A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210716575.8A CN115187989A (en) 2022-06-22 2022-06-22 Image processing method and device, electronic equipment, scanning pen and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210716575.8A CN115187989A (en) 2022-06-22 2022-06-22 Image processing method and device, electronic equipment, scanning pen and storage medium

Publications (1)

Publication Number Publication Date
CN115187989A true CN115187989A (en) 2022-10-14

Family

ID=83514853

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210716575.8A Pending CN115187989A (en) 2022-06-22 2022-06-22 Image processing method and device, electronic equipment, scanning pen and storage medium

Country Status (1)

Country Link
CN (1) CN115187989A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115604449A (en) * 2022-12-12 2023-01-13 哈尔滨学院(Cn) Translation pen capable of projecting images

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115604449A (en) * 2022-12-12 2023-01-13 哈尔滨学院(Cn) Translation pen capable of projecting images

Similar Documents

Publication Publication Date Title
KR101126466B1 (en) Photographic document imaging system
US8218890B2 (en) Method and apparatus for cropping images
US20070206877A1 (en) Model-based dewarping method and apparatus
JPH11313207A (en) Method for deciding tilt angle of document image
JP2008520039A (en) Detection method of iris and pupil in human image
RU2631765C1 (en) Method and system of correcting perspective distortions in images occupying double-page spread
WO2022105415A1 (en) Method, apparatus and system for acquiring key frame image, and three-dimensional reconstruction method
CN107992869B (en) Method and device for correcting tilted characters and electronic equipment
US7542608B2 (en) Method for automatically cropping image objects
EP2782065B1 (en) Image-processing device removing encircling lines for identifying sub-regions of image
CN114037992A (en) Instrument reading identification method and device, electronic equipment and storage medium
JP6542230B2 (en) Method and system for correcting projected distortion
CN115187989A (en) Image processing method and device, electronic equipment, scanning pen and storage medium
CN112419207A (en) Image correction method, device and system
CN115222935A (en) Image correction method, image correction device, electronic apparatus, scanning pen, and storage medium
CN111340040B (en) Paper character recognition method and device, electronic equipment and storage medium
JP6669390B2 (en) Information processing apparatus, information processing method, and program
Liang et al. Mosaicing of camera-captured document images
CN112532884A (en) Identification method and device and electronic equipment
CN112036319A (en) Picture processing method, device, equipment and storage medium
US20210281742A1 (en) Document detections from video images
CN111428707B (en) Method and device for identifying pattern identification code, storage medium and electronic equipment
JP2000187705A (en) Document reader, document reading method and storage medium
CN113159037A (en) Picture rectification method and device, computer equipment and storage medium
JPH07168941A (en) Picture processor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination