CN109827516B - Method for measuring distance through wheel - Google Patents
Method for measuring distance through wheel Download PDFInfo
- Publication number
- CN109827516B CN109827516B CN201910207478.4A CN201910207478A CN109827516B CN 109827516 B CN109827516 B CN 109827516B CN 201910207478 A CN201910207478 A CN 201910207478A CN 109827516 B CN109827516 B CN 109827516B
- Authority
- CN
- China
- Prior art keywords
- vehicle
- wheel
- coordinate system
- wheels
- world coordinate
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Landscapes
- Traffic Control Systems (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention provides a method for measuring distance through wheels, wherein the distance is the wheel track between a vehicle and surrounding vehicles, and the method comprises the following steps: step a: establishing a world coordinate system taking the center of the vehicle as a coordinate origin; step b: collecting images of vehicles and wheels around the vehicle; step c: carrying out vehicle identification and wheel identification on the acquired images to obtain the positions of the vehicle and the wheels in the images, namely the positions of the vehicle and the wheels; step d: calculating the subordination relation between the wheels and the vehicle according to the geometric relative relation between the vehicle position and the wheel position obtained in the step c; step e: obtaining the coordinates of the contact points of the outer edges of the front wheel and the rear wheel of the vehicle under a world coordinate system; step f: calculating to obtain the coordinates of the front wheel and the rear wheel of the target vehicle under a world coordinate system according to the projection transformation relation; and step g: the calculation can obtain the transverse track and the longitudinal track between the vehicle and the target vehicle.
Description
Technical Field
The invention relates to the field of machine vision, in particular to a method for measuring distance through a wheel.
Background
The measurement of the distance between vehicles provides extremely important information for vehicle-to-vehicle communication on roads or automatic/auxiliary driving, almost all vehicle early warning types need to take vehicle distance measurement as a key input parameter, such as lowest-level lane departure early warning, the precision requirement is that the distance from the wheels of the vehicle to the inner edge and the outer edge of a lane line is calculated below 10 cm, and the advanced similar vehicle collision early warning function needs to calculate the distance from the vehicle to the surrounding vehicles quickly in real time.
The patent with publication number CN207611140U, entitled apparatus for measuring distance of vehicle ahead, discloses the following technical contents: the patent only aims at the front visible and recognizable type vehicle and obtains the vehicle outline and the circumscribed rectangle of the outline to calculate the distance by the traditional image processing method on the basis of a single camera, the patent further takes the wheel as the target to identify and detect, more accurate image coordinates of the contact points of the front wheel and the rear wheel or the single wheel can be obtained, the distance is calculated by the contact point of the wheel and the vehicle body data of the vehicle, and the precision, the accuracy and the stability of continuous images are greatly improved.
The invention discloses a method and a device for measuring vehicle distance and an automobile, and a patent with publication number CN103148837B discloses the following technical contents: the method comprises the steps that images of scenes in front of a current vehicle are synchronously acquired through two cameras of a binocular vision system; acquiring candidate license plate regions in the image acquired by each camera, and extracting a feature vector corresponding to each candidate license plate region; and judging whether the candidate license plate region corresponding to each feature vector is a real license plate region. Compare binocular camera range finding near and the price is higher relatively, this patent only uses the monocular fisheye camera, can reach the visual angle scope as big as the binocular simultaneously use the wheel range finding can be stable obtain the vehicle touchdown point position, and the license plate that this patent used detects that the stability descends by a wide margin under the illumination unstability, the cover shelters from the condition.
From the current research situation of intelligent vehicles at home and abroad, the fact that the technology related to vehicle distance measurement is still continuously developed can be found, the used distance measuring equipment and method have good and bad complementation, and the accuracy degree of vehicle distance estimation has great significance for reducing traffic accidents.
Disclosure of Invention
In view of the above-mentioned drawbacks of the prior art, a method for measuring the distance between the own vehicle and the surrounding vehicles through wheels is proposed, which is based on at least one sensor device, and determines all visible wheels of the visible vehicle by using a deep neural network method according to the images returned by the sensors, so as to calculate various relevant distances between the own vehicle and the target vehicle.
The invention relates to a method for measuring distance through wheels, wherein the distance is the wheel track between a vehicle and surrounding vehicles, and the method is characterized by comprising the following steps:
step a: establishing a world coordinate system taking the center of the vehicle as a coordinate origin;
step b: acquiring images of vehicles and wheels around the vehicle through at least one vision sensor on the vehicle;
step c: carrying out vehicle identification and wheel identification on the acquired images to obtain the positions of the vehicle and the wheels in the images, namely the positions of the vehicle and the wheels;
step d: calculating the subordination relation between the wheels and the vehicle according to the geometric relative relation between the vehicle position and the wheel position obtained in the step c;
step e: obtaining the coordinates of the contact points of the outer edges of the front wheel and the rear wheel of the vehicle under a world coordinate system;
step f: calculating the coordinates of the front wheel and the rear wheel of the target vehicle under a world coordinate system according to the projection transformation relation based on the vehicle position, the wheel position and the membership obtained in the steps c and d;
step g: and calculating the coordinate difference value from the front/rear wheel of the target vehicle to the front/rear wheel on the left/right side of the vehicle under the world coordinate system, so as to obtain the transverse track and the longitudinal track between the vehicle and the target vehicle.
Preferably, in the step a, in the world coordinate system, the center of the vehicle is taken as a coordinate origin, the heading direction is taken as a Y-axis, the vertical ground direction is taken as a Z-axis, and the direction perpendicular to a YZ plane and the heading right direction is taken as an X-axis.
Preferably, in step b, the vision sensor is arranged at the head part and/or the left side and/or the right side of the vehicle.
Preferably, in step c, vehicle identification and wheel identification are performed by a deep neural network method.
Preferably, in step c, the vehicle position includes a 2D frame position of the vehicle, a 3D frame position of the vehicle, and a curve of the vehicle contour; the wheel positions include a 2D frame position and a wheel profile curve of the vehicle.
Preferably, the calculating of the membership in step d further comprises the steps of:
step d 1: calculating to obtain the transverse maximum and the longitudinal maximum and the minimum of the vehicle body based on the coordinates of the vehicle contour curve;
step d 2: comparing the wheel edge point coordinate values with a maximum value range based on the wheel profile curve;
step d 3: if more than three edge points are in the maximum value range, the wheel is judged to belong to the target vehicle, and the affiliations between all the detected vehicles and all the detected wheels are obtained.
Preferably, in the case that a plurality of the vision sensors are installed, the step d further includes a step d 4: the merging processing is performed for a plurality of wheel positions belonging to the same target vehicle detected by different vision sensors.
Preferably, step f further comprises the following steps:
step f 1: converting the vehicle position and the wheel position from an image coordinate system to a camera system coordinate system;
step f 2: and converting the vehicle position and the wheel position in the camera system coordinate system to the world coordinate system.
Preferably, the vision sensor is a monocular camera.
The invention has the following beneficial effects:
1. the distance measurement method can accurately measure the distance from the front wheel or the rear wheel of the target vehicle to the vehicle, and further improves the subsequent early warning function of using the distance measurement method as an important input parameter.
2, the invention only needs a monocular camera as the visual sensor device, and compared with a binocular camera and even a millimeter wave radar, the cost is lower.
Drawings
Fig. 1 is a flow chart of a method of measuring distance by a wheel of an embodiment of the present invention.
Fig. 2 is a schematic view of a method of measuring distance by a wheel of one embodiment of the present invention.
Detailed Description
The invention is further illustrated by the following examples, which are intended only for a better understanding of the contents of the study of the invention and are not intended to limit the scope of the invention.
Fig. 1 is a flow chart of a method of measuring distance by a wheel of an embodiment of the present invention. Fig. 2 is a schematic view of a method of measuring distance by a wheel of one embodiment of the present invention. A method of measuring a distance by a wheel according to the present invention will be described in detail with reference to fig. 1 and 2. The distance is the wheel track between the vehicle and the surrounding vehicles, and comprises a transverse wheel track and a longitudinal wheel track. The method comprises the following steps:
step a: and establishing a world coordinate system taking the center of the vehicle as a coordinate origin. In the world coordinate system, the center of the vehicle is taken as the origin of coordinates, the advancing direction of the vehicle head is taken as the Y axis, the direction vertical to the ground is taken as the Z axis, and the direction vertical to the YZ plane and the right turning direction of the vehicle head is taken as the X axis.
Step b: images of vehicles and wheels around the vehicle are acquired by three vision sensors on the vehicle. The vision sensor is arranged at the head part and/or the left side and/or the right side of the vehicle. As shown in fig. 2, vision sensors may be installed in three places of the host vehicle A, B, C. In the embodiment of the invention, 3 visual sensors are simultaneously installed to acquire images, and in other embodiments, 1 visual sensor or a plurality of visual sensors can be installed to acquire images. Preferably, in this embodiment, the vision sensor is a monocular camera. The invention adopts the monocular camera as the visual sensor device, and has lower cost compared with the binocular camera and even a millimeter wave radar.
Step c: and carrying out vehicle identification and wheel identification on the acquired images, and obtaining the positions of the vehicle and the wheels in the images, namely the vehicle position and the wheel position. In this step, preferably, the vehicle identification and the wheel identification may be performed by a deep neural network method. In addition, vehicle identification and wheel identification can also be carried out by known image processing methods. The vehicle position comprises a 2D frame position of the vehicle, a 3D frame position of the vehicle and a curve of a vehicle contour; the wheel positions include a 2D frame position and a wheel profile curve of the vehicle.
Step d: and c, calculating the subordination relation between the wheels and the vehicle according to the geometric relative relation between the vehicle position and the wheel position obtained in the step c. Calculating the membership in step d further comprises the steps of:
step d 1: calculating to obtain the transverse maximum and the longitudinal maximum and the minimum of the vehicle body based on the coordinates of the vehicle contour curve;
step d 2: comparing the wheel edge point coordinate values with a maximum value range based on the wheel profile curve;
step d 3: if more than three edge points are in the maximum value range, the wheel is judged to belong to the target vehicle, and the affiliations between all the detected vehicles and all the detected wheels are obtained.
In the case of installing a plurality of vision sensors, the method further includes the step d 4: the merging processing is performed for a plurality of wheel positions belonging to the same target vehicle detected by different vision sensors.
Step e: and obtaining the coordinates of the contact points of the outer edges of the front wheel and the rear wheel of the vehicle under a world coordinate system. Since the size data of the vehicle is known, the coordinates of the contact points of the outer edges of the front wheel and the rear wheel of the vehicle in the world coordinate system can be calculated and obtained through the known data.
Step f: and c, calculating the coordinates of the front/rear wheels of the target vehicle in the world coordinate system according to the projective transformation relation based on the vehicle position, the wheel position and the membership obtained in the steps c and d. In step f, the method further comprises the following steps:
step f 1: converting the vehicle position and the wheel position from an image coordinate system to a camera system coordinate system;
step f 2: and converting the vehicle position and the wheel position in the camera system coordinate system to the world coordinate system.
And finally, step g: and e, calculating the coordinate difference value from the front/rear wheel of the target vehicle to the front/rear wheel on the left/right side of the vehicle under the world coordinate system according to the data obtained in the step e and the step f, and obtaining the transverse track and the longitudinal track between the vehicle and the target vehicle.
As shown in fig. 2, lines 1 to 7 from the center of the vehicle or the wheel touch point illustrate the types of distances that the method of the present invention supports to measure, including:
the straight line 1 is the longitudinal distance from the wheel touch point of the vehicle to the front/rear wheel touch point of the target vehicle;
the straight line 2 is the transverse distance from the wheel touch point of the vehicle to the front/rear wheel touch point of the target vehicle;
the straight line 3 is the longitudinal distance from the center point of the vehicle to the contact point of the front wheel and the rear wheel of the target vehicle;
the straight line 4 is the longitudinal distance from the center point of the vehicle to the contact point of the front wheel and the rear wheel of the target vehicle;
the straight line 5 is the straight line distance from the center point of the vehicle to the center point of the target vehicle;
the straight line 6 is the straight line distance from the center point of the vehicle to the contact point of the front wheels of the target vehicle;
the straight line 7 is the straight line distance from the center point of the vehicle to the contact point of the rear wheel of the target vehicle.
By the method, the distance measurement can be accurate to the distance from the front wheel or the rear wheel of the target vehicle to the vehicle, and the subsequent early warning function using the distance measurement as an important input parameter is further improved.
It will be apparent to those skilled in the art that the above embodiments are merely illustrative of the present invention and are not to be construed as limiting the present invention, and that changes and modifications to the above described embodiments may be made within the spirit and scope of the present invention as defined in the appended claims.
Claims (9)
1. A method of measuring a distance by means of a wheel, said distance being a track between a vehicle and a surrounding vehicle, characterized by the steps of:
step a: establishing a world coordinate system taking the center of the vehicle as a coordinate origin;
step b: acquiring images of vehicles and wheels around the vehicle through at least one vision sensor on the vehicle;
step c: carrying out vehicle identification and wheel identification on the acquired images to obtain the positions of the vehicle and the wheels in the images, namely the positions of the vehicle and the wheels;
step d: calculating the subordination relation between the wheels and the vehicle according to the geometric relative relation between the vehicle position and the wheel position obtained in the step c;
step e: obtaining the coordinates of the contact points of the outer edges of the front wheel and the rear wheel of the vehicle under a world coordinate system;
step f: calculating the coordinates of the front wheel and the rear wheel of the target vehicle under a world coordinate system according to the projection transformation relation based on the vehicle position, the wheel position and the membership obtained in the steps c and d;
step g: and e, calculating the coordinate difference value from the front/rear wheels of the target vehicle to the front/rear wheels on the left/right sides of the vehicle in the world coordinate system according to the coordinates of the contact points of the outer edges of the front wheels and the rear wheels of the vehicle in the world coordinate system obtained in the step e and the coordinates of the front/rear wheels of the target vehicle in the world coordinate system obtained in the step f, and thus obtaining the transverse wheel track and the longitudinal wheel track between the vehicle and the target vehicle.
2. The method according to claim 1, wherein in the world coordinate system, the center of the vehicle is taken as a coordinate origin, the forward direction of the vehicle head is taken as a Y axis, the vertical ground surface is taken as a Z axis, and the direction perpendicular to a YZ plane and the right turning direction of the vehicle head is taken as an X axis.
3. The method according to claim 2, wherein in step b, the vision sensor is arranged at the head part and/or the left side and/or the right side of the vehicle.
4. The method of claim 1, wherein in the step c, the vehicle identification and the wheel identification are performed by a deep neural network method.
5. The method of claim 4, wherein in step c, the vehicle position comprises a 2D frame position of the vehicle, a 3D frame position of the vehicle, and a curve of the vehicle profile; the wheel positions include a 2D frame position and a wheel profile curve of the vehicle.
6. The method of claim 5, wherein the calculating of the membership in step d further comprises the steps of:
step d 1: calculating to obtain the transverse maximum and the longitudinal maximum and the minimum of the vehicle body based on the coordinates of the vehicle contour curve;
step d 2: comparing the wheel edge point coordinate values with a maximum value range based on the wheel profile curve;
step d 3: if more than three edge points are in the maximum value range, the wheel is judged to belong to the target vehicle, and the affiliations between all the detected vehicles and all the detected wheels are obtained.
7. The method of claim 6, wherein in the case of installing a plurality of said vision sensors, step d further comprises the step d4 of: the merging processing is performed for a plurality of wheel positions belonging to the same target vehicle detected by different vision sensors.
8. The method of claim 1, wherein step f further comprises the steps of:
step f 1: converting the vehicle position and the wheel position from an image coordinate system to a camera system coordinate system;
step f 2: and converting the vehicle position and the wheel position in the camera system coordinate system to the world coordinate system.
9. The method of claim 1, wherein the vision sensor employs a monocular camera.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910207478.4A CN109827516B (en) | 2019-03-19 | 2019-03-19 | Method for measuring distance through wheel |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910207478.4A CN109827516B (en) | 2019-03-19 | 2019-03-19 | Method for measuring distance through wheel |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109827516A CN109827516A (en) | 2019-05-31 |
CN109827516B true CN109827516B (en) | 2020-09-04 |
Family
ID=66870418
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910207478.4A Active CN109827516B (en) | 2019-03-19 | 2019-03-19 | Method for measuring distance through wheel |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109827516B (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110246183B (en) * | 2019-06-24 | 2022-07-15 | 百度在线网络技术(北京)有限公司 | Wheel grounding point detection method, device and storage medium |
CN110287884B (en) * | 2019-06-26 | 2021-06-22 | 长安大学 | Voltage line detection method in auxiliary driving |
CN110738181B (en) * | 2019-10-21 | 2022-08-05 | 东软睿驰汽车技术(沈阳)有限公司 | Method and device for determining vehicle orientation information |
US11210533B1 (en) * | 2020-08-09 | 2021-12-28 | Phantom AI, Inc. | Method of predicting trajectory of vehicle |
CN112530160A (en) * | 2020-11-18 | 2021-03-19 | 合肥湛达智能科技有限公司 | Target distance detection method based on deep learning |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101281022A (en) * | 2008-04-08 | 2008-10-08 | 上海世科嘉车辆技术研发有限公司 | Method for measuring vehicle distance based on single eye machine vision |
CN104697491A (en) * | 2013-12-10 | 2015-06-10 | 通用汽车环球科技运作有限责任公司 | Distance determination using a monoscopic imager in a vehicle |
CN104899554A (en) * | 2015-05-07 | 2015-09-09 | 东北大学 | Vehicle ranging method based on monocular vision |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102012215754A1 (en) * | 2012-09-05 | 2014-03-06 | Robert Bosch Gmbh | Method and device for vehicle measurement |
CN105205805A (en) * | 2015-08-19 | 2015-12-30 | 奇瑞汽车股份有限公司 | Vision-based intelligent vehicle transverse control method |
CN106443650A (en) * | 2016-09-12 | 2017-02-22 | 电子科技大学成都研究院 | Monocular vision range finding method based on geometric relation |
-
2019
- 2019-03-19 CN CN201910207478.4A patent/CN109827516B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101281022A (en) * | 2008-04-08 | 2008-10-08 | 上海世科嘉车辆技术研发有限公司 | Method for measuring vehicle distance based on single eye machine vision |
CN104697491A (en) * | 2013-12-10 | 2015-06-10 | 通用汽车环球科技运作有限责任公司 | Distance determination using a monoscopic imager in a vehicle |
CN104899554A (en) * | 2015-05-07 | 2015-09-09 | 东北大学 | Vehicle ranging method based on monocular vision |
Also Published As
Publication number | Publication date |
---|---|
CN109827516A (en) | 2019-05-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109827516B (en) | Method for measuring distance through wheel | |
US11348266B2 (en) | Estimating distance to an object using a sequence of images recorded by a monocular camera | |
CN110077399B (en) | Vehicle anti-collision method based on road marking and wheel detection fusion | |
CN108444390B (en) | Unmanned automobile obstacle identification method and device | |
CN106054174B (en) | It is used to cross the fusion method of traffic application using radar and video camera | |
CN109844762B (en) | In-vehicle image processing apparatus | |
CN106054191B (en) | Wheel detection and its application in object tracking and sensor registration | |
CN110065494B (en) | Vehicle anti-collision method based on wheel detection | |
CN107133985B (en) | Automatic calibration method for vehicle-mounted camera based on lane line vanishing point | |
CN109359409A (en) | A kind of vehicle passability detection system of view-based access control model and laser radar sensor | |
US9846812B2 (en) | Image recognition system for a vehicle and corresponding method | |
US8180100B2 (en) | Plane detector and detecting method | |
US8154594B2 (en) | Mobile peripheral monitor | |
JP6313081B2 (en) | In-vehicle image processing apparatus and vehicle system using the same | |
Labayrade et al. | In-vehicle obstacles detection and characterization by stereovision | |
JP4082471B2 (en) | Outside monitoring device | |
CN104700414A (en) | Rapid distance-measuring method for pedestrian on road ahead on the basis of on-board binocular camera | |
JP2006053756A (en) | Object detector | |
CN110197173B (en) | Road edge detection method based on binocular vision | |
CN107796373B (en) | Distance measurement method based on monocular vision of front vehicle driven by lane plane geometric model | |
CN108197590B (en) | Pavement detection method, device, terminal and storage medium | |
Petrovai et al. | A stereovision based approach for detecting and tracking lane and forward obstacles on mobile devices | |
KR20190059894A (en) | Detecting objects from camera images | |
CN106570487A (en) | Method and device for predicting collision between objects | |
CN107220632B (en) | Road surface image segmentation method based on normal characteristic |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |