US20170242443A1 - Gap measurement for vehicle convoying - Google Patents

Gap measurement for vehicle convoying Download PDF

Info

Publication number
US20170242443A1
US20170242443A1 US15/590,715 US201715590715A US2017242443A1 US 20170242443 A1 US20170242443 A1 US 20170242443A1 US 201715590715 A US201715590715 A US 201715590715A US 2017242443 A1 US2017242443 A1 US 2017242443A1
Authority
US
United States
Prior art keywords
vehicle
radar
state
lead
recited
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/590,715
Inventor
Austin B. Schuh
Stephen M. ERLIEN
Stephan Pleines
John L. Jacobs
Joshua P. Switkes
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Peloton Technology Inc
Original Assignee
Peloton Technology Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from PCT/US2016/060167 external-priority patent/WO2017070714A1/en
Priority to US15/590,715 priority Critical patent/US20170242443A1/en
Application filed by Peloton Technology Inc filed Critical Peloton Technology Inc
Assigned to PELOTON TECHNOLOGY, INC. reassignment PELOTON TECHNOLOGY, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: JACOBS, JOHN L., ERLIEN, STEPHEN M., PLEINES, STEPHAN, SCHUH, AUSTIN B., SWITKES, JOSHUA P.
Publication of US20170242443A1 publication Critical patent/US20170242443A1/en
Priority to CN201780081508.0A priority patent/CN110418745B/en
Priority to JP2019523642A priority patent/JP7152395B2/en
Priority to CN202211662662.6A priority patent/CN116203551A/en
Priority to PCT/US2017/058477 priority patent/WO2018085107A1/en
Priority to EP17867739.9A priority patent/EP3535171A4/en
Priority to CA3042647A priority patent/CA3042647C/en
Priority to US15/936,271 priority patent/US10514706B2/en
Priority to US16/184,866 priority patent/US20190279513A1/en
Priority to US16/675,579 priority patent/US11360485B2/en
Priority to US17/839,464 priority patent/US12124271B2/en
Priority to JP2022155699A priority patent/JP7461431B2/en
Priority to JP2024046529A priority patent/JP2024095700A/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/14Adaptive cruise control
    • B60W30/16Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/13Receivers
    • G01S19/14Receivers specially adapted for specific applications
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0287Control of position or course in two dimensions specially adapted to land vehicles involving a plurality of land vehicles, e.g. fleet or convoy travelling
    • G05D1/0291Fleet control
    • G05D1/0293Convoy travelling
    • G06K9/6226
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • G08G1/163Decentralised systems, e.g. inter-vehicle communication involving continuous checking
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01QANTENNAS, i.e. RADIO AERIALS
    • H01Q1/00Details of, or arrangements associated with, antennas
    • H01Q1/27Adaptation for use in or on movable bodies
    • H01Q1/32Adaptation for use in or on road or rail vehicles
    • H01Q1/3208Adaptation for use in or on road or rail vehicles characterised by the application wherein the antenna is used
    • H01Q1/3233Adaptation for use in or on road or rail vehicles characterised by the application wherein the antenna is used particular used as part of a sensor or in a security system, e.g. for automotive radar, navigation systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9316Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles combined with communication equipment with other vehicles or with base stations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9318Controlling the steering
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/93185Controlling the brakes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9319Controlling the accelerator
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/932Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles using own vehicle data, e.g. ground speed, steering wheel direction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9325Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles for inter-vehicle distance regulation, e.g. navigating in platoons
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93271Sensor installation details in the front of the vehicles
    • G05D2201/0213
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/22Platooning, i.e. convoy of communicating vehicles

Definitions

  • the present invention relates generally to systems and methods for enabling vehicles to closely follow one another safely using automatic or partially automatic control.
  • vehicle automation relates to vehicular convoying systems that enable vehicles to follow closely together in a safe, efficient and convenient manner Following closely behind another vehicle has significant fuel savings benefits, but is generally unsafe when done manually by the driver.
  • vehicle convoying system is sometimes referred to as vehicle platooning systems in which a second, and potentially additional, vehicle(s) is/are autonomously or semi-autonomously controlled to closely follow a lead vehicle in a safe manner.
  • a variety of methods, controllers and algorithms are described for identifying the back of a particular vehicle (e.g., a platoon partner) in a set of distance measurement scenes and/or for tracking the back of such a vehicle.
  • the described techniques can be used in conjunction with a variety of different distance measuring technologies including radar, LIDAR, sonar units or any other time-of-flight distance measuring sensors, camera based distance measuring units, and others.
  • a radar (or other distance measurement) scene is received and first vehicle point candidates are identified at least in part by comparing the relative position of the respective detected objects that they represent, and in some circumstances the relative velocity of such detected objects, to an estimated position (and relative velocity) for the first vehicle.
  • the first vehicle point candidates are categorized based on their respective distances of the detected objects that they represent from the estimated position of the first vehicle.
  • the categorization is repeated for a multiplicity of samples so that the categorized first vehicle point candidates include candidates from multiple sequential samples.
  • the back of the first vehicle is then identified based at least in part of the categorization of the first vehicle point candidates.
  • the identified back of the first vehicle or an effective vehicle length that is determined based at least in part on the identified back of the first vehicle may then be used in the control of the second vehicle.
  • a bounding box is conceptually applied around the estimated position of the first vehicle and measurement system object points that are not located within the bounding box are not considered first vehicle point candidates.
  • the bounding box defines a region that exceeds a maximum expected size of the first vehicle.
  • the relative velocity of the vehicles is estimated together with an associated speed uncertainty.
  • object points within the set of detected object points that are moving at a relative speed that is not within the speed uncertainty of the estimated speed are not considered first vehicle point candidates.
  • categorizing the first vehicle point candidates includes populating a histogram with the first vehicle point candidates.
  • the histogram including a plurality of bins, with each bin representing a longitudinal distance range relative to the estimated position of the first vehicle.
  • the identification of the back of the first vehicle may be done after the histogram contains at least a predetermined number of first vehicle point candidates.
  • a clustering algorithm (as for example a modified mean shift algorithm) is applied to the first vehicle point candidates to identify one or more clusters of first vehicle point candidates.
  • the cluster located closest to the second vehicle that includes at least a predetermined threshold percentage or number of first vehicle radar point candidates may be selected to represent the back of the first vehicle.
  • Kalman filtering is used to estimate the position of the first vehicle.
  • a current radar (or other distance measurement) sample is obtained from a radar (or other distance measurement) unit.
  • the current distance measurement sample includes a set of zero or more object points.
  • a current estimate of a state of the lead vehicle corresponding to the current sample is obtained.
  • the current state estimate includes one or more state parameters which may include (but is not limited to), a position parameter (such as the current relative position of the lead vehicle), a speed parameter (such as a current relative velocity of the lead vehicle) and/or other position and/or orientation related parameters.
  • the current estimate of the state of the lead vehicle has an associated state uncertainty and does not take into account any information from the current distance measurement sample.
  • a determination is made regarding whether any of the object points match the estimated state of the lead vehicle within the state uncertainty. If so, the matching object point that best matches the estimated state of the lead vehicle is selected as a measured state of the lead vehicle. That measured state of the lead vehicle is then used in the determination of a sequentially next estimate of the state of the lead vehicle corresponding to a sequentially next sample. The foregoing steps are repeated a multiplicities of times to thereby track the lead vehicle.
  • the measured states of the lead vehicle may be used in the control of one or both of the vehicles—as for example in the context of vehicle platooning or convoying systems, in the at least partially automatic control of the trailing vehicle to maintain a desired gap between the lead vehicle and the trailing vehicle.
  • each sample indicates, for each of the object points, a position of a detected object corresponding to such object point (relative to the distance measuring unit).
  • Each current estimate of the state of the lead vehicle includes a current estimate of the (relative) position of the lead vehicle and has an associated position uncertainty. To be considered a valid measurement, the selected matching object point must match the estimated position of the lead vehicle within the position uncertainty. In some implementations, the current estimate of the position of the lead vehicle estimates the current position of a back of the lead vehicle.
  • each sample indicates, for each of the object points, a relative velocity of a detected object corresponding to such object point (relative to the distance measuring unit).
  • Each current estimate of the state of the lead vehicle includes a current estimate of the relative velocity of the lead vehicle and has an associated velocity uncertainty. To be considered a valid measurement, the selected matching object point must match the estimated relative velocity of the lead vehicle within the velocity uncertainty.
  • the state uncertainty is increased for the sequentially next estimate of the state of the lead vehicle.
  • GNSS position updates are periodically received based at least in part on detected GNSS positions of the lead and trailing vehicles. Each time a vehicle GNSS position update is received, the estimated state of the lead vehicle and the state uncertainty are updated based on such position update.
  • GNSS global navigation satellite systems
  • vehicle speed updates are periodically received based at least in part on detected wheel speeds of the lead and trailing vehicles. Each time a vehicle speed update is received, the estimated state of the lead vehicle and the state uncertainty are updated based on such lead vehicle speed update.
  • the described approaches are well suited for use in vehicle platooning and/or vehicle convoying systems including tractor-trailer truck platooning applications.
  • FIG. 1 is a block diagram of a representative platooning control architecture.
  • FIG. 2 is a flow chart illustrating a method of determining the effective length of a platoon partner based on outputs of a radar unit.
  • FIG. 3 is a diagrammatic illustration showing the nature of a bounding box relative to a partner vehicle's expected position.
  • FIG. 4A is a diagrammatic illustration showing exemplary radar object points that might be identified by a radar unit associated with a trailing truck that is following directly behind a lead truck.
  • FIG. 4B is a diagrammatic illustration showing a circumstance where the entire lead truck of FIG. 4A is not within the radar unit's field of view.
  • FIG. 4C is a diagrammatic illustration showing a circumstance where the bounding box associated with the lead truck of FIG. 4A is not entirely within the radar unit's field of view.
  • FIG. 4D is a diagrammatic illustration showing a circumstance where the lead truck is in a different lane than the trailing truck, but its entire bounding box is within the radar unit's field of view.
  • FIG. 5A is a graph that illustrates the relative location (longitudinally and laterally) of a first representative set of partner vehicle radar point candidates that might be detected when following a tractor-trailer rig.
  • FIG. 5B is a histogram representing the longitudinal distances of the detected partner vehicle radar point candidates illustrated in FIG. 5A .
  • FIG. 5C is a plot showing the mean shift centers of the histogram points represented in FIG. 5B .
  • FIG. 5D is a graph that illustrates the relative location (longitudinally and laterally) of a second (enlarged) set of partner vehicle radar point candidates that might be detected when following a tractor-trailer rig.
  • FIG. 5E is a histogram representing the longitudinal distances of the detected partner vehicle radar point candidates illustrated in FIG. 5D .
  • FIG. 5F is a plot showing the mean shift centers of the histogram points represented in FIG. 5E .
  • FIG. 6 is a diagrammatic block diagram of a radar scene processor suitable for use by a vehicle controller to interpret received radar scenes.
  • FIG. 7 is a flow chart illustrating a method of determining whether any particular radar scene reports the position of the back of a partner vehicle and updating the estimator of FIG. 6 .
  • FIG. 8 is a representation of a Kalman filter state array and covariance matrix suitable for use in some embodiments.
  • One of the goals of platooning is typically to maintain a desired longitudinal distance between the platooning vehicles, which is frequently referred to herein as the “desired gap”. That is, it is desirable for the trailing vehicle (e.g., a trailing truck) to maintain a designated gap relative to a specific vehicle (e.g., a lead truck).
  • the vehicles involved in a platoon will typically have sophisticated control systems suitable for initiating a platoon, maintaining the gap under a wide variety of different driving conditions, and gracefully dissolving the platoon as appropriate.
  • FIG. 1 diagrammatically illustrates a vehicle control architecture that is suitable for use with platooning tractor-trailer trucks.
  • a platoon controller 110 receives inputs from a number of sensors 130 on the tractor and/or one or more trailers or other connected units, and a number of actuators and actuator controllers 150 arranged to control operation of the tractor's powertrain and other vehicle systems.
  • An actuator interface (not shown) may be provided to facilitate communications between the platoon controller 110 and the actuator controllers 150 .
  • the platoon controller 110 also interacts with an inter-vehicle communications controller 170 which orchestrates communications with the platoon partner and a NOC communications controller 180 that orchestrates communications with a network operations center (NOC).
  • the vehicle also preferably has selected configuration files that include known information about the vehicle.
  • the functional components of the platoon controller 110 include gap regulator 112 , mass estimator 114 , radar tracker 116 and brake health monitor 118 . In many applications, the platoon controller 110 will include a variety of other components as well.
  • Some of the sensors utilized by the platoon controller 110 may include GNSS (GPS) unit 131 , wheel speed sensors 132 , inertial measurement devices 134 , radar unit 137 , LIDAR unit 138 , cameras 139 , accelerator pedal position sensor 141 , steering wheel position sensor 142 , brake pedal position sensor 143 , and various accelerometers.
  • GPS global navigation satellite systems
  • GPS is just one of the currently available global navigation satellite systems (GNSS). Therefore, it should be appreciated that data from any other GNSS system or from other suitable position sensing systems may be used in place of, or in addition to the GPS system.
  • sensors including wheel speed sensors, 132 , radar unit 137 , accelerator pedal position sensor 141 , steering wheel position sensor 142 , brake pedal position sensor 143 , and accelerometer 144 are relatively standard equipment on newer trucks (tractors) used to pull semi-trailers.
  • others such as the GNSS unit 131 and LIDAR unit 138 (if used) are not currently standard equipment on such tractors or may not be present on a particular vehicle and may be installed as needed or desired to help support platooning.
  • Some of the vehicle actuators controllers 150 that the platoon controller directs at least in part include torque request controller 152 (which may be integrated in an ECU or power train controller); transmission controller 154 , brake controller 156 and clutch controller 158 .
  • the communications between vehicles may be directed over any suitable channel and may be coordinated by inter-vehicle communications controller 170 .
  • DSRC Dedicated Short Range Communications
  • the DSRC protocol e.g. the IEEE 802.11p protocol
  • other communications protocols and channels may be used in addition to or in place of a DSRC link.
  • the inter vehicle communications may additionally or alternatively be transmitted over a Citizen's Band (CB) Radio channel, one or more General Mobile Radio Service (GMRS) bands, and one or more Family Radio Service (1-RS) bands or any other now existing or later developed communications channels using any suitable communication protocol.
  • CB Citizen's Band
  • GMRS General Mobile Radio Service
  • 1-RS Family Radio Service
  • the transmitted information may include the current commands generated by the platoon controller such as requested/commanded engine torque, requested/commanded braking deceleration. They may also include steering commands, gear commands, etc. when those aspects are controlled by platoon controller.
  • Corresponding information is received from the partner vehicle, regardless of whether those commands are generated by a platoon controller or other autonomous or semi-autonomous controller on the partner vehicle (e.g., an adaptive cruise control system (ACC) or a collision mitigation system (CMS)), or through other or more traditional mechanisms—as for example, in response to driver inputs (e.g., accelerator pedal position, brake position, steering wheel position, etc.).
  • ACC adaptive cruise control system
  • CMS collision mitigation system
  • tractor sensor information provided to platoon controller is also transmitted to the platoon partner and corresponding information is received from the platoon partner so that the platoon controllers on each vehicle can develop an accurate model of what the partner vehicle is doing.
  • any other relevant information that is provided to the platoon controller including any vehicle configuration information that is relevant to the platoon controller.
  • the specific information transmitted may vary widely based on the requirements of the platoon controllers, the sensors and actuators available on the respective vehicles, and the specific knowledge that each vehicle may have about itself.
  • the information transmitted between vehicles may also include information about intended future actions. For example, if the lead vehicle knows it approaching a hill, it may expect to increase its torque request (or decrease its torque request in the context of a downhill) in the near future and that information can be conveyed to a trailing vehicle for use as appropriate by the platoon controller.
  • the nature of the expected events themselves can be indicated (e.g., a hill, or curve or exit is approaching) together with the expected timing of such events.
  • the intended future actions can be reported in the context of expected control commands such as the expected torques and/or other control parameters and the timing at which such changes are expected.
  • expected control commands such as the expected torques and/or other control parameters and the timing at which such changes are expected.
  • expected events there are a wide variety of different types of expected events that may be relevant to the platoon control.
  • the communications between the vehicles and the NOC may be transmitted over a variety of different networks, such as the cellular network, various Wi-Fi networks, satellite communications networks and/or any of a variety of other networks as appropriate.
  • the communications with the NOC may be coordinated by NOC communications controller 180 .
  • the information transmitted to and/or received from the NOC may vary widely based on the overall system design.
  • the NOC may provide specific control parameters such as a target gap tolerance. These control parameters or constraints may be based on factors known at the NOC such as speed limits, the nature of the road/terrain (e.g., hilly vs. flat, winding vs. straight, etc.) weather conditions, traffic or road conditions, etc.
  • the NOC may provide information such information to the platoon controller.
  • the NOC may also provide information about the partner vehicle including its configuration information and any known relevant information about its current operational state such as weight, trailer length, etc.
  • the vehicles involved in a platoon will typically have one or more radar systems that are used to detect nearby objects. Since radar systems tend to be quite good at determining distances between objects, separation distances reported by the radar unit(s) are quite useful in controlling the gap between vehicles. Therefore, once a platooning partner is identified, it is important to locate that specific partner vehicle in the context of the radar system output. That is, to determine which (if any) of a variety of different objects that might be identified by the radar unit correspond to the targeted partner.
  • the platoon partner will not always correlate to the closest vehicle detected by the radar unit or to the vehicle that is directly in front of the trailing truck.
  • the partner may be out of sight of a host vehicle's radar unit because it is too far away.
  • the partner comes into sight of the radar unit, it becomes important to identify and distinguish that partner from other objects in the radar unit's field of view.
  • the description below describes techniques that are particularly well suited for identifying and distinguishing a designated partner from other objects that may be detected by a radar unit so that the radar unit can effectively track the partner vehicle (sometimes referred to as “locking onto” the partner).
  • a lead truck may change lanes at which point it may not be directly in front of the trailing vehicle, so again, it is important for that the distance between the platoon partners reported by the radar unit be associated with the platoon partner rather than merely the closest vehicle or a vehicle that happens to be directly in front of the trailing truck.
  • the radar unit may not be able to “see” the platooning partner. This could be because an interloper has gotten between the platoon partners or the lead vehicle has maneuvered out of view of the trailing vehicle's radar unit, interference with the radar signals, etc.
  • the position of the partner vehicle is generally known from the GPS based location information that is transmitted to the host vehicle.
  • the GPS system typically reports a location on the tractor, which could for example, be the position of the antenna(s) that receive the GPS signals.
  • the detected GPS position may then be translated to the position of a reference location on the vehicle that is a known distance from the GPS antenna, with the position of that reference location serving as the vehicle's reported GPS position.
  • the specific reference location chosen may vary based on control system preferences.
  • the reference location may be the center of the rear axles of the tractor.
  • the difference between the reported GPS position and the physical back of the vehicle can be significant to the platoon control. Therefore, it is often important to know the distance between the reported vehicle position and the actual back of the vehicle. This is sometimes referred to herein as the “effective vehicle length.”
  • the effective vehicle length is particularly important in the context of a tractor trailer truck where the reported GPS position is typically located somewhere on the cab (tractor) and the distance from the reported GPS position to the back of the trailer may be quite long.
  • trailer lengths on the order of 12-18 meters are common in the U.S. although they can be shorter or longer (indeed much longer in the context of double or triple trailers).
  • the distance from the reported GPS position to the back of the vehicle must also account for the longitudinal distance from the reported GPS position to the front of the trailer and/or any extensions associate with the load. It should be appreciated that in the trucking industry, the effective vehicle length often will not be known since any particular tractor may pull a variety of different trailers and the attachment point between the tractor and trailer is adjustable on the tractor.
  • radar units used in general road vehicle driving automation systems typically output data that indicates the presence of any object(s) detected within a designated field together with the relative position and speed of such object(s).
  • a radar unit may detect the presence of a variety of objects within its operational field.
  • the detected objects may include any vehicle positioned directly in front of the host vehicle, vehicles in adjacent lanes that may be passing, being passed by or driving in parallel to the platoon, stationary objects such as obstacles in the road, signs, trees, and other objects to the side of the road, etc.
  • the radar unit itself typically doesn't know or convey the identity or nature of the detected object. Rather it simply reports the relative position and motion of any and all perceived objects within its operational field.
  • the logic interpreting the output of the radar unit to have and maintain a good understanding of exactly where the partner vehicle is expected to be relative to the radar unit's field of view regardless of whether the partner vehicle is even in that field of view. This is possible even when no explicit mechanism is provided for identifying the partner because the platooning system preferably has multiple independent mechanisms that can be used to help determine a vehicle's position.
  • a communications link is preferably established between the platooning vehicles.
  • the communications may be established over one or more wireless links such as a Dedicated Short Range Communications (DSRC) link, a cellular link, etc.
  • DSRC Dedicated Short Range Communications
  • the processes used to identify potential platoon partners and to establish the platoon and appropriate communication links may vary widely. By way of example, a few representative techniques are described in U.S. patent application Ser. Nos. 13/542,622 and 13/542,627 as well as PCT Patent Application Nos. PCT/US2014/030770, PCT/US2016/049143 and PCT/US2016/060167 previously filed by Applicant, each of which is incorporated herein by reference.
  • the platoon controller 110 requests the radar system control logic attempt to find the partner vehicle. More specifically, the trailing vehicle's radar tracker 116 needs to find and thereafter track the back of the lead vehicle in the context of the radar unit's outputs so that its data can be used in gap control.
  • FIG. 2 a method particularly well suited for establishing a radar fix on a platoon partner will be described.
  • One aspect of establishing a radar fix is to determine the length of the partner so the GPS position information can be correlated to radar system outputs.
  • radar tracker control logic determines, receives or requests an estimate of the current relative position of the partner vehicle and subscribes to or regularly receives updates regarding the partner vehicle's relative position as they become available as represented by step 203 of FIG. 2 .
  • the estimated information may optionally include various additional position related information such as relative velocity of the vehicles, the relative heading of the vehicles, etc.
  • the radar tracker control logic is configured to estimate the current relative position, velocity and orientation (heading) of the partner vehicle based on a variety of sensor inputs from both the host vehicle and the partner vehicle.
  • the platoon partners are in communication with one another and during platooning, they send extensive information back and forth about themselves, including continually updated information about their current location and operating states.
  • some of the location related information that can be helpful to interpreting radar unit data may include information such as the partner vehicle's GPS position, wheel speed, orientation/heading (direction that the vehicle is heading), yaw rate (which indicates the vehicle's rate of turn), pitch, roll and acceleration/deceleration (longitudinal and angular in any of the forgoing directions).
  • Operational related information may also include a variety of other information of interest such the current torque requests, brake inputs, gear, etc.
  • Information about the vehicles may include information such as the make and model of the vehicle, its length (if known), its equipment, estimated weight, etc. Any of these and/or other available information can be used in the position related estimates. By way of example, one particular position estimator is described below with respect to FIGS. 6 and 7 .
  • the estimated partner vehicle position related information can come from any appropriate source and the estimation does not need to be made by the radar tracker control logic itself. Additionally, although it is preferred that position and operational information be transmitted in both directions between vehicles, that is not necessary as long as the host vehicle is able to obtain the required information about the partner vehicle(s).
  • the current location related information is updated very frequently.
  • update frequencies for items such as GPS position and wheel speed received over a DSRC link at frequencies on the order of 10 to 500 Hz, as for example 50 Hz work well although slower and much faster update frequencies may be used as appropriate in other embodiments.
  • regular updates of the location related information are desirable, there is no need that they be received synchronously or at consistent intervals.
  • the partner vehicles may or may not be within the radar unit's field of view.
  • both the host vehicle's position and the partner vehicle's position are generally known based at least on the received GPS data so it is easy to estimate their separation with reasonable certainty.
  • GPS location signals tend to be pretty good, the reported locations may be off by some amount and thus it is better to treat any reported GPS position as an estimate with some appropriate amount of uncertainty rather than treating the reported position as infallible information. More details regarding some specific algorithms that are suitable for estimating the partner vehicle position will be described in more detail below.
  • GPS position readings from commercially available GPS sensors used in vehicle automation applications tend to be accurate within about 2-3 meters in practical road conditions when there is a direct line of sight to at least 4 GPS satellites.
  • some GPS sensors are regularly more precise and no GPS sensors are guaranteed to always be that accurate due to variables such as interference, operations is regions where there is not line of sight visibility to the required number of operational GPS satellites, etc.
  • a bounding box is applied around the estimated relative position of the partner (step 206 of FIG. 2 ).
  • the purpose of the bounding box is to define a region that the partner vehicle is “expected” to be found in.
  • the logic will thereafter look for radar detected objects located within that bounding box in an effort to identify objects that may correlate to the partner vehicle.
  • the concept of a bounding box is helpful for several reasons. Initially it should be appreciated that the GPS unit will typically report the location of its antenna, which in the context of a tractor-trailer truck is usually on the cab. This detected position is then typically translated to a predefined reference location on the tractor and that translated position is used as the reported GPS position.
  • the reported GPS position for a tractor-trailer will be well in front of the back of the trailer which is (a) the point that is of primary interest to the gap control purposes, and (b) is typically the most prominent feature identified by the radar unit from a trailing platoon partner.
  • the distance between the reported GPS position and the back of the trailer will not be known in many circumstances.
  • One reason for the uncertainty is that a particular tractor (cab) may be used to pull a variety of different trailers (or other loads) which potentially have different lengths. Therefore the effective length of the tractor-trailer combination may vary from trip to trip and from a control standpoint it is generally undesirable to count on the driver to manually input the effective length of the tractor-trailer combination each trip.
  • the reported GPS positions of both platoon partners are subject to a degree of uncertainty.
  • the actual size and geometry of the bounding box used may vary but it is desirable that the region be large enough to encompass the entire range of vehicle lengths and widths that are possible plus a buffer to account of uncertainty in the estimated GPS position.
  • the longitudinal length of the bounding box be longer than any tractor-trailer combination that might be expected to be encountered.
  • U.S. commercial trucking applications involving normal tractor trailer combinations typically don't significantly exceed a combined length of 23 meters.
  • bounding boxes on the order of 32 meters long and 3-4.5 meters, as for example 3.8 meters wide have been found to work well.
  • the tractor-trailer combinations may be longer and therefore longer bounding boxes may be appropriate.
  • the size of the bounding box can be adjusted accordingly to more accurately reflect the expected offset between the GPS position and the back of the trailer—which correlates to the effective vehicle length.
  • the effective length and width of the platoon partner is “known,” it is still desirable to utilize a bounding box greater in size than the reported length and width to accommodate uncertainty in the GPS estimates and the possibility that the load may include a feature that extends beyond the vehicle's reported length.
  • the bounding box may encompass any desired geometric shape and/or may include dimensions other than longitudinal length and lateral width—as for example relative velocity.
  • the bounding box may be defined in any desired manner.
  • FIG. 3 A representative bounding box 255 applied around a lead truck 251 in a platoon of two trucks is diagrammatically illustrated in FIG. 3 .
  • each truck has a GPS unit 258 located on its tractor (cab) and a radar unit 260 located at the front of the cab. It can be seen that the bounding box exceeds the length and width of the lead truck 251 .
  • the bounding box may be defined more complexly.
  • the scaled squares of the lateral offset (Y off ) and the relative velocity (V) of the vehicles may be compared to a threshold (Th).
  • Th a threshold
  • a radar point would then be rejected if the sum of these squares exceeds the designated threshold (Th), even if the radar point is within the longitudinal range of the bounding box.
  • the bounding box has the effective appearance of a tube with in a state space map with velocity being the third axis.
  • the logic of such an approach is that if both the measured lateral offset and the measured velocity of a detected object are relatively lower probability matches, then the detected point is less likely to be a match (and therefore more appropriate to disregard for the purposes of identifying the back of a partner vehicle) than if one of those parameters is off but the other very nearly matches the expected value.
  • the bounding box definition may be arranged to change over time. For example, one or more selected dimensions of the bounding box may be reduced as the algorithm begins to develop a better understanding of what radar object sample points are more likely to correspond to the partner vehicle or the back of the partner vehicle.
  • the logic determines whether the entire bounding box is within the other vehicle's radar unit's field of view 263 (step 209 ). If not, the logic waits for the entire bounding box to come within the radar unit's field of view thereby effectively ignoring the radar system outputs for the purpose of identifying the partner vehicle (although of course the radar system outputs can be used for other purposes such as collision avoidance if desired). There are a variety of reasons why the partner vehicle may not be within or fully within the radar units field of view at any particular time. Initially, it should be appreciated that although the radar unit(s) used to support platooning may be placed at a variety of different locations on the vehicles, they often have a relatively narrow field of view.
  • one common approach is to place a forward facing radar unit having a relatively narrow fixed beam in the vicinity of the middle of the front bumper to detect objects in front of the vehicle.
  • a forward facing radar unit having a relatively narrow fixed beam is illustrated in FIG. 3 .
  • the field of view 263 of radar unit 260 located on the trailing truck 252 is also shown.
  • a forward facing radar unit When a forward facing radar unit is used, it will be unable to see any vehicle behind or to the side of its host vehicle. Even when the partner vehicle is ahead of the radar unit host, it may be out of the field of view if it is too far ahead of the host or is around a corner—as may be the case when a platoon partner is first identified. In some cases a platoon partner can be partially in the radar unit's field of view. A common example of this is when the partner vehicle in an adjacent lane and not far enough ahead for the back of its trailer to be seen by a narrow beamed forward facing radar unit.
  • FIGS. 4A-4D illustrate a few (of the many) potential relative positioning of two trucks that are in the process of establishing a platoon.
  • the lead truck 251 is directly ahead of the trailing truck 252 and its bounding box 255 is fully within the field of view 263 of trailing truck radar unit 260 .
  • the lead truck 251 is in a lane adjacent the trailing truck 252 and some, but not all of the lead truck 251 itself (and thus not all of bounding box 255 ) is within the field of view 263 of trailing truck radar unit 260 .
  • FIG. 4A the lead truck 251 is directly ahead of the trailing truck 252 and its bounding box 255 is fully within the field of view 263 of trailing truck radar unit 260 .
  • the lead truck 251 is in a lane adjacent the trailing truck 252 and some, but not all of the lead truck 251 itself (and thus not all of bounding box 255 ) is within the field of view 263 of trailing truck radar unit 260 .
  • the lead truck 251 is in a lane adjacent to the trailing truck 252 and all of the lead truck 251 itself, but not the entire bounding box 255 , is within the field of view 263 of trailing truck radar unit 260 .
  • the lead truck 251 is again in a lane adjacent the trailing truck 252 but differs from FIGS. 4B and 4C in that the entire bounding box 255 associated with lead truck 251 is within the field of view 263 of trailing truck radar unit 260 .
  • the partner vehicle identification logic waits at step 209 for the entire bounding box to come within the radar unit's field.
  • the radar system controller logic obtains a next radar sample (step 212 ) and a current estimate of the partner vehicle's position and velocity relative to itself (step 215 ).
  • a next radar sample step 212
  • a current estimate of the partner vehicle's position and velocity relative to itself step 215 .
  • Commercially available short range radar units utilized in road vehicle applications are typically configured to output their sensed scene at a relatively rapid sample rate. Each scene typically identifies a set of zero or more objects that have been detected as well as the velocity of such objects relative to the radar unit itself.
  • the nature of radar systems is that the transmitted radio waves can be reflected by most anything in their path including both any intended target(s) and potentially a wide variety of different items. Therefore, when trying to establish a platoon, it is important to identify the reflected signal(s) that represent the desired partner and to be able to distinguish that partner from the noise reflected from other objects.
  • the radar unit may receive reflections from multiple different vehicles including any vehicle that is immediately ahead, passing vehicles going in the same or opposite direction objects to the side of the road such as highway or street signs, trees or other objects along the side of the road, etc.
  • the radar system control logic determines whether any of the identified objects are partner vehicle radar point candidates as represented by step 218 .
  • Representative objects that might be detected by the radar unit 260 are marked with X's in FIGS. 4A-4D .
  • an object detected in the scene must be located within the bounding box in terms of both position and speed. Radar objects located outside of the bounding box are preferably rejected because there is a relatively higher probability that they do not correspond to the partner vehicle. For example, they could correspond to vehicles in adjacent lanes 272 , 273 , an interloper located between the platoon partners (not shown), objects on the side of the road 274 , etc.
  • Objects that do not closely match the expected relative speed of the partner vehicle are also preferably rejected even if they match the expected position aspects of the bounding box longitudinally and laterally because again, it is less likely that they correspond to the platoon partner.
  • a stationary object such as a feature to the side of the road (e.g. a road sign, tree or stationary vehicle), debris in the road, or a detected feature in the road itself (e.g. a pothole, etc.), will appear to be approaching the radar unit at the speed that the host vehicle is traveling at. It is noted that many commercially available radar units will automatically filter out, and therefore don't report, stationary objects. When such a radar unit is used, the stationary objects would not even be identified as part of the radar scene.
  • Some of the reported radar objects may be traveling in the same direction as the host vehicle but are moving at a relative velocity that is different than the expected partner velocity. There is a relatively high probability that such radar objects do not correspond to the partner vehicle and therefore these types of radar points are also preferably discarded.
  • any detected radar objects that appear to match the expected location and speed of the partner within the context of the defined bounding box are considered partner vehicle radar point candidates and are categorized with respect to how far they are longitudinally (along the longitudinal axis of the partner) from the estimated location of the partner (e.g., the partner's GPS position).
  • a histogram is utilized for to this categorization. The number of bins in the histogram may vary. For computational ease, 512 bins divided evenly over the length of the bounding box has been found to work well, although more or less bins can be used as appropriate for any particular application. In implementations that use a bounding box of approximately 32 meters, with 512 bins, each bin corresponds to approximately 6 cm (2-3 inches). If greater resolution is desired, then more bins can be used.
  • the radar when the radar is mounted relatively low on the host vehicle it may detect reflections from the transmission or other items along the truck's undercarriage or other features of the tractor-trailer such as the trailer's landing gear or the back of the tractor and identify those items as separate detected “objects.” Therefore, it is possible (indeed it is relatively common) that any particular sample may identify more than one object that meets the criteria of a partner vehicle radar point candidates. In such circumstances multiple candidates associated with a particular radar sample will be added to the histogram.
  • step 224 a determination is made regarding whether sufficient samples have been obtained to analyze the radar data to identify the partner vehicle in step 224 . If not, the logic returns to step 212 where the next sample is obtained and the process repeats until sufficient samples have been obtained to facilitate analysis. If the bounding box moves partially out of the field of view of the radar unit at any point (as represented by the “no” branch from decision block 225 ), then the logic returns to step 209 where it waits for the bounding box to come back into full view before taking additional samples.
  • FIG. 5A is a plot showing a set of 98 detected partner vehicle radar point candidates transposed into a reference frame based on the expected location of the front truck.
  • the x-axis of the plot shows the longitudinal distance from the expected position of the front of the leading truck to the detected point.
  • the y-axis shows the lateral offset of the detected point relative to the center axis of the leading truck. It can be seen that although there is noticeable variation in the locations of the detected points, in the illustrated sample set, the points tend to be clustered into a couple of regions.
  • FIG. 5B is a histogram that shows the longitudinal distance to each of the detected partner vehicle radar point candidates in the plot of FIG. 5A . It can be seen that when only the longitudinal distance is considered, the clustering tends to be even more pronounced.
  • the large cluster 290 located furthest back in the histogram typically corresponds to the back of the vehicle and is often (although not always) the largest cluster.
  • Cluster 292 located further forward typically correspond to other features of the partner truck.
  • radar reflections from the forward features tend to be weaker and more sporadically identified as a discrete object by the radar unit, which translates to a smaller cluster in the histogram.
  • the logic follows the yes branch from decision block 224 and flows to step 227 where a clustering algorithm is applied to the histogram data.
  • the trigger point for when processing may start can vary widely based on the needs of any particular system. In general, it is desirable for the histogram to contain enough data points so that the partner vehicle can be accurately identified. In some specific implementations, the histogram must include data from a first threshold worth of samples (e.g., samples corresponding to at least 3 seconds worth of data or 60 samples) and include at least a second threshold worth of partner vehicle radar point candidates (e.g., at least 60 partner vehicle radar points). The thresholds used may vary based on the needs of a particular implementation.
  • samples corresponding to at least 1-5 seconds worth of data or thresholds in the range of 40 to 500 points may be used in some implementations.
  • samples corresponding to at least 3 seconds worth of data or 60 samples and 60 partner vehicle radar points are used as thresholds.
  • the dataset illustrated in FIGS. 5A and 5B is representative of a dataset that might be available at the time that an attempt is initially made to identify the back of the partner vehicle—that is, the first time that the “yes” branch from step 224 is followed.
  • FIG. 5C is a plot showing the mean shift centers of the histogram points represented in FIG. 5B , with the heights of the centers being indicative of the number of points associated with that center.
  • the two clusters 290 and 292 stand out even more dramatically in this representation.
  • the mean shift data is then analyzed to determine whether one of the clusters meets predefined back of partner vehicle criteria in step 230 . If so, that cluster is identified as corresponding to the back of the vehicle. (Step 233 ). Since each cluster corresponds to a designated distance between the partner's reported GPS position and the back of the vehicle, the effective length of the vehicle is defined by the cluster. As noted above, the phrase “effective vehicle length” as used herein corresponds to the distance between the reported GPS position and the back of the vehicle—which is an important distance to know for control purposes. It should be appreciated that this is typically different than the actual length of the vehicle because the reported reference position may not be located at the front of the vehicle.
  • the cluster located closest to the back of bounding box that has over a threshold percentage of the total number of radar points in the histogram is identified as back of the platoon partner vehicle.
  • a further constraint is used that requires that the cluster location not move by more than a certain threshold on the last sample.
  • maximum movement thresholds on the order of 1 mm have been found to work well in some applications. This approach has been found to very reliably identify the radar point that corresponds to the back of a truck even when the radar unit controller has no predetermined knowledge of the length of the vehicle and regardless of the presence of other traffic.
  • the threshold percentage or other characteristics of the histogram used to identify the back of the vehicle may vary based on application.
  • cluster 290 is designated as the back of the lead truck.
  • Radar points that report features that are not where the platoon partner is expected to be are filtered because they are not within the bounding box. Radar points that are not traveling at close to the expected relative speed are filtered regardless of where they are found.
  • the back of vehicle criteria used on the clustered histogram data effectively filters any other vehicles traveling within the footprint of the bounding box at very near the same speed as the platoon partner because the bins are small enough that it is highly unlikely that such an interloper can maintain a constant enough gap to fool the algorithm into thinking that the interloper is part of the target (e.g., even if the interloper is traveling at nearly the same speed as the partner vehicle, if it is located within the bounding box, it's position relative to the partner vehicle's position is likely to vary enough to cause the back of partner vehicle test to fail.
  • the back of vehicle criteria also filters out more random objects reported by the radar unit.
  • the effective vehicle length indicated by the selected mean shift cluster may be reported to the gap controller and any other controller concerned with the length of the partner.
  • the distance between the GPS reference location and the front of the host vehicle is known and therefore the effective vehicle length determined by the radar unit can readily be used in association with known information about the truck to positively indicate the front and back of the truck as represented by step 236 .
  • radar points may optionally be discarded after they become too old or the process restarted if the system has trouble identifying the back of the partner vehicle or for other reasons, such as the vehicles coming to a stop.
  • the back of the partner identification process continues to run or is periodically rerun even after the vehicle length has been determined.
  • the initial length determination is made while the platoon partners are relatively far apart (e.g., over 100 feet).
  • the gap controller may tighten the gap thereby drawing the vehicles closer together.
  • the radar reading are often more precise than they are when the vehicles are 100+ feet apart.
  • more measurement give a better statistical indication of the relative position of the vehicle.
  • FIG. 5D is a plot showing a set of 1700 detected partner vehicle radar point candidates on the same graph as shown in FIG. 5A .
  • the 1700 sample points include the 98 points illustrated in FIGS. 5A-5C and were obtained by continuing to run the same radar point classification algorithm.
  • FIGS. 5E and 5F show the histogram and mean shift centers respectively for the larger data set.
  • FIG. 5E corresponds to FIG. 5B
  • FIG. 5F corresponds to FIG. 5C . It can be seen that the larger dataset appears to have identified a small cluster 293 located near the front of the lead vehicle and has effectively filtered out some smaller clusters identified in the smaller data set.
  • the histogram and/or mean shift clusters also provide a very good indication of the radar signature of the partner vehicle.
  • This known signature of the partner vehicle can be used in a number of different ways as an independent mechanism for verifying that the proper vehicle is being tracked. For example, in scenarios where GPS data becomes unavailable or communications between the vehicles are disrupted for a period of time, the histogram can be used as a check to verify that the correct vehicle is being tracked by the radar unit.
  • the portion of the truck that can be seen can be compared to the histogram signature to determine the relative positioning of the trucks, which can be used as a measurement for gap control or as part of autonomous or semi-autonomous control of the trailing vehicle.
  • a new histogram in circumstances when radar contact is lost, can be started at an appropriate time and a new histogram can be compared to a stored histogram indicative of the platoon partner. When there is a match, that match can be good independent evidence that radar contact with the platoon partner has been reestablished.
  • newly created histograms can be compared to stored histograms representing the platoon partner at various times during platooning as a way of independently verifying that the platoon partner is still being tracked. This can be a good safety check to verify that the radar unit has not inadvertently switched and locked onto a vehicle that is traveling in parallel next to the platoon partner.
  • the histograms can also be saved as a radar signature of the partner vehicle and shared with other trucks that may later seek to platoon with that vehicle—which can be useful in the initial identification process.
  • Such models preferably utilize inputs from multiple different sensing systems and include at least some redundant information from different systems when practical.
  • the provision of redundant information from different systems is helpful as a double check as to the integrity of received data and also provides backup mechanisms for the inevitable times when a system is unable to convey accurate information.
  • the gap between vehicles can be determined using a number of different techniques.
  • One general approach is to use the distance to the platoon partner detected by the radar system. Although radar tends to very accurately measure the distance between vehicles, it is important to ensure that the distance being reported is actually the distance to the platoon partner rather than some other vehicle or feature. There are also times when the partner vehicle is not within the radar's field of view or the radar or the radar unit is not operating as desired for a brief period.
  • An independent way of determining the distance between the platoon partners is to utilize their respective GPS data. Specifically, the distance between the vehicles should be the difference between the vehicle's respective GPS positions, minus the effective length of the lead vehicle and the offset distance between the front of the trailing vehicle and its GPS receiver.
  • GPS data Limitations of using the GPS data include the fact that the GPS data will not always be available due to factors such as the GPS receivers not having a clear view of sufficient GPS satellites to be able to determine a location or the communication link between vehicles being down for a period of time.
  • the GPS data is also fundamentally limited by the fact that the accuracy of the GPS data, which while good, is often less precise than desired for gap control.
  • Other systems for measuring distances between the platoon partners have their own advantages and limitations.
  • the gap expected at a time in the immediate future can be estimated based on factors such as the current positions, the relative velocities and yaw rates of the vehicles.
  • the respective velocities of the vehicles may also be measured, determined, estimated and/or predicted in a variety of different manners.
  • wheel speed sensors can be used to relatively accurately indicate the current speeds of the respective vehicles.
  • Knowledge of the vehicle's orientation can be used in conjunction with the knowledge of the vehicle's speed to determine its velocity.
  • the radar unit can be used to measure the relative speeds of the platoon partners. Knowledge of other factors such as torque request, vehicle weight, engine characteristics and road grade can be used to predict vehicle speeds in the future.
  • the radar system controller (or another controller whose determinations can be utilized by the radar system controller) includes a position estimator that maintains an estimate of the current position, orientation and relative speed of the partner vehicle relative to the radar unit.
  • a position estimator that maintains an estimate of the current position, orientation and relative speed of the partner vehicle relative to the radar unit.
  • One suitable radar scene processor 600 that includes a position/state estimator 612 is illustrated in FIG. 6 .
  • radar scene processor 600 includes gap monitor 610 and a partner identifier 620 .
  • the gap monitor 610 is configured to track the position of the back of the partner vehicle based on radar measurements (after the back of the partner vehicle has been identified) and to report radar position and speed measurements corresponding to the back of the partner vehicle to the gap controller and/or any other component interested in such measurements made by the radar unit.
  • One particular implementation of the gap monitoring algorithm will be described below with reference to the flow chart of FIG. 7 .
  • the gap monitor 610 includes a position/state estimator 612 having a Kalman filter 615 that is used to determine both the most recent estimate of the position of the partner vehicle relative to the host vehicle and to predict the expected position of the partner vehicle at the time the next radar sample will be taken.
  • the position/state estimator 612 utilizes both the detected radar scenes and other available vehicle state information such as the respective GPS positions, wheel speeds, and inertial measurements of the host and partner vehicles in the estimate of the expected state (e.g. position, velocity etc.) of the leading vehicle. These state estimates can then be used to help interpret the received radar scene.
  • the gap monitor 600 properly identify the radar return object that corresponds to the back of the partner vehicle out of a radar scene that may include a set of detected objects. This helps ensure that the proper detected point is used in the gap control. It is also helpful in identifying situations in which the tracker does not have good confidence regarding which (if any) of the objects detected by the radar in a particular scene sample accurately represent the position of the back of the partner vehicle so that such a sample can be discounted, ignored or otherwise properly handled in the context of the gap control algorithm.
  • One particular Kalman filter design that is well suited for use in the position/state estimator 612 is described below with respect to FIG. 8 .
  • the partner identifier 620 includes its own position/state estimator 622 , a histogram 624 , a clustering algorithm 625 which produces mean shift clusters 626 and partner length estimator 627 .
  • the partner identifier 620 executes an algorithm such as the algorithm discussed above with respect to FIG. 2 to identify the back of the partner vehicle. As part of that process, histogram 624 is populated.
  • the histogram is diagrammatically shown as being part of the partner identifier 620 , but it should be appreciated that the histogram is merely a data structure that can be physically located at any appropriate location and may be made available to a variety of other processes and controllers within, or external to, the radar tracker 620 .
  • the partner length estimator 624 is configured to determine the length of the partner vehicle (including its front and back relative to its GPS reference position) based on the histogram and other available information.
  • the position/state estimator 622 in the partner identifier 620 functions similarly to the position/state estimator 612 describe above and may also include a Kalman filter 623 .
  • a significant difference between position state estimator 622 used for partner identification and position/state estimator 612 is that what radar point corresponds to the back of the partner truck is not known during identification and therefore the radar unit samples cannot be used as part of the position/state estimates.
  • the position/state estimation, partner detection, partner length estimating and gap monitoring algorithms may be executed on a radar tracking processor dedicated to radar tracking alone, or they may be implemented on a processor that performs other gap or platoon management tasks as well.
  • the respective algorithms may be implemented as distinct computing processes or they may be integrated in various manners with each other and/or other functionality in various computing processes. In other embodiments, discrete or programmable logic may be used to implement the described functionality. It should be apparent that a wide variety of different models can be used to track the position of the back of the partner vehicle relative to the radar unit and to estimate future positions.
  • Two particular position/state estimators are diagrammatically illustrated as part of FIG. 6 and a method that can be used to estimate the current position at any given radar sample time is illustrated in the flow chart of FIG. 7 .
  • the trailing vehicle is tracking the position of the back of a lead vehicle, although an analogous process can be used by the lead vehicle to track a following vehicle or for parallel vehicles to track one another.
  • the described method presupposes that we have a reasonable estimate of the location of the back of the partner vehicle—which can initially be determined using the method described above with respect to FIG. 2 or in any other suitable manner. For example, when the effective length of the front vehicle is known, the initial estimate for the relative position of the back of the lead vehicle can be estimated based on GPS position data.
  • One way to determine whether a matching target is to quantify an uncertainty factor in association with the estimated position. If a radar target point is within the range of the uncertainty factor of the expected position, then it can be considered a match.
  • Kalman filtering is used to estimate the position of the back of the partner vehicle and to quantify the uncertainty. Kalman filtering is particularly appropriate because it inherently adjusts the uncertainty level based on the perceived accuracy of the measurements.
  • the closest radar object point identified in the radar scene is treated as the “matching” target.
  • the “closest” match may be selected based on a combination of metrics including longitudinal position, lateral position, relative speeds, etc.
  • the radar tracker transmits the distance to the matched object and relative speed of the matched object to the gap controller 112 as the current gap to and relative speed of, the back of partner vehicle (step 506 ).
  • the only information transmitted is the longitudinal distance to the back of the trailer and its relative speed. This is because while currently available radar units are generally quite good at measuring distance and relative speed, they are not as good at precisely measuring lateral velocities or providing precise lateral position information regarding identified objects. However, if the radar unit used can accurately measure other useful attributes of the target such as lateral velocities, acceleration, etc.,—that information may optionally be transmitted as well.
  • the best matched target is used to update the radar tracking position and speed estimate for the back of the truck as well (step 508 ).
  • the position and speed estimate is then propagated in time to the position expected for the next radar sample in step 510 . That is, the logic estimates the expected position of the back of the truck at the time the next radar sample is expected. This is a relatively simple matter since the radar samples are provided at regular intervals so the timing of the next expected sample is easy to determine. For example, if the radar sample rate is 20 Hz, the next sample can be expected to occur 0.05 seconds after the last sample.
  • the front and rear vehicles are traveling at exactly the same velocity and both vehicles are traveling in the same direction, than the “expected” position of the back of the front vehicle would be exactly the same as the last detected position of the back of the front vehicle.
  • vehicles will be traveling at slightly different speeds and possibly in slightly different directions if one of the vehicles is turned or turning slightly relative to the other. For example, using a simple example, if the trailing vehicle is moving in exactly the same direction as the lead vehicle at a constant velocity of 1.00 meters per second faster than the lead vehicle, then the back of the lead vehicle would be expected to be 5 cm closer to the lead vehicle at the time the next radar sample is taken (0.05 seconds after the last sample was taken).
  • Simple trigonometry may be used to determine the expected position if the vehicles are turned or turning slightly with respect to one another.
  • any number of other relevant variables that are known to or obtainable by the radar system controller can be considered in the calculation of the expected position and speed to further improve the estimates. These might include the respective accelerations (measured or estimated) of the vehicles, the respective directions of travel and/or rates of turn of the two vehicles, etc. Factors that may influence the velocity, acceleration or rate of turn of the vehicles such as the respective vehicles torque requests, the current grade, the vehicle weights, etc. may also be used to further refine the estimate.
  • the uncertainty estimate is updated as represented by block 512 as described in more detail below.
  • step 504 which utilizes the then current estimate of the position of the back of the lead vehicle to determine whether a match occurs.
  • the current estimate of the position of the lead vehicle can be expected to (indeed likely will) change over time.
  • the then current best estimate of the position of the back of front vehicle may be used which helps ensure that the partner vehicle is accurately tracked.
  • the platoon system preferably utilizes multiple independent or partially-independent mechanisms for tracking the position and speed, of the respective vehicles.
  • the platoon controller may have access to GPS position data which provides an independent mechanism for determining the relative positions of the platooning vehicles.
  • the platoon controller may also have access to wheel speed data which provides an alternative mechanism for determining the respective speeds, and thus the relative speed of the platoon partners.
  • Such data for the host vehicle is available from the host vehicle sensors.
  • Data for the partner vehicles is available over the communications link (e.g. the DSRC link, a cellular link or any other available communication method).
  • the radar tracking position and speed estimate is updated using the current GPS position estimate (step 523 ), and that updated position and speed estimate is propagated in time to the expected receipt of the next radar sample as represented by step 510 .
  • the radar tracking position and speed estimate is updated using the current wheel speed estimates (step 533 ), and that updated position and speed estimate is propagated in time to the expected receipt of the next radar sample as represented by step 510 .
  • the GPS position, wheel speed and inertial measurements are preferably updated on a relatively rapid basis—which is often (although not necessarily) more frequent than the radar samples.
  • GPS update frequencies in the range of 25 to 500 Hz, as for example 50 Hz have been found to work well for open road platoon control applications.
  • Similar wheel speed and inertial measurement update frequencies have also been found to work well—although there is no need to update the GPS positions, wheel speed and/or inertial measurements at the same sample rate as each other, or at the same sample rate as the radar unit.
  • the updates from the radar unit, the GPS sensors, the wheel speed sensor and inertial measurements are handled asynchronously as they are received. Although not required, this is useful to help ensure that the latest sensor inputs are utilized in estimating the expected relative positions and speeds of the platooning vehicles at the time the next radar unit scene sample is received. This is contrasted with a system in which the wheel speed sensor and GPS sensor information is updated once each sample of the radar unit. Although synchronous updates can also work well, the use of asynchronous updates tends to improve the accuracy of the estimates because various sensor inputs can be updated more frequently than the radar unit sampling rate.
  • the same types of measurements on the different trucks are preferably synchronized in time. That is, GPS position measurements on the front truck are preferably synchronized in time with GPS position measurements on the back truck so that the relative positions of the trucks can be determined at a particular instant in time.
  • the wheel speed measurements on the front truck are preferably synchronized in time with wheel speed measurements on the back truck so that the relative speeds of the trucks can be determined at a particular instant in time.
  • the various inertial measurements are also preferably synchronized with each other as well.
  • the GPS system provides very accurate global timing signals.
  • the clocks used for the platoon partners can be synchronized with the GPS signals and the various measurements (e.g. GPS position measurements, wheel speed measurements, inertial measurements, etc.) can therefore be instructed to occur at specific synchronized times on the respective trucks.
  • Each measurement may also be accompanied by a timestamp that indicates when the measurement was taken so that the synchronization of the measurements can be verified (or accounted for if similar sensor measurements are not synchronized between vehicles).
  • step 504 utilizes the then current estimate of the position of the back of the lead vehicle to determine whether any of the received radar sample object points (targets) match the expected position of the back of the partner vehicle. It should be appreciated that there may be times when no radar sample targets match the expected position of the back of the partner vehicle as represented by the “no” branch from decision 504 . In such cases the radar system controller still propagates the position estimate in time (step 510 ) so that the position estimate is updated for the next radar sample based on the other information the controller has. Such other information includes the then current estimates and may be further updated based on inputs from other systems (e.g., the GPS or wheel speed sensor) as previously discussed.
  • other information includes the then current estimates and may be further updated based on inputs from other systems (e.g., the GPS or wheel speed sensor) as previously discussed.
  • the radar unit will be shaken accordingly and any radar measurement samples taken at that instant are less likely to be accurate and/or useful to the model.
  • Other sensors such as the wheel speed and inertial measurement sensor are less likely to be accurate at such times as well.
  • the lead truck is aggressively braking it is more likely that its trailer will move back and forth more than usual which again suggests that any radar samples taken during such braking are less likely to be useful for predicting the future position of the back of the trailer.
  • the controller detects, or is informed, that an event is occurring that makes the measurements of any particular sensor suspect, the measurements from such sensor(s) can safely be ignored in the context of the position estimate.
  • inputs from other sensors deemed more reliable (if any) may continue to be used to update the position model and the position estimate may continue to be propagated in time for each subsequent sample.
  • the uncertainty associated with position estimate can be expected to increase slightly with each ignored sample, which has the effect of increasing the variation from the estimated position of the back of the partner vehicle that would be tolerated when determining whether there is a target that matches the expected position of the back of the partner vehicle.
  • the position model described above is relatively simple in that it utilizes a relatively small set of measured inputs including (1) the received radar scenes (which show the relative position and relative velocity of detected objects); (2) measured GPS positions of the platoon partners (which can be used to determine their relative positions); (3) measured wheel speeds of the platoon partners (which can be used to determine their relative speeds); and (4) measured yaw rate and orientation.
  • the position model can be adapted to utilize whatever relevant information is available to it in the position estimates. For example, if the pitch or roll of the vehicles are available, the position model can incorporate such measurements into the position estimates.
  • the roll can be useful because on trucks the GPS antennas tend to be located on top of the cabs at locations over 4 meters above the ground (e.g. 14-15 feet). At such heights, even relatively small tilting in the roll direction can cause the reported position of the respective vehicles to vary significantly.
  • the pitch can be useful for similar reasons. For example, with a platooning gap of 15 meters, a difference in pitch of just ⁇ 2 degrees can result in a difference of a meter in the apparent or detected height of an object. At further distances and/or larger pitch variations, those differences are amplified. Since many radar units used in platooning systems have relatively narrow views this can lead to expected objects not being detected, or detected objects being discarded, because they are further from the estimated position than expected when pitch is not considered. Similarly, if longitudinal and/or angular accelerations are available, the position model can incorporate the acceleration measurements into the position estimates.
  • the relative positioning and/or speed and/or orientation of the vehicles can relatively accurately be measured using other systems such as LIDAR, sonar, other time of flight distance sensors, sensors configured to receive a signal transmitted from another vehicle, cameras, stereo cameras or other appropriate technologies, those measurements can be incorporated into the position model in addition to, or in place of, the GPS, wheel speed and inertial measurements.
  • the position model can be considerably more sophisticated using inputs such a torque requests, braking signals and/or other operational information about the respective platoon partners to further refine the predicted position at the time of the next radar sample.
  • the radar sample object points are compared to the estimated (expected) position and relative speed of the back of the partner vehicle.
  • more or fewer parameters can be compared to identify a match.
  • matches may be based on matching the expected position of the partner vehicle rather than position and speed/velocity. If the radar unit is capable of reliably reporting other information such as acceleration, rates of lateral movement, etc., then such information can also be compared to corresponding estimates as part of the match identification 504 .
  • a significant advantage of the described approach is that the relative position and velocity estimates can reliably continue even when the back of the platoon partner is outside the view of the radar unit—as may sometimes be the case when the lead vehicle changes to a different lane, an interloper cuts in between the platooning vehicles, or a transitory fault occurs with the radar unit.
  • radar identification of the platoon partner can more easily be reestablished when the back of the platoon partner comes back into the radar unit's view.
  • this is very different than adaptive cruise control systems that utilize radar only to track the distance to the vehicle directly in front of the host vehicle—regardless of who that leading vehicle may be.
  • histogram and/or mean shift clusters described above with respect to FIG. 5 can be used as another check to verify that the correct vehicle is being tracked by the radar unit or to provide a reference point when some, but not all of the truck is within the radar unit's field of view.
  • a noteworthy feature of the method described with respect to FIG. 7 is that the same algorithm(s) can be used to estimate the relative position/velocity of the partner vehicle during the initial radar identification of the partner vehicle as described above with respect to FIG. 2 .
  • the radar tracker 116 / 600 would not have a good estimate of the position of the back of the partner vehicle.
  • no target would match the expected position of the back of the partner vehicle at decision point 504 so no measured position would be reported to the gap controller and the radar unit's measurements would not be used to update the position and speed estimates—thereby following the “no” branch from decision point 504 which causes steps 506 and 508 to be skipped.
  • the other available sensors including the GPS sensors 131 , the wheel speed sensors 132 and inertial measurement sensors 134 all provide their respective measurements, which provides a reasonable estimate of the position of the vehicle suitable for use in the initial identification of the partner vehicle.
  • Kalman filtering is intended to encompass linear quadratic estimation (LQE) as well as extensions and generalizations of LQE such as the extended Kalman filter and the unscented Kalman filter which are designed to work with nonlinear systems.
  • LQE linear quadratic estimation
  • extensions and generalizations of LQE such as the extended Kalman filter and the unscented Kalman filter which are designed to work with nonlinear systems.
  • Kalman filtering uses a series of measurements observed over time containing noise and other inaccuracies and produces estimates of unknown variables that tend to be more precise than those based on a single measurement alone.
  • the Kalman filter keeps track of the estimated state of the system and the variance or uncertainty of the estimate. This is particularly well suited for estimating the position, speed and other state information related to gap control because of the errors inherent is some of the measurements and the potential unavailability at times of some of the desired measurement samples.
  • state array (X) suitable for use in some of the described embodiments that involve a pair of platooning tractor-trailer trucks includes:
  • the estimated state at the time of the next radar sample (X k+1 ) is a function of the previous state (X k ) and a covariance matrix (P k ) indicative of the level of uncertainty in the measurements.
  • a covariance matrix corresponding to the state array (X) represented above is illustrated in FIG. 8 .
  • the estimated state at the time of the next radar sample (X k+1 ) is equal to the product of a state transition model (A) and the previous state (X k ) plus the product of a control input model (B) and any modeled inputs (u k ⁇ 1 ). This can be represented mathematically as follows.
  • One particular control input array (U) includes:
  • Kalman filtering is particularly well adapted to making the types of position and velocity estimations useful in the techniques described herein. Although Kalman filtering works particularly well, it should be appreciated that other state/space estimation algorithms, such as Particle Filtering, etc. can be used in alternative embodiments.
  • Kalman filtering works well is that most of the measurements, including the GPS measurements, the radar measurements, the wheel speed measurements and the inertial measurements tend to be subject to varying measurement errors. For example, it is not uncommon for any particular GPS measurement to be off by more than a meter.
  • the covariance matrix (P k ) quantifies the statistical variation (error) observed in the measurements and utilizes that knowledge to improve the quality of the position and speed estimates.
  • information about the state of the partner vehicle that is received from the partner vehicle is used by the host to help verify or confirm that data from a sensor on the host vehicle that is believed to measure a characteristic of the partner vehicle is actually representative of the partner vehicle.
  • information from a lead vehicle about its position, speed, orientation etc. is used by a radar scene processor on the trailing vehicle to predict an expected position and speed of the lead vehicle. Those predictions are then used to help determine which (if any) of the detected radar objects correspond to the lead vehicle.
  • the state information received from the lead vehicle may be a measured value (such as a measure wheel speed) or a predicted value (such as a predicted speed) which may be even more reliable in circumstances in which the parameter (e.g., speed) is changing.
  • partner vehicle state information such as the partner vehicle's: current torque request; braking status (including the status of the foundation brakes, a retarder, engine braking and/or any other braking device in the context of larger trucks); or steering angle.
  • the information can also include a status indicator such as an indication that a blinker, the hazard lights, the taillights or other lights are on.
  • a status indicator such as an indication that a blinker, the hazard lights, the taillights or other lights are on.
  • it can also include qualitative information about the partner vehicle such as its radar signature, or its visual appearance (e.g. its color, a identifying marker, or some other feature or characteristic that can be readily identified by one of the controllers on the host vehicle).
  • an intended or expected action such as notification that the lead vehicle is about to change lanes, will take the next exit or turn at the next intersection.
  • the host vehicle may request that the partner vehicle take specific actions to help with such identification.
  • the nature of such a request may vary widely—for example, the rear truck may request that the lead truck turn on specific lights, switch lanes, accelerate or decelerate to a specific speed, honk its horn, etc.
  • additional information about the partner vehicle can also be obtained from a third vehicle, a larger mesh of vehicles or from another external source.
  • a third vehicle travelling in parallel with the platoon partners may have measured the position, velocity and/or other characteristics of the partner vehicle and that information can be used as another independent check.
  • NOC network operations center
  • a network operations center (NOC) in communication with both platoon partners may know the intended route and communicate that route, or more short term directions to the host vehicle as appropriate.
  • information from the partner vehicle may be transmitted via an intermediary such as a third vehicle, a NOC, etc. Any of this type of data can be useful—and some of the information may be particularly helpful in circumstance in which communications between the vehicles is temporarily lost.
  • the described radar based vehicle identification and tracking can be used in any type of connected vehicle application in which independent information about the position and/or velocity of one or more other vehicles is known or available to the unit interpreting the radar data.
  • the described techniques are particularly well suited for use in convoying systems involving more than two vehicles.
  • the described techniques are very well adapted for use in autonomous vehicle traffic flow applications where knowledge about the intentions of other specific vehicles is deemed important. Indeed, this is expected to be an important application of the inventions with the growth of the autonomous and connected vehicle markets.
  • the inventions have been described primarily in the context of identifying and tracking other vehicles using commercially available radar units designed for use in driving automation systems. Such units are typically designed to analyze the received radar energy and identify objects that are believed to the radar manufacturer to be relevant. Although the described inventions work well with such units, they are not so constrained. Rather, both the vehicle identification and vehicle tracking processes are well suited for use with radar units that don't filter the response as much and report the reflected radar signal intensities in a more general way rather than attempting to identify particular objects. In particular, the statistical nature of the radar return binning and the back of vehicle detection are quite well suited for using radar data provided in other forms such as intensity/location. Furthermore, the invention is not limited to distance measurement systems using electromagnetic energy in the frequency range of radar.
  • target vehicle identification and/or tracking techniques may readily be used in conjunction with other electromagnetic energy based distance measuring technologies such as LIDAR which utilize electromagnetic energy in different frequency ranges, sound based distance measurement (e.g., sonar, ultrasound, etc.) or various time of flight based distance measuring systems.
  • LIDAR electromagnetic energy based distance measuring technologies
  • sound based distance measurement e.g., sonar, ultrasound, etc.
  • time of flight based distance measuring systems e.g., ultrasonic sensors, etc.
  • the described techniques can also be used in conjunction with distance measuring techniques using cameras or stereo cameras, beacon based technologies in which the sensor measures a beacon signal transmitted from the partner vehicle and/or other technologies.
  • the platooning vehicles may have mechanisms such as transponders suitable for identifying themselves to the radar unit. When available, information from such devices can be used to further assist with the identification and tracking of the platoon partner.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Traffic Control Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)

Abstract

A variety of methods, controllers and algorithms are described for identifying the back of a particular vehicle (e.g., a platoon partner) in a set of distance measurement scenes and/or for tracking the back of such a vehicle. The described techniques can be used in conjunction with a variety of different distance measuring technologies including radar, LIDAR, camera based distance measuring units and others. The described approaches are well suited for use in vehicle platooning and/or vehicle convoying systems including tractor-trailer truck platooning applications.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • The present application is a continuation-in-part of PCT Application No. PCT/US2016/060167, filed on Nov. 2, 2016, which claims priority of U.S. Provisional Patent Application No. 62/249,898, filed on Nov. 2, 2015, both of which are incorporated herein by reference in their entirety.
  • BACKGROUND
  • The present invention relates generally to systems and methods for enabling vehicles to closely follow one another safely using automatic or partially automatic control.
  • In recent years significant strides have been made in the fields of autonomous and semi-autonomous vehicles. One segment of vehicle automation relates to vehicular convoying systems that enable vehicles to follow closely together in a safe, efficient and convenient manner Following closely behind another vehicle has significant fuel savings benefits, but is generally unsafe when done manually by the driver. One type of vehicle convoying system is sometimes referred to as vehicle platooning systems in which a second, and potentially additional, vehicle(s) is/are autonomously or semi-autonomously controlled to closely follow a lead vehicle in a safe manner.
  • In vehicle platooning and convoying systems an understanding of the distance between the vehicles is a very important control parameter and multiple different independent mechanisms may be used to determine the distance between vehicles. These may include radar systems, transmitting absolute or relative position data between vehicles (e.g., GPS or other GNSS data), LIDAR systems, cameras, etc. A challenge that occurs when using radar in platooning type applications is that the partner vehicle must be reliably identified from a potentially ambiguous set of radar reflections and tracked under constantly changing conditions. The present application describes techniques for identifying and tracking specific vehicles based on vehicle radar data that are well suited for platooning, convoying and other autonomous or semi-autonomous driving applications.
  • SUMMARY
  • A variety of methods, controllers and algorithms are described for identifying the back of a particular vehicle (e.g., a platoon partner) in a set of distance measurement scenes and/or for tracking the back of such a vehicle. The described techniques can be used in conjunction with a variety of different distance measuring technologies including radar, LIDAR, sonar units or any other time-of-flight distance measuring sensors, camera based distance measuring units, and others.
  • In one aspect, a radar (or other distance measurement) scene is received and first vehicle point candidates are identified at least in part by comparing the relative position of the respective detected objects that they represent, and in some circumstances the relative velocity of such detected objects, to an estimated position (and relative velocity) for the first vehicle. The first vehicle point candidates are categorized based on their respective distances of the detected objects that they represent from the estimated position of the first vehicle. The categorization is repeated for a multiplicity of samples so that the categorized first vehicle point candidates include candidates from multiple sequential samples. The back of the first vehicle is then identified based at least in part of the categorization of the first vehicle point candidates. The identified back of the first vehicle or an effective vehicle length that is determined based at least in part on the identified back of the first vehicle may then be used in the control of the second vehicle.
  • In some embodiments, a bounding box is conceptually applied around the estimated position of the first vehicle and measurement system object points that are not located within the bounding box are not considered first vehicle point candidates. In some embodiments, the bounding box defines a region that exceeds a maximum expected size of the first vehicle.
  • In some embodiments, the relative velocity of the vehicles is estimated together with an associated speed uncertainty. In such embodiments, object points within the set of detected object points that are moving at a relative speed that is not within the speed uncertainty of the estimated speed are not considered first vehicle point candidates.
  • In some embodiments, categorizing the first vehicle point candidates includes populating a histogram with the first vehicle point candidates. The histogram including a plurality of bins, with each bin representing a longitudinal distance range relative to the estimated position of the first vehicle. In such embodiments, the identification of the back of the first vehicle may be done after the histogram contains at least a predetermined number of first vehicle point candidates. In some embodiments, a clustering algorithm (as for example a modified mean shift algorithm) is applied to the first vehicle point candidates to identify one or more clusters of first vehicle point candidates. In such embodiments, the cluster located closest to the second vehicle that includes at least a predetermined threshold percentage or number of first vehicle radar point candidates may be selected to represent the back of the first vehicle.
  • In some embodiments, Kalman filtering is used to estimate the position of the first vehicle.
  • In another aspect, methods of tracking a specific lead vehicle using a distance measuring unit mounted on a trailing vehicle are described. In this embodiment, a current radar (or other distance measurement) sample is obtained from a radar (or other distance measurement) unit. The current distance measurement sample includes a set of zero or more object points. In parallel, a current estimate of a state of the lead vehicle corresponding to the current sample is obtained. The current state estimate includes one or more state parameters which may include (but is not limited to), a position parameter (such as the current relative position of the lead vehicle), a speed parameter (such as a current relative velocity of the lead vehicle) and/or other position and/or orientation related parameters.
  • The current estimate of the state of the lead vehicle has an associated state uncertainty and does not take into account any information from the current distance measurement sample. A determination is made regarding whether any of the object points match the estimated state of the lead vehicle within the state uncertainty. If so, the matching object point that best matches the estimated state of the lead vehicle is selected as a measured state of the lead vehicle. That measured state of the lead vehicle is then used in the determination of a sequentially next estimate of the state of the lead vehicle corresponding to a sequentially next sample. The foregoing steps are repeated a multiplicities of times to thereby track the lead vehicle. The measured states of the lead vehicle may be used in the control of one or both of the vehicles—as for example in the context of vehicle platooning or convoying systems, in the at least partially automatic control of the trailing vehicle to maintain a desired gap between the lead vehicle and the trailing vehicle.
  • In some embodiments, each sample indicates, for each of the object points, a position of a detected object corresponding to such object point (relative to the distance measuring unit). Each current estimate of the state of the lead vehicle includes a current estimate of the (relative) position of the lead vehicle and has an associated position uncertainty. To be considered a valid measurement, the selected matching object point must match the estimated position of the lead vehicle within the position uncertainty. In some implementations, the current estimate of the position of the lead vehicle estimates the current position of a back of the lead vehicle.
  • In some implementations, each sample indicates, for each of the object points, a relative velocity of a detected object corresponding to such object point (relative to the distance measuring unit). Each current estimate of the state of the lead vehicle includes a current estimate of the relative velocity of the lead vehicle and has an associated velocity uncertainty. To be considered a valid measurement, the selected matching object point must match the estimated relative velocity of the lead vehicle within the velocity uncertainty.
  • In some embodiments, when none of the radar object points in a particular distance measurement sample match the estimated state of the lead vehicle within the state uncertainty, then the state uncertainty is increased for the sequentially next estimate of the state of the lead vehicle.
  • In some embodiments, global navigation satellite systems (GNSS) position updates are periodically received based at least in part on detected GNSS positions of the lead and trailing vehicles. Each time a vehicle GNSS position update is received, the estimated state of the lead vehicle and the state uncertainty are updated based on such position update.
  • In some embodiments vehicle speed updates are periodically received based at least in part on detected wheel speeds of the lead and trailing vehicles. Each time a vehicle speed update is received, the estimated state of the lead vehicle and the state uncertainty are updated based on such lead vehicle speed update.
  • The described approaches are well suited for use in vehicle platooning and/or vehicle convoying systems including tractor-trailer truck platooning applications.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The invention and the advantages thereof, may best be understood by reference to the following description taken in conjunction with the accompanying drawings in which:
  • FIG. 1 is a block diagram of a representative platooning control architecture.
  • FIG. 2 is a flow chart illustrating a method of determining the effective length of a platoon partner based on outputs of a radar unit.
  • FIG. 3 is a diagrammatic illustration showing the nature of a bounding box relative to a partner vehicle's expected position.
  • FIG. 4A is a diagrammatic illustration showing exemplary radar object points that might be identified by a radar unit associated with a trailing truck that is following directly behind a lead truck.
  • FIG. 4B is a diagrammatic illustration showing a circumstance where the entire lead truck of FIG. 4A is not within the radar unit's field of view.
  • FIG. 4C is a diagrammatic illustration showing a circumstance where the bounding box associated with the lead truck of FIG. 4A is not entirely within the radar unit's field of view.
  • FIG. 4D is a diagrammatic illustration showing a circumstance where the lead truck is in a different lane than the trailing truck, but its entire bounding box is within the radar unit's field of view.
  • FIG. 5A is a graph that illustrates the relative location (longitudinally and laterally) of a first representative set of partner vehicle radar point candidates that might be detected when following a tractor-trailer rig.
  • FIG. 5B is a histogram representing the longitudinal distances of the detected partner vehicle radar point candidates illustrated in FIG. 5A.
  • FIG. 5C is a plot showing the mean shift centers of the histogram points represented in FIG. 5B.
  • FIG. 5D is a graph that illustrates the relative location (longitudinally and laterally) of a second (enlarged) set of partner vehicle radar point candidates that might be detected when following a tractor-trailer rig.
  • FIG. 5E is a histogram representing the longitudinal distances of the detected partner vehicle radar point candidates illustrated in FIG. 5D.
  • FIG. 5F is a plot showing the mean shift centers of the histogram points represented in FIG. 5E.
  • FIG. 6 is a diagrammatic block diagram of a radar scene processor suitable for use by a vehicle controller to interpret received radar scenes.
  • FIG. 7 is a flow chart illustrating a method of determining whether any particular radar scene reports the position of the back of a partner vehicle and updating the estimator of FIG. 6.
  • FIG. 8 is a representation of a Kalman filter state array and covariance matrix suitable for use in some embodiments.
  • In the drawings, like reference numerals are sometimes used to designate like structural elements. It should also be appreciated that the depictions in the figures are diagrammatic and not to scale.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • The Applicant has proposed various vehicle platooning systems in which a second, and potentially additional, vehicle(s) is/are autonomously or semi-autonomously controlled to closely follow a lead vehicle in a safe manner By way of example, U.S. application Ser. Nos. 13/542,622, 13/542,627 and 14/292,583; U.S. Provisional Application Nos. 61/505,076, 62/249,898, 62/343,819, 62/377,970 and; and PCT Application Nos. PCT/US2014/030770, PCT/US2016/049143 and PCT/US2016/060167 describe various vehicle platooning systems in which a trailing vehicle is at least partially automatically controlled to closely follow a designated lead vehicle. Each of these earlier applications is incorporated herein by reference.
  • One of the goals of platooning is typically to maintain a desired longitudinal distance between the platooning vehicles, which is frequently referred to herein as the “desired gap”. That is, it is desirable for the trailing vehicle (e.g., a trailing truck) to maintain a designated gap relative to a specific vehicle (e.g., a lead truck). The vehicles involved in a platoon will typically have sophisticated control systems suitable for initiating a platoon, maintaining the gap under a wide variety of different driving conditions, and gracefully dissolving the platoon as appropriate.
  • The architecture and design of control systems suitable for implementing vehicle platooning may vary widely. By way of example, FIG. 1 diagrammatically illustrates a vehicle control architecture that is suitable for use with platooning tractor-trailer trucks. In the illustrated embodiment a platoon controller 110, receives inputs from a number of sensors 130 on the tractor and/or one or more trailers or other connected units, and a number of actuators and actuator controllers 150 arranged to control operation of the tractor's powertrain and other vehicle systems. An actuator interface (not shown) may be provided to facilitate communications between the platoon controller 110 and the actuator controllers 150. The platoon controller 110 also interacts with an inter-vehicle communications controller 170 which orchestrates communications with the platoon partner and a NOC communications controller 180 that orchestrates communications with a network operations center (NOC). The vehicle also preferably has selected configuration files that include known information about the vehicle.
  • Some of the functional components of the platoon controller 110 include gap regulator 112, mass estimator 114, radar tracker 116 and brake health monitor 118. In many applications, the platoon controller 110 will include a variety of other components as well.
  • Some of the sensors utilized by the platoon controller 110 may include GNSS (GPS) unit 131, wheel speed sensors 132, inertial measurement devices 134, radar unit 137, LIDAR unit 138, cameras 139, accelerator pedal position sensor 141, steering wheel position sensor 142, brake pedal position sensor 143, and various accelerometers. Of course, not all of these sensors will be available on all vehicles involved in a platoon and not all of these sensors are required in any particular embodiment. A variety of other sensor (now existing or later developed or commercially deployed) may be additionally or alternatively be utilized by the platoon controller in other embodiments. In the primary embodiments described herein, GPS position data is used. However, GPS is just one of the currently available global navigation satellite systems (GNSS). Therefore, it should be appreciated that data from any other GNSS system or from other suitable position sensing systems may be used in place of, or in addition to the GPS system.
  • Many (but not all) of the described sensors, including wheel speed sensors, 132, radar unit 137, accelerator pedal position sensor 141, steering wheel position sensor 142, brake pedal position sensor 143, and accelerometer 144 are relatively standard equipment on newer trucks (tractors) used to pull semi-trailers. However, others, such as the GNSS unit 131 and LIDAR unit 138 (if used) are not currently standard equipment on such tractors or may not be present on a particular vehicle and may be installed as needed or desired to help support platooning.
  • Some of the vehicle actuators controllers 150 that the platoon controller directs at least in part include torque request controller 152 (which may be integrated in an ECU or power train controller); transmission controller 154, brake controller 156 and clutch controller 158.
  • The communications between vehicles may be directed over any suitable channel and may be coordinated by inter-vehicle communications controller 170. By way of example, the Dedicated Short Range Communications (DSRC) protocol (e.g. the IEEE 802.11p protocol), which is a two-way short to medium range wireless communications technology that has been developed for vehicle to vehicle communications, works well. Of course other communications protocols and channels may be used in addition to or in place of a DSRC link. For example, the inter vehicle communications may additionally or alternatively be transmitted over a Citizen's Band (CB) Radio channel, one or more General Mobile Radio Service (GMRS) bands, and one or more Family Radio Service (1-RS) bands or any other now existing or later developed communications channels using any suitable communication protocol.
  • The specific information transmitted back and forth between the vehicles may vary widely based on the needs of the platoon controller. In various embodiments, the transmitted information may include the current commands generated by the platoon controller such as requested/commanded engine torque, requested/commanded braking deceleration. They may also include steering commands, gear commands, etc. when those aspects are controlled by platoon controller. Corresponding information is received from the partner vehicle, regardless of whether those commands are generated by a platoon controller or other autonomous or semi-autonomous controller on the partner vehicle (e.g., an adaptive cruise control system (ACC) or a collision mitigation system (CMS)), or through other or more traditional mechanisms—as for example, in response to driver inputs (e.g., accelerator pedal position, brake position, steering wheel position, etc.).
  • In many embodiments, much or all of the tractor sensor information provided to platoon controller is also transmitted to the platoon partner and corresponding information is received from the platoon partner so that the platoon controllers on each vehicle can develop an accurate model of what the partner vehicle is doing. The same is true for any other relevant information that is provided to the platoon controller, including any vehicle configuration information that is relevant to the platoon controller. It should be appreciated that the specific information transmitted may vary widely based on the requirements of the platoon controllers, the sensors and actuators available on the respective vehicles, and the specific knowledge that each vehicle may have about itself.
  • The information transmitted between vehicles may also include information about intended future actions. For example, if the lead vehicle knows it approaching a hill, it may expect to increase its torque request (or decrease its torque request in the context of a downhill) in the near future and that information can be conveyed to a trailing vehicle for use as appropriate by the platoon controller. Of course, there is a wide variety of other information that can be used to foresee future torque or braking requests and that information can be conveyed in a variety of different forms. In some embodiments, the nature of the expected events themselves can be indicated (e.g., a hill, or curve or exit is approaching) together with the expected timing of such events. In other embodiments, the intended future actions can be reported in the context of expected control commands such as the expected torques and/or other control parameters and the timing at which such changes are expected. Of course, there are a wide variety of different types of expected events that may be relevant to the platoon control.
  • The communications between the vehicles and the NOC may be transmitted over a variety of different networks, such as the cellular network, various Wi-Fi networks, satellite communications networks and/or any of a variety of other networks as appropriate. The communications with the NOC may be coordinated by NOC communications controller 180. The information transmitted to and/or received from the NOC may vary widely based on the overall system design. In some circumstances, the NOC may provide specific control parameters such as a target gap tolerance. These control parameters or constraints may be based on factors known at the NOC such as speed limits, the nature of the road/terrain (e.g., hilly vs. flat, winding vs. straight, etc.) weather conditions, traffic or road conditions, etc. In other circumstances the NOC may provide information such information to the platoon controller. The NOC may also provide information about the partner vehicle including its configuration information and any known relevant information about its current operational state such as weight, trailer length, etc.
  • Radar Tracking
  • The vehicles involved in a platoon will typically have one or more radar systems that are used to detect nearby objects. Since radar systems tend to be quite good at determining distances between objects, separation distances reported by the radar unit(s) are quite useful in controlling the gap between vehicles. Therefore, once a platooning partner is identified, it is important to locate that specific partner vehicle in the context of the radar system output. That is, to determine which (if any) of a variety of different objects that might be identified by the radar unit correspond to the targeted partner.
  • Preliminarily, it should be appreciated that the platoon partner will not always correlate to the closest vehicle detected by the radar unit or to the vehicle that is directly in front of the trailing truck. There are a wide variety of different scenarios that can cause this to be the case. For example, when the platoon is initially being set up, the partner may be out of sight of a host vehicle's radar unit because it is too far away. As the partner comes into sight of the radar unit, it becomes important to identify and distinguish that partner from other objects in the radar unit's field of view. The description below describes techniques that are particularly well suited for identifying and distinguishing a designated partner from other objects that may be detected by a radar unit so that the radar unit can effectively track the partner vehicle (sometimes referred to as “locking onto” the partner).
  • Furthermore, during the course of driving, there will be traffic in adjacent lanes that are traveling beside, passing or being passed by the platoon and it is important for the radar unit to be able to continue to differentiate the platoon partner from passing vehicles so that the gap controller doesn't start trying to maintain the gap from the wrong vehicle. In another example, a lead truck may change lanes at which point it may not be directly in front of the trailing vehicle, so again, it is important for that the distance between the platoon partners reported by the radar unit be associated with the platoon partner rather than merely the closest vehicle or a vehicle that happens to be directly in front of the trailing truck. There may also be times when the radar unit may not be able to “see” the platooning partner. This could be because an interloper has gotten between the platoon partners or the lead vehicle has maneuvered out of view of the trailing vehicle's radar unit, interference with the radar signals, etc.
  • For platoon control purposes, it is also important to understand where the back of the vehicle is relative to the vehicle's reported position. To elaborate, the position of the partner vehicle is generally known from the GPS based location information that is transmitted to the host vehicle. However, the GPS system typically reports a location on the tractor, which could for example, be the position of the antenna(s) that receive the GPS signals. The detected GPS position may then be translated to the position of a reference location on the vehicle that is a known distance from the GPS antenna, with the position of that reference location serving as the vehicle's reported GPS position. The specific reference location chosen may vary based on control system preferences. By way of example, in some tractor trailer truck platooning embodiments, the reference location may be the center of the rear axles of the tractor.
  • The difference between the reported GPS position and the physical back of the vehicle can be significant to the platoon control. Therefore, it is often important to know the distance between the reported vehicle position and the actual back of the vehicle. This is sometimes referred to herein as the “effective vehicle length.” The effective vehicle length is particularly important in the context of a tractor trailer truck where the reported GPS position is typically located somewhere on the cab (tractor) and the distance from the reported GPS position to the back of the trailer may be quite long. By way of example, trailer lengths on the order of 12-18 meters are common in the U.S. although they can be shorter or longer (indeed much longer in the context of double or triple trailers). The distance from the reported GPS position to the back of the vehicle must also account for the longitudinal distance from the reported GPS position to the front of the trailer and/or any extensions associate with the load. It should be appreciated that in the trucking industry, the effective vehicle length often will not be known since any particular tractor may pull a variety of different trailers and the attachment point between the tractor and trailer is adjustable on the tractor.
  • Establishing a Radar Fix on a Platoon Partner
  • As will be apparent from the discussion above, a challenge that occurs when using radar in platooning type applications is that the partner vehicle must initially be found and identified in the context of the radar system's output and thereafter reliably tracked under constantly changing conditions. In application such as the trucking industry, it is also desirable to determine the effective length of at least the lead vehicle.
  • Commercially available radar units used in general road vehicle driving automation systems typically output data that indicates the presence of any object(s) detected within a designated field together with the relative position and speed of such object(s). Thus, during driving, such a radar unit may detect the presence of a variety of objects within its operational field. The detected objects may include any vehicle positioned directly in front of the host vehicle, vehicles in adjacent lanes that may be passing, being passed by or driving in parallel to the platoon, stationary objects such as obstacles in the road, signs, trees, and other objects to the side of the road, etc. Although many different types of objects may be detected, the radar unit itself typically doesn't know or convey the identity or nature of the detected object. Rather it simply reports the relative position and motion of any and all perceived objects within its operational field. Therefore, to identify and track the partner vehicle in the context of the radar unit output, it is helpful for the logic interpreting the output of the radar unit to have and maintain a good understanding of exactly where the partner vehicle is expected to be relative to the radar unit's field of view regardless of whether the partner vehicle is even in that field of view. This is possible even when no explicit mechanism is provided for identifying the partner because the platooning system preferably has multiple independent mechanisms that can be used to help determine a vehicle's position.
  • When a platoon partner is identified a communications link is preferably established between the platooning vehicles. The communications may be established over one or more wireless links such as a Dedicated Short Range Communications (DSRC) link, a cellular link, etc. Once communications are established between the two vehicles, they begin transmitting data back and forth regarding their respective selves, their current locations and operational states. The processes used to identify potential platoon partners and to establish the platoon and appropriate communication links may vary widely. By way of example, a few representative techniques are described in U.S. patent application Ser. Nos. 13/542,622 and 13/542,627 as well as PCT Patent Application Nos. PCT/US2014/030770, PCT/US2016/049143 and PCT/US2016/060167 previously filed by Applicant, each of which is incorporated herein by reference.
  • Once a platoon partner has been identified, the platoon controller 110 requests the radar system control logic attempt to find the partner vehicle. More specifically, the trailing vehicle's radar tracker 116 needs to find and thereafter track the back of the lead vehicle in the context of the radar unit's outputs so that its data can be used in gap control. Referring next to FIG. 2, a method particularly well suited for establishing a radar fix on a platoon partner will be described. One aspect of establishing a radar fix is to determine the length of the partner so the GPS position information can be correlated to radar system outputs.
  • When the process initiates, radar tracker control logic determines, receives or requests an estimate of the current relative position of the partner vehicle and subscribes to or regularly receives updates regarding the partner vehicle's relative position as they become available as represented by step 203 of FIG. 2. In addition to the relative locations, the estimated information may optionally include various additional position related information such as relative velocity of the vehicles, the relative heading of the vehicles, etc.
  • In some embodiments, the radar tracker control logic is configured to estimate the current relative position, velocity and orientation (heading) of the partner vehicle based on a variety of sensor inputs from both the host vehicle and the partner vehicle. As mentioned above, the platoon partners are in communication with one another and during platooning, they send extensive information back and forth about themselves, including continually updated information about their current location and operating states. By way of example, some of the location related information that can be helpful to interpreting radar unit data may include information such as the partner vehicle's GPS position, wheel speed, orientation/heading (direction that the vehicle is heading), yaw rate (which indicates the vehicle's rate of turn), pitch, roll and acceleration/deceleration (longitudinal and angular in any of the forgoing directions). Operational related information may also include a variety of other information of interest such the current torque requests, brake inputs, gear, etc. Information about the vehicles, may include information such as the make and model of the vehicle, its length (if known), its equipment, estimated weight, etc. Any of these and/or other available information can be used in the position related estimates. By way of example, one particular position estimator is described below with respect to FIGS. 6 and 7.
  • Although a particular estimator is described, it should be appreciated that the estimated partner vehicle position related information can come from any appropriate source and the estimation does not need to be made by the radar tracker control logic itself. Additionally, although it is preferred that position and operational information be transmitted in both directions between vehicles, that is not necessary as long as the host vehicle is able to obtain the required information about the partner vehicle(s).
  • The current location related information is updated very frequently. Although the actual frequency of the updates can vary widely based on the nature of the information being updated and the nature of the communication link or vehicle system that provides the information, update frequencies for items such as GPS position and wheel speed received over a DSRC link at frequencies on the order of 10 to 500 Hz, as for example 50 Hz work well although slower and much faster update frequencies may be used as appropriate in other embodiments. Furthermore, although regular updates of the location related information are desirable, there is no need that they be received synchronously or at consistent intervals.
  • It should be appreciated that when the radar system begins trying to locate the partner vehicle, the partner vehicles may or may not be within the radar unit's field of view. However both the host vehicle's position and the partner vehicle's position are generally known based at least on the received GPS data so it is easy to estimate their separation with reasonable certainty. It should also be appreciated that although GPS location signals tend to be pretty good, the reported locations may be off by some amount and thus it is better to treat any reported GPS position as an estimate with some appropriate amount of uncertainty rather than treating the reported position as infallible information. More details regarding some specific algorithms that are suitable for estimating the partner vehicle position will be described in more detail below. Experience has shown that GPS position readings from commercially available GPS sensors used in vehicle automation applications tend to be accurate within about 2-3 meters in practical road conditions when there is a direct line of sight to at least 4 GPS satellites. However, it should be appreciated that some GPS sensors are regularly more precise and no GPS sensors are guaranteed to always be that accurate due to variables such as interference, operations is regions where there is not line of sight visibility to the required number of operational GPS satellites, etc.
  • Once the partner vehicle's relative position estimate is known, a bounding box is applied around the estimated relative position of the partner (step 206 of FIG. 2). The purpose of the bounding box is to define a region that the partner vehicle is “expected” to be found in. The logic will thereafter look for radar detected objects located within that bounding box in an effort to identify objects that may correlate to the partner vehicle. The concept of a bounding box is helpful for several reasons. Initially it should be appreciated that the GPS unit will typically report the location of its antenna, which in the context of a tractor-trailer truck is usually on the cab. This detected position is then typically translated to a predefined reference location on the tractor and that translated position is used as the reported GPS position. Thus, the reported GPS position for a tractor-trailer will be well in front of the back of the trailer which is (a) the point that is of primary interest to the gap control purposes, and (b) is typically the most prominent feature identified by the radar unit from a trailing platoon partner. Furthermore, the distance between the reported GPS position and the back of the trailer will not be known in many circumstances. One reason for the uncertainty is that a particular tractor (cab) may be used to pull a variety of different trailers (or other loads) which potentially have different lengths. Therefore the effective length of the tractor-trailer combination may vary from trip to trip and from a control standpoint it is generally undesirable to count on the driver to manually input the effective length of the tractor-trailer combination each trip. To a lesser extent the reported GPS positions of both platoon partners are subject to a degree of uncertainty.
  • The actual size and geometry of the bounding box used may vary but it is desirable that the region be large enough to encompass the entire range of vehicle lengths and widths that are possible plus a buffer to account of uncertainty in the estimated GPS position. Thus, for trucking applications, it is desirable that the longitudinal length of the bounding box be longer than any tractor-trailer combination that might be expected to be encountered. For example, U.S. commercial trucking applications involving normal tractor trailer combinations typically don't significantly exceed a combined length of 23 meters. In such applications, bounding boxes on the order of 32 meters long and 3-4.5 meters, as for example 3.8 meters wide have been found to work well. In regions that allow longer trailers or the use of double or triple trailers, the tractor-trailer combinations may be longer and therefore longer bounding boxes may be appropriate. If the actual length of the platoon partner is known, the size of the bounding box can be adjusted accordingly to more accurately reflect the expected offset between the GPS position and the back of the trailer—which correlates to the effective vehicle length. However, even when it is believed that the effective length and width of the platoon partner is “known,” it is still desirable to utilize a bounding box greater in size than the reported length and width to accommodate uncertainty in the GPS estimates and the possibility that the load may include a feature that extends beyond the vehicle's reported length.
  • It should be appreciated though that there is no need for the bounding box to be rectilinear in nature, rather, the bounding box may encompass any desired geometric shape and/or may include dimensions other than longitudinal length and lateral width—as for example relative velocity. Thus, the bounding box may be defined in any desired manner.
  • A representative bounding box 255 applied around a lead truck 251 in a platoon of two trucks is diagrammatically illustrated in FIG. 3. In the illustrated embodiment, each truck has a GPS unit 258 located on its tractor (cab) and a radar unit 260 located at the front of the cab. It can be seen that the bounding box exceeds the length and width of the lead truck 251.
  • In some embodiments, the bounding box may be defined more complexly. For example, in one particular embodiment, the scaled squares of the lateral offset (Yoff) and the relative velocity (V) of the vehicles may be compared to a threshold (Th). A radar point would then be rejected if the sum of these squares exceeds the designated threshold (Th), even if the radar point is within the longitudinal range of the bounding box. Such a test may be represented mathematically as shown below:

  • If kY off 2 +V 2≧Th, then the object is rejected
  • In such an approach, the bounding box has the effective appearance of a tube with in a state space map with velocity being the third axis. The logic of such an approach is that if both the measured lateral offset and the measured velocity of a detected object are relatively lower probability matches, then the detected point is less likely to be a match (and therefore more appropriate to disregard for the purposes of identifying the back of a partner vehicle) than if one of those parameters is off but the other very nearly matches the expected value. Although only a couple specific bounding box definition approaches have been described, it should be apparent that a wide variety of other bounding box definitions may be used as appropriate in other implementations. Additionally, the bounding box definition may be arranged to change over time. For example, one or more selected dimensions of the bounding box may be reduced as the algorithm begins to develop a better understanding of what radar object sample points are more likely to correspond to the partner vehicle or the back of the partner vehicle.
  • Once the bounding box has been established, the logic determines whether the entire bounding box is within the other vehicle's radar unit's field of view 263 (step 209). If not, the logic waits for the entire bounding box to come within the radar unit's field of view thereby effectively ignoring the radar system outputs for the purpose of identifying the partner vehicle (although of course the radar system outputs can be used for other purposes such as collision avoidance if desired). There are a variety of reasons why the partner vehicle may not be within or fully within the radar units field of view at any particular time. Initially, it should be appreciated that although the radar unit(s) used to support platooning may be placed at a variety of different locations on the vehicles, they often have a relatively narrow field of view. For example, one common approach is to place a forward facing radar unit having a relatively narrow fixed beam in the vicinity of the middle of the front bumper to detect objects in front of the vehicle. Such an arrangement is illustrated in FIG. 3. In that figure, the field of view 263 of radar unit 260 located on the trailing truck 252 is also shown.
  • When a forward facing radar unit is used, it will be unable to see any vehicle behind or to the side of its host vehicle. Even when the partner vehicle is ahead of the radar unit host, it may be out of the field of view if it is too far ahead of the host or is around a corner—as may be the case when a platoon partner is first identified. In some cases a platoon partner can be partially in the radar unit's field of view. A common example of this is when the partner vehicle in an adjacent lane and not far enough ahead for the back of its trailer to be seen by a narrow beamed forward facing radar unit. It should be appreciated that it is undesirable to utilize radar samples if the back of the bounding box is not within the radar unit's field of view, since there is a risk that the furthest back portion of the partner vehicle that is detected by the radar unit is not actually the back of the vehicle.
  • FIGS. 4A-4D illustrate a few (of the many) potential relative positioning of two trucks that are in the process of establishing a platoon. In FIG. 4A, the lead truck 251 is directly ahead of the trailing truck 252 and its bounding box 255 is fully within the field of view 263 of trailing truck radar unit 260. In contrast, in FIG. 4B, the lead truck 251 is in a lane adjacent the trailing truck 252 and some, but not all of the lead truck 251 itself (and thus not all of bounding box 255) is within the field of view 263 of trailing truck radar unit 260. In FIG. 4C, the lead truck 251 is in a lane adjacent to the trailing truck 252 and all of the lead truck 251 itself, but not the entire bounding box 255, is within the field of view 263 of trailing truck radar unit 260. In FIG. 4D, the lead truck 251 is again in a lane adjacent the trailing truck 252 but differs from FIGS. 4B and 4C in that the entire bounding box 255 associated with lead truck 251 is within the field of view 263 of trailing truck radar unit 260. In circumstances where the entire bounding box is not located within the radar unit's field of view (e.g., a scenario such as shown in FIG. 4B or 4C or when the lead vehicle is otherwise out of view), the partner vehicle identification logic waits at step 209 for the entire bounding box to come within the radar unit's field.
  • When the entire bounding box is within the radar unit's field of view (e.g. a scenario such as illustrated in FIG. 4A or FIG. 4D), the radar system controller logic obtains a next radar sample (step 212) and a current estimate of the partner vehicle's position and velocity relative to itself (step 215). Commercially available short range radar units utilized in road vehicle applications are typically configured to output their sensed scene at a relatively rapid sample rate. Each scene typically identifies a set of zero or more objects that have been detected as well as the velocity of such objects relative to the radar unit itself.
  • The nature of radar systems is that the transmitted radio waves can be reflected by most anything in their path including both any intended target(s) and potentially a wide variety of different items. Therefore, when trying to establish a platoon, it is important to identify the reflected signal(s) that represent the desired partner and to be able to distinguish that partner from the noise reflected from other objects. By way of example, when driving along a road, the radar unit may receive reflections from multiple different vehicles including any vehicle that is immediately ahead, passing vehicles going in the same or opposite direction objects to the side of the road such as highway or street signs, trees or other objects along the side of the road, etc.
  • When a sensed scene is received, the radar system control logic determines whether any of the identified objects are partner vehicle radar point candidates as represented by step 218. Representative objects that might be detected by the radar unit 260 are marked with X's in FIGS. 4A-4D. To qualify as a partner vehicle radar point candidate, an object detected in the scene must be located within the bounding box in terms of both position and speed. Radar objects located outside of the bounding box are preferably rejected because there is a relatively higher probability that they do not correspond to the partner vehicle. For example, they could correspond to vehicles in adjacent lanes 272, 273, an interloper located between the platoon partners (not shown), objects on the side of the road 274, etc. Objects that do not closely match the expected relative speed of the partner vehicle are also preferably rejected even if they match the expected position aspects of the bounding box longitudinally and laterally because again, it is less likely that they correspond to the platoon partner. For example, a stationary object such as a feature to the side of the road (e.g. a road sign, tree or stationary vehicle), debris in the road, or a detected feature in the road itself (e.g. a pothole, etc.), will appear to be approaching the radar unit at the speed that the host vehicle is traveling at. It is noted that many commercially available radar units will automatically filter out, and therefore don't report, stationary objects. When such a radar unit is used, the stationary objects would not even be identified as part of the radar scene.
  • Some of the reported radar objects may be traveling in the same direction as the host vehicle but are moving at a relative velocity that is different than the expected partner velocity. There is a relatively high probability that such radar objects do not correspond to the partner vehicle and therefore these types of radar points are also preferably discarded.
  • Any detected radar objects that appear to match the expected location and speed of the partner within the context of the defined bounding box are considered partner vehicle radar point candidates and are categorized with respect to how far they are longitudinally (along the longitudinal axis of the partner) from the estimated location of the partner (e.g., the partner's GPS position). In some embodiments, a histogram is utilized for to this categorization. The number of bins in the histogram may vary. For computational ease, 512 bins divided evenly over the length of the bounding box has been found to work well, although more or less bins can be used as appropriate for any particular application. In implementations that use a bounding box of approximately 32 meters, with 512 bins, each bin corresponds to approximately 6 cm (2-3 inches). If greater resolution is desired, then more bins can be used.
  • It has been observed that it is common for the short range radar units utilized in road vehicle applications to identify multiple different “objects” that may be actually part of the same vehicle as represented by radar points 276-279 in FIGS. 4A-4D. This is particularly common in trucks and indeed it is common for the radar signature of a tractor-trailer truck to appear as more than one object. For example, the back of the trailer, an underride guard, and/or other features of the trailer or load located near the back of the trailer may appear in the radar output as one or multiple distinct objects (e.g., points 276, 277). Additionally, objects located further up the trailer and/or objects in the vicinity of the cab may be separately identified (e.g. points 278, 279). For example when the radar is mounted relatively low on the host vehicle it may detect reflections from the transmission or other items along the truck's undercarriage or other features of the tractor-trailer such as the trailer's landing gear or the back of the tractor and identify those items as separate detected “objects.” Therefore, it is possible (indeed it is relatively common) that any particular sample may identify more than one object that meets the criteria of a partner vehicle radar point candidates. In such circumstances multiple candidates associated with a particular radar sample will be added to the histogram.
  • After the histogram has been populated with any partner vehicle radar point candidates identified in the sample, a determination is made regarding whether sufficient samples have been obtained to analyze the radar data to identify the partner vehicle in step 224. If not, the logic returns to step 212 where the next sample is obtained and the process repeats until sufficient samples have been obtained to facilitate analysis. If the bounding box moves partially out of the field of view of the radar unit at any point (as represented by the “no” branch from decision block 225), then the logic returns to step 209 where it waits for the bounding box to come back into full view before taking additional samples.
  • As discussed above, commercially available short range radar units utilized in road vehicle applications are typically configured to output their sensed scene at a relatively rapid sample rate. By way of example, sample rates on the order of 20 to 25 hertz are common, although either higher or lower sample frequencies may be used. Therefore, the histogram will populate fairly quickly when the partner vehicle is within the radar unit's field of view and the histogram will provide a rather good indication of the radar signature of the partner.
  • FIG. 5A is a plot showing a set of 98 detected partner vehicle radar point candidates transposed into a reference frame based on the expected location of the front truck. The x-axis of the plot shows the longitudinal distance from the expected position of the front of the leading truck to the detected point. The y-axis shows the lateral offset of the detected point relative to the center axis of the leading truck. It can be seen that although there is noticeable variation in the locations of the detected points, in the illustrated sample set, the points tend to be clustered into a couple of regions. FIG. 5B is a histogram that shows the longitudinal distance to each of the detected partner vehicle radar point candidates in the plot of FIG. 5A. It can be seen that when only the longitudinal distance is considered, the clustering tends to be even more pronounced.
  • The large cluster 290 located furthest back in the histogram typically corresponds to the back of the vehicle and is often (although not always) the largest cluster. Cluster 292 located further forward typically correspond to other features of the partner truck. Experience has shown that radar reflections from the forward features tend to be weaker and more sporadically identified as a discrete object by the radar unit, which translates to a smaller cluster in the histogram.
  • If sufficient samples have been obtained to support analysis, the logic follows the yes branch from decision block 224 and flows to step 227 where a clustering algorithm is applied to the histogram data. The trigger point for when processing may start can vary widely based on the needs of any particular system. In general, it is desirable for the histogram to contain enough data points so that the partner vehicle can be accurately identified. In some specific implementations, the histogram must include data from a first threshold worth of samples (e.g., samples corresponding to at least 3 seconds worth of data or 60 samples) and include at least a second threshold worth of partner vehicle radar point candidates (e.g., at least 60 partner vehicle radar points). The thresholds used may vary based on the needs of a particular implementation. By way of example, samples corresponding to at least 1-5 seconds worth of data or thresholds in the range of 40 to 500 points may be used in some implementations. In one specific example, samples corresponding to at least 3 seconds worth of data or 60 samples and 60 partner vehicle radar points are used as thresholds.
  • The dataset illustrated in FIGS. 5A and 5B is representative of a dataset that might be available at the time that an attempt is initially made to identify the back of the partner vehicle—that is, the first time that the “yes” branch from step 224 is followed.
  • In general, the clustering algorithm bunches data points that are highly likely to represent the same point. A variety of conventional clustering algorithms can be used for this purpose. By way of example, modified mean shift algorithms work well. FIG. 5C is a plot showing the mean shift centers of the histogram points represented in FIG. 5B, with the heights of the centers being indicative of the number of points associated with that center. The two clusters 290 and 292 stand out even more dramatically in this representation.
  • The mean shift data is then analyzed to determine whether one of the clusters meets predefined back of partner vehicle criteria in step 230. If so, that cluster is identified as corresponding to the back of the vehicle. (Step 233). Since each cluster corresponds to a designated distance between the partner's reported GPS position and the back of the vehicle, the effective length of the vehicle is defined by the cluster. As noted above, the phrase “effective vehicle length” as used herein corresponds to the distance between the reported GPS position and the back of the vehicle—which is an important distance to know for control purposes. It should be appreciated that this is typically different than the actual length of the vehicle because the reported reference position may not be located at the front of the vehicle.
  • In some implementations the cluster located closest to the back of bounding box that has over a threshold percentage of the total number of radar points in the histogram is identified as back of the platoon partner vehicle. In some implementations a further constraint is used that requires that the cluster location not move by more than a certain threshold on the last sample. By way of example, maximum movement thresholds on the order of 1 mm have been found to work well in some applications. This approach has been found to very reliably identify the radar point that corresponds to the back of a truck even when the radar unit controller has no predetermined knowledge of the length of the vehicle and regardless of the presence of other traffic. However, it should be appreciated that the threshold percentage or other characteristics of the histogram used to identify the back of the vehicle may vary based on application. In the embodiment illustrated in FIGS. 5A-5C, cluster 290 is designated as the back of the lead truck.
  • It is particularly noteworthy that even though other traffic moving in parallel with the platoon may be detected by the radar, the described approach very reliably filters those radar points by effectively applying a number of different types of filters. Radar points that report features that are not where the platoon partner is expected to be are filtered because they are not within the bounding box. Radar points that are not traveling at close to the expected relative speed are filtered regardless of where they are found. The back of vehicle criteria used on the clustered histogram data effectively filters any other vehicles traveling within the footprint of the bounding box at very near the same speed as the platoon partner because the bins are small enough that it is highly unlikely that such an interloper can maintain a constant enough gap to fool the algorithm into thinking that the interloper is part of the target (e.g., even if the interloper is traveling at nearly the same speed as the partner vehicle, if it is located within the bounding box, it's position relative to the partner vehicle's position is likely to vary enough to cause the back of partner vehicle test to fail. The back of vehicle criteria also filters out more random objects reported by the radar unit.
  • The effective vehicle length indicated by the selected mean shift cluster may be reported to the gap controller and any other controller concerned with the length of the partner. In most circumstances, the distance between the GPS reference location and the front of the host vehicle is known and therefore the effective vehicle length determined by the radar unit can readily be used in association with known information about the truck to positively indicate the front and back of the truck as represented by step 236.
  • In some circumstances none of the mean shift clusters will meet the back of partner vehicle criteria. In most cases this suggests that there is a risk that the partner vehicle is not being accurately tracked. In such cases (as illustrated by the no branch from decision 230) the process continues to run collecting radar points from additional samples until the criteria is met indicating that the partner vehicle has confidently been identified. In some embodiments, radar points may optionally be discarded after they become too old or the process restarted if the system has trouble identifying the back of the partner vehicle or for other reasons, such as the vehicles coming to a stop.
  • In some embodiments, the back of the partner identification process continues to run or is periodically rerun even after the vehicle length has been determined. There are several advantages to continuing to populate the histogram. Often the initial length determination is made while the platoon partners are relatively far apart (e.g., over 100 feet). Once the back of the partner vehicle has been reliably identified, the gap controller may tighten the gap thereby drawing the vehicles closer together. When the vehicles are closer together, the radar reading are often more precise than they are when the vehicles are 100+ feet apart. Additionally, remembering that in some circumstances the GPS measurements may be relatively far off for gap control purposes, more measurement give a better statistical indication of the relative position of the vehicle. By continuing to run the back of partner identification process, those better measurements can be used to more accurately determine the effective length of the partner vehicle, which is highly desirable for control purposes.
  • FIG. 5D is a plot showing a set of 1700 detected partner vehicle radar point candidates on the same graph as shown in FIG. 5A. The 1700 sample points include the 98 points illustrated in FIGS. 5A-5C and were obtained by continuing to run the same radar point classification algorithm. FIGS. 5E and 5F show the histogram and mean shift centers respectively for the larger data set. Thus, FIG. 5E corresponds to FIG. 5B, and FIG. 5F corresponds to FIG. 5C. It can be seen that the larger dataset appears to have identified a small cluster 293 located near the front of the lead vehicle and has effectively filtered out some smaller clusters identified in the smaller data set.
  • Continuing to run the back of partner identification process has other potential uses as well. For example, some trucks have the ability to draw the trailer closer to the cab when the truck is operating on the highway. Thus, although it is relatively rare, there are situations in which the effective length of the truck can vary over the course of a platoon. Such changes can automatically be detected by rerunning or continuing to run the back of the partner identification process.
  • Over time, the histogram and/or mean shift clusters also provide a very good indication of the radar signature of the partner vehicle. This known signature of the partner vehicle can be used in a number of different ways as an independent mechanism for verifying that the proper vehicle is being tracked. For example, in scenarios where GPS data becomes unavailable or communications between the vehicles are disrupted for a period of time, the histogram can be used as a check to verify that the correct vehicle is being tracked by the radar unit. In circumstances where the back of the lead truck is not within the view of the trailing vehicle's radar, but other portions of the trailer and tractor are within the radar's view, the portion of the truck that can be seen can be compared to the histogram signature to determine the relative positioning of the trucks, which can be used as a measurement for gap control or as part of autonomous or semi-autonomous control of the trailing vehicle.
  • In another example, in circumstances when radar contact is lost, a new histogram can be started at an appropriate time and a new histogram can be compared to a stored histogram indicative of the platoon partner. When there is a match, that match can be good independent evidence that radar contact with the platoon partner has been reestablished. Similarly, newly created histograms can be compared to stored histograms representing the platoon partner at various times during platooning as a way of independently verifying that the platoon partner is still being tracked. This can be a good safety check to verify that the radar unit has not inadvertently switched and locked onto a vehicle that is traveling in parallel next to the platoon partner. The histograms can also be saved as a radar signature of the partner vehicle and shared with other trucks that may later seek to platoon with that vehicle—which can be useful in the initial identification process.
  • Estimating Position of Platoon Partners
  • In the context of platooning, it is helpful to maintain accurate models of the expected relative positions, speeds and orientations of each of the vehicles in the platoon as such information is very helpful in the accurate control of the gap between platoon partners. Such models preferably utilize inputs from multiple different sensing systems and include at least some redundant information from different systems when practical. The provision of redundant information from different systems is helpful as a double check as to the integrity of received data and also provides backup mechanisms for the inevitable times when a system is unable to convey accurate information.
  • By way of example, the gap between vehicles can be determined using a number of different techniques. One general approach is to use the distance to the platoon partner detected by the radar system. Although radar tends to very accurately measure the distance between vehicles, it is important to ensure that the distance being reported is actually the distance to the platoon partner rather than some other vehicle or feature. There are also times when the partner vehicle is not within the radar's field of view or the radar or the radar unit is not operating as desired for a brief period. An independent way of determining the distance between the platoon partners is to utilize their respective GPS data. Specifically, the distance between the vehicles should be the difference between the vehicle's respective GPS positions, minus the effective length of the lead vehicle and the offset distance between the front of the trailing vehicle and its GPS receiver. Limitations of using the GPS data include the fact that the GPS data will not always be available due to factors such as the GPS receivers not having a clear view of sufficient GPS satellites to be able to determine a location or the communication link between vehicles being down for a period of time. The GPS data is also fundamentally limited by the fact that the accuracy of the GPS data, which while good, is often less precise than desired for gap control. Other systems for measuring distances between the platoon partners have their own advantages and limitations.
  • When the current gap between the vehicles is known, the gap expected at a time in the immediate future can be estimated based on factors such as the current positions, the relative velocities and yaw rates of the vehicles. The respective velocities of the vehicles may also be measured, determined, estimated and/or predicted in a variety of different manners. For example, wheel speed sensors can be used to relatively accurately indicate the current speeds of the respective vehicles. Knowledge of the vehicle's orientation can be used in conjunction with the knowledge of the vehicle's speed to determine its velocity. The radar unit can be used to measure the relative speeds of the platoon partners. Knowledge of other factors such as torque request, vehicle weight, engine characteristics and road grade can be used to predict vehicle speeds in the future.
  • In the context of the radar system control, knowing where the leading vehicle is expected to be relative to the radar unit on a trailing vehicle can be quite helpful in determining whether one or more objects detected by the radar unit correspond to the back of the lead vehicle. Therefore, in some embodiments, the radar system controller (or another controller whose determinations can be utilized by the radar system controller) includes a position estimator that maintains an estimate of the current position, orientation and relative speed of the partner vehicle relative to the radar unit. One suitable radar scene processor 600 that includes a position/state estimator 612 is illustrated in FIG. 6.
  • In the embodiment illustrated in FIG. 6, radar scene processor 600 includes gap monitor 610 and a partner identifier 620. The gap monitor 610 is configured to track the position of the back of the partner vehicle based on radar measurements (after the back of the partner vehicle has been identified) and to report radar position and speed measurements corresponding to the back of the partner vehicle to the gap controller and/or any other component interested in such measurements made by the radar unit. One particular implementation of the gap monitoring algorithm will be described below with reference to the flow chart of FIG. 7.
  • In the illustrated embodiment, the gap monitor 610 includes a position/state estimator 612 having a Kalman filter 615 that is used to determine both the most recent estimate of the position of the partner vehicle relative to the host vehicle and to predict the expected position of the partner vehicle at the time the next radar sample will be taken. As described in more detail with respect to FIG. 7, in the illustrated embodiment, the position/state estimator 612 utilizes both the detected radar scenes and other available vehicle state information such as the respective GPS positions, wheel speeds, and inertial measurements of the host and partner vehicles in the estimate of the expected state (e.g. position, velocity etc.) of the leading vehicle. These state estimates can then be used to help interpret the received radar scene. That is, having a reasonable estimate of where the partner vehicle is likely to be in the context of a radar scene helps the gap monitor 600 properly identify the radar return object that corresponds to the back of the partner vehicle out of a radar scene that may include a set of detected objects. This helps ensure that the proper detected point is used in the gap control. It is also helpful in identifying situations in which the tracker does not have good confidence regarding which (if any) of the objects detected by the radar in a particular scene sample accurately represent the position of the back of the partner vehicle so that such a sample can be discounted, ignored or otherwise properly handled in the context of the gap control algorithm. One particular Kalman filter design that is well suited for use in the position/state estimator 612 is described below with respect to FIG. 8.
  • The partner identifier 620 includes its own position/state estimator 622, a histogram 624, a clustering algorithm 625 which produces mean shift clusters 626 and partner length estimator 627. The partner identifier 620 executes an algorithm such as the algorithm discussed above with respect to FIG. 2 to identify the back of the partner vehicle. As part of that process, histogram 624 is populated. The histogram is diagrammatically shown as being part of the partner identifier 620, but it should be appreciated that the histogram is merely a data structure that can be physically located at any appropriate location and may be made available to a variety of other processes and controllers within, or external to, the radar tracker 620. The partner length estimator 624 is configured to determine the length of the partner vehicle (including its front and back relative to its GPS reference position) based on the histogram and other available information.
  • The position/state estimator 622 in the partner identifier 620 functions similarly to the position/state estimator 612 describe above and may also include a Kalman filter 623. A significant difference between position state estimator 622 used for partner identification and position/state estimator 612 is that what radar point corresponds to the back of the partner truck is not known during identification and therefore the radar unit samples cannot be used as part of the position/state estimates.
  • The position/state estimation, partner detection, partner length estimating and gap monitoring algorithms may be executed on a radar tracking processor dedicated to radar tracking alone, or they may be implemented on a processor that performs other gap or platoon management tasks as well. The respective algorithms may be implemented as distinct computing processes or they may be integrated in various manners with each other and/or other functionality in various computing processes. In other embodiments, discrete or programmable logic may be used to implement the described functionality. It should be apparent that a wide variety of different models can be used to track the position of the back of the partner vehicle relative to the radar unit and to estimate future positions. Two particular position/state estimators are diagrammatically illustrated as part of FIG. 6 and a method that can be used to estimate the current position at any given radar sample time is illustrated in the flow chart of FIG. 7.
  • Referring next to FIG. 7, a method of tracking a partner vehicle and estimating its future position based in part on information received from the radar unit will be described. In the illustrated embodiment, the trailing vehicle is tracking the position of the back of a lead vehicle, although an analogous process can be used by the lead vehicle to track a following vehicle or for parallel vehicles to track one another. The described method presupposes that we have a reasonable estimate of the location of the back of the partner vehicle—which can initially be determined using the method described above with respect to FIG. 2 or in any other suitable manner. For example, when the effective length of the front vehicle is known, the initial estimate for the relative position of the back of the lead vehicle can be estimated based on GPS position data.
  • Each time a new radar scene is received (step 502) a determination is made regarding whether any of the radar object points (targets) matches the expected position and relative velocity of the back of the partner vehicle (step 504). This is preferably a probabilistic determination in which it is concluded that that there is a high probability that the “matching” target indeed represents the back of the partner vehicle. One way to determine whether a matching target is to quantify an uncertainty factor in association with the estimated position. If a radar target point is within the range of the uncertainty factor of the expected position, then it can be considered a match. As will be described in more detail below in some implementations Kalman filtering is used to estimate the position of the back of the partner vehicle and to quantify the uncertainty. Kalman filtering is particularly appropriate because it inherently adjusts the uncertainty level based on the perceived accuracy of the measurements.
  • If more than one of the reported radar target points match the estimated position within the range defined by the uncertainty factor (sometimes referred to as a ball of uncertainty), then the closest radar object point identified in the radar scene is treated as the “matching” target. In the context of this determination, the “closest” match may be selected based on a combination of metrics including longitudinal position, lateral position, relative speeds, etc.
  • If a match is found, the radar tracker transmits the distance to the matched object and relative speed of the matched object to the gap controller 112 as the current gap to and relative speed of, the back of partner vehicle (step 506). In some embodiments, the only information transmitted is the longitudinal distance to the back of the trailer and its relative speed. This is because while currently available radar units are generally quite good at measuring distance and relative speed, they are not as good at precisely measuring lateral velocities or providing precise lateral position information regarding identified objects. However, if the radar unit used can accurately measure other useful attributes of the target such as lateral velocities, acceleration, etc.,—that information may optionally be transmitted as well.
  • When a match is found, the best matched target is used to update the radar tracking position and speed estimate for the back of the truck as well (step 508). The position and speed estimate is then propagated in time to the position expected for the next radar sample in step 510. That is, the logic estimates the expected position of the back of the truck at the time the next radar sample is expected. This is a relatively simple matter since the radar samples are provided at regular intervals so the timing of the next expected sample is easy to determine. For example, if the radar sample rate is 20 Hz, the next sample can be expected to occur 0.05 seconds after the last sample. If the front and rear vehicles are traveling at exactly the same velocity and both vehicles are traveling in the same direction, than the “expected” position of the back of the front vehicle would be exactly the same as the last detected position of the back of the front vehicle. However, often vehicles will be traveling at slightly different speeds and possibly in slightly different directions if one of the vehicles is turned or turning slightly relative to the other. For example, using a simple example, if the trailing vehicle is moving in exactly the same direction as the lead vehicle at a constant velocity of 1.00 meters per second faster than the lead vehicle, then the back of the lead vehicle would be expected to be 5 cm closer to the lead vehicle at the time the next radar sample is taken (0.05 seconds after the last sample was taken). Simple trigonometry may be used to determine the expected position if the vehicles are turned or turning slightly with respect to one another. Of course, any number of other relevant variables that are known to or obtainable by the radar system controller can be considered in the calculation of the expected position and speed to further improve the estimates. These might include the respective accelerations (measured or estimated) of the vehicles, the respective directions of travel and/or rates of turn of the two vehicles, etc. Factors that may influence the velocity, acceleration or rate of turn of the vehicles such as the respective vehicles torque requests, the current grade, the vehicle weights, etc. may also be used to further refine the estimate.
  • In addition to propagating the position estimate in time, the uncertainty estimate is updated as represented by block 512 as described in more detail below.
  • After the position estimate has been propagated in time and the uncertainty estimate has been updated, the process repeats for the next sample as represented in the flow chart of FIG. 7 by returning to step 502 where the next radar scene sample is received. The propagation of the estimated position in time is particularly useful in step 504 which utilizes the then current estimate of the position of the back of the lead vehicle to determine whether a match occurs. The current estimate of the position of the lead vehicle can be expected to (indeed likely will) change over time. For each radar sample, the then current best estimate of the position of the back of front vehicle may be used which helps ensure that the partner vehicle is accurately tracked.
  • As suggested above, the platoon system preferably utilizes multiple independent or partially-independent mechanisms for tracking the position and speed, of the respective vehicles. For example, as discussed above, the platoon controller may have access to GPS position data which provides an independent mechanism for determining the relative positions of the platooning vehicles. The platoon controller may also have access to wheel speed data which provides an alternative mechanism for determining the respective speeds, and thus the relative speed of the platoon partners. Such data for the host vehicle is available from the host vehicle sensors. Data for the partner vehicles is available over the communications link (e.g. the DSRC link, a cellular link or any other available communication method).
  • Each time that a new GPS position estimates are received (as represented by box 520 in FIG. 7), the radar tracking position and speed estimate is updated using the current GPS position estimate (step 523), and that updated position and speed estimate is propagated in time to the expected receipt of the next radar sample as represented by step 510. In parallel, each time that new wheel speed estimates are received (as represented by box 530 in FIG. 7), the radar tracking position and speed estimate is updated using the current wheel speed estimates (step 533), and that updated position and speed estimate is propagated in time to the expected receipt of the next radar sample as represented by step 510. Similarly, each time new inertial measurements such as yaw rates, vehicle orientation (heading), vehicle pitch and/or vehicle roll are received (as represented by box 540), the radar tracking position and speed estimate s updated using the current inertial measurements (step 542).
  • The GPS position, wheel speed and inertial measurements are preferably updated on a relatively rapid basis—which is often (although not necessarily) more frequent than the radar samples. By way of example, GPS update frequencies in the range of 25 to 500 Hz, as for example 50 Hz have been found to work well for open road platoon control applications. Similar wheel speed and inertial measurement update frequencies have also been found to work well—although there is no need to update the GPS positions, wheel speed and/or inertial measurements at the same sample rate as each other, or at the same sample rate as the radar unit.
  • In the embodiment shown, the updates from the radar unit, the GPS sensors, the wheel speed sensor and inertial measurements are handled asynchronously as they are received. Although not required, this is useful to help ensure that the latest sensor inputs are utilized in estimating the expected relative positions and speeds of the platooning vehicles at the time the next radar unit scene sample is received. This is contrasted with a system in which the wheel speed sensor and GPS sensor information is updated once each sample of the radar unit. Although synchronous updates can also work well, the use of asynchronous updates tends to improve the accuracy of the estimates because various sensor inputs can be updated more frequently than the radar unit sampling rate.
  • Although the different types of measurements do not need to be synchronized with one another, the same types of measurements on the different trucks are preferably synchronized in time. That is, GPS position measurements on the front truck are preferably synchronized in time with GPS position measurements on the back truck so that the relative positions of the trucks can be determined at a particular instant in time. Similarly, the wheel speed measurements on the front truck are preferably synchronized in time with wheel speed measurements on the back truck so that the relative speeds of the trucks can be determined at a particular instant in time. The various inertial measurements are also preferably synchronized with each other as well.
  • It should be appreciated that it is relatively simple to coordinate the timing of the various measurements between vehicles because GPS is used and the vehicles communicate with one another over the communications link. As is well known, the GPS system provides very accurate global timing signals. Thus, the clocks used for the platoon partners can be synchronized with the GPS signals and the various measurements (e.g. GPS position measurements, wheel speed measurements, inertial measurements, etc.) can therefore be instructed to occur at specific synchronized times on the respective trucks. Each measurement may also be accompanied by a timestamp that indicates when the measurement was taken so that the synchronization of the measurements can be verified (or accounted for if similar sensor measurements are not synchronized between vehicles).
  • The propagation of the estimated position in time is particularly useful in step 504 which utilizes the then current estimate of the position of the back of the lead vehicle to determine whether any of the received radar sample object points (targets) match the expected position of the back of the partner vehicle. It should be appreciated that there may be times when no radar sample targets match the expected position of the back of the partner vehicle as represented by the “no” branch from decision 504. In such cases the radar system controller still propagates the position estimate in time (step 510) so that the position estimate is updated for the next radar sample based on the other information the controller has. Such other information includes the then current estimates and may be further updated based on inputs from other systems (e.g., the GPS or wheel speed sensor) as previously discussed.
  • There are some operational circumstances where one or more measurements might be expected to be suspect. For example, when a host vehicle is shaken unusually hard—as may occur when a wheel runs over a pothole or encounters other unusual roughness in the road—the radar unit will be shaken accordingly and any radar measurement samples taken at that instant are less likely to be accurate and/or useful to the model. Other sensors such as the wheel speed and inertial measurement sensor are less likely to be accurate at such times as well. In another example, when the lead truck is aggressively braking it is more likely that its trailer will move back and forth more than usual which again suggests that any radar samples taken during such braking are less likely to be useful for predicting the future position of the back of the trailer. When the controller detects, or is informed, that an event is occurring that makes the measurements of any particular sensor suspect, the measurements from such sensor(s) can safely be ignored in the context of the position estimate. In such circumstances inputs from other sensors deemed more reliable (if any) may continue to be used to update the position model and the position estimate may continue to be propagated in time for each subsequent sample. The uncertainty associated with position estimate can be expected to increase slightly with each ignored sample, which has the effect of increasing the variation from the estimated position of the back of the partner vehicle that would be tolerated when determining whether there is a target that matches the expected position of the back of the partner vehicle.
  • The position model described above is relatively simple in that it utilizes a relatively small set of measured inputs including (1) the received radar scenes (which show the relative position and relative velocity of detected objects); (2) measured GPS positions of the platoon partners (which can be used to determine their relative positions); (3) measured wheel speeds of the platoon partners (which can be used to determine their relative speeds); and (4) measured yaw rate and orientation. In other embodiments, when different or additional types of sensor information is available to the radar controller, the position model can be adapted to utilize whatever relevant information is available to it in the position estimates. For example, if the pitch or roll of the vehicles are available, the position model can incorporate such measurements into the position estimates. The roll can be useful because on trucks the GPS antennas tend to be located on top of the cabs at locations over 4 meters above the ground (e.g. 14-15 feet). At such heights, even relatively small tilting in the roll direction can cause the reported position of the respective vehicles to vary significantly. The pitch can be useful for similar reasons. For example, with a platooning gap of 15 meters, a difference in pitch of just ±2 degrees can result in a difference of a meter in the apparent or detected height of an object. At further distances and/or larger pitch variations, those differences are amplified. Since many radar units used in platooning systems have relatively narrow views this can lead to expected objects not being detected, or detected objects being discarded, because they are further from the estimated position than expected when pitch is not considered. Similarly, if longitudinal and/or angular accelerations are available, the position model can incorporate the acceleration measurements into the position estimates.
  • In embodiments in which the relative positioning and/or speed and/or orientation of the vehicles can relatively accurately be measured using other systems such as LIDAR, sonar, other time of flight distance sensors, sensors configured to receive a signal transmitted from another vehicle, cameras, stereo cameras or other appropriate technologies, those measurements can be incorporated into the position model in addition to, or in place of, the GPS, wheel speed and inertial measurements.
  • In some embodiments, the position model can be considerably more sophisticated using inputs such a torque requests, braking signals and/or other operational information about the respective platoon partners to further refine the predicted position at the time of the next radar sample.
  • In the primary described embodiment the radar sample object points are compared to the estimated (expected) position and relative speed of the back of the partner vehicle. In other embodiments, more or fewer parameters can be compared to identify a match. For example, in some embodiments matches (or lack thereof) may be based on matching the expected position of the partner vehicle rather than position and speed/velocity. If the radar unit is capable of reliably reporting other information such as acceleration, rates of lateral movement, etc., then such information can also be compared to corresponding estimates as part of the match identification 504.
  • A significant advantage of the described approach is that the relative position and velocity estimates can reliably continue even when the back of the platoon partner is outside the view of the radar unit—as may sometimes be the case when the lead vehicle changes to a different lane, an interloper cuts in between the platooning vehicles, or a transitory fault occurs with the radar unit. With such tracking, radar identification of the platoon partner can more easily be reestablished when the back of the platoon partner comes back into the radar unit's view. As will be appreciated by those familiar with the art, this is very different than adaptive cruise control systems that utilize radar only to track the distance to the vehicle directly in front of the host vehicle—regardless of who that leading vehicle may be.
  • It is noted that the histogram and/or mean shift clusters described above with respect to FIG. 5 can be used as another check to verify that the correct vehicle is being tracked by the radar unit or to provide a reference point when some, but not all of the truck is within the radar unit's field of view.
  • A noteworthy feature of the method described with respect to FIG. 7 is that the same algorithm(s) can be used to estimate the relative position/velocity of the partner vehicle during the initial radar identification of the partner vehicle as described above with respect to FIG. 2. In that situation, the radar tracker 116/600 would not have a good estimate of the position of the back of the partner vehicle. As such, no target would match the expected position of the back of the partner vehicle at decision point 504 so no measured position would be reported to the gap controller and the radar unit's measurements would not be used to update the position and speed estimates—thereby following the “no” branch from decision point 504 which causes steps 506 and 508 to be skipped. However, the other available sensors, including the GPS sensors 131, the wheel speed sensors 132 and inertial measurement sensors 134 all provide their respective measurements, which provides a reasonable estimate of the position of the vehicle suitable for use in the initial identification of the partner vehicle.
  • Kalman Filtering
  • The method described with respect to FIG. 7 can be implemented using a variety of techniques. One presently preferred embodiment that works particularly well utilizes Kalman Filtering. As used herein, the phrase Kalman filtering is intended to encompass linear quadratic estimation (LQE) as well as extensions and generalizations of LQE such as the extended Kalman filter and the unscented Kalman filter which are designed to work with nonlinear systems. As will be understood by those familiar with Kalman filtering in general, Kalman filtering uses a series of measurements observed over time containing noise and other inaccuracies and produces estimates of unknown variables that tend to be more precise than those based on a single measurement alone. The Kalman filter keeps track of the estimated state of the system and the variance or uncertainty of the estimate. This is particularly well suited for estimating the position, speed and other state information related to gap control because of the errors inherent is some of the measurements and the potential unavailability at times of some of the desired measurement samples.
  • The state variables used in the Kalman filter may vary widely with the nature of the model used. One particular state array (X) suitable for use in some of the described embodiments that involve a pair of platooning tractor-trailer trucks includes:
  • (1) the longitudinal position of the center of the rear axles of the front truck relative to the center of the rear axles of the back truck (x);
  • (2) the lateral position of the center of the rear axle of the front truck relative to the center of the rear axles of the back truck (y);
  • (3) the heading of the front truck relative to the heading of the trailing truck (χ);
  • (4) the speed of the lead vehicle (v1); and
  • (5) the speed of the trailing vehicle (v2).
  • This can be represented mathematically as follows:
  • X = [ x y χ v 1 v 2 ]
  • The estimated state at the time of the next radar sample (Xk+1) is a function of the previous state (Xk) and a covariance matrix (Pk) indicative of the level of uncertainty in the measurements. A covariance matrix corresponding to the state array (X) represented above is illustrated in FIG. 8. As will be understood by those familiar with Kalman filtering in general, the estimated state at the time of the next radar sample (Xk+1) is equal to the product of a state transition model (A) and the previous state (Xk) plus the product of a control input model (B) and any modeled inputs (uk−1). This can be represented mathematically as follows.

  • X k+1 =AX k Bu k
  • One particular control input array (U) includes:
  • (1) the yaw rate of the front vehicle (ψ1); and
  • (2) the yaw rate of the rear vehicle (ψ2)
  • This can be represented mathematically as follows:
  • U = [ ψ1 ψ2 ]
  • Although specific state and modeled input arrays are illustrated, it should be appreciated that the specific state and control input variables used in any particular implementation may vary widely based on the nature of the estimation model used.
  • Kalman filtering is particularly well adapted to making the types of position and velocity estimations useful in the techniques described herein. Although Kalman filtering works particularly well, it should be appreciated that other state/space estimation algorithms, such as Particle Filtering, etc. can be used in alternative embodiments.
  • One of the reasons that Kalman filtering works well is that most of the measurements, including the GPS measurements, the radar measurements, the wheel speed measurements and the inertial measurements tend to be subject to varying measurement errors. For example, it is not uncommon for any particular GPS measurement to be off by more than a meter. The covariance matrix (Pk) quantifies the statistical variation (error) observed in the measurements and utilizes that knowledge to improve the quality of the position and speed estimates.
  • Integrating Other Information into Sensor Data Verification
  • In the embodiments described above, information about the state of the partner vehicle that is received from the partner vehicle is used by the host to help verify or confirm that data from a sensor on the host vehicle that is believed to measure a characteristic of the partner vehicle is actually representative of the partner vehicle. For example, in some of the described embodiments, information from a lead vehicle about its position, speed, orientation etc. is used by a radar scene processor on the trailing vehicle to predict an expected position and speed of the lead vehicle. Those predictions are then used to help determine which (if any) of the detected radar objects correspond to the lead vehicle. The state information received from the lead vehicle may be a measured value (such as a measure wheel speed) or a predicted value (such as a predicted speed) which may be even more reliable in circumstances in which the parameter (e.g., speed) is changing.
  • It should be appreciated that a wide variety of other information/data received from the partner vehicle can additionally or alternatively be used to further help with such verification. This can include other partner vehicle state information such as the partner vehicle's: current torque request; braking status (including the status of the foundation brakes, a retarder, engine braking and/or any other braking device in the context of larger trucks); or steering angle. The information can also include a status indicator such as an indication that a blinker, the hazard lights, the taillights or other lights are on. It can also include qualitative information about the partner vehicle such as its radar signature, or its visual appearance (e.g. its color, a identifying marker, or some other feature or characteristic that can be readily identified by one of the controllers on the host vehicle). It can also include information about an intended or expected action—such as notification that the lead vehicle is about to change lanes, will take the next exit or turn at the next intersection.
  • In some circumstances, the host vehicle may request that the partner vehicle take specific actions to help with such identification. The nature of such a request may vary widely—for example, the rear truck may request that the lead truck turn on specific lights, switch lanes, accelerate or decelerate to a specific speed, honk its horn, etc.
  • Additionally, it should be appreciated that additional information about the partner vehicle can also be obtained from a third vehicle, a larger mesh of vehicles or from another external source. For example a third vehicle travelling in parallel with the platoon partners may have measured the position, velocity and/or other characteristics of the partner vehicle and that information can be used as another independent check. In another example, a network operations center (NOC) in communication with both platoon partners may know the intended route and communicate that route, or more short term directions to the host vehicle as appropriate. In other circumstances information from the partner vehicle may be transmitted via an intermediary such as a third vehicle, a NOC, etc. Any of this type of data can be useful—and some of the information may be particularly helpful in circumstance in which communications between the vehicles is temporarily lost.
  • Although only a few embodiments of the inventions have been described in detail, it should be appreciated that the inventions may be implemented in many other forms without departing from the spirit or scope of the invention. The inventions have been described primarily in the context of a pair of trucks platooning with a forward facing radar unit being located at the front of the trailing truck. However, it should be appreciated that the same concepts can be applied to any types of vehicles operating in any type of connected vehicle applications, regardless of where the radar unit is located on the vehicle and/or the direction (or directions) that the radar unit(s) interrogates. Thus, for example, a backward facing radar unit on a lead vehicle can be used to identify and/or track following vehicles using radar in substantially the same manner as described. Similarly if omni-directional radar is used, similar approaches can be used to identify and/or track other vehicles using radar regardless of their position relative to the host vehicle.
  • As suggested above, the described radar based vehicle identification and tracking can be used in any type of connected vehicle application in which independent information about the position and/or velocity of one or more other vehicles is known or available to the unit interpreting the radar data. Thus, for example, the described techniques are particularly well suited for use in convoying systems involving more than two vehicles. Also, the described techniques are very well adapted for use in autonomous vehicle traffic flow applications where knowledge about the intentions of other specific vehicles is deemed important. Indeed, this is expected to be an important application of the inventions with the growth of the autonomous and connected vehicle markets.
  • The inventions have been described primarily in the context of identifying and tracking other vehicles using commercially available radar units designed for use in driving automation systems. Such units are typically designed to analyze the received radar energy and identify objects that are believed to the radar manufacturer to be relevant. Although the described inventions work well with such units, they are not so constrained. Rather, both the vehicle identification and vehicle tracking processes are well suited for use with radar units that don't filter the response as much and report the reflected radar signal intensities in a more general way rather than attempting to identify particular objects. In particular, the statistical nature of the radar return binning and the back of vehicle detection are quite well suited for using radar data provided in other forms such as intensity/location. Furthermore, the invention is not limited to distance measurement systems using electromagnetic energy in the frequency range of radar. Rather, it should be appreciated that the same target vehicle identification and/or tracking techniques may readily be used in conjunction with other electromagnetic energy based distance measuring technologies such as LIDAR which utilize electromagnetic energy in different frequency ranges, sound based distance measurement (e.g., sonar, ultrasound, etc.) or various time of flight based distance measuring systems. The described techniques can also be used in conjunction with distance measuring techniques using cameras or stereo cameras, beacon based technologies in which the sensor measures a beacon signal transmitted from the partner vehicle and/or other technologies.
  • In some implementations, the platooning vehicles may have mechanisms such as transponders suitable for identifying themselves to the radar unit. When available, information from such devices can be used to further assist with the identification and tracking of the platoon partner.
  • Therefore, the present embodiments should be considered illustrative and not restrictive and the invention is not to be limited to the details given herein, but may be modified within the scope and equivalents of the appended claims.

Claims (42)

What is claimed is:
1. A method of identifying a position of a back of a first vehicle using radar scenes received from a radar unit on a second vehicle, the comprising:
a) estimating a position of the first vehicle relative to a second vehicle;
b) receiving a radar scene sample from the radar unit on the second vehicle, the radar scene including a set of zero or more detected radar object points, each radar object point corresponding to a detected object;
c) identifying first vehicle radar point candidates within the set of received detected radar object points;
d) categorizing the first vehicle radar point candidates based on distance that the detected objects that they represent are from the estimated first vehicle position;
e) repeating steps (a)-(d) a multiplicity of times, whereby the categorized first vehicle radar point candidates include candidates from multiple sequential radar scene samples; and
f) identifying the position of the back of the first vehicle based at least in part of the categorization of the first vehicle radar point candidates.
2. A method as recited in claim 1 further comprising identifying a bounding box around the estimated position of the first vehicle, wherein radar object points within the set of received detected radar object points that are not located within the bounding box are not considered first vehicle radar point candidates.
3. A method as recited in claim 2 wherein the bounding box defines a region that exceeds a maximum expected size of the first vehicle.
4. A method as recited in claim 1 further comprising estimating a speed of the first vehicle relative to the second vehicle, the estimated relative speed having an associated speed uncertainty, wherein radar object points within the set of detected radar object points that correspond to detected objects that are moving at a relative speed that is not within the speed uncertainty of the estimated speed are not considered first vehicle radar point candidates.
5. A method as recited in claim 1 wherein the identified back of the first vehicle or an effective vehicle length that is determined based at least in part on the identified back of the first vehicle is used in the control of the second vehicle.
6. A method as recited in claim 1 wherein steps (a)-(c) are repeated at a sample rate of at least 10 Hertz.
7. A method as recited in claim 1 wherein categorizing the first vehicle radar point candidates includes populating a histogram with the first vehicle radar point candidates, the histogram including a plurality of bins, each bin representing a longitudinal distance range relative to the estimated position of the first vehicle.
8. A method as recited in claim 7 wherein the identification of the back of the first vehicle is only done after the histogram contains at least a predetermined number of first vehicle radar point candidates.
9. A method as recited in claim 7 further comprising applying a clustering algorithm to the first vehicle radar point candidates to identify one or more clusters of first vehicle radar point candidates.
10. A method as recited in claim 9 wherein the clustering algorithm is a modified mean shift algorithm.
11. A method as recited in claim 9 wherein the cluster located closest to the second vehicle is selected to represent the back of the first vehicle.
12. A method as recited in claim 9 wherein the cluster located closest to the second vehicle that includes at least a predetermined threshold percentage or number of first vehicle radar point candidates is selected to represent the back of the first vehicle.
13. A method as recited in claim 12 wherein the predetermined threshold percentage is at least 10% of first vehicle radar point candidates in the histogram.
14. A method as recited in claim 12 wherein the predetermined number of first vehicle radar point candidates is a number that is at least 40.
15. A method as recited in claim 1 further comprising determining an effective length of the first vehicle based at least in part on the identified back of the vehicle.
16. A method as recited in claim 1 wherein Kalman filtering is used to estimate the position of the first vehicle.
17. A method as recited in claim 7 further comprising comparing properties of the histogram or mean shift clusters derived from the histogram to a known set of data representative of a target partner vehicle to verify whether the first vehicle is the target partner vehicle.
18. A method as recited in claim 7 further comprising comparing properties of the histogram or mean shift clusters derived from the histogram to a radar scene received when the back of the first vehicle is not within the radar unit's field of view but a portion of the first vehicle is within the radar unit's field of view to help determine a current relative position of the first vehicle.
19. A method as recited in claim 1 wherein the first and second vehicles are trucks.
20. A method as recited in claim 19 wherein the first vehicle is a tractor-trailer truck.
21. A method of identifying a position of a back of a first vehicle using scenes received from a distance measuring unit on a second vehicle, the comprising:
a) estimating a position of the first vehicle relative to a second vehicle;
b) receiving a scene sample from the distance measuring unit on the second vehicle, the scene including a set of zero or more detected object points, each object point corresponding to a detected object;
c) identifying first vehicle point candidates within the set of received detected object points;
d) categorizing the first vehicle point candidates based on distance that the detected objects that they represent are from the estimated first vehicle position;
e) repeating steps (a)-(d) a multiplicity of times, whereby the categorized first vehicle point candidates include candidates from multiple sequential distance measuring unit scene samples; and
f) identifying the position of the back of the first vehicle based at least in part of the categorization of the first vehicle point candidates.
22. A method of tracking a specific lead vehicle using a distance measurement unit mounted on a trailing vehicle, the method comprising:
(a) obtaining a current sample from the distance measurement unit, the current sample including a set of zero or more object points;
(b) obtaining a current estimate of a state of the lead vehicle corresponding to the current sample, wherein the current estimate of the state of the lead vehicle has an associated state uncertainty and does not take into account any information from the current sample;
(c) determining whether any of the object points match the estimated state of the lead vehicle within the state uncertainty; and
(d) when at least one of the object points matches the estimated state of the lead vehicle within the state uncertainty, selecting the matching object point that best matches the estimated state of the lead vehicle as a measured state of the lead vehicle, and using the measured state of the lead vehicle in the determination of a sequentially next estimate of a state of the lead vehicle corresponding to a sequentially next sample; and
repeating steps (a)-(d) a multiplicities of times to thereby track the lead vehicle.
23. A method as recited in claim 22 wherein the current state estimate includes a plurality of state parameters, the state parameters including a position parameter indicative of a position of the lead vehicle relative to the trailing vehicle and a speed parameter indicative of a velocity of the lead vehicle relative to the trailing vehicle.
24. A method as recited in claim 22 further comprising at least partially automatically controlling the trailing vehicle to maintain a desired gap between the lead vehicle and the trailing vehicle and wherein each selected object point has an associated longitudinal distance from the distance measurement unit, and wherein the associated longitudinal distance is treated by a gap controller responsible for maintaining the desired gap as a current measured longitudinal distance from the distance measurement unit to the back of the lead vehicle.
25. A method as recited in claim 22 wherein:
each sample indicates a position of each of the object points; and
each current estimate of the state of the lead vehicle includes a current estimate of the position of the lead vehicle and has an associated position uncertainty;
the selected matching object point must match the estimated position of the lead vehicle within the position uncertainty.
26. A method as recited in claim 25 wherein the current estimate of the position of the lead vehicle estimates the current position of a back of the lead vehicle.
27. A method as recited in claim 25 wherein the estimated position of the lead vehicle is a relative position relative to the trailing vehicle.
28. A method as recited in claim 25 wherein:
each sample also indicates a relative velocity of each of the object points; and
each current estimate of the state of the lead vehicle further includes a current estimate of a relative velocity of the lead vehicle and has an associated velocity uncertainty;
the selected matching object point must both (i) match the estimated position of the lead vehicle within the position uncertainty, and (ii) match the estimated velocity of the lead vehicle within the velocity uncertainty.
29. A method as recited in claim 22 wherein when none of the object points in a particular sample match the estimated state of the lead vehicle within the state uncertainty, then the state uncertainty is increased for the sequentially next estimate of the state of the lead vehicle.
30. A method as recited in claim 29 wherein the estimate state includes a plurality of state parameters, the state parameters including a position parameter, a speed parameter and an orientation parameter.
31. A method as recited in claim 22 further comprising:
periodically receiving global navigation satellite systems (GNSS) position updates based at least in part on detected GNSS positions of the lead and trailing vehicles; and
each time a GNSS position update is received, updating the estimated state of the lead vehicle and the state uncertainty based on such GNSS position update.
32. A method as recited in claim 22 further comprising:
periodically receiving vehicle speed updates based at least in part on detected wheel speeds of the lead and trailing vehicles; and
each time a vehicle speed update is received, updating the estimated state of the lead vehicle and the state uncertainty based on such vehicle speed update.
33. A method as recited in claim 22 wherein steps (a)-(d) are repeated at a sample rate of at least 10 Hertz.
34. A method as recited in claim 22 wherein Kalman filtering is used to estimate the state of the lead vehicle and the associated state uncertainty.
35. A method as recited in claim 22 wherein the estimated state of the lead vehicle includes an estimated position of the back of the lead vehicle and the selected matching object point is considered a measurement of the relative position of the back of the lead vehicle.
36. A method as recited in claim 22 wherein a controller on the trailing vehicle maintains a profile of point clusters representative of the lead vehicle and the selected matching point corresponds to one of the point clusters.
37. A method as recited in claim 22 wherein the lead and trailing vehicles are trucks involved in a platoon.
38. A method as recited in claim 22 wherein the distance measurement unit is a radar unit.
39. A method of tracking a specific lead vehicle using a radar unit mounted on a trailing vehicle, the method comprising:
(a) obtaining a current radar sample from the radar unit, the current radar sample including a set of zero or more radar object points, each radar object point indicating a relative a position of such radar object point relative to the radar unit;
(b) obtaining a current estimate of a state of the lead vehicle corresponding to the current radar sample, wherein the current estimate of the state of the lead vehicle has an associated state uncertainty and includes a current estimate of the position of a back of the lead vehicle relative to the radar unit, wherein the current estimate of the position of the back of lead vehicle has an associated position uncertainty that is at least a part of the state uncertainty;
(c) determining whether any of the radar object points match the estimated state of the lead vehicle within the state uncertainty, wherein to match the estimated state of the lead vehicle within the state uncertainty, a matching radar object point must match the estimated position of the back of the lead vehicle within the position uncertainty; and
(d) when at least one of the radar object points matches the estimated state of the lead vehicle within the state uncertainty, selecting the matching radar object point that best matches the estimated state of the lead vehicle as a measured state of the lead vehicle, and using the measured state of the lead vehicle in the determination of a sequentially next estimate of a state of the lead vehicle corresponding to a sequentially next radar sample;
repeating steps (a)-(d) a multiplicities of times;
periodically receiving vehicle global navigation satellite systems (GNSS) position updates based at least in part on detected GNSS positions of the lead and trailing vehicles;
each time a vehicle GNSS position update is received, updating the estimated state of the lead vehicle and the state uncertainty based on such vehicle GNSS position update;
periodically receiving vehicle speed updates based at least in part on detected wheel speeds of the lead and trailing vehicles; and
each time a vehicle speed update is received, updating the estimated state of the lead vehicle and the state uncertainty based on such vehicle speed update; and
at least partially automatically controlling the trailing vehicle to maintain a desired gap between the lead vehicle and the trailing vehicle based at least in part on an aspect of the measured state of the lead vehicle.
40. A method as recited in claim 39 wherein:
each radar sample also indicates a relative velocity of each of the radar object points; and
each current estimate of the state of the lead vehicle further includes a current estimate of a relative velocity of the lead vehicle and has an associated velocity uncertainty;
the selected matching radar object point must both (i) match the estimated position of the lead vehicle within the position uncertainty, and (ii) match the estimated velocity of the lead vehicle within the velocity uncertainty.
41. A method as recited in claim 39 wherein:
when none of the radar object points in a particular radar sample match the estimated position of the lead vehicle within the position uncertainty, then the position uncertainty is increased for the sequentially next estimate of the position of the lead vehicle; and
when none of the radar object points in a particular radar sample match an estimated velocity of the lead vehicle within an velocity uncertainty, then the velocity uncertainty is increased for the sequentially next estimate of the position of the lead vehicle.
42. A method as recited in claim 39 wherein the estimated state of the lead vehicle includes an estimated position of the back of the lead vehicle and the selected matching radar object point is considered a measurement of the relative position of the back of the lead vehicle.
US15/590,715 2011-07-06 2017-05-09 Gap measurement for vehicle convoying Abandoned US20170242443A1 (en)

Priority Applications (13)

Application Number Priority Date Filing Date Title
US15/590,715 US20170242443A1 (en) 2015-11-02 2017-05-09 Gap measurement for vehicle convoying
CN201780081508.0A CN110418745B (en) 2016-11-02 2017-10-26 Clearance measurement for vehicle convoying
JP2019523642A JP7152395B2 (en) 2016-11-02 2017-10-26 Gap measurement for vehicle platoons
CN202211662662.6A CN116203551A (en) 2016-11-02 2017-10-26 Gap measurement for vehicle navigation
PCT/US2017/058477 WO2018085107A1 (en) 2016-11-02 2017-10-26 Gap measurement for vehicle convoying
EP17867739.9A EP3535171A4 (en) 2016-11-02 2017-10-26 Gap measurement for vehicle convoying
CA3042647A CA3042647C (en) 2016-11-02 2017-10-26 Gap measurement for vehicle convoying
US15/936,271 US10514706B2 (en) 2011-07-06 2018-03-26 Gap measurement for vehicle convoying
US16/184,866 US20190279513A1 (en) 2016-11-02 2018-11-08 Vehicle convoying using satellite navigation and inter-vehicle communication
US16/675,579 US11360485B2 (en) 2011-07-06 2019-11-06 Gap measurement for vehicle convoying
US17/839,464 US12124271B2 (en) 2011-07-06 2022-06-13 Gap measurement for vehicle convoying
JP2022155699A JP7461431B2 (en) 2016-11-02 2022-09-29 Gap measurement for vehicle platoons
JP2024046529A JP2024095700A (en) 2016-11-02 2024-03-22 Gap measurement for vehicle convoying

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201562249898P 2015-11-02 2015-11-02
PCT/US2016/060167 WO2017070714A1 (en) 2015-09-15 2016-11-02 Vehicle identification and location using senor fusion and inter-vehicle communication
US15/590,715 US20170242443A1 (en) 2015-11-02 2017-05-09 Gap measurement for vehicle convoying

Related Parent Applications (3)

Application Number Title Priority Date Filing Date
PCT/US2006/000167 Continuation-In-Part WO2006074216A2 (en) 2005-01-04 2006-01-04 Identification of molecular interactions and therapeutic uses thereof
PCT/US2016/060167 Continuation-In-Part WO2017070714A1 (en) 2011-07-06 2016-11-02 Vehicle identification and location using senor fusion and inter-vehicle communication
US15/936,271 Continuation-In-Part US10514706B2 (en) 2011-07-06 2018-03-26 Gap measurement for vehicle convoying

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US15/936,271 Continuation US10514706B2 (en) 2011-07-06 2018-03-26 Gap measurement for vehicle convoying
US16/184,866 Continuation-In-Part US20190279513A1 (en) 2016-11-02 2018-11-08 Vehicle convoying using satellite navigation and inter-vehicle communication

Publications (1)

Publication Number Publication Date
US20170242443A1 true US20170242443A1 (en) 2017-08-24

Family

ID=59629906

Family Applications (3)

Application Number Title Priority Date Filing Date
US15/590,715 Abandoned US20170242443A1 (en) 2011-07-06 2017-05-09 Gap measurement for vehicle convoying
US15/936,271 Active - Reinstated US10514706B2 (en) 2011-07-06 2018-03-26 Gap measurement for vehicle convoying
US16/675,579 Active US11360485B2 (en) 2011-07-06 2019-11-06 Gap measurement for vehicle convoying

Family Applications After (2)

Application Number Title Priority Date Filing Date
US15/936,271 Active - Reinstated US10514706B2 (en) 2011-07-06 2018-03-26 Gap measurement for vehicle convoying
US16/675,579 Active US11360485B2 (en) 2011-07-06 2019-11-06 Gap measurement for vehicle convoying

Country Status (1)

Country Link
US (3) US20170242443A1 (en)

Cited By (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180210462A1 (en) * 2013-03-15 2018-07-26 Peloton Technology, Inc. System and method for implementing pre-cognition braking and/or avoiding or mitigation risks among platooning vehicles
CN108985254A (en) * 2018-08-01 2018-12-11 上海主线科技有限公司 A kind of band based on laser hangs tag vehicle tracking
US10262542B2 (en) 2012-12-28 2019-04-16 General Electric Company Vehicle convoy control system and method
WO2019133470A1 (en) * 2017-12-28 2019-07-04 Bendix Commercial Vehicle Systems Llc Sensor-based anti-hacking prevention in platooning vehicles
US10377383B2 (en) * 2017-12-11 2019-08-13 Ford Global Technologies, Llc Vehicle lane change
WO2019222684A1 (en) * 2018-05-18 2019-11-21 The Charles Stark Draper Laboratory, Inc. Convolved augmented range lidar nominal area
US10514706B2 (en) 2011-07-06 2019-12-24 Peloton Technology, Inc. Gap measurement for vehicle convoying
US10520952B1 (en) 2011-07-06 2019-12-31 Peloton Technology, Inc. Devices, systems, and methods for transmitting vehicle data
US10520581B2 (en) 2011-07-06 2019-12-31 Peloton Technology, Inc. Sensor fusion for autonomous or partially autonomous vehicle control
CN110751825A (en) * 2019-10-29 2020-02-04 北京百度网讯科技有限公司 Method, device, equipment and computer readable storage medium for avoiding formation driving
CN110816547A (en) * 2018-08-07 2020-02-21 通用汽车环球科技运作有限责任公司 Perception uncertainty modeling of real perception system for autonomous driving
US20200125117A1 (en) * 2018-10-23 2020-04-23 Peloton Technology, Inc. Systems and methods for platooning and automation safety
WO2020107038A1 (en) * 2018-11-19 2020-05-28 Invensense, Inc. Method and system for positioning using radar and motion sensors
CN111367269A (en) * 2018-12-26 2020-07-03 武汉万集信息技术有限公司 Navigation positioning method, device and system of laser radar
US10717439B2 (en) * 2017-09-15 2020-07-21 Honda Motor Co., Ltd Traveling control system and vehicle control method
US10732645B2 (en) 2011-07-06 2020-08-04 Peloton Technology, Inc. Methods and systems for semi-autonomous vehicular convoys
US10739445B2 (en) 2018-05-23 2020-08-11 The Charles Stark Draper Laboratory, Inc. Parallel photon counting
US10762791B2 (en) 2018-10-29 2020-09-01 Peloton Technology, Inc. Systems and methods for managing communications between vehicles
US10816985B2 (en) * 2018-04-17 2020-10-27 Baidu Usa Llc Method on moving obstacle representation for trajectory planning
US10838054B2 (en) * 2018-10-08 2020-11-17 Aptiv Technologies Limited Detection system and method
US10899323B2 (en) 2018-07-08 2021-01-26 Peloton Technology, Inc. Devices, systems, and methods for vehicle braking
US10922974B2 (en) * 2017-11-24 2021-02-16 Toyota Jidosha Kabushiki Kaisha Vehicle control device
US10955540B2 (en) 2017-12-01 2021-03-23 Aptiv Technologies Limited Detection system
US20210170820A1 (en) * 2019-12-09 2021-06-10 Magna Electronics Inc. Vehicular trailer hitching assist system with coupler height and location estimation
US11092668B2 (en) 2019-02-07 2021-08-17 Aptiv Technologies Limited Trailer detection system and method
US20210269029A1 (en) * 2018-06-26 2021-09-02 Continental Automotive Gmbh Group of Vehicles, Method for Operating the Group of Vehicles, Computer Program and Computer-Readable Storage Medium
US11125575B2 (en) * 2019-11-20 2021-09-21 Here Global B.V. Method and apparatus for estimating a location of a vehicle
US11334092B2 (en) 2011-07-06 2022-05-17 Peloton Technology, Inc. Devices, systems, and methods for transmitting vehicle data
US11386786B2 (en) * 2018-07-07 2022-07-12 Robert Bosch Gmbh Method for classifying a relevance of an object
US11408995B2 (en) 2020-02-24 2022-08-09 Aptiv Technologies Limited Lateral-bin monitoring for radar target detection
US20220262034A1 (en) * 2019-07-23 2022-08-18 Volkswagen Aktiengesellschaft Generation of Non-Semantic Reference Data for Positioning a Motor Vehicle
US11427196B2 (en) 2019-04-15 2022-08-30 Peloton Technology, Inc. Systems and methods for managing tractor-trailers
EP4063912A1 (en) * 2021-03-17 2022-09-28 Aptiv Technologies Limited Tracking different sections of articulated vehicles
US20220319186A1 (en) * 2019-09-27 2022-10-06 Hitachi Astemo, Ltd. Object Detection Device, Travel Control System, And Travel Control Method
EP4280194A1 (en) * 2022-05-20 2023-11-22 Hyundai Mobis Co., Ltd. Method and apparatus for controlling rear collision warning of vehicle
US12039784B1 (en) * 2021-09-30 2024-07-16 Zoox, Inc. Articulated object determination
US12055650B2 (en) 2019-07-30 2024-08-06 Telefonaktiebolaget Lm Ericsson (Publ) Technique for determining a relative position between vehicles
JP7571716B2 (en) 2021-12-07 2024-10-23 トヨタ自動車株式会社 OBJECT DETERMINATION DEVICE, OBJECT DETERMINATION COMPUTER PROGRAM, AND OBJECT DETERMINATION METHOD

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11835965B2 (en) * 2011-07-06 2023-12-05 Peloton Technology, Inc. Applications for using mass estimations for vehicles
US11100211B2 (en) * 2015-08-26 2021-08-24 Peloton Technology, Inc. Devices, systems, and methods for remote authorization of vehicle platooning
JP6760786B2 (en) * 2016-07-21 2020-09-23 Thk株式会社 Mobile robot and control method
DE102017216867A1 (en) * 2017-09-25 2019-03-28 Robert Bosch Gmbh Method and radar sensor for reducing the influence of interference in the evaluation of at least one received signal
WO2019081921A1 (en) * 2017-10-24 2019-05-02 Bae Systems Plc Positioning at least one vehicle in relation to a set of moving targets
US11205026B2 (en) * 2018-06-25 2021-12-21 Toyota Research Institute, Inc. Benefit apportioning system and methods for vehicle platoons
KR20200083683A (en) * 2018-12-14 2020-07-09 현대자동차주식회사 Vehicle, Server communicating with the vehicle and method for controlling the same
US11023753B2 (en) * 2019-02-27 2021-06-01 Toyota Motor Engineering & Manufacturing North America, Inc. System and method for determining a lane change of a preceding vehicle
US11591757B2 (en) * 2019-04-17 2023-02-28 Caterpillar Paving Products Inc. System and method for machine control
US11166134B2 (en) * 2019-10-25 2021-11-02 Toyota Motor Engineering And Manufacturing North America, Inc. Vehicular communications based on internet communication identifiers associated with codes of vehicles
US20230003872A1 (en) * 2021-06-30 2023-01-05 Zoox, Inc. Tracking objects with radar data

Family Cites Families (418)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3725921A (en) 1970-11-04 1973-04-03 Bendix Corp Traffic responsive speed control system
JPS5141849A (en) 1974-10-04 1976-04-08 Matsushita Electric Ind Co Ltd DENGENSOCHI
US4370718A (en) 1979-02-06 1983-01-25 Chasek Norman E Responsive traffic light control system and method based on conservation of aggregate momentum
US4317117A (en) 1979-07-20 1982-02-23 Chasek Norman E Cross correlated doppler radar/infra red velocity and presence sensor
US5295551A (en) 1986-03-06 1994-03-22 Josef Sukonick System for the cooperative driving of two or more vehicles
KR940001633B1 (en) 1990-01-17 1994-02-28 미쯔비시 덴끼 가부시끼가이샤 Following control apparatus for an automotive vehicle
JPH0441940A (en) * 1990-06-06 1992-02-12 Nissan Motor Co Ltd Control device of engine for vehicle
JP2995970B2 (en) 1991-12-18 1999-12-27 トヨタ自動車株式会社 Travel control device for vehicles
US5331561A (en) 1992-04-23 1994-07-19 Alliant Techsystems Inc. Active cross path position correlation device
US6314366B1 (en) 1993-05-14 2001-11-06 Tom S. Farmakis Satellite based collision avoidance system
WO1995027964A1 (en) 1994-04-12 1995-10-19 Qualcomm Incorporated Method and apparatus for freight transportation using a satellite navigation system
US5572449A (en) 1994-05-19 1996-11-05 Vi&T Group, Inc. Automatic vehicle following system
JPH08314540A (en) 1995-03-14 1996-11-29 Toyota Motor Corp Vehicle travel guide system
JP3191621B2 (en) 1995-03-14 2001-07-23 トヨタ自動車株式会社 Vehicle travel guidance system
US7418346B2 (en) 1997-10-22 2008-08-26 Intelligent Technologies International, Inc. Collision avoidance methods and systems
US6720920B2 (en) 1997-10-22 2004-04-13 Intelligent Technologies International Inc. Method and arrangement for communicating between vehicles
US7629899B2 (en) 1997-10-22 2009-12-08 Intelligent Technologies International, Inc. Vehicular communication arrangement and method
US6370475B1 (en) 1997-10-22 2002-04-09 Intelligent Technologies International Inc. Accident avoidance system
US5633456A (en) 1995-08-04 1997-05-27 Chrysler Corporation Engine misfire detection with digital filtering
JP3358403B2 (en) 1995-09-11 2002-12-16 トヨタ自動車株式会社 Platoon running control device
JP3633707B2 (en) 1996-03-08 2005-03-30 日産ディーゼル工業株式会社 Vehicle group running control device
US6125321A (en) 1996-06-07 2000-09-26 Toyota Jidosha Kabushiki Kaisha Motor vehicle drive system controller and automatic drive controller
JP3732292B2 (en) 1996-11-27 2006-01-05 本田技研工業株式会社 Vehicle group running control system
US6043777A (en) 1997-06-10 2000-03-28 Raytheon Aircraft Company Method and apparatus for global positioning system based cooperative location system
US20080147253A1 (en) 1997-10-22 2008-06-19 Intelligent Technologies International, Inc. Vehicular Anticipatory Sensor System
US20080154629A1 (en) 1997-10-22 2008-06-26 Intelligent Technologies International, Inc. Vehicle Speed Control Method and Arrangement
US8965677B2 (en) 1998-10-22 2015-02-24 Intelligent Technologies International, Inc. Intra-vehicle information conveyance system and method
DE19849583B4 (en) 1997-10-27 2009-06-18 Nissan Motor Co., Ltd., Yokohama-shi System and method for controlling a distance between vehicles
US6167357A (en) 1998-04-23 2000-12-26 Cummins Engine Company, Inc. Recursive vehicle mass estimation
JP2000085407A (en) 1998-07-17 2000-03-28 Denso Corp Vehicle-to-vehicle control device and recording medium
US6418370B1 (en) 1998-08-04 2002-07-09 Denso Corporation Apparatus and method for controlling a target distance and a warning distance between traveling vehicles and a recording medium for storing the control method
DE19837380A1 (en) 1998-08-18 2000-02-24 Zahnradfabrik Friedrichshafen Method to determine mass of vehicle; involves obtaining measurements to determine traction force variable and movement variable
EP0982173A2 (en) 1998-08-27 2000-03-01 Eaton Corporation Method for determination of an optimum vehicle cruise separation distance
JP2000113400A (en) 1998-09-30 2000-04-21 Honda Motor Co Ltd Automatic tracking travel system
JP2000330637A (en) 1999-03-16 2000-11-30 Honda Motor Co Ltd Method for detecting obstacle of vehicle
JP2000311291A (en) 1999-04-27 2000-11-07 Honda Motor Co Ltd Convoy travel controller
JP2000322696A (en) 1999-05-07 2000-11-24 Honda Motor Co Ltd In-line travel controller
DE10024739A1 (en) 1999-05-21 2000-12-07 Honda Motor Co Ltd Vehicle convoy travel mode control device transmits request to join existing convoy or break away from convoy to convoy lead vehicle and switches between manual and automatic control modes upon request recognition
JP4082831B2 (en) 1999-10-26 2008-04-30 株式会社小松製作所 Vehicle control device
US6604038B1 (en) 1999-11-09 2003-08-05 Power Talk, Inc. Apparatus, method, and computer program product for establishing a remote data link with a vehicle with minimal data transmission delay
US6510381B2 (en) 2000-02-11 2003-01-21 Thomas L. Grounds Vehicle mounted device and a method for transmitting vehicle position data to a network-based server
US6304211B1 (en) 2000-02-16 2001-10-16 Bertho Boman System and method for measuring distance between two objects using received satellite transmitted data
US7835864B1 (en) 2000-02-20 2010-11-16 Dale F. Oexmann Vehicle proximity detection and control systems
DE10017662A1 (en) 2000-04-08 2001-10-11 Bosch Gmbh Robert Method and device for controlling the distance of a vehicle from a preceding vehicle
US6345603B1 (en) 2000-04-11 2002-02-12 Visteon Global Technologies, Inc. Throttle control for vehicle using redundant throttle signals
US6765495B1 (en) 2000-06-07 2004-07-20 Hrl Laboratories, Llc Inter vehicle communication system
JP2004534204A (en) 2000-11-30 2004-11-11 ピレリ・プネウマティチ・ソチエタ・ペル・アツィオーニ System and method for monitoring tires
JP2002190091A (en) 2000-12-20 2002-07-05 Pioneer Electronic Corp Traveling time setting method and device, method and device for calculating route using it
US6898585B2 (en) 2001-02-02 2005-05-24 University Of Illinois Fuzzy logic method for adaptively evaluating the validity of sensor data
JP3838048B2 (en) 2001-04-16 2006-10-25 日産自動車株式会社 Vehicle travel control device
US6640164B1 (en) 2001-08-28 2003-10-28 Itt Manufacturing Enterprises, Inc. Methods and systems for remote control of self-propelled vehicles
JP4703917B2 (en) 2001-09-10 2011-06-15 コマツレンタル株式会社 Rental system and rental business support method
US6680548B2 (en) 2001-11-20 2004-01-20 Luxon Energy Devices Corporation Electronic timers using supercapacitors
US7016782B2 (en) * 2002-05-30 2006-03-21 Delphi Technologies, Inc. Collision detection system and method of estimating miss distance
US9917773B2 (en) 2008-08-04 2018-03-13 General Electric Company Data communication system and method
US20130317676A1 (en) 2012-05-23 2013-11-28 Jared Klineman Cooper System and method for inspecting a route during movement of a vehicle system over the route
US6963795B2 (en) 2002-07-16 2005-11-08 Honeywell Interntaional Inc. Vehicle position keeping system
JP4076071B2 (en) 2002-08-19 2008-04-16 アルパイン株式会社 Communication method and vehicle communication apparatus between moving bodies
US7104617B2 (en) 2002-09-06 2006-09-12 Ford Motor Company Independent braking and controllability control method and system for a vehicle with regenerative braking
US6882923B2 (en) 2002-10-17 2005-04-19 Ford Global Technologies, Llc Adaptive cruise control system using shared vehicle network data
WO2004038335A1 (en) 2002-10-22 2004-05-06 Hitachi, Ltd. Map data delivering method for communication-type navigation system
JP2004217175A (en) 2003-01-17 2004-08-05 Toyota Motor Corp Vehicle-to-vehicle distance control device
WO2004077378A1 (en) 2003-02-25 2004-09-10 Philips Intellectual Property & Standards Gmbh Method and system for leading a plurality of vehicles
US6975246B1 (en) 2003-05-13 2005-12-13 Itt Manufacturing Enterprises, Inc. Collision avoidance using limited range gated video
JP2004348430A (en) 2003-05-22 2004-12-09 Pioneer Electronic Corp Urgent braking alarm in vehicle, transmission device for urgent braking information, server device, and urgent braking alarm system and method
US7660436B2 (en) 2003-06-13 2010-02-09 Sarnoff Corporation Stereo-vision based imminent collision detection
US7110882B2 (en) 2003-07-07 2006-09-19 Robert Bosch Gmbh Method for improving GPS integrity and detecting multipath interference using inertial navigation sensors and a network of mobile receivers
DE10350276A1 (en) 2003-10-28 2005-06-02 Robert Bosch Gmbh Device for fatigue warning in motor vehicles with distance warning system
JP3915776B2 (en) * 2003-12-09 2007-05-16 日産自動車株式会社 Leading vehicle detection device, host vehicle control device, and leading vehicle detection method
US7299130B2 (en) 2003-12-12 2007-11-20 Advanced Ceramic Research, Inc. Unmanned vehicle
US7522995B2 (en) 2004-02-05 2009-04-21 Nortrup Edward H Method and system for providing travel time information
WO2005093372A1 (en) 2004-03-29 2005-10-06 Hitachi, Ltd. Navigation system and course guiding method
US7454962B2 (en) 2004-08-18 2008-11-25 Nissan Diesel Motor Co., Ltd. Fuel consumption evaluation system
WO2006029399A2 (en) 2004-09-09 2006-03-16 Avaya Technology Corp. Methods of and systems for network traffic security
US8903617B2 (en) 2004-10-05 2014-12-02 Vision Works Ip Corporation Absolute acceleration sensor for use within moving vehicles
JP2006131055A (en) 2004-11-04 2006-05-25 Denso Corp Vehicle traveling controlling device
DE602005001841T2 (en) 2005-01-14 2008-04-17 Alcatel Lucent navigation service
US20070060045A1 (en) 2005-02-02 2007-03-15 Prautzsch Frank R System and technique for situational awareness
JP4127403B2 (en) 2005-02-28 2008-07-30 独立行政法人 宇宙航空研究開発機構 Method and apparatus for stabilizing control of vehicle traffic
US7593811B2 (en) 2005-03-31 2009-09-22 Deere & Company Method and system for following a lead vehicle
US8442735B2 (en) 2005-06-15 2013-05-14 Ford Global Technologies, Llc Traction control system and method
US7894982B2 (en) 2005-08-01 2011-02-22 General Motors Llc Method and system for linked vehicle navigation
US20070032245A1 (en) 2005-08-05 2007-02-08 Alapuranen Pertti O Intelligent transportation system and method
US7729857B2 (en) 2005-08-18 2010-06-01 Gm Global Technology Operations, Inc. System for and method of detecting a collision and predicting a vehicle path
US20070233337A1 (en) 2005-09-14 2007-10-04 Plishner Paul J Semi-autonomous guidance system for a vehicle
US7460951B2 (en) 2005-09-26 2008-12-02 Gm Global Technology Operations, Inc. System and method of target tracking using sensor fusion
FI120191B (en) 2005-10-03 2009-07-31 Sandvik Tamrock Oy A method for driving mining vehicles in a mine and a transportation system
US20070083318A1 (en) 2005-10-07 2007-04-12 Parikh Jayendra S Adaptive cruise control using vehicle-to-vehicle wireless communication
JP4720457B2 (en) 2005-11-22 2011-07-13 アイシン・エィ・ダブリュ株式会社 Vehicle driving support method and driving support device
US8000874B2 (en) 2006-03-10 2011-08-16 Nissan Motor Co., Ltd. Vehicle headway maintenance assist system and method
US7876258B2 (en) 2006-03-13 2011-01-25 The Boeing Company Aircraft collision sense and avoidance system and method
US9373149B2 (en) 2006-03-17 2016-06-21 Fatdoor, Inc. Autonomous neighborhood vehicle commerce network and community
US7425903B2 (en) 2006-04-28 2008-09-16 International Business Machines Corporation Dynamic vehicle grid infrastructure to allow vehicles to sense and respond to traffic conditions
FR2901233B1 (en) 2006-05-17 2009-02-27 Eurolum Soc Par Actions Simpli WHEEL VEHICLE, COUPLING METHOD, DAMAGING METHOD, METHOD OF MANAGING THE VEHICLE AND TRAIN OF RESULTING VEHICLES
US20080258890A1 (en) 2006-05-22 2008-10-23 Todd Follmer System and Method for Remotely Deactivating a Vehicle
US9067565B2 (en) 2006-05-22 2015-06-30 Inthinc Technology Solutions, Inc. System and method for evaluating driver behavior
JP4579191B2 (en) 2006-06-05 2010-11-10 本田技研工業株式会社 Collision avoidance system, program and method for moving object
WO2007143757A2 (en) 2006-06-09 2007-12-13 Carnegie Mellon University Software architecture for high-speed traversal of prescribed routes
US8947531B2 (en) 2006-06-19 2015-02-03 Oshkosh Corporation Vehicle diagnostics based on information communicated between vehicles
US8139109B2 (en) 2006-06-19 2012-03-20 Oshkosh Corporation Vision system for an autonomous vehicle
WO2008000820A1 (en) 2006-06-30 2008-01-03 Continental Teves Ag & Co. Ohg Method and apparatus for transmitting vehicle-related information in and out of a vehicle
US7554435B2 (en) 2006-09-07 2009-06-30 Nissan Technical Center North America, Inc. Vehicle on-board unit
WO2008043850A1 (en) 2006-10-13 2008-04-17 Continental Teves Ag & Co. Ohg System for reducing the braking distance of a vehicle
US9037388B2 (en) 2006-11-17 2015-05-19 Mccrary Personal Transport System, Llc Intelligent public transit system using dual-mode vehicles
US8532862B2 (en) 2006-11-29 2013-09-10 Ryan A. Neff Driverless vehicle
JP5130079B2 (en) 2007-02-28 2013-01-30 株式会社デンソーアイティーラボラトリ Electronic scanning radar apparatus and receiving array antenna
EP2145288A4 (en) 2007-03-05 2013-09-04 Digitaloptics Corp Europe Ltd Red eye false positive filtering using face location and orientation
ATE426878T1 (en) 2007-03-26 2009-04-15 Bay Zoltan Alkalmazott Kutatas SYSTEM AND METHOD FOR DISTRIBUTING TRAFFIC DATA FROM VEHICLE TO VEHICLE USING RADIO WAVES
US20080249667A1 (en) 2007-04-09 2008-10-09 Microsoft Corporation Learning and reasoning to enhance energy efficiency in transportation systems
WO2009009468A1 (en) 2007-07-06 2009-01-15 University Of Pittsburgh-Of The Commonwealth System Of Higher Education Powered vehicle convoying systems and methods of convoying powered vehicles
KR101136443B1 (en) 2007-07-30 2012-04-19 쇼조 이와모토 Packet traffic control system
DE102007040165A1 (en) 2007-08-21 2009-02-26 Siemens Ag Method for operating a vehicle association, communication equipment, traction vehicle, vehicle and vehicle association
US20090051510A1 (en) 2007-08-21 2009-02-26 Todd Follmer System and Method for Detecting and Reporting Vehicle Damage
WO2009027322A1 (en) 2007-08-27 2009-03-05 Qualcomm Incorporated Gnss receiver with wireless interface
US20090062974A1 (en) 2007-09-03 2009-03-05 Junichi Tamamoto Autonomous Mobile Robot System
DE102007046763A1 (en) 2007-09-28 2009-04-09 Robert Bosch Gmbh Control procedure and system
DE102007058192A1 (en) 2007-12-04 2009-06-10 Continental Teves Ag & Co. Ohg Central control unit for several assistance systems and motor vehicles provided in a motor vehicle (because of unrecognized priority)
US20090157461A1 (en) 2007-12-12 2009-06-18 Honeywell International Inc. Vehicle deployment planning system
US8090517B2 (en) 2007-12-19 2012-01-03 Nissan Motor Co., Ltd. Inter-vehicle distance maintenance supporting system and method
GB0802212D0 (en) 2008-02-06 2008-03-12 Meritor Heavy Vehicle Braking A brake system and method
US8285456B2 (en) 2008-02-29 2012-10-09 Caterpillar Inc. System for controlling a multimachine caravan
US8078376B2 (en) 2008-04-28 2011-12-13 General Electric Company System and method for verifying the availability of a level of a braking system in a powered system
DE102008001841A1 (en) 2008-05-19 2009-11-26 Zf Friedrichshafen Ag A method for driving a lock-up clutch in a hydrodynamic torque transmission device
KR101463250B1 (en) 2008-05-26 2014-11-18 주식회사 포스코 Method for platooning of vehicles in an automated vehicle system
FR2931984B1 (en) 2008-06-02 2014-12-12 Airbus France METHOD AND APPARATUS FOR GENERATING A CONTROLLED SPEED FOR AN AIRCRAFT RUNNING ON THE GROUND WITHIN AN AIRCRAFT CONVOY.
US20090326799A1 (en) 2008-06-25 2009-12-31 Expresspass Systems, Inc. Distributed Route Segment Maintenance and Hierarchical Routing Based on Physical Vehicle Criteria
WO2010004911A1 (en) 2008-07-10 2010-01-14 三菱電機株式会社 Train-of-vehicle travel support device
JP2010030525A (en) 2008-07-30 2010-02-12 Toyota Motor Corp Travel support device
US8116921B2 (en) 2008-08-20 2012-02-14 Autonomous Solutions, Inc. Follower vehicle control system and method for forward and reverse convoy movement
EP2159779B1 (en) 2008-08-27 2013-01-16 Saab Ab Using image sensor and tracking filter time-to-go to avoid mid-air collisions
US8195358B2 (en) 2008-09-11 2012-06-05 Deere & Company Multi-vehicle high integrity perception
US20100082179A1 (en) 2008-09-29 2010-04-01 David Kronenberg Methods for Linking Motor Vehicles to Reduce Aerodynamic Drag and Improve Fuel Economy
US8126642B2 (en) 2008-10-24 2012-02-28 Gray & Company, Inc. Control and systems for autonomously driven vehicles
FR2939113B1 (en) 2008-12-01 2012-07-06 Thomas Petalas SYSTEM FOR AUTOMATICALLY PLACING LOADS
EP2390857B1 (en) 2009-01-20 2013-07-03 Toyota Jidosha Kabushiki Kaisha Row-running control system and vehicle
KR100957137B1 (en) 2009-02-26 2010-05-11 한국과학기술원 System and method for controlling group driving
JP5041099B2 (en) 2009-02-27 2012-10-03 トヨタ自動車株式会社 Vehicle relative position estimation device and vehicle relative position estimation method
US20100228427A1 (en) 2009-03-05 2010-09-09 Massachusetts Institute Of Technology Predictive semi-autonomous vehicle navigation system
JP4939564B2 (en) 2009-03-23 2012-05-30 本田技研工業株式会社 Vehicle information providing device
US8224551B2 (en) 2009-03-24 2012-07-17 Bendix Commercial Vehicle Systems Llc ACC extended mode operation
US8352112B2 (en) 2009-04-06 2013-01-08 GM Global Technology Operations LLC Autonomous vehicle management
US8380362B2 (en) * 2009-07-10 2013-02-19 The Boeing Company Systems and methods for remotely collaborative vehicles
EP2876621A1 (en) 2009-07-28 2015-05-27 Toyota Jidosha Kabushiki Kaisha Vehicle control device, vehicle control method, and vehicle control system
US8397063B2 (en) 2009-10-07 2013-03-12 Telcordia Technologies, Inc. Method for a public-key infrastructure for vehicular networks with limited number of infrastructure servers
US9457810B2 (en) 2009-10-21 2016-10-04 Berthold K. P. Horn Method and apparatus for reducing motor vehicle traffic flow instabilities and increasing vehicle throughput
US8744661B2 (en) 2009-10-21 2014-06-03 Berthold K. P. Horn Method and apparatus for reducing motor vehicle traffic flow instabilities and increasing vehicle throughput
US8903574B2 (en) 2009-10-22 2014-12-02 General Electric Company System and method for vehicle communication, vehicle control, and/or route inspection
US8738238B2 (en) 2009-11-12 2014-05-27 Deere & Company Coordination of vehicle movement in a field
US9355548B2 (en) 2009-12-03 2016-05-31 Bi Incorporated Systems and methods for contact avoidance
WO2011082948A2 (en) 2009-12-14 2011-07-14 Continental Automotive Gmbh Method for communicating between a first motor vehicle and at least one second motor vehicle
JP5503961B2 (en) 2009-12-25 2014-05-28 株式会社デンソーアイティーラボラトリ Observation signal processor
US8951272B2 (en) 2010-02-11 2015-02-10 Ethicon Endo-Surgery, Inc. Seal arrangements for ultrasonically powered surgical instruments
JP2011171983A (en) 2010-02-18 2011-09-01 Sony Corp Apparatus and, processing information method, and computer-readable recording medium
US8618922B2 (en) 2010-03-30 2013-12-31 GM Global Technology Operations LLC Method and system for ensuring operation of limited-ability autonomous driving vehicles
CN102548821B (en) 2010-04-07 2016-01-20 丰田自动车株式会社 Vehicle travel support device
JP5585177B2 (en) 2010-04-12 2014-09-10 トヨタ自動車株式会社 Leading vehicle position determination device
EP2390744B1 (en) 2010-05-31 2012-11-14 Volvo Car Corporation Control system for travel in a platoon
US9096228B2 (en) 2010-06-23 2015-08-04 Continental Teves Ag & Co. Ohg Method and system for accelerated object recognition and/or accelerated object attribute recognition and use of said method
AT11551U3 (en) 2010-07-12 2011-05-15 Avl List Gmbh Method for testing a vehicle and a test vehicle with an active secondary vehicle
EP2593807B1 (en) 2010-07-16 2017-07-12 Continental Teves AG & Co. oHG Method and system for validating a vehicle-to-x message and use of the methood
JP5573461B2 (en) 2010-07-27 2014-08-20 トヨタ自動車株式会社 Vehicle control system
JP5494332B2 (en) 2010-07-27 2014-05-14 トヨタ自動車株式会社 Vehicle control system
US9129523B2 (en) * 2013-05-22 2015-09-08 Jaybridge Robotics, Inc. Method and system for obstacle detection for vehicles using planar sensor data
JP5293699B2 (en) 2010-08-11 2013-09-18 トヨタ自動車株式会社 Vehicle control device
JP5510173B2 (en) 2010-08-11 2014-06-04 トヨタ自動車株式会社 Vehicle control device
US8992391B2 (en) 2010-09-17 2015-03-31 Seastrom Manufacturing Co., Inc. Sizing fit cycle
US8793036B2 (en) 2010-09-22 2014-07-29 The Boeing Company Trackless transit system with adaptive vehicles
DE102010042048B4 (en) 2010-10-06 2020-11-12 Robert Bosch Gmbh Device and method for assisting a driver of a motor vehicle in a driving maneuver
US8717192B2 (en) 2010-10-08 2014-05-06 Navteq B.V. Method and system for using intersecting electronic horizons
US20120109421A1 (en) 2010-11-03 2012-05-03 Kenneth Scarola Traffic congestion reduction system
US20160194014A1 (en) 2010-11-17 2016-07-07 General Electric Company Vehicular data communication systems
US20120139495A1 (en) 2010-12-06 2012-06-07 Coda Automative, Inc. Electrochemical cell balancing circuits and methods
US8510012B2 (en) 2010-12-22 2013-08-13 Bendix Commercial Vehicle Systems Llc Anti-tailgating system and method
US8624764B2 (en) 2011-02-11 2014-01-07 Analog Devices, Inc. Test circuits and methods for redundant electronic systems
FR2972066B1 (en) 2011-02-28 2022-06-17 Eurocopter France METHOD FOR OPTIMIZING THE LOADING OF CARRIAGES INTO A VEHICLE
DE102012203182A1 (en) 2011-03-01 2012-09-06 Continental Teves Ag & Co. Ohg Safety device for a motor vehicle and method for operating a motor vehicle
US8887022B2 (en) 2011-03-04 2014-11-11 Infineon Technologies Austria Ag Reliable data transmission with reduced bit error rate
TWI421177B (en) 2011-03-18 2014-01-01 Ind Tech Res Inst Methods and systems of saving energy control
DE102011002275B4 (en) 2011-04-27 2023-08-31 Dr. Ing. H.C. F. Porsche Aktiengesellschaft Method for predicting the driving behavior of a vehicle driving ahead
WO2018039114A1 (en) 2016-08-22 2018-03-01 Peloton Technology, Inc. Systems for vehicular platooning and methods therefor
US10520581B2 (en) 2011-07-06 2019-12-31 Peloton Technology, Inc. Sensor fusion for autonomous or partially autonomous vehicle control
WO2013006826A2 (en) 2011-07-06 2013-01-10 Peloton Technology Inc. Systems and methods for semi-autonomous vehicular convoying
US9645579B2 (en) 2011-07-06 2017-05-09 Peloton Technology, Inc. Vehicle platooning systems and methods
US20170242443A1 (en) 2015-11-02 2017-08-24 Peloton Technology, Inc. Gap measurement for vehicle convoying
US9582006B2 (en) 2011-07-06 2017-02-28 Peloton Technology, Inc. Systems and methods for semi-autonomous convoying of vehicles
US10254764B2 (en) 2016-05-31 2019-04-09 Peloton Technology, Inc. Platoon controller state machine
US10520952B1 (en) 2011-07-06 2019-12-31 Peloton Technology, Inc. Devices, systems, and methods for transmitting vehicle data
US20130018766A1 (en) 2011-07-12 2013-01-17 Edwin Roy Christman Minimalist approach to roadway electrification
JP5565385B2 (en) 2011-07-16 2014-08-06 株式会社デンソー VEHICLE WIRELESS COMMUNICATION DEVICE AND COMMUNICATION SYSTEM
JP5533810B2 (en) 2011-07-23 2014-06-25 株式会社デンソー Follow-up control device
US9014915B2 (en) 2011-07-25 2015-04-21 GM Global Technology Operations LLC Active safety control for vehicles
US9165470B2 (en) 2011-07-25 2015-10-20 GM Global Technology Operations LLC Autonomous convoying technique for vehicles
JP5760835B2 (en) 2011-08-10 2015-08-12 株式会社デンソー Driving support device and driving support system
US8554468B1 (en) 2011-08-12 2013-10-08 Brian Lee Bullock Systems and methods for driver performance assessment and improvement
CN103781685B (en) 2011-08-25 2016-08-24 日产自动车株式会社 The autonomous drive-control system of vehicle
JP2013073360A (en) 2011-09-27 2013-04-22 Denso Corp Platoon driving device
JP5472248B2 (en) 2011-09-27 2014-04-16 株式会社デンソー Convoy travel device
JP5440579B2 (en) 2011-09-27 2014-03-12 株式会社デンソー Convoy travel device
US9163948B2 (en) 2011-11-17 2015-10-20 Speedgauge, Inc. Position accuracy testing system
CN103946906A (en) 2011-11-21 2014-07-23 丰田自动车株式会社 Vehicle identification device
DE102012221264A1 (en) * 2011-11-21 2013-05-23 Continental Teves Ag & Co. Ohg Method and device for determining the position of objects by means of communication signals and use of the device
US8798887B2 (en) 2011-11-30 2014-08-05 GM Global Technology Operations LLC System and method for estimating the mass of a vehicle
US9771070B2 (en) 2011-12-09 2017-09-26 GM Global Technology Operations LLC Method and system for controlling a host vehicle
US8649962B2 (en) 2011-12-19 2014-02-11 International Business Machines Corporation Planning a route for a convoy of automobiles
US9187118B2 (en) 2011-12-30 2015-11-17 C & P Technologies, Inc. Method and apparatus for automobile accident reduction using localized dynamic swarming
US9276621B2 (en) 2012-01-16 2016-03-01 Maxim Intergrated Products, Inc. Method and apparatus for differential communications
DE102012002695B4 (en) 2012-02-14 2024-08-01 Zf Cv Systems Hannover Gmbh Procedure for determining an emergency braking situation of a vehicle
US8620517B2 (en) 2012-02-21 2013-12-31 Toyota Mototr Engineering & Manufacturing North America, Inc. Vehicular platooning using distributed receding horizon control
EP2637072B1 (en) 2012-03-05 2017-10-11 Volvo Car Corporation Path following of a target vehicle
US8457827B1 (en) 2012-03-15 2013-06-04 Google Inc. Modifying behavior of autonomous vehicle based on predicted behavior of other vehicles
US8880272B1 (en) 2012-03-16 2014-11-04 Google Inc. Approach for estimating the geometry of roads and lanes by using vehicle trajectories
SE537447C2 (en) 2012-03-27 2015-05-05 Scania Cv Ab Device and method for streamlining fuel utilization during the speed of a vehicle
US20130278441A1 (en) 2012-04-24 2013-10-24 Zetta Research and Development, LLC - ForC Series Vehicle proxying
SE1250443A1 (en) 2012-05-03 2013-11-04 Scania Cv Ab Support system and method for forming vehicle trains
US9669828B2 (en) 2012-06-01 2017-06-06 Toyota Motor Engineering & Manufacturing North America, Inc. Cooperative driving and collision avoidance by distributed receding horizon control
GB201210059D0 (en) 2012-06-07 2012-07-25 Jaguar Cars Powertrain control system and method
SE536549C2 (en) 2012-06-14 2014-02-11 Scania Cv Ab System and method for assisting a vehicle in overtaking a vehicle train
US9026367B2 (en) 2012-06-27 2015-05-05 Microsoft Technology Licensing, Llc Dynamic destination navigation system
US8948995B2 (en) 2012-06-28 2015-02-03 Toyota Motor Engineering & Manufacturing North America, Inc. Preceding vehicle state prediction
EP2685338B1 (en) 2012-07-12 2018-04-11 Volvo Car Corporation Apparatus and method for lateral control of a host vehicle during travel in a vehicle platoon
US9969081B2 (en) 2012-07-27 2018-05-15 Alberto Daniel Lacaze Method and system for the directed control of robotic assets
US9989637B2 (en) * 2012-08-03 2018-06-05 Safie Holdings LLC Portable collision warning apparatus
US9308932B2 (en) * 2012-08-09 2016-04-12 Steering Solutions Ip Holding Corporation System for providing steering assist torque based on a proportional gain value
US10737665B2 (en) 2012-08-28 2020-08-11 Ford Global Technologies, Llc Vehicle braking based on external object communications
JP5949366B2 (en) 2012-09-13 2016-07-06 トヨタ自動車株式会社 Road traffic control method, road traffic control system and in-vehicle terminal
DE102012216386A1 (en) 2012-09-14 2014-03-20 Robert Bosch Gmbh Method for operating a driver assistance system of a vehicle
SE537958C2 (en) 2012-09-24 2015-12-08 Scania Cv Ab Procedure, measuring device and control unit for adapting vehicle train control
US9221396B1 (en) * 2012-09-27 2015-12-29 Google Inc. Cross-validating sensors of an autonomous vehicle
JP5668741B2 (en) 2012-10-04 2015-02-12 株式会社デンソー Convoy travel device
JP6015329B2 (en) 2012-10-11 2016-10-26 株式会社デンソー Convoy travel system and convoy travel device
SE1251163A1 (en) 2012-10-15 2014-04-16 Scania Cv Ab System and method in connection with the occurrence of vehicle trains
WO2014060117A1 (en) 2012-10-17 2014-04-24 Toll Collect Gmbh Method and devices for collecting a traffic-related toll fee
US20140129075A1 (en) 2012-11-05 2014-05-08 Dennis M. Carleton Vehicle Control Using Modeled Swarming Behavior
JP5682610B2 (en) 2012-11-07 2015-03-11 トヨタ自動車株式会社 In-vehicle communication device, in-vehicle communication system, and communication method
US9633565B2 (en) 2012-11-15 2017-04-25 GM Global Technology Operations LLC Active safety system and method for operating the same
US8983714B2 (en) 2012-11-16 2015-03-17 Robert Bosch Gmbh Failsafe communication system and method
US9275549B2 (en) 2012-11-28 2016-03-01 Denso Corporation Vehicle pseudo-travel locus generator
US9776587B2 (en) 2012-11-29 2017-10-03 Here Global B.V. Method and apparatus for causing a change in an action of a vehicle for safety
US9026282B2 (en) 2012-11-30 2015-05-05 Electro-Motive Diesel, Inc. Two-tiered hierarchically distributed locomotive control system
US8914225B2 (en) 2012-12-04 2014-12-16 International Business Machines Corporation Managing vehicles on a road network
SE1251407A1 (en) 2012-12-12 2014-06-13 Scania Cv Ab Device and method for evaluation of forward speed including vehicle train formation
US10501087B2 (en) 2012-12-19 2019-12-10 Volvo Truck Corporation Method and arrangement for determining the speed behaviour of a leading vehicle
JP5765326B2 (en) 2012-12-19 2015-08-19 株式会社デンソー Inter-vehicle communication device and platooning control device
US9659492B2 (en) 2013-01-11 2017-05-23 Here Global B.V. Real-time vehicle spacing control
JP6082984B2 (en) 2013-01-25 2017-02-22 株式会社アドヴィックス Vehicle mass estimation device
US9367065B2 (en) 2013-01-25 2016-06-14 Google Inc. Modifying behavior of autonomous vehicles based on sensor blind spots and limitations
US9412271B2 (en) 2013-01-30 2016-08-09 Wavetronix Llc Traffic flow through an intersection by reducing platoon interference
US9102406B2 (en) 2013-02-15 2015-08-11 Disney Enterprises, Inc. Controlling unmanned aerial vehicles as a flock to synchronize flight in aerial displays
KR101500060B1 (en) 2013-02-26 2015-03-06 현대자동차주식회사 Vehicle apparatus and system for controlling group driving, and method for selecting preceding vehicle using the same
WO2014133425A1 (en) 2013-02-27 2014-09-04 Volvo Truck Corporation System and method for determining an aerodynamically favorable position between ground traveling vehicles
SE537259C2 (en) 2013-03-06 2015-03-17 Scania Cv Ab Device and procedure for increased road safety in vehicle trains
SE537446C2 (en) 2013-03-06 2015-05-05 Scania Cv Ab Device and method of communication of vehicle trains
US9740178B2 (en) 2013-03-14 2017-08-22 GM Global Technology Operations LLC Primary controller designation in fault tolerant systems
US11294396B2 (en) 2013-03-15 2022-04-05 Peloton Technology, Inc. System and method for implementing pre-cognition braking and/or avoiding or mitigation risks among platooning vehicles
US20180210463A1 (en) 2013-03-15 2018-07-26 Peloton Technology, Inc. System and method for implementing pre-cognition braking and/or avoiding or mitigation risks among platooning vehicles
CN104380349A (en) 2013-04-15 2015-02-25 弗莱克斯电子有限责任公司 Vehicle intruder alert detection and indication
US20140310277A1 (en) 2013-04-15 2014-10-16 Flextronics Ap, Llc Suspending user profile modification based on user context
US20140309836A1 (en) 2013-04-16 2014-10-16 Neya Systems, Llc Position Estimation and Vehicle Control in Autonomous Multi-Vehicle Convoys
JP5737316B2 (en) 2013-04-17 2015-06-17 株式会社デンソー Convoy travel system
JP5817777B2 (en) 2013-04-17 2015-11-18 株式会社デンソー Convoy travel system
DE102013207113A1 (en) 2013-04-19 2014-10-23 Continental Teves Ag & Co. Ohg A method and system for avoiding a launching of a follower vehicle to an immediate preceding vehicle and use of the system
EP2799902A1 (en) 2013-04-30 2014-11-05 Baselabs GmbH Method and apparatus for the tracking of multiple objects
US9254846B2 (en) 2013-05-03 2016-02-09 Google Inc. Predictive reasoning for controlling speed of a vehicle
US9233697B2 (en) 2013-05-24 2016-01-12 General Electric Company Method and system for controlling a vehicle system factoring mass attributable to weather
FR3006088B1 (en) 2013-05-27 2015-11-27 Renault Sas DEVICE FOR ESTIMATING THE PERMANENT MODE OF OPERATION OF A MOTOR VEHICLE AND ASSOCIATED METHOD
US20150025917A1 (en) 2013-07-15 2015-01-22 Advanced Insurance Products & Services, Inc. System and method for determining an underwriting risk, risk score, or price of insurance using cognitive information
US9454150B2 (en) 2013-07-17 2016-09-27 Toyota Motor Engineering & Manufacturing North America, Inc. Interactive automated driving system
WO2015025047A1 (en) 2013-08-22 2015-02-26 Continental Teves Ag & Co. Ohg Car2x receiver filtering based on a receiving corridor in a geographic coordinate system
US9371002B2 (en) 2013-08-28 2016-06-21 Vision Works Ip Corporation Absolute acceleration sensor for use within moving vehicles
CA2918506C (en) 2013-09-05 2022-03-22 Crown Equipment Corporation Dynamic operator behavior analyzer
SE537598C2 (en) 2013-09-30 2015-07-14 Scania Cv Ab Method and system for organizing vehicle trains
SE537618C2 (en) 2013-09-30 2015-08-04 Scania Cv Ab Method and system for common driving strategy for vehicle trains
SE537578C2 (en) 2013-09-30 2015-06-30 Scania Cv Ab Control unit and method for controlling a vehicle in a vehicle train
SE537466C2 (en) 2013-09-30 2015-05-12 Scania Cv Ab System and method for regulating a vehicle train with two different driving strategies
SE537603C2 (en) 2013-09-30 2015-07-21 Scania Cv Ab Method and system for handling obstacles for vehicle trains
SE537985C2 (en) 2013-09-30 2016-01-12 Scania Cv Ab System and method for regulating vehicle trains with a joint position-based driving strategy
SE537482C2 (en) 2013-09-30 2015-05-12 Scania Cv Ab Method and system for common driving strategy for vehicle trains
SE537469C2 (en) 2013-09-30 2015-05-12 Scania Cv Ab A system and method for correcting map data and vehicle train position data
KR101909917B1 (en) 2013-10-07 2018-10-19 한국전자통신연구원 Apparatus and method for controlling autonomous vehicle platooning
MY161720A (en) 2013-10-11 2017-05-15 Nissan Motor Travel control device and travel control method
US9141112B1 (en) 2013-10-16 2015-09-22 Allstate Insurance Company Caravan management
US9174672B2 (en) * 2013-10-28 2015-11-03 GM Global Technology Operations LLC Path planning for evasive steering maneuver in presence of target vehicle and surrounding objects
US9792822B2 (en) 2013-11-08 2017-10-17 Honda Motor Co., Ltd. Convoy travel control apparatus
US9146561B2 (en) 2013-12-03 2015-09-29 King Fahd University Of Petroleum And Minerals Robotic leader-follower navigation and fleet management control method
JP6042794B2 (en) 2013-12-03 2016-12-14 本田技研工業株式会社 Vehicle control method
EP2881926B1 (en) 2013-12-04 2021-08-04 Volvo Car Corporation Method and control system for controlling movement of a group of road vehicles
US20150161894A1 (en) 2013-12-05 2015-06-11 Elwha Llc Systems and methods for reporting characteristics of automatic-driving software
GB2521134A (en) 2013-12-10 2015-06-17 Ibm Route Monitoring
US9406177B2 (en) 2013-12-20 2016-08-02 Ford Global Technologies, Llc Fault handling in an autonomous vehicle
US9355423B1 (en) 2014-01-24 2016-05-31 Allstate Insurance Company Reward system related to a vehicle-to-vehicle communication system
JP2017512347A (en) 2014-01-30 2017-05-18 ウニヴェルシダージ ド ポルトUniversidade Do Porto Apparatus and method for self-automated parking for autonomous vehicles based on vehicle networking
US9079587B1 (en) * 2014-02-14 2015-07-14 Ford Global Technologies, Llc Autonomous control in a dense vehicle environment
US9511764B2 (en) 2014-02-28 2016-12-06 Ford Global Technologies, Llc Semi-autonomous mode control
US9731732B2 (en) 2014-03-09 2017-08-15 General Electric Company Systems and methods for vehicle control
SE538458C2 (en) 2014-04-08 2016-07-12 Scania Cv Ab Method, apparatus and system comprising the apparatus for supporting the creation of vehicle trains
US9304515B2 (en) 2014-04-24 2016-04-05 Lenovo Enterprise Solutions (Singapore) Pte. Ltd. Regional operation modes for autonomous vehicles
US20150334371A1 (en) 2014-05-19 2015-11-19 Rockwell Automation Technologies, Inc. Optical safety monitoring with selective pixel array analysis
US10185998B1 (en) 2014-05-20 2019-01-22 State Farm Mutual Automobile Insurance Company Accident fault determination for autonomous vehicles
WO2015179705A1 (en) 2014-05-21 2015-11-26 Quantum Fuel Systems Technologies Worldwide, Inc. Enhanced compliance verification system
CA2950136A1 (en) 2014-05-23 2015-12-23 Rochester Institute Of Technology Method for optimizing asset value based on driver acceleration and braking behavior
JP6285303B2 (en) 2014-07-11 2018-02-28 株式会社デンソー Vehicle control device
US9744970B2 (en) 2014-07-14 2017-08-29 Ford Global Technologies, Llc Estimating a trailer road grade
US9182764B1 (en) 2014-08-04 2015-11-10 Cummins, Inc. Apparatus and method for grouping vehicles for cooperative driving
US10510256B2 (en) 2014-10-20 2019-12-17 Robert Brandriff Vehicle collision avoidance system and method
EP3210090B1 (en) 2014-10-21 2020-10-14 Road Trains LLC Platooning control via accurate synchronization
FR3029662A1 (en) 2014-12-03 2016-06-10 Inst Nat Des Sciences Appliquees (Insa) SIMULATION SYSTEM, DEVICES, METHODS AND PROGRAMS THEREFOR.
CN113654561A (en) 2014-12-05 2021-11-16 苹果公司 Autonomous navigation system
EP3227876A1 (en) 2014-12-05 2017-10-11 Audi AG Method for coordinating movements of vehicles forming a platoon
US9702705B2 (en) 2014-12-24 2017-07-11 International Business Machines Corporation Crowd-assisted micro-navigation
US11182870B2 (en) 2014-12-24 2021-11-23 Mcafee, Llc System and method for collective and collaborative navigation by a group of individuals
US9384666B1 (en) 2015-02-01 2016-07-05 Thomas Danaher Harvey Methods to operate autonomous vehicles to pilot vehicles in groups or convoys
US11107031B2 (en) 2015-02-18 2021-08-31 Ryder Integrated Logistics, Inc. Vehicle fleet control systems and methods
WO2016134770A1 (en) 2015-02-26 2016-09-01 Volvo Truck Corporation Method of controlling inter-vehicle gap(s) in a platoon
CN104717071B (en) 2015-02-28 2018-01-05 深圳先进技术研究院 Road train data authentication method for authenticating and car-mounted terminal
JP6122893B2 (en) 2015-03-19 2017-04-26 ヤフー株式会社 Navigation device, method and program
US20170083844A1 (en) 2015-04-01 2017-03-23 Earthwave Technologies, Inc. Systems, methods, and apparatus for efficient vehicle fleet tracking, deployment, and management
WO2016182489A1 (en) 2015-05-11 2016-11-17 Scania Cv Ab Device, system and method for a platooning operation
US10031522B2 (en) 2015-05-27 2018-07-24 Dov Moran Alerting predicted accidents between driverless cars
WO2016189495A1 (en) 2015-05-27 2016-12-01 Van Dyke, Marc Alerting predicted accidents between driverless cars
US9841762B2 (en) 2015-05-27 2017-12-12 Comigo Ltd. Alerting predicted accidents between driverless cars
US9884631B2 (en) 2015-06-04 2018-02-06 Toyota Motor Engineering & Manufacturing North America, Inc. Transitioning between operational modes of an autonomous vehicle
US20160362048A1 (en) 2015-06-09 2016-12-15 AWARE 360 Ltd. Self-monitoring of vehicles in a convoy
US10091670B2 (en) 2015-06-12 2018-10-02 Denso International America, Inc. System and measurement method for a dedicated short-range communication on-vehicle coverage system
DE102015211451A1 (en) 2015-06-22 2017-01-05 Volkswagen Aktiengesellschaft Method for manipulation protection of user data packets to be transmitted via a bus system between system components
US9944134B2 (en) 2015-06-24 2018-04-17 GM Global Technology Operations LLC Integrated sensing unit and method for determining vehicle wheel speed and tire pressure
US9449258B1 (en) 2015-07-02 2016-09-20 Agt International Gmbh Multi-camera vehicle identification system
US10115314B2 (en) 2015-07-08 2018-10-30 Magna Electronics Inc. Lane change system for platoon of vehicles
WO2017019595A1 (en) 2015-07-27 2017-02-02 Genghiscomm Holdings, LLC Airborne relays in cooperative-mimo systems
US9869560B2 (en) 2015-07-31 2018-01-16 International Business Machines Corporation Self-driving vehicle's response to a proximate emergency vehicle
US9618347B2 (en) 2015-08-03 2017-04-11 Nissan North America, Inc. Projecting vehicle transportation network information representing an intersection
US11100211B2 (en) 2015-08-26 2021-08-24 Peloton Technology, Inc. Devices, systems, and methods for remote authorization of vehicle platooning
CA2996546A1 (en) 2015-08-26 2017-03-02 Peloton Technology, Inc. Devices, systems and methods for vehicle monitoring and platooning
US9915051B2 (en) 2015-09-01 2018-03-13 Bahman Niroumand Mandrel for forming an aggregate pier, and aggregate pier compacting system and method
CA3004051A1 (en) 2015-09-15 2017-04-27 Peloton Technology, Inc. Vehicle identification and location using sensor fusion and inter-vehicle communication
JP6377169B2 (en) 2015-09-16 2018-08-22 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd System and method for estimating UAV position
EP3351025B1 (en) 2015-09-17 2020-04-08 Telefonaktiebolaget LM Ericsson (publ) Communication device, first radio node, second radio node, and methods therein, for determining whether to allow a first vehicle to overtake a vehicle platoon
US9751532B2 (en) 2015-10-27 2017-09-05 International Business Machines Corporation Controlling spacing of self-driving vehicles based on social network relationships
US9823166B2 (en) 2015-11-04 2017-11-21 Ford Global Technologies, Llc Coordinated testing in vehicle platoons
US20170132299A1 (en) 2015-11-10 2017-05-11 Caterpillar Inc. System and method for managing data associated with worksite
US10007271B2 (en) 2015-12-11 2018-06-26 Avishtech, Llc Autonomous vehicle towing system and method
US9802620B2 (en) 2015-12-18 2017-10-31 GM Global Technology Operations LLC Position error estimate and implementation for autonomous driving
US10204519B2 (en) 2015-12-25 2019-02-12 Ozyegin Universitesi Communication between vehicles of a platoon
US20170197615A1 (en) 2016-01-11 2017-07-13 Ford Global Technologies, Llc System and method for reverse perpendicular parking a vehicle
CN105667387B (en) 2016-01-13 2017-12-26 京东方科技集团股份有限公司 Vehicle communications device and method, vehicle
US9632507B1 (en) 2016-01-29 2017-04-25 Meritor Wabco Vehicle Control Systems System and method for adjusting vehicle platoon distances based on predicted external perturbations
US10244538B2 (en) 2016-02-12 2019-03-26 Futurewei Technologies, Inc. System and method for determining a resource selection technique
CN105835882A (en) 2016-03-01 2016-08-10 乐视云计算有限公司 Automatic vehicle traveling method and device
SE539788C2 (en) 2016-03-22 2017-11-28 Scania Cv Ab A method and a system for controlling platooning operation when a vehicle is to leave the platoon
US9852554B2 (en) 2016-03-31 2017-12-26 Harman International Industries, Incorporated Systems and methods for vehicle-to-vehicle communication
US10542464B2 (en) 2016-04-01 2020-01-21 Futurewei Technologies, Inc. Methods for data communication to a platoon of connected vehicles
US10353387B2 (en) 2016-04-12 2019-07-16 Here Global B.V. Method, apparatus and computer program product for grouping vehicles into a platoon
CN108885826B (en) 2016-04-15 2021-10-15 本田技研工业株式会社 Vehicle control system, vehicle control method, and storage medium
US9964948B2 (en) 2016-04-20 2018-05-08 The Florida International University Board Of Trustees Remote control and concierge service for an autonomous transit vehicle fleet
US9776638B1 (en) 2016-04-20 2017-10-03 GM Global Technology Operations LLC Remote interrogation and override for automated driving system
SE540620C2 (en) 2016-04-22 2018-10-02 Scania Cv Ab Method and system for determining risks for vehicles about leaving a platoon
SE540619C2 (en) 2016-04-22 2018-10-02 Scania Cv Ab Method and system for adapting platooning operation according to the behavior of other road users
US20170309187A1 (en) 2016-04-22 2017-10-26 Hsin-Nan Lin Vehicle platoon system and method thereof
US10430745B2 (en) 2016-05-03 2019-10-01 Azuga, Inc. Method and apparatus for evaluating driver performance and determining driver rewards
NL2016753B1 (en) 2016-05-10 2017-11-16 Daf Trucks Nv Platooning method for application in heavy trucks
US9927816B2 (en) 2016-05-13 2018-03-27 Macau University Of Science And Technology System and method for operating a follower vehicle in a vehicle platoon
WO2017200433A1 (en) 2016-05-17 2017-11-23 Telefonaktiebolaget Lm Ericsson (Publ) Methods, platoon controller and vehicle controller, for enabling a decision to join a vehicle platoon
SE539923C2 (en) 2016-05-23 2018-01-16 Scania Cv Ab Methods and communicators for transferring a soft identity reference from a first vehicle to a second vehicle in a platoon
JP6565793B2 (en) 2016-05-30 2019-08-28 株式会社デンソー Convoy travel system
WO2017209666A1 (en) 2016-05-31 2017-12-07 Telefonaktiebolaget Lm Ericsson (Publ) Method and platoon manager for enabling a wireless device in a vehicle to communicate over a cellular network
GB2540039A (en) 2016-06-03 2017-01-04 Daimler Ag Method for operating a vehicle as a following vehicle in a platoon
US10017179B2 (en) 2016-06-06 2018-07-10 GM Global Technology Operations LLC Method for optimizing inter-vehicle distance and equitably sharing fuel benefits in a vehicle platoon
US20170349058A1 (en) 2016-06-07 2017-12-07 Briggs & Stratton Corporation Fleet management system for outdoor power equipment
US9878657B2 (en) 2016-06-15 2018-01-30 Denso International America, Inc. Projected laser lines/graphics onto the road for indicating truck platooning/warning to other drivers of presence of truck platoon
US10013877B2 (en) 2016-06-20 2018-07-03 Toyota Jidosha Kabushiki Kaisha Traffic obstruction notification system based on wireless vehicle data
US10332403B2 (en) 2017-01-04 2019-06-25 Honda Motor Co., Ltd. System and method for vehicle congestion estimation
US10625742B2 (en) 2016-06-23 2020-04-21 Honda Motor Co., Ltd. System and method for vehicle control in tailgating situations
US10027024B2 (en) 2016-06-29 2018-07-17 Denso International America, Inc. Antenna for vehicle platooning
WO2018000386A1 (en) 2016-06-30 2018-01-04 华为技术有限公司 Method for controlling vehicle platoon driving, centralized control device, and vehicle
US11107018B2 (en) 2016-07-15 2021-08-31 Cummins Inc. Method and apparatus for platooning of vehicles
US10209708B2 (en) 2016-07-28 2019-02-19 Lytx, Inc. Determining driver engagement with autonomous vehicle
US10068485B2 (en) 2016-08-15 2018-09-04 Ford Global Technologies, Llc Platooning autonomous vehicle navigation sensory exchange
WO2018035145A1 (en) 2016-08-19 2018-02-22 Pcms Holdings, Inc. Method for changing the lead vehicle of a vehicle platoon
US10369998B2 (en) 2016-08-22 2019-08-06 Peloton Technology, Inc. Dynamic gap control for automated driving
WO2018043520A1 (en) 2016-08-30 2018-03-08 ナブテスコオートモーティブ株式会社 Vehicle and vehicle platoon
WO2018043519A1 (en) 2016-08-31 2018-03-08 ナブテスコオートモーティブ株式会社 Brake system, vehicle, and vehicle platoon
JP6748213B2 (en) 2016-09-05 2020-08-26 ナブテスコオートモーティブ株式会社 Platooning management system
US10849081B2 (en) 2016-09-16 2020-11-24 Qualcomm Incorporated Synchronized radio transmission for vehicle platooning
US9928746B1 (en) 2016-09-16 2018-03-27 Ford Global Technologies, Llc Vehicle-to-vehicle cooperation to marshal traffic
US10089882B2 (en) 2016-09-21 2018-10-02 Wabco Europe Bvba Method for controlling an own vehicle to participate in a platoon
DE102016011325A1 (en) 2016-09-21 2018-03-22 Wabco Gmbh A method for determining a dynamic vehicle distance between a follower vehicle and a front vehicle of a platoon
US10551504B2 (en) 2016-10-04 2020-02-04 Trimble Inc. Method and system for sharing convergence data
US9940840B1 (en) 2016-10-06 2018-04-10 X Development Llc Smart platooning of vehicles
US10040390B2 (en) 2016-10-11 2018-08-07 Ford Global Technologies, Llc Vehicle light diagnostic
US20180113448A1 (en) 2016-10-24 2018-04-26 Ford Global Technologies, Llc Vehicle energy reduction
US10399564B2 (en) 2016-10-25 2019-09-03 Ford Global Technologies, Llc Vehicle roundabout management
DE102016012868A1 (en) 2016-10-28 2018-05-03 Man Truck & Bus Ag Technology for the longitudinal control of a commercial vehicle in a vehicle association
EP3316062B1 (en) 2016-10-31 2019-09-04 Nxp B.V. Platoon control
CN110418745B (en) 2016-11-02 2023-01-13 佩路通科技股份有限公司 Clearance measurement for vehicle convoying
US20190279513A1 (en) 2016-11-02 2019-09-12 Peloton Technology, Inc. Vehicle convoying using satellite navigation and inter-vehicle communication
US10474163B2 (en) 2016-11-24 2019-11-12 Lg Electronics Inc. Vehicle control device mounted on vehicle and method for controlling the vehicle
US9616743B1 (en) 2016-11-30 2017-04-11 Borgwarner Inc. Cooling air flow system
US11293765B2 (en) 2016-12-08 2022-04-05 Pcms Holdings, Inc. System and method for routing and reorganization of a vehicle platoon in a smart city
SE541478C2 (en) 2016-12-16 2019-10-15 Scania Cv Ab Method and control unit for adjusting an inter vehicular distance between vehicles in a platoon
US10503176B2 (en) 2016-12-30 2019-12-10 Bendix Commercial Vehicle Systems Llc Self-ordering of fleet vehicles in a platoon
US10073464B2 (en) 2016-12-30 2018-09-11 Bendix Commercial Vehicle Systems Llc Varying the distance between vehicles in a platoon
US10482767B2 (en) 2016-12-30 2019-11-19 Bendix Commercial Vehicle Systems Llc Detection of extra-platoon vehicle intermediate or adjacent to platoon member vehicles
US10372123B2 (en) 2016-12-30 2019-08-06 Bendix Commercial Vehicle Systems Llc “V” shaped and wide platoon formations
JPWO2018135630A1 (en) 2017-01-20 2019-11-07 ナブテスコオートモーティブ株式会社 Convoy travel vehicle group and convoy travel method
US10281926B2 (en) 2017-01-23 2019-05-07 Bendix Commercial Vehicle Systems Llc Apparatus and method for controlling vehicles in a platoon
US20210132630A1 (en) 2017-01-24 2021-05-06 Volvo Truck Corporation A method for at least one string of platooning vehicles
GB2551248A (en) 2017-04-12 2017-12-13 Daimler Ag Method for operating a vehicle in a platoon
ES2974322T3 (en) 2017-05-04 2024-06-26 Koninklijke Philips Nv Intragroup communication
WO2018217219A1 (en) 2017-05-22 2018-11-29 Peloton Technology, Inc. Transceiver antenna system for platooning
US10017039B1 (en) 2017-07-20 2018-07-10 Bendix Commercial Vehicle Systems Llc Vehicle platooning with a hybrid electric vehicle system
US10880293B2 (en) 2017-08-25 2020-12-29 Ford Global Technologies, Llc Authentication of vehicle-to-vehicle communications
US20190073908A1 (en) 2017-09-05 2019-03-07 Ford Global Technologies, Llc Cooperative vehicle operation
KR102350092B1 (en) 2017-11-13 2022-01-12 현대자동차주식회사 Apparatus for controlling cluster driving of vehicle and method thereof
KR102383436B1 (en) 2017-12-01 2022-04-07 현대자동차주식회사 Platooning control apparatus and method
KR102463717B1 (en) 2017-12-12 2022-11-07 현대자동차주식회사 Apparatus for controlling braking force of platooning vehicle, system having the same and method thereof
KR20190070001A (en) 2017-12-12 2019-06-20 현대자동차주식회사 Apparatus for controlling platooning based-on avoid actively, system having the same and method thereof
US10739787B2 (en) 2018-01-12 2020-08-11 Toyota Motor Engineering & Manufacturing North America, Inc. Responsibilities and agreement acceptance for vehicle platooning
EP3911547A4 (en) 2018-07-07 2022-10-26 Peloton Technology, Inc. Control of automated following in vehicle convoys
KR20190129020A (en) 2019-10-29 2019-11-19 엘지전자 주식회사 Method and apparatus for controlling vehicle to prevent accident

Cited By (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10514706B2 (en) 2011-07-06 2019-12-24 Peloton Technology, Inc. Gap measurement for vehicle convoying
US11334092B2 (en) 2011-07-06 2022-05-17 Peloton Technology, Inc. Devices, systems, and methods for transmitting vehicle data
US11360485B2 (en) 2011-07-06 2022-06-14 Peloton Technology, Inc. Gap measurement for vehicle convoying
US10732645B2 (en) 2011-07-06 2020-08-04 Peloton Technology, Inc. Methods and systems for semi-autonomous vehicular convoys
US10520581B2 (en) 2011-07-06 2019-12-31 Peloton Technology, Inc. Sensor fusion for autonomous or partially autonomous vehicle control
US10520952B1 (en) 2011-07-06 2019-12-31 Peloton Technology, Inc. Devices, systems, and methods for transmitting vehicle data
US10916146B2 (en) 2012-12-28 2021-02-09 Transportation Ip Holdings, Llc Vehicle convoy control system and method
US10262542B2 (en) 2012-12-28 2019-04-16 General Electric Company Vehicle convoy control system and method
US11294396B2 (en) * 2013-03-15 2022-04-05 Peloton Technology, Inc. System and method for implementing pre-cognition braking and/or avoiding or mitigation risks among platooning vehicles
US20180210462A1 (en) * 2013-03-15 2018-07-26 Peloton Technology, Inc. System and method for implementing pre-cognition braking and/or avoiding or mitigation risks among platooning vehicles
US10717439B2 (en) * 2017-09-15 2020-07-21 Honda Motor Co., Ltd Traveling control system and vehicle control method
US10922974B2 (en) * 2017-11-24 2021-02-16 Toyota Jidosha Kabushiki Kaisha Vehicle control device
US10955540B2 (en) 2017-12-01 2021-03-23 Aptiv Technologies Limited Detection system
US11474224B2 (en) 2017-12-01 2022-10-18 Aptiv Technologies Limited Detection system
US10377383B2 (en) * 2017-12-11 2019-08-13 Ford Global Technologies, Llc Vehicle lane change
WO2019133470A1 (en) * 2017-12-28 2019-07-04 Bendix Commercial Vehicle Systems Llc Sensor-based anti-hacking prevention in platooning vehicles
US10921823B2 (en) * 2017-12-28 2021-02-16 Bendix Commercial Vehicle Systems Llc Sensor-based anti-hacking prevention in platooning vehicles
US10816985B2 (en) * 2018-04-17 2020-10-27 Baidu Usa Llc Method on moving obstacle representation for trajectory planning
WO2019222684A1 (en) * 2018-05-18 2019-11-21 The Charles Stark Draper Laboratory, Inc. Convolved augmented range lidar nominal area
US10739445B2 (en) 2018-05-23 2020-08-11 The Charles Stark Draper Laboratory, Inc. Parallel photon counting
US20210269029A1 (en) * 2018-06-26 2021-09-02 Continental Automotive Gmbh Group of Vehicles, Method for Operating the Group of Vehicles, Computer Program and Computer-Readable Storage Medium
US11386786B2 (en) * 2018-07-07 2022-07-12 Robert Bosch Gmbh Method for classifying a relevance of an object
US10899323B2 (en) 2018-07-08 2021-01-26 Peloton Technology, Inc. Devices, systems, and methods for vehicle braking
CN108985254A (en) * 2018-08-01 2018-12-11 上海主线科技有限公司 A kind of band based on laser hangs tag vehicle tracking
CN110816547A (en) * 2018-08-07 2020-02-21 通用汽车环球科技运作有限责任公司 Perception uncertainty modeling of real perception system for autonomous driving
US10838054B2 (en) * 2018-10-08 2020-11-17 Aptiv Technologies Limited Detection system and method
US11435466B2 (en) * 2018-10-08 2022-09-06 Aptiv Technologies Limited Detection system and method
US11768284B2 (en) 2018-10-08 2023-09-26 Aptiv Technologies Limited Detection system and method
US20200125117A1 (en) * 2018-10-23 2020-04-23 Peloton Technology, Inc. Systems and methods for platooning and automation safety
US11341856B2 (en) 2018-10-29 2022-05-24 Peloton Technology, Inc. Systems and methods for managing communications between vehicles
US10762791B2 (en) 2018-10-29 2020-09-01 Peloton Technology, Inc. Systems and methods for managing communications between vehicles
WO2020107038A1 (en) * 2018-11-19 2020-05-28 Invensense, Inc. Method and system for positioning using radar and motion sensors
CN111367269A (en) * 2018-12-26 2020-07-03 武汉万集信息技术有限公司 Navigation positioning method, device and system of laser radar
US11092668B2 (en) 2019-02-07 2021-08-17 Aptiv Technologies Limited Trailer detection system and method
US11427196B2 (en) 2019-04-15 2022-08-30 Peloton Technology, Inc. Systems and methods for managing tractor-trailers
US20220262034A1 (en) * 2019-07-23 2022-08-18 Volkswagen Aktiengesellschaft Generation of Non-Semantic Reference Data for Positioning a Motor Vehicle
US12055650B2 (en) 2019-07-30 2024-08-06 Telefonaktiebolaget Lm Ericsson (Publ) Technique for determining a relative position between vehicles
US20220319186A1 (en) * 2019-09-27 2022-10-06 Hitachi Astemo, Ltd. Object Detection Device, Travel Control System, And Travel Control Method
US12033395B2 (en) * 2019-09-27 2024-07-09 Hitachi Astemo, Ltd. Object detection device, travel control system, and travel control method
CN110751825A (en) * 2019-10-29 2020-02-04 北京百度网讯科技有限公司 Method, device, equipment and computer readable storage medium for avoiding formation driving
US11125575B2 (en) * 2019-11-20 2021-09-21 Here Global B.V. Method and apparatus for estimating a location of a vehicle
US11656088B2 (en) 2019-11-20 2023-05-23 Here Global B.V. Method and apparatus for estimating a location of a vehicle
US20210170820A1 (en) * 2019-12-09 2021-06-10 Magna Electronics Inc. Vehicular trailer hitching assist system with coupler height and location estimation
US11679635B2 (en) * 2019-12-09 2023-06-20 Magna Electronics Inc. Vehicular trailer hitching assist system with coupler height and location estimation
US11802961B2 (en) 2020-02-24 2023-10-31 Aptiv Technologies Limited Lateral-bin monitoring for radar target detection
US11408995B2 (en) 2020-02-24 2022-08-09 Aptiv Technologies Limited Lateral-bin monitoring for radar target detection
EP4063912A1 (en) * 2021-03-17 2022-09-28 Aptiv Technologies Limited Tracking different sections of articulated vehicles
US12039784B1 (en) * 2021-09-30 2024-07-16 Zoox, Inc. Articulated object determination
JP7571716B2 (en) 2021-12-07 2024-10-23 トヨタ自動車株式会社 OBJECT DETERMINATION DEVICE, OBJECT DETERMINATION COMPUTER PROGRAM, AND OBJECT DETERMINATION METHOD
EP4280194A1 (en) * 2022-05-20 2023-11-22 Hyundai Mobis Co., Ltd. Method and apparatus for controlling rear collision warning of vehicle

Also Published As

Publication number Publication date
US10514706B2 (en) 2019-12-24
US11360485B2 (en) 2022-06-14
US20200073400A1 (en) 2020-03-05
US20180217610A1 (en) 2018-08-02

Similar Documents

Publication Publication Date Title
US11360485B2 (en) Gap measurement for vehicle convoying
US10520581B2 (en) Sensor fusion for autonomous or partially autonomous vehicle control
CA3042647C (en) Gap measurement for vehicle convoying
US11002849B2 (en) Driving lane detection device and driving lane detection method
US12037015B2 (en) Vehicle control device and vehicle control method
US9830814B2 (en) System and method for transmitting detected object attributes over a dedicated short range communication system
US9836964B2 (en) Vehicle identification system and vehicle identification device
EP3018027A1 (en) Control arrangement arranged to control an autonomous vehicle, autonomous drive arrangement, vehicle and method
CN110033609B (en) Vehicle fleet control device
US20130325284A1 (en) Traffic congestion detection apparatus and vehicle control apparatus
US12124271B2 (en) Gap measurement for vehicle convoying
KR20180078978A (en) Apparatus and method for controlling speed in cacc system
US11932248B2 (en) Vehicle control apparatus
US20190385444A1 (en) Vehicle control system, data processing apparatus, and vehicle control method
US20200098203A1 (en) Information processing system and server
US11794787B2 (en) Vehicle assist feature control
US11572731B2 (en) Vehicle window control
CN110929475A (en) Annotation of radar profiles of objects
US11555919B2 (en) Radar calibration system
JP2023064792A (en) Vehicle travel control processing system
WO2024144436A1 (en) Device and method for detecting objects
Maranga et al. Short paper: Inter-vehicular distance improvement using position information in a collaborative adaptive cruise control system
Sarkar et al. Final Technical Report-Fuel-Efficient Platooning in Mixed Traffic Highway Environments
KR20240137872A (en) Vehicle positioning apparatus and method
CN115113293A (en) Multispectral object detection using thermal imaging

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: PELOTON TECHNOLOGY, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SCHUH, AUSTIN B.;ERLIEN, STEPHEN M.;PLEINES, STEPHAN;AND OTHERS;SIGNING DATES FROM 20170607 TO 20170608;REEL/FRAME:042879/0032

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION