US20130259302A1 - Method of tracking objects - Google Patents
Method of tracking objects Download PDFInfo
- Publication number
- US20130259302A1 US20130259302A1 US13/527,429 US201213527429A US2013259302A1 US 20130259302 A1 US20130259302 A1 US 20130259302A1 US 201213527429 A US201213527429 A US 201213527429A US 2013259302 A1 US2013259302 A1 US 2013259302A1
- Authority
- US
- United States
- Prior art keywords
- tracking object
- feature points
- tracking
- area
- separation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/26—Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
Definitions
- the invention relates to tracking objects and more particularly to a method of tracking a tracking object and a non-tracking object for identification even when overlap occurs.
- an original image is converted into a binary image having either a value of 1 (i.e., image of interest) or value of 0 (i.e., image of don't care) based on features such as skin color of an object or view distance or calculation (e.g., background addition or subtraction) in object tracking.
- a value of 1 i.e., image of interest
- value of 0 i.e., image of don't care
- features such as skin color of an object or view distance or calculation (e.g., background addition or subtraction) in object tracking.
- Positions and ranges of different objects in the binary image can be obtained by analyzing connected elements of the binary image.
- the relative positioning is critical to the effectiveness of tracking. Tracking is easy when two objects are spaced. However, tracking may be very difficult or even fail due to overlapping.
- distance state of two objects is thus very important.
- An object to be tracked is first taken as an object of interest.
- a state of the object with respect to another object is determined by detection.
- separation means there is a great distance (i.e., greater than a predetermined distance) between a tracking object and a non-tracking object
- proximity means the distance between the tracking object and the non-tracking object is less than the predetermined distance
- overlap means the tracking object extends over and partly cover the non-tracking object to formed a connected object. The overlap can make object tracking more difficult and even cause failure.
- the invention discussed later aims to increase the effectiveness of two overlapping objects.
- It is therefore one object of the invention to provide a method of object tracking comprising creating areas of a tracking object and a non-tracking object respectively; determining a state of the tracking object and the non-tracking object is separation, proximity, or overlap; creating at least one separation template image of a separation area of at least one of the tracking object and the non-tracking object if the tracking object is proximate the non-tracking object; fetching all feature points of an overlapping area of the tracking object and the non-tracking object if the tracking object and the non-tracking object overlap; performing a match-on each of the feature points and the separation template image so as to calculate a corresponding matching error score respectively; and comparing the matching error score of each of the feature points with that of the separation template image so as to determine whether the feature points belong to the tracking object or the non-tracking object.
- FIG. 1 is a flow chart of a method of object tracking according to a first preferred embodiment of the invention
- FIG. 2 is a flow chart of the sub-steps of the step S 10 ;
- FIG. 3 is a flow chart of the sub-steps of the step S 11 ;
- FIG. 4 schematically depicts two objects being overlapped
- FIG. 5 is a flow chart of a method of object tracking according to a second preferred embodiment of the invention.
- FIG. 6 is a flow chart of a method of object tracking according to a third preferred embodiment of the invention.
- FIGS. 1 to 4 a flow chart diagram of a method of object tracking in accordance with the invention is illustrated.
- a computer e.g., personal computer, laptop, or the like
- the method comprises the following steps as discussed in detail below.
- Step S 10 Areas of a tracking object and a non-tracking object are created respectively.
- Step S 11 State such as separation, proximity, or overlap of the tracking object and the non-tracking object is determined.
- Step S 12 At least one separation template image of a separation area of at least one of the tracking object and the non-tracking object is created if the tracking object is proximate the non-tracking object.
- Step S 13 All feature points of an overlapping area of the tracking object and the non-tracking object are fetched if the tracking object and the non-tracking object overlap.
- Step S 14 A matching is performed on each of the feature points and the separation template image so as to calculate a corresponding matching error score respectively.
- Step S 15 The matching error score of each of the feature points and that of the separation template image are compared with each other so as to determine whether the feature points belong to the tracking object or the non-tracking object.
- Each of the tracking object and the non-tracking object has a respective area which can be any shapes including rectangle (as exemplary example discussed below).
- the area has a width w, a height h, and a coordinate (x, y) on the left top portion.
- R.x, R.y, R.w and R.h represent the four items
- a R t t represents the rectangular area of the i-th object at time t
- Q t represents the rectangular area of the tracking object at time t
- C t represents the rectangular area of the non-tracking object at time t.
- An overlapping ratio B t of the rectangular area R t of the i-th object and the rectangular area Q t-1 of the tracking object at time (t- 1 ) is calculated one by one in the tracking.
- Area(Q t-1 ) is the rectangular area of the tracking object at time (t- 1 ).
- Area(R t t ) is the rectangular area of the i-th object at time t.
- X left , X right , Y top and Y bottom are the left, the top, the right and the bottom values of the overlapped rectangle of the i-th object and the tracking object at time (t- 1 ) as expressed below.
- Y top max( Q t-1 , y, R t t , y )
- the overlapped rectangle is empty if X left >X right or Y top >Y bottom . That is, the overlapped rectangle does not exist and there is no common element for R t t and Q t-1 .
- B t can be obtained by the following expression if R t t overlaps at Q t-1
- i* is chosen as the current tracking object from all objects since it has the largest overlapped area and is expressed below.
- the step S 10 further comprises the following sub-steps:
- Sub-step S 100 A corresponding area of each of a plurality of objects in an image is created in which the area has a width w, a height h, and a coordinate (x, y) on the left top portion all associated with the corresponding object.
- Sub-step S 101 At least one tracking object is selected from the objects.
- Sub-step S 102 A first area of each tracking object at a first time (i.e., time (t- 1 )) is calculated and a second area of all objects in the image at a second time (i.e., time t) is calculated.
- Sub-step S 103 Extent of second area overlapping the first area is determined in order to obtain an overlapped ratio.
- Sub-step S 104 The object having the highest overlapped ratio in the second areas is taken as a tracking object.
- the computer facing a plurality of objects of the image may find the object having the highest overlapped ratio by matching at time t and time (t- 1 ).
- the computer may perform steps S 100 to S 104 with respect to each object in the tracking. Those objects having a lower overlapped ratio are taken as non-tracking objects.
- the computer may perform step S 11 after clearly defining the tracking object and the non-tracking objects.
- an inspection range D t t is created based on the size of the area of object R t t so that proximity of two objects can be determined.
- the step S 11 further comprises the following sub-steps:
- Sub-step S 110 A first inspection range of the tracking object and a second inspection range of the non-tracking object are created by executing a predetermined formula based on area of the tracking object, area of the non-tracking object, width w, height h, and coordinate (x, y) on the left top portion of the area of the tracking object, and width w, height h, and coordinate (x, y) on the left top portion of the area of the non-tracking object.
- Sub-step S 111 State is determined to be separation if the first inspection range does not overlap the second inspection range.
- Sub-step S 112 State is determined to be proximity or overlap if the first inspection range overlaps the second inspection range.
- Sub-step S 113 An alert range threshold is created if the state is determined to be proximity or overlap and an overlapping ratio of the alert range threshold and the first inspection range is calculated.
- Sub-step S 114 State is determined to be proximate if the overlapping ratio is less than an alert range threshold.
- Sub-step S 115 State is determined to be overlap if the overlapping ratio is greater than the alert range threshold.
- the predetermined formula is expressed below.
- Non-overlapping discussed in sub-step S 111 and overlapping discussed in sub-step S 112 are determined using the overlapped ratio of sub-step S 103 .
- state of the tracking object and the non-tracking object is either proximity or overlap if the overlapped ratio is greater than zero and state of the tracking object and the non-tracking object is separation if the overlapped ratio is equal to zero.
- alert range threshold A t of sub-step S 113 is employed to record a whole range of the rectangular area Q t of the tracking object and the rectangular area C t of a non-tracking object at time t when both objects are in the state of proximity as expressed below.
- a left min( Q t , x, C t , x )
- a top min( Q t , y, C t , y )
- a right max( Q t , x+Q t , w, C t , x+C t , w )
- a right max( Q t , y+Q t , h, C t , y+C t , h )
- Alert range threshold may vary as state changes and ranges recorded may also vary as the state changes. No alert is necessary for both objects if the state changes from proximity to separation. An overlapping object is created if a tracking object Q t-1 at time (t- 1 ) overlaps a non-tracking object C t-1 at time t- 1 when the state changes from proximity to overlap.
- the overlapping ratio of the overlapping object and the tracking object Q t-1 at a previous time (t- 1 ) is high in the state of overlap.
- the overlapping object is chosen as a tracking object of the current time.
- the rectangular area of the tracking object is recorded as Q t and the alert range threshold is recorded as A t .
- the alert range threshold is 0.7. Basically, the alert range threshold is between 0.01 and 0.9 by adjusting the area of the object and the alert range threshold.
- the computer may create at least one separation template image of a separation area of at least one of the tracking object and the non-tracking object.
- the separation area is a set of the area of the tracking object or the area of the non-tracking object.
- the separation template image comprises feature points of the non-tracking object and/or the tracking object in the separation area.
- a separation template image of the non-tracking object in a separation area it is still possible of creating a separation template image of a separation area of the tracking object in the state of proximity.
- a separation template image of a separation area of the tracking object and a separation template image of a separation area of the non-tracking object are created respectively.
- at least one separation template image is taken as target for comparing feature points fetched in the state of overlap.
- the state of the tracking object and the non-tracking object is proximity. Creation of a separation template image of a separation area of the non-tracking object is taken as an exemplary example. How to fetch feature points and its comparison will be discussed in detail below.
- the computer may fetch all feature points of the object overlapped by the tracking object and the non-tracking object when the state changes from proximity to overlap.
- the feature point fetch can be performed using a local binary pattern (LBP), a scale-invariant feature transform (SIFT), a speeded up Robost features (SURF), or any other vertex detection algorithm.
- LBP local binary pattern
- SIFT scale-invariant feature transform
- SURF speeded up Robost features
- LBP is taken by the invention for performing the feature point fetch in step S 13 .
- sum of squared difference (SSD) is employed to perform block matching for feature point matching in step S 14 .
- SSD sum of squared difference
- the system will fetch feature points from the rectangular area of the overlapping object when the tracking object is in the overlap state. All fetched feature points are employed in the separation template created by the non-tracking object for block matching in the state of proximity. As a result, the best matched locations of the fetched feature points in the separation template can be computed by block matching. It is understood that block matching is a well method for similarity comparison. Sum of squared distance (SSD) is employed to perform the block matching in the following example:
- a plurality of (e.g., n) feature points are fetched from the current image I t .
- map each feature point f t (x,y), 1 ⁇ t ⁇ n to a location on the matched image I * recorded by the separation template and give a displacement vector (u, v) thereto.
- a feature point f i is classified based on the predetermined threshold T D .
- the corresponding feature point is a feature point of the non-tracking object when D ft is less than the threshold T D .
- the corresponding feature point is a feature point of the tracking object when D ft is greater than the threshold T D .
- a separation template image of a separation area of the tracking object is created in the state of proximity.
- a matching comparison is conducted in the state of overlap.
- the corresponding feature point is a feature point of the tracking object when D ft is less than the threshold T D .
- the corresponding feature point is a feature point of the non-tracking object when D ft is greater than the threshold T D .
- a separation template image of a separation area of each of the tracking object and the non-tracking object is created in the state of proximity.
- a matching comparison is conducted in the state of overlap.
- two matching error scores are obtained and each is compared with the threshold.
- the two matching error scores are compared with each other. As a result, the matching comparison accuracy is greatly increased.
- a flow chart illustrates a method of a second preferred embodiment of the invention.
- the second preferred embodiment aims at successfully tracking a tracking object when the tracking object and a non-tracking object having similar features are in the state of overlap.
- the method of the second preferred embodiment comprises the following steps:
- Step S 50 After matching and comparing, it is determined that all feature points of the tracking object belong to the first-type feature points.
- Step S 51 After matching and comparing, it is determined that all feature points of the non-tracking object belong to the second-type feature points.
- Step S 52 Binary image values of the second-type feature points are deleted. For example, binary values of the second-type feature points are changed from one to zero so that a binary structure of the non-tracking object can be destroyed.
- Step S 53 A distance transform algorithm is performed to obtain binary distances of all of the first-type feature points.
- Step S 54 Location of a maximum binary distance among all binary distances is taken as center of the tracking object.
- the method of the second preferred embodiment is applied in hand positioning as detailed below.
- a binary image is transformed into a distance image by executing a distance transform algorithm.
- Value of the distance image represents a shortest distance from a point within the object to an edge (e.g., top, bottom, left and right) thereof. Hence, the value of a point proximate the edge is less than that of a point away from the edge.
- Hand and, for example face may overlap in the state of overlap.
- a maximum distance can be found by executing the distance transform algorithm and it may occur on the face. This is because skin color of the face is similar to that of the hand.
- both the face and the hand are taken as image points of interest in binary image processing. Unfortunately, it often finds the incorrect position of the hand.
- Step S 52 states that binary image values of the second-type feature points are deleted. For example, binary values of the second-type feature points are changed from one to zero so that a binary structure of the face (or any of other non-tracking objects) can be destroyed. Thus, edge data of the face can also be obtained. After the distance transform, the maximum value may represent the hand position rather than the face.
- the method of the invention can not only determine state of a tracking object and a non-tracking object but also find center of the tracking object when they are overlapped. However, this is employed conversely by searching the corresponding matched points in the image of the current overlapping object for each detected feature point of the separation template image.
- the separation template image consists of both the tracking object image and the non-tracking object image when they are in the state of proximity.
- N C is the number of the feature points belonging to the non-tracking object (i.e., the second-type feature points)
- N O is the number of the feature points belonging to the tracking object (i.e., the first-type feature points).
- a flow chart illustrates a method of a third preferred embodiment of the invention.
- the method of the third preferred embodiment comprises the following steps:
- Step S 60 After matching and comparing, it is determined that all feature points of the tracking object belong to the first-type feature points.
- Step S 61 After matching and comparing, it is determined that all feature points of the non-tracking object belong to the second-type feature points.
- Step S 62 The first-type feature points and the second-type feature points are compared each other to obtain a decrease ratio of each of the first-type feature points and the second-type feature points.
- Step S 63 It is determined that the non-tracking object overlaps the tracking object if the decrease ratio of the first-type feature points is greater than that of the second-type feature points. Otherwise, it is determined that the tracking object overlaps the non-tracking object.
- the overlapping relationship of the tracking object and the non-tracking object can be determined based on K C and K O in the state of overlap.
- the decrease ratio of the first-type feature points in the overlapping rectangular is smaller than that of the second-type feature points as the tracking object is extending over and partly covering the non-tracking object.
- K O and K C be the number of the first type feature points and the second type feature points appearing in the overlapping object image respectively. Ratio K of K C and K O is defined below
- K increases as K O increases and K C decreases and it is determined that the tracking object is overlapping the non-tracking object.
- the first-type feature points of the overlapping objects gradually disappear due to the gradual overlapping of the tracking object by the non-tracking object.
- K O decreases and K C increases.
- Decrease ratio of the first-type feature points is greater than decrease ratio of the second-type feature points.
- the computer may set a fixed overlap coefficient T K to be
- the state of overlapping of an object can be determined by K. It is determined that the tracking object overlaps the non-tracking object if K is greater than T K . To the contrary, it is determined that the non-tracking object overlaps the tracking object if K is less than T K .
- the method of object tracking of the invention can be briefed below. Relative location of the tracking object and the non-tracking object can be determined. Next, at least one separation template image is created when the two objects are proximity each other. Next, the separation template image can be employed as a basis for feature points matching when the two objects overlap. As a result, tracking failure due to overlap can be avoided. Further, object tracking efficiency and effectiveness can be greatly increased.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
Abstract
Description
- 1. Field of the Invention
- The invention relates to tracking objects and more particularly to a method of tracking a tracking object and a non-tracking object for identification even when overlap occurs.
- 2. Description of Related Art
- Conventionally, an original image is converted into a binary image having either a value of 1 (i.e., image of interest) or value of 0 (i.e., image of don't care) based on features such as skin color of an object or view distance or calculation (e.g., background addition or subtraction) in object tracking. This has the benefits of sharp contrast and reliability.
- Positions and ranges of different objects in the binary image can be obtained by analyzing connected elements of the binary image. The relative positioning is critical to the effectiveness of tracking. Tracking is easy when two objects are spaced. However, tracking may be very difficult or even fail due to overlapping.
- Thus, distance state of two objects is thus very important. There are three states, i.e., separation, proximity and overlap between any two objects of interest. An object to be tracked is first taken as an object of interest. Next, a state of the object with respect to another object is determined by detection.
- It is defined that separation means there is a great distance (i.e., greater than a predetermined distance) between a tracking object and a non-tracking object, proximity means the distance between the tracking object and the non-tracking object is less than the predetermined distance, and overlap means the tracking object extends over and partly cover the non-tracking object to formed a connected object. The overlap can make object tracking more difficult and even cause failure.
- The invention discussed later aims to increase the effectiveness of two overlapping objects.
- It is therefore one object of the invention to provide a method of object tracking comprising creating areas of a tracking object and a non-tracking object respectively; determining a state of the tracking object and the non-tracking object is separation, proximity, or overlap; creating at least one separation template image of a separation area of at least one of the tracking object and the non-tracking object if the tracking object is proximate the non-tracking object; fetching all feature points of an overlapping area of the tracking object and the non-tracking object if the tracking object and the non-tracking object overlap; performing a match-on each of the feature points and the separation template image so as to calculate a corresponding matching error score respectively; and comparing the matching error score of each of the feature points with that of the separation template image so as to determine whether the feature points belong to the tracking object or the non-tracking object.
- The above and other objects, features and advantages of the invention will become apparent from the following detailed description taken with the accompanying drawings.
-
FIG. 1 is a flow chart of a method of object tracking according to a first preferred embodiment of the invention; -
FIG. 2 is a flow chart of the sub-steps of the step S10; -
FIG. 3 is a flow chart of the sub-steps of the step S11; -
FIG. 4 schematically depicts two objects being overlapped; -
FIG. 5 is a flow chart of a method of object tracking according to a second preferred embodiment of the invention; and -
FIG. 6 is a flow chart of a method of object tracking according to a third preferred embodiment of the invention. - Referring to
FIGS. 1 to 4 , a flow chart diagram of a method of object tracking in accordance with the invention is illustrated. A computer (e.g., personal computer, laptop, or the like) is employed to execute the method. The method comprises the following steps as discussed in detail below. - Step S10: Areas of a tracking object and a non-tracking object are created respectively.
- Step S11: State such as separation, proximity, or overlap of the tracking object and the non-tracking object is determined.
- Step S12: At least one separation template image of a separation area of at least one of the tracking object and the non-tracking object is created if the tracking object is proximate the non-tracking object.
- Step S13: All feature points of an overlapping area of the tracking object and the non-tracking object are fetched if the tracking object and the non-tracking object overlap.
- Step S14: A matching is performed on each of the feature points and the separation template image so as to calculate a corresponding matching error score respectively.
- Step S15: The matching error score of each of the feature points and that of the separation template image are compared with each other so as to determine whether the feature points belong to the tracking object or the non-tracking object.
- Each of the tracking object and the non-tracking object has a respective area which can be any shapes including rectangle (as exemplary example discussed below). The area has a width w, a height h, and a coordinate (x, y) on the left top portion. For an area R of step S10, R.x, R.y, R.w and R.h represent the four items, a Rt t represents the rectangular area of the i-th object at time t, Qt represents the rectangular area of the tracking object at time t, and Ct represents the rectangular area of the non-tracking object at time t.
- An overlapping ratio Bt of the rectangular area Rt of the i-th object and the rectangular area Qt-1 of the tracking object at time (t-1) is calculated one by one in the tracking. Area(Qt-1) is the rectangular area of the tracking object at time (t-1). Area(Rt t) is the rectangular area of the i-th object at time t. Xleft, Xright, Ytop and Ybottom are the left, the top, the right and the bottom values of the overlapped rectangle of the i-th object and the tracking object at time (t-1) as expressed below.
-
X left=max(Q t-1 , x, R t t ,x) -
Y top=max(Q t-1 , y, R t t , y) -
X right=min(Q t-1 , x+Q t-1 , w, R t t , x+R t t , w) -
Y bottom=min(Qt-1 , y+Q t-1 , h, R t t , y+R t t , h) - The overlapped rectangle is empty if Xleft>Xright or Ytop>Ybottom. That is, the overlapped rectangle does not exist and there is no common element for Rt t and Qt-1. Bt can be obtained by the following expression if Rt t overlaps at Qt-1
-
- i* is chosen as the current tracking object from all objects since it has the largest overlapped area and is expressed below.
-
- Thereafter, a rectangular area of the tracking object at time t is created by the rectangular area of the i*-th object, that is, Qt=Rt t*. Further, the rectangular area of each non-i* object can be taken as the rectangular area Ct of the non-tracking object.
- The step S10 further comprises the following sub-steps:
- Sub-step S100: A corresponding area of each of a plurality of objects in an image is created in which the area has a width w, a height h, and a coordinate (x, y) on the left top portion all associated with the corresponding object.
- Sub-step S101: At least one tracking object is selected from the objects.
- Sub-step S102: A first area of each tracking object at a first time (i.e., time (t-1)) is calculated and a second area of all objects in the image at a second time (i.e., time t) is calculated.
- Sub-step S103: Extent of second area overlapping the first area is determined in order to obtain an overlapped ratio.
- Sub-step S104: The object having the highest overlapped ratio in the second areas is taken as a tracking object.
- As shown in
FIG. 2 , the computer facing a plurality of objects of the image may find the object having the highest overlapped ratio by matching at time t and time (t-1). In short, the computer may perform steps S100 to S104 with respect to each object in the tracking. Those objects having a lower overlapped ratio are taken as non-tracking objects. - As shown in
FIG. 3 , the computer may perform step S11 after clearly defining the tracking object and the non-tracking objects. Next, an inspection range Dt t is created based on the size of the area of object Rt t so that proximity of two objects can be determined. The step S11 further comprises the following sub-steps: - Sub-step S110: A first inspection range of the tracking object and a second inspection range of the non-tracking object are created by executing a predetermined formula based on area of the tracking object, area of the non-tracking object, width w, height h, and coordinate (x, y) on the left top portion of the area of the tracking object, and width w, height h, and coordinate (x, y) on the left top portion of the area of the non-tracking object.
- Sub-step S111: State is determined to be separation if the first inspection range does not overlap the second inspection range.
- Sub-step S112: State is determined to be proximity or overlap if the first inspection range overlaps the second inspection range.
- Sub-step S113: An alert range threshold is created if the state is determined to be proximity or overlap and an overlapping ratio of the alert range threshold and the first inspection range is calculated.
- Sub-step S114: State is determined to be proximate if the overlapping ratio is less than an alert range threshold.
- Sub-step S115: State is determined to be overlap if the overlapping ratio is greater than the alert range threshold.
- The predetermined formula is expressed below.
-
- It is possible of determining state of an object by determining whether the inspection range Dt t* of the tracking object overlaps the inspection range Dt t of the non-tracking object after creating the inspection ranges.
- Non-overlapping discussed in sub-step S111 and overlapping discussed in sub-step S112 are determined using the overlapped ratio of sub-step S103. In short, state of the tracking object and the non-tracking object is either proximity or overlap if the overlapped ratio is greater than zero and state of the tracking object and the non-tracking object is separation if the overlapped ratio is equal to zero.
- As shown in
FIG. 4 , alert range threshold At of sub-step S113 is employed to record a whole range of the rectangular area Qt of the tracking object and the rectangular area Ct of a non-tracking object at time t when both objects are in the state of proximity as expressed below. -
A left=min(Q t , x, C t , x) -
A top=min(Q t , y, C t , y) -
A right=max(Q t , x+Q t , w, C t , x+C t , w) -
A right=max(Q t , y+Q t , h, C t , y+C t , h) - where Aleft, Atop, Aright and Abottom are the left, the top, the right and the bottom coordinates of the alert range threshold respectively. Thus, the rectangular are of the alert range threshold At can be expressed as (Aleft, Atop, Aright−Aleft, Abottom−Atop). Alert range threshold may vary as state changes and ranges recorded may also vary as the state changes. No alert is necessary for both objects if the state changes from proximity to separation. An overlapping object is created if a tracking object Qt-1 at time (t-1) overlaps a non-tracking object Ct-1 at time t-1 when the state changes from proximity to overlap. The overlapping ratio of the overlapping object and the tracking object Qt-1 at a previous time (t-1) is high in the state of overlap. Thus, the overlapping object is chosen as a tracking object of the current time. Further, the rectangular area of the tracking object is recorded as Qt and the alert range threshold is recorded as At.
- In one embodiment, the alert range threshold is 0.7. Basically, the alert range threshold is between 0.01 and 0.9 by adjusting the area of the object and the alert range threshold.
- As discussed in steps S12, S13, state of the tracking object and the non-tracking object is proximity. Thus, the computer may create at least one separation template image of a separation area of at least one of the tracking object and the non-tracking object. The separation area is a set of the area of the tracking object or the area of the non-tracking object. The separation template image comprises feature points of the non-tracking object and/or the tracking object in the separation area.
- Following will discuss how to create a separation template image of the non-tracking object in a separation area. But it is still possible of creating a separation template image of a separation area of the tracking object in the state of proximity. Alternatively, a separation template image of a separation area of the tracking object and a separation template image of a separation area of the non-tracking object are created respectively. Further, at least one separation template image is taken as target for comparing feature points fetched in the state of overlap. These are techniques known in the art and are not discussed in detail for the sake of brevity.
- The state of the tracking object and the non-tracking object is proximity. Creation of a separation template image of a separation area of the non-tracking object is taken as an exemplary example. How to fetch feature points and its comparison will be discussed in detail below.
- The computer may fetch all feature points of the object overlapped by the tracking object and the non-tracking object when the state changes from proximity to overlap. The feature point fetch can be performed using a local binary pattern (LBP), a scale-invariant feature transform (SIFT), a speeded up Robost features (SURF), or any other vertex detection algorithm.
- In one embodiment, LBP is taken by the invention for performing the feature point fetch in step S13. Further, sum of squared difference (SSD) is employed to perform block matching for feature point matching in step S14. Above is discussed in detail below.
- The system will fetch feature points from the rectangular area of the overlapping object when the tracking object is in the overlap state. All fetched feature points are employed in the separation template created by the non-tracking object for block matching in the state of proximity. As a result, the best matched locations of the fetched feature points in the separation template can be computed by block matching. It is understood that block matching is a well method for similarity comparison. Sum of squared distance (SSD) is employed to perform the block matching in the following example:
- A plurality of (e.g., n) feature points are fetched from the current image It. Next, map each feature point ft=(x,y), 1<t≦n to a location on the matched image I* recorded by the separation template and give a displacement vector (u, v) thereto. A deviation is calculated on the position pt=(x+u, y+v) in which pt is located in the separation range R* and the deviation is calculated by executing a formula below.
-
- where B is the radius of a block centered on the feature point fi. The smaller of the SSD the more similar between corresponding locations is. To the contrary, the larger of the SSD the less similar between corresponding locations is. Each feature point fi will find a displacement (um, vm) having a minimum deviation on the separation template as below.
-
- After finding (um, vm), an optimum matching position Pt m=(x+um, y+vm) is obtained. Also, deviation SSD(x,y)(um, vm) can be taken as feature point classification. SSD deviation is low when the feature point is the feature point of the non-tracking object. This is because the recorded separation template takes the non-tracking object before overlapping as record target. To the contrary, SSD deviation is high when the feature point is the feature point of the tracking object. This is because there is no information regarding tracking object record in the separation range Rm. Therefore, SSD deviation is defined as matching error score.
- As discussed in steps S14 and S15, a feature point fi is classified based on the predetermined threshold TD. Dft is defined as the matching error score of the i-th feature point fi, i.e. Dfi=SSD(x,y)(x+u*, y+v*). The corresponding feature point is a feature point of the non-tracking object when Dft is less than the threshold TD. The corresponding feature point is a feature point of the tracking object when Dft is greater than the threshold TD. Thus, it is possible of correctly locating the feature points of both the tracking object and the non-tracking object when objects overlap. As a result, the purpose of identifying objects can be obtained.
- Likewise, a separation template image of a separation area of the tracking object is created in the state of proximity. Next, a matching comparison is conducted in the state of overlap. The corresponding feature point is a feature point of the tracking object when Dft is less than the threshold TD. The corresponding feature point is a feature point of the non-tracking object when Dft is greater than the threshold TD.
- Moreover, a separation template image of a separation area of each of the tracking object and the non-tracking object is created in the state of proximity. Next, a matching comparison is conducted in the state of overlap. As such, two matching error scores are obtained and each is compared with the threshold. Also, the two matching error scores are compared with each other. As a result, the matching comparison accuracy is greatly increased.
- Referring to
FIG. 5 , a flow chart illustrates a method of a second preferred embodiment of the invention. The second preferred embodiment aims at successfully tracking a tracking object when the tracking object and a non-tracking object having similar features are in the state of overlap. The method of the second preferred embodiment comprises the following steps: - Step S50: After matching and comparing, it is determined that all feature points of the tracking object belong to the first-type feature points.
- Step S51: After matching and comparing, it is determined that all feature points of the non-tracking object belong to the second-type feature points.
- Step S52: Binary image values of the second-type feature points are deleted. For example, binary values of the second-type feature points are changed from one to zero so that a binary structure of the non-tracking object can be destroyed.
- Step S53: A distance transform algorithm is performed to obtain binary distances of all of the first-type feature points.
- Step S54: Location of a maximum binary distance among all binary distances is taken as center of the tracking object.
- In one exemplary example, the method of the second preferred embodiment is applied in hand positioning as detailed below.
- First, in a feature binary image, a binary image is transformed into a distance image by executing a distance transform algorithm.
- Value of the distance image represents a shortest distance from a point within the object to an edge (e.g., top, bottom, left and right) thereof. Hence, the value of a point proximate the edge is less than that of a point away from the edge.
- Hand and, for example face, may overlap in the state of overlap. A maximum distance can be found by executing the distance transform algorithm and it may occur on the face. This is because skin color of the face is similar to that of the hand. Thus, both the face and the hand are taken as image points of interest in binary image processing. Unfortunately, it often finds the incorrect position of the hand.
- Step S52 states that binary image values of the second-type feature points are deleted. For example, binary values of the second-type feature points are changed from one to zero so that a binary structure of the face (or any of other non-tracking objects) can be destroyed. Thus, edge data of the face can also be obtained. After the distance transform, the maximum value may represent the hand position rather than the face.
- It is possible of obtaining a rectangular area of the hand in the state of overlap by referring to the position of the hand and the distance transform value of the hand. Thus, position of the hand in the state of overlap can be shown. It is assumed that coordinate of the hand center is (x, y), its value is DT, and in one possible design the rectangular area of the hand HR can be expressed as HR=(x−2DT, y−2.72DT, 4DT, 4DT).
- The method of the invention can not only determine state of a tracking object and a non-tracking object but also find center of the tracking object when they are overlapped. However, this is employed conversely by searching the corresponding matched points in the image of the current overlapping object for each detected feature point of the separation template image. Here, the separation template image consists of both the tracking object image and the non-tracking object image when they are in the state of proximity. Suppose with the most recent separation template image, NC is the number of the feature points belonging to the non-tracking object (i.e., the second-type feature points) and NO is the number of the feature points belonging to the tracking object (i.e., the first-type feature points). When the matching error between a detected feature point in the separation template image and its best matched point in the image of the current overlapping object is small, it denotes the detected feature point in the separation template image also appears in the overlapping object image; otherwise, it is regarded that the detected feature point in the separation template image is disappeared in the overlapping object image.
- Referring to
FIG. 6 , a flow chart illustrates a method of a third preferred embodiment of the invention. The method of the third preferred embodiment comprises the following steps: - Step S60: After matching and comparing, it is determined that all feature points of the tracking object belong to the first-type feature points.
- Step S61: After matching and comparing, it is determined that all feature points of the non-tracking object belong to the second-type feature points.
- Step S62: The first-type feature points and the second-type feature points are compared each other to obtain a decrease ratio of each of the first-type feature points and the second-type feature points.
- Step S63: It is determined that the non-tracking object overlaps the tracking object if the decrease ratio of the first-type feature points is greater than that of the second-type feature points. Otherwise, it is determined that the tracking object overlaps the non-tracking object. The overlapping relationship of the tracking object and the non-tracking object can be determined based on KC and KO in the state of overlap.
- The decrease ratio of the first-type feature points in the overlapping rectangular is smaller than that of the second-type feature points as the tracking object is extending over and partly covering the non-tracking object. Thus, it is possible of determining whether the tracking object overlaps the non-tracking object or the non-tracking object overlaps the tracking object. Let KO and KC be the number of the first type feature points and the second type feature points appearing in the overlapping object image respectively. Ratio K of KC and KO is defined below
-
- K increases as KO increases and KC decreases and it is determined that the tracking object is overlapping the non-tracking object.
- To the contrary, the first-type feature points of the overlapping objects gradually disappear due to the gradual overlapping of the tracking object by the non-tracking object. As a result, KO decreases and KC increases. Decrease ratio of the first-type feature points is greater than decrease ratio of the second-type feature points. Thus, it is possible of determining whether the non-tracking object overlaps the tracking object. Ratio K thus gradually decreases.
- The computer may set a fixed overlap coefficient TK to be
-
- The state of overlapping of an object can be determined by K. It is determined that the tracking object overlaps the non-tracking object if K is greater than TK. To the contrary, it is determined that the non-tracking object overlaps the tracking object if K is less than TK.
- The method of object tracking of the invention can be briefed below. Relative location of the tracking object and the non-tracking object can be determined. Next, at least one separation template image is created when the two objects are proximity each other. Next, the separation template image can be employed as a basis for feature points matching when the two objects overlap. As a result, tracking failure due to overlap can be avoided. Further, object tracking efficiency and effectiveness can be greatly increased.
- While the invention has been described in terms of preferred embodiments, those skilled in the art will recognize that the invention can be practiced with modifications within the spirit and scope of the appended claims.
Claims (9)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW101111784A TWI479431B (en) | 2012-04-03 | 2012-04-03 | Method of gesture tracking objects |
TW101111784A | 2012-04-03 | ||
TW101111784 | 2012-04-03 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20130259302A1 true US20130259302A1 (en) | 2013-10-03 |
US8929597B2 US8929597B2 (en) | 2015-01-06 |
Family
ID=49235078
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/527,429 Expired - Fee Related US8929597B2 (en) | 2012-04-03 | 2012-06-19 | Method of tracking objects |
Country Status (2)
Country | Link |
---|---|
US (1) | US8929597B2 (en) |
TW (1) | TWI479431B (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150193503A1 (en) * | 2012-08-30 | 2015-07-09 | Facebook, Inc. | Retroactive search of objects using k-d tree |
CN105139406A (en) * | 2015-09-08 | 2015-12-09 | 哈尔滨工业大学 | Tracking accuracy inversion method based on sequence images |
CN109276819A (en) * | 2017-07-20 | 2019-01-29 | 株式会社东芝 | Information processing unit, information processing system and storage medium |
US10231088B2 (en) | 2016-06-01 | 2019-03-12 | International Business Machines Corporation | Mobile device inference and location prediction of a moving object of interest |
CN110688873A (en) * | 2018-07-04 | 2020-01-14 | 上海智臻智能网络科技股份有限公司 | Multi-target tracking method and face recognition method |
CN111247517A (en) * | 2018-06-27 | 2020-06-05 | 华为技术有限公司 | Image processing method, device and system |
US11004209B2 (en) * | 2017-10-26 | 2021-05-11 | Qualcomm Incorporated | Methods and systems for applying complex object detection in a video analytics system |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9679218B2 (en) * | 2013-05-09 | 2017-06-13 | Tata Consultancy Services Limited | Method and apparatus for image matching |
RU2014113049A (en) * | 2014-04-03 | 2015-10-10 | ЭлЭсАй Корпорейшн | IMAGE PROCESSOR CONTAINING A GESTURE RECOGNITION SYSTEM WITH OBJECT TRACKING ON THE BASIS OF COMPUTING SIGNS OF CIRCUITS FOR TWO OR MORE OBJECTS |
EP3381016A4 (en) * | 2015-11-26 | 2019-06-12 | Sportlogiq Inc. | Systems and methods for object tracking and localization in videos with adaptive image representation |
TWI586936B (en) * | 2016-05-20 | 2017-06-11 | 國立交通大學 | A transform method between a physical image and a virtual image and a system thereof |
TWI688902B (en) * | 2018-08-31 | 2020-03-21 | 國立中正大學 | Expanded local binary pattern method in facial expression recognition and system thereof |
JP7184161B2 (en) * | 2019-03-27 | 2022-12-06 | 日本電気株式会社 | OBJECT TRACKING DEVICE, CONTROL METHOD, AND PROGRAM |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080187173A1 (en) * | 2007-02-02 | 2008-08-07 | Samsung Electronics Co., Ltd. | Method and apparatus for tracking video image |
US20080240496A1 (en) * | 2007-03-26 | 2008-10-02 | Senior Andrew W | Approach for resolving occlusions, splits and merges in video images |
US8019143B2 (en) * | 2005-05-30 | 2011-09-13 | Olympus Corporation | Image processing apparatus and computer program product |
US8111873B2 (en) * | 2005-03-18 | 2012-02-07 | Cognimatics Ab | Method for tracking objects in a scene |
US20120093364A1 (en) * | 2010-02-19 | 2012-04-19 | Panasonic Corporation | Object tracking device, object tracking method, and object tracking program |
US20120183177A1 (en) * | 2011-01-17 | 2012-07-19 | Postech Academy-Industry Foundation | Image surveillance system and method of detecting whether object is left behind or taken away |
US20120274785A1 (en) * | 2006-06-28 | 2012-11-01 | Nikon Corporation | Subject tracking apparatus, camera having the subject tracking apparatus, and method for tracking subject |
US20130195361A1 (en) * | 2012-01-17 | 2013-08-01 | Alibaba Group Holding Limited | Image index generation based on similarities of image features |
US8559670B2 (en) * | 2009-11-19 | 2013-10-15 | Industrial Technology Research Institute | Moving object detection detection within a video stream using object texture |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101965576B (en) * | 2008-03-03 | 2013-03-06 | 视频监控公司 | Object matching for tracking, indexing, and search |
-
2012
- 2012-04-03 TW TW101111784A patent/TWI479431B/en not_active IP Right Cessation
- 2012-06-19 US US13/527,429 patent/US8929597B2/en not_active Expired - Fee Related
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8111873B2 (en) * | 2005-03-18 | 2012-02-07 | Cognimatics Ab | Method for tracking objects in a scene |
US8019143B2 (en) * | 2005-05-30 | 2011-09-13 | Olympus Corporation | Image processing apparatus and computer program product |
US20120274785A1 (en) * | 2006-06-28 | 2012-11-01 | Nikon Corporation | Subject tracking apparatus, camera having the subject tracking apparatus, and method for tracking subject |
US20080187173A1 (en) * | 2007-02-02 | 2008-08-07 | Samsung Electronics Co., Ltd. | Method and apparatus for tracking video image |
US20080240496A1 (en) * | 2007-03-26 | 2008-10-02 | Senior Andrew W | Approach for resolving occlusions, splits and merges in video images |
US8270675B2 (en) * | 2007-03-26 | 2012-09-18 | International Business Machines Corporation | Approach for resolving occlusions, splits and merges in video images |
US8559670B2 (en) * | 2009-11-19 | 2013-10-15 | Industrial Technology Research Institute | Moving object detection detection within a video stream using object texture |
US20120093364A1 (en) * | 2010-02-19 | 2012-04-19 | Panasonic Corporation | Object tracking device, object tracking method, and object tracking program |
US20120183177A1 (en) * | 2011-01-17 | 2012-07-19 | Postech Academy-Industry Foundation | Image surveillance system and method of detecting whether object is left behind or taken away |
US20130195361A1 (en) * | 2012-01-17 | 2013-08-01 | Alibaba Group Holding Limited | Image index generation based on similarities of image features |
Non-Patent Citations (1)
Title |
---|
Armanfard, N.; Komeili, M.; Valizadeh, M.; Kabir, E., "Effective hierarchical background modeling and foreground detection in surveillance systems," Computer Conference, 2009. CSICC 2009. 14th International CSI , vol., no., pp.164,169, 20-21 Oct. 2009 * |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150193503A1 (en) * | 2012-08-30 | 2015-07-09 | Facebook, Inc. | Retroactive search of objects using k-d tree |
CN105139406A (en) * | 2015-09-08 | 2015-12-09 | 哈尔滨工业大学 | Tracking accuracy inversion method based on sequence images |
US10231088B2 (en) | 2016-06-01 | 2019-03-12 | International Business Machines Corporation | Mobile device inference and location prediction of a moving object of interest |
US10375522B2 (en) | 2016-06-01 | 2019-08-06 | International Business Machines Corporation | Mobile device inference and location prediction of a moving object of interest |
CN109276819A (en) * | 2017-07-20 | 2019-01-29 | 株式会社东芝 | Information processing unit, information processing system and storage medium |
US11004209B2 (en) * | 2017-10-26 | 2021-05-11 | Qualcomm Incorporated | Methods and systems for applying complex object detection in a video analytics system |
CN111247517A (en) * | 2018-06-27 | 2020-06-05 | 华为技术有限公司 | Image processing method, device and system |
CN110688873A (en) * | 2018-07-04 | 2020-01-14 | 上海智臻智能网络科技股份有限公司 | Multi-target tracking method and face recognition method |
Also Published As
Publication number | Publication date |
---|---|
TW201342254A (en) | 2013-10-16 |
TWI479431B (en) | 2015-04-01 |
US8929597B2 (en) | 2015-01-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8929597B2 (en) | Method of tracking objects | |
US20220172384A1 (en) | Logo Recognition in Images and Videos | |
US10430663B2 (en) | Method, electronic device and non-transitory computer readable storage medium for image annotation | |
US8280196B2 (en) | Image retrieval apparatus, control method for the same, and storage medium | |
US9147255B1 (en) | Rapid object detection by combining structural information from image segmentation with bio-inspired attentional mechanisms | |
US8774510B2 (en) | Template matching with histogram of gradient orientations | |
Jeong et al. | Fast horizon detection in maritime images using region-of-interest | |
US9342759B1 (en) | Object recognition consistency improvement using a pseudo-tracklet approach | |
US10147015B2 (en) | Image processing device, image processing method, and computer-readable recording medium | |
US10636165B2 (en) | Information processing apparatus, method and non-transitory computer-readable storage medium | |
US20170109427A1 (en) | Information processing apparatus, information processing method, and storage medium | |
JP2011008507A (en) | Image retrieval method and system | |
CN111191533B (en) | Pedestrian re-recognition processing method, device, computer equipment and storage medium | |
US8027978B2 (en) | Image search method, apparatus, and program | |
JP2007508633A (en) | Method and image processing device for analyzing object contour image, method and image processing device for detecting object, industrial visual device, smart camera, image display, security system, and computer program product | |
CN105354571B (en) | Distortion text image baseline estimation method based on curve projection | |
WO2014136327A1 (en) | Image processing system, image processing method, and image processing program | |
US20220215650A1 (en) | Information processing device, information processing method, and program recording medium | |
US8417038B2 (en) | Image processing apparatus, processing method therefor, and non-transitory computer-readable storage medium | |
Verma et al. | Grassmann manifold based dynamic hand gesture recognition using depth data | |
Yörük et al. | An efficient Hough transform for multi-instance object recognition and pose estimation | |
CN113420648B (en) | Target detection method and system with rotation adaptability | |
Rebai et al. | Road intersection detection and classification using hierarchical SVM classifier | |
US20110150344A1 (en) | Content based image retrieval apparatus and method | |
US8849050B2 (en) | Computer vision methods and systems to recognize and locate an object or objects in one or more images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: CHUNG HUA UNIVERSITY, TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HUANG, YEA-SHUAN;CHEN, YU-CHUNG;REEL/FRAME:028405/0217 Effective date: 20120206 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20190106 |