Assignment – 4: Feature Matching for Autostitching
-Charudatta Phatak
Section 1: Feauture detection
and matching
I used the sample code
provided for harris corner detection and made some modifications to it. Mainly
used a bigger derivative kernel and used matlab function ordfilt2 for
non-maxima suppression. The detected feature points are shown in red. After the
detecing the feature points, the feature descriptors were calculated by
sampling 8x8 patches from 40x40 patches. Then I used SSD to match the
descriptors. Thresholding was done based on the 1-NN/2-NN ratio. In the figures
below, points in blue were obtained after this step. Then using RANSAC,
homography was estimated amongst these points and the outliers were rejected.
In figures below, points in yellow were obtained after RANSAC . Then the code
from prev. assignment was used to generate mosaic from the images.
Image Set 1:
Figure 1: Red - Detected Feature points, Blue –
Matched feature pts, Yellow – RANSAC feature points
Figure 2: Mosaic generated using the yellow points
above.
Image Set 2:
Figure 3: Red - Detected Feature points, Blue –
Matched feature pts, Yellow – RANSAC feature points
Figure 4: Mosaic generated from yellow points.
Bells & Whistles:-
Section 2: Multiscale feature detection.
I used a 3 level Gaussian pyramid
of the images and detected the feature points in each level. The maximaÕs were
selected by comparing the neighborhoods across all the 3 levels of the
pyramids. The results are shown in
the figure. The varying diameter of the spot shows the level at which it was
selected.
Figure 5: Feature points detected over multiscale.
Then I also computed the
orientation vector [cos(q) , sin(q)] for each point as mentioned in the MOPS paper. The arrows in the
figure show the orientation vector for each point. These were also calculated
over all the levels of the images.
Figure 6: Vectors showing the orientation of the
feature patches at all the levels.
Section 3: Automatic panorama
recognition
A set of images was given as
the input in random order. I didnÕt use the bundle assignment but crudely
forced a check over the no. of matching points it found to be less than 4 by keeping
the threshold of 1-NN/2-NN very low.
Image set 1:
Figure 7: The input image set in the order it was
input.
Figure 8: Output images.
Image set 2:
Figure 9: Input images in the order of input.
Figure 10: Output images.