Project 4A: Image Warping and Mosaicing

Background

A “cool” application of different aspects of image warping is image mosaicing. Using two or more photographs, you can create an image mosaic by registering, projective warping, resampling, and compositing them. One significant method used in the project was the computation of homographies, which was used to warp images.

Shoot the Pictures

Approach

I took photos at multiple locations by rotating my iPhone camera on a flat surface.

Example Images

woz1.jpg
Woz Left
woz2.jpg
Woz Right
enclave1.jpg
Enclave Left
enclave2.jpg
Enclave Right
hmmb1.jpg
Hearst Memorial Mining Building (HMMB) Left
hmmb2.jpg
HMMB Right
apt1.jpg
Apartment Left
apt2.jpg
Apartment Left

Recover Homographies

Approach

I implemented the approach from lecture and discussion in my code. Using the input points, im1_pts and im2_pts , I created the arrays below and applied least squares to solve for the homography matrix H .

homographies.jpg

Warp the Images

Approach

First, I set the source points to be the 4 corners of the input image. I calculated the destination by multiplying the source by the homography matrix, H * source and normalized with destination[:2] / destination[2] . Next, I shifted the destination points such that the upper left point is at (0, 0) . I then created a new array based on the shift to be able to fit the output image. Next, I used the same procedure as Project 3 for inverse warping, except I shifted the polygon returned by skimage.draw.polygon. Finally, I displayed the resulting warped image.

Image Rectification

Approach

First, I found 4 source points from the input image that is a rectangle (but doesn't appear as one due to camera perspective). I then used the image dimensions to generate 4 destination points that were in the shape of a rectangle. I found the homography from the source to the destination, and warped the image into a rectangular plane with the homography.

Example Image Rectifications

painting.jpg
Painting
p_rect.jpg
Painting Rectified
hmmb1.jpg
HMMB
h_rect.jpg
HMMB Rectified

Blend the images into a mosaic

Approach

In this part, I followed the approach from lecture again. First, I warped the right image onto the left. Next, I found the destination transforms of the left image and warped right using scipy.ndimage.distance_transform_edt. Then I applied the Python equivalent of logical(dtrans1 > dtrans2) from lecture to generate the mask. Finally, I used the blending code from Project 2 to blend everything together and generate the mosaics.

Example Destination Transform and Mask

e1_dt.jpg
Enclave Left Destination Transform
e2_dt.jpg
Enclave Right Destination Transform
e_mask.jpg
Enclave Mask

Mosaics

w_mosaic.jpg
Woz Mosaic
e_mosaic.jpg
Enclave Mosaic
h_mosaic.jpg
HMMB Mosaic
a_mosaic.jpg
Apartment Mosaic

Project 4B: Feature Matching for Autostitching

Background

In Project 4A, we worked with stitching with manually-selected points. In this part, we will be automating the stitching process. Points are selected with Harris Interest Point Detector and ANMS before getting matched. Then RANSAC is used to increase match accuracies. Finally, we have a set of points and a homography matrix that we can use to warp and mosaic.

1. Detecting Corner Features

Approach

In order to find the interest points for each image, I first converted to grayscale before using the Harris Interest Point Detector code from the course website. Next, to restrict the number of interest points, I implemented Adaptive Non-Maximal Suppression (ANMS). I followed the formula ri = min_j |xi - xj|, s.t. f(xi) < c_robust f(xj), xj in I from the paper, handling infinite radii by replacing them with the largest non-infinite radius. I used n_ip = 50 interest points and c_robust = 0.9 .

Harris Interest Points v.s. ANMS

woz1_harris.jpg
Woz Left Harris Interest Points
woz1_anms.jpg
Woz Left ANMS
woz2_harris.jpg
Woz Right Harris Interest Points
woz2_anms.jpg
Woz Right ANMS
enclave1_harris.jpg
Enclave Left Harris Interest Points
enclave1_anms.jpg
Enclave Left ANMS
enclave2_harris.jpg
Enclave Right Harris Interest Points
enclave2_anms.jpg
Enclave Right ANMS
apt1_harris.jpg
Apartment Left Harris Interest Points
apt1_anms.jpg
Apartment Left ANMS
apt2_harris.jpg
Apartment Right Harris Interest Points
apt2_anms.jpg
Apartment Right ANMS
hmmb1_harris.jpg
HMMB Left Harris Interest Points
hmmb1_anms.jpg
HMMB Left ANMS
hmmb2_harris.jpg
HMMB Right Harris Interest Points
hmmb2_anms.jpg
HMMB Right ANMS

2. Extracting Feature Descriptors

Approach

First, I applied my Gaussian blur function from Project 2 to blur the grayscale version of each image. Next, following the paper, I sampled each 40x40 square around every interest point, resized to an 8x8 patch, and normalized. Finally, I stored a flattened version of each patch in a (n_ip, 64) array.

Feature Descriptors for Woz Left

woz_descr.jpg

3. Matching Features

Approach

To match the features between two images, I once again followed the paper. First, I set an image to be the reference image. I then found the 1- and 2-NNs for the reference neighbors by calculating the distances between each feature between the reference and second image. Finally, I filtered through all of the points using Lowe's threshold formula 1-NN / 2-NN < threshold and returned an array of matching indices. I used a threshold of 0.8 in this part.

Examples of Matched Features

woz_match.jpg
Woz Matches
enclave_match.jpg
Enclave Matches
apt_match.jpg
Apartment Matches
hmmb_match.jpg
HMMB Matches

4. RANSAC

Approach

In the previous part, it can be observed that some points are matched incorrectly - thus, we use RANSAC to reduce inaccuracies. For this part, I implemented 4-point RANSAC by exactly following the lecture slides. To calculate homographies, I used computeH from Project 4A. I had to slightly modify how I did operations with the homography matrix due to array shapes and x and y coordinates flipping. My RANSAC function outputted the largest set of inliers and its homography matrix, as outlined in the paper.

Matched Features After RANSAC

woz_ransac.jpg
Woz RANSAC
enclave_ransac.jpg
Enclave RANSAC
apt_ransac.jpg
Apartment RANSAC
hmmb_ransac.jpg
HMMB RANSAC

5. Mosaicing

Approach

Based on the points and homographies from RANSAC, I warped the right image to the left image using a slightly modified warpImage method (to account for different input shapes). The rest of the mosaicing process is identical to Project 4A.

Autostitched Mosaics

w_mosaic.jpg
Manual
woz_astitch.jpg
Autostitched
e_mosaic.jpg
Manual
enclave_astitch.jpg
Autostitched
a_mosaic.jpg
Manual
apt_astitch.jpg
Autostitched
h_mosaic.jpg
Manual
hmmb_astitch.jpg
Autostitched

What I've Learned

The coolest thing I learned in this project was how we're able to mathematically match features, and how random sampling processes like RANSAC can reduce inaccuracies. Also, I really liked how autostitching can end up stitching images better than manually selecting points.