Starting from:
$29.99

$26.99

Line-plane intersection_Assignment 1

Problems
1. Line-plane intersection (5 points) The line in 3D defined by the join of the points X1 = (X1,Y1,Z1,T1) and X2 = (X2,Y2,Z2,T2) can be represented as a Plu¨cker matrix L = X1X 2 −X2X 1 or pencil of points X(λ) = λX1 + (1−λ)X2 (i.e., X is a function of λ). The line intersects the plane π = (a,b,c,d) at the point XL = Lπ or X(λπ), where λπ is determined such that X(λπ)π = 0 (i.e., X(λπ) is the point on π). Show that XL is equal to X(λπ) up to scale.
2. Line-quadric intersection (5 points) In general, a line in 3D intersects a quadric Q at zero, one (if the line is tangent to the quadric), or two points. If the pencil of points X(λ) = λX1+(1−λ)X2 represents a line in 3D, the (up to two) real roots of the quadratic polynomial c2λ2 Q + c1λQ + c0 = 0 are used to solve for the intersection point(s) X(λQ). Show that c2 = X 1 QX1−2X 1 QX2 + X 2 QX2, c1 = 2(X 1 QX2 −X 2 QX2), and c0 = X 2 QX2. 3. Programming: Automatic feature detection and matching (35 points)
(a) Feature detection (20 points) Download input data from the course website. The file price_center20.JPG contains image 1 and the file price_center21.JPG contains image 2. In your report, include a figure containing the pair of input images. For each input image, calculate an image where each pixel value is the minor eigenvalue of the gradient matrix N = P w I2 x P w IxIy P w IxIy P w I2 y   where w is the window about the pixel, and Ix and Iy are the gradient images in the x and y direction, respectively. Calculate the gradient images using the fivepoint central difference operator. Set resulting values that are below a specified threshold value to zero (hint: calculating the mean instead of the sum in N allows for adjusting the size of the window without changing the threshold value). Apply an operation that suppresses (sets to 0) local (i.e., about a window) nonmaximum pixel values in the minor eigenvalue image. Vary these parameters such that around 600–650 features are detected in each image. For resulting nonzero pixel values, determine the subpixel feature coordinate using the F¨orstner corner point operator. In your report, state the size of the feature detection window (i.e., the size of the window used to calculate the elements in the gradient matrix N), the minor eigenvalue threshold value, the size of the local nonmaximum suppression window, and the resulting number of features detected in each image. Additionally, include a figure containing the pair of images, where the detected features (after local
2
nonmaximum suppression) in each of the images are indicated by a square about the feature, where the size of the square is the size of the detection window. (b) Feature matching (15 points) Determine the set of one-to-one putative feature correspondences by performing a brute-force search for the greatest correlation coefficient value (in the range [-1, 1]) between the detected features in image 1 and the detected features in image 2. Only allow matches that are above a specified correlation coefficient threshold value (note that calculating the correlation coefficient allows for adjusting the size of the matching window without changing the threshold value). Further, only allow matches that are above a specified distance ratio threshold value, where distance is measured to the next best match for a given feature. Vary these parameters such that around 200 putative feature correspondences are established. Optional: constrain the search to coordinates in image 2 that are within a proximity of the detected feature coordinates in image 1. In your report, state the size of the proximity window (if used), the correlation coefficient threshold value, the distance ratio threshold value, and the resulting number of putative feature correspondences (i.e., matched features). Additionally, include a figure containing the pair of images, where the matched features in each of the images are indicated by a square about the feature, where the size of the square is the size of the matching window, and a line segment is drawn from the feature to the coordinates of the corresponding feature in the other image (see Fig. 4.9(e) in the Hartley & Zisserman book as an example).

More products