[ptx] Registration testcases

JD Smith jdsmith at as.arizona.edu
Tue Jan 20 03:40:52 GMT 2004


On Mon, 2004-01-19 at 15:28, Pablo d'Angelo wrote:
> Hi!
> 
> I've put a few panoramas online, complete with source images.
> 
> http://wurm.wh-wurm.uni-ulm.de/~redman/gallery/pano_testcases
> 
> It contains a simple 4 pictures pano, a 24 pic half spherical pano
> taken from the 161 m high church tower of Ulm (the town where I live).
> Its really worth watching it with the native PTViewer 0.4 or FSPViewer. :)

> Btw. I know that most of the panos will probably have been shot more
> carefully than the images presented here.. luckily we have the possibility
> to change stuff by hand :) (which was not done for the panosifter tests on
> this gallery).

Nice examples, Pablo.  It seems to me that when panosifter, or any of
the other auto-match algorithms fail, they will fail by correctly
matching a subset of the pictures, but completely mis-matching one or
more, as in several of your examples.  In this case, it's tedious to
break all of the feature-connections and re-insert some, and just
starting all over won't do any good.  Abandoning auto-match for
hand-selected features seems a bit daunting too.  Since shifting
features (people,  clouds, poor hand-help shots) essentially mean that
some amount of mis-matches must be tolerated by even the most perfect
algorithm, we need a quick and easy way to correct them.

What I envision is for the user to be able to graphically place the
errant image or images by dragging and/or rotating them into place (with
an easy option to set the roll to 0).  If all the SIFT feature points
have been saved (not just the matched ones), I expect that a local
re-match, based on existing nearby points, can be performed quite
quickly.  This will likely find correct matches much more effectively
than completely fishing in the dark, matching any image against any
other.  To develop the list of points to match up, you could compute the
overlap of the image outlines according to the user re-location of the
image, grow it by 10% or some-such to correct of user alignment error,
and perform exactly the same type of matching employed in the
full-panorama sweep (but much faster, of course).  This method could
even be used ab initio, for truly difficult stitches.  You might even
have a configurable "sloppiness" setting to work with images with
parallax errors, shifting objects, etc.

As far as interface, I think it would be nice if the actual triangles
which matched up could somehow be indicated, as an option.

Thanks for all your work.

JD



More information about the ptX mailing list