I just watched this TED talk with Blaise Aguera y Arcas on Photosynth, a powerful application in development that stitches thousands of photos together and extrapolates the position of the photographer, creating a virtual model of the area photographed (found on BizarroBlog). A written explanation doesn’t do it justice, you really should watch the demo. I’ve just been playing around with the prototype on Microsoft Labs, it’s really incredible. This kind of thing is going to replace the linear, click-next-in-slideshow model of sites like Flickr and Photobucket, and what’s incredible is the complexity of the metadata that’s going to build up around each image: as people tag images in the likes of Flickr, that data is applied to all connected images. The best thing to do is try it, and if your computer hasn’t got the chops for it watch the talk on the TED site.