Summary:

Computer scientists at the University of California, San Diego have developed a new way of mixing images and video feeds from mobile cameras in the field to provide remote viewers with a virtual window into a physical environment. Dubbed ‘Reality Flythrough,’ the application constructs a 3D […]

Computer scientists at the University of California, San Diego have developed a new way of mixing images and video feeds from mobile cameras in the field to provide remote viewers with a virtual window into a physical environment. Dubbed ‘Reality Flythrough,’ the application constructs a 3D virtual environment dynamically out of the live video streams. Here is how it works – instead of watching many video streams on different monitors, this technology, can combine all that and create a virtual reality environment that puts you smack in the middle of action. The researchers at UCSD’s Jacobs School of Engineering have already begun testing the software for homeland security and emergency response, but they say that the technology has other potential consumer uses as well. Imagine watching the Yankees game and being right on the mound with the Big Unit. Marry camera phones, video phones and Google Maps with real time traffic information, and you got virtual drive right before you even leave the desk. The best part is that you don’t need to have multiple video feeds. Static images can be used to plug the gaps. Check out samples in mpeg2 here.

Comments have been disabled for this post