07-05-2011, 03:24 PM
ABSTRACT
Realityflythrough is a telepresence/tele-reality system thatworks in the dynamic, uncalibrated environments typicallyassociated with ubiquitous computing. By opportunisticallyharnessing networked mobile video cameras, it allows a userto remotely and immersively explore a physical space. Live2d video feeds are situated in a 3d representation of the world.Rather than try to achieve photorealism at every point inspace, we instead focus on providing the user with a senseof how the video streams relate to one another spatially. Byproviding cues in the form of dynamic transitions, we canapproximate photorealistic telepresence while harnessing cameras“in the wild.” This paper shows that transitions betweensituated 2d images are sensible and provide a compellingtelepresence experience.
Author KeywordsTelepresence, Ubiquitous video
ACM Classification KeywordsH.5.1 [Multimedia Information Systems]: Artificial, augmented,and virtual realities
INTRODUCTION
We are rapidly moving toward a world where networkedvideo cameras are ubiquitous. Already, camera-equippedcell phones are becoming commonplace. Imagine being ableto tap into live video feeds to remotely explore the world inreal time. RealityFlythrough is a telepresence system thatmakes this vision possible.There are numerous applications for such a system, but perhapsthe most compelling involves disaster response. Consider,for example, first responders equipped with head-mountedwireless video cameras encountering the chaos of a disastersite. As they fan out through the site, they continuouslybroadcast their location, orientation, and what they see to aRealityFlythrough server. The responders’ central commandviews each of these video feeds from a first-person perspective,transitioning between them in a manner that reveals the spatial relationships between the source cameras. The resultingsituational awareness helps central command directmedics to the injured, firefighters to potential flare-ups, andengineers to structural weaknesses. As more people enterthe site and fixed cameras are positioned, the naturalness ofthe flythrough is enhanced until ultimately the entire spaceis covered and central command can “fly” around the sitelooking for hot spots without constraints.There have been many approaches to creating interactive immersiveenvironments that promote exploration of either aremote or a virtual space. The virtual reality communitybuilds the environments from scratch, using photograph-basedtexture maps if necessary and where possible [1]; the graphicsand vision communities create photorealistic renderingsof novel views using photographs (and in some cases videofeeds) taken from different angles [5]; and the robotics communityachieves the effect by attaching a camera to a remotecontrolledrobot [6].Our work starts with a different set of assumptions, and asa result leads to a very different design. The goal of RealityFlythroughis to harness networked ubiquitous cameras.Ubiquitous cameras are everywhere, or at a minimum cango anywhere. They are inside, outside, carried by people, attachedto cars, on city streets, and in parks. Ubiquity movescameras from the quiet simplicity of the laboratory to theharsh reality of the wild. The wild is dynamic—with peopleand objects constantly on the move, and with uncontrolledlighting conditions; it is uncalibrated—with the locationsof objects and cameras imprecisely measured; and it isvariable—with video stream quality, and location accuracyvarying by equipment being used, and the quantity of videostreams varying by location and wireless coverage. Staticsurveillance-style cameras may be available, but it is morelikely that cameras will be carried by people. Mobile camerasthat tilt and sway with their operators present their ownunique challenges. Not only may the position of the camerabe inaccurately measured, but sampling latency can lead toadditional errors.
Download full report
http://realityflythroughfiles/chi05_paper.pdf