A new digital system developed by the University of Illinois and the University of California, Berkeley enables people in different locations to interact in real time in a shared virtual space. Illinois professor Peter Bajcsy says the tele-immersive environment captures, transmits, and displays three-dimensional (3D) movement in real time. “It’s a virtual environment that is the product of real-time imaging, not the result of programming 3D [computer-aided design] models,” Bajcsy says. “Nobody has to be supplied with equipment to enable imaging and 3D reconstruction.” Clusters of visible and thermal-spectrum digital cameras and large liquid crystal displays (LCDs) are deployed around a space. Information from those cameras are recorded, rendered in a 3D virtual space, and transmitted to another location. Participants at each site can see their own digital clones and their counterparts at the other site on the LCD screens. Bajcsy says the goal is to make a system portable and affordable. The researchers also are working on making data transmission more efficient. A single camera generates about 460 megabytes of data for every second of real-time footage, but only a single gigabyte of bandwidth is available, which poses a problem when systems with multiple cameras are used, Bajcsy says.
For More Information Visit: http://www.cpccci.com