remote media immersion
#1

i need report and ppt of this seminar topic
Reply
#2
i need report and ppt on remote media immertion
Reply
#3
Tongue 
[attachment=4316]This article is presented by: RANJITHA.C.R
REMOTE MEDIA IMMERSION

The charter of the Integrated Media Systems Center (IMSC) at the University of Southern California (USC) is to investigate new methods and technologies that combine multiple modalities into highly effective, immersive technologies, applications and environments. One of the results of these research efforts is the Remote Media Immersion (RMI) system.
Then comes what is RMI?
Immersive application aspects:
1. Multi-model environment (aural, visual, haptic, …)
2. Shared space with virtual and real elements
3. High fidelity
4. Geographically distributed
Interactive
Tongue

Reply
#4


[/i][/b]

Remote Media Immersion (RMI) system


ABSTRACT

The Remote Media Immersion (RMI) system is the result of a unique blend of multiple cutting-edge media technologies to create the ultimate digital media delivery platform. The main goal is to provide an immersive user experience of the highest quality. RMI encompasses all end-to-end aspects from media acquisition, storage, transmission up to their final rendering. Specifically, the Yima streaming media server delivers multiple high bandwidth streams, transmission error and flow control protocols ensure data integrity, and high-definition video combined with immersive audio provide highest quality rendering. The RMI system is operational and has been successfully demonstrated in small and large venues. Relying on the continued advances in electronics integration and residential broadband improvement, RMI demonstrates the future of on-demand home entertainment.
Reply
#5

[attachment=13483]
ABSTRACT
The Remote Media Immersion (RMI) system is the result of a unique blend of multiple cutting-edge media technologies to create the ultimate digital media delivery platform. The main goal is to provide an immersive user experience of the highest quality. RMI encompasses all end-to-end aspects from media acquisition, storage, transmission up to their final rendering. Specifically, the Yima streaming media server delivers multiple high bandwidth streams, transmission error and flow control protocols ensure data integrity, and high-definition video combined with immersive audio provide highest quality rendering. The RMI system is operational and has been successfully demonstrated in small and large venues. Relying on the continued advances in electronics integration and residential broadband improvement, RMI demonstrates the future of on-demand home entertainment.
INTRODUCTION
The charter of the Integrated Media Systems Center (IMSC) at the University of Southern California (USC) is to investigate new methods and technologies that combine multiple modalities into highly effective, immersive technologies, applications and environments. One of the results of these research efforts is the Remote Media Immersion (RMI) system. The goal of the RMI is to create and develop a complete aural and visual environment that places a participant or group of participants in a virtual space where they can experience events that occurred in different physical locations. RMI technology can effectively overcome the barriers of time and space to enable, on demand, the realistic recreation of visual and aural cues recorded in widely separated locations.
The focus of the RMI effort is to enable the most realistic recreation of an event possible while streaming the data over the Internet. Therefore, we push the technological boundaries much beyond what current video-on-demand or streaming media systems can deliver. As a consequence, high-end rendering equipment and significant transmission bandwidth are required. The RMI project integrates several technologies that are the result of research efforts at IMSC. The current operational version is based on four major components that are responsible for the acquisition, storage, transmission, and rendering of high quality media.
STAGES OF RMI
Acquisition of high-quality media streams
This authoring component is an important part of the overall chain to ensure the high quality of the rendering result as experienced by users at a later time. As the saying “garbage in, garbage out” implies, no amount of quality control in later stages of the delivery chain can make up for poorly acquired media.
Real-time digital storage and playback of multiple independent streams
Yima Scalable Streaming Media Architecture provides real-time storage, retrieval and transmission capabilities. The Yima server is based on a scalable cluster design. Each cluster node is an off-the-shelf personal computer with attached storage devices and, for example, a Fast or Gigabit Ethernet connection. The Yima server software manages the storage and
network resources to provide real-time service to the multiple clients that are requesting media streams. Media types include, but are not limited to, MPEG-2 at NTSC and HDTV resolutions, multichannel audio (e.g., 10.2 channel immersive audio), and MPEG-4.
Protocols for synchronized, efficient realtime transmission of multiple media streams
A selective data retransmission scheme improves playback quality while maintaining realtime properties. A flow control component reduces network traffic variability and enables streams of various characteristics to be synchronized at the rendering location. Industry standard networking protocols such as Real-Time Protocol (RTP) and Real-Time Streaming Protocol (RTSP) provide compatibility with commercial systems.
Rendering of immersive audio and high resolution video
Immersive audio is a technique developed at IMSC for capturing the audio environment at a remote site and accurately reproducing the complete audio sensation and ambience at the client location with full fidelity, dynamic range and directionality for a group of listeners (16 channels of uncompressed linear PCM at a data rate of up to 17.6Mb/s). The RMI
video is rendered in HDTV resolutions (1080i or 720p format) and transmitted at a rate of up to 45 Mb/s.
OVERVIEW OF COMPONENTS
The overall objective of the RMI research endeavor is to achieve the best possible quality at each rendering location for a group of participants. Group sizes may range from a single person or family at home to a large venue seating hundreds. For the visual streams we decided that we required at least high-definition (HD) resolution as defined by ATSC1. The highest quality ATSC modes are either 1920 × 1080 pixels at an interlaced frame rate of 29.97 per second, or 1280 × 720 pixels at a progressive frame rate of 59.94 per second. For the audio rendering we rely on the immersive audio technology developed at IMSC which utilizes a 10.2 channel playback system. The rendering capabilities of immersive audio goes much beyond current stereo and 5.1 channel systems. The combination of 10.2 channels of immersive audio and high-definition video is the next step in audio-visual
fidelity. Each presentation session retrieves and plays back at least one high-definition visual and one immersive aural stream in synchronization. Note that this choice was imposed by the available media content and is not an inherent limitation in the Yima design. The streams are stored separately on the server for two reasons. First, the RMI system is designed to be extensible such that additional video or other streams may become part of a presentation in the future. Second, allowing streams to be separately stored enables RMI to retrieve different components of a presentation from different server locations. The final, fine-grained synchronization is achieved at the client side. The on-demand delivery of the streams that form anRMI presentation is enabled through our streaming media architecture called Yima. With features such as scalability, multi-stream synchronization, transmission error and flow control, it is uniquely suited for RMI-style media delivery.
Reply

Important Note..!

If you are not satisfied with above reply ,..Please

ASK HERE

So that we will collect data for you and will made reply to the request....OR try below "QUICK REPLY" box to add a reply to this page
Popular Searches: how to use agave sisalana as a filter media, petters media, malayalam seminar reports of remote media immersion, unguided media diagram, recreation, media representations of class, types of tranmission media,

[-]
Quick Reply
Message
Type your reply to this message here.

Image Verification
Please enter the text contained within the image into the text box below it. This process is used to prevent automated spam bots.
Image Verification
(case insensitive)

Possibly Related Threads...
Thread Author Replies Views Last Post
Lightbulb Tele-Immersion Electrical Fan 1 2,101 20-12-2012, 11:21 AM
Last Post: seminar details
  tele immersion seminars report computer science technology 9 14,838 20-12-2012, 11:20 AM
Last Post: seminar details
  Confidential Storage and Deletion methods for Electronic Media computer girl 0 835 07-06-2012, 10:34 AM
Last Post: computer girl
  tele immersion full report computer science technology 2 4,583 13-03-2012, 04:13 PM
Last Post: seminar paper
  Parallel Computing In Remote Sensing Data Processing computer science crazy 4 4,849 01-03-2012, 09:32 AM
Last Post: seminar paper
  PARALLEL COMPUTING IN REMOTE SENSING DATA PROCESSING seminar projects crazy 1 2,934 24-02-2012, 11:40 AM
Last Post: seminar paper
  Remote Administration Trojan's computer science crazy 4 4,991 02-05-2011, 11:34 AM
Last Post: seminar class
  TELE-IMMERSION seminar projects crazy 2 2,188 17-04-2011, 08:00 AM
Last Post: radh.lucky
  REMOTE SLEEP MONITORING AND MEDICAL ALARM SYSTEM seminar class 0 1,507 01-03-2011, 03:49 PM
Last Post: seminar class
  Remote Procedure Call seminar class 0 1,128 16-02-2011, 04:58 PM
Last Post: seminar class

Forum Jump: