ImmersiaTV will pilot an innovative end-to-end system covering the entire audiovisual value chain to enable a novel form of creative audiovisual storytelling based on omnidirectional video. The project will encompass immersive production tools, support for omnidirectional cameras, including ultra-high definition and high dynamic range images, adaptive content coding and distribution mechanisms, and immersive (HMD) & second screen visualisation. ImmersiaTV will demonstrate via a set of live and pre-produced pilots its deployability in a real production and distribution platform. Five specific objectives need to be addressed to demonstrate the feasibility of this novel approach for the creation, production, broadcast and display of omnidirectional video:
The creation of a new immersive cinematographic language, will be addressed in WP2: Requirements, format and creation of Immersive experiences. The delivery of a produced omnidirectional video stream combining several video sources in a coherent way imposes the revision of the traditional conventions of broadcast television. To achieve this objective we will analyze the content formats currently available in the market, study and report End-user demands, i.e., what does the audience expect, propose a Novel format design that matches the market offer and the audience’s expectations. These production scenarios have to enable the implementation of a novel cinematographic language and the delivery of a novel experience based on live omnidirectional broadcast. We will also address the professional user demands that are needed for this novel audiovisual content creation, both for offline and live production scenarios, targeting specifically the future exploitation of the related production tools, involving the most relevant stakeholders.
To Adapt the production pipeline, will be addressed in WP3: Immersive Broadcast Platform, in line with the requirements of WP2. The aim will be to assemble the production toolset necessary to efficiently produce immersive content that can be consumed across affordable immersive displays (head mounted displays), second screens (smartphones and tablets), and the traditional television set. This requires to assemble a Production Toolset to produce content integrating omnidirectional video streams within a carefully constructed narrative structure, across our target devices, both for Offline omnidirectional content production and Live omnidirectional content production. It also requires implementing a home receptor which can combine several omnidirectional video streams and deliver them across devices, integrating the production choices (for tablets and TVs: which is the default field of view, if rendered in HMD: which portals appear, what is their shape, size and position, etc), integrate the input of the user (head movements, tablet movements or gestures, etc.) as well as manage appropriately the timing of scenes and events and the synchronisation across devices.
To Re-design the distribution chain, will also be addressed in WP3: Immersive Broadcast Platform. The aim will be to integrate existing capture devices and emerging coders, decoders and distribution technologies, in an iterative process, in order to deliver at the end of the project the necessary components to stream near-real time interactive omnidirectional content. First, we will deliver a Multi-stream multi-display on-demand streaming service, involving several synchronized video streams, both omnidirectional and not, to the end-user, allowing him or her to have a coherent experience through several platforms (TV, tablets and head mounted displays). On a second stage, we will provide Off-the-shelf omnidirectional live broadcast. Current broadcast infrastructures assume specific formats, unfit for omnidirectional content. Therefore, specific strategies are needed to stream omnidirectional video live within the existing content delivery infrastructure. For example, it needs to be reformatted in a rectangular frame to meet the requirements of current codecs, but these transformations introduce geometric distortions. The last stage will consist in developing a Smart live omnidirectional streaming. The smart integration of innovative codecs, targeted geometric transformations and advanced stitching should allow the real-time adjustment between the best possible end-user experience and the bandwidth limitations, by defining regions of interest within the omnidirectional video streams, steering the parameter settings of the video encoding based on the feedback of objective quality metrics.
To Maximize the quality of the end-user and the professional-user experience will be addressed in WP4: Demonstration Pilots, through the implementation of 3 demonstration pilots each of them structured around an execution and demonstration plan, followed by a playfield demonstration, and finished with an evaluation of the end-user (content experience) and professional-user experience (content production workflow and tools). Different user iterations will take place during the project, in order to create an optimal immersive user experience in line with the user expectations and requirements as defined in WP2. Technical evaluations after each pilot will be also conducted with the relevant project stakeholders.
To Maximize the market impact of the ImmersiaTV solutions will be to ensure ImmersiaTV has a determining impact on the European and global audiovisual market. It will be addressed in WP5. Innovation transfer and exploitation, through the implementation of different strategies, business clinics, communication efforts and innovation transfer agreements.