Driving Future of Immersive Entertainment

admin   June 22, 2020

Since 2016, Ravi Velhal, who leads Digital Media Standards, Technologies and Immersive Cinema programs globally at Intel, started bringing plethora of VR technologies and major Hollywood immersive co-productions worldwide

Since past 25 years, Intel has been at the forefront of emerging technologies used by the entertainment industry to deliver rich content and provide consumers with exciting immersive digital experiences today. Pioneered by Intel’s legendary CEO Andy Grove in the early 1990s, Intel has been driving and establishing content / technology policy and standards between the Hollywood, consumer electronics and technology providers.

Major studios, post production and visual effects companies leverage Intel technologies from creation to distribution and consumption-such as Intel® Xeon® Scalable and Intel® Core™ processors, Intel® Optane™ memory technology, 5G networking solutions, and development toolkits like SDVis for open source raytracing — to unleash the best viewing experiences for consumers.

Ravi will throw light on cutting-edge technologies for immersive and interactive content production at CannesXR 2020 and his colleague Sarah Vick, Building Shared Sense of Humanity using Volumetric Studio highlighting state-of-the-art Intel Studios, touted as world’s largest Volumetric Studio, where iconic and pioneering immersive experience such as ‘Grease- You are the one that I want’ was filmed with more than 5 trillion pixels.

Advancing story powered by tech, Intel collaborated with Sony Pictures last year to produce virtual reality experiences based on ‘Spider-Man: Far From Home’, which allows participants to enjoy a web-slinging thrill ride from Spider-Man’s perspective, experience delivered first at Mobile World Congress 2019 leveraging ultra-low-latency over 5G for multiplayer VRE.

Created by the The Museum of Tomorrow, Rio de Janeiro presents RePangeia – a Technoshamanistic experience in Virtual Reality developed in collaboration with Imagined by Intel. Inspired by Tecnoshamanism, the experience seeks to engage with the perspectives of traditional indigenous Tupinambá and Tikuna tribal communities of lower Amazonian Brazilian rainforest to rethink the future to explore possibilities to create a connection between traditional ancestral knowledge using the virtual reality and volumetric technology with depth data of tribe’s ritualistic dance performance. Immersive technology offers The RePangeia ritual extends from the real world into the virtual. Before entering the Virtual World, participants take off their shoes and put on indigenous ankle rattles. In the experience, they follow the movements of the Holographic volumetrically captured Indigenous Guide in a spiritual dance. Participants can see their own hands represented as glowing energy fields in the experience. Through touch, they build up their own energy and can share energy between themselves and the guide to experience mindfulness in the Natural World around them.

Directed by Marcela Sabino, experiences such as Repangea highlights potential of immersive technologies that are available to the content creators around the world to take their story forward. Since 2016, Ravi Velhal, who leads Digital Media Standards, Technologies and Immersive Cinema programs globally at Intel, started bringing plethora of VR immersive technologies and knowhow worldwide, including Brazil and Colombia in South America and began collaborating with Brazilian content creators, studios, which led him to direct first ever Virtual Reality production of Rio Carnival Virtual Reality Experience in 2018 which was released at Marche Du Films at 71st Cannes Film Festival in 2018 ago along with Save Every Breathe-Dunkirk VRE and Prelude to LeMusk VRE Cinema by A R Rahman, two time Oscar and Grammy Winner on a Positron VR motion chair emitting scent, paved the way for dedicated CannesXR exhibition category for immersive Cinema at Cannes.

Last year (2019), Ravi and Intel team brought Grease Volumetric Song Experience (Intel, Paramount Pictures), FirstMan VRE (Universal Pictures, RYOT, CreateVR and Intel) and Scent of a Song from LeMusk Cinema VRE, with AR Rahman’s appearance at the CannesXR joint Keynote, all co-produced by Intel.

During this unprecedented time, Marché du Film as a part of 73rd Cannes Film Festival is bringing CannesXR online and virtual, highlighting the XR industry and to foster link between creators and industry partners in the XR sector online. Cannes XR Virtual (June 24-26,2020) is a three day online and virtual event dedicated to Immersive Entertainment, in order to adapt to current situation, to reshape a groundbreaking event online and in Virtual Reality, a program fully dedicated to immersive technologies and works, in connection with the art of storytelling and the film industry. At CannesXR, Ravi will be speaking about cutting-edge technologies for immersive and interactive content production (see box for more details).

Rahman & Ravi

Hands around the world project, unique way to bring immersive technology and music together for climate change awareness According to Oscar-Grammy winner A R Rahman, “Hands Around the World” it is a unique project that brings together music and technology to create awareness about climate change. There is a lot spoken about climate change, but I feel this is the time for all of us to come together and take action. Ravi Velhal from Intel, who is one of the technical advisors for my Le Musk project, pitched my name to Neil and Ken, saying AR will be the right person for this project. So, they did their research, watched my show Harmony on Amazon Prime and after that, asked me to compose a song for Hands Around The World.’ A R Rahman recorded “Hands Around the World” song in LA and launched its teaser recently during COVID crisis on Earth’s day


Featured Post

Transforming Hollywood from pixels to voxels

admin   November 1, 2019

Volumetric capture can record footage of a real person from various viewpoints, after which software analyzes, compresses, and recreates all the viewpoints of a fully volumetric 3D human, says Ravi Velhal

Developers and content creators use Intel hardware and software technologies to capture and process immersive digital assets for video games, virtual and augmented reality, holograms, film and other rapidly growing markets. Forecasters expect the volumetric video market to grow from $578 million in 2018 to nearly $2.8 billion by 20231. Read on to learn how Intel engages with film studios and innovative developers to bring volumetric video experiences to life.

TechCrunch explains that volumetric capture can record footage of a real person from various viewpoints, “after which software analyzes, compresses, and recreates all the viewpoints of a fully volumetric 3D human.” It can be described as “depth to point cloud from capture” or creating a point-cloud from frames. Volumetric video plays it back into video format.

Developers and content creators around the world see growing opportunities with this technology. Markets include architectural design, public safety, retail point-of-sale, gaming, mass-media entertainment, and many other exciting application areas.

Storytelling Meets Technology

At Cannes film festival Ravi’s keynote and panel session highlighted the power of production technologies to create multisensory immersive cinema experiences and Hollywood volumetric cinema productions. In the era of next-generation immersive cinema, storytelling and technology combine in a data-powered entertainment landscape.

Audiences respond strongly to the power of immersive cinema storytelling. You can see, hear, feel, experience and even smell that in the multisensory Intel collaboration, prelude to LeMusk by A.R. Rahman (double Oscar winner musician, composer, well-known for his hit film Slumdog Millionaire). Intel also collaborates on a growing number of exciting Hollywood productions, including SpiderMan: Homecoming VRE, Save Every Breath: Dunkirk VRE by Warner Bros., First Man: VRE by Universal Pictures, and SpiderMan: Far From Home multiplayer VRE over 5G at MWC 2019 and the summer blockbuster, SpiderMan: Far from Home VRE by Sony Pictures.

Intel Studios and “Grease” at 40

A key milestone in the transformation of Hollywood was on January 8, 2018. That day, Variety reported on the opening of Intel Studios, a dedicated volumetric video capture facility. The story summed everything up nicely by saying, “Intel wants to help Hollywood embrace the next generation of immersive media … essentially producing high-end holographic content…”

Intel Studios’ general manager Diego Prilusky and Randal Kleiser, the director of the 1978 iconic movie “Grease,” revealed at CES 2019 an exploratory, immersive teaser celebrating the film’s 40th anniversary. Growing up with Song and dance Bollywood influence , Ravi initiated Grease project with Randal Kleiser and Paramount Pictures. A little more than a year later, at the Cannes Film Festival/CannesXR 2019 event, Intel and Paramount Pictures released the five trillion pixel, full-volumetric experience based on a re-creation of the iconic song “You’re the One That I Want” for PCs and AR devices.

Advancing Data-Driven Storytelling with Sony Pictures Entertainment

In June 2018, Sony Pictures Entertainment (SPE) launched Sony Innovation Studios in collaboration with Intel, Dell, and Deloitte Digital. Housed in a 7,000 square-foot sound stage on the Sony Pictures Studio lot in Culver City, Calif., the space features the latest in research and development in areas including volumetric video and customizable set scanning. Sony Innovation Studios works with engineers throughout Sony Corporation and pursues business development opportunities in motion pictures, television, music and gaming. Intel has teamed up with Sony Pictures to help expand the applications of Sony Innovation Studios beyond entertainment to other market segments. With Sony Innovation Studios, enterprises can leverage emerging technology to visualize real business solutions and solve problems within their industries.

Most recently, Intel teamed up Sony Pictures Virtual Reality to bring the Spider-Man™: Far From Home Virtual Reality Experience. This experience gives players the chance to select their favorite Spider-Man suit, and, for the first time, swing high above New York City. The short VR gaming experience was developed by CreateVR using Intel® Xeon® Scalable processors. In addition to the VR experience, Intel worked with Sony Innovation Studios to capture Tony Stark’s jet with volumetric technology using a point cloud system and processed using Intel Xeon Scalable processors. Watch the Behind the Screens video about the making of the VR experience and a sneak peek into Tony Stark’s volumetric scanned jet.

Humanizing the Digital World

As glamorous as our work with Hollywood studios and world-famous directors may sound, some of the greatest value comes from developers and technology innovators like Underminer Studios.

Underminer’s Volumation product uses cost-effective technology to capture, optimize, and perform post-production clean-up in one process, without a green screen. It can increase realism and immersion and lower the barriers of accessibility and cost for 3D images of real humans or objects in motion.

Volumation was introduced at SIGGRAPH 2018 by Alex Porter, CEO of Underminer Studios. She described the opportunity for real world assets, objects, and people to be put into digital formats, as “humanizing the digital world.”

After a few initial designs, the rig for the Volumation demo at SIGGRAPH included 26 Grove* UP* boards and 128 cameras. Technologies used included Intel® NUC™ PCs, HTC* Vive* VR, Intel® Optane™ memory/storage, Google* ARCore* AR, Unity* real-time 3D, Intel® networking, Intel® Graphics Performance Analyzers (Intel® GPA), and of course, it all ran on Intel® CPUs.

Capture the Opportunity

Intel works with leading companies to ensure the creative software you rely on performs at its best, so you can get through production work faster and have more time to develop your next great idea. Get practical ideas on ways to adopt immersive technology for a range of markets at the Intel® Developer Zone/VR.

THOUGHT LEADER

Ravindra aka Ravi Velhal leads Digital Media Standards, Technologies and Immersive Cinema programs globally at Intel.

As a Cinema Virtual Reality Experience (VRE) pioneer, he co-produced Le Musk multi-sensory VR by A.R. Rahman, Save Every Breathe: Dunkirk VRE with Warner Bros, FIRST MAN: VRE with Universal Pictures and RYOT, directed Rio Carnival 2018 VRE with GloboTV, collaborated with Sony Pictures on SpiderMan: Home Coming VRE which was nominated for prestigious Emmy, and he recently launched Innovation Studios with Sony Pictures. At Mobile World Congress in Feb 2019, he collaborated with Sony Pictures to launch SpiderMan: Far From Home, the world’s first multiplayer VRE gaming experience over 5G. His focus is to drive future immersive cinema, including future media formats. Ravi is recipient of Hollywood’s prestigious Lumiere award, Infinity Film Festival of Beverly Hills Audience Choice Award. Ravi holds several global patents, and he has served as a director on the AIS/VR Society board, and an advisor to film and trade organizations, globally.


Blurring The Lines Between Real and Virtual Worlds

admin   February 14, 2018

In an interview with Pickle, Ravi Velhal, Global Content Strategist-New Media Experiences, Intel Corporation, talks about the potential of VR filmmaking and how it is transforming the cinematic experience.

Intel and AR Rahman launched Prelude to Le Musk at Las Vegas on April 24, 2017, powered by Intel technologies in multi-sensory ergonomic motion guided Cinematic VR chair developed by Positron, which seamlessly transports the audience to the incredible world of Cinema VR.

Portland Oregon-based Ravi Velhal, Intel’s Global Content Strategist, who first made headlines at Hollywood’s VR Society, is at the top of his game with his debut as VR Technology Producer collaborating with debut director AR Rahman and his team, creating world’s first multi-sensory and India’s first VR movie experience prelude to Le Musk, setting a new industry standard and continue to promote VR as immersive medium for storytellers globally and more on the way VR Technology Producer, Ravi Velhal deep dive into new immersive medium of 360-degree storytelling with the Le Musk. As Hollywood’s VR Society board member, Ravi is on a mission to accelerate the transformation, innovation and profitability of the Virtual Reality Content, Distribution and Technology Business globally and especially in the emerging markets.

How has computing and virtual reality evolved over the years? What is next in VR filmmaking?

Industry experts say the total VR market is expected to reach $80 billion by 2020 and $569 billion by 2025. The world of computing is radically expanding the epicentre of VR ecosystem. Technology is going to change VR filmmaking, as virtual reality has changed the equation; expanded boundaries; has defined new expressions for story tellers; and brought new immersive multi-sensory experiences for audience, making them a part of the story. Where existing experience primarily involves engaging with a screen, Le Musk VR Cinema project marks an inflection point where computing possibilities are bound only by two times Academy and Grammy award winner musical maestro AR Rahman’s imagination to completely immerse audience through multisensory- sight, sound, haptics, motion and olfactory (aroma) experience. The next frontier of VR will empower us to build, solve, create and play without limits – in a world that is indistinguishable from reality. It will enable content creators to deliver richer, immersive and more interactive cinema VR experiences to delight audience globally.

What are the key technology challenges for VR movie making? How can these be addressed?

The sheer file size and quality of the media moving through the VR workflow pushes most available technologies to the limit. While the vast majority of existing pre-and post-production processes have been optimised for standard HD and 4K media, 360 immersive VR format has to work with media that is exponentially larger in size, higher in resolution, and performs complex operation on that media. Sheer nature of current VR 360 camera, stitching and lighting requires CG correction in almost every frame.

Every component ranging from workstations, blade servers, network, graphics card, software stacks, storage, output monitors to VR system in the production and post production pipeline has to perform in harmony to deliver exceptional results. Workflow pushes the boundaries of technologies, requiring the latest equipment, software and creative talent to meet film-maker’s expectations to delight audience. From production (acquire and live preview) to post-production (edit, stitch, vfx, render, mix and encode) to secure delivery of VR cinema on motion chair and VR ready devices where Intel and ecosystem partner technologies play a very important role.

So where can I watch VR cinema?

Full length cinematic VR is in relatively nascent stage. Many location-based VR lounges are being created all over the world. However, space required for indi vidual VR setup, library and the duration of content that audience can watch without experiencing fatigue or motion sickness pose challenges for commercialising VR cinema. At NAB, while showcasing prelude to Le Musk we were able to address some of the key cinema VR challenges by reducing VR showcase footprint and combining multisensory experience in a mere 5×5 footprint using Positron VR Chair, a promising step towards multisensory Cinema VR theatre ecosystem realisation down the line. Such transition is not driven by one company alone – a number of important players across the industry have come together to make it happen. Our mission is to provide computing platform that is suited for developing such ecosystems and make it easy for people to build solutions on top of it.

How was the Le Musk project conceptualised? Please tell us more about the world’s first multi-sensory cinema VR movie.

AR Rahman entered into a new venture as a debut director in a pioneering attempt to make new episodic international VR film Le Musk. The film is directed, written and scored by Rahman in technical collaboration with Intel. Le Musk is considered as one of the world’s first multi-sensory cinema VR movie. Set in Rome and surrounding Tuscany, the musical aromatic story chronicles the Juliet, played by Nora Arnezeder, a prominent French actress, who has a smell fixation. The audience should be able to watch this high-resolution 360 degrees cinema VR story for an extended period of time without experiencing fatigue or motion sickness. From storyboarding to scriptwriting, content creation and processing to consumption, rules were re-written for VR flmmaking. We worked endless hours, learning, failing, and solving problems every step of the way. It had been a fascinating journey— we are just scratching the surface to unleash the real potential of immersive cinema VR experiences.

World premiere of Prelude to Le Musk was launched in NAB Show Las Vegas 2017. Please tell us about how this new movie watching experience unfolded for the audience.

Still VR is relatively new entrant in the nascent medium of 360-degree storytelling. Le Musk in many ways was a challenging experiment for AR Rahman and our entire production and post-production crew. The challenge was to bring several multisensory integrations together for its NAB showdown at Intel booth in Las Vegas in April 2017. High quality 360 degree virtual reality cinema with spatial audio, subtle haptics that responds to gesture, motion encoded movie to reduce fatigue and motion sickness associated with VR, combined with olfactory-aroma shooters that emanate various smell (including Le Musk developed by sensory director Grace Boyle) that depends upon particular scene sequence – all were combined together in egg shaped motion encoded VR Pod chair developed by Positron with Intel VR ready PC at its base that drives all the action. The viewer sits in the Pod chair with their VR HMD and headphone on with their feet off the ground. As the movie plays the pod rotates in 360 degrees, pitches forward, backward and side to side gently as the story unfolds. Overall experience guides you through different points of the story, letting you fully immerse in the Le Musk prelude story which dazzled NAB Show 2017 in its official premier of Prelude to Le Musk at Intel show floor in Las Vegas. With total amazement and rave reviews, people were standing in long lines with average wait time of more than 1.5 hours. After experiencing Le Musk, audience were thrilled about the VR movie technology breakthroughs achieved by AR Rahman, Intel and the hardworking Le Musk team.

The future of cinema VR is going to be totally immersive, where multi-sensory experiences powered by technologies will keep on multiplying but the lines between real and virtual worlds will keep on blurring.

We have been going thru amazing experience learning about VR Moviemaking in the last year and half discovering new techniques and new ways to tell stories, Le Musk is immersive project. As I have been trying to tell stories thru music so far and somewhere interest in virtual reality made the line blur and compelled me totake steps to visuals and directing. VR is a game changer and opened a new world of creativity – A R Rahman