AP & Multimer: Immersive media study
The Associated Press is an independent, not-for-profit news cooperative headquartered in New York City. Our teams in over 100 countries tell the world’s stories, from breaking news to investigative reporting. We provide content and services to help engage audiences worldwide, working with companies of all types, from broadcasters to brands.
Multimer is a location analytics system that supports human-centered spatial design and decisions.
It does this by collecting, visualizing, and analyzing geolocated, indoor and outdoor biosensor data transmitted by common wearables, including smartwatches, heart rate straps, and brainwave bands.
By collecting and analyzing biosensor data through Multimer, organizations can analyze how their users, employees, and customers utilize a space.
This experiment was conducted at NYU’s Black box motion capture room. The immersive experiences presented included three 360-degree video stories as well as a CGI room- scale experience. There were twelve participants cooperated with Multimer staff member, wearing motion capture suits, EEG brainwave sensors, and heart-rate monitors. During the experiment, the team recorded their brain activities, heart rates, and body motions. Also, there were surveys to get information about participants’ experience level and what they feel subjective when they were watching the video. The outputs include the motion capture data being converted in FBX, BVH and CSV format and the bio-sensor data which includes the readings of the sensor during the experiment - the heart rates and the attention levels, etc.
Three headsets used
This dataset contains 30 Multimer files, and 17 csv, 17 fbx files and 75 bvh files of motion capture data.
Understand the Data
Relationship between various concerned parameters in the experiment:
Study finding - hand-free headsets drive a higher level of engagements:
Study finding - war zone immersive reporting drives participant stimulation, while science and environment stories build open-mindedness:
I used Unity 3D for the visualization which is a powerful game engine for 3D interactive design. Also, Unity is able to export the application into multiple formats for different operating systems which is good for delivering the result. To display the 360-degree video, I choose to use a sphere as the canvas to display the video.
Shaders used to display the video inside a sphere
Display the visualization on a flat screen
Clean bio signal data
Read bio signal data by second
Visualize the data: Colors represent attention level, scale represent the value of heartrate