top of page

AP & Multimer: Immersive media study

2000px-Associated_Press_logo.svg.png
multimer_logo_-1.png

The Associated Press is an independent, not-for-profit news cooperative headquartered in New York City. Our teams in over 100 countries tell the world’s stories, from breaking news to investigative reporting. We provide content and services to help engage audiences worldwide, working with companies of all types, from broadcasters to brands.

Multimer is a location analytics system that supports human-centered spatial design and decisions.

It does this by collecting, visualizing, and analyzing geolocated, indoor and outdoor biosensor data transmitted by common wearables, including smartwatches, heart rate straps, and brainwave bands.

By collecting and analyzing biosensor data through Multimer, organizations can analyze how their users, employees, and customers utilize a space.

The Experiment

This experiment was conducted at NYU’s Black box motion capture room. The immersive experiences presented included three 360-degree video stories as well as a CGI room- scale experience. There were twelve participants cooperated with Multimer staff member, wearing motion capture suits, EEG brainwave sensors, and heart-rate monitors. During the experiment, the team recorded their brain activities, heart rates, and body motions. Also, there were surveys to get information about participants’ experience level and what they feel subjective when they were watching the video. The outputs include the motion capture data being converted in FBX, BVH and CSV format and the bio-sensor data which includes the readings of the sensor during the experiment - the heart rates and the attention levels, etc.

The Experiment

Screen Shot 2018-12-12 at 11.18.21 PM.pn

Three headsets used

virtual-reality-headsets-list.png
virtual-reality-headsets-list copy 2.png
virtual-reality-headsets-list copy.png

Four Stories

Screen Shot 2018-12-17 at 2.59.27 PM.png

The Data

This dataset contains 30 Multimer files, and 17 csv, 17 fbx files and 75 bvh files of motion capture data. 

data-google-drive.png
furture-sort.png

Understand the Data

Relationship between various concerned parameters in the experiment:

Picture1.png

Study finding - hand-free headsets drive a higher level of engagements:

Picture2.png

Study finding - war zone immersive reporting drives participant stimulation, while science and environment stories build open-mindedness:

Picture3.png

The visualization

Immersive_story_telling_data_visualizati

I used Unity 3D for the visualization which is a powerful game engine for 3D interactive design. Also, Unity is able to export the application into multiple formats for different operating systems which is good for delivering the result. To display the 360-degree video, I choose to use a sphere as the canvas to display the video. 

Shaders used to display the video inside a sphere

Screen Shot 2018-12-13 at 12.53.14 AM.pn
Screen Shot 2018-12-13 at 1.08.54 AM.png

Effect:

360ball3.png
360ball.png

Display the visualization on a flat screen

1200px-Sphere_wireframe_10deg_6r.svg.png
未标题-3.png
Screen Shot 2018-12-13 at 12.47.38 AM.pn

Effect:

screenBall.png

Clean bio signal data

Screen-Shot-2018-12-12-at-1.22.27-PM.png
Screen-Shot-2018-12-12-at-1.22.36-PM-768

Read bio signal data by second

Screen-Shot-2018-12-12-at-3.48.29-PM.png

Visualize the data: Colors represent attention level, scale represent the value of heartrate

Screen Shot 2018-12-13 at 1.18.49 AM cop
Screen Shot 2018-12-13 at 1.18.49 AM.png

Annotation:

Screen Shot 2018-12-16 at 11.07.39 PM.pn

Download: 

bottom of page