Case Study: Active Theory

0:03

I think the most exciting thing

0:05

about working with Porter Robinson

0:06

and his team

0:07

is really the connection

0:09

that he has with his fans.

0:10

It's really unique

0:12

that I've seen from a recording artist.

0:13

He really brings people in

0:15

and he really cares about the experience

0:17

and what people are thinking

0:18

and what people are feeling throughout.

0:45

Secret Sky definitely succeeded our expectations

0:48

because there was a lot of really

0:49

genuine connection that was happening.

0:52

Mux as a platform is very flexible.

0:54

It gives us the ability

0:55

to basically get direct access

0:56

to the video feed

0:57

and plug it into a 3D environment.

0:59

We actually sample the video

1:01

multiple times to do all these

1:02

like crazy color effects.

1:03

It has basically a light show going off

1:05

that is completely powered

1:05

by the live stream.

1:07

We researched a few different options,

1:08

and Mux was the best as far as cost

1:11

and we realized, Oh,

1:12

it's actually really developer friendly,

1:13

like we can actually build

1:14

a whole bunch of stuff on top of this.

1:16

We sort of integrated

1:17

it deeper into the platform.

1:18

So now

1:19

all the video in the platform

1:20

is powered by Mux.

1:21

And I think that definitely

1:22

turned a light bulb on

1:23

how do we kind of like bridge

1:24

the virtual experience

1:25

into physical experiences

1:26

so that it's not just isolated

1:27

to being an online thing, you know?

1:34

So the Dream Portal,

1:35

this installation we're doing here

1:37

is one of our first efforts

1:38

to bridge the gap between the physical

1:40

and the digital space.

1:41

And this is a new bit of technology

1:43

that we've been building for the platform

1:45

that enables people

1:45

at the real festival

1:46

to connect and chat to people

1:48

inside the virtual festival.

1:49

People at the physical festival

1:51

can walk up to the screen.

1:52

We're using a kinect azure

1:54

body tracking technology

1:55

to basically track the users

1:56

so they can spawn

1:57

little particles and dance

1:59

and do fun stuff like that.

2:00

There's also going to be a microphone

2:02

so you can talk to the attendees

2:03

on the virtual side and then vice versa.

2:05

People in the virtual side

2:06

are going to see a screen pop up

2:07

kind of out of nowhere -

2:08

that's basically gonna be a live feed

2:10

into the Second Sky Festival.

2:11

The screen,

2:12

which is a 3D screen

2:14

similar to, you know, 3D movie,

2:15

we built a new renderer to be able

2:17

to pipe straight into that screen.

2:19

So we render the world two times

2:21

one for each eye slightly offset.

2:23

And then people put on glasses

2:24

and each side of the lens

2:26

blocks out certain pixels,

2:28

and it creates a 3D stereoscopic image,

2:30

so you feel like you're seeing it in 3D.

2:32

It was just really fun

2:33

coming in and being like,

2:34

Oh, what is this?

2:35

And then suddenly like, you know,

2:36

you realize like,

2:37

Oh my god, can they see us?

2:38

Can they see us

2:38

just start to interact with them?

2:39

It’s like the best thing.

2:40

Brazil I saw a lot Texas,

2:42

Texas I saw a lot, couple of people from Japan,

2:45

which was super cool.

2:46

There's a guy even right now in Mexico.

2:48

A virtual event in my mind or,

2:51

you know, in our mind

2:52

isn't

2:53

something happens

2:54

and you watch it. A virtual event is

2:57

how are we bringing people together?

2:59

And what are they feeling?

3:01

What are they taking away?

3:03

Something beyond

3:04

just watching something on screen.

3:06

And that's what I feel like

3:07

this is all really about, right.

3:08

Getting people

3:09

that might not be here physically

3:10

for whatever reason,

3:11

like have the experience.

3:13

So...all for it.