TimeWave Space Station

June 2013: The expectations of whizz-bang tech and the realities of current and affordable applications for video streaming/telepresence haven’t yet shaken hands. Compared to tools available a decade ago, the digital space has taken huge leaps. However, one may underestimate the complexity of a festival that not only streams remote pieces from multiple locations during the course of an evening but also employs telepresence.

LoNyLa had evolved to the point where we were streaming in HD online with one camera. In Velocity Lab 2012, we streamed rehearsed readings between London, NYC, LA, Singapore and Berlin. TimeWave represented a steep ramp-up in terms of technical complexity.

While Simon Gethin Thomas ran sound and lighting for the live theatre performances, J Dakota Powell managed the video streaming and telepresence components of the festival.

 THE SPACE STATION EXPLAINED

What was affectionately known as the “space station” consisted of 4 laptops. Each one had a designated function. To combine all of these functions in one computer would’ve overloaded its processing power.

Certain applications had to remain open and accessible for transitions. Even so, the transitions between plays took too long – a hard-learned lesson in front of a patient audience. As TimeWave evolves, the transitions between plays need to be compressed.

TimeWave Space Station

The functions of each of the four computers were as follows:

 COMPUTER 1: COMMAND AND CONTROL (C&C)

We used a 2012 Macbook Retina with an i7 quad core processor, two Thunderbolt ports and plenty of RAM. Two Canon Vixia HV30s were hooked up to Blackmagic Intensity Extreme video encoders, which plugged into the laptop. The Canons were set to stream in HD.

Live Video Editing

We were able to edit the London-based video streams in real time – i.e., switch camera shots – via Telestream’s Wirecast software. This software plugged directly into UStream’s online broadcasting platform. It also recorded the live edit directly to our computer.

In essence, it was a virtual TV studio (see Wirecast image below). We could’ve added cameras via local networking. However, the dual Thunderbolt ports on the Macbook Pro limited the TimeWave shoot to two HD videocameras.

Live Video Editing

Telepresence Platform

The C&C computer also housed the Vidyo telepresence platform. We could control (e.g. turn on/off) the audio and video feeds of the remote cameras from this platform.

For example, a video feed from NYC would be on standby during the course of an evening. Once a play from NYC was about to go live, we’d turn on the NYC video feed from London. The remote camera was already turned on and ready to go.

See the Vidyo platform panel with multiple streams on standby.

 COMPUTER 2: AUDIO AND PROJECTION INPUTS

When you *project* a remote video feed to a live audience, you have to split the audio and video components of the feed.

If you don’t split the audio from the video, the audio creates an infinite feedback loop. The reason: input from the projection is being fed back into the microphone on stage. What results is a deafening echo. We had to use a second computer to stream the audio of remote feeds.

Telepresence Platform

For example, when NYC streamed into the Vidyo telepresence platform, we only used the video (image) of the feed. We had to mute the sound. Skype was then used to stream the audio portion of the play being performed in NYC. By splitting image and sound, we eliminated the problem of an infinitely looping echo.

From a communications standpoint, the coordination of video feeds from remote locations became that much more complicated. New York City had to check in with London on two platforms – Vidyo for image and Skype for sound.

Computer 2 was also used for projection inputs for the live performances in London.

The TimeWave set-up wasn’t built for video projection design. In 2013, we focused on remote video streaming and telepresence; however, some of the plays in London required projection of background images and short videos.

We used the telepresence platform for video projection, and it was not optimal. Moving forward, we will need a computer dedicated to video projection, particularly for a festival involving several short plays.

COMPUTER 3: COMMUNICATIONS AND MONITORING

TimeWave Ustream

A third computer was used for communications between London and remote locations – New York City, Los Angeles, Singapore and Madrid.

The easiest application to use was Google Hangouts. A chat function enabled cities to prepare the actors and camera operators to stream and helped London to coordinate the flow between the plays.

The video feeds on the telepresence platform caused the most problems. The application would time out a video feed on standby – i.e., the video feed would drop off the radar and disappear from the platform altogether.

We then had to start from scratch. Just before going live, the remote team had to log back in to the platform and London had to retest the video and audio streams. These glitches caused innumerable delays. Live audiences had to be prepared for “comms” breaks.

In London, our only solution was to lay out a table of food and drink. Hope people made merry!

Computer 3 also enabled us to watch the live stream on the Ustream platform, monitoring the broadcast for sound and image problems.

There were few issues with online broadcasting. On occasion, external static would affect the Ustream broadcast. Because the static was due to outside factors – e.g., construction on the street below Innovation Warehouse – we had little control over it and had to wait for it to pass. Remote streaming and telepresence posed the lion’s share of the difficulties.

 COMPUTER 4: PROJECTION

Sex Flap & Jazz Projection

The fourth computer acted as a puppet for computers one and two. Remote video streams were projected from this computer to a large floor-to-ceiling, rear view projection screen at Innovation Warehouse.

If this computer were used for anything else, our behind-the-curtain machinations would become transparent to the audiences. In some cases, we couldn’t avoid the transparency.

The audience actually delighted in seeing a Skype message projected onscreen to Teoma Nacarrato (“Dirt”) in Montreal…saying “Go go go!”

Because of the complexity of a hybrid stage and screen format, the use of furniture or a detailed set for the live performance would, in effect, detract from what was happening on the screen.

In a sense, you have to strike a balance between 2D (screen) and 3D (stage) worlds.

As per physical production values, TimeWave consisted of simple props from Innovation Warehouse, dramatic lighting, sound bytes and streaming video on a large screen. Given that this is a first step toward a new type of programming, anything more complicated on stage would’ve been beyond our reach.

While TimeWave was by no means perfect, the festival did stretch the boundaries of storytelling form, particularly with the use of telepresence.