ForPETA,wedeveloped"EyeToEye,"theworld'sfirstVirtualRealityLiveActingExperiencethatallowsuserstoengageinadialoguewithavirtualanimalthroughmotioncapturing.
Client
PETA
Industry
Public Sector and Non-Profit
Services
Creative Strategy & StorytellingUX DesignCreative DirectionArt DirectionVisual DesignConcept ArtTechnical Project ManagementGame Development3D Art Production
Awards
iF Design AwardD&AD AwardsClio AwardsLovie AwardsFavourite Website AwardsEpica AwardsLondon International AwardsNew York FestivalsEurobest AwardsADC Germany (Art Directors Club)ADC Global AwardsDeutscher Digital AwardDDC Gute Gestaltung (Deutscher Designer Club)Max AwardComm AwardWebby AwardsAnnual Multimedia Award
A dialogue changes perspectives.
PETA needed a new way to reach people.
PETA wanted to engage individuals who had previously had little interaction with animal rights. The goal was to evoke empathy rather than to lecture; success was measured by dwell time, immediate feedback at events, and reach through live installations and WebVR.
EyeToEye is a world-first live acting VR experience that combines motion capture, face and body tracking, binaural sound design, and an Unreal real-time framework.
EyeToEye makes encounters with animals tangible.

The core idea is simple: a personal dialogue enables true understanding. An actor controls the animal persona in real-time, transmitting voice, facial expressions, and gestures, allowing users to experience the conversation as profoundly real and to empathize with the animal's perspective.
Four clear phases led to implementation.
In a compact project structure, we examined concepts and technology in Discover, shaped interaction ideas and prototypes in Prototype, built the real-time capable system in Build, and launched the experience as a mobile installation and WebVR in Launch.
The magic moment is created through live acting: the voice, facial expressions, and gestures of a performer merge with a 3D character to create an individual conversation. Enhanced by binaural sound design, the experience fully immerses users in the scene and generates intense, personal reactions.
Technically, EyeToEye is based on a real-time framework developed in Unreal, combined with face and body tracking, a motion capture suit, and precise binaural sound; the setup has been optimized for festival and live use.
Technology creates emotional closeness.
The experience achieved measurable impact.
In user tests, the average dwell time was over 10 minutes, the installation attracted significant attention at conferences and festivals, and the WebVR adaptation expanded global reach. User feedback showed strong emotional reactions and lasting conversation prompts.