A prestigious success in 2024 (with a host of festival selections and awards, including Venice, Geneva and the XRMust Awards), OTO’S PLANET was also released on stores before the end of the year. Now accessible to everyone, this French-language production (Luxembourg, Canada, France) is a successful bet on interactivity that is both relevant and always at the service of the story. Here’s a look back at its conception, from Luxembourg and Canada, with creator Gwenael François, and producers Julien Becker (Skill Lab in Luxembourg) and Nicolas S. Roy (Dpt. in Montreal). Part 2.
A look back at our first interview about the project: “OTO’S PLANET is an interactive work that reflects on the absurdity of our contemporary society” – Gwenael François, Julien Becker (Skill Lab)
Walking in circles isn’t easy
G. F. – In terms of staging, we saw where there could be problems, in terms of the sequence, in terms of the time that each sequence could take. In the end, it can get a bit long when a character goes from point A to point B and point B is quite a long way away! You don’t want to bore the viewer, that’s the first challenge. And technically, we had a second difficulty when we shot the motion capture, which is the phase of capturing the movements of the characters. We didn’t shoot on a round planet, but on a flat surface. We then had to map this onto a sphere. So technically it wasn’t an easy challenge. Zeilt, who were in charge of the motion capture, had to work on the code so that it could work in volume, and the character could still end up in the right place. During the shoot, we had to draw points on the ground to estimate the trajectories and arrival points, and the initial results were not very obvious. When you think you’re going straight ahead, you shouldn’t go straight ahead. You make a big curve to get to the point. And you don’t understand, but that’s the way it is. There were a few little surprises, but they were dealt with quite well by the technical teams.

N. S. R. – It’s obvious that motion capture brought its own challenges. I don’t even remember at what stage we realised that we had to map the characters onto a small sphere, but that we were going to do the motion capture flat. That added certain elements of complexity. At the same time there were also decisions to simplify the process, which then modified decisions that had been taken in terms of integration into Unity. It’s normal in every project to fine-tune these details, to react to surprises. Motion capture isn’t that simple!
OTO’S PLANET successfully combines interactivity and history
G. F. – Interactivity was a challenge, and we knew that finding a balance between the two would be one of the major challenges of the production. For me, as a director, it’s all about the point of view. And letting the viewer wander around, choosing his or her angle, that’s what’s fun. What’s more, with two characters and the possibility of choice, there’s something very funny to think about in terms of freedom.
G. F. – The idea, for example, of moving the planet where you want it, is more of a technical shortcut, a convenience for the user to make sure it adapts to all conditions. We don’t know where people are going to be when they do the experiment, and we want to allow them to raise the planet, lower it and put it where they want. So it was more of a user-friendly objective, and not necessarily linked to the narrative or to a search for viewer placement. But we’re very happy with the result and the possibilities it offers. It’s very elegant, and of course it comes from the teams at Dpt.

N. S. R. – Originally, at least for the prototype, control over the planet was much more limited, because what made it possible to turn the planet were certain rocks. From a UX point of view, it was a bit more complicated. You had to explain to users, you can put your hand here to turn the planet, and then it became something very optional. In short, it was a bit more cumbersome. Then, as we started to work with the new headsets, Meta Quest 3 and Apple Vision Pro, we realised that there would be simpler ways, in fact, of allowing you to move around the planet. That had an impact on the experience. Initially, people were more likely to walk around the planet, rather than spinning it and changing the viewpoint.
N. S. R. – And through the experience, there are several possible interactions, such as giving Oto a piece of fruit – which will trigger a stage in the story. This is very different from changing his point of view, which is indeed very interactive but has no impact on the storyline as such. It’s an interactivity that’s intrinsic to the experience. It’s very different from what we’re used to seeing in VR, because normally users can turn and move around a scene. But being able to control the whole experience is something quite new, and something that viewers adopt very intuitively.
N. S. R. – The scenery gives us the opportunity to offer this intuitive navigation, but we did a number of tests to determine the best way to control this movement. One of the things we realised was that taking something that doesn’t exist (like a set element to rotate the planet) was actually very unintuitive. It’s trying to hold on to the void. Whereas if you use your fingers, you get haptic feedback straight away. From a user point of view, this action is much more intuitive than trying to hold a virtual object.

Producing for the latest generation of headsets
N. S. R. – When we started the project, we were on Quest 1! We used joysticks for the prototype. Then we were able to work with Quest 2 and then Quest 3 for the production project, which enabled us to develop all the hand tracking and gestures. And then, with the Vision Pro, we had to ask ourselves the question of mixed reality.
N. S. R. – Our decision to offer a version for Apple Vision Pro was based on a certain opportunism – offering one of the first interactive narrative experiences – and a desire to test the possibilities of this new headset. We also wanted to have the opportunity to work with Apple to understand what they wanted to do with this headset. We had to work with beta versions of Unity developed in collaboration with Apple. We had to make adjustments to the animation and shaders to get it to work. Normally with Unity you can release several versions of the same project without too much difficulty, but on this occasion there are 2 versions of OTO’S PLANET produced for the 2 headsets.
G. F. – The biggest challenge was clearly for Dpt. to adapt the project to the Vision Pro. I was very excited about the arrival of Apple’s headset, and it’s a real achievement to be one of the first on the market. The visual quality is incomparable, even if the headset is very expensive. It gives the details of the experience a new dimension, and it’s a real pleasure to watch.

N. S. R. – Beyond the technical constraints, what’s interesting about adapting it for the Vision Pro, and also making a mixed reality version for Quest 3, is that originally we were only going to do it in virtual reality. But, quite naturally, it’s a 360-degree story that fits quite well with MR, with its ‘small world’ feel, like The Little Prince. The experience takes place in front of us, in the centre of the room. Unlike VR, which immerses you, here you invite the universe and its characters into your home. It really gives you another perspective on the context of the project.
J. B. – Certainly adapting it for mixed reality seemed very natural, where there were more questions about moving it from flat to VR initially. I’d be curious to know how people feel about it, now that the festival cycle (where it was presented in VR) is over.
OTO’S PLANET, what’s next?
G. F. – For the future, we have other projects, which don’t have the same format. Obviously, we thought it would be great fun to have the stories of the other characters. I’d love to know what happened before with Exo. It could be great fun to dig around a bit and say what’s Exo’s Planet, for example? Or other crazy planet ideas…
Leave a Reply
You must be logged in to post a comment.