The future of gaming, or ‘just a tool’? Hands-on with Teammates, Ubisoft’s ambitious voice AI tech demo
We tried Ubisoft’s in-game cloud AI project

The debate about the use of AI in video games is one that seems certain to only increase in both intensity and range over time.
Whereas the promise of in-game NFTs and companies throwing the word ‘metaverse’ around like confetti at a wedding appear to have been fads that never really gained much traction following initial backlash, it seems more likely that AI – in the myriad new forms it takes – will be here to stay for the foreseeable future.
Everyone has their line in the sand when it comes to this modern wave of cloud-based AI – be that the generated images and videos (referred to by many as ‘AI slop’) spreading throughout social media, or the use of systems like ChatGPT or Microsoft Copilot offering text-based guidance. For many, that line in the sand won’t be moved until certain aspects are fundamentally changed (copyright, energy use, accuracy and so on).
As with any other element of gaming, in order to be able to effectively cover and discuss the use of AI in games, it’s important for the media to actually know what we’re talking about. That’s why, when I was invited to Ubisoft’s Paris office to see Teammates, the latest iteration of its in-game AI project earlier this week, I accepted the offer to try it out and see first-hand how it works.
At last year’s Game Developers Conference, Ubisoft showed off what it called NEO NPCs, the first stage of an initiative developed by a Ubisoft Paris R&D team where players could use their microphone to talk to a character called Bloom and were encouraged to learn about his back story. Other demos included watching a stealth mission with a drone and discussing it with Bloom in real-time, and planning an infiltration strategy with another NPC called Iron.
The aim of the NEO NPC demos was to show examples of cloud-based AI being used to “enhance narrative immersion” and make the player feel more like they’re involved in the action and less like they’re just being guided along a set conversation with pre-written dialogue. At their core, though, the demos were mainly variations on chatbots, albeit ones trained to look at and discuss specific things. The next step was to actually incorporate some of this tech into a gameplay scenario.
That’s the idea behind Teammates, the tech demo I played this week, which shows how the R&D team’s AI technology can be used in a game in ways that go beyond simply standing and talking to NPCs – in this specific case, how it could be implemented in a first-person shooter where you’re accompanied by two AI-driven partners and a have access to an AI assistant called Jaspar (definitely not Jarvis, so don’t get suspicious, Iron Man fans).

At the start of the demo Jaspar asked me my name, then continued to refer to me as Chris for the remainder of the session. It gave me some background on the storyline, asked me if I had any questions, then gave more information on the specific aspects of the lore I asked about. So far, so straightforward.
Before getting started, a member of the R&D team explained to me that Jaspar wasn’t just there for narrative purposes, he also worked as an extension of the game’s menus and could carry out UX tasks for the player. I decided to try it out.
“Jaspar,” I asked. “I usually play with the Y-axis inverted on the right stick, can you sort that?” Just like that, my right stick was inverted. I decided to see how this could work for accessibility. “Jaspar, I’m colour blind,” I lied. “Can you change the display to account for this?”
“What type of colour blindness?” He asked. “Protanopia? Deuteranopia? Tritanopia? Achromatopsia?”
“Ummm… the third one.” The game’s graphics then changed accordingly. While the game was doing little more than activating options that could be accessed from a menu screen anyway, it was still a nice touch, and showed some of the less controversial, non-narrative possibilities that could result from the tech.
“‘Jaspar,’ I asked. ‘I usually play with the Y-axis inverted on the right stick, can you sort that?’ Just like that, my right stick was inverted.”
Then it was onto the game itself. I was shown a room with a pressure pad on it, connected to a door. I could ask Jaspar to highlight it for me, and it would glow in yellow. “Would you like me to tell you how it works?” He asked.
“No thanks, it seems pretty obvious,” I said. “You step on the thing and the door opens.” I walked onto the pressure pad and opened the door. Jaspar then sarcastically praised me for taking such initiative and the game awarded me with an achievement for thinking for myself.
I gained other achievements throughout the demo, but they felt more organic. The R&D team member explained to me that rather than the usual achievement system on Xbox, PlayStation and Steam where you perform a specific task and that triggers the achievement, here the AI can study your performance and decide an appropriate time to award it.
I was then introduced to Sofia and Pablo, the two robotic companions who would be joining me on my mission. The team are still working on implementing moods into the game – have NPCs and partners become happier, sadder, angrier or what have you, depending on how the conversation goes – so presumably the decision to go with robots with no facial features for now means the team can focus on getting the dialogue right without having to also worry about lip syncing, facial expressions and the like at this stage.

After a brief chat with my new robot partners, I was taken to another room, this time with two pressure pads on it. In classic video game fashion both need to be pressed at the same time, so I asked on the mic: “Sofia, can you step on the pad on the left, and I’ll step on the right one?” As requested, Sofia walked over to the left-hand pressure pad, so I stood on the right and the door opened.
Once the basics were out of the way I was taken to an outdoors environment with numerous obstacles and armed enemy robots walking around. “I’ve disarmed your gun for now, Chris,” Jaspar told me.
“When do I get to use it?”
“When you’ve proved you’re responsible enough to handle it.”
“That seems quite subjective.”
“Responsibility is earned through actions, not opinions, Chris.”
It was an oddly natural interaction, as was the upcoming combat section where I, still unarmed, had to direct my partners. “Right, Sofia, can you hide behind the grey pillar on the left? And Pablo, if you could duck behind that red truck on the right.”
They did. “Pablo, get ready to shoot the guy on the right-hand side, when I say so.”
“Okay, waiting for your order,” Pablo replied. I waited for the guard to turn his back.
“Fire.” Pablo gunned down the robotic guard, causing the other one to react. “Sofia, take out the other one, quickly.”
Sofia came out of cover and took out the other guard. The input delay between me speaking into the mic, the cloud-based AI interpreting my words (Scottish accent and all) then carrying out the action while also generating new dialogue was thankfully brief – it’s all well and good waiting for a reply when you’re using something like ChatGPT but in a video game fast responses are important.

As the demo progressed (and I mercifully finally got to use my gun), each combat scenario became more complicated, with some enemies on higher ground, or taking cover themselves. “Sofia, you take the right and try to annoy the guards so Pablo can flank them on the right,” I said during one fight.
“No problem,” Sofia replied. “I’ll go and cause a big ruckus to distract them.”
They helped with puzzle-solving too. At one point we were on the roof of a building with a pressure pad, but when I stepped on it I couldn’t see which door it was opening.
“Um, Sofia?” I asked. I preferred Sofia, she was a bit more talkative than the moody Pablo. “Do you have any idea what door I’m opening when I step on this?”
“Well Chris, the wires coming from it lead down the side of the building to the door directly underneath us, so it looks like it opens that one.”
While this all works surprisingly well, and made me think for the first time that I’d genuinely like to play through a whole adventure like this, there are still plenty of issues that have to be navigated before a game using this technology can even gain widespread acceptance in the first place, let alone sell well beyond that.
One of the big controversies surrounding the use of AI for dialogue is that it could potentially put writers out of a job. This is clearly something Ubisoft is aware of as a PR issue, which is why it also ensured a number of narrative designers attended the demo to discuss their role in its development.
After the demo I spoke with Virginie Mosser, the narrative director of the project. As a writer with many years of experience as an author, screenwriter and comic book writer, she told me that when she was asked to get involved with the project she was extremely apprehensive at first, because of the obvious concern that AI could eventually replace the writing process entirely.
Instead, she told me, she realised that she still had an important role to play in defining each character, and other elements of the game such as the plot and other structural elements which are put in place to make sure the player still has a path to follow.
For Sofia and Pablo, she had to create a detailed character sheet – similar to one that would be created for a Dungeons & Dragons type character – explaining their backstory, their personality, their likes and dislikes. She then wrote many hundreds of lines of dialogue for that character, in order to build a more detailed model of who they are and how they react to certain situations.

It appears that by approaching each character in this way, Ubisoft hopes to eliminate one of the key arguments against AI, the theft of other people’s work (usually in images and videos). By feeding a character’s AI model with original data provided by Ubisoft’s own narrative team, the idea is that the game will then use this information as the driving force behind any dialogue it generates.
“I know there is some mixed feeling, and at the beginning I was worried about the place of creation, and the place I can take in a team like that,” Mosser conceded to me. “But I haven’t had a day since I started working on this team without creating original content. And that’s why I’m waking up in the morning – I want to create, and I’m still creating.
“My team has more profile: at the beginning we had one narrative designer, now we have three. And we’re adding some experts – soon we’re adding an improv actor, I worked with actors for the recordings and so on.
“When someone comes to me and says ‘generative AI is cold, and there is no creation here’, I’m like, okay, but it’s my day-to-day life, and I can assure you I have a lot of creative meetings. I write hundreds of lines and dialogue and tone. We are still here, and there is a place for us.”
“When someone comes to me and says ‘generative AI is cold, and there is no creation here’, I’m like, okay, but it’s my day-to-day life, and I can assure you I have a lot of creative meetings. I write hundreds of lines and dialogue and tone. We are still here, and there is a place for us.”
The one aspect of the demo which gave me the most hope that what Virginie said was accurate was the ‘black boxes’. The aim of the demo is to find five robot corpses, each of which has an airplane-style black box located inside them. Each time you extract one you can read their last thoughts before they died – essentially, they’re just like any Resident Evil style text log you can find in any other adventure.
After finding one of these black boxes I could still use the game’s AI features with them. I could ask Jaspar to summarise the contents of the text log if I couldn’t be bothered reading the whole thing myself, or I could ask my partners for theories on some of the things mentioned in them. The key thing to me, however, is that these AI interactions were all happening around a text log that had been written by an actual writer at Ubisoft.
Whether this is something the player will want remains to be seen. During a panel discussion in which some of the other developers discussed their views on the technology, narrative designer Anaïs Desfachelles noted that the days of branching dialogue trees could be over with this tech, but in a later chat Xavier Manzanares, the project’s overall director, stressed that this isn’t the future of all video games, but rather a new way developers can take their specific games.

“It’s a tool first,” he told me. “We’ve been working on it for more than two years now, and our conclusion is that it’s a super cool tool, but it’s still a tool.
“So for us, the way we can convince [people] – and that was our objective today – is to show the system behind it. What is the game design vision, what is the narrative vision, what is the storyline, even the characters Sofia and Pablo.
“What you saw today is 4 or 5 months of work from many individuals in the team. Although it could be seen as easy, no, it’s a lot of work. Because in the end, that’s the way we can differentiate from a generic ‘plugin to genAI’ project, because we don’t want that.”
Pointing to the black boxes as an example, Manzanares explained that the aim isn’t necessarily to make games that are entirely driven by AI, but to make games that can incorporate elements of it to work alongside more traditional gameplay elements.
“It’s just the design that you choose as a team,” he told me. “So the team has to decide ‘okay, I have generative content here I can use, but maybe I want to stay in something structured. So the black box was, for us, an example of this.
“We didn’t want, for example, Jaspar or the teammates to talk about something else than what you have here, we still have [that structure]. Then if you talk to Jaspar and talk about something else, then it can be something that is not branching, but is actually generated.
“So I think tomorrow’s games will be a mix, and in the end we won’t even talk anymore about AI, you’ll just be like ‘oh, this system is really cool because I feel that it’s adapted to what I want’, but it’s going to be less about the word itself.”
“It’s a tool first. We’ve been working on it for more than two years now, and our conclusion is that it’s a super cool tool, but it’s still a tool.”
“There are some constraints,” Mosser added. “Jasper evolves in a constrained character sheet, with his flaws, his qualities and so on. So we add some narrative bites, and for us it’s important to say ‘OK, maybe at this time we need to say that this particular event will happen’. So it’s not like there’s a lack of control, like ‘OK, we’re on the map and there’s no story here’, it’s just that the way we tell the story will change.”
One of the other main concerns surrounding a game like this is that every time a project is handed over to the masses, some will inevitably do everything in their power to make it work in a way it shouldn’t. When Epic Games tried something different in its recent Star Wars season by adding an AI-driven Darth Vader NPC who could talk to players, it only took a matter of hours for players to work their way around his AI logic and have him swearing at players and shouting slurs.
According to the team behind this project, measures have been taken to place enough guardrails into the game to make it difficult for the player to do this, without entirely limiting the freedom that should come with being able to ‘speak’ to a character in a game.
“Basically, we have a multi-layered system,” Manzanares explained. “Of course, the first thing we want to have is free speech – genAI is not just free speech and voice, but it’s a part of voice input and thanks to genAI it’s something possible now. But just for that part, we have multi-layered detection.”
“The first layer is to build a strong character with values,” Mosser added. “Like Pablo, he has some beliefs, and his character sheet which constrain him to behave in a way. It guides him, and we talk about the values of our character from the beginning. So this is a first step for us, to say ‘OK, Jasper will react like this if you talk about these problematic issues’, so there’s this type of constraint.

“The other one is a narrative flow – we are on a mission, so there’s no time for talking about an inappropriate subject. And after that, comes Rémi [Labory, data and AI director], and with him there’s a lot of guardrails.”
One of the other plans, Manzanares told me, is that rather than preventing the player from saying something toxic, the aim is to provide an experience where it’s not actually fun or worthwhile to do it. After all – and I should stress that this is my analogy, not his – every game with a creative element inevitably invites the temptation to create something penis-shaped before this novelty quickly wears off and the player actually uses the software in more fulfilling ways.
“It’s like when we started to have the first games where you could run over pedestrians”, he explained. “When you start to have a game design system which tells you that doing this doesn’t bring you any value, any points… I think in the near future, meaning next year, players will start to get used to talk, and of course at first you want to test it, but then what you want is to be connected to the world itself.
“So the funny out-of-bounds stuff starts to be reduced, because you don’t have a game design system giving you points for that. So it’s a smarter way than saying ‘ewwww, no’. We have the ‘no’, but we believe that first we want to bring them back to ‘this is not giving you any XP points’.”
“The funny out-of-bounds stuff starts to be reduced, because you don’t have a game design system giving you points for that. So it’s a smarter way than saying ‘ewwww, no’. We have the ‘no’, but we believe that first we want to bring them back to ‘this is not giving you any XP points’.”
“I talked to another journalist earlier,” Mosser interrupted, “and he told me he felt really bad at the end of the demo because he forgot to say goodbye to Jaspar”. It’s something I can believe – I found myself saying “please” and “thank you” to Jaspar and my partners, for no reason other than not wanting to offend my entirely fictional colleagues.
As the day wound to an end, I had a separate discussion with the game’s creative director, Reynald François, who had previously worked on such titles as For Honour, Far Cry 6 and the Assassin’s Creed series before being moved over to the Paris R&D team to work on the AI project.
I asked him if what I had played was going to be part of a new IP, or if it was purely a proof of concept that would then be getting added to an Assassin’s Creed, a Far Cry or – more likely, given the squad-based nature of the tech demo – a Tom Clancy tactical shooter.
“I can’t really answer that, because the team we’ve got here is not really a game team,” he replied. “It’s an R&D team, and we’re building the tech, the bricks, that we can then give to other productions.
“We’ve started talking with creative directors from other projects, and we come to them and say ‘look, we have a number of things we can already do, but we also have all these things on the board that we could potentially do if you are interested in those things. Here’s what it does today, look at it and then come back to us and tell us how you want to integrate that into the game.’
“And depending on this idea, or that idea, they might have different needs, but it doesn’t matter because the middleware itself – the concept, the system – is really completely agnostic, and so now we’re just waiting on them to tell us what they would like to achieve, and then we can tell them if we can do it today, or tomorrow, and work on that.”

Conscious that there will be players who still aren’t sold on this technology, and who may be concerned that Ubisoft sees this as the future of third-person action adventure games, or FPS games, or RPGs, I asked François if the team saw it this way, or whether they instead see AI-powered games as a separate genre or sub-genre rather than the ‘new normal’ going forwards.
“It’s branching, right?” he replied. “We can use it as an expansion for an already existing system. And also, when you look at the demo you saw, there’s a lot of components. There’s the companion, and the teammates, and all of that. Some games might want to just take some of that.
“For example, one of the angles that was mentioned is using it as an accessibility element. A game designer could say ‘that’s only what I want, I’m doing my own thing but I love this AI companion for accessibility reasons’. And then that’s not really changing anything.
“Or they go all in and they have super dynamic characters, and then it’s like a new genre, and it can be a spin-off, like RPG+AI, or FPS+AI, or Driving+AI, who knows, and all of that I think is eventually going to branch to doing things differently. Because we can really do things we could never do before, which is that we can account for absolutely every single possible player input, and it will have a response from the game.”
With all the chat about character sheets from the previous discussion with the narrative director, and the mention of RPGs making me start thinking of what a Baldur’s Gate with this sort of tech could be like, I offered one final comment as I got up to leave.
“You know,” I said to François, “when you talk about using it in an RPG context, it certainly feels like the closest to an actual Dungeons & Dragons game you could get.”
“No shit,” he knowingly smiled.














