"AI is not coming for your job. It's coming for your workflow"
NAK3D's Kelly Vero tells us how she's using AI to reduce digital fashion asset creation from weeks to minutes.
Hello! Welcome to the latest edition of AI Gamechangers, the weekly newsletter where we chat with leaders from the games industry who are doing practical, innovative things with AI. You can read six months of these interviews for free in the archive.
Our Q&A this week is with Kelly Vero, a creative powerhouse and veteran of the games industry. She discusses how she combined her unique background to create NAK3D, an AI-powered platform that rapidly generates digital assets for games, the metaverse and e-commerce.
Looking for more AI insight? Scroll to the end for a round-up of the latest news and links from around the web.
Kelly Vero, NAK3D

Meet Kelly Vero, a multifaceted creative leader with over 30 years in the games industry. Her expertise in the metaverse and digital transformation has led to her being sought after for speaking engagements worldwide. Kelly is also an author of both reference books and fiction. In 2022, she founded NAK3D, a company that uses AI to rapidly generate 3D in-game assets, bridging her interests in technology, games, and fashion. We spoke at length about her background, her vision for interoperability, and the role AI can play in helping an entrepreneur build a business.
Top takeaways from this conversation:
AI should augment workflows, not replace jobs. AI's value lies in accelerating existing processes (in Kelly’s case, reducing the creation time of digital fashion assets from weeks to minutes).
Start small and practical with AI implementation. Kelly emphasises building incrementally with available tools (including open source varieties) rather than seeking massive funding upfront for a single ambitious plan.
True interoperability is key to digital assets' future. When it comes to the metaverse, Kelly’s vision of creating truly platform-agnostic digital fashion items suggests that the future of in-game commerce depends on breaking down platform silos.
AI Gamechangers: You have a varied background that includes games, tech, fiction and more. How did you get to where you are now with NAK3D?
Kelly Vero: Let's start at the beginning! I'm a failed fashion designer. That's an integral part of this conversation.
When I was about 19, I’d been coding for nine years, on and off, for fun. My parents bought me a ZX Spectrum, 16k. I got hooked on BASIC. I'm a pretty obsessive person, and so I wanted to master everything that I could.
I found myself being a massive loser. I didn't go to university. I actually went to be a catwalk model for a couple of summers in Paris; I did Paris Fashion Week. My dad dragged me back from Paris by my ear, saying, “You've gotta do something with your life!” I could have been Karlie Kloss, but instead, I went down the route of working in this LGBT bar. A guy came up to me and said, “You’re really exciting!” Even in those days, I had weird hair and a general vibe because I used to make my own clothes. He said, “What else can you do?” I told him I could code, so he said, “Come to my office on Monday. I've got some work for you.” And I started working in video games as a result of that.
That was about 1992, and I've worked in video games since then. I’ve worked on some of the greats: Tomb Raider, Transformers, Halo 3, Subway Surfers… I've shipped maybe 30 to 50 games.
“I create items that work across every single platform. What we’ve developed is something totally interoperable. I can take a stylised item and place it into a Meta space, or I can drop the exact same item directly into The Sims. It exists in different universes”
Kelly Vero
I did enjoy doing that. One of the things I learned as a game designer was that even though coding is so important, actually it’s the aesthetic that sells the game. Coding moves the blood around the body; it’s the arterial centre of a video game. But anything that has some kind of aesthetic weight to it is going to do all right – we eat with our eyes as people. I worked that out early on in my career because I wanted to work in fashion. I couldn't draw, so there was no way I was going to St Martin's. But I could code, and I knew what looked good.
There are great people like Alex Horton, who was a creative director on Grand Theft Auto, who, because of that, said to me, “You have an eye for what looks good in a video game.” That is important if you want to develop because you need to have your feet in both camps. You've got to understand what the user wants, and you've got to have a gut feeling about what looks good.
I cut my teeth in the world of coding and doing literally as much as I could around every single game studio. Even working at Sony’s Santa Monica studio, I wanted to know what everybody was doing! And also in Sony Japan, when I worked on Final Fantasy – I wanted to immerse myself in every aspect of it, so I would one day be able to say, “I know kung fu!”
You've got games in your past, you've got fashion. How did that all come together in NAK3D?
It coalesced into NAK3D because I felt there was an interesting bridge that could be built between fashion brands, especially in games.
I was working heavily with digital twins, using game technology. I had moved to Switzerland, and realised quite quickly that we don't have much of a games industry in this country. So I decided to use some transferable skills and transfer my knowledge of game technology and coding to start building bits and pieces and apps in digital twin technology.
There was a definite lack of anyone looking cool in a video game! Nobody looks good in a video game. If you're going to wear a black polo neck in a game, I wanted that black polo neck to be a Tom Ford black polo neck rather than just something some guy had conceived.

Remember, back in my time, most concept artists in video games were dudes. There were few women around. It used to annoy the hell out of me. Marketing, maybe, working in the typing pool, doing admin. But why couldn't they be designers? Why couldn't they be artists? So, I wanted to bring a little bit of flair, chicness, and couture to the industry.
Please talk us through how the NAK3D technology works.
On average, it's going to take any common-or-garden artist between four hours and possibly four weeks (depending on the object they're creating) to conceive a piece of art, model it, and render it to the point where you can see it in a game.
I started with digital twin technology. We’d been scanning items into point cloud data, using MRI and CT scans like you would find at an airport. Then we segment that point cloud data into shapes: dresses, tops, shoes, and so on.
The rug was pulled from under us because the startup failed. I thought it would be a good idea to continue because it was something that people were interested in. But it took so long to do a digital twin, even though we were [improving on] that four weeks; it was still taking upwards of at least a day to re-segment the point cloud data. We’d started off not being AI-powered.

I said to my business advisors, “I want to make a digital fashion factory. I want to make a factory that produces digital objects at scale for the fashion industry.” They said, “You need a pilot partner.” So, I started working with the well-known e-commerce label NET-A-PORTER. I got the database of every single item of clothing they had and started to build a data set.
I tried to do what art teams do, but faster. With the addition of AI, we were able to turn those data sets into training models. Now we've got training models from all of the meshes in the world that have been made for fashion. I brought those in, started training our engine on it, and now we can create digital fashion in under four minutes flat.
That means I can now scale out really quickly in terms of content or live ops for any game studio. Predominantly, my customers are fashion. Because e-commerce is big – there's a big turnaround of inventory. I take that information, plug it into NAK3D, and it starts generating items straight away. I can then re-texture or re-topologise if I feel it needs a little bit of a tweak. But that takes no more than about 10 minutes to an hour of time to go through and QA and re-render some items, because AI has allowed us to move faster.
We don't have to take big strides. We just have to go a bit faster. Because if we don't, we're always going to be in competition with the likes of Shein or Wish or companies that produce fast fashion, and then that ends up in landfill. Well, just imagine that I'm taking that landfill headache away by developing samples really quickly using AI, and then you don't have to create those samples in the first place.
You presented this in Davos and received plaudits for NAK3D’s work. Can you tell us about the validation you got there?
January 2024, lacking in sleep. Let's be honest: people go to the World Economic Forum to have a party! You can do quite a lot of business there, but it's good for celeb spotting or listening to interesting talks from people who you’d usually pay inordinate amounts of money to see. I had no sleep the night before.
“People are having a panic about AI, but it's not about designing AI that's going to do all of our writing and creative art. I think you need to have balance, and I hope that we're in that position now in 2025”
Kelly Vero
I went to the event the next day and felt completely under-prepared. We each had seven minutes to present a pitch, and that pitch had to be the alpha to the omega: what are you looking for in terms of investment? Where do you see yourself in five years? I was rambling, but coherently, and I managed to win Startup Innovation of the Year. That was a shock!
Looking back, was there a light-bulb moment where you realised that AI was going to be important?
In 2012 when we were working on Transformers Universe, we were using AI to pre-program all of our Transformer vehicle-to-robot transformations. We were also using AI for camera controls (because we'd seen an early version of Forza do something similar – nothing in the games industry is new, we borrow from everybody else). More recently, I've been heavily involved in the metaverse, and I've seen how there are a lot of bots around.
[OpenAI’s] Sam Altman said something that resonated with me, and that was that we’ll see the first billion-dollar company run by one person and a machine. He said that's going to happen soon. I use that as my North Star. That's the way that I definitely see things going.
I intended that we would use AI at NAK3D for speed and throughput, because trying to do it all manually is super difficult. I was on stage at Pocket Gamer Connects, talking about AI and ML pipelines, five years ago. The scales fell from everybody's eyes more recently, but at the time, everybody said I was completely nuts! “There's no way AI is ever going to infiltrate our workflow.” But I said, “Mark my words: it's coming for you. It's not coming for your job. It's coming for your workflow.” And I think that was the key point I was trying to make there.
I live in Zurich, which is the home of cybernetics, certainly in Europe. We have a super university called ETH, funded by the likes of Disney, Unity, and organisations focused on doing inventive things. I'm proud to say I live in the city which invented moving image segmentation. So I was quite excited when I did that talk at Pocket Gamer Connects to go through their research papers and see that they've been there, and that was perhaps five years before I even started talking about it.
It's interesting from that perspective: there are people who are having a panic about [AI], but there are also people like me thinking, “No, this is going to make my job simpler and better.” It's not about designing AI that's going to do all of our writing and creative art. Not at all. I think you need to have balance, and I hope that we're in that position now in 2025.
Last month, DeepSeek came out and disrupted everything. Were you surprised that something came out of China like that?
I've got four words to say to you, and that's, “Gong Hei Fat Choy” [this interview took place close to Chinese New Year]. DeepSeek is a Lunar New Year present from China to the world, I think!
There are a few things that we need to unpack with DeepSeek. Sam Altman built a company [OpenAI] that surrounded him with people that could tell him, “Everything's going to be okay.” But [DeepSeek] was a guy and a couple of his mates built this thing, and it makes me say, “You do not need to surround yourself with all these people to develop this product for you!” I read the GitHub paper a couple of nights ago and it reads a bit like the white paper we wrote for NAK3D – I am one woman, and I developed my original prototype. Sure, I needed a team to help with bits and pieces, but I did pretty much all of it myself, and I bootstrapped it. And to a certain extent, DeepSeek is the same.
“Coding moves the blood around the body, it's the arterial centre of a video game. But anything that has some kind of aesthetic weight to it is going to do all right – we eat with our eyes as people”
Kelly Vero
The only issue I think a lot of people have with DeepSeek is, “The Chinese are going to web hook into everything we're doing!” I kind of hate that, because I think that really negates the growth of technology. If we keep being xenophobic about how technology is created, we're never going to be able to find tech parity in what we create, and what our vision is.
I read the paper, and I was pretty blown away by it, because it was thorough. He built everything from open-source technology – that's why it's cheap. Guys, do not go out and say, “We are going to build this thing, and we need $50 million to build it today”. Instead, you should go out and start doing exactly what he did: build things in little bite-sized chunks, prototypes, and then start to stick those things together, and you'll find quite quickly that your data sets will just run effectively. There are loads of people working in the background to make that happen in open source. Open source is the way!
You’re very influential in the metaverse space. Some of the things that people “discovered” in the metaverse in the last few years were things that multiplayer games have been doing for years! Do you think that with your background in games, your knowledge of tech and fashion and AI, you were perfectly placed to take advantage of the metaverse?
It really did blow up during the pandemic, didn’t it?! But yeah, I’m a big believer in it. I wrote a book about the metaverse to help people understand how they should be making things for it.
The metaverse brings together everything. Fifteen years ago, I was talking to Xbox and asking for it to be an open source platform. I was working with a company in Silicon Valley, having conversations like, “We do voice over IP, can we please do games over IP?” The metaverse is a foundational layer which we build everything on top of.

We talk all the time about the metaverse being a persistent, online, shared space, but no one walks the walk. They just want you to drop in and buy something, and then do whatever the hell you want. They don't care. There's no kind of “after-sale service”. It's very fragmented. But the metaverse shouldn't be fragmented. It should be completely seamless.
What I do at NAK3D is create items that work across every single platform. What we’ve developed is something totally interoperable. I don’t even think Ready Player Me is doing anything this interoperable right now. I can take a hugely stylised item, export it as a GLB file, and place it into a Meta space, or I can drop the exact same item directly into The Sims. It exists in different universes, but it’s still the same item.
Ideally, what we want to have is some form of omniverse or metaverse that enables us to literally do spoke-and-hub experiences and keep that software on a level.
What are your plans for NAK3D now? What’s your roadmap?
I want to develop an API, and I want that API to be used in every single game studio. I'd love them to be able to just use the technology.
Now, at the moment, the difficulty I have, because I'm completely bootstrapped, is that I have to wait for revenue before I can carry on doing further development. This is good because it means we're constantly testing the data sets, and we're bringing in more. We've got about 10,000 lines of inventory, and I've just added another 6,000 over the last two weeks. We're pretty much at capacity now in terms of what I can handle.
I want to build the business. I'd like it to be much bigger. I'd love people to get their hands on it and play with the API and see what they think about it. Hopefully we can get to that place in 2025. (“We?”! I make it sound like it's a big corporation. It's just me and my machine!)
Looking forward over the next few years, do you have any wild thoughts about what AI is going to bring to the world of games, to the metaverse, to fashion?
I've always built things that are game-related with a view to being able to use the technology in other areas. We all tried it during the pandemic: Unity, Unreal. They definitely tried to diversify a little bit.
I think what we'll find is the death of websites and e-commerce. I'm just going to say that right now, because I think games commerce is going to be a really big thing. Look at how many ad tech companies there are at conferences like Pocket Gamer Connects and how many solutions there are for additional revenue streams. People are starting to turn towards that, to think about how they're reaching their consumers in different ways. If you want to have an interconnected metaverse, you've got to plug things into that platform. And I think we're coming to a place now where we'll focus on the ability to be able to do everything inside a video game.
Further down the rabbit hole
What’s been happening in AI and games? Your selected news round-up:
“We're launching an AI gaming studio at xAI,” announced Elon Musk during a broadcast about Grok 3, joking that it might be called Epic Games before conceding that’s already the name of a successful games company.
Outraged tech writer Ed Zitron has crunched the numbers and concluded that the generative AI industry is a con, burning too much money to ever be good business, and calling the promises of CEOs “equal parts ridiculous and offensive.”
Web3 game makers continue to embrace agentic AI, with Metaking’s strategy game Blocklords enabling players to create their own in-game agents, and the Sui blockchain forming a partnership with Talus to launch an onchain framework, Nexus.
AI-powered game creation platform Ludo recently launched a 3D generator tool to enable developers and 3D artists to create 3D models from text and 2D images.
ByteDance, the TikTok company, has unveiled OmniHuman, an AI tool that transforms photographs into naturalistic videos of folk chatting, singing and dancing.
Microsoft has published a paper in Nature outlining Muse, a generative AI tool specifically designed for gameplay ideation. The model is formally called a World and Human Action Model (WHAM) and was trained on human Xbox gameplay data.