EA partners with the company behind Stable Diffusion to make games with AI
-
"Making games with AI" sounds like hell… like this is what hell must be.
Sitting there prompting AI, getting shitty ass results, prompting it again and again until you eventually settle for slightly less shitty results. The frustration and the loss of agency… oh God, someone should make a psychological horror about this: a frustrated artist forced to ditch their skills and tools and use AI to bring their unique vision to life, and throughout the film you watch them descend deeper and deeper into madness and depression until they burn down a data center and laugh manically as it disintegrates around them.
I don't make computer games with AI, but I do create tabletop roleplaying adventures on a regular basis. I use AI for a lot of it. You have no idea what the workflow is. It's not hell, it's an enormous boon.
-
"Making games with AI" sounds like hell… like this is what hell must be.
Sitting there prompting AI, getting shitty ass results, prompting it again and again until you eventually settle for slightly less shitty results. The frustration and the loss of agency… oh God, someone should make a psychological horror about this: a frustrated artist forced to ditch their skills and tools and use AI to bring their unique vision to life, and throughout the film you watch them descend deeper and deeper into madness and depression until they burn down a data center and laugh manically as it disintegrates around them.
That's... not what this is about?
The point of integrating AI into games is to provide further diversity within the game.
Think Skyrim. By default you're limited to 3-4 discussion options, right? Imagine now, if you will, that you could just... type in anything, including emotional markers, and have the characters respond interactively to the statement and tone. No longer are you bound by limited dialogue in RPGs.
visual generative AI will just spice up the visuals - hopefully. Things like repetitive textures and such will disappear as the game generates brand new textures for each grid element. Or create tons of background characters without the need to specify them. The list goes on.
-
This post did not contain any content.
It’s like using AI to come up with a recipe.
Sure, you could use the recipe from a famous chef that will be good, or you know, just take your chances with AI and maybe you won’t be food poisoned.
AI would be so useful if we weren’t already in the Information Age.
-
This post did not contain any content.
Who fucking asked for that? WHO??!!
-
It would be interesting if NPCs could react believably to free form input from the player, but that seems unlikely to be reliable. I'm not sure how you'd map the text to in game actions.
Like, you could have the LLM respond with "ok I'll let you have my horse if you leave in the morning, and solve the zombie problem" but it's not trivial to make the game world respond to that. Not even counting it breaking out into "sorry as an LLM I can't ..."
-
You have optimism. It could be used for good like you describe, but most likely will be used by EA to cut artists out, slash costs, make as much profit as possible and suffocate the company until the private equity owners move on to the next company.
-
wrote last edited by [email protected]
No, that's exactly what this is about. They came right out and said as much. It won't work, but they'll cause a lot of damage in the process of failing.
-
Procedural generation with appropriate constraints and a connected game that stores and recalls what's been created can do this far better than a repurposed LLM. It's hard work on the front end but you have a much better idea of what the output will be vs. hoping the LLM "understands" and remembers the context as it goes.
-
It would be interesting if NPCs could react believably to free form input from the player, but that seems unlikely to be reliable. I'm not sure how you'd map the text to in game actions.
Like, you could have the LLM respond with "ok I'll let you have my horse if you leave in the morning, and solve the zombie problem" but it's not trivial to make the game world respond to that. Not even counting it breaking out into "sorry as an LLM I can't ..."
wrote last edited by [email protected]I've seen prototypes of RPGs where you could freeform talk to NPCs and I pretty quickly lost enthusiasm for the idea after seeing it in action.
It didn't feel like a DnD game where you're maneuvering a social conflict with the DM or other players, it felt more like the social equivalent of jumping up on a table where an NPC couldn't get to you and stabbing them in the face.
-
wrote last edited by [email protected]
Those poor players lol
-
This post did not contain any content.
What a fucking surprise.
-
Who fucking asked for that? WHO??!!
They’re trying to cut costs to pay for the loans they took to make the acquisition
-
This post did not contain any content.
-
Sorry but procedural generation will never give you the same result as a well tuned small LLM can.
Also there's no "hoping", LLM context preservation and dynamic memory can be easily fine-tuned even on micro models.
-
It would be interesting if NPCs could react believably to free form input from the player, but that seems unlikely to be reliable. I'm not sure how you'd map the text to in game actions.
Like, you could have the LLM respond with "ok I'll let you have my horse if you leave in the morning, and solve the zombie problem" but it's not trivial to make the game world respond to that. Not even counting it breaking out into "sorry as an LLM I can't ..."
Yes it is trivial.
LLM can already do tool calling, emotion metadata output and so on. It would take minimal effort for a well tuned model to also output things like facial expressions, body language, hand and body movements and so on.
-
wrote last edited by [email protected]
I agree that the results will be different, and certainly a very narrowly trained LLM for conversation could have some potentials if it has proper guardrails. So either way there's a lot of prep beforehand to make sure the boundaries are very clear. Which would work better is debatable and depends on the application. I've played around with plenty of fine tuned models, and they will get off track contextually with enough data. LLMs and procedural generation have a lot in common, but the latter is far easier to manage predictable outputs because of how the probability is used to create them.
-
No, that's exactly what this is about. They came right out and said as much. It won't work, but they'll cause a lot of damage in the process of failing.
Which is a different article about a (somewhat) unrelated topic.
Using AI for development is already out there, and you can't put that genie back in the bottle. As an engineer I'm already using it in my daily work for tons of things - I've built separate agents to do a number of things:
- read work tickets, collate resources, create a work plan, do the initial footwork (creating branches, moving tickets to the right states, creating Notion document with work plan and resources)
- read relevant changes in UI design documents and plan + execute changes (still needs some manual review but e.g. with Android Jetpack Compose, it makes 90-95% of the needed work and requires minimal touch-up)
- do structural work - boilerplates, etc.
- write unit and integration tests, and already working out a UI test automation agent
- do code reviews on changes, document them, and write appropriate commit messages
- do PR reviews - I still review them myself but an extra eye is always helpful
guess what, AI didn't replace me, it just allowed me to focus on actually thinking up solutions instead of doing hours of boilerplate stuff.
AI isn't the enemy in software development. Companies who think they can replace engineers with AI. Middle managers will sooner be on that date, as they were mostly useless anyway.
-
Who fucking asked for that? WHO??!!
The Saudis?
️ -
This post did not contain any content.
they should concentrate on disney ones.
-
wrote last edited by [email protected]
You are going to get downvoted, but you're right. AI doesn't need to be used for every part of the entire development process for it to be "made with the help of AI". There are certain parts of the workflow that I'm sure is already being done regularly with AI, for example commenting code.
Mindlessly feeding prompts into chatgpt for the entirety of the core code or art would be terrible.
