EA partners with the company behind Stable Diffusion to make games with AI
-
This post did not contain any content.
Who fucking asked for that? WHO??!!
-
It would be interesting if NPCs could react believably to free form input from the player, but that seems unlikely to be reliable. I'm not sure how you'd map the text to in game actions.
Like, you could have the LLM respond with "ok I'll let you have my horse if you leave in the morning, and solve the zombie problem" but it's not trivial to make the game world respond to that. Not even counting it breaking out into "sorry as an LLM I can't ..."
-
You have optimism. It could be used for good like you describe, but most likely will be used by EA to cut artists out, slash costs, make as much profit as possible and suffocate the company until the private equity owners move on to the next company.
-
wrote last edited by [email protected]
No, that's exactly what this is about. They came right out and said as much. It won't work, but they'll cause a lot of damage in the process of failing.
-
Procedural generation with appropriate constraints and a connected game that stores and recalls what's been created can do this far better than a repurposed LLM. It's hard work on the front end but you have a much better idea of what the output will be vs. hoping the LLM "understands" and remembers the context as it goes.
-
It would be interesting if NPCs could react believably to free form input from the player, but that seems unlikely to be reliable. I'm not sure how you'd map the text to in game actions.
Like, you could have the LLM respond with "ok I'll let you have my horse if you leave in the morning, and solve the zombie problem" but it's not trivial to make the game world respond to that. Not even counting it breaking out into "sorry as an LLM I can't ..."
wrote last edited by [email protected]I've seen prototypes of RPGs where you could freeform talk to NPCs and I pretty quickly lost enthusiasm for the idea after seeing it in action.
It didn't feel like a DnD game where you're maneuvering a social conflict with the DM or other players, it felt more like the social equivalent of jumping up on a table where an NPC couldn't get to you and stabbing them in the face.
-
wrote last edited by [email protected]
Those poor players lol
-
This post did not contain any content.
What a fucking surprise.
-
Who fucking asked for that? WHO??!!
They’re trying to cut costs to pay for the loans they took to make the acquisition
-
This post did not contain any content.
-
Sorry but procedural generation will never give you the same result as a well tuned small LLM can.
Also there's no "hoping", LLM context preservation and dynamic memory can be easily fine-tuned even on micro models.
-
It would be interesting if NPCs could react believably to free form input from the player, but that seems unlikely to be reliable. I'm not sure how you'd map the text to in game actions.
Like, you could have the LLM respond with "ok I'll let you have my horse if you leave in the morning, and solve the zombie problem" but it's not trivial to make the game world respond to that. Not even counting it breaking out into "sorry as an LLM I can't ..."
Yes it is trivial.
LLM can already do tool calling, emotion metadata output and so on. It would take minimal effort for a well tuned model to also output things like facial expressions, body language, hand and body movements and so on.
-
wrote last edited by [email protected]
I agree that the results will be different, and certainly a very narrowly trained LLM for conversation could have some potentials if it has proper guardrails. So either way there's a lot of prep beforehand to make sure the boundaries are very clear. Which would work better is debatable and depends on the application. I've played around with plenty of fine tuned models, and they will get off track contextually with enough data. LLMs and procedural generation have a lot in common, but the latter is far easier to manage predictable outputs because of how the probability is used to create them.
-
No, that's exactly what this is about. They came right out and said as much. It won't work, but they'll cause a lot of damage in the process of failing.
Which is a different article about a (somewhat) unrelated topic.
Using AI for development is already out there, and you can't put that genie back in the bottle. As an engineer I'm already using it in my daily work for tons of things - I've built separate agents to do a number of things:
- read work tickets, collate resources, create a work plan, do the initial footwork (creating branches, moving tickets to the right states, creating Notion document with work plan and resources)
- read relevant changes in UI design documents and plan + execute changes (still needs some manual review but e.g. with Android Jetpack Compose, it makes 90-95% of the needed work and requires minimal touch-up)
- do structural work - boilerplates, etc.
- write unit and integration tests, and already working out a UI test automation agent
- do code reviews on changes, document them, and write appropriate commit messages
- do PR reviews - I still review them myself but an extra eye is always helpful
guess what, AI didn't replace me, it just allowed me to focus on actually thinking up solutions instead of doing hours of boilerplate stuff.
AI isn't the enemy in software development. Companies who think they can replace engineers with AI. Middle managers will sooner be on that date, as they were mostly useless anyway.
-
Who fucking asked for that? WHO??!!
The Saudis?
️ -
This post did not contain any content.
they should concentrate on disney ones.
-
wrote last edited by [email protected]
You are going to get downvoted, but you're right. AI doesn't need to be used for every part of the entire development process for it to be "made with the help of AI". There are certain parts of the workflow that I'm sure is already being done regularly with AI, for example commenting code.
Mindlessly feeding prompts into chatgpt for the entirety of the core code or art would be terrible.
-
How do they reduce costs with AI if not by eliminating jobs?
-
15+
BF3 was when I made my vow to stop supporting them for releasing unfinished buggy ass games.What are these ass games you are talking about? Im willing to look past them being buggy.
-
I don't think it would be easy to map free form text to game behavior. Not just like "make the NPC smile" but complex behavior like "this NPC will now go to this location and take this action". That seems like it would be very error prone at best.
