![]() That could help make such dives into VR feel unique and fresh rather than an evolution of what’s come before.Īgain, I’m aware this could be a wildly ambitious use of ChatGPT and OpenAI-based tech. For example, a route around an area could be adapted to how fast the player is moving or what they are looking at. Or have Garrus go back to his calibrations if the player keeps asking him about his favorite sandwich or other such nonsense.Īlternatively, we could use the combination of AI, procedural generation and a headset’s tracking tech, to better customize and evolve a person's experience within a game or virtual environment. Plus safeguards would be needed against people trying to push the limits of the AI, say have Liara raise her eyebrows at a certain crude remark. Prompts or rough interaction prompts could be used to angle the player into saying something the AI would be able to handle (say a popup that notes Tali wants to talk about the Migrant Fleet), given trying to process an entire language, semantics, context within a game’s story and world and so on, might be a bit much for even the most advanced AI tech of today and the near-future. In some way, you could think of this as the near-ultimate conclusion to text-based adventures, only you're feeding an AI with a lot more variables and natural language. ![]() Only, rather than select rough answers via a conversation wheel, you answer in your own way, with AI processing figuring out what you mean, and eye and gesture tracking tech used to interpret the relationship between your words, gaze and movements to better figure out how you’re saying something. Imagine if you will, a VR Mass Effect game where you can wander a ship and can talk to a variety of characters, as is the case in the regular Mass Effect games. But with even the likes of Siri getting better at handling requests said in natural language, there’s more scope for chatbots to better parse what humans are spouting. I understand the basics of a neural network and how AI systems can learn, but I’m no expert and I can totally understand how these ideas might be an engineering nightmare. Now the PSVR 2 has eye tracking that can seemingly have NPCs in some games react to your gaze - this wasn't obvious in my testing, but it’s early days - so what if that was combined with a form of AI-powered chatbot? You could have a game where NPCs can use natural language processing and the reading of one's gaze to have unique conversations with the player. ![]() While I’ll never turn my nose up at being able to pinpoint arrow shots into the eyes of a prowling Watcher or throw an unloaded gun to shatter an approaching enemy, I’d like a fresher way to interact with a virtual world. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |