Using game engines to model human behavior

I've been reading some posts about using commercial gaming engines for modeling human behavior.  Here are my thoughts:     

Several of the commercial game engines provide an API with an event model and expose this in a programming language - e.g. the Quake Engine and C++.  In this mode, one could create a world with actors that model human behavior in some way - that certainly is what the "AI" programmers for popular games hope to accomplish. 

In my observation though, the API of events focuses on the physical world and not motivations.  So, for example, the API makes it fairly easy to keep track of where an actor is, have they collided with some other object, etc.  As a result, you don't get much of a toolkit beyond this - i.e. not much to model the personality, motivations, etc. of the actor.

Also, "AI" work that drives how an actor moves thru the world, when would the actor perform various behaviors, etc....this is the proprietary work of the various development companies and are closely guarded secrets. It is something they add to the engine.  You don't get this when you use/buy the engine.  Re-licensing this material (if available) probably would cost $50,000 to $500,000 (rough guess).

 As mentioned in an earlier post, some of the latest level editors for game engines (e.g. Doom/Quake) support using a tool (vice coding) to set some behaviors of actors in the world and ability to set properties on an actor that afect their behavior. From what I have seen so far though, these are pretty basic.  For example, the ability to set in one actor an affinity (or dis-affinity) for another actor in the world and this controls how enthusiastically the first actor follows (or runs away from) the second actor.
       

No comments: