|
Post by alexvostrov on Nov 5, 2016 15:44:45 GMT -8
I was thinking back to the way Storytron worked and what I could learn from that. I remember the design having some good ideas.
Of course, Storytron itself is in hibernation, and alas my copy of SWAT died 2 computers ago. Luckily, I still have my copy of Chris's book on IS.
How current and complete is the book? After all the Siboot work, do you still consider it to be relevant?
|
|
|
Post by chriscrawford on Nov 5, 2016 15:56:43 GMT -8
Good question. The short version is that there hasn't been any real progress since that book was published so many years ago. The academics have continued to push at it, but they are nibbling, not biting. I've seen only a few academics who have stuck to it. Most work on it for a few years, then give up when they realize that they just aren't making much headway. Usually they divert to something related to interactive storytelling. We've also seen plenty of interesting experiments in the Indie World, as you have pointed out, but none of them can be said to be true-blue interactive storytelling; instead, they're games with interesting dramatic elements.
|
|
|
Post by alexvostrov on Nov 5, 2016 17:37:54 GMT -8
I wonder how much of the obstacle is lack of clarity as to what the problem actually is. For example, the people who made Prom Week had good intentions, but completely missed the mark.
For me the root is intentionality. Between the ages of 3-4 kids start to grasp human minds reasonably well, after they gain an understanding of how beliefs affect behaviour. By comparison, other mammals never really get there (chimps have a basic grasp of it). Whatever switch is flipped at that age, we want to target that. I've been reading literature about autism to figure out what the basic shape of that ability is. Roughly, I see it as the capacity to predict people's actions based on inferred desires and beliefs.
So, simple test. How confused would someone without a theory of mind be by your work? Prom Week, alas, is 100% mechanical and doesn't require a theory of mind at all.
Do you have a quick litmus test that you use for potential solutions Chris?
|
|
|
Post by chriscrawford on Nov 5, 2016 18:48:43 GMT -8
No, I've never thought of the problems in that fashion. For me, a fundamental realization came fairly early. Imagine creating a large method that determines a character's emotional reaction to any given event. You send the event as input to the method, and it returns a set of emotional reactions to it. It does NOT include behavioral reactions -- just emotional reactions -- how angry/frightened they become, how happy/sad, and so forth.
Now just add a level of recursion and you have social intelligence. That is, allow Character A to mentally plug Character B into his own emotional reaction method. This allows A to assess B's likely reaction to A's own action.
Pretty simple. Sometime I'll tell you the embarrassing mistake I made that triggered that realization.
|
|
|
Post by alexvostrov on Nov 5, 2016 19:20:41 GMT -8
Now just add a level of recursion and you have social intelligence. That is, allow Character A to mentally plug Character B into his own emotional reaction method. This allows A to assess B's likely reaction to A's own action. Yes, this is really magic. I've had something like that in my own prototypes. The AI would run its own brain as a sub-routine to simulate the likely reactions of other people to events. That's a basic description of empathy. I actually think that this idea is way bigger than just games. Doesn't it drive you crazy that software never knows what you're trying to do? I think that the future of applications is to infer the user's goals. That'll be hard to do for Word, but easier for us because we can control how complicated the game world is. How is your realization reflected in Storytron? If I recall correctly, event witnesses take roles and those roles dictate emotional response. Did you ever implement empathy for actors when deciding what to do?
|
|
|
Post by chriscrawford on Nov 5, 2016 20:15:24 GMT -8
There's an implementation of it somewhere in the engine, and it is accessible to the script writer, but I can't recall now.
|
|