Game Engines 2021W Lecture 15

From Soma-notes
Jump to navigation Jump to search
Lecture 15
----------
Interactive storytelling

different from narrative in RPGs, interactive fiction
 - both are considered games
 - interactive storytelling isn't a game

focus is on storytelling

key idea: open-ended storytelling

Chris Crawford is a big proponent

He made some very influential early games
 - I knew his Balance of Power game, cold war simulator

Got tired of making games, because he realized he
really wanted to make stories
 - interactive stories

Key difference: open-ended storytelling!

Narrative in games is almost always fixed
 - limited branches of storytelling
 - all explicitly created by the developer
 - tree of branching possibilities
 - more options lead to exponential growth in size of tree
   - but exponentially fewer will see each possibility!
 - many if not most games have very few endings,
   and the myriad choices in them converge to the same
   endings


But there are exceptions with emergent storytelling
e.g. dwarf fortress
 - minecraft started as a simplified dwarf fortress

games based on complex simulations can lead to complex emergent narratives
 - but the narrative is secondary to other mechanics

But what if narrative was the focus?

Façade
 - not really a game
 - instead, it is a small cocktail party
   - you interact with a fighting couple via text
 - so, so many possibilities
   - real effort to simulate what it would be like
     to interact with a real couple
 - but like so many of these sort of efforts,
   it can be frustrating because we can imagine
   things to say that the system can't make sense out of

We discussed text adventure games driven by deep learning

What would a game engine need to have to support these
kinds of "stories"?
 - i.e., drama simulators

key idea of interactive storytelling is the developer
just creates the characters and setting
 - the "player"'s choices then drive the narrative

Only achievable if we abandon story trees
 - instead, you simulate
 - just like we do with physics in an open world game
 - "relationship physics"

It all comes down to the size of the input space, and
how input is interpreted
 - compare the possibilities of a mouse movement
   to freeform text or voice input

If you allow unbounded interactivity, nearly impossible to account for all possibilities
 - so we need to limit it, but in creative ways

Creating a sandbox with lots of possibilities, but not too many possibilies
 - so really like classic game design

Separating games from storytelling doesn't make sense to me
 - I think about sports, and how we watch it
 - we naturally create narratives with characters
   - even through a limited perspective, we have a
     theory of mind about the participants

So, to get interactive storytelling, we need systems
where people will develop theories of mind about the NPCs,
even when their actions are limited by the rules of the game

Think of an online multiplayer game where you felt
bad for an opponent

Have you ever felt bad for an AI-driven opponent?

I think something very simple is missing from the AI-driven opponents we play in games
 - no notion of cooperation

Emergent stories can happen when there is a real possibility of cooperation
 - drama comes from betrayal, which can only happen
   if there was cooperation before

We want people to have an emotional reaction to games
 - complex ones require relationships
 - relationships are based on cooperation

But games are more often based on competition

I've thought a lot about cooperation in other contexts
 - I do research in computer security
 - background in studying biology & evolution
 - my PhD advisor was a pioneer in genetic algorithms

I think we got evolution wrong
 - "survival of the fittest" is true but mostly meaningless as a design principle, more of a tautology
 - "survival of those who cooperate" is a better take I think

cooperation is the basis of complexity in all systems

What is cooperation in software systems?
 - composition, e.g., using libraries, services, etc

When you assemble software from pieces parts (and use online APIs), you *assume* that that other code will work with you and not try to take advantage of you (mostly)
 - you can only use a library if you expect it to "cooperate" with your code

cooperation is the basis of computing, but we forget this

computer security exploits are the breakdown of cooperation
 - and it is devastating and commonplace because our systems assume cooperation

cooperation is a design blindspot

There do exist games where NPCs can cooperate with you
 - but they are almost always outnumbered by
   the NPCs who will try to kill you
 - you almost never convert enemy NPCs to your side,
   except through a "weapon" mechanic

Note how "drama" works in more adult-oriented media
 - constant mix of cooperation and competition
 - no pure good or evil, which really means no
   pure competition or cooperation
 - the "sides" are fluid

The "sides" in computer games, though, are almost always very clear

Consider Among Us
 - that works because people play it
 - but could AIs play with people in a meaningful way?
   - would have to limit style of communication
   - but could still allow for many choices
   - would need to allow for recognition of common
     cause

Challenges
 - detecting deception

The key thing a game engine would need is a way to help NPCs have a (simple) theory of mind
 - at minimum, classify friend vs foes based on past actions
 - but should support more

So what does the NPC need?
 - memory of past actions
 - "personality", tweakable tendencies to do certain actions
   - weights on decision tree?
 - ways to connect memory to personality
   - variable response depending on who else they are interacting with
   - based on history of interactions

Imagine a sprite that remembered all its past collisions
 - think of prisoner's dilemma

Instinct is typically not based on experience, "built-in",
but should probably play a role

Imagine a civ game with this sort of system
 - you could conquer a village, but then you'd face resistance
 - instead you could sign a treaty, but this would
   limit your options
     - may not want to fight so much

cooperation as a game mechanic can be very simple
 - but it can lead to complex emergent narratives,
   even feelings
 - simple mechanisms can have very rich emotional impact

cooperation is all about autonomy
 - can always remove consent

Instead of developing complex narrative decision trees,
why not make characters with simple models of cooperation and conflict, and see what behavior emerges?
 - in the context of game rules?

Imagine playing chess with an AI that is "friendly"
 - gives you hints when you make a mistake
 - compliments you on good moves

Classic board game of cooperation & competition is diplomacy