» Fri May 27, 2011 6:53 pm
Dragons are pretty cliche and overused. And in TES lore, they are rare and sacred and extremely well-hidden (read as: they do not wish to be found and therefore aren't). Not to mention that a dragon is not something that a single hero can take down. The number of individuals that could remotely bring down a dragon would cause Gamebryo to shed salty tears trying to render them on the screen.
Plus, dragons are too obvious a grab for "epic," which is actually a detractor in my book.
-----
Dredging up an old post of mine from several months ago, here's what I envision regarding actually putting the "Radiant" in Radiant AI, and AI that factors in fuzzy logic:
Well, the first step towards a more coherent AI system is to give guards the ability to take NPCs to jail. That way, petty theft doesn't get punished by death, and punishment by death doesn't lead to factional revolt against impossible odds.
The second is to incorporate some form of desire versus consequences scale for actions, with numeric thresholds determined by AI personality settings to decide when certain actions are taken and certain actions aren't. For example, City-Swimmer gets caught stealing bread. The nearby patrolling legionnaire attempts to take her in. City-Swimmer numerically evaluates her chances of survival by resisting arrest (guard's skills and attributes versus her own), the accuracy of which is determined by some skill/attribute based perception algorithm. Then, City-Swimmer's evaluation is compared to City-Swimmer's aggression rating. If her aggression is high enough to offset the the chances of survival (assuming the chances of survival were evaluated as low), then City-Swimmer could resist arrest and attack. The legionnaire would then pummel City-Swimmer. However, this doesn't have to mean death. The further the percent of City-Swimmer's health drops, the closer that percentage comes to overriding City-Swimmer's high aggression setting. And when it gets low enough, City-Swimmer will yield. The legionnaire will then evaluate whether or not to accept that yield based on his responsibility setting, which, for a guard, would be pretty high (meaning he'd let her live and cart her off to jail). That's the huge gaping problem with our current AI setup. It's all or nothing, and no room for in between or dynamic decision-making based on constantly-changing outside factors. But the outside factors aren't too terribly difficult to add in.
This would apply to a multitude of situations, as well. For example, Bethesda cited the residents of Bravil's skooma den as a good example of why they toned down RAI. The skooma addicts were given a desire for skooma. And there is a skooma merchant just across the way. However, the addicts have no money for skooma, yet their desire is their sole focus, so they promptly would go kill the skooma merchant to satisfy their given desire. All or nothing; fulfill at all costs. With the system above, the skooma addicts would evaluate the consequences for murder (perhaps by developers pre-assigning specific actions and parameters with numeric values to indicate severity of consequences). Then, the numeric strength of their desire for skooma would be weighed against that numeric evaluation of consequence, as well as responsibility settings. If their desire wasn't strong enough to outweigh the consequences, then they'd refrain from murder yet still retain the desire for skooma.
That's the basic form. If we added in the possibility of desire increasing as the amount of time passes that it goes unfulfilled, things would get more interesting. Not to say that all skooma addicts would reach the point of desire where murder would be justified for them, but some would (dependent upon AI personality settings) Further, if we added other options to achieve the goal of skooma, that would increase the interesting factor. Skooma addicts need money to fulfill cravings? Then allow for NPCs to find scripted freelance work around town, odd jobs and what not. Whether those jobs are the more traditional and legal sort, or the slightly questionable sort, depends on the NPC and the AI conditions for hiring. The job itself, visually, wouldn't be more complex in nature than having two or three lines of dialogue, and a few "go here" AI directions. Everything else would be happening behind the scenes. And with their newfound money, the skooma addicts could feed their habit and sink themselves back into poverty, thus necessitating the repeat of the process.