But its still annoying if u didnt know an NPC was essential but u killed them to get into the DB or something. I think once u attack an essential NPC there should be a message telling u they are essential. So if u care u can stop and if u dont give two s**** then u murder the guy anyway!
If you were concerned about cutting off potential quests, then you shouldn't have been killing in the first place, should you? It should be part of the event of murder to consider the potential implications of that murder. And if consequences arise out of it, then so be it. Does a murderer in real life get a quest popup to notify them that their intended victim might be of use to them later on in life?
I'm not sure how they could so this, but if they could then fine by me, problem is (more so in oblivion because of the more complicated AI) that NPC's are so stupid, that a single accidental blow, would probably result in an all out war, with half the population there dying, e.g. city swimmer is in thieves guild, city swimmer gets caught and is attacked by guards, many thieves guild members are near and help city-swimmer by fighting with her, about all the thieves guild members and 3 guards die, this is without counting the possibility of any random NPC being near and getting an accidental hit, resulting in more deaths (multiplied if random NPC is part of any large faction e.g. fighters guild). This is what really irked me about these deaths, that sometimes you can be on the other side of the map when it happens. Thats why i actually turn some NPC's into essential using a mod, that way none of my future quests would be interrupted, however if they can improve NPC AI so that they or do as you said then that would be just great.
Well, the first step towards a more coherent AI system is to give guards the ability to take NPCs to jail. That way, petty theft doesn't get punished by death, and punishment by death doesn't lead to factional revolt against impossible odds.
The second is to incorporate some form of desire versus consequences scale for actions, with numeric thresholds determined by AI personality settings to decide when certain actions are taken and certain actions aren't. For example, City-Swimmer gets caught stealing bread. The nearby patrolling legionnaire attempts to take her in. City-Swimmer numerically evaluates her chances of survival by resisting arrest (guard's skills and attributes versus her own), the accuracy of which is determined by some skill/attribute based perception algorithm. Then, City-Swimmer's evaluation is compared to City-Swimmer's aggression rating. If her aggression is high enough to offset the the chances of survival (assuming the chances of survival were evaluated as low), then City-Swimmer could resist arrest and attack. The legionnaire would then pummel City-Swimmer. However, this doesn't have to mean death. The further the percent of City-Swimmer's health drops, the closer that percentage comes to overriding City-Swimmer's high aggression setting. And when it gets low enough, City-Swimmer will yield. The legionnaire will then evaluate whether or not to accept that yield based on his responsibility setting, which, for a guard, would be pretty high (meaning he'd let her live and cart her off to jail). That's the huge gaping problem with our current AI setup. It's all or nothing, and no room for in between or dynamic decision-making based on constantly-changing outside factors. But the outside factors aren't too terribly difficult to add in.
This would apply to a multitude of situations, as well. For example, Bethesda cited the residents of Bravil's skooma den as a good example of why they toned down RAI. The skooma addicts were given a desire for skooma. And there is a skooma merchant just across the way. However, the addicts have no money for skooma, yet their desire is their sole focus, so they promptly would go kill the skooma merchant to satisfy their given desire. All or nothing; fulfill at all costs. With the system above, the skooma addicts would evaluate the consequences for murder (perhaps by developers pre-assigning specific actions and parameters with numeric values to indicate severity of consequences). Then, the numeric strength of their desire for skooma would be weighed against that numeric evaluation of consequence, as well as responsibility settings. If their desire wasn't strong enough to outweigh the consequences, then they'd refrain from murder yet still retain the desire for skooma.
That's the basic form. If we added in the possibility of desire increasing as the amount of time passes that it goes unfulfilled, things would get more interesting. Not to say that all skooma addicts would reach the point of desire where murder would be justified for them, but some would (dependent upon AI personality settings) Further, if we added other options to achieve the goal of skooma, that would increase the interesting factor. Skooma addicts need money to fulfill cravings? Then allow for NPCs to find scripted freelance work around town, odd jobs and what not. Whether those jobs are the more traditional and legal sort, or the slightly questionable sort, depends on the NPC and the AI conditions for hiring. The job itself, visually, wouldn't be more complex in nature than having two or three lines of dialogue, and a few "go here" AI directions. Everything else would be happening behind the scenes. And with their newfound money, the skooma addicts could feed their habit and sink themselves back into poverty, thus necessitating the repeat of the process.