I thought I would explain “Needs, Wants, & Desires” within the context of Star Control in a little more detail. This is at the foundation of how Will Wright's games work, and it is very obvious that he learned this from a book called “Designing Games & Simulations” which, in the 1980's, was one of the only books that existed related to game design. It is much more obvious in Sim City then it is in Spore, but is also there in Spore.
Most modern game designers would attempt to create a “Living World” simply by assigning a route for each “AI Object” to endlessly repeat. This will create an illusion of a “Living World”, but not nearly as well as Needs, Wants, & Desires will. Within each category there is an order of precedence that determines which needs will override any other needs an AI Object might have, and the same goes for the other two categories. So, instead of simple “endlessly repeating routing” that eventually becomes obvious too the player, “NWD” will make it far more difficult for the players to discern what is going on. And, as a result of this, the AI will appear to be far more intelligent... and yet this is still a “board game simple AI”.
That “endlessly repeating routing” would exist for each AI Object, but within the NWD system. So imagine a “mercenary”. You would incorrectly assign the “wants” to make money as a “needs” to make money, and this will make that ship appear to act like a “mercenary” who's primary “thought” is to make money. But getting fuel when needed would still be a higher ranking need than making money, for example, because it can't make money without fuel. So this ship might have a “stock” programming within this system that has it endlessly repeating a route between 3 or 4 star systems. Get fuel in star system 1, move cargo from star system 3 to star system 6, conduct piracy operations in system 9, sell looted goods in system 12, at which point it is low on fuel and returns to star system 1 to repeat this endless loop.
Then the player comes along and has a conversation with this ship. Different player choices might then add higher ranking needs, wants, and/or desires for that AI Object (mercenary ship). But not necessarily how the player expects. In your conversation maybe you made a deal with them to steal something for you, and meet you in a certain system later to give/sell it too you. This objective becomes its highest ranking want. But then it heads in the opposite direction you were expecting... because it doesn't have the fuel to complete the task, so its highest ranking need tells it to first go get fuel. Once it has the fuel, your quest then outranks its normal wants and desires to do its endlessly repeating route, so it breaks from its normal behavior to go get the thing you asked for and meet you where you had agreed too meet.
This was a very simple example. Interaction with the player creates new needs, wants, & desires and breaks this AI Object out of its normal “stock behavior” of the usual “endlessly repeating route”. Once it has fulfilled those needs, wants, & desires it will return to that “stock AI” that keeps it in its “home area”. You can see how “real” this appears too be to the player on a lot of different levels, including the ship not immediately heading to the quest location if it has needs that need to be fulfilled for it to be able to do that. They almost appear “unreliable” in some ways, as a “mercenary” should, and appear to be “thinking on their own”.
As I mentioned in the beginning, the most obvious example of this is in Will Wright's Sim City. The classic example, and the exercise that inspired Sim City, is applying this to the survival of a self-sustenance level village.