You could be right. Rationality is a tricky concept though. What's rational in one culture can be considered completely irrational in another.
If "rational behaviour" is meant to be "value-directed", then what you state here is known as relativistic ethical fallacy.
It all depends on the complexity of the transaction. The simpler the transaction ( market oriented culture in your terminology),
Objects like atoms are simple, but they are sufficient to form everything on Earth, including me and you. Similar things happen in the case of market transactions. They are sufficient to build a modern airplane or latest sports car. A worker screwing a screw on a car assemply line is profiting from the difference in market price of a screw and underassembled car taken together and the car with the screw in the proper place. Certainly finding such profitable actions and processes is not trivial, and this is why specialists in the area called enterpreners are needed. On the other side, Soviet Union was a huge reactionary attempt to build modern society on the notion of sensenless absolute norms. In my opinion its failure shows that losing meaning (proper value balance) is especially destructive in complicated transactions.
On the other hand, if the transaction is complex, e.g. involves different types interactions, objects, and stretches over a long period of time, norms and values become much more important than a one-time "rationality".
I almost agree with you. Artificial intelligence IS about doing actions maximizing value according to some valuation system. This even has its technical name -- Reinforcement Learning. And rationality is nothing but being consistent in following one's values. Except I do not understand your mixing up values and norms. I do not see any better definition for norms than being rigid rules of action for which value-derived justification is forgotten and most often is no longer true. Like not eating pork or beef in times when refrigerators and rapid delivery are widely available.
no subject
Date: 2009-01-10 09:23 am (UTC)If "rational behaviour" is meant to be "value-directed", then what you state here is known as relativistic ethical fallacy.
It all depends on the complexity of the transaction. The simpler the transaction ( market oriented culture in your terminology),
Objects like atoms are simple, but they are sufficient to form everything on Earth, including me and you. Similar things happen in the case of market transactions. They are sufficient to build a modern airplane or latest sports car. A worker screwing a screw on a car assemply line is profiting from the difference in market price of a screw and underassembled car taken together and the car with the screw in the proper place. Certainly finding such profitable actions and processes is not trivial, and this is why specialists in the area called enterpreners are needed.
On the other side, Soviet Union was a huge reactionary attempt to build modern society on the notion of sensenless absolute norms. In my opinion its failure shows that losing meaning (proper value balance) is especially destructive in complicated transactions.
On the other hand, if the transaction is complex, e.g. involves different types interactions, objects, and stretches over a long period of time, norms and values become much more important than a one-time "rationality".
I almost agree with you. Artificial intelligence IS about doing actions maximizing value according to some valuation system. This even has its technical name -- Reinforcement Learning. And rationality is nothing but being consistent in following one's values. Except I do not understand your mixing up values and norms. I do not see any better definition for norms than being rigid rules of action for which value-derived justification is forgotten and most often is no longer true. Like not eating pork or beef in times when refrigerators and rapid delivery are widely available.