A Model for Inconsistency in Consumer Choice Due to Limited Computational Ability, Inaccurate Memory, and Learning.

By Bernard Gress, 1999

    1. Introduction/Abstract
    2. Errors in the consumption process
      1. "Errors" vs. "Inconsistency"
      2. Learning Inconsistency
      3. Computational Errors
      4. Perception Errors
      5. Other Errors
    3. The Model and its Interpretation
    4. Comparison of the Model with Real World Data
    5. Conclusions

 

Abstract

Consumer preference theory offers an interesting model of consumer choice based on the concepts of decreasing marginal utility etc etc. At the same time, however, it implies a variety of behavioural consistencies that range from unlikely to next-to-impossible.

While there have not as of yet been any serious empirical studies of individual choice, so that we don?t know what properties real choices might exhibit, with a few seemingly reasonable assumptions it is possible to devise an extension of the neo-classical model that would generate seemingly appropriately inconsistent choice data.

 

Introduction

Consumer preference theory offers an interesting model of consumer choice based on the concepts of decreasing marginal utility etc etc. At the same time, however, it implies a variety of behavioural consistencies that range from unlikely to next to impossible. Even in the most basic two-good model, the consumer is able to discern precisely the trade-offs in utility from minute changes in quantities or prices of either item. While this might be acceptable for a two-good model, it would seem to become computationally intractable as the number of goods grows.

The neo-Classical model says that so long as prices and income are identical, choices will also be identical. If the model is adapted to include the dimension of time, then this assumption implies that our consumer is able to pass through any arbitrarily-long period of time and choose the identical bundles he chose before, just so long as prices and income are the same. If the model is expanded to include, say, seven or eight consumption items, then this neo-classical consumer is still able perform this feat of memory and computation regardless of how many times the prices or income in the interim have changed, and regardless of how much time should pass between the first and last choice. So long as income and prices are the same, no matter how many dimensions the space, and regardless of how much time has passed between the choices, he will always be able to choose exactly the same amounts of each consumption item each time. In general, the consumer is basically granted infinite computational ability; the ability to solve a very large number of simultaneous equations in some very small period of time.

Also of interest in this model is the notion that people know exactly what their utility is. If they are asked, then our consumers have the capability to respond with the precise functional form of their utility functions, perhaps after a brief period of introspection. And they have this self-understanding from day one, regardless of whether they are children or octogenarians, regardless of whether or not they have ever had an income of one million dollars or faced the bizzarest of price combinations before. There is no sense of needing to acquire some experience with spending vastly different levels of wealth, or of having to reevaluate priorities under wildly fluctuating prices. There is no sense of having to ?discover one?s self? to find out where exactly one?s ideal ?bliss? point is.

In this case I think we can then imagine that any data we might hope to have to analyse will be, at best, at the level of an individual shopper, and probably in an individual store. Hence our stylized consumer would probably be some homemaker shopping for his or for he and his family?s needs. The homemaker pushes his cart up and down the aisles of the ubiquitous modern supermarket, weighing the costs and benefits of each item vis-à-vis one another.

"Errors" vs. "Inconsistency"

Of course our first problem is determining what ?inconsistency? consists of. To determine what consistency might be, we would have to begin to understand how people determine what is important to themselves, a task that the serious fields of theology, philosophy, and psychology have yet to complete and one that a fickle and capricious field like economics has hardly gotten around to deciding. The neo-classical model, suitably qualified, at least offers some testable hypotheses of what constitutes ?consistent? or ?rational? behaviour, but to date there has been no rigourous testing of it done. Still, I think that most people would probably agree on a division the ?errors? of an individual into two types, those that are inconsistent with what is ?objectively? best for the individual but which are chosen consciously, and those which are inconsistent with what is ?objectively? best for the individual but which were chosen accidentally, i.e. through error. The first would be assigned to some higher capriciousness of the human spirit which science and method could never hope to predict, the second would be assigned to the fallibility of the human brain and it?s inability to do complex calculations. Of course we are begging the question by assuming that there is something which is ?objectively best? for anyone in the first place, and of course this is just the fundamentally un-testable portion of consumer theory which makes up its Lakatosian ?hard-core.? But it is at least a higher level of dogma, pushing our tautological tenets one further step back.

Thus, if we reinterpret the classical utility function to instead be this objective function which describes exactly what is best for people, we could then consider deviations from the best choice as due to either unpredictable whims, or noise of some sort in the consumption process. Looked at in this fashion, what types of inconsistency might we expect to see?

Inconsistency from Learning

The individual takes his items home and consumes them. He makes some note of his satisfaction with the items. The next time he returns to make more purchases he may or may not recall his judgement accurately. If he is the family representative then there will be errors in the communication of family members? reevaluation of their preferences. Finally, over time, any individual will inevitably change his preferences, changing his opinions of Fruit Loops and bubble gum in favour of beer, liver, and limburger cheese.

Computational Errors

Some items have promotions, rebates, or other complex pricing functions like buy-one-get-one-free. Inevitably there is some error involved in such calculations of prices and relative prices.

Errors from Forgetfulness

Not all items to be consumed are purchased at the same time, some items have already been purchased and are at home being used, others might be at some other store to be purchased later. There will inevitably be errors in recalling the prices of the items already owned or in predicting the prices of items to be bought later.

Errors of Perception

Even if we are able to categorize goods into exact divisions based on the functions they provide to consumers according to the objective utility function, we are still faced with errors of perception. The consumer must infer the utility a good will provide based only on the limited information available in the supermarket at the time of his purchase. Surely there is an opportunity for error here as products willfully attempt to misrepresent themselves.

Errors of Imperfect Self Knowledge

How does the consumer know the utility of an item without first purchasing it and trying it? Even if a consumer uses a good, how does he know to what degree it is fulfilling his objective utility function? When relative prices change and income is constant, the individual is faced with the task of reevaluating the proportions of each item he wishes to purchase. How does he know that if prices change by 10% then, based on the slope of his utility curve at this new tangency, he should buy exactly 29.4% more of item X and 14.92% less of item Y? We might imagine that even if the underlying utility curve is precise that there is still a requisite amount of guesswork or some internal and yet imprecise internal feedback mechanism involved in communicating the degree to which a good satisfies the objective utility function.

There might also be possible inconsistencies introduced as the homemaker performs the process of aggregating his family?s varied utility functions into one. There might also be some degree of error in the homemaker?s perception of his family members? preferences, to be later corrected with feedback from them.

Other Errors

Since the process of buying anything is a long and involved one, with a myriad of tasks to be performed and a variety of computations to be made, it is inevitable that there will be an infinite assortment of bumps and jiggles to the final decision. While individually they will undoubtably be insignificant, in toto they could be crucial. It would seem good to make allowances for all of these remaining noises too.

If our goal is to create a model of consumption that allows for ?inconsistencies? in behaviour, then the neo-classical model of consumer choice would let us down I think the best approach to this question is to begin by pushing the model closer to what we believe to be the realities of the consumer and the consumption process that we wish to model. Since I am presumably designing my model for some corporation to determine how best to sell its product or for some proactive government to best understand or predict its economy, then it would probably be preferable to consider a middle-class individual in one of the modern economies, as opposed to, say, a Tibetan peasant haggling for a hunk of goat at the village bazaar. In practice, of course, the ascertaining of the ?objective? utility function might be unimportant; perhaps we would only be concerned with making predictions in which case we could just get an estimate of a ?representative? utility function from the confluence of many individuals.

 

The Model

All of these points would seem to want to introduce some form of error, even if we are to assume the most rigid and precise of neo-classical models at the core. I think that these errors can be best divided into two or three major categories of errors with different statistical properties which could be tested.

Since there are no suitable sources of data on individual consumption choices to analyse for 'inconsistencies' due to limited computational or learning capabilities, we can only use our imagination to conjure up some 'reasonable' errors and inconsistencies that we might expect to encounter.  I think that there would be a few general situations which would generate different types of errors that we might expect to observe in a person, and which we would like to emulate in our simple model of inconsistency and learning.

For one, we might expect that efficiency/consistency should fall as time separates two choices due to a fading of memory.  Thus, we might expect consistency to fall as the distance between one choice and the next, as measured in commodity space and as measured in time, increases. On the flip side, we might expect that if prices and income are constant then efficiency should increase, as a consumer ?zeros in? on his optimal choice, becoming increasingly aware of what is best for him through trial and error. These proposed phenomena are elaborated in the following table.

Types of Events/Shocks

Types of inconsistencies we might expect to see in consumer:

Statistical properties of data to be searched for or modeled:

Constant Income and Prices

Consumer figures out what he really wants and closes in on optimal point.

Increasing efficiency/consistency, decreasing distances.

Constantly increasing Income, mildly fluctuating Prices

Consumer figures out what he really wants and closes in on optimal points.

Low inefficiency as GARP not often violated; constant distance as consumer narrows around income expansion path.

Constant Income, wildly fluctuating Prices

Consumer has opportunity to understand intricacies of one indifference curve and the rates of transformation of the goods.

More violations of GARP; higher but decreasing inefficiency

Wildly fluctuating Income and Prices

Consumer has no chance to adapt to new situations, makes many inefficient/foolish choices

High correlation of distance in consumption space, and time, with inefficiency;

 

The model assumes that there is a true utility function at the consumer?s core. At the same time, however, it pretends that this true function is unknown to the individual, and that he can only make a guess as to the precise location of any efficient choice. Here is one such utility map that could be used.

 

Here is an example run of the first 30 bundles that are chosen by the utility-maximising individual as prices and income fluctuate randomly, if he is able to do so perfectly and without error.

So far this is the standard utility-maximization model. My model then adds another space, of the same dimensions as the consumption space, over which to assign ?levels of clarity,? ?levels of understanding,? ?levels of cognizance,? ?degrees of familiarity,? (or some similar such label) to each bundle. Let me just call it the ?inconsistency space.? This space will be used to generate probabilities for each bundle so that a bundle can be chosen randomly based on the probabilities. Within this space, weights are given to correspond to each consumption bundle, with the weights based on their distance from the optimally chosen bundle. I have chosen to use a Normal distribution centered on the optimal bundle as my weighting function.

While all points in the space are assigned a weighting, not all are acceptable choices. Probabilities are only generated over points that are feasible: we must require that the consumer spend no more than his income, so the probable set is just the feasible set with probabilities attached to each point. The model is made simpler, however, by requiring that all income be spent so that the probabilities for the probable set are assigned only over those bundles actually on the budget line.

The weightings recorded in inconsistency space are cumulative over time; as more and more bundles are chosen, more and more points in the vicinity of those bundles have an increased probability of being chosen. Finally, all probabilities across the space are decreased somewhat after each time period so that there is a constant rate of decay of all points in the inconsistency space. And finally, the variance of the Normal weighting function is a function of the distance between the present choice and the last, where distance is calculated over the consumption space and possibly over time too.

Hence the procedure is as follows

    1. Generate a sequence of length T of random incomes and prices
    2. Solve the consumer maximization problem: maxx1,x2,? {U subject to } to find quantities x1, x2, x3, ? xk of the k goods.
    3. Find the Euclidian distance from the previous maxed bundle:
      dt =
    4. Weight the k dimensional consumption-space around the maximized bundle x1, x2, x3, ? xk with a Normal distribution with a variance of dt.
    5. Find the set of all bundles in that are on the budget line, i.e. such that , and find the numbers that have been assigned to each of these bundles in the previous weighting.
    6. Generate the corresponding probabilities from these weightings for each of the bundles found above.  This is the 'probably set.'
    7. Randomly choose a bundle from this list according to the corresponding probabilities; this yields the "inconsistent" or "imperfect" or "irrational" choice made by our consumer.
    8. Repeat for t+1

     

    So how does this model relate to reality or at least to the aforementioned error types? What is the story one might tell about these various techniques?

    Choosing a point at random from the budget line, based on the probabilities derived from inconsistency space, suggests the consumer?s inability to identify what is best for himself; by using the Normal distribution centered on the optimal choice, the probability of choosing the optimal choice is still greatest, but of course not certain.

    By allowing the weightings to be cumulative so that the points in the vicinities of the most often chosen bundles become more probable, the probability of any distant bundle being chosen falls. It is as if the consumer gets used to, or has experience with, certain bundles. And, even when circumstances (i.e. prices) should have him choose otherwise, he still ends up choosing that which he is used to.

    To suggest that big changes in the consumer's information will cause a greater deal of confusion and novelty, and thereby a greater degree of inconsistency in choice, the variance of the Normal distribution is tied to the distance in commodity space between bundles from one period to the next. The greater the jump, the greater the spread and the lower the weightings added to the neighborhood of the strictly consistent bundle. Likewise, the smaller the jump, the more likely the consumer is to choose a bundle even closer to the strictly consistent bundle. Since things are cumulative, constant prices and income will allow the weightings to converge to the optimal bundle. It is possible to have any kind of modified distance function of course. By taking into account more than just the last choice?s distance, by using some function involving lags of previous distances, distance in time is given a different role in the model. I have modified the simple distance function mentioned above in my simulations to take a moving average of the previous distances and to scale it somewhat, adding a constant and multiplying it by a constant. This ensures that there is a set range within the variance of the Normal distribution can fluctuate so that it doesn?t get out of hand.

    Here is a graph of my distance function?s values:

    Finally, the inconsistency space is decremented by a constant each period, but not allowed to go below zero. This allows for control of the rate of convergence and -memory loss? and in-a-rut-edness in general. If points are visited frequently enough, the consumer forgets about them or relinquishes them back to their original status of uninteresting.

 

An animation of the evolution of this probability space is shown here, click on the play button to get it to move.

consumerchoice.avi (7,061,654 bytes)

Here we can see  the first 100 periods of the learning model's 'inconsistency space.'

The brightness of the space shows the relative weight of that point in the consumption space. The probabilities for the different maximal sets are found by considering the weights at each point in this space, along the budget line.

The coordinates, i.e. quantities, of the idealized bundle for that period are shown on top of each picture.

 

Comparing the Model with Real-World Data

In fact there are some datasets that can be used to directly test the consistency of individual choice with the theory of Revealed Preferences. I have found one collected by Battalio et al. (1973) from 38 patients of Central Islip State Hospital. As part of their treatment, the patients worked for tokens that could be exchanged for items such as cigarettes, candy, milk, locker rental, clothes, admission to a dance and the like. During a seven week period, the relative prices of various groups of these goods were doubled or halved, and data were collected on how the expenditures of each individual responded to the price changes. These data have been examined by Battalio et al. and Cox using revealed preference techniques.

I use Varian's test for efficiency of the choice set ? which computes the minimum expenditure necessary to purchase an observed choice that is revealed preferred to a given observation ? on the Battalio data set to see if there is any support for the notion of tying the consumption-space distance to the efficiency. Unsurprisingly, I find that there is not any clear relationship. There are many reasons one could point to, the first being that more than 90% of the choices made by all of the individuals are 100% efficient. A graph of the efficiencies and the distances shows this rather clearly, so that the regression line finds measures the effect of the perfect efficiency and doesn?t have much chance to note any other effects. Also, the prices in the Battalio dataset are rather simple, changing by a factor of four for two of the products and not changing at all for the other products, so that relative values and marginal rates of substitution would seem to be readily apparent.

Here is the matrix of the prices of the five goods over the seven weeks in the Battalio dataset.

Finally, the time period between each choice, one week, is sufficiently long, and the choices sufficiently clear, that there probably wouldn?t be much computational difficulty or self-analysis needed to figure out their relative importances.

 

 

 

Conclusions

This model is primarly intended as an alternative to an Artificial Neural Network model I am also working on that I feel would better mimic the limited computational ability and learning properties of an actual person, and is thereby intended to offer a point of reference and an alternative view to it. I don?t think that it will be of much use by itself, particularly in light of the lack of data with which to calibrate it.

 

Bibliography

Battalio, R.C., et al., "A Test of Consumer Demand Theory Using Observations of Individual Consumer Purchases." Western Economic Journal, 11, 411-428, 1973

Varian, H.R., Computational Economics and Finance: Modeling and Analysis with Mathematica., 1996