If Mr. Smith wants to SELL me his horse, do I really WANT to buy it? It’s a question as old as markets and horses have existed, but it was for many, many years, one of the unspoken questions of economics.
It is a question Groucho Marx once posed in a slightly different way, when he declared that he refused to join any club which was prepared to accept him as a member.
So how do we solve this paradox? The paradox of the horse that is, not the Groucho paradox.
For most of the history of economics, the answer was quite simple. Simply assume perfect markets and perfect information, so the horse buyer would know everything about the horse, and so would the seller, and in those cases where the horse is worth more to the buyer than the seller, both can strike a mutually beneficial deal. Gains from trade, it’s called.
In the real world, of course, life is not so straightforward, and the person selling the horse is likely to know rather more about it than the potential purchaser. This is called ‘asymmetric information’, and the buyer is facing what is called an ‘adverse selection’ problem, as he has adverse information relative to the seller.
This is a classic case of what economics professor George Akerlof sought to address in 1970, in a seminal paper called ‘The Market for Lemons’.
Akerlof had become intrigued by the limited tools that economists were using in the late 1960s. Unemployment, so went much of the general thinking, was caused by money wages adjusting too slowly to changes in the demand for labour. This was the so-called ‘neo-classical synthesis’ and it assumed classic markets, albeit they could be a bit slow to work.
At the same time, economists had come to doubt that changes in the availability of capital and labour could in themselves explain economic growth. The role of education was called upon as a sort of magic bullet to explain why an economy grew as fast as it did. But that posed a problem for Akerlof. How can we distinguish the impact on productivity of the education itself from the extent to which education simply helped grade people, he asked. The idea here is that more able people will tend on average to seek out more education. So how far does education in itself contribute to growth, and how far does it help simply as a signal and a screen for employers? In the real world, of course, these signals could be useful because employers are like the horse buyers – they know less about the potential employees than the employees know about themselves, the classic adverse selection problem.
Akerlof turned to the market for used cars for the answer, not least because at the time a major factor in the business cycle was the big fluctuation in sales of new cars. He quickly spotted the problem. Just like in the market for horses, the first thing a potential used car buyer is likely to ask is “Why should I WANT to buy that used car if he wants so much to SELL it to me”. The suspicion is that the car is what Americans call a ‘lemon’, a sub-standard pick of the crop. Owners of better quality used cars, called ‘plums’, are much less likely to want to sell.
Now let’s say you’re willing to spend £10,000 on a plum but only £5,000 on a lemon. In such a case, the best price you’d be willing to pay is about £7,500, and only then if you thought there was an equal chance of a lemon and a plum. At this price, though, sellers of the plums will tend to back out, but sellers of the troublesome lemons will be very happy to accept your offer. But as a buyer you know this, so will not be willing to pay £7,500 for what is very likely to be a lemon. The prices that will be offered in this scenario may well spiral down to £5,000 and only the worst used cars will be bought and sold. The bad lemons have effectively driven out the good plums, and buyers will start buying new cars instead of plums. Just as with horses, asymmetric and imperfect information in the used car market has the potential, therefore, to severely compromise its effective operation.
What can be done about this? For the answer we must go back to part of the reason why people seek education, which is to signal personal qualities which might otherwise be difficult to discern. This is part of the wider theory of signalling and screening, and it takes us to another place, on another day.
Reference: Akerlof, G. (1970), The Market for Lemons: Quality, Uncertainty and the Market Mechanism. Quarterly Journal of Economics. 84:488-500.
In a game first popularized in the academic literature in the early 1980s, two people are invited separately to play a game, with a monetary prize. The players are acting anonymously, playing the game via a computer terminal. The game, widely known these days as the ‘Ultimatum Game’, involves one of the players, who we will call Jack, being given £50, say, by the experimenter. He must now decide whether or how much he should offer the other player, who we will call Jill. Remember that Jack and Jill don’t know each other, and will remain anonymous to each other. The game is only played once, so there is no comeback from Jill whatever Jack does. There is, however, one consideration for Jack to think about. If Jill turns down the offer, they both walk away empty-handed. So how much should Jack offer Jill? And how much will he? Traditional economic theory about rational behaviour would suggest that Jack, as a profit-maximizing agent, should offer Jill a very small amount, and that Jill should accept this very small amount rather than get nothing. In fact, early experimenters who put real people into this scenario usually found that the amount of money offered lay somewhere between 50-50 and 65-35. Sometimes, nevertheless, the second player was indeed offered only a small amount, and in those cases where this was less than 30% of the prize, usually refused. In other words, when Jill was offered less than £15 of the £50, she usually walked away from the deal, leaving both with nothing. Is this reconcilable with rational economic behaviour? One explanation that is often proposed is that offers of less than 30% or so are considered as desultory, even insulting, and Jill is getting utility (as economists would call it) from punishing Jack. Yet the low offer made by Jack is not in fact a personal insult, and arises as part of the design of the game. Indeed, neither player will ever know who the other person is. It is certainly not profit-maximizing behaviour by Jill. Is there another explanation? One explanation that makes some sense, proposed in the mid-1980s by the distinguished mathematician and game theoretician, Robert Aumann, is that people tend to evolve rules of thumb according to which they behave in their day-to-day lives. One such rule he identifies as “Don’t be a sucker; don’t let people walk all over you.” This might indeed work well as a general rule for Jill to live by, insofar as it helps build up her reputation for people’s future reference. But in this particular situation, turning down £15 does nothing to build up her reputation, because she is anonymous. Aumann’s explanation is that Jill doesn’t think like that. She has built up this rule-of-thumb behavioural code over a lifetime, and will not so easily abandon it a particular context, when the situation is different. This is what we might call ‘bounded rationality’, in that people do not usually consciously maximize in each decision situation, but instead use rules of thumb that work well “on the whole”. So, that leaves a couple of questions. The first is whether Jack is being rational when he offers a small slice of the cake to Jill, and the second is whether he is he being altruistic or self-interested when he offers her a bigger slice? Reference: Robert Aumann, Rationality and Bounded Rationality: The 1986 Nancy L. Schwartz Memorial Lecture.
Derren Brown, the illusionist, is no stranger to the use of the idea of the wisdom of crowds as part of his entertainment package. A few years ago, for example, he selected a group of people and asked them to estimate how many sweets were in a jar.
All conventional ‘wisdom of crowds’ stuff, albeit wrapped as part of a magical mystery tour. His relatively more recent venture into this world of apparent wisdom went down a rather singular avenue, however, as he explained how a group of 24 people could predict the winning Lottery numbers with uncanny accuracy.
The idea in essence was that each of the 24 would make a guess about the number on each ball and the average of each of these guesses would converge on the next set of winning numbers. It appeared to work – but that is the thing about illusionists; they are good at producing illusions.
I will not go into how he did generate the effect of predicting the lottery draw, because there is no point if you already know, and because it would spoil the fun if you don’t. What is sure, however, is that the musings of the crowd had nothing to do with it.
But why not? After all, if the crowd can accurately guess the weight of an ox or the number of jelly beans in a jar, why not the numbers on the lottery balls? The simple answer, of course, is because the lottery balls are drawn randomly. And the thing about random events is that they are unpredictable. This is at the heart of what economists term ‘weak form market efficiency’, i.e. that future movements in market prices cannot be predicted from past movements. In this sense, the series has no memory.
So what is likely to happen if you do get a group of friends around and ask each to guess the number that will appear on each of the balls drawn next Saturday? If you take the average of the guesses about each in turn, my best estimate is that you are likely to end up with a prediction for each ball that is about 30 or less. Why so? Partly this is because people tend to pick birthdays but it’s also because the averaging of a large number of guesses is likely to produce a number somewhere nearer the mid-point of the set of numbers than the extremes.
But if you do use these numbers and just happen to win, you’re likely to be sharing your winnings with a lot of other people who’ve chosen the same numbers as you. The better strategy is to populate your ticket with bigger numbers, which are likely to be less popular.
This strategy won’t alter your chance of winning but it will increase how much you can expect to win if you do win. And that is no illusion!
When asked to list my all-time heroes, the name of William of Ockham (or Occam) is never far from my lips. Born in the late 13th century, in the Surrey village of Ockham, this Franciscan philosopher, theologian and political writer, is generally considered to be one of the major figures in medieval scholarship.
In this regard, he ranks alongside the likes of his fellow theologians Thomas Aquinas and John Duns Scotus in the pantheon of great pre-Renaissance thinkers. Despite the title he earned at Oxford University of Venerabilis Inceptor (‘Worthy Beginner’) it is therefore by his alternative title of Doctor Invincibilis (‘Unconquerable Doctor’) that he comes down to us. Of all his writings, and they are each worthy of separate study, it is for his principle of parsimony in explanation and theory-building that he is best known today. It is a principle that Fox Mulder refers to in an episode of the X-files and that Jodie Foster defers to in ‘Contact’. Indeed, in William Peter Blatty’s novel, Legion (on which ‘The Exorcist III’ is based), the lead character complains that he was not put on earth “to sell William of Occam door to door.” He needn’t have bothered. William of Ockham sells himself well enough without help, through the principle that is known as ‘Occam’s Razor.’
The Razor is perhaps most clearly defined in Encyclopedia Britannica’s Student edition, where it is taken as an admonishment to devise no more explanations than necessary for any given situation. Put another way, it advises that one should opt for explanations in terms of the fewest possible number of causes, factors or variables. The adults’ version of Encyclopedia Britannica puts it more elegantly, but perhaps less clearly, in these terms – ‘Pluritas non est ponenda sine necessitate’ (‘Plurality should not be posited without necessity’). As such, the principle can be interpreted as giving precedence to simplicity; of two competing theories, the simplest explanation of an entity is to be preferred. There are some higher truths, which may be known to us by experience or revelation, and which Ockham witnesses as necessary rather than contingent entities, to which we are not advised to apply the razor. This is a part of Ockham’s trenchant analysis which is often forgotten, but at least need not concern us when considering the theme of today’s article.
So how do modern-day analysts stand on the shoulders of this medieval giant? The best explanation is perhaps by way of example, and for this we need to travel to the Hong Kong racetrack and to the professional gamblers who devise sophisticated forecasting models of the outcomes of the races run at the Sha Tin and Happy Valley tracks. The basic methodology is to identify each individual factor that could possibly predict the outcome. And what do you do then? How do you decide what to include and what not? For the answer I asked a man who has conservatively made tens of millions of dollars at the track from this very approach. As we enjoyed the view from his Sydney penthouse, he summed it up in a sentence. “I apply Occam’s Razor”, he said, “it really is as simple as that!”
Now say that you have a missing aircraft, and the number of explanations are seeming to grow by the minute. What should we do? Apply Occam’s Razor, of course. And see what we get.
I was once told a story about the value of crowd wisdom in turning up buried treasure. The story was that by asking a host of people, each with a little knowledge of ships, sailing and the sea, where a vessel is likely to have sunk in years gone by, it is possible with astonishing accuracy to pinpoint the wreck and the bounty within. Individually, each of those contributing a guess as to the location is limited to their special knowledge, whether of winds or tides or surf or sailors, but the idea is that together their combined wisdom (arrived at by averaging their guesses) could pinpoint the treasure more accurately than a range of other predictive tools. At least that’s the way it was told to me by an economist who was in turn told the story by a physicist friend. To any advocate of the power of prediction markets, this certainly sounds plausible, so I decided to investigate further. Soon I was getting acquainted with the fascinating tale of the submarine USS Scorpion, as related by Mark Rubinstein, Professor of Applied Investment Analysis at the University of California at Berkeley. In a fascinating paper titled, ‘Rational Markets? Yes or No? The Affirmative Case’, he tells of a story related in a book called ‘Blind Man’s Bluff: The Untold Story of American Submarine Espionage’ by Sherry Sontag and Christopher Drew. The book tells how on the afternoon of May 27, 1968, the submarine USS Scorpion was declared missing with all 99 men aboard. It was known that she must be lost at some point below the surface of the Atlantic Ocean within a circle 20 miles wide. This information was of some help, of course, but not enough to determine even five months later where she could actually be found. The Navy had all but given up hope of finding the submarine when John Craven, who was their top deep-water scientist, came up with a plan which pre-dated the explosion of interest in prediction markets by decades. He simply turned to a group of submarine and salvage experts and asked them to bet on the probabilities of what could have happened. Taking an average of their responses, he was able to identify the location of the missing vessel to within a furlong (220 yards) of its actual location. The sub was found. Sontag and Drew also relate the story of how the Navy located a live hydrogen bomb lost by the Air Force. Now this story has got me thinking. Let’s just say we are looking for a missing aircraft. Could prediction markets possibly help? It’s a question I’m currently asking myself. Maybe others should be as well!
It is said that on returning from a day at the races, a certain Lord Falmouth was asked by a friend how he had fared. “I’m quits on the day”, came the triumphant reply. “You mean by that,” asked the friend, “that you are glad when you are quits?” When the said Lord replied that indeed he was, his companion suggested that there was a far easier way of breaking even, and without the trouble or annoyance. “By not betting at all!” The noble lord said that he had never looked at it like that and, according to legend, gave up betting from that very moment.
While this may well serve as a very instructive tale for many, a certain Edward O. Thorpe, writing in 1962, took a rather different view. He had devised a strategy, based on probability theory, for consistently beating the house at Blackjack (or ‘21’). In his book, ‘Beat the Dealer: A Winning Strategy for the Game of Twenty – One’, Thorp presents the system. On the inside cover of the dust jacket he claims that “the player can gain and keep a decided advantage over the house by relying on the strategy”.
The basic rules of blackjack are simple. To win a round, the player has to draw cards to beat the dealer’s total and not exceed a total of 21.
Because players have choices to make, most obviously as to whether to take another card or not, there is an optimal strategy for playing the game. The precise strategy depends on the house rules, but generally speaking it pays, for example, to hit (take another card) when the total of your cards is 14 and the dealer’s face-up card is 7 or higher. If the dealer’s face-up card is a 6 or lower, on the other hand, you should stand (decline another card). This is known as ‘basic strategy.’
While basic strategy will reduce the house edge, it is generally not enough to turn the edge in the player’s favour. That requires exploitation of the additional factor inherent in the tradition that the used cards are put to one side and not shuffled back into the deck.
This means that by counting which cards have been removed from the deck, we can re-evaluate the probabilities of particular cards or card sizes being dealt moving forward. For example, a disproportionate number of high cards in the deck is good for the player, not least because in those situations where the rules dictate that the house is obliged to take a card, a plethora of remaining high cards increases the dealer’s probability of going bust (exceeding a total of 21).
Thorp’s genius was in devising a method of reducing this strategy to a few simple rules which could be understood, memorized and made operational by the average player in real time. As the book blurb puts it, “The presentation of the system lends itself readily to the rapid play normally encountered in the casinos.”
Since the publication of the book, the strategy has been amended and improved, but Ed Thorp’s original insights stand. The problem simply changed to one familiar to many successful horse players, i.e. how to get your money on before being closed down. That’s another story, for another day. As is the time I met up with the great man himself, and the time, years later, when I shared a pint with some of the legendary MIT blackjack team.