1968-schroeder.pdf: “Soviet Reality Sans Potemkin: The amenities of Moscow from the native point of view”, Gertrude Schroeder (1968):
[Dormin111 summary (emphasis added): The link is to a declassified CIA document written in 1968….The author is a CIA spy who explains that the CIA was trying to calculate the economy of the USSR, and by their best estimates, the US GDP was more than 2× the USSR GDP, and the US GDP per capita was around 3×. However, she thinks these numbers are overestimating USSR GDP because it’s difficult to account for quality. An American haircut can be priced the same way as a Soviet haircut, but an American refrigerator is probably vastly better than a Soviet refrigerator.
So the author goes undercover in Moscow for a few months to live as the Russians do and see what economic life is really like for them. She explains that she tried to live the Russian way as an American working at the embassy, but the locals were super nice to her all the time. They always smiled and sent her to the front of every line. So she had to get beat up local clothes, dust off her Russian language skills, put on a grumpy expression (presumably), and pretend to be a Russian (or rather, pretend to be an Estonian due to her accent). Her findings:
- Lines, lines, and more lines. Everyone had to wait on line for everything. Food, clothes, whatever. Wait times were 10–15 minutes at best, but could easily stretch into hours. Sometimes she waited on lines even when she didn’t know what she was waiting for.
- Even in Moscow, the variety and quality of goods was atrocious. At a given grocer, they might offer two or three different items each day. So one day she could get pickled fish, the next day cabbage and tomatoes, the next day rice, etc. Bread seemed to be the only thing that was always in stock.
- Stores were often bureaucratic clusterfucks. The author couldn’t just buy tea; she had to wait on one line to make a tea selection, then collect a piece of paper, then wait on another line to exchange the paper for money, then get another piece of paper, then wait on another line to exchange that piece of paper for tea.
- Prices were outrageous by the standards of the salaries of the people working in the capital city of the second most powerful nation on earth.
- The service was awful. She went to some of the nicest restaurants in Moscow (of which there were fewer than a dozen in a city of millions of people), and the meals would take at least 3 hours. Waiters would stand around doing nothing and wouldn’t come over to her even when she called them.
- Everyone was incredibly rude. She was violently shoved on trains and buses. People screamed at her if she hesitated on lines.
- There was a general feeling of boredom and malaise. There were no luxury goods to buy or events to look forward to. People expected to just live, get married, and wait to die.
- Everyone knew the government reports about how awesome the USSR was doing were bullshit. They envied the West.]
1969-brooks-businessadventures-ch3-thefederalincometax.pdf: “Business Adventures: Twelve Classic Tales from the World of Wall Street: Chapter 3: The Federal Income Tax: Its History and Peculiarities”, John Brooks (1969):
[Profile by the veteran economics New Yorker reporter John Brooks of the post-WWII American federal income tax (history, effects & reform attempts), which taxed incomes at rates as high as 91%, but was riddled with bizarre loopholes and exceptions which meant the real tax rates were typically half that—at the cost of massively distorting human behavior.
Stars would stop performing halfway through the year, marriages would be scheduled for the right month, films were designed around tax incentives rather than merits, countless oil wells were drilled unnecessarily and rich people would invest in business of no interest to them like bowling alleys, businessmen had to meticulously record every lunch because income tax distorted salaries in favor of fringe benefits. (A similar dynamic was at play in the rise of employer-based health insurance during WWII, contributing to the present grotesquely inefficient American healthcare system.)]
…the writer David T. Bazelon has suggested that the economic effect of the tax has been so sweeping as to create two quite separate kinds of United States currency—before-tax money and after-tax money. At any rate, no corporation is ever formed, nor are any corporation’s affairs conducted for as much as a single day, without the lavishing of earnest consideration upon the income tax, and hardly anyone in any income group can get by without thinking of it occasionally, while some people, of course, have had their fortunes or their reputations, or both, ruined as a result of their failure to comply with it. As far afield as Venice, an American visitor a few years ago was jolted to find on a brass plaque affixed to a coin box for contributions to the maintenance fund of the Basilica of San Marco the words “Deductible for U.S. Income-Tax Purposes.”
1974-feldstein.pdf: “Unemployment Compensation: Adverse Incentives And Distributional Anomalies”, Martin Feldstein (1974-06):
The current system of unemployment compensation entails very strong adverse incentives. For a wide variety of “representative” unemployed workers, unemployment benefits replace more than 60 per cent of lost net income. In the more generous states, the replacement rate is over 80 per cent for men and over 100 per cent for women. Most of the $19$51974 billion in benefits go to middle and upper income families. This anomaly in the distribution of benefits is exacerbated by the fact that unemployment compensation benefits are not subject to tax.
1982-hax.pdf: “Competitive Cost Dynamics: The Experience Curve”, Arnoldo C. Hax, Nicolas S. Majluf (1982-10-01):
This is the first of 3 articles about some popular tools that have been widely used since the early 1970’s to support strategic decision-making. The article below deals with the experience curve; subsequent articles will deal with the growth-share matrix and the industry attractiveness-business strength matrix. These tools have inspired a degree of controversy about their uses and limitations, issues that will be explored in this and the subsequent articles.
(This is the first of the tutorial articles we will be publishing in Interfaces. The objective of a tutorial article is to describe an important technique or an application area for Interfaces readers who are nonexperts in the field. Please write and let me know what area(s) you would like to see covered in tutorial articles (and who you would like to see write them) and what area(s) you would be prepared to cover in a tutorial of your own.)
1987-rogowski.pdf: “Political Cleavages and Changing Exposure to Trade”, Ronald Rogowski (1987-12):
Combining the classical theorem of Stolper and Samuelson with a model of politics derived from Becker leads to the conclusion that exogenous changes in the risks or costs of countries’ external trade will stimulate domestic conflict between owners of locally scarce and locally abundant factors. A traditional three-factor model then predicts quite specific coalitions and cleavages among owners of land, labor, and capital, depending only on the given country’s level of economic development and its land-labor ratio. A preliminary survey of historical periods of expanding and contracting trade, and of such specific cases as the German “marriage of iron and rye,” U.S. and Latin American populism, and Asian socialism, suggests the accuracy of this hypothesis. While the importance of such other factors as cultural divisions and political inheritance cannot be denied, the role of exogenous changes in the risks and costs of trade deserves further investigation.
1988-gehr.pdf: “Undated Futures Markets”, Adam K. Gehr Jr. (1988-02-01):
This article discusses the mechanics, economics, advantages and disadvantages of undated futures markets with specific reference to the Chinese Gold and Silver Exchange Society of Hong Kong (CGSES). It also suggests a potential application of undated futures markets to the trading of stock index futures.
An undated futures market is an alternative to conventional futures markets. In conventional futures markets contracts mature at selected times during the year. Several contracts with different maturity dates trade simultaneously. In an undated futures market only a single contract trades, but that contract can serve the hedging purposes of the multiple contracts traded in a dated market. The CGSES is of interest both as a curiosum and as an example of a potentially valuable form of futures market.
The following section of this paper describes the operation of an undated futures market and the specific mechanics of trading on the CGSES. Sections II and III discuss the economics of price determination and hedging in undated futures markets. Section IV describes the advantages of undated futures markets to futures traders and points out potential problems in certain applications. Section V shows how such markets might be adapted to the US, especially for trading futures on a stock index or on other indices, and Section VI is a conclusion.
1989-david.pdf: “Computer and Dynamo: The Modern Productivity Paradox In A Not-Too Distant Mirror”, Paul A. David (1989-07-01):
Many observers of contemporary economic trends have been perplexed by the contemporary conjuncture of rapid technological innovation with disappointingly slow gains in measured productivity. The purpose of this essay is to show modern economists, and others who share their puzzlement in this matter, the direct relevance to their concerns of historical studies that trace the evolution of techno-economic regimes formed around “general purpose engines”. For this purpose an explicit parallel is drawn between two such engines—the computer and the dynamo. Although the analogy between information technology and electrical technology would have many limitations were it to be interpreted very literally, it nevertheless proves illuminating. Each of the principal empirical phenomena that go to make up modern perceptions of a “productivity paradox”, had a striking historical precedent in the conditions that obtained a little less than a century ago in the industrialized West. In 1900 contemporaries might well have said that the electric dynamos were to be seen “everywhere but in the economic statistics”. Exploring the reasons for that state of affairs, and the features of commonality between computer and dynamo—particularly in the dynamics of their diffusion and their incremental improvement, and the problems of capturing their initial effects with conventional productivity measures—provides some clues to help understand our current situation. The paper stresses the importance of keeping an appropriately long time-frame in mind when discussing the connections between the information revolution and productivity growth, as well as appreciating the contingent, path-dependent nature of the process of transition between one techno-economic regime and the next. [Keywords: productivity slowdown; diffusion of innovations; economics of technology; information technology; electric power industry]
1990-david.pdf: “The Dynamo and the Computer: A Historical Perspective on the Modern Productivity Paradox”, Paul A. David (1990-05-01):
Many observers of recent trends in the industrialized economies of the West have been perplexed by the conjecture of rapid technological innovation with disappointingly slow gains in measured productivity. A generation of economists who were brought up to identify increases in total factor productivity indexes with “technical progress” has found it quite paradoxical for the growth accountants’ residual measure of “the advance of knowledge” to have vanished at the very same time that a wave of major innovations was appearing-in microelectronics, in communications technologies based on lasers and fiber optics, in composite materials, and in biotechnology…This latter aspect of the so-called “productivity paradox” attained popular currency in the succinct formulation attributed to Robert Solow: “We see the computers everywhere but in the productivity statistics.”
…If, however, we are prepared to approach the matter from the perspective afforded by the economic history of the large technical systems characteristic of network industries, and to keep in mind a time-scale appropriate for thinking about transitions from established technological regimes to their respective successor regimes, many features of the so-called productivity paradox will be found to be neither so unprecedented nor so puzzling as they might otherwise appear.
…Computer and dynamo each form the nodal elements of physically distributed (transmission) networks. Both occupy key positions in a web of strongly complementary technical relationships that give rise to “network externality effects” of various kinds, and so make issues of compatibility standardization important for business strategy and public policy (see my 1987 paper and my paper with Julie Bunn, 1988). In both instances, we can recognize the emergence of an extended trajectory of incremental technical improvements, the gradual and protracted process of diffusion into widespread use, and the confluence with other streams of technological innovation, all of which are interdependent features of the dynamic process through which a general purpose engine acquires a broad domain of specific applications (see Timothy Bresnahan and Manuel Trajtenberg, 1989). Moreover, each of the principal empirical phenomena that make up modem perceptions of a productivity paradox had its striking historical precedent in the conditions that obtained a little less than a century ago in the industrialized West, including the pronounced slowdown in industrial and aggregate productivity growth experienced during the 1890–1913 era by the two leading industrial countries, Britain and the United States (see my 1989 paper, pp. 12–15, for details). In 1900, contemporary observers well might have remarked that the electric dynamos were to be seen “everywhere but in the productivity statistics!”
Two proposals are made that may facilitate the creation of derivative market instruments, such as futures contracts, cash settled based on economic indices.
The first proposal concerns index number construction: indices based on infrequent measurements of nonstandardized items may control for quality change by using a hedonic repeated measures method, an index number construction method that follows individual assets or subjects through time and also takes account of measured quality variables.
The second proposal is to establish markets for perpetual claims on cash flows matching indices of dividends or rents. Such markets may help us to measure the prices of the assets generating these dividends or rents even when the underlying asset prices are difficult or impossible to observe directly. A perpetual futures contract is proposed that would cash settle every day in terms of both the change in the futures price and the dividend or rent index for that day.
1996-dempsey.pdf: “Taxi Industry Regulation, Deregulation, and Reregulation: The Paradox of Market Failure”, Paul Stephen Dempsey (1996):
During the last fifteen years, Congress has deregulated, wholly or partly, a number of infrastructure industries, including most modes of transport—airlines, motor carriers, railroads, and intercity bus companies. Deregulation emerged in a comprehensive ideological movement which abhorred governmental pricing and entry controls as manifestly causing waste and inefficiency, while denying consumers the range of price and service options they desire.
In a nation dedicated to free market capitalism, governmental restraints on the freedom to enter into a business or allowing the competitive market to set the price seem fundamentally at odds with immutable notions of economic liberty. While in the late 19th and early 20th Century, market failure gave birth to economic regulation of infrastructure industries, today, we live in an era where the conventional wisdom is that government can do little good and the market can do little wrong.
Despite this passionate and powerful contemporary political/
economic ideological movement, one mode of transportation has come full circle from regulation, through deregulation, and back again to regulation—the taxi industry. American cities began regulating local taxi firms in the 1920s. Beginning a half century later, more than 20 cities, most located in the Sunbelt, totally or partially deregulated their taxi companies. However, the experience with taxicab deregulation was so profoundly unsatisfactory that virtually every city that embraced it has since jettisoned it in favor of resumed economic regulation.
Today, nearly all large and medium-sized communities regulate their local taxicab companies. Typically, regulation of taxicabs involves: (1) limited entry (restricting the number of firms, and/
or the ratio of taxis to population), usually under a standard of “public convenience and necessity,” [PC&N] (2) just, reasonable, and non-discriminatory fares, (3) service standards (e.g., vehicular and driver safety standards, as well as a common carrier obligation of non-discriminatory service, 24-hour radio dispatch capability, and a minimum level of response time), and (4) financial responsibility standards (e.g., insurance).
This article explores the legal, historical, economic, and philosophical bases of regulation and deregulation in the taxi industry, as well as the empirical results of taxi deregulation. The paradoxical metamorphosis from regulation, to deregulation, and back again, to regulation is an interesting case study of the collision of economic theory and ideology, with empirical reality. We begin with a look at the historical origins of taxi regulation.
[Keywords: Urban Transportation, Taxi Industry, Common Carrier, Mass Transit, Taxi Industry Regulation, Taxi Deregulation, Reregulation, Taxicab Ordinance, PUC, Open Entry, Reglated Entry, Operating Efficiency, Destructive Competition, Regulated Competition, Cross Subsidy, Cream Skimming, PC&N, Pollution, Cabs]
1998-brynjolfsson.pdf: “Beyond the productivity paradox”, Erik Brynjolfsson, Lorin M. Hitt (1998-08-01):
…What We Now Know About Computers and Productivity: Research on computers and productivity is entering a new phase. While the first wave of studies sought to document the relationship between investments in computers and increases in productivity, new research is focusing on how to make more computerization effective. Computerization does not automatically increase productivity, but it is an essential component of a broader system of organizational changes which does increase productivity. As the impact of computers becomes greater and more pervasive, it is increasingly important to consider these organizational changes as an integral part of the computerization process.
This is not the first time that a major general purpose technology like computers required an expensive and time-consuming period of restructuring. Substantial productivity improvement from electric motors did not emerge until almost 40 years after their introduction into factories . The first use involved swapping gargantuan motors for large steam engines with no redesign of work processes. The big productivity gains came when engineers realized that the factory layout no longer had to be dictated by the placement of power transmitting shafts and rods. They re-engineered the factory so that machines were distributed throughout the factory, each driven by a separate, small electric motor. This made it possible to arrange the machines in accordance with the logic of work flow instead of in proximity to the central power unit.
It has also taken some time for businesses to realize the transformative potential of information technology to revolutionize work. However, the statistical evidence suggests that revolution is occurring much more quickly this time.
1998-delong.pdf: “Estimates of World GDP, One Million B.C.–Present”, J. Bradford DeLong (1998-01-01):
I construct estimates of world GDP over the very long run by combining estimates of total human populations with estimates of levels of real GDP per capita.
1998-hamilton.pdf: “The True Cost of Living: 1974–1991”, Bruce W. Hamilton (1998-05-19):
This first purpose of this paper is to utilize the PSID to see whether the anomalies of Figure 1 and Figure 2 can be attributed to some non-CPI cause such as demographics or changes in the distribution of income. The second purpose is to offer a more refined estimate of CPI bias. Third, I will present evidence of strikingly different inflation rates by race. Using the PSID, I estimate a demand function for food at home for 1974 through 1991. Using a standard measure of real income (total family income after federal taxes, the PSID’s best continuously available approximation of disposable income)10 deflated by the CPI, this demand function has shown consistent drift over the sample period; I attribute this drift to unmeasured growth in real income, and in turn I attribute the mismeasurement of income to CPI bias.
In a nutshell, the results are as follows: On average, in 1974 the PSID sample11 of white households spent 16.64% of its income on at-home food. By 1991 this share had fallen to 12.04%. Measured per-household income grew 7% over this time span, explaining just over half a point of the food-share decline. Decline in the relative CPI of food is sufficient to explain perhaps as much as 1 percentage point of decline in food’s share. Other regressors accounts for less than 0.1 point of additional decline; thus about 3 points of the food-share decline are left to be explained by CPI bias. I estimate that this bias is about 2.5% per year from 1974 through 1981, and slightly under 1% per year since then.
For blacks, food’s share fell from 21.17% to 12.44%. Approximately 0.8 point of the decline can be explained by measured income growth, and another point by movement in other regressors, and up to another 1 point by the decline in the food CPI. Thus the food-share decline left to be explained by measurement error is 5.9 points. I estimate the bias to be approximately 4% per year from 1974 through 1981 and about 3% per year since then.
2000-brynjolfsson.pdf: “Beyond Computation: Information Technology, Organizational Transformation and Business Performance”, Erik Brynjolfsson, Lorin M. Hitt (2000-09):
To understand the economic value of computers, one must broaden the traditional definition of both the technology and its effects. Case studies and firm-level econometric evidence suggest that: 1) organizational “investments” have a large influence on the value of IT investments; and 2) the benefits of IT investment are often intangible and disproportionately difficult to measure. Our analysis suggests that the link between IT and increased productivity emerged well before the recent surge in the aggregate productivity statistics and that the current macroeconomic productivity revival may in part reflect the contributions of intangible capital accumulated in the past.
2000-hanson.pdf: “Long-Term Growth As A Sequence of Exponential Models”, Robin Hanson (2000-12):
A world product time series covering two million years is well fit by either a sum of four exponentials, or a constant elasticity of substitution (CES) combination of three exponential growth modes: “hunting,” “farming,” and “industry.” The CES parameters suggest that farming substituted for hunting, while industry complemented farming, making the industrial revolution a smoother transition. Each mode grew world product by a factor of a few hundred, and grew a hundred times faster than its predecessor. This weakly suggests that within the next century a new mode might appear with a doubling time measured in days, not years.
This paper provides the first estimates of overall CPI bias prior to the 1970s and new estimates of bias since the 1970s. It finds that annual CPI bias was −0.1% between 1888 and 1919 and rose to 0.7% between 1919 and 1935. Annual CPI bias was 0.4% in the 1960s and then rose to 2.7% between 1972 and 1982 before falling to 0.6% between 1982 and 1994. The findings imply that we have underestimated growth rates in true income in the 1920s and 1930s and in the 1970s.
2001-fehr.pdf: “Do Incentive Contracts Crowd Out Voluntary Cooperation?”, Ernst Fehr, Simon Gächter (2001-11-05):
In this paper we provide experimental evidence indicating that incentive contracts may cause a strong crowding out of voluntary cooperation. This crowding-out effect constitutes costs of incentive provision that have been largely neglected by economists. In our experiments the crowding-out effect is so strong that the incentive contracts are less efficient than contracts without any incentives. Principals, nonetheless, prefer the incentive contracts because they allow them to appropriate a much larger share of the (smaller) total surplus and are, hence, more profitable for them.
2001-warner.pdf: “The Personal Discount Rate: Evidence from Military Downsizing Programs”, John T. Warner, Saul Pleeter (2001-03):
The military drawdown program of the early 1990’s provides an opportunity to obtain estimates of personal discount rates based on large numbers of people making real choices involving large sums. The program offered over 65,000 separatees the choice between an annuity and a lump-sum payment. Despite break-even discount rates exceeding 17%, most of the separatees selected the lump sum—saving taxpayers $2.70$1.72001 billion in separation costs. Estimates of discount rates range from 0 to over 30% and vary with education, age, race, sex, number of dependents, ability test score, and the size of payment.
2003-ansolabehere.pdf: “Why is There so Little Money in U.S. Politics?”, Stephen Ansolabehere, John M. de Figueiredo, James M. Snyder Jr. (2003-12-01):
Two extreme views bracket the range of thinking about the amount of money in U.S. political campaigns. At one extreme is the theory that contributors wield considerable influence over legislators. Even modest contributions may be cause for concern and regulation, given the extremely large costs and benefits that are levied and granted by government. An alternative view holds that contributors gain relatively little political leverage from their donations, since the links from an individual campaign contribution to the election prospects of candidates and to the decisions of an individual legislators are not very firm. Although these theories have different implications, they share a common perspective that campaign contributions should be considered as investments in a political marketplace, where a return on that investment is expected.
In this paper, we begin by offering an overview of the sources and amounts of campaign contributions in the U.S. In the light of these facts, we explore the assumption that the amount of money in U.S. campaigns mainly reflects political investment. We then offer our perspective that campaign contributions should be viewed primarily as a type of consumption good, rather than as a market for buying political benefits. Although this perspective helps to explain the levels of campaign contributions by individuals and organizations, it opens up new research questions of its own.
2003-brynjolfsson.pdf: “Computing Productivity: Firm-Level Evidence”, Erik Brynjolfsson, Lorin M. Hitt (2003-11-01):
We explore the effect of computerization on productivity and output growth using data from 527 large U.S. firms over 1987–1994. We find that computerization makes a contribution to measured productivity and output growth in the short term (using 1-year differences) that is consistent with normal returns to computer investments. However, the productivity and output contributions associated with computerization are up to 5 times greater over long periods (using 5- to 7-year differences). The results suggest that the observed contribution of computerization is accompanied by relatively large and time-consuming investments in complementary inputs, such as organizational capital, that may be omitted in conventional calculations of productivity. The large long-run contribution of computers and their associated complements that we uncover may partially explain the subsequent investment surge in computers in the late 1990s.
2003-ruhm.pdf: “Good times make you sick”, Christopher J. Ruhm (2003-07):
This study uses microdata from the 1972–1981 National Health Interview Surveys (NHIS) to examine how health status and medical care utilization fluctuate with state macroeconomic conditions. Personal characteristics, location fixed-effects, general time effects and (usually) state-specific time trends are controlled for. The major finding is that there is a counter-cyclical variation in physical health that is especially pronounced for individuals of prime-working age, employed persons, and males. The negative health effects of economic expansions persist or accumulate over time, are larger for acute than chronic ailments, and occur despite a protective effect of income and a possible increase in the use of medical care. Finally, there is some suggestion that mental health may be procyclical, in sharp contrast to physical well-being. [Keywords: Health status, Morbidity, Macroeconomic conditions.]
2004-ziobrowski.pdf: “Abnormal Returns from the Common Stock Investments of the U.S. Senate”, Alan J. Ziobrowski, Ping Cheng, James W. Boyd, Brigitte J. Ziobrowski (2004-12-01):
The actions of the federal government can have a profound impact on financial markets. As prominent participants in the government decision making process, U.S. Senators are likely to have knowledge of forthcoming government actions before the information becomes public. This could provide them with an informational advantage over other investors. We test for abnormal returns from the common stock investments of members of the U.S. Senate during the period 1993–1998. We document that a portfolio that mimics the purchases of U.S. Senators beats the market by 85 basis points per month, while a portfolio that mimics the sales of Senators lags the market by 12 basis points per month. The large difference in the returns of stocks bought and sold (nearly one percentage point per month) is economically large and reliably positive.
2005-smilansky.pdf: “The Paradox Of Beneficial Retirement”, Saul Smilansky (2005-09-01):
Morally, when should one retire from one’s job? The surprising answer may be ‘now’. It is commonly assumed that for a person who has acquired professional training at some personal effort, is employed in a task that society considers useful, and is working hard at it, no moral problem arises about whether that person should continue working. I argue that this may be a mistake: within many professions and pursuits, each one among the majority of those positive, productive, hard working people ought to consider leaving his or her job.
2006-hong.pdf: “Asset Float and Speculative Bubbles”, Harrison Hong, José Scheinkman, Wei Xiong (2006-05-16):
We model the relationship between asset float (tradeable shares) and speculative bubbles.
Investors with heterogeneous beliefs and short-sales constraints trade a stock with limited float because of insider lockups. A bubble arises as price overweighs optimists’ beliefs and investors anticipate the option to resell to those with even higher valuations. The bubble’s size depends on float as investors anticipate an increase in float with lockup expirations and speculate over the degree of insider selling.
Consistent with the Internet experience, the bubble, turnover, and volatility decrease with float and prices drop on the lockup expiration date.
2006-mackenzie.pdf: “Is economics performative? Option theory and the construction of derivatives markets”, Donald Mackenzie (2006-08-23):
The thesis that economics is “performative” (Callon 1998) has provoked much interest but also some puzzlement and not a little confusion. The purpose of this article is to examine from the viewpoint of performativity one of the most successful areas of modern economics, the theory of options, and in so doing hopefully to clarify some of the issues at stake. To claim that economics is performative is to argue that it does things, rather than simply describing (with greater or lesser degrees of accuracy) an external reality that is not affected by economics. But what does economics do, and what are the effects of it doing what it does?
That the theory of options is an appropriate place around which to look for performativity is suggested by two roughly concurrent developments. Since the 1950s, the academic study of finance has been transformed from a low-status, primarily descriptive activity to a high-status, analytical, mathematical, Nobel-prize-winning enterprise. At the core of that enterprise is a theoretical account of options dating from the start of the 1970s (Black-Scholes). Around option theory there has developed a large array of sophisticated mathematical analyses of financial derivatives. (A “derivative” is a contract or security, such as an option, the value of which depends upon the price of another asset or upon the level of an index or interest rate.)
…Away from the hubbub, computers were used to generate Black-Scholes prices. Those prices were reproduced on sets of paper sheets which floor traders could carry around, often tightly wound cylindrically with only immediately relevant rows visible so that a quick squint would reveal the relevant price. While some individual traders and trading firms produced their own sheets, others used commercial services. Perhaps the most widely used sheets were sold by Fischer Black himself: see figure 2. Each month, Black would produce computer-generated sheets of theoretical prices for all the options traded on U.S. options exchanges, and have them photocopied and sent to those who subscribed to his pricing service. In 1975, for example, sheets for 100 stocks, with 3 volatility estimates for each stock, cost $1,170$3001975 per month, while a basic service with one stock and one volatility estimate cost $59$151975 per month (Black 1975b, “The Option Service: An Introduction”)
At first sight, Black’s sheets look like monotonous arrays of figures. They were, however, beautifully designed for their intended role in “distributed cognition” (Hutchins 1995a and b). Black included what options traders using the Black-Scholes-Merton model needed to know, but no more than they needed to know—there is virtually no redundant information on a sheet—hence their easy portability. He found an ad hoc but satisfactory way of dealing with the consequences of dividends for option pricing (an issue not addressed in the original version of the model), and devoted particular care to the crucial matter of the estimation of volatility. Even the physical size of the sheets was well-judged. Prices had first to be printed on the large computer line-printer paper of the period, but they were then photo-reduced onto standard-sized paper, differently colored for options traded on the different exchanges. The resultant sheets were small enough for easy handling, but not so small that the figures became too hard to read (the reproduction in figure 2 is smaller than full-scale).
How were Black’s sheets and similar option pricing services used? They could, of course, simply be used to set option prices. In April 1976, options trading began on the Pacific Stock Exchange in San Francisco, and financial economist Mark Rubinstein became a trader there. He told me in an interview that he found his fellow traders on the new exchange initially heavily reliant on Black’s sheets: “I walked up [to the most active option trading ‘crowd’] and looked at the screen [of market prices] and at the sheet and it was identical. I said to myself, ‘academics have triumphed’” (Rubinstein 2000).
2007-doran.pdf: “So You Discovered an Anomaly … Gonna Publish It? An Investigation Into the Rationality of Publishing a Market Anomaly”, James Doran, Colbrin A. Wright (2007-01-11):
If publishing an anomaly leads to the dissipation of its profitability, a notion that has mounting empirical support, then publishing a highly profitable market anomaly seems to be irrational behavior. This paper explores the issue by developing and empirically testing a theory that argues that publishing a market anomaly may, in fact, be rational behavior. The theory predicts that researchers with few (many) publications and lesser (stronger) reputations have the highest (lowest) incentive to publish market anomalies. Employing probit models, simple OLS regressions, and principal component analysis, we show that (a) market anomalies are more likely to be published by researchers with fewer previous publications and who have been in the field for a shorter period of time and (b) the profitability of published market anomalies is inversely related to the common factor spanning the number of publications the author has and the number of years that have elapsed since the professor earned his Ph.D. The empirical results suggest that the probability of publishing an anomaly and the profitability of anomalies that are published are inversely related to the reputation of the authors. These results corroborate the theory that publishing an anomaly is rational behavior for an author trying to establish his or her reputation.
2007-schneider.pdf: “A Rule Against Perpetuities For The Twenty-First Century”, Frederick R. Schneider (2007):
The common law rule against perpetuities maintained alienation of property by voiding interests in property that did not vest within a life in being at the creation of the interest plus twenty-one years. The rule was applied strictly, often producing harsh results. The courts used a what-might-happen test to strike down nonvested interests that might not have vested in a timely manner. During the last half-century, many legislatures have softened the application of the rule against perpetuities by enacting wait-and-see provisions, which require courts to decide cases based on the facts as they actually developed, and reformation, which allowed some nonvested interests to be reformed to save them from invalidity.
This paper describes the common law rule. Then it traces the modern developments, including promulgation of the widely adopted Uniform Statutory Rule Against Perpetuities, which includes an alternate 90 year fixed wait-and-see period to be applied in place of the common law’s lives in being plus twenty-one years.
The paper continues by exploring the policies which underlie the rule against perpetuities. Then, after finding that there is no substantial movement to repeal the rule except for trusts, it is established that proposals for that federal law, including federal transfer taxes, cannot and should not be used to implement the policies served by the rule itself.
There is a continuing need for state rules against perpetuities. The paper proposes that the rule be modified to make it more understandable and easier to apply. The proposed rule would replace lives in being plus twenty-one years with a fixed term of years. This would eliminate most of the difficulties encountered in application of the rule. Wait-and-see and reformation are part of the proposed rule. The proposed rule provides for determination of valid interests at the end of the fixed term of year Rule and contains a definition of “vested” to enable judges and attorneys to apply the rule in cases which will arise many years in the future.
2008-hanson.pdf: “Showing that you care: The evolution of health altruism”, Robin Hanson (2008-01-01):
Human behavior regarding medicine seems strange; assumptions and models that seem workable in other areas seem less so in medicine. Perhaps, we need to rethink the basics. Toward this end, I have collected many puzzling stylized facts about behavior regarding medicine, and have sought a small number of simple assumptions which might together account for as many puzzles as possible.
The puzzles I consider include a willingness to provide more medical than other assistance to associates, a desire to be seen as so providing, support for nation, firm, or family provided medical care, placebo benefits of medicine, a small average health value of additional medical spending relative to other health influences, more interest in public that private signals of medical quality, medical spending as an individual necessity but national luxury, a strong stress-mediated health status correlation, and support for regulating health behaviors of the low status. These phenomena seem widespread across time and cultures.
I can explain these puzzles moderately well by assuming that humans evolved deep medical habits long ago in an environment where people gained higher status by having more allies, honestly cared about those who remained allies, were unsure who would remain allies, wanted to seem reliable allies, inferred such reliability in part based on who helped who with health crises, tended to suffer more crises requiring non-health investments when having fewer allies, and invested more in cementing allies in good times in order to rely more on them in hard times.
These ancient habits would induce modern humans to treat medical care as a way to show that you care. Medical care provided by our allies would reassure us of their concern, and allies would want you and other allies to see that they had pay enough to distinguish themselves from posers who didn’t care as much as they. Private information about medical quality is mostly irrelevant to this signaling process.
If people with fewer allies are less likely to remain our allies, and if we care about them mainly assuming they remain our allies, then we want them to invest more in health than they would choose for themselves. This tempts us to regulate their health behaviors. This analysis suggests that the future will continue to see robust desires for health behavior regulation and for communal medical care and spending increases as a fraction of income, all regardless of the health effects of these choices.
2009-agarwal.pdf: “The Age of Reason: Financial Decisions over the Life Cycle and Implications for Regulation”, Sumit Agarwal, John C. Driscoll, Xavier Gabaix, David Laibson (2009):
Many consumers make poor financial choices, and older adults are particularly vulnerable to such errors. About half of the population between ages 80 and 89 have a medical diagnosis of substantial cognitive impairment. We study life-cycle patterns in financial mistakes using a proprietary database with information on 10 types of credit transactions. Financial mistakes include suboptimal use of credit card balance transfer offers and excess interest rate and fee payments. In a cross section of prime borrowers, middle-aged adults made fewer financial mistakes than either younger or older adults. We conclude that financial mistakes follow a U-shaped pattern, with the cost-minimizing performance occurring around age 53. We analyze nine regulatory strategies that may help individuals avoid financial mistakes. We discuss laissez-faire, disclosure, nudges, financial “driver’s licenses,” advance directives, fiduciaries, asset safe harbors, and ex post and ex ante regulatory oversight. Finally, we pose seven questions for future research on cognitive limitations and associated policy responses.
2009-ferrante.pdf: “Education, Aspirations and Life Satisfaction”, Francesco Ferrante (2009-10-21):
The idea that expanding work and consumption opportunities always increases people’s wellbeing is well established in economics but finds no support in psychology. Instead, there is evidence in both economics and psychology that people’s life satisfaction depends on how experienced utility compares with expectations of life satisfaction or decision utility.
In this paper I suggest that expanding work and consumption opportunities is a good thing for decision utility but may not be so for experienced utility. On this premise, I argue that people may overrate their socioeconomic prospects relative to real life chances and I discuss how systematic frustration over unfulfilled expectations can be connected to people’s educational achievement.
I test the model’s predictions on Italian data and find preliminary support for the idea that education and access to stimulating environments may have a perverse impact on life satisfaction. I also find evidence that the latter effect is mediated by factors such as gender and age.
Indeed, the model seeks to go beyond the Italian case and provide more general insights into how age/
life satisfaction relationships can be modelled and explained.
2010-glied.pdf: “The Economic Value of Teeth”, Sherry Glied, Matthew Neidell (2010-03-01):
This paper examines the effect of oral health on labor market outcomes by exploiting variation in fluoridated water exposure during childhood. The politics surrounding the adoption of water fluoridation by local governments suggests exposure to fluoride is exogenous to other factors affecting earnings. Exposure to fluoridated water increases women’s earnings by approximately 4%, but has no detectable effect for men. Furthermore, the effect is largely concentrated amongst women from families of low socioeconomic status. We find little evidence to support occupational sorting, statistical discrimination, and productivity as potential channels, with some evidence supporting consumer and possibly employer discrimination.
2010-oberholzergee.pdf: “File Sharing and Copyright”, Felix Oberholzer-Gee, Koleman Strumpf (2010-01-01):
The advent of file sharing has considerably weakened effective copyright protection. Today, more than 60% of Internet traffic consists of consumers sharing music, movies, books, and games. Yet, despite the popularity of the new technology, file sharing has not undermined the incentives of authors to produce new works. We argue that the effect of file sharing has been muted for three reasons. (1) The cannibalization of sales that is due to file sharing is more modest than many observers assume. Empirical work suggests that in music, no more than 20% of the recent decline in sales is due to sharing. (2) File sharing increases the demand for complements to protected works, raising, for instance, the demand for concerts and concert prices. The sale of more expensive complements has added to artists’ incomes. (3) In many creative industries, monetary incentives play a reduced role in motivating authors to remain creative. Data on the supply of new works are consistent with the argument that file sharing did not discourage authors and publishers. Since the advent of file sharing, the production of music, books, and movies has increased sharply.
2010-rost.pdf: “The corporate governance of Benedictine abbeys”, Katja Rost, Emil Inauen, Margit Osterloh, Bruno S. Frey (2010-01-12):
Purpose: This paper aims to analyse the governance structure of monasteries to gain new insights and apply them to solve agency problems of modern corporations. In an historic analysis of crises and closures it asks, if Benedictine monasteries were and are capable of solving agency problems. The analysis shows that monasteries established basic governance instruments very early and therefore were able to survive for centuries.
methodology/: The paper uses a dataset of all Benedictine abbeys that ever existed in Bavaria, Baden‐Württemberg, and German‐speaking Switzerland to determine their lifespan and the reasons for closures. The governance mechanisms are analyzed in detail. Finally, it draws conclusions relevant to the modern corporation. The theoretical foundations are based upon principal agency theory, psychological economics, as well as embeddedness theory. approach
Findings: The monasteries that are examined show an average lifetime of almost 500 years and only a quarter of them dissolved as a result of agency problems. This paper argues that this success is due to an appropriate governance structure that relies strongly on internal control mechanisms.
implications: Benedictine monasteries and stock corporations differ fundamentally regarding their goals. Additional limitations of the monastic approach are the tendency to promote groupthink, the danger of dictatorship and the life long commitment.
Practical implications: The paper adds new insights into the corporate governance debate designed to solve current agency problems and facilitate better control.
value: By analyzing monasteries, a new approach is offered to understand the efficiency of internal behavioral incentives and their combination with external control mechanisms in corporate governance.
2010-schuh.pdf: “Who Gains and Who Loses from Credit Card Payments? Theory and Calibrations”, Scott Schuh, Oz Shy, Joanna Stavins (2010-08-31):
Merchant fees and reward programs generate an implicit monetary transfer to credit card users from non-card (or “cash”) users because merchants generally do not set differential prices for card users to recoup the costs of fees and rewards. On average, each cash-using household pays $196$1492010 to card-using households and each card-using household receives $1,488$1,1332010 from cash users every year. Because credit card spending and rewards are positively correlated with household income, the payment instrument transfer also induces a regressive transfer from low-income to high-income households in general. On average, and after accounting for rewards paid to households by banks, the lowest-income household ($26,260$20,0002010 or less annually) pays $28$212010 and the highest-income household ($196,954$150,0002010 or more annually) receives $985$7502010 every year. We build and calibrate a model of consumer payment choice to compute the effects of merchant fees and card rewards on consumer welfare. Reducing merchant fees and card rewards would likely increase consumer welfare.
2011-dube.pdf: “Coups, Corporations, and Classified Information”, Arindrajit Dube, Ethan Kaplan, Suresh Naidu (2011-08-11):
We estimate the impact of coups and top-secret coup authorizations on asset prices of partially nationalized multinational companies that stood to benefit from U.S.-backed coups. Stock returns of highly exposed firms reacted to coup authorizations classified as top-secret. The average cumulative abnormal return to a coup authorization was 9% over 4 days for a fully nationalized company, rising to more than 13% over 16 days. Precoup authorizations accounted for a larger share of stock price increases than the actual coup events themselves. There is no effect in the case of the widely publicized, poorly executed Cuban operations, consistent with abnormal returns to coup authorizations reflecting credible private information. We also introduce two new intuitive and easy to implement nonparametric tests that do not rely on asymptotic justifications.
2012-bloom-2.pdf: “Americans Do IT Better: US Multinationals and the Productivity Miracle”, Nicholas Bloom, Raffaella Sadun,, John Van Reenen (2012-02):
US productivity growth accelerated after 1995 (unlike Europe’s), particularly in sectors that intensively use information technologies (IT). Using two new micro panel datasets we show that US multinationals operating in Europe also experienced a “productivity miracle.” US multinationals obtained higher productivity from IT than non-US multinationals, particularly in the same sectors responsible for the US productivity acceleration. Furthermore, establishments taken over by US multinationals (but not by non-US multinationals) increased the productivity of their IT. Combining pan-European firm-level IT data with our management practices survey, we find that the US IT related productivity advantage is primarily due to its tougher “people management” practices.
2012-bloom.pdf: “Does Management Matter? Evidence from India”, Nicholas Bloom, Benn Eifert, Aprajit Mahajan, David McKenzie, John Roberts (2012-11-18):
A long-standing question is whether differences in management practices across firms can explain differences in productivity, especially in developing countries where these spreads appear particularly large. To investigate this, we ran a management field experiment on large Indian textile firms. We provided free consulting on management practices to randomly chosen treatment plants and compared their performance to a set of control plants. We find that adopting these management practices raised productivity by 17% in the first year through improved quality and efficiency and reduced inventory, and within three years led to the opening of more production plants. Why had the firms not adopted these profitable practices previously? Our results suggest that informational barriers were the primary factor explaining this lack of adoption. Also, because reallocation across firms appeared to be constrained by limits on managerial time, competition had not forced badly managed firms to exit.
2012-gordon.pdf: “Is U.S. Economic Growth Over? Faltering Innovation Confronts the Six Headwinds”, Bob Gordon (2012-08-01):
This paper raises basic questions about the process of economic growth. It questions the assumption, nearly universal since Solow’s seminal contributions of the 1950s, that economic growth is a continuous process that will persist forever. There was virtually no growth before 1750, and thus there is no guarantee that growth will continue indefinitely. Rather, the paper suggests that the rapid progress made over the past 250 years could well turn out to be a unique episode in human history. The paper is only about the United States and views the future from 2007 while pretending that the financial crisis did not happen. Its point of departure is growth in per-capita real GDP in the frontier country since 1300, the U.K. until 1906 and the U.S. afterwards. Growth in this frontier gradually accelerated after 1750, reached a peak in the middle of the 20th century, and has been slowing down since. The paper is about “how much further could the frontier growth rate decline?”
The analysis links periods of slow and rapid growth to the timing of the three industrial revolutions (IR’s), that is, IR #1 (steam, railroads) from 1750 to 1830; IR #2 (electricity, internal combustion engine, running water, indoor toilets, communications, entertainment, chemicals, petroleum) from 1870 to 1900; and IR #3 (computers, the web, mobile phones) from 1960 to present. It provides evidence that IR #2 was more important than the others and was largely responsible for 80 years of relatively rapid productivity growth between 1890 and 1972. Once the spin-off inventions from IR #2 (airplanes, air conditioning, interstate highways) had run their course, productivity growth during 1972–96 was much slower than before. In contrast, IR #3 created only a short-lived growth revival between 1996 and 2004. Many of the original and spin-off inventions of IR #2 could happen only once—urbanization, transportation speed, the freedom of females from the drudgery of carrying tons of water per year, and the role of central heating and air conditioning in achieving a year-round constant temperature.
Even if innovation were to continue into the future at the rate of the two decades before 2007, the U.S. faces six headwinds that are in the process of dragging long-term growth to half or less of the 1.9% annual rate experienced between 1860 and 2007. These include demography, education, inequality, globalization, energy/
environment, and the overhang of consumer and government debt. A provocative “exercise in subtraction” suggests that future growth in consumption per capita for the bottom 99% of the income distribution could fall below 0.5% per year for an extended period of decades.
2014-heald.pdf: “How Copyright Keeps Works Disappeared”, Paul J. Heald (2014-10-28):
A random sample of new books for sale on Amazon.com shows more books for sale from the 1880s than the 1980s. Why? This article presents new data on how copyright stifles the reappearance of works. First, a random sample of more than 2,000 new books for sale on Amazon.com is analyzed along with a random sample of almost 2,000 songs available on new DVDs. Copyright status correlates highly with absence from the Amazon shelf. Together with publishing business models, copyright law seems to deter distribution and diminish access. Further analysis of eBook markets, used books on Abebooks.com, and the Chicago Public Library collection suggests that no alternative marketplace for out-of-print books has yet developed. Data from iTunes and YouTube, however, tell a different story for older hit songs. The much wider availability of old music in digital form may be explained by the differing holdings in two important cases, Boosey & Hawkes v. Disney (music) and Random House v. Rosetta Stone (books).
2014-lewis.pdf: “Managing an iconic old luxury brand in a new luxury economy: Hermès handbags in the US market”, Tasha L. Lewis, Brittany Haas (2014-03):
The Hermès brand is synonymous with a wealthy global elite clientele and its products have maintained an enduring heritage of craftsmanship that has distinguished it among competing luxury brands in the global market. Hermès has remained a family business for generations and has successfully avoided recent acquisition attempts by luxury group LVMH. Almost half of the luxury firm’s revenue ($1.90$1.52012B in 2012) is derived from the sale of its leather goods and saddlery, which includes its handbags. A large contributor to sales is global demand for one of its leather accessories, the Birkin bag, ranging in price from $12,298$10,0002014 to $307,458$250,0002014. Increased demand for the bag in the United States since 2002 resulted in an extensive customer waitlist lasting from months to a few years. Hermès retired the famed waitlist (sometimes called the ‘dream list’) in the United States in 2010, and while the waitlist has been removed, demand for the Birkin bag has not diminished and making the bag available to luxury consumers requires extensive, careful distribution management. In addition to inventory constraints related to demand for the Birkin bag in the United States, Hermès must also manage a range of other factors in the US market. These factors include competition with ‘affordable’ luxury brands like Coach, monitoring of unsolicited brand endorsers as well as counterfeit goods and resellers. This article examines some of the allocation practices used to carefully manage the Hermès brand in the US market.
2015-bronnenberg.pdf: “Do Pharmacists Buy Bayer? Informed Shoppers and the Brand Premium”, Bart J. Bronnenberg, Jean-Pierre Dubé, Matthew Gentzkow, Jesse M. Shapiro (2015-07-15):
We estimate the effect of information and expertise on consumers’ willingness to pay for national brands in physically homogeneous product categories. In a detailed case study of headache remedies, we find that more informed or expert consumers are less likely to pay extra to buy national brands, with pharmacists choosing them over store brands only 9% of the time, compared to 26% of the time for the average consumer. In a similar case study of pantry staples such as salt and sugar, we show that chefs devote 12 percentage points less of their purchases to national brands than demographically similar non-chefs. We extend our analysis to cover 50 retail health categories and 241 food and drink categories. The results suggest that misinformation and related consumer mistakes explain a sizable share of the brand premium for health products, and a much smaller share for most food and drink products. We tie our estimates together using a stylized model of demand and pricing.
2016-gubby.pdf: “Preparing for the Worst: The Space Insurance Market's Realistic Disaster Scenarios”, Robin Gubby, David Wade, David Hoffer (2016-05-31):
Approximately 30 satellite launches are insured each year, and insurance coverage is provided for about 200 in-orbit satellites. The total insured exposure for these risks is currently in excess of US$25 billion. Commercial communications satellites in geostationary Earth orbit represent the majority of these, although a larger number of commercial imaging satellites, as well as the second-generation communication constellations, will see the insurance exposure in low Earth orbit start to increase in the years ahead, from its current level of US$1.5 billion. Regulations covering Lloyd’s of London syndicates require that each syndicate reserves funds to cover potential losses and to remain solvent. New regulations under the European Union’s Solvency II directive now require each syndicate to develop models for the classes of insurance provided to determine their own solvency capital requirements. Solvency II is expected to come into force in 2016 to ensure improved consumer protection, modernized supervision, deepened EU market integration, and increased international competitiveness of EU insurers. For each class of business, the inputs to the solvency capital requirements are determined not just on previous results, but also to reflect extreme cases where an unusual event or sequence of events exposes the syndicate to its theoretical worst-case loss. To assist syndicates covering satellites to reserve funds for such extreme space events, a series of realistic disaster scenarios (RDSs) has been developed that all Lloyd’s syndicates insuring space risks must report upon on a quarterly basis. The RDSs are regularly reviewed for their applicability and were recently updated to reflect changes within the space industry to incorporate such factors as consolidation in the supply chain and the greater exploitation of low Earth orbit. The development of these theoretical RDSs will be overviewed along with the limitations of such scenarios. Changes in the industry that have warranted the recent update of the RDS, and the impact such changes have had will also be outlined. Finally, a look toward future industry developments that may require further amendments to the RDSs will also be covered by the article.
In the legal system of the premodern Middle East, the closest thing to an autonomous private organization was the Islamic waqf. This non-state institution inhibited political participation, collective action, and rule of law, among other indicators of democratization. It did so through several mechanisms. Its activities were essentially set by its founder, which limited its capacity to meet political challenges. Being designed to provide a service on its own, it could not participate in lasting political coalitions. The waqf’s beneficiaries had no say in evaluating or selecting its officers, and they had trouble forming a political community. Thus, for all the resources it controlled, the Islamic waqf contributed minimally to building civil society. As a core element of Islam’s classical institutional complex, it perpetuated authoritarian rule by keeping the state largely unrestrained. Therein lies a key reason for the slow pace of the Middle East’s democratization process.
2016-lu.pdf: “The Contractual Nature of the City”, Qian Lu (2016):
Urbanization is a process in which separated and dispersed property rights become concentrated in a specific location. This process involves a large volume of contracts to redefine and rearrange various property rights, producing various and high transaction costs. Efficient urbanization implies the reduction of these costs. This paper studies how efficient urbanization reduces transaction costs in the real world, based on a series of contracts rather than the coercive power. Specifically, this paper shows that Jiaolong Co. built a city by being a central contractor, which acquired planning rights by contract, and signed a series of tax sharing contracts with government, farmers, tenants, and business enterprises. These contractual arrangements greatly reduced the transaction costs and promoted the development.
2016-mclean.pdf: “Does Academic Research Destroy Stock Return Predictability?”, R. David McLean, Jeffrey Pontiff (2016-02-01):
We study the out-of-sample and post-publication return predictability of 97 variables shown to predict cross-sectional stock returns. Portfolio returns are 26% lower out-of-sample and 58% lower post-publication. The out-of-sample decline is an upper bound estimate of data mining effects. We estimate a 32% (58%–26%) lower return from publication-informed trading. Post-publication declines are greater for predictors with higher in-sample returns, and returns are higher for portfolios concentrated in stocks with high idiosyncratic risk and low liquidity. Predictor portfolios exhibit post-publication increases in correlations with other published-predictor portfolios. Our findings suggest that investors learn about mispricing from academic publications.
2017-akam-theexquisitelyenglishandamazinglylucrativeworldoflondonclerks.html: “The Exquisitely English (and Amazingly Lucrative) World of London Clerks: It’s a Dickensian profession that can still pay upwards of $650,000 per year”, Simon Akam (Bloomberg News) (2017-05-23):
John/ Mark Taylor belongs to one of the last surviving professions of Dickensian London. Clerks have co-existed with chimney sweeps and gene splicers. It’s a trade that one can enter as a teenager, with no formal qualifications, and that’s astonishingly well-paid. A senior clerk can earn a half-million pounds per year, or more than $650,000, and some who are especially entrenched make far more.
Clerks—pronounced “clarks”—have no equivalent in the U.S. legal system, and have nothing in common with the Ivy League-trained Supreme Court aides of the same spelling. They exist because in England and Wales, to simplify a bit, the role of lawyer is divided in two: There are solicitors, who provide legal advice from their offices, and there are barristers, who argue in court. Barristers get the majority of their business via solicitors, and clerks act as the crucial middlemen between the tribes—they work for and sell the services of their barristers, steering inquiring solicitors to the right man or woman. Clerks are by their own cheerful admission “wheeler-dealers,” what Americans might call hustlers. They take a certain pride in managing the careers of their bosses, the barristers—a breed that often combines academic brilliance with emotional fragility. Many barristers regard clerks as their pimps. Some, particularly at the junior end of the profession, live in terror of clerks. The power dynamic is baroque and deeply English, with a naked class divide seen in few other places on the planet. Barristers employ clerks, but a bad relationship can strangle their supply of cases. In his 1861 novel Orley Farm, Anthony Trollope described a barrister’s clerk as a man who “looked down from a considerable altitude on some men who from their professional rank might have been considered as his superiors.”…One of the most peculiar aspects of the clerk-barrister relationship is that clerks handle money negotiations with clients. Barristers argue that avoiding fee discussions keeps their own interactions with clients clean and uncomplicated, but as a consequence, they’re sometimes unaware of how much they actually charge. The practice also insulates and coddles them. Clerks become enablers of all sorts of curious, and in some cases self-destructive, behavior.
…John Flood, a legal sociologist who in 1983 published the only book-length study of barristers’ clerks, subtitled The Law’s Middlemen, uses an anthropological lens to explain the relationship. He suggests that barristers, as the de facto priests of English law—with special clothes and beautiful workplaces—require a separate tribe to keep the temple flames alight and press money from their congregation. Clerks keep barristers’ hands clean; in so doing they accrue power, and they’re paid accordingly. I asked more than a dozen clerks and barristers, as well as a professional recruiter, what the field pays. Junior clerks, traditionally recruited straight after leaving school at 16 and potentially with no formal academic qualifications, start at £15,000 to £22,000 ($19,500 to $28,600); after 10 years they can make £85,000. Pay for senior clerks ranges from £120,000 to £500,000, and a distinct subset can earn £750,000. The Institute of Barristers’ Clerks disputed these figures, saying the lows were too low and the highs too high. But there’s no doubt that the best clerks are well-rewarded. David Grief, 63, a senior clerk at the esteemed Essex Court Chambers, spoke to me enthusiastically about his personal light airplane, a TB20 Trinidad.
…Before the U.K. decimalized its currency in 1971, clerks received “shillings on the guinea” for each case fee. Under the new money system, the senior clerks’ take was standardized at 10% of their chambers’ gross revenue. Sometimes, but not always, they paid their junior staff and expenses out of this tithe. Chambers at the time were typically small, four to six barristers strong, but in the 1980s, they grew. As they added barristers and collected more money, each chambers maintained just one chief clerk, whose income soared. The system was opaque: The self-employed barristers didn’t know what their peers within their own chambers were paid, and in a precomputer age, with all transactions recorded in a byzantine paper system, barristers sometimes didn’t know what their clerks earned, either. Jason Housden, a longtime clerk who now works at Matrix Chambers, told me that, when he started out in the 1980s at another office, his senior clerk routinely earned as much as the top barristers and on occasion was the best-paid man in the building. · One anecdote from around the same time, possibly apocryphal, is widely shared. At a chambers that had expanded and was bringing in more money, three silks decided their chief clerk’s compensation, at 10%, had gotten out of hand. They summoned him for a meeting and told him so. In a tactical response that highlights all the class baggage of the clerk-barrister relationship, as well as the acute British phobia of discussing money, the clerk surprised the barristers by agreeing with them. “I’m not going to take a penny more from you,” he concluded. The barristers, gobsmacked and paralyzed by manners, never raised the pay issue again, and the clerk remained on at 10% until retirement. · Since the 1980s, fee structures have often been renegotiated when a senior clerk retires. Purely commission-based arrangements are now rare—combinations of salary and incentive are the rule, though some holdouts remain. Goddard told me last summer that he receives 3% of the entire take of the barristers at 4 Stone; later he said this was inaccurate, and that his pay was determined by a “complicated formula.” (Pupil barristers, as trainees are known, start there at £65,000 per year, and the top silks each make several million pounds.) · The huge sums that clerks earn, at least relative to their formal qualifications, both sit at odds with the feudal nature of their employment and underpin it. In some chambers, clerks still refer to even junior barristers as “sir” or “miss.” Housden remembers discussing this issue early in his career with a senior clerk. He asked the man whether he found calling people half his age “sir” demeaning. The reply was straightforward: “For three-quarters of a million pounds per year, I’ll call anyone sir.”
2017-gard.pdf: “Creating a Last Twenty (L20) Collection: Implementing Section 108(h) in Libraries, Archives and Museums”, Elizabeth Townsend Gard (2017-10-02):
Section 108(h) has not been utilized by libraries and archives, in part because of the uncertainty over definitions (e.g. “normal commercial exploitation”), determination of the eligibility window (last twenty years of the copyright term of published works), and how to communicate the information in the record to the general public. This paper seeks to explore the elements necessary to implement the Last Twenty exception, otherwise known as Section 108(h) and create a Last Twenty (L20) collection. In short, published works in the last twenty years of the copyright may be digitized and distributed by libraries, archives, and museums, as long as there is no commercial sale of the works and no reasonably priced copy is available. This means that Section 108(h) is available for the forgotten and neglected works, 1923-1941, including millions of foreign works restored by GATT. Section 108(h) is less effective for big, commercially available works. In many ways, that is the dividing line created by Section 108(h): allow for commercial exploitation of works throughout their term, but allow libraries to rescue works that had no commercial exploitation or copies available for sale and make them available through copying and distribution for research, scholarship, and preservation. In fact, Section 108(h) when it was being debated in Congress was called labeled “orphan works.” This paper suggests ways to think about the requirements of Section 108(h) and to make it more usable for libraries. Essentially, by confidently using Section 108(h) we can continue to make the past usable one query at a time. The paper ends with an evaluation of the recent Discussion Paper by the U.S. Copyright Office on Section 108 and suggests changes/
recommendations related to the proposed changes to Section 108(h). [Keywords: Copyright, Public Domain, Library, Archives, Museum, Section 108(h), Internet Archive, orphan works]
2017-levine.pdf: “Smart and Illicit: Who Becomes an Entrepreneur and Do They Earn More?”, Ross Levine, Yona Rubinstein (2017-05):
We disaggregate the self-employed into incorporated and unincorporated to distinguish between “entrepreneurs” and other business owners. We show that the incorporated self-employed and their businesses engage in activities that demand comparatively strong nonroutine cognitive abilities, while the unincorporated and their firms perform tasks demanding relatively strong manual skills. People who become incorporated business owners tend to be more educated and—as teenagers—score higher on learning aptitude tests, exhibit greater self-esteem, and engage in more illicit activities than others. The combination of “smart” and “illicit” tendencies as youths accounts for both entry into entrepreneurship and the comparative earnings of entrepreneurs. Individuals tend to experience a material increase in earnings when becoming entrepreneurs, and this increase occurs at each decile of the distribution.
2017-nagaraj.pdf: “Does Copyright Affect Reuse? Evidence from Google Books and Wikipedia”, Abhishek Nagaraj (2017-07-26):
While digitization has greatly increased the reuse of knowledge, this study shows how these benefits might be mitigated by copyright restrictions. I use the digitization of in-copyright and out-of-copyright issues of Baseball Digest magazine by Google Books to measure the impact of copyright on knowledge reuse in Wikipedia. I exploit a feature of the 1909 Copyright Act whereby material published before 1964 has lapsed into the public domain, allowing for the causal estimation of the impact of copyright across this sharp cutoff. I find that, while digitization encourages knowledge reuse, copyright restrictions reduce citations to copyrighted issues of Baseball Digest by up to 135% and affect readership by reducing traffic to affected pages by 20%. These impacts are highly uneven: copyright hurts the reuse of images rather than text and affects Wikipedia pages for less-popular players greater than more-popular ones.
The online appendix is available at https:/
/. doi.org/ 10.1287/ mnsc.2017.2767
2017-sacerdote.pdf: “Fifty Years Of Growth In American Consumption, Income, And Wages”, Bruce Sacerdote (2017-05-16):
Despite the large increase in U.S. income inequality, consumption for families at the 25th and 50th percentiles of income has grown steadily over the time period 1960–2015. The number of cars per household with below median income has doubled since 1980 and the number of bedrooms per household has grown 10% despite decreases in household size. The finding of zero growth in American real wages since the 1970s is driven in part by the choice of the CPI-U as the price deflator (Broda and Weinstein 2008, Prices, Poverty, And Inequality: Why Americans Are Better Off Than You Think). Small biases in any price deflator compound over long periods of time. Using a different deflator such as the Personal Consumption Expenditures index (PCE) yields modest growth in real wages and in median household incomes throughout the time period. Accounting for the Hamilton (1998) and Costa (2001) estimates of CPI bias yields estimated wage growth of 1% per year during 1975–2015. Meaningful growth in consumption for below median income families has occurred even in a prolonged period of increasing income inequality, increasing consumption inequality and a decreasing share of national income accruing to labor.
2017-sichel.pdf: “The Price of Nails since 1700: Even Simple Products Experienced Large Price Declines”, Daniel E. Sichel (2017-04):
Many products—such as lighting and computing—have undergone revolutionary changes since the beginning of the industrial revolution. This paper considers the opposite end of the spectrum of product change, focusing on nails. Nails are a simple, everyday product whose form has changed relatively little over the last three centuries, and this paper constructs a continuous, constant-quality price index for nails since 1695. These data indicate that the price of nails fell substantially relative to an overall basket of consumption goods as reflected in the CPI, with the preferred index falling by a factor of about 15 times from the mid 1700s to the mid 1900s. While these declines were nowhere near as rapid as those for lighting and computing, they were still quite sizable and large enough to enable the development of other products and processes and contribute to downstream changes in patterns of economic activity. Moreover, with the relative price of nails having been so much higher in an earlier period, nails played a much more important role in economic activity in an earlier period than they do now. [A not yet completed section of the paper will use a growth accounting framework to assess the proximate sources of the change in the price of nails.]
2018-bruhn.pdf: “The Impact of Consulting Services on Small and Medium Enterprises: Evidence from a Randomized Trial in Mexico”, Miriam Bruhn, Dean Karlan, Antoinette Schoar (2018-03-07):
A randomized control trial with 432 small and medium enterprises in Mexico shows positive impact of access to 1 year of management consulting services on total factor productivity and return on assets. Owners also had an increase in “entrepreneurial spirit” (an index that measures entrepreneurial confidence and goal setting). Using Mexican social security data, we find a persistent large increase (about 50%) in the number of employees and total wage bill even 5 years after the program. We document large heterogeneity in the specific managerial practices that improved as a result of the consulting, with the most prominent being marketing, financial accounting, and long-term business planning.
2018-buterin.pdf: “Liberal Radicalism: A Flexible Design For Philanthropic Matching Funds”, Vitalik Buterin, Zoë Hitzig, E. Glen Weyl (2018-12-31):
We propose a design for philanthropic or publicly-funded seeding to allow (near) optimal provision of a decentralized, self-organizing ecosystem of public goods. The concept extends ideas from Quadratic Voting to a funding mechanism for endogenous community formation. Citizens make public goods contributions to projects of value to them. The amount received by the project is (proportional to) the square of the sum of the square roots of contributions received. Under the “standard model” this yields first best public goods provision. Variations can limit the cost, help protect against collusion and aid coordination. We discuss applications to campaign finance, open source software ecosystems, news media finance and urban public projects. More broadly, we relate our mechanism to political theory, discussing how this solution to the public goods problem may furnish neutral and non-authoritarian rules for society that nonetheless support collective organization.
2018-horowitz.pdf: “Relative Education and the Advantage of a College Degree”, Jonathan Horowitz (2018):
What is the worth of a college degree when higher education expands? The relative education hypothesis posits that when college degrees are rare, individuals with more education have less competition to enter highly-skilled occupations. When college degrees are more common, there may not be enough highly-skilled jobs to go around; some college-educated workers lose out to others and are pushed into less-skilled jobs. Using new measurements of occupation-level verbal, quantitative, and analytic skills, this study tests the changing effect of education on skill utilization across 70 years of birth cohorts from 1971 to 2010, net of all other age, period, and cohort trends. Higher-education expansion erodes the value of a college degree, and college-educated workers are at greater risk for underemployment in less cognitively demanding occupations. This raises questions about the sources of rising income inequality, skill utilization across the working life course, occupational sex segregation, and how returns to education have changed across different life domains.
2018-leckelt.pdf: “The rich are different: Unravelling the perceived and self–reported personality profiles of high–net–worth individuals”, Marius Leckelt, David Richter, Carsten Schröder, Albrecht C. P. Küfner, Markus M. Grabka, Mitja D. Back (2018-11-22):
Beyond money and possessions, how are the rich different from the general population?
Drawing on a unique sample of high-net-worth individuals from Germany (≥1 million Euro in financial assets; n = 130), nationally representative data (n = 22,981), and an additional online panel (n = 690), we provide the first direct investigation of the stereotypically perceived and self-reported personality profiles of high-net-worth individuals.
Investigating the broad personality traits of the Big Five and the more specific traits of narcissism and locus of control, we find that stereotypes about wealthy people’s personality are accurate albeit somewhat exaggerated and that wealthy people can be characterized as stable, flexible, and agentic individuals who are focused more on themselves than on others.
2018-magniberton.pdf: “Why do academics oppose the market? A test of Nozick’s hypothesis”, Raul Magni-Berton, Diego Ríos (2018):
In this article, the authors explore why academics tend to oppose the market. To this intent the article uses normative political theory as an explanatory mechanism, starting with a conjecture originally suggested by Robert Nozick. Academics are over-represented amongst the best students of their cohort. School achievement engenders high expectations about future economic prospects. Yet markets are only contingently sensitive to school achievement. This misalignment between schools and markets is perceived by academics—and arguably by intellectuals in general—as morally unacceptable. To test this explanation, the article uses an online questionnaire with close to 1500 French academic respondents. The data resulting from this investigation lend support to Nozick’s hypothesis.
2019-atack.pdf: “'Automation' of Manufacturing in the Late Nineteenth Century: The Hand and Machine Labor Study”, Jeremy Atack, Robert A. Margo, Paul W. Rhode (2019-01-01):
Recent advances in artificial intelligence and robotics have generated a robust debate about the future of work. An analogous debate occurred in the late nineteenth century when mechanization first transformed manufacturing. We analyze an extraordinary dataset from the late nineteenth century, the Hand and Machine Labor study carried out by the US Department of Labor in the mid-1890s. We focus on transitions at the task level from hand to machine production, and on the impact of inanimate power, especially of steam power, on labor productivity. Our analysis sheds light on the ability of modern task-based models to account for the effects of historical mechanization.
[Summary by Jack Clark & Matthew van der Merwe:
Quantifying automation in the Industrial Revolution: We all know that the Industrial Revolution involved the substantial substitution of human labour for machine labour. This 2019 paper from a trio of economists paints a clear quantitative picture of automation in this period, using the 1899 US Hand and Machine Labor Study.
The dataset: The HML study is a remarkable data-set that has only recently been analyzed by economic historians. Commissioned by Congress and collected by the Bureau of Labor Statistics, the study collected observations on the production of 626 manufactured units (e.g. ‘men’s laced shoes’) and recorded in detail the tasks involved in their production and relevant inputs to each task. For each unit, this data was collected for machine-production, and hand-production.
Key findings: The paper looks at transitions between hand-labour and machine-labour across tasks. It finds clear evidence for both the displacement and productivity effects of automation on labour:
- 67% of hand tasks transitioned 1-to-1 to being performed by machines and a further 28% of hand tasks were subdivided or consolidated into machine tasks. Only 4% of hand tasks were abandoned.
- New tasks (not previously done by hand) represented one-third of machine tasks.
- Machine labour reduced total production time by a factor of 7.
- The net effect of new tasks on labour demand was positive—time taken up by new machine-tasks was 5× the time lost on abandoned hand-tasks
Matthew’s view: The Industrial Revolution is perhaps the most transformative period in human history so far, with massive effects on labour, living standards, and other important variables. It seems likely that advances in AI could have a similarly transformative effect on society, and that we are in a position to influence this transformation and ensure that it goes well. This makes understanding past transitions particularly important. Aside from the paper’s object-level conclusions, I’m struck by how valuable this diligent empirical work from the 1890s, and the foresight of people who saw the importance in gathering high-quality data in the midst of this transition. This should serve as inspiration for those involved in efforts to track metrics of AI progress.]
2019-bazzi.pdf: “The Institutional Foundations of Religious Politics: Evidence from Indonesia”, Samuel Bazzi, Gabriel Koehler-Derrick, Benjamin Marx (2019-12-23):
This article explores the foundations of religious influence in politics and society. We show that an important Islamic institution fostered the entrenchment of Islamism at a critical juncture in Indonesia, the world’s largest Muslim country. In the early 1960s, rural elites transferred large amounts of land into waqf—inalienable charitable trusts in Islamic law—to avoid expropriation by the state. Regions facing a greater threat of expropriation exhibit more prevalent waqf land and Islamic institutions endowed as such, including mosques and religious schools. These endowments provided conservative forces with the capital needed to promote Islamist ideology and mobilize against the secular state. We identify lasting effects of the transfers on the size of the religious sector, electoral support for Islamist parties, and the adoption of local sharia laws. These effects are shaped by greater demand for religion in government but not by greater piety among the electorate. Waqf assets also impose costs on the local economy, particularly in agriculture, where these endowments are associated with lower productivity. Overall, our findings shed new light on the origins and consequences of Islamism.
2019-bouguen.pdf: “Using Randomized Controlled Trials to Estimate Long-Run Impacts in Development Economics”, Adrien Bouguen, Yue Huang, Michael Kremer, Edward Miguel (2019-05-13):
We assess evidence from randomized controlled trials (RCTs) on long-run economic productivity and living standards in poor countries. We first document that several studies estimate large positive long-run impacts, but that relatively few existing RCTs have been evaluated over the long run. We next present evidence from a systematic survey of existing RCTs, with a focus on cash transfer and child health programs, and show that a meaningful subset can realistically be evaluated for long-run effects. We discuss ways to bridge the gap between the burgeoning number of development RCTs and the limited number that have been followed up to date, including through new panel (longitudinal) data; improved participant tracking methods; alternative research designs; and access to administrative, remote sensing, and cell phone data. We conclude that the rise of development economics RCTs since roughly 2000 provides a novel opportunity to generate high-quality evidence on the long-run drivers of living standards.
2019-brynjolfsson-3.pdf: “Artificial Intelligence and the Modern Productivity Paradox: A Clash of Expectations and Statistics”, Erik Brynjolfsson, Daniel Rock, Chad Syverson (2019-05-01):
We live in an age of paradox. Systems using artificial intelligence match or surpass human-level performance in more and more domains, leveraging rapid advances in other technologies and driving soaring stock prices. Yet measured productivity growth has declined by half over the past decade, and real income has stagnated since the late 1990s for a majority of Americans. We describe four potential explanations for this clash of expectations and statistics: false hopes, mismeasurement, redistribution and implementation lags. While a case can be made for each explanation, we argue that lags have likely been the biggest contributor to the paradox. The most impressive capabilities of AI, particularly those based on machine learning, have not yet diffused widely. More importantly, like other general purpose technologies, their full effects won’t be realized until waves of complementary innovations are developed and implemented. The adjustment costs, organizational changes, and new skills needed for successful AI can be modeled as a kind of intangible capital. A portion of the value of this intangible capital is already reflected in the market value of firms. However, going forward, national statistics could fail to measure the full benefits of the new technologies and some may even have the wrong sign.
…The discussion around the recent patterns in aggregate productivity growth highlights a seeming contradiction. On the one hand, there are astonishing examples of potentially transformative new technologies that could greatly increase productivity and economic welfare (see Brynjolfsson and McAfee 2014 [Race Against The Machine]). There are some early concrete signs of these technologies’ promise, recent leaps in artificial intelligence (AI) performance being the most prominent example. However, at the same time, measured productivity growth over the past decade has slowed importantly. This deceleration is large, cutting productivity growth by half or more in the decade preceding the slow-down. It is also widespread, having occurred throughout the Organisation for Economic Cooperation and Development (OECD) and, more recently, among many large emerging economies as well (Syverson 2017.1
We thus appear to be facing a redux of the Solow (1987) paradox: we see transformative new technologies everywhere but in the productivity statistics.
In this chapter, we review the evidence and explanations for the modern productivity paradox and propose a resolution. Namely, there is no inherent inconsistency between forward-looking technological optimism and backward-looking disappointment. Both can simultaneously exist. Indeed, there are good conceptual reasons to expect them to simultaneously exist when the economy undergoes the kind of restructuring associated with transformative technologies. In essence, the forecasters of future company wealth and the measurers of historical economic performance show the greatest disagreement during times of technological change. In this chapter, we argue and present some evidence that the economy is in such a period now.
2019-cowen.pdf: “Is the rate of scientific progress slowing down?”, Tyler Cowen, Ben Southwood (2019-08-05):
Our task is simple: we will consider whether the rate of scientific progress has slowed down, and more generally what we know about the rate of scientific progress, based on these literatures and other metrics we have been investigating. This investigation will take the form of a conceptual survey of the available data. We will consider which measures are out there, what they show, and how we should best interpret them, to attempt to create the most comprehensive and wide-ranging survey of metrics for the progress of science. In particular, we integrate a number of strands in the productivity growth literature, the “science of science” literature, and various historical literatures on the nature of human progress.
…To sum up the basic conclusions of this paper, there is good and also wide-ranging evidence that the rate of scientific progress has indeed slowed down, In the disparate and partially independent areas of productivity growth, total factor productivity, GDP growth, patent measures, researcher productivity, crop yields, life expectancy, and Moore’s Law we have found support for this claim.
One implication here is we should not be especially optimistic about the productivity slowdown, as that notion is commonly understood, ending any time soon. There is some lag between scientific progress and practical outputs, and with science at less than its maximum dynamic state, one might not expect future productivity to fare so well either. Under one more specific interpretation of the data, a new General Purpose Technology might be required to kickstart economic growth once again.
2019-hjort.pdf: “The Arrival of Fast Internet and Employment in Africa”, Jonas Hjort, Jonas Poulsen (2019-03):
To show how fast Internet affects employment in Africa, we exploit the gradual arrival of submarine Internet cables on the coast and maps of the terrestrial cable network. Robust difference-in-differences estimates from 3 datasets, covering 12 countries, show large positive effects on employment rates—also for less educated worker groups—with little or no job displacement across space. The sample-wide impact is driven by increased employment in higher-skill occupations, but less-educated workers’ employment gain less so. Firm-level data available for some countries indicate that increased firm entry, productivity, and exporting contribute to higher net job creation. Average incomes rise.
2019-hoynes.pdf: “Universal Basic Income in the United States and Advanced Countries”, Hilary Hoynes, Jesse Rothstein (2019-08):
We discuss the potential role of universal basic incomes (UBIs) in advanced countries. A feature of advanced economies that distinguishes them from developing countries is the existence of well-developed, if often incomplete, safety nets. We develop a framework for describing transfer programs that is flexible enough to encompass most existing programs as well as UBIs, and we use this framework to compare various UBIs to the existing constellation of programs in the United States. A UBI would direct much larger shares of transfers to childless, nonelderly, nondisabled households than existing programs, and much more to middle-income rather than poor households. A UBI large enough to increase transfers to low-income families would be enormously expensive. We review the labor supply literature for evidence on the likely impacts of a UBI. We argue that the ongoing UBI pilot studies will do little to resolve the major outstanding questions. [Keywords: safety net, income transfer, universal basic income, labor supply, JEL I38, JELH24]
2019-huising.pdf: “Moving off the Map: How Knowledge of Organizational Operations Empowers and Alienates”, Ruthanne Huising (2019-06-26):
This paper examines how employees become simultaneously empowered and alienated by detailed, holistic knowledge of the actual operations of their organization, drawing on an inductive analysis of the experiences of employees working on organizational change teams. As employees build and scrutinize process maps of their organization, they develop a new comprehension of the structure and operation of their organization. What they had perceived as purposively designed, relatively stable, and largely external is revealed to be continuously produced through social interaction. I trace how this altered comprehension of the organization’s functioning and logic changes employees’ orientation to and place within the organization. Their central roles are revealed as less efficacious than imagined and, in fact, as reproducing the organization’s inefficiencies. Alienated from their central operational roles, they voluntarily move to peripheral change roles from which they feel empowered to pursue organization-wide change. The paper offers two contributions. First, it identifies a new means through which central actors may become disembedded, that is, detailed comprehensive knowledge of the logic and operations of the surrounding social system. Second, the paper problematizes established insights about the relationship between social position and challenges to the status quo. Rather than a peripheral social location creating a desire to challenge the status quo, a desire to challenge the status quo may encourage central actors to choose a peripheral social location.
…Some held out hope that one or two people at the top knew of these design and operation issues; however, they were often disabused of this optimism. For example, a manager walked the CEO through the map, presenting him with a view he had never seen before and illustrating for him the lack of design and the disconnect between strategy and operations. The CEO, after being walked through the map, sat down, put his head on the table, and said, “This is even more fucked up than I imagined.” The CEO revealed that not only was the operation of his organization out of his control but that his grasp on it was imaginary.
[See HBR popularization: “Can You Know Too Much About Your Organization?”, Huising 2019.:
But as the projects ended and the teams disbanded, a puzzle emerged. Some team members returned, as intended by senior management, to their prior roles and careers in the organization. Some, however, chose to leave these careers entirely, abandoning what had been to that point successful and satisfying work to take on organizational change roles elsewhere. Many took new jobs with responsibility for organizational development, Six Sigma, total quality management (TQM), business process re-engineering (BPR), or lean projects. Others assumed temporary contract roles to manage BPR project teams within their own or other organizations.
…Despite being experienced managers, what they learned was eye-opening. One explained that “it was like the sun rose for the first time….I saw the bigger picture.” They had never seen the pieces—the jobs, technologies, tools, and routines—connected in one place, and they realized that their prior view was narrow and fractured. A team member acknowledged, “I only thought of things in the context of my span of control.”…The maps of the organization generated by the project teams also showed that their organizations often lacked a purposeful, integrated design that was centrally monitored and managed. There may originally have been such a design, but as the organization grew, adapted to changing markets, brought on new leadership, added or subtracted divisions, and so on, this animating vision was lost. The original design had been eroded, patched, and overgrown with alternative plans. A manager explained, “Everything I see around here was developed because of specific issues that popped up, and it was all done ad hoc and added onto each other. It certainly wasn’t engineered.” Another manager described how local, off-the-cuff action had contributed to the problems observed at the organizational level:
“They see problems, and the general approach, the human approach, is to try and fix them….Functions have tried to put band-aids on every issue that comes up. It sounds good, but when they are layered one on top of the other they start to choke the organization. But they don’t see that because they are only seeing their own thing.”
Finally, analyzing a particular work process, another manager explained that she had been “assuming that somebody did this [the process] on purpose. And it wasn’t done on purpose. It was just a series of random events that somehow came together.”]
2019-mckenzie.pdf: “Predicting entrepreneurial success is hard: Evidence from a business plan competition in Nigeria”, David McKenzie, Dario Sansone (2019-11-01):
We compare the absolute and relative performance of three approaches to predicting outcomes for entrants in a business plan competition in Nigeria: Business plan scores from judges, simple ad-hoc prediction models used by researchers, and machine learning approaches. We find that (1) business plan scores from judges are uncorrelated with business survival, employment, sales, or profits three years later; (2) a few key characteristics of entrepreneurs such as gender, age, ability, and business sector do have some predictive power for future outcomes; (3) modern machine learning methods do not offer noticeable improvements; (4) the overall predictive power of all approaches is very low, highlighting the fundamental difficulty of picking competition winners.
This article examines the extent to which Victorian investors were short-sale constrained. While previous research suggests that there were relatively few limits on arbitrage, this article argues that short-sales of stocks outside the Official List were indirectly constrained by the risk of being cornered. Evidence for this hypothesis comes from three corners in cycle company shares [during the 1890s bicycle mania] which occurred in 1896–1897, two of which resulted in substantial losses for short-sellers. Legal efforts to retrieve funds lost in a corner were unsuccessful, and the court proceedings reveal a widespread contempt for short-sellers, or ‘bears’, among the general public. Consistent with the hypothesis that these episodes affected the market, this study’s findings show that cycle companies for which cornering risk was greater experienced disproportionately lower returns during a subsequent crash in the market for cycle shares. This evidence suggests that, under certain circumstances, short-selling shares in Britain prior to 1900 could have been much riskier than previously thought.
…Cycle share prices are found to have risen by over 200% in the early months of 1896, and remained at a relatively high level until March 1897. This boom was accompanied by the promotion of many new cycle firms, with 363 established in 1896 and another 238 during the first half of 1897. This was followed by a crash, with cycle shares losing 76% of their peak value by the end of 1898. The financial press appears to have been aware that a crash was imminent, repeatedly advising investors to sell cycle shares during the first half of 1897. Interestingly, however, these articles never explicitly recommended short-selling cycle shares…Between 1890 and 1896, a succession of major technological innovations substantially increased the demand for British bicycles.37 Bicycle production increased in response, with the number of British cycle companies in existence quadrupling between 1889 and 1897.38 Cycle firms, most of which were based in and around Birmingham, took advantage of the boom of 1896 by going public, resulting in the successful promotion of £17.3 million worth of cycle firms in 1896 and a further £7.4 million in 1897.39 By 1897 there was an oversupply problem in the trade, which was worsened by an exponential increase in the number of bicycles imported from the US.40 The bicycle industry entered recession, and the number of Birmingham-based cycle firms fell by 54% between 1896 and 1900.41
…The total paid for the 200 shares [by the short-trader Hamlyn] was £2,550, to be delivered at a price of £231.25, for a loss of £2,318.75. To put this loss in context, Hamlyn’s barrister noted that, had he succeeded in obtaining the shares at allotment, the profit would have been only £26.
2019-scholl.pdf: “Testing the Automation Revolution Hypothesis”, Keller Scholl, Robin Hanson (2019-12-27):
Recently, many have predicted an imminent automation revolution, and large resulting job losses. Others have created metrics to predict new patterns in job automation vulnerability. As context to such claims, we test basic theory, two vulnerability metrics, and 251 O*NET job features as predictors of 1505 expert reports regarding automation levels in 832 U.S. job types from 1999 to 2019.
We find that pay, employment, and vulnerability metrics are predictive (R2~0.15), but add little to the top 25 O*NET job features, which together predict far better (R2~0.55). These best predictors seem understandable in terms of traditional kinds of automation, and have not changed over our time period. Instead, it seems that jobs have changed their features to become more suitable for automation.
We thus find no evidence yet of a revolution in the patterns or quantity of automation. And since, over this period, automation increases have predicted neither changes in pay nor employment, this suggests that workers have little to fear if such a revolution does come. [Keywords: automation, wages, employment, occupations, artificial intelligence, technology]
This study examines the use of “algorithms in everyday labor” to explore the labor conditions of three Chinese food delivery platforms: Baidu Deliveries, Eleme, and Meituan. In particular, it examines how delivery workers make sense of these algorithms through the parameters of temporality, affect, and gamification. The study also demonstrates that in working for food delivery platforms, couriers are not simply passive entities that are subjected to a digital “panopticon.” Instead, they create their own “organic algorithms” to manage and, in some cases, even subvert the system. The results of the approach used in this study demonstrate that digital labor has become both more accessible and more precarious in contemporary China. Based on these results, the notion of “algorithmic making and remaking” is suggested as a topic in future research on technology and digital labor. [Keywords: delivery workers, food delivery platform, algorithms, labor]
2020-akcigit.pdf: “Ten Facts on Declining Business Dynamism and Lessons from Endogenous Growth Theory”, Ufuk Akcigit, Sina T. Ates (2021):
In this paper, we review the literature on declining business dynamism and its implications in the United States and propose a unifying theory to analyze the symptoms and the potential causes of this decline. We first highlight 10 pronounced stylized facts related to declining business dynamism documented in the literature and discuss some of the existing attempts to explain them. We then describe a theoretical framework of endogenous markups, innovation, and competition that can potentially speak to all of these facts jointly. We next explore some theoretical predictions of this framework, which are shaped by two interacting forces: a composition effect that determines the market concentration and an incentive effect that determines how firms respond to a given concentration in the economy. The results highlight that a decline in knowledge diffusion between frontier and laggard firms could be an important driver of empirical trends observed in the data. This study emphasizes the potential of growth theory for the analysis of factors behind declining business dynamism and the need for further investigation in this direction.
2020-arora.pdf: “The Changing Structure of American Innovation: Some Cautionary Remarks for Economic Growth”, Ashish Arora, Sharon Belenzon, Andrea Patacconi, Jungkyu Suh (2020):
A defining feature of modern economic growth is the systematic application of science to advance technology. However, despite sustained progress in scientific knowledge, recent productivity growth in the United States has been disappointing. We review major changes in the American innovation ecosystem over the past century. The past three decades have been marked by a growing division of labor between universities focusing on research and large corporations focusing on development. Knowledge produced by universities is not often in a form that can be readily digested and turned into new goods and services. Small firms and university technology transfer offices cannot fully substitute for corporate research, which had previously integrated multiple disciplines at the scale required to solve substantial technical problems. Therefore, whereas the division of innovative labor may have raised the volume of science by universities, it has also slowed, at least for a period of time, the transformation of that knowledge into novel products and processes.
2020-barth.pdf: “Genetic Endowments and Wealth Inequality”, Daniel Barth, Nicholas W. Papageorge, Kevin Thom (2020-04-01):
We show that genetic endowments linked to educational attainment strongly and robustly predict wealth at retirement. The estimated relationship is not fully explained by flexibly controlling for education and labor income. We therefore investigate a host of additional mechanisms that could account for the gene-wealth gradient, including inheritances, mortality, risk preferences, portfolio decisions, beliefs about the probabilities of macroeconomic events, and planning horizons. We provide evidence that genetic endowments related to human capital accumulation are associated with wealth not only through educational attainment and labor income but also through a facility with complex financial decision-making.
2020-bessen.pdf: “Industry Concentration and Information Technology”, James Bessen (2020-08-01):
Industry concentration has been rising in the United States since 1980. Does this signal declining competition and the need for a new antitrust policy? Or are other factors causing concentration to increase? This paper explores the role of proprietary information technology (IT), which could increase the productivity of top firms relative to others and raise their market share. Instrumental variable estimates find a strong link between proprietary IT and rising industry concentration, accounting for most of its growth. Moreover, the top four firms in each industry benefit disproportionately. Large investments in proprietary software—$250 billion per year—appear to substantially impact industry structure.
2020-bloom.pdf: “Are Ideas Getting Harder to Find?”, Nicholas Bloom, Charles I. Jones, John Van Reenen, Michael Webb (2020-04-01):
Long-run growth in many models is the product of two terms: the effective number of researchers and their research productivity. We present evidence from various industries, products, and firms showing that research effort is rising substantially while research productivity is declining sharply. A good example is Moore’s Law. The number of researchers required today to achieve the famous doubling of computer chip density is more than 18 times larger than the number required in the early 1970s. More generally, everywhere we look we find that ideas, and the exponential growth they imply, are getting harder to find.
2020-boeing.pdf: “A global decline in research productivity? Evidence from China and Germany”, Philipp Boeing, Paul Hünermund (2020-12-01):
- Replicates findings in Bloom et al 2020 for China and Germany.
- Provides evidence for a decline in research productivity in both countries.
- Using firm-level R&D panel data for public and private firms spanning three decades.
- Strong decline in R&D productivity in China due to end of catch-up growth.
- Conclusion: ideas are not only getting harder to find in the U.S.
In a recent paper, Bloom et al 2020 find evidence for a substantial decline in research productivity in the U.S. economy during the last 40 years. In this paper, we replicate their findings for China and Germany, using detailed firm-level data spanning three decades. Our results indicate that diminishing returns in idea production are a global phenomenon, not just confined to the U.S.
2020-cummins.pdf: “The micro-evidence for the Malthusian system. France, 1670–1840”, Neil Cummins (2020-10-01):
I test the assumptions of the Malthusian model at the individual, cross-sectional level for France, 1650–1820. Using husband’s occupation from the parish records of 41 French rural villages, I assign three different measures of status. There is no evidence for the existence of the positive check; infant deaths are unrelated to status. However, the preventive check operates strongly, acting through female age at first marriage. The wives of rich men are younger brides than those of poorer men. This drives a positive net-fertility gradient in living standards. However, the strength of this gradient is substantially weaker than it is in pre-industrial England. [Keywords: economic history, historical demography, population, Malthus, fertility, mortality, living standards]
2020-fulford.pdf: “Does it matter where you came from? Ancestry composition and economic performance of US counties, 1850–2010”, Scott L. Fulford, Ivan Petkov, Fabio Schiantarelli (2020-08-09):
What impact on local development do immigrants and their descendants have in the short and long term? The answer depends on the attributes they bring with them, what they pass on to their children, and how they interact with other groups. We develop the first measures of the country-of-ancestry composition and of GDP per worker for US counties from 1850 to 2010. We show that changes in ancestry composition are associated with changes in local economic development. We use the long panel and several instrumental variables strategies in an effort to assess different ancestry groups’ effect on county GDP per worker. Groups from countries with higher economic development, with cultural traits that favor cooperation, and with a long history of a centralized state have a greater positive impact on county GDP per worker. Ancestry diversity is positively related to county GDP per worker, while diversity in origin-country economic development or culture is negatively related.
2020-grier.pdf: “The Washington Consensus Works: Causal Effects of Reform, 1970–2015”, Kevin B. Grier, Robin M. Grier (2020-09-08):
- Sustained economic reform statistically-significantly raises real GDP per capita over a 5- to 10-year horizon.
- Despite the unpopularity of the Washington Consensus, its policies reliably raise average incomes.
- Countries that had sustained reform were 16% richer 10 years later.
Traditional policy reforms of the type embodied in the Washington Consensus have been out of academic fashion for decades. However, we are not aware of a paper that convincingly rejects the efficacy of these reforms. In this paper, we define generalized reform as a discrete, sustained jump in an index of economic freedom, whose components map well onto the points of the old consensus. We identify 49 cases of generalized reform in our dataset that spans 141 countries from 1970 to 2015. The average treatment effect associated with these reforms is positive, sizeable, and statistically-significant over 5- and 10- year windows. The result is robust to different thresholds for defining reform and different estimation methods. We argue that the policy reform baby was prematurely thrown out with the neoliberal bathwater. [Keywords: reform, Washington Consensus, rule of law, property rights, economic development]
2020-johnson.pdf: “What Remains of Cross-Country Convergence”, Paul Johnson, Chris Papageorgiou (2020-03):
We examine the record of cross-country growth over the past fifty years and ask if developing countries have made progress on closing the income gap between their per capita incomes and those in the advanced economies. We conclude that, as a group, they have not and then survey the literature on absolute convergence with particular emphasis on that from the last decade or so. That literature supports our conclusion of a lack of progress in closing the income gap between countries. We close with a brief examination of the recent literature on cross-individual distribution of income, which finds that despite the lack of progress on cross country convergence, global inequality has tended to fall since 2000. ( JEL E01, E13, O11, O47, F41, F62)
San Francisco is gentrifying rapidly as an influx of high-income newcomers drives up housing prices and displaces lower-income incumbent residents. In theory, increasing the supply of housing should mitigate increases in rents. However, new construction could also increase demand for nearby housing by improving neighborhood quality. The net impact on nearby rents depends on the relative sizes of these supply and demand effects.
This paper identifies the causal impact of new construction on nearby rents, displacement, and gentrification by exploiting random variation in the location of new construction induced by serious building fires. I combine parcel-level data on fires and new construction with an original dataset of historic Craigslist rents and panel data on individual migration histories to test the impact of proximity to new construction. I find that rents fall by 2% for parcels within 100m of new construction. Renters’ risk of being displaced to a lower-income neighborhood falls by 17%. Both effects decay linearly to zero within 1.5km. Next, I show evidence of a hyperlocal demand effect, with building renovations and business turnovers spiking and then returning to zero after 100m. Gentrification follows the pattern of this demand effect: parcels within 100m of new construction are 2.5 percentage points (29.5%) more likely to experience a net increase in richer residents.
Affordable housing and endogenously located construction do not affect displacement or gentrification. These findings suggest that increasing the supply of market rate housing has beneficial spillover effects for incumbent residents, reducing rents and displacement pressures while improving neighborhood quality.
2020-pereztruglia.pdf: “The Effects of Income Transparency on Well-Being: Evidence from a Natural Experiment”, Ricardo Perez-Truglia (2020-04):
In 2001, Norwegian tax records became easily accessible online, allowing everyone in the country to observe the incomes of everyone else. According to the income comparisons model, this change in transparency can widen the gap in well-being between richer and poorer individuals. Using survey data from 1985–2013 and multiple identification strategies, we show that the higher transparency increased the gap in happiness between richer and poorer individuals by 29%, and it increased the life satisfaction gap by 21%. We provide back-of-the-envelope estimates of the importance of income comparisons, and discuss implications for the ongoing debate on transparency policies.
2020-roodman.pdf: “Superexponential [Modeling the Human Trajectory]”, David Roodman (2020-07-30):
A scan of the history of gross world product (GWP) over millennia raises fundamental questions about the human past and prospect. What is the distribution of shocks ranging from recession to pandemic? Were the agricultural and industrial revolutions one-offs or did they manifest ongoing dynamics? Is growth exponential, if with occasional step changes in the rate, or is it superexponential? If the latter, how do we interpret the implication that output will become infinite in finite time? This paper introduces the first coherent statistical model of GWP history. It casts a GWP series as a sample path in a stochastic diffusion, one whose specification is novel yet rooted in neoclassical growth theory. After maximum likelihood fitting to GWP back to 10,000 BCE, most observations fall between the 40th and 60th percentiles of predicted distributions. The fit implies that GWP explosion is all but inevitable, in a median year of 2047. This projection cuts against the steadiness of growth in income per person seen in the last two centuries in countries at the economic frontier. And it essentially contra-dicts the laws of physics. But neither tension justifies immediate dismissal of the explosive projection. Accelerating economic growth is better explained by theory than constant growth. And if physical limits are articulated in a neoclassical-type model by endogenizing natural resources, explosion leads to implosion, formally avoiding infinities. The quality of the superexponential fit to the past suggests not so much that growth is destined to ascend as that the human system is unstable. [Keywords: endogenous growth; macroeconomic history; gross world product; stochastic differential equations]
2020-sauer-howcameoturneddlistcelebsintoamonetizationmachine.html: “How Cameo Turned D-List Celebs Into a Monetization Machine: Inside the surreal and lucrative two-sided marketplace of mediocre famous people”, Patrick J. Sauer (2020-03-17):
These formulas have turned an obscure idea that Galanis and his college buddies had a few years ago about making more money for second rate celebs into a thriving two-sided marketplace that has caught the attention of VCs, Hollywood, and professional sports. In June, Cameo raised $50 million in Series B funding, led by Kleiner Perkins (which recently began funding more early stage startups) to boost marketing, expand into international markets, and staff up to meet the growing demand. In the past 15 months, Cameo has gone from 20 to 125 employees, and moved from an 825-square-foot home base in the 1871 technology incubator into its current 6,000-square-foot digs in Chicago’s popping West Loop. Cameo customers have purchased more than 560,000 videos from some 20,000 celebs and counting, including ’80s star Steve Guttenberg and sports legend Kareem Abdul-Jabbar. And now, when the masses find themselves in quarantined isolation—looking for levity, distractions, and any semblance of the human touch—sending each other personalized videograms from the semi-famous has never seemed like a more pitch-perfect offering.
The product itself is as simple as it is improbable. For a price the celeb sets—anywhere from $5 to $2,500—famous people record video shout-outs, aka “Cameos,” that run for a couple of minutes, and then are delivered via text or email. Most Cameo videos are booked as private birthday or anniversary gifts, but a few have gone viral on social media. Even if you don’t know Cameo by name, there’s a good chance you caught Bam Margera of MTV’s Jackass delivering an “I quit” message on behalf of a disgruntled employee, or Sugar Ray’s Mark McGrath dumping some poor dude on behalf of the guy’s girlfriend. (Don’t feel too bad for the dumpee, the whole thing was a joke.)
…Back at the whiteboard, Galanis takes a marker and sketches out a graph of how fame works on his platform. “Imagine the grid represents all the celebrity talent in the world,” he says, “which by our definition, we peg at 5 million people.” The X-axis is willingness; the Y-axis is fame. “Say LeBron is at the top of the X-axis, and I’m at the bottom,” he says. On the willingness side, Galanis puts notoriously media-averse Seattle Seahawks running back Marshawn Lynch on the far left end. At the opposite end, he slots chatty celebrity blogger-turned-Cameo-workhorse Perez Hilton, of whom Galanis says, “I promise if you booked him right now, the video would be done before we leave this room.”
…“The contrarian bet we made was that it would be way better for us to have people with small, loyal followings, often unknown to the general population, but who were willing to charge $5 to $10,” Galanis says. Cameo would employ a revenue-sharing model, getting a 25% cut of each video, while the rest went to the celeb. They wanted people like Galanis’ co-founder (and former Duke classmate) Devon Townsend, who had built a small following making silly Vine videos of his travels with pal Cody Ko, a popular YouTuber. “Devon isn’t Justin Bieber, but he had 25,000 Instagram followers from his days as a goofy Vine star,” explains Galanis. “He originally charged a couple bucks, and the people who love him responded, ‘Best money I ever spent!’”
…After a customer books a Cameo, the celeb films the video via the startup’s app within four to seven days. Most videos typically come in at under a minute, though some talent indulges in extensive riffs. (Inexplicably, “plant-based activist and health coach” Courtney Anne Feldman, wife of Corey, once went on for more than 20 minutes in a video for a customer.) Cameo handles the setup, technical infrastructure, marketing, and support, with white-glove service for the biggest earners with “whatever they need”—details like help pronouncing a customer’s name or just making sure they aren’t getting burned-out doing so many video shout-outs.
…For famous people of any caliber—the washed-up, the obscure micro-celebrity, the actual rock star—becoming part of the supply side of the Cameo marketplace is as low a barrier as it gets. Set a price and go. The videos are short—Instagram comedian Evan Breen has been known to knock out more than 100 at $25 a pop in a single sitting—and they don’t typically require any special preparation. Hair, makeup, wardrobe, or even handlers aren’t necessary. In fact, part of the oddball authenticity of Cameo videos is that they have a take-me-as-I-am familiarity—filmed at breakfast tables, lying in bed, on the golf course, running errands, at a stoplight, wherever it fits into the schedule.
2021-abramitzky.pdf: “Intergenerational Mobility of Immigrants in the United States over Two Centuries”, Ran Abramitzky, Leah Boustan, Elisa Jácome, Santiago Pérez (2021-02-01):
Using millions of father-son pairs spanning more than 100 years of US history [using US census data], we find that children of immigrants from nearly every sending country have higher rates of upward mobility than children of the US-born. Immigrants’ advantage is similar historically and today despite dramatic shifts in sending countries and US immigration policy. Immigrants achieve this advantage in part by choosing to settle in locations that offer better prospects for their children.
2021-brynjolfsson.pdf: “The Productivity J-Curve: How Intangibles Complement General Purpose Technologies”, Erik Brynjolfsson, Daniel Rock, Chad Syverson (2021-01):
General purpose technologies (GPTs) like AI enable and require substantial complementary investments. These investments are often intangible and poorly measured in national accounts.
We develop a model that shows how this can lead to underestimation of productivity growth in a new GPTs early years and, later, when the benefits of intangible investments are harvested, productivity growth overestimation. We call this phenomenon the Productivity J-curve.
We apply our method to US data and find that adjusting for intangibles related to computer hardware and software yields a TFP level that is 15.9% higher than official measures by the end of 2017.
2021-desrochers-2.pdf: “Care to Wager Again? An Appraisal of Paul Ehrlich's Counterbet Offer to Julian Simon, Part 2: Critical Analysis”, Pierre Desrochers, Vincent Geloso, Joanna Szurmak (2021-02-11):
Objective: This paper provides the first comprehensive assessment of the outcome of Paul Ehrlich’s and Stephen Schneider’s counteroffer (1995) to economist Julian Simon following Ehrlich’s loss in the famous Ehrlich-Simon wager on economic growth and the price of natural resources (1980–1990). Our main conclusion in a previous article is that, for indicators that can be measured satisfactorily or can be inferred from proxies, the outcome favors Ehrlich-Schneider in the first decade following their offer. This second article extends the timeline towards the present time period to examine the long-term trends of each indicator and proxy, and assesses the reasons invoked by Simon to refuse the bet.
Methods: Literature review, data gathering, and critical assessment of the indicators and proxies suggested or implied by Ehrlich and Schneider. Critical assessment of Simon’s reasons for rejecting the bet. Data gathering for his alternative indicators.
Results: For indicators that can be measured directly, the balance of the outcomes favors the Ehrlich-Schneider claims for the initial ten-year period. Extending the timeline and accounting for the measurement limitations or dubious relevance of many of their indicators, however, shifts the balance of the evidence towards Simon’s perspective.
Conclusion: The fact that Ehrlich and Schneider’s own choice of indicators yielded mixed results in the long run, coupled with the fact that Simon’s preferred indicators of direct human welfare yielded largely favorable outcomes is, in our opinion, sufficient to claim that Simon’s optimistic perspective was largely validated.
2021-desrochers.pdf: “Care to Wager Again? An Appraisal of Paul Ehrlich's Counterbet Offer to Julian Simon, Part 1: Outcomes”, Pierre Desrochers, Vincent Geloso, Joanna Szurmak (2021-02-11):
Objective: This paper provides the first comprehensive assessment of the outcome of Paul Ehrlich and Stephen Schneider’s counteroffer (1995) to economist Julian Simon following Ehrlich’s loss in the famous Ehrlich-Simon wager on economic growth and the price of natural resources (1980–1990).
Methods: Literature review, data gathering and critical assessment of the indicators and proxies suggested or implied by Ehrlich and Schneider. Critical assessment of Simon’s reasons for rejecting the bet. Data gathering for his alternative indicators.
Results: For indicators that can be measured satisfactorily, the balance of the outcomes favors the Ehrlich-Schneider claims for the initial ten-year period. Extending the timeline and accounting for the measurement limitations or dubious relevance of many of their indicators, however, shifts the balance of the evidence towards Simon’s perspective.
Conclusion: Although the outcomes favour the Ehrlich-Schneider claims for the initial ten-year period, Ehrlich and Schneider’s indicators yielded mixed results in the long run. Simon’s preferred indicators of direct human welfare would yield largely favourable outcomes if the bet were extended into the present. Based on this, we claim that Simon’s optimistic perspective was once again largely validated.
[Followup paper: “Care to Wager Again? An Appraisal of Paul Ehrlich’s Counterbet Offer to Julian Simon, Part 2: Critical Analysis”, Desrochers et al 2021b.]
2021-eshel.pdf: “Debasement of silver throughout the Late Bronze-Iron Age transition in the Southern Levant: Analytical and cultural implications”, Tzilla Eshel, Ayelet Gilboa, Naama Yahalom-Mack, Ofir Tirosh, Yigal Erel (2021):
- Levantine ~1200–950 BCE silver hoards were subjected to chemical and isotopic analysis.
- Silver was alloyed with copper, reflecting a shortage after the Bronze Age collapse.
- This debasement was often concealed by adding arsenic.
- A mixing model distinguishes between isotopic contributions of alloyed metals.
- Results suggest that silver shortage in the Levant probably lasted until ~950 BCE.
The study of silver, which was an important mean of currency in the Southern Levant during the Bronze and Iron Age periods (~1950–586 BCE), revealed an unusual phenomenon. Silver hoards from a specific, yet rather long timespan, ~1200–950 BCE, contained mostly silver alloyed with copper. This alloying phenomenon is considered here for the first time, also with respect to previous attempts to provenance the silver using lead isotopes. Eight hoards were studied, from which 86 items were subjected to chemical and isotopic analysis. This is, by far, the largest dataset of sampled silver from this timespan in the Near East. Results show the alloys, despite their silvery sheen, contained high percentages of Cu, reaching up to 80% of the alloy. The Ag-Cu alloys retained a silvery tint using two methods, either by using an enriched silver surface to conceal a copper core, or by adding arsenic and antimony to the alloy. For the question of provenance, we applied a mixing model which simulates the contribution of up to three end members to the isotopic composition of the studied samples. The model demonstrates that for most samples, the more likely combination is that they are alloys of silver from Aegean-Anatolian ores, Pb-poor copper, and Pb-rich copper from local copper mines in the Arabah valley (Timna and Faynan). Another, previously suggested possibility, namely that a substantial part of the silver originated from the West Mediterranean, cannot be validated analytically. Contextualizing these results, we suggest that the Bronze Age collapse around the Mediterranean led to the termination of silver supply from the Aegean to the Levant in the beginning of the 12th century BCE, causing a shortage of silver. The local administrations initiated sophisticated devaluation methods to compensate for the lack of silver—a suspected forgery. It is further suggested that following the Egyptian withdrawal from Canaan around the mid-12th century BCE, Cu-Ag alloying continued, with the use of copper from Faynan instead of Timna. The revival of long-distance silver trade is evident only in the Iron Age IIA (starting ~950 BCE), when silver was no longer alloyed with copper, and was imported from Anatolia and the West Mediterranean. [Keywords: silver hoards, alloys, lead isotopic analysis, debasement, arsenic, Bronze age collapse, Mediterranean trade]
2021-hodgson.pdf: “Financial institutions and the British Industrial Revolution: did financial underdevelopment hold back growth?”, Geoffrey M. Hodgson (2021-01-18):
This scoping paper addresses the role of financial institutions in empowering the British Industrial Revolution. Prominent economic historians have argued that investment was largely funded out of savings or profits, or by borrowing from family or friends: hence financial institutions played a minor role. But this claim sits uneasily with later evidence from other countries that effective financial institutions have mattered a great deal for economic development. How can this mismatch be explained? Despite numerous technological innovations, from 1760 to 1820 industrial growth was surprisingly low. Could the underdevelopment of financial institutions have held back growth? There is relatively little data to help evaluate this hypothesis. More research is required on the historical development of institutions that enabled finance to be raised. This would include the use of property as collateral. This paper sketches the evolution of British financial institutions before 1820 and makes suggestions for further empirical research. Research in this direction should enhance our understanding of the British Industrial Revolution and of the preconditions of economic development in other countries.
2021-meyer.pdf: “The Use and Misuse of Income Data and Extreme Poverty in the United States”, Bruce D. Meyer, Derek Wu, Victoria Mooers, Carla Medalia (2021):
Recent research suggests that the share of US households living on less than $2/
person/ day is high and rising.
We reexamine such extreme poverty by linking SIPP and CPS data to administrative tax and program data.
We find that more than 90% of those reported to be in extreme poverty are not, once we include in-kind transfers, replace survey reports of earnings and transfer receipt with administrative records, and account for ownership of substantial assets. More than half of all misclassified households have incomes from the administrative data above the poverty line, and many have middle-class measures of material well-being.
2021-ruttan.pdf: “Instrumental use erodes sacred values”, Rachel L. Ruttan, Loran F. Nordgren (2021-01-21):
A fundamental feature of sacred values like environmental-protection, patriotism, and diversity is individuals’ resistance to trading off these values in exchange for material benefit. Yet, for-profit organizations increasingly associate themselves with sacred values to increase profits and enhance their reputations.
In the current research, we investigate a potentially perverse consequence of this tendency: that observing values used instrumentally (i.e., in the service of self-interest) subsequently decreases the sacredness of those values. Seven studies (n = 2,785) demonstrate support for this value corruption hypothesis. Following exposure to the instrumental use of a sacred value, observers held that value as less sacred (Studies 1–6), were less willing to donate to value-relevant causes (Studies 3 and 4), and demonstrated reduced tradeoff resistance (Study 7). We reconcile the current effect with previously documented value protection effects by suggesting that instrumental use decreases value sacredness by shifting descriptive norms regarding value use (Study 3), and by failing to elicit the same level of outrage as taboo tradeoffs, thus inhibiting value protective responses (Studies 4 and 5).
These results have important implications: People and organizations that use values instrumentally may ultimately undermine the very values from which they intend to benefit. [Keywords: self-interest, morality, values, sacred values, prosocial behavior]
2021-tarduno.pdf: “The congestion costs of Uber and Lyft”, Matthew Tarduno (2021-03):
Applying difference in differences and regression discontinuity specifications to high-frequency traffic data, I estimate that Uber and Lyft together decreased daytime traffic speeds in Austin by roughly 2.3%. Using Austin-specific measures of the value of travel time, I translate these slowdowns to estimates of citywide congestion costs that range from $33 to $52 million annually. Back of the envelope calculations imply that these costs are similar in magnitude to the consumer surplus provided by TNCs in Austin.
Together these results suggest that while TNCs may impose modest travel time externalities, restricting or taxing TNC activity is unlikely to generate large net welfare gains through reduced congestion.
2021-yamada.pdf: “The long–term causal effect of U.S. bombing missions on economic development: Evidence from the Ho Chi Minh Trail and Xieng Khouang Province in Lao P.D.R”, Takahiro Yamada, Hiroyuki Yamada (2021-05-01):
- Investigates the long-term causal effects of bombings on later economic development.
- Focus on Laos that is one of the most intensely bombed countries per capita in history.
- Use granular grid data, nightlights or population, as proxies of economic development.
- No robust effects of bombings in southern Laos, but some effects in northern Laos.
- No within-country conditional economic convergence, which could be Lao specific.
This study investigates the long-term causal effects of U.S. bombing missions during the Vietnam War on later economic development in Laos. Following an instrumental variables approach, we use the distance between the centroid of village-level administrative boundaries and heavily bombed targets, namely, the Ho Chi Minh Trail in southern Laos and Xieng Khouang Province in northern Laos, as an instrument for the intensity of U.S. bombing missions. We use three datasets of mean nighttime light intensity (1992, 2005, and 2013) and two datasets of population density (1990 and 2005) as outcome variables. The estimation results show no robust long-term effects of U.S. bombing missions on economic development in southern Laos but show negative effects in northern Laos, even 40 years after the war. We also found that the results do not necessarily support the conditional convergence hypothesis within a given country, although this result could be unique to Laos. [Keywords: conflict damage, economic development, conditional convergence hypothesis, Lao P.D.R]