Working Paper Abstracts – 1990

Working Paper Abstracts – 1990


This paper introduces a utility function that nests three classes of utility functions: (1) time-separable utility functions; (2) “catching up with the Joneses” utility functions that depend on the consumer’s level of consumption relative to the lagged cross-sectional average level of consumption; and (3) utility functions that display habit formation. Closed-form solutions for equilibrium asset prices are derived under the assumption that consumption growth is i.i.d. The equity premia under catching up with the Joneses and under habit formation are, for some parameter values, as large as the historically observed equity premium in the United States.


Project with negative expected value cannot obtain financing in competitive capital markets if all potential investors are risk neutral and have identical beliefs about the distribution of the project’s net revenue. We present a series of examples with heterogeneous beliefs in which it is possible for a project to obtain financing even though all investors in the project believe, conditional on the project being undertaken, that the project has a negative expected value. An important feature of the examples is that the differences in beliefs are due only to differences in information, and are not simply arbitrary unexplained differences in opinions.


The dynamic behavior of security prices is studied in a setting where two agents trade strategically and learn from market prices. Each trader receives a private signal about fundamentals, the significance of which depends on the signal received by the other trader. In trading, each agent wants to deceive the other trader into revealing his signal, while not revealing his own signal. We show that trade is self-generating because agents learn the value of the asset only through observation of the market price. Uninformed agents, technical analysts, can also trade by charting past prices. These chartists ensure market efficiency. Equilibrium price paths of the model may display reversals in which all traders rationally revise their beliefs, first in one direction and then in the opposite direction even though no new information had entered the system. A piece of information which is initially thought to be bad news may be revealed, through trading, to be good news. This fad-like behavior results from rational strategic interaction and Bayesian inference. In this model security prices do not follow a martingale.


This research examines a detailed time series of future aggregate profit forecasts. It seeks to determine whether the path of equity prices in the twelve month period surrounding the October 1987 crash could be justified by changes in the fundamental determinants of stock prices, i.e., the present value of expected future corporate profits. It does not address what caused the shifts in expectations of future corporate profits, but whether there was a sufficiently large change in these forecasts to justify the movement of stock prices.

The paper concludes that the rise and subsequent fall of stock prices in 1987 cannot be explained by discounting the mean or “consensus” level of future profit forecasts with a constant equity risk premium. It is determined that the required equity risk premium must have changed over three percentage points over the period to reconcile the consensus valuations with the actual time series of stock prices.

Two explanations are offered to explain the deviation of actual stock prices from the consensus valuation. The first analyzes whether changes in the required equity risk premium was correlated with the cross-sectional dispersion of the forecasts of future corporate profits. The second explanation, using a Lintner model of heterogeneously informed investors with a constant equity risk premium, examines whether changes in an index of investor “sentiment” could explain stock prices in 1987.

The study finds that both the one-year ahead dispersion of profit forecasts and the indicators of investor sentiment are independently significant in explaining the deviation between the S&P 500 Index and the consensus valuations of corporate equity during this period. Therefore one cannot reject the hypothesis that the stock market behavior during 1987 was accompanied by changes in the perceived risk of the future stream of corporate profits or shifting sentiment between those investors adhering to the optimistic and pessimistic profit forecasts. Changes in consensus valuations alone cannot explain the movements in the market.


Despite its importance to our national wealth, real estate remains a relatively unstudied asset. A key reason for this is the perceived limitations of available return data. Most research in the area relies on the appraisal-based series such as the Frank Russell Company (FRC) index. This paper suggests that the stock market provides a ready source of transactions-based data. Many real estate researchers are skeptical of stock market data because the studies of real estate investment trusts (REITs) find that their returns are more correlated with the broader stock market than with other known real estate series. A typical conclusion is that traded real estate securities do not accurately reflect real estate values.

We argue that such conclusions are premature. We simulate the returns and risks of portfolios of different types of real estate firms using historical stock prices for the 1962-1988 period. We use the time series of the real estate portfolio returns to estimate the correlation structure of returns among real estate securities, common stocks, long-term Treasury bonds, and inflation. Estimation of factor models documents a particularly strong small stock component to traded real estate stock returns. We also find substantial heterogeneity in return patterns across different types of real estate firms. Finally, we demonstrate that our real estate portfolios are not unrelated to appraisal-based indexes. Lagged values of the real estate portfolio returns can predict returns on the FRC index.


The paper derives the government budget constraint and studies the sustainability of deficits in a stochastic, dynamically efficient economy. Contrary to the intuition based on certainty models, policies with permanent expected primary deficits can be sustainable. Even an infinite string of realized primary deficits does not necessarily provide evidence against sustainability.

Moreover, one has to be careful in discounting future fiscal variables. Even if the government finances deficits by issuing safe debt, the safe interest rate cannot be used in transversality conditions and in computing present values.

The stochastic setting allows one to reconcile dynamic efficiency with a safe interest rate below the average rate of economic growth. Evidence that the U.S. government has run average primary deficits and that government bond returns have been below the growth rate over long periods combined with evidence on dynamic efficiency from Abel (1989) suggests that the sustainability results for the stochastic, dynamic efficient economy are highly relevant for assessing U.S. fiscal policy.


A representative-agent pricing model with time-varying moments of consumption growth is used to analyze implications about means and volatilities of equity returns and interest rates, first-order autocorrelations of equity returns for various investment horizons, and R2’s in projections of equity returns for various horizons on predetermined financial variables. An analysis using non-expected-utility preferences reveals that high risk aversion is key in matching empirical benchmarks for average returns, but low intertemporal substitution is important in obtaining implications corresponding to estimates of volatilities, autocorrelations, and the predictability of returns.

no abstract


Many analyses of debt policy assume exogenous government expenditures. Instead, I use an optimizing model in which the government endogenously selects values of taxes, spending, and debt to maximize welfare. If demand for publicly provided goods is elastic, a debt-financed tax cut increases consumption, because individuals rationally expect some reduced government spending in the future. Even though future taxes rise, they do not offset the expansionary effect of the current tax cut on consumption. Depending on preferences, the marginal propensity to consume out of tax cuts can take any value between zero and the marginal propensity out of ordinary income.


Recent work demonstrates that dynastic assumptions guarantee the irrelevance of all redistributional policies, distortionary taxes, and prices —the neutrality of fiscal policy (Ricardian equivalence) is only the “tip of the iceberg.” In this paper, we investigate the possibility of reinstating approximate Ricardian equivalence by introducing a small amount of friction in intergenerational links. If Ricardian equivalence depends upon significantly shorter chains of links than do these stronger neutrality results, then friction may dissipate the effects that generate strong neutrality, without significantly affecting the Ricardian result. Although this intuition turns out to be essentially correct, we show that models with small amounts of friction have other untenable implications. We conclude that the theoretical case for Ricardian equivalence remains tenuous.


Banking panics are the central event informing and rationalizing government intervention into the banking industry. In the last decade progress has been made in understanding the origins of panics. This essay reviews recent theoretical and empirical work on the origins of banking panics. New evidence on the causes of banking panics is introduced. Banking panics do not appear to have been caused by random withdrawal risks associated with seasonal shocks in the countryside. Instead, adverse economic news, in concert with asymmetric information about the incidence of shocks, and problems of bank asset diversification associated with unit banking seem to have led to banking panics.


This paper analyzes a market where investors observe the intermediate stages of price formation and can revise their orders as prices are determined. A trading mechanism that exhibits this property is said to be transparent. The issue of market transparency arises in many current policy issues such as the timing and quality of trade reporting, sunshine trading, and the effectiveness of publicizing order imbalances to reduce price volatility. We first examine a non-transparent where traders submit written orders before market clearing. We then contrast this system with a transparent market mechanism where traders can revise their orders during the price formation process. This system is shown to have the same equilibrium as a system where information on shocks to order flow is directly disclosed to market participants. Throughout, trading is modeled as a game between strategic traders with rational expectations. Contrary to popular intuition, transparency can increase price variability and lower liquidity.


Prior to the Securities Exchange Act of 1934, manipulation of stock prices was an issue of great concern. The Act reduced the possibilities for manipulation by, among other things, making it illegal for a manager to sell short his firm’s shares or for false information about a firm to be released. This paper asks whether an uninformed raider can profitably manipulate stock prices simply by buying and selling shares. It is shown that in a rational expectations framework where all agents maximize utility, it is possible for an uninformed raider to manipulate stock prices profitably, provided investors attach a positive probability to the raider being informed.


This paper uses long-horizon autocorrelations and variance ratio statistics to test for long-term mean reversion in real exchange rates. Unlike most previous tests of this hypothesis, the tests do reject a random walk for monthly data in the post-Bretton Woods era; however, the statistics indicate that positively-correlated innovations, rather than mean reversion, are the source of the rejection. Tests using annual data for the twentieth century also reject the random walk. In this case, however, the rejection can be attributed to mean reversion and confirms that PPP is a long-term phenomenon.


This paper examines the role of the market maker in intertemporal price formation in securities market. We argue that the market maker, in performing the critical function of price discovery, may set prices in a dynamic context that would be suboptimal in a single period context in order to learn more from the resulting order flow. Such actions constitute an investment in the production of information. Necessary and sufficient conditions for the existence of such price strategies are developed and several examples are presented.


This paper analyzes and contrasts the process of price formation under two alternative trading mechanisms: a continuous quote-driven system where dealers post prices before order submission and an order-driven system where traders submit their orders before prices are determined. The order-driven system can operate either as a continuous auction, providing immediate order execution, or as a periodic auction, where orders are batched for execution at a pre-determined time. Throughout, trading is modeled as a game where order quantities and beliefs are determined endogenously and agents act strategically. We show that prices in the continuous dealer system are more efficient and less variable than prices in the continuous auction system. The auction mechanism is more robust in the sense that it can operate where the dealer mechanism cannot, but the reverse is not true. The two mechanisms are equivalent only in a ‘large’ market.

We demonstrate that a periodic mechanism, by pooling orders for simultaneous execution, can overcome the problems of information asymmetry that cause failure in a continuous mechanism where trading takes place sequentially. When both mechanisms are viable, a periodic system offers greater price efficiency but requires traders to sacrifice continuity. The results provide a partial explanation for differences in market organization across assets and markets.


The paper examines the substantiality of fiscal policies in a stochastic economy with a particular focus on two benchmark policies, balanced budgets and tax smoothing. These policies are typically sustainable if lump-sum taxes are available, but they are generally not sustainable in a stochastic environment, if taxation is constrained by the size of the economy. The sustainability problems arise because the debt-income ratio becomes excessive whenever there are sufficiently low realizations of aggregate income.

I also compute the probability of reaching high debt-income ratios with different policies and I discuss the role of debt management for sustainability. It turns out that balanced budgets can be sustained forever with very high probability (but less than one), but so can policies with permanent primary or with-interest deficits. I argue that debt management is important for sustainability in a stochastic environment and show that the use state-contingent government liabilities can be helpful in designing sustainable versions of tax-smoothing and balanced budget policies.


An evaluation of the performance of foreign exchange hedges shows that, in a mean-variance framework, fully hedging exchange risk does not improve the performance of a portfolio of international equities; however, dynamic strategies which incorporate market information on risk premiums in the forward market are shown to statistically improve performance.


The period prior to the U.S. Civil War saw the introduction and rapid diffusion of the railroad. It was also the Free Banking Era (1838-1863) during which some states allowed relatively free entry into banking. Banks in all states issued distinct private monies, called bank notes, which circulated at discounts from face value in secondary markets at locations away from the issuing bank. This paper proposes a pricing model for bank notes, and then, using a newly discovered data set of monthly bank note prices for all banks in North America, studies the secondary market for privately issued bank notes during the American Free Banking Era, 1838-1863. To test the model, the durations and costs of trips from Philadelphia to other locations are constructed from pre-Civil War travellers’ guides in order to measure improvements resulting from the diffusion of the railroad during this period. The results suggest that the note market accurately prices risk. Systematic wildcat banking was not possible. The transportation costs of note redemption explain only part of bank note discount variation. Bank default risk was differentially priced and such risk premia varied cyclically.


An exchange-rate system is a set of contracts which commits Central Banks to intervene in the foreign-exchange market. The design features of the system include: the rules of intervention, the limits placed on exchange rates and the “crisis scenario” which describes possible transitions to new regimes in case one Central bank runs out of reserves or borrowing capacity. This paper considers the various trade-offs one faces in designing an exchange-rate system. Svensson (1989) has already analyzed the degree of variability in the exchange rate, the interest rate and the fundamentals. But the tradeoff also pertains to the amount of reserves which the Central banks must have on hand in order to forestall a speculative attack and make the system sustainable. The amount of reserves needed depends crucially on the assumed crisis scenario.


It is widely recognized that heterogeneous information across traders plays an important role in generating financial market activity. However, the predictions of any model of financial markets depend completely on the equilibrium concept used to solve the model. The choice of equilibrium subsumes assumptions and implications concerning: the degree to which traders make use of price as information, the amount of noise trading in the market, the information content of prices, the effect on prices of differences in beliefs, etc. An empirical analysis of equilibrium formation serves as an implicit test of the relative importance of these different properties.

We devise tests that distinguish between competitive (Walrasian), fully revealing rational expectations, and noisy rational expectations equilibria based on their comparative static predictions concerning trading volume around public information signals. These tests are implemented using data on stock market trading volume, price changes, and changes in analysts’ earnings forecasts around interim earnings announcements. Empirical results strongly support the noisy rational expectations hypothesis. This indicates that a significant amount of noise trading exists (so that private information has value) but not enough to obfuscate entirely the information content of price. Our analysis also indicates that the dispersion of private information across traders has an impact on trading volume, but not on price. Finally, we explore the implications of our results for asset pricing and volatility, as well as for certain “anomalous” phenomena observed in financial markets.


Much of economic theory is concerned with understanding price determination in competitive markets. Such theories assume that all individuals continuously participate in one giant market where they can express their demands for all assets simultaneously as a function of a giant price vector. This assumption of simultaneous and continuous participation in all markets is inconsistent with two important facts: First, it is costly for an individual or an institution to continuously express demands in any single market, and second it is simply impossible to trade in all markets simultaneously. These two facts create a need for intermediaries. Much is known about the role of intermediaries as principals who add liquidity to markets by trading on their own account. However, far less is known about the informational role of intermediaries. In this paper, we will analyze the consequences of the fact that intermediaries play a fundamental role as repositories of information.


Only one-fourth of U.S. families own stock. This paper examines whether the consumption of stockholders differs from the consumption of non-stockholders and whether these differences help explain the empirical failures of the consumption-based CAPM. Household panel data are used to construct time series on the consumption of each group. The results indicate that the consumption of stockholders is more volatile than that of non-stockholders and is more highly correlated with the excess return on the stock market. These differences help explain the size of the equity premium, although they do not fully resolve the equity premium puzzle.


This paper examines the risks and returns of long-term low-grade bonds for the period 1977-1989. We find: (1) Low-grade bonds realized higher returns than higher-grade bonds and lower returns than common stocks. Also, low-grade bonds exhibited less volatility than higher-grade bonds due to their call features and high coupons. (2) There is no relation between the age of low-grade bonds and their realized returns. Cyclical factors explain much of the observed relation between default rates and bond age. (3) Low-grade bonds behave like both bonds and stocks. Despite this complexity, there is no evidence that low-grade bonds are systematically over- or under-priced.


In this report we investigate the impact of the recent increase in indebtedness in the United States on tax receipts, economic stability, and economic efficiency. Evidence reported in this study shows that increased indebtedness in the corporate sector has a significant cost to the Treasury in terms of reduced tax receipts. This reduction in corporate tax receipts is not fully offset by an increase in personal taxes. It appears that corporate restructuring, together with the build-up of debt, is worrisome and bears careful monitoring, but it does not seem to lead to a financial crisis. The issue of economic stability, without access to more data, is not settled and it is difficult to make a clear case.


Standard contract theory suggests that in optimal contract payments should be contingent on many events, but in practice this rarely happens. For example, financial securities typically do not make payments contingent on accounting information. This paper develops a theory to explain these missing contingencies. The first important element of the theory is that contracts are based on signals produced by measurement systems which are manipulable. The second is that the contracting parties have incomplete information about each other’s type. Given these two assumptions, it is shown that in equilibrium a non-competitive contract is optimal.


A defining characteristic of bank loans is that they are not resold once created. Yet, in 1989 about $240 billion of commercial and industrial loans were sold, compared to trivial amounts five years earlier. Selling loans without explicit guarantee or recourse is inconsistent with theories of the existence of financial intermediation. What has changed to make bank loans marketable? In this paper we test for the presence of implicit contractual features of bank loan sales contracts that could explain this inconsistency. In addition, the effect of technological progress on the reduction of information asymmetries between loan buyers and loan sellers is considered. The paper tests for the presence of these features and effects using a sample of over 800 recent loan sales.


This paper looks at prices of S&P 500 futures options over 1985-87 to see whether there were any expectations prior to October 1987 of an impending stock market crash. Two approaches are used. First, it is shown that the crash insurance embodied in out-of-the-money puts started commanding an unusually high price in a well-defined sense during the year preceding the crash. Second, a model is derived for pricing American futures options when the underlying asset price follows a jump-diffusion with systematic jump risk. The jump-diffusion parameters implicit in S&P 500 futures options prices are estimated daily. Results are that negative jumps were expected and implicit distributions became strongly negatively skewed during the year preceding the crash. These results cannot be explained by other standard option pricing models. Both approaches indicate no strong fears of a crash during the two months immediately preceding the crash.