That's right, it filets, it chops, it dices, slicesSee my earlier post on NGDP targeting.
Never stops, lasts a lifetime, mows your lawn
And it mows your lawn and it picks up the kids from school
It gets rid of unwanted facial hair
it gets rid of embarrassing age spots
It delivers a pizza, and it lengthens, and it strengthens
And it finds that slipper that's been at large under the chaise lounge(2) for several weeks
And it plays a mean Rhythm Master
It makes excuses for unwanted lipstick on your collar
And it's only a dollar, step right up, it's only a dollar, step right up
'Cause it forges your signature
If not completely satisfied, mail back unused portion of product
For complete refund of price of purchase
What's happening in monetary policy and macroeconomics.
Sunday, October 30, 2011
Christina Romer on NGDP Targeting
Christina Romer thinks that NGDP targeting will solve the world's problems. According to Christina, NGDP targeting will: (i) remove dissent from the FOMC; (ii) improve communication about monetary policy to the public; (iii) improve confidence; (iv) lower borrowing costs; (v) increase private spending. This reminds me of a Tom Waits song, "Step Right Up." Here's an excerpt:
Bitcoin Update
Apparently setting up an alternative monetary system is not so easy. See this update on Bitcoin, and my earlier post.
Note: I had previously linked to David Andolfatto's blog, but it looks like that is defunct.
Note: I had previously linked to David Andolfatto's blog, but it looks like that is defunct.
Tom Sargent
There is an interview with Tom Sargent in the Sunday NYT. Sargent makes the point that modern macroeconomics is neither political nor inherently un-Keynesian. From the article:
Here's something interesting. Lucas and Sargent once wrote a paper, "After Keynesian Macroeconomics," that you might think, from the title, is an exercise in Keynes-bashing. Actually, not so. Here is the last paragraph:
Professor Sargent described himself as a scientist, a “numbers guy” who is “just seeking the truth” as any good researcher does.Further,
“If you go to seminars with guys who are actually doing the work and are trying to figure things out, it’s not ideological,” he said. “Half the people in the room may be Democrats and half may be Republicans. It just doesn’t matter.”
Today, Professor Sargent says that in some ways he actually is a Keynesian, but he qualified that claim, too. “I’m happy to say I am a Harrison-Kreps-Keynesian,” he said, citing work by two scholars at Stanford, J. Michael Harrison and David M. Kreps. They developed a theory of speculative investor behavior and stock-bubble formation that subtly modifies rational expectations “in a beautiful way” and “captures Keynes’s argument, makes it rigorous, and pushes it further,” he said.Sargent is kidding us a bit here. What he finds interesting about Keynes is an idea formalized and developed in this paper by Harrison and Kreps. When Sargent says he is "actually a Keynesian," that's not a Hicks IS-LM Keynesian, as a reading of Harrison-Kreps should make clear.
Here's something interesting. Lucas and Sargent once wrote a paper, "After Keynesian Macroeconomics," that you might think, from the title, is an exercise in Keynes-bashing. Actually, not so. Here is the last paragraph:
The objectives of equilibrium business cycle theory are taken, without modification, from the goal which motivated the construction of Keynesian macroeconometric models: to provide a scientifically-based means of assessing, quantitatively, the likely effects of alternative economic policies. Without the econometric successes achieved by the Keynesian models, this goal would be simply inconceivable. However, unless the now-evident limits of these models are frankly acknowledged and radically new directions taken, the real accomplishments of the Keynesian revolution will be lost as surely as those we know to be illusory.
Saturday, October 29, 2011
Open Market Operations and Non-Neutralities of Money
Matt Rognlie and I are having a conversation in the comment thread of this previous post, which I'm sure most of you have lost track of. Here's a summary of the basic issues: One of my complaints with New Keynesian economics is that it skirts around most of what is interesting for me about monetary policy and how it works. In mainstream monetary models, e.g. standard representative agent models with cash-in-advance constraints, non-neutralities of money are restricted to the effects of unanticipated money and inflation. Monetary policy matters due to distortions in intertemporal prices, for example the anticipation of higher money growth and higher inflation acts as a tax on labor supply and reduces output. Further, the nominal interest rate increases due to a Fisher effect. Mike Woodford looked at those effects and thought that they did not matter much in practice, or that they had the wrong signs, and he wrote down models where he could dispense with those types of intertemporal distortions entirely. In basic New Keynesian models we do not worry about the details of monetary exchange, it is assumed that the central bank can choose the short-term nominal interest rate at will, and monetary policy has real effects because of relative price distortions due to sticky prices and wages.
My contention is that one cannot analyze monetary policy without modeling the role of central bank liabilities and other assets in exchange, and the role of the central bank as a financial intermediary. This need not involve substituting for New Keynesian-type effects. One can easily take the approach of being explicit about exchange, the central bank balance sheet, and central bank intermediation activity, and include the sticky prices and wages if one really can't live without them.
In this paper, one of the results I get is a particular non-neutrality of money. Prices are flexible, so it's certainly not a New Keynesian effect, and it's different from what you get in mainstream monetary models. There are essentially two classes of assets - currency and various other assets (government bonds, loans) which may be fundamentally illiquid but are made liquid (though not as liquid as currency) by financial intermediaries. A standard open market purchase (think of this as normal times) will ultimately increase the stock of currency in nominal terms, with no change in the real stock of currency, but the real stock of other assets declines, those assets become more scarce, the real interest rate falls, and lending increases. Essentially, this is an illiquidity effect.
In reply to Matt's last set of comments:
1. [Here he's discussing the effect of the open market operation]
First, in my model, there is not an increase in the yield spread "between government debt and other liquid securities." To keep things simple, I put assets into two classes. In the second class there is everything that is not currency, and I assumed that all that stuff (government interest-bearing debt and loans) could be intermediated in the same way. In a more elaborate model, one might imagine assets with different degrees of liquidity, but liquidity will be priced according to how assets are intermediated. For example, we might think of a house as highly illiquid, but a mortgage-backed security (MBS) can be highly-liquid, and the MBS is essentially backed by the houses that act as collateral for the mortgage debt that gets chopped up and put into the MBS.
Second, Matt has hit on something interesting in the latter part of the above paragraph, relating to fiscal policy. The Treasury could indeed be more important than the Fed, as it can bring about changes in the total quantity of consolidated government debt outstanding; the Fed can only change the composition. In fact, under current circumstances, the changes in the composition of outstanding debt the Fed can accomplish are irrelevant. Matt seems to think that these things don't "ring empirically true." I say run with the idea.
Matt goes on to discuss how New Keynesian effects work, and how he thinks they are empirically more relevant than what I'm after. You can read the details in the comment thread in the previous post. Two comments:
1. In terms of current events, my model might tell you that our current problem is that liquid assets (the second class of assets - the intermediated non-currency assets) are too scarce, and the real interest rate is too low. New Keynesians tell us the real rate is too high. If we take the New Keynesian line, we have to take a stand on what the "natural" real rate of interest is. That would be the real rate if wages and prices were perfectly flexible. To determine what that rate is we have to determine what the shock was that was driving the recession (and the financial crisis) presumably. I'm not sure what the New Keynesians have in mind there. Also, at first glance, real rates (based on current inflation, current short nominal rates, TIPS yields) look pretty low to me.
2. Some of Matt's arguments are in terms of back-of-the-envelope reasoning about the quantities of government debt and currency relative to other liquid assets. But we know that, through the shadow banking sector, a small quantity of assets, used as collateral, can support a very large quantity of credit activity. This is part of what Gary Gorton has written about. One might not think that things going haywire with a relatively small quantity of mortgage debt could cause such a big problem, but it did. Similarly, small changes in the quantity of interest-bearing government debt outstanding, through the process of rehypothecation, can give potentially very large effects in asset markets. Collateral and rehypothecation are not in my model, but if one were to take it to the data, that might be part of what one would want to include.
My contention is that one cannot analyze monetary policy without modeling the role of central bank liabilities and other assets in exchange, and the role of the central bank as a financial intermediary. This need not involve substituting for New Keynesian-type effects. One can easily take the approach of being explicit about exchange, the central bank balance sheet, and central bank intermediation activity, and include the sticky prices and wages if one really can't live without them.
In this paper, one of the results I get is a particular non-neutrality of money. Prices are flexible, so it's certainly not a New Keynesian effect, and it's different from what you get in mainstream monetary models. There are essentially two classes of assets - currency and various other assets (government bonds, loans) which may be fundamentally illiquid but are made liquid (though not as liquid as currency) by financial intermediaries. A standard open market purchase (think of this as normal times) will ultimately increase the stock of currency in nominal terms, with no change in the real stock of currency, but the real stock of other assets declines, those assets become more scarce, the real interest rate falls, and lending increases. Essentially, this is an illiquidity effect.
In reply to Matt's last set of comments:
1. [Here he's discussing the effect of the open market operation]
But ultimately I have severe doubts that this channel makes much of a quantitative difference. When the Fed adjusts policy through open market operations, over the short to medium term it's making purchases in the tens of billions of dollars; maybe $100 billion at the very most. Meanwhile, the MZM money stock is $10 trillion, and that's an underestimate of the true size of the universe of liquid assets. Fiscal shocks happen all the time that adjust the quantity of liquid government debt by much more than Fed operations normally do; if you're positing that this an important channel for the effects of Fed policy, it follows that the Fed is at most a minor sideshow next to the Treasury. That doesn't ring empirically true to me.
First, in my model, there is not an increase in the yield spread "between government debt and other liquid securities." To keep things simple, I put assets into two classes. In the second class there is everything that is not currency, and I assumed that all that stuff (government interest-bearing debt and loans) could be intermediated in the same way. In a more elaborate model, one might imagine assets with different degrees of liquidity, but liquidity will be priced according to how assets are intermediated. For example, we might think of a house as highly illiquid, but a mortgage-backed security (MBS) can be highly-liquid, and the MBS is essentially backed by the houses that act as collateral for the mortgage debt that gets chopped up and put into the MBS.
Second, Matt has hit on something interesting in the latter part of the above paragraph, relating to fiscal policy. The Treasury could indeed be more important than the Fed, as it can bring about changes in the total quantity of consolidated government debt outstanding; the Fed can only change the composition. In fact, under current circumstances, the changes in the composition of outstanding debt the Fed can accomplish are irrelevant. Matt seems to think that these things don't "ring empirically true." I say run with the idea.
Matt goes on to discuss how New Keynesian effects work, and how he thinks they are empirically more relevant than what I'm after. You can read the details in the comment thread in the previous post. Two comments:
1. In terms of current events, my model might tell you that our current problem is that liquid assets (the second class of assets - the intermediated non-currency assets) are too scarce, and the real interest rate is too low. New Keynesians tell us the real rate is too high. If we take the New Keynesian line, we have to take a stand on what the "natural" real rate of interest is. That would be the real rate if wages and prices were perfectly flexible. To determine what that rate is we have to determine what the shock was that was driving the recession (and the financial crisis) presumably. I'm not sure what the New Keynesians have in mind there. Also, at first glance, real rates (based on current inflation, current short nominal rates, TIPS yields) look pretty low to me.
2. Some of Matt's arguments are in terms of back-of-the-envelope reasoning about the quantities of government debt and currency relative to other liquid assets. But we know that, through the shadow banking sector, a small quantity of assets, used as collateral, can support a very large quantity of credit activity. This is part of what Gary Gorton has written about. One might not think that things going haywire with a relatively small quantity of mortgage debt could cause such a big problem, but it did. Similarly, small changes in the quantity of interest-bearing government debt outstanding, through the process of rehypothecation, can give potentially very large effects in asset markets. Collateral and rehypothecation are not in my model, but if one were to take it to the data, that might be part of what one would want to include.
Wednesday, October 26, 2011
Do We Need a New Keynes?
John Cassidy, a journalist who writes for the New Yorker, wonders where the New Keynes is lurking. Cassidy states:
Unfortunately for Cassidy, but fortunately for the rest of us as it turns out, modern economics is not set up to give journalists tasty sound bites. In my generation, and in younger ones, it's hard to identify the "big" people with the "big" ideas, as economic research is a collective effort - much more so than in the days of Keynes, or even in the generation of Lucas, Prescott, Sargent, and Sims. The big idea that is helpful in understanding the financial crisis, its causes, and what should have been done or should be done about it, is the idea that was developed by the information theorists of the 1970s - Akerlof, Stiglitz, Townsend, Rothschild, Holmstrom; by the mechanism designers - Hurwicz, Maskin, Myerson; by the monetary theorists - Wallace, Townsend (again), Kiyotaki, Wright; by the financial intermediation theorists - Diamond, Dybvig, Townsend (again), Prescott, Boyd; by the dynamic contracting theorists - Green, Abreu-Pearce-Stacchetti, Atkeson-Lucas; by general-equilibrium financial-frictions people - Gertler, Bernanke, Smith, Kiyotaki-Moore. That idea is much much bigger, and immensely more solid science than Keynes.
At the end of the [radio] show, Leonard asked me an interesting question: Has the financial crisis and Great Recession produced any big new economic ideas? My immediate response was that it hasn’t, or, if it has, I wasn’t aware of them.I seems that what excites journalists is the personal, and the large. A nice, readable story is one that focuses on an individual with a "big" idea, that can be stated crisply in a few lines.
Unfortunately for Cassidy, but fortunately for the rest of us as it turns out, modern economics is not set up to give journalists tasty sound bites. In my generation, and in younger ones, it's hard to identify the "big" people with the "big" ideas, as economic research is a collective effort - much more so than in the days of Keynes, or even in the generation of Lucas, Prescott, Sargent, and Sims. The big idea that is helpful in understanding the financial crisis, its causes, and what should have been done or should be done about it, is the idea that was developed by the information theorists of the 1970s - Akerlof, Stiglitz, Townsend, Rothschild, Holmstrom; by the mechanism designers - Hurwicz, Maskin, Myerson; by the monetary theorists - Wallace, Townsend (again), Kiyotaki, Wright; by the financial intermediation theorists - Diamond, Dybvig, Townsend (again), Prescott, Boyd; by the dynamic contracting theorists - Green, Abreu-Pearce-Stacchetti, Atkeson-Lucas; by general-equilibrium financial-frictions people - Gertler, Bernanke, Smith, Kiyotaki-Moore. That idea is much much bigger, and immensely more solid science than Keynes.
Thursday, October 20, 2011
Econometrics, Calibration, and Fights
Paul Krugman is off on another rant about about dead issues in macroeconomics, in this post and this one, including the usual discussion about "freshwater guys," who currently exist only in Krugman's mind.
Several points here:
1. In current macroeconomic thought, calibration methods and econometrics are both widely accepted as useful approaches to answering quantitative questions, and these approaches are often mixed in particular projects. Kydland and Prescott pioneered calibration methods in macroeconomics, and applied them to a particular class of models, but the methods themselves are more general than that, and various Old Keynesians and New Keynesians have found calibration useful.
2. Part of what the calibration people were reacting to, was a view among econometricians that quantitative work was about "testing" theories. The problem is that any macroeconomic model is going to be wrong on some dimensions. To be useful, a model must be simple, and simplification makes it wrong in some sense. Subjected to standard econometric tests, it will be rejected. Models that are rejected by the data can nevertheless be extremely useful. I think that point is now widely recognized, and you won't find strong objections to it, as you might have in 1982.
3. There was indeed a fight at Minnesota over econometrics. I can't remember exactly when it happened. Maybe I was working at the Minneapolis Fed at the time. Maybe I wasn't. In any case, I think I know most of the details of the fight. The people involved were all fine human beings, and if you talked to them at the time about their sides of the argument, they would all make perfect sense. Fights happen among people who work together, and it's best to let these things rest. I'm sure Krugman could tell us a lot about fights in his own department. This is basically gossip, and is best left for talk in the bar.
4. Krugman tells us:
Several points here:
1. In current macroeconomic thought, calibration methods and econometrics are both widely accepted as useful approaches to answering quantitative questions, and these approaches are often mixed in particular projects. Kydland and Prescott pioneered calibration methods in macroeconomics, and applied them to a particular class of models, but the methods themselves are more general than that, and various Old Keynesians and New Keynesians have found calibration useful.
2. Part of what the calibration people were reacting to, was a view among econometricians that quantitative work was about "testing" theories. The problem is that any macroeconomic model is going to be wrong on some dimensions. To be useful, a model must be simple, and simplification makes it wrong in some sense. Subjected to standard econometric tests, it will be rejected. Models that are rejected by the data can nevertheless be extremely useful. I think that point is now widely recognized, and you won't find strong objections to it, as you might have in 1982.
3. There was indeed a fight at Minnesota over econometrics. I can't remember exactly when it happened. Maybe I was working at the Minneapolis Fed at the time. Maybe I wasn't. In any case, I think I know most of the details of the fight. The people involved were all fine human beings, and if you talked to them at the time about their sides of the argument, they would all make perfect sense. Fights happen among people who work together, and it's best to let these things rest. I'm sure Krugman could tell us a lot about fights in his own department. This is basically gossip, and is best left for talk in the bar.
4. Krugman tells us:
I’d just add that correspondents tell me that the anti-Keynesians pretty much blockade any hiring of new Keynesians in their departments.People in academic departments disagree about who to hire. This is news? When an academic hire is made, we think we are making a very long-term commitment, particularly when it's a tenured appointment. We're making risky bets on people. Are they intelligent and creative types? Will they be able to adapt when their current research program goes out of fashion? Will they get along well with their colleagues and make everyone more productive? Sometimes things turn out badly. While James Tobin was alive, he wouldn't tolerate having macroeconomists who were up on contemporary macro in his department. As a result, he set his department back, and they have only recently caught up with where macro has gone. Sometimes things turn out well. In 1991, the University of Minnesota hired Nobu Kiyotaki. Remember that Blanchard and Kiyotaki (1987) was a key innovation in Keynesian economics. The Minnesotans were not doctrinaire about it. They saw a smart and productive guy, hired him, and he went on to do more great things. He now works in the Princeton econ department, with Krugman.
Wednesday, October 19, 2011
Nominal GDP Targeting
Nominal GDP (NGDP) targeting seems to be getting a lot of attention. The idea seems to go back at least to the 1980s, when Bennett McCallum talked about it. Scott Sumner and David Beckworth have taken this up as a cause, and Charles Evans has discussed NGDP targeting in a speech. Some of the business media think it matters.
To make sense out of NGDP targeting, start with the original Taylor rule, as specified in Taylor's 1993 paper. Taylor proposed that monetary policy should be conducted according to the following rule:
R(t) = p(t)- p(t-1) + a[y(t) - y*] + b[p(t) - p(t-1) - i*] + r*,
where R is the fed funds rate target, p is the log of the price level, y is the log of real GDP, y* is the target level of real GDP, i* is the target inflation rate, r* is the long-run real interest rate, a > 0 and b > 0. Taylor did not derive his rule using theory, but instead argued that this rule worked well according to some loss criterion in some macroeconometric models. The Taylor rule found its way into New Keynesian (NK) models, and into monetary policy discussions. Taylor rules have been derived in NK models, though the arguments are a little slippery. Generally, an optimal policy rule in a NK model would be some relationship between the policy instrument(s) and exogenous variables, but of course real GDP and the price level are endogenous, so one has to go through some contortions to coax the Taylor rule out of any model. The argument would seem to rely on what is observable to the central bank and what is not.
So, suppose that the central bank adopts a NGDP target. Then, the central bank must also have an approach to implementing such a target. Presumably the advocates of NGDP targeting think that standard central banking practice works, i.e. that a sensible approach to policy over the very short term is to specify an intermediate target for the fed funds rate, with the target set according to the current state of the economy relative to the NGDP target. Thus, we could specify the implementation of the NGDP target as a rule
R(t) = p(t) - p(t-1) + c[y(t) + p(t) - y* - p*] + r*,
where y*p* is the log of the nominal GDP target and c > 0. We can then rewrite this rule as
R(t) = p(t) - p(t-1) + c[y(t) - y*] + c[p(t) - p(t-1) - p* + p(t-1)] + r*
What's the difference between this and the basic Taylor rule? Not much. (i) The coefficients on the terms governing the response of the fed funds rate to the "output gap" and the deviation of the inflation rate from its target are constrained to be the same. (ii) The interpretation of y* may be different. In the NK literature y* is the efficient level of aggregate output ground out in the underlying real business cycle model. The NGDP targeters seem to think of y* as the trend level of output. For practical purposes it does not make much difference, as the people who measure output gaps tend to think of trend GDP as potential GDP. (iii) The target inflation rate is not a constant, but the percentage deviation of the target price level from last period's inflation rate.
In sum, the NGDP rule fits well within the set of Taylor rules that people have considered, which deviate in various ways from the basic rule that Taylor wrote down in 1993. So what's new? Could it be that there is something different about what happens at the zero lower bound, which I have not accounted for thus far? Suppose we are at the zero lower bound, which is essentially the case currently, and the Fed announces, say, a target path for NGDP of 5% per year indefinitely. Could the Fed actually achieve such a target, even if it wanted to? No. Under current circumstances, there are no actions the Fed can take that could necessarily achieve such an outcome. Indeed, it is possible that the Fed could promise to keep the policy rate at 0.25% for five years in the future, and NGDP growth could fall below the target.
There is no magic in a NGDP target. I know people look at the state of the economy, and think that the Fed should keep trying things. Maybe something will work? Well, I'm afraid not. Even the FOMC dissenters, and their supporters are not quite ready to say that there is nothing the Fed can do under the current circumstances that could increase employment. But they should.
To make sense out of NGDP targeting, start with the original Taylor rule, as specified in Taylor's 1993 paper. Taylor proposed that monetary policy should be conducted according to the following rule:
R(t) = p(t)- p(t-1) + a[y(t) - y*] + b[p(t) - p(t-1) - i*] + r*,
where R is the fed funds rate target, p is the log of the price level, y is the log of real GDP, y* is the target level of real GDP, i* is the target inflation rate, r* is the long-run real interest rate, a > 0 and b > 0. Taylor did not derive his rule using theory, but instead argued that this rule worked well according to some loss criterion in some macroeconometric models. The Taylor rule found its way into New Keynesian (NK) models, and into monetary policy discussions. Taylor rules have been derived in NK models, though the arguments are a little slippery. Generally, an optimal policy rule in a NK model would be some relationship between the policy instrument(s) and exogenous variables, but of course real GDP and the price level are endogenous, so one has to go through some contortions to coax the Taylor rule out of any model. The argument would seem to rely on what is observable to the central bank and what is not.
So, suppose that the central bank adopts a NGDP target. Then, the central bank must also have an approach to implementing such a target. Presumably the advocates of NGDP targeting think that standard central banking practice works, i.e. that a sensible approach to policy over the very short term is to specify an intermediate target for the fed funds rate, with the target set according to the current state of the economy relative to the NGDP target. Thus, we could specify the implementation of the NGDP target as a rule
R(t) = p(t) - p(t-1) + c[y(t) + p(t) - y* - p*] + r*,
where y*p* is the log of the nominal GDP target and c > 0. We can then rewrite this rule as
R(t) = p(t) - p(t-1) + c[y(t) - y*] + c[p(t) - p(t-1) - p* + p(t-1)] + r*
What's the difference between this and the basic Taylor rule? Not much. (i) The coefficients on the terms governing the response of the fed funds rate to the "output gap" and the deviation of the inflation rate from its target are constrained to be the same. (ii) The interpretation of y* may be different. In the NK literature y* is the efficient level of aggregate output ground out in the underlying real business cycle model. The NGDP targeters seem to think of y* as the trend level of output. For practical purposes it does not make much difference, as the people who measure output gaps tend to think of trend GDP as potential GDP. (iii) The target inflation rate is not a constant, but the percentage deviation of the target price level from last period's inflation rate.
In sum, the NGDP rule fits well within the set of Taylor rules that people have considered, which deviate in various ways from the basic rule that Taylor wrote down in 1993. So what's new? Could it be that there is something different about what happens at the zero lower bound, which I have not accounted for thus far? Suppose we are at the zero lower bound, which is essentially the case currently, and the Fed announces, say, a target path for NGDP of 5% per year indefinitely. Could the Fed actually achieve such a target, even if it wanted to? No. Under current circumstances, there are no actions the Fed can take that could necessarily achieve such an outcome. Indeed, it is possible that the Fed could promise to keep the policy rate at 0.25% for five years in the future, and NGDP growth could fall below the target.
There is no magic in a NGDP target. I know people look at the state of the economy, and think that the Fed should keep trying things. Maybe something will work? Well, I'm afraid not. Even the FOMC dissenters, and their supporters are not quite ready to say that there is nothing the Fed can do under the current circumstances that could increase employment. But they should.
Tuesday, October 18, 2011
Matt Rognlie Needs to Learn Some Monetary Economics
Matt thinks that monetary frictions don't matter. Mike Woodford made the same mistake, and set a large fraction of macroeconomists off to work on New Keynesian models. Then, the financial crisis hit, and central banks started to engage in some unprecedented interventions, about which standard New Keynesian models had nothing to say.
In a basic Woodford model (see for example Mike's book, Interest and Prices), all monetary frictions are stripped away. In a Woodford world, the only friction comes about because of nominal price stickiness (and perhaps wage stickiness too), which leads to relative price distortions. What does a central bank do in a Woodford world? It sets the price of a bond which is a claim to "money" in the future, but this money is not actually held by anyone, in spite of the fact that all prices and wages are denominated in units of the stuff.
Why does central banking matter? It matters because a central bank can engage in intermediation activities that are not replicated in the private sector. A typical central bank has a monopoly on the issue of currency (in the US this is implicit), and on the large-value payments system. Thus, currency and bank reserves are liabilities that cannot be issued by private financial institutions. When the central bank issues its liabilities in order to buy assets, this in general matters. In particular, asset prices move. To understand how that process works, one has to model the frictions that make private intermediation useful, and the frictions that make assets of all kinds (including the ones conventionally called "money") useful in exchange. By "exchange," I mean exchange of all kinds, including retail exchange, and exchange among financial institutions.
Matt, for starters, you can read this blog post, this piece with Randy Wright, our chapter in the Handbook of Monetary Economics, and this forthcoming AER paper. The latter shows you why you need monetary frictions to understand the financial crisis and unconventional monetary policy.
In a basic Woodford model (see for example Mike's book, Interest and Prices), all monetary frictions are stripped away. In a Woodford world, the only friction comes about because of nominal price stickiness (and perhaps wage stickiness too), which leads to relative price distortions. What does a central bank do in a Woodford world? It sets the price of a bond which is a claim to "money" in the future, but this money is not actually held by anyone, in spite of the fact that all prices and wages are denominated in units of the stuff.
Why does central banking matter? It matters because a central bank can engage in intermediation activities that are not replicated in the private sector. A typical central bank has a monopoly on the issue of currency (in the US this is implicit), and on the large-value payments system. Thus, currency and bank reserves are liabilities that cannot be issued by private financial institutions. When the central bank issues its liabilities in order to buy assets, this in general matters. In particular, asset prices move. To understand how that process works, one has to model the frictions that make private intermediation useful, and the frictions that make assets of all kinds (including the ones conventionally called "money") useful in exchange. By "exchange," I mean exchange of all kinds, including retail exchange, and exchange among financial institutions.
Matt, for starters, you can read this blog post, this piece with Randy Wright, our chapter in the Handbook of Monetary Economics, and this forthcoming AER paper. The latter shows you why you need monetary frictions to understand the financial crisis and unconventional monetary policy.
The Two Sides of FOMC
There were two interesting speeches by Fed Presidents posted yesterday, one by Charles Evans, Chicago Fed President, and one by Jeff Lacker, Richmond Fed President. These are representative, I think, of the two opposing views on the FOMC.
Evans, as is well-known by now, is a hard-core Keynesian. Yesterday's speech is consistent with a previous one he gave, but there are more details in the most recent one. In response to the criticism that he is setting us up to repeat the monetary policy mistakes of the 1970s, he tells us about the 1970s, and also about the policy mistakes of the 1930s. From his point of view, we should be more worried about making the policy mistakes of the 1930s than repeating the 1970s. Why?
This of course is nonsense. In the New Keynesian model underlying Evans's thinking, there are no banks holding reserves, nor a central bank with a balance sheet that includes Treasury bonds, mortgage-backed securities, and Treasury bills. There is nothing that can capture what quantitative easing, of any type, is about. Why would Evans think that his model is telling us anything more than that we are at the zero lower bound and he has nothing more to say about it?
Now, why should we not be worried about making the monetary policy mistakes of the 1970s?
Evans has been highly supportive of the two recent policy decisions regarding forward guidance and Operation Twist. But he wants more. In particular, he is calling for conditionality in Fed statements, of the following sort:
If Evans's views dominated on the FOMC, I would be very worried. Fortunately, there are other voices. Jeff Lacker says:
Lacker also takes a shot at Barney Frank:
Evans, as is well-known by now, is a hard-core Keynesian. Yesterday's speech is consistent with a previous one he gave, but there are more details in the most recent one. In response to the criticism that he is setting us up to repeat the monetary policy mistakes of the 1970s, he tells us about the 1970s, and also about the policy mistakes of the 1930s. From his point of view, we should be more worried about making the policy mistakes of the 1930s than repeating the 1970s. Why?
Consider another metric for interest rates, the well-known Taylor Rule, which captures how monetary policy typically adjusts to output gaps and deviations in inflation from target. Its prescriptions would call for the federal funds rates to be something like –3.6 percent now, well below the zero lower bound the funds rate is currently stuck at. Our large-scale asset purchases have provided additional stimulus, but by most estimates not enough to bring us down to the Taylor Rule prescriptions.This is by now a well-worn argument in the Fed system for more monetary policy accommodation. Fit a Taylor rule to the data, without taking into account that it is not feasible to violate the zero lower bound on the fed funds rate. Then, under the assumption that past Fed behavior was optimal, or that we want the Fed to behave consistently with past behavior so as to maintain credibility, plug current observations for the output gap and inflation into the estimated rule. The rule tells us the fed funds rate should be negative. What's the conclusion? You might be thinking that we should re-estimate the Taylor rule taking into account the zero lower bound. Wrong. Some people in the Fed system, including Evans apparently, think that the conclusion is that we are not doing enough, and the Fed should find some other way to ease, such as buying some long Treasury bonds.
This of course is nonsense. In the New Keynesian model underlying Evans's thinking, there are no banks holding reserves, nor a central bank with a balance sheet that includes Treasury bonds, mortgage-backed securities, and Treasury bills. There is nothing that can capture what quantitative easing, of any type, is about. Why would Evans think that his model is telling us anything more than that we are at the zero lower bound and he has nothing more to say about it?
Now, why should we not be worried about making the monetary policy mistakes of the 1970s?
Other critics raise the specter of 1970s-like structural changes in the economy. Such changes, they argue, have reduced our productive potential, in particular the mechanisms by which resources — most notably labor — move from declining to expanding sectors of the economy.[3] I am acutely aware of the costs of making such an error. No central banker wants to repeat the painful experiences of the 1979–83 period. Indeed, the FOMC discussed this issue at great length (see the minutes of our January 2011 meeting).[4] However, I have yet to see empirical evidence based on a modeling framework that successfully captures U.S. business cycle dynamics that shows such supply-side structural factors can come close to explaining the huge shortfalls in actual GDP from trend and the high level of unemployment.It's good that Evans wants a serious model to explain why unemployment is so high and the recovery is so sluggish. Maybe he could also supply us with a serious model of how quantitative easing works.
Evans has been highly supportive of the two recent policy decisions regarding forward guidance and Operation Twist. But he wants more. In particular, he is calling for conditionality in Fed statements, of the following sort:
I think we should consider committing to keep short-term rates at zero until either the unemployment rate goes below 7 percent or the outlook for inflation over the medium term goes above 3 percent.Yikes. The Fed should not be making explicit statements that make policy actions contingent on things, such as the unemployment rate, for which we could argue the primary determining factors are not monetary policy. Suppose the Fed made such a commitment, and there were significant sectoral changes in the US economy over the next two years that caused the unemployment rate to increase. What then?
If Evans's views dominated on the FOMC, I would be very worried. Fortunately, there are other voices. Jeff Lacker says:
My reading of the evidence is that the strength of this recovery is going to be relatively independent of our monetary policy choices from here on out. The factors likely to be restraining growth — from empty houses to prospective tax rates — are nonmonetary and largely beyond the power of the central bank to offset through easier monetary conditions. History has repeatedly demonstrated that if a central bank attempts to add monetary stimulus to offset nonmonetary disturbances to growth, the result is higher inflation that can be difficult and costly to eliminate. This is why I opposed the Maturity Extension Program — popularly known as "Operation Twist" — in which the Fed will buy long-term Treasury securities and simultaneously sell short-term Treasury securities. The effect of these operations is uncertain, but likely to be relatively small. My sense is that the main effect will be to raise inflation somewhat rather than increase growth.I pretty much agree with that, and it's roughly consistent with Plosser's views. The only thing I disagree with here are the prospects for inflation. I don't think that the Operation Twist program has any consequences at all - for quantities or prices.
Lacker also takes a shot at Barney Frank:
The fact that diverse and independent views are brought to bear on important policy questions is attributable in part to the unique federated structure of the Federal Reserve System. When the Fed was founded in 1913, Congress deliberately rejected the monolithic model of the European central banks of the time. By chartering 12 distinct banks, each with a board of directors that appoints their Reserve Bank president (subject to approval by the Board of Governors), they deliberately sought to insulate policymaking from election-induced swings that can distort decision-making by diminishing the focus on long-run considerations. And while the Reserve Bank presidents are subject to oversight from both their own boards of directors and the Board of Governors in Washington, their distinct policy views are informed by both regional economic information and the independent research of Reserve Bank economists. This is why legislation that aims at stifling dissent by removing the presidents from the FOMC would be so harmful. By limiting the diversity of independent views around the table, such measures would undermine the historic strength of the System.Good for Jeff. One of the strengths of the Fed system relative to other central banks is the semi-independence of the regional Feds from the Board of Governors in Washington, which creates healthy competition in ideas. In recent history, the average level of expertise in economics has been much higher among the regional Fed presidents than among the Governors. It would be too bad to lose that.
Thursday, October 13, 2011
Twisting
There are some things in the minutes of the September 20-21 FOMC meeting that seem worth discussing.
Bernanke and other Fed officials like to tell us about the large toolbox they have available for fixing what ails us, and the meeting began with a discussion of the available tools that could be used to supply more accommodation. The choice was framed as Goldilocks would see it. There is "too cold," "just right," and "too hot," in that order, and you know at the outset that the committee will choose just right.
Too cold would involve a change in how the proceeds from principal payments on its holdings of agency securities would be revinvested. Policy before the last meeting was to hold the size of the Fed's balance sheet constant, and to take the proceeds from agency securities and mortgage-backed securities (MBS) that run off, and reinvest that in Treasury securities with a particular average maturity. The proposal was to simply lengthen that average maturity. Just right was Operation Twist - lengthen the average maturity of the Fed's portfolio by selling short-maturity Treasury bonds and buying long Treasuries, while holding the size of the balance sheet constant. Too hot was a repeat of QE3 - an increase in the size of the Fed's balance sheet through purchases of long Treasuries.
At this point in the meeting, there is some discussion. People on the committee who are in an accommodative mood are thinking they will want to keep trying if whatever the committee decides to do now does not work. Just right is seen as a one-time intervention - clearly you can't keep increasing the average maturity of the Fed's portfolio indefinitely. Some people raise some objections: maybe none of these interventions will have much of an effect, if any; maybe we will get too much inflation. A proposal is introduced which was not heretofore on the table (and which ultimately the committee will go for), which is to reinvest the proceeds from maturing agency securities and MBS in more MBS, rather than in Treasuries.
Then, there is a discussion about transparency. In particular:
The committee also considered the possibility of lowering the interest rate on reserves (IOR), presumably to 0% from 0.25%. It is quite important that this discussion appears in the FOMC minutes, as decisions about changes in the IOR actually rest with the Board of Governors, not the FOMC. In discussing this at the FOMC meeting, the Board is recognizing that the committee should have a role in the decision, though that role was not given to it by Congress. In my view, a change in the IOR, or language that tells us about the future path of the IOR, is the only relevant element of Fed decisionmaking currently. None of the quantitative interventions actually matter, under current circumstances.
Here is a useful piece of information from the IOR discussion:
Otherwise, the IOR discussion is a bit murky. For example:
The FOMC of course settled on a policy involving a swap of $400 billion in short-term Treasuries for an equal quantity of long-term Treasuries ("Operation Twist"), and a policy of reinvesting principal repayments on agency securities and MBS in new MBS. The committee members voting for these measures seem to think this will result in decreases in long-term interest rates, and a reduction in the margin between mortgage rates and other long-term interest rates.
There were of course three dissenting votes. Fisher's objections were:
Kocherlakota says:
So, the majority of voting members on the FOMC seem to think that they can actually do things that are more "accommodative" currently. Even the people who are objecting (particularly Plosser) seem to think that the Operation Twist maturity swap, and the QE2 swap of reserves for long Treasuries, actually matter for long-term interest rates. If the Fed can in fact move long-term interest rates at will, then in fact they should be able to target long-term nominal interest interest rates. Indeed, if we believe what the FOMC says, the Fed should be able to determine the whole nominal term structure of interest rates by intervening sufficiently. Why then are these unusual interventions specified not as interest rate targets but in terms of the quantities of assets purchased? You know why. If they thought they could hit the interest rate targets, they would announce it that way. But they know they cannot; basically it doesn't work.
Bernanke and other Fed officials like to tell us about the large toolbox they have available for fixing what ails us, and the meeting began with a discussion of the available tools that could be used to supply more accommodation. The choice was framed as Goldilocks would see it. There is "too cold," "just right," and "too hot," in that order, and you know at the outset that the committee will choose just right.
Too cold would involve a change in how the proceeds from principal payments on its holdings of agency securities would be revinvested. Policy before the last meeting was to hold the size of the Fed's balance sheet constant, and to take the proceeds from agency securities and mortgage-backed securities (MBS) that run off, and reinvest that in Treasury securities with a particular average maturity. The proposal was to simply lengthen that average maturity. Just right was Operation Twist - lengthen the average maturity of the Fed's portfolio by selling short-maturity Treasury bonds and buying long Treasuries, while holding the size of the balance sheet constant. Too hot was a repeat of QE3 - an increase in the size of the Fed's balance sheet through purchases of long Treasuries.
At this point in the meeting, there is some discussion. People on the committee who are in an accommodative mood are thinking they will want to keep trying if whatever the committee decides to do now does not work. Just right is seen as a one-time intervention - clearly you can't keep increasing the average maturity of the Fed's portfolio indefinitely. Some people raise some objections: maybe none of these interventions will have much of an effect, if any; maybe we will get too much inflation. A proposal is introduced which was not heretofore on the table (and which ultimately the committee will go for), which is to reinvest the proceeds from maturing agency securities and MBS in more MBS, rather than in Treasuries.
Then, there is a discussion about transparency. In particular:
Most participants indicated that they favored taking steps to increase further the transparency of monetary policy, including providing more information about the Committee's longer-run policy objectives and about the factors that influence the Committee's policy decisions.It's hard to know what to make of this without more specifics. Maybe what the committee members had in mind was in line with what Evans talks about here. If so, it's wrongheaded, and some people on the committee seem to think so too:
a number of participants expressed concerns about the conceptual issues associated with establishing and communicating explicit longer-run objectives for the unemployment rate or other measures of labor market conditions, inasmuch as the long-run equilibrium levels of such measures are influenced importantly by nonmonetary factors, are subject to change over time, and are estimated with considerable uncertainty. In contrast, participants noted that the long-run level of inflation is determined primarily by monetary policy.
The committee also considered the possibility of lowering the interest rate on reserves (IOR), presumably to 0% from 0.25%. It is quite important that this discussion appears in the FOMC minutes, as decisions about changes in the IOR actually rest with the Board of Governors, not the FOMC. In discussing this at the FOMC meeting, the Board is recognizing that the committee should have a role in the decision, though that role was not given to it by Congress. In my view, a change in the IOR, or language that tells us about the future path of the IOR, is the only relevant element of Fed decisionmaking currently. None of the quantitative interventions actually matter, under current circumstances.
Here is a useful piece of information from the IOR discussion:
a recent change in deposit insurance assessments had the effect of significantly reducing the net return that many banks receive from holding reserve balances.There are some seemingly puzzling things going on with respect to the behavior of the fed funds rate relative to the IOR. One might expect that the IOR would place a lower bound on the fed funds rate, much as in any channel system (Canada, Australia, ECB, for example). But this is not the case, as the fed funds rate is currently less than the IOR, and has even decreased since late 2008. The GSEs (Fannie Mae, Freddie Mac) do not receive interest on reserves, and commercial banks do not arbitrage away the difference between zero and 0.25%, for reasons that are in part unexplained. However, a contributing factor to the lack of arbitrage has been the change in deposit insurance assessments. Banks are now charged the assessment based on total assets, not deposits. Thus, if I am a bank and attempt to arbitrage the difference between the fed funds rate and the IOR by borrowing on the fed funds market and holding what I borrow as reserves, I increase what I pay to the FDIC. As well, there seems to be some effect of the total quantity of reserves in the system on the IOR-fed funds rate differential (higher reserves increases the differential), but this is just a correlation with no theory backing it up.
Otherwise, the IOR discussion is a bit murky. For example:
Moreover, many participants voiced concerns that reducing the IOR rate risked costly disruptions to money markets and to the intermediation of credit, and that the magnitude of such effects would be difficult to predict in advance. In addition, the federal funds market could contract as a result and the effective federal funds rate could become less reliably linked to other short-term interest rates.The "disruptions to money markets" may refer to effects that might arise because of the rules governing money market mutual funds, but I'm not sure. I'm not sure why anyone is concerned with activity on the fed funds market. Most of the activity on this market currently must simply be commercial banks borrowing from GSEs. Why do we care if that goes away?
The FOMC of course settled on a policy involving a swap of $400 billion in short-term Treasuries for an equal quantity of long-term Treasuries ("Operation Twist"), and a policy of reinvesting principal repayments on agency securities and MBS in new MBS. The committee members voting for these measures seem to think this will result in decreases in long-term interest rates, and a reduction in the margin between mortgage rates and other long-term interest rates.
There were of course three dissenting votes. Fisher's objections were:
Mr. Fisher saw a maturity extension program as providing few, if any, benefits in support of job creation or economic growth, while it could potentially constrain or complicate the timely removal of policy accommodation. In his view, any reduction in long-term Treasury rates resulting from this policy action would likely lead to further hoarding by savers, with counterproductive results on business and consumer confidence and spending behaviors. He felt that policymakers should instead focus their attention on improving the monetary policy transmission mechanism, particularly with regard to the activity of community banks, which are vital to small business lending and job creation.The first sentence makes sense, but I can't decipher the rest. Hoarding by savers and improvements in transmission through community banks? What's that about? We know Fisher is not an economist, but he has economists briefing him. Does he not listen? Does it not sink in? Are the briefers bad at their jobs? Who knows?
Kocherlakota says:
Mr. Kocherlakota's perspective on the policy decision was again shaped by his view that in November 2010, the Committee had chosen a level of accommodation that was well calibrated for the condition of the economy. Since November, inflation, and the one-year-ahead forecast for inflation, had risen, while unemployment, and the one-year-ahead forecast for unemployment, had fallen. He did not believe that providing more monetary accommodation was the appropriate response to those changes in the economy, given the current policy framework.This is basically identical to Kocherlakota's objection at the previous meeting. Finally, Plosser:
Mr. Plosser felt that a maturity extension program would do little to improve near-term growth or employment, in light of the ongoing structural adjustments and fiscal challenges both in the United States and abroad. Moreover, in his view, with inflation continuing to run above earlier forecasts, such a program could risk adding unwanted inflationary pressures and complicate the eventual exit from the period of extraordinarily accommodative monetary policy.I think that is basically correct, but the maturity swap is essentially irrelevant for inflation, and does not further complicate exit, given that the size of the balance sheet is being held constant.
So, the majority of voting members on the FOMC seem to think that they can actually do things that are more "accommodative" currently. Even the people who are objecting (particularly Plosser) seem to think that the Operation Twist maturity swap, and the QE2 swap of reserves for long Treasuries, actually matter for long-term interest rates. If the Fed can in fact move long-term interest rates at will, then in fact they should be able to target long-term nominal interest interest rates. Indeed, if we believe what the FOMC says, the Fed should be able to determine the whole nominal term structure of interest rates by intervening sufficiently. Why then are these unusual interventions specified not as interest rate targets but in terms of the quantities of assets purchased? You know why. If they thought they could hit the interest rate targets, they would announce it that way. But they know they cannot; basically it doesn't work.
Monday, October 10, 2011
Red-Letter Day: Krugman Gets Banking
Miracles happen. Krugman understands a Diamond-Dybvig model (more or less). We can quibble about the banking panic part, but he essentially gets it right, and applies it to thinking about the Murray Rothbard elements of Ron Paul's thinking. It's roughly consistent with #3 in this piece. There is hope.
Sargent Family Tree
I found this on Sargent's web page. This shows you Sargent's academic children, grandchildren and great-grandchildren.
Krugman is Confused
I wrote about the 2011 Nobel prize that went to Sargent and Sims, then read what Krugman has to say.
If you have been reading Krugman, you know that he thinks the IS-LM model is great, that bad economists are the ones who use sophisticated mathematics, and that we are in a Dark Age of Macroeconomics that began in the 1970s in "freshwater schools" like the University of Minnesota. Are those views consistent with this?
If you have been reading Krugman, you know that he thinks the IS-LM model is great, that bad economists are the ones who use sophisticated mathematics, and that we are in a Dark Age of Macroeconomics that began in the 1970s in "freshwater schools" like the University of Minnesota. Are those views consistent with this?
...before Sargent and Sims came along, econometrics consisted largely of estimating models you had no good reason to believe based on identifying assumptions (if you don’t already know, you don’t want to) that lacked credibility. S and S played a key role in developing methods that let the data speak instead.Krugman does not seem to understand that the "incredible identifying assumptions" came from Old Keynesian IS-LM economics. 1970s rational expectations macroeconometrics, developed by Sargent and Sims, tells you that the identifying assumptions that went into expanded IS-LM estimated models like the FRB/MIT/Penn model, were incredible, and that we should throw those models out.
Sargent and Sims
I woke up this morning to Per Krusell's voice on the radio, telling me that Thomas Sargent and Christopher Sims had won the 2011 Nobel Prize in Economics. Excellent!
Sargent, along with Neil Wallace, was among the first macroeconomists to recognize that Robert Lucas had done something important in 1972, and helped the rest of the profession understand that by developing the ideas. Sargent, Wallace, and Sims were instrumental in developing, in the 1970s, a model for cooperation in economic research between academics and central bankers at the Federal Reserve Bank of Minneapolis. Minnesota macro has since had a huge influence on the profession, and on the practice of central banking.
Both Sargent and Sims brought a strong econometric tradition to macroecononomics. Sims's work on vector autoregressions, beginning with Macroeconomics and Reality has been highly influential, and you can see Sims's influence in how people like Marty Eichenbaum, Larry Christiano, and Jordi Gali, for example, do their work. Modern quantitative work in macroeconomics, among New Keynesians and non-Keynesians alike, includes both estimation and calibration (from Prescott), a state of affairs I think Sargent and Sims are pleased with.
Sargent has been a key proponent of the use of mathematics and technical developments in other fields in macroeconomics, from dynamic programming methods to frequency domain techniques to robust control. In part, he has promoted the use of these techniques in several generations of textbooks for economics graduate students. Indeed, Sargent's key influence has been through his students. Any Sargent student can tell you about the "Sargent reading group," how it works, and how much they learned from it.
Both Sargent and Sims are economists with extremely high technical ability, but with brilliant insight into economic ideas and economic modeling. Sims is not only a top econometrician, but has made key contributions to the study of the fiscal theory of the price level and rational inattention.
The Nobel committee chose well this year. I think all of us should be pleased.
Sargent, along with Neil Wallace, was among the first macroeconomists to recognize that Robert Lucas had done something important in 1972, and helped the rest of the profession understand that by developing the ideas. Sargent, Wallace, and Sims were instrumental in developing, in the 1970s, a model for cooperation in economic research between academics and central bankers at the Federal Reserve Bank of Minneapolis. Minnesota macro has since had a huge influence on the profession, and on the practice of central banking.
Both Sargent and Sims brought a strong econometric tradition to macroecononomics. Sims's work on vector autoregressions, beginning with Macroeconomics and Reality has been highly influential, and you can see Sims's influence in how people like Marty Eichenbaum, Larry Christiano, and Jordi Gali, for example, do their work. Modern quantitative work in macroeconomics, among New Keynesians and non-Keynesians alike, includes both estimation and calibration (from Prescott), a state of affairs I think Sargent and Sims are pleased with.
Sargent has been a key proponent of the use of mathematics and technical developments in other fields in macroeconomics, from dynamic programming methods to frequency domain techniques to robust control. In part, he has promoted the use of these techniques in several generations of textbooks for economics graduate students. Indeed, Sargent's key influence has been through his students. Any Sargent student can tell you about the "Sargent reading group," how it works, and how much they learned from it.
Both Sargent and Sims are economists with extremely high technical ability, but with brilliant insight into economic ideas and economic modeling. Sims is not only a top econometrician, but has made key contributions to the study of the fiscal theory of the price level and rational inattention.
The Nobel committee chose well this year. I think all of us should be pleased.
Wednesday, October 5, 2011
Simple-Minded Pseudo-Macroeconomists
Tyler Cowen does not like the IS/LM model. Excellent. Indeed, when Hicks wrote his 1937 paper "Mr. Keynes and the Classics," which interpreted Keynes's General Theory as the now-familiar IS-LM construct, he finished with this:
How is IS-LM used today? You do not see it in published macroeconomic research, as a framework for discussion among policymakers, or in PhD programs in economics. It is certainly not necessary to use it in teaching Keynesian economics to undergraduates. In the third edition of my intermediate macro textbook, you will not find an IS-LM model. I have found what I think are more straightforward and instructive ways to get Keynesian economics across, and to get it across in line with what modern Keynesian researchers actually do. For example, I do a version of a Keynesian coordination failure model that looks like what Roger Farmer did in the early 1990s, and an undergraduate version of a Woodford sticky-price model.
So, given that IS-LM is not used by any serious macroeconomic researchers or practitioners, and that we want to represent in an accessible way for undergraduates what macroeconomic researchers and practitioners are actually up to, why would anyone care about IS-LM? Why indeed? But Brad DeLong and Paul Krugman do. In fact, they are quite passionate about it. Well, the Amish are passionate about what they do as well. While DeLong and Krugman might like to freeze the profession at its state in 1937, the rest of us have moved on. In the words of the great bard:
The General Theory of Employment is a useful book. But it is neither the beginning nor the end of Dynamic Economics.Thus, Hicks himself is telling you: "This is my effort to figure out what the heck Keynes was trying to get across. Don't take it too seriously though. I'm sure there is plenty of good research to come that will put all of this into perspective, and indeed may replace it." Hicks would probably have been surprised at what happened to IS-LM. Generations of textbook writers found IS-LM a very convenient model to use in getting basic Keynesian ideas across to undergraduate students. However, frontier macroeconomic researchers did not take IS-LM seriously after the early 1970s. By about 1980, IS-LM had essentially disappeared from the top economics journals and from the top PhD programs in economics. But one could still find some version of IS-LM in undergraduate textbooks.
How is IS-LM used today? You do not see it in published macroeconomic research, as a framework for discussion among policymakers, or in PhD programs in economics. It is certainly not necessary to use it in teaching Keynesian economics to undergraduates. In the third edition of my intermediate macro textbook, you will not find an IS-LM model. I have found what I think are more straightforward and instructive ways to get Keynesian economics across, and to get it across in line with what modern Keynesian researchers actually do. For example, I do a version of a Keynesian coordination failure model that looks like what Roger Farmer did in the early 1990s, and an undergraduate version of a Woodford sticky-price model.
So, given that IS-LM is not used by any serious macroeconomic researchers or practitioners, and that we want to represent in an accessible way for undergraduates what macroeconomic researchers and practitioners are actually up to, why would anyone care about IS-LM? Why indeed? But Brad DeLong and Paul Krugman do. In fact, they are quite passionate about it. Well, the Amish are passionate about what they do as well. While DeLong and Krugman might like to freeze the profession at its state in 1937, the rest of us have moved on. In the words of the great bard:
Your old road is rapidly aging. Please get out of the new one if you can't lend your hand, for the times they are a-changing.
Subscribe to:
Posts (Atom)