Chapter 14Real Business Cycles

Introduction

Chapter 8 used the IS-LM model to analyze short-run fluctuations. That model, built on Keynesian foundations, treats aggregate demand as the primary driver of business cycles. In the late 1970s, a methodological revolution challenged this approach. Robert Lucas argued that any model used for policy evaluation must be built from microeconomic foundations — optimizing agents, rational expectations, and market clearing. This is the Lucas critique, and it destroyed the large-scale Keynesian models that had dominated macroeconomics.

The Real Business Cycle (RBC) model, pioneered by Kydland and Prescott (1982), took the Lucas critique seriously. It asks: can an economy with fully flexible prices, rational agents, and technology shocks reproduce the key features of the business cycle? The answer is a qualified yes — and even where the answer is no, the RBC framework became the chassis for all subsequent macroeconomic modeling.

By the end of this chapter, you will be able to:
  1. State the Lucas critique and explain why it matters for policy analysis
  2. Set up the social planner's problem for the basic RBC model
  3. Derive first-order conditions using the Bellman equation
  4. Log-linearize the model around steady state
  5. Trace impulse responses to technology shocks
  6. Evaluate the RBC model against business cycle data

14.1 The Lucas Critique

In 1976, Robert Lucas published what may be the most influential methodological paper in macroeconomics. His argument was simple but devastating: if agents are rational, their behavior depends on the policy regime. When policy changes, agents' decision rules change — so parameters estimated under the old regime are invalid under the new one.

The old approach. In the 1960s–70s, central banks and governments used large-scale econometric models (hundreds of equations) to predict the effects of policy changes. These models estimated behavioral parameters — the marginal propensity to consume, the slope of the Phillips curve, the sensitivity of investment to interest rates — from historical data, then simulated "what if" scenarios by changing policy variables.

The critique. Lucas pointed out that these parameters are not structural constants of nature. They reflect agents' optimal responses to the economic environment — including the policy regime. Change the regime, and the parameters change.

Example 14.1 — Temporary Tax Cut

A Keynesian model estimates the MPC at 0.8 from historical data and predicts that a \$100 billion tax cut will raise consumption by \$10 billion. But if the tax cut is perceived as temporary, forward-looking consumers may save most of it to pay higher future taxes (Ricardian equivalence, Chapter 16). The MPC under a temporary tax cut is much lower than 0.8.

Example 14.2 — The Phillips Curve

The Phillips curve appeared to offer a stable tradeoff: the Fed could "buy" lower unemployment by accepting higher inflation. But when the Fed actually tried this in the late 1960s, workers and firms adjusted their inflation expectations upward. The Phillips curve shifted — the tradeoff disappeared. The parameter (the slope) changed because the policy regime changed.

Lucas critique. Robert Lucas's (1976) argument that econometric relationships estimated under one policy regime are invalid for predicting the effects of a different regime. Because agents' decision rules depend on the policy environment, the parameters of reduced-form models change when policy changes.
Microfoundations. The methodological requirement that macroeconomic models be derived from explicit optimization problems of individual agents (households, firms) with well-defined preferences, technology, and constraints. Parameters of preferences and technology are structural — they do not change with policy.

The solution: Build models from structural primitives — preferences, technology, and constraints — that don't change when policy changes. Agents' decision rules are derived from optimization, not assumed. This is the microfoundations approach.

14.2 The Basic RBC Model

Environment

Representative agent. A modeling device in which all households are identical, so the economy's behavior can be characterized by the optimization of a single "representative" household. This avoids aggregation problems but rules out distributional effects.
Social planner's problem. The optimization problem of a benevolent planner who maximizes the representative household's welfare subject to the economy's resource constraints. In the RBC model, the planner's solution coincides with the competitive equilibrium (by the First Welfare Theorem), simplifying the analysis.
Competitive equilibrium (equivalence with planner's problem). The decentralized outcome where households maximize utility and firms maximize profits, taking prices as given. In the basic RBC model with no externalities or distortions, the competitive equilibrium is Pareto efficient and replicates the social planner's allocation. This equivalence allows modelers to solve the simpler planner's problem.
Technology shock (TFP shock). A stochastic disturbance $z_t$ to total factor productivity that shifts the production function. In the RBC model, technology shocks following an AR(1) process are the sole source of business cycle fluctuations: $\ln z_t = \rho_z \ln z_{t-1} + \varepsilon_t$.
$$E_0 \sum_{t=0}^{\infty} \beta^t u(c_t, 1 - l_t)$$ (Eq. 14.1)

where $c_t$ is consumption, $l_t$ is labor supply, and \$1 - l_t$ is leisure. Technology: $Y_t = z_t K_t^\alpha l_t^{1-\alpha}$.

Technology shocks follow an AR(1) process:

$$\ln z_t = \rho_z \ln z_{t-1} + \varepsilon_t, \quad \varepsilon_t \sim N(0, \sigma_\varepsilon^2)$$ (Eq. 14.2)

Capital accumulation: $K_{t+1} = (1-\delta)K_t + I_t$. Resource constraint: $c_t + I_t = Y_t$.

Bellman Equation

$$V(K_t, z_t) = \max_{c_t, l_t} \left\{\ln c_t + \phi \ln(1-l_t) + \beta E_t[V(K_{t+1}, z_{t+1})]\right\}$$ (Eq. 14.4)

First-Order Conditions

Euler equation (intertemporal):

$$\frac{1}{c_t} = \beta E_t\left[\frac{1}{c_{t+1}}(\alpha z_{t+1} K_{t+1}^{\alpha-1} l_{t+1}^{1-\alpha} + 1 - \delta)\right]$$ (Eq. 14.5)

Intratemporal labor supply:

$$\frac{\phi}{1 - l_t} = \frac{(1-\alpha)z_t K_t^\alpha l_t^{-\alpha}}{c_t}$$ (Eq. 14.6)

14.3 Calibration

Calibration. A methodology for choosing model parameters using external information — long-run averages from national accounts, microeconomic estimates, and first moments of the data — rather than econometric estimation. The model is then evaluated by comparing its predictions for untargeted moments (second moments: volatilities, correlations) to the data.

RBC models introduced calibration: set parameters using external information (long-run averages, microeconomic studies, national accounts), then check whether the model reproduces business cycle features that weren't targeted.

ParameterValueSource / Target
$\beta$0.99Matches 4% annual real interest rate
$\alpha$0.36Capital share of income
$\delta$0.02510% annual depreciation
$\rho_z$0.95Persistence of Solow residual
$\sigma_\varepsilon$0.007Volatility of Solow residual innovations

14.4 Log-Linearization

Log-linearization. An approximation technique that expresses the model's nonlinear equations as linear functions of log-deviations from steady state ($\hat{x}_t = \ln x_t - \ln x^*$). The linearized system can be solved analytically or with standard matrix methods, yielding policy functions and impulse responses.

Define $\hat{x}_t = \ln x_t - \ln x^*$ (log-deviation from steady state). Taylor-expand each equation, keeping first-order terms.

$$\hat{y}_t = \hat{z}_t + \alpha \hat{k}_t + (1-\alpha)\hat{l}_t$$ (Eq. 14.8)
$$\hat{k}_{t+1} = (1-\delta)\hat{k}_t + \delta \hat{i}_t$$ (Eq. 14.9)
Impulse response function. The dynamic path of each model variable following a one-time shock. In the RBC model, an impulse response to a technology shock traces how output, consumption, investment, and hours worked deviate from steady state over time and eventually return.

14.5 Impulse Response Functions

A positive technology shock ($\varepsilon_t > 0$) raises $z_t$. Output rises immediately. Consumption rises by less than output (smoothing). Investment rises sharply (temporarily high returns). Hours worked depend on the balance of substitution and income effects — with persistent shocks, the wealth effect partially offsets the wage incentive.

Interactive: RBC Impulse Responses

Adjust the persistence of technology shocks ($\rho_z$) and watch how the impulse response shapes change. At low persistence, shocks die out quickly. At high persistence, effects are nearly permanent.

Transient (0.00) Baseline (0.95) Near-permanent (0.99)
$\rho_z = 0.95$: Half-life ≈ 14 quarters. Shock is highly persistent — large wealth effect dampens hours response.

Figure 14.1. Impulse responses to a one-standard-deviation positive technology shock. Four panels: output, consumption, investment, and hours worked. Drag the slider to see how persistence shapes the dynamics. Hover for exact values.

Example 14.3 — Steady State Computation

Compute the steady state for the basic RBC model with $\alpha = 0.33$, $\beta = 0.99$, $\delta = 0.025$, $\phi = 2$ (leisure weight), $z^* = 1$.

Step 1: From the Euler equation at steady state ($c_{t+1} = c_t$): \$1 = \beta(\alpha z^* K^{*\alpha-1} l^{*1-\alpha} + 1 - \delta)$. Solving: $\alpha K^{*\alpha-1} l^{*1-\alpha} = (1/\beta - 1 + \delta) = 1/0.99 - 1 + 0.025 = 0.0351$.

Step 2: Capital-labor ratio: $(K/l)^{\alpha-1} = 0.0351/0.33 = 0.1064$. So $K/l = 0.1064^{1/(0.33-1)} = 0.1064^{-1.493} = 28.6$.

Step 3: Output-capital ratio: $Y/K = (K/l)^{\alpha-1} = 0.1064$. Investment share: $I/Y = \delta(K/Y) = 0.025/0.1064 = 0.235$. Consumption share: $C/Y = 1 - I/Y = 0.765$.

Step 4: From the labor FOC: $\phi/(1-l^*) = (1-\alpha)(K^*/l^*)^\alpha / c^*$. With target $l^* = 1/3$: verify the calibration is internally consistent.

Example 14.4 — Qualitative Impulse Response to a Technology Shock

Trace the response to a positive one-standard-deviation technology shock ($\varepsilon_0 = 0.007$) with $\rho_z = 0.95$.

Impact (t=0): $z_0$ rises by 0.7%. Output jumps immediately: higher TFP means more output from the same inputs. The wage rises (MPL up), and the return to capital rises (MPK up).

Consumption: Rises by less than output (~0.3%). Forward-looking households smooth consumption over the persistent shock. They save a large fraction of the windfall.

Investment: Rises sharply (~2.5%) because the return to capital is temporarily high and households channel saving into capital accumulation.

Hours: The response depends on persistence. The substitution effect (higher wage $\to$ work more) pushes hours up. The wealth effect (richer $\to$ consume more leisure) pushes hours down. With $\rho_z = 0.95$, the wealth effect partially offsets, producing a small positive hours response (~0.2%).

Dynamics (t=1,...,40): All variables decay toward steady state at rate $\rho_z^t$. Capital accumulates slowly (predetermined), keeping output elevated even after $z_t$ has declined.

Business cycle moments. Summary statistics of cyclical fluctuations: standard deviations of key variables (output, consumption, investment, hours), relative volatilities ($\sigma_c/\sigma_y$, $\sigma_i/\sigma_y$), cross-correlations ($\text{corr}(c,y)$), and autocorrelations. The RBC model is evaluated by comparing model-generated moments to these data moments.
Shimer puzzle. The observation (Shimer, 2005) that the standard search-and-matching model of the labor market generates far too little unemployment volatility relative to the data. With Nash bargaining, wages absorb most of the productivity shock, leaving little room for employment fluctuations. This is the labor-market analog of the RBC model's hours volatility failure.

14.6 RBC Model Predictions vs. Data

Successes

FeatureU.S. DataRBC Model
$\sigma_c/\sigma_y$≈ 0.5✓ ~0.5
$\sigma_i/\sigma_y$≈ 3.0✓ ~3.0
Output persistenceAutocorr. ~0.85✓ From $\rho_z$
Procyclical C and I$\rho(c,y) > 0$

Failures

FeatureU.S. DataRBC Model
Hours volatility$\sigma_h/\sigma_y \approx 0.8$✗ ~0.3
Monetary non-neutralityMoney affects real GDP✗ Neutral
RecessionsMany non-technology causes✗ Requires negative tech shocks

Interactive: Calibration Explorer

Adjust the model's structural parameters and see how the simulated business cycle moments change. Compare to U.S. data targets — can you find a calibration that matches all moments?

0.200.360.50
0.9500.9900.999
0.0100.0250.050
0.500.950.99
0.0020.0070.015
Moment U.S. Data Model Match?
$\sigma_y$ (%) 1.72 1.72
$\sigma_c / \sigma_y$ 0.50 0.50
$\sigma_i / \sigma_y$ 3.00 3.00
$\sigma_h / \sigma_y$ 0.80 0.31
$\text{corr}(c, y)$ 0.88 0.88
$\text{autocorr}(y)$ 0.85 0.85

Figure 14.2. Calibration explorer. Adjust parameters and watch model moments update. Green check = within 20% of target. Red cross = outside 20%. The hours volatility ratio ($\sigma_h/\sigma_y$) is the hardest moment to match — the basic RBC model consistently underestimates it.

14.7 The HP Filter

Hodrick-Prescott filter. Decomposes a time series $y_t$ into trend $\tau_t$ and cycle $c_t = y_t - \tau_t$:
$$\min_{\{\tau_t\}} \sum_{t=1}^T (y_t - \tau_t)^2 + \lambda \sum_{t=2}^{T-1} [(\tau_{t+1} - \tau_t) - (\tau_t - \tau_{t-1})]^2$$ (Eq. 14.10)

The smoothing parameter $\lambda$ controls the tradeoff: higher $\lambda$ means smoother trend. Standard: $\lambda = 1600$ for quarterly data.

Interactive: HP Filter Visualizer

A simulated GDP series is decomposed into trend and cycle using the HP filter. Drag $\lambda$ to see the tradeoff: low $\lambda$ lets the trend track every wiggle (small cycles), high $\lambda$ forces a smooth trend (large cycles).

$\lambda = 1$ (trend tracks data) $\lambda = 1600$ (standard) $\lambda = 100000$ (linear trend)
$\lambda = 1600$ (standard for quarterly data). Cycle std dev: 1.72%

Figure 14.3. HP filter applied to simulated log GDP. Top panel: data (blue) and trend (red). Bottom panel: cyclical component (green). Standard $\lambda = 1600$ for quarterly data. Drag the slider to feel why the choice of $\lambda$ matters.

Example 14.5 — Model Moments vs. U.S. Data

Compare the baseline RBC model ($\alpha = 0.36$, $\beta = 0.99$, $\delta = 0.025$, $\rho_z = 0.95$, $\sigma_\varepsilon = 0.007$) to quarterly U.S. data (1947–2019, HP-filtered with $\lambda = 1600$).

MomentU.S. DataRBC ModelMatch?
$\sigma_y$ (%)1.721.72Yes (targeted)
$\sigma_c/\sigma_y$0.500.52Yes
$\sigma_i/\sigma_y$3.002.84Yes
$\sigma_h/\sigma_y$0.800.31No
$\text{corr}(c,y)$0.880.94Approx.
$\text{autocorr}(y)$0.850.86Yes

Key success: Consumption smoothing ($\sigma_c/\sigma_y \approx 0.5$) and investment volatility ($\sigma_i/\sigma_y \approx 3$) emerge naturally from optimal saving.

Key failure: Hours volatility is far too low (\$1.31$ vs. \$1.80$). The model needs either indivisible labor (Hansen, 1985) or labor market frictions to match the data.

The Historical Lens

The Lucas critique (1976): Why it destroyed large-scale Keynesian models.

In the 1960s and early 1970s, central banks and treasuries relied on large-scale econometric models — some with hundreds of equations — to forecast the economy and evaluate policy. The Federal Reserve's FRB/MIT/Penn model, the Brookings model, and similar systems estimated behavioral relationships (the marginal propensity to consume, the Phillips curve slope, the interest sensitivity of investment) from decades of historical data.

These models appeared to offer a stable tradeoff between inflation and unemployment. The Phillips curve suggested that the Fed could "buy" a percentage point of lower unemployment by accepting 1–2 percentage points of additional inflation. Policymakers in the Johnson and Nixon administrations exploited this tradeoff.

The critique: Lucas showed that the Phillips curve's slope was not a structural constant but a function of the monetary regime. Under a regime that kept inflation low, workers' inflation expectations were anchored, and surprise inflation could temporarily boost employment. But when the Fed systematically pursued inflationary policy, workers adjusted their expectations. The Phillips curve shifted up — the economy got higher inflation with no employment gain. This is exactly what happened during the stagflation of the 1970s.

The legacy: Lucas's paper redirected all of macroeconomics toward models built from structural primitives — preferences, technology, and equilibrium concepts that are invariant to policy. The RBC model was the first full implementation of this vision. Every DSGE model used by central banks today descends from the methodological revolution Lucas triggered.

Thread Example: The Kaelani Republic

RBC Analysis of Kaelani's Commodity Shock

A 20% decline in copper prices (40% of exports, 20% of GDP) is modeled as a negative technology shock equivalent to a 1.6% decline in GDP-equivalent productivity.

Output: Falls ~1.6%, partially recovers as resources reallocate. Consumption: Falls by less (smoothing). Investment in copper: Falls sharply. Hours: In copper sector, decline sharply; other sectors may absorb some workers.

The RBC model captures output and consumption dynamics, but misses unemployment dynamics — displaced copper miners don't instantly find jobs in other sectors.

Summary

Key Equations

LabelEquationDescription
Eq. 14.1$E_0 \sum \beta^t u(c_t, 1-l_t)$Household preferences
Eq. 14.2$\ln z_t = \rho_z \ln z_{t-1} + \varepsilon_t$Technology shock process
Eq. 14.4Bellman equationValue function
Eq. 14.5Euler equationConsumption smoothing
Eq. 14.6$MRS_{leisure,cons} = MPL$Intratemporal labor condition
Eq. 14.8–14.9Log-linearized systemApproximate solution
Eq. 14.10HP filterTrend-cycle decomposition

Practice

  1. Write down the social planner's problem for the basic RBC model with $u(c, l) = \ln c + 2\ln(1-l)$ and $Y = zK^{0.36}l^{0.64}$. Derive the Euler equation and the intratemporal labor condition.
  2. Using the calibration table, compute the steady-state values of $K/Y$, $I/Y$, and $C/Y$.
  3. Log-linearize the production function $Y = zK^\alpha l^{1-\alpha}$ around steady state. Verify Eq. 14.8.

Apply

  1. Identify three historical recessions and evaluate whether a "negative technology shock" is plausible for each.
  2. Explain the critique that measured TFP (the Solow residual) may reflect demand shocks, not true technology changes.
  3. Compare the RBC model's labor market predictions with real-world labor markets. What feature would you add?
  4. What happens if the HP filter $\lambda$ is set too high or too low? Why does it matter for model evaluation?

Challenge

  1. Solve the basic RBC model analytically under $\delta = 1$. Show that the policy functions are log-linear.
  2. Explain the Shimer puzzle: why does the Mortensen-Pissarides model generate too little unemployment volatility?
  3. Compare the RBC model and IS-LM on: (a) source of cycles, (b) role of monetary policy, (c) policy implications.