Study role of commitment in allowing automated (algorithmic) pricing to sustain supracompetitive prices. Key ingredient: letting managers override the algorithms in place. Cost of override = less commitment.
Even w/ costless override, supracompetitive pricing is possible in equilibrium! Just need to design the algorithm in a clever way.
Prediction helps, too. If algorithms lack both commitment and prediction capabilities, then we get Bertrand pricing.
Broadly holds true in a dynamic setting, too. Dynamic game eqm is in mixed strategies, yielding asymmetric (across firms) Edgeworth cycling. Matches gasoline pricing data in ways other workhorse cycle models don't!
"Economic Inequality and Market Power" Latest available draft: January 2024. [Dropbox][SSRN] [Slides]
Develop tractable partial equilibrium framework mapping income/wealth inequality to demand. Idea: characterize consumers separately based on wealth and tastes. Wealth causally changes WTP for the products via price sensitivity.
Framework can be used to incorporate inequality into any static IO exercise, empirical or theoretical. Focus on two test cases.
First: develop axioms that, combined with framework, imply a new inequality-adjusted, dollar metric of consumer welfare.
Second: develop sufficient conditions for increases in inequality to increase markups. In future: empricial exercise.
How should an environmental regulator target costly inspections in a dynamic context?
Idea: use linked regulation. Inspect bad actors more regularly, but also inspect plants co-owned with known bad actors because they may be correlated.
Texas does this in enforcing CWA and RCRA. But how well do they do it? What are the gains of linking? Estimate dynamic model of enforcement with linked regulation and multi-plant fiirms.
Targeting historic bad actors is better than random inspections, and linking is better than targeting one plant a time. 50% efficiency improvements from this!
Why are some firms better at predicting demand than others when setting prices?
How do we measure firms' information in the presence of other unobservables that shift prices? New identification results leveraging common-knowledge demand shifts.
Apply the method to hotel industry. Turns out, hotels vary widely in demand prediction abilities. This matters for productivity, and it doesn't tend to advantage big chains with big data ccapabilitie.
"Volatility, Uncertainty, and Hotel Capacity" Latest available draft: June 2020.
[Dropbox] [Slides]
Capacity is sunk, so when demand changes, capacity cannot change accordingly. So does demand stochasticity distort capacity choice?
In the hotel industry, there is not much of a distortion in capacity, if prices can adjust to demand shifts instead.
But since prices cannot adjust to demand shifts, hotels build excess capacity in anticipation of stochastic demand.
Theory shows that this is because excess capacity is cheap and often slack. In other markets, this may not be true.
Resting Papers
"Informational Complementarities and IT Arms Races" Draft available upon request.
Examine firms' information-gathering in equlibrium. Theory suggests that if one firm gets better at predicting demand its rivals may try to do the same.
In the hotel industry, equilibrium effects account for over half the variation in demand prediction quality across hotels.
This suggests that if one or two large hotel chains achieve an improvement in demand forecasting ability, they may incite an IT arms race where everyone scrambles to improve as well.