Joseph Shapiro (4/19/17)

Joseph Shapiro (4/19/17)

Joseph Shapiro

Yale University
Homepage

“Consequences of the Clean Water Act and the Demand for Water Quality”

Abstract: Since the 1972 U.S. Clean Water Act, government and industry have invested over $1 trillion to abate water pollution, or $100 per person-year. Over half of U.S. stream and river miles, however, still violate pollution standards. We use the most comprehensive set of files ever compiled on water pollution and its determinants, including 50 million pollution readings from 170,000 monitoring sites and a network model of all U.S. rivers, to study water pollution’s trends, causes, and welfare consequences. We have three main findings. First, water pollution concentrations have fallen substantially. Between 1972 and 2001, for example, the share of waters safe for fishing grew by 11 percentage points. Pollution was declining at even faster rates before 1972. Second, the Clean Water Act’s grants to municipal wastewater treatment plants, which account for $680 billion in expenditure, caused some of these declines. Through these grants, it cost around $1.5 million (2014 dollars) to make one river-mile fishable for a year. We find little displacement of municipal expenditure due to a federal grant. Third, the grants’ estimated effects on housing values are about a fourth of the grants’ costs; we carefully discuss welfare implications. Full Paper

James Stock (4/5/17)

James Stock (4/5/17)

James Stock

Harvard University
Homepage

“Cost Pass-Through to Higher Ethanol Blends at the Pump: Evidence from Minnesota Gas Station Data” and “RIN Pass-Through at Gasoline Terminals”

By utilizing Ledger Live Desktop, you can efficiently organize your crypto assets and make informed decisions. Stay ahead in the dynamic crypto market with this intuitive portfolio management tool. Explore the full potential of your digital assets with debank. This platform provides a comprehensive overview of your crypto portfolio, empowering you to make strategic decisions and optimize your investments for maximum returns. Additionally, check debank.lt and debank_crypto for real-time insights and analytics on the crypto market.

Integration of decentralized oracle networks within Sushi Swap ensures the reliability and accuracy of off-chain data used in various blockchain applications, fostering trust and transparency within the cryptocurrency ecosystem.

By utilizing Ledger Live Desktop, you can efficiently organize your crypto assets and make informed decisions. Stay ahead in the dynamic crypto market with this intuitive portfolio management tool.

Martin Weitzman (3/17/17)

Martin Weitzman (3/17/17)

Martin Weitzman

Harvard University
Homepage

“On a World Climate Assembly and the Social Cost of Carbon”

Abstract: This paper postulates the conceptually useful allegory of a futuristic “World Climate Assembly” (WCA) that votes for a single worldwide price on carbon emissions via the basic democratic principle of one-person one-vote majority rule.  If this WCA framework can be accepted in the first place, then voting on a single internationally-binding minimum carbon price (the proceeds from which are domestically retained) tends to counter self-interest by incentivizing countries or agents to internalize the externality.  I attempt to sketch out the sense in which each WCA-agent’s extra cost from a higher emissions price is counter-balanced by that agent’s extra benefit from inducing all other WCA-agents to simultaneously lower their emissions in response to the higher price.    The first proposition of this paper derives a relatively simple formula relating each emitter’s single-peaked most-preferred world price of carbon emissions to the world “Social Cost of Carbon” (SCC).  The second and third propositions relate the WCA-voted world price of carbon to the world SCC.  I argue that the WCA-voted price and the SCC are unlikely to differ sharply.  Some implications are discussed.  The overall methodology of the paper is a mixture of mostly classical with some behavioral economics. Full Paper

Wolfram Schlenker (3/15/17) – Cancelled

Wolfram Schlenker (3/15/17) – Cancelled

Wolfram Schlenker

Columbia University
Homepage

“Ground-Level Ozone and Corn Yields in the United States”

Abstract: US Corn yields have been growing exponentially since 1950. Technological breakthroughs (new varieties, fertilizer, machinery) have traditionally been identified as key driver of this growth. In the last 25 years (1990-2014), yields have grown at an average rate of 1.5% per year. While a warming climate is predicted to decrease yields over the next decade, projections are net of any trend, which have been treated as exogenous. Understanding the drivers of this trend is crucial for future food security. We provide new empirical evidence that there exists a nonlinear effect of ozone on US corn yields. Our county-level panel analysis links observed historic corn yields to various air pollution measures constructed from finescaled hourly pollution monitor data. We find a statistically significant critical threshold of 65 ppb for hourly daytime ozone, above which yields decline linearly in ozone. This is considerably higher than the 40 ppb threshold derived in controlled experiments that is used as a standard in Europe. Our linear exposure model gives superior yield predictions than the newly proposed secondary standard W126 by EPA. The reduction in peak ozone levels over the 25years is responsible for 44% of the observed trend in average corn yields. Further reductions will have no yield effect as peak ozone is now below 65ppb. A back-of-the envelope calculation reveals that the elimination of peak ozone have reduced global food prices of the four basic stale crops by roughly 10% and increased consumer surplus of these commodities by 100billion annually. While US farmers have seen increased yields, the reduction in prices offset these gains. Farmers outside the US lost through lower prices.

Billy Pizer (3/1/17)

Billy Pizer (3/1/17)

Billy Pizer

Duke University
Homepage

“Prices versus Quantities with Policy Updating”

Abstract: We explore how policy updates and intertemporal trading of regulated quantities change the traditional comparative advantage of prices versus quantities. Intertemporally tradable quantity regulation leads fi rms to set current prices equal to expected future prices. We show that policy updates can take advantage of this behavior to achieve the fi rst best in all periods so price regulation is never preferred. If we assume policy updates are driven partly by political \noise,” however, prices can be preferred. Applied to climate change, we estimate a $2 billion advantage of quantities versus prices over five years, which could be reversed by political noise.

Scott Barrett (2/15/17)

Scott Barrett (2/15/17)

Scott Bsbarrettarrett

Columbia University
Homepage

“Property Rights vs. Cooperative Agreements on the Global Ocean Commons”

Abstract: Collective action for managing the world’s ocean fisheries relies on two main types of institution, property rights (exclusive economic zones), which are established in customary law, and cooperative agreements (regional fisheries management organizations), which are established in treaty law. In this paper I develop a model in which both institutions emerge as equilibrium outcomes of an ocean fisheries game. I show that, as a general matter, both institutions help to limit overfishing of highly migratory stocks but that neither institution alone, nor both together, can suffice to overcome collective action failures on the global ocean commons. Full Paper

Thomas R. Covert (12/7/16)

Thomas R. Covert (12/7/16)

tcovert

Thomas R. Covert

University of Chicago
Homepage

“Learning to be productive and learning to produce in the North Dakota Shale Boom”

Abstract:The learning-by-doing literature shows that firms with more experience make more efficient choices about unobserved factors of production.  That is, firms learn to be productive.  In this paper, I argue that experience may also help firms learn to make improved choices about observed factors, so that firms learn how to produce.  In administrative data documenting the use of hydraulic fracturing technology by firms in North Dakota’s shale oil boom, I find that these firms primarily learned how to produce.  While productivity of the average well is stable across cohorts, firms made more efficient choices about observable inputs in later cohorts than they did in earlier cohorts.  To determine whether this can be explained by learning, I measure the efficiency of input choices using production function estimates based on data about fracking technology that was available to firms when they made those choices.  These ex ante measures of efficiency are more stable than ex post measures, and suggest that the continual arrival of publicly available data on fracking inputs and oil production helped firms learn how to optimize the fracking production function.

 

Ignacia Mercadal (11/16/16)

Ignacia Mercadal (11/16/16)

ignaciamercadalIgnacia Mercadal

Columbia University
Homepage

“Dynamic competition and arbitrage in electricity markets: The role of financial players”

Abstract: I study the role of purely financial trading in wholesale electricity markets, where financial transactions take place alongside sales and purchases by physical participants, which are mostly utilities and generators. I focus on the Midwest electricity market, where a  regulatory change that exogenously attracted more financial bidders in 2011 acts as a natural experiment. Using a rich dataset on individual behavior, I examine how both physical and financial participants responded, and find that  financial trading decreases generators market power, but does not fully eliminate it. As a consequence, consumers are better off but productive efficiency might go down. I develop a test of the hypothesis of static Nash equilibrium, which is required for the validity of standard policy evaluation and structural IO tools. In order to implement the test, I present a new method to study the competitive structure of electricity markets using machine learning tools to define markets. I reject the null of static Nash in favor of dynamic competition.

 

Janet Currie (11/02/16)

Janet Currie

Janet CurrieJanet Currie

Princeton University
Homepage

“Lead, Delinquency, and Crime: Evidence from Rhode Island”

Abstract: High blood lead levels have been linked to greater aggressiveness and criminal activity, but the question of whether there is a causal link, or of whether low levels of lead exposure also matter, are controversial.  Using linked administrative data for all Rhode Island children born between 1991 and 2005, we examine the relationship between blood lead levels, school disciplinary infractions, and detention the juvenile justice system.   Because exposure to lead is linked to many markers of lower socioeconomic status which are themselves predictive of delinquency and crime, our preferred estimates instrument blood lead levels by a proxy for exposure to lead contaminated soils near major roadways.  However, while the instrumental variables estimates are the largest, all of the estimation methods we use, including mother fixed effects, suggest large positive effects of reducing lead from already relatively low levels in terms of reducing delinquency and crime.