2015 Conference - Session 1

Session 1 - Thursday, March 19, 9:00 - 10:30

 

A.1:What Should Policy Makers (and the Public) Know about Interpreting Regulatory BCA? (Marvin 309)

Chair: Susan Dudley (sdudley@gwu.edu), The George Washington University

Panelists to Include:

1. Richard Belzer (regcheck@mac.com), Regulatory Checkbook

2. Glenn Blomquist (gcblom@email.uky.edu), University of Kentucky

3. Chris Carrigan (ccarrigan@gwu.edu), The George Washington University

4. Tony Cox (tcoxdenver@aol.com), Cox Associates

5. Peter Linquiti (linquiti@gwu.edu), The George Washington University

6. Brian Mannix (BMannix@aol.com), The George Washington University

Participants from a January 2015 discussion will discuss what policy makers need to know to understand and interpret RIAs.

B.1: Water Resources Management (Marvin 307)

Chair: William Wheeler (wheeler.william@epa.gov), U.S. Environmental Protection Agency

Presentations:

1.     Economic Assessment of Climate Change Adaptation Pilot Studies in the Great Lakes Region, Tess Forsell* (tess.forsell@erg.com), Eastern Research Group; National Oceanic and Atmospheric Administration's Coastal Service Center; Horsley Witten Group, Inc.

The economic effects of flooding from extreme precipitation events are being experienced throughout the Great Lakes region. The purpose of this study was to assess the economic costs and benefits of green infrastructure (GI) as a method of reducing the negative effects of flooding in Duluth, Minnesota, and Toledo, Ohio. A secondary purpose of the study was to develop an analytical framework that can be applied in other communities to 1) assess how their community may be impacted by flooding with increased precipitation, 2) consider the range of available green infrastructure and land use policy options to reduce flooding, and 3) identify the benefits that can be realized by implementing GI. Flooding modeled under current and future precipitation scenarios was coupled with current and future land use conditions to account for increased impervious surfaces that can further increase stormwater runoff volumes and peak flows. Next, flooding under current and future scenarios was modeled and associated damages were estimated using assumptions about additional flood storage that could be provided through the implementation of GI. The amount of reduced damages associated with flood mitigation strategies is represented as “benefits” (i.e., the difference between the economic impact of flooding without flood mitigation and the economic impact with the implementation of flood mitigation infrastructure). Monetized benefits include: reduced building damages; increased recreational use; reduced flood damaged land restoration costs; and reduced storm sewer infrastructure costs. The total present value and annualized benefits are monetized over 20 years and 50 years. In Duluth, the community where more benefits could be monetized, the 50-year effects are estimated to be $4.17 million in costs and $4.68 million in benefits. In this comparison benefits exceed costs, providing evidence in favor of implementing the GI project.

2.     Joint Effects of Storm Surge and Sea-Level Risk on U.S. Coasts, Lindsay Ludwig* (lludwig@indecon.com) and James Neumann, Industrial Economics; Kerry Emanuel and Sai Ravela, WindRisk Tech and MIT; Paul Kirshen, University of New Hampshire; Kirk Bosma, Woods Hole Group; and Jeremy Martinich, U.S. Environmental Protection Agency

 Ludwig Slides

Recent literature, the US Global Change Research Program’s National Climate Assessment, and recent events, such as Hurricane Sandy, highlight the need to take better account of both storm surge and sea-level rise (SLR) in assessing coastal risks of climate change. This study combines three models – a tropical cyclone simulation model; a storm surge model; and a model for economic impact and adaptation – to estimate the joint effects of storm surge and SLR for the US coast through 2100. The model is tested using multiple SLR scenarios, including those incorporating estimates of dynamic ice-sheet melting, two global greenhouse gas (GHG) mitigation policy scenarios, and multiple general circulation model climate sensitivities. The results illustrate that a large area of coastal land and property is at risk of damage from storm surge today; that land area and economic value at risk expands over time as seas rise and as storms become more intense; that adaptation is a cost-effective response to this risk, but residual impacts remain after adaptation measures are in place; that incorporating site-specific episodic storm surge increases national damage estimates by a factor of two relative to SLR-only estimates, with greater impact on the East and Gulf coasts; and that mitigation of GHGs contributes to significant lessening of damages. For a mid-range climate-sensitivity scenario that incorporates dynamic ice sheet melting, the approach yields national estimates of the impacts of storm surge and SLR of $990 billion through 2100 (net of adaptation, cumulative undiscounted 2005$); GHG mitigation policy reduces the impacts of the mid-range climate-sensitivity estimates by $84 to $100 billion.

3.     Assessing the Distributional Consequences of Premium and Claims Payments in the National Flood Insurance Program, Okmyung Bin* (bino@ecu.edu) and John A. Bishop, East Carolina University; Carolyn Kousky, Resources for the Future

This study examines the redistributional effects of the National Flood Insurance Program (NFIP), i.e. who benefits and who bears the costs of the NFIP, using a national database of premium, coverage, and claim payments at the zip code level between 2001 and 2009. Some argue the program provides an important benefit to low income households living in low lying areas in communities like those along the Mississippi River system, while others believe that it acts as a subsidy to the wealthy owners of beach homes. A recent study, based on more than 25 years of the NFIP premium and claims data to determine how the program’s price and payouts correlate to per-capita county income, finds no evidence that the NFIP disproportionally advantages richer counties (Bin, Bishop, and Kousky, Public Finance Review 2012). Although such finding is a useful first-order assessment, more detailed analysis is warranted since claims payments tend to be concentrated on a few policies such as repetitive loss properties. Our finding based on more disaggregated data should help insurance practitioners and policy makers make informed policy decisions regarding the flood insurance program.

4.     Economic Evaluation of Community Water Fluoridation: A Community Guide Systematic Review, Tao Ran* (xgy2@cdc.gov), Sajal Chattopadhyay and Randy Elder, U.S. Centers for Disease Control and Prevention 

Previous systematic review of the effectiveness of community water fluoridation (CWF) showed that it reduced dental caries across populations, and a 2002 economic review found from a societal perspective that CWF saved money. However, the effectiveness of CWF has decreased from around 50% in the 1970s to around 25% in the 1990s. Re-examining the benefits and costs of CWF is therefore necessary. Using methods developed for Guide to Community Preventive Services economic reviews, 564 papers were identified from January 1995 to November 2013. Ten studies were included in the current review, with four covering intervention benefits only and another six providing both costs and benefits information. Additionally, two of the six studies analyzed the cost-effectiveness of CWF. For all four benefit-only studies, dental treatments in various forms decreased with the presence of CWF. For the remaining six studies, per capita annual intervention cost ranged from $0.11 to $4.89 in 2013 U.S. dollars (without an outlier). Variation in costs was mainly caused by community population size, with decreasing cost associated with increasing community population. Per capita annual benefits in the six studies ranged from $5.45 to $139.78. Variation in benefits was mainly due to the numbers and types of benefit components. Benefit-cost ratio ranged from 1.12:1 to 135:1, and the ratio was positively associated with community population size. The economic benefit of CWF exceeded the intervention cost. Further, benefit-cost ratio increased with the community population size.

C.1:Estimation of Cumulative Benefits and Costs of Regulation Using the "RegData" Database (Marvin 308)

Chair: James Broughel (jbroughel@mercatus.gmu.edu), George Mason University

Presentations:

1.     RegData: A Numerical Database on Industry-Specific Regulations for All U.S. Industries and Federal Regulations, 1997-2012, Patrick McLaughlin* (pmclaughlin@mercatus.gmu.edu) and Omar Al-Ubaydli, George Mason University 

We introduce RegData, formerly known as the Industry-specific Regulatory Constraint Database. RegData annually quantifies federal regulations by industry and by regulatory agency for all federal regulations from 1997 to 2012. The quantification of regulations at the industry level for all industries is without precedent. RegData measures regulation for industries at the two-, three-, and four-digit North American Industry Classification System (NAICS) levels. We created this database using text analysis to count binding constraints in the wording of regulations, as codified in the Code of Federal Regulations, and to measure the applicability of regulatory text to different industries. We validate our measures of regulation by examining known episodes of regulatory growth and deregulation as well as comparing our measures to an existing, cross-sectional measure of regulation. We then demonstrate several plausible relations between industry regulation and variables of economic interest. Researchers can use this database to study the determinants of industry regulations and to study regulations’ effects on a massive array of dependent variables, both across industries and across time.

2.     Estimating Industry- and Agency-Specific Cumulative Costs and Benefits with RegData, Antony Davies* (antony@antolin-davies.com), Duquesne University; Patrick McLaughlin, George Mason University

Using RegData 2.0, we exploit variation in cumulated regulation across industries, agencies, and time to estimate the cost, in terms of lost productivity, from regulatory accumulation to the benefits, in terms of outcomes achieved.  While agencies are sometimes required to estimate costs and benefits of proposed regulations prior to those regulations being enacted, we are unaware of any ex-post analyses that look at the costs and benefits that actually accrued from regulatory accumulation. We examine a set of major agencies for which the likely desired outcome of regulations is generally known.  For example, the paper will compare an estimate of the lost productivity due to OSHA regulations to the likely desired outcome--improvements in workplace safety, as reflected in workplace illness, injury, and fatality data.  The results will inform both retrospective and prospective review efforts by presenting a credible, empirical methodology for estimating the cumulative impact of regulation (positive or negative).

3.     The Aggregate Cost of Regulations: A Structural Estimation of a Tractable Multi-Sector Endogenous Growth Model, Bentley Coffey* (bentleygcoffey@gmail.com), University of South Carolina; Patrick McLaughlin; Pietro Peretto, Duke University

We estimate the effects of federal regulation on industry-specific value-added to GDP using RegData 2.0 for a panel of 42 industries over 35 years (1977 – 2011). Our estimation is performed within the structure of a Schumpeterian model of endogenous growth, which produces closed-form solutions despite the complications inherent in its multi-sector dynamic general equilibrium structure. To capture the effect of regulations on firms, we treat regulations as constraints in firms’ production processes that raise fixed costs and decrease the firm’s productivity. We then estimate the parameters of this model using national and sector-specific macroeconomic data joined with RegData 2.0, which measures the incidence of regulations on industries based on text analysis of federal regulatory code. With estimates of the model’s parameters fitted to real data, we can confidently conduct counter-factual experiments on alternative regulatory environments and discuss the policy implications of our findings.

4.     Does Regulation Enhance or Inhibit Turnover of Firms by Industry? Thomas Stratmann* (tstratma@gmu.edu), Matt Mitchell and Patrick McLaughlin, George Mason University

A large body of research suggests that churn—the turnover of top firms within an industry—is the mark of a competitive, dynamic, and healthy economy. Among other things, churn has been linked to technological innovation, competitive pricing, and economic growth. The economic theory of regulation offers ambiguous predictions about the relationship between government regulation and churn. Regulation may be a disruptive force, breaking up firms and discouraging integration (Posner 1971). Or, it may be a monopolizing force, erecting barriers to entry (Stigler 1971). We employ RegData 2.0, a new dataset tracking regulatory trends by industry and agency over time, to test the relationship between regulation and churn across 211 U.S. industries over the time period 1997 - 2011.  We show that, on average, the accumulation of regulation specific to an industry reduces the churn of that industry—implying a hidden but substantial economic cost of regulatory accumulation.  Our results are consistent with the ideas that regulations create barriers to entry, protect incumbent firms, and are disproportionately costly on small entities such as new firms and start-ups.

D.1: Use of BCA in Setting Homeland Security Policy  (Marvin 413-414)

Chair: Tony Homan, (Anthony.homan@dot.gov), U.S. Department of Transportation 

Presentations:

1.     The Social Value of Cybersecurity, Daniela Silitra* (dsilitra@mitre.org) and Haeme Nam, MITRE Corporation

Cybersecurity has been a hot topic over the past few years. Its broad spectrum leads to array of studies in various areas. Similarly, measuring social value is becoming more and more accepted practice; especially in a period of unprecedented budget cuts. This paper will attempt to measure the social value of cybersecurity in two areas by employing value added measuring methodology. The two areas of interest is in national economy; and Corporate America. Value measuring criteria will be identified for each of these areas; such as number of cyber jobs created or limited loss of revenue due to cyber-attacks, and then assessed them accordingly. In addition, an extension of this study will be conducted to assess social value of cybersecurity to the public-at-large.  The initial study or phase I of the study will be to assess the social value of cybersecurity in areas of national economy and corporate America. Phase II of the study, which assess the social value of cybersecurity to the public-at-large will be conducted for next year’s presentation.

2.     A Literature Review and Proposed Method of Measuring a Reduction in Vulnerability, Alex Moscoso* (Alex.Moscoso@tsa.dhs.gov), U.S. Transportation Security Administration

Moscoso Slides

According to National Infrastructure Protection Plan 2013, risk is defined as a function of threat, vulnerability, and consequence.  This relationship is used by Federal government agencies within the Department of Homeland Security to assess risks associated with certain terrorist attacks scenarios, including benefit-cost analyses in rulemaking.  In developing a standard methodology to quantify risk, it is necessary to quantify the threat to a target, its vulnerability, and the consequence of a successful attack.  While estimating the economic consequences of a successful attack can be done by using the standard value of statistical life, costs of injuries and property damage; accurate quantitative measurements for threat and vulnerability are more elusive.  With regards to rulemaking, economists seek to measure the reduction in vulnerability from the introduction of certain mitigation measures. Quantifying the effectiveness of a specific mitigation measure used to protect the homeland is difficult task for many reasons: (1) an individual measure is usually part of a vast security system where changes to one component affects other interconnecting components to varying degrees; (2) measuring individual components’ cascading effects through a layered security system proves challenging; and (3) while technology effectiveness can be tested, tracing the impacts of a policy is difficult.  This research will present findings from a literature review on the current methods of measuring vulnerability.  It will also present a working method of measuring vulnerability using available data, fault tree analysis to map large security systems, and Monte Carlo simulations to replicate terrorist attacks scenarios. This research will further the discussion on quantifying vulnerability reduction by examining what has already been accomplished in the field and proposing a method based on those accomplishments.

3.     Estimating Benefits of Maritime Safety Training Programs, Ali Gungor* (ali.gungor@uscg.mil), U.S. Coast Guard

U.S. merchant mariners and their employers spend significant time and money on safety training programs each year due to international conventions or simply following best industry practices. In 2013, the United States Coast Guard (USCG) published a final rule that brings additional training requirements with significant costs to the mariners and the industry overall following the international standards set by the International Maritime Organization’s Standards of Training, Certification and Watchkeeping (STCW) Convention and their amendments of 1995 and 2010. Against significant annual costs that had already been incurred since 1997 and more to be incurred after the publication of this final rule, however, USCG did not estimate any quantifiable or monetized benefits that could be attributed to maritime safety training. Rather, the regulatory impact analyses since 2011 used and provided detailed break-even analyses, transfer benefits and qualitative benefits among other benefit estimation methods. This presentation discusses the challenges of estimating benefits of maritime safety training programs over the last two decades. In particular, all subject matter experts attempted to answer this question: “does safety training save lives?” and the follow-up question: “if yes or maybe, how do you quantify or monetize them?”

4.     Estimating the Cumulative Impact of Coast Guard Regulations Under Executive Order 13563, Rosemarie Odom* (rosemarie.a.odom@uscg.mil), Paul Large and Ali Gungor, U.S. Coast Guard

In Executive Order 13563 (January 18, 2011), agencies are directed to tailor regulations to take into account (to the extent practicable) cumulative costs of regulations.  As a tool to inform the analysis of cumulative costs of regulation, Coast Guard has developed a Cumulative Impacts Database, which contains cost and benefit information on the final regulatory actions that Coast Guard has promulgated since 1993.  The CID allows Coast Guard to aggregate the estimated costs of its regulations by diverse factors.  The aggregated data indicates that 92% of the cost of Coast Guard regulations over the past 20 years results from two statutory mandates: the Oil Pollution Act of 1990 and the Marine Transportation Security Act. This presentation describes and provides examples of the contents of the Cumulative Impacts Database and summarizes some of the key findings.  The presentation also discusses the limitations of using the results from database, particularly the challenge of applying the aggregate results to individual owners and operators.

E.1:Social Policy BCA: Assessing Child Welfare and Justice Programs (Marvin 310)

Chair: Stuart Shapiro (stuartsh@rutgers.edu), Rutgers University  

Discussant: Brian Bumbarger (bkb10@psu.edu), Pennsylvania State University           

Presentations:

1.     A Cost-Benefit Analysis of the 2009 Reform of the Rockefeller Drug Laws in New York City, Joshua Rinaldi* (jrinaldi@vera.org), Vera Institute of Justice

The 2009 drug law reforms (DLR) in New York State changed how drug crimes were processed in the New York City criminal justice system by removing mandatory minimum sentences for defendants facing a range of felony drug and property charges. It also created new options to divert defendants to drug treatment as an alternative to incarceration. In addition to implementation and impact evaluations, the Vera Institute of Justice conducted a cost-benefit analysis (CBA) to explore the economic implications of DLR in New York City.

The CBA computes costs and benefits based on Vera’s impact evaluation, which used administrative records from multiple city and state agencies to track outcomes for cases disposed during two equivalent time periods, pre- and post-DLR. Propensity Score Matching (PSM) was used to select comparable samples, controlling for baseline differences in case and individual level characteristics.

Costs and benefits were measured from the perspectives of taxpayers and victims for a three-year follow-up period post arrest. Taxpayer costs include law enforcement, courts, jail, prison, probation, parole, and drug treatment. Victimization costs were measured by calculating the reform’s impact on the tangible and intangible costs of crime. This presentation will describe the cost-benefit methodology, results, and implications for policy-making.

2.     Doing Well, While Doing Good: A Benefit Cost Analysis of Private Foundation Investment in a Social Bond Impact Program to Reduce Recidivism, Joseph Cordes* (cordes@gwu.edu), The George Washington University; William Winfrey, HCM Strategists/EPI

Nonprofit foundations are increasingly turning to benefit-cost analysis as a means of evaluating the impact of their grants.  A large nonprofit foundation located in the Northeastern United States has participated in a large social impact bond program, investing $1.5 million out of a total of $27 million in a project intended to reduce recidivism.

Our paper uses benefit cost analysis, undertaken from the perspective of the foundation, to evaluate the impact of the foundation’s investment in the social impact bond experiment.  The first step in the evaluation is to undertake a benefit cost analysis of the intervention itself.  Although the analysis draws on standard practices for defining and measuring the social costs and benefits of the intervention, the issue of which discount rate to use in the analysis is less well-defined. The options include:  the social discount rate used to evaluate the program if it were to undertaken in the public sector; a discount rate based on the time preference of the foundation; or a discount rate reflecting the opportunity cost to the foundation of investing its funds in the social impact bond program. We explore this question by formulating a simple model of a foundation’s “social investment problem”.  We show that the appropriate discount rate will depend on: (a) the foundation’s objective (social welfare) function, as defined by its mission; (b) the financial return to the foundation’s endowment, and (c) tax rules governing foundation payout. 

An additional issue that needs to be addressed is that of estimating the impact of the specific foundation’s participation as one of several entities providing funds for the intervention.  Our approach builds on that suggested by Brest, et. al. (2004). The results of our base case analysis indicate that participating in the social impact bond experiment is justified, even at discount rates that are higher than would be used to evaluate an equivalent project financed with public funds.  We also undertake a Monte Carlo simulation to explore the robustness of these results to different values for components of social benefits and costs.

3.     Cost-Benefit Analysis of Supportive Housing for Child Welfare Involved Families, Josh Leopold* (jleopold@urban.org), Mary Cunningham and Mike Pergamit, Urban Institute

This presentation will focus on practical challenges of designing and implementing a benefit-cost analysis for a housing intervention targeted high-needs families with involvement in multiple systems. The Supportive Housing for Child-Welfare Involved Families: a Research Partnership (SHARP) evaluation is a randomized controlled trial in five demonstration sites: Memphis, TN, Cedar Rapids, IA, Broward County, FL, San Francisco, CA, and Connecticut. The demonstration, funded by the Children’s Bureau, a division of the U.S. Department of Health and Human Services, provides supportive housing (permanent housing paired with case management and voluntary services) to families with a history of homelessness and child welfare involvement. The costs of the intervention are expected to be higher than the services provided through usual care by the child welfare system. However, if the program works as intended it is expected to reduce utilization of homeless and child welfare services and produce long-term benefits in child and adult well-being and productivity. The Urban Institute, with support from Dr. Bob Plotnick at the Evans School of Public Policy and Governance, is conducting a benefit-cost analysis of the demonstration to determine whether, and under what conditions, the benefits of the intervention outweigh the costs of producing them. The analysis will distinguish between costs and benefits that accrue directly to families as well as publicly-funded systems (local, state, and federal) and society at-large. For the primary cost domains—homelessness, child welfare, and supportive housing—the evaluators will use the “ingredients method” to estimate actual unit costs through direct data collection. For secondary domains, such as public benefits, health care, and education, the evaluators will rely on the literature to estimate unit costs. During this presentation, the evaluators will outline the methods being used for the benefit-cost analysis to determine unit costs and utilization and solicit participant feedback on how to address anticipated challenges.