Of
The plan codifies outside-the-fence regulation --- justifies
limitless federal grid regulation
Tribe 15, (Carl M. Loeb University Professor, Harvard University and Professor of
Constitutional Law, Harvard Law School, EPAS PROPOSED 111(d) RULE FOR
EXISTING POWER PLANTS: LEGAL AND COST ISSUES,
docs.house.gov/meetings/IF/IF03/20150317/103073/HHRG-114-IF03-Wstate-TribeL20150317-U1.pdf)
EPAs plan spectacularly fails that test, and the rule of law commands us to be
consistent. Some people seem to practice fair weather federalism, rediscovering
States rights when it allows them to sustain a federal policy they favor, but abandoning the same
principles when it suits them. The Constitution demands more than that.
EPAs proposal would comprehensively re-order national electricity policy ,
allowing the agency to seize the role of National Electricity Czar and to
elbow state as well as federal regulators out of the way . For the first
This sort of plant to plug regulation would permit the EPA to regulate any
use of electricity as long as it afects CO2 emissions a standard that
would reach virtually every use of electricity in the U nited S tates. There is no
limiting principle . The A ffordable C are A ct may not compel health insurance
consumers to eat or buy broccoli, but EPA seeks to interpret the C lean A ir
A ct to allow it to regulate every watt used in growing broccoli and moving
it to the market as well as every watt used for any other activity within a
State. Even assuming that such an ambitious and unprecedented plan was precisely what Congress directed
EPA to promulgate (and the statute, as I will show, makes clear that Congress did the opposite), the plan
would dramatically violate the Tenth Amendments well-established anticommandeering principle. Indeed, this plan would violate that principle in a
remarkably sweeping and novel way , well beyond EPAs usual mandate. It would
require States to base their energy and emissions policies on the needs of
other States (and even other nations, such as Canada) with which they are inextricably linked through
the power distribution system the national power grid. And the breathtaking scope of
authority asserted by EPA to regulate outside-the-fence would give it
greater power than Congress has granted even to the Federal 6 Energy Regulatory
Commission (FERC), even though national grid management lies within FERCs mandate rather than EPAs and
EPA to command the States to do the Federal Governments bidding, it would have said so clearly. Indeed, as I will
show, core constitutional principles and precedents governing the Federal-State relationship plainly forbid such
blatant federal commandeering. The Supreme Court has instructed that Congress does not hide elephants in
Ensures blackouts
Scherman 15, Former General Counsel of the Federal Energy Regulatory
Commission and currently Chairman of the Energy, Regulation and Litigation Group
at Gibson, Dunn & Crutcher LLP (William, EPA's Dangerous Desire To Become
America's Energy Regulator, www.forbes.com/sites/beltway/2015/05/11/epasdangerous-desire-to-become-americas-energy-regulator/#4409411a59e2)
it should be a concern to every American that does not want to work in
the dark and sleep with the lights on that this federal agency, the one that would fail first year electrical
So
engineering, is the very same agency proposing to radically change the Nations power gridthe Environmental
CAA, the EPA has the regulatory authority to limit certain pollutants and other smog-forming substances emitted
from the smokestacks of fossil fuel-fired power plants (coal, natural-gas, oil, etc.). Sometimes the EPAs measures
have directly impacted the cost of generating power and, as a result, have indirectly influenced the States and
federal agencythe Federal Energy Regulatory Commissionthat is an expert on the electricity market and whose
mission is to assure Reliable, Efficient and Sustainable Energy for Customers of the Nations bulk power system.
EPAs proposed Clean Power Plan would dramatically change all this and
in doing so would arrogate upon EPA control over how every American
gets and uses electricity service whether that customer is a homeowner in De Moines, a factory
in Ohio, or someone running their air conditioner in California. Simply put, the EPAs proposal reaches
into every aspect of the generation and use of electricity in the United
States through its so-called plant to plug approach to CO2 emissions.
The EPAs sweeping new plant to plug approach is radically diferent from
any other regulation the EPA has previously imposed on electricity
generators. Instead of merely saying to an existing power plant thou shalt not emit more than X from your
smokestack, the Clean Power Plan would insinuate the EPA into every aspect of the Nations energy grid. For
instance, the EPA proposes to reduce the use (demand in industry terms) of electricity by requiring States to impose
energy efficiency standards that meet the EPAs approval. This is a laudable goal to be sure, but the demand
reductions the EPA proposes cannot possible be implemented as quickly as they want (if ever), and if they could,
they would change consumer and industrial consumption patterns forever. At the same time, the EPA is requiring
States to massively shift generation away from fossil fuel-fired power plants to renewable sources of electricity such
as wind and solar. But as any first year electrical engineering student knows, you cant simply substitute wind or
solar power for coal power on a megawatt-for-megawatt basis if you want to keep the lights on when the wind isnt
blowing or the sun isnt shining. The electric grid simply does not work that way. Whether the EPA even has the
legal authority to promulgate the Clean Power Plan will eventually be decided by the Courts. But, putting the EPAs
Nations energy regulator. With the greatest respect, an agency that views all
electricity as fungible lacks the substantive expertise to adequately
consider the impact its proposed rule might have on the reliability of the
electric grid and the long-term effect on costs to the American electricity consumer. That expertise
has always rested with FERC and the States. There has been a great deal written about how
the Clean Power Plan was developed and proposed without much of a role for FERC. One FERC Commissioner even
testified to Congress that in a meeting with the EPAs Joe Goffman and Janet McCabe, the EPA refused to allow
FERC to look at documents relating to the Clean Power Plan. And, when FERC did have some initial views, the EPA
appears to have simply ignored FERCs advice. The Director of FERCs Office of Reliability memorialized in a memo
that in one private meeting, FERC advised the EPA that it had doubts about the EPAs proposal to vastly increase
the use of natural gas-fired generation in lieu of coal-fired generation. FERC also advised that there were
unresolved questions about the proposed increased reliance on renewables, and that the EPAs aggressive
timeline for relying on renewables would be difficult to accomplish. In essence, FERCthe federal experts on
questions of electric reliabilityadvised the EPA that its Clean Power Plan may have serious reliability implications
for the Nations electric grid, but the EPA refused to listen. More recently, the EPA has publically stated that it wants
to work more closely with the States and FERC on reliability issues. This has led many to propose, in various forms,
a so-called reliability safety valve that would be included in the final Clean Power Plan. That a general consensus
has emerged that a safety value is needed all but concedes that the EPAs proposal will cause reliability problems.
environment for our kids and our future. But how we reach those goals should not put ideology and federal agency
turf battles ahead of safety and reliability. Nor should the basic physical realities of our electric grid be sacrificed to
pie-in-the-sky notions of endless carbon-free green energy. The sweeping changes envisioned in the Clean Power
Plan should cause every American to ask: When did the EPA become our Nations energy regulator? When did the
EPA acquire both the statutory mandate from Congress and the required subject-matter expertise to do FERCs and
the States jobs? When did the EPA gain the expertise to determine the optimal and most reliable mix of coal and
natural gas power plants? When did the EPA acquire the expertise to determine how much power can (or should) be
Nuclear war
Andres and Breetz, 11Professor of National Security Strategy at the
National War College AND doctoral candidate in the Department of Political Science
at The Massachusetts Institute of Technology (Richard and Hanna, Small Nuclear
Reactors for Military Installations: Capabilities, Costs, and Technological
Implications, Strategic Forum, February 1, 2011, dml) [ableist language
modifications denoted by brackets]
people in the United States and Canada lost power, some for up to a week, when one Ohio utility failed to properly
trim trees. The blackout created cascading disruptions in sewage systems, gas station pumping, cellular
communications, border check systems, and so forth, and demonstrated the interdependence of modern
infrastructural systems. (8) More recently, awareness has been growing that the grid is also vulnerable to purposive
attacks. A report sponsored by the Department of Homeland Security suggests that a coordinated cyberattack on
the grid could result in a third of the country losing power for a period of weeks or months. (9)Cyberattacks on
critical infrastructure are not well understood. It is not clear, for instance, whether existing terrorist groups might be
able to develop the capability to conduct this type of attack. It is likely, however, that some nation-states either
Econ
Chemical industry resilient even when profit falls
CNI 8 (Chemical News & Intelligence, This Week in ICIS Chemical Business, 8-18,
Lexis)
Engineering and construction companies are expanding to specialties and photovoltaics Global engineering
Korea The big goal for a process engineer could be the development of a technology that converts all the
raw materials to the desired end product with the minimum theoretical energy consumption, no emissions
and the lowest capital cost.
Economic crises lead to conciliatory behavior through five primary channels. (1)
Economic crises lead to austerity pressures, which in turn incent leaders
to search for ways to cut defense expenditures. (2) Economic crises also encourage
strategic reassessment, so that leaders can argue to their peers and their publics
that defense spending can be arrested without endangering the state . This
can lead to threat deflation, where elites attempt to downplay the seriousness of the threat posed by
a former rival. (3) If a state faces multiple threats, economic crises provoke
elites to consider threat prioritization, a process that is postponed during
periods of economic normalcy. (4) Economic crises increase the political and
economic benefit from international economic cooperation. Leaders seek
foreign aid, enhanced trade, and increased investment from abroad during
periods of economic trouble. This search is made easier if tensions are reduced with
historic rivals. (5) Finally, during crises, elites are more prone to select leaders who are
perceived as capable of resolving economic difficulties, permitting the
emergence of leaders who hold heterodox foreign policy views. Collectively, these
mechanisms make it much more likely that a leader will prefer conciliatory
policies compared to during periods of economic normalcy. This section reviews this causal logic in greater
detail, while also providing historical examples that these mechanisms recur in practice.
highest electricity price was 17.9 cents per kilowatt-hour in New England, and the lowest was 10.7 cents per
prices are
expected to fall by 0.3% in 2016 and then increase by 3% in 2017.
Favorable weather and expansion of the customer base provided a
moderately ofsetting efect on the low price scenario so far in 2016. In
2017, the expected electricity price rise is likely to have a positive impact
on utilities (XLU) (IDU) (FXU) revenues.
kilowatt-hour in the East South Central area. According to the EIA forecast, residential electricity
overwhelming supplier for over a century.[4] Throughout that time, particularly during the Industrial Revolution,
Warming
No extinction from warming
Barrett, professor of natural resource economics Columbia University, 7
(Scott, Why Cooperate? The Incentive to Supply Global Public Goods,
introduction)
climate change does not threaten the survival of the human species.5 If unchecked, it will cause
other species to become extinction (though biodiversity is being depleted now due to
other reasons). It will alter critical ecosystems (though this is also happening
now, and for reasons unrelated to climate change). It will reduce land area as the seas rise,
and in the process displace human populations. Catastrophic climate change is possible, but
not certain. Moreover, and unlike an asteroid collision, large changes (such as sea level
rise of, say, ten meters) will likely take centuries to unfold, giving societies time to
adjust. Abrupt climate change is also possible, and will occur more rapidly, perhaps over a decade or two.
However, abrupt climate change (such as a weakening in the North Atlantic circulation), though potentially very
serious, is unlikely to be ruinous. Human-induced climate change is an experiment of planetary proportions,
and we cannot be sur of its consequences. Even in a worse case scenario , however, global
climate change is not the equivalent of the Earth being hit by mega-asteroid.
Indeed, if it were as damaging as this, and if we were sure that it would be
this harmful, then our incentive to address this threat would be
overwhelming. The challenge would still be more difficult than asteroid defense, but we would have done
First,
Oceans resilient
Kennedy 2 (Victor, Environmental science prof, Maryland, Former Director,
Cooperative Oxford Laboratory, PhD, Coastal and Marine Ecosystems and Global
Climate Change, http://www.pewclimate.org/projects/marine.cfm, 2002)
provides some interesting insights into focal points for the big bilateral summit in
September. Both China and the U.S. released reports summarizing the Rice-Yang visit, and the focus was decidedly
global . Instead of touching on bilateral subjects, both governments directed attention to U.S.China cooperation on global issues : the Ebola crisis, the North Korea
nuclear issue, the P5+1 negotiations with Iran, and ensuring stability in
Afghanistan. The emphasis on Afghanistan is especially interesting, as it marks a new area of
cooperation between Washington and Beijing. The Diplomat has
previously reported on the signs China is willing to take a more active role
in mediating between Afghanistan and the Taliban, including bringing Pakistan to the negotiating table. The U.S. role in all of this has been unclear, with
some reports indicating that the U.S. has plans to participate in negotiations with Afghan officials and Taliban leaders which was denied by U.S.
government officials. The prospect of a negotiation process with the Taliban led by China and sanctioned by the U.S. could be a critical development for
Afghanistans future. Details on possible U.S.-China cooperation on this front remain murky. The statement from National Security Council spokesperson
Bernadette Meehan made clear that Rice and Yang discussed Afghanistan in their meeting, but did not offer any additional details. The report from Chinas
Foreign Ministry (translated here by Xinhua) did not mention Afghanistan at all. But given the shared concern for Afghanistans stability, the U.S. and China
are undoubtedly having serious discussions on how to coordinate their efforts. Official summaries of the meeting paid more attention to a long-time point
of emphasis: the North Korean nuclear program. Weve entered another round of speculation as to when (if at all) North Korea will conduct another nuclear
test. When it comes to North Korea, U.S. administrations are always eager to show they have buy-in from China even if verbal commitments never
translate to action (something my colleague Ankit and I discussed in more detail in our latest podcast, featuring Joel Wit). This time around, according to
, Rice and Yang agreed that North Korea would not succeed in its twin
pursuit of nuclear weapons and economic development. The Chinese summary, predictably,
the NSC
was far more muted, saying only that China adheres to the principles of denuclearization and peaceful settlement through dialogue and negotiations.
Yang added his hope that all related parties will exercise restraint, avoid any irritating rhetoric and acts, and jointly maintain peace and stability on the
peninsula. Taken together, these two reports dont spark much hope for a breakthrough on how to approach North Koreas nuclear program. The relative
length given to the North Korea issue in each sides statement shows that both Beijing and Washington are focusing on this issue in the lead-up to Xis
visit. However, the problem is that the two sides have different goals for what a breakthrough would look like. China wants a return to the Six Party Talks or
another form of dialogue, while Washington wants greater Chinese commitment to the sanctions regime and/or a solid North Korean concession on its
strengthen coordination on regional and global challenges. The U.S. and China have different agendas for the international order (see, for example, my
in the social
consequences of climate change. Along with rising sea levels, varying patterns of precipitation,
vegetation, and possible resource scarcity, perhaps the most incendiary claims have to do
with conflict and political violence. A second consensus has begun to emerge
among policy makers and opinion leaders that global warming may well result in
increased civil and even interstate warfare, as groups and nations compete for water, soil, or
oil. Authoritative bodies, leading government officials, and even the Nobel Peace prize committee have highlighted
the prospect that climate change will give rise to more heated confrontations as communities compete in a warmer
(Goldstein 2002; Levy et al. 2001; Luard 1986, 1988; Hensel 2002; Sarkees, et al. 2003; Mueller 2009).3 While talk
Guardian 15
Although the US never ratified the Kyoto protocol, it won an opt-out from having to fully report or act on its armed
forces greenhouse gas emissions, which was then double-locked by a House national defence authorisation bill in
Under the Paris agreement, countries would not be obliged to cut their
military emissions but, equally, there would be no automatic exemption for them either. US officials
privately say that the deal adopted on Saturday has no provisions covering military
compliance one way or another, leaving decisions up to nation states as to which national sectors should
make emissions cuts before 2030. If were going to win on climate we have to make
sure we are counting carbon completely, not exempting diferent things
like military emissions because it is politically inconvenient to count them ,
Stephen Kretzmann, Oil Change Internationals director told the Guardian. The atmosphere certainly
counts the carbon from the military, therefore we must as well . The US
military is widely thought to be the worlds biggest institutional consumer of
crude oil, but its emissions reporting exemptions mean it is hard to be sure. According to Department
of Defence figures, the US army emitted more than 70m tonnes of CO2 equivalent
per year in 2014. But the figure omits facilities including hundreds of military bases
overseas, as well as equipment and vehicles. Activities including
intelligence work, law enforcement, emergency response, tactical fleets
and areas classified as national security interests are also exempted from
1999.
reporting obligations. The US military requested the original Kyoto exemption on national security grounds. While
the Obama administration is not looking to the military for emissions cuts
before 2030, US republicans argue that future presidents, such as the socialist candidate Bernie Sanders for
the Democrats, could. Lets face it, vast swathes of our military are big carbon
emitters tanks, Jeeps, humvees, jet planes and of course much of our navy is not nuclear-powered, so [the
Paris agreement] could be used as a trojan horse, said Steven Groves, a senior research fellow at the US thinktank
the Heritage Foundation. He added: This might be a good opportunity for people concerned with national security
to go to congress and get some type of legislative exemption in the same way as was done during the Kyoto time
period. One of the first advocates of the House double-lock on the Kyoto exemption was Dick Cheney, according to
a book called The greening of the US military by Terry Lee Anderson, a senior fellow at Stanford University. Cheney
argued that the Kyoto clause would not cover US unilateral actions in a letter, which was also signed by other
putting an extra 25m cars on to US roads for a year. The paper found that projected US spending on the Iraq war
could cover all global investments in renewable energy needed to halt global warming trends in the period to 2030.
T
Topical af must be legislation -- judicial action is not T
Establish means legislate -- courts only rule on established law
Websters 10 Webster's New World College Dictionary Copyright 2010 by
Wiley Publishing, Inc., Cleveland, Ohio.
Used by arrangement with John Wiley & Sons, Inc.
http://www.yourdictionary.com/establish
establish to order, ordain, or enact (a law, statute, etc.) permanently
the
term "policy" means such decisions assigned to the agency and policies made
by legislators are embodied in the statutory language and hence are not
"made" either by the agency or the courts , but are derived through the various techniques
(William N. Eskridge, Jr. & Philip P. Frickey ed., 1994) ("A policy is simply a statement of objectives."). Here
of statutory interpretation. 6. FCC v. WNCN Listeners Guild, 450 U.S. at 592-93. See, e.g., Ronald M. Levin,
Identifying Questions of Law in Administrative Law, 74 GEO. L.J. 1 (1985) (scrutinizing the difference between
questions of law and other questions, such as policy). 7. WNCN Listeners Guild v. FCC, 610 F.2d 838, 838 (D.C. Cir.
1979). 8. Id. at 842.
principle. See, e.g., Great Plains Coop. v. CFTC, 205 F.3d 353, 356 (8th Cir. 2000) (using the Chevron opinion as
supporting the conclusion that "statutory interpretation is the province of the judiciary"); Antipova v. U.S. Att'y Gen.,
392 F.3d 1259, 1261 (1 1th Cir. 2004) (explaining that the court reviews "the agency's statutory interpretation of its
laws and regulations de novo .... However, we defer to the agency's interpretation if it is reasonable and does not
contradict the clear intent of Congress"). See generally 3 CHARLES H. KOCH, JR., ADMINISTRATIVE LAW AND
PRACTICE 12.32[1] (2d ed. 1997) (offering many more examples). [END FOOTNOTE] the court. He nonetheless
noted that an administrative decision under delegated policymaking authority would be subject only to hard look
review, which he properly characterizes: "[The Commission] must take a 'hard look' at the salient problems." 9 That
is, the court must assure that the agency took a hard look, not take a hard look itself. "Only [the Commission], and
not this court, has the expertise to formulate rules welltailored to the intricacies of radio broadcasting, and the
flexibility to adjust those rules to changing conditions .... And only it has the power to determine how to perform its
regulatory function within the substantive and procedural bounds of applicable law."' 0 In other words, the court
must assure that the agency is acting within its statutory authority and, once it determines the agency is acting
within delegated policymaking authority, the court is largely out of the picture. Upon crossing this boundary, the
judicial job is limited to assuring that the policy is not arbitrary by determining whether the agency took a hard
look. The basic review system is revealed as Judge McGowan continues: "[The prior case] represents, not a policy,
but rather the law of the land as enacted by Congress and interpreted by the Court...."" He properly noted that
T
We meet congress brings the law into force
CI Establish means to put into force
Webster, Merriam-Webster products and services are backed by the largest team
of professional dictionary editors and writers in America, and one of the largest in
the world, establish, no date, http://www.merriamwebster.com/dictionary/establish
establish : to cause (someone or something) to be widely known
and accepted to put (someone or something) in a position, role, etc., that
will last for a long time
Simple Definition of
pollution. (Carbon dioxide is the primary greenhouse gas contributing to climate change.) The goal is that by 2030, the U.S. would
reduce carbon emissions from coal-fired plants by 32 percent. It's not only good for the U.S. but also, in terms of our position in the
world, it allows us to be leaders in encouraging other countries to take more steps toward clean energy
2NC
Warming
D
Warming much slower than their impacts assume their
models are flawed and our authors use the newest and best
science
volcanoes, solar forcing, natural variability
Fyfe et. al 16 [John, Canadian Centre for Climate Modelling and Analysis,
Environment and Climate Change university of Vancouver, Gerald Meehl, National
Center for Atmospheric Research, Boulder, Colorado, Matthew England, ARC Centre
of Excellence for Climate System Science, University of New South Wales, Michael
Mann, Department of Meteorology and Earth and Environmental Systems Institute,
Pennsylvania State University, Benjamin Santer, Program for Climate Model
Diagnosis and Intercomparison (PCMDI), Lawrence Livermore National Laboratory,
Gregory Flato, Canadian Centre for Climate Modelling and Analysis, Environment
and Climate Change Canada, University of Victoria, Ed Hawkins, National Centre for
Atmospheric Science, Department of Meteorology, University of Reading, Nathan
Gillet, Canadian Centre for Climate Modelling and Analysis, Environment and
Climate Change Canada, University of Victoria, Shang-Ping Xie, Scripps Institution of
Oceanography, University of California San Diego, Yu Kosaka, Research Center for
Advanced Science and Technology, University of Tokyo, Making sense of the early2000s warming slowdown, Nature Journal March 2016, pg. 227-28]
-
Our results support previous findings of a reduced rate of surface warming over
the 20012014 period a period in which anthropogenic forcing increased at a relatively
constant rate. Recent research that has identified and corrected the errors and inhomogeneities in the surface air
temperature record4 is of high scientific value. Investigations have also identified non-climatic artefacts in
tropospheric temperatures inferred from radiosondes30 and satellites31, and important errors in ocean heat
uptake estimates25. Newly identified observational errors do not, however, negate
the existence of a real reduction in the surface warming rate in the early
twenty-first century relative to the 1970s1990s. This reduction arises through the
combined efects of internal decadal variability 1118, volcanic 19,23 and solar
activity, and decadal changes in anthropogenic aerosol forcing 32. The warming
slowdown has motivated substantial research into decadal climate variability and uncertainties in key external
forcings. As a result, the scientific community is now better able to explain temperature variations such as
those experienced during the early twenty-first century33, and perhaps even to make skilful predictions of such
fluctuations in the future. For example, climate model predictions initialized with recent observations indicate a
transition to a positive phase of the IPO with increased rates of global surface temperature warming (ref. 34, and
This mismatch focused attention on a compelling science problem a problem deserving of scientific scrutiny.
Based on our analysis, which relies on physical understanding of the key processes and forcings
involved, we find that the rate of warming over the early twenty-first century is slower than that of the
previous few decades. This slowdown is evident in time series of GMST and in the global mean
temperature of the lower troposphere. The magnitude and statistical significance of observed trends (and the
magnitude and significance of their differences relative to model expectations) depends on the start and end dates
Research into the nature and causes of the slowdown has triggered improved
understanding of observational biases, radiative forcing and internal variability . This has led to
of the intervals considered23.
climate prediction, where the challenge is to simulate how the combined effects of external forcing and internal
variability produce the time-evolving regional climate we will experience over the next ten years.
empirical data on virtually every objective indicator of human wellbeing indicates that the state of humanity has never been better, despite
unprecedented levels of population, economic development, and new
technologies. In fact, human beings have never been longer lived, healthier,
wealthier, more educated, freer, and more equal than they are today. Why
does the Neo-Malthusian worldview fail the reality check? The fundamental
reasons why their projections fail are because they assume that population,
affluence and technology the three terms on the right hand side of the IPAT equation are
independent of each other. Equally importantly, they have misunderstood the
nature of each of these terms, and the nature of the misunderstanding is
essentially the same, namely, that contrary to their claims, each of these
factors instead of making matters progressively worse is, in the long run,
necessary for solving whatever problems plague humanity. Compounding
these misunderstandings, environmentalists and Neo-Malthusians frequently
conflate human well-being with environmental well-being. While the latter influences
the former, the two arent the same. Few inside, and even fewer outside, rich countries would rank
elsewhere,
environmental indicators among the most important indicators of human well-being except, possibly, access to safe
water and sanitation. These two environmental indicators also double as indicators of human well-being because
capita income and some combined measure of education and literacy. None of these three are related to the
environment. The disconnect between environmental indicators and indicators of human well-being is further
the most critical indicators of human wellbeing life expectancy, mortality rates, prevalence of hunger and malnutrition, literacy, education, child labor,
or poverty generally improved regardless of whether environmental indicators
(e.g., levels of air and water pollution, loss of biodiversity) fluctuated up or down (see, e.g., the previous
evidenced by the fact that over the last century,
6.0% from 200005. Many countries are now concerned that fewer young people means that their social security
HIV/AIDS is a case in point. The world was unprepared to deal with HIV/AIDS when it first appeared. For practical
purposes, it was a death sentence for anyone who got it. It took the wealth of the most developed countries to
harness the human capital to develop an understanding of the disease and devise therapies. From 1995 to 2004,
age-adjusted death rates due to HIV declined by over 70 percent in the US (USBC 2008). Rich countries now cope
with it, and developing countries are benefiting from the technologies that the former developed through the
application of economic and human resources, and institutions at their disposal. Moreover, both technology and
affluence are necessary because while technology provides the methods to reduce problems afflicting humanity,
developed countries than in developing countries. And in many developing countries access would be even lower
but for wealthy charities and governments from rich countries (Goklany 2007a, pp. 7997). Because technology is
largely based on accretion of knowledge, it ought to advance with time, independent of affluence provided
society is open to scientific and technological inquiry and does not squelch technological change for whatever
dropping with time for any specific level of GDP per capita. It is also illustrated in Figure 2 for life expectancy, which
hands and brains. As famously noted by Julian Simon, they are the Ultimate Resource.
This is something Neo-Malthusians have difficulty in comprehending . Notably, a
World Bank study, Where is the Wealth of Nations?, indicated that human capital and the value of institutions
constitute the largest share of wealth in virtually all countries. A population that is poor, with low human capital,
low affluence, and lacking in technological knowhow is more likely to have higher mortality rates, and lower life
expectancy than a population that is well educated, affluent and technologically sophisticated, no matter what its
(e.g., for cropland, a measure of habitat converted to human uses) or even reverse it (e.g., for water pollution,
and indoor and traditional outdoor air pollution), particularly in the richer countries. Note that since the product of
population (P) and affluence (A or GDP per capita) is equivalent to the GDP then according to the IPAT identity,
which specifies that I = P x A x T, the technology term (T) is by definition the impact (I) per GDP (see Part II in this
series of posts). Ill call this the impact intensity. If the impact is specified in terms of emissions, then the technology
term is equivalent to the emissions intensity, that is, emissions per GDP. Therefore the change in impact intensity
(or emissions intensity) over a specified period is a measure of technological change over that period. Since matters
improve if impact/emissions intensity drops, a negative sign in front of the change in impact intensity denotes that
technological change has reduced the impact. Table 1 shows estimates of the changes in impacts intensity, or
technological change, over the long term for a sample of environmental indicators for various time periods and
geographical aggregations. Additional results regarding technological change over different time periods and
plausible Neo-Malthusian arguments that technological change would eventually increase environmental impacts,
predictions, especially about the future. Most analysts recognize this. They know that just because one can explain
Neo-Malthusians, by contrast,
cannot hindcast the past but are confident they can forecast the future . Finally,
and hindcast the past, it does not guarantee that one can forecast the future.
had the solutions they espouse been put into effect a couple of centuries ago, most of us alive today would be dead
and those who were not would be living poorer, shorter, and unhealthier lives, constantly subject to the vagaries of
nature, surviving from harvest to harvest, spending more of our time in darkness because lighting would be a
luxury, and our days in the drudgery of menial tasks because under their skewed application of the precautionary
principle (see here, here and here) fossil fuel consumption would be severely curtailed, if not banned. Nor would the
rest of nature necessarily be better off. First,
greater demand for fuelwood, and the forests would be denuded. Second, less
fossil fuels also means less fertilizer and pesticides and, therefore, lower
agricultural productivity. To compensate for lost productivity,, more habitat
would need to be converted to agricultural uses. But habitat conversion
(including deforestation) not climate change is already the greatest threat to
biodiversity!
Econ
prices at the Henry Hub are expected to remain around or below $5.00 per million British thermal units (MMBtu) (in
2015 dollars) through 2040. The Henry Hub spot price averaged $2.62/MMBtu in 2015, the lowest annual average
price since 1995. Prices rise through 2020 in the AEO2016 Reference case projection as natural gas demand
increases, particularly for exports of liquefied natural gas (LNG). Currently, most U.S. natural gas exports are sent to
Mexico by pipeline, but LNG exports, including those from several facilities currently built or under construction,
The persistent,
low price of U.S. natural gas is the primary driver for increased
natural gas consumption in the industrial sector. Energy-intensive
industries and those that use natural gas as a feedstock, such as bulk
chemicals, make up most of the increase in natural gas consumption. Low
natural gas prices also support long-term consumption growth in the
electric power sector. Natural gas use for power generation reached a
record high in 2015 and is expected to be high in 2016 as well, likely surpassing coal
account for most of the expected increases in total U.S. natural gas exports through 2020.
relatively
on an annual average basis. However, a relatively steep rise in natural gas prices through 2020 (rising 11% per
year) and rapid growth in renewable generationspurred by renewable tax credits that were extended in 2015
Throughout
the 2020s and 2030s, electricity generation using natural gas increases
again. Because natural gas-fired electricity generation produces fewer
carbon dioxide emissions than coal-fired generation, natural gas is
expected to play a large role in compliance with the Clean Power Plan for
existing generation from fossil fuels, which takes effect in 2022. The electric power sector's
also contribute to a decline in power generation fueled by natural gas between 2016 and 2021.
total consumption of natural gas from 2020 through 2030 is 6 Tcf greater in the AEO2016 Reference case than in a
case where the Clean Power Plan is not implemented (No CPP).
UQ
Manufacturing is up PMI index proves
Moutray 12/15 (Chad Moutray, Ph.D., Economics from Southern Illinois
University, is chief economist for the National Association of Manufacturers (NAM).
December 15, 2016 http://www.shopfloor.org/2016/12/markit-u-s-manufacturingoutput-in-december-grew-at-strongest-rate-since-march-2015/) swap
U.S. Manufacturing Output in December Grew at Strongest Rate Since
March 2015 The Markit Flash U.S. Manufacturing PMI edged up from 54.1 in
November to 54.2 in December, a 21-month high . This mostly mirrored
assessments about new orders growth (up from 55.5 to 55.6), which also expanded at the
fastest pace over that time frame. Other indicators were mixed but encouraging. Employment
expanded at its highest rate in 18 months (up from 52.4 to 54.1), whereas output grew
modestly but pulled back a little in December (down from 56.0 to 55.1). On a more disappointing note, exports
slowed to a near crawl but were positive for the sixth time in the past seven months
Markit:
(down from 51.0 to 50.3). Softer international demand, however, should not be surprising given the strong U.S.
dollar. Overall, this report provides some encouragement for manufacturers ,
many of whom have been rather cautious in their economic outlook for much of the past two years.
A2 Green Tech
Only our studies take this efect into account
Robert Michaels and Robert Murphy, January 2009. Michaels is a professor of economics at
California State University and a senior fellow at the Institute for Energy Research. Murphy is director of the Institute
for Energy Research. Green Jobs: Fact or Fiction? Institute for Energy Research.
http://www.instituteforenergyresearch.org/green-jobs-fact-or-fiction/
American Progress (CAP) study recommends a $100 billion expenditure to be financed through the sale of carbon
allowances under a cap-and-trade program. CAP estimates that this fiscal stimulus will result in the creation of
green studies critiqued in this report implicitly assume that there is a limitless
pool of idle labor which can fill the new green slots created by government
spending. Yet to the extent that some of the new green jobs are filled by workers
who were previously employed, estimates of job creation are overstated,
perhaps significantly so. In addition, the studies do not account for the rise in worker
productivity over time. Thus their long-range forecasts of total jobs created by green programs
are inflated, even on their own terms.
The
Marked
To its credit, CAP alludes to potential inflationary labor shortages from job creation [vi] due to its proposed
program, but dismisses the concern as irrelevant for an economy in recession. The thinking is that the workers
going into the new green jobs will simply reduce the unemployment rate, rather than siphoning talented people
workers seeking jobs, job creation in non-green sectors will be lower than it otherwise would have been.
Link
Massively drives up prices EPA agrees
Jarrett, MPSC former commissioner and energy attorney, 2016
(Terry, States are right to worry about clean power plan costs, 7-8,
http://thehill.com/blogs/pundits-blog/energy-environment/286906-states-are-rightto-worry-about-clean-power-plan-costs)
For starters,
the EIA says the plan will mean significantly higher prices
for residential and commercial electricity. They attribute this to higher transmission and
distribution costs coming at a time when electricity consumption will also grow slightly (in 2015-2040.) Interestingly, the EIA
projects that these higher electricity prices will actually reduce demand 2%
by 2030. Why? Because compliance actions and higher prices will force cash-strapped consumers to
adopt their own austerity measures . A key part of the CPP is the
dismantling of coal-fired power in the U.S. As the EIA sees it, Coals share of total electricity generation, which was 50%
in 2005 and 33% in 2015, falls to 21% in 2030 and to 18% in 2040. Coal power plants currently anchor
Americas base-load electricity generation, so its understandable that their elimination
would drive up prices . But is such a move justified? The EIA projects that renewable energy (solar and wind) will play a
significant role in meeting electricity demand growth throughout most of the country. Its a bold gamble, since the EIA believes that renewables will
account for 27% of total U.S. generation by 2040. But EIA data shows wind and solar power supplying only 5.6% of U.S. electricity generation in 2015. So,
1NR
electric
generators in these states will have to prematurely shutter coal-fired plants
and replace them with plants that produce less carbon dioxidemost likely wind and
generating states being assigned reductions that are much larger than the 32 percent. As a result,
solar power plants that need natural gas or coal-fired plants to back them up when the sun is not shining and/or the
Terry Boston, president of the PJM bulk electricity management grid that serves 60 million residents in
parts of 13 states, including New Jersey, said doubts are growing over forecasts based on long-term
weather trends, typically 30-year averages. PJM experts, he said, could soon factor climate change and
extreme events into their planning models for delivering power and for restoring it when big storms
turn off the lights. I cannot think of any year in my career with more challenges, Boston said. U.S.
Energy Secretary Steven Chu said theres urgency in moving forward quickly. Blackouts and
brownouts already cost our economy tens of billions of dollars a year, and
we risk ever more serious consequences if we continue to rely on outdated and inflexible
infrastructure, Chu recently told a Congressional committee
addition was made to the lesson of Sept. 11: The actions of terrorists cant always
be distinguished from the actions of a drunken dispatcher or random lightning.
A2 link turn
CPP will strain the grid and cause blackouts dispersion - also
causes international free-riding
Segal 15 Scott Segal is executive director of the Electric Reliability Coordinating Council and former Emory
debater August 31, 2015 EPA's Clean Power Plan ignores costs, threatens reliability
http://www.mlive.com/opinion/index.ssf/2015/08/epas_clean_power_plan_ignores.html
We can expect significant potential threat to the electric reliability upon which
our modern way of life depends. Experts argued the proposed rule strained essential
services . The final rule is little improved. While it ofers welcome delays in
the onset of compliance, it still creates a choke point by requiring draft
plans to be filed in just 12 months. Further, the "safety valve" it creates for
reliability is too little, too late like pulling an emergency brake after the
accident is already occurring. Renewables are an important element of a
balanced portfolio as power companies know, because they are making
substantial investments in them but they cannot come at the expense of
maintaining critical base-load power plants. Is all the pain worth it for the
benefits of the Clean Power Plan? No. The rule is not likely to decrease the
harms associated with global warming, given that climate change is an
international phenomenon. While the White House hopes other nations
will follow our lead at December's climate summit in Paris, it is just as
likely that they will simply exploit competitive advantages created by our
action.
utilities, Federal Energy Regulatory Commission, Nuclear Regulatory Commission, Department of Energy, state and
Enterprise Institute believe that the EPA should avoid this aggressive intervention and continue a policy of
"Cooperative Federalism" by using the "normal tools of government" including the electoral process and political
mandates.
rounded cooperation. In a recent news release, the EPA said that it has recorded state efforts that
consistently met or exceeded the federal requirements for energy efficiency, fuel use, renewable energy, and other
high-performance sustainable building metrics. In 2013, for example, EPA oversaw the 24 percent energy intensity
reduction from its FY 2003 baseline, a reduction from the FY 2013 energy intensity by 25.6 percent from FY 2003. In
FY 2013, EPA also measured a reduced fleet petroleum use by 38.9 percent compared to the FY 2005 baseline,
exceeding the goal of 16 percent." In addition, the EPA reports that greenhouse gases in the US have been reduced
by 10 percent 2005-2012. In the states, the 50 separate Public Utility Commissions (and their National Association
of Regulatory Utility Commissioners) have been exercising their authority and responsibility for working with state
governments, power plant operators, business community, state environmental groups, consumer groups and
47 states
have demand-side energy efficiency projects, all with measurable results,
38 states have Renewable Portfolio Standards (RPS), 10 states have
voluntary market-based Green House Gas (GHG) emission trading programs and numerous large
transmission companies to provide electricity to power the largest economy in the world. Currently,
private companies and publicly traded utility companies have been pursuing voluntary emission reduction
strategies. In a recent presentation at conference of the American Meteorological Society in Phoenix, EPA
Administrator Gina McCarthy said that "Science is under attack like it has never been before," which seems like
hyperbole, at the least or a highly political rationalization, at the most. In a recent editorial in Science Magazine, the
executive publisher Alan I. Leshner, said: "If the general public is to share more opinions with members of the
scientific community, scientists themselves cannot ignore concerns that people may have about the research
process or findings. There needs to be a conversation, not a lecture." Adding to the overall scientific confusion are
recent stories about "global warming" by many news outlets like the BBC, Forbes, the New York Times, The
Economist and CBS, they have reported that there has been no measurable increase in temperature over the last
15 years, also known as "global warming pause." On the other hand, other media sources like the World
Meteorological Organization, The Guardian and Climate Central are reporting that the 10 warmest years have been
since 1998. Surely, these disparities represent a major disagreement between respected sources of weather science
information. For the record, the United Nations International Panel on Climate Change's (IPCC) latest study shows a
temperature increase of 0.09 degrees Fahrenheit since 1998. Unsurprisingly, a recent Ohio State University 2015
study suggests that "both liberals, conservatives have science bias," when they are presented with facts that
challenge some of their political beliefs. Finally, there are several EPA's Climate Change assertions which can be
vigorously debated. For example, in the EPA News Release of October 31, 2014, it talks about the impacts of climate
change across the country, "ranging from more severe droughts and wildfires to record heat waves and damaging
storms." One could easily argue that none of the events need necessarily have been caused by global warming. In
fact, there is no detailed scientific evidence to ascribe "climate change" to any of these natural events. All of this
for such precipitous action as proposed in the Clean Power Plan seems more political than practical, especially given
A2 k2 reliability
NERC agrees the CPP harms reliabilitymultiple argsno
link turn
NERC, 16North American Electric Reliability Corporation (Potential Reliability
Impacts of EPAs Clean Power Plan, http://www.nerc.com/pa/RAPA/ra/Reliability
%20Assessments%20DL/CPP%20Phase%20II%20Final.pdf,
significant changes occurring on the BPS both as a
matter of the course of business as well as a direct efect of CPP implementation. The results
The results of the CPP analysis underscore
of the IPM and AURORAxmp models have been delineated in the preceding chapters along with an overview of the
key assumptions that went into deriving the aforementioned results. A deviation from the relevant input
nuclear, etc., to more asynchronous, distributed and storage-enabled resources such as wind, solar, and storage. In
addition, the modern grid system will change in future to incorporate microgrids, smart networks, and other
NERC formed the Essential Reliability Services Task Force (ERSTF) that
studied the implications to planning and operating the BPS in the face of
these changing resource mix. Essential Reliability Services (ERSs) include three important building
advancing technologies.
blocks of reliability, namely; frequency support, ramping, and voltage. In order to maintain the reliability of electric
grid, resources need to be able to provide frequency support, voltage control, and ramping capability. The ERSTF
evaluated the capabilities of newer resources in terms of providing ERSs to see if they are able to provide them.
ERSs will be needed for future as we transition from conventional generation mix to newer resource mix. Based on
the analysis of geographic areas that are experiencing the greatest level of change in their types of resources ,
a
number of measures and industry practices are recommended to identify trends
and prepare for the transition in resource mix. Frequency20 The electric grid is designed to operate at a frequency
of 60 hertz.
and equipment of all sizes and types. It is critical to maintain and restore frequency after
a disturbance such as the loss of generation. An instantaneous (inertial) response from some resources and a fast
response from other resources help to slow the rate of frequency drop during the arresting period by providing a
Adequate
ramping capability (the ability to match load and generation at all times) is necessary to
maintain system frequency. Changes to the generation mix or the system
operators ability to adjust resource output can impact the ability of the
operator to keep the system in balance. Voltage22 Voltage must be controlled to protect
fast increase in power output during the rebound period to stabilize the frequency. Ramping21
system reliability and move power where it is needed in both normal operations and following a disturbance.
Voltage issues tend to be local in nature, such as in sub-areas of the transmission and distribution systems.
Reactive power is needed to keep electricity flowing and maintain necessary voltage levels. Each reliability building
block has an associated video animation to explain the concept of that particular ERS. Please click on each title
above to access the corresponding video. Additionally, they are all available here: The Basics of Essential Reliability
measures. In addition, the ERSWG in a separate effort will evaluate the impact of distributed energy resources on
indicating the level of resources that need to be built in order to meet demand; however, system planners must
adequately plan for new resources to ensure that they can be built in a timely fashion to accommodate retirements
when grid operators pull too much energy from compensating sources,
they can cause the generators providing the extra generating capacity to
fail, which in turn causes the grid operator to over-tax different power generators, which may also fail,
creating what is known as a " cascading failure ."' 10 9 Cascading failures can
eventually lead to blackouts .'10 One of the largest cascading failures resulting in a blackout in
North America occurred on August 14, 2003.' That blackout affected an estimated fifty million people in eight U.S.
states and Ontario, Canada for up to a week.'1 2 The blackout cost the United States between $4 billion and $10
billion, 113 and in Canada, "gross domestic product was down 0.7% in August, there was a net loss of 18.9 million
work hours, and manufacturing shipments in Ontario were down $2.3 billion (Canadian dollars)."' "14 To avoid
blackouts, the electrical grid requires the balancing of supply and demand to a near-perfect degree. In the United
States, operators maintain the electrical grid at 60 hertz ("Hz")." 5 Ideally, this means that "the transmission grid
would always operate precisely at 60 Hz ...even as its millions of consumers impose varying loads at tens of
thousands of substations.' ' I 1 A demonstrative case of electrical grid balancing occurs in England, in what is known
as the "TV pick-up."' " 7 After a popular television show or sporting event ends, "[m]illions of lights and kettles are
simultaneously switched on" while "[t]he National Grid . . . must keep the frequency at 50hz.""' 8 This often requires
turning additional peak load generators online specifically to combat the power surge. 19 To combat the threat of
blackouts, electrical operators 'have to forecast [energy demand] second by second, minute by minute."' 1 0 They
base predictions on what customers required on a similar day with 'exactly the same weather." 2 '
of the grid
maintain its reliability by ensuring that deviations [from the ideal 60 Hz]
never grow to catastrophic size ."' 124 These limits give operators a minimal
amount of wiggle room to balance the grid, but if the electrical grid
operator allows the frequency to go outside of the outer bounds, the grid
can fail and cause blackouts . Time-intermittent wind and photovoltaic
power create even more challenges for grid operators. As the demand for
renewable energy sources increases, the shortcomings of the electrical
grid will become more problematic . "The changes will lead to grids that are more stochastic and
exhibit dynamics requiring new stability criteria that address emerging problems and can be evaluated faster,
closer to real time.' 125 In Germany, where renewable energy sources have priority over traditional power plants,
transmission companies send excess renewable energy to other counties because of the grid's inability to store
electricity efficiently.1 26 In the United States, the Bonneville Power Administration, based in the Pacific Northwest,
has taken another approach, which includes booting wind energy supplies off line in favor of hydroelectric power
when too much supply exists, allowing clean energy to go unused. 127 While grid operators make these decisions
based on the necessity to balance the grid, decisions like Bonneville's can make investments in clean energy
superfluous.
A2 Timeframe/ adapt
Ev is about CPP shift to smart grids
study found that 70 percent of utility-security professionals say that they have experienced at least one security
breach. For their part, federal and state governments genuflect to the goal of reliable, resilient, and affordable
electric service. Yet comparatively trivial sums are directed at ensuring that grids are more secure, compared with
the vast funding to promote, subsidize, and deploy green energy on grids. The central challenge for U.S. utilities in
the twenty-first century is to accommodate the conflict between political demands for more green energy and
government needs to improve its vital role in helping with cyber situational awareness, the private sector must
lead the way in defending against cyberphysical threats that evolve and move at tech-sectornot bureaucratic
velocities.
Smart grids easier to hack the plan makes grids less secure.
Mills, 16
[Mark, Senior Fellow at the Manhattan Institute and author of Exposed: How
Americas Electric Grids Are Becoming Greener, Smarterand More Vulnerable,
Smart Grids: Greener & Easier to Hack, Real Clear Policy, July 14, 2016,
http://www.realclearpolicy.com/blog/2016/07/14/smart_grids_greener__easier_to_hac
k.html]
both require Internet-centric mechanisms to meet the challenge of using episodic supplies to fuel societys alwayson power demand. Thus, policies from California to New York as well as the EPAs Clean Power Plan, envision adding
meters to the power electronics on solar panels. Cybersecurity has simply not been the priority in green policy
domains even though technical and engineering message boards and publications are filled with examples of
cyber-vulnerabilities or weak or non-existent cybersecurity features .
followed by an email from the hacker stating that the power at all the
hospitals will be shut down within an hour. The ransom is, say, $10 million in Bitcoins.
Now imagine a different scenario, this time a hot Manhattan evening when several blocks go dark. Its not
a ransom this time but a threat : more is coming. The mayor gets an image on his smartphone of the July 25th
This is
1977 cover of Time Magazine with its headline Night of Terror. That 1977 New York City blackout lasted 25 hours,
involved thousands of ransacked stores and fires, 4,000 arrests and $300 million in damages. This time, the mayor
also worries that the attacker could be coordinating an array of Orlando-type physical assaults to fuel the chaos. In
the first case, the ransom gets paid and power comes back. In the second scenario, no physical attacks happen, but
it takes two days and heroic efforts from ConEds crews to restore power by reverting to older manual systems that
bypass the smart stuff. But the terrorists made their point. And in both cases forensic teams from the Department
of Homeland Security, the FBI, and DODs Cyber Command descend. They learn that a sophisticated phishing scam
inserted a computer worm, combined with malware loaded earlier in a backdoor hack into a power monitoring
device, enabling the remote seizure of local power network controls. The NSA traces the cyber breadcrumbs to
anonymous servers in Georgia (the country not the state) or Iran, or China, and a dead end. Sound far-fetched?
Consider where we are today: ransomware attacks are already a scourge. The American Hospital Association
reported that several health care companies and hospitals were hit earlier this year with ransomware (most paid).
But, so far, hackers can only shut down a target organizations access to its own computer system or e-commerce
Web site. As for the future, consider that for hackers, todays Internet-connected cars look just like tomorrows
connected grids. Researchers have hacked the Ford Escape, Toyota Prius, Nissan Leaf, and to great fanfare a
Jeep Grand Cherokee. Last years cyber-jacking of a Jeep took full control from ten miles away by exploiting
vulnerabilities in the Internet-connected infotainment system to backdoor into the cars microcomputers that
operate the steering and brakes. In the wake of that stunt, Chrysler recalled over a million cars and corrected those
particular vulnerabilities. Earlier this year, the FBI and NHTSA issued a general alert regarding vehicle cyber
vulnerabilities. Everyone on both sides knows its only the tip of the cyber-berg. In fact,
there have
already been cases of grid-like cyber-jacking. In 2008, a Polish teenager hacked a citys
light-rail controls and caused a derailment. In 2010 the world learned of a clandestine hack ostensibly U.S.-Israeli
that inserted the Stuxnet computer virus to damage the electrical infrastructure of Irans nuclear facilities. In
2015, hackers breached the operating system of a German steel mill, causing enormous physical damage. And this
So far there have been no such
hacks on U.S. power grids that we know about . And experts testifying
before Congress about the Ukraine event credibly asserted that Americas longhaul grids are better protected at least for now. But thats not the issue. Exposure is a
problem not so much with long-haul grids but with local grids in cities and
communities where all the Internet smarts are planned. As green
connectivity is accelerated onto those grids, the attack surface expands.
Todays grids are , by Silicon Valley standards, dumb even if deliberately so. But we
already know what adding more Internet connectivity enables. The Department of Homeland Security asserts that
Americas manufacturing and energy sectors are the top two targets for attacks on cyber-physical systems. And
Cisco reports that 70 percent of utility IT security professionals discovered a breach last year, compared with 55
Sq CP
Renewables now solves the af, but the plan rushes the
transition and causes our impacts
Porter, et al, 15Bishop William Lawrence University Professor, Harvard
Business School (Michael, with David Gee, Senior Partner and Managing Director at
The Boston Consulting Group, and Gregory Pope, Principal at The Boston Consulting
Group, AMERICAS UNCONVENTIONAL ENERGY OPPORTUNITY,
http://www.hbs.edu/competitiveness/Documents/america-unconventional-energyopportunity.pdf,
Policies at both the state and federal level will continue to encourage
lower-carbon energy solutions . State Renewable Portfolio Standards will
cumulatively require a minimum of 60 GW of new renewable generation by
2030, 40% higher than is mandated today.158 In addition, 13 states have introduced
greenhouse gas emissions limits that will require further shifts to lowercarbon
power.159 Federal standards will also ensure that vehicles and appliances
continue to improve their energy efficiency. There are also a growing
number of other proposals that would encourage carbon reductions over
the next 1015 years and longer. The Obama Administration, for example, has recently introduced
the proposed Clean Power Plan (CPP)160 that covers carbon reductions in the power sector, signed a greenhouse
gas emissions accord with China,161 and made U.S. greenhouse gas reduction pledges to the Paris round of
international climate negotiations.162 Each proposal targets a 2530% reduction in carbon emissions by 2030
compared with 2005 levels. These proposals face stiff political and legal challenges, but the reality is that
U.S. position in natural gas is a crucial asset in making Americas energy transition both feasible and at a competitive cost across a range of carbon reduction scenarios, at least through
2030. Natural gas can replace up to 50% of the existing coal capacity by 2022 at lower cost,168 providing significant economic and carbon benefits, regardless of other climate policies.
EPA Administrator Gina McCarthy put it well in April 2015 when she said, [Fracking] has changed the game for me in terms of how the energy system is working. The inexpensive gas
thats being produced has allowed us to make leaps and bounds in progress on the air pollution side and, frankly, to make the Clean Power Plan.169 Natural gas essential for near-term
carbon reductions Natural gas is the only fuel that can cost-effectively deliver large-scale carbon emissions reductions in the near term, including the 30% carbon emissions reduction
targeted by the proposed Clean Power Plan. A 2014 CSIS/Rhodium Group study170 shows that increasing natural gass share of power generation from 28% today171 to 43% by 2030
allows the U.S. to meet the 30% reduction target of the Clean Power Plan without significantly increasing the cost of electricity in the U.S.172 The study estimates that power rates would
rise by around 4%, while overall energy expenditures would remain nearly flat, assuming that states coordinate their implementation.173 (See Figure 23 on page 39.) Unconventional
natural gas also gives the U.S. a competitive advantage in moving to a low-carbon energy system over other countries that lack abundant natural gas resources. Without a supply of lowcost gas, Germany, for example, set aggressive renewables goals and then spent $400 billion in direct government subsidies to support renewable growth.174 The price of electricity for
residential customers increased by 70% between 2004 and 2014.175 The share of renewables has increased to about 25%,176 but the share of coalfired power has actually increased as
solar, depending on the state.179 As the German example shows, major subsidies or much higher
electricity bills would be required to meet the Clean Power Plan, or similar reduction
goals, using renewables alone. In addition to the higher cost of generation, the transition
to a high renewable share will require an estimated $750 billion in grid
improvements in the U.S. to handle large volumes of intermittent
renewables and the more sophisticated forms of energy management and
efficiency needed.180 Transmission and distribution lines will require additional capacity and two-way
flows to manage widening sources of intermittent renewables. Smart grid metering and control
systems need to become more sophisticated and widespread to allow grid
operators to harmonize the new, complex flows of power supply and demand. Practically, this process will
require a 20- to 30-year period .181 Natural gas needed for standby power Natural gas
power plants are a necessary complement to the scale-up of renewables.
As renewables gain share, backup capacity will need to grow significantly
to ensure that a large volume of on-demand power can come online over
extremely short periods to compensate for absences of wind or sun. (See
Figure 24.) The particular levels of backup capacity required will depend on the percentage and
distribution of intermittent renewables, as well as the ability of the grid to utilize demand response and storage, but