UCS Blog - The Equation (text only)

The Scientific Integrity Act and the Importance of Storytelling in Science Communication

My job regularly requires explaining complex science and policy topics to the media, public, and decisionmakers. So I took over the Union of Concerned Scientists’ twitter account (#GretchenTakeover) to share my top tips for talking about science in decisionmaking, examples of effective science communication, and suggestions for how to advocate for the Scientific Integrity Act.  Here are the key takeaways.

Scientific Integrity matters to my daily life and yours

I started the Twitter takeover by sharing ways that scientific integrity has affected my life – and affects all of us every day. Here were the main points:

  • We need to protect the government employees who are charged with using science to protect us–from weather forecasts, to infectious disease monitoring, to food safety inspections. I did this snarky #ThankAGovScientist post one time.
  • My 11th grade chemistry teacher helped me see the value of scientific integrity. “Where would science be if we changed data?” she asked me. I wrote more about that in a Teen Vogue piece here.
  • Remember, behind the headlines of terrible news on science policy, there is an army of federal scientists trying to just do their jobs. The experiences of former government scientists like Joel Clement and Maria Caffrey give a window to that.
  • I constantly remind myself that science only benefits our health and safety if we protect its role in policy decisions. We can’t be silent while science is used for harm and inequities are perpetuated. That’s why we need to support the Scientific Integrity Act.
Misuse of science harms the public

But unfortunately, science isn’t always used in the public interest. Political and financial forces often misuse or ignore scientific evidence to the public’s detriment. Here are some of the sources showing just how big of a problem this is across agencies, across issue areas, and over time:

  • Compared to past administrations, we are seeing more and more sidelining of the role of science in decisionmaking. (Academic citation here).
  • In fact, we’ve tracked more than 100 attacks on science in our tracker here.
  • Political leaders have taken a wrecking ball to federal science, killing science advice, burying critical reports, interfering in research and blocking scientists from talking to journalists—a path of destruction.
  • These attacks on science will harm public health and the environment, especially for communities already burdened with more pollution, safety risks, and other stressors. This move increasing hazardous air pollution is one example.
  • The Trump administration ignored EPA scientists in failing to ban the harmful pesticide chlorpyrifos. Sadly this is but one of dozens of examples of political leaders’ misuse of science that threatens public health and safety.
  • The Department of the Interior has seen new levels of undue political interference. Scientific reports squashed, political appointees reviewing grants, and neglect of science showing health and environmental concerns.
  • Government scientists cited political influence as a barrier to science-based decisionmaking on our 2018 survey. This was true even under past administrations. Clearly, scientific integrity must be strengthened.
The Scientific Integrity Act is a solution, but it needs your support

To protect the role of science in federal decisionmaking we must strengthen scientific integrity, and the Scientific Integrity Act would do just that. To advance in Congress, the Scientific Integrity Act must be prioritized by members of Congress. They are looking to you (their constituents) to tell them what to focus on.

  • The Scientific Integrity Act helps protect government science and the scientists charged with protecting us. Recent attacks on government science show that we need this now more than ever. But we need Congress to prioritize it.
  • Led by Senator Brian Schatz and Representative Paul Tonko, the bill would help ensure government science is working to protect public health and safety, for years to come. We need all members of Congress on board.
  • The bill gives government scientists the right to share their research publicly, ensures that government communication of science is accurate, and protects science in policy decisions from political interference.
  • As my colleague Michael Halpern testified to Congress, the bill is good for science and good for policy. “This bill promotes good government. It enhances accountability. It prevents corruption.”
  • These benefits aren’t just speculative. My colleague Jacob Carter wrote here about many attacks on public health and safety that might have been prevented under the bill.
  • To get the bill to move in Congress, we need people like you telling your Congressional representatives you want science protected. Call. Write. Tweet. It all helps!
  • Scientists and science supporters have already made a huge difference: Thanks to thousands of calls, emails, tweets and postcards, plus efforts on the Hill, there are now 199 co-sponsors for the #ScientificIntegrityAct—a rare feat for any bill!
  • But we need all members in Congress on board, and now is the perfect time to step it up: Your representative is home during the month of August –and you need to let them know science matters to people in your district.
Tell your Members of Congress to advance the Scientific Integrity Act

Here’s how you can help ensure your members of Congress do what they can do strengthen federal scientific integrity: Use social media to get their attention!

  • If they support the Scientific Integrity Act, tell them to help it advance out of committee and onto the House floor. If they don’t support yet, encourage them to. Explain why it matters to you, your life, and your district.
  • How to craft a good message:
    • Take things from the top. Use short sentences. Meet people where they are. Don’t assume people already understand scientific topics. Better to repeat concepts than to lose people off the bat.
    • Make it personal. As a scientist, I’m tempted to lead with science and facts but when talking to decisionmakers, a personal anecdote can be more impactful and memorable.
    • Engage your decisionmakers directly and respectfully. Don’t forget to tag their social media handle. They or their staff scan Twitter regularly to see what people are saying about them (“@ them”).
    • Make it specific. This is a complex, far-reaching bill, and you don’t need to cover every implication in your message. Find one that resonates for you. Use this post to brainstorm what will most resonate.
  • Here are some more tips for using social media for science advocacy. We need you to speak up now more than ever.
  • Hear from UCS communications experts about using social media effectively for advocacy on this webinar.
  • More broadly, check out the advocacy resources that the UCS Science Network has on science communication, science advocacy and science policy on a range of issues.
  • If you want decisionmakers to pay attention, give them a person, an image, a story to remember. They can find facts elsewhere, but they want to know how policy affects YOU and what you care about.
  • Last summer, I gave a public comment on a rule that will restrict EPA’s use of science to make public health decisions—with my newborn in tow! These attacks on government science are about so much more than science.
  • Scientific integrity is about ensuring the future is better than the present, for our kids and for other people’s children. My colleague Jacob Carter talked about scientific integrity’s impact on his Arkansas community.
  • Follow your policymakers’ accounts on social media to see what they talk about, and make your message relevant to what most interests them. And take advantage of moments like hearings or breaking news when they might be more focused on science topic.

Bottom line: Science serves the public good but only if we protect its role and insist that it is used effectively in public decisions. Our public health and safety depends on it.

Now’s the Time for California to Start Planning for Self-Driving Cars

Photo: Richard Masoner/Cyclelicious/Flickr

Traffic. I’m not a big fan. I’m lucky enough to have commuted to work by bike and BART for the last 15 years and avoided countless hours in gridlock. But last week, faced with the tricky summer camp logistics that comes with being a parent of two children, I had no choice but to drive. And I was reminded just how soul sucking bad traffic can be. By the end of the week I was exhausted.

Imagine if traffic gets even worse? Uber and Lyft are already being implicated in the rise of congestion in San Francisco and other cities.  Add self-driving cars in the coming few years to the mix and things could look even worse. But it doesn’t have to turn out that way.

As I noted in my last blog post, automated vehicles (AVs) could be an integral part of a future multi-modal transportation system. For example, AVs could facilitate connections to high quality mass transit, increase sharing of rides, and accelerate a transition to electric vehicles. But ensuring self-driving cars improve our transportation system will take thoughtful policy from local, state, and federal decision makers. Without it we are likely to end up with more traffic clogging our streets, and we will fail to cut transportation pollution anywhere close to what we need to. Plus, the last thing we need are more cars on the road emitting carbon and making the climate crisis worse.

California is not prepared for self-driving cars

California’s known for being a pioneer in transportation innovation. Everything from the invention of freeways (for better or worse!) to kick-starting  the EV revolution.  So California must be prepared for AVs. Right?

Wrong. So far, the state’s primary actions on AVs have been led by the DMV. They’ve focused on the rules of the road for AVs including requirements for companies seeking permits to test the vehicles on public roads.  But they aren’t taking into account climate pollution or a whole host of other issues that come up with AV deployment, from impacts to workers and economic development issues, to congestion and infrastructure needs.  The Governor’s Office of Planning and Research, as part of a multi-agency work group, has helped organize a set of helpful principles around AV deployment which is a good start, but turning these principles into actions is a critical next step.

California has the opportunity to start planning ahead before this technology is commonplace on our streets and Senate Bill 59, introduced by Senator Ben Allen, would do just that.

What does SB 59 do?

SB 59 creates the California Council on the Future of Transportation. The title is a bit grandiose perhaps, but if created, could profoundly influence the future of transportation in California.  The Council is charged with recommending policies to the Governor and legislature to maximize the benefits of a self-driving future.  Currently, there are piecemeal approaches, with different cities trying different things and very little guidance on the broad ranges of issues and interests involved. SB 59 can bring all these pieces together and catalyze the statewide innovative, multi-stakeholder thinking we need to maximize the benefits of new technologies while addressing these challenges.

To cover the wide range of issues related to transportation and deployment of self-driving cars, the committee would bring together a diverse mix of stakeholders. Committee members would come from both industry and public interest groups as well as local and state government agencies.  These range from representatives from labor organizations, research institutions, disability rights and pedestrian safety groups, health and science organizations, and environmental justice advocates.  Technology companies, automakers, and motorists would also be represented. State and local agency representation would span local transit agency representation to state department of transportation, DMV Air Resources Board, Workforce Development, Office of Business and Economic Development and more.

The issues raised by self-driving cars are many and the committee is charged with reporting back to the legislature on several topics with its first report due by January 2022, and subsequent reports due every two years thereafter. The topics include, but are not limited to, road safety, infrastructure improvements, reducing congestion and vehicle miles traveled, furthering the state’s environmental, public health and energy goals, labor and economic impacts, accessibility and insurance.

Will self driving cars be climate heroes?

Importantly, the legislation requires establishing subcommittees to delve into specific issue areas.  With such a broad group of stakeholders and issues to cover, this is a critical component to ensure progress is made.

The bill would establish a subcommittee to examine the health and sustainability issues related to self-driving vehicles and would be guided by the Automated Vehicle Principles for Healthy and Sustainable Communities formerly released by a multi-agency working group.  These principles, similar to policy principles UCS previously developed, importantly call out the need for self-driving cars to be shared and electric, to support highly quality transit and active transportation (walking and biking), and to improve livability. They also call for improving transportation equity by ensuring that self-driving cars increase access to mobility for communities and individuals currently lacking affordable transportation options. Our recent report, Where are Self-driving Cars Taking Us?,  highlights the importance of proactive policy to achieve this outcome, otherwise self-driving cars could end up exacerbating pollution and congestion in communities already overburdened.

What kind of recommendations might the committee come up with to address climate emissions related to AV deployment? Road pricing to encourage pooling of rides, investments to improve or expand mass transit systems, setting limits on private vehicle use when no passengers are present, and incentives or requirements for electric drive are the types of policy ideas that should be explored by the committee amongst others. Many of these solutions aren’t unique to AVs, but become even more critical as AVs hit the road.

Now’s the right time to prepare for self-driving cars

A Waymo self-driving vehicle in Mountain View CA.

We are already seeing the negative impacts of Uber and Lyft on congestion and public transit in urban centers. A recent analysis, released by Uber and Lyft, found 13 percent of all vehicle travel in San Francisco is from Uber and Lyft and that about 45 percent of travel was without any passengers. Self-driving cars are likely to be deployed in similar services and exacerbate these problems.

Self-driving cars are already on the road in California, shuttling employees at companies like Google and Cruise, and it may take a few years or more before they substantially replace regular cars. Even if the technology takes longer than expected to mature (as some are now thinking), interest in the technology does not appear to be waning.  When I checked how many technology and auto companies were registered to test self-driving cars earlier this year, I estimated about 40 companies registered.  The list is now over 60 and that’s only in California. The longer timeline for AV deployment provides a window of opportunity for policymakers and they should take advantage of it.

The California Legislature should pass SB 59

Transportation emissions are the largest source of climate emissions in CA and they’ve continued to grow despite our best efforts.  We can’t afford to take a wait and see approach with AVs. SB 59 is an important step in creating the level of coordination necessary to realize the promise of emerging self-driving cars and avert outcomes we want to avoid. It ensures state and regional decision-makers, state transportation officials and other stakeholders have a forum to coordinate on various aspects of AV deployment before it’s too late.

Photo: Richard Masoner/Cyclelicious/Flickr

More Charging Infrastructure Coming for Electric Trucks and Buses in California

Great news from San Francisco today. The California Public Utilities Commission approved San Diego Gas and Electric’s (SDG&E) five-year, $107 million proposal to invest in charging infrastructure for electric trucks and buses. The proposal will result in at least 3,000 new electric trucks and buses in the San Diego region. But the utility is aiming for the budget to support closer to 6,000 vehicles.

Approval of SDG&E’s proposal is a significant step towards cleaning the air and meeting climate goals in the San Diego region and California. And utilities have an important role to play in the electrification of trucks and buses not only through infrastructure investments, but also by offering fair electricity rates for vehicle charging.

The funding will support Class 2 through Class 8 vehicles such as delivery trucks, transit buses, garbage trucks, port trucks, and school buses. Infrastructure for off-road vehicles (e.g., forklifts) and transport refrigeration units is also eligible for funding.

Recognizing inequities in exposure to air pollution from vehicles, at least 30 percent of the funding must be invested in communities most burdened by pollution. These include neighborhoods such as Barrio Logan, Logan Heights, and Sherman Heights, if you are familiar with San Diego.

A pilot for electric school buses

The approved proposal also includes plans for a “vehicle to grid” (V2G) electric school bus pilot project. SDG&E will partner with a school district to purchase ten electric buses and install charging infrastructure capable of using energy from the buses’ batteries for the electric grid. So, while children are at school, their buses will be sending clean energy back to the grid.

The relatively high downtime of school buses compared to other heavy-duty vehicles make them prime candidates for V2G. I expect that many utilities and school districts will closely follow the outcomes of this project. V2G offers a potential revenue stream for school districts or anyone else that sells power from vehicles to the grid.

The proposal had a lot of support

UCS was one of 15 stakeholders, or “parties,” that reached a settlement agreement supporting a modified version of SDG&E’s original proposal. Parties represented a broad range of interests, including community groups, ratepayer advocates, charging companies, electric truck companies, and environmental organizations.

This broad agreement made for a relatively easy vote by the Commission. The final decision was nearly identical to the settlement agreement.

The funding will support electrical upgrades and equipment needed for electric vehicles. School districts, transit agencies, and small businesses in disadvantaged communities will also be eligible for a 50 percent rebate on the actual vehicle charger. A minimum of 10 percent of the investment must serve transit buses and school buses. A maximum of 10 percent of the budget can serve electric forklifts (which are already trending towards electrification).

SDG&E’s funding will come in two phases, $84 million in the first phase, and $23 million will become available in the second phase if SDG&E shows progress towards deployment goals outlined in the settlement agreement.

The investments span a five-year period and will be paid for by all SDG&E customers. SDG&E estimates it will add $0.38/month in 2022 to the bill of a residential customer that uses 500 kWh of electricity per month, which is roughly average for SDG&E customers.

In all, progress on air quality and climate change can come in many forms – large demonstrations and marches in the street and decisions at public utility commissions that fly under the radar of most. All are important to getting where we need to be. If you’re a business or entity thinking about buying an electric truck or bus and get your electricity from SDG&E, you should contact them about benefiting from this funding.

 

Photo: Jimmy O'Dea

As Global Warming Increases, Is There an Upper Limit to How Much Additional Water Vapor The Atmosphere Can Hold?

© iStockphoto.com/Ridofranz

I’m sure you’ve heard that old adage, “It’s not the heat, it’s the humidity.” Living in Washington, DC, for the last three decades, I certainly know what it means. That said, it would be more accurate to say, “It’s not only the heat, it’s also the humidity.”

Indeed, humidity is critical when estimating the impact higher temperatures triggered by global warming will have. Scientific projections of future heat stress that rely on temperature increases alone underestimate the problem. After all, a 100°F summer day in the nation’s capital with 70 percent humidity feels a lot different than a 100°F summer day with 25 percent humidity in Phoenix. When there is a lot of water vapor in the air, it feels hotter.

With that phenomenon in mind, the Union of Concerned Scientists (UCS) published the report Killer Heat in the United States, along with a companion peer-reviewed study in the journal Environmental Research Communications, in July. Both the report and study use heat index—the measure of how hot it feels when air temperatures are combined with the amount of moisture in the air—to project the impact of rapid increases in extreme heat nationwide if industrialized nations don’t act quickly to reduce heat-trapping emissions.

It turned out that the report and study’s release date—July 16—was smack dab in the middle of an extended heat wave that settled over the central and eastern portions of the United States. The entire month of July in Washington was brutal. We suffered through 22 days with a high temperature of at least 90°F and humidity averaging 65 percent.

As if on cue, we recently received a question about water vapor and global warming from Henry J., a UCS member in Lexington, Kentucky, who must be familiar with high humidity. Lexington experienced temperatures in the high 80s with humidity hovering between 50 and 65 percent the first week of August.

Kristy Dahl

Kristina Dahl co-authored this report. She is a senior climate scientist for the Climate & Energy program at the Union of Concerned Scientists. Photo: UCS

It was a made-to-order question for the lead author of the Killer Heat report, Kristina Dahl, a senior climate scientist in our Climate and Energy Program. Kristy, who has a Ph.D. in paleoclimate from a joint Massachusetts Institute of Technology-Woods Hole Oceanographic Institution program, was the associate director of a Rutgers University campus-wide climate change initiative and a long time consultant to UCS before joining UCS full time in 2018.

Q: As global warming increases, is there an upper limit as to how much additional water vapor the atmosphere can hold? – Henry J., Lexington, Kentucky

K.D.: The short answer is no, there is no upper limit that we know of. But the long answer is that it’s a bit more complicated.

Experiments have shown that air can continue to accumulate water vapor at temperatures well above what we expect with global warming. As the atmosphere warms, a few things happen: Evaporation from bodies of water increases, which adds water vapor to the atmosphere, and water molecules in the air become energized and start moving faster. Energized water molecules are less likely to condense, so it is often said that warm air can “hold” more water vapor.

Condensation tends to happen around particles—such as dust and salt—in the atmosphere. There are enough particles that humidity typically doesn’t exceed 100 percent, though it theoretically could in completely pure air, before water molecules start to condense. That said, the amount of water vapor it takes to reach 100 percent humidity depends on the air temperature.

E.N.: What difference does humidity make?

K.D.: When your body’s cooling system is trying to cope with a hot day, humidity makes all the difference. People sweat to release heat, because when sweat evaporates, it has a cooling effect. A breeze or a fan can help us to cool down by quickening the pace of that evaporation. But humidity in the air around us limits sweat evaporation and reduces the associated cooling effect. So high temperature and humidity cause our bodies to accumulate heat. For this reason, weather forecasters often talk about the heat index—or the “feels like” temperature—which is a combined measure of temperature and a measure of humidity or “relative humidity.”

In Arkansas, Georgia and other humid states, the heat index may be much higher than the air temperature, while in Arizona, New Mexico and other arid states, the temperature and heat index may be the same.

A chart showing blah

To determine the effect of both heat and humidity, the National Weather Service formulated the heat index based on the range of typical warm season conditions. As the climate warms, the heat index will land outside the range of reliably calculable values, or, quite literally, off the charts.

As the heat index rises, more people become susceptible to heat-related illnesses, including heat stress and heat stroke. Recognizing the importance of the heat index to our bodies’ experience with heat, the National Weather Service uses heat index-based thresholds as the basis for issuing heat advisories and excessive heat warnings. For example, with a heat index of 105°F, the weather service typically issues an excessive heat warning that includes such language as “heat illness is likely” or “the very young, the elderly, those without air conditioning, and those participating in strenuous outdoor activities will be the most susceptible.”

E.N.: How will the heat index change with global warming?

The importance of humidity prompted my colleagues and me to focus on projecting future extreme heat using the heat index rather than temperature alone in our recent Killer Heat in the United States report. Without that humidity component, most previous studies were only capturing part of the story about how rising temperatures will threaten our health.

One thing that’s really interesting about the heat index, though, is that it was intended to encompass the range of conditions we’ve experienced historically. As global warming progresses, we’ll increasingly find ourselves in the black “off-the-charts” zone in the chart above. That zone represents conditions that are so hot that they exceed the National Weather Service’s heat index range and formulas. Such conditions have, thankfully, been exceedingly rare in the United States. They’ve been limited to just a couple of days per year in Arizona and the Sonoran Desert in Southern California.

Our Killer Heat analysis found that if industrialized nations worldwide fail to reduce global heat-trapping emissions, the United States would see a staggering—and frankly terrifying—increase in the frequency and severity of days with extreme heat in the coming decades. For example, we found that by midcentury, the average number of days per year with a heat index above 105°F would quadruple across the county. And while just three U.S. cities that have historically experienced an average of 30 or more days per year with a heat index above 105°F, by midcentury more than 150 cities across the country would experience that frequency and level of extreme heat.

The number of people in the country exposed to off-the-charts conditions also would jump dramatically if global emissions do not fall precipitously. Between now and midcentury, the number of people exposed to a week or more of off-the-charts conditions in an average year would rise from fewer than 2,000 to roughly 6 million. And by late century, more than a third of the U.S. population—120 million people—would be frequently exposed to these dangerous conditions.

The good news is that this unrecognizably hot future is far from a done deal. Aggressive emissions reductions that limit future global warming to at most 3.6°F—or 2°C—would contain the expansion and spare millions in the United States from suffering relentless summer heat.

Do you have a question for a UCS expert? Submit your question here.

© iStockphoto.com/Ridofranz

The Trump Administration Dismantles Endangered Species Protections as Sixth Mass Extinction Crisis Looms

American pika (Ochotona princeps) Photo: Shanthanu Bhardwaj/CC BY-SA 2.0 (Flickr)

Today, the Trump administration released a final rule dismantling the role of science in informing protections for endangered and threatened wildlife. The Endangered Species Act (ESA) and the protections it has afforded to threatened and endangered species have been based on the best available science and commercial data. Today, science will take a backseat as the new rule will sideline scientific evidence and emphasize considerations of economic costs in decisions to list species and/or the habitat they depend on under ESA. This new rule will result in less protection for America’s threatened wildlife and a higher likelihood of losing species forever as Earth’s sixth mass extinction occurs. 

The rule is part of a larger strategy by the Trump administration to sideline science from decisionmaking processes, especially for endangered species. The Trump administration has cherry-picked scientific evidence to make a case to delist the gray wolf across the country, rolled back conservation plans for the sage grouse, and suppressed publication of research conducted by federal scientists demonstrating three pesticides alone jeopardize the continued existence of more than 1200 endangered or threatened species. And this barely scratches the surface of attacks on endangered species protections. 

The rule released today overhauls the entire process by which species are considered for listing under the ESA. Here are some ways in which the rule will force science to take a backseat in listing decisions. 

Inserting economic considerations

As my former colleague Charise Johnson wrote when this rule was proposed, “The proposal adds economic considerations in the process for deciding if a species is in fact threatened or endangered, instead of making it a purely scientific decision.” As written, listing decisions under the ESA are required by law to be made solely based on the best available scientific and commercial data. The bipartisan agreement on this language when the ESA was written into law is there for a very good reason – economic considerations are irrelevant when determining whether a species warrants protection scientifically.  

Neglecting climate change considerations

The new rule redefines “foreseeable future” in a way that will prevent federal experts from considering the future effects of climate change on species populations. This is detrimental for species that are strongly affected by climate change such as the American pika. The American pika is so sensitive to increases in heat that its Western population has been dwindling already—the population of the species will likely decrease under future hotter conditions. Under the new rule, scientists will not be able to consider the pika’s population under future climate change scenarios, even if such conditions are a threat to the long-term survival of the species. 

Hopefully the American pika will move up in elevation where temperatures are cooler. If we expected this to happen, then the government could designate those areas as critical habitat. Unfortunately, the Trump administration’s new rule makes it almost impossible to designate habitat that is not inhabited now but likely will be in the future due to climate change. 

Under the new rule, it is difficult to imagine the continued existence of species expected to be strongly affected by climate change. 

Allowing “take” of threatened species

The new rule also will sideline scientific evidence that some threatened species’ populations are not healthy enough to be used as a resource or hunted. Threatened species have received protections from being “taken” under the ESA for 40 years, but the new rule upends these protections unless the FWS writes a special rule to provide such protection for a species. Even if the science shows that a threatened species’ population is not healthy enough such that we can fish or hunt those species, if FWS doesn’t have the resources to write a rule, or if special interests prevent such a special rule from being written, then it is more likely that “take” of a threatened species will drive it closer to extinction. For example, scientists have pointed out that the population of the gray wolf would likely decline due to illegal poaching if it lost its ESA protections. 

Once they’re gone, they’re gone

I’ve said this before, but I’ll say it again—once a species is gone, it’s gone forever. We don’t get another chance to have that species in the world. Such losses may have large effects on ecosystems and on us. We depend on biodiversity, many times unknowingly. That is why it is so critically important to protect our most threatened and endangered species. 

The ESA was passed in 1973 unanimously. The Senate voted 92-0 in favor of the bill, and the House voted in favor of the Senate’s version of the bill 390-12. The majority of people in the US agree that we need to protect endangered and threatened wildlife. The majority of public comments (more than 800,000) were in opposition to the rule altering listing decisions under the ESA. And the ESA works—it is the landmark legislation that brought the emblematic bald eagle back from the brink of extinction. 

If there is so much agreement that the ESA works, and the public agrees that the ESA is good and needed, why is the Trump administration doing an end run around this legislation? Well, the new rule could certainly help open up protected lands to development of fossil fuels. But who could imagine the Secretary of the Interior, David Bernhardt, who is a former fossil fuel lobbyist, and who would oversee many of these listing decisions, wanting that? 

Photo: Shanthanu Bhardwaj/CC BY-SA 2.0 (Flickr)

Maunakea and the Need to Indigenize Astronomy

Photo: pedrik/Flickr

I am told by Hawaiians that Maunakea is sacred. I am not sure I understand what that means, I am not Hawaiian, I am an outsider.

What I know about Maunakea is really only two things. The first is that Maunakea is one of the best sites for astronomy observing in the world, thanks to its height and the mostly stable weather on the mountain. That is why astronomers have proposed that the Thirty-Meter Telescope (TMT) be built there. With this telescope, we can expect new discoveries about planets orbiting other stars and whether these planets might host life as we understand it. We might learn about the first stars ever born and peer deeper into the Universe’s history than ever before. I am an astronomer and I will benefit from Canada’s participation in the TMT.

The second thing I know is that Maunakea is Hawaiian territory and we, astronomy, do not have consent for TMT on Maunakea. I think this has been clear for more than a decade through court cases and protests, but the idea of consent came to a head on July 18, 2019 when Elders were arrested by police for trying to protect the mountain. This was a violent moment, but not a new moment. Elders have been arrested for protesting the Alton Gas project in Nova Scotia, Trans Mountain pipeline in British Columbia and as we all know, Standing Rock. All of these situations and more are instances where Indigenous peoples were telling settlers/colonizers that they do not have consent. TMT does not have consent to be on Maunakea. I understand this as an Mi’kmaw First Nation person myself and seeing those arrests on Maunakea from thousands of kilometers away was wrong.

I know these two things and both concepts appear to be in conflict. But not to me. TMT does not have consent and that should be the end of the story. As a scientist, Indigenous rights are infinitely more important than whatever research benefit I might obtain from TMT on Maunakea. For me to do otherwise is to do unethical science and to harm Indigenous peoples. I only wish my colleagues could see this.

Even after weeks of protest, TMT is still looming over Maunakea, and colleagues are making many arguments justifying TMT over Hawaiian rights. Some are saying this is science versus religion, or that TMT is an economic boon to Hawaiians or that science is more important, or that polls say Hawaiians support TMT. None of these arguments matter or are relevant. Saying science versus religion is a Eurocentric (Western) way of diminishing Hawaiian culture and history and attempts to define the sacredness of Maunakea in Eurocentric way. It is irrelevant. Maunakea is Hawaiian and we do not have consent. Our ethical duty is to respect even if we don’t understand. When astronomers/scientists note that TMT should be built because it is an economic boon to Hawaiians, it is also irrelevant. Maunakea is Hawaiian territory and Hawaiians will decide what is or isn’t an economic boon. When scientists cite polls saying Hawaiians  support TMT so it should be built, they are saying that they get to decide what is or isn’t consent. But, Maunakea is Hawaiian and we do not have consent. No matter what frivolous argument astronomers make, (Eurocentric) astronomy does not have rights to Maunakea. Maunakea is Hawaiian territory and it is time we in science and astronomy respect that ahead of our own ambitions.

While I see this “debate” in a simple way, I think the debate exists because of how we do astronomy.  Astronomy in the USA, Canada, etc. is built from a Eurocentric perspective and erases Indigenous knowledges and peoples. Just think about a constellation in the sky made of a grouping of stars. Who defined that constellation, was it European or from somewhere else? It was probably a constellation defined by a group of European scientists about a century ago based on historical use of Greek/Roman constellations and less likely a Hawaiian constellation or an Inuit constellation or any Indigenous constellation.  We have not learned to respect and embrace Indigenous knowledges into astronomy. We have never truly listened to Hawaiians and Indigenous peoples. Maybe if we as scientists had a meaningful understanding of Hawaiian astronomy and perspectives, we could have avoided the situation we are in now.

Instead of erasing Indigenous knowledges, what if we braided Indigenous knowledges and Eurocentric astronomy? The Mi’kmaq Elders Albert and Murdena Marshall presented the term “Two-Eyed Seeing” as a methodology to view natural phenomena through two perspectives: one Eurocentric, one Indigenous. Bringing the two perspectives together allows us to understand natural phenomena better and in more detail. We as scientists would learn to see our relation to the natural phenomena we observe and to the land on which we live and work. Perhaps methods like this would help scientists and astronomers better understand Hawaii and Indigenous peoples worldwide. We have a lot to learn.

 

Hilding Neilson is a non-tenure stream assistant professor in the Department of Astronomy & Astrophysics at the University of Toronto and is a member of the Qalipu Mi’kmaq First Nation from Newfoundland and Labrador. He is an interdisciplinary scientist and educator working to blend Indigenous knowledges into astronomy curriculum with the goal of Indigenizing astronomy in Canada.  His research also focuses on probing the physics of stars from those like our Sun to the biggest, most massive stars and how we use these stars as laboratories to better understand our Universe from cosmology to extrasolar planets.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Photo: pedrik/Flickr

Why Colorado Needs a Zero Emission Vehicles Standard

Subaru vehicles are popular in Colorado and environmentalism is part of their brand identity, so it's troubling that their sole electric vehicle isn't available in Colorado. Photo: Tim Hurst/Flickr

Colorado is poised to enact requirements for automakers to sell “Zero Emission Vehicles” (ZEVs) in the state. In 2018, Colorado was already in the top 5 states in terms of percent of vehicle sales that are electric in the country, so why is this important? It’s because despite progress, Colorado could be moving even faster to switch from gasoline to electricity to power cars and trucks, if there were more models of electric vehicles available.

Colorado lacks a ZEV standard

Subaru wrote to Colorado’s regulators in October 2018 to “clarify” that the new plug-in model will be “available to all Subaru retailers nationally”, though to-date it’s not on Colorado dealers’ lots. Click to enlarge.

In 2018, Colorado ranked 4th in the nation in percent of new cars sold that were electric vehicles, at 2.6 percent. But that was less than half the rate of sales in California (7.9 percent sales), despite Colorado’s more generous electric vehicle (EV) rebate program. While there are many factors that lead to that sales difference, one key factor is the greater availability of EVs in California.

Automakers have argued that the ZEV standard is not needed to ensure EVs are available in Colorado. For example, Subaru said in a letter to the Colorado Air Quality Control Commission:

“Product mandates on manufacturers are not responsible for moving the ZEV market forward, cooperation and buy‐in from all stakeholders are the best course of action.”

And Subaru also “clarified” that its new plug-in hybrid Crosstrek SUV would be “available” to dealers in Colorado, a statement that seems targeted to assure the regulators that its ZEVs would be available in the state:

“Regarding the sale of the 2019 Subaru Crosstrek Hybrid (Plug‐in Hybrid Electric Vehicle or PHEV) in the State of Colorado, please allow us to clarify a possible misunderstanding. When the new Crosstrek Hybrid goes on sale in December of this year, it will be legal for sale in Colorado and available to all Subaru retailers nationally.”

Well, yes, the Subaru Crosstrek is technically legal for sale in Colorado, but that’s not the concern. The important question is: Is it available in the state for interested customers? What have we seen to date? Exactly 2 plug-in Crosstreks have been registered in Colorado in the first six months of 2019 because it’s not available at Subaru dealers in the state. A search on Edmunds.com of new cars available at dealers shows 0 cars available within 100 miles of Denver, CO, with the nearest one 800 miles away in Riverside, CA (as of August 6, 2019). By comparison, doing the same search 100 miles around Oakland, CA reports 23 Crosstrek plug-in hybrids available at dealers’ lots.

 

Subaru vehicles are very popular in Colorado and part of their brand is environmentalism. The CEO of Subaru of America told the website brandchannel: “We’ve been identified as granola-crunchers and tree-huggers, and people with lots of stickers on the backs of their vehicles, and it’s true. We’re proud of that. Why run away from that?”

But yet the only states with more than 10 new Crosstrek plug-in registrations in 2019 (so far) are California, Connecticut, Maryland, Massachusetts, New Jersey, New York, and Oregon. What do these states all share in common? They have all adopted the ZEV standard. Clearly the ZEV regulation is needed to move the market forward in Colorado.

I’ve singled out Subaru in this example, but there are other manufacturers that have also avoided bringing electric drive cars to Colorado and other states that aren’t part of the ZEV regulation. For example, the Volkswagen e-Golf, Fiat 500e, Kia Niro EV, and Honda Clarity are all virtually unavailable in Colorado. And in its own letter to regulators, Kia bragged about the Soul EV being available for sale in 13 states, but not Colorado. So, while Colorado has had more EV sales than much of the rest of the country, EV sales could potential be even higher if more models were available to car buyers in the state.

Colorado needs to send the right message on climate change

The air pollution from vehicles on Colorado’s roads today is significant. And the state is already experiencing the impact of climate change, driven by combustion of fossil fuels. Zero Emission Vehicles are a solution that will help address both air quality and climate change. While the impact will be small initially compared to the number of gasoline cars on the road, the most important part of adopting the ZEV regulation will be to put Colorado on a clear path forward towards moving away from gasoline and towards lower emissions.

Photo: Tim Hurst/Flickr

Trump Administration’s Attacks on SNAP Hurt Farmers and Rural Areas

Photo: UCS

If you haven’t already, you really should read this week’s post from our resident food systems and health expert Sarah Reinhardt. In it, Sarah breaks down everything that’s wrong with agriculture secretary Sonny Perdue’s latest regulatory attack on the Supplemental Nutrition Assistance Program, or SNAP. Or rather, nearly everything that’s wrong with it. Because in addition to the dishonesty, denialism, and downright cruelty (I know, it’s the point) that oozes from a rule change that, if enacted, would take food off the plates of $3.1 million low-income people, there’s something else.

Secretary Perdue’s proposed SNAP cuts would hurt the very people he calls his “customers”: farmers and rural communities.

As we head into National Farmers Market Week, an annual celebration of local food, let’s look at the impact of SNAP cuts on farmers who are bringing the rest of us a bounty of summer peaches, tomatoes, corn, and green beans right now.

Many of those farmers have a larger customer base and higher sales than they otherwise would because there are now (according to USDA data updated in July) 3,441 SNAP-authorized farmers markets operating nationwide. Farmers at those markets can sell their fresh produce, meats, and dairy products to people at all income levels because of the program. And SNAP redemptions at farmers markets has been rising, up 35 percent between 2012 and 2017. Community-based programs that double SNAP dollars, also on the rise, mean even more sales for farmers.

But markets won’t be able to double the benefits of SNAP recipients who are kicked out of the program as a result of Perdue’s short-sighted and punitive eligibility change. And while his proposed rule provides an estimate of the losses to small retailers ($183 loss of revenue per small authorized retailer on average per month), there’s no consideration for how this rule would affect farmers market vendors specifically. Surely though, some slice of such farmers’ incomes would simply disappear.

And there’s more. As Salon reported during the SNAP fight that nearly hijacked the 2018 farm bill, some small farmers use SNAP to supplement their own families’ food budgets. Some of them will likely lose their benefits as well.

Finally, as the Union of Concerned Scientists has shown, many households in low-wage, low-prosperity rural counties turn to SNAP to augment their food budgets—in fact, they do so at higher rates than their urban counterparts. We found that 136 of the 150 counties with the highest percentages of SNAP participation by household are located in rural areas. And SNAP dollars spent at rural grocery stores help boost struggling economies.

For National Farmers Market Week, how about actually helping farmers?

Any minute now, I expect to see a cheerful press release from Secretary Perdue praising farmers markets and the farmers who supply them. Don’t get me wrong, he should acknowledge those farmers. We should all raise a glass to them during this celebratory week. (Grilled watermelon margarita, anyone?)

But Secretary Perdue’s USDA really should do more to ensure their success, instead of shortsightedly shrinking their customer base and leaving their neighbors hungry.

Photo: UCS

So Long and Thanks for all the Fish, Says the EPA to Southwest Alaska

Photo: Alaska Region U.S. Fish & Wildlife Service Follow/Flickr

The Environmental Protection Agency (EPA) has once again washed its hands of its responsibility to protect the health and safety of our waterways. On Tuesday, the agency helped clear a path towards the development of Pebble Mine, a proposed mine in Bristol Bay, Alaska that if built will become the largest open-pit copper and gold mine in the United States. The EPA has abdicated their authority to veto the project if the mine proves dangerous to the rivers, streams, and other water bodies in the region.

The EPA forfeited their veto power by refusing to send an official letter that contained the following language, as required under a 1992 agreement, that Pebble Mine “will have substantial and unacceptable impacts to an Aquatic Resource of National Importance.”

Substantial and unacceptable impact

Let’s review the evidence-based reasons showing that, without a doubt, Pebble Mine would have a substantial and unacceptable impact on the aquatic systems of Bristol Bay.

Hardrock mining, like mining for copper and gold, is an industry that is prone to polluting waterways with toxic substances such as arsenic and lead. It is estimated that 40 percent of the watersheds in the western United States are contaminated by pollution from hard rock mines. The proposed Pebble Mine is expected to process 180,000 tons of ore a day and be operational for 20 years. The mine would include laying a 187-mile-long natural pipeline, constructing an 84-mile-long private transportation route that crosses over 200 streams (including Lake Iliamna, Alaska’s biggest lake), and would encompass the building of dams that would block critical salmon habitat. The mine would be situated in a seismically active region at the headwaters of Bristol Bay, it would reach a depth of 0.77 mile (in comparison, the Grand Canyon’s maximum depth is 1 mile), and is estimated generate up to 10 billion tons of toxic mine waste.

In 2014, a three-year scientific assessment was conducted by scientists at the EPA’s Pacific Northwest region to study the potential environmental impact of Pebble Mine. The researchers concluded that the mine would result in the loss of 1,200 to 4,900 acres of wetlands, lakes, and ponds. In the lowest impact scenario, 5 miles of salmon-filled streams and 19 miles of tributaries would be lost; in the highest impact scenario, 22 miles of salmon-filled streams and 72 miles of tributaries would be destroyed by mining activities.

Salmon populations are sensitive to toxic runoff from mines and exposure can result in death, reduced growth and reproduction, and an inability to relocate their natal streams (salmon need to return to the spot where they themselves hatched in order to deposit their eggs). And since salmon are one of the keystone species of the region – they provide food for 137 species and 25 percent of the nitrogen needs of riverside vegetation come from salmon carcasses – industrial mining activities have the ability to severely threaten not only the current population of salmon, but the entire ecosystem of Bristol Bay.

The EPA has bowed to industry pressure and the people hate it

The EPA has flipped multiple times on the issue over the years. This is due in part to an intense and multipronged strategy by the Pebble Limited Partnership, a subsidiary of the company, Northern Dynasty Minerals, that is attempting to build the mine. The Pebble Limited Partnership has spent $11 million on lobbying since 2011 ($4 million of that since the inauguration of President Trump), painted EPA’s efforts to scientifically assess the environmental impact as “biased,” and attempted to demonize the agency for considering protective actions under the Clean Water Act. Most recently, they have turned to a former friend in Congress, Lamar Smith, to advance their agenda, and finish what he started as Chairman of the House Committee on Science, Space, and Technology.

Unsurprisingly, EPA’s actions go against the will of the American people, especially the Alaskan people. Since 2012, polling has shown that Alaskans have consistently been opposed to Pebble mine. Also since 2012, over 2.5 million comments have been submitted to EPA asking for protection of Bristol Bay from the potential risks posed by Pebble Mine. The US Army Corps held hearings in April 2019 in Bristol Bay communities and 80% of public testimony was in opposition of the mine.

The worst impacts will be on Native communities

Native Alaskans have fished the salmon in Bristol Bay for at least 4,000 years and even today the salmon represent a major source of subsistence for Native groups. While a few tribal groups have granted permission for Pebble Mine to be built, the vast majority of tribal groups, particularly those directly impacted by the mine, are vehemently opposed. These Native groups have been fighting Pebble Mine for years using the 2014 EPA report and other scientific data to make their case. But it comes down to this – Pebble Mine is likely to rob Native Alaskans not only of their way of life, but it could endanger their entire food supply.

When we surveyed federal scientists last year, some of the scientists wrote us comments related to this topic.

  • “I have seen the concern of the people of Bristol Bay as the Agency changes positions on the Pebble Mine,” said one EPA scientist.
  • “Agency decisions have resulted in collapse of salmonid populations which [significantly] affects the diet and income of low income, Native American populations,” said a National Oceanic and Atmospheric Administration (NOAA) scientist.
  • “I am concerned that opening up the Arctic for oil and gas exploration will have disproportionately negative impacts on Native Alaskan communities and their hunting and subsistence activities,” said a Bureau of Ocean Energy Management (BOEM) scientist.
If you love to eat salmon, this could affect you

Bristol Bay is the home of the world’s largest Chinook salmon run and it accounts for half of the wild sockeye salmon that is sold worldwide. This results in $1.5 billion in economic activity and is responsible for 14,000 Alaskan fishing jobs. According to the EPA’s 2014 scientific assessment, “Bristol Bay is remarkable as one of the last places on Earth with such bountiful and sustainable harvests of wild salmon. One of the main factors leading to the success of this fishery is the fact that its aquatic habitats are untouched and pristine, unlike the waters that support many other fisheries.”

The EPA’s decision to run away from its responsibilities goes against the scientific evidence and the will of the people. Unlike other decisions made by this EPA, there is no way to reset the damage that could be done to this region. Instead, Alaskan communities, including its Native population, will have to live with the consequences of this decision forever.

Photo: Alaska Region U.S. Fish & Wildlife Service Follow/Flickr

Tribute to Dr. Frank Ackerman, a Second Draft

The confidence a good mentor places in you can give you confidence in yourself. When a good mentor is willing to invest in you, you can become willing to invest in yourself. Good mentors can be hard to come by, and earlier this month the most influential mentor I’ve ever had passed away.

Dr. Frank Ackerman was a giant in his field; I was truly honored to have been so fortunate to have worked with him. Frank took me under his wing when I was at Synapse Energy Economics. During my time there, I had the privilege of working with him on a range of economic analyses. From the role of climate modeling in public policy to the costs of generic drug regulation. From the social cost of carbon to the long-term plan of a small Kentucky cooperative utility.

Renowned for his robust analytical capabilities in the field of environmental economics, Frank was also an incredibly skilled writer.

I was not.

I still find myself comparing my work to Frank’s and find my work wanting. Having said that, the most important thing Frank taught me was that good writing isn’t a “gift.” If you want to get good at it, you must work at it and keep working at it.

I was 28, and nobody had ever bothered to tell me that until then.

Here are the three things Frank taught me about technical writing that I’ll never forget:

1. Write

In Calculus, those that give up rob themselves of ever solving the puzzle; it was those willing to struggle that found the right answers. I liked struggling through the math equations but when it came to writing, I lacked encouragement and motivation. Frank gave me both.

“Putting the work in” was a big part of what Frank coached me to do. He would assign me to do write-ups of our work, forced me to think through how to frame the issue, and would show me what worked (and what didn’t). Frank made sure I develop muscle memory so that I would get it right the second go-around (or third, or fourth). It would have taken less time for him to have just written it himself, but he really invested in me.

Thanks, Frank.

I used to do whatever I could to make sure writing wasn’t a big part of my job, for fear of failing at it. Today, I tweet at wonks, testify to utility regulators, write briefs to legislators, author technical reports for who knows, and you are currently reading my blog.

2. Read

“What science writers do you read?”

That was one of the first questions he asked me and I couldn’t think of any. He challenged me to find writers that tackle challenging topics and read how those people do write. Learn from the best.

I was reading mainstream economists like Joseph Stiglitz and Paul Krugman. Frank’s suggestions led me to David Roberts (then at Grist, now at Vox) who has taken on topics like discount rates (with otters).

My job required me to read 1,000’s of pages of intense testimony from energy economists and financial analysts, still does. What Frank impressed upon me the need to pay attention to how reporters were covering issues. I was busy trying to impress economists. Trying to show that I knew economics, all the while my writing had become incomprehensible to the outside world.

I started reading Frank’s books (Priceless, Poisoned for Pennies, and Climate Economics) as well as his articles (like Climate Economics in Four Easy Pieces).

Frank was a prolific writer.

I still go back and read Frank’s work. I still read David Robert’s work. I read Julia Pyper, Jenny Chase, Brad Plummer, Gavin Bade, Iulia Gheorghiu, Catherine Traywick, Russel Gold, and countless others. Learning from the best, just like Frank taught me.

3. Rewrite

One element of writing that I’ve learned to appreciate is that you get a chance to be a smarter version of yourself.

Who hasn’t thought, “I have the perfect response to that.” Or, having walked away from an argument thought, “Gee I should have worded it differently!”

When you are doing the writing, you get to walk away, come back, and the rewrite your argument exactly how you want it. Frank was infinitely patient with me and taught me to be patient with myself. I used to write up to the deadline of projects, rather than stagger the writing. Giving myself a day (or even an hour) to step away and come back with fresh eyes, it can make a big difference.

Frank helped me become a better writer and I tried to thank him in a Twitter thread that I botched with broken tangential thoughts and poor grammar. I’m sure if he had seen it, he would have taken out his red pen and gone to work on it. It was no way to honor him, so I’m giving it another shot. This post is my rewrite. My way of memorializing my thoughts the way I wanted to. To sound like me, only better.

I hope I did right by you, Frank.

Frank in his element.

Patrick Fore on Unsplash http://frankackerman.com/

An Absence of Energy Leadership in a Climate Crisis

Photo: Ma. William Carraway/Wikimedia Commons

These are interesting times for electricity regulators and legislators in the 13 US states, from the Mid-Atlantic through the Ohio Valley to Illinois, that share authority with PJM and FERC. PJM has been in direct conflict with state activity and autonomy in regulating environmental effects from energy power production. With a hastily retired CEO at PJM and indecision at FERC, states now should stand up for the role of clean energy in PJM, and be heard.

Renewable Energy Standards are law in 29 states plus D.C. Additional state payments to generators are made through ZECs and cost-of-service regulation. Credit: DSIRE. Click to enlarge.

State officials have a variety of policies and goals for the electricity supply for each of their states, from rate stability and economic development incentives, to ambitious renewable goals, to health and safety protections for workers and consumers. Governors and legislators are closer to constituents and respond to the interests of their communities more directly than a regional utility or federal agency. And in light of the federal government’s abdication of numerous duties, this is more true now than ever before.

State decisions on plants

Decisions about new power plants are arguably the largest and most impactful decisions state leaders can make in the energy field. The present tension surrounding PJM and FERC over capacity market rules stems from PJM seeking to abandon the past balancing of funding power plants between states and PJM. The treatment of state clean energy policies in wholesale markets, and in fact the functioning of the PJM capacity market, are now in limbo.

Every state in PJM sponsors payments

Every one of the 13 states plus the District of Columbia have policies of paying power plants in addition to market rules PJM adopted under FERC jurisdiction. Some of these existing practices were directed by state laws establishing renewable energy procurements (known as RPS or RES), as those states promoted customer choice and “de-regulation.” More recently, in response to lower energy prices, some states enacted laws to keep existing nuclear plants from closing (i.e. ZEC payments).

More subsidies in rates than RES or ZECs

But the largest share of plants in the PJM region under state programs are those receiving cost-of-service payments through ratepayer bills. Approximately 40,000 MW, or roughly 20% of generation in PJM, is owned by for-profit utilities (AEP, Dominion, First Energy and PPL) that collect costs for this generation through state-approved customer rates. That fleet is three times larger than the amount PJM identified to FERC as needing PJM’s market intervention to reset prices. PJM’s challenge to the state-supported renewable and nuclear generators has been delayed at FERC. FERC’s initial response—remove all generators that have state-supported payments—was unacceptable to PJM. There is no schedule or approved rules for the next annual PJM procurement of capacity. A stand-off has ensued, and with it we are beginning down the path of living without a capacity market.

What are the stakes?

The stakes are higher than the $6-7 billion per year that consumers pay through the PJM capacity market.  PJM uses the capacity market results to plan the needed transmission, and ensure adequate power plant reserves. The PJM approach is favoring the power plants that contribute to climate change. In the acceptance of past state-supported fossil-fuel burning plants, which have costs recovered in state-set rates, PJM allows old polluting plants to avoid competition.

What are the climate change factors in PJM’s territory? Record-breaking weather is one way to look at climate change. According to a new UCS analysis on extreme heat, historically, there have been an average of 4.6 days per year in the PJM territory with a heat index above 100 degrees. This would increase to an average of 33.5 days per year by mid-century without carbon and methane emission reductions.

Coastal flooding impacting over 2 million homes on US coasts, and 500,000 homes in PJM states alone, is another way to consider the cost of the grid operator ignoring state purchases of carbon-free energy.

This comes down to the familiar conflict between Policies vs. Markets. PJM describes itself as having three core functions: transmission planning, grid operations, and wholesale markets. PJM continues to seek “evolution in its markets to value what policymakers find valuable.” UCS notes an example of PJM adopting winter capacity rights for wind, first on an interim basis and soon to be formalized on a permanent basis.

As is so often necessary in our economy, the state policy goals must be articulated and enforced, and then incorporated into markets. Seat belts for cars, for example, became standard when required by law. Perhaps after many more thousands of injuries, consumers would have selected only cars with seat belts, but policymakers set a policy to protect the public health and safety. The car market includes the cost of seat belts in the price of all cars, and other innovations in the industry can proceed without a stand-off over seat belt policy.

This is how democracy works

In the electricity market, pollution policy hasn’t become as unified and clear as seatbelt policy.  States (and corporations and cities) have adopted clean energy procurement policies to deal with market externalities, those good things and bad things that are outside the prices paid for electricity. This kind of leadership, especially in the absence of leadership at the regional and federal level, is essential. It is also how democracy works, how our federal system is designed, and fortunately how can make progress on getting more of the good things (energy without heat-trapping emissions) that we need to keep our climate and economy in some recognizable balance.

Photo: Ma. William Carraway/Wikimedia Commons

Trump Administration Sidesteps Congress to Cut SNAP. Again.

The Trump administration has announced a new proposed rule that would make it harder for millions to feed their families—and is defying Congress in the process.

If that sounds familiar, it’s because we’ve seen this movie before. Several times.

Remember that in 2018, Trump allies in the House of Representatives hijacked the 5-year farm bill in an attempt to make drastic cuts to the US Department of Agriculture’s (USDA’s) Supplemental Nutrition Assistance program, or SNAP (formerly known as food stamps). Perhaps you recall some of their proposals, including the much-derided America’s Harvest Box idea. Those proposals were ultimately rejected by Congress, which in December passed a farm bill that allowed SNAP to continue doing what it does best: providing nutrition assistance to the most vulnerable individuals and families in our country. But that hasn’t stopped the Trump administration from circumventing Congress and taking matters into its own hands with a series of regulatory changes to slash SNAP spending and participation.

For example, last November, amidst a blitz of baseless changes to immigration policy, the administration called for a rule change that would force many immigrants to choose between keeping food on the table and maintaining a path to citizenship.

And in December, on the heels of the farm bill’s passage in Congress, Secretary Perdue unveiled a plan that could cause more than 750,000 unemployed and underemployed adults to lose access to SNAP. And despite strong opposition from public health and anti-hunger groups and outright rejection by Congress itself, this proposal continues its slow and defiant march through the rulemaking process.

Now, Secretary Perdue is proposing an additional regulatory change that would make it harder for millions of parents, seniors, and children to qualify for SNAP, and could also put affordable school lunches out of reach for half a million kids in the process.

If we’ve become accustomed to the Trump administration’s attacks on the federal safety net, it’s because they’re part of a consistent pattern of undermining science-based policies that protect public health and safety. But that doesn’t make them any less alarming. The consequences are real, and could be felt by communities across the country—rural and urban alike—for years to come.

Understanding the latest SNAP proposal: Cutting categorical eligibility

This particular policy change focuses on what’s called “broad-based categorical eligibility,” or “cat-el” for short. Put simply, it’s a legal provision that allows households who qualify for a federal program called Temporary Assistance for Needy Families (TANF) to automatically qualify for SNAP. This makes it easier for people who need help to apply for it—and it eases the administrative burden on the state and federal agencies responsible for running these programs.

Here’s the rub: States have flexibility in deciding what makes someone eligible for TANF and can set their own income limits and asset limits. Broadly, this has resulted in lower eligibility standards for TANF (meaning it’s generally easier to qualify for TANF than for SNAP) and some variability in regulations from state to state.

Enter the Trump administration, declaring it’s time to close this “loophole” by limiting the circumstances in which cat-el can be used. Specifically, the proposed rule suggests that cat-el should only be used in cases where TANF is providing a household with “ongoing and substantial benefits” of at least $50 per month over six months. It would also allow just three types of non-cash TANF assistance to count toward this minimum: subsidized employment, work supports, and childcare (together, these make up about 30 percent of all federal and state TANF dollars spent).

But categorical eligibility isn’t a loophole, and calling it one is both demeaning and dishonest. It’s an intentional policy provision, and it has been upheld by Congress for decades—most recently in the passage of the bipartisan 2018 farm bill. There’s a reason cat-el exists, and there’s a reason more than 40 states are using it.

Until we can prevent persistent poverty, we need anti-poverty programs

It isn’t difficult to become poor in America, but it’s undeniably difficult to be poor. Last year’s report from the Economic Policy income (EPI) showed that income inequality has been widening in every state since the 1970s, meaning it’s only getting harder for most families to make ends meet. On average, those in the top 1 percent of US families by income earned 26.3 times more than those in the bottom 99 percent in 2015. Of course, poverty and hunger are closely related, and it shows in the data. Nearly 12 percent of all households in the US—15 million in total—are classified as food insecure, and rates of food insecurity among black and Hispanic households are nearly double those among white households.

But rather than proposing policies to address the underlying causes of hunger and poverty, the Trump  administration keeps trying to slash the safety net, while pushing misperceptions that poverty is a personal failure or character flaw—or worse, that it doesn’t exist at all. But make no mistake: the consequences of cutting access to SNAP are real, and they will have lasting impacts for families and communities across the country. The research is clear on the benefits that SNAP delivers to local economies, and on the critical protection it provides against food insecurity—particularly for kids. Young children who participate in SNAP are more likely to complete high school, and to have lower rates of obesity and metabolic syndrome in adulthood.

According to the USDA’s own estimates, this latest proposed rule would cause more than 3.1 million individuals to lose access to SNAP. Households with elderly members would be disproportionately affected: more than 13 percent of households with elderly members are expected to lose benefits. Furthermore, the rule may cause 500,000 children to become ineligible for free and reduced-price school lunches—a fact the USDA revealed on a press call last week, but omitted from the proposed rule itself and has refused to acknowledge since.

What you can do now

From now through September 23, the USDA is accepting public comments on the proposed rule. The next two months is a critical period for the administration to receive feedback about the policy and its potentially devastating consequences, both from members of the general public who can communicate what the rule would likely mean for their families, neighbors, and communities, and from public health professionals who can leverage their expertise to defend a program that is effective and evidence-based.

For more information on writing effective public comments, including quick tips and templates for writing your own, check out these resources from Food Research & Action Center and Hands Off SNAP.

Photo: USDA.

Four Automakers Stand Firm With California Against a Trump Administration Rollback

Mary Nichols, chair of the California Air Resources Board, announced an agreement last week between the state of California and four automakers to make their vehicles significantly more efficient than would be required under the Trump administration’s rollback. Photo: Emmett Institute/Flickr

Last week, California’s governor and lead regulator announced an agreement with four automakers (BMW, Ford, Honda, and Volkswagen) on vehicle emissions standards that exceed anything the Trump administration has proposed, though it remains lower than what is on the books today.

With the current administration shirking its responsibility to reduce emissions from the passenger vehicle fleet, California has once again stepped up to its longstanding leadership role by negotiating a compromise directly with four automakers. The agreement allows manufacturers to sell a national fleet which meets a reduced standard to count as compliance with the state’s own, stronger standards, providing certainty and meaningful global warming emissions reductions.

This is an important step forward—first and foremost, the automakers are acknowledging California’s leadership on reducing emissions and the continued role the state must play in moving the nation forward. The automakers are also acknowledging what we’ve been saying all along, which is that they can go well beyond the rollbacks proposed by both the administration and their own trade groups. Honda and Volkswagen have made similar statements previously, but BMW and Ford committing to this proposal shows the breadth of the consensus in the industry that the technology is there to reduce emissions—what is needed now is a full commitment across the board to go beyond letters and actually manufacture the efficient vehicles needed to reduce emissions and save consumers money.

There are many details which I’ll work through below, but the obvious question is simply, “What about all the other automakers, like General Motors and Toyota?” Are any more going to step forward to the challenge, or are they just going to remain as they have throughout the past six decades, pushing for weaker regulations that cost consumers and the environment?

What automakers stand to gain

Right now, we have a nationwide program on vehicle emissions—the Obama administration looked at the evidence on the current standards and saw that its standards remained feasible and appropriate. California agreed, and the standards we currently have on the books would nearly halve global warming emissions from vehicles through 2025, saving consumers money at the pump and reducing emissions from transportation, the US’s largest contributor to global warming emissions today.

The Trump administration came in and threw a giant wrench in the gears, eliminating the consensus-based nationwide program by proposing to roll back the federal standard at the industry’s request. The administration is also trying to illegally eliminate the authority of California and the 13 states who’ve adopted California’s Advanced Clean Car Standards, leaving manufacturers with the challenge of complying with two sets of books, one for the 35 percent of sales reported in the states maintaining California’s program, and the other for the rest of the country dealing with the Trump rollback. It also falls out of line with more stringent requirements from China and Europe for these global companies. Combined with the inevitable lawsuits resulting from the Trump administration’s gift to the oil industry, this creates massive uncertainty for automakers (albeit of their own making).

In the agreement with BMW, Ford, Honda, and Volkswagen, California has agreed to accept as compliance with its own standard automakers selling a fleet nationwide which averages 3.7 percent per year improvement from 2021 onwards, up to 1 percent of which can be made up of bonus credits given to electric vehicles. This is weaker than what California would have otherwise required of those manufacturers within its state’s borders, but recreates a nationwide standard for those automakers, providing greater certainty and ease of compliance.

What we stand to gain

Many in the industry are investing in technologies to reduce fuel use, whether that is more advanced and efficient transmissions, next-gen engine designs involving more efficient combustion or ever more clever engine “right-sizing” strategies, or cutting gasoline use altogether by switching to battery-electric vehicles.

Some of these new technologies have been deployed in new cars that consumers have been buying over the last 8 years, thanks to the existing standards.  However, the Trump administration’s proposed rollback threatens to stall out these and further advancements in the industry. Automakers don’t put energy-saving technology in cars without a push—with the Trump administration shirking its responsibilities, that push is having to come from California and the states adopting California’s standards

Ensuring that these manufacturers sell more efficient vehicles nationwide means that not only will Californians, or New Yorkers, or other folks who live in states with strong standards get the most efficient vehicle choices—these agreements help bring parity of choice to consumers across the country, which means the amount we all spend on gas stands to decrease, no matter where we live.

For the environment, what this means is that while a cloud of uncertainty hangs over anything the administration puts forward, as it lies in legal limbo, we are continuing to push efficiency forward. If a future administration acknowledges the problem of climate change, we will be closer to the needed trajectory to solve it under this proposal than under the years of delay which would result from the President’s proposal.

Better than standing still

At the same time, it is important to acknowledge that this agreement is not going to put us on the path we need to be to meet the Paris agreement targets, nor is it keeping us on the path we are on right now. There will be increases in emissions, even under the agreement signed by California and the four automakers.

Notably, this agreement only covers four automakers, representing just 30 percent of annual sales in the US. In order for this agreement to really make a dent in emissions, the entire industry needs to be bought into it. This is what makes the Trump administration’s rollback so frustratingly stupid—they have the biggest lever to exercise in the fight against climate change, and they’re using it instead to benefit oil companies.

Our analysis shows that if all manufacturers were to join this agreement, we would retain about 75 percent of the benefits of the current standards.

If the entire industry signed up for California’s plan, our analysis indicates that about 75 percent of the emissions reductions from vehicles sold through 2025 would be retained. This agreement represents a significantly better proposal than the 100 percent of emissions lost under the Trump administration’s proposal, but the foregone emissions reductions will have lasting consequences.

The average lifetime of cars exceeds 15 years and is getting longer—even if all automakers bought into the agreement with California, the vehicles sold under this weaker proposal would cost consumers around $100 billion more in fuel over their lifetimes. The nearly 300 million metric tons in additional global warming emissions will also cost us and future generations precious time in the fight against climate change.

Who will be the next to stand up?

BMW, Ford, Honda, and Volkswagen did the right thing in stepping away from their trade associations to work directly with California, but this agreement alone is simply not enough to address the severity of the challenge. President Trump may be ignoring the problem entirely, but it’s time to show the American people that the auto industry is serious about addressing climate change.

Toyota talks a big game about its environmental bona fides, but its fleet has actually increased its average fuel use and emissions over the past few years thanks to a lack of investment in the very segments it’s working to grow, trucks. Committing to, and taking, strong action is a way to show it’s serious about reversing that trend.

General Motors has lobbied for a meager 1 percent per year improvement from its gasoline-powered fleet, despite them making up more than 90 percent of its projected sales over the next decade. While it may claim to be “driving toward a future of zero emissions,” GM’s proposal was slated to get there at horse-and-buggy speed. If GM wants to finally merge into the fast lane toward its zero emissions future, joining this agreement with California would be a way of assuring its commitment to “meaningful year-over-year reductions.”

Getting those two manufacturers on board would double the number of vehicles covered by this agreement with California and go a long way towards bringing the rest of the industry on board. But whether GM and Toyota are leaders in the next phase of this agreement or the last ones to sign on, we need to get more companies on board to really push forward and undo the damage that the industry has done by seeking Trump’s help in a rollback.

We know that the industry can be doing a lot more to reduce emissions from the vehicles it sells—while this agreement is still not commensurate with where we could and should be going, it’s at least a welcome step in the right direction.

Photo: Emmett Institute/Flickr Mary Nichols @ Twitter brionv @ Flickr

What do Alaska Wildfires Mean for Global Climate Change?

Alaska is on fire.

During the (on-going) 2019 fire season, over 2 million acres have burned – an area roughly equivalent to that of Yellowstone National Park. In comparison to many fires in the conterminous United States, many fires in Alaska burn far away from population centers, and as such can be fought and responded to differently. However, to put the alarming nature of this season in context, the 2019 fire season in Alaska has already burned greater acreage than ALL fires in California during 2018 (~1.8 million acres), the year of the devastating Camp, Woolsey, and Carr fires. We can see in the figure below that 2019 is slated to burn far more acreage than an average Alaskan season and may be on a trajectory to surpass 3 million acres burned.

Trends in burned acreage and total emissions across recent, representative fire seasons in Alaska. Burned acreage data from historical fire information from Alaska Interagency Coordinating Center, calculated using uniform growth over the provided course of each fire. Points were connected by lines for visual clarity. Emissions estimated using data from Veraverbeke et al 2015, which integrates both biomass and soil combustion.

Alaska’s ecosystems are already some of the most vulnerable to climate change, with temperatures rising at a rate that’s twice the global average. Across the state, record breaking heat and drought have dried out Alaska’s ecosystems, priming them to burn.

While these fires pose risks to human life and infrastructure, they also exacerbate climate warming by releasing millions of tons of heat-trapping gases into the atmosphere as Alaska’s vegetation and carbon rich soils burn, meaning that the impact of this year’s fires could reverberate for generations.

Wildfire smoke forecast from UAFSMOKE showing PM2.5 levels over the state of Alaska. PM2.5 is a major component of wildfire smoke and a good proxy for air quality. Pink, purple, and red areas are unhealthy for sensitive groups like children and older adults.

Boreal forests are fire adapted, meaning that fire is a natural part of the ecosystem. In recent years, however, fires have become more frequent and intense, moving beyond the historic fire regime of these ecosystems. So now we’re seeing more fires that burn a larger area.

In the twenty years from 1980-2000, Alaskan fires burned approximately 13.9 million acres. In the last nineteen years, Alaskan fires have burned more than double that area, approximately 28.1 million acres.

Fire seasons of this magnitude can create dangerous conditions for those who live, work, and visit Alaska. Risk of fire has already led to evacuation of several communities throughout Alaska, and smoke from these fires can create conditions that far exceed the EPA’s clean air standards for particulate matter.

Even more human lives are at risk when you consider the individuals fighting fire and the smoke impacts that make being outdoors, even to research and respond to fire itself, a health hazard.

Beyond these immediate risks of fire, carbon emissions from these wildfires could exacerbate climate warming for decades to come.

In addition to the carbon emitted from the burning of trees and foliage during a wildfire, carbon stored within the soil and below ground can also be released. Alaska’s ecosystems store huge quantities of carbon both as permafrost and soil that has accumulated over millennia. Wildfires destabilize these stores of carbon by combusting soil and accelerating permafrost thaw, both of which release heat trapping gases to the atmosphere.

Global carbon storage, darker colors (indicating the most soil carbon) are largely in high latitude ecosystems. Riccardo Pravettoni, UNEP/GRID-Arenda

And when it comes to the quantity of these gases, Alaskan fires aren’t messing around.

In 2019, fires in the state have released roughly three times what Alaska emits annually from burning fossil fuels (Fires release ~99 Million Metric Tons of CO2), and 40% more than all 2018 fires in California (68 Million Metric Tons of CO2). In the 3 largest fire years on record (2004, 2005, and 2015), fires released approximately 8 times more.

To avoid devastating impacts of climate change, we must reduce emissions from all sectors, including protecting carbon that is already sequestered. While these fires and emissions present an unchecked liability when it comes to combating continued climate warming, they also may present an unusual opportunity – where we can fight climate change with a technique (fire management) that we already can implement successfully.  Although fire management is often ignored as a viable climate mitigation strategy, the increasingly large fires this season in Alaska suggest that we ought to reconsider and view management of Alaskan fires as critical intervention opportunity.

Public Domain

Calor fatal en cada distrito de congreso: Nuevo mapa y 433 fichas informativas que muestran lo que está en juego

Un nuevo mapa interactivo de Union of Concerned Scientists le deja explorar cómo se pronostica que el calentamiento global va a aumentar la frecuencia y la severidad del calor extremo en su distrito de congreso. Con esta herramienta, podrá descargar fichas informativas en español o inglés que muestran los riesgos que enfrenta su distrito.

Explore el mapa interactivo

Los datos en este mapa y en las fichas informativas vienen de nuestro nuevo informe, Calor fatal en Estados Unidos: Decisiones sobre el clima y el futuro de días peligrosamente cálidos, y muestran la escalada asombrosa del calor peligroso a través del país si no reducimos las emisiones que atrapan el calor.

Pocos lugares de EE.UU. no se verán afectados por el calor extremo que será cada vez más frecuente. Donde quiera que viva, el calor extremo probablemente va a transformar su vida diaria y su comunidad en las siguientes décadas, aunque en qué medida lo hará depende mucho de nosotros.

Para prevenir un futuro irreconociblemente cálido y peligroso, necesitamos que nuestros oficiales elegidos aboguen por políticas que reduzcan de forma rápida y drástica nuestras emisiones de gases que atrapan el calor. Necesitamos que tomen acciones para mantener a salvo a su pueblo cuando el calor aumente a niveles peligrosos.

Como dijo mi colega Rachel Cleetus en su testimonio ante el congreso estadounidense, tenemos la tecnología y comprendemos los mecanismos de la política. Sólo nos falta la voluntad política. Con esta herramienta y las fichas informativas usted podrá ayudar a crear esa voluntad e ímpetu político.

En esta entrada de blog exploraremos esta herramienta, destacaremos la información disponible para cada distrito de congreso, y usted aprenderá cómo usar los datos para involucrar a su comunidad, a sus oficiales elegidos y a los candidatos en las campañas electorales de su área.

Explore como su distrito de congreso será afectado por el calor extremo

La herramienta es sencilla. Haga clic en cualquier distrito de congreso en los estados contiguos de EE.UU. y aparecerá una ventana con un enlace para descargar una ficha informativa para este distrito. La primera página de cada ficha muestra pronósticos de calor extremo específicos a ese distrito de congreso dependiendo si tomamos o no acción rápida para reducir las emisiones. La segunda página destaca los resultados a nivel nacional del informe Calor fatal y enumera  las políticas sensatas que podrían establecer los oficiales locales, estatales y federales para reducir las emisiones y para estar mejor preparados para el calor extremo y peligroso.

Mientras que algunos distritos están más expuestos al calor extremo que otros, casi todos los distritos enfrentarán algún nivel de riesgo.

Por ejemplo, en el tercer distrito de congreso de Florida, donde se ubica Gainesville, el promedio de días con una sensación térmica por encima de 100ºF está previsto a aumentar a 95 días al año de los 25 días en que históricamente (1971-2000) ha experimentado temperaturas así de altas. Es decir que las tres semanas al año de calor extremo se convertirían en tres meses de calor extremo por año.

En el tercer distrito de congreso de Iowa, que incluye la ciudad de Des Moines, se verían menos días de calor extremo que en Florida dada su ubicación. Pero para mediados del siglo, aumentaría a 35 días al año de sensación térmica que supera los 100ºF, bastante más que el promedio histórico de seis días al año con tales temperaturas.  Semejante aumento requeriría cambios a los planes que hace esa comunidad para mantenerse a salvo durante los eventos de calor extremo.

Días de calor extremo por año en el 3er distrito de Iowa 

Cada ficha informativa tiene su propio gráfico único, para llevar la información local relevante a su comunidad y a los oficiales elegidos.

Cada ficha informativa tiene su propio gráfico único, para llevar la información local relevante a su comunidad y a los oficiales elegidos.

Y se proyecta que el primer distrito de congreso de Oklahoma, donde se encuentra Tulsa, tendrá 86 días al año de temperaturas por encima de 105ºF para finales del siglo si no tomamos medidas para reducir las emisiones. Pero eso podría reducirse a la mitad, o 43 días al año, si limitáramos el calentamiento a 3,5ºF, o 2ºC.

Días de calor extremo por año en el 1er distrito de Oklahoma

Fichas informativas resaltan cómo se vería un futuro sin acción para reducir las emisiones comparado con uno en que se toma acción rápida y se limita el calentamiento.

Fichas informativas resaltan cómo se vería un futuro sin acción para reducir las emisiones comparado con uno en que se toma acción rápida y se limita el calentamiento.

Los oficiales elegidos y los candidatos necesitan conocer los riesgos del calor extremo del futuro

Esta herramienta les permitirá comprender al pueblo y a los oficiales estadounidenses cómo las decisiones que tomamos hoy afectarán al futuro que le entregaremos a la próxima generación. Pero lo que hacemos con esa conciencia es crítico, especialmente cuando se trata de asegurar que nuestros oficiales elegidos reconozcan el riesgo y tengan un plan para hacerle frente.

Aquí hay cuatro maneras de hacerles llegar a sus oficiales elegidos y a los candidatos esta información. Estos pueden incluir candidatos para representante de congreso, para alcalde o comisario de condado y hasta candidatos para presidente. Es importante preguntarles que van a hacer con esta información.

  1. Utilice las redes sociales. Si está en Twitter, tuitéale a los oficiales o candidatos de su distrito. Incluya unos datos sobre el calor extremo en su área, con un enlace al mapa o a su ficha informativa, y pregúntele cuáles son sus planes para enfrentar el problema. Incluya los usuarios de Twitter en su tuit para que lo vean ellos o sus funcionarios. Puede incluir otros hashtags como #cambioclimático, #calentamientoglobal, o cualquiera que pueda realzar su visibilidad en las redes sociales.
  2. Asista a un foro o evento. Pídales a sus representantes o a los candidatos sus planes para combatir el calor extremo y el cambio climático. Presénteles los datos sobre los aumentos en frecuencia e intensidad de calor extremo pronosticados para su distrito y pregúnteles cómo piensan apoyar a su comunidad para fomentar la resiliencia al calor extremo. Como este problema no se limitará a solo su comunidad, pídales a los representantes sus planes para luchar a nivel federal para reducir las emisiones de gases que causan el calentamiento global. Acción agresiva a nivel nacional para reducir las emisiones globales podría reducir a la mitad la frecuencia de los días de calor extremo, comparado con un futuro en el que dejáramos seguir aumentando a las emisiones como hasta ahora. También puede enviarles emails directamente a sus representantes y a los candidatos. La mayoría de las páginas Web oficiales de los candidatos incluyen información de contacto con un formulario de email, una dirección u otra forma de escribirle directamente a la persona con la que desee comunicarse.
  3. Escríbale una carta al editor de su periódico local. Los candidatos y oficiales elegidos se mantienen al tanto de las noticias locales, así que escribir una carta al editor puede ser una buena manera de informarles sobre los asuntos que les importa a usted y a su comunidad, incluido el calor extremo. Y si arma su carta con estadísticas sobre el aumento pronosticado del calor extremo en las próximas décadas, podría atraer la atención del editor y conseguir que publique su carta. Los periódicos no publican todas las cartas al editor que reciben, pero si da respuesta a un artículo que ya ha aparecido en ese periódico, podría mejorar la posibilidad que se publicara su carta. Eso sí – deberá enviarla a poco tiempo de publicarse el artículo. No deje que pasen días entre el artículo original y su carta al editor. Lo bueno es que las cartas al editor suelen ser cortas (menos de 200 palabras, no más de 2 párrafos) así que se pueden escribir bastante rápido. Comuníquese con su periódico para averiguar las instrucciones para enviar cartas al editor.

Esperamos que le sea útil esta herramienta, y nos gustaría saber cómo la está utilizando.

Si le interesa investigar más a fondo las proyecciones del calor extremo para su ciudad o su condado, puede echar un vistazo a las demás herramientas que hemos desarrollado, incluso un “widget de calor extremo” con el que podrá encontrar rápidamente resultados por ciudad o condado; un mapa a nivel de condado con muchísimos datos; y fichas informativas llenas de información local.

Mapa interactivo del análisis

El mapa

Mapa por condado para cada uno de los 3.109 condados en EE.UU. contiguos con todos los escenarios de calor futuro incluidos en el análisis (En inglés)

Esta entrada de blog fue traducido del inglés por Michelle Rama-Poccia

AP Photo/John Locher

Why Berkeley Banned Natural Gas in New Buildings

Photo: Christian Naenny/Flickr

Two weeks ago, Berkeley, California became the first city in the nation to ban natural gas hook-ups in new construction. The ordinance passed unanimously with overwhelming public support, but the gas industry has been quietly fighting back by stealthily funding “consumer” groups that criticize gas bans like Berkeley’s. So let me explain what the rule actually does and why it’s a great idea.

The fine print

The Berkeley rule goes into effect January 1, 2020. The rule will initially only apply to low-rise residential buildings, but as the California Energy Commission develops all-electric building standards for mid-rise, high-rise, and commercial buildings, the natural gas ban will automatically apply to newly constructed buildings of those types as well. To be clear, this rule applies only to new buildings – so for those of you who love your gas stove so much that Berkeley would have to pry it out of your cold dead hands, you won’t have to let go of your gas-burning appliances quite yet. (But why hang on when electric induction ranges routinely top Consumer Reports’ performance tests?) Existing buildings in Berkeley can continue to use natural gas… at least for the time being.

Small city, big impact

Berkeley is a small, liberal city, so some folks may just dismiss this new rule as a far-out Berkeley thing that won’t ever get any traction.

But not so fast.

Berkeley has long been a city of “firsts.” There are numerous examples where cities and states have followed Berkeley’s lead and passed ground-breaking policies that originated in Berkeley. For example, Berkeley was the first city to:

  • Integrate public schools voluntarily (1968)
  • Limit smoking in restaurants (1977)
  • Ban Styrofoam (1989)

…And the list goes on. In short, Berkeley’s novel policies have a history of getting replicated at a much larger scale, leading to a big impact.

Berkeley and the big picture

Given the urgency of reducing emissions to avoid the worst impacts of global warming, Berkeley’s ordinance is significant because it sets in motion what could be a huge breakthrough in building decarbonization.

Let’s take a minute to think about the bigger picture.

Last year saw the release of the Special Report on Global Warming of 1.5°C. Amongst its many findings, the report highlighted the urgency of reaching net zero greenhouse gas emissions by mid-century in order to avoid the worst impacts of global warming. Again, that’s net zero by mid-century, which means we have just over 30 years to eliminate as many emissions as possible and to offset all the rest. The state of California has already signaled its intention to achieve that goal – last year, Governor Jerry Brown signed an executive order committing the state to reach carbon neutrality by 2045.

Eliminating emissions one step at a time

So how do you eliminate emissions from buildings when buildings tend to stick around for many decades and retrofits are notoriously expensive? Berkeley’s answer is to nip the problem in the bud by preventing natural gas use in all new buildings, which immediately eliminates one of the largest sources of emissions from buildings.

Roughly 27% of Berkeley’s greenhouse gas emissions come from natural gas use in buildings. (For comparison, buildings both in California and nationwide account for only 12% of emissions since California and the United States both have much more agricultural and industrial emissions – see comparison charts below.) Twenty seven percent is a big chunk, but Berkeley’s total emissions are miniscule in comparison to California’s or the United States’ emissions.

For example, California’s commercial and residential buildings produce roughly as much greenhouse gas emissions as the entire country of Switzerland. Buildings in all of the United States produce roughly as much emissions as the entire country of Canada. In short, buildings, and the natural gas burned inside, are a big contributor to global warming.

Enter Berkeley, which is leading the way with its new ordinance. Many other cities, such as San Francisco, are considering similar action. Looking ahead, if California or the United States were to adopt a similar policy, it would be a big step towards reducing the large amount of emissions from buildings.

Over a quarter of Berkeley’s emissions come from buildings. In California and nationwide, buildings account for 12% of emissions. Sources: Berkeley data, California data, United States data.

Saving money and sparing the air

But this isn’t just about preventing climate change. It’s also about saving people money and improving indoor air quality. A recent analysis demonstrated that new all-electric homes end up saving homeowners money in comparison to new homes built with natural gas; these findings apply not only to Oakland (Berkeley’s neighbor to the south), but also to cities in significantly different climate zones, such as Chicago, Houston, and Providence, Rhode Island. From a financial standpoint, building new homes that are all-electric is a good choice.

My mother-in-law’s electric induction stovetop, which is pretty fun to cook on. The burners on the right are induction, and the burners on the left are the old-school electric kind that actually heat up (which allows you to use any pan on those burners).

Furthermore, burning natural gas in your home can be bad news for indoor air quality. Cooking with a gas stove without proper ventilation can make indoor air unhealthy to breathe. One study found that, during a typical winter week, millions of Californians could be exposed to unhealthy levels of indoor air pollutants when cooking with a gas stove without proper ventilation. Cooking with an electric stove doesn’t eliminate all indoor air pollution from cooking (e.g. think about the last time you accidentally burned something), but taking the gas stove out of the equation keeps the air in homes healthier to breathe.

All-electric homes are already becoming the default for new construction in some areas of California. Modern electric appliances are totally unlike their inefficient and ineffective predecessors – many of today’s all-electric technologies (e.g. heat pumps and induction cooktops) perform even better than their natural-gas-fueled counterparts. All-electric homes aren’t some remnant of the past – on the contrary, the technology exists to build comfortable, functional, and affordable all-electric homes for the future.

Let the fondue flow begin

In an effort to ensure Berkeley’s new ordinance would pass, one of Berkeley’s city council members had a staffer make chocolate fondue on an electric induction cooktop to demonstrate the safety and efficacy of all-electric cooking appliances.

Such a demonstration seems a little over the top to me, but if that’s what it takes to get cities, states, and countries to pass similar policies, then I am hereby requesting that all fondue enthusiasts pack your bags (don’t forget your portable induction cooktop!) and prepare for a world-wide tour – we need your help!

Photo: Christian Naenny/Flickr

New Electric Options for Drivers Looking to Cut Carbon Emissions

The all-electric Audi e-tron SUV is responsible much lower carbon emissions than a similar gasoline-powered Audi SUV Photo: Dave Reichmuth/UCS

Electric vehicles can significantly reduce the emissions from driving by substituting increasingly cleaner electricity for gasoline. At UCS we’ve been tracking how the difference between gasoline and electric car emissions have been changing and where in the US has the lowest electric vehicle emissions.  In 2018, UCS updated our estimates of the carbon emissions of electric vehicles with latest electricity data and now we’ve updated our vehicle emissions calculator with the latest models to reflect the carbon emissions savings from these new vehicles.

UCS’s electric vehicle emissions calculator shows the carbon emissions from different models across the country.

More options for super-efficient electric vehicles

The emissions of the Tesla Model 3 “Standard Range Plus” edition now match the Hyundai Ionic Electric for the lowest per mile carbon emissions for any passenger car in the US. On the cleanest grid in the US (in upstate New York), driving the Tesla Model 3 Standard Plus is equal to a (hypothetical) gasoline car that gets 257 MPG, even including the emissions coming from electricity generation. Put another way, a fleet of 10 of these efficient electric vehicles would have the same emissions of one average new gasoline vehicle.

And the Model 3 isn’t alone: the new Hyundai Kona Electric and Kia Niro Electric join the existing BMW i3, Hyundai Ioniq, and Volkswagen eGolf as vehicles that have 60 MPG emissions equivalent or better for two-thirds of the US population.

New plug-in SUV options: significantly lower emissions than the gasoline alternatives

In addition to smaller vehicles like the Tesla Model 3 and Kia Niro, we’ve also added new plug-in SUVs to our calculator. These larger vehicles do require more electricity to use than the average electric vehicle, but still have considerable carbon emissions savings compared to the equivalent gasoline vehicle. For example, the new Subaru Crosstrek plug-in hybrid SUV driven in California has emissions equal to a 47 MPG gasoline car, while the gasoline version gets 29 MPG. And the all-electric Audi e-tron SUV has emissions equal to an 80 MPG gasoline car when driven in California, less than a third of the emissions of a similar Audi gasoline SUV (Audi Q5 or Q8).

Plug-in vehicles like these SUVs are critical to increasing the number of electric cars on the road (and decreasing emissions), because many new car shoppers are looking for SUVs and crossover vehicles. And having more manufacturers like Subaru and Audi means that shoppers not only have more size options, but greater consumer choice while being able to pick electric.

Photo: Dave Reichmuth

Tackling Health Disparities in St. Louis

Many factors cause disparities in who has access to healthcare, as well as the quality of the care they receive. Health disparities facing St. Louis are not unique to the city but are intensified by two primary factors: division between the city and county, and extreme racial segregation. Having two separate governments operating in the same municipal area means that multiple initiatives may be formed to tackle the same problems, but never communicate or share resources. While some services – such as the sewer district and certain medical centers – are shared, many more function independently,  necessitating that organizations communicate and comply with two sets of legislatures and regulations.

To learn more about and address the specific issues facing St. Louis, Washington University ProSPER worked with the Union of Concerned Scientists to host a panel with public health experts, community leaders, and county officials. Speakers included Angela Brown, acting CEO of the St. Louis Regional Health Commission, Dr. Bettina Drake, Associate Director of Community Outreach and Engagement at the Siteman Cancer Center, Jessica Holmes, Principal Strategist for Alignment, Innovation, and Growth for the St. Louis Integrated Health Network, Dr. Will Ross, Associate Dean of Diversity at the Washington University School of Medicine in St. Louis, and Dr. Spring Schmidt, the current Acting Director of the St. Louis County Public Health Department.

The impact of racial segregation on health outcomes

St. Louis City has been ranked as the 10th most segregated city in the United States while the metropolitan area is the 6th most segregated. An example of this is the Delmar Divide. North of Delmar Boulevard the population is more than 98% African American, while south of Delmar, the population is more than 70% white. Additionally, south of Delmar Boulevard the median income increases by $30,000, the number of residents with bachelor’s degrees by 60%, and the median home value by $250,000. This is not the only location in St. Louis where such a stark contrast can be seen, and these socioeconomic disparities have significant impacts on the health of residents.

Evidence of this segregation is reflected in cancer rate differences and outcomes in north St. Louis. Eight zip codes close to Coldwater Creek were found to have higher rates of breast cancer and leukemia and these rates were highest among African American women. Furthermore, African American women were more likely to be diagnosed with later stages of breast cancer and have a mortality rate 10% higher than the total population. These women were more likely to delay diagnosis and treatment due to cost, which results in greater numbers of late-stage diagnoses and higher mortality.

Challenges to equitable access

To address this and other issues, the Missouri Department of Social Services in partnership with the St. Louis Regional Health Commission have funded Gateway to Better Health, a pilot program to “provide uninsured adults a bridge in care until they are able to enroll in health insurance coverage options available through the Affordable Care Act.” The program was implemented in 2012 and serves as a means of providing healthcare access to over 60,000 residents. Although the program helps many in the St. Louis area, the panelists agreed that it is a stop-gap measure incapable of adequately addressing the needs of the region.

These needs may grow in the near future due to funding cuts to the Medicaid Disproportionate Share Hospital Program (DSH). This program provides funds to hospitals that provide care to a large number of uninsured and Medicaid insured individuals to offset the corresponding costs. DSH is federally funded and expected to be cut by $4 billion this year and an additional $8 billion over the next five years, justified in part by the Medicaid expansion in the Affordable Care Act. Missouri hospitals expected to lose over $157 million in funding last year alone which could have a devastating impact on the amount of services they are able to provide to low-income and un- or underinsured individuals. As Missouri is one of 13 states that has not accepted the Medicaid expansion, it will be disproportionately affected by these budget cuts.

Hope for progress & action to reduce disparities

While there are many challenges to fixing health care disparities in the St. Louis region, some progress is being made. The St. Louis Integrated Health Commission is working to bring together resources in the county, city, and from private organizations and reduce some of the inefficiencies and missed populations caused by lack of communication. There has also been an increase in the use of community health workers, community members who may not have medical training but are a key resource in helping people navigate the healthcare system, especially for the specific needs of at-risk populations. More research is being done to address racial disparities in healthcare access and outcomes. There is still a long way to go and many problems to resolve, but we can all do our part to support the organizations and people on the front lines of these issues and by contacting our legislatures about fixing some of the systemic issues that drive health disparities. Having public conversations about the issues and hurdles we face is one step toward addressing disparities within the system and supporting the initiatives and healthcare workers who are working toward health equity.

 

Max Lyon is a graduate student at the Washington University of St Louis School of Medicine. He is also a member of Washington University ProSPER (Promoting Science Policy, Education, and Research), a graduate student organization that helps students explore issues in science policy, advocacy, communication, and outreach.

Science Network Voices gives Equation readers access to the depth of expertise and broad perspective on current issues that our Science Network members bring to UCS. The views expressed in Science Network posts are those of the author alone.

Resilience and Transitioning to EVs Should Be Key Features in New Highway Bill

Photo: nsub1/Flickr

This blog was written in collaboration with Shana Udvardy, Climate Resilience Analyst in the Climate and Energy Program.

Most legislation introduced in Congress doesn’t go anywhere – it doesn’t get a hearing, doesn’t get a vote, it just dies quietly at the end of the two-year congressional session.  However, there are a few things that Congress must do either annually or every few years, and these are the big “must-pass” bills that you sometimes hear people talk about.

The Surface Transportation Reauthorization package is one of those must-pass bills. Therefore, it is an important opportunity for Congress to include strong provisions that reduce dangerous heat-trapping emissions, accelerate the transition to electric vehicles,  and make transportation infrastructure more resilient to ensure it is prepared to withstand those extreme weather and climate-related impacts that are already unavoidable due to global warming.   Every five years or so, Congress takes a look at our transportation system and tells the agencies how they should alter their course — they evaluate how things have been going and send directions to the agencies involved on how to administer those programs going forward.  This process is always a massive undertaking, usually somewhat contentious, and covers a lot of policy ground (highway spending, safety, rail, transit, permitting, etc.).

This week, the Senate Environment and Public Works (EPW) Committee is going to officially kick off the beginning of the Surface Transportation Reauthorization process by considering their surface transportation reauthorization bill by holding a business meeting, where they will consider amendments to the base text (also called a “mark up”).  The base text should be released today (yes, the day before the business meeting), so we will all be reading it quickly and providing input into offices of members who sit on this committee.  If you’re thinking that it is a little crazy that we are only going to see bill text the day before the committee considers the bill, you are not wrong, but this will be a long process so this will not be our only bite at the apple. UCS wants Congress to pass a Surface Transportation Bill that ensures a low-carbon and resilient transportation system.

Reducing emissions and advancing clean vehicles

Given that the transportation sector is the leading contributor to US heat-trapping emissions (producing nearly 30 percent of all US global warming emissions) UCS is strongly advocating for using the Surface Transportation Bill to set the stage to deploy more electric vehicles (EVs).  We have a few ideas of things that should be included in this bill to make this a reality:

  • Include the Clean Corridors Act, which would create (and provide grant funding for) more highway “corridors” that connect urban centers by installing charging infrastructure (or hydrogen fueling stations) along the major routes between those cities. The charging stations should be fast charging stations, meaning that they can charge an EV to 80 percent in about 30 minutes.
  • Update the federal guidebook that standardizes signage to make sure that it includes EV signage that all states can choose to adopt. The last full update of this manual was in 2009, when there were only a handful of Tesla Roadsters on the road. The Department of Transportation has updated their interim guidance in the meantime, but having signage for people to know where charging stations are located is critical.
  • Increase funding for electric transit buses. In 2018, states and cities applied for grants totaling $558 million, but only $85 million of funding was available.  It comes as no surprise that cities and states are interested in converting transit buses to electric power – they provide both health benefits for the neighborhoods the buses operate in as well as climate benefits.
  • Ask states and metropolitan planning organizations to create EV plans, particularly on EV charging infrastructure deployment, to accommodate their citizens that choose to buy or lease EVs.

Given that climate change in the future will amplify the risks our vulnerable transportation systems already face, we also have some ideas on how to make more climate-ready and resilient transportation systems:

  • All infrastructure that we build now should be ready to withstand the future impacts of climate change – for example, new federally funded infrastructure should be built at least two feet above the “100-year” (1 percent annual chance) flood level.
  • Provide grant funding for climate-resilient infrastructure, specifically for coastal states, improving evacuation routes, and for flexible use of funds to account for future conditions.
  • Create an expert panel to study the likely impacts of climate change on transportation infrastructure and provide a report to Congress with recommendations and guidance on how to appropriately plan and build climate-ready transportation infrastructure.

This markup is just the first step in what will likely be a very long process.  Other committees in the Senate need to weigh in (Finance for tax issues, Commerce for rail and safety issues, Banking for transit issues) and then the House also needs to put together their multi-committee bill.  Finally, the House and Senate will need to resolve the differences between the bills in conference.  The existing authorization of the surface transportation programs (the Fast Act) expires in September 2020, so there is also a lot of time left in this process.  We will keep an eye on these issues as they move through Congress and do our best to ensure that strong electrification and climate resilience priorities are included in the final bill.

Photo: nsub1/Flickr

Misinterpreting Scientific Integrity Data in House Oversight Hearing

Former DOI climate change scientist, Joel Clement, testifies on scientific integrity to the HNR committee on Thursday, July 25th, 2019. Photo: House.gov

Last week, the House Natural Resource Committee held a hearing on scientific integrity and attacks on science at the Department of Interior (DOI). In his opening statement, ranking member Rob Bishop from Utah showed the Committee a graph and offered it as evidence that under the Trump administration, scientific integrity complaints are at their lowest since data collection began at the DOI. ​As is often the case, the graph alone does not tell the full story, and Congressman Bishop ought to want to understand why the numbers appear to be so low.

  1. Federal scientists are afraid of filing scientific integrity complaints. How do I know this? Because I measured data on this very issue in 2018. In a survey of federal scientists that I led in 2018, I found that 800 scientists said they would not feel comfortable coming forward with a scientific integrity violation. That is far too many scientists who feel like they cannot report interference in their work – the ideal number, of course, being zero. Many members at the hearing criticized scientists for a lack of transparency in their raw research data, so we invite these members to take a look at the raw frequency results from my survey for item 36 here. These data are from the very federal scientists in agencies when the survey was conducted.
  2. More scientific integrity issues have been discussed publicly than have been documented at DOI. We know that more scientific integrity issues have happened at DOI than what has been formally filed because of freedom of information act requests and investigative journalism. At UCS, we keep a running tally of all these publicly reported scientific integrity issues on our attacks on science page. For example, DOI’s list of closed scientific integrity cases records two filed complaints in 2018, we documented eight scientific integrity issues from information that has been disclosed publicly. This is just another signal that scientists are not filing scientific integrity violations even if they are occurring.
  3. Scientist whistleblowers have publicly stated the hesitancy in filing scientific integrity complaints due to fear of retaliation from the Trump administration. One of the witnesses at the HNR hearing, Dr. Maria Caffrey, discussed a personal and painful story of her fight to retain climate change language in a report she drafted for the National Park Service. In her testimony, she recalled colleagues begging her not to come forward with her allegation of a scientific integrity violation. One colleague noted their fear that Dr. Caffrey coming forward could result in their reassignment, which would make it difficult for them to provide for their children. Dr. Caffrey did, indeed, lose her job for speaking out about the truth. And we have seen others publicly reprimanded for discussing scientific work or reassigned.

These are examples of the very real repercussions that our federal scientists may risk in simply trying to do their jobs – informing their agencies and the public about what the science says. Imagine the impact on the federal workforce writ large as they witness what can and has happened to their colleagues.

If you were a federal scientist working under the Trump administration, would you be willing to come forward with a scientific integrity complaint? Or even talk publicly about the science related to issues considered politically contentious or at odds with the administration’s political/ideological preferences? This culture of fear would probably have you think very long and hard about doing so if it meant losing your job and the ability to provide for your family.

Currently it’s all stick and no carrot when it comes to the work of scientists under the Trump administration, which the data clearly shows. Congressman Bishop did not take these other data into consideration when coming to his conclusion that scientific integrity is not a big problem at DOI under the Trump administration. Unfortunately, this lack of understanding will cost federal scientists, and the public who depend on their work, a lot of heartache.

Pages