The Power of Truth® has been released for sale and assignment to a conservative pro-American news outlet, cable network, or other media outlet that wants to define and brand its operation as the bearer of the truth, and set itself above the competition.

In every news story the audience hears of censorship, speech, and the truth. The Power of Truth® has significant value to define an outlet, and expand its audience. A growing media outlet may decide to rebrand their operation The Power of Truth®. An established outlet may choose to make it the slogan distinguishing their operation from the competition. You want people to think of your outlet when they hear it, and think of the slogan when they see your company name. It is the thing which answers the consumer's questions: Why should I choose you? Why should I listen to you? Think:

  • What’s in your wallet -- Capital One
  • The most trusted name in news – CNN
  • Fair and balanced - Fox News
  • Where’s the beef -- Wendy’s
  • You’re in good hands -- Allstate
  • The ultimate driving machine -- BMW

The Power of Truth® is registered at the federal trademark level in all applicable trademark classes, and the sale and assignment includes the applicable domain names. The buyer will have both the trademark and the domains so that it will control its business landscape without downrange interference.

Contact: Truth@ThePowerOfTruth.com

image
Single-use plastics. Anton Petrus/Moment

Hundreds of millions of tons of single-use plastic ends up in landfills every year, and even the small percentage of plastic that gets recycled can’t last forever. But our group of materials scientists has developed a new method for creating and deconstructing polymers that could lead to more easily recycled plastics – ones that don’t require you to carefully sort out all your recycling on trash day.

In the century since their conception, people have come to understand the enormous impacts – beneficial as well as detrimental – plastics have on human lives and the environment. As a group of polymer scientists dedicated to inventing sustainable solutions for real-world problems, we set out to tackle this issue by rethinking the way polymers are designed and making plastics with recyclability built right in.

Why use plastics, anyway?

Everyday items including milk jugs, grocery bags, takeout containers and even ropes are made from a class of polymers called polyolefins. Polyolefins make up around half of the plastics produced and disposed of every year.

These polymers are used in plastics commonly labeled as HDPE, LLDPE or PP, or by their recycling codes #2, #4 and #5, respectively. These plastics are incredibly durable because the chemical bonds that make them up are extremely stable. But in a world set up for single-use consumption, this is no longer a design feature but rather a design flaw.

Imagine if half of the plastics used today were recyclable by twice as many processes as they are now. While that wouldn’t get the recycling rate to 100%, a jump from single digits – currently around 9% – to double digits would make a big dent in the plastics produced, the plastics accumulated in the environment and their capacity for recycling and reuse.

Recycling methods we already have

Even the plastics that make it to a recycling facility can’t be reused in exactly the same way they were used before – the recycling process degrades the material, so it loses utility and value. Instead of making a plastic cup that is downgraded each time it gets recycled, manufacturers could potentially make plastics once, collect them and reuse them on and on.

Conventional recycling requires careful sorting of all the collected materials, which can be hard with so many different plastics. Here in the U.S., collection happens mainly through single stream recycling – everything from metal cans, glass bottles, cardboard boxes and plastic cups end up in the same bin. Separating paper from metal doesn’t require complex technology, but sorting a polypropylene container from a polyethylene milk jug is hard to do without the occasional mistake.

Two workers, in bright yellow, stand at a conveyor belt covered in plastics in a recycling facility.
Recycling workers sort through materials. AP Photo/Mark Gillispie

When two different plastics are mixed together during recycling, their useful properties are hugely reduced – to the point of making them useless.

But say you can recycle one of these plastics by a different method, so it doesn’t end up contaminating the recycling stream. When we mixed samples of polypropylene with a polymer we made, we were still able to depolymerize – or break down the material – and regain our building blocks without chemically affecting the polypropylene. This indicated that a contaminated waste stream could still recover its value, and the material in it could go on to be recycled, either mechanically or chemically.

Plastics we need − but more recyclable

In a study published in October 2023, our team developed a series of polymers with only two simple building blocks – one soft polymer and one hard polymer – that mimicked polyolefins but could also be chemically recycled.

Connecting two different polymers together multiple times until they form a single, long molecule creates what’s called a multiblock polymer. Just by adjusting how much of each polymer type goes into the multiblock polymer, our team created a wide range of materials with properties that spanned across polyolefin types. But creating these multiblock polymers is easier said than done.

To link these hard and soft polymers, we adapted a technique that had previously been used only on very small molecules. This method is improved relative to traditional methods of making polymers in a step-by-step fashion, developed in the 1920s, where the reactive groups on the end of the molecules need to be exactly matched.

In our method, the reactive groups are now the same as each other, meaning we didn’t have to worry about pairing the ends of each building block to make polymers that can compete with the polyolefins we already use. Using the same strategy, applied in reverse by adding hydrogen, we could disconnect the polymers back into their building blocks and easily separate them to use again.

A graph showing a steady increase in single-use plastic use across all plastic types shown, from X to projected in 2050.
Realized and predicted production of commodity plastics through 2050. International Energy Agency

With an almost twofold increase in annual plastic use projected through 2050, the complexity and quantity of plastic recycling will only increase. It’s an important consideration when designing new materials and products.

Using just two building blocks to make plastics that have a huge variety of properties can go a long way toward reducing and streamlining the number of different plastics used to make the products we need. Instead of needing one plastic to make something pliable, another for something stiff, and a third, fourth and fifth for properties in between, we could control the behavior of plastics by just changing how much of each building block is there.

Although we’re still in the process of answering some big questions about these polymers, we believe this work is a step in the right direction toward more sustainable plastics.

We were able to create materials that mimic the properties of plastics the world relies on, and our sights are now set on creating plastic compositions that you couldn’t with existing methods.

The Conversation

Katherine Harry receives funding from RePLACE (Redesigning Polymers to Leverage a Circular Economy) funded by the Office of Science of the US Department of Energy.

Emma Rettner receives funding from RePLACE (Redesigning Polymers to Leverage a Circular Economy) funded by the Office of Science of the US Department of Energy.

Read more …New class of recyclable polymer materials could one day help reduce single-use plastic waste

image
Rocky Mountain fires leave telltale ash layers in nearby lakes like this one. Philip Higuera

Strong winds blew across mountain slopes after a record-setting warm, dry summer. Small fires began to blow up into huge conflagrations. Towns in crisis scrambled to escape as fires bore down.

This could describe any number of recent events, in places as disparate as Colorado, California, Canada and Hawaii. But this fire disaster happened over 110 years ago in the Northern Rocky Mountains of Idaho and Montana.

The “Big Burn” of 1910 still holds the record for the largest fire season in the Northern Rockies. Hundreds of fires burned over 3 million acres – roughly the size of Connecticut – most in just two days. The fires destroyed towns, killed 86 people and galvanized public policies committed to putting out every fire.

A black and white photo from 1910 shows rail lines and the burned shells of buildings
Many residents of Wallace, Idaho, fled on trains ahead of the 1910 blaze. Volunteers who stayed saved part of the town, but about a third of it burned. R.H. McKay/U.S. Forest Service archive, CC BY

Today, as the climate warms, fire seasons like in 1910 are becoming more likely. The 2020 fire season was an example. But are extreme fire seasons like these really that unusual in the context of history? And, when fire activity begins to surpass anything experienced in thousands of years – as research suggests is happening in the Southern Rockies – what will happen to the forests?

As paleoecologists, we study how and why ecosystems changed in the past. In a multiyear project, highlighted in two new publications, we tracked how often forest fires occurred in high-elevation forests in the Rocky Mountains over the past 2,500 years, how those fires varied with the climate and how they affected ecosystems. This long view provides both hopeful and concerning lessons for making sense of today’s extreme fire events and impacts on forests.

Lakes record history going back millennia

When a high-elevation forest burns, fires consume tree needles and small branches, killing most trees and lofting charcoal in the air. Some of that charcoal lands on lakes and sinks to the bottom, where it is preserved in layers as sediment accumulates.

After the fire, trees regrow and also leave evidence of their existence in the form of pollen grains that fall on the lake and sink to the bottom.

By extracting a tube of those lake sediments, like a straw pushed into a layer cake from above, we were able to measure the amounts of charcoal and pollen in each layer and reconstruct the history of fire and forest recovery around a dozen lakes across the footprint of the 1910 fires.

A woman sitting an inflatable boat, wearing a life jacket, holds a long tube filed with lake bottom sediment.
Author Kyra Clark-Wolf holds a sediment core pulled from a lake containing evidence of fires over thousands of years. Philip Higuera
Long tubes of lake floor sediment are opened on a table.
Researchers at the University of Montana examine a sediment core from a high-elevation lake in the Rocky Mountains. Each core is sliced into half-centimeter sections, reflecting around 10 years each, and variations in charcoal within the core are used to reconstruct a timeline of past wildfires. University of Montana

Lessons from Rockies’ long history with fire

The lake sediments revealed that high-elevation, or subalpine, forests in the Northern Rockies in Montana and Idaho have consistently bounced back after fires, even during periods of drier climate and more frequent burning than we saw in the 20th century.

High-elevation forests only burn about once every 100 to 250 or more years on average. We found that the amount of burning in subalpine forests of the Northern Rockies over the 20th and 21st centuries remained within the bounds of what those forests experienced over the previous 2,500 years. Even today, the Northern Rockies show resilience to wildfires, including early signs of recovery after extensive fires in 2017.

Three illustrated charts show forest density increasing and time between fires falling over the past 4,800 years at one location.
Long-term changes in climate, forest density and fire frequency over the past 4,800 years in one high-elevation forest in the Northern Rockies, reconstructed from lake sediments. The red dots reflect timing of past fires. Kyra Clark-Wolf

But similar research in high-elevation forests of the Southern Rockies in Colorado and Wyoming tells a different story.

The record-setting 2020 fire season, with three of Colorado’s largest fires, helped push the rate of burning in high-elevation forests in Colorado and Wyoming into uncharted territory relative to the past 2,000 years.

Climate change is also having bigger impacts on whether and how forests recover after wildfires in warmer, drier regions of the West, including the Southern Rockies, the Southwest and California. When fires are followed by especially warm, dry summers, seedlings can’t establish and forests struggle to regenerate. In some places, shrubby or grassy vegetation replace trees altogether.

Graphs show fire activity rising with temperature over time.
Fire history reconstructions from 20 high-elevation lakes in the Southern Rockies show that historically, fires burned every 230 years on average. That has increased significantly in the 21st century. Philip Higuera, CC BY-ND

Changes happening now in the Southern Rockies could serve as an early warning for what to expect further down the road in the Northern Rockies.

Warmer climate, greater fire activity, higher risks

Looking back thousands of years, it’s hard to ignore the consistent links between the climate and the prevalence of wildfires.

Warmer, drier springs and summers load the dice to make extensive fire seasons more likely. This was the case in 1910 in the Northern Rockies and in 2020 in the Southern Rockies.

When, where and how climate change will push the rate of burning in the rest of the Rockies into uncharted territory is harder to anticipate. The difference between 1910 and 2020 was that 1910 was followed by decades with low fire activity, whereas 2020 was part of an overall trend of increasing fire activity linked with global warming. Just one fire like 1910’s Big Burn in the coming decades, in the context of 21st-century fire activity, would push the Northern Rockies beyond any known records.

A tiny pine seedling in a vast landscape of burned trees and soil.
A lodgepole pine tree seedling begins to grow one year after the October 2020 East Troublesome Fire in Rocky Mountain National Park. Recovery in high-elevation forests takes decades. Philip Higuera

Lessons from the long view

The clock is ticking.

Extreme wildfires will become more and more likely as the climate warms, and it will be harder for forests to recover. Human activity is also raising the risk of fires starting.

The Big Burn of 1910 left a lasting impression because of the devastating impacts on lives and homes and, as in the 2020 fire season and many other recent fire disasters, because of the role humans played in igniting them.

Photo shows burned trees across miles of hillsides along a railroad line
The aftermath of the 1910 fire near the North Fork of the St. Joe River in the Coeur d’Alene National Forest, Idaho. R.H. McCoy/U.S. Forest Service archive, CC BY

Accidental ignitions – from downed power lines, escaped campfires, dragging chains, railroads – expand when and where fires occur, and they lead to the majority of homes lost to fires. The fire that destroyed Lahaina, Hawaii, is the most recent example.

So what can we do?

Curbing greenhouse gas emissions from vehicles, power plants and other sources can help slow warming and the impacts of climate change on wildfires, ecosystems and communities. Forest thinning and prescribed burns can alter how forests burn, protecting humans and minimizing the most severe ecological impacts.

Reframing the challenge of living with wildfire – building with fire-resistant materials, reducing accidental ignitions and increasing preparedness for extreme events – can help minimize damage while maintaining the critical role that fires have played in forests across the Rocky Mountains for millennia.

The Conversation

Kyra Clark-Wolf has received funding from the National Science Foundation and the Joint Fire Science Program

Philip Higuera receives funding from the National Science Foundation, United States Geological Survey, and Joint Fire Science Program.

Read more …What 2,500 years of wildfire evidence tells us about the future of fires in the West

image
A large-scale battery storage system in Long Beach, Calif., provides renewable electricity during peak demand periods. Patrick T. Fallon/AFP via Getty Images

After nearly two decades of stagnation, U.S. electricity demand is surging, driven by growing numbers of electric cars, data centers and air conditioners in a warming climate. But traditional power plants that generate electricity from coal, natural gas or nuclear energy are retiring faster than new ones are being built in this country. Most new supply is coming from wind and solar farms, whose output varies with the weather.

That’s left power companies seeking new ways to balance supply and demand. One option they’re turning to is virtual power plants.

These aren’t massive facilities generating electricity at a single site. Rather, they are aggregations of electricity producers, consumers and storers – collectively known as distributed energy resources – that grid managers can call on as needed.

Some of these sources, such as batteries, may deliver stored electric power. Others may be big electricity consumers, such as factories, whose owners have agreed to cut back their power use when demand is high, freeing up energy for other customers. Virtual power sources typically are quicker to site and build, and can be cleaner and cheaper to operate, than new power plants.

Virtual power plants are more resilient against service outages than large, centralized generating stations because they distribute energy resources across large areas.

A growing resource

Virtual power plants aren’t new. The U.S. Department of Energy estimates that there are already 30 to 60 gigawatts of them in operation today. A gigawatt is 1 billion watts – roughly the output of 2.5 million solar photovoltaic panels or one large nuclear reactor.

Most of these virtual power plants are industrial customers that have agreed to reduce demand when conditions are tight. But as growing numbers of homes and small businesses add rooftop solar panels, batteries and electric cars, these energy customers can become not only consumers but also suppliers of power to the grid.

For example, homeowners can charge up their batteries with rooftop solar when it’s sunny, and discharge power back to the grid in the evening when demand is high and prices sometimes spike.

As smart thermostats and water heaters, rooftop solar panels and batteries enable more customers to participate in them, DOE estimates that virtual power plants could triple in scale by 2030. That could cover roughly half of the new capacity that the U.S. will need to cover growing demand and replace retiring older power plants. This growth would help to limit the cost of building new wind and solar farms and gas plants.

And because virtual power plants are located where electricity is consumed, they’ll ease the burden on aging transmission systems that have struggled to add new lines.

A hand points to a lighted electronic panel.
A battery display panel inside a model home in Menifee, Calif., where 200 houses in a development are all-electric, equipped with solar panels and batteries and linked by a microgrid that can power the community during outages. Watchara Phomicinda/MediaNews Group/The Press-Enterprise via Getty Images

New roles for power customers

Virtual power plants scramble the roles of electricity producers and consumers. Traditional power plants generate electricity at central locations and transmit it along power lines to consumers. For the grid to function, supply and demand must be precisely balanced at all times.

Customer demand is typically assumed to be a given that fluctuates with the weather but follows a fairly predictable pattern over the course of a day. To satisfy it, grid operators dispatch a mix of baseload sources that operate continuously, such as coal and nuclear plants, and more flexible sources such as gas and hydropower that can modulate their output quickly as needed.

Output from wind and solar farms rises and falls during the day, so other sources must operate more flexibly to keep supply and demand balanced. Still, the basic idea is that massive facilities produce power for millions of passive consumers.

Virtual power plants upend this model by embracing the fact that consumers can control their electricity demand. Industrial consumers have long found ways to flex their operations, limiting demand when power supplies are tight in return for incentives or discounted rates.

Now, thermostats and water heaters that communicate with the grid can let households modulate their demand too. For example, smart electric water heaters can heat water mostly when power is abundant and cheap, and limit demand when power is scarce.

In Vermont, Green Mountain Power is offering its customers incentives to install batteries that will provide power back to the grid when it’s needed most. In Texas, where I live, deadly blackouts in 2021 highlighted the importance of bolstering our isolated power grid. Now, utilities here are using Tesla Powerwalls to help turn homes into virtual power sources. South Australia aims to connect 50,000 homes with solar and batteries to build that country’s largest virtual power plant.

People wait at a propane gas station, bundled in heavy clothes.
People line up to refill propane tanks in Houston after a severe winter storm caused electricity blackouts and a catastrophic failure of Texas’ power grid in February 2021. Go Nakamura/Getty Images

Virtual power, real challenges

Virtual power plants aren’t a panacea. Many customers are reluctant to give up even temporary control of their thermostats, or have a delay when charging their electric car. Some consumers are also concerned about the security and privacy of smart meters. It remains to be seen how many customers will sign up for these emerging programs and how effectively their operators will modulate supply and demand.

There also are challenges at the business end. It’s a lot harder to manage millions of consumers than dozens of power plants. Virtual power plant operators can overcome that challenge by rewarding customers for allowing them to flex their supply and demand in a coordinated fashion.

As electricity demand rises to meet the needs of growing economies and replace fossil fuel-burning cars and furnaces, and reliance on renewable resources increases, grid managers will need all the flexibility they can get to balance the variable output of wind and solar generation. Virtual power plants could help reshape electric power into an industry that’s more nimble, efficient and responsive to changing conditions and customers’ needs.

The Conversation

Daniel Cohan does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

Read more …What is a virtual power plant? An energy expert explains

image
Rocky Mountain fires leave telltale ash layers in nearby lakes like this one. Philip Higuera

Strong winds blew across mountain slopes after a record-setting warm, dry summer. Small fires began to blow up into huge conflagrations. Towns in crisis scrambled to escape as fires bore down.

This could describe any number of recent events, in places as disparate as Colorado, California, Canada and Hawaii. But this fire disaster happened over 110 years ago in the Northern Rocky Mountains of Idaho and Montana.

The “Big Burn” of 1910 still holds the record for the largest fire season in the Northern Rockies. Hundreds of fires burned over 3 million acres – roughly the size of Connecticut – most in just two days. The fires destroyed towns, killed 86 people and galvanized public policies committed to putting out every fire.

A black and white photo from 1910 shows rail lines and the burned shells of buildings
Many residents of Wallace, Idaho, fled on trains ahead of the 1910 blaze. Volunteers who stayed saved part of the town, but about a third of it burned. R.H. McKay/U.S. Forest Service archive, CC BY

Today, as the climate warms, fire seasons like in 1910 are becoming more likely. The 2020 fire season was an example. But are extreme fire seasons like these really that unusual in the context of history? And, when fire activity begins to surpass anything experienced in thousands of years – as research suggests is happening in the Southern Rockies – what will happen to the forests?

As paleoecologists, we study how and why ecosystems changed in the past. In a multiyear project, highlighted in two new publications, we tracked how often forest fires occurred in high-elevation forests in the Rocky Mountains over the past 2,500 years, how those fires varied with the climate and how they affected ecosystems. This long view provides both hopeful and concerning lessons for making sense of today’s extreme fire events and impacts on forests.

Lakes record history going back millennia

When a high-elevation forest burns, fires consume tree needles and small branches, killing most trees and lofting charcoal in the air. Some of that charcoal lands on lakes and sinks to the bottom, where it is preserved in layers as sediment accumulates.

After the fire, trees regrow and also leave evidence of their existence in the form of pollen grains that fall on the lake and sink to the bottom.

By extracting a tube of those lake sediments, like a straw pushed into a layer cake from above, we were able to measure the amounts of charcoal and pollen in each layer and reconstruct the history of fire and forest recovery around a dozen lakes across the footprint of the 1910 fires.

A woman sitting an inflatable boat, wearing a life jacket, holds a long tube filed with lake bottom sediment.
Author Kyra Clark-Wolf holds a sediment core pulled from a lake containing evidence of fires over thousands of years. Philip Higuera
Long tubes of lake floor sediment are opened on a table.
Researchers at the University of Montana examine a sediment core from a high-elevation lake in the Rocky Mountains. Each core is sliced into half-centimeter sections, reflecting around 10 years each, and variations in charcoal within the core are used to reconstruct a timeline of past wildfires. University of Montana

Lessons from Rockies’ long history with fire

The lake sediments revealed that high-elevation, or subalpine, forests in the Northern Rockies in Montana and Idaho have consistently bounced back after fires, even during periods of drier climate and more frequent burning than we saw in the 20th century.

High-elevation forests only burn about once every 100 to 250 or more years on average. We found that the amount of burning in subalpine forests of the Northern Rockies over the 20th and 21st centuries remained within the bounds of what those forests experienced over the previous 2,500 years. Even today, the Northern Rockies show resilience to wildfires, including early signs of recovery after extensive fires in 2017.

Three illustrated charts show forest density increasing and time between fires falling over the past 4,800 years at one location.
Long-term changes in climate, forest density and fire frequency over the past 4,800 years in one high-elevation forest in the Northern Rockies, reconstructed from lake sediments. The red dots reflect timing of past fires. Kyra Clark-Wolf

But similar research in high-elevation forests of the Southern Rockies in Colorado and Wyoming tells a different story.

The record-setting 2020 fire season, with three of Colorado’s largest fires, helped push the rate of burning in high-elevation forests in Colorado and Wyoming into uncharted territory relative to the past 2,000 years.

Climate change is also having bigger impacts on whether and how forests recover after wildfires in warmer, drier regions of the West, including the Southern Rockies, the Southwest and California. When fires are followed by especially warm, dry summers, seedlings can’t establish and forests struggle to regenerate. In some places, shrubby or grassy vegetation replace trees altogether.

Graphs show fire activity rising with temperature over time.
Fire history reconstructions from 20 high-elevation lakes in the Southern Rockies show that historically, fires burned every 230 years on average. That has increased significantly in the 21st century. Philip Higuera, CC BY-ND

Changes happening now in the Southern Rockies could serve as an early warning for what to expect further down the road in the Northern Rockies.

Warmer climate, greater fire activity, higher risks

Looking back thousands of years, it’s hard to ignore the consistent links between the climate and the prevalence of wildfires.

Warmer, drier springs and summers load the dice to make extensive fire seasons more likely. This was the case in 1910 in the Northern Rockies and in 2020 in the Southern Rockies.

When, where and how climate change will push the rate of burning in the rest of the Rockies into uncharted territory is harder to anticipate. The difference between 1910 and 2020 was that 1910 was followed by decades with low fire activity, whereas 2020 was part of an overall trend of increasing fire activity linked with global warming. Just one fire like 1910’s Big Burn in the coming decades, in the context of 21st-century fire activity, would push the Northern Rockies beyond any known records.

A tiny pine seedling in a vast landscape of burned trees and soil.
A lodgepole pine tree seedling begins to grow one year after the October 2020 East Troublesome Fire in Rocky Mountain National Park. Recovery in high-elevation forests takes decades. Philip Higuera

Lessons from the long view

The clock is ticking.

Extreme wildfires will become more and more likely as the climate warms, and it will be harder for forests to recover. Human activity is also raising the risk of fires starting.

The Big Burn of 1910 left a lasting impression because of the devastating impacts on lives and homes and, as in the 2020 fire season and many other recent fire disasters, because of the role humans played in igniting them.

Photo shows burned trees across miles of hillsides along a railroad line
The aftermath of the 1910 fire near the North Fork of the St. Joe River in the Coeur d’Alene National Forest, Idaho. R.H. McCoy/U.S. Forest Service archive, CC BY

Accidental ignitions – from downed power lines, escaped campfires, dragging chains, railroads – expand when and where fires occur, and they lead to the majority of homes lost to fires. The fire that destroyed Lahaina, Hawaii, is the most recent example.

So what can we do?

Curbing greenhouse gas emissions from vehicles, power plants and other sources can help slow warming and the impacts of climate change on wildfires, ecosystems and communities. Forest thinning and prescribed burns can alter how forests burn, protecting humans and minimizing the most severe ecological impacts.

Reframing the challenge of living with wildfire – building with fire-resistant materials, reducing accidental ignitions and increasing preparedness for extreme events – can help minimize damage while maintaining the critical role that fires have played in forests across the Rocky Mountains for millennia.

The Conversation

Kyra Clark-Wolf has received funding from the National Science Foundation and the Joint Fire Science Program

Philip Higuera receives funding from the National Science Foundation, United States Geological Survey, and Joint Fire Science Program.

Read more …What the extreme fire seasons of 1910 and 2020 – and 2,500 years of forest history – tell us about...

More Articles …