How a supercomputer is helping AT&T prepare for extreme weather

AT & T has a new climate change risk assessment tool developed with the help of scientists at Argonne National Laboratory and Supercomputing Power CNBC . Telecom companies want to protect their infrastructure from flooding and extreme weather events that are expected to increase as climate change continues.

A few years ago, AT & T began thinking about the long-term risks to climate change equipment. For example, a company may have cell towers and sites around the world that are vulnerable to flooding and may need to flood on flooded water. In other areas, the service relies on a ground copper line that can be blown up by a storm, and weather patterns can change and be buried deeper underground. "We basically have deep dives in essence: what was our long-term plan and how did it relate to climate change," says Shannon Carroll, director of environmental sustainability at AT & T .

So they relied on scientists at the Argonne National Laboratory, such as Rao Kotamarthi, chief climate scientist in the environmental sciences sector. He and his colleagues used billions of hours of supercomputing time to analyze how the risk of wind and flood could change in a warmer future. But for the data to be useful, we had to use a much smaller scale than usual. "Basically, you have to create a model on a scale where there is an infrastructure," says Kotamarthi The Verge . "The most interesting question people ask is on the scale."

Most climate models operate on a 100-kilometer (62-mile) scale. In other words, the data covers 100 kilometers of North America. This allows you to get a big picture, but it is not granular details like what is going on in a particular block. The Argonne team reduced the local climate model to 12 kilometers (7.5 miles) and flood data to 200 meters (656 feet). This is the heart of AT & T's plan to use that information. AT & T's Carroll said, "This is about resolution.

Analyzing climate data on a small scale requires a lot of time and computing power, so it is costly. "We estimate that the parallel processor at the Argonne National Laboratory's supercomputer would have taken 80 million hours."

Please Note: This content is provided and hosted by a 3rd party server. Sometimes these servers may include advertisements. igetintopc.com does not host or upload this material and is not responsible for the content.