POPULARITY
Addressing the increasingly complex problem of wildfires that threaten people, communities and economic assets, requires new ways of thinking and new technologies. That is the premise of Convective Capital, California-based venture capital firm that invests in companies trying to solve the global wildfire crisis. Jay Ribakove, Principal at Convective, joins Michael Torrance, Chief Sustainability Officer for BMO, on the latest episode of Sustainability Leaders to discuss some of the most promising fire mitigation and suppression innovations.
No Priors: Artificial Intelligence | Machine Learning | Technology | Startups
This week on No Priors, Sarah and Elad sit down with Bill Clerico, founder of Convective Capital, an early stage venture fund focused on technology-driven solutions for wildfire mitigation and climate resilience. The wildfires in Los Angeles have caused unprecedented property damage and immense hardship for countless individuals and families. This episode is devoted to diving into understanding what happened and what we can do in the future. Bill shares his insights into the increasing severity of wildfires, the role of policy, and how infrastructure issues, like outdated building codes and underfunded utilities, are contributing to the crisis. They discuss the latest innovations in fire-fighting technology, from advanced detection to drones, and how these tools can help mitigate future damage. Sign up for new podcasts every week. Email feedback to show@no-priors.com Follow us on Twitter: @NoPriorsPod | @Saranormous | @EladGil | @BillClerico Show Notes: 0:00 Introduction 1:02 Why are wildfires getting worse? 5:37 Policies and regulatory decisions 10:47 Housing: building codes and permitting 13:19 Key factors in response 16:20 Improving water supply and city infrastructure 19:10 Preventing wildfires 21:26 Underinvestment in California's utilities 26:53 Innovative fire fighting technology 29:35 Accelerating Los Angeles' recovery 34:29 Actions homeowners, insurance companies, and governments can take
Elizabeth Victor, senior natural catastrophe specialist for Swiss Re discusses the increase in convective storms and so called nonpeak perils and the impact it's having on the insurance industry.
Radiant Heat vs.Convective Heat for Smokers & Grills: The Ultimate Guide
Every American who has a mortgage is required by their bank to have homeowners insurance, but getting it and keeping it is becoming a challenge. In this episode, hear the highlights of a Senate hearing examining the problems in the homeowners insurance market and why they might lead to much bigger problems next time disaster strikes. Please Support Congressional Dish – Quick Links Contribute monthly or a lump sum via Support Congressional Dish via (donations per episode) Send Zelle payments to: Donation@congressionaldish.com Send Venmo payments to: @Jennifer-Briney Send Cash App payments to: $CongressionalDish or Donation@congressionaldish.com Use your bank's online bill pay function to mail contributions to: Please make checks payable to Congressional Dish Thank you for supporting truly independent media! Background Sources Effects of Climate on Insurance Christopher Flavelle and Mira Rojanasakul. May 13, 2024. The New York Times. Chris Van Hollen et al. September 7, 2023. Chris Van Hollen, U.S. Senator for Maryland. Alice C. Hill. August 17, 2023. Council on Foreign Relations. Insurance Information Institute. Antonio Grimaldi et al. November 19, 2020. McKinsey & Company. Lobbying OpenSecrets. OpenSecrets. OpenSecrets. Heritage Foundation SourceWatch. Demotech William Rabb. April 15, 2024. Insurance Journal. Parinitha Sastry et al. December 2023. Fannie Mae Adam Hayes. May 17, 2023. Investopedia. Hurricanes National Oceanic and Atmospheric Administration. National Oceanic and Atmospheric Administration. Audio Sources Senate Committee on the Budget June 5, 2024 Witnesses: Glen Mulready, Insurance Commissioner, State of Oklahoma Rade Musulin, Principal, Finity Consulting Dr. Ishita Sen, Assistant Professor of Finance, Harvard Business School Deborah Wood, Florida Resident , Research Fellow, Heritage Foundation's Grover Hermann Center for the Federal Budget Clips 23:05 Sen. Sheldon Whitehouse (D-RI): In 2022 and 2023, more than a dozen insurance companies left the Florida residential market, including national insurers like Farmers. Residents fled to Citizens Property Insurance, the state backed insurer of last resort, which ballooned from a 4% market share in 2019 to as much as 17% last year. If it has to pay out claims that exceed its reserves, citizens can levy a surcharge on Florida insurance policy holders across the state. Good luck with that. Particularly if the surcharge grows to hundreds or even thousands of dollars to depopulate its books. Citizens has let private insurers cherry pick out its least risk policies. Those private insurers may have problems of their own, as we will hear today. 25:10 Sen. Sheldon Whitehouse (D-RI): The federal budget takes a hit because these insurers and their policies are accepted by Freddie Mac and Fannie Mae, who either own or guarantee a large part of our $12 trillion mortgage market. This all sounds eerily reminiscent of the run-up to the mortgage meltdown of 2008, including a role of potentially captive or not fully responsible rating agencies. 25:45 Sen. Sheldon Whitehouse (D-RI): Florida is far from alone. A New York Times investigation found that the insurance industry lost money on homeowners coverage in 18 states last year, and the states may surprise you. The list includes Illinois, Michigan, Utah, Washington, and Iowa. Insurers in Iowa lost money each of the last four years. This is a signal that hurricanes and earthquakes, once the most prevalent perils, are being rivaled by hail, windstorms, and wildfires. 28:00 Sen. Sheldon Whitehouse (D-RI): This isn't all that complicated. Climate risk makes things uninsurable. No insurance makes things unmortgageable. No mortgages crashes the property markets. Crashed property markets trash the economy. It all begins with climate risk, and a major party pretending that climate risk isn't real imperils our federal budget and millions of Americans all across the country. 33:45 Sen. Chuck Grassley (R-IA): Insurance premiums are far too high across the board and may increase after the recent storms, including those very storms in my state of Iowa. Climate change isn't the primary driver of insurance rate hikes and collapse of the insurance industry isn't imminent. Although I'll have to say, Iowa had six property and casualty companies pull out of insuring Iowans. Climate change doesn't explain why auto insurance premiums in 2024 have increased by a whopping 20% year over year. It also doesn't account for the consistent failure of liberal cities to fight crime, which has raised insurance risk and even caused insurers to deny coverage. Expensive liberal policies, not climate change, are much to blame for these market dynamics. 39:00 Sen. Sheldon Whitehouse (D-RI): The first witness is Rade Musulin. Rade is an actuary with 45 years of experience in insurance, specializing in property pricing, natural perils, reinsurance, agriculture, catastrophe, risk modeling, public policy development, and climate risk. Specifically, he spent many years working in Florida, including as chair of the Florida Hurricane Catastrophe Fund Advisory Council during the time in which Citizens Property Insurance Corporation was established. 39:35 Sen. Sheldon Whitehouse (D-RI): Our second witness is Dr. Ishida Sen. Dr. Sen is an Assistant Professor at Harvard Business School. Her recent research examines the pricing of property insurance and the interactions between insurance and mortgage markets. This includes the role that institutions and the regulatory landscape play and the broader consequences for real estate markets, climate adaptation, and our overall financial stability. 40:00 Sen. Sheldon Whitehouse (D-RI): Our third witness is Deb Wood. Ms. Wood and her husband Dan McGrath are both retired Floridians. They moved to South Florida in 1979 and lived in Broward County, which includes Fort Lauderdale for 43 years until skyrocketing insurance premiums became too much. They now reside in Tallahassee, Florida. 40:35 Sen. Chuck Grassley (R-IA): Dr. EJ Antoni is a Research Fellow at the Heritage Foundation Grover M. Hermann Center for the Federal Budget. His research focuses on fiscal and monetary policy, and he previously was an economist at the Texas Public Policy Foundation. Antoni earned his Master's degree and Doctor's degree in Economics from Northern Illinois University. 41:10 Sen. Chuck Grassley (R-IA): Commissioner Glen Mulready has served as Oklahoma's 13th Insurance Commissioner and was first elected to this position in 2019. Commissioner Mulready started his insurance career as a broker in 1984, and also served in the Oklahoma State House of Representatives. 42:15 Rade Musulin: Okay. My name is Ray Muslin. I'm an actuary who has extensive experience in natural hazard risks and funding arrangements for the damage and loss they cause. I've worked with many public sector entities on policy responses to the challenges of affordability, availability of insurance, and community resilience. This work included participating in Florida's response to Hurricane Andrew, which included the creation of the Florida Hurricane Catastrophe Fund and Citizens Property Insurance Corporation. The Cat Fund and Citizens can access different forms of funding than traditional insurance companies. Instead of holding sufficient capital or reinsurance before an event to cover the cost of potential losses, both entities use public sources of capital to reduce upfront costs by partially funding losses post-event through bonding and assessments. All property casualty insurance policy holders, whether in Citizens or not, are subject to its assessments. While the Cat Fund can also assess almost all policies, including automobile, this approach exposes Floridians to debt and repayment if large losses occur, and it subsidizes high risk policies from the entire population. These pools, others like them in other states, and the NFIP have contributed to rapid development in high risk areas driving higher costs in the long run. In Florida, national insurers have reduced their exposure as a significant proportion of the insurance market has moved to Citizens or smaller insurers with limited capital that are heavily dependent on external reinsurance. To date, Florida's system has been successful in meeting its claims obligations, while improvements in building codes have reduced loss exposure. However, for a variety of reasons, including exposure to hurricanes, claims cost inflation, and litigation, Florida's insurance premiums are the highest in the nation, causing significant affordability stress for consumers. According to market research from Bankrate, the average premium for a $300,000 home in Florida is three times the national average, with some areas five times the national average. A major hurricane hitting a densely populated area like Miami could trigger large and long lasting post-event assessments or even exceed the system's funding capacity. Continued rapid exposure growth and more extreme hurricane losses amplified by climate change will cause increasing stress on the nation's insurance system, which may be felt through solvency issues, non-renewals, growth of government pools, and affordability pressure. 44:55 Rade Musulin: Evidence of increasing risk abounds, including Hurricane Otis in 2023, which rapidly intensified from a tropical storm to a cat. five hurricane and devastated Acapulco in Mexico last summer. Water temperatures off Florida exceeded a hundred degrees Fahrenheit last week. As was alluded to earlier, NOAA forecast an extremely active hurricane season for '24. We've seen losses in the Mid-Atlantic from Sandy, record flooding from Harvey, and extreme devastation from Maria, among others. In coming decades, we must prepare for the possibility of more extreme hurricanes and coastal flooding from Texas to New England. 46:50 Dr. Ishita Sen: Good morning Senators. I am Ishita Sen, Assistant Professor at Harvard Business School and my research studies insurance markets. In recent work with co-authors at Columbia University and the Federal Reserve Board, I examine how climate risk creates fiscal and potentially financial instability because of miscalibrated insurer screening standards and repercussions to mortgage markets. 47:15 Dr. Ishita Sen: Insurance is critical to the housing market. Property insurers help households rebuild after disasters by preserving collateral values and reducing the likelihood that a borrower defaults. Insurance directly reduces the risks for mortgage lenders and the Government-Sponsored Enterprises (GSEs) such as Fannie Mae and Freddie Mac Mortgage Lenders therefore require property insurance and the GSEs only purchase mortgages backed by insurers who meet minimum financial strength ratings, which measure insurer solvency and ability to pay claims. The GSEs accept three main rating agencies AM Best, S & P and, more recently, Demotech. And to provide an example, Fannie Mae requires insurers to have at least a B rating from AM Best, or at least an A rating from Demo Tech to accept a mortgage. Now, despite having this policy in place, we find a dramatic rise in mortgages backed by fragile insurers and show that the GSEs and therefore the taxpayers ultimately shoulder a large part of the financial burden. Our research focuses on Florida because of availability of granular insurance market data, and we show that traditional insurers are exiting and the gap is rapidly being filled by insurers, rated by Demotech, which has about 60% market share in Florida today. These insurers are low quality across a range of different financial and operational metrics, and are at a very high risk of becoming insolvent. But despite their risk, these insurers secure high enough ratings to meet the minimum rating requirements set by the GSEs. Our analysis shows that many actually would not be eligible under the methodologies of other rating agencies, implying that in many cases these ratings are inflated and that the GSEs insurer requirements are miscalibrated. 49:20 Dr. Ishita Sen: We next look at how fragile insurers create mortgage market risks. So in the aftermath of Hurricane Irma, homeowners with a policy from one of the insolvent Demotech insurers were significantly more likely to default on their mortgage relative to similar borrowers with policies from stable insurers. This is because insurers that are in financial trouble typically are slower to pay claims or may not pay the full amounts. But this implies severe economic hardships for many, many Floridians despite having expensive insurance coverage in place. However, the pain doesn't just stop there. The financial costs of fragile insurers go well beyond the borders of Florida because lenders often sell mortgages, for example, to the GSEs, and therefore, the risks created by fragile insurers spread from one state to the rest of the financial system through the actions of lenders and rating agencies. In fact, we show two reasons why the GSEs bear a large share of insurance fragility risk. First is that lenders strategically securitize mortgages, offloading loans backed by Demotech insurers to the GSEs in order to limit their counterparty risk exposures. And second, that lenders do not consider insurer risk during mortgage origination for loans that they can sell to the GSEs, even though they do so for loans that they end up retaining, indicating lax insurer screening standards for loans that can be offloaded to the GSEs. 50:55 Dr. Ishita Sen: Before I end, I want to leave you with two numbers. Over 90%. That's our estimate of Demotech's market share among loans that are sold to the GSEs. And 25 times more. That's Demotech's insolvency rate relative to AM Best, among the GSE eligible insurers. 57:15 Glen Mulready: As natural disasters continue to rise, understanding the dynamics of insurance pricing is crucial for both homeowners and policymakers. Homeowners insurance is a fundamental safeguard for what is for many Americans their single largest asset. This important coverage protects against financial loss due to damage or destruction of a home and its contents. However, recent years have seen a notable increase in insurance premiums. One significant driver of this rise is convective storms and other severe weather events. Convective storms, which include phenomena like thunderstorms, tornadoes, and hail, have caused substantial damage in various regions. The cost to repair homes and replace belongings after such events has skyrocketed leading insurance companies to adjust their premiums to cover that increased risk. Beyond convective storms, we've witnessed hurricanes, wildfires, and flooding. These events have not only caused damage, but have also increased the long-term risk profile of many areas. Insurance companies are tasked with managing that risk and have responded by raising premiums to ensure they can cover those potential claims. 58:30 Glen Mulready: Another major factor influencing homeowner's insurance premiums is inflation. Inflation affects the cost of building materials, labor, and other expenses related to home repair and reconstruction. As the cost of living increases, so does the cost of claims for insurers. When the price of lumber, steel, and other essential materials goes up, the expense of repairing or rebuilding homes also rises. Insurance companies must reflect these higher costs in their premiums to maintain financial stability and ensure they can meet those contractual obligations to policyholders. 59:35 Glen Mulready: I believe the most essential aspect of managing insurance premiums is fostering a robust, competitive free market. Competition among insurance companies encourages innovation and efficiency, leading to better pricing and services for consumers. When insurers can properly underwrite and price for risk, they create a more balanced and fair market. This involves using advanced data analytics and modeling techniques to accurately assess the risk levels of different properties. By doing so, insurance companies can offer premiums that reflect the true risk, avoiding excessive charges for low risk homeowners, and ensuring high risk properties are adequately covered. Regulation also plays a crucial role in maintaining a healthy insurance market. Policyholders must strike a balance between consumer protection and allowing insurers the freedom and flexibility to adjust their pricing based on the risk. Overly stringent regulations can stifle competition and lead to market exits, reducing choices for consumers. We've seen this play out most recently in another state where there were artificial caps put in place on premium increases that worked well for consumers in the short term, but then one by one, all of the major insurers began announcing they would cease to write any new homeowners insurance in that state. These are all private companies, and if there's not the freedom and flexibility to price their products properly, they may have to take drastic steps as we've seen. Conversely, a well-regulated market encourages transparency and fairness, ensuring that homeowners have access to the most affordable and adequate coverage options. 1:02:00 Dr. EJ Antoni: I'm a public finance economist and the Richard F. Aster fellow at the Heritage Foundation, where I research fiscal and monetary policy with a particular focus on the Federal Reserve. I am also a senior fellow at the Committee to Unleash Prosperity. 1:02:15 Dr. EJ Antoni: Since January 2021, prices have risen a cumulative 19.3% on average in the American economy. Construction prices for single family homes have risen much faster, up 30.5% during the same time. 1:03:20 Dr. EJ Antoni: Actuarial tables used in underwriting to estimate risk and future losses, as well as calculate premiums, rely heavily on those input costs. When prices increase radically, precisely as has happened over the last several years, old actuarial tables are of significantly less use when pricing premiums because they will grossly understate the future cost to the insurer. The sharp increase in total claim costs since 2019 has resulted in billions of dollars of losses for both insurers and reinsurers prompting large premium increases to stop those losses. This has put significant financial stress on consumers who are already struggling with a cost of living crisis and are now faced with much higher insurance premiums, especially for homeowners insurance. 1:05:10 Dr. EJ Antoni: The increase in claims related to weather events has undoubtedly increased, but it is not due to the climate changing. This is why the insurance and reinsurance markets do not rely heavily on climate modeling when pricing premiums. Furthermore, climate models are inherently subjective, not merely in how the models are constructed, but also by way of the inputs that the modeler uses. In other words, because insufficient data exists to create a predictive model, a human being must make wide ranging assumptions and add those to the model in place of real world data. Thus, those models have no predictive value for insurers. 1:07:40 Sen. Sheldon Whitehoue (D-RI): You say that this combination of demographics, development, and disasters poses a significant risk to our financial system. What do you mean by risk to our financial system Rade Musulin: Well, Senator, if you look at the combination, as has been pointed out, of high growth and wealth accumulation in coastal areas, and you look at just what we've observed in the climate, much less what's predicted in the future, there is significant exposure along the coastline from Maine to Texas. In fact, my family's from New Jersey and there is enormous development on the coast of New Jersey. And if we start to get major hurricanes coming through those areas, the building codes are probably not up to the same standards they are in Florida. And we could be seeing some significant losses, as I believe was pointed out in the recent Federal Reserve study. Sen. Sheldon Whitehoue (D-RI): And how does that create risk to the financial system? Rade Musulin: Well, because it's sort of a set of dominoes, you start with potentially claims issues with the insurers being stressed and not able to pay claims. You have post-event rate increases as we've seen in Florida, you could have situations where people cannot secure insurance because they can't afford it, then that affects their mortgage security and so on and so forth. So there are a number of ways that this could affect the financial system, sir. Sen. Sheldon Whitehoue (D-RI): Cascading beyond the immediate insurer and becoming a national problem. Rade Musulin: Well, I would just note Senator, that in Florida, the real problems started years after we got past Andrew. We got past paying the claims on Andrew, and then the big problems occurred later when we tried to renew the policies. 1:10:50 Sen. Sheldon Whitehouse (D-RI): And you see in this, and I'm quoting you here, parallels in the 2008 financial crisis. What parallels do you see? Dr. Ishita Sen: So just like what happened during the financial crisis, there were rating agencies that gave out high ratings to pools of mortgages backed by subprime loans. Here we have a situation where rating agencies like Demotech are giving out inflated ratings to insurance companies. The end result is sort of the same. There is just too much risk and too many risky mortgages being originated, in this case backed by really low quality insurers that are then entering the financial system. And the consequences of that has to be born by, of course the homeowners, but also the mortgage owners, GSCs (Government Sponsored Enterprises), the lenders, and ultimately the federal and state governments. Sen. Sheldon Whitehouse (D-RI): You say, this will be my last question. The fragility of property insurers is an important channel through which climate risk might threaten the stability of mortgage markets and possibly the financial system. What do you mean when you refer to a risk to the financial system? Dr. Ishita Sen: Well, as I was explaining the GSEs, if there are large losses that the GSEs face, then those losses have to be plugged by somebody. So the taxpayers, that's one channel through which you've got risk to the financial system and the GSE's serve as a backstop in the mortgage market. They may not have the ability or capacity to do so in such a scenario, which affects mortgage backed security prices, which are held by all sorts of financial institutions. So that starts affecting all of these institutions. On the other hand, if you've got a bunch of insurers failing, another channel is these insurers are one of the largest investors in many asset classes like corporate bonds, equities, and so on. And they may have to dump these securities at inopportune times, and that affects the prices of these securities as well. 1:12:45 Sen. Chuck Grassley (R-AI): Dr. Antoni, is there any evidence to support the notion that climate change is the greatest threat to the insurance market? Dr. EJ Antoni: No. Senator, there is not. And part of that has to do again, with the fact that when we look at the models that are used to predict climate change, we simply don't have enough empirical data with which we can input into those models. And so as a result of that, we have to have human assumptions on what we think is going to happen based essentially on a guess. And as a result of that, these models really are not of any predictive value, and that's why these models for the last 50 years have been predicting catastrophic outcomes, none of which have come true. 1:14:45 Glen Mulready: This focus on the rating agencies, I would agree with that if that were the be all end all. But the state insurance commissioners in each 50 states is tasked with the financial solvency of the insurance companies. We do not depend on rating agencies for that. We are doing financial exams on them. We are doing financial analysis every quarter on each one of them. So I would agree if that was the sort of be all end all, forgive that phrase, but it's not at all. And we don't depend very much at all on those rating agencies from our standpoint. 1:22:15 Dr. Ishita Sen: On the point about regulators looking at -- rating agencies is not something that we need to look at. I would just point out that in Florida, if you look at the number of exams that the Demotech rated insurers, that by the way have a 20% insolvency rate relative to 0% for traditional insurers, they get examined at the same rate as the traditional insurers like Farmers and AllState get examined, which is not something that you would expect if you're more risky. You would expect regulators to come look at them much, much more frequently. And the risk-based capital requirements that we have currently, which were designed in the 1980s, they're just not sensitive enough to new risks like wildfire and hurricanes and so on. And also not as well designed for under-diversified insurance companies because if so, all of these insurers were meeting the risk-based capital requirements, however, at the same time going insolvent at the rate of 20%. So those two things don't really go hand in hand. 1:23:25 Dr. Ishita Sen: Ultimately what the solution is is something that is obviously the main question that we are here to answer, but I would say that it is extremely hard to really figure out what the solution is, in part because we are not in a position right now to even answer some basic facts about how big the problem is, what exactly the numbers look like. For instance, we do not know basic facts about how much coverage people have in different places, how much they're paying. And when I say we don't know, we don't know this at a granular enough level because the data does not exist. And the first step towards designing any policy would be for us to know exactly how bad the problem is. And then we come up with a solution for that and start to evaluate these different policy responses. Right now we are trying to make policy blindfolded. 1:23:50 Sen. Ron Johnson (R-WI): So we've had testimony before this committee that we've already spent $5-6 trillion. That's 5,000 to 6,000 billion dollars trying to mitigate climate change. We haven't made a dent in it. Their estimates, it's going to cost tens of trillions of dollars every year to reach net zero. So again, this is not the solution for a real problem, which is the broken insurance market. I have enough Wisconsin residents who live on the Gulf Coast in Florida to know after Hurricane Ian, you got some real problems in Florida. But fixing climate change isn't the solution. 1:33:15 Sen. Jeff Merkley (D-OR): In looking at the materials I saw that Citizens Property Insurance Company, I gather that's Louisiana and Florida, that have a completely state backed program. Well, alright, so if the state becomes the insurer of last resort and they now suffer the same losses that a regular private insurance company is suffering, now the folks in the state are carrying massive debt. So that doesn't seem like a great solution. Dr. Ishita Sen: That's definitely a problem, right? The problem is of course, that whether the state then has the fiscal capacity to actually withstand a big loss, like a big hurricane season, which is a concern that was raised about Citizens. And in such a scenario then in a world where they do not have enough tax revenue, then they would have to go into financial markets, try to borrow money, which could be very costly and so on. So fiscally it's going to be very challenging for many cities and many municipalities and counties and so on. 1:36:40 Sen. Mitt Romney (R-UT): I wish there were something we could do that would reduce the climate change we're seeing and the warming of the planet. But I've seen absolutely nothing proposed by anyone that reduces CO2 emissions, methane gases and the heating of the planet. Climate change is going to happen because of the development in China and Indonesia and Brazil, and the only thing that actually makes any measurable impact at all is putting a price on carbon, and no one seems to be willing to consider doing that. Everything else that's being talked about on the climate — Democratic Senator: I got two bills. Sen. Mitt Romney (R-UT): I know you and I are, but you guys had reconciliation. You could have done it all by yourselves and you didn't. So the idea that somehow we're going to fix climate and solve the insurance problem is pie in the sky. That's avoiding the reality that we can't fix climate because that's a global issue, not an American issue. Anyway, let me turn back to insurance. 1:38:30 Sen. Mitt Romney (R-UT): So the question is, what actions can we take? Fiscal reform? Yes, to try and deal with inflation. Except I want to note something, Mr. Antoni, because you're esteemed at the Heritage Foundation. 72% of federal spending is not part of the budget we vote on. So we talk about Biden wants to spend all this.... 72% we don't vote on; we only vote on 28%. Half of that is the military. We Republicans want more military spending, not less. So that means the other 14%, which the Democrats want to expand, there's no way we can reduce the 14% enough to have any impact on the massive deficits we're seeing. So there's going to have to be a broader analysis of what we have to do to reign in our fiscal challenges. I just want to underscore that. I would say a second thing we can do, besides fiscal reform and dealing with inflation, is stopping subsidizing high risk areas. Basically subsidizing people to build expensive places along the coast and in places that are at risk of wildfire. And we subsidize that and that creates huge financial risk to the system. And finally, mitigation of one kind or another. That's the other thing we can do is all sorts of mitigation: forestry management, having people move in places that are not high risk. But if you want to live in a big house on the coast, you're gonna have to spend a lot of money to insure it or take huge risk. That's just the reality. So those are the three I come up with. Stop the subsidy, mitigation, and fiscal reform. What else am I missing, Mr. Musulin? And I'm just going to go down the line for those that are sort of in this area to give me your perspectives. Rade Musulin: Well, thank you, Senator. And I'd agree with all those things. And I'd also add that we need to start thinking about future-proofing our building codes and land use policies. The sea levels are rising. If you're going to build a house that's supposed to last 75 years, you ought to be thinking about the climate in 75 years when you give somebody a permit to build there. So I'd say that's important. I'd also say that large disasters also drive inflation because it puts more pressure and demand on labor and materials. More disasters means supplies that could have been used to build new homes for Americans or diverted to rebuild homes in the past. So certainly doing things to reduce the vulnerability of properties and improve their resilience is important. And I do think, sir, that there are things we can do about climate change with respect over periods of decades that can make a difference in the long run. Thank you. Sen. Mitt Romney (R-UT): Thank you. Yes. Dr. Ishita Sen: So before that, the one point about inflation that we are missing, which is without doubt it is a contributing factor, but the US has had inflation in the past without such an acute crisis in insurance markets. So whether that is the biggest cause or not is up for debate. I don't think we have reached a conclusion on inflation being the biggest contributor of rising insurance cost. Sen. Mitt Romney (R-UT): It's just a big one. You'd agree It's a big one? Dr. Ishita Sen: I agree. It's a big one, but I wouldn't say it's the biggest one in terms of policy solutions. I completely agree with you on, we need to stop subsidizing building in high risk areas. That's definitely one of the things we need to do that. Mitigation, another point that you bring up. And on that, I would say not only do we need to harden our homes, but we also need to harden our financial institutions, our banks, and our insurance companies in order to make them withstand really large climate shocks that are for sure coming their way. Sen. Mitt Romney (R-UT): Thank You, Ms. Wood. I'm going to let you pass on this just because that's not your area of expertise. Your experience was something which focused our thinking today. Mr. Mulready. Glen Mulready: Thank you, Senator. I would say amen to your comments, but I'll give you three quick things. Number one, FEMA has a survey out that states that every $1 spent in mitigation saves $6 in lost claims. It pays off. Number two, unfortunately, a lot of communities have to have a disaster happen. In Moore, Oklahoma, back a dozen years ago, an EF5 (tornado) hit, it was just totally devastating. After that, the city of Moore changed their zoning, they changed their building zoning codes, and then third, the city of Tulsa, back in the eighties, had horrible flooding happened. So they invested over decades in infrastructure to prevent flooding. Now we're one of only two communities in the country that are Class one NFIP rated. 1:45:40 Sen. Chris Van Hollen (D-MD): One way to address this, and I think it was discussed in a different matter, is the need to get the data and to get consensus on where the risks lie, which is why last year Senator Whitehouse, Senator Warren and I sent a letter to the Treasury Department, to the Federal Insurance Office (FIO), urging them to collect information from different states. I'm a supporter of a state-based insurance system for property and casualty insurance, but I do think it would benefit all of us to have a sort of national yardstick against which we can measure what's happening. So Dr. Sen, could you talk a little bit about the benefit of having a common source of insurance data through the FIO and how that could benefit state regulators and benefit all of us? Dr. Ishita Sen: Yeah, absolutely. Thanks for bringing that up. That's just the first order importance, I think, because we don't even know the basic facts about this problem at a granular enough level. The risks here are local, and so we need to know what's going zip code-by-zip code, census tract-by-census tract, and for regulators to be able to figure out exactly how much risk is sitting with each of these insurance companies they need to know how much policies they're writing, what's the type of coverage they're selling in, what are the cancellations looking like in different zip codes. Only then can they figure out exactly how exposed these different insurers are, and then they can start designing policy about whether the risk-based capital ratios look alright or not, or should we put a surcharge on wildfires or hurricanes and so on? And we do need a comprehensive picture. We just can't have a particular state regulator look at the risks in that state, because of course, the insurer is selling insurance all over the country and we need to get a comprehensive picture of all of that. 1:47:40 Sen. Chris Van Hollen: I appreciate that. I gather that the Treasury Department is getting some resistance from some state insurance regulators. I hope we can overcome that because I'm not sure why anyone would want to deny the American people the benefit of the facts here. 1:48:45 Rade Musulin: I will just note that sometimes climate change itself can contribute to the inflation we've been talking about. For example, there were beetle infestations and droughts and fires in Canada, which decimated some of the lumber crop and led to a fivefold increase in the cost of lumber a few years ago. So some of this claims inflation is actually related to climate change, and I think we need to address that. 1:49:35 Glen Mulready: If you didn't know, the NAIC, National Association of Insurance Commission is in the midst of a data collection right now that will collect that data for at least 80% of the homeowner's market. And we have an agreement with FIO (Federal Insurance Office) to be sharing that data with them. They originally came to us, I got a letter from FIO and they were requesting data that we did not actually collect at the zip code level, and they had a very stringent timeline for that. So my response, it wasn't, no, it was just, look, we can't meet that timeline. We don't collect that today. We can in the future. But from that is where this has grown the data called by the NEIC. Sen. Chris Van Hollen (D-MD): So I appreciate, I saw that there had been now this effort on behalf of the....So has this now been worked out? Are there any states that are objecting, to your knowledge at this point in time, in terms of sharing data? Glen Mulready: I don't know about specific states. We will be collecting data that will represent at least 80% of the market share. Music by Editing Production Assistance
As Nick and I tackle the complexities of a Fableist 373 Cabernet Sauvignon, we find ourselves diving into more than just the notes and finishes of our chosen vino. We discuss the stormy relationship between climate and the insurance industry, where hailstorms now rival hurricanes in their capacity for chaos. The episode uncorks discussions on the roofing innovations that could stand up to Mother Nature's wrath, and the financial fermentations represented by Aon's and NFP's merger, all while considering the potent blend of climate on the very grapes in our glass. Every good story has its twists, and the leadership shake-up at Lockton Insurance is no exception. With the departure of CEO Peter Clune, we probe the possible undercurrents of this significant change and speculate on the future direction of the world's 10th largest brokerage. And amidst these seismic shifts, we can't help but address the forecast for a hurricane season that promises to be as memorable as the Cabernet on our lips - a swirl of meteorological predictions and legislative changes, like Florida's stance on litigation funding, that could redefine the landscape of the legal system. Our conversation may start with wine, but it's the tales of trial attorneys and the unexpected round of February golf in Minneapolis that add flavor to our discussion. We admire the finesse of courtroom combatants and their ability to weave narrative and knowledge into a compelling argument. And, as we raise a glass to the quirky intersection of legal drama with pop culture - from 'My Cousin Vinny' to 'Catch Me If You Can' - we toast to the shared moments that keep us connected, to both the tales we tell and the trials we withstand. So, join us, glass in hand, for an episode that pairs life's unexpected twists with the timeless pursuit of a good story. Timestamps 0:00 Today's wine: Fableist 373 The Ant and the Cicada 4:11 Today's topics 4:56 Hail storms in the US and roof materials appropriate for insurance 10:23 Current state of AON's purchase of NFP and Point Capital's purchase of Truist 13:45 Zurich adds $300m top cat layer for Europe to 2024 reinsurance arrangements 16:11 Lockton CEO Peter Clune exits the company and will be replaced by Ron Lockton 21:33 AccuWeather Warns of Potential ‘Super-Charged' 2024 Hurricane Season 26:17 Florida pushing against litigation funding 30:52 Talking about attorneys, bars, and Curb Connect with RiskCellar: Website: https://www.riskcellar.com/ Brandon Schuh: Facebook: https://www.facebook.com/profile.php?id=61552710523314 LinkedIn: https://www.linkedin.com/in/brandon-stephen-schuh/ Instagram: https://www.instagram.com/schuhpapa/ Nick Hartmann: LinkedIn: https://www.linkedin.com/in/nickjhartmann/
We discuss the severe thunderstorms' rising insured costs. Plus: Earnings weakness in the Medicare Advantage business and US commercial real estate exposure of German banks.Speakers: Bernhard Held, VP-Sr Credit Officer, Moody's Investors Service; Dean Ungar, VP-Sr Credit Officer, Moody's Investors Service; Evelyn Ocas Salazar, AVP-Analyst, Moody's Investors Service; Juergen Grieser, Sr Dir Mgr-Analytics & Modeling, Moody's RMSHosts: Danielle Reed, VP – Senior Research Writer, Moody's Investors Service; Myles Neligan, VP – Senior Research Writer, Moody's Investors ServiceRelated Research:Property & Casualty Insurance – US: Battered by convective storms, carriers seek higher prices, advanced modelsHealth Insurance – US: Medicare Advantage continues to grow, but its profitability is weakeningBanks – Europe: Most EU banks have limited direct exposure to US CRE, German banks are more exposed
Meteorologist Mark Bove, senior vice president of natural catastrophe solutions, Munich Re, discusses what the greater intensity of storms is doing to the insurance industry, and what can be done to mitigate the losses.
This past week, there was plenty of severe weather: strong thunderstorms turned into violent supercells, many spawned tornadoes, large hail stones, and gusty winds. The plains, southeast, and mid-Atlantic all experienced this connected system of strong, severe weather and storms; but what led to this, and did we know it was coming? Explore the atmospheric setup and continental connections that allowed for this two-day series of storms to occur! Links you may find useful or interesting: National Weather Service Storm Prediction Center's Day 1 Convective Outlook for June 15 (referred to in episode) Radar Signature and Mesoscale Discussion surrounding Perryton, Texas on June 15 (referred to in episode) Public Severe Weather Outlook indicating Dangers in Plains on June 15 (referred to in episode) Drone Footage of Perryton, Texas Tornado Destuction (Weather Channel) The Weather Review is a podcast devoted to exploring the scientific and real-world connections of recent weather news, diving deeper than your typical forecast or storm chaser logs. Please note this podcast is a work-in-progress experiment. There are many flaws in the planning, production, and posting of it. Each episode allows for more experimental attempts at understanding how to do this and allows for this podcast to improve as time goes on. Any feedback is extremely valuable and welcomed!
Proactive planning and use of various weather tools can help operators navigate around convective weather. The post Podcast: Navigating Through Convective Weather appeared first on NBAA - National Business Aviation Association.
Today, we have two guests, Sonia Kastner, Founder and CEO at Pano, and Bill Clerico, Founder and Managing Partner at Convective Capital. Both Sonia and Bill are keynotes in the emerging category of fire tech and in the subcategory of climate tech that's referred to as adaptation solutions, technologies that can help deliver resiliency in the face of an increasingly unstable planet. At Pano, Sonia is developing technology that creates actionable intelligence for wildfire management. They're deploying a network of high-definition cameras across our forests to help generate faster and more informed fire response.At Convective Capital, Bill is investing in technology startups that are solving the problem of extreme wildfires, including Pano. Cody, Sonia, and Bill dive into the issue of wildfires, how and why they've grown in severity, the traditional response mechanisms that fire agencies have used and how that's changing, what types of technologies are being developed to support their efforts, and of course, some details about Pano's product offering. We also touch on the talent that's flowing into fire tech and how critical it is for us to continue to fund and develop new ways to adapt to a changing planet, try as we might in parallel to reign in the emissions and trapped heat that are causing climate change. In this episode, we cover: [3:00] Sonia's background and catalyst for working in climate adaptation at Pano [5:05] Bill's background in FinTech and inspiration to start Convective Capital [7:33] The mega wildfire crisis today and trends over the last two decades [11:54] Universal factors contributing to wildfires across different geographies [14:28] Solutions to wildfires including Pano's technology[16:49] An overview of firefighting today, early detection, and rapid initial attack [21:09] How suppression efforts could change based on fire characteristics and the need for collaboration [24:58] Challenges of building a tech company in the wilderness[27:37] How Pano is leveraging Starlink to create solutions for their customers[29:14] An overview of the company's physical product and buyers [31:52] How Convective Capital approaches companies like Pano who sell primarily to fire agencies [34:27] How organizations like CAL FIRE are changing their approach to work with tech companies [36:19] Skills needed and where talent is coming from [38:40] What's next for Pano and Convective CapitalGet connected:Cody Simms Twitter / LinkedInSonia Kastner / PanoBill Clerico / Convective CapitalMCJ Podcast / Collective*You can also reach us via email at info@mcjcollective.com, where we encourage you to share your feedback on episodes and suggestions for future topics or guests.Episode recorded on January 12, 2023.
Bill Clerico, Founder and Managing Partner at Convective Capital, and Co founder and CEO of WePay, joins the show to talk aboutBuilding companies in a recession and Bill's model for evaluating investment opportunities.How Bill and his team uncovered PM fit for WePay, and the signals that helped WePay make the transition from a consumer to B2B company.Current state of the payments industry and picking categories that are less evolved for disruption.Hiring the right profile of employee for the right stage of your venture.Bill's advice for navigating the fundraising market over the next 18 months as a startup founder, and why more founders should be building in climate, and more specifically, fire tech.
J+M are back for another edition of VC Sunday School. This week, they discuss momentum investing in venture capital! (1:35) Then, Molly interviews Convective Capital's Bill Clerico on investing in firetech startups. (27:58) (0:00) Molly tees up Sunday's segments! (1:35) VC Sunday School: Momentum investing: fair strategy, or sign of a bubble? (12:56) OpenPhone - Get an extra 20% off any plan for your first 6 months at https://openphone.com/twist (14:30) Contrarian investing in VC (26:28) Microsoft for Startups Founders Hub - Apply in 5 minutes, no funding required, sign up at http://aka.ms/thisweekinstartups (27:58) Molly is joined by Convective Capital's Bill Clerico and talks about investing solely in firetech startups (35:30) Bill breaks down the different types of firetech startups that he's investing in FOLLOW Bill: https://twitter.com/billclerico FOLLOW Jason: https://linktr.ee/calacanis FOLLOW Molly: https://twitter.com/mollywood Subscribe to our YouTube to watch all full episodes: https://www.youtube.com/channel/UCkkhmBWfS7pILYIk0izkc3A?sub_confirmation=1
Global MHD simulations of the solar convective zone using a volleyball mesh decomposition I Pilot by Andrius Popovas et al. on Monday 21 November Solar modelling has long been split into ''internal'' and ''surface'' modelling, because of the lack of tools to connect the very different scales in space and time, as well as the widely different environments and dominating physical effects involved. Significant efforts have recently been put into resolving this disconnect. We address the outstanding bottlenecks in connecting internal convection zone and dynamo simulations to the surface of the Sun, and conduct a proof-of-concept high resolution global simulation of the convection zone of the Sun, using the task-based DISPATCH code framework. We present a new `volleyball' mesh decomposition, which has Cartesian patches tessellated on a sphere with no singularities. We use our new entropy based HLLS approximate Riemann solver to model magneto-hydrodynamics in a global simulation, ranging between 0.655 -- 0.995 R$_odot$, with an initial ambient magnetic field set to 0.1 Gauss. The simulations develop convective motions with complex, turbulent structures. Small-scale dynamo action twists the ambient magnetic field and locally amplifies magnetic field magnitudes by more than two orders of magnitude within the initial run-time. arXiv: http://arxiv.org/abs/http://arxiv.org/abs/2211.09564v1
Venus boundary layer dynamics: eolian transport and convective vortex by Maxence Lefèvre. on Tuesday 18 October Few spacecraft have studied the dynamics of Venus' deep atmosphere, which is needed to understand the interactions between the surface and atmosphere. Recent global simulations suggest a strong effect of the diurnal cycle of surface winds on the depth of the planetary boundary layer. We propose to use a turbulent-resolving model to characterize the Venus boundary layer and the impact of surface winds for the first time. Simulations were performed in the low plain and high terrain at the Equator and noon and midnight. A strong diurnal cycle is resolved in the high terrain, with a convective layer reaching 7 km above the local surface and vertical wind of 1.3 m/s. The boundary layer depth in the low plain is consistent with the observed wavelength of the dune fields. At noon, the resolved surface wind field for both locations is strong enough to lift dust particles and engender micro-dunes. Convective vortices are resolved for the first time on Venus. arXiv: http://arxiv.org/abs/http://arxiv.org/abs/2210.09219v1
Venus boundary layer dynamics: eolian transport and convective vortex by Maxence Lefèvre. on Tuesday 18 October Few spacecraft have studied the dynamics of Venus' deep atmosphere, which is needed to understand the interactions between the surface and atmosphere. Recent global simulations suggest a strong effect of the diurnal cycle of surface winds on the depth of the planetary boundary layer. We propose to use a turbulent-resolving model to characterize the Venus boundary layer and the impact of surface winds for the first time. Simulations were performed in the low plain and high terrain at the Equator and noon and midnight. A strong diurnal cycle is resolved in the high terrain, with a convective layer reaching 7 km above the local surface and vertical wind of 1.3 m/s. The boundary layer depth in the low plain is consistent with the observed wavelength of the dune fields. At noon, the resolved surface wind field for both locations is strong enough to lift dust particles and engender micro-dunes. Convective vortices are resolved for the first time on Venus. arXiv: http://arxiv.org/abs/http://arxiv.org/abs/2210.09219v1
Venus boundary layer dynamics: eolian transport and convective vortex by Maxence Lefèvre. on Tuesday 18 October Few spacecraft have studied the dynamics of Venus' deep atmosphere, which is needed to understand the interactions between the surface and atmosphere. Recent global simulations suggest a strong effect of the diurnal cycle of surface winds on the depth of the planetary boundary layer. We propose to use a turbulent-resolving model to characterize the Venus boundary layer and the impact of surface winds for the first time. Simulations were performed in the low plain and high terrain at the Equator and noon and midnight. A strong diurnal cycle is resolved in the high terrain, with a convective layer reaching 7 km above the local surface and vertical wind of 1.3 m/s. The boundary layer depth in the low plain is consistent with the observed wavelength of the dune fields. At noon, the resolved surface wind field for both locations is strong enough to lift dust particles and engender micro-dunes. Convective vortices are resolved for the first time on Venus. arXiv: http://arxiv.org/abs/http://arxiv.org/abs/2210.09219v1
Venus boundary layer dynamics: eolian transport and convective vortex by Maxence Lefèvre. on Tuesday 18 October Few spacecraft have studied the dynamics of Venus' deep atmosphere, which is needed to understand the interactions between the surface and atmosphere. Recent global simulations suggest a strong effect of the diurnal cycle of surface winds on the depth of the planetary boundary layer. We propose to use a turbulent-resolving model to characterize the Venus boundary layer and the impact of surface winds for the first time. Simulations were performed in the low plain and high terrain at the Equator and noon and midnight. A strong diurnal cycle is resolved in the high terrain, with a convective layer reaching 7 km above the local surface and vertical wind of 1.3 m/s. The boundary layer depth in the low plain is consistent with the observed wavelength of the dune fields. At noon, the resolved surface wind field for both locations is strong enough to lift dust particles and engender micro-dunes. Convective vortices are resolved for the first time on Venus. arXiv: http://arxiv.org/abs/http://arxiv.org/abs/2210.09219v1
A Note on the "Various Atmospheres over Water Oceans on Terrestrial Planets with a One-Dimensional Radiative-Convective Equilibrium Model by Tetsuya Hara et al. on Sunday 16 October It has been investigated the possibility of the various atmospheres over water oceans. We have considered the H$_2$ atmosphere and He atmosphere concerning to N$_2$ atmosphere over oceans. One of the main subjects in astrobiology is to estimate the habitable zone. If there is an ocean on the planet with an atmosphere, there is an upper limit to the outgoing infrared radiation called the Komabayashi-Ingersoll limit (KI-limit). This limit depends on the components of the atmospheres. We have investigated this dependence under the simple model, using the one-dimensional gray radiative-convective equilibrium model adopted by Nakajima et al. (1992). The outgoing infrared radiation ($F_{IRout}$) with the surface temperature ($T_s$) has shown some peculiar behavior. The examples for H$_2$, He, and N$_2$ background gas for H$_2$O vapour are investigated. There is another limit called the Simpson-Nakajima limit (SN-limit) mainly composed of vapour. This steam limit does not depend on the background atmosphere components. Under super-Earth case ($g=2times$9.8 m/s$^2$), several cases are also calculated. The KI-limit dependence on the initial pressure is presented. The various emission rates by Koll & Cronin (2019) are investigated. arXiv: http://arxiv.org/abs/http://arxiv.org/abs/2210.05963v2
A Note on the "Various Atmospheres over Water Oceans on Terrestrial Planets with a One-Dimensional Radiative-Convective Equilibrium Model by Tetsuya Hara et al. on Sunday 16 October It has been investigated the possibility of the various atmospheres over water oceans. We have considered the H$_2$ atmosphere and He atmosphere concerning to N$_2$ atmosphere over oceans. One of the main subjects in astrobiology is to estimate the habitable zone. If there is an ocean on the planet with an atmosphere, there is an upper limit to the outgoing infrared radiation called the Komabayashi-Ingersoll limit (KI-limit). This limit depends on the components of the atmospheres. We have investigated this dependence under the simple model, using the one-dimensional gray radiative-convective equilibrium model adopted by Nakajima et al. (1992). The outgoing infrared radiation ($F_{IRout}$) with the surface temperature ($T_s$) has shown some peculiar behavior. The examples for H$_2$, He, and N$_2$ background gas for H$_2$O vapour are investigated. There is another limit called the Simpson-Nakajima limit (SN-limit) mainly composed of vapour. This steam limit does not depend on the background atmosphere components. Under super-Earth case ($g=2times$9.8 m/s$^2$), several cases are also calculated. The KI-limit dependence on the initial pressure is presented. The various emission rates by Koll & Cronin (2019) are investigated. arXiv: http://arxiv.org/abs/http://arxiv.org/abs/2210.05963v2
Bridging the gap -- the disappearance of the intermediate period gap for fully convective stars, uncovered by new ZTF rotation periods by Yuxi Lu et al. on Thursday 13 October The intermediate period gap, discovered by Kepler, is an observed dearth of stellar rotation periods in the temperature-period diagram at $sim$ 20 days for G dwarfs and up to $sim$ 30 days for early-M dwarfs. However, because Kepler mainly targeted solar-like stars, there is a lack of measured periods for M dwarfs, especially those at the fully convective limit. Therefore it is unclear if the intermediate period gap exists for mid- to late-M dwarfs. Here, we present a period catalog containing 40,553 rotation periods (9,535 periods $>$ 10 days), measured using the Zwicky Transient Facility (ZTF). To measure these periods, we developed a simple pipeline that improves directly on the ZTF archival light curves and reduces the photometric scatter by 26%, on average. This new catalog spans a range of stellar temperatures that connect samples from Kepler with MEarth, a ground-based time domain survey of bright M-dwarfs, and reveals that the intermediate period gap closes at the theoretically predicted location of the fully convective boundary ($G_{rm BP} - G_{rm RP} sim 2.45$ mag). This result supports the hypothesis that the gap is caused by core-envelope interactions. Using gyro-kinematic ages, we also find a potential rapid spin-down of stars across this period gap. arXiv: http://arxiv.org/abs/http://arxiv.org/abs/2210.06604v1
A Note on the "Various Atmospheres over Water Oceans on Terrestrial Planets with a One-Dimensional Radiative-Convective Equilibrium Model by Tetsuya Hara et al. on Wednesday 12 October It has been investigated the possibility of the various atmospheres over water oceans. We have considered the H$_2$ atmosphere and He atmosphere concerning to N$_2$ atmosphere over oceans. One of the main subjects in astrobiology is to estimate the habitable zone. If there is an ocean on the planet with an atmosphere, there is an upper limit to the outgoing infrared radiation called the Komabayashi-Ingersoll limit (KI-limit). This limit depends on the components of the atmospheres. We have investigated this dependence under the simple model, using the one-dimensional gray radiative-convective equilibrium model adopted by Nakajima et al. (1992). The outgoing infrared radiation ($F_{IRout}$) with the surface temperature ($T_s$) has shown some peculiar behavior. The examples for H$_2$, He, and N$_2$ background gas for H$_2$O vapour are investigated. There is another limit called the Simpson-Nakajima limit (SN-limit) mainly composed of vapour. This steam limit does not depend on the background atmosphere components. Under super-Earth case ($g=2times$9.8 m/s$^2$), several cases are also calculated. The KI-limit dependence on the initial pressure is presented. The various emission rates by Koll & Cronin (2019) are investigated. arXiv: http://arxiv.org/abs/http://arxiv.org/abs/2210.05963v1
A Note on the "Various Atmospheres over Water Oceans on Terrestrial Planets with a One-Dimensional Radiative-Convective Equilibrium Model by Tetsuya Hara et al. on Wednesday 12 October It has been investigated the possibility of the various atmospheres over water oceans. We have considered the H$_2$ atmosphere and He atmosphere concerning to N$_2$ atmosphere over oceans. One of the main subjects in astrobiology is to estimate the habitable zone. If there is an ocean on the planet with an atmosphere, there is an upper limit to the outgoing infrared radiation called the Komabayashi-Ingersoll limit (KI-limit). This limit depends on the components of the atmospheres. We have investigated this dependence under the simple model, using the one-dimensional gray radiative-convective equilibrium model adopted by Nakajima et al. (1992). The outgoing infrared radiation ($F_{IRout}$) with the surface temperature ($T_s$) has shown some peculiar behavior. The examples for H$_2$, He, and N$_2$ background gas for H$_2$O vapour are investigated. There is another limit called the Simpson-Nakajima limit (SN-limit) mainly composed of vapour. This steam limit does not depend on the background atmosphere components. Under super-Earth case ($g=2times$9.8 m/s$^2$), several cases are also calculated. The KI-limit dependence on the initial pressure is presented. The various emission rates by Koll & Cronin (2019) are investigated. arXiv: http://arxiv.org/abs/http://arxiv.org/abs/2210.05963v1
TOI-5205b: A Jupiter transiting an M dwarf near the Convective Boundary by Shubham Kanodia et al. on Sunday 25 September We present the discovery of TOI-5205b, a transiting Jovian planet orbiting a solar metallicity M4V star, which was discovered using TESS photometry and then confirmed using a combination of precise radial velocities, ground-based photometry, spectra and speckle imaging. The host star TOI-5205 sits near the eponymous `Jao gap', which is the transition region between partially and fully-convective M dwarfs. TOI-5205b has one of the highest mass ratio for M dwarf planets with a mass ratio of almost 0.3$%$, as it orbits a host star that is just $0.392 pm 0.015$ $M_{odot}$. Its planetary radius is $1.03 pm 0.03~R_J$, while the mass is $1.08 pm 0.06~M_J$. Additionally, the large size of the planet orbiting a small star results in a transit depth of $sim 7%$, making it one of the deepest transits of a confirmed exoplanet orbiting a main-sequence star. The large transit depth makes TOI-5205b a compelling target to probe its atmospheric properties, as a means of tracing the potential formation pathways. While there have been radial velocity-only discoveries of giant planets around mid M dwarfs, this is the first transiting Jupiter with a mass measurement discovered around such a low-mass host star. The high mass of TOI-5205b stretches conventional theories of planet formation and disk scaling relations that cannot easily recreate the conditions required to form such planets. arXiv: http://arxiv.org/abs/http://arxiv.org/abs/2209.11160v2
TOI-5205b: A Jupiter transiting an M dwarf near the Convective Boundary by Shubham Kanodia et al. on Sunday 25 September We present the discovery of TOI-5205b, a transiting Jovian planet orbiting a solar metallicity M4V star, which was discovered using TESS photometry and then confirmed using a combination of precise radial velocities, ground-based photometry, spectra and speckle imaging. The host star TOI-5205 sits near the eponymous `Jao gap', which is the transition region between partially and fully-convective M dwarfs. TOI-5205b has one of the highest mass ratio for M dwarf planets with a mass ratio of almost 0.3$%$, as it orbits a host star that is just $0.392 pm 0.015$ $M_{odot}$. Its planetary radius is $1.03 pm 0.03~R_J$, while the mass is $1.08 pm 0.06~M_J$. Additionally, the large size of the planet orbiting a small star results in a transit depth of $sim 7%$, making it one of the deepest transits of a confirmed exoplanet orbiting a main-sequence star. The large transit depth makes TOI-5205b a compelling target to probe its atmospheric properties, as a means of tracing the potential formation pathways. While there have been radial velocity-only discoveries of giant planets around mid M dwarfs, this is the first transiting Jupiter with a mass measurement discovered around such a low-mass host star. The high mass of TOI-5205b stretches conventional theories of planet formation and disk scaling relations that cannot easily recreate the conditions required to form such planets. arXiv: http://arxiv.org/abs/http://arxiv.org/abs/2209.11160v2
Impact of subsurface convective flows on the formation of sunspot magnetic field and energy build-up by Takafumi Kaneko et al. on Wednesday 14 September Strong solar flares occur in $delta $-spots characterized by the opposite-polarity magnetic fluxes in a single penumbra. Sunspot formation via flux emergence from the convection zone to the photosphere can be strongly affected by convective turbulent flows. It has not yet been shown how crucial convective flows are for the formation of $delta$-spots. The aim of this study is to reveal the impact of convective flows in the convection zone on the formation and evolution of sunspot magnetic fields. We simulated the emergence and transport of magnetic flux tubes in the convection zone using radiative magnetohydrodynamics code R2D2. We carried out 93 simulations by allocating the twisted flux tubes to different positions in the convection zone. As a result, both $delta$-type and $beta$-type magnetic distributions were reproduced only by the differences in the convective flows surrounding the flux tubes. The $delta $-spots were formed by the collision of positive and negative magnetic fluxes on the photosphere. The unipolar and bipolar rotations of the $delta$-spots were driven by magnetic twist and writhe, transporting magnetic helicity from the convection zone to the corona. We detected a strong correlation between the distribution of the nonpotential magnetic field in the photosphere and the position of the downflow plume in the convection zone. The correlation could be detected $20$-$30$ h before the flux emergence. The results suggest that high free energy regions in the photosphere can be predicted even before the magnetic flux appears in the photosphere by detecting the downflow profile in the convection zone. arXiv: http://arxiv.org/abs/http://arxiv.org/abs/2209.06311v1
Convective outgassing efficiency in planetary magma oceans: insights from computational fluid dynamics by Arnaud Salvador et al. on Wednesday 14 September Planetary atmospheres are commonly thought to result from the efficient outgassing of cooling magma oceans. During this stage, vigorous convective motions in the molten interior are believed to rapidly transport the dissolved volatiles to shallow depths where they exsolve and burst at the surface. This assumption of efficient degassing and atmosphere formation has important implications for planetary evolution, but has never been tested against fluid dynamics considerations. Yet, during a convective cycle, only a finite fraction of the magma ocean can reach the shallow depths where volatiles exsolution can occur, and a large-scale circulation may prevent a substantial magma ocean volume from rapidly reaching the planetary surface. Therefore, we conducted computational fluid dynamics experiments of vigorous 2D and 3D Rayleigh-B'enard convection at Prandtl number of unity to characterize the ability of the convecting fluid to reach shallow depths at which volatiles are exsolved and extracted to the atmosphere. Outgassing efficiency is essentially a function of the magnitude of the convective velocities. This allows deriving simple expressions to predict the time evolution of the amount of outgassed volatiles as a function of the magma ocean governing parameters. For plausible cases, the time required to exsolve all oversaturated water can exceed the magma ocean lifetime in a given highly vigorous transient stage, leading to incomplete or even negligible outgassing. Furthermore, the planet size and the initial magma ocean water content, through the convective vigor and the exsolution depth, respectively, strongly affect magma oceans degassing efficiency, possibly leading to divergent planetary evolution paths and resulting surface conditions. Overall, despite vigorous convection, for a significant range of parameters, convective degassing appears not as efficient as previously thought. arXiv: http://arxiv.org/abs/http://arxiv.org/abs/2209.06199v1
Convective outgassing efficiency in planetary magma oceans: insights from computational fluid dynamics by Arnaud Salvador et al. on Wednesday 14 September Planetary atmospheres are commonly thought to result from the efficient outgassing of cooling magma oceans. During this stage, vigorous convective motions in the molten interior are believed to rapidly transport the dissolved volatiles to shallow depths where they exsolve and burst at the surface. This assumption of efficient degassing and atmosphere formation has important implications for planetary evolution, but has never been tested against fluid dynamics considerations. Yet, during a convective cycle, only a finite fraction of the magma ocean can reach the shallow depths where volatiles exsolution can occur, and a large-scale circulation may prevent a substantial magma ocean volume from rapidly reaching the planetary surface. Therefore, we conducted computational fluid dynamics experiments of vigorous 2D and 3D Rayleigh-B'enard convection at Prandtl number of unity to characterize the ability of the convecting fluid to reach shallow depths at which volatiles are exsolved and extracted to the atmosphere. Outgassing efficiency is essentially a function of the magnitude of the convective velocities. This allows deriving simple expressions to predict the time evolution of the amount of outgassed volatiles as a function of the magma ocean governing parameters. For plausible cases, the time required to exsolve all oversaturated water can exceed the magma ocean lifetime in a given highly vigorous transient stage, leading to incomplete or even negligible outgassing. Furthermore, the planet size and the initial magma ocean water content, through the convective vigor and the exsolution depth, respectively, strongly affect magma oceans degassing efficiency, possibly leading to divergent planetary evolution paths and resulting surface conditions. Overall, despite vigorous convection, for a significant range of parameters, convective degassing appears not as efficient as previously thought. arXiv: http://arxiv.org/abs/http://arxiv.org/abs/2209.06199v1
Convective outgassing efficiency in planetary magma oceans: insights from computational fluid dynamics by Arnaud Salvador et al. on Wednesday 14 September Planetary atmospheres are commonly thought to result from the efficient outgassing of cooling magma oceans. During this stage, vigorous convective motions in the molten interior are believed to rapidly transport the dissolved volatiles to shallow depths where they exsolve and burst at the surface. This assumption of efficient degassing and atmosphere formation has important implications for planetary evolution, but has never been tested against fluid dynamics considerations. Yet, during a convective cycle, only a finite fraction of the magma ocean can reach the shallow depths where volatiles exsolution can occur, and a large-scale circulation may prevent a substantial magma ocean volume from rapidly reaching the planetary surface. Therefore, we conducted computational fluid dynamics experiments of vigorous 2D and 3D Rayleigh-B'enard convection at Prandtl number of unity to characterize the ability of the convecting fluid to reach shallow depths at which volatiles are exsolved and extracted to the atmosphere. Outgassing efficiency is essentially a function of the magnitude of the convective velocities. This allows deriving simple expressions to predict the time evolution of the amount of outgassed volatiles as a function of the magma ocean governing parameters. For plausible cases, the time required to exsolve all oversaturated water can exceed the magma ocean lifetime in a given highly vigorous transient stage, leading to incomplete or even negligible outgassing. Furthermore, the planet size and the initial magma ocean water content, through the convective vigor and the exsolution depth, respectively, strongly affect magma oceans degassing efficiency, possibly leading to divergent planetary evolution paths and resulting surface conditions. Overall, despite vigorous convection, for a significant range of parameters, convective degassing appears not as efficient as previously thought. arXiv: http://arxiv.org/abs/http://arxiv.org/abs/2209.06199v1
Convective outgassing efficiency in planetary magma oceans: insights from computational fluid dynamics by Arnaud Salvador et al. on Wednesday 14 September Planetary atmospheres are commonly thought to result from the efficient outgassing of cooling magma oceans. During this stage, vigorous convective motions in the molten interior are believed to rapidly transport the dissolved volatiles to shallow depths where they exsolve and burst at the surface. This assumption of efficient degassing and atmosphere formation has important implications for planetary evolution, but has never been tested against fluid dynamics considerations. Yet, during a convective cycle, only a finite fraction of the magma ocean can reach the shallow depths where volatiles exsolution can occur, and a large-scale circulation may prevent a substantial magma ocean volume from rapidly reaching the planetary surface. Therefore, we conducted computational fluid dynamics experiments of vigorous 2D and 3D Rayleigh-B'enard convection at Prandtl number of unity to characterize the ability of the convecting fluid to reach shallow depths at which volatiles are exsolved and extracted to the atmosphere. Outgassing efficiency is essentially a function of the magnitude of the convective velocities. This allows deriving simple expressions to predict the time evolution of the amount of outgassed volatiles as a function of the magma ocean governing parameters. For plausible cases, the time required to exsolve all oversaturated water can exceed the magma ocean lifetime in a given highly vigorous transient stage, leading to incomplete or even negligible outgassing. Furthermore, the planet size and the initial magma ocean water content, through the convective vigor and the exsolution depth, respectively, strongly affect magma oceans degassing efficiency, possibly leading to divergent planetary evolution paths and resulting surface conditions. Overall, despite vigorous convection, for a significant range of parameters, convective degassing appears not as efficient as previously thought. arXiv: http://arxiv.org/abs/http://arxiv.org/abs/2209.06199v1
The EBLM project -- IX Five fully convective M-dwarfs, precisely measured with CHEOPS and TESS light curves by D. Sebastian et al. on Wednesday 07 September Eclipsing binaries are important benchmark objects to test and calibrate stellar structure and evolution models. This is especially true for binaries with a fully convective M-dwarf component for which direct measurements of these stars' masses and radii are difficult using other techniques. Within the potential of M-dwarfs to be exoplanet host stars, the accuracy of theoretical predictions of their radius and effective temperature as a function of their mass is an active topic of discussion. Not only the parameters of transiting exoplanets but also the success of future atmospheric characterisation rely on accurate theoretical predictions. We present the analysis of five eclipsing binaries with low-mass stellar companions out of a sub-sample of 23, for which we obtained ultra high-precision light curves using the CHEOPS satellite. The observation of their primary and secondary eclipses are combined with spectroscopic measurements to precisely model the primary parameters and derive the M-dwarfs mass, radius, surface gravity, and effective temperature estimates using the PYCHEOPS data analysis software. Combining these results to the same set of parameters derived from TESS light curves, we find very good agreement (better than 1% for radius and better than 0.2% for surface gravity). We also analyse the importance of precise orbits from radial velocity measurements and find them to be crucial to derive M-dwarf radii in a regime below 5% accuracy. These results add five valuable data points to the mass-radius diagram of fully-convective M-dwarfs. arXiv: http://arxiv.org/abs/http://arxiv.org/abs/2209.03128v1
Welcome to the next installment of the Anesthesia Patient Safety podcast hosted by Alli Bechtel. This podcast is an exciting journey towards improved anesthesia patient safety. We are back for the exciting conclusion to our 2-Part Rapid Response series on the safe use of convective warmers during anesthesia care. Join us today for the response from Smiths Medical, the manufacturer for the Level 1 Equator Convective Warming Devices as well as recommendations for continued safe use of these devices from Dr. Jeffrey Feldman. This is an important conversation about how to keep patients safe and warm in the operating room. © 2022, The Anesthesia Patient Safety FoundationFor show notes & transcript, visit our episode page at apsf.org: https://www.apsf.org/podcast/95-still-keeping-patients-warm-and-safe-with-convective-warming-systems/
http://futureofrisk.com/ Convective storms are among the most common and most destructive natural hazards on Earth. Mike Widdekind, Property Technical Director for Risk Engineering for the Zurich Services Corporation discusses how businesses can protect themselves and their employees during these dangerous weather systems. Recorded on 9-29-21
Tonight's Guest Panelist has been on the show several times prior. He runs a weather blog and a chaser convention, among other things. We welcome back friend of the podcast, Chris White! Tonight's Guest WeatherBrain is an award-winning journalist and author. He's written several books, and in 2019 received the Overseas Press Club Award for the best Human Rights Reporting in any medium. He also won the Amnesty International Award for Foreign Reporting. Jeff Stern, welcome to WeatherBrains!
Convective SIGMET (WST), ALL Pilotscafe.com V3.0.0 pg 16
Inclement weather conditions are, by far, the largest cause of flight delays in the United States. In an average year, inclement weather — including convection — is the reason for nearly 70 percent of all delays.In this episode, we'll hear from two professionals on convective weather, flight delays, and the work that FAA does to get ahead of severe weather and minimize delays.Read the show notes on our blog.
Inclement weather conditions are, by far, the largest cause of flight delays in the United States. In an average year, inclement weather — including convection — is the reason for nearly 70 percent of all delays. In this episode, we'll hear from two professionals on convective weather, flight delays, and the work that FAA does to get ahead of severe weather and minimize delays.
Winter weather was certainly in the forecast for last week. Tuesday’s snow was something that caught even meteorologists statewide off guard. The flurries were heavy, and WBRC Chief Meteorologist J-P Dice this week talks with Dr. Tim Coleman about how our surprise snow day came to be. This episode of Behind the Front is sponsored by Durante Home Exteriors. Click Here or Call: (205) 956-4110 to learn how Durante can help protect your home from the weather.
In this interview, I had the pleasure to interview Allison Wing from Florida State University. She gave me a clear and concise review of what we know about the effects of hurricanes on climate change, plus she explained to me the motivations behind the Radiative Convective Equilibrium model inter-comparison project (led by her), the unexpected findings and the way forward. Learn more: http://myweb.fsu.edu/awing/index.html --- Send in a voice message: https://anchor.fm/the-climate-academy/message
Professor Robin Tanamachi studies severe storms using radar. In this episode, Robin discusses both the facts and fiction behind the movie Twister. She shares the career path that has led her to become a scientist studying convective storms and tornadoes. As a high school student, Robin took advantage of programs that allowed her to earn college credits. Professor Tanamachi lived in Japan for a few months researching typhoons during grad school. She recommends getting outside of your comfort zone to help yourself grow. A children's book about her field research, The Tornado Scientist: Seeing Inside Severe Storms, written by Mary Kay Carson, was released earlier this year (2019). Purdue page: http://www.eaps.purdue.edu/people/faculty-pages/tanamachi.html The XTRRA radar: https://xtrra.eaps.purdue.edu/ The Tornado Scientist: Seeing Inside Severe Storms, written by Mary Kay Carson: https://www.amazon.com/Tornado-Scientist-Scientists-Field/dp/0544965825
Here is your Convective outlook. Being issued by the National Weather Service's Storm Prediction Center, located in Norman Oklahoma. This outlook is being issued at six AM Central Daylight Time Thursday July 18 2019. Areas of severe storms are possible across the Upper Midwest on Thursday. --- Support this podcast: https://anchor.fm/xtremeweather/support
Tonight's Guest WeatherBrain is the teaching professor in meteorology and the associate head of the undergraduate program in meteorology at The Penn State University. He is also the host, feature writer, and producer of Weather World, Penn State Meteorology Department's weekday weather magazine show. He also produces a segment on Wednesday's called WeatherWhys (WxYz). In addition, he was an on-air storm analyst at The Weather Channel from 2002 to 2005 (just after Dr. John Scala was there) and was the chief meteorologist at the historic Franklin Institute Science Museum in Philadelphia from 1998 to 2002. He has also co-authored two books: "The Philadelphia Area Weather Book" and "A World of Weather: Fundamentals of Meteorology." Dr. Jon Nese, welcome to WeatherBrains!
A feature from NASA’s Ames Research Center in Silicon Valley originally posted on October 14, 2016.
Basic Science Clinic by Steve Morgan & Sophie Connolly In the words of the canonical Roman poet Ovid: “Sickness seizes the body from bad ventilation”. Welcome to Basic Science Clinic Raw Science 9. Convective gas flow provides the substrate to interface with alveolar structural and functional adaptation in orchestrating gas exchange. Gas exchange is the serial interconnection of ventilation, diffusion and perfusion. Alveolar gas composition is determined by amount and type of gases delivered by ventilation, the rate and direction of gas diffusion, and the pulmonary blood flow which continuously recalibrates partial pressure gradients to direct oxygen and carbon dioxide movement. Bulk gas volume displacement is the mechanism of ventilation, but how do we conceptualise and quantify its contribution to gas exchange and its associated abnormalities? In this pod we will examine the quantification of ventilation and parse its correspondence with variance in the dead space volume and central role in carbon dioxide homeostasis. In this podcast we will cover: How do we define pulmonary ventilation? What is the relationship between alveolar minute ventilation and alveolar gas composition? What are the determinants of arterial carbon dioxide partial pressure? What are the physiological sequelae of hypercapnia? What is permissive hypercarpnia? What is dead space? How do we quantify dead space volume? What are the factors that affect dead space? Here are your Raw Science factoids: Morphological dead space estimates give us volumes of 150mls for dogs, 380mls for cows and 150-300L for whales. In 1986 in Cameroon a paroxysmal expulsion of carbon dioxide from the volcanic lake Nyos resulted in the asphyxiation of 1700 people and 8000 animals. The highest documented PaCO2 to be measured in a subsequent survivor weighed in at 501 mmHg in a 16 year old boy swallowed, sequestered and asphyxiated by a rapidly filling grain truck. For feedback, corrections and suggestions find us on the twitter handles @falconzao and @sophmconnolly or alternatively post on ICN. Check out our new website basicscienceclinic.com where you can access the back catalogue and peruse the real brains behind our elaborate plagiarism by checking out the reference page to go direct to the source material. Thanks for listening. Next up we’ll consummate the journey of inspirate atmospheric gas into the blood phase, as we appraise gas diffusion across the alveolar-capillary membrane and dissect the pulmonary vasculature.
Basic Science Clinic by Steve Morgan & Sophie Connolly An expert is a person who has made all the mistakes that can be made in a very narrow field. Niels Bohr Welcome to Basic Science Clinic Raw Science 8. Convective gas flow through the tracheobronchial tree is the end-point of pulmonary mechanics but the fundamental purpose of the lung is gas exchange, comprised of three interlinked physiological processes: ventilation, diffusion and perfusion. Today we examine the incredible structural adaptation of the human lung down to the alveolus as the centrepoint of gas exchange, a process itself best conceptualized via the elegant physiological model of the alveolar gas equation. The unraveling of the procession of pulmonary blood flow from right ventricle to lung to facilitate the mingling of blood and air involved protagonists that spanned epochs from Hippocrates to Galen and eventually in 1661 to Marcello Malpighi. He was the first person to view the pulmonary capillaries and alveoli through the augmented reality offered by the light microscope that had been invented in 1590. The composition of gas in the alveoli determines and represents the process of pulmonary gas exchange and provides a framework for understanding the mechanisms and practical physiological limitations. Alveolar gas is practically inaccessible in vivo and hence requires an accurate and precise model to ascertain its configuration under specific conditions. The alveolar gas equation relates the alveolar partial pressure of oxygen to inspired partial pressure of oxygen, alveolar and hence arterial partial pressure of carbon dioxide and the respiratory quotient. How is the lung adapted to optimise gas exchange? So how does the alveolus fit in? What are the cell populations in the alveolar region? How can we model pulmonary gas exchange? Raw Science factoids: The oxygen content of arterial blood is ~21 mls/dl, ie 21% by volume. The oxygen content of mixed venous blood is 15-16 mls/dl indicating a total body oxygen extraction of 25%. The total alveolar surface area is approximately 80x greater than the total surface area of the skin. Each erythrocyte contains approximately 250 million haemoglobin molecules and 400 billion erythrocytes occupy the total pulmonary capillary blood volume. For feedback, corrections and suggestions find us on twitter @falconzao and @sophmconnolly or post on ICN. Thanks for listening. Next up we’ll continue our examination of pulmonary gas exchange by looking in more detail at ventilation, perfusion and diffusion. Coming soon is our video series Raw Focus to delve deeper into the key concepts from each of the podcasts.
Basic Science Clinic by Steve Morgan & Sophie Connolly What we know is not much. What we do not know is immense. Pierre-Simon Laplace Welcome to Basic Science Clinic Raw Science 7. As a prelude to deconstructing gas exchange we have been examining how humans, as tidal ventilators, replenish the composition of the gas in the functional residual capacity to provide a plentiful oxygen repository to buffer fluctuations in the oxygen content of blood leaving the lung with every beat of the heart. Convective, pressure gradient driven, bulk gas volume displacement can only occur if the displacing force is greater than the forces that oppose gas flow. These oppositional forces are the physiological targets of pathological processes that affect the lung, that alter pulmonary mechanics, increase work of breathing eventually critically compromising respiratory function and indicating the need for respiratory support measures. To effectively manage organ system dysfunction it is vital to develop an intimate understanding of your enemy so today we will examine the oppositional forces to gas flow that are among the key perpetrators of respiratory failure. In this pod we’ll cover: What are the oppositional forces to gas flow? What is elastance? What is elastic recoil and what are its determinants? How does the lung prevent surface tension induced alveolar instability? What is the 2nd major oppositional force to gas flow? How do these driving and oppositional forces relate to work of breathing? Raw Science Factoids During inspiration the alveolar radius increases from 0.05mm to 1mm which should require a distending pressure of 10 cmH2O but surfactant’s detergent action means that 1 cmH2O will suffice. Normal resting VO2 is approximately 2-4 mls/kg/min. In terms of VO2max an average untrained healthy male would approximate 35-40 ml/kg/min and former multiple Tour de France champion and King of the Mountains Miguel Indurain hit 88ml/kg/min at his peak. Racing Siberian sled dogs can reach 240 ml/kg/min. Elite rowers can escalate their total minute ventilation to 240 L/min by hitting respiratory rates of 60/min and tidal volumes of 4000 mls and inspite of this heroic effort still generate lactates of 15-18 mmol/L. For feedback, corrections and suggestions find us on twitter @falconzao and @sophmconnolly or post on ICN. Thanks for listening. Next up we’ll begin our examination of pulmonary gas exchange, also coming soon is the second Crit Think series, Doors of Deception, in which we will look at the labyrinthine and mendacious ways our decision making faculties can deceive us.
Fakultät für Physik - Digitale Hochschulschriften der LMU - Teil 05/05
This thesis aims to answer the question if 3D effects of thermal radiative transfer need to be considered in cloud resolving simulations and if an influence of 3D thermal heating and cooling rates exists in contrast to common 1D approximations. To study this question with the help of a cloud resolving model, an accurate, yet fast parameterization of 3D radiative transfer is needed. First, an accurate 3D Monte Carlo model was developed which was used as benchmark for developing the fast `Neighboring Column approximation' (NCA), which was then coupled to the UCLA-LES to study the effects of 3D thermal heating and cooling rates in comparison to common 1D radiative transfer approximations. First, differences between common 1D radiative transfer approximations and a correct 3D radiative transfer model were analyzed. For this, efficient Monte Carlo variance reduction methods have been developed and implemented in MYSTIC, a Monte Carlo radiative transfer model. The dependence of 1D and 3D heating and cooling rates on cloud geometry has been investigated by analyzing idealized clouds such as cubes or half spheres. Further more, 1D and 3D heating and cooling rates in realistic cloud fields were simulated and compared. It could be shown that cooling rates reach maximum values of several 100 K/d at cloud tops if the model resolution was between 50 m to 200 m. Additional cloud side cooling of several 10 to 100 K/d was found in 3D heating and cooling rate simulations. At the cloud bottom, modest warming of a few 10 K/d occurs. Heating and cooling rates depend on the vertical location of the cloud in the atmosphere, the liquid water content of the cloud, the shape of the cloud and the geometry of the cloud field (for example the distance between clouds). Based on the results of a detailed analysis of exact simulations of 3D thermal heating and cooling rates, a fast, but still accurate 3D parameterization for thermal heating and cooling rates has been developed. This parameterization, the `Neighboring Column Approximation' (NCA), is based on a 1D radiative transfer solution and uses the next neighboring columns of a column to estimate the 3D heating or cooling rate. The method can be used in parallelized models. With the NCA, it is possible to simulate 3D cloud side cooling and warming. It was shown that the NCA is a factor of 1.5 to 2 more expensive in terms of computational time when used in a cloud resolving model, compared to a 1D radiative transfer approximation. The NCA was implemented in UCLA-LES, a cloud resolving, large-eddy simulation model. With the UCLA-LES and the NCA it was possible for the first time to study the effects of 3D interactive thermal radiation on cloud development. Simulations without radiation, with 1D thermal radiation and 3D thermal (NCA) radiation have been performed and differences have been analyzed. First, single, isolated clouds were investigated. Depending on the cloud shape, 3D thermal radiation changes cloud development in comparison to 1D thermal radiation. Overall it could be shown that a thermal radiation effect on cloud development exists in general. Whether there is a differences between 1D and 3D thermal radiation on cloud development seems to depend on the specific situation. One of the main features of thermal radiation affecting a single cloud is a change in the cloud circulation. Stronger updrafts in the cloud core and stronger downdrafts at the cloud sides were found, causing an enhanced cloud development at first, but a faster decay of the cloud in the end. Second, large scale simulations of a shallow cumulus cloud field in a 25 x 25 km^2 domain with 100~m horizontal resolution were analyzed. To the authors knowledge, this is the first time that a cloud field of this size and resolution was simulated including 3D interactive thermal radiation. It was shown that on average, updrafts, downdrafts and liquid water increases if thermal radiation is accounted for. While most variables (for example liquid water mixing ratio or cloud cover) did not show significant systematic difference between no-radiation simulation and the simulations with 1D and 3D thermal radiation, the cloud size (or horizontal extent) was larger in the simulations with interactive 3D thermal radiation. Convective organization set in after a few hours already. This is a clear indication that 3D thermal radiation could trigger convective organization.
Fakultät für Physik - Digitale Hochschulschriften der LMU - Teil 05/05
Ein theoretisches Modell welches die Fluktuationen innerhalb eines Wolkenensembles beschreibt, stellt die Basis für stochastische Konvektionsparametrisierung dar. Hochaufgelöste, idealisierte Simulationen eines Wolkenensembles über einer homogenen Meeresoberfläche werden benutzt um die Gültigkeit des theoretischen Modells im Strahlungs-Konvektions-Gleichgewicht zu evaluieren. Im ersten Schritt dieser Studie werden Kontrollsimulationen mit einer horizontalen Auflösung von 2 km durchgeführt, wobei fünf verschiedene Abkühlungsraten benutzt werden um das Wolkenensemble anzutreiben. In den Kontrollsimulationen wird die Gültigkeit einer exponentiellen Verteilung des vertikalen Massenflusses und der Wolkengrößen für alle Abkühlungsraten nachgewiesen. Desweiteren, nimmt die Anzahl der Wolken im Modellgebiet mit steigender Abkühlungsrate linear zu, wohingegen nur eine schwache Abhängigkeit der mittleren Wolkengrößen und deren Vertikalgeschwindigkeiten von der Stärke der Abkühlungsrate beobachtet wird. Diese Ergebnisse zeigen eine gute Übereinstimmung mit dem theoretischen Modell. Im zweiten Teil dieser Studie wird die Gitterweite in den numerischen Simulationen sukzessive bis zu einer Auflösung von 125 m verfeinert. Hierbei treten signifikante Änderungen in der Wolkenstatistik und der Struktur der Wolkenfelder auf. Die Größe der Wolken nimmt mit zunehmender Auflösung stark ab, wohingegen die Anzahl der Gitterpunkte innerhalb einer Wolke ansteigt, da diese mit zunehmender Auflösung besser auf dem numerischen Gitter dargestellt werden können. Im Gegensatz zu den zufällig verteilten Wolken in den Kontrollsimulationen, wird in den feinaufgelösten Wolkenfeldern beobachtet, dass sich die einzelnen konvektiven Zellen in bandartigen Strukturen um wolkenfreie Gebiete anordnen. Der Umkreis einer Wolke in dem diese Cluster-Effekte beobachtet werden, scheint indes unabhängig von der horizontalen Auflösung zu sein. Mit feiner werdender Auflösung weicht darüber hinaus die Wahrscheinlichkeitsdichteverteilung der Wolkengrößen und des vertikalen Massenflusses immer stärker von der exponentiellen Verteilung ab. Für größere Werte in den Verteilungen zeigen sich Übereinstimmungen mit einer Power-Law Verteilung. Durch die Partitionierung der Wolken-Cluster in deren zugrundeliegende, einzelne Aufwindbereiche kann die erwartete, exponentielle Verteilung des Massenflusses und der Wolkengrößen wieder hergestellt werden. Das theoretische Modell ist daher in hochaufgelösten Simulationen für die einzelnen Aufwindbereiche gültig, allerdings müssen die Cluster-Effekte im Wolkenensemble hierbei berücksichtigt werden.
Fakultät für Physik - Digitale Hochschulschriften der LMU - Teil 05/05
Convective phenomena in the atmosphere, such as convective storms, are characterized by very fast, intermittent and seemingly stochastic processes. They are thus difficult to predict with Numerical Weather Prediction (NWP) models, and difficult to estimate with data assimilation methods that combine prediction and observations. In this thesis, nonlinear data assimilation methods are tested on two idealized convective scale cloud models, developed in [58] and [59]. The aim of this work was to apply the particle filter, a method specifically designed for nonlinear models, to the two toy models that resemble some properties of convection. Potential problems and characteristic features of particle filter methodology were analyzed, having in mind possible tests on far more elaborate NWP models. The first model, the stochastic cloud model, is a one-dimensional birth-death model initialized by a Poisson distribution, where clouds randomly appear or die independently from each other on a set of grid-points. This purely stochastic model is physically unreal- istic, but since it is highly nonlinear and non-Gaussian, it contains minimal requirements for representing main features of convection. The derivation of the transition probability density function (PDF) of the stochastic cloud model made it possible to understand better the weighting mechanism involved in the particle filter. This mechanism, which associates a weight to particles (state vectors) according to their likelihood with respect to observa- tions and to their evolution in time, is followed by resampling, where particles with high probability are replicated, and others eliminated. The ratio between magnitudes of the ob- servation probability distribution and the transition probability is shown to determine the selection process of particles at each time step, where data and prediction are combined. Further, a strong sensitivity of the filter to the observation density and to the speed of the cloud variability (given by the cloud life time) is demonstrated. Thus, the particle filter can outperform some simpler methods for certain observation densities, whereas it does not bring any improvement when using others. Similarly, it leads to good results for stationary cloud fields while having difficulties to follow fast varying cloud fields, because any change in the model state variable is potentially penalized. The main difficulty for the filter is the fact that this model is discrete, while the filter was designed for data assimilation of continuous fields. However, by representing an extreme testbed for the particle filter, the stochastic cloud model shows the importance of the observation and model error densities for the selection of particles, and it stresses the influence of the chosen model parameters on the filter’s performance. The second model considered was the modified shallow water model. It is based on the shallow water equations, to which is added a small stochastic noise in order to trigger convection, and an equation for the evolution of rain. It contains spatial correlations and is represented by three dynamical variables - wind speed, water height and rain concentration - which are linked together. A reduction of the observation coverage and of the number of updated variables leads to a relative error reduction of the particle filter compared to an ensemble of particles that are only continuously pulled (nudged) to observations, for a certain range of nudging parameters. But not surprisingly, reducing data coverage in- creases the absolute error of the filter. We found that the standard deviation of the error density exponents is a quantity that is responsible for the relative success of the filter with respect to nudging-only. In the case where only one variable is assimilated, we formulated a criterion that determines whether the particle filter outperforms the nudged ensemble. A theoretical estimate is derived for this criterion. The theoretical values of this estimate, which depends on the parameters involved in the assimilation set up (nudging intensity, model and observation error covariances, grid size, ensemble size,...), are roughly in accor- dance with the numerical results. In addition, comparing two different nudging matrices that regulate the magnitude of relaxation of the state vectors towards the observations, showed that a diagonally based nudging matrix leads to smaller errors, in the case of assimilating three variables, than a nudging matrix based on stochastic errors added in each integration time step. We conclude that the efficient particle filter could bring an improvement with respect to conventional data assimilation methods, when it comes to the convective scale. Its success, however, appears to depend strongly on the parameters of the test setting.
There are three main types of convective storms: airmass thunderstorms, severe thunderstorms and hurricanes. These storms are all driven by the release of latent heat into the atmosphere during condensation of water vapor. Severe thunderstorms include both squall line thunderstorms and tornados. They acquire energy from water vapor in the atmosphere over land and therefore typically require warm air temperatures and high humidity. Hurricanes gain energy from water vapor evaporated from the ocean surface. This requires warm ocean temperatures, and is the reason hurricanes weaken over land. Hurricanes are cyclonic and therefore also require a non-zero Coriolis force to form and maintain their structure. For this reason they cannot form over the equator and cannot cross the equator. Complete course materials are available at the Open Yale Courses website: http://oyc.yale.edu This course was recorded in Fall 2011.
Large scale air motion in the atmosphere occurring sufficiently above the surface is in geostrophic balance. Areas of high and low pressure anomalies in the atmosphere are surrounded by rotating flow caused by the balance between the pressure gradient and Coriolis forces. The direction of rotation around these pressure anomalies reverses between the northern and southern hemispheres due to the reversal in sign of the Coriolis force across the equator. This can be seen in the reverse direction of the spiraling of clouds in satellite images of hurricanes in the northern and southern hemispheres. Convective storms are also discussed. Complete course materials are available at the Open Yale Courses website: http://oyc.yale.edu This course was recorded in Fall 2011.
Fakultät für Physik - Digitale Hochschulschriften der LMU - Teil 03/05
Fri, 30 Jul 2010 12:00:00 +0100 https://edoc.ub.uni-muenchen.de/12016/ https://edoc.ub.uni-muenchen.de/12016/1/kober_kirstin.pdf Kober, Kirstin
Fakultät für Physik - Digitale Hochschulschriften der LMU - Teil 03/05
This dissertation deals with a climatological investigation of different orographic precipitation enhancement processes in the Alps. On the one hand, the database consists of observations of more than 1000 weather stations in Bavaria and western Austria for the time period of 1991 - 2000. On the other hand, a MM5-climate-mode-simulation driven with ERA-40 reanalysis data provides information about the environmental meteorological conditions. After a short introduction into the basics of orographic precipitation enhancement and into some simple linear models, the different station data are converted into a uniform temporal resolution of 6 hours. To estimate the relative contribution of different precipitation types, we distinguish between cold fronts, warm fronts, convection and a class carrying unclassified events. Convective precipitation in connection with fronts is attributed to the respective frontal class. Unclassified events predominantly consist of postfrontal upslope precipitation and quasi-stationary fronts. In addition, the wind direction at Alpine crest level (700 hPa) is considered. In a first step, the precipitation differences between the Alpine foreland and the northern Alps are analyzed. The investigation of the climatological importance of the 4 precipitation types in various regions shows that summertime convection and orographic lifting make the largest contribution to the precipitation gradient towards the Alps. Convective precipitation occurs predominantly in association with southwesterly flow and is more abundant in the Alps than in the adjacent forelands because convection is primarily triggered over the Alps. Orographic lifting is most active in case of northwesterly and northerly winds and intensifies both frontal and postfrontal precipitation. The climatological importance of orographic lifting is much larger in winter than in summer. A reversed precipitation gradient with systematically more precipitation in the foreland than in the Alps is found for fronts associated with a wind direction of exactly 270$^circ$. Under these circumstances, the wind blows parallel to the mountain range and lee effects related to upstream topography reduce the precipitation intensity in the Alps. The climatological precipitation maximum is shifted from westerly towards northerly winds when moving from west to east in the northern Alps. The second part of this work comprises an investigation of the climatological precipitation decrease from the northern Alps to the inner-Alpine valleys. The comparison between the precipitation distribution of 3 regions in the northern Alps and 3 regions in the central Alps shows that especially for cold fronts and unclassified events in connection with northwesterly or northerly flow, a distinctive precipitation surplus can be found in the northern Alps. For convective precipitation and southwesterly or westerly winds the inner-Alpine regions show high convective activity. For southerly wind directions the showers formed over the central Alps are advected towards the northern Alps, where they sometimes even intensify due to a convergence with the inflow from the Alpine foreland. There is a strong west-east precipitation gradient in the central Alps. The decreasing crest level of the northern Alps and the decreasing north-south-extension of the Alps together with the topographical structure of the valleys are the main reasons for the higher precipitation amounts in the eastern parts of the central Alps than in the western parts. Additionally, an investigation of the precipitation gradient in dependence on the temperature level is made. However, a classification of the precipitation events into 3 classes (snow line $>$ 2500 m, 2500 m - 1000 m, $
Fakultät für Physik - Digitale Hochschulschriften der LMU - Teil 02/05
Der Einfluss von Trögen der oberen Troposphäre auf konvektive Instabilität wurde mit Hilfe von Analysen, die auf Gitterdaten des Datenarchivs des Europäischen Zentrums für mittelfristige Wettervorhersage (EZMW) basieren, untersucht. Als Maß der Instabilität wurde die sogenante "Convective Available Potential Energy" (CAPE) verwendet. Eine Fallstudie des Burdekin Thunderstorm in Australien (Janur 2001) zeigte, dass die hohe CAPE vor der Entwicklung des Gewitters von kalter Luft, die mit Trögen in Zusammenhang steht, beeinflusst wurde. Im Gegensatz dazu war in den Fällen der Australischen Tropischen Zyklone Theodore (Februar 1994) und Rewa (Janur 1994) der Einfluss der Tröge auf die CAPE minimal, wobei die Abkühlung schwächer als in dem Fall des Gewitters war. Die Intensivierung tropischer Wirbelstürme wurde mit Hilfe von numerischen Modellrechnungen, die von der Fallstudie motiviert wurden, weiter erforscht. Ergebnisse aus einer Kontroll-Modellrechnung zeigen, dass die Intensivierung ein eigentlich nicht axialsymmetrischer Prozess ist. Kumuluskonvektion bildet sich vornehmlich in der Nähe des Radius der maximalen Windgeschwindigkeit des initialen Wirbels. Diese konvektive Zellen weisen erhöhte Rotation auf und werden daher Meso-Wirbel genannt. Die Entstehung der Meso-Wirbel ist abhängig von der CAPE, die mit Grenzschichtfeuchte verbunden ist, die wegen des Feuchteaustausches zwischen Luft und Meer bei hoher Windgeschwindigkeit zunimmt. Dennoch ist die weitere Intensivierung des Wirbelsturms als Ganzes unabhängig von der CAPE. Der wichtigste Prozess hierbei ist die Verschmelzung der Wirbel, wodurch sich der Wirbelsturm rasch verstärkt. In der Folge wurden Ensembleberechnungen mit zufälligen Störungen der Anfangsfeuchte in der unteren Troposphäre durchgeführt, um die Sensitivität der asymmetrischen Intensivierung bezüglich der Feuchte zu erforschen. Es war zu beobachten, dass die Entstehung und Verschmelzung der Meso-Wirbel von zufälligen Störungen beeinflusst wurde, wogegen sich die Intensität des vollentwickelten Wirbelsturms im Bereich der Schwankungsbreite der Kontroll-Modellrechnung bewegte. Die Effekte einer Reduzierung der Feuchte in der mittleren Troposphäre, einer verstärkten Strahlungsabkühlung und einer oberen antizyklonalen Scherströmung, wurden ebenfalls untersucht. Es wurde belegt, dass die Entwicklung von Wirbelstörmen empfindlich von diesen drei Faktoren abhängt. Die Verschmelzung der Meso-Wirbel ist wegen der Reduzierung des Auftriebs in der Kumuluskonvektion verzögert. Ensembleberechnungen zeigen auch, dass die Vorhersagbarkeit während der Periode der Intensivierung von Wirbelstürmen gering ist. Erhebliche Schwankungen der Intensität des Wirbelsturms in den Rechnungen der einzelnen Mitglieder des Ensembles zu einem festgelegten Zeitpunkt deuten auf die Grenzen der Vorhersagbarkeit einzelner Modellrechnungen hin.
Hg1 - xCdxTe and Cd1 - xZnxTe single crystals were grown by the tracwelling heater method (THM), applying two different techniques of artificially stirring the solution zone. Accelerated crucible rotation (ACRT) was used in a vertical growth arrangement and compared a technique with constant rotation around the horizontal axis of the ampoule. The dominant hydrodynamic mechanisms of noth methods are discribed by the rotating disc model and are suggested to be almost identical with respect to the growth conditions at the interface. Convective flow is effectively enhanced adhacent to the growing crystal, where the matter transport is regarded as the rate-limiting step of solution growth. Inclusion density analysis by IR microscopy was used to characterise the crystals of Cd1 - xZnxTe grown at different rates. It was shown that forced convection allows an increase in the crystal growth rate from a few mm day-1 with ACRT or horizontally rotating THM.