POPULARITY
Outline00:00 - Intro01:07 - Early steps02:47 - Why control?05:20 - The move to the US07:40 - The first journal paper13:30 - What is backstepping?17:08 - Grad school25:10 - Stochastic stabilization29:53 - The interest in PDEs43:24 - Navier-Stokes equations52:12 - Hyperbolic PDEs and traffic models57:51 - Predictors for long delays1:08:14 - Extremum seeking1:27:14 - Safe control1:36:30 - Interplay between machine learning and control1:42:28 - Back to the roots: robust adaptive control1:50:50 - On service1:55:54 - AdviceLinksMiroslav's site: https://flyingv.ucsd.edu/Tuning functions paper: https://tinyurl.com/yznv6r9rP. Kokotović: https://tinyurl.com/mwmbm9yhSeparation and swapping: https://tinyurl.com/y4fre6t8Adaptive nonlinear stabilizers: https://tinyurl.com/4a9wmmvxKKK book: https://tinyurl.com/2kw2b4k6Stochastic nonlinear stabilization: https://tinyurl.com/4td3537aFollow-up with unknown covariance: https://tinyurl.com/4c4n7fd7Boundary state feedbacks for PIDEs: https://tinyurl.com/4e9y4tdrBoundary Control of PDEs: https://tinyurl.com/d8x38bmjStabilization of Navier–Stokes systems: https://tinyurl.com/4a8cbjemTraffic congestion control: https://tinyurl.com/525jphs5Delay compensation: https://tinyurl.com/5yz6uj9pNonlinear predictors for long delays: https://tinyurl.com/7wvce6vyStability of extremum seeking: https://tinyurl.com/mr5cvzd3Nash equilibrium seeking: https://tinyurl.com/yeywrysnInverse optimal safety filters: https://tinyurl.com/9dkrpvkkNeural operators for PDE control: https://tinyurl.com/5yynsp7vBode lecture: https://tinyurl.com/mp92cs9uCSM article: Support the showPodcast infoPodcast website: https://www.incontrolpodcast.com/Apple Podcasts: https://tinyurl.com/5n84j85jSpotify: https://tinyurl.com/4rwztj3cRSS: https://tinyurl.com/yc2fcv4yYoutube: https://tinyurl.com/bdbvhsj6Facebook: https://tinyurl.com/3z24yr43Twitter: https://twitter.com/IncontrolPInstagram: https://tinyurl.com/35cu4kr4Acknowledgments and sponsorsThis episode was supported by the National Centre of Competence in Research on «Dependable, ubiquitous automation» and the IFAC Activity fund. The podcast benefits from the help of an incredibly talented and passionate team. Special thanks to L. Seward, E. Cahard, F. Banis, F. Dörfler, J. Lygeros, ETH studio and mirrorlake . Music was composed by A New Element.
The co-founder of Muscle Nerds. He's traveled the world teaching for Charles Poliquin, and has taught thousands of personal trainers and coaches how to get better at their craft over the past 13 years, including Ali Gilbert, who many of you are fans of. We discussed PED's, Tren, high frequency training, U/L split, issues in bodybuilding, blood glucose and GDA's, & so much more.Please share this episode if you liked it. To support the podcast, the best cost-free way is to subscribe and please rate the podcast 5* wherever you find your podcasts. Thanks for watching.To be part of any Q&A, follow trensparentpodcast or nylenayga on instagram and watch for Q&A prompts on the story https://www.instagram.com/trensparentpodcast/Watch The Podcast:https://www.youtube.com/channel/UCqgN2kieCEHwZ9M-QFBxfCgPharma TRT, GH analogs, peptides, IGF-1, var troche, fat-loss/hair-loss treatments, etc | HRT Men's Health Optimization: https://transcendcompany.com/patient-intake-form/?ls=Nyle+NaygaHuge Supplements (Protein, Pre, Defend Cycle Support, Utilize GDA, Vital, Astragalus, Citrus Bergamot): https://www.hugesupplements.com/discount/NYLESupport code 'NYLE' 10% off - proceeds go towards upgrading content productionYoungLA Clothes: https://www.youngla.com/discount/nyleCode ‘NYLE' to support the podcastLet's chat about the Podcast:Instagram: https://www.instagram.com/trensparentpodcast/TikTok: https://www.tiktok.com/@transparentpodcastPersonalized Bodybuilding Program: https://www.nylenaygafitness.comTimestamps:00:00:00 - Teaser & Intro00:02:58 - Crazy Ali00:03:48 - What got you into bodybuilding and PDEs?00:10:20 - Paul Barnett00:11:44: - Varsity Blues00:14:02 - What got you started with coaching? 00:27:21 - What's your worst tren experience?00:31:39 - Roid rage00:47:49 - Justin Harris00:54:15 - Halotestin01:05:59 - High-frequency training01:18:08 - isometric contraction01:20:56 - Vince Gironda EMG technique01:34:57 - Christopher Sommer01:48:36 - Balancing Intense Training with Recovery01:55:58 - Training Imbalances02:05:53 - Supraspinatus02:19:00 - Does GDA help prevent insulin resistance in the offseason?02:33:27 - Anti-Diabetic Medications02:40:23 - Last question
Prof. Karthik Duraisamy is a Professor at the University of Michigan, the Director of the Michigan Institute for Computational Discovery and Engineering (MICDE) and the founder of the startup Geminus.AI. In this episode, we discusses AI4Science, with a particular focus on fluid dynamics and computational fluid dynamics. Prof. Duraisamy talks about the progress and challenges of using machine learning in turbulence modeling and the potential of surrogate models (both data-driven and physics-informed neural networks). He also explores the concept of foundational models for science and the role of data and physics in AI applications. The discussion highlights the importance of using machine learning as a tool in the scientific process and the potential benefits of large language models in scientific discovery. We also discuss the need for collaboration between academia, tech companies, and startups to achieve the vision of a new platform for scientific discovery. Prof. Duraisamy predicts that in the next few years, there may be major advancements in foundation models for science however he cautions against unrealistic expectations and emphasizes the importance of understanding the limitations of AI.Links:Summer school tutorials https://github.com/scifm/summer-school-2024 (scroll down for links to specific tutorials)SciFM24 recordings : https://micde.umich.edu/news-events/annual-symposia/2024-symposium/SciFM24 Summary : https://drive.google.com/file/d/1eC2HJdpfyZZ42RaT9KakcuACEo4nqAsJ/viewTrillion parameter consortium : https://tpc.devTurbulence Modelling in the age of data: https://www.annualreviews.org/content/journals/10.1146/annurev-fluid-010518-040547LinkedIn: https://www.linkedin.com/showcase/micde/Chapters00:00 Introduction09:41 Turbulence Modeling and Machine Learning21:30 Surrogate Models and Physics-Informed Neural Networks28:42 Foundational Models for Science35:23 The Power of Large Language Models47:43 Tools for Foundation Models48:39 Interfacing with Specialized Agents53:31 The Importance of Collaboration58:57 The Role of Agents and Solvers01:08:26 Balancing AI and Existing Expertise01:21:28 Predicting the Future of AI in Fluid Dynamics01:23:18 Closing Gaps in Turbulence Modeling01:25:42 Achieving Productivity Benefits with Existing ToolsTakeaways-Machine learning is a valuable tool in the development of turbulence modeling and other scientific applications.-Data-driven modeling can provide additional insights and improve the accuracy of scientific models.-Physics-informed neural networks have potential in solving inverse problems but may not be as effective in solving complex PDEs.-Foundational models for science can benefit from a combination of data-driven approaches and physics-based knowledge.-Large language models have the potential to assist in scientific discovery and provide valuable insights in various scientific domains. Having a strong foundation in the domain of study is crucial before applying AI techniques.-Collaboration between academia, tech companies, and startups is necessary to achieve the vision of a new platform for scientific discovery.-Understanding the limitations of AI and managing expectations is important.-AI can be a valuable tool for productivity gains and scientific assistance, but it will not replace human expertise.Keywords#computationalfluiddynamics , #ailearning #largelanguagemodels , #cfd , #supercomputing , #fluiddynamics
Outline00:00 - Intro01:17 - Early Years04:17 - The “Scenic Route” to Control Theory12:44 - Sampled Data Systems22:26 - Linear Parameter Varying (LPV) Identification28:07 - From Distributed Systems and PDEs ...38:59 - ... to Distributed Control of Spatially Invariant Systems49:02 - Taming the Navier-Stokes Equations50:55 - Advice to Future Students1:13:12 - Coherence in Large Scale Systems1:32:28 - On Resistive Losses in Power Systems1:39:00 - Cochlear Instabilities1:50:40 - Stochasticity in Feedback Loops2:00:00 - About Linear and Nonlinear Control2:08:14 - How to Select a Research Problem2:14:21 - Future of Control2:22:06 - OutroLinks- Paper on moment-invariants and object recognition: https://tinyurl.com/26tnks3z- Bassam's PhD Thesis: https://tinyurl.com/3n2274dv- Identification of linear parametrically varying systems: https://tinyurl.com/mryebhhy- Distributed control of spatially invariant systems: https://tinyurl.com/rzszjch2- Shift Operator: https://tinyurl.com/24fwehet- Heat Equation: https://tinyurl.com/57rc6s7h- Navier-Stokes Equations: https://tinyurl.com/45ktrd2e- The impulse response of the Navier-Stokes equations: https://tinyurl.com/4vaausfn- Non-Normal Matrix: https://tinyurl.com/58z4sph8- Coherence in large-scale networks: https://tinyurl.com/ynm5cbay- The Price of Synchrony: https://tinyurl.com/3svzancw- Tinnitus: https://tinyurl.com/yc5hm549- Cochlear Instabilities: https://tinyurl.com/fjespjbj- Stochasticity in Feedback Loops: https://tinyurl.com/yc6aw9xt- Koopman Operator: https://tinyurl.com/3jeu68p8- Carleman Linearization: https://tinyurl.com/yckzrnfh- Mamba Model: https://tinyurl.com/33h59jwj- Spectral Factorization: https://Support the Show.Podcast infoPodcast website: https://www.incontrolpodcast.com/Apple Podcasts: https://tinyurl.com/5n84j85jSpotify: https://tinyurl.com/4rwztj3cRSS: https://tinyurl.com/yc2fcv4yYoutube: https://tinyurl.com/bdbvhsj6Facebook: https://tinyurl.com/3z24yr43Twitter: https://twitter.com/IncontrolPInstagram: https://tinyurl.com/35cu4kr4Acknowledgments and sponsorsThis episode was supported by the National Centre of Competence in Research on «Dependable, ubiquitous automation» and the IFAC Activity fund. The podcast benefits from the help of an incredibly talented and passionate team. Special thanks to L. Seward, E. Cahard, F. Banis, F. Dörfler, J. Lygeros, ETH studio and mirrorlake . Music was composed by A New Element.
#228 April 4nd, 2024 or 33oh10 http://loosescrewsed.com Join us on discord! And check out the merch store! PROMO CODES https://discord.gg/3Vfap47Rea Support us on Patreon: https://www.patreon.com/LooseScrewsED Squad Update: War in Yen-Yi and MCC 858. Election in Kolyawa. We have a lot of overheating systems due to our recent run of expansions We have a handful of systems that need a little love Maitis is close to expanding…might as well push it over the top! All details in the #standing-orders and/or the #loose-screws-factions channels of the Discord. Galnet Update: https://community.elitedangerous.com/ Prototype Frame Shift Drive almost production ready, competitor sceptical (sic) Titan Oya deploys a surprise fleet of Orthrus vessels Dev news: Oya spawns 30 alerts Many more than the usual/allowed number Many are more than 10 LY from any control system Some are in systems that were cleared yesterday There was a Galnet update about it but hours after the fact. (timing? In response, all AX ops turn to attack Oya now, in spite of high damage resistance In spite of PFed messages, the rules no-longer apply and efforts to clear systems are wasted PDES advises it's pilots to bombard the titan since clearing systems doesn't matter Tissue samples already collected will be held on carriers to turn in after Oya is neutralized
This week, Louie and Sam are joined by Dr Adam Townsend on a quest to understand fluids, PDEs, and how we should be doing more recreational maths!
This month on Episode 47 of Discover CircRes, host Cynthia St. Hilaire highlights three original research articles featured in the March 31 issue of Circulation Research. We'll also provide an overview of the Compendium on Increased Risk of Cardiovascular Complications in Chronic Kidney Disease published in the April 14 issue. Finally, this episode features an interview with Dr Elizabeth Tarling and Dr Bethan Clifford from UCLA regarding their study, RNF130 Regulates LDLR Availability and Plasma LDL Cholesterol Levels. Article highlights: Shi, et al. LncRNAs Regulate SMC Phenotypic Transition Chen, et al. Bilirubin Stabilizes Atherosclerotic Plaque Subramaniam, et al. Mapping Non-Obvious cAMP Nanodomains by Proteomics Compendium on Increased Risk of Cardiovascular Complications in Chronic Kidney Disease Cindy St. Hilaire: Hi, and welcome to Discover CircRes, the podcast of the American Heart Association's Journal, Circulation Research. I'm your host, Dr Cindy St. Hilaire, from the Vascular Medicine Institute at the University of Pittsburgh, and today I'm going to share three articles selected from our March 31st issue of Circulation Research and give you a quick summary of our April 14th Compendium. I'm also excited to speak with Dr Elizabeth Tarling and Dr Bethan Clifford from UCLA regarding their study, RNF130 Regulates LDLR Availability and Plasma LDL Cholesterol Levels. So first the highlights. The first article we're going to discuss is Discovery of Transacting Long Noncoding RNAs that Regulates Smooth Muscle Cell Phenotype. This article's coming from Stanford University and the laboratory of Dr Thomas Quertermous. Smooth muscle cells are the major cell type contributing to atherosclerotic plaques. And in plaque pathogenesis, the cells can undergo a phenotypic transition whereby a contractile smooth muscle cell can trans differentiate into other cell types found within the plaque, such as macrophage-like cells, osteoblast-like cells and fibroblast-like cells. These transitions are regulated by a network of genetic and epigenetic mechanisms, and these mechanisms govern the risk of disease. The involvement of long non-coding RNAs, or Lnc RNAs as they're called, has been increasingly identified in cardiovascular disease. However, smooth muscle cell Lnc RNAs have not been comprehensively characterized and the regulatory role in the smooth muscle cell state transition is not thoroughly understood. To address this gap, Shi and colleagues created a discovery pipeline and applied it to deeply strand-specific RNA sequencing from human coronary artery smooth muscle cells that were stressed with different disease related stimuli. Subsequently, the functional relevancy of a few novel Lnc RNAs was verified in vitro. From this pipeline, they identified over 4,500 known and over 13,000 unknown or previously unknown Lnc RNAs in human coronary artery smooth muscle cells. The genomic location of these long noncoding RNAs was enriched near coronary artery disease related transcription factor and genetic loci. They were also found to be gene regulators of smooth muscle cell identity. Two novel Lnc RNAs, ZEB-interacting suppressor or ZIPPOR and TNS1-antisense or TNS1-AS2, were identified by the screen, and this group discovered that the coronary artery disease gene, ZEB2, which is a transcription factor in the TGF beta signaling pathway, is a target for these Lnc RNAs. These data suggest a critical role for long noncoding RNAs in smooth muscle cell phenotypic transition and in human atherosclerotic disease. Cindy St. Hilaire: The second article I want to share is titled Destabilization of Atherosclerotic Plaque by Bilirubin Deficiency. This article is coming from the Heart Research Institute and the corresponding author is Roland Stocker. The rupture of atherosclerotic plaque contributes significantly to cardiovascular disease. Plasma concentrations of bilirubin, a byproduct of heme catabolism, is inversely associated with risk of cardiovascular disease, but the link between bilirubin and atherosclerosis is unknown. Chen et el addressed this gap by crossing a bilirubin knockout mice to a atherosclerosis prone APOe knockout mouse. Chen et el addressed this gap by crossing the bilirubin knockout mouse to the atherosclerosis-prone APOE knockout mouse, and used the tandem stenosis model of plaque instability to address this question. Compared with their litter mate controls, bilirubin-APOE double knockouts showed signs of increased systemic oxidative stress, endothelial dysfunction, as well as hyperlipidemia. And they had higher atherosclerotic plaque burden. Hemeatabolism was increased in unstable plaques compared with stable plaques in both of these groups as well as in human coronary arteries. In mice, the bilirubin deletion selectively destabilized unstable plaques and this was characterized by positive arterial remodeling and increased cap thinning, intra plaque hemorrhage, infiltration of neutrophils and MPO activity. Subsequent proteomics analysis confirmed bilirubin deletion enhanced extracellular matrix degradation, recruitment and activation of neutrophils and associated oxidative stress in the unstable plaque. Thus, bilirubin deficiency generates a pro atherogenic phenotype and selectively enhances neutrophil-mediated inflammation and destabilization of unstable plaques, thereby providing a link between bilirubin and cardiovascular disease risk. Cindy St. Hilaire: The third article I want to share is titled Integrated Proteomics Unveils Regulation of Cardiac Monocyte Hypertrophic Growth by a Nuclear Cyclic AMP Nano Domain under the Control of PDE3A. This study is coming from the University of Oxford in the lab of Manuela Zaccolo. Cyclic AMP is a critically important secondary messenger downstream from a myriad of signaling receptors on the cell surface. Signaling by cyclic AMP is organized in multiple distinct subcellular nano domains, regulated by cyclic AMP hydrolyzing phosphodiesterases or PDEs. The cardiac beta adrenergic signaling has served as the prototypical system to elucidate this very complex cyclic AMP compartmentalization. Although studies in cardiac monocytes have provided an understanding of the location and the properties of a handful of these subcellular domains, an overview of the cellular landscape of the cyclic AMP nano domains is missing. To understand the nanodynamics, Subramanian et al combined an integrated phospho proteomics approach that took advantage of the unique role that individual phosphodiesterases play in the control of local cyclic AMP. They combined this with network analysis to identify previously unrecognized cyclic AMP nano domains associated with beta adrenergic stimulation. They found that indeed this integrated phospho proteomics approach could successfully pinpoint the location of these signaling domains and it provided crucial cues to determine the function of previously unknown cyclic AMP nano domains. The group characterized one such cellular compartment in detail and they showed that the phosphodiesterase PDE3A2 isoform operates in a nuclear nano domain that involves SMAD4 and HDAC1. Inhibition of PDE3 resulted in an increased HDAC1 phosphorylation, which led to an inhibition of its deacetylase activity, and thus derepression of gene transcription and cardiac monocyte hypertrophic growth. These findings reveal a very unique mechanism that explains the negative long-term consequences observed in patients with heart failure treated with PDE3 inhibitors. Cindy St. Hilaire: The April 14th issue is our compendium on Increased Risk of Cardiovascular Complications in Chronic Kidney Disease. Dr Heidi Noels from the University of Aachen is our guest editor of the 11 articles in this issue. Chronic kidney disease is defined by kidney damage or a reduced kidney filtration function. Chronic kidney disease is a highly prevalent condition affecting over 13% of the population worldwide and its progressive nature has devastating effects on patient health. At the end stage of kidney disease, patients depend on dialysis or kidney transplantation for survival. However, less than 1% of CKD patients will reach this end stage of chronic kidney disease. Instead, most of them with moderate to advanced chronic kidney disease will prematurely die and most often they die from cardiovascular disease. And this highlights the extreme cardiovascular burden patients with CKD have. The titles of the articles in this compendium are the Cardio Kidney Patient Epidemiology, Clinical Characteristics, and Therapy by Nicholas Marx, the Innate Immunity System in Patients with Cardiovascular and Kidney Disease by Carmine Zoccali et al. NETs Induced Thrombosis Impacts on Cardiovascular and Chronic Kidney disease by Yvonne Doering et al. Accelerated Vascular Aging and Chronic Kidney Disease, The Potential for Novel Therapies by Peter Stenvinkel et al. Endothelial Cell Dysfunction and Increased Cardiovascular Risk in Patients with Chronic Kidney Disease by Heidi Noels et al. Cardiovascular Calcification Heterogeneity in Chronic Kidney Disease by Claudia Goettsch et al. Fibrosis in Pathobiology of Heart and Kidney From Deep RNA Sequencing to Novel Molecular Targets by Raphael Kramann et al. Cardiac Metabolism and Heart Failure and Implications for Uremic Cardiomyopathy by P. Christian Schulze et al. Hypertension as Cardiovascular Risk Factor in Chronic Kidney Disease by Michael Burnier et al. Role of the Microbiome in Gut, Heart, Kidney crosstalk by Griet Glorieux et al, and Use of Computation Ecosystems to Analyze the Kidney Heart Crosstalk by Joachim Jankowski et al. These reviews were written by leading investigators in the field, and the editors of Circulation Research hope that this comprehensive undertaking stimulates further research into the path flow of physiological kidney-heart crosstalk, and on comorbidities and intra organ crosstalk in general. Cindy St. Hilaire: So for our interview portion of the episode I have with me Dr Elizabeth Tarling and Dr Bethan Clifford. And Dr Tarling is an associate professor in the Department of Medicine in cardiology at UCLA, and Dr Clifford is a postdoctoral fellow with the Tarling lab. And today we're going to be discussing their manuscript that's titled, RNF130 Regulates LDLR Availability and Plasma LDL Cholesterol Levels. So thank you both so much for joining me today. Elizabeth Tarling: Thank you for having us. Bethan Clifford: Yeah, thanks for having us. This is exciting. Cindy St. Hilaire: I guess first, Liz, how did you get into this line of research? I guess, before we get into that, I should disclose. Liz, we are friends and we've worked together in the ATVB Women's Leadership Committee. So full disclosure here, that being said, the editorial board votes on these articles, so it's not just me picking my friends. But it is great to have you here. So how did you enter this field, I guess, briefly? Elizabeth Tarling: Yeah, well briefly, I mean my training right from doing my PhD in the United Kingdom in the University of Nottingham has always been on lipid metabolism, lipoprotein biology with an interest in liver and cardiovascular disease. So broadly we've always been interested in this area and this line of research. And my postdoctoral research was on atherosclerosis and lipoprotein metabolism. And this project came about through a number of different unique avenues, but really because we were looking for regulators of LDL biology and plasma LDL cholesterol, that's sort of where the interest of the lab lies. Cindy St. Hilaire: Excellent. And Bethan, you came to UCLA from the UK. Was this a topic you were kind of dabbling in before or was it all new for you? Bethan Clifford: It was actually all completely new for me. So yeah, I did my PhD at the same university as Liz and when I started looking for postdocs, I was honestly pretty adamant that I wanted to stay clear away from lipids and lipid strategy. And then it wasn't until I started interviewing and meeting people and I spoke to Liz and she really sort of convinced me of the excitement and that the interest and all the possibilities of working with lipids and well now I won't go back, to be honest. Cindy St. Hilaire: And now here you are. Well- Bethan Clifford: Exactly. Cindy St. Hilaire: ... congrats on a wonderful study. So LDLR, so low density lipoprotein receptor, it's a major determinant of plasmid LDL cholesterol levels. And hopefully most of us know and appreciate that that is really a major contributor and a major risk for the development of atherosclerosis and coronary artery disease. And I think one thing people may not really appreciate, which your study kind of introduces and talks about nicely, is the role of the liver, right? And the role of receptor mediated endocytosis in regulating plasma cholesterol levels. And so before we kind of chat about the nitty-gritty of your study, could you just give us a brief summary of these key parts between plasma LDL, the LDL receptor and where it goes in your body? Elizabeth Tarling: Yeah. So the liver expresses 70% to 80% of the body's LDL receptor. So it's the major determinant of plasma lipoprotein plasma LDL cholesterol levels. And through groundbreaking work by Mike Brown and Joe Goldstein at the University of Texas, they really define this receptor mediated endocytosis by the liver and the LDL receptor by looking at patients with familial hypercholesterolemia. So those patients have mutations in the LDL receptor and they either express one functional copy or no functional copies of the LDL receptor and they have very, very large changes in plasma LDL cholesterol. And they have severe increases in cardiovascular disease risk and occurrence and diseases associated with elevated levels of cholesterol within the blood and within different tissues. And so that's sort of how the liver really controls plasma LDL cholesterol is through this receptor mediated endocytosis of the lipoprotein particle. Cindy St. Hilaire: There's several drugs now that can help regulate our cholesterol levels. So there's statins which block that rate limiting step of cholesterol biosynthesis, but there's this new generation of therapies, the PCSK9 inhibitors. And can you just give us a summary or a quick rundown of what are those key differences really? What is the key mechanism of action that these therapies are going after and is there room for more improvement? Bethan Clifford: Yeah, sure. So I mean I think you've touched on something that's really key about the LDR receptor is that it's regulated at so many different levels. So we have medications available that target the production of cholesterol and then as you mentioned this newer generation of things like PCSK9 inhibitors that sort of try and target LDL at the point of clearance from the plasma. And in response to your question of is there room for more regulation, I would say that given the sort of continual rate of increased cholesterol in the general population and the huge risks associated with elevated cholesterol, there's always capacity for more to improve that and sort of generally improve the health of the population. And what we sort of found particularly exciting about RNF130 is that it's a distinct pathway from any of these regulatory mechanisms. So it doesn't regulate the level of transcription, it doesn't regulate PCSK9. Or in response to PCSK9, it's a completely independent pathway that could sort of improve or add to changes in cholesterol. Cindy St. Hilaire: So your study, it's focusing on the E3 ligase, RNF130. What is an E3 ligase, and why was this particular one of interest to you? How did you come across it? Elizabeth Tarling: is predTates Bethan joining the lab. This is, I think, again for the listeners and those people in training, I think it's really important to note this project has been going in the lab for a number of years and has really... Bethan was the one who came in and really took charge and helped us round it out. But it wasn't a quick find or a quick story. It had a lot of nuances to it. But we were interested in looking for new regulators of LDL cholesterol and actually through completely independent pathways we had found the RNF130 locus as being associated with LDL cholesterol in animals. And then it came out in a very specific genome-wide association study in the African American care study, the NHLBI care study. And so really what we started looking at, we didn't even know what it was. Elizabeth Tarling: So we asked ourselves, well what is this gene? What is this protein? And it's RNF, so that's ring finger containing protein 130 and ring stands for really interesting new gene. Somebody came up with the glorious name. But proteins that contain this ring domain are very characteristic and they are E3 ubiquitin ligases. And so they conjugate the addition of ubiquitin to a target protein and that signals for that protein to either be internalized and/or degraded through different decorative pathways within the cell. And so we didn't land on it because we were looking at E3 ligases, we really came at it from an LDL cholesterol perspective. And it was something that we hadn't worked on before and the study sort of blossomed from there. Cindy St. Hilaire: That's amazing and a beautiful, but also, I'm sure, heartbreaking story because these long projects are just... They're bears. So what does this RNF130 do to LDLR? What'd you guys find? Bethan Clifford: As Liz said, this is a long process, but one of the key factors of RNF130 is it's structurally characteristically looked like E3 ligase. So the first thing that Liz did and then I followed up with in the lab is to see is this E3 ligase ubiquitinating in vitro. And if it is going to ubiquitinate, what's it likely to regulate that might cause changes in plasma cholesterol that would explain these human genetic links that we saw published at the same time. And so because the LDL cholesterol is predominantly regulated by the LDL receptor and the levels of it at the surface of the parasites in the liver, the first question we wanted to see is does RNF130 interact in any way with that pathway? And I'm giving you the brief view here of the LDL receptor. We obviously tested lots of different receptors. We tested lots of different endocytose receptors and lipid regulators, but the LDL receptor is the one that we saw could be ubiquitinated by RNF130 in vitro. And so then we wanted to sort of go on from there and establish, okay, if this E3 ubiquitin ligase, is it regulating LDL receptor? What does that mean in an animal context in terms of regulating LDL cholesterol? Cindy St. Hilaire: Yeah, and I guess we should also explain, ubiquitination, in terms of this receptor, and I guess related to Goldstein and Brown and receptor mediated endocytosis, like what does that actually mean for the liver cell and the cholesterol in the LDLR that is binding the receptor? Bethan Clifford: So yes, ubiquitination is a really common regulatory mechanism actually across all sorts of different cells, all sorts of different receptors and proteins. And basically what it does is it signals for degradation of a protein. So a ubiquitin molecule is conjugated to its target such as in our case the LDL receptor and that ubiquitin tells the cell that this protein is ready for proteasomal degradation. And that's just one of the many things ubiquitination can do. It can also signal for a trafficking event, it can signal for a protein to protein interaction, but it's most commonly associated with the proteasomal degradation. Cindy St. Hilaire: So in terms of... I guess I'm thinking in terms of PCSK9, right? So those drugs are stemming from observations in humans, right? There were humans with gain and loss of function mutations, which caused either more or less of this LDLR receptor internalization. How is this RNF130 pathway different from the PCSK9 activities? Elizabeth Tarling: Yeah, so PCSK9 is a secreted protein, so it's made by hepatocyte and actually other cells in the body and it's secreted and it binds to the LDL particle, LDL receptor complex, and signals for its internalization and degradation in the proteasome. So this is not ubiquitination event, this is a completely different trafficking event. And so the RNF130, actually what Bethan showed, is it directly ubiquitinates the LDL receptor itself, signaling for an internalization event and then ultimately degradation of the LDR receptor through a decorative pathway, which we also define in the study. So these are two unique mechanisms and actually some key studies that we did in the paper were to modulate RNF130 in animals that do not have PCSK9. And so in that system where in the absence of PCSK9 you have a lot of LDR receptor in the liver that's internalizing cholesterol. What happens when you overexpress RNF130? Do you still regulate at the LDL receptor? And you absolutely do. And so that again suggests that they're two distinct mechanisms and two distinct pathways. Cindy St. Hilaire: That was one thing I really loved about your paper is every kind of figure or section, the question that would pop up in my head, even ones that didn't pop in my head were beautifully answered with some of these really nice animal models, which is never an easy thing, right? And so one of the things that you brought up was difficulty in making one of the animal models. And so I'm wondering if you could share a little bit for that challenge. I think one thing that we always tend to hide is just science is hard and a lot of what we do doesn't work. And I really think especially for the trainees and really everyone out there, if we kind of share these things more, it's better. So what was one of the most challenging things in this study? And I guess I'm thinking about that floxed animal. Elizabeth Tarling: Yeah, so I'll speak a bit about that and then I'll let Bethan address because she was really the one on the ground doing a lot of the struggles. But again, we actually weren't going to include this information in the paper. And upon discussion and actually prompted by the reviewers of the paper and some of the questions that they asked us, we realized, you know what? It's actually really important to show this and show that this happens and that there are ways around it. And so the first story is before Bethan even arrived in the lab, we had purchased embryonic stem cells that were knockout first condition already. And so this is a knockout strategy in which the exon of interest is flanked with lots of P sites so that you can create a flox animal, but also so you can create a whole body knockout just by the insertion of this knockout first cassette. Elizabeth Tarling: And so we got those mice actually in the first year of Bethan joining the lab. We finally got the chimeric mice and we were able to stop reading those mice. And at the same time we tried to generate our flox animals so that we could move on to do tissue-specific studies. And Bethan can talk about the pain associated with this. But over two years of breeding, we never got the right genotypes from the different crosses that you need to do to generate the flox animal. And it was actually in discussions with Bethan where we decided we need to go back. We need to go back to those ESLs that we purchased five years ago and we need to figure out if all of the elements that the quality control step had told us were in place are actually present. And so Bethan went back and sequenced the whole locus and the cassette to figure out what pieces were present and we found that one of the essential locks P sites that's required for every single cross from the initial animal was absent and therefore we could actually never make the mouse we wanted to make. And so that's sort of just a lesson for people going down that route and making these tools that we need in the lab to answer these questions is that despite paying extra money and getting all of the sort of QCs that you can get before you receive the ESLs, we should have gone back and done our own housekeeping and sort of a long journey told us when we went back that we didn't have what we thought we had at the beginning. And that was a real sticking point as Bethan can- Cindy St. Hilaire: Yeah. And so you know you're not alone. My very first postdoc that I did, I went with a mouse that they had also bought and were guaranteed that it was a knockout and it was not. And it is a painful lesson, but it is critical to... You get over it. So Bethan, maybe you can also tell us a little bit about what are the other kind of next things you tried? You pivoted and you pivoted beautifully because all the models you used I thought were quite elegant in terms of exactly asking the question you wanted to ask in the right cells. So can you maybe explain some of the in vivo models you used for this study? Bethan Clifford: Sure, there are definitely a lot. So I mean I think Liz sort of encapsulated the trouble we have with the knockout really succinctly, but actually I want to just take this moment to sort of shout out to another postdoc in the Tarling lab, Kelsey Jarrett, who was really instrumental in the pivoting to a different model. So for the knockouts when we sort of established we didn't have exactly what we thought we did and then to compound that we also weren't getting the DeLiAn ratios breeding this whole body knockout. We wanted to sort of look at a more transient knockout model. And that's where Kelsey really stepped in and sort of led the way and she generated AAV-CRISPR for us to target RNF130 specifically in the liver. And that had the added beauty of, one, not requiring breeding to get over this hurdle of the knockout being somewhat detrimental to breeding. But it also allowed us to ask the question of what RNF130 is doing specifically in the liver where the liver regulates LDL receptor and LDL cholesterol. And so that was one of the key models that really, really helped get this paper over the finish line. But we did a whole barrage of experiments, as you've seen. We wanted to make sure... One of the key facets of the Tarling lab is whenever you do anything, no matter what you show Liz, it will always be, "Okay, you showed it to me one way, now show it to me a different way." Can you get the same result coming at it from different ways? And if you can't, why is that? What is the regulation behind that? And so that's really what the paper is doing is asking the same question in as many ways as we can accurately and appropriately probe what RNF130 does to the LDR receptor. So we tried gain of function studies without adenovirus overexpression. We tried transient knockdown with antisense oligonucleotides, and then we did, as I said, the AAV-CRISPR knockdown with the help of Kelsey and our whole body knockout. And then we also repeated some of these studies such as the adenovirus and the ASO in specific genetic backgrounds. So in the absence of PCSK9, can we still regulate the LDL receptor? And then we also, just to really confirm this, in the absence of the LDL receptor, do we see a difference? And the answer is no, because this effect was really dependent on that LDL receptor being present. So there was a big combination. Cindy St. Hilaire: It was really nice, really a beautiful step-wise progression of how to solidly answer this question. But a lot of, I think, almost all you did was in mice. And so what is the genetic evidence for relevancy in humans? Can you discuss a little bit about those databases that you then went to to investigate, is this relevant in humans? Bethan Clifford: I think Liz might be better off answering that question. Elizabeth Tarling: And I think this sort of pivots on what Bethan was saying. So when we had struggles in the lab, it was a team environment and a collaboration between people in the lab that allowed us to make that leap and make those next experiments possible to then really answer that question. And to be able to include the antisense oligonucleotides required a collaboration with industry. We were very lucky to have a longstanding collaboration with Ionis, who provided the antisense oligonucleotides. And for the human genetics side of things, that also was a collaboration with Marcus Seldin, who was a former postdoc with Jake Lusis and is now our PI at UC Irvine. And what he helped us do is dive into those summary level databases and ask from that initial study in the NHLBI care population, do we see associations of RNF130 expression in humans with LDL cholesterol with cardiovascular outcomes. And so one database which I would recommend everybody use, it's publicly available, is the StarNet database. And it's in the paper and the website is there. And that allowed us to search for RNF130. Elizabeth Tarling: And what it does is it asks how RNF130 expression in different tissues is associated with cardiometabolic outcomes and actual in CAD cases and controls, so people with and without heart disease. And we found that expression of RNF130 in the liver was extremely strongly correlated with the occurrence of cardiovascular disease in people with CAD. So in cases versus controls. And then we were also able to find many other polymorphisms in the RNF130 locus that were associated with LDL cholesterol in multiple different studies. And I think the other message from this paper is this, unlike PCSK9 and unlike LDR receptor itself, which are single gene mutations that cause cardiovascular disease, there are many sub genome-wide significant loci that contribute to this multifactorial disease, which is extremely complex. And I think RNF130 falls within that bracket that those sort of just on the borderline of being genome-wide significant still play significant biological roles in regulating these processes. And they don't come up as a single gene hit for a disease, but combinatorialy they are associated with increased risk of disease and they have a molecular mechanism that's associated with the disease. And so that's what Marcus helped us do in terms of the human genetics is really understand that and get down to that level of data. Cindy St. Hilaire: Yeah. Yeah, it really makes you want to go back and look at those. Everyone always focuses on that really high peak and those analyses, but what are all those other ones above the noise, right? So it's really important. Elizabeth Tarling: I think it's really hard to do that. I think that's one where people... Again, it comes down to team science and the group of people that we brought together allowed us to ask that molecular question about how that signal was associated with the phenotype. I think by ourselves we wouldn't have been able to do it. Cindy St. Hilaire: Yeah. So your antisense oligonucleotide experiments, they were really nice. They showed, I think it was a four-week therapy, they showed that when you injected them expression of RNF130 went down by 90%. I think cholesterol in the animals was lowered by 50 points or so. Is this kind of a next viable option? And I guess related to that, cholesterol's extremely important for everything, right? Cell membrane integrity, our neurons, all sorts of things. Is it possible with something that is perhaps really as powerful as this to make cholesterol too low? Elizabeth Tarling: I think that what we know from PCSK9 gain and loss of function mutations is that you can drop your plasma cholesterol to very low levels and still be okay because there are people walking around with mutations that do that. I think RNF130 is a little different in that it's clearly regulatory in a homeostatic function in that it's ubiquitously expressed and it has this role in the liver to regulate LDL receptor availability, but there are no homozygous loss of function mutants people walking around, which tells us something else about how important it is in potentially other tissues and in other pathways. And we've only just begun to uncover what those roles might be. So I think that as a therapy, it has great potential. We need to do a lot more studies to sort of move from rodent models into more preclinical models. But I do think that the human data tell us that it's really important in other places too. And so yeah, we need to think about how best it might work as a therapy. If it's combinatorial, if it's dosed. Those are the types of things that we need to think about. Cindy St. Hilaire: Yeah, it's really exciting. Do you know, are there other protein targets of RNF130? Is that related to my next question of what is next? Elizabeth Tarling: I mean, so I should point out, so Bethan unfortunately left the lab last year for a position at Amgen where she's working on obesity and metabolic disease. But before she left, she did two very, very cool experiments searching for new targets or additional targets of RNF130. Starting in the liver, but hopefully we'll move those into other tissues. And so she did gain of function RNF130 versus what loss of function we have of RNF130, and she did specific mass spec analysis of proteins that are ubiquitinated in those different conditions. And by overlaying those data sets, we're hoping to carve out new additional targets of RNF130. And there are some, and they're in interesting pathways, which we have yet to completely test, but definitely there are additional pathways, at least when you overexpress and reduce expression. Now, whether they turn out to be, again, bonafide in vivo, actual targets that are biologically meaningful is sort of the next step. Cindy St. Hilaire: Yeah. Well, I'm sure with your very rigorous approach, you are going to find out and hopefully we'll see it here in the future. Dr Elizabeth Tarling and Dr Bethan Clifford, thank you so much for joining me today. I really enjoyed this paper. It's a beautiful study. I think it's a beautiful example, especially for trainees about kind of thoroughly and rigorously going through and trying to test your hypothesis. So thanks again. Elizabeth Tarling: Thank you. Bethan Clifford: Thank you very much. Cindy St. Hilaire: That's it for the highlights from the March 31st and April 14th issues of Circulation Research. Thank you for listening. Please check out the Circulation Research Facebook page and follow us on Twitter and Instagram with the handle @CircRes, and #DiscoverCircRes. Thank you to our guests, Dr Liz Tarling and Dr Bethan Clifford. This podcast is produced by Ishara Ratnayaka, edited by Melissa Stoner, and supported by the editorial team of Circulation Research. I'm your host, Dr Cindy St. Hilaire, and this is Discover CircRes, you're on-the-go source for the most exciting discoveries in basic cardiovascular research. This program is copyright of the American Heart Association 2022. The opinions expressed by speakers in this podcast are their own, and not necessarily those of the editors or of the American Heart Association. For more information, visit ahajournals.org.
Link to bioRxiv paper: http://biorxiv.org/cgi/content/short/2022.12.07.519498v1?rss=1 Authors: Moravveji, S., Doyon, N., Duchesne, S., Mashreghi, J. Abstract: Alzheimers disease is a complex, multi-factorial and multi-parametric neurodegenerative etiology. Mathematical models can help understand such a complex problem by providing a way to explore and conceptualize principles, merging biological knowledge with experimental data into a model amenable to simulation and external validation, all without the need for extensive clinical trials. We performed a scoping review of mathematical models of AD with a search strategy applied to the PubMed database which yielded 846 entries. After applying our exclusion criteria, only 17 studies remained from which we extracted data, focusing on three aspects of mathematical modeling: how authors addressed continuous time, how models were solved, and how the high dimensionality and non-linearity of models were managed. Most articles modeled AD at the cellular range of the disease process, operating on a short time scale (e.g., minutes; hours), i.e., the micro view (12/17); the rest considered regional or brain-level processes, with longer timescales (e.g., years, decades) (the macro view). Most papers were concerned primarily with A{beta} (n = 8), few modeled with both A{beta} and tau proteins (n = 3), and some considered more than these two factors in the model (n = 6). Models used partial differential equations (PDEs; n = 3), ordinary differential equations (ODEs; n = 7), both PDEs and ODEs (n = 3). Some didnt specify the mathematical formalism (n = 4). Sensitivity analyses were performed in only a small number of papers (4/17). Overall, we found that only two studies could be considered valid in terms of parameters and conclusions, and two more were partially valid. The majority (n = 13) either was invalid or there was insufficient information to ascertain their status. While mathematical models are powerful and useful tools for the study of AD, closer attention to reporting is necessary to gauge the quality of published studies to replicate or continue with their contributions. Copy rights belong to original authors. Visit the link for more info Podcast created by Paper Player, LLC
Les dirigeants nigériens et leurs partenaires publics et privés étaient réunis à Paris ce 5 et 6 décembre pour présenter le Plan de développement économique et social 2022-2026. Un plan à 29,6 milliards d'euros qui vise à consolider les bases du développement du pays et à transformer radicalement son économie. « Notre ambition est de ramener le taux de pauvreté de 43 % en 2022 à 35 % en 2026... » Pour Mohamed Bazoum, le président du Niger, le PDES 2022-2026 est une nécessité vitale pour permettre à son pays de relever le défi majeur de la pauvreté. Confronté à une démographie forte, une agriculture encore largement dépendante du climat, mais aussi aux questions de sécurité, le Niger entend investir à la fois dans le capital humain, la gouvernance et la transformation de son économie. Rabiou Abdou, le ministre nigérien du Plan, est l'architecte du PDES. « La transformation de l'économie suggère la création des industries à partir des matières premières que nous produisons, c'est-à-dire notre production sylvo-agro-pastorale. Ces industries vont générer non seulement de la valeur ajoutée, qui elle va alimenter les ressources publiques via le prélèvement d'impôts, et surtout créer des emplois à cette jeunesse que nous sommes en train de former en investissant dans le capital humain. » Une participation de la Banque africaine de développement Le Niger a chiffré son plan à 29,6 milliards d'euros. L'État s'engage à apporter 13,4 milliards sur ses ressources. Il espère 10 milliards d'euros de ses partenaires institutionnels et 6 milliards du secteur privé. La BAD, la Banque africaine de développement, a déjà sorti le chéquier. Marie-Laure Akin-Olugbade, vice-présidente de la BAD : « Nous nous proposons d'engager un montant de 1 500 milliards de francs CFA. Il s'agit d'engagements nouveaux dans le domaine des infrastructures, comme le transport et les énergies, en particulier les énergies renouvelables ». Des aides à la prise de décision pour les investisseurs La France, l'Union européenne, la Banque mondiale, la Banque islamique de développement, mais aussi de nombreux partenaires bilatéraux et multilatéraux ont répondu présents. Le Niger compte aussi sur le secteur privé. Le PDES propose aux investisseurs une série de grands projets industriels, énergétiques ou miniers, clés en mains. « Pour ces projets, nous avons conduit des études de faisabilité détaillées. Dans toutes les dimensions. Les dimensions économiques, financières, environnementales ou sociétales. Donc, ces études de faisabilité étant disponibles, tout investisseur peut consulter les différents projets et se dire tout de suite "si je mets tant, voilà ce que je gagne". Ce travail a été pris en charge par la Banque mondiale et sa filiale, la SFI. Nous mettons donc à la disposition des investisseurs des aides à la décision », explique Rabiou Abdou. Ces études de faisabilité ont été confiées à un grand cabinet d'audit international. Niamey entend ainsi mettre toutes les chances de son côté.
Today we talk about two important equations studied in partial differential equations (PDEs). The intuition that we've cultivated over the past semester has proven to be quite useful when visualizing solutions. Enjoy the episode :) Instagram: @math.physics.podcast Tiktok: @math.physics.podcast Email: math.physics.podcast@gmail.com Twitter: @MathPhysPod
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Specializing in Problems We Don't Understand , published by johnswentworth on the AI Alignment Forum. Most problems can be separated pretty cleanly into two categories: things we basically understand, and things we basically don't understand. Some things we basically understand: building bridges and skyscrapers, treating and preventing infections, satellites and GPS, cars and ships, oil wells and gas pipelines and power plants, cell networks and databases and websites. Some things we basically don't understand: building fusion power plants, treating and preventing cancer, high-temperature superconductors, programmable contracts, genetic engineering, fluctuations in the value of money, biological and artificial neural networks. Problems we basically understand may have lots of moving parts, require many people with many specialties, but they're generally problems which can be reliably solved by throwing resources at it. There usually isn't much uncertainty about whether the problem will be solved at all, or a high risk of unknown unknowns, or a need for foundational research in order to move forward. Problems we basically don't understand are the opposite: they are research problems, problems which likely require a whole new paradigm. In agency terms: problems we basically understand are typically solved via adaptation-execution rather than goal-optimization. Problems we basically don't understand are exactly those for which existing adaptations fail. Main claim underlying this post: it is possible to specialize in problems-we-basically-don't-understand, as a category in its own right, in a way which generalizes across fields. Problems we do understand mainly require relatively-specialized knowledge and techniques adapted to solving particular problems. But problems we don't understand mainly require general-purpose skills of empiricism, noticing patterns and bottlenecks, model-building, and design principles. Existing specialized knowledge and techniques don't suffice - after all, if the existing specialized knowledge and techniques were sufficient to reliably solve the problem, then it wouldn't be a problem-we-basically-don't-understand in the first place. So. how would one go about specializing in problems we basically don't understand? This post will mostly talk about how to choose what to formally study, and how to study it, in order to specialize in problems we don't understand. Specialize in Things Which Generalize Suppose existing models and techniques for hot plasmas don't suffice for fusion power. A paradigm shift is likely necessary. So, insofar as we want to learn skills which will give us an advantage (relative to existing hot plasma specialists) in finding the new paradigm, those skills need to come from some other area - they need to generalize from their original context to the field of hot plasmas. We want skills which generalize well. Unfortunately, a lot of topics which are advertised as “very general” don't actually add much value on most problems in practice. A lot of pure math is like this - think abstract algebra or topology. Yes, they can be applied all over the place, but in practice the things they say are usually either irrelevant or easily noticed by some other path. (Though of course there are exceptions.) Telling us things we would have figured out anyway doesn't add much value. There are skills and knowledge which do generalize well. Within technical subjects, think probability and information theory, programming and algorithms, dynamical systems and control theory, optimization and microeconomics, linear algebra and numerical analysis. Systems and synthetic biology generalize well within biology, mechanics and electrodynamics are necessary for fermi estimates in most physical sciences, continuum mechanics and PDEs are useful for a wide ...
Paul holds a PhD in Computer Science from Massachusetts Institute of Technology (MIT). In his dissertation he developed novel global optimization techniques for previously unaddressed problem classes, including parabolic partial differential equations (PDEs), to improve the optical efficiency of solar energy. He has served as a Principal Investigator on several DARPA research program teams advancing the state of the field of Natural Language Processing (NLP). He has held the role of Research Director in Artificial Intelligence (AI), Machine Learning (ML)/Data Science and NLP at companies ranging from small rapidly growing startups such as New Knowledge/Yonder to large companies such as Dun & Bradstreet. ————————————————————————————— Connect with me here: ✉️ My weekly email newsletter: jousef.substack.com
Canadian complaints; the never ending election; if I only had a brain; Uber, Lyft & DoorDash lie to buy their law, make millions; EU hits Amazon with antitrust charge; AI cracks PDEs; Eric Schmidt moves to Cyprus; fast radio bursts; Silk Road's bitcoin; Furry Season Landscaping; Carfax for cops; Ransomware Facebook ads; virtual Hall of Fame; podcast news; streaming devices; Apple's event; Loser.com; the finances of C19 safe concerts; feedback loop; goodbye, Alex.Show notes at https://gog.show/483
Dr Carolyn Lam: Welcome to Circulation on the Run, your weekly podcast summary and backstage pass to the journal and its editors. I'm Dr Carolyn Lam, associate editor from the National Heart Center in Duke National University of Singapore. Dr Greg Hundley: I'm Greg Hundley, associated editor from the VCU Pauley Heart Center in Richmond, Virginia. Dr Carolyn Lam: Greg, today's speaker paper is really special on a number of levels. First, it's a research letter and secondly, it's actually basic science. Now, this tells you it's got to be really special. Well, I'll just give you a hint. It talks about a new therapy for stroke. I'm going to leave it at that, leave you guessing because you've got to hang on as we tell you about the rest of the issue and then listen to the feature discussion. Now, the first original paper here, I want to describe as a basic paper focusing on PDE4B in heart failure. Dr Greg Hundley: All right, Carolyn, I'm not even going to let you start to quiz me on this. Can you tell me what in the world is PDE4B? Dr Carolyn Lam: All right. Phosphodiesterases, or PDEs, really represent a highly diverse super family of enzymes among which PDE3 and PDE4 are the main phosphodiesterases that degrading cyclic AMP with a high affinity in the heart. The cyclic AMP hydrolyzing phosphodiesterase 4B, which is PDE4B, is the key negative regulator of cardiac beta-adrenergic receptor stimulation. PDE4B deficiency leads to abnormal calcium handling and PDE4B is decreased in pressure overload hypertrophy suggesting that increasing PDE4B in the heart may be beneficial in heart failure. These authors led by Dr Vandecasteele from Inserm tested this hypothesis in elegant experiments involving both human cardiac tissues and transgenic mouse lines. Dr Greg Hundley: Carolyn, that was just a wonderful explanation and I really learned about these phosphodiesterases. Now, tell me what did they find in their study? Dr Carolyn Lam: The cyclic AMP hydrolyzing enzyme, PDE4B, was decreased in human failing hearts. Cardiac over expression of PDE4B in mice, resulting in a 15-fold increase in cyclic AMP hydrolysis decreased cardiac contraction and protected against the cardiotoxic effects of chronic beta-adrenergic stimulation. Whereas transgenic mice with a 50-fold increase in cardiac cyclic AMP hydrolysis underwent maladaptive remodeling. Furthermore, cardiac PDE4B gene transfer with serotype nine adeno associated viruses resulted in a significantly lower increase in cardiac PDE4B and protected against chronic catecholamine stimulation and transaortic constriction without depressing basal cardiac function. These results overall suggest that a moderate increase in cardiac PDE4B is beneficial to counteract the detrimental effects of excessive sympathetic system activation in heart failure and increase in PDE4B in the human heart could be achieved by gene therapy with adeno associated viruses or by using recently developed small molecules with PDE4 activating properties. Dr Greg Hundley: Wow, Carolyn. Very interesting. I mean, perhaps this'll work its way into heart failure management. Well, my study, our first study to describe involves the comparative efficacy and safety of oral P2Y12 inhibitors and acute coronary syndromes. It's a meta-analysis of 52,816 patients from 12 randomized trials. It comes to us from Professor Eliano Navarese from Nicholas Copernicus University. All right, Carolyn, here's your quiz. Have you wondered which PGY inhibitor is optimal for reducing risk of adverse cardiovascular events? Dr Carolyn Lam: Oh, that's an easy one. Of course I've wondered, but you're going to tell us the results. Dr Greg Hundley: It's getting harder and harder to trip you up Carolyn. Very clever, okay. This study aims to evaluate current evidence comparing the efficacy and safety profile of prasugrel, ticagrelor and clopidogrel in acute coronary syndrome by meta-analysis of 12 randomized clinical trials. Again, involving those 52,816 patients with ACS. Dr Carolyn Lam: Wow. What did they find Greg? Dr Greg Hundley: Compared clopidogrel, ticagrelor significantly reduced cardiovascular mortality and all-cause mortality. Whereas there was no statistically significant mortality reduction with prasugrel. Dr Greg Hundley: Next, compared with each other there were no significant differences in mortality with prasugrel versus ticagrelor. In addition, compared with clopidogrel, prasugrel reduced myocardial infarction, whereas ticagrelor showed no risk reduction. Dr Greg Hundley: Now stint thrombosis risk was significantly reduced by both ticagrelor and prasugrel versus clopidogrel. Compared with clopidogrel, both prasugrel and ticagrelor significantly increased major bleeding. There was no significant difference between prasugrel and ticagrelor for all outcomes explored. Dr Carolyn Lam: Summarize that for us. Dr Greg Hundley: Okay Carolyn. Prasugrel and ticagrelor reduced ischemic events, but increased bleeding in comparison to clopidogrel. A significant mortality reduction was observed with ticagrelor only. There was no efficacy and safety difference between prasugrel and ticagrelor. So a really nice summary evaluating these P2Y12 inhibitors, Dr Carolyn Lam: Indeed. Question for you, Greg, what is the prevalence of deep venous thrombosis, a DVT and its risk factors, prognosis and potential prophylaxis strategies for hospitalized patients with COVID-19? That's what the next paper is about. It is a single center observational study of 143 hospitalized patients confirmed of COVID-19. And this is from co-corresponding authors, Doctors Xi and Hu from Union Hospital in Wuhan China, Dr Zhang from Beijing Chaoyang, and Dr Ge from St. Christopher Hospital for Children in Philadelphia, United States, they found that DVT was found in a high percentage of these patients. Forty-six percent of the 143 patients and was associated with adverse outcomes with CURB-65 score three to five. Padua prediction score four a more and D-dimer greater than one microgram per mil, which in combination predicted DVT with a sensitivity of more than 88.5%. Thrombo prophylaxis was associated with lower DVT in a subgroup of patients with high Padua prediction score. Dr Greg Hundley: Now, what does this mean for all of us in this era of COVID-19? Dr Carolyn Lam: So this suggests that DVT is more common in hospitalized patients with COVID-19. So ultrasound screening of high-risk patients, as I mentioned before, may be indicated for the more prevention of DVT with low molecular weight heparins in high risk patients, such as those with a high Padua prediction scores may reduce DVT in hospitalized patients with COVID-19. Of course more work needs to be done, but a very interesting paper. Dr Greg Hundley: What a fantastic description. Well, my next paper is more from the world of basic science and involves phosphodiesterase 3A in arterial hypertension and comes to us from Dr Enno Klussmann from the Max Delbruck Center for Molecular Medicine. So Carolyn, autosomal dominant hypertension with brachydactyly clinically resembles salt resistant, essential hypertension and causes death by stroke before the age of 50 years. So in this study, the authors use genetic mapping, sequencing, transgenic technology, CRISPR-CAS based nine gene editing, immunoblotting, and fluorescence resonance energy transfer to identify new patients perform extensive animal phenotyping and explore new signaling pathways related to hypertension with brachydactyly. Dr Carolyn Lam: Wow. So what did they find, Greg? Dr Greg Hundley: Well, Carolyn, the authors described a novel mutation within a 15 BP region of the PDE3A gene, and define this segment as a mutational hotspot in hypertension with brachydactyly, the mutations cause an increase in enzyme activity, a CRISPR-Cas9 generated rat model with a nine BP deletion within the hotspot analogous to human deletion recapitulated the hypertension with brachydactyly in mice, mutant, transgenic PDE3A over expression and smooth muscle cells confirmed that mutant PDE3A caused hypertension. The afferent signaling found in these models was associated with an increase in vascular smooth muscle cell proliferation and changes in vessel morphology and function. Dr Carolyn Lam: Gosh, so what are the clinical implications? Greg? Dr Greg Hundley: The mutated PDE3A gene drives mechanisms that increase peripheral vascular resistance and cause hypertension. These authors presented two new animal models that serve to elucidate these underlying mechanisms further, and their findings could facilitate the search for new anti-hypertensive treatments. Dr Carolyn Lam: Very nice Greg. Well, the next paper is actually one we've already discussed in our special COVID-19 edition and that was aired on 22nd, May, 2020. That's the paper from Dr Poissy and Susen from University Lille in Inserm, and they reported a case-series of COVID-19 patients with pulmonary embolism in their institution of Lille University Hospital. So, please everybody remembers to tune in to that as a refresher. Also in today's journal, the issue of COVID-19 coagulopathy in venous thromboembolism is further discussed in an editorial by Dr Alex Spyropoulos and Dr Jeffrey Weitz. Let me tell you a bit more about other papers in this week's issue. There are letters to the editor from Dr Mueller and from Dr Gulati all about the paper incidents, trends and outcomes of type two myocardial infarction in the community cohort. There's a letter from Dr Siontis on the blood pressure myocardial infarction paradox. Dr Carolyn Lam: Does hypertension exert a protective effect in type two MI? In the ECG challenge Dr Di Cosola talks about the high, the low end, the narrow QRS in a peripartum cardiomyopathy. There's an online mind piece by Dr Kohli entitled surfing the waves of the COVID-19 pandemic as a cardiovascular clinician, a perspective piece by Dr Albert titled "The Heart of the Matter Unmasking and Addressing COVID-19's Toll on Diverse Populations". In Paths the Discovery series, Dr Rutherford talks about serial innovation to bring transformative precision medicines to people with serious diseases. And this is a conversation with Dr Jeffrey Leiden. Dr Greg Hundley: Very nice. Carolyn, I've got a couple other papers to discuss similar to your paper on DVT, Professor Lin Cai has a research letter involving the extremely high incidents of lower extremity, deep venous thrombosis in 48 patients with severe COVID-19 from Wuhan China. In an on my mind piece, Dr Anum Saeed from University of Pittsburgh discusses reinforcing cardiology training during a pandemic. It's an open letter to our leaders. Our own Bridget Kuhn has a piece entitled COVID-19 leads to major changes for cardiologists in training. And then finally, Dr Stephen Archer from Queens University provides a nice perspective on differentiating COVID-19 pneumonia from ARDS and high-altitude pulmonary edema, and what are the therapeutic implications. And now Carolyn, how about we get onto that feature discussion, one of the unusual times where we emphasize an important point in a research letter? Dr Carolyn Lam: You bet, Greg. Today's feature discussion is I think one of the most impactful, basic science papers we have, and that is why we're discussing it. I am so pleased to have the first author Dr Luca Liberale from University of Zurich, as well as Dr Peipei Ping associate editor from UCLA. So welcome both. Luca, I really need your help here. Can you please explain what your experiment was and your main findings? Dr Luca Liberale: We really happy that we could set up an experiment design, which has some kind of translation of value. So, differently from any other set up involving the tandem middle cerebral artery occlusion, which is among the most used model for ischemic stroke in basic science. In this case, the treatment is done post-ishchemically. So the mice received the neutralizing antibody against IL-1α only after they scan making salts. And we specifically thought to duties to keep the translational relevant side. As I said before, and trying to mirror the case of a patient, we think come have a stroke that goes to the emergency department, and he is eligible for revascularization therapy. And together with this revascularization therapy being at EPA or whatever for it, it received is also the kind of anti-IL-1α treatment. And another good translation of relevancy we thought this may have is that identifying of IL-1α antibody is already available in the market and being in many phase three trial. So we thought this is a ready to go, ready for the translation from the bench to the bedside, as we used to say. Dr Carolyn Lam: It's just so interesting because when we think about ischemic stroke, you know, we think about thrombolysis as practically the only thing we can do, and forgive me I'm not in neurologist here, but this is so unique to go with an anti-inflammatory mechanism. Now, when you see that this neutralizing antibody is currently in use, do you mean in cancer in other diseases? Dr Luca Liberale: Mainly it's cancer, but it's also other dermatological diseases. It's not only cancer, but oh yes, definitely. Cancer is one of the major fields of its application. Dr Carolyn Lam: Wow. So with that very interesting background, could you tell us about the experiment and what you found? Dr Luca Liberale: What we found is that after inducing ischemia in the animal for 45 minutes, we let them reperfuse for 48 hours during which the animal are under the treatment. So they received a bolus of anti-IL-1α immediately at the time of reperfusion. So when we take out from the carotid artery, the filament, and they received these volumes and they are let survive for 48 hours. So they are free to go in the cage, to seek drinks. After 48 hours, we assessed the neurological deficit and we sacrifice the animal to assess the stroke size by using the quite common PTC staining. And what we could find is that indeed the treatment with the higher dose, because we use two doses, and we could see a dose response, could that reduce the stroke size by 36% as compared to the treatment with the isotype control. And this went together with a significant reduction in that neurological impairment. So it's not only an experimental reduction, but it's also physiologically relevant for the animals. Dr Carolyn Lam: That really is incredible, and the way you manage to convey such a lot of data in a research letters is also remarkable. So, to the audience, you have to pick this up. It's a succinct read, just this one central figure that tells the whole story, and you're about to hear from Dr Ping. Dear Peipei, if you could tell us what the significance of this paper is, maybe some of the discussions that occurred behind the doors, so to speak among the editors. Dr Peipei Ping: We were super impressed with the fundamental message of the submitted report. Carolyn, as you are fully aware, most ischemic studies speed that in the heart or in a brain model, often select mechanisms that must be activated pre the event you bent of ischemia to induce a protective effect, a neurological protective effect in stroke or cardioprotective effect in the heart. So, as an associate editor who spent her entire 30 years career in this area of study, we often fascinated about the sentencing or the naiveness of the basic scientists in this area. Because you would have to plan an ischemia in the patient knowing when that to happen. And then before that happens, activate all these beautiful signaling and mechanisms, everything you have generated to prevent that ischemia. So the search for the possible mechanistic understanding of a post-ischemic event rescue mechanism has been going on for decades. And it's very, very challenging, Carolyn. Dr Peipei Ping: The beauty of the study is it utilized already in clinical trial, existing human antibody inhibitor, interleukin alpha-one antibody. You said antibody. So the reagent is already bile approved. Then examine very carefully in a post-ischemic fashion to see how relevant that agent in a time window reasonable to rescue ischemic injury. You can already tell from Lucas introduction; the results are profound, and it has stimulated many discussions in the field. It's very relevant to clinical center piece, even though it's still at that translational stage. So we saw this as a beautiful representation of how clinicians and scientists capable of not only bring something from the bench to the clinic or the clinic or to the bench. This is something comes to a full circle. It went from clinic where the reagent was used and created for something to the bench, understanding mechanistic insight, have a beautiful animal of human disease stroke model to test them and then take it to the clinic again. Dr Carolyn Lam: Goodness, Peipei. I love the way you put that. I actually didn't see that Luc[a], till you put it that way. I do have a couple of questions for Luca though. I understand you made it very clear in your paper that the human monoclonal antibody is in clinical use, but in this experiment, you had to use the rodent equivalent because the human antibody doesn't block the rodent IL-1α, which is very reasonable. But then it brings the question, how closely does this rodent model recapitulate thrombotic ischemic, or a stroke in humans? I mean, what do you think? Dr Luca Liberale: Well, what we see when we use our usual approach, this is a model that we're using in our center for molecular cardiology here in Zurich, and this been used in that specific group of Professor Ameche for many years. And this is usually quite well accepted as a model. So that, that the timeframe is 45 minutes of ischemia and 48 hours of reperfusion. I'll got to quite mirror the acute phase of an acute ischemic stroke, which is actually where we think that the inflammatory pathways can play the major roles. Also. I mean, everybody of us know that the recent anti-inflammatory trials confirmed this, that reducing the inflammation and the inflammatory pathways is good but can also be harmful. Dr Luca Liberale: So in the case, we can use an approach, which is limited in the time, maybe really close to that acute phase, really during the acute rates goes to the acute event. Well, maybe this can be quite useful and quite a translationally relevant that prolongs inactivation of such pathways as result. They can ask some for, so the balance in between the benefits and the harms cannot be that clear, can, I mean, needs to be quite well addressed. Dr Carolyn Lam: And that actually brings me to the next question. You know, the word translational has been mentioned quite a number of times here. So can you give us a sneak-peak on what the translational plans that your team may have? What's the next steps? Dr Luca Liberale: The next steps now is back to the company. So our basic findings are here. They will be published soon, and now it's all about the clinical scientist, and how they want to implement these basic findings into the clinic. Dr Carolyn Lam: So target engagement and mechanistic information as well. Peipei, could I just give you the last word, if you don't mind, maybe a bit of a cheeky question. What would you have loved to see in this paper or in a subsequent paper that offers a step closer to translation? Dr Peipei Ping: I think this study has shown most necessary components as a basic science research paper. I think the next level closer to the translation as Luca has already alluded to, has to do with both efficacy studies, as well as safety studies, and those actually would need to be done in the clinic because the mouse model. I think it's a fantastic model to offer these lines of information. Ischemic-wise I think it's very strong and translational value is very high and that was the predominant reason we voted to accept the paper. As you know, the accept and raise of circulation is very, very low as our bar is very high. Dr Carolyn Lam: Very nice. So target engagement and mechanistic information as well. Congratulations, Luca. Thank you so much Peipei for your great comments. Now, listeners, you heard it first time here on Circulation on the Run. Thank you for joining us today. Dr Greg Hundley: This program is copyright at the American Heart Association, 2020.
Hello, welcome, and G’day, we are so happy to have you join us again for our latest episode. When we say episode we don’t mean Buck having a rant and all that; just thought we should clarify that up front. Now we have a bumper episode this week with lots of stuff for you to enjoy. First up we have Professor bringing us news about Atari’s plan to open a chain of hotels. They have announced that they will be themed and focus on developing E-sports functionality. But we wonder is it going to be presented with the old style timber/brown laminate like we used to see on the old Atari? Guarantee that it won’t be the first option for the honeymoon market, although, any ladies who think that would be cool let us know that we are wrong once again. Following this Professor has the ESA’s environmentally unfriendly stance against the right to repair and modify your machines/consoles. BOO HISS!!! How lame can these idiots be, I mean, don’t they realise that advancements in technology have come about by people looking under the hood and figuring out ways to do things better? Next up we have DJ talking about IDW losing money and what they are planning to do about it. Apparently they have announced they are struggling in various markets with the material they are presently distributing. Of course the solution would be to develop new material, breathe new life into the company instead of just pumping out more of the same thing. But, no, they are hoping that their partnership with Netflix will save them. Then DJ brings us news on the critic’s reviews on Birds of Prey for us to laugh at and make fun of. Seriously, does anyone actually take them serious anymore? Buck suggests that if Rotten Tomatoes give a film a bad rating it must be good, and if it is good then stay away. What do you think, does that sound like a good approach to use? Then Buck has the latest on the Spitzer Space Telescope, which has just retired. After being on mission since 2003 it has provided heaps of data for scientist, with the first images and data being of the Tarantula Nebula. In homage to its origins the last images and data developed by Spitzer were also of the Tarantula Nebula. Which honestly is a hotbed of activity, with the explosion of a supernova that was first discovered in 1987 (interestingly called 1987a) and still sweeping through space creating some awesome images. Also there is a collection of 40 massive stars at least 50 times the size of our sun all in a tiny area together. If this isn’t enough for you Buck also brings us news of the latest evolution of robot tanks. Before anyone gets all Terminator Judgement Day on us, they are not all armed and those that are still need a person involved to fire on a target. But for those who are looking for an alternative to a trailer that hitches to your car, we might have a solution for you. We hope you enjoy this bumper episode this week. As normal we have the usual shout outs, remembrances, birthdays, and special events. Let us know what you think, make requests if you have a topic you are interested in having us dig into. Or just drop us a line and we will give you a special shout out. As always, remember to take care of yourselves, look out for each other and stay hydrated. Atari’s new idea : Gaming Hotels - https://www.gamesindustry.biz/articles/2020-01-27-atari-to-open-gaming-hotels-in-at-least-eight-us-citiesESA’s stance on Right To Repair - https://www.youtube.com/watch?v=KAVp1WVq-1Q&feature=youtu.beIDW loses money…a lot of money - https://www.bleedingcool.com/2020/01/25/idw-loses-17-1-million-in-2019-projects-profitability-in-2021/ Early reviews of Birds of Prey have arrived…. - https://boundingintocomics.com/2020/01/27/early-audience-reviews-for-margot-robbies-birds-of-prey-arrive/ The Tarantula Nebula - https://phys.org/news/2020-01-tarantula-nebula-web-mystery-spitzer.htmlRobot Tank…now with no firepower - https://www.bbc.com/news/business-50387954Games PlayedProfessor– Steamworld - http://imageform.se/game/steamworld-dig/Rating: 3.5/5Buck– Thunder Run: War of Clans - https://www.facebook.com/ThunderRunWarOfClans/?epa=SEARCH_BOX Rating: 3/5DJ – DNPOther topics discussedNiagara Falls' real-life Mario Kart track- https://dailyhive.com/toronto/niagara-falls-real-life-mario-kart-track-opening-june-2018?fbclid=IwAR32kb9QxDvcX-YzQGY0G9GGRDlYfkpqRU2fH2Kee96JtqNXo_r5YrDl1JwNintendo wins legal battle against one of Tokyo's real-life 'Mario Kart' tours- https://www.engadget.com/2020/01/29/nintendo-tokyo-mario-kart-legal-win/ Final Fantasy VII cafes- https://soranews24.com/2020/01/25/final-fantasy-vii-cafes-are-opening-in-tokyo-and-osaka-to-celebrate-legendary-games-remake/OutRun (arcade game released by Sega in September 1986.)- https://en.wikipedia.org/wiki/Out_RunMario Bros. (platform game published and developed for arcades by Nintendo in 1983.)- https://en.wikipedia.org/wiki/Mario_Bros.The Verge PC Build (Reuploaded) by Ext_Nation- https://www.youtube.com/watch?v=UZ4viTwfFxALouis Rossmann - Why I don't use Apple products- https://www.youtube.com/watch?v=sfrYOWlKJ_gJerryRigEverything (American YouTuber and tech reviewer. He has gained fame and popularity for his technology reviews including smartphones, watches, apps and much more.)- https://www.youtube.com/user/JerryRigEverything- https://youtube.fandom.com/wiki/JerryRigEverythingEvery Console In One Box - The Origin Big O by Unbox Therapy- https://www.youtube.com/watch?v=ErQQc6cUSTAStar Trek (comic book series by IDW Publishing, based on the Star Trekscience fiction entertainment franchise created by Gene Roddenberry.)- https://en.wikipedia.org/wiki/Star_Trek_(IDW_Publishing)Captain Marvel (2019 American superhero film based on the Marvel Comics character Carol Danvers.)- https://en.wikipedia.org/wiki/Captain_Marvel_(film)Punisher (A fictional character appearing in American comic books published by Marvel Comics. The Punisher made his first appearance in The Amazing Spider-Man #129.)- https://en.wikipedia.org/wiki/PunisherSpiderman 3 (2007 American superhero film based on the fictional Marvel Comics character Spider-Man.)- https://en.wikipedia.org/wiki/Spider-Man_3Dexter (American television crimedramamystery series that aired on Showtime from October 1, 2006, to September 22, 2013)- https://en.wikipedia.org/wiki/Dexter_(TV_series)Snakes on a Plane (At one point, the film was given the title Pacific Air Flight 121, only to have it changed back to the working title at Samuel Jackson's request.)- https://en.wikipedia.org/wiki/Snakes_on_a_Plane#ProductionBlack hole (A black hole is a region of spacetime exhibiting gravitational acceleration so strong that nothing—no particles or even electromagnetic radiation such as light—can escape from it.)- https://en.wikipedia.org/wiki/Black_holeThe Amazing World of Gumball (British-American surreal comedy animated television series created by Ben Bocquelet for Cartoon Network.)- https://en.wikipedia.org/wiki/The_Amazing_World_of_Gumball Cartoon Network - The Amazing World of Gumball | The Meaning Of Life- https://www.youtube.com/watch?v=oZspV3ser1Q Tarantula (The spider originally bearing the name "tarantula" was Lycosa tarantula, a species of wolf spider native to Mediterranean Europe. The name derived from that of the southern Italian town of Taranto.)- https://en.wikipedia.org/wiki/Tarantula#EtymologyRoboCop (1987) - It's Only a Glitch (Dick's boardroom demonstration of the Enforcement Droid 209 goes awry when the droid opens fire on Kinney.- https://www.youtube.com/watch?v=TstteJ1eIZg Law Abidding Citizen (2009) - Weaponized bomb disposal robot kills people- https://www.youtube.com/watch?v=SrK-UojUTNwRemote controlled weapon station (A remote controlled weapon station (RCWS), or remote weapon station (RWS), also known as a remote weapon system, (RWS) is a remotely operated weaponized system often equipped with fire-control system for light and medium-caliber weapons which can be installed on ground combat vehicle or sea- and air-based combat platforms.)- https://en.wikipedia.org/wiki/Remote_controlled_weapon_stationCommon Remotely Operated Weapon Station (CROWS) (a remote weapon station systems used by the US military on its armored vehicles and ships. It allows weapon operators to engage targets without leaving the protection of their vehicle.)- https://en.wikipedia.org/wiki/CROWSIn Flanders Fields by John McCrae - https://www.poetryfoundation.org/poems/47380/in-flanders-fields Dulce Et Decorum Est (poem written by Wilfred Owen during World War I, and published posthumously in 1920)- https://en.wikipedia.org/wiki/Dulce_et_Decorum_estPodcast Nine and Three-Quarters (TNC Podcast)- https://thatsnotcanon.com/nineandthreequarterspodcast Shout Outs - 26 January 2020 – Kobe Bryant died – https://en.wikipedia.org/wiki/Death_of_Kobe_BryantOn January 26, 2020, a Sikorsky S-76B helicopter crashed in Calabasas, California, around 30 miles (48 kilometers) northwest of Los Angeles, en route to Camarillo, California. It was carrying former basketball player Kobe Bryant, his daughter Gianna, six family friends including baseball coach John Altobelli and his wife and daughter, and the pilot. All on board were killed.- 26 January 2020 – Australian of the Year 2020 awarded to eye surgeon James Muecke - https://www.abc.net.au/news/2020-01-25/australian-of-year-awards-2020-announced-in-canberra/11901006 Dr Muecke was honoured in a ceremony in Canberra. In 2000 he co-founded Vision Myanmar at the South Australian Institute of Ophthalmology and later co-founded Sight For All, a social impact organisation aiming to create a world where everyone can see. More recently Dr Muecke's work has specifically focused on preventing the leading cause of blindness in adults — type 2 diabetes. When accepting the award from Prime Minister Scott Morrison, Dr Muecke said he viewed type 2 diabetes — something that impacts nearly one in every 10 Australians — as a "looming catastrophe for our health system".- 28 January 1958 – Lego patents its First Bricks - https://www.nationalgeographic.org/thisday/jan28/lego-patents-its-first-bricks/The Lego Group, with headquarters in Billund, Denmark, patented its design for interlocking plastic bricks. The design was so stable that those bricks can still be used with Lego sets created today. Today, the Lego Group, based in Denmark, is worth more than a billion dollars.Remembrances- 26 January 2020 – Louis Nirenberg -https://en.wikipedia.org/wiki/Louis_NirenbergCanadian-American mathematician, considered one of the most outstanding mathematicians of the 20th century. He made fundamental contributions to linear and nonlinear partial differential equations (PDEs) and their application to complex analysis and geometry. His contributions include the Gagliardo–Nirenberg interpolation inequality, which is important in the solution of the elliptic partial differential equations that arise in many areas of mathematics, and the formalization of the bounded mean oscillation known as John–Nirenberg space, which is used to study the behaviour of both elastic materials and games of chance known as martingales. He died at the age of 94 in New York. - 28 January 1918 – Lieutenant Colonel John McCrae - https://en.wikipedia.org/wiki/John_McCraeCanadian poet, physician, author, artist and soldier during World War I, and a surgeon during the Second Battle of Ypres, in Belgium. He is best known for writing the famous war memorial poem "In Flanders Fields". McCrae died of pneumonia near the end of the war. "In Flanders Fields" appeared anonymously in Punch on December 8, 1915, but in the index to that year McCrae was named as the author. The verses swiftly became one of the most popular poems of the war, used in countless fund-raising campaigns and frequently translated. He died from pneumonia with "extensive pneumococcus meningitis" at the age of 45 in Boulogne-sur-Mer.- 28 January 1996 – Jerry Siegel - https://en.wikipedia.org/wiki/Camillo_Golgi Jerome Siegel who also used pseudonyms including Joe Carter and Jerry Ess was an American comic book writer. His most famous creation was Superman, which he created in collaboration with his friend Joe Shuster. Siegel and Shuster had been developing the Superman story and character since 1933, hoping to sell it as a syndicated newspaper comic-strip. But after years of fruitless soliciting to the syndicates, Siegel and Shuster agreed to publish Superman in a comic book. In March 1938, they sold all rights to Superman to the comic-book publisher Detective Comics, Inc., another forerunner of DC, for $130 ($2,361 when adjusted for inflation). Siegel and Shuster later regretted their decision to sell Superman after he became an astonishing success. DC Comics now owned the character and reaped the royalties. Nevertheless, DC Comics retained Siegel and Shuster as the principal writer and artist for the Superman comics, and they were well-paid because they were popular with the readers. He died from a heart attack at the age of 81 in Los Angeles,California.Famous Birthdays- 28 January 1611 – Johannes Hevelius - https://en.wikipedia.org/wiki/Johannes_Hevelius A councillor and mayor of Danzig (Gdańsk), Kingdom of Poland. As an astronomer, he gained a reputation as "the founder of lunar topography", and described ten new constellations, seven of which are still used by astronomers. He discovered four comets, in 1652, 1661, 1672 and 1677. These discoveries led to his thesis that such bodies revolve around the Sun in parabolic paths. He was born in Danzig,Pomeranian Voivodeship.- 28 January 1912 – Jackson Pollack – https://en.wikipedia.org/wiki/Jackson_PollockAmerican painter and a major figure in the abstract expressionist movement. He was widely noticed for his technique of pouring or splashing liquid household paint onto a horizontal surface ('drip technique'), enabling him to view and paint his canvases from all angles. It was also called 'action painting', since he used the force of his whole body to paint, often in a frenetic dancing style. This extreme form of abstraction divided the critics: some praised the immediacy and fluency of the creation, while others derided the random effects. He was born in Cody Wyoming. - 28 January 1966 – Seiji Mizushima - https://en.wikipedia.org/wiki/Seiji_Mizushima Japanese anime storyboard artist and director. He has directed such series as Shaman King, Fullmetal Alchemist, Mobile Suit Gundam 00, Un-Go, and Concrete Revolutio. He also directed anime films such as Fullmetal Alchemist: Conqueror of Shamballa and Mobile Suit Gundam 00 the Movie: Awakening of the Trailblazer. He was born in Fuchū, Tokyo.Events of Interest - 28 January 1887 - The first digging work on the Eiffel Tower started - https://www.toureiffel.paris/en/the-monument/historyIt took two years, two months, and five days after construction began. The pieces of iron are connected by 2.5 million rivets. They were put in place by between 150 and 300 workers who were employed to build the structure. The structure may be named after Gustave Eiffel, but it was actually his senior engineers, Maurice Koechlin and Emile Nouguier, who designed the building.- 21 January 1981 - The World Land Speed Record on a public road is broken by Rudolf Caracciola in the Mercedes-Benz W195 at a speed of 432.7 kilometres per hour (268.9 mph). - https://en.wikipedia.org/wiki/Mercedes-Benz_W125_Rekordwagen The streamlined car was derived from the 1937 open-wheel race car Mercedes-Benz W125 Formel-Rennwagen, of which also a streamlined version was raced at the non-championship Avusrennen in Berlin. The main difference to the Grand Prix race car, which had to adhere to the 750 kg (1,653 lb) limit, was the engine. While the GP car had the 8-cylinder inline M125, which was rather tall, the record car was fitted with a V12 engine that was lower, which reduced drag. It remained the fastest ever officially timed speed on a public road until broken on 5 November 2017 by Koenigsegg in an Agera RS driven by Niklas Lilja, achieving 445.6 km/h (276.9 mph) on a closed highway in Nevada.- 28 January 1896 – Walter Arnold of East Peckham, Kent, becomes the first person to be convicted of speeding. He was fined oneshilling, plus costs, for speeding at 8 mph (13 km/h), thereby exceeding the contemporary speed limit of 2 mph (3.2 km/h). - https://www.historic-uk.com/HistoryUK/HistoryofBritain/Walter-Arnold-Worlds-First-Speeding-Ticket/The London Daily News detailed the four counts, also known as “informations”, on which Walter Arnold faced charges at Tunbridge Wells court. Arnold’s vehicle was described several times in the newspaper court report as a “horseless carriage”, and the case clearly raised some interesting philosophical as well as legal points for the bench. The first count, which reads oddly now, was for using a “locomotive without a horse,” the next for having fewer than three persons “in charge of the same”, indicating the enduring influence of horse-drawn and steam locomotion when it came to legislating the new vehicles. Next came the actual speeding charge, for driving at more than two miles per hour, and finally, a charge for not having his name and address on the vehicle. The case may have had an influence on the changes to legislation shortly afterwards. The fearsome machines no longer needed a minimum of three people to control them. IntroArtist – Goblins from MarsSong Title – Super Mario - Overworld Theme (GFM Trap Remix)Song Link - https://www.youtube.com/watch?v=-GNMe6kF0j0&index=4&list=PLHmTsVREU3Ar1AJWkimkl6Pux3R5PB-QJ Follow us on Facebook - Page - https://www.facebook.com/NerdsAmalgamated/ - Group - https://www.facebook.com/groups/440485136816406/ Twitter - https://twitter.com/NAmalgamated Spotify - https://open.spotify.com/show/6Nux69rftdBeeEXwD8GXrS iTunes - https://itunes.apple.com/au/podcast/top-shelf-nerds/id1347661094 RSS - http://www.thatsnotcanonproductions.com/topshelfnerdspodcast?format=rssInstagram - https://www.instagram.com/nerds_amalgamated/General Enquiries Email - Nerds.Amalgamated@gmail.comRate & Review us on Podchaser - https://www.podchaser.com/podcasts/nerds-amalgamated-623195
In this episode Gudrun talks with her new colleague Xian Liao. In November 2018 Xian has been appointed as Junior Professor (with tenure track) at the KIT-Faculty of Mathematics. She belongs to the Institute of Analysis and works in the group Nonlinear Partial Differential Equations. She is very much interested in Dispersive Partial Differential Equations. These equations model, e.g., the behaviour of waves. For that it is a topic very much in the center of the CRC 1173 - Wave phenomena at our faculty. Her mathematical interest was always to better understand the solutions of partial differential equations. But she arrived at dispersive equations through several steps in her carreer. Originally she studied inhomogeneous incompressible fluids. This can for example mean that the fluid is a mixture of materials with different viscosities. If we have a look at the Navier-Stokes equations for materials like water or oil, one main assumption therein is, that the viscosity is a material constant. Nevertheless, the equations modelling their flows are already nonlinear and there are a few serious open questions. Studying flows of inhomogneous materials brings in further difficulties since there occur more and more complex nonlinearities in the equations. It is necessary to develop a frame in which one can characterise the central properties of the solutions and the flow. It turned out that for example finding and working with quantities which remain conserved in the dynamics of the process is a good guiding line - even if the physical meaning of the conserved quantitiy is not always clear. Coming from classical theory we know that it makes a lot of sense to have a look at the conservation of mass, energy and momentum, which translate to conserved quantities as combinations of velocity, its derivatives, pressure and density. Pressure and density are not independent in these simplified models but are independent in the models Xiao studies. In the complex world of inhomogeneous equations we lose the direct concept to translate between physics and mathematics but carry over the knowledge that scale invarance and conservation are central properties of the model. It is interesting to characterize how the complex system develops with a change of properties. To have a simple idea - if it is more developing in the direction of fast flowing air or slow flowing almost solid material. One number which helps to see what types of waves one has to expect is the Mach number. It helps to seperate sound waves from fluid waves. A mathematical/physical question then is to understand the process of letting the Mach number go to zero in the model. It is not that complicated to make this work in the formulae. But the hard work is done in proving that the solutions to the family of systems of PDEs with lower and lower Mach number really tend to the solutions of the derived limit system. For example in order to measure if solutions are similar to each other (i.e. they get nearer and nearer to each other) one needs to find the norms which measure the right properties. Xian was Undergraduate & Master student at the Nanjing University in China from 2004 to 2009, where she was working with Prof. Huicheng Yin on Partial Differential Equations. She succeeded in getting the scholarship from China Scholarship Council and did her PhD within the laboratory LAMA (with Prof. Raphaël Danchin on zero-Mach number system). She was member of the University Paris-Est but followed many master courses in the programs of other Parisian universities as well. In 2013 she spent 8 months at the Charles University in Prague as Postdoc within the research project MORE. There she collaborated with Prof. Eduard Feireisl and Prof. Josef Málek on understanding non-Newtonian fluids better. After that period she returned to China and worked two years at the Academy of Mathematics & Systems Science as Postdoc within the research center NCMIS. With Prof. Ping Zhang she was working on density patch problems. Before her appointment here in Karlsruhe she already returned to Europe. 2016-2018 she was Postdoc at the University Bonn within the CRC 1060. She was mainly working with Prof. Herbert Koch on Gross-Pitaevskii equations - a special topic within dispersive equations. References Short Interview with the CRC 1173 Wave phenomena X. Liao, R. Danchin: On the wellposedness of the full low-Mach number limit system in general Besov spaces. Commun. Contemp. Math.: 14(3), 1250022, 2012. X. Liao: A global existence result for a zero Mach number system. J. Math. Fluid Mech.: 16(1), 77-103, 2014. X. Liao, E. Feireisl and J. Málek: Global weak solutions to a class of non-Newtonian compressible fluids. Math. Methods Appl. Sci.: 38(16), 3482-3494, 2015. X. Liao: On the strong solutions of the nonhomogeneous incompressible Navier-Stokes equations in a thin domain. Differential Integral Equations: 29, 167-182, 2016. X. Liao, P. Zhang: Global regularities of 2-D density patches for viscous inhomogeneous incompressible flow with general density: high regularity case, 2016.
Jonathan Fröhlich hat im Juli 2017 seine Masterarbeit zum Thema "Heterogeneous Multiscale Methods for Poroelastic Media" eingereicht. Sie wurde von Professor Christian Wieners in unserem Institut betreut. Strömungs- und Transportphänomene in sogenannten porösen Medien spielen eine wichtige Rolle in einem breiten Spektrum von Bereichen und Anwendungen, wie zum Beispiel in der Landwirtschaft, der Biomedizin, der Baugeologie und der Erdöltechnik. Betrachtet man beispielsweise den Boden, so stellt man fest, dass der Sand, das Gestein oder der Kies keine homogene Masse ist mit homogenen Materialeigenschaften, sondern aus unzähligen unterschiedlich großen und in den physikalischen Eigenschaften variierenden Teilen bestehen. Die hohe Heterogenität solcher Medien führt auf eine große Komplexität, die im Modell des porösen Mediums stark vereinfacht betrachtet wird. Es liegt deshalb die Frage nahe: Wie verallgemeinert man herkömmliche Modelle für poröse Medien so, dass nicht gleich die komplette Zusammensetzung benötigt wird, aber mehr von der Struktur berücksichtigt wird? Die vorliegende Arbeit und unser Gespräch konzentrieren sich auf einen Spezialfall, nämlich die einphasige Strömung durch poroelastische Medien. Sie sind gekennzeichnet durch die Wechselwirkung zwischen der Beanspruchung der intrinisischen Struktur und der Strömung der Flüssigkeit. Konkret erzwingt die Änderung des Flüssigkeitsdrucks eine Beanspruchung des Materials, wodurch es beschleunigt und bewegt wird. Ein Beispiel hierfür ist der Blutfluß durch Adern. Das Blut verändert im Fließen ständig die konkrete Geometrie der elastisch verformbaren Adern und gleichzeitig ändern die Adern die Fließrichtung und -geschwindigkeit des Blutes. Dieser Prozeß wird mit bestimmten partiellen Differentialgleichungen (PDEs) modelliert. Jonathan verwendete das von Biot (1941) eingeführte linearisierte Modell und erweitert es zu einem quasistatischen Konsolidationsmodell für die Bodenmechanik. Solche Probleme sind charakterisiert durch die enorme Größe des betrachteten Gebietes, beispielsweise mehrere Kilometer an Flussbett. Dies steht im Kontrast zu den sehr kleineskaligen geometrischen Informationen, wie Sandkorngrößen und -formen, die einen Einfluss auf das System haben. Die standardmäßige Finite-Elemente-Methode zur numerischen Lösung dieses Systems von PDEs wird nur dann gute Ergebnisse liefern, wenn die Auflösung des Netzes wirklich extrem hoch ist. Dies würde zu nicht realisierbaren Rechenzeiten führen. Deshalb wird eine Idee von E und Engquist benutzt, die sogenannte Finite Element Heterogene Multiskalen Methode (FE-HMM) von 2003. Sie entkoppelt den heterogenen Teil und löst ihn durch ein mikroskopisch modelliertes Problem. Das makroskopische Problem braucht dann nur ein viel gröberes Netz und benutzt die Informationen aus dem mikroskopischen Teil als Daten. Mathematisch gesehen verwendet die Theorie eine schwache Formulierung mit Hilfe von Bilinearformen und sucht nach Lösungen in Sobolev-Räumen. Die passende Numerik für das makroskopische Problem ist eine gemischte Finite-Elemente-Methode für ein gestörtes Sattelpunktproblem. Deshalb müssen für Existenz und Eindeutigkeit von schwachen Lösungen bestimmte Bedingungen erfüllt sein, die der klassischen LBB-Bedingung (auch inf-sup-Bedingung genannt) ähnlich sind. Das zu lösende mikroskopische Problem ist elliptisch und wird mithilfe klassischer Homogenisierungstheorie hergeleitet, wobei zusätzliche Bedingungen zur Sicherung der Zwei-Skalen Konvergenz erfüllt werden müssen. Literatur und weiterführende Informationen Maurice A. Biot: General Theory of Three‐Dimensional Consolidation Journal of Applied Physics 12, 155, 1941. E. Weinan, Björn Engquist: The Heterogeneous Multiscale MethodsCommunications in Mathematical Sciences, Volume 1, Number 1, 87-132, 2003. M. Sahimi: Flow and Transport in Porous Media and Fractured Rock Wiley Weinheim, 2011. Assyr Abdulle e.a.: The heterogeneous multiscale method Acta Numerica Volume 21, pp. 1-87, 2012. Podcasts J. Fröhlich: Getriebeauswahl, Gespräch mit G. Thäter im Modellansatz Podcast, Folge 028, Fakultät für Mathematik, Karlsruher Institut für Technologie (KIT), 2014. L.L.X. Augusto: Filters, Gespräch mit G. Thäter im Modellansatz Podcast, Folge 112, Fakultät für Mathematik, Karlsruher Institut für Technologie (KIT), 2016.
Second numéro de cette émission de Mind The Game. Second numéro, un peu moins de fébrilité malgré mon expérience podcastique et de nouveau un ami...Et de longue date pour ce ping-pong !Le jeu ?Devil May Cry, l'original de Capcom qui a quasi créé un sous-genre voire un genre pour les plus grands fans, plus de 4 épisodes et un Reboot. Un démon qui pue la classe armée d'une épée et d'une paire de guns qui ont chacun un nom...Badass on vous dit !C'est qui l'invité ?Il s'agit de Mathieu. Pas podcasteur à ses heures mais un compagnon de jeu vidéo, animation japonaise et bien d'autres délires. Un acharné de la série Devil May Cry...ça aide !Pourquoi ?Parce que c'était une super excuse lors de mon passage à l'époque à Strasbourg d'ajouter à un séjour déjà fort plaisant ! Donc pourquoi ne pas se faire un kiff supplémentaire sur un jeu qui nous a fait, et fait encore, vibrer aujourd'hui.Au final ?Une émission « stylish as hell ». Classe au possible en français ! :pDes bons délires, des blagues à deux balles qu'on adore tous les deux et des réflexions nouvelles auxquelles je n'aurai pas pensé sans lui !Au menu de ce podcast autour du jeu Devil May Cry :Partie 1 00:00:00 Intro de l'invité autour de Devil May Cry et du genre qu'il a créé ;« Quelques grammes d'exploration dans un monde de hack n'slash ? »Partie 2 00:20:55Level Design et autre construction de niveaux Partie 3 00:45:25Dante Must Die : toute une histoire !Partie 4 00:58:42Du démon à la sorcière : le pré-BayonettaLes références données durant l'émission :L'émission Speed game de Nesblog sur Devil May Cry : http://www.nesblog.com/lediabledemaipleur/http://www.nesblog.com/dmcaldante/La chaine Youtube de Saurian avec les Umbran arts : https://www.youtube.com/user/SaurLa chaine Youtube de n472a, c'est en japonais mais en fouillant avec les vignettes on s'en sort;) :https://www.youtube.com/user/n472aArticle de Moossye sur Bayonetta héroïne camp-roccoco : Un article fleuve est aussi dispo dans le numéro « Girl Power » de la collection « Les cahiers du jeu vidéo » aux éditions Pixn'loveEt enfin une intervention « épilogue » dans le podcast Badass (avec plein d'autres héroïnes cultes et de chouettes chroniqueuses!) : http://podtail.com/podcast/badass/l-heroine-qui-a-change-ma-vie/Crédits musiques (dans l'ordre d'apparition)OST Devil May CryLet's rock ! (Title)OST Devil May Cry HR HM ArrangeLock & LoadGM-03 (The Idol of Time and Space)OST Devil May Cry 3 OSTDevil Never Cry (staff roll)Tous les thèmes sont la propriété de leurs compositeurs et/ou sociétés respectives.Pour info : http://vgmdb.net/album/2903http://vgmdb.net/album/2901http://vgmdb.net/album/53080Le générique est composé de sons libres de droit (dont la base de données http://shtooka.net/index.php)Le morceau Dead pixels est composé par Lukash dont vous pouvez consulter en libre tous ses albums : http://lukhash.com/ET SURTOUT TOUT COMME MOI EVENTUELLEMENT LUI FAIRE UN DON SI VOUS APPRECIEZ SON TRAVAIL ! ;)
This is one of two conversations which Gudrun Thäter recorded alongside the conference Women in PDEs which took place at our faculty in Karlsruhe on 27-28 April 2017. Marie Elisabeth Rognes was one of the seven invited speakers. Marie is Chief Research Scientist at the Norwegian research laboratory Simula near Oslo. She is Head of department for Biomedical Computing there. Marie got her university education with a focus on Applied Mathematics, Mechanics and Numerical Physics as well as her PhD in Applied mathematics at the Centre for Mathematics for Applications in the Department of Mathematics at the University of Oslo. Her work is devoted to providing robust methods to solve Partial Differential Equations (PDEs) for diverse applications. On the one hand this means that from the mathematical side she works on numerical analysis, optimal control, robust Finite Element software as well as Uncertainty quantification while on the other hand she is very much interested in the modeling with the help of PDEs and in particular Mathematical models of physiological processes. These models are useful to answer What if type-questions much more easily than with the help of laboratory experiments. In our conversation we discussed one of the many applications - Cerebral fluid flow, i.e. fluid flow in the context of the human brain. Medical doctors and biologists know that the soft matter cells of the human brain are filled with fluid. Also the space between the cells contains the water-like cerebrospinal fluid. It provides a bath for human brain. The brain expands and contracts with each heartbeat and appoximately 1 ml of fluid is interchanged between brain and spinal area. What the specialists do not know is: Is there a circulation of fluid? This is especially interesting since there is no traditional lymphatic system to transport away the biological waste of the brain (this process is at work everywhere else in our body). So how does the brain get rid of its litter? There are several hyotheses: Diffusion processes, Fast flow (and transport) along the space near blood vessel, Convection. The aim of Marie's work is to numerically test these (and other) hypotheses. Basic testing starts on very idalised geometries. For the overall picture one useful simplified geometry is the annulus i.e. a region bounded by two concentric circles. For the microlevel-look a small cube can be the chosen geometry. As material law the flow in a porous medium which is based on Darcy flow is the starting point - maybe taking into account the coupling with an elastic behaviour on the boundary. The difficult non-mathematical questions which have to be answered are: How to use clinical data for estabilishing and testing models How to prescribe the forces In the near future she hopes to better understand the multiscale character of the processes. Here especially for embedding 1d- into 3d-geometry there is almost no theory available. For the project Marie has been awarded a FRIPRO Young Research Talents Grant of the Research Council of Norway (3 years - starting April 2016) and the very prestegious ERC Starting Grant (5 years starting - 2017). References M.E. Rognes: Mathematics that cures us.TEDxOslo 3 May 2017 Young academy of Norway ERC Starting Grant: Mathematical and computational foundations for modeling cerebral fluid flow 5 years P.E. Farrell e.a.: Dolfin adjoint (Open source software project) FEniCS computing platform for PDEs (Open source software project) Wikipedia on FEniCS Collection of relevant literature implemented in FEniCS
This is one of two conversations which Gudrun Thäter recorded alongside the conference Women in PDEs which took place at our faculty in Karlsruhe on 27-28 April 2017. Marie Elisabeth Rognes was one of the seven invited speakers. Marie is Chief Research Scientist at the Norwegian research laboratory Simula near Oslo. She is Head of department for Biomedical Computing there. Marie got her university education with a focus on Applied Mathematics, Mechanics and Numerical Physics as well as her PhD in Applied mathematics at the Centre for Mathematics for Applications in the Department of Mathematics at the University of Oslo. Her work is devoted to providing robust methods to solve Partial Differential Equations (PDEs) for diverse applications. On the one hand this means that from the mathematical side she works on numerical analysis, optimal control, robust Finite Element software as well as Uncertainty quantification while on the other hand she is very much interested in the modeling with the help of PDEs and in particular Mathematical models of physiological processes. These models are useful to answer What if type-questions much more easily than with the help of laboratory experiments. In our conversation we discussed one of the many applications - Cerebral fluid flow, i.e. fluid flow in the context of the human brain. Medical doctors and biologists know that the soft matter cells of the human brain are filled with fluid. Also the space between the cells contains the water-like cerebrospinal fluid. It provides a bath for human brain. The brain expands and contracts with each heartbeat and appoximately 1 ml of fluid is interchanged between brain and spinal area. What the specialists do not know is: Is there a circulation of fluid? This is especially interesting since there is no traditional lymphatic system to transport away the biological waste of the brain (this process is at work everywhere else in our body). So how does the brain get rid of its litter? There are several hyotheses: Diffusion processes, Fast flow (and transport) along the space near blood vessel, Convection. The aim of Marie's work is to numerically test these (and other) hypotheses. Basic testing starts on very idalised geometries. For the overall picture one useful simplified geometry is the annulus i.e. a region bounded by two concentric circles. For the microlevel-look a small cube can be the chosen geometry. As material law the flow in a porous medium which is based on Darcy flow is the starting point - maybe taking into account the coupling with an elastic behaviour on the boundary. The difficult non-mathematical questions which have to be answered are: How to use clinical data for estabilishing and testing models How to prescribe the forces In the near future she hopes to better understand the multiscale character of the processes. Here especially for embedding 1d- into 3d-geometry there is almost no theory available. For the project Marie has been awarded a FRIPRO Young Research Talents Grant of the Research Council of Norway (3 years - starting April 2016) and the very prestegious ERC Starting Grant (5 years starting - 2017). References M.E. Rognes: Mathematics that cures us.TEDxOslo 3 May 2017 Young academy of Norway ERC Starting Grant: Mathematical and computational foundations for modeling cerebral fluid flow 5 years P.E. Farrell e.a.: Dolfin adjoint (Open source software project) FEniCS computing platform for PDEs (Open source software project) Wikipedia on FEniCS Collection of relevant literature implemented in FEniCS
This is one of two conversations which Gudrun Thäter recorded alongside the conference Women in PDEs which took place at our Department in Karlsruhe on 27-28 April 2017. Maria Lopez-Fernandez from the University La Sapienza in Rome was one of the seven invited speakers. She got her university degree at the University of Valladolid in Spain and worked as an academic researcher in Madrid and at the University of Zürich. Her field of research is numerical analyis and in particular the robust and efficient approximation of convolutions. The conversation is mainly focussed on its applications to wave scattering problems. The important questions for the numerical tools are: Consistency, stability and convergence analysis. The methods proposed by Maria are Convolution Quadrature type methods for the time discretization coupled with the boundary integral methods for the spatial discretization. Convolution Quadrature methods are based on Laplace transformation and numerical integration. They were initially mostly developed for parabolic problems and are now adapted to serve in the context of (hyperbolic) wave equations. Convolution quadrature methods introduce artificial dissipation in the computation, which stabilzes the numerics. However it would be physically more meaningful to work instead with schemes which conserve mass. She is mainly interested in fast algorithms with reduced memory requirements and adaptivity in time and space. The motivational example for her talk was the observation of severe acoustic problems inside a new building at the University of Zürich. Any conversation in the atrium made a lot of noise and if someone was speaking loud it was hard to understand by the others. An improvement was provided by specialised engineers who installed absorbing panels. From the mathematical point of view this is an nice application of the modelling and numerics of wave scattering problems. Of course, it would make a lot of sense to simulate the acoustic situation for such spaces before building them - if stable fast software for the distribution of acoustic pressure or the transport of signals was available. The mathematical challenges are high computational costs, high storage requirements and and stability problems. Due to the nonlocal nature of the equations it is also really hard to make the calculations in parallel to run faster. In addition time-adaptive methods for these types of problems were missing completely in the mathematical literature. In creating them one has to control the numerical errors with the help of a priori and a posteriori estimates which due to Maria's and others work during the last years is in principle known now but still very complicated. Also one easily runs into stability problems when changing the time step size. The acoustic pressure distribution for the new building in Zürich has been sucessfully simulated by co-workers in Zürich and Graz by using these results together with knowledge about the sound-source and deriving heuristic measures from that in order to find a sequence of time steps which keeps the problem stable and adapt to the computations effectively. There is a lot of hope to improve the performance of these tools by representing the required boundary element matrices by approximations with much sparser matrices. References M. López Fernández, S. Sauter: Generalized Convolution Quadrature with Variable Time Stepping. Part II: Algorithm and Numerical Results. Applied Numerical Mathematics, 94, pp. 88 - 105 (2015) M. López Fernández, S. Sauter: Generalized Convolution Quadrature based on Runge-Kutta Methods. Numerische Mathematik, 133 (4), pp. 734 - 779 (2016) S. Sauter, M. Schanz: Convolution Quadrature for the Wave Equation with Impedance Boundary Conditions. Journal of Computational Physics, Vol 334, pp. 442 - 459 (2017) Podcasts T. Arens: Lärmschutz, Gespräch mit S. Ritterbusch im Modellansatz Podcast, Folge 16, Fakultät für Mathematik, Karlsruher Institut für Technologie (KIT), 2014. F. Sayas: Acoustic Scattering, Conversation with G. Thäter in the Modellansatz Podcast, Episode 58, Department of Mathematics, Karlsruhe Institute of Technology (KIT), 2016.
This is one of two conversations which Gudrun Thäter recorded alongside the conference Women in PDEs which took place at our Department in Karlsruhe on 27-28 April 2017. Maria Lopez-Fernandez from the University La Sapienza in Rome was one of the seven invited speakers. She got her university degree at the University of Valladolid in Spain and worked as an academic researcher in Madrid and at the University of Zürich. Her field of research is numerical analyis and in particular the robust and efficient approximation of convolutions. The conversation is mainly focussed on its applications to wave scattering problems. The important questions for the numerical tools are: Consistency, stability and convergence analysis. The methods proposed by Maria are Convolution Quadrature type methods for the time discretization coupled with the boundary integral methods for the spatial discretization. Convolution Quadrature methods are based on Laplace transformation and numerical integration. They were initially mostly developed for parabolic problems and are now adapted to serve in the context of (hyperbolic) wave equations. Convolution quadrature methods introduce artificial dissipation in the computation, which stabilzes the numerics. However it would be physically more meaningful to work instead with schemes which conserve mass. She is mainly interested in fast algorithms with reduced memory requirements and adaptivity in time and space. The motivational example for her talk was the observation of severe acoustic problems inside a new building at the University of Zürich. Any conversation in the atrium made a lot of noise and if someone was speaking loud it was hard to understand by the others. An improvement was provided by specialised engineers who installed absorbing panels. From the mathematical point of view this is an nice application of the modelling and numerics of wave scattering problems. Of course, it would make a lot of sense to simulate the acoustic situation for such spaces before building them - if stable fast software for the distribution of acoustic pressure or the transport of signals was available. The mathematical challenges are high computational costs, high storage requirements and and stability problems. Due to the nonlocal nature of the equations it is also really hard to make the calculations in parallel to run faster. In addition time-adaptive methods for these types of problems were missing completely in the mathematical literature. In creating them one has to control the numerical errors with the help of a priori and a posteriori estimates which due to Maria's and others work during the last years is in principle known now but still very complicated. Also one easily runs into stability problems when changing the time step size. The acoustic pressure distribution for the new building in Zürich has been sucessfully simulated by co-workers in Zürich and Graz by using these results together with knowledge about the sound-source and deriving heuristic measures from that in order to find a sequence of time steps which keeps the problem stable and adapt to the computations effectively. There is a lot of hope to improve the performance of these tools by representing the required boundary element matrices by approximations with much sparser matrices. References M. López Fernández, S. Sauter: Generalized Convolution Quadrature with Variable Time Stepping. Part II: Algorithm and Numerical Results. Applied Numerical Mathematics, 94, pp. 88 - 105 (2015) M. López Fernández, S. Sauter: Generalized Convolution Quadrature based on Runge-Kutta Methods. Numerische Mathematik, 133 (4), pp. 734 - 779 (2016) S. Sauter, M. Schanz: Convolution Quadrature for the Wave Equation with Impedance Boundary Conditions. Journal of Computational Physics, Vol 334, pp. 442 - 459 (2017) Podcasts T. Arens: Lärmschutz, Gespräch mit S. Ritterbusch im Modellansatz Podcast, Folge 16, Fakultät für Mathematik, Karlsruher Institut für Technologie (KIT), 2014. F. Sayas: Acoustic Scattering, Conversation with G. Thäter in the Modellansatz Podcast, Episode 58, Department of Mathematics, Karlsruhe Institute of Technology (KIT), 2016.
Andrii Khrabustovskyi works at our faculty in the group Nonlinear Partial Differential Equations and is a member of the CRC Wave phenomena: analysis and numerics. He was born in Kharkiv in the Ukraine and finished his studies as well as his PhD at the Kharkiv National University and the Institute for Low Temperature Physics and Engineering of the National Academy of Sciences of Ukraine. He joined our faculty in 2012 as postdoc in the former Research Training Group 1294 Analysis, Simulation and Design of Nanotechnological Processes, which was active until 2014. Gudrun Thäter talked with him about one of his research interests Asymptotic analysis and homogenization of PDEs. Photonic crystals are periodic dielectric media in which electromagnetic waves from certain frequency ranges cannot propagate. Mathematically speaking this is due to gaps in the spectrum of the related differential operators. For that an interesting question is if there are gaps inbetween bands of the spectrum of operators related to wave propagation, especially on periodic geometries and with periodic coeffecicients in the operator. It is known that the spectrum of periodic selfadjoint operators has bandstructure. This means the spectrum is a locally finite union of compact intervals called bands. In general, the bands may overlap and the existence of gaps is therefore not guaranteed. A simple example for that is the spectrum of the Laplacian in which is the half axis . The classic approach to such problems in the whole space case is the Floquet–Bloch theory. Homogenization is a collection of mathematical tools which are applied to media with strongly inhomogeneous parameters or highly oscillating geometry. Roughly spoken the aim is to replace the complicated inhomogeneous by a simpler homogeneous medium with similar properties and characteristics. In our case we deal with PDEs with periodic coefficients in a periodic geometry which is considered to be infinite. In the limit of a characteristic small parameter going to zero it behaves like a corresponding homogeneous medium. To make this a bit more mathematically rigorous one can consider a sequence of operators with a small parameter (e.g. concerning cell size or material properties) and has to prove some properties in the limit as the parameter goes to zero. The optimal result is that it converges to some operator which is the right homogeneous one. If this limit operator has gaps in its spectrum then the gaps are present in the spectra of pre-limit operators (for small enough parameter). The advantages of the homogenization approach compared to the classical one with Floquet Bloch theory are: The knowledge of the limit operator is helpful and only available through homogenization. For finite domains Floquet Bloch does not work well. Though we always have a discrete spectrum we might want to have the gaps in fixed position independent of the size of our domain. Here the homogenization theory works in principle also for the bounded case (it is just a bit technical). An interesting geometry in this context is a domain with periodically distributed holes. The question arises: what happens if the sizes of holes and the period simultaneously go to zero? The easiest operator which we can study is the Laplace operator subject to the Dirichlet boundary conditions. There are three possible regimes: For holes of the same order as the period (even slightly smaller) the Dirichelet conditions on the boundary of holes dominate -- the solution for the corresponding Poisson equation tends to zero. For significantly smaller holes the influence on the holes is so small that the problem "forgets" about the influence of the holes as the parameter goes to zero. There is a borderline case which lies between cases 1 and 2. It represents some interesting effects and can explain the occurance of so-called strange terms. A traditional ansatz in homogenization works with the concept of so-called slow and fast variables. The name comes from the following observation. If we consider an infinite layer in cylindrical coordinates, then the variable r measures the distance from the origin when going "along the layer", the angle in that plane, and z is the variable which goes into the finite direction perpendicular to that plane. When we have functions then the derivative with respect to r changes the power to while the other derivatives leave that power unchanged. In the interesting case k is negative and the r-derivate makes it decreasing even faster. This leads to the name fast variable. The properties in this simple example translate as follows. For any function we will think of having a set of slow and fast variables (characteristic to the problem) and a small parameter eps and try to find u as where in our applications typically . One can formally sort through the -levels using the properties of the differential operator. The really hard part then is to prove that this formal result is indeed true by finding error estimates in the right (complicated) spaces. There are many more tools available like the technique of Tartar/Murat, who use a weak formulation with special test functions depending on the small parameter. The weak point of that theory is that we first have to know the resulat as the parameter goes to zero before we can to construct the test function. Also the concept of Gamma convergence or the unfolding trick of Cioranescu are helpful. An interesting and new application to the mathematical results is the construction of wave guides. The corresponding domain in which we place a waveguide is bounded in two directions and unbounded in one (e.g. an unbounded cylinder). Serguei Nazarov proposed to make holes in order to make gaps into the line of the spectrum for a specified wave guide. Andrii Khrabustovskyi suggests to distribute finitely many traps, which do not influence the essential spectrum but add eigenvalues. One interesting effect is that in this way one can find terms which are nonlocal in time or space and thus stand for memory effects of the material. References P. Exner and A. Khrabustovskyi: On the spectrum of narrow Neumann waveguide with periodically distributed δ′ traps, Journal of Physics A: Mathematical and Theoretical, 48 (31) (2015), 315301. A. Khrabustovskyi: Opening up and control of spectral gaps of the Laplacian in periodic domains, Journal of Mathematical Physics, 55 (12) (2014), 121502. A. Khrabustovskyi: Periodic elliptic operators with asymptotically preassigned spectrum, Asymptotic Analysis, 82 (1-2) (2013), 1-37. S.A. Nazarov, G. Thäter: Asymptotics at infinity of solutions to the Neumann problem in a sieve-type layer, Comptes Rendus Mecanique 331(1) (2003) 85-90. S.A. Nazarov: Asymptotic Theory of Thin Plates and Rods: Vol.1. Dimension Reduction and Integral Estimates. Nauchnaya Kniga: Novosibirsk, 2002.
Andrii Khrabustovskyi works at our faculty in the group Nonlinear Partial Differential Equations and is a member of the CRC Wave phenomena: analysis and numerics. He was born in Kharkiv in the Ukraine and finished his studies as well as his PhD at the Kharkiv National University and the Institute for Low Temperature Physics and Engineering of the National Academy of Sciences of Ukraine. He joined our faculty in 2012 as postdoc in the former Research Training Group 1294 Analysis, Simulation and Design of Nanotechnological Processes, which was active until 2014. Gudrun Thäter talked with him about one of his research interests Asymptotic analysis and homogenization of PDEs. Photonic crystals are periodic dielectric media in which electromagnetic waves from certain frequency ranges cannot propagate. Mathematically speaking this is due to gaps in the spectrum of the related differential operators. For that an interesting question is if there are gaps inbetween bands of the spectrum of operators related to wave propagation, especially on periodic geometries and with periodic coeffecicients in the operator. It is known that the spectrum of periodic selfadjoint operators has bandstructure. This means the spectrum is a locally finite union of compact intervals called bands. In general, the bands may overlap and the existence of gaps is therefore not guaranteed. A simple example for that is the spectrum of the Laplacian in which is the half axis . The classic approach to such problems in the whole space case is the Floquet–Bloch theory. Homogenization is a collection of mathematical tools which are applied to media with strongly inhomogeneous parameters or highly oscillating geometry. Roughly spoken the aim is to replace the complicated inhomogeneous by a simpler homogeneous medium with similar properties and characteristics. In our case we deal with PDEs with periodic coefficients in a periodic geometry which is considered to be infinite. In the limit of a characteristic small parameter going to zero it behaves like a corresponding homogeneous medium. To make this a bit more mathematically rigorous one can consider a sequence of operators with a small parameter (e.g. concerning cell size or material properties) and has to prove some properties in the limit as the parameter goes to zero. The optimal result is that it converges to some operator which is the right homogeneous one. If this limit operator has gaps in its spectrum then the gaps are present in the spectra of pre-limit operators (for small enough parameter). The advantages of the homogenization approach compared to the classical one with Floquet Bloch theory are: The knowledge of the limit operator is helpful and only available through homogenization. For finite domains Floquet Bloch does not work well. Though we always have a discrete spectrum we might want to have the gaps in fixed position independent of the size of our domain. Here the homogenization theory works in principle also for the bounded case (it is just a bit technical). An interesting geometry in this context is a domain with periodically distributed holes. The question arises: what happens if the sizes of holes and the period simultaneously go to zero? The easiest operator which we can study is the Laplace operator subject to the Dirichlet boundary conditions. There are three possible regimes: For holes of the same order as the period (even slightly smaller) the Dirichelet conditions on the boundary of holes dominate -- the solution for the corresponding Poisson equation tends to zero. For significantly smaller holes the influence on the holes is so small that the problem "forgets" about the influence of the holes as the parameter goes to zero. There is a borderline case which lies between cases 1 and 2. It represents some interesting effects and can explain the occurance of so-called strange terms. A traditional ansatz in homogenization works with the concept of so-called slow and fast variables. The name comes from the following observation. If we consider an infinite layer in cylindrical coordinates, then the variable r measures the distance from the origin when going "along the layer", the angle in that plane, and z is the variable which goes into the finite direction perpendicular to that plane. When we have functions then the derivative with respect to r changes the power to while the other derivatives leave that power unchanged. In the interesting case k is negative and the r-derivate makes it decreasing even faster. This leads to the name fast variable. The properties in this simple example translate as follows. For any function we will think of having a set of slow and fast variables (characteristic to the problem) and a small parameter eps and try to find u as where in our applications typically . One can formally sort through the -levels using the properties of the differential operator. The really hard part then is to prove that this formal result is indeed true by finding error estimates in the right (complicated) spaces. There are many more tools available like the technique of Tartar/Murat, who use a weak formulation with special test functions depending on the small parameter. The weak point of that theory is that we first have to know the resulat as the parameter goes to zero before we can to construct the test function. Also the concept of Gamma convergence or the unfolding trick of Cioranescu are helpful. An interesting and new application to the mathematical results is the construction of wave guides. The corresponding domain in which we place a waveguide is bounded in two directions and unbounded in one (e.g. an unbounded cylinder). Serguei Nazarov proposed to make holes in order to make gaps into the line of the spectrum for a specified wave guide. Andrii Khrabustovskyi suggests to distribute finitely many traps, which do not influence the essential spectrum but add eigenvalues. One interesting effect is that in this way one can find terms which are nonlocal in time or space and thus stand for memory effects of the material. References P. Exner and A. Khrabustovskyi: On the spectrum of narrow Neumann waveguide with periodically distributed δ′ traps, Journal of Physics A: Mathematical and Theoretical, 48 (31) (2015), 315301. A. Khrabustovskyi: Opening up and control of spectral gaps of the Laplacian in periodic domains, Journal of Mathematical Physics, 55 (12) (2014), 121502. A. Khrabustovskyi: Periodic elliptic operators with asymptotically preassigned spectrum, Asymptotic Analysis, 82 (1-2) (2013), 1-37. S.A. Nazarov, G. Thäter: Asymptotics at infinity of solutions to the Neumann problem in a sieve-type layer, Comptes Rendus Mecanique 331(1) (2003) 85-90. S.A. Nazarov: Asymptotic Theory of Thin Plates and Rods: Vol.1. Dimension Reduction and Integral Estimates. Nauchnaya Kniga: Novosibirsk, 2002.
In this lecture, Professor Trefethen discusses finite differencing in general grids and multiple space dimensions.
In this lecture, Professor Trefethen discusses PDEs in science and engineering, and explicit 1D finite differences.
In this concluding lecture, Professor Nick Trefethen discusses the question Who invented the great numerical algorithms?
In this lecture, Professor Trefethen discusses Chebyshev spectral discretization.
In this lecture, Professor Trefethen discusses Fourier, Laurent, and Chebyshev. Then, Chebyshev series and interpolants
In this lecture, Professor Trefethen discusses Fourier spectral discretization and Fourier spectral discretization via FFT.
In this lecture, Professor Trefethen discusses order of accuracy and reaction-diffusion equations and other stiff PDEs.
In this lecture, Professor Trefethen discusses numerical instability and implicit 1D finite differences.
This is the last of four conversation Gudrun had during the British Applied Mathematics Colloquium which took place 5th – 8th April 2016 in Oxford. Andrea Bertozzi from the University of California in Los Angeles (UCLA) held a public lecture on The Mathematics of Crime. She has been Professor of Mathematics at UCLA since 2003 and Betsy Wood Knapp Chair for Innovation and Creativity (since 2012). From 1995-2004 she worked mostly at Duke University first as Associate Professor of Mathematics and then as Professor of Mathematics and Physics. As an undergraduate at Princeton University she studied physics and astronomy alongside her major in mathematics and went through a Princeton PhD-program. For her thesis she worked in applied analysis and studied fluid flow. As postdoc she worked with Peter Constantin at the University of Chicago (1991-1995) on global regularity for vortex patches. But even more importantly, this was the moment when she found research problems that needed knowledge about PDEs and flow but in addition both numerical analysis and scientific computing. She found out that she really likes to collaborate with very different specialists. Today hardwork can largely be carried out on a desktop but occasionally clusters or supercomputers are necessary. The initial request to work on Mathematics in crime came from a colleague, the social scientist Jeffrey Brantingham. He works in Anthropology at UCLA and had well established contacts with the police in LA. He was looking for mathematical input on some of his problems and raised that issue with Andrea Bertozzi. Her postdoc George Mohler came up with the idea to adapt an earthquake model after a discussion with Frederic Paik Schoenberg, a world expert in that field working at UCLA. The idea is to model crimes of opportunity as being triggered by crimes that already happend. So the likelihood of new crimes can be predicted as an excitation in space and time like the shock of an earthquake. Of course, here statistical models are necessary which say how the excitement is distributed and decays in space and time. Mathematically this is a self-exciting point process. The traditional Poisson process model has a single parameter and thus, no memory - i.e. no connections to other events can be modelled. The Hawkes process builds on the Poisson process as background noise but adds new events which then are triggering events according to an excitation rate and the exponential decay of excitation over time. This is a memory effect based on actual events (not only on a likelihood) and a three parameter model. It is not too difficult to process field data, fit data to that model and make an extrapolation in time. Meanwhile the results of that idea work really well in the field. Results of field trials both in the UK and US have just been published and there is a commercial product available providing services to the police. In addition to coming up with useful ideas and having an interdisciplinary group of people committed to make them work it was necessery to find funding in order to support students to work on that topic. The first grant came from the National Science Foundation and from this time on the group included George Tita (UC Irvine) a criminology expert in LA-Gangs and Lincoln Chayes as another mathematician in the team. The practical implementation of this crime prevention method for the police is as follows: Before the policemen go out on a shift they ususally meet to divide their teams over the area they are serving. The teams take the crime prediction for that shift which is calculated by the computer model on the basis of whatever data is available up to shift. According to expected spots of crimes they especially assign teams to monitor those areas more closely. After introducing this method in the police work in Santa Cruz (California) police observed a significant reduction of 27% in crime. Of course this is a wonderful success story. Another success story involves the career development of the students and postdocs who now have permanent positions. Since this was the first group in the US to bring mathematics to police work this opened a lot of doors for young people involved. Another interesting topic in the context of Mathematics and crime are gang crime data. As for the the crime prediction model the attack of one gang on a rival gang usually triggers another event soon afterwards. A well chosen group of undergraduates already is mathematically educated enough to study the temporary distribution of gang related crime in LA with 30 street gangs and a complex net of enemies. We are speaking about hundreds of crimes in one year related to the activity of gangs. The mathematical tool which proved to be useful was a maximum liklihood penalization model again for the Hawkes process applied on the expected retaliatory behaviour. A more complex problem, which was treated in a PhD-thesis, is to single out gangs which would be probably responsable for certain crimes. This means to solve the inverse problem: We know the time and the crime and want to find out who did it. The result was published in Inverse Problems 2011. The tool was a variational model with an energy which is related to the data. The missing information is guessed and then put into the energy . In finding the best guess related to the chosen energy model a probable candidate for the crime is found. For a small number of unsolved crimes one can just go through all possible combinations. For hundreds or even several hundreds of unsolved crimes - all combinations cannot be handled. We make it easier by increasing the number of choices and formulate a continuous instead of the discrete problem, for which the optimization works with a standard gradient descent algorithm. A third topic and a third tool is Compressed sensing. It looks at sparsitiy in data like the probability distribution for crime in different parts of the city. Usually the crime rate is high in certain areas of a city and very low in others. For these sharp changes one needs different methods since we have to allow for jumps. Here the total variation enters the model as the -norm of the gradient. It promotes sparsity of edges in the solution. Before coming up with this concept it was necessary to cross-validate quite a number of times, which is computational very expensive. So instead of in hours the result is obtained in a couple minutes now. When Andrea Bertozzi was a young child she spent a lot of Sundays in the Science museum in Boston and wanted to become a scientist when grown up. The only problem was, that she could not decide which science would be the best choice since she liked everything in the museum. Today she says having chosen applied mathematics indeed she can do all science since mathematics works as a connector between sciences and opens a lot of doors. References Press coverage of Crime prevention collected Website of Mathematical and Simulation Modeling of Crime Examples for work of undergraduates M. Allenby, e.a.: A Point Process Model for Simulating Gang-on-Gang Violence, Project Report, 2010. K. Louie: Statistical Modeling of Gang Violence in Los Angeles, talk at AMS Joint meetings San Francisco, AMS Session on Mathematics in the Social Sciences, 2010] Publications of A. Bertozzi and co-workers on Crime prevention G.O. Mohler e.a.: Randomized controlled field trials of predictive policing, J. Am. Stat. Assoc., 111(512), 1399-1411, 2015. J. T. Woodworth e.a.: Nonlocal Crime Density Estimation Incorporating Housing Information, Phil. Trans. Roy. Soc. A, 372(2028), 20130403, 2014. J. Zipkin, M. B. Short & A. L. Bertozzi: Cops on the dots in a mathematical model of urban crime and police response, Discrete and Continuous Dynamical Systems B, 19(5), pp. 1479-1506, 2014. H. Hu e.a.: A Method Based on Total Variation for Network Modularity Optimization using the MBO Scheme, SIAM J. Appl. Math., 73(6), pp. 2224-2246, 2013. L.M. Smith e.a.: Adaptation of an Ecological Territorial Model to Street Gang Spatial Patterns in Los Angeles Discrete and Continuous Dynamical Systems A, 32(9), pp. 3223 - 3244, 2012. G. Mohler e.a.. (2011): Self- exciting point process modeling of crime, Journal of the American Statistical Association, 106(493):100–108, 2011. A. Stomakhin, M. Short, and A. Bertozzi: Reconstruction of missing data in social networks based on temporal patterns of interactions. Inverse Problems, 27, 2011. N. Rodriguez & A. Bertozzi: Local Existence and Uniqueness of Solutions to a PDE model for Criminal Behavior , M3AS, special issue on Mathematics and Complexity in Human and Life Sciences, Vol. 20, Issue supp01, pp. 1425-1457, 2010. Related Podcasts AMS - Mathematical Moments Podcast: MM97 - Forecasting Crime British Applied Mathematics Colloquium 2016 Special J.Dodd: Crop Growth, Conversation with G. Thäter in the Modellansatz Podcast episode 89, Department of Mathematics, Karlsruhe Institute of Technology (KIT), 2016. http://modellansatz.de/crop-growth H. Wilson: Viscoelastic Fluids, Conversation with G. Thäter in the Modellansatz Podcast episode 92, Department of Mathematics, Karlsruhe Institute of Technology (KIT), 2016. http://modellansatz.de/viscoelastic-fluids A. Hosoi: Robots, Conversation with G. Thäter in the Modellansatz Podcast, episode 108, Department for Mathematics, Karlsruhe Institute of Technologie (KIT), 2016. http://modellansatz.de/robot A. Bertozzi: Crime Prevention, Conversation with G. Thäter in the Modellansatz Podcast, episode 109, Department for Mathematics, Karlsruhe Institute of Technologie (KIT), 2016. http://modellansatz.de/crime-prevention
This is the last of four conversation Gudrun had during the British Applied Mathematics Colloquium which took place 5th – 8th April 2016 in Oxford. Andrea Bertozzi from the University of California in Los Angeles (UCLA) held a public lecture on The Mathematics of Crime. She has been Professor of Mathematics at UCLA since 2003 and Betsy Wood Knapp Chair for Innovation and Creativity (since 2012). From 1995-2004 she worked mostly at Duke University first as Associate Professor of Mathematics and then as Professor of Mathematics and Physics. As an undergraduate at Princeton University she studied physics and astronomy alongside her major in mathematics and went through a Princeton PhD-program. For her thesis she worked in applied analysis and studied fluid flow. As postdoc she worked with Peter Constantin at the University of Chicago (1991-1995) on global regularity for vortex patches. But even more importantly, this was the moment when she found research problems that needed knowledge about PDEs and flow but in addition both numerical analysis and scientific computing. She found out that she really likes to collaborate with very different specialists. Today hardwork can largely be carried out on a desktop but occasionally clusters or supercomputers are necessary. The initial request to work on Mathematics in crime came from a colleague, the social scientist Jeffrey Brantingham. He works in Anthropology at UCLA and had well established contacts with the police in LA. He was looking for mathematical input on some of his problems and raised that issue with Andrea Bertozzi. Her postdoc George Mohler came up with the idea to adapt an earthquake model after a discussion with Frederic Paik Schoenberg, a world expert in that field working at UCLA. The idea is to model crimes of opportunity as being triggered by crimes that already happend. So the likelihood of new crimes can be predicted as an excitation in space and time like the shock of an earthquake. Of course, here statistical models are necessary which say how the excitement is distributed and decays in space and time. Mathematically this is a self-exciting point process. The traditional Poisson process model has a single parameter and thus, no memory - i.e. no connections to other events can be modelled. The Hawkes process builds on the Poisson process as background noise but adds new events which then are triggering events according to an excitation rate and the exponential decay of excitation over time. This is a memory effect based on actual events (not only on a likelihood) and a three parameter model. It is not too difficult to process field data, fit data to that model and make an extrapolation in time. Meanwhile the results of that idea work really well in the field. Results of field trials both in the UK and US have just been published and there is a commercial product available providing services to the police. In addition to coming up with useful ideas and having an interdisciplinary group of people committed to make them work it was necessery to find funding in order to support students to work on that topic. The first grant came from the National Science Foundation and from this time on the group included George Tita (UC Irvine) a criminology expert in LA-Gangs and Lincoln Chayes as another mathematician in the team. The practical implementation of this crime prevention method for the police is as follows: Before the policemen go out on a shift they ususally meet to divide their teams over the area they are serving. The teams take the crime prediction for that shift which is calculated by the computer model on the basis of whatever data is available up to shift. According to expected spots of crimes they especially assign teams to monitor those areas more closely. After introducing this method in the police work in Santa Cruz (California) police observed a significant reduction of 27% in crime. Of course this is a wonderful success story. Another success story involves the career development of the students and postdocs who now have permanent positions. Since this was the first group in the US to bring mathematics to police work this opened a lot of doors for young people involved. Another interesting topic in the context of Mathematics and crime are gang crime data. As for the the crime prediction model the attack of one gang on a rival gang usually triggers another event soon afterwards. A well chosen group of undergraduates already is mathematically educated enough to study the temporary distribution of gang related crime in LA with 30 street gangs and a complex net of enemies. We are speaking about hundreds of crimes in one year related to the activity of gangs. The mathematical tool which proved to be useful was a maximum liklihood penalization model again for the Hawkes process applied on the expected retaliatory behaviour. A more complex problem, which was treated in a PhD-thesis, is to single out gangs which would be probably responsable for certain crimes. This means to solve the inverse problem: We know the time and the crime and want to find out who did it. The result was published in Inverse Problems 2011. The tool was a variational model with an energy which is related to the data. The missing information is guessed and then put into the energy . In finding the best guess related to the chosen energy model a probable candidate for the crime is found. For a small number of unsolved crimes one can just go through all possible combinations. For hundreds or even several hundreds of unsolved crimes - all combinations cannot be handled. We make it easier by increasing the number of choices and formulate a continuous instead of the discrete problem, for which the optimization works with a standard gradient descent algorithm. A third topic and a third tool is Compressed sensing. It looks at sparsitiy in data like the probability distribution for crime in different parts of the city. Usually the crime rate is high in certain areas of a city and very low in others. For these sharp changes one needs different methods since we have to allow for jumps. Here the total variation enters the model as the -norm of the gradient. It promotes sparsity of edges in the solution. Before coming up with this concept it was necessary to cross-validate quite a number of times, which is computational very expensive. So instead of in hours the result is obtained in a couple minutes now. When Andrea Bertozzi was a young child she spent a lot of Sundays in the Science museum in Boston and wanted to become a scientist when grown up. The only problem was, that she could not decide which science would be the best choice since she liked everything in the museum. Today she says having chosen applied mathematics indeed she can do all science since mathematics works as a connector between sciences and opens a lot of doors. References Press coverage of Crime prevention collected Website of Mathematical and Simulation Modeling of Crime Examples for work of undergraduates M. Allenby, e.a.: A Point Process Model for Simulating Gang-on-Gang Violence, Project Report, 2010. K. Louie: Statistical Modeling of Gang Violence in Los Angeles, talk at AMS Joint meetings San Francisco, AMS Session on Mathematics in the Social Sciences, 2010] Publications of A. Bertozzi and co-workers on Crime prevention G.O. Mohler e.a.: Randomized controlled field trials of predictive policing, J. Am. Stat. Assoc., 111(512), 1399-1411, 2015. J. T. Woodworth e.a.: Nonlocal Crime Density Estimation Incorporating Housing Information, Phil. Trans. Roy. Soc. A, 372(2028), 20130403, 2014. J. Zipkin, M. B. Short & A. L. Bertozzi: Cops on the dots in a mathematical model of urban crime and police response, Discrete and Continuous Dynamical Systems B, 19(5), pp. 1479-1506, 2014. H. Hu e.a.: A Method Based on Total Variation for Network Modularity Optimization using the MBO Scheme, SIAM J. Appl. Math., 73(6), pp. 2224-2246, 2013. L.M. Smith e.a.: Adaptation of an Ecological Territorial Model to Street Gang Spatial Patterns in Los Angeles Discrete and Continuous Dynamical Systems A, 32(9), pp. 3223 - 3244, 2012. G. Mohler e.a.. (2011): Self- exciting point process modeling of crime, Journal of the American Statistical Association, 106(493):100–108, 2011. A. Stomakhin, M. Short, and A. Bertozzi: Reconstruction of missing data in social networks based on temporal patterns of interactions. Inverse Problems, 27, 2011. N. Rodriguez & A. Bertozzi: Local Existence and Uniqueness of Solutions to a PDE model for Criminal Behavior , M3AS, special issue on Mathematics and Complexity in Human and Life Sciences, Vol. 20, Issue supp01, pp. 1425-1457, 2010. Related Podcasts AMS - Mathematical Moments Podcast: MM97 - Forecasting Crime British Applied Mathematics Colloquium 2016 Special J.Dodd: Crop Growth, Conversation with G. Thäter in the Modellansatz Podcast episode 89, Department of Mathematics, Karlsruhe Institute of Technology (KIT), 2016. http://modellansatz.de/crop-growth H. Wilson: Viscoelastic Fluids, Conversation with G. Thäter in the Modellansatz Podcast episode 92, Department of Mathematics, Karlsruhe Institute of Technology (KIT), 2016. http://modellansatz.de/viscoelastic-fluids A. Hosoi: Robots, Conversation with G. Thäter in the Modellansatz Podcast, episode 108, Department for Mathematics, Karlsruhe Institute of Technologie (KIT), 2016. http://modellansatz.de/robot A. Bertozzi: Crime Prevention, Conversation with G. Thäter in the Modellansatz Podcast, episode 109, Department for Mathematics, Karlsruhe Institute of Technologie (KIT), 2016. http://modellansatz.de/crime-prevention
Sandra May works at the Seminar for Applied Mathematics at ETH Zurich and visited Karlsruhe for a talk at the CRC Wave phenomena. Her research is in numerical analysis, more specifically in numerical methods for solving PDEs. The focus is on hyperbolic PDEs and systems of conservation laws. She is both interested in theoretical aspects (such as proving stability of a certain method) and practical aspects (such as working on high-performance implementations of algorithms). Sandra May graduated with a PhD in Mathematics from the Courant Institute of Mathematical Sciences (part of New York University) under the supervision of Marsha Berger. She likes to look back on the multicultural working and learning experience there. We talked about the numerical treatment of complex geometries. The main problem is that it is difficult to automatically generate grids for computations on the computer if the shape of the boundary is complex. Examples for such problems are the simulation of airflow around airplanes, trucks or racing cars. Typically, the approach for these flow simulations is to put the object in the middle of the grid. Appropriate far-field boundary conditions take care of the right setting of the finite computational domain on the outer boundary (which is cut from an infinite model). Typically in such simulations one is mainly interested in quantities close to the boundary of the object. Instead of using an unstructured or body-fitted grid, Sandra May is using a Cartesian embedded boundary approach for the grid generation: the object with complex geometry is cut out of a Cartesian background grid, resulting in so called cut cells where the grid intersects the object and Cartesian cells otherwise. This approach is fairly straightforward and fully automatic, even for very complex geometries. The price to pay comes in shape of the cut cells which need special treatment. One particular challenge is that the cut cells can become arbitrarily small since a priori their size is not bounded from below. Trying to eliminate cut cells that are too small leads to additional problems which conflict with the goal of a fully automatic grid generation in 3d, which is why Sandra May keeps these potentially very small cells and develops specific strategies instead. The biggest challenge caused by the small cut cells is the small cell problem: easy to implement (and therefore standard) explicit time stepping schemes are only stable if a CFL condition is satisfied; this condition essentially couples the time step length to the spatial size of the cell. Therefore, for the very small cut cells one would need to choose tiny time steps, which is computationally not feasible. Instead, one would like to choose a time step appropriate for the Cartesian cells and use this same time step on cut cells as well. Sandra May and her co-workers have developed a mixed explicit implicit scheme for this purpose: to guarantee stability on cut cells, an implicit time stepping method is used on cut cells. This idea is similar to the approach of using implicit time stepping schemes for solving stiff systems of ODEs. As implicit methods are computationally more expensive than explicit methods, the implicit scheme is only used where needed (namely on cut cells and their direct neighbors). In the remaining part of the grid (the vast majority of the grid cells), a standard explicit scheme is used. Of course, when using different schemes on different cells, one needs to think about a suitable way of coupling them. The mixed explicit implicit scheme has been developed in the context of Finite Volume methods. The coupling has been designed with the goals of mass conservation and stability and is based on using fluxes to couple the explicit and the implicit scheme. This way, mass conservation is guaranteed by construction (no mass is lost). In terms of stability of the scheme, it can be shown that using a second-order explicit scheme coupled to a first-order implicit scheme by flux bounding results in a TVD stable method. Numerical results for coupling a second-order explicit scheme to a second-order implicit scheme show second-order convergence in the L^1 norm and between first- and second-order convergence in the maximum norm along the surface of the object in two and three dimensions. We also talked about the general issue of handling shocks in numerical simulations properly: in general, solutions to nonlinear hyperbolic systems of conservation laws such as the Euler equations contain shocks and contact discontinuities, which in one dimension express themselves as jumps in the solution. For a second-order finite volume method, typically slopes are reconstructed on each cell. If one reconstructed these slopes using e.g. central difference quotients in one dimension close to shocks, this would result in oscillations and/or unphysical results (like negative density). To avoid this, so called slope limiters are typically used. There are two main ingredients to a good slope limiter (which is applied after an initial polynomial based on interpolation has been generated): first, the algorithm (slope limiter) needs to detect whether the solution in this cell is close to a shock or whether the solution is smooth in the neighborhood of this cell. If the algorithm thinks that the solution is close to a shock, the algorithm reacts and adjusts the reconstruted polynomial appropriately. Otherwise, it sticks with the polynomial based on interpolation. One commonly used way in one dimension to identify whether one is close to a shock or not is to compare the values of a right-sided and a left-sided difference quotient. If they differ too much the solution is (probably) not smooth there. Good reliable limiters are really difficult to find. Literature and additional material S. May, M. Berger: An Explicit Implicit Scheme for Cut Cells in Embedded Boundary Meshes, Preprint available as SAM report, number 2015-44, 2015. S. May, M. Berger: A mixed explicit implicit time stepping scheme for Cartesian embedded boundary meshes, Finite Volumes for Complex Applications VII - Methods and Theoretical Aspects, pp. 393-400, Springer, 2014. S. May, M. Berger: Two-dimensional slope limiters for finite volume schemes on non-coordinate-aligned meshes, SIAM J. Sci. Comput. 35 (5) pp. A2163-A2187, 2013.
Sandra May works at the Seminar for Applied Mathematics at ETH Zurich and visited Karlsruhe for a talk at the CRC Wave phenomena. Her research is in numerical analysis, more specifically in numerical methods for solving PDEs. The focus is on hyperbolic PDEs and systems of conservation laws. She is both interested in theoretical aspects (such as proving stability of a certain method) and practical aspects (such as working on high-performance implementations of algorithms). Sandra May graduated with a PhD in Mathematics from the Courant Institute of Mathematical Sciences (part of New York University) under the supervision of Marsha Berger. She likes to look back on the multicultural working and learning experience there. We talked about the numerical treatment of complex geometries. The main problem is that it is difficult to automatically generate grids for computations on the computer if the shape of the boundary is complex. Examples for such problems are the simulation of airflow around airplanes, trucks or racing cars. Typically, the approach for these flow simulations is to put the object in the middle of the grid. Appropriate far-field boundary conditions take care of the right setting of the finite computational domain on the outer boundary (which is cut from an infinite model). Typically in such simulations one is mainly interested in quantities close to the boundary of the object. Instead of using an unstructured or body-fitted grid, Sandra May is using a Cartesian embedded boundary approach for the grid generation: the object with complex geometry is cut out of a Cartesian background grid, resulting in so called cut cells where the grid intersects the object and Cartesian cells otherwise. This approach is fairly straightforward and fully automatic, even for very complex geometries. The price to pay comes in shape of the cut cells which need special treatment. One particular challenge is that the cut cells can become arbitrarily small since a priori their size is not bounded from below. Trying to eliminate cut cells that are too small leads to additional problems which conflict with the goal of a fully automatic grid generation in 3d, which is why Sandra May keeps these potentially very small cells and develops specific strategies instead. The biggest challenge caused by the small cut cells is the small cell problem: easy to implement (and therefore standard) explicit time stepping schemes are only stable if a CFL condition is satisfied; this condition essentially couples the time step length to the spatial size of the cell. Therefore, for the very small cut cells one would need to choose tiny time steps, which is computationally not feasible. Instead, one would like to choose a time step appropriate for the Cartesian cells and use this same time step on cut cells as well. Sandra May and her co-workers have developed a mixed explicit implicit scheme for this purpose: to guarantee stability on cut cells, an implicit time stepping method is used on cut cells. This idea is similar to the approach of using implicit time stepping schemes for solving stiff systems of ODEs. As implicit methods are computationally more expensive than explicit methods, the implicit scheme is only used where needed (namely on cut cells and their direct neighbors). In the remaining part of the grid (the vast majority of the grid cells), a standard explicit scheme is used. Of course, when using different schemes on different cells, one needs to think about a suitable way of coupling them. The mixed explicit implicit scheme has been developed in the context of Finite Volume methods. The coupling has been designed with the goals of mass conservation and stability and is based on using fluxes to couple the explicit and the implicit scheme. This way, mass conservation is guaranteed by construction (no mass is lost). In terms of stability of the scheme, it can be shown that using a second-order explicit scheme coupled to a first-order implicit scheme by flux bounding results in a TVD stable method. Numerical results for coupling a second-order explicit scheme to a second-order implicit scheme show second-order convergence in the L^1 norm and between first- and second-order convergence in the maximum norm along the surface of the object in two and three dimensions. We also talked about the general issue of handling shocks in numerical simulations properly: in general, solutions to nonlinear hyperbolic systems of conservation laws such as the Euler equations contain shocks and contact discontinuities, which in one dimension express themselves as jumps in the solution. For a second-order finite volume method, typically slopes are reconstructed on each cell. If one reconstructed these slopes using e.g. central difference quotients in one dimension close to shocks, this would result in oscillations and/or unphysical results (like negative density). To avoid this, so called slope limiters are typically used. There are two main ingredients to a good slope limiter (which is applied after an initial polynomial based on interpolation has been generated): first, the algorithm (slope limiter) needs to detect whether the solution in this cell is close to a shock or whether the solution is smooth in the neighborhood of this cell. If the algorithm thinks that the solution is close to a shock, the algorithm reacts and adjusts the reconstruted polynomial appropriately. Otherwise, it sticks with the polynomial based on interpolation. One commonly used way in one dimension to identify whether one is close to a shock or not is to compare the values of a right-sided and a left-sided difference quotient. If they differ too much the solution is (probably) not smooth there. Good reliable limiters are really difficult to find. Literature and additional material S. May, M. Berger: An Explicit Implicit Scheme for Cut Cells in Embedded Boundary Meshes, Preprint available as SAM report, number 2015-44, 2015. S. May, M. Berger: A mixed explicit implicit time stepping scheme for Cartesian embedded boundary meshes, Finite Volumes for Complex Applications VII - Methods and Theoretical Aspects, pp. 393-400, Springer, 2014. S. May, M. Berger: Two-dimensional slope limiters for finite volume schemes on non-coordinate-aligned meshes, SIAM J. Sci. Comput. 35 (5) pp. A2163-A2187, 2013.
This episode discusses the Born-Infeld model for Electromagnetodynamics. Here, the standard model are the Maxwell equations coupling the interaction of magnetic and electric field with the help of a system of partial differential equations. This is a well-understood classical system. But in this classical model, one serious drawback is that the action of a point charge (which is represented by a Dirac measure on the right-hand side) leads to an infinite energy in the electric field which physically makes no sense. On the other hand, it should be possible to study the electric field of point charges since this is how the electric field is created. One solution for this challenge is to slightly change the point of view in a way similar to special relativity theory of Einstein. There, instead of taking the momentum () as preserved quantity and Lagrange parameter the Lagrangian is changed in a way that the bound for the velocity (in relativity the speed of light) is incorporated in the model. In the electromagnetic model, the Lagrangian would have to restrict the intensity of the fields. This was the idea which Borne and Infeld published already at the beginning of the last century. For the resulting system it is straightforward to calculate the fields for point charges. But unfortunately it is impossible to add the fields for several point charges (no superposition principle) since the resulting theory (and the PDE) are nonlinear. Physically this expresses, that the point charges do not act independently from each other but it accounts for certain interaction between the charges. Probably this interaction is really only important if charges are near enough to each other and locally it should be only influenced by the charge nearest. But it has not been possible to prove that up to now. The electrostatic case is elliptic but has a singularity at each point charge. So no classical regularity results are directly applicable. On the other hand, there is an interesting interplay with geometry since the PDE occurs as the mean curvature equation of hypersurfaces in the Minkowski space in relativity. The evolution problem is completely open. In the static case we have existence and uniqueness without really looking at the PDEs from the way the system is built. The PDE should provide at least qualitative information on the electric field. So if, e.g., there is a positive charge there could be a maximum of the field (for negative charges a minimum - respectively), and we would expect the field to be smooth outside these singular points. So a Lipschitz regular solution would seem probable. But it is open how to prove this mathematically. A special property is that the model has infinitely many inherent scales, namely all even powers of the gradient of the field. So to understand maybe asymptotic limits in theses scales could be a first interesting step. Denis Bonheure got his mathematical education at the Free University of Brussels and is working there as Professor of Mathematics at the moment. Literature and additional material M. Kiessling: Electromagnetic Field Theory Without Divergence Problems 1, The Born Legacy, Journal of Statistical Physics, Volume 116, Issue 1, pp 1057-1122, 2004. M. Kiessling: Electromagnetic Field Theory Without Divergence Problems 2, A Least Invasively Quantized Theory, Journal of Statistical Physics, Volume 116, Issue 1, pp 1123-1159, 2004. M. Kiessling: On the motion of point defects in relativistic fields, in Quantum Field Theory and Gravity, Conceptual and Mathematical Advances in the Search for a Unified Framework, Finster e.a. (ed.), Springer, 2012. Y. Brenier: Some Geometric PDEs Related to Hydrodynamics and Electrodynamics, ICM Vol. III pp 761--772, 2002.
This episode discusses the Born-Infeld model for Electromagnetodynamics. Here, the standard model are the Maxwell equations coupling the interaction of magnetic and electric field with the help of a system of partial differential equations. This is a well-understood classical system. But in this classical model, one serious drawback is that the action of a point charge (which is represented by a Dirac measure on the right-hand side) leads to an infinite energy in the electric field which physically makes no sense. On the other hand, it should be possible to study the electric field of point charges since this is how the electric field is created. One solution for this challenge is to slightly change the point of view in a way similar to special relativity theory of Einstein. There, instead of taking the momentum () as preserved quantity and Lagrange parameter the Lagrangian is changed in a way that the bound for the velocity (in relativity the speed of light) is incorporated in the model. In the electromagnetic model, the Lagrangian would have to restrict the intensity of the fields. This was the idea which Borne and Infeld published already at the beginning of the last century. For the resulting system it is straightforward to calculate the fields for point charges. But unfortunately it is impossible to add the fields for several point charges (no superposition principle) since the resulting theory (and the PDE) are nonlinear. Physically this expresses, that the point charges do not act independently from each other but it accounts for certain interaction between the charges. Probably this interaction is really only important if charges are near enough to each other and locally it should be only influenced by the charge nearest. But it has not been possible to prove that up to now. The electrostatic case is elliptic but has a singularity at each point charge. So no classical regularity results are directly applicable. On the other hand, there is an interesting interplay with geometry since the PDE occurs as the mean curvature equation of hypersurfaces in the Minkowski space in relativity. The evolution problem is completely open. In the static case we have existence and uniqueness without really looking at the PDEs from the way the system is built. The PDE should provide at least qualitative information on the electric field. So if, e.g., there is a positive charge there could be a maximum of the field (for negative charges a minimum - respectively), and we would expect the field to be smooth outside these singular points. So a Lipschitz regular solution would seem probable. But it is open how to prove this mathematically. A special property is that the model has infinitely many inherent scales, namely all even powers of the gradient of the field. So to understand maybe asymptotic limits in theses scales could be a first interesting step. Denis Bonheure got his mathematical education at the Free University of Brussels and is working there as Professor of Mathematics at the moment. Literature and additional material M. Kiessling: Electromagnetic Field Theory Without Divergence Problems 1, The Born Legacy, Journal of Statistical Physics, Volume 116, Issue 1, pp 1057-1122, 2004. M. Kiessling: Electromagnetic Field Theory Without Divergence Problems 2, A Least Invasively Quantized Theory, Journal of Statistical Physics, Volume 116, Issue 1, pp 1123-1159, 2004. M. Kiessling: On the motion of point defects in relativistic fields, in Quantum Field Theory and Gravity, Conceptual and Mathematical Advances in the Search for a Unified Framework, Finster e.a. (ed.), Springer, 2012. Y. Brenier: Some Geometric PDEs Related to Hydrodynamics and Electrodynamics, ICM Vol. III pp 761--772, 2002.
Catherine Bandle war bis 2003 Professorin am Mathematischen Institut der Universität in Basel. Aber auch über die Emeritierung hinaus ist sie sehr rege in der Forschung zu elliptischen und parabolischen partiellen Differentialgleichungen. Das zeigt sich an einer beeindruckenden Zahl von Publikationen, der Teilnahme an Tagungen und im Einbringen ihrer Erfahrung in die Tätigkeit von Gremien wie dem Landeshochschulrat Brandenburg und dem Steering Committee of the European Science Foundation program: Global and Geometric Aspects of Nonlinear Partial Differential Equations. Ihre Faszination für die Vielseitigkeit dieses Themas in den Anwendungen und die Zusammenhänge zur Geometrie haben sich über viele Jahrzehnte erhalten. Für den Workshop Nonlinear Days 2015 wurde sie für einen Hauptvortrag nach Karlsruhe eingeladen. Wir haben diese Gelegenheit genutzt, das Thema der Modellbildung mit Hilfe von partiellen Differentialgleichungen mit ihr etwas allgemeiner zu beleuchten. Traditionell stehen elliptische wie parabolische Gleichungen am Beginn der modernen Modellbildung von Prozessen in der Physik, der Biologie und Chemie. Hier sind es Diffusions-, Reaktions-, Transport- und Wachstumsprozesse, die zunächst durch gewöhnliche Differentialgleichungen beschrieben wurden. Allerdings waren vor etwa 150 Jahren die Anwendungen in Teilen schon zu komplex für dieses zu einfache Modell. Abhängigkeiten von Veränderungen in allen Raum- und der Zeitrichtung sollten interagierend erfasst werden. Das führte zwingend auf die partiellen Differentialgleichungen. Mit dem Aufstellen der Gleichungen verband sich die Hoffnung, durch die zugehörigen Lösungen Vorhersagen treffen zu können. Um diese Lösungen zu finden, brauchte es aber ganz neue Konzepte. Am Anfang der Entwicklung standen beispielsweise die Fourierreihen, die (unter den richtigen Voraussetzungen) eine Darstellung solcher Lösungen sein können. Werkzeuge wie Fourier- und Lapalacetransformation konnten zumindest für bestimmte Geometrien hilfreiche Antworten geben. Später wurder der Begriff der schwachen Lösung bzw. schwachen Formulierung geprägt und die damit verbundenen Sobolevräume auf verschiedenen Wegen entwickelt und untersucht. Die Suche nach den Lösungen der Gleichungen hat damit die theoretische Entwicklung in der Mathematik stark vorangetrieben. Heute sind wir froh, dass wir in der linearen Theorie (siehe auch Lemma von Lax-Milgram) vieles verstanden haben und versuchen uns Stück für Stück nichtlineare Modellen anzueignen. Ein erster Schritt ist häufig eine lokale Linearisierung oder das Zulassen von Nichtlinearitäten in untergeordneten Termen (semilineare Probleme). Ein integraler Bestandteil ist hier jedoch auch die Möglichkeit, mehr als eine Lösung der Gleichung zu haben und wir brauchen deshalb Konzepte, die physikalisch relevante unter ihnen zu finden. Hier sind Konzepte der Stabilität wichtig. Nur stabile Lösungen sind solche, die zu beobachtbaren Phänomenen führen. Wichtige Werkzeuge in der Lösungstheorie sind auch die Normen, in denen wir unsere Lösungen messen. Am überzeugendsten ist es, wenn sich Normen in Energien des Systems übersetzen lassen. Dann kann man auch die Stabilität im Rahmen von Energieerhaltung und Energieminimierung diskutieren. Literatur und Zusatzinformationen Catherine Bandle: Die Mathematik als moderne Weltsprache - Am Beispiel der Differenzialgleichungen, UniNova Wissenschaftsmagazin der Universität Basel, Band 87, 2000. R.Farwig: Skript zu Elementaren Differentialgleichungen, Technische Universität Darmstadt, 2008. Videos zu PDEs (in Englisch) Video zur Fourierreihenidee auf Deutsch
In this episode of the Biopharma EHS Podcast, Dean M. Calhoun, CIH, President and CEO of Affygility Solutions, interviews Ms. Stephanie Wilkins, President of PharmaConsult US, on the subject of Risk-MaPP, ADEs and PDEs: Their importance to multi-product pharmaceutical manufacturing facilities. Risk-MaPP stands for Risk-Based Manufacture of Pharmaceutical Products. It is one of the International Society of Pharmaceutical Engineers' (ISPE) Base-line Guides. One of the essential components of Risk-MaPP is the determination of acceptable daily exposure (ADE) values. Calculation of ADEs is similar to the calculation of PDEs presented by European Medicines Agency (EMA). If you need an occupational exposure limit (OEL) or ADE for a pharmaceutical compound, please check out our entire catalog of OEL Fastrac monographs. We have many OEL monographs that come with ADEs.
Séminaires de probabilités et statistiques (SAMM, 2009-2010)
We give sufficient conditions under which the convergence of finite difference approximations in the space variable of the solution to the Cauchy problem for stochastic PDEs of parabolic type can be accelerated to any given order of convergence by Richardson's method.
Speaker: Prof. M. Arcak Abstract: The passivity concept - an abstraction of energy conservation and dissipation in physical systems - has been instrumental in feedback control theory and led to breakthroughs in nonlinear and adaptive control design. In this talk we discuss the use of passivity as a stability test for classes of biochemical reaction networks. The main result determines global asymptotic stability of the network from the diagonal stability of a dissipativity matrix which incorporates information about the passivity properties of the subsystems, the interconnection structure of the network, and the signs of the feedback terms. This stability test encompasses the well-known 'secant criterion' for cyclic networks and extends it to general interconnection structures represented by graphs. An extension to reaction-diffusion PDEs is also discussed. The results are illustrated on MAPK cascade models and on branched interconnection structures motivated by metabolic networks.
Speaker: Prof. M. Arcak Abstract: The passivity concept - an abstraction of energy conservation and dissipation in physical systems - has been instrumental in feedback control theory and led to breakthroughs in nonlinear and adaptive control design. In this talk we discuss the use of passivity as a stability test for classes of biochemical reaction networks. The main result determines global asymptotic stability of the network from the diagonal stability of a dissipativity matrix which incorporates information about the passivity properties of the subsystems, the interconnection structure of the network, and the signs of the feedback terms. This stability test encompasses the well-known 'secant criterion' for cyclic networks and extends it to general interconnection structures represented by graphs. An extension to reaction-diffusion PDEs is also discussed. The results are illustrated on MAPK cascade models and on branched interconnection structures motivated by metabolic networks.
Highly Oscillatory Problems: Computation, Theory and Application
Reich, S (Potsdam) Tuesday 27 March 2007, 11:30-12:15
We consider a directed polymer on the unit circle, with a continuous direction (time) parameter , defined as a simple random walk subjected via a Gibbs measure to a Hamiltonian whose increments in time have either long memory () or semi-long memory (), and which also depends on a space parameter (position/state of the polymer). is interpreted as the Hurst parameter of an infinite-dimensional fractional Brownian motion. The partition function of this polymer is linked to stochastic PDEs via a long-memory parabolic Anderson model. We present a summary of the new techniques which are required to prove that, in the semi-long memory case, converges to a positive finite non-random constant, and in the long-memory case, this limit is blows up, while the correct exponential growth function in that case is sandwiched between and . These tools include an almost sub-additivity concept, usage of Malliavin derivatives for concentration estimates, and an adaptation to the long-memory case of some arguments from the case (no memory), which require a detailed study of the interaction between the long memory, the spatial covariance, and the simple random walk. This talk describes joint work with Dr. Tao Zhang. Frederi VIENS. Purdue University. Bande son disponible au format mp3 Durée : 46 mn