POPULARITY
Welcome to Nerd Alert, a series of special episodes bridging the gap between marketing academia and practitioners. We're breaking down highly involved, complex research into plain language and takeaways any marketer can use.In this episode, Elena and Rob explore how the Dirichlet model challenges common beliefs about brand loyalty. They reveal why most consumers aren't deeply loyal to specific brands and why reaching new customers matters more than increasing loyalty.Topics covered: [01:00] "The Dirichlet: A Comprehensive Model of Buying Behavior"[02:00] What products do we buy without brand loyalty?[03:45] How brand share predicts buying behavior[04:30] Why acquisition drives loyalty, not vice versa[05:00] Most buyers are light buyers who purchase infrequently[06:00] The shocking truth about Coca-Cola's customer base To learn more, visit marketingarchitects.com/podcast or subscribe to our newsletter at marketingarchitects.com/newsletter.Resources: Goodhardt, G. J., Ehrenberg, A. S. C., & Chatfield, C. (1984). The Dirichlet: A comprehensive model of buying behaviour.Journal of the Royal Statistical Society: Series A (General), 147(5), 621–655. Get more research-backed marketing strategies by subscribing to The Marketing Architects on Apple Podcasts, Spotify, or wherever you listen to podcasts.
Rix, Dirichlet and the rest return as our game table returns to the Telessaria adventure! It's all about ocean travel, nausea Con saves and some rather unpleasant, demonic merrow this session. Connect with us: teachersinthedungeon on Instagram and Facebook, @dungeonteachers on X and teachersinthedungeon@gmail.com
Solfate Podcast - Interviews with blockchain founders/builders on Solana
A conversation with Arnold, the co-founder of Sphere (a payments and banking platform on Solana).PS: We are happy to announce the launch of our Solfate Drip channel where you can collect Solfate Podcast episodes as digital collectibles! Subscribe to our Drip channel today: drip.haus/solfate (we have some fun experiments planned)SummaryArnold shares his background and early experiences with crypto, including his journey to building Sphere on the Solana blockchain. He discusses the benefits of blockchain and the pragmatic approach to decentralization. Arnold also emphasizes the importance of understanding history in the context of blockchain. Finally, he explains the pain points that Sphere Payments aims to address.In this conversation, Arnold discusses the benefits of using Sphere to simplify crypto payments and integrations. He highlights the challenges of creating an SPL transfer wrapper contract and emphasizes the ease of using Sphere for accepting crypto payments, especially from international customers. Arnold also explains that Sphere is a suitable solution for businesses that want to handle their finances in a crypto-based world. He mentions the ability to accept payments in fiat and have them land in USDC, as well as the future goal of automating payout processes. Arnold advises listeners to learn from history and consider unconventional approaches to problem-solving. The conversation concludes with plans for future discussions and appreciation for the insights shared.TakeawaysBlockchain offers the potential for decentralized solutions, but it is important to take a pragmatic approach and consider the existing systems and their benefits.Understanding history is crucial in the context of blockchain, as it helps us recognize the challenges and opportunities that arise from the current financial and regulatory systems.Sphere Payments aims to address pain points in the correspondent banking system, such as high fees, slow transactions, and lack of transparency.By leveraging blockchain technology, Sphere Payments provides a more accessible and efficient way to transfer value, particularly for remittances and cross-border transactions. Sphere provides an easy and efficient solution for accepting crypto payments and simplifying integrations.The target customers for Sphere are businesses that want to handle their finances in a crypto-based world.Sphere allows for accepting payments in fiat and converting them to USDC.Automating financial processes and exploring unconventional approaches can lead to more efficient solutions.Future conversations with Arnold are anticipated to delve deeper into historical and philosophical aspects of crypto.Find Arnold and Sphere onlineFollow Arnold on twitter - @0xdirichletFollow Sphere on twitter - @sphere_labsCheckout the Sphere website - spherepay.coFollow us aroundNicktwitter: @nickfrostygithub: github.com/nickfrostywebsite: https://nick.afJamestwitter: @jamesrp13github: github.com/jamesrp13Solfate Podcasttwitter: @SolfatePodmore podcast episodes: solfate.com/podcastcollect episodes on Drip: drip.haus/solfate
Perfectly parallel cosmological simulations using spatial comoving Lagrangian acceleration by Florent Leclercq et al. on Sunday 18 September Existing cosmological simulation methods lack a high degree of parallelism due to the long-range nature of the gravitational force, which limits the size of simulations that can be run at high resolution. To solve this problem, we propose a new, perfectly parallel approach to simulate cosmic structure formation, which is based on the spatial COmoving Lagrangian Acceleration (sCOLA) framework. Building upon a hybrid analytical and numerical description of particles' trajectories, our algorithm allows for an efficient tiling of a cosmological volume, where the dynamics within each tile is computed independently. As a consequence, the degree of parallelism is equal to the number of tiles. We optimised the accuracy of sCOLA through the use of a buffer region around tiles and of appropriate Dirichlet boundary conditions around sCOLA boxes. As a result, we show that cosmological simulations at the degree of accuracy required for the analysis of the next generation of surveys can be run in drastically reduced wall-clock times and with very low memory requirements. The perfect scalability of our algorithm unlocks profoundly new possibilities for computing larger cosmological simulations at high resolution, taking advantage of a variety of hardware architectures. arXiv: http://arxiv.org/abs/http://arxiv.org/abs/2003.04925v4
Doç. Dr. Serhan Yarkan ve Halil Said Cankurtaran'ın yer aldığı Mühendislik Kavramları serisinin Haberleşme Kuramı odaklı bu bölümünde, Fourier serileri ve Fourier dönüşümünü ele alınmıştır. Haberleşme Kuramı içerisinde sürekli olarak kullandığımız araçlardan biri olan Fourier serileri ve Fourier dönüşümünü ele aldığımız bu bölümümüzde öncelikle bir işaretin Fourier serilerine açılabilmesi ya da bir işaretin Fourier dönüşümünün alınabilmesi için gerekli olan şartlara değinilmiştir. Sonrasında ise rastgele işaretlerin Fourier dönüşümü üzerine konuşulup, rastgelelik söz konusu olduğunda Fourier dönüşümünün matematiksel ve mühendislik açısından ele alınma şekilleri değerlendirilmiştir. Bölümümüz doğada karşımıza çıkan Fourier dönüşümü örnekleri sonlandırılmıştır. Keyifli dinlemeler dileriz. 01:03 Giriş 02:58 Fourier serileri ve dönüşümünün ortaya çıkışı, temelleri 04:40 Periyodik işaretler, Fourier serileri ve dönüşümü 06:18 Süreksizlik, kesikli işaretler ve Fourier dönüşümü 07:24 Gibbs Fenomeni 08:05 Dirichlet'in üçüncü gerek şartı 12:21 Rastgele işaretlerin Fourier dönüşümü 12:58 Rastgele işaretlerin Fourier dönüşümünün kullanıldığı bir örnek. Shazam. 14:21 Rastgele işaretlerin istatistiksel özellikleri Fourier dönüşümü altında nasıl korunuyor? 19:37 Doğada Fourier dönüşümü örnekleri TapirCast - Bilimsel ve Teknolojik Gelişmeler: https://youtube.com/playlist?list=PLwvStmyxv70_rnTR_kItlrZvaIdxWgfIN TapirCast - Mühendislik Kavramları: https://youtube.com/playlist?list=PLwvStmyxv708xJad4QY9ZueBMGdLSz3m6 TapirCast - Bilim Tarihi: https://youtube.com/playlist?list=PLwvStmyxv70_XdrpkVTcYEylAltcL0Kth TapirCast - Tematik Bölümler: https://youtube.com/playlist?list=PLwvStmyxv70_1LSin7YTebFe3-dp6Jjz9 TapirCast - Tapir Lab.: https://youtube.com/playlist?list=PLwvStmyxv70-HaMqRbP1yyGMpv_xTKHgw TapirCast - TUAC: https://youtube.com/playlist?list=PLwvStmyxv70-lFqHV4ry-7-u1hAUy0r1y TapirCast - IEEE Spectum: https://youtube.com/playlist?list=PLwvStmyxv709QYQzgEyDRi9h8pS_BHICr TapirCast - VideoCast: https://youtube.com/playlist?list=PLwvStmyxv70-g2zpIygm-WWyGtHzNCcTP Apple Podcasts: @TapirCast, https://podcasts.apple.com/tr/podcast/tapircast/id1485098931 Spotify: @TapirCast, https://open.spotify.com/show/1QJduW17Sgvs1sofFgJN8L?si=6378c7e84186419e Tapir Lab. GitHub: @TapirLab, https://github.com/TapirLab Tapir Lab. Instagram: @tapirlab, https://www.instagram.com/tapirlab/ Tapir Lab. Twitter: @tapirlab, https://twitter.com/tapirlab?s=20 Tapir Lab.: http://tapirlab.com/ --- TapirCast'in Mühendislik Kavramları serisinin Haberleşme Kuramı odaklı bu bölümünde, Fourier serileri ve Fourier dönüşümü ele alınmıştır. Keyifli dinlemeler. #podcast #Engineering https://youtu.be/q383ufeE71s
We talk a lot about generative modeling on this podcast — at least since episode 6, with Michael Betancourt! And an area where this way of modeling is particularly useful is healthcare, as Maria Skoularidou will tell us in this episode. Maria is a final year PhD student at the University of Cambridge. Her thesis is focused on probabilistic machine learning and, more precisely, towards using generative modeling in… you guessed it: healthcare! But her fields of interest are diverse: from theory and methodology of machine intelligence to Bayesian inference; from theoretical computer science to information theory — Maria is knowledgeable in a lot of topics! That's why I also had to ask her about mixture models, a category of models that she uses frequently. Prior to her PhD, Maria studied Computer Science and Statistical Science at Athens University of Economics and Business. She's also invested in several efforts to bring more diversity and accessibility in the data science world. When she's not working on all this, you'll find her playing the ney, trekking or rawing. Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ (https://bababrinkman.com/) ! Thank you to my Patrons for making this episode possible! Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, James Ahloy, Robin Taylor, Thomas Wiecki, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran, Paul Oreto, Colin Caprani, George Ho, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Matthew McAnear, Michael Hankin, Cameron Smith, Luis Iberico, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Aaron Jones, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton and Jeannine Sue. Visit https://www.patreon.com/learnbayesstats (https://www.patreon.com/learnbayesstats) to unlock exclusive Bayesian swag ;) Links from the show: Maria on Twitter: https://twitter.com/skoularidou (https://twitter.com/skoularidou) Maria on LinkedIn: https://www.linkedin.com/in/maria-skoularidou-1289b62a/ (https://www.linkedin.com/in/maria-skoularidou-1289b62a/) Maria's webpage: https://www.mrc-bsu.cam.ac.uk/people/in-alphabetical-order/n-to-s/maria-skoularidou/ (https://www.mrc-bsu.cam.ac.uk/people/in-alphabetical-order/n-to-s/maria-skoularidou/) Mixture models in PyMC: https://www.pymc.io/projects/examples/en/latest/gallery.html#mixture-models (https://www.pymc.io/projects/examples/en/latest/gallery.html#mixture-models) LBS #4 Dirichlet Processes and Neurodegenerative Diseases, with Karin Knudson: https://learnbayesstats.com/episode/4-dirichlet-processes-and-neurodegenerative-diseases-with-karin-knudson/ (https://learnbayesstats.com/episode/4-dirichlet-processes-and-neurodegenerative-diseases-with-karin-knudson/) Bayesian mixtures with an unknown number of components: https://rss.onlinelibrary.wiley.com/doi/abs/10.1111/1467-9868.00095 (https://rss.onlinelibrary.wiley.com/doi/abs/10.1111/1467-9868.00095) Markov Chain sampling methods for Dirichlet Processes: https://www.tandfonline.com/doi/abs/10.1080/10618600.2000.10474879 (https://www.tandfonline.com/doi/abs/10.1080/10618600.2000.10474879) Retrospective Markov chain Monte Carlo methods for Dirichlet process hierarchical models: https://academic.oup.com/biomet/article-abstract/95/1/169/219181...
Tom and Dan pause their storytelling to interview two of the players who experienced Telessaria: Chris Metz, creator of Rix the tiefling sorcerer, and Marc Kugler, creator of Dirichlet the wood elf ranger. The players give a glimpse into the thought processes on the other side of the DM screen and raise their own questions about future adventures. --- This episode is sponsored by · Anchor: The easiest way to make a podcast. https://anchor.fm/app
Link to bioRxiv paper: http://biorxiv.org/cgi/content/short/2020.11.20.391573v1?rss=1 Authors: Mastrantonio, G., Bibbona, E., Furlan, M. Abstract: We propose a hierarchical Bayesian approach to infer the RNA synthesis, processing, and degradation rates from sequencing data. We parametrise kinetic rates with novel functional forms and estimate the parameters through a Dirichlet process defined at a low level of hierarchy. Despite the complexity of this approach, we manage to perform inference, clusterisation and model selection simultaneously. We apply our method to investigate transcriptional and post-transcriptional responses of murine fibroblasts to the activation of proto-oncogene MYC. We uncover a widespread choral regulation of the three rates, which was not previously observed in this biological system. Copy rights belong to original authors. Visit the link for more info
Link to bioRxiv paper: http://biorxiv.org/cgi/content/short/2020.11.10.330183v1?rss=1 Authors: He, S., Schein, A., Sarsani, V., Flaherty, P. Abstract: There are distinguishing features or "hallmarks" of cancer that are found across tumors, individuals, and types of cancer, and these hallmarks can be driven by specific genetic mutations. Yet, within a single tumor there is often extensive genetic heterogeneity as evidenced by single-cell and bulk DNA sequencing data. The goal of this work is to jointly infer the underlying genotypes of tumor subpopulations and the distribution of those subpopulations in individual tumors by integrating single-cell and bulk sequencing data. Understanding the genetic composition of the tumor at the time of treatment is important in the personalized design of targeted therapeutic combinations and monitoring for possible recurrence after treatment. We propose a hierarchical Dirichlet process mixture model that incorporates the correlation structure induced by a structured sampling arrangement and we show that this model improves the quality of inference. We develop a representation of the hierarchical Dirichlet process prior as a Gamma-Poisson hierarchy and we use this representation to derive a fast Gibbs sampling inference algorithm using the augment-and-marginalize method. Experiments with simulation data show that our model outperforms standard numerical and statistical methods for decomposing admixed count data. Analyses of real acute lymphoblastic leukemia cancer sequencing dataset shows that our model improves upon state-of-the-art bioinformatic methods. An interpretation of the results of our model on this real dataset reveals co-mutated loci across samples. Copy rights belong to original authors. Visit the link for more info
Link to bioRxiv paper: http://biorxiv.org/cgi/content/short/2020.11.10.376533v1?rss=1 Authors: Gusmao, E. G., Mizi, A., Brant, L., Papantonis, A. Abstract: The advent of the chromosome conformation capture (3C) and related technologies has profoundly renewed our understaning of three-dimensional chromatin organization in mammalian nuclei. Alongside these experimental approaches, numerous computational tools for handling, normalizing, visualizing, and ultimately detecting interactions in 3C-type datasets are being developed. Here, we present Bloom, a comprehensive method for the analysis of 3C-type data matrices on the basis of Dirichlet process mixture models that addresses two important open issues. First, it retrieves occult interaction patterns from sparse data, like those derived from single-cell Hi-C experiments; thus, bloomed sparse data can now be used to study interaction landscapes at sub-kbp resolution. Second, it detects enhancer-promoter interactions with high sensitivity and inherently assigns an interaction frequency score (IFS) to each contact. Using enhancer perturbation data of different throughput, we show that IFS accurately quantifies the regulatory influence of each enhancer on its target promoter. As a result, Bloom allows decoding of complex regulatory landscapes by generating functionally-relevant enhancer atlases solely on the basis of 3C-type of data. Copy rights belong to original authors. Visit the link for more info
Link to bioRxiv paper: http://biorxiv.org/cgi/content/short/2020.10.26.355941v1?rss=1 Authors: Yang, G., Sabunciyan, S., Florea, L. D. Abstract: Alternative splicing of mRNA is an essential gene regulatory mechanism with important roles in development and disease. We present MntJULiP, a method for comprehensive and accurate quantification of splicing differences between two or more conditions. MntJULiP implements novel Dirichlet-multinomial and zero-inflated negative binomial models within a Bayesian framework to detect both changes in splicing ratios and in absolute splicing levels of introns with high accuracy, and can find classes of variation overlooked by reference tools. Additionally, a mixture model allows multiple conditions to be compared simultaneously. Highly scalable, it processed hundreds of GTEx samples in
Daniel Litt really likes Dirichlet's theorem on primes in arithmetic progressions and it's easy to see why. But we'll let him explain. Also Holmes and Watson make an appearance.
Link to bioRxiv paper: http://biorxiv.org/cgi/content/short/2020.10.01.322768v1?rss=1 Authors: Hossain, S. M. M., Halsana, A. A., Khatun, L., Ray, S., Mukhopadhyay, A. Abstract: Pancreatic Ductal Adenocarcinoma (PDAC) is the most lethal type of pancreatic cancer (PC), late detection of which leads to its therapeutic failure. This study aims to find out key regulatory genes and their impact on the progression of the disease helping the etiology of the disease which is still largely unknown. We leverage the landmark advantages of time-series gene expression data of this disease, and thereby the identified key regulators capture the characteristics of gene activity patterns in the progression of the cancer. We have identified the key modules and predicted gene functions of top genes from the compiled gene association network (GAN). Here, we have used the natural cubic spline regression model (splineTimeR) to identify differentially expressed genes (DEG) from the PDAC microarray time-series data downloaded from gene expression omnibus (GEO). First, we have identified key transcriptomic regulators (TR) and DNA binding transcription factors (DbTF). Subsequently, the Dirichlet process and Gaussian process (DPGP) mixture model is utilized to identify the key gene modules. A variation of the partial correlation method is utilized to analyze GAN, which is followed by a process of gene function prediction from the network. Finally, a panel of key genes related to PDAC is highlighted from each of the analyses performed. Copy rights belong to original authors. Visit the link for more info
Episode: 1919 Möbius and his strip: an abstract spur to applied mathematics. Today, Möbius' strip.
Link to bioRxiv paper: http://biorxiv.org/cgi/content/short/2020.08.20.258772v1?rss=1 Authors: Nagpal, S., Srivastava, D., Mande, S. S. Abstract: Topic modeling is frequently employed for discovering structures (or patterns) in a corpus of documents. Its utility in text-mining and document retrieval tasks in various fields of scientific research is rather well known. An unsupervised machine learning approach, Latent Dirichlet Allocation (LDA) has particularly been utilized for identifying latent (or hidden) topics in document collections and for deciphering the words that define one or more topics using a generative statistical model. Here we describe how SARS-CoV-2 genomic mutation profiles can be structured into a Bag of Words (BoW) to enable identification of signatures (topics) and their probabilistic distribution across various genomes using LDA. Topic models were generated using ~47000 novel corona virus genomes (considered as documents), leading to identification of 16 amino acid mutation signatures and 18 nucleotide mutation signatures (equivalent to topics) in the corpus of chosen genomes through coherence optimization. The document assumption for genomes also helped in identification of contextual nucleotide mutation signatures in the form of conventional N-grams (e.g. bi-grams and tri-grams). We validated the signatures obtained using LDA driven method against the previously reported phylogenetic clades for genomes. Additionally, we report the distribution of the identified mutation signatures on the global map of SARS-CoV-2 genomes. Use of the non-phylogenetic albeit classical approaches like topic modeling and other data centric pattern mining algorithms is therefore proposed for supplementing the efforts towards understanding the genomic diversity of the evolving SARS-CoV-2 genomes (and other pathogens/microbes). Copy rights belong to original authors. Visit the link for more info
What do neurodegenerative diseases, gerrymandering and ecological inference all have in common? Well, they can all be studied with Bayesian methods — and that’s exactly what Karin Knudson is doing. In this episode, Karin will share with us the vital and essential work she does to understand aspects of neurodegenerative diseases. She’ll also tell us more about computational neuroscience and Dirichlet processes — what they are, what they do, and when you should use them. Karin did her doctorate in mathematics, with a focus on compressive sensing and computational neuroscience at the University of Texas at Austin. Her doctoral work included applying hierarchical Dirichlet processes in the setting of neural data and focused on one-bit compressive sensing and spike-sorting. Formerly the chair of the math and computer science department of Phillips Academy Andover, she started a postdoc at Mass General Hospital and Harvard Medical in Fall 2019. Most importantly, rock climbing and hiking have no secrets for her! Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ ! Links from the show, personally curated by Karin Knudson: Karin on Twitter: https://twitter.com/karinknudson Spike train entropy-rate estimation using hierarchical Dirichlet process priors (Knudson and Pillow): https://pillowlab.princeton.edu/pubs/abs_Knudson_HDPentropy_NIPS13.html Fighting Gerrymandering with PyMC3, PyCon 2018, Colin Carroll and Karin Knudson: https://www.youtube.com/watch?v=G9I5ZnkWR0A Expository resources on Dirichlet Processes: Chapter 23 of Bayesian Data Analysis (Gelman et al.) and http://www.gatsby.ucl.ac.uk/~ywteh/research/npbayes/dp.pdf Hierarchical Dirichlet Processes (introduced the HDP and included applications in topic modeling and for working with time-series data and Hidden Markov Models): https://www.stat.berkeley.edu/~aldous/206-Exch/Papers/hierarchical_dirichlet.pdf A Sticky HDP-HMM with applications to speaker diarization (a nice example of how the HDP can be used with HMM, in this case cleverly adapted so that states have more persistence): https://arxiv.org/abs/0905.2592 If you want to get deeper into the weeds and also get a sense of the history: Dirichlet Processes with Applications to Bayesian Nonparametric Problems (https://projecteuclid.org/euclid.aos/1176342871) and A Bayesian Analysis of Some Nonparametric Problems (https://projecteuclid.org/euclid.aos/1176342360) --- Send in a voice message: https://anchor.fm/learn-bayes-stats/message
This Week in Machine Learning & Artificial Intelligence (AI) Podcast
Today we’re joined by William Fehlman, director of data science at USAA. We caught up with William a while back to discuss: His work on topic modeling, which USAA uses in various scenarios, including chat channels with members via mobile and desktop interfaces. How their datasets are generated. Explored methodologies of topic modeling, including latent semantic indexing, latent Dirichlet allocation, and non-negative matrix factorization. We also explore how terms are represented via a document-term matrix, and how they are scored based on coherence. The complete show notes can be found at twimlai.com/talk/276. Visit twimlcon.com to learn more about the TWIMLcon: AI Platforms conference! Early-bird registration ends on 6/28!
Andrii Khrabustovskyi works at our faculty in the group Nonlinear Partial Differential Equations and is a member of the CRC Wave phenomena: analysis and numerics. He was born in Kharkiv in the Ukraine and finished his studies as well as his PhD at the Kharkiv National University and the Institute for Low Temperature Physics and Engineering of the National Academy of Sciences of Ukraine. He joined our faculty in 2012 as postdoc in the former Research Training Group 1294 Analysis, Simulation and Design of Nanotechnological Processes, which was active until 2014. Gudrun Thäter talked with him about one of his research interests Asymptotic analysis and homogenization of PDEs. Photonic crystals are periodic dielectric media in which electromagnetic waves from certain frequency ranges cannot propagate. Mathematically speaking this is due to gaps in the spectrum of the related differential operators. For that an interesting question is if there are gaps inbetween bands of the spectrum of operators related to wave propagation, especially on periodic geometries and with periodic coeffecicients in the operator. It is known that the spectrum of periodic selfadjoint operators has bandstructure. This means the spectrum is a locally finite union of compact intervals called bands. In general, the bands may overlap and the existence of gaps is therefore not guaranteed. A simple example for that is the spectrum of the Laplacian in which is the half axis . The classic approach to such problems in the whole space case is the Floquet–Bloch theory. Homogenization is a collection of mathematical tools which are applied to media with strongly inhomogeneous parameters or highly oscillating geometry. Roughly spoken the aim is to replace the complicated inhomogeneous by a simpler homogeneous medium with similar properties and characteristics. In our case we deal with PDEs with periodic coefficients in a periodic geometry which is considered to be infinite. In the limit of a characteristic small parameter going to zero it behaves like a corresponding homogeneous medium. To make this a bit more mathematically rigorous one can consider a sequence of operators with a small parameter (e.g. concerning cell size or material properties) and has to prove some properties in the limit as the parameter goes to zero. The optimal result is that it converges to some operator which is the right homogeneous one. If this limit operator has gaps in its spectrum then the gaps are present in the spectra of pre-limit operators (for small enough parameter). The advantages of the homogenization approach compared to the classical one with Floquet Bloch theory are: The knowledge of the limit operator is helpful and only available through homogenization. For finite domains Floquet Bloch does not work well. Though we always have a discrete spectrum we might want to have the gaps in fixed position independent of the size of our domain. Here the homogenization theory works in principle also for the bounded case (it is just a bit technical). An interesting geometry in this context is a domain with periodically distributed holes. The question arises: what happens if the sizes of holes and the period simultaneously go to zero? The easiest operator which we can study is the Laplace operator subject to the Dirichlet boundary conditions. There are three possible regimes: For holes of the same order as the period (even slightly smaller) the Dirichelet conditions on the boundary of holes dominate -- the solution for the corresponding Poisson equation tends to zero. For significantly smaller holes the influence on the holes is so small that the problem "forgets" about the influence of the holes as the parameter goes to zero. There is a borderline case which lies between cases 1 and 2. It represents some interesting effects and can explain the occurance of so-called strange terms. A traditional ansatz in homogenization works with the concept of so-called slow and fast variables. The name comes from the following observation. If we consider an infinite layer in cylindrical coordinates, then the variable r measures the distance from the origin when going "along the layer", the angle in that plane, and z is the variable which goes into the finite direction perpendicular to that plane. When we have functions then the derivative with respect to r changes the power to while the other derivatives leave that power unchanged. In the interesting case k is negative and the r-derivate makes it decreasing even faster. This leads to the name fast variable. The properties in this simple example translate as follows. For any function we will think of having a set of slow and fast variables (characteristic to the problem) and a small parameter eps and try to find u as where in our applications typically . One can formally sort through the -levels using the properties of the differential operator. The really hard part then is to prove that this formal result is indeed true by finding error estimates in the right (complicated) spaces. There are many more tools available like the technique of Tartar/Murat, who use a weak formulation with special test functions depending on the small parameter. The weak point of that theory is that we first have to know the resulat as the parameter goes to zero before we can to construct the test function. Also the concept of Gamma convergence or the unfolding trick of Cioranescu are helpful. An interesting and new application to the mathematical results is the construction of wave guides. The corresponding domain in which we place a waveguide is bounded in two directions and unbounded in one (e.g. an unbounded cylinder). Serguei Nazarov proposed to make holes in order to make gaps into the line of the spectrum for a specified wave guide. Andrii Khrabustovskyi suggests to distribute finitely many traps, which do not influence the essential spectrum but add eigenvalues. One interesting effect is that in this way one can find terms which are nonlocal in time or space and thus stand for memory effects of the material. References P. Exner and A. Khrabustovskyi: On the spectrum of narrow Neumann waveguide with periodically distributed δ′ traps, Journal of Physics A: Mathematical and Theoretical, 48 (31) (2015), 315301. A. Khrabustovskyi: Opening up and control of spectral gaps of the Laplacian in periodic domains, Journal of Mathematical Physics, 55 (12) (2014), 121502. A. Khrabustovskyi: Periodic elliptic operators with asymptotically preassigned spectrum, Asymptotic Analysis, 82 (1-2) (2013), 1-37. S.A. Nazarov, G. Thäter: Asymptotics at infinity of solutions to the Neumann problem in a sieve-type layer, Comptes Rendus Mecanique 331(1) (2003) 85-90. S.A. Nazarov: Asymptotic Theory of Thin Plates and Rods: Vol.1. Dimension Reduction and Integral Estimates. Nauchnaya Kniga: Novosibirsk, 2002.
Andrii Khrabustovskyi works at our faculty in the group Nonlinear Partial Differential Equations and is a member of the CRC Wave phenomena: analysis and numerics. He was born in Kharkiv in the Ukraine and finished his studies as well as his PhD at the Kharkiv National University and the Institute for Low Temperature Physics and Engineering of the National Academy of Sciences of Ukraine. He joined our faculty in 2012 as postdoc in the former Research Training Group 1294 Analysis, Simulation and Design of Nanotechnological Processes, which was active until 2014. Gudrun Thäter talked with him about one of his research interests Asymptotic analysis and homogenization of PDEs. Photonic crystals are periodic dielectric media in which electromagnetic waves from certain frequency ranges cannot propagate. Mathematically speaking this is due to gaps in the spectrum of the related differential operators. For that an interesting question is if there are gaps inbetween bands of the spectrum of operators related to wave propagation, especially on periodic geometries and with periodic coeffecicients in the operator. It is known that the spectrum of periodic selfadjoint operators has bandstructure. This means the spectrum is a locally finite union of compact intervals called bands. In general, the bands may overlap and the existence of gaps is therefore not guaranteed. A simple example for that is the spectrum of the Laplacian in which is the half axis . The classic approach to such problems in the whole space case is the Floquet–Bloch theory. Homogenization is a collection of mathematical tools which are applied to media with strongly inhomogeneous parameters or highly oscillating geometry. Roughly spoken the aim is to replace the complicated inhomogeneous by a simpler homogeneous medium with similar properties and characteristics. In our case we deal with PDEs with periodic coefficients in a periodic geometry which is considered to be infinite. In the limit of a characteristic small parameter going to zero it behaves like a corresponding homogeneous medium. To make this a bit more mathematically rigorous one can consider a sequence of operators with a small parameter (e.g. concerning cell size or material properties) and has to prove some properties in the limit as the parameter goes to zero. The optimal result is that it converges to some operator which is the right homogeneous one. If this limit operator has gaps in its spectrum then the gaps are present in the spectra of pre-limit operators (for small enough parameter). The advantages of the homogenization approach compared to the classical one with Floquet Bloch theory are: The knowledge of the limit operator is helpful and only available through homogenization. For finite domains Floquet Bloch does not work well. Though we always have a discrete spectrum we might want to have the gaps in fixed position independent of the size of our domain. Here the homogenization theory works in principle also for the bounded case (it is just a bit technical). An interesting geometry in this context is a domain with periodically distributed holes. The question arises: what happens if the sizes of holes and the period simultaneously go to zero? The easiest operator which we can study is the Laplace operator subject to the Dirichlet boundary conditions. There are three possible regimes: For holes of the same order as the period (even slightly smaller) the Dirichelet conditions on the boundary of holes dominate -- the solution for the corresponding Poisson equation tends to zero. For significantly smaller holes the influence on the holes is so small that the problem "forgets" about the influence of the holes as the parameter goes to zero. There is a borderline case which lies between cases 1 and 2. It represents some interesting effects and can explain the occurance of so-called strange terms. A traditional ansatz in homogenization works with the concept of so-called slow and fast variables. The name comes from the following observation. If we consider an infinite layer in cylindrical coordinates, then the variable r measures the distance from the origin when going "along the layer", the angle in that plane, and z is the variable which goes into the finite direction perpendicular to that plane. When we have functions then the derivative with respect to r changes the power to while the other derivatives leave that power unchanged. In the interesting case k is negative and the r-derivate makes it decreasing even faster. This leads to the name fast variable. The properties in this simple example translate as follows. For any function we will think of having a set of slow and fast variables (characteristic to the problem) and a small parameter eps and try to find u as where in our applications typically . One can formally sort through the -levels using the properties of the differential operator. The really hard part then is to prove that this formal result is indeed true by finding error estimates in the right (complicated) spaces. There are many more tools available like the technique of Tartar/Murat, who use a weak formulation with special test functions depending on the small parameter. The weak point of that theory is that we first have to know the resulat as the parameter goes to zero before we can to construct the test function. Also the concept of Gamma convergence or the unfolding trick of Cioranescu are helpful. An interesting and new application to the mathematical results is the construction of wave guides. The corresponding domain in which we place a waveguide is bounded in two directions and unbounded in one (e.g. an unbounded cylinder). Serguei Nazarov proposed to make holes in order to make gaps into the line of the spectrum for a specified wave guide. Andrii Khrabustovskyi suggests to distribute finitely many traps, which do not influence the essential spectrum but add eigenvalues. One interesting effect is that in this way one can find terms which are nonlocal in time or space and thus stand for memory effects of the material. References P. Exner and A. Khrabustovskyi: On the spectrum of narrow Neumann waveguide with periodically distributed δ′ traps, Journal of Physics A: Mathematical and Theoretical, 48 (31) (2015), 315301. A. Khrabustovskyi: Opening up and control of spectral gaps of the Laplacian in periodic domains, Journal of Mathematical Physics, 55 (12) (2014), 121502. A. Khrabustovskyi: Periodic elliptic operators with asymptotically preassigned spectrum, Asymptotic Analysis, 82 (1-2) (2013), 1-37. S.A. Nazarov, G. Thäter: Asymptotics at infinity of solutions to the Neumann problem in a sieve-type layer, Comptes Rendus Mecanique 331(1) (2003) 85-90. S.A. Nazarov: Asymptotic Theory of Thin Plates and Rods: Vol.1. Dimension Reduction and Integral Estimates. Nauchnaya Kniga: Novosibirsk, 2002.
In this NEW episode we discuss Latent Semantic Indexing type machine learning algorithms which have a PROBABILISTIC interpretation. We explain why such a probabilistic interpretation is important and discuss how such algorithms can be used in the design of document retrieval systems, search engines, and recommender systems. Check us out at: www.learningmachines101.com and follow us on twitter at: @lm101talk
David Hipp hat am Projekt Cooking Math mitgearbeitet. In seinem darin vorgestellten Forschungsprojekt betrachtet er eine relativ einfache Form der Wellengleichung, die jedoch gut für die Beschreibung von akustischen Wellen geeignet ist. Die Gleichung beschreibt die Wellenausbreitung im Raum mit Hilfe einer partiellen Differentialgleichung. Die Lösung der Wellengleichung ist eine Funktion deren Variablen die Zeit und der Ort sind. Konkret werden in der Gleichung zeitliche und räumliche Änderungen des Zustands, also der Funktion, in Beziehung gesetzt, um die Wellenausbreitung zu beschreiben. Das mathematische Modell für Wellenausbreitung in beschränkten Gebieten umfasst neben der partiellen Differentialgleichung (die hier die Wellengleichung ist) auch noch die Anfangsbedingung, die den Zustand und die Geschwindigkeit zu Beginn des Prozesses festlegt, sowie die Bedingungen am Rand des Gebietes. Physikalisch ist klar, dass Wellen, wenn sie auf die Oberfläche am Rand treffen beispielsweise reflektiert, gebrochen oder gestreut werden können - je nachdem welche Materialeigenschaften der Rand hat. David Hipp untersucht in seiner Forschung insbesondere den Einfluss der Randbedingungen auf die Simulationen solcher Probleme - in seinem Fall also die Wellenausbreitung im Raum. Klassisch wird häufig die Dirichlet oder Neumann-Randbedingung gewählt bzw. die Robin Bedingung als Mischung der beiden. Diese drei Bedingungen auf dem Rand sind allerdings nicht immer realistisch genug, weil sie keine Bewegungen auf dem Rand erlauben.. Deshalb untersucht man derzeit dynamische Randbedingungen - das sind eben solche Bedingungen, die Bewegungen der Welle auf dem Rand zulassen. Damit kann die Wellen Energie an die Wand abgeben und die Wand selbst aktiver Teil der Wellenausbreitung wird. Das kann dann sogar zu Oberflächenwellen auf der Wand führen. Konventionelle numerische Verfahren müssen auf diese neuartigen Randbedingungen erst angepasst werden. Zwar kann David Hipp auf die Finite Elemente Diskretisierung im Raum in Kombination mit klassichen Zeitschrittverfahren zurückgreifen, jedoch muss geprüft werden ob diese Verfahren immer noch so gute Ergebnisse liefern, wie man sie von üblichen Anwendungen gewohnt ist. Eine Herausforderung der dynamischen Randbedingungen ist es, dass unterschiedliche Skalen im Prozess auftreten können, die dann auch so berücksichtigt werden müssen. Zum Beispiel schwingen die Wand und die akustische Welle mit unterschiedlichen Geschwindigkeiten oder Frequenzen. Im Moment genügt es für seine Testrechnungen, das Randgitter der FE-Diskretisierung zu belassen. In Zukunft geht es aber auch darum, hier Anpassungen für unterschiedlichste Materialien möglich zu machen um den verschiedenen Skalen gerecht zu werden. David Hipp ging im Cooking Math Projekt sehr offen und mit wenigen konkreten Vorstellungen in die Zusammenarbeit mit der Hochschule für Gestaltung (HfG) hinein. Schlussendlich ist das vorliegende Ergebnis der Zusammenarbeit mit Oliver Jelko von der HfG eine Mischung aus Lehrvideo zur Mathematik der Wellenausbreitung und professioneller Animation numerischer Testrechnungen für drei unterschiedliche Randbedingungen: Dirichlet, Neumann und akustische Randbedingung. Die akustische Randbedingung ist eine dynamische Randbedingung, die auf der modellhaften Vorstellung beruht, dass die Wand aus vielen winzigen Federn besteht, welche zu schwingen beginnen, wenn sie durch auftreffende akustische Wellen dazu angeregt werden. Als Mathematiker gehört die visuelle Darstellung der Ergebnisse zu unserer Arbeit und ist z.B. auch eine Form von Verifizierung. Aber die professionelle Animation zu Dirichlet-, Neumann und akustischen Randbedingungen durch einen Gestalter ist leichter zugänglich und erlaubt ein intuitives Verständnis. Das Video aus dem Cooking Math Projekt Literatur und Zusatzinformationen J. T. Beale, S. I. Rosencrans: Acoustic boundary conditions, Bull. Amer. Math. Soc. 80, 1276-1278, 1974. S. Larsson, V. Thomee: Partial Differential Equations with Numerical Methods, Springer, 2003. V. Rao: Boundary Condition Thinking, ein populärwissenschaftlicher Zugang zu Randbedingungen, 2011. R.P. Vito and S.A. Dixon: Blood Vessel Constitutive Models, 1995–2002, Annual Review of Biomedical Engineering 5, 413-439, 2003. Podcasts J. Enders, C. Spatschek: Cooking Math, Gespräch mit G. Thäter und S. Ritterbusch im Modellansatz Podcast, Folge 80, Fakultät für Mathematik, Karlsruher Institut für Technologie (KIT), 2016. http://modellansatz.de/cooking-math J. Eilinghoff: Splitting, Gespräch mit G. Thäter im Modellansatz Podcast, Folge 81, Fakultät für Mathematik, Karlsruher Institut für Technologie (KIT), 2016. http://modellansatz.de/splitting P. Krämer: Zeitintegration, Gespräch mit G. Thäter im Modellansatz Podcast, Folge 82, Fakultät für Mathematik, Karlsruher Institut für Technologie (KIT), 2016. http://modellansatz.de/zeitintegration
How do populations evolve? This question inspired Alberto Saldaña to his PhD thesis on Partial symmetries of solutions to nonlinear elliptic and parabolic problems in bounded radial domains. He considered an extended Lotka-Volterra models which is describing the dynamics of two species such as wolves in a bounded radial domain: For each species, the model contains the diffusion of a individual beings, the birth rate , the saturation rate or concentration , and the aggressiveness rate . Starting from an initial condition, a distribution of and in the regarded domain, above equations with additional constraints for well-posedness will describe the future outcome. In the long run, this could either be co-existence, or extinction of one or both species. In case of co-existence, the question is how they will separate on the assumed radial bounded domain. For this, he adapted a moving plane method. On a bounded domain, the given boundary conditions are an important aspect for the mathematical model: In this setup, a homogeneous Neumann boundary condition can represent a fence, which no-one, or no wolve, can cross, wereas a homogeneous Dirichlet boundary condition assumes a lethal boundary, such as an electric fence or cliff, which sets the density of living, or surviving, individuals touching the boundary to zero. The initial conditions, that is the distribution of the wolf species, were quite general but assumed to be nearly reflectional symmetric. The analytical treatment of the system was less tedious in the case of Neumann boundary conditions due to reflection symmetry at the boundary, similar to the method of image charges in electrostatics. The case of Dirichlet boundary conditions needed more analytical results, such as the Serrin's boundary point lemma. It turned out, that asymtotically in both cases the two species will separate into two symmetric functions. Here, Saldaña introduced a new aspect to this problem: He let the birth rate, saturation rate and agressiveness rate vary in time. This time-dependence modelled seasons, such as wolves behaviour depends on food availability. The Lotka-Volterra model can also be adapted to a predator-prey setting or a cooperative setting, where the two species live symbiotically. In the latter case, there also is an asymptotical solution, in which the two species do not separate- they stay together. Alberto Saldaña startet his academic career in Mexico where he found his love for mathematical analysis. He then did his Ph.D. in Frankfurt, and now he is a Post-Doc in the Mathematical Department at the University of Brussels. Literature and additional material A. Saldaña, T. Weth: On the asymptotic shape of solutions to Neumann problems for non-cooperative parabolic systems, Journal of Dynamics and Differential Equations,Volume 27, Issue 2, pp 307-332, 2015. A. Saldaña: Qualitative properties of coexistence and semi-trivial limit profiles of nonautonomous nonlinear parabolic Dirichlet systems, Nonlinear Analysis: Theory, Methods and Applications, 130:31 46, 2016. A. Saldaña: Partial symmetries of solutions to nonlinear elliptic and parabolic problems in bounded radial domains, PhD thesis, Johann Wolfgang Goethe-Universität Frankfurt am Main, Germany, 2014. A. Saldaña, T. Weth: Asymptotic axial symmetry of solutions of parabolic equations in bounded radial domains, Journal of Evolution Equations 12.3: 697-712, 2012.
How do populations evolve? This question inspired Alberto Saldaña to his PhD thesis on Partial symmetries of solutions to nonlinear elliptic and parabolic problems in bounded radial domains. He considered an extended Lotka-Volterra models which is describing the dynamics of two species such as wolves in a bounded radial domain: For each species, the model contains the diffusion of a individual beings, the birth rate , the saturation rate or concentration , and the aggressiveness rate . Starting from an initial condition, a distribution of and in the regarded domain, above equations with additional constraints for well-posedness will describe the future outcome. In the long run, this could either be co-existence, or extinction of one or both species. In case of co-existence, the question is how they will separate on the assumed radial bounded domain. For this, he adapted a moving plane method. On a bounded domain, the given boundary conditions are an important aspect for the mathematical model: In this setup, a homogeneous Neumann boundary condition can represent a fence, which no-one, or no wolve, can cross, wereas a homogeneous Dirichlet boundary condition assumes a lethal boundary, such as an electric fence or cliff, which sets the density of living, or surviving, individuals touching the boundary to zero. The initial conditions, that is the distribution of the wolf species, were quite general but assumed to be nearly reflectional symmetric. The analytical treatment of the system was less tedious in the case of Neumann boundary conditions due to reflection symmetry at the boundary, similar to the method of image charges in electrostatics. The case of Dirichlet boundary conditions needed more analytical results, such as the Serrin's boundary point lemma. It turned out, that asymtotically in both cases the two species will separate into two symmetric functions. Here, Saldaña introduced a new aspect to this problem: He let the birth rate, saturation rate and agressiveness rate vary in time. This time-dependence modelled seasons, such as wolves behaviour depends on food availability. The Lotka-Volterra model can also be adapted to a predator-prey setting or a cooperative setting, where the two species live symbiotically. In the latter case, there also is an asymptotical solution, in which the two species do not separate- they stay together. Alberto Saldaña startet his academic career in Mexico where he found his love for mathematical analysis. He then did his Ph.D. in Frankfurt, and now he is a Post-Doc in the Mathematical Department at the University of Brussels. Literature and additional material A. Saldaña, T. Weth: On the asymptotic shape of solutions to Neumann problems for non-cooperative parabolic systems, Journal of Dynamics and Differential Equations,Volume 27, Issue 2, pp 307-332, 2015. A. Saldaña: Qualitative properties of coexistence and semi-trivial limit profiles of nonautonomous nonlinear parabolic Dirichlet systems, Nonlinear Analysis: Theory, Methods and Applications, 130:31 46, 2016. A. Saldaña: Partial symmetries of solutions to nonlinear elliptic and parabolic problems in bounded radial domains, PhD thesis, Johann Wolfgang Goethe-Universität Frankfurt am Main, Germany, 2014. A. Saldaña, T. Weth: Asymptotic axial symmetry of solutions of parabolic equations in bounded radial domains, Journal of Evolution Equations 12.3: 697-712, 2012.
Kiwan, R (A. U. Dubaï) Wednesday 13 May 2015, 15:15-16:00
Fakultät für Physik - Digitale Hochschulschriften der LMU - Teil 04/05
In der vorliegenden Arbeit wird mit Hilfe der verallgemeinerten Eichtheorie/Gravitations-Dualität, welche stark gekoppelte Eichtheorien mit schwach gekrümmten gravitativen Theorien verbindet, stark korrelierte Quantenzustände der Materie untersucht. Der Schwerpunkt liegt dabei in Anwendungen auf Systeme der kondensierten Materie, insbesondere Hochtemperatur-Supraleitung und kritische Quantenzustände bei verschwindender Temperatur. Die Eichtheorie/Gravitations-Dualität entstammt der Stringtheorie und erlaubt eine Umsetzung des holographischen Prinzips. Aus diesem Grund wird eine kurze Einführung in die Konzepte der Stringtheorie und ihre Auswirkungen auf das holographische Prinzip gegeben. Für das tiefere Verständnis der effektiven Niederenergie-Feldtheorien wird zusätzlich die Supersymmetrie benötigt. Ausgestattet mit einem robusten Stringtheorie-Hintergrund wird die unterschiedliche Interpretation der Dirichlet- oder D-Branen, ausgedehnte Objekte auf denen offene Strings/Fäden enden können, diskutiert: Zum einen als massive solitonische Lösungen der Typ II Supergravitation und auf der anderen Seite, ihre Rolle als Quelle für supersymmetrische Yang-Mills Theorien. Die Verbindung dieser unterschiedlichen Betrachtungsweise der D-Branen liefert eine explizite Konstruktion der Eichtheorie/Gravitations-Dualität, genauer der AdS_5/CFT_4 Korrespondenz zwischen der N=4 supersymmetrischen SU(N_c) Yang-Mills Theorie in vier Dimensionen mit verschwindender beta-Funktion in allen Ordnungen, also eine echte konforme Theorie, und Type IIB Supergravitation in der zehn dimensionalen AdS_5 X S^5 Raumzeit. Darüber hinaus wird das Wörterbuch, das zwischen den Operatoren der konformen Feldtheorie und den gravitativen Feldern übersetzt, im Detail eingeführt. Genauer gesagt, die Zustandssumme der stark gekoppelten N=4 supersymmetrischen Yang-Mills Theorie im Grenzwert großer N_c, ist identisch mit der Zustandssumme der Supergravitation unter Berücksichtigung der zugehörigen Lösungen der Bewegungsgleichungen, ausgewertet am Rand des AdS-Raumes. Die Anwendung der perturbativen Quantenfeldtheorie und die Verbindungen zur quantenstatistischen Zustandssumme erlaubt die Erweiterung des holographischen Wörterbuchs auf Systeme mit endlichen Dichten und endlicher Temperatur. Aus diesem Grund werden alle Aspekte der Quantenfeldtheorie behandelt, die für die Anwendung der ``Linear-Response''-Theorie, der Berechnung von Korrelationsfunktionen und die Beschreibung von kritischen Phänomenen benötigt werden, wobei die Betonung auf allgemeine Zusammenhänge zwischen Thermodynamik, statistischer Physik bzw. statistischer Feldtheorie und Quantenfeldtheorie liegt. Des Weiteren wird der Renormierungsgruppen-Formalismus zur Beschreibung von effektiven Feldtheorien und kritischen Phänomene im Kontext der verallgemeinerten Eichtheorie/Gravitations-Dualität ausführlich dargelegt. Folgende Hauptthemen werden in dieser Arbeit behandelt: Die Untersuchung der optischen Eigenschaften von holographischen Metallen und ihre Beschreibung durch das Drude-Sommerfeld Modell, ein Versuch das Homes'sche Gesetz in Hochtemperatur-Supraleitern holographisch zu beschreiben indem verschiedene Diffusionskonstanten und zugehörige Zeitskalen berechnet werden, das mesonische Spektrum bei verschwindender Temperatur und schlussendlich holographische Quantenzustände bei endlichen Dichten. Entscheidend für die Anwendung dieses Rahmenprogramms auf stark korrelierte Systeme der kondensierten Materie ist die Renormierungsgruppenfluss-Interpretation der AdS_5/CFT_4 Korrespondenz und die daraus resultierenden emergenten, holographischen Duale, welche die meisten Beschränkungen der ursprünglichen Theorie aufheben. Diese sogenannten ``Bottom-Up'' Zugänge sind besonders geeignet für Anwendungen auf Fragestellungen in der Theorie der kondensierten Materie und der ``Linear-Response''-Theorie, mittels des holographischen Fluktuations-Dissipations-Theorem. Die Hauptergebnisse der vorliegenden Arbeit umfassen eine ausführliche Untersuchung der R-Ladungs-Diffusion und der Impulsdiffusion in holographischen s- und p-Wellen Supraleitern, welche durch die Einstein-Maxwell Theorie bzw. die Einstein-Yang-Mills Theorie beschrieben werden, und eine Vertiefung des Verständnisses der universellen Eigenschaften solcher Systeme. Als zweites wurde die Stabilität der kalten holographischen Quantenzustände der Materie untersucht, wobei eine zusätzliche Diffusions-Mode entdeckt wurde. Diese Mode kann als eine Art ``R-Spin-Diffusion'' aufgefasst werden, die der Spin-Diffusion in Systemen mit frei beweglichen ``itineranten'' Elektronen ähnelt, wobei die Entkopplung der Spin-Bahn Kopplung die Spin-Symmetrie in eine globale Symmetrie überführt. Das Fehlen der Instabilitäten und die Existenz einer ``Zero-Sound'' Mode, bekannt von Fermi-Flüssigkeiten, deuten eine Beschreibung der kalten holographischen Materie durch eine effektive hydrodynamische Theorie an.
Majumdar, A (University of Bath) Tuesday 09 April 2013, 10:00-11:00
Mathematik, Informatik und Statistik - Open Access LMU - Teil 02/03
In linear mixed models, the assumption of normally distributed random effects is often inappropriate and unnecessarily restrictive. The proposed approximate Dirichlet process mixture assumes a hierarchical Gaussian mixture that is based on the truncated version of the stick breaking presentation of the Dirichlet process. In addition to the weakening of distributional assumptions, the specification allows to identify clusters of observations with a similar random effects structure. An Expectation-Maximization algorithm is given that solves the estimation problem and that, in certain respects, may exhibit advantages over Markov chain Monte Carlo approaches when modelling with Dirichlet processes. The method is evaluated in a simulation study and applied to the dynamics of unemployment in Germany as well as lung function growth data.
Mathematik, Informatik und Statistik - Open Access LMU - Teil 02/03
This short note contains an explicit proof of the Dirichlet distribution being the conjugate prior to the Multinomial sample distribution as resulting from the general construction method described, e.g., in Bernardo and Smith (2000). The well-known Dirichlet-Multinomial model is thus shown to fit into the framework of canonical conjugate analysis (Bernardo and Smith 2000, Prop.~5.6, p.~273), where the update step for the prior parameters to their posterior counterparts has an especially simple structure. This structure is used, e.g., in the Imprecise Dirichlet Model (IDM) by Walley (1996), a simple yet powerful model for imprecise Bayesian inference using sets of Dirichlet priors to model vague prior knowledge, and furthermore in other imprecise probability models for inference in exponential families where sets of priors are considered.
Mathematics and Applications of Branes in String and M-theory
Rangamani, M (University of Durham) Wednesday 09 May 2012, 16:00-17:00
eBusiness Verantwortliche haben viel zu entscheiden. Tägliche Neuerungen und Strategien gilt es abzuwägen. Soll weiter in SEO investiert werden oder kann durch CRO (Conversion Rate Optimierung) mittelfristig ein höherer Effekt erzielt werden. In dieser Conversion Clinic besprechen wir die Erfolgsfaktoren beider Disziplinen und wie sie strategisch ineinandergreifen um Online Shops effektiver und rentabler zu machen.
Bobylev, A (Karlstad) Tuesday 28 September 2010, 15:00-15:45
Bas and Roy welcome back the SEO Head of MediaVest, Pete Young. They discuss the recent voting changes at Sphinn, as well as a recent Rand Fishkin post at SEOMoz entitled Latent Dirichlet Allocation (LDA) and Googles Rankings are Remarkably Well Correlated.
Pavlov, B (Auckland) Friday 30 July 2010, 14:00-14.45
Carlson, R (Colorado) Thursday 29 July 2010, 14:45-15.30
Mathematics and Physics of Anderson Localization: 50 Years After
Goldstein, M (Toronto) Thursday 18 December 2008, 16:30-17:30 Classical and Quantum Transport in the Presence of Disorder
Solomyak, M (Weizmann Institute of Science) Thursday 12 April 2007, 15:30-16:30 Graph Models of Mesoscopic Systems, Wave-Guides and Nano-Structures
Grieser, D (Carl von Ossietzky, Oldenburg) Thursday 12 April 2007, 14:00-15:00 Graph Models of Mesoscopic Systems, Wave-Guides and Nano-Structures
Fakultät für Mathematik, Informatik und Statistik - Digitale Hochschulschriften der LMU - Teil 01/02
Statistical relational learning analyzes the probabilistic constraints between the entities, their attributes and relationships. It represents an area of growing interest in modern data mining. Many leading researches are proposed with promising results. However, there is no easily applicable recipe of how to turn a relational domain (e.g. a database) into a probabilistic model. There are mainly two reasons. First, structural learning in relational models is even more complex than structural learning in (non-relational) Bayesian networks due to the exponentially many attributes an attribute might depend on. Second, it might be difficult and expensive to obtain reliable prior knowledge for the domains of interest. To remove these constraints, this thesis applies nonparametric Bayesian analysis to relational learning and proposes two compelling models: Dirichlet enhanced relational learning and infinite hidden relational learning. Dirichlet enhanced relational learning (DERL) extends nonparametric hierarchical Bayesian modeling to relational data. In existing relational models, the model parameters are global, which means the conditional probability distributions are the same for each entity and the relationships are independent of each other. To solve the limitations, we introduce hierarchical Bayesian (HB) framework to relational learning, such that model parameters can be personalized, i.e. owned by entities or relationships, and are coupled via common prior distributions. Additional flexibility is introduced in a nonparametric HB modeling, such that the learned knowledge can be truthfully represented. For inference, we develop an efficient variational method, which is motivated by the Polya urn representation of DP. DERL is demonstrated in a medical domain where we form a nonparametric HB model for entities involving hospitals, patients, procedures and diagnoses. The experiments show that the additional flexibility introduced by the nonparametric HB modeling results in a more accurate model to represent the dependencies between different types of relationships and gives significantly improved prediction performance about unknown relationships. In infinite hidden relational model (IHRM), we apply nonparametric mixture modeling to relational data, which extends the expressiveness of a relational model by introducing for each entity an infinite-dimensional hidden variable as part of a Dirichlet process (DP) mixture model. There are mainly three advantages. First, this reduces the extensive structural learning, which is particularly difficult in relational models due to the huge number of potential probabilistic parents. Second, the information can globally propagate in the ground network defined by the relational structure. Third, the number of mixture components for each entity class can be optimized by the model itself based on the data. IHRM can be applied for entity clustering and relationship/attribute prediction, which are two important tasks in relational data mining. For inference of IHRM, we develop four algorithms: collapsed Gibbs sampling with the Chinese restaurant process, blocked Gibbs sampling with the truncated stick breaking construction (SBC), and mean-field inference with truncated SBC, as well as an empirical approximation. IHRM is evaluated in three different domains: a recommendation system based on the MovieLens data set, prediction of the functions of yeast genes/proteins on the data set of KDD Cup 2001, and the medical data analysis. The experimental results show that IHRM gives significantly improved estimates of attributes/relationships and highly interpretable entity clusters in complex relational data.
Fakultät für Mathematik, Informatik und Statistik - Digitale Hochschulschriften der LMU - Teil 01/02
Probabilistic modeling for data mining and machine learning problems is a fundamental research area. The general approach is to assume a generative model underlying the observed data, and estimate model parameters via likelihood maximization. It has the deep probability theory as the mathematical background, and enjoys a large amount of methods from statistical learning, sampling theory and Bayesian statistics. In this thesis we study several advanced probabilistic models for data clustering and feature projection, which are the two important unsupervised learning problems. The goal of clustering is to group similar data points together to uncover the data clusters. While numerous methods exist for various clustering tasks, one important question still remains, i.e., how to automatically determine the number of clusters. The first part of the thesis answers this question from a mixture modeling perspective. A finite mixture model is first introduced for clustering, in which each mixture component is assumed to be an exponential family distribution for generality. The model is then extended to an infinite mixture model, and its strong connection to Dirichlet process (DP) is uncovered which is a non-parametric Bayesian framework. A variational Bayesian algorithm called VBDMA is derived from this new insight to learn the number of clusters automatically, and empirical studies on some 2D data sets and an image data set verify the effectiveness of this algorithm. In feature projection, we are interested in dimensionality reduction and aim to find a low-dimensional feature representation for the data. We first review the well-known principal component analysis (PCA) and its probabilistic interpretation (PPCA), and then generalize PPCA to a novel probabilistic model which is able to handle non-linear projection known as kernel PCA. An expectation-maximization (EM) algorithm is derived for kernel PCA such that it is fast and applicable to large data sets. Then we propose a novel supervised projection method called MORP, which can take the output information into account in a supervised learning context. Empirical studies on various data sets show much better results compared to unsupervised projection and other supervised projection methods. At the end we generalize MORP probabilistically to propose SPPCA for supervised projection, and we can also naturally extend the model to S2PPCA which is a semi-supervised projection method. This allows us to incorporate both the label information and the unlabeled data into the projection process. In the third part of the thesis, we introduce a unified probabilistic model which can handle data clustering and feature projection jointly. The model can be viewed as a clustering model with projected features, and a projection model with structured documents. A variational Bayesian learning algorithm can be derived, and it turns out to iterate the clustering operations and projection operations until convergence. Superior performance can be obtained for both clustering and projection.
Mathematik, Informatik und Statistik - Open Access LMU - Teil 02/03
Nonparametric Predictive Inference (NPI) is a general methodology to learn from data in the absence of prior knowledge and without adding unjustified assumptions. This paper develops NPI for multinomial data where the total number of possible categories for the data is known. We present the general upper and lower probabilities and several of their properties. We also comment on differences between this NPI approach and corresponding inferences based on Walley's Imprecise Dirichlet Model.
Fakultät für Mathematik, Informatik und Statistik - Digitale Hochschulschriften der LMU - Teil 01/02
Enabling computer systems to understand human thinking or behaviors has ever been an exciting challenge to computer scientists. In recent years one such a topic, information filtering, emerges to help users find desired information items (e.g.~movies, books, news) from large amount of available data, and has become crucial in many applications, like product recommendation, image retrieval, spam email filtering, news filtering, and web navigation etc.. An information filtering system must be able to understand users' information needs. Existing approaches either infer a user's profile by exploring his/her connections to other users, i.e.~collaborative filtering (CF), or analyzing the content descriptions of liked or disliked examples annotated by the user, ~i.e.~content-based filtering (CBF). Those methods work well to some extent, but are facing difficulties due to lack of insights into the problem. This thesis intensively studies a wide scope of information filtering technologies. Novel and principled machine learning methods are proposed to model users' information needs. The work demonstrates that the uncertainty of user profiles and the connections between them can be effectively modelled by using probability theory and Bayes rule. As one major contribution of this thesis, the work clarifies the ``structure'' of information filtering and gives rise to principled solutions. In summary, the work of this thesis mainly covers the following three aspects: Collaborative filtering: We develop a probabilistic model for memory-based collaborative filtering (PMCF), which has clear links with classical memory-based CF. Various heuristics to improve memory-based CF have been proposed in the literature. In contrast, extensions based on PMCF can be made in a principled probabilistic way. With PMCF, we describe a CF paradigm that involves interactions with users, instead of passively receiving data from users in conventional CF, and actively chooses the most informative patterns to learn, thereby greatly reduce user efforts and computational costs. Content-based filtering: One major problem for CBF is the deficiency and high dimensionality of content-descriptive features. Information items (e.g.~images or articles) are typically described by high-dimensional features with mixed types of attributes, that seem to be developed independently but intrinsically related. We derive a generalized principle component analysis to merge high-dimensional and heterogenous content features into a low-dimensional continuous latent space. The derived features brings great conveniences to CBF, because most existing algorithms easily cope with low-dimensional and continuous data, and more importantly, the extracted data highlight the intrinsic semantics of original content features. Hybrid filtering: How to combine CF and CBF in an ``smart'' way remains one of the most challenging problems in information filtering. Little principled work exists so far. This thesis reveals that people's information needs can be naturally modelled with a hierarchical Bayesian thinking, where each individual's data are generated based on his/her own profile model, which itself is a sample from a common distribution of the population of user profiles. Users are thus connected to each other via this common distribution. Due to the complexity of such a distribution in real-world applications, usually applied parametric models are too restrictive, and we thus introduce a nonparametric hierarchical Bayesian model using Dirichlet process. We derive effective and efficient algorithms to learn the described model. In particular, the finally achieved hybrid filtering methods are surprisingly simple and intuitively understandable, offering clear insights to previous work on pure CF, pure CBF, and hybrid filtering.
Mathematik, Informatik und Statistik - Open Access LMU - Teil 02/03
On the basis of integral representations we propose fast numerical methods to solve the Cauchy problem for the stochastic wave equation without boundaries and with the Dirichlet boundary conditions. The algorithms are exact in a probabilistic sense.