POPULARITY
Sarah Minson, U.S. Geological Survey There are many underdetermined geophysical inverse problems. For example, when we try to infer earthquake fault slip, we find that there are many potential slip models that are consistent with our observations and our understanding of earthquake physics. One way to approach these problems is to use Bayesian analysis to infer the ensemble of all potential models that satisfy the observations and our prior knowledge. In Bayesian analysis, our prior knowledge is known as the prior probability density function or prior PDF, the fit to the data is the data likelihood function, and the target PDF that satisfies both the prior PDF and data likelihood function is the posterior PDF. Simulating a posterior PDF can be computationally expensive. Typical earthquake rupture models with 10 km spatial resolution can require using Markov Chain Monte Carlo (MCMC) to draw tens of billions of random realizations of fault slip. And now new technological advancements like LiDAR provide enormous numbers of laser point returns that image surface deformation at submeter scale, exponentially increasing computational cost. How can we make MCMC sampling efficient enough to simulate fault slip distributions at sub-meter scale using “Big Data”? We present a new MCMC approach called cross-fading in which we transition from an analytical posterior PDF (obtained from a conjugate prior to the data likelihood function) to the desired target posterior PDF by bringing in our physical constraints and removing the conjugate prior. This approach has two key efficiencies. First, the starting PDF is by construction “close” to the target posterior PDF, requiring very little MCMC to update the samples to match the target. Second, all PDFs are defined in model space, not data space. The forward model and data misfit are never evaluated during sampling, allowing models to be fit to Big Data with zero computational cost. It is even possible, without additional computational cost, to incorporate model prediction errors for Big Data, that is, to quantify the effects on data prediction of uncertainties in the model design. While we present earthquake models, this approach is flexible and can be applied to many geophysical problems.
CLASS Survey Description: Coronal Line Needles in the SDSS Haystack by Michael Reefe et al. on Wednesday 23 November Coronal lines are a powerful, yet poorly understood, tool to identify and characterize Active Galactic Nuclei (AGNs). There have been few large scale surveys of coronal lines in the general galaxy population in the literature so far. Using a novel pre-selection technique with a flux-to-RMS ratio $F$, followed by Markov-Chain Monte Carlo (MCMC) fitting, we searched for the full suite of 20 coronal lines in the optical spectra of almost 1 million galaxies from the Sloan Digital Sky Survey (SDSS) Data Release 8. We present a catalog of the emission line parameters for the resulting 258 galaxies with detections. The Coronal Line Activity Spectroscopic Survey (CLASS) includes line properties, host galaxy properties, and selection criteria for all galaxies in which at least one line is detected. This comprehensive study reveals that a significant fraction of coronal line activity is missed in past surveys based on a more limited set of coronal lines; $sim$60% of our sample do not display the more widely surveyed [Fe X] $lambda$6374. In addition, we discover a strong correlation between coronal line and WISE W2 luminosities, suggesting that the mid-infrared flux can be used to predict coronal line fluxes. For each line we also provide a confidence level that the line is present, generated by a novel neural network, trained on fully simulated data. We find that after training the network to detect individual lines using 100,000 simulated spectra, we achieve an overall true positive rate of 75.49% and a false positive rate of only 3.96%. arXiv: http://arxiv.org/abs/http://arxiv.org/abs/2211.11882v1
CLASS Survey Description: Coronal Line Needles in the SDSS Haystack by Michael Reefe et al. on Tuesday 22 November Coronal lines are a powerful, yet poorly understood, tool to identify and characterize Active Galactic Nuclei (AGNs). There have been few large scale surveys of coronal lines in the general galaxy population in the literature so far. Using a novel pre-selection technique with a flux-to-RMS ratio $F$, followed by Markov-Chain Monte Carlo (MCMC) fitting, we searched for the full suite of 20 coronal lines in the optical spectra of almost 1 million galaxies from the Sloan Digital Sky Survey (SDSS) Data Release 8. We present a catalog of the emission line parameters for the resulting 258 galaxies with detections. The Coronal Line Activity Spectroscopic Survey (CLASS) includes line properties, host galaxy properties, and selection criteria for all galaxies in which at least one line is detected. This comprehensive study reveals that a significant fraction of coronal line activity is missed in past surveys based on a more limited set of coronal lines; $sim$60% of our sample do not display the more widely surveyed [Fe X] $lambda$6374. In addition, we discover a strong correlation between coronal line and WISE W2 luminosities, suggesting that the mid-infrared flux can be used to predict coronal line fluxes. For each line we also provide a confidence level that the line is present, generated by a novel neural network, trained on fully simulated data. We find that after training the network to detect individual lines using 100,000 simulated spectra, we achieve an overall true positive rate of 75.49% and a false positive rate of only 3.96%. arXiv: http://arxiv.org/abs/http://arxiv.org/abs/2211.11882v1
Can Molecular Ratios be used as Diagnostics of AGN and Starburst activity? The Case of NGC 1068 by J. Butterworth et al. on Tuesday 13 September Molecular line ratios, such as HCN(1-0)/HCO$^+$(1-0) and HCN(4-3)/CS(7-6) are routinely used to identify AGN activity in galaxies. Such ratios are however hard to interpret as they are highly dependent on the physics and energetics of the gas and hence can seldom be used a as a unique unambiguous diagnostic. We use the composite galaxy NGC 1068 as a "laboratory", to investigate whether molecular line ratios between HCN, HCO$^+$ and CS are useful tracers of AGN-dominated gas and determine the origin of the differences in such ratios across different types of gas. Such determination will allow a more rigorous use of such ratios. We first empirically examine the aforementioned ratios at different angular resolutions to quantify correlations. We then use LTE and non-LTE analyses coupled with Markov Chain Monte Carlo (MCMC) sampling in order to determine the origin of the underlying differences in ratios. We propose that at high spatial resolution (< 50 pc) the HCN(4-3)/CS(2-1) is a reliable tracer of AGN activity. Finally we find that the variations in ratios are not a consequence of different densities or temperature but of different fractional abundances yielding to the important result that it is essential to consider the what chemical processes are at play when drawing conclusions from radiative transfer calculations. Upon analysis at varying spatial scales previous proposed as well as a new molecular line ratio have been shown to have varying levels of consistency. We have also determined from investigation of radiative transfer modelling of our data that it is essential to consider the chemistry of the species when reaching conclusions from radiative transfer calculations. arXiv: http://arxiv.org/abs/http://arxiv.org/abs/2209.05928v1
Link to bioRxiv paper: http://biorxiv.org/cgi/content/short/2020.09.03.281162v1?rss=1 Authors: Invernizzi, A., Haak, K. V., Carvalho, J., Renken, R., Cornelissen, F. Abstract: The majority of neurons in the human brain process signals from neurons elsewhere in the brain. Connective Field (CF) modeling is a biologically-grounded method to describe this essential aspect of the brain s circuitry. It allows characterizing the response of a population of neurons in terms of the activity in another part of the brain. CF modeling translates the concept of the receptive field (RF) into the domain of connectivity by assessing the spatial dependency between signals in distinct cortical visual field areas. Standard CF model estimation has some intrinsic limitations in that it cannot estimate the uncertainty associated with each of its parameters. Obtaining the uncertainty will allow identification of model biases, e.g. related to an over- or under-fitting or a co-dependence of parameters, thereby improving the CF prediction. To enable this, here we present a Bayesian framework for the CF model. Using a Markov Chain Monte Carlo (MCMC) approach, we estimate the underlying posterior distribution of the CF parameters and consequently, quantify the uncertainty associated with each estimate. We applied the method and its new Bayesian features to characterize the cortical circuitry of the early human visual cortex of 12 healthy participants that were assessed using 3T fMRI. In addition, we show how the MCMC approach enables the use of effect size (beta) as a data-driven parameter to retain relevant voxels for further analysis. Finally, we demonstrate how our new method can be used to compare different CF models. Our results show that single Gaussian models are favoured over differences of Gaussians (i.e. center-surround) models, suggesting that the cortico-cortical connections of the early visual system do not possess center-surround organisation. We conclude that our new Bayesian CF framework provides a comprehensive tool to improve our fundamental understanding of the human cortical circuitry in health and disease. Copy rights belong to original authors. Visit the link for more info
Speed run of Anomaly Detection, Recommenders(Content Filtering vs Collaborative Filtering), and Markov Chain Monte Carlo (MCMC). ocdevel.com/mlg/14 for notes and resources
Fakultät für Mathematik, Informatik und Statistik - Digitale Hochschulschriften der LMU - Teil 01/02
The purpose of brain mapping is to advance the understanding of the relationship between structure and function in the human brain. Several techniques---with different advantages and disadvantages---exist for recording neural activity. Functional magnetic resonance imaging (fMRI) has a high spatial resolution, but low temporal resolution. It also suffers from a low-signal-to-noise ratio in event-related experimental designs, which are commonly used to investigate neuronal brain activity. On the other hand, the high temporal resolution of electroencephalography (EEG) recordings allows to capture provoked event-related potentials. Though, 3D maps derived by EEG source reconstruction methods have a low spatial resolution, they provide complementary information about the location of neuronal activity. There is a strong interest in combining data from both modalities to gain a deeper knowledge of brain functioning through advanced statistical modeling. In this thesis, a new Bayesian method is proposed for enhancing fMRI activation detection by the use of EEG-based spatial prior information in stimulus based experimental paradigms. This method builds upon a newly developed mere fMRI activation detection method. In general, activation detection corresponds to stimulus predictor components having an effect on the fMRI signal trajectory in a voxelwise linear model. We model and analyze stimulus influence by a spatial Bayesian variable selection scheme, and extend existing high-dimensional regression methods by incorporating prior information on binary selection indicators via a latent probit regression. For mere fMRI activation detection, the predictor consists of a spatially-varying intercept only. For EEG-enhanced schemes, an EEG effect is added, which is either chosen to be spatially-varying or constant. Spatially-varying effects are regularized by different Markov random field priors. Statistical inference in resulting high-dimensional hierarchical models becomes rather challenging from a modeling perspective as well as with regard to numerical issues. In this thesis, inference is based on a Markov Chain Monte Carlo (MCMC) approach relying on global updates of effect maps. Additionally, a faster algorithm is developed based on single-site updates to circumvent the computationally intensive, high-dimensional, sparse Cholesky decompositions. The proposed algorithms are examined in both simulation studies and real-world applications. Performance is evaluated in terms of convergency properties, the ability to produce interpretable results, and the sensitivity and specificity of corresponding activation classification rules. The main question is whether the use of EEG information can increase the power of fMRI models to detect activated voxels. In summary, the new algorithms show a substantial increase in sensitivity compared to existing fMRI activation detection methods like classical SPM. Carefully selected EEG-prior information additionally increases sensitivity in activation regions that have been distorted by a low signal-to-noise ratio.
Fakultät für Mathematik, Informatik und Statistik - Digitale Hochschulschriften der LMU - Teil 01/02
Many areas of applied statistics have become aware of the problem of measurement error-prone variables and their appropriate analysis. Simply ignoring the error in the analysis usually leads to biased estimates, like e.g. in the regression with error-prone covariates. While this problem has been discussed at length for parametric regression, only few methods exist to handle nonparametric regression under error, which are usually either computer intensive or little effective. This thesis develops new methods achieving the correction quality of state of the art methods while demanding only a trickle of their computing time. These new methods use the so-called relevance vector machine (RVM) for nonparametric regression - now enhanced by correction methods based on the ideas of regression calibration, the so-called SIMEX and Markov Chain Monte Carlo (MCMC) correction. All methods are compared in simulation studies regarding Gaussian, binary and Poisson responses. This thesis also discusses the case of multiple error-prone covariates. Furthermore, a MCMC based correction method for nonparametric regression of binary longitudinal data with covariate measurement error is introduced. This data scenario is often encountered, e.g. in epidemiological applications.
Mathematik, Informatik und Statistik - Open Access LMU - Teil 02/03
In this paper models for claim frequency and claim size in non-life insurance are considered. Both covariates and spatial random e ects are included allowing the modelling of a spatial dependency pattern. We assume a Poisson model for the number of claims, while claim size is modelled using a Gamma distribution. However, in contrast to the usual compound Poisson model going back to Lundberg (1903), we allow for dependencies between claim size and claim frequency. Both models for the individual and average claim sizes of a policyholder are considered. A fully Bayesian approach is followed, parameters are estimated using Markov Chain Monte Carlo (MCMC). The issue of model comparison is thoroughly addressed. Besides the deviance information criterion suggested by Spiegelhalter et al. (2002), the predictive model choice criterion (Gelfand and Ghosh (1998)) and proper scoring rules (Gneiting and Raftery (2005)) based on the posterior predictive distribution are investigated. We give an application to a comprehensive data set from a German car insurance company. The inclusion of spatial e ects significantly improves the models for both claim frequency and claim size and also leads to more accurate predictions of the total claim sizes. Further we quantify the significant number of claims e ects on claim size.
Mathematik, Informatik und Statistik - Open Access LMU - Teil 02/03
In market microstructure theory the effect of time between consecutive transactions and trade volume on transaction price changes of exchange traded shares and options has been considered (e.g. Diamond and Verecchia (1987) and Easley and O'Hara (1987)). The goal of this paper is to investigate if these theoretical considerations can be supported by a statistical analysis of data on transaction price changes of options on shares of the Bayer AG in 1993-94. For this appropriate regression models with non linear and interaction effects are developed to study the influence of trade volume, time between trades, intrinsic value of an option at trading time and price development of the underlying share on the absolute transation price change of an option. Since price changes are measured in ticks yield count data structure, we use in a first analysis ordinary Poisson generalized linear models (GLM) ignoring the time series structure of the data. In a second analysis these Poisson GLM's are extended to allow for an additional AR(1) latent process in the mean which accounts for the time series structure. Parameter estimation in this extended model is not straight forward and we use Markov Chain Monte Carlo (MCMC) methods. The extended Poisson GLM is compared to the ordinary Poisson GLM in a Bayesian setting using the deviance information criterion (DIC) developed by Spiegelhalter et al. (2002). With regard to market microstructure theory the results of the analysis support the expected effect of time between trades on absolute option price changes but not for trade volume in this data set.
Mathematik, Informatik und Statistik - Open Access LMU - Teil 02/03
We describe a stochastic model based on a branching process for analyzing surveillance data of infectious diseases that allows to make forecasts of the future development of the epidemic. The model is based on a Poisson branching process with immigration with additional adjustment for possible overdispersion. An extension to a space-time model for the multivariate case is described. The model is estimated in a Bayesian context using Markov Chain Monte Carlo (MCMC) techniques. We illustrate the applicability of the model through analyses of simulated and real data.
Mathematik, Informatik und Statistik - Open Access LMU - Teil 02/03
This work is motivated by a mobility study conducted in the city of Munich, Germany. The variable of interest is a binary response, which indicates whether public transport has been utilized or not. One of the central questions is to identify areas of low/high utilization of public transport after adjusting for explanatory factors such as trip, individual and household attributes. The goal is to develop flexible statistical models for a binary response with covariate, spatial and cluster effects. One approach for modeling spatial effects are Markov Random Fields (MRF). A modification of a class of MRF models with proper joint distributions introduced by Pettitt et al. (2002) is developed. This modification has the desirable property to contain the intrinsic MRF in the limit and still allows for efficient spatial parameter updates in Markov Chain Monte Carlo (MCMC) algorithms. In addition to spatial effects, cluster effects are taken into consideration. Group and individual approaches for modeling these effects are suggested. The first one models heterogeneity between clusters, while the second one models heterogeneity within clusters. A naive approach to include individual cluster effects results in an unidentifiable model. It is shown how an appropriate reparametrization gives identifiable parameters. This provides a new approach for modeling heterogeneity within clusters. For hierarchical spatial binary regression models with individual cluster effects two MCMC algorithms for parameter estimation are developed. The first one is based on a direct evaluation of the likelihood. The second one is based on the representation of binary responses with Gaussian latent variables through a threshold mechanism, which is particularly useful for probit models. Simulation results show a satisfactory behavior of the MCMC algorithms developed. Finally the proposed model classes are applied to the mobility study and results are interpreted.
Mathematik, Informatik und Statistik - Open Access LMU - Teil 02/03
Most econometric analyses of patent data rely on regression methods using a parametric form of the predictor for modeling the dependence of the response given certain covariates. These methods often lack the capability of identifying non-linear relationships between dependent and independent variables. We present an approach based on a generalized additive model in order to avoid these shortcomings. Our method is fully Bayesian and makes use of Markov Chain Monte Carlo (MCMC) simulation techniques for estimation purposes. Using this methodology we reanalyze the determinants of patent oppositions in Europe for biotechnology/pharmaceutical and semiconductor/computer software patents. Our results largely confirm the findings of a previous parametric analysis of the same data provided by Graham, Hall, Harhoff&Mowery (2002). However, our model specification clearly verifies considerable non-linearities in the effect of various metrical covariates on the probability of an opposition. Furthermore, our semiparametric approach shows that the categorizations of these covariates made by Graham et al. (2002) cannot capture those non--linearities and, from a statistical point of view, appear to somehow ad hoc.
Mathematik, Informatik und Statistik - Open Access LMU - Teil 01/03
In this paper binary state space mixed models of Czado and Song (2001) are applied to construct individual risk profiles based on a daily dairy of a migraine headache sufferer. These models allow for the modeling of a dynamic structure together with parametric covariate effects. Since the analysis is based on posterior inference using Markov Chain Monte Carlo (MCMC) methods, Bayesian model fit and model selection criteria are adapted to these binary state space mixed models. It is shown how they can be used to select an appropriate model, for which the probability of a headache today given the occurrence or nonoccurrence of a headache yesterday in dependency on weather conditions such as windchill and humidity can be estimated. This can provide the basis for pain management of such patients.
Mathematik, Informatik und Statistik - Open Access LMU - Teil 01/03
BayesX is a Software tool for Bayesian inference based on Markov Chain Monte Carlo (MCMC) inference techniques. The main feature of BayesX so far, is a very powerful regression tool for Bayesian semiparametric regression within the Generalized linear models framework. BayesX is able to estimate nonlinear effects of metrical covariates, trends and flexible seasonal patterns of time scales, structured and/or unstructured random effects of spatial covariates (geographical data) and unstructured random effects of unordered group indicators. Moreover, BayesX is able to estimate varying coefficients models with metrical and even spatial covariates as effectmodifiers. The distribution of the response can be either Gaussian, binomial or Poisson. In addition, BayesX has some useful functions for handling and manipulating datasets and geographical maps.
Mathematik, Informatik und Statistik - Open Access LMU - Teil 01/03
Discrete-time grouped duration data, with one or multiple types of terminating events, are often observed in social sciences or economics. In this paper we suggest and discuss dynamic models for flexible Bayesian nonparametric analysis of such data. These models allow simultaneous incorporation and estimation of baseline hazards and time-varying covariate effects, without imposing particular parametric forms. Methods for exploring the possibility of time-varying effects, as for example the impact of nationality or unemployment insurance benefits on the probability of re-employment, have recently gained increasing interest. Our modelling and estimation approach is fully Bayesian and makes use of Markov Chain Monte Carlo (MCMC) simulation techniques. A detailed analysis of unemployment duration data, with full-time job, part-time job and other causes as terminating events, illustrates our methods and shows how they can be used to obtain refined results and interpretations.