Connecting to Apple Music.
School of Computing and Information Sciences
In this talk we examine how high performance computing has changed over the last 10-year and look toward the future in terms of trends. These changes have had and will continue to have a major impact on our software. Some of the software and algorithm challenges have already been encountered, such as management of communication and memory hierarchies through a combination of compile--time and run--time techniques, but the increased scale of computation, depth of memory hierarchies, range of latencies, and increased run--time environment variability will make these problems much harder. We will look at five areas of research that will have an importance impact in the development of software and algorithms. We will focus on following themes: . Redesign of software to fit multicore and hybrid architectures . Automatically tuned application software . Exploiting mixed precision for performance . The importance of fault tolerance . Communication avoiding algorithms
Experiments have shown that even very slow electrons can break DNA strands. This surprising result suggests that the electrons are somehow trapped in states that live long enough to promote bond rupture. Computational techniques can simulate such trapping processes and help develop an understanding of the strand-break mechanisms. However, detailed quantum simulations of the collision between a slow electron and a polyatomic molecule are highly computationally intensive. I will discuss the techniques we use to address this demanding problem and give a few examples of applications and their implications.
During the past fifty years there has been extensive, continuous and growing interaction between logic and computer science. In fact, logic has been called .the calculus of the computer science.. The argument is the logic plays a fundamental role in computer science, similar computer science as disparate as architecture (logic gates), software engineering (specification and verification), programming languages (semantics, logic programming), databases (relational algebra and SQL) artificial intelligence (automated theorem proving), algorithms (complexity and expressiveness) and theory of computation (general notions of computability). This non-technical talk will provide an overview of the unusual effectiveness of logic in computer of science, going back all the way to Aristotle and Euclid and showing how logic actually gave rise to computer science.
The research of the Center for Subsurface Modeling (CSM) addresses the growing use of computers to simulate physical events and the use of these simulations to study physical phenomena and to perform engineering analysis and design. Our team investigates high-performance parallel processing as a tool to model the behavior of fluids in permeable geologic formations such as petroleum and natural gas reservoirs, groundwater aquifers and aquitards, and in shallow bodies of water such as bays and estuaries. The accurate and efficient simulation of subsurface phenomena requires a blend of physical and geomechanical modeling of subsurface processes and careful numerical implementation. Compounding these issues is a general lack of high quality data from model calibration and verification. CSM researchers collaborate with outside experts to find suitably accurate representations of physical systems, including such processes as fluid phase behavior, particle transport and dispersion, capillary pressure effects, flow in highly heterogeneous media (possibly fractured and vuggy), geomechanical response and subsidence and well production. These and other processes must be simulated accurately so as to avoid nonphysical numerical artifacts that can cloud risk assessment and the intervention and optimization of management objectives. The Center is part of the Institute for Computational Engineering and Sciences (ICES). CSM comprises a close-knit team of faculty and research scientists with expertise in applied mathematics, engineering, and computer, physical, chemical and geological sciences. This interdisciplinary approach to simulation permits a more effective integration of advanced mathematical and numerical techniques with engineering applications.
"We all know that modern science is undergoing a profound transformation as it aims to tackle the complex problems of the 21st Century. It isbecoming highly collaborative; problems as diverse as climate change,renewable energy, or the origin of gamma-ray bursts require understanding processes that no single group or community alone has the skills to address. At the same time, after centuries of little change, compute, data, and network environments have grown by 9-12 orders of magnitude in the last few decades. Moreover, science is not only compute-intensive but is dominated now by data-intensive methods. This dramatic change in the culture and methodology of science will require a much more integrated and comprehensive approach to development and deployment of hardware, software, and algorithmic tools and environments supporting research, education, and increasingly collaboration across disciplines."