POPULARITY
In this episode, Darren reminisces with Chetan Venkatesh, CEO of MacroMeta. Venkatesh has a long history of data management from the beginning days of Grid Computing and has started MacroMeta to tackle data management across the globally dispersed edge, data centers, and clouds. Blog: https://www.intel.com/content/www/us/en/government/podcasts/embracing-digital-transformation-episode98.html
Kent talks about Smart Home products transitioning to aid businesses, particularly in home rentals. He analyzes the current tech and how it impacted the short-term rental experience. Ryan and Kent then discuss the current landscape of the Smart Space industry, with Kent sharing insights on trends and challenges he's witnessed. The podcast is wrapped up with what to look out for from Yonomi and the Smart Space industry.Kent Dickson is an experienced technology leader who enjoys building great teams and disruptive products. He is VP and GM of IoT Platforms & Services for Allegion, a global pioneer in seamless access, with leading brands like CISA®, Interflex®, LCN®, Schlage®, SimonsVoss®, and Von Duprin®. Before Allegion, Kent was co-founder and CEO of Yonomi, the simple connected home integration platform, which joined the Allegion family of brands in January 2021. Kent's background includes serving as General Manager of GridMachine, a massive scale Grid Computing-as-a-Service operated as a business unit of Sentient AI. Kent spent nine years at BEA Systems, leading the teams for several market-defining products in the WebLogic and AquaLogic lines. Kent has spent ten years working on the Smart Home frontier, partnering with leading device makers, voice assistants, AI innovators, and service providers. Kent holds a BS in Aerospace Engineering and an MBA from the University of Colorado, Boulder.
In this episode of the IoT for All Podcast, we sat down with Kent Dickson, CEO and Co-Founder of Yonomi to talk about the state of the smart home landscape and how the COVID-19 pandemic could make-or-break progress for companies in the industry.Kent has spent most of his career in enterprise software and the past ten on the smart home frontier, from his experience building a massive scale Grid Computing-as-a-Service product at GridMachine to developing a Cloud platform for residential energy management.To start the episode, Kent introduced us to Yonomi’s background and what they do as an integration platform for smart home devices. On the consumer side, Kent talked about how Yonomi creates positive experiences with even simple routines like turning off the robot vacuum when the TV turns on or ensuring that when a homeowner arrives the lights are on and the thermostat is set to the ideal temperature. He also shared how, for developers, Yonomi can reduce development time by providing an API that allows for a high level of interoperability between devices.Kent spoke to some of the challenges that came with developing Yonomi’s platform, including building on and maintaining the flexibility to connect with products from different countries and learning to work around the development schedules of larger companies as the platform grew. Turning the conversation to current events, Kent spoke to how the COVID-19 pandemic has forced many to spend more time at home, giving many a greater appreciation for the services their smart home devices perform. Kent said, “our customers seem to be more excited about what we’re doing than they’ve ever been.”Kent shared how he believed MSOs and telecom companies could step up to help ease consumer discomfort while self-quarantined and spoke to the role of companies across industries, including insurance, in helping consumers save on costs and maintain a high level of comfort.To close out the episode, Kent gave his predictions for the smart home industry post-pandemic. He shared that, in the next two years, the widespread adoption of smart devices would be inevitable - much like how every TV on the market is a smart TV, he said light switches and other household necessities would likely follow suit.Interested in connecting with Kent? Reach out to him on Linkedin!About Yonomi: The Yonomi Platform simplifies interoperability throughout the connected home ecosystem. From cloud enablement for consumer products to powering hundreds of third-party device integrations, Yonomi enables you to engage consumers with unique branded experiences in homes across the globe. Yonomi was founded in 2013 by Kent Dickson, Joss Scholten, and Garett Madole. The company is dual-headquartered in Austin, TX and Boulder, CO. Key Questions and Topics from this Episode:(01:53) Introduction to Kent Dickson(04:04) Introduction to Yonomi(09:15) How does Yonomi integrate different smart home interfaces?(13:29) What have been the biggest challenges during the development of the Yonomi platform?(18:32) How does Yonomi choose its partners?(21:28) On the consumer side, has the Coronavirus’ affected the smart home industry at all?(25:01) What are MSOs doing to support consumers during this crisis? What role does Yonomi play in that?(28:57) How can insurance companies utilize IoT to help homeowners spend less on repairs and upkeep?(34:54) What are your predictions for the future of smart homes?(40:01) What announcements can we look forward to from Yonomi?
Direto do Rio de Janeiro, Wellington Moscon, CEO e fundador da GoEPIK, conversa com Cezar Taurion para entender quando a transformação digital nos negócios é necessária, por que está tomando conta das preocupações dos líderes das organizações e como extrair valor dela. Cezar Taurion é presidente do i2a2 (Instituto de Inteligência Artificial Aplicada), partner de Digital Transformation da Kick Corporate Ventures, mentor e investidor anjo. É autor de sete livros que abordam assuntos como Transformação Digital, Software Livre, Grid Computing, Software Embarcado, Cloud Computing e Big Data, entre outros.Criada em 2017, a GoEPIK é uma startup que oferece uma plataforma hipercustomizável de gerenciamento de processos que permite empresas e empreendedores criar de maneira autônoma soluções para Indústria 4.0 e Transformação Digital. Essas soluções atendem todas as áreas de negócios das organizações e integram tecnologias como Realidade Aumentada, IoT (Internet das Coisas), Machine Learning, Analytics, Workflows e Checklists, entre outros.Para receber grátis o trial da Plataforma GoEPIK, acesse o site www.goepik.com.br e cadastre-se.
Episode 41: Kent Dickson – Yonomi Kent Dickson is the co-founder and CEO of Yonomi, the IoT company creating a more connected smart home. Yonomi builds smart home solutions for people and companies to connect devices, integrate multiple platforms, and Bring the Home to Life(TM). Kent’s background includes serving as General Manager of GridMachine, a massive scale Grid Computing-as-a-Service operated as a business unit of Sentient AI. Prior to that Kent was CTO at Tendril where he led product development and strategy for a first of its kind Cloud platform for the Internet of Things with a focus on residential energy management use cases. Kent also spent 9 years at BEA Systems leading the teams for several market defining products in the WebLogic and AquaLogic lines. Kent has spent 10 years working on the Smart Home frontier, partnering with leading device makers, voice assistants, AI innovators, and service providers. Kent holds a BS in Aerospace Engineering and an MBA from the University of Colorado, Boulder. Links to things we talk about: Kent Dickson on LinkedIn Yonomi Website Yonomi on Instagram Click to Review and Rate Colorado TechCast on iTunes! We value every review we receive and often read them out on the show. If you take the time to leave one, THANK YOU – You rock! IF YOU LIKE WHAT YOU HEAR, PLEASE: Subscribe to our list Connect with us on Twitter Email us and tell us what you think!
In Episode 3 of the I Love Data Centers Podcast, you'll meet Pete Sclafani, COO and Co-founder of 6Connect. Pete's expertise allows him to speak up and down the IT stack. I call Pete a "life decathlete" because of the breadth of experience he has as an IT expert, businessman, husband, and father. Episode 003 Show Notes • Where are you right now? [3:13] • Pete’s background [4:18] • Norwich University [6:03] • How did you get into technology? [8:37] • Pete’s college experience [11:26] • First job out of college [15:10] • “Saddest Cubicle” [18:59] • First time walking into a data center [20:06] • Pete on the evolution of 6Connect [23:22] • How do you go about speaking with different archetypes in the industry? [27:13] • Speaking to executives [29:42] • Speaking with operational people 31:01 • What have you learned in this industry that has surprised you? [31:54] • “Lessons in Grid Computing” [34:37] • How do you see things evolving in the industry? [39:21] • Pete on innovation in the industry over the past 15 years [45:16] • What is a common misconception people have about the data center industry? [52:32] • What is on the backdrop of your computer right now? [54:23] • Mac or PC? [54:49] • What have you read lately that you found fascinating? [55:30] • “A World Undone” book [55:54] • What advice would you give someone new to the industry? [57:25] • How to contact Pete and learn more about 6Connect [59:19] Links mentioned in episode Saddest Cubicle https://www.wired.com/2014/10/saddest-cubicles/#slide-4 Lessons in Grid Computing https://www.amazon.com/Lessons-Grid-Computing-System-Mirror/dp/0471790109 A World Undone https://www.amazon.com/World-Undone-Story-Great-1914/dp/0553382403
In this session, we explain how Financial Services organizations can leverage AWS grid computing capabilities to perform large-scale calculations for risk management purposes. Numerous financial services companies face the same basic challenge: modeling multiple scenarios with different risk factors simultaneously or in quick succession in order to make informed decisions that maximize gains and minimize financial loss. Some examples of these workloads include Monte Carlo simulations, price model validation and back-testing, and risk calculations for hedging and capital optimization strategies. We provide programmatic guidance around what AWS services to use when running a grid computing cluster that requires thousands of cores and specific industry use cases and key benefits around speed and costs that the AWS platform, auto-scaling capabilities, and various compute services can help achieve.
Thomas Kentemich unterhält sich mit Till Schulte-Coerne über das Thema Grid-Computing. Ein Grid ist ein verteiltes System, das intensive Berechnungen auf einem virtuellen Supercomputer aus lose gekoppelten Einzelsystemen löst. Die Ressourcen in einem Grid werden von den Nutzern gemeinschaftlich und transparent genutzt.
Fakultät für Mathematik, Informatik und Statistik - Digitale Hochschulschriften der LMU - Teil 01/02
In the 1990s a number of technological innovations appeared that revolutionized biology, and 'Bioinformatics' became a new scientific discipline. Microarrays can measure the abundance of tens of thousands of mRNA species, data on the complete genomic sequences of many different organisms are available, and other technologies make it possible to study various processes at the molecular level. In Bioinformatics and Biostatistics, current research and computations are limited by the available computer hardware. However, this problem can be solved using high-performance computing resources. There are several reasons for the increased focus on high-performance computing: larger data sets, increased computational requirements stemming from more sophisticated methodologies, and latest developments in computer chip production. The open-source programming language 'R' was developed to provide a powerful and extensible environment for statistical and graphical techniques. There are many good reasons for preferring R to other software or programming languages for scientific computations (in statistics and biology). However, the development of the R language was not aimed at providing a software for parallel or high-performance computing. Nonetheless, during the last decade, a great deal of research has been conducted on using parallel computing techniques with R. This PhD thesis demonstrates the usefulness of the R language and parallel computing for biological research. It introduces parallel computing with R, and reviews and evaluates existing techniques and R packages for parallel computing on Computer Clusters, on Multi-Core Systems, and in Grid Computing. From a computer-scientific point of view the packages were examined as to their reusability in biological applications, and some upgrades were proposed. Furthermore, parallel applications for next-generation sequence data and preprocessing of microarray data were developed. Microarray data are characterized by high levels of noise and bias. As these perturbations have to be removed, preprocessing of raw data has been a research topic of high priority over the past few years. A new Bioconductor package called affyPara for parallelized preprocessing of high-density oligonucleotide microarray data was developed and published. The partition of data can be performed on arrays using a block cyclic partition, and, as a result, parallelization of algorithms becomes directly possible. Existing statistical algorithms and data structures had to be adjusted and reformulated for the use in parallel computing. Using the new parallel infrastructure, normalization methods can be enhanced and new methods became available. The partition of data and distribution to several nodes or processors solves the main memory problem and accelerates the methods by up to the factor fifteen for 300 arrays or more. The final part of the thesis contains a huge cancer study analysing more than 7000 microarrays from a publicly available database, and estimating gene interaction networks. For this purpose, a new R package for microarray data management was developed, and various challenges regarding the analysis of this amount of data are discussed. The comparison of gene networks for different pathways and different cancer entities in the new amount of data partly confirms already established forms of gene interaction.
Fakultät für Mathematik, Informatik und Statistik - Digitale Hochschulschriften der LMU - Teil 01/02
Als essentielle Komponente des IT-Security Managements umfasst das Identity & Access Management (I&AM) saemtliche organisatorischen und technischen Prozesse der Verwaltung von Dienstnutzern einer Einrichtung und deren Berechtigungen; dabei werden die Datenbestaende verschiedenster autoritativer Datenquellen wie Personal- und Kundenverwaltungssysteme aggregiert, korreliert und in aufbereiteter Form den IT-Services zur Verfuegung gestellt. Das Federated Identity Management (FIM) hat zum Ziel, die so geschaffenen integrierten Datenbestaende auch organisationsuebergreifend nutzbar zu machen; diese Funktionalitaet wird beispielsweise im Rahmen von Business-to-Business-Kooperationen, Outsourcing-Szenarien und im Grid-Computing zunehmend dringender benoetigt. Die Vermeidung von Redundanz und Inkonsistenzen, aber auch die garantierte Verfuegbarkeit der Daten und die Einhaltung von Datenschutzbestimmungen stellen hierbei besonders kritische Erfolgsfaktoren dar. Mit der Security Assertion Markup Language (SAML), den Spezifikationen der Liberty Alliance und WS-Federation als integralem Bestandteil des Web Services WS-*-Protokollstacks haben sich industrielle und partiell standardisierte technische Ansaetze fuer FIM herauskristallisiert, deren praktische Umsetzung jedoch noch haeufig an der nur unzureichend geklaerten, komplexen organisatorischen Einbettung und den technischen Unzulaenglichkeiten hinsichtlich der Integration in bestehende IT-Infrastrukturen scheitert. In dieser Arbeit wird zunaechst eine tiefgehende und in diesem Umfang neue Anforderungsanalyse durchgefuehrt, die neben I&AM und FIM auch die als User-Centric Identity Management (UCIM) bezeichnete Benutzerperspektive beruecksichtigt; die Schwerpunkte der mehr als 60 strukturierten und gewichteten Anforderungen liegen dabei auf der Integration von I&AM- und FIM-Systemen sowohl auf der Seite der organisation, der die Benutzer angehoeren (Identity Provider), als auch beim jeweiligen Dienstleister (Service Provider), und auf dem Einbezug von organisatorischen Randbedingungen sowie ausgewaehlten Sicherheits- und Datenschutzaspekten. Im Rahmen eines umfassenden, gesamtheitlichen Architekturkonzepts wird anschliessend eine Methodik zur systematischen Integration von FIM-Komponenten in bestehende I&AM-Systeme erarbeitet. Neben der praezisen Spezifikation der technischen Systemschnittstellen, die den bestehenden Ansaetzen fehlt, fokussiert diese Arbeit auf die organisatorische Eingliederung aus Sicht des IT Service Managements, wobei insbesondere das Security Management und das Change Management nach ITIL vertieft werden. Zur Kompensation weiterer grundlegender Defizite bisheriger FIM-Ansaetze werden im Rahmen eines Werkzeugkonzepts fuenf neue FIM-Komponenten spezifiziert, die auf eine verbesserte Interoperabilitaet der FIM-Systeme der an einer so genannten Identity Federation beteiligten organisationen abzielen. Darueber hinaus wird auf Basis der eXtensible Access Control Markup Language (XACML) eine policy-basierte Privacy Management Architektur spezifiziert und integriert, die eine dezentrale Steuerung und Kontrolle von Datenfreigaben durch Administratoren und Benutzer ermoeglicht und somit essentiell zur Einhaltung von Datenschutzauflagen beitraegt. Eine Beschreibung der prototypischen Implementierung der Werkzeugkonzepte mit einer Diskussion ihrer Performanz und die methodische Anwendung des Architekturkonzepts auf ein komplexes, realistisches Szenario runden die Arbeit ab.
Dan Ciruli of Digipede Technologies is back to bring us up to date with Digipede Networks, a .NET toolset for enabling grid computing.Support this podcast at — https://redcircle.com/net-rocks/donations
Dan Ciruli talks about the Digipede Network, a framework for grid computing with the .NET Framework on the Windows platform.Support this podcast at — https://redcircle.com/net-rocks/donations