Podcasts about eigenvalues

  • 28PODCASTS
  • 73EPISODES
  • 36mAVG DURATION
  • ?INFREQUENT EPISODES
  • Mar 4, 2025LATEST

POPULARITY

20172018201920202021202220232024


Best podcasts about eigenvalues

Latest podcast episodes about eigenvalues

Quantitude
S6E16 Correspondence Analysis

Quantitude

Play Episode Listen Later Mar 4, 2025 46:41


In this week's episode Greg and Patrick shine a flashlight on correspondence analysis and find that this is an extraordinarily cool yet often neglected method similar to factor analysis but applied to nominal contingency tables. Along the way they also discuss online personality tests, marital therapy, modern antibiotics, the Newlywed Game, grand slams, the advantages of being flexible, disrespecting nominal variables, formally apologizing to linguists, Winnie the Pooh, VH1's Pop-Up Video, the witches of Macbeth, Wait Wait Don't Tell Me, and the downsides of Novocaine. Stay in contact with Quantitude! Web page: quantitudepod.org TwitterX: @quantitudepod YouTube: @quantitudepod Merch: redbubble.com

The Nonlinear Library
AF - An Extremely Opinionated Annotated List of My Favourite Mechanistic Interpretability Papers v2 by Neel Nanda

The Nonlinear Library

Play Episode Listen Later Jul 7, 2024 38:20


Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: An Extremely Opinionated Annotated List of My Favourite Mechanistic Interpretability Papers v2, published by Neel Nanda on July 7, 2024 on The AI Alignment Forum. This post represents my personal hot takes, not the opinions of my team or employer. This is a massively updated version of a similar list I made two years ago There's a lot of mechanistic interpretability papers, and more come out all the time. This can be pretty intimidating if you're new to the field! To try helping out, here's a reading list of my favourite mech interp papers: papers which I think are important to be aware of, often worth skimming, and something worth reading deeply (time permitting). I've annotated these with my key takeaways, what I like about each paper, which bits to deeply engage with vs skim, etc. I wrote a similar post 2 years ago, but a lot has changed since then, thus v2! Note that this is not trying to be a comprehensive literature review - this is my answer to "if you have limited time and want to get up to speed on the field as fast as you can, what should you do". I'm deliberately not following academic norms like necessarily citing the first paper introducing something, or all papers doing some work, and am massively biased towards recent work that is more relevant to the cutting edge. I also shamelessly recommend a bunch of my own work here, sorry! How to read this post: I've bolded the most important papers to read, which I recommend prioritising. All of the papers are annotated with my interpretation and key takeaways, and tbh I think reading that may be comparable good to skimming the paper. And there's far too many papers to read all of them deeply unless you want to make that a significant priority. I recommend reading all my summaries, noting the papers and areas that excite you, and then trying to dive deeply into those. Foundational Work A Mathematical Framework for Transformer Circuits (Nelson Elhage et al, Anthropic) - absolute classic, foundational ideas for how to think about transformers (see my blog post for what to skip). See my youtube tutorial (I hear this is best watched after reading the paper, and adds additional clarity) Deeply engage with: All the ideas in the overview section, especially: Understanding the residual stream and why it's fundamental. The notion of interpreting paths between interpretable bits (eg input tokens and output logits) where the path is a composition of matrices and how this is different from interpreting every intermediate activations And understanding attention heads: what a QK and OV matrix is, how attention heads are independent and additive and how attention and OV are semi-independent. Skip Trigrams & Skip Trigram bugs, esp understanding why these are a really easy thing to do with attention, and how the bugs are inherent to attention heads separating where to attend to (QK) and what to do once you attend somewhere (OV) Induction heads, esp why this is K-Composition (and how that's different from Q & V composition), how the circuit works mechanistically, and why this is too hard to do in a 1L model Skim or skip: Eigenvalues or tensor products. They have the worst effort per unit insight of the paper and aren't very important. Superposition Superposition is a core principle/problem in model internals. For any given activation (eg the output of MLP13), we believe that there's a massive dictionary of concepts/features the model knows of. Each feature has a corresponding vector, and model activations are a sparse linear combination of these meaningful feature vectors. Further, there are more features in the dictionary than activation dimensions, and they are thus compressed in and interfere with each other, essentially causing cascading errors. This phenomena of compression is called superposition. Toy models of superpositio...

The Nonlinear Library
LW - Math-to-English Cheat Sheet by nahoj

The Nonlinear Library

Play Episode Listen Later Apr 9, 2024 11:02


Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Math-to-English Cheat Sheet, published by nahoj on April 9, 2024 on LessWrong. Say you've learnt math in your native language which is not English. Since then you've also read math in English and you appreciate the near universality of mathematical notation. Then one day you want to discuss a formula in real life and you realize you don't know how to pronunce "an". Status: I had little prior knowledge of the topic. This was mostly generated by ChatGPT4 and kindly reviewed by @TheManxLoiner. General Distinguishing case F,δ "Big F" or "capital F", "little delta" Subscripts an "a sub n" or, in most cases, just "a n" Calculus Pythagorean Theorem a2+b2=c2 "a squared plus b squared equals c squared." Area of a Circle A=πr2 "Area equals pi r squared." Slope of a Line m=y2y1x2x1 "m equals y 2 minus y 1 over x 2 minus x 1." Quadratic Formula x=bb24ac2a "x equals minus b [or 'negative b'] plus or minus the square root of b squared minus four a c, all over two a." Sum of an Arithmetic Series S=n2(a1+an) "S equals n over two times a 1 plus a n." Euler's Formula eiθ=cos(θ)+isin(θ) "e to the i theta equals cos [pronounced 'coz'] theta plus i sine theta." Law of Sines sin(A)a=sin(B)b=sin(C)c "Sine A over a equals sine B over b equals sine C over c." Area of a Triangle (Heron's Formula) A=s(sa)(sb)(sc), where s=a+b+c2 "Area equals the square root of s times s minus a times s minus b times s minus c, where s equals a plus b plus c over two." Compound Interest Formula A=P(1+rn)nt "A equals P times one plus r over n to the power of n t." Logarithm Properties logb(xy)=logb(x)+logb(y) Don't state the base if clear from context: "Log of x y equals log of x plus log of y." Otherwise "Log to the base b of x y equals log to the base b of x plus log to the base b of y." More advanced operations Derivative of a Function dfdx or ddxf(x) or f'(x) "df by dx" or "d dx of f of x" or "f prime of x." Second Derivative d2dx2f(x) or f''(x) "d squared dx squared of f of x" or "f double prime of x." Partial Derivative (unreviewed) xf(x,y) "Partial with respect to x of f of x, y." Definite Integral baf(x)dx "Integral from a to b of f of x dx." Indefinite Integral (Antiderivative) f(x)dx "Integral of f of x dx." Line Integral (unreviewed) Cf(x,y)ds "Line integral over C of f of x, y ds." Double Integral badcf(x,y)dxdy "Double integral from a to b and c to d of f of x, y dx dy." Gradient of a Function f "Nabla f" or "gradient of f" to distinguish from other uses such as divergence or curl. Divergence of a Vector Field F "Nabla dot F." Curl of a Vector Field F "Nabla cross F." Laplace Operator (unreviewed) Δf or 2f "Delta f" or "Nabla squared f." Limit of a Function limxaf(x) "Limit as x approaches a of f of x." Linear Algebra (vectors and matrices) Vector Addition v+w "v plus w." Scalar Multiplication cv "c times v." Dot Product vw "v dot w." Cross Product vw "v cross w." Matrix Multiplication AB "A B." Matrix Transpose AT "A transpose." Determinant of a Matrix |A| or det(A) "Determinant of A" or "det A". Inverse of a Matrix A1 "A inverse." Eigenvalues and Eigenvectors λ for eigenvalues, v for eigenvectors "Lambda for eigenvalues; v for eigenvectors." Rank of a Matrix rank(A) "Rank of A." Trace of a Matrix tr(A) "Trace of A." Vector Norm v "Norm of v" or "length of v". Orthogonal Vectors vw=0 "v dot w equals zero." With numerical values Matrix Multiplication with Numerical Values Let A=(1234) and B=(5678), then AB=(19224350). "A B equals nineteen, twenty-two; forty-three, fifty." Vector Dot Product Let v=(1,2,3) and w=(4,5,6), then vw=32. "v dot w equals thirty-two." Determinant of a Matrix For A=(1234), |A|=2. "Determinant of A equals minus two." Eigenvalues and Eigenvectors with Numerical Values Given A=(2112), it has eigenvalues λ1=3 and λ2=1, with corresponding eigenvectors v1=(11) and v2=(11). "Lambda ...

The Nonlinear Library: LessWrong
LW - Math-to-English Cheat Sheet by nahoj

The Nonlinear Library: LessWrong

Play Episode Listen Later Apr 9, 2024 11:02


Link to original articleWelcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Math-to-English Cheat Sheet, published by nahoj on April 9, 2024 on LessWrong. Say you've learnt math in your native language which is not English. Since then you've also read math in English and you appreciate the near universality of mathematical notation. Then one day you want to discuss a formula in real life and you realize you don't know how to pronunce "an". Status: I had little prior knowledge of the topic. This was mostly generated by ChatGPT4 and kindly reviewed by @TheManxLoiner. General Distinguishing case F,δ "Big F" or "capital F", "little delta" Subscripts an "a sub n" or, in most cases, just "a n" Calculus Pythagorean Theorem a2+b2=c2 "a squared plus b squared equals c squared." Area of a Circle A=πr2 "Area equals pi r squared." Slope of a Line m=y2y1x2x1 "m equals y 2 minus y 1 over x 2 minus x 1." Quadratic Formula x=bb24ac2a "x equals minus b [or 'negative b'] plus or minus the square root of b squared minus four a c, all over two a." Sum of an Arithmetic Series S=n2(a1+an) "S equals n over two times a 1 plus a n." Euler's Formula eiθ=cos(θ)+isin(θ) "e to the i theta equals cos [pronounced 'coz'] theta plus i sine theta." Law of Sines sin(A)a=sin(B)b=sin(C)c "Sine A over a equals sine B over b equals sine C over c." Area of a Triangle (Heron's Formula) A=s(sa)(sb)(sc), where s=a+b+c2 "Area equals the square root of s times s minus a times s minus b times s minus c, where s equals a plus b plus c over two." Compound Interest Formula A=P(1+rn)nt "A equals P times one plus r over n to the power of n t." Logarithm Properties logb(xy)=logb(x)+logb(y) Don't state the base if clear from context: "Log of x y equals log of x plus log of y." Otherwise "Log to the base b of x y equals log to the base b of x plus log to the base b of y." More advanced operations Derivative of a Function dfdx or ddxf(x) or f'(x) "df by dx" or "d dx of f of x" or "f prime of x." Second Derivative d2dx2f(x) or f''(x) "d squared dx squared of f of x" or "f double prime of x." Partial Derivative (unreviewed) xf(x,y) "Partial with respect to x of f of x, y." Definite Integral baf(x)dx "Integral from a to b of f of x dx." Indefinite Integral (Antiderivative) f(x)dx "Integral of f of x dx." Line Integral (unreviewed) Cf(x,y)ds "Line integral over C of f of x, y ds." Double Integral badcf(x,y)dxdy "Double integral from a to b and c to d of f of x, y dx dy." Gradient of a Function f "Nabla f" or "gradient of f" to distinguish from other uses such as divergence or curl. Divergence of a Vector Field F "Nabla dot F." Curl of a Vector Field F "Nabla cross F." Laplace Operator (unreviewed) Δf or 2f "Delta f" or "Nabla squared f." Limit of a Function limxaf(x) "Limit as x approaches a of f of x." Linear Algebra (vectors and matrices) Vector Addition v+w "v plus w." Scalar Multiplication cv "c times v." Dot Product vw "v dot w." Cross Product vw "v cross w." Matrix Multiplication AB "A B." Matrix Transpose AT "A transpose." Determinant of a Matrix |A| or det(A) "Determinant of A" or "det A". Inverse of a Matrix A1 "A inverse." Eigenvalues and Eigenvectors λ for eigenvalues, v for eigenvectors "Lambda for eigenvalues; v for eigenvectors." Rank of a Matrix rank(A) "Rank of A." Trace of a Matrix tr(A) "Trace of A." Vector Norm v "Norm of v" or "length of v". Orthogonal Vectors vw=0 "v dot w equals zero." With numerical values Matrix Multiplication with Numerical Values Let A=(1234) and B=(5678), then AB=(19224350). "A B equals nineteen, twenty-two; forty-three, fifty." Vector Dot Product Let v=(1,2,3) and w=(4,5,6), then vw=32. "v dot w equals thirty-two." Determinant of a Matrix For A=(1234), |A|=2. "Determinant of A equals minus two." Eigenvalues and Eigenvectors with Numerical Values Given A=(2112), it has eigenvalues λ1=3 and λ2=1, with corresponding eigenvectors v1=(11) and v2=(11). "Lambda ...

The Dictionary
#E35 (Egyptian to eigenvector)

The Dictionary

Play Episode Listen Later Oct 20, 2023 41:00


I read from Egyptian to eigenvector.     Egypt https://en.wikipedia.org/wiki/Egypt     Technically, yes, Mammoths were still around while the pyramids were being built, but it was a small population on an island.  https://www.worldatlas.com/articles/did-woolly-mammoths-still-roam-parts-of-earth-when-the-great-pyramids-were-built.html     The fantastic podcast "Ologies" by Alie Ward has a great episode about Egyptian bread called "Gastroegyptology" https://www.alieward.com/ologies/gastroegyptology     Eigenvalues and eigenvectors, oh my! https://en.wikipedia.org/wiki/Eigenvalues_and_eigenvectors     The word of the episode is "eidetic". https://en.wikipedia.org/wiki/Eidetic_memory     Theme music from Jonah Kraut https://jonahkraut.bandcamp.com/     Merchandising! https://www.teepublic.com/user/spejampar     "The Dictionary - Letter A" on YouTube   "The Dictionary - Letter B" on YouTube   "The Dictionary - Letter C" on YouTube   "The Dictionary - Letter D" on YouTube     Featured in a Top 10 Dictionary Podcasts list! https://blog.feedspot.com/dictionary_podcasts/     Backwards Talking on YouTube: https://www.youtube.com/playlist?list=PLmIujMwEDbgZUexyR90jaTEEVmAYcCzuq     dictionarypod@gmail.com https://www.facebook.com/thedictionarypod/ https://www.threads.net/@dictionarypod https://twitter.com/dictionarypod https://www.instagram.com/dictionarypod/ https://www.patreon.com/spejampar https://www.tiktok.com/@spejampar 917-727-5757

The Stephen Wolfram Podcast
Science & Technology Q&A for Kids (and others) [October 1, 2021]

The Stephen Wolfram Podcast

Play Episode Listen Later Sep 23, 2022 85:25


Stephen Wolfram answers general questions from his viewers about science and technology as part of an unscripted livestream series, also available on YouTube here: https://wolfr.am/youtube-sw-qa Questions include: Why do flies fly around seemingly constantly with no apparent goal whatsoever? - Why don't we make houses out of some kind of amber and then carve them? - How do traffic light systems work? - Does Stephen prepare any of the answers? they are all so clear and thought out - Hello, how are the magnetic north and geographical north related? - 1 How come oil is deposited in Arctic regions? - Could you please explain what Eigenvalues/vectors and what you can do with them. Thanks.

Quantitude
S3E23: The Mättrix Part II: Using Matrices To Our Advantage

Quantitude

Play Episode Listen Later Mar 8, 2022 53:56


In this week's episode Greg and Patrick continue their discussion from last week in The Mättrix Part Deux, exploring the magic of matrices including estimation, eigenvalues, and eigenvectors. Along the way they also mention flawed audio transcripts, 50 Shades of Greg, drunkenly shoving a matrix, drug mules, things you need, isomorphic interdigitation, plywood and tennis balls, heroin-filled condoms, talking to volleyballs, bawitdaba da bang a dang diggy diggy, meat grinders, not going to prom, vector bouquets, and The Wright Stuff.  

Analysis on Graphs and its Applications
Pollution-free methods for finding eigenvalues in the gaps of the continuous spectrum

Analysis on Graphs and its Applications

Play Episode Listen Later Jan 30, 2022 62:00


Michael Levitin; University of Reading 23 May 2007 – 11:00 to 12:00

Tech Stories
EP-20 How Machine Learning reveals the Real Face Behind the Mask?

Tech Stories

Play Episode Listen Later Dec 18, 2021 6:33


In this episode I tried to explain the principle behind face regonition using PCA -Eigen face Approach PCA- Principal Component Analysis Eigen faces: An eigenface is the name given to a set of eigenvectors when used in the computer vision problem of human face recognition. What is eigenvalues and eigenvectors in PCA? Eigenvectors are unit vectors with length or magnitude equal to 1. ... Eigenvalues are coefficients applied to eigenvectors that give the vectors their length or magnitude. Covariance Matrix:is a square matrix giving the covariance between each pair of elements of a given random vector. Listen the episode on all podcast platform and share your feedback as comments here Do check the episode on various platform follow me on instagram https://www.instagram.com/podcasteramit Apple https://podcasts.apple.com/us/podcast/id1544510362 Huhopper Platform https://hubhopper.com/podcast/tech-stories/318515 Amazon https://music.amazon.com/podcasts/2fdb5c45-2016-459e-ba6a-3cbae5a1fa4d Spotify https://open.spotify.com/show/2GhCrAjQuVMFYBq8GbLbwa

The Function Room
The Matrix Revised

The Function Room

Play Episode Listen Later Mar 15, 2021 45:34 Very Popular


Okay enough messing around, this week we get into the Matrix. Okay not that matrix. The mathematical matrix. But this one is way more powerful than a dystopian future in which humanity is unknowingly trapped inside a simulated reality. That's piddly. Mathematical matrices are used in everywhere, from making computer games to quantum physics.That's Jane Breen ,Assistant Professor in Applied Maths in Ontario University in Canada. She loves modelling the complexity of networks in the real world with some very powerful and sometimes simple tools. Speaking of simple tools, before long, I start throw around lingo like Eigenvalues and Markov Chains like I know what I'm talking about. We find out how Google got so successful, a brief digression into how drugmakers know their drugs will work and before finishing off on how to control the spread of disease. And Ruby and Lily find themselves playing with a real-life application of a Markov Chain, a Game of Snakes and Ladders. Jane Breen https://sites.google.com/view/breenjA really good youtube channel for visualising what's going on in Matrices and All Of That. https://www.youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab

The History of Computing

Today we're going to cover a computer programming language many might not have heard of, ALGOL.  ALGOL was written in 1958. It wasn't like many of the other languages in that it was built by committee. The Association for Computing Machinery and the German Society of Applied Mathematics and Mechanics were floating around ideas for a universal computer programming language.  Members from the ACM were a who's who of people influential in the transition from custom computers that were the size of small homes to mainframes. John Backus of IBM had written a programming language called Speedcoding and then Fortran. Joseph Wegstein had been involved in the development of COBOL. Alan Perlis had been involved in Whirlwind and was with the Carnegie Institute of Technology. Charles Katz had worked with Grace Hopper on UNIVAC and FLOW-MATIC.  The Germans were equally as influential. Frederich Bauer had brought us the stack method while at the Technical University of Munich. Hermann Bottenbruch from The Institute for Applied Mathematics had written a paper on constructing languages. Klaus Samelson had worked on a computer called PERM that was similar to the MIT Whirlwind project. He'd come into computing while studying Eigenvalues.  Heinz Ritishauser had written a number of papers on programming techniques and had codeveloped the language Superplan while at the The Swiss Federal Institute of Technology. This is where the meeting would be hosted. They went from May 27th to June 2nd in 1958 and initially called the language they would develop as IAL, or the International Algebraic Language. But would expand the name to ALGOL, short for Algorithmic Language. They brought us code blocks, the concept that you have a pair of words or symbols that would begin and end a stanza of code, like begin and end. They introduced nested scoped functions. They wrote the whole language right there. You would name a variable by simply saying integer or setting the variable as a := 1. You would substantiate a for and define the steps to perform until - the root of what we would now call a for loop. You could read a variable in from a punch card. It had built-in SIN and COSIN. It was line based and fairly simple functional programming by today's standards. They defined how to handle special characters, built boolean operators, floating point notation. It even had portable types.  And by the end had a compiler that would run on the Z22 computer from Konrad Zuse. While some of Backus' best work it effectively competed with FORTRAN and never really gained traction at IBM. But it influenced almost everything that happened afterwards.  Languages were popping up all over the place and in order to bring in more programmers, they wanted a formalized way to allow languages to flourish, but with a standardized notation system so algorithms could be published and shared and developers could follow along with logic. One outcome of the ALGOL project was the Backus–Naur form, which was the first such standardization. That would be expanded by Danish Peter Naur for ALGOL 60, thus the name. In ALGOL 60 they would meet in Paris, also adding Father John McCarthy, Julien Green, Bernard Vauquois, Adriaan van Wijngaarden, and Michael Woodger. It got refined, yet a bit more complicated. FORTRAN and COBOL use continued to rage on, but academics loved ALGOL. And the original implementation now referred to as the ZMMD implementation, gave way to X1 ALGOL, Case ALGOL, ZAM in Poland, GOGOL, VALGOL, RegneCentralen ALGOL, Whetstone ALGOL for physics, Chinese ALGOL, ALGAMS, NU ALGOL out of Norway, ALGEK out of Russia,  Dartmouth ALGOL, DG/L, USS 90 Algol, Elliot ALGOL, the ALGOL Translator, Kidsgrove Algol, JOVIAL, Burroughs ALGOL, Niklaus Firths ALGOL W, which led to Pascal, MALGOL, and the last would be S-algol in 1979.  But it got overly complicated and overly formal. Individual developers wanted more flexibility here and there. Some wanted simpler languages. Some needed more complicated languages. ALGOL didn't disappear as much as it evolved into other languages. Those were coming out fast and with a committee to approve changes to ALGOL, they were much slower to iterate.  You see, ALGOL profoundly shaped how we think of programming languages. That formalization was critical to paving the way for generations of developers who brought us future languages. ALGOL would end up being the parent of CPL and through CPL, BCPL, C, C++, and through that Objective-C. From ALGOL also sprang Simula and through Simula, Smalltalk. And Pascal and from there, Modula and Delphi. It was only used for a few years but it spawned so much of what developers use to build software today.  In fact, other languages evolved as anti-ALGOL-derivitives, looking at how you did something and deciding to do it totally differently.  And so we owe this crew our thanks. They helped to legitimize a new doctrine, a new career, computer programmer. They inspired. They coded. And in so doing, they helped bring us into the world of functional programming and set structures that allowed the the next generation of great thinkers to go even further, directly influencing people like Adele Goldberg and Alan Kay.  And it's okay that the name of this massive contribution is mostly lost to the annals of history. Because ultimately, the impact is not. So think about this - what can we do to help shape the world we live in? Whether it be through raw creation, iteration, standardization, or formalization - we all have a role to play in this world. I look forward to hearing more about yours as it evolves!

Machine Learning Cafe
Can we predict the accuracy of a Neural Network? Yes, with the WeightWatcher tool by Charles Martin, Ph.D. - 002

Machine Learning Cafe

Play Episode Listen Later Dec 7, 2019 56:51 Very Popular


In this Episode we talked about the deep neural networks and the spectral density of each layer's weights. It turns out, you can predict the accuracy ( and many more) with the WeigthWatcher application. We talk about the 5+1 Phases of learning and Heavy Tailed Self Regularization. Charles Martin, PhD on LinkedIn: https://www.linkedin.com/in/charlesmartin14/ During the episode we talked about these VC Theory: https://en.wikipedia.org/wiki/Vapnik%E2%80%93Chervonenkis_theory Why Deep Learning works? post by Charles Martin, 2015 https://calculatedcontent.com/2015/03/25/why-does-deep-learning-work/ Presentation at Berkeley: Why Deep Learning works? by Charles Martin, 2016 https://www.youtube.com/watch?v=fHZZgfVgC8U Several Papers written by Charles Martin and Michael Mahoney: https://arxiv.org/search/?query=%22Charles+H.+Martin%22&searchtype=author&abstracts=show&order=-announced_date_first&size=50 Newest blog post about weightwatcher, 2019 December: https://calculatedcontent.com/2019/12/03/towards-a-new-theory-of-learning-statistical-mechanics-of-deep-neural-networks/ WeightWatcher on GitHub: https://github.com/CalculatedContent/WeightWatcher easy installation for python users: pip install weightwatcher How to reach out: https://calculationconsulting.com/ charles@calculationconsulting.com To access their slack channel please contact Charles first. ---Copyright Info--- Music is from https://filmmusic.io , intro first part is by Miklos Toth and some free garage band loops. :) intro second part: "Aces High" by Kevin MacLeod, outro "Acid Trumpet" by Kevin MacLeod (https://incompetech.com), License: CC BY (http://creativecommons.org/licenses/by/4.0/)  

OCW Scholar: Linear Algebra
Lecture 21: Eigenvalues and Eigenvectors

OCW Scholar: Linear Algebra

Play Episode Listen Later Aug 1, 2018 51:22


OCW Scholar: Linear Algebra
Eigenvalues and Eigenvectors

OCW Scholar: Linear Algebra

Play Episode Listen Later Aug 1, 2018 9:22


A teaching assistant works through a problem on eigenvalues and eigenvectors.

Linear Algebra
Lecture 21: Eigenvalues and Eigenvectors

Linear Algebra

Play Episode Listen Later Mar 3, 2017 51:22


The Best Debate in the Universe
Episode #10 - Taylor Nikolai, Rucka, Free College Tuition, Race Wars, Hot Corvette Mom, Snoop Dog

The Best Debate in the Universe

Play Episode Listen Later Aug 8, 2016 64:03


I'm joined this week by social marketing master, Taylor Nikolai, who thinks I don't pronounce his name correctly. Taylor drops some wisdom in the form of Gamification Theory when it comes to education, so much like how Pokemon Go has gamified exercise, he thinks the same can be done with education as a whole. Look forward to solving for Eigenvalues in the next Dark Souls! I also witnessed the start of a race war on my street. I brought in the clip of a black guy (who we come to learn is half-Mexican) going toe-to-toe with a Mexican delivery man. Who wins? You, the listener, because I recorded the entire thing. Plus Rucka is back and pleased to hear all our new Italian and Armenian callers. Also, here is a quick survey to help out the show: http://survey.libsyn.com/madcastmedia We also discuss whether Snoop Dog should be culpable for encouraging his fans to rush the stage, which caused it to collapse. And as promised, here's that hot Corvette mom. Is it just me? https://rss.madcastmedia.com/bestdebate/10/ Stone fox. Correction: Her knuckle tattoos say: "FEAR - LESS." Badass. The voicemail number is: 1-562-58-I-RULE (1-562-584-7853). madcastmedia.com Sources Washington Post - Why Bernie Sanders' free college plan doesn't make sense - https://www.washingtonpost.com/news/grade-point/wp/2016/04/22/why-bernie-sanderss-free-college-plan-doesnt-make-sense/ Education.com - Waiting for Superman: cost of high school drop out - http://www.education.com/magazine/article/waiting-superman-means-parents/ KRON4 - Mom stuffed her kids in trunk of Corvette - http://kron4.com/2016/07/18/mom-accused-of-putting-kids-ages-3-and-5-in-corvette-trunk/ LA Times - Snoop Dog concert stage collapses - http://www.latimes.com/entertainment/music/la-et-ms-snoop-dog-concert-injuries-20160806-snap-story.html "Mining by Moonlight" and "Music to Delight" by Kevin MacLeod (incompetech.com) Licensed under Creative Commons: By Attribution 3.0 http://creativecommons.org/licenses/by/3.0/

Learn Differential Equations: Up Close with Gilbert Strang and Cleve Moler

Two equations with a constant matrix are stable (solutions approach zero) when the trace is negative and the determinant is positive.

Learn Differential Equations: Up Close with Gilbert Strang and Cleve Moler
Symmetric Matrices, Real Eigenvalues, Orthogonal Eigenvectors

Learn Differential Equations: Up Close with Gilbert Strang and Cleve Moler

Play Episode Listen Later Apr 12, 2016 15:54


Symmetric matrices have n perpendicular eigenvectors and n real eigenvalues.

Learn Differential Equations: Up Close with Gilbert Strang and Cleve Moler

The eigenvectors remain in the same direction when multiplied by the matrix. Subtracting an eigenvalue from the diagonal leaves a singular matrix: determinant zero. An n by n matrix has n eigenvalues.

Chem 131B Physical Chemistry
Molecular Structure & Statistical Mechanics 131B. Lecture 18. Eigenstates & Eigenvalues

Chem 131B Physical Chemistry

Play Episode Listen Later Dec 15, 2015 48:48


微分方程 Differential Equations
3.2 Straight-Line Solutions & 3.3 Phase Portraits for Linear Systems with Real Eigenvalues

微分方程 Differential Equations

Play Episode Listen Later Oct 26, 2015 54:05


Computational Science and Engineering I
Lecture 06: Eigenvalues (part 2); positive definite (part 1)

Computational Science and Engineering I

Play Episode Listen Later Jul 15, 2015 50:18


Periodic and Ergodic Spectral Problems
On fluctuations of eigenvalues of random band matrices

Periodic and Ergodic Spectral Problems

Play Episode Listen Later Jun 30, 2015 52:09


Shcherbina, M (Institute for Low Temperatures, Kharkov) Monday 22 June 2015, 15:00-16:00

Differential Equations, Spring 2006
Lecture 26: Continuation: repeated real eigenvalues, complex eigenvalues

Differential Equations, Spring 2006

Play Episode Listen Later Jun 29, 2015 46:37


Differential Equations, Spring 2006
Lecture 25: Homogeneous linear systems with constant coefficients: solution via matrix eigenvalues (real and distinct case)

Differential Equations, Spring 2006

Play Episode Listen Later Jun 29, 2015 49:07


Discrete Stochastic Processes
Lecture 8: Markov Eigenvalues and Eigenvectors

Discrete Stochastic Processes

Play Episode Listen Later Jun 22, 2015 83:37


This lecture covers eigenvalues and eigenvectors of the transition matrix and the steady-state vector of Markov chains. It also includes an analysis of a 2-state Markov chain and a discussion of the Jordan form.

Periodic and Ergodic Spectral Problems
How to Place an Obstacle so as to Optimize the Dirichlet Eigenvalues in R2.

Periodic and Ergodic Spectral Problems

Play Episode Listen Later May 28, 2015 32:48


Kiwan, R (A. U. Dubaï) Wednesday 13 May 2015, 15:15-16:00

Periodic and Ergodic Spectral Problems
Eigenvalues of the Schroedinger operator on infinite combinatorial and quantum graphs

Periodic and Ergodic Spectral Problems

Play Episode Listen Later May 12, 2015 56:42


Rozenblum, G (Chalmers University of Technology) Tuesday 28 April 2015, 15:00-16:00

Periodic and Ergodic Spectral Problems
Negative eigenvalues of two-dimensional Schroedinger operators

Periodic and Ergodic Spectral Problems

Play Episode Listen Later Mar 5, 2015 65:00


Shargorodsky, E (King's College London) Tuesday 03 March 2015, 14:00-15:00

Periodic and Ergodic Spectral Problems
On the minimax principle for eigenvalues of Dirac operator with Coulombic singularities

Periodic and Ergodic Spectral Problems

Play Episode Listen Later Feb 26, 2015 57:44


Morozov, S (Ludwig-Maximilians-Universität München) Wednesday 25 February 2015, 14:00-15:00

Periodic and Ergodic Spectral Problems
Accumulation of complex eigenvalues for a class of indefinite Sturm-Liouville operators

Periodic and Ergodic Spectral Problems

Play Episode Listen Later Feb 26, 2015 56:12


Seri, M (University College London) Tuesday 24 February 2015, 14:00-15:00

MathsCasts
Eigenvalues of a 3x3 matrix (MathsCasts)

MathsCasts

Play Episode Listen Later Mar 22, 2013 7:42


Eigenvalues of a 3 by 3 matrix are calculated. Some useful advice is given concerning factorization of the characteristic equation.

MathsCasts
Eigenvalues of a 2x2 matrix (MathsCasts)

MathsCasts

Play Episode Listen Later Mar 21, 2013 7:46


We show how to find eigenvalues for an NxN matrix then do a 2x2 example. This MathsCast follows from an introductory MathsCast on eigenvalues and eigenvectors and it is also helpful to view the presentations on linear dependence and independence first.

MathsCasts
Eigenvalues and eigenvectors (MathsCasts)

MathsCasts

Play Episode Listen Later Jul 20, 2012 10:34


Explains graphically what it means to multiply a matrix and a vector. Then introduces eigenvectors as vectors belonging to the matrix with a special property in matrix vector multiplication. Eigenvalues are also explained in this context.

MATH 222: Differential Equations
7_II_c_Complex Eigenvalues to Solve Systems of ODE Examples II

MATH 222: Differential Equations

Play Episode Listen Later Oct 25, 2011 14:52


MATH 222: Differential Equations
7_II_c_Complex Eigenvalues to Solve Systems of ODE Examples II

MATH 222: Differential Equations

Play Episode Listen Later Oct 25, 2011


MATH 222: Differential Equations
7_II_a_Eigenvalues and Eigenvectors to Solve Systems of ODE Examples III

MATH 222: Differential Equations

Play Episode Listen Later Oct 21, 2011


MATH 222: Differential Equations
7_II_b_Repeated Eigenvalues to Solve Systems of ODE Examples I

MATH 222: Differential Equations

Play Episode Listen Later Oct 21, 2011


MATH 222: Differential Equations
7_II_b_Repeated Eigenvalues to Solve Systems of ODE Examples I

MATH 222: Differential Equations

Play Episode Listen Later Oct 21, 2011 13:46


MATH 222: Differential Equations
7_II_a_Eigenvalues and Eigenvectors to solve Systems of ODE Examples III

MATH 222: Differential Equations

Play Episode Listen Later Oct 20, 2011 12:21


MATH 222: Differential Equations
7_I_c_Repeated Eigenvalues and Eigenvectors of a Matrix Examples I

MATH 222: Differential Equations

Play Episode Listen Later Oct 18, 2011


MATH 222: Differential Equations
7_1_c_Repeated Eigenvalues and Eigenvectors of a Matrix Examples I

MATH 222: Differential Equations

Play Episode Listen Later Oct 18, 2011 7:54


MATH 222: Differential Equations
7_I_b_Complex Eigenvalues and Eigenvectors of a Matrix Examples II

MATH 222: Differential Equations

Play Episode Listen Later Oct 15, 2011


MATH 222: Differential Equations
7_I_b_Complex Eigenvalues and Eigenvectors of a Matrix Examples II

MATH 222: Differential Equations

Play Episode Listen Later Oct 15, 2011 17:59


Inverse Problems
Transmission Eigenvalues in Inverse Scattering Theory

Inverse Problems

Play Episode Listen Later Aug 8, 2011 39:21


Cakoni, F (University of Delaware) Wednesday 03 August 2011, 16:00-16:45

Inverse Problems
Transmission Eigenvalues for a Spherically Stratified Medium

Inverse Problems

Play Episode Listen Later Aug 4, 2011 47:21


Colton, D (University of Delaware) Wednesday 03 August 2011, 14:00-14:45

Inverse Problems
Transmission Eigenvalues and Upper Triangular Compactness

Inverse Problems

Play Episode Listen Later Aug 4, 2011 44:17


Sylvester, J (University of Washington) Tuesday 02 August 2011, 11:15-12:00

Modelling with systems of differential equations - for iBooks
Modelling with systems of differential equations

Modelling with systems of differential equations - for iBooks

Play Episode Listen Later Jul 31, 2011


This unit is intended to further develop your understanding of Newtonian mechanics in relation to oscillating systems. In addition to a basic grounding in solving systems of differential equations, this unit assumes that you have some understanding of eigenvalues and eigenvectors. This study unit is just one of many that can be found on LearningSpace, part of OpenLearn, a collection of open educational resources from The Open University. Published in ePub 2.0.1 format, some feature such as audio, video and linked PDF are not supported by all ePub readers.

Linear Algebra (Math 6435)
Eigenvalues; Diagonalization

Linear Algebra (Math 6435)

Play Episode Listen Later Jan 24, 2011 62:30


Linear Algebra (Math 6435)
Gram Schmidt 2; Eigenvalues 1

Linear Algebra (Math 6435)

Play Episode Listen Later Jan 21, 2011 64:26


eigenvalues gram schmidt
Linear Algebra
Linear Algebra: Eigenvalues of a 3x3 matrix

Linear Algebra

Play Episode Listen Later Aug 21, 2010 14:07


Linear Algebra
Linear Algebra: Introduction to Eigenvalues and Eigenvectors

Linear Algebra

Play Episode Listen Later Aug 21, 2010 7:42


Linear Algebra
Linear Algebra: Proof of formula for determining Eigenvalues

Linear Algebra

Play Episode Listen Later Aug 21, 2010 9:18


Linear Algebra
Linear Algebra: Example solving for the eigenvalues of a 2x2 matrix

Linear Algebra

Play Episode Listen Later Aug 21, 2010 5:38


MATH 222: Differential Equations
7_II_a_Eigenvalues and Eigenvectors to Solve Systems of ODE Examples II

MATH 222: Differential Equations

Play Episode Listen Later Jun 5, 2010


MATH 222: Differential Equations
7_II_a_Eigenvalues and Eigenvectors to Solve Systems of ODE Examples II

MATH 222: Differential Equations

Play Episode Listen Later Jun 5, 2010 16:59


MATH 222: Differential Equations
7_II_a_Eigenvalues and Eigenvectors to Solve Systems of ODE Examples I

MATH 222: Differential Equations

Play Episode Listen Later May 31, 2010


MATH 222: Differential Equations
7_I_a_Eigenvalues and Eigenvectors of a matrix Examples II

MATH 222: Differential Equations

Play Episode Listen Later May 31, 2010


MATH 222: Differential Equations
7_I_b_Complex Eigenvalues and Eigenvectors of a matrix Examples I

MATH 222: Differential Equations

Play Episode Listen Later May 31, 2010


MATH 222: Differential Equations
7_I_b_Complex Eigenvalues and Eigenvectors of a Matrix Examples I

MATH 222: Differential Equations

Play Episode Listen Later May 31, 2010 20:26


MATH 222: Differential Equations
7_II_a_Eigenvalues and Eigenvectors to solve Systems of ODE Examples I

MATH 222: Differential Equations

Play Episode Listen Later May 27, 2010 11:27


MATH 222: Differential Equations
7_I_a_Eigenvalues and Eigenvectors of a Matrix Examples II

MATH 222: Differential Equations

Play Episode Listen Later May 25, 2010 21:37


MATH 222: Differential Equations
7_I_a_Eigenvalues and Eigenvectors of a Matrix Examples I

MATH 222: Differential Equations

Play Episode Listen Later May 8, 2010 12:29


MATH 222: Differential Equations
7_I_a_Eigenvalues and Eigenvectors of a Matrix Examples I

MATH 222: Differential Equations

Play Episode Listen Later May 8, 2010


Mathematics and Physics of Anderson Localization: 50 Years After
Non-Hermitian Anderson model on a strip: properties of eigenvalues and eigenfunctions

Mathematics and Physics of Anderson Localization: 50 Years After

Play Episode Listen Later Sep 20, 2008 59:52


Goldsheid, I (QMUL) Tuesday 19 August 2008, 15:30-16:30 Anderson Localization and Related Phenomena

Mathematics and Physics of Anderson Localization: 50 Years After
Poisson statistics for eigenvalues of continuum random schrodinger operators

Mathematics and Physics of Anderson Localization: 50 Years After

Play Episode Listen Later Sep 20, 2008 49:37


Klein, A (UC, Irvine) Tuesday 19 August 2008, 09:00-10:00 Anderson Localization and Related Phenomena

Mathematics and Physics of Anderson Localization: 50 Years After
Statistics of Eigenvalues for 1d random systems

Mathematics and Physics of Anderson Localization: 50 Years After

Play Episode Listen Later Aug 21, 2008 64:00


Molchanov, S (North Carolina) Thursday 31 July 2008, 14:00-15:00

Analysis on Graphs and its Applications
Localized shelf waves on a curved coast - existence of eigenvalues of a linear operator pencil in a curved waveguide

Analysis on Graphs and its Applications

Play Episode Listen Later May 20, 2008 49:45


Parnovski, L (University College London) Friday 13 April 2007, 14:00-15:00 Graph Models of Mesoscopic Systems, Wave-Guides and Nano-Structures

Analysis on Graphs and its Applications
Dirichlet eigenvalues in a narrow strip

Analysis on Graphs and its Applications

Play Episode Listen Later May 19, 2008 61:44


Solomyak, M (Weizmann Institute of Science) Thursday 12 April 2007, 15:30-16:30 Graph Models of Mesoscopic Systems, Wave-Guides and Nano-Structures

strip narrow eigenvalues dirichlet science thursday
Analysis on Graphs and its Applications
Finding eigenvalues and resonances of the Laplacian on domains with regular ends

Analysis on Graphs and its Applications

Play Episode Listen Later Apr 17, 2008 58:03


Levitin, M (Heriot-Watt) Thursday 12 April 2007, 11:30-12:30 Graph Models of Mesoscopic Systems, Wave-Guides and Nano-Structures