Podcasts about 360m

  • 77PODCASTS
  • 96EPISODES
  • 33mAVG DURATION
  • 1MONTHLY NEW EPISODE
  • Jun 2, 2025LATEST

POPULARITY

20172018201920202021202220232024


Best podcasts about 360m

Latest podcast episodes about 360m

Naughty But Nice with Rob Shuter
TAYLOR SWIFT CELEBRATES $360M MASTERS WIN, MILEY CYRUS REVEALS FAMILY TRAUMA, REAL HOUSEWIFE TORCHES ANDY COHEN IN BRAVO WAR

Naughty But Nice with Rob Shuter

Play Episode Listen Later Jun 2, 2025 21:06 Transcription Available


Taylor Swift toasted her $360M masters win with a private NYC dinner alongside Selena Gomez, while Miley Cyrus opened up about years of hidden family pain. Meanwhile, Leah McSweeney's legal battle against Andy Cohen escalates. Rob is joined by his dear pal Garrett Vogel from Elvis Duran and the Morning Show with all the scoop. Don't forget to vote in today's poll on Twitter at @naughtynicerob or in our Facebook group.See omnystudio.com/listener for privacy information.

Behind Her Empire
SmartSweets Founder: From Working at McDonald's to Dropping Out of College & Selling Her Company for $360M in 4 Years – Tara Bosch

Behind Her Empire

Play Episode Listen Later Mar 10, 2025 62:39


Tara Bosch is the founder of SmartSweets, the brand that redefined the candy aisle with its low-sugar treats.Tara's journey wasn't always sweet. At 21, she dropped out of college and moved into her grandmother's basement with a bold vision: to create a global brand that would revolutionize candy. Armed with a gummy bear mold from Amazon and zero food science experience, she spent months perfecting a sugar-free recipe. But turning an idea into reality wasn't easy—she faced failed ventures, skeptical investors, and major manufacturing roadblocks. Determined to make it work, she bet everything, taking out personal loans to launch SmartSweets in 2016. Just four years later, she scaled it into a $100+ million business before selling a majority stake for $360 million.In this week's episode, Tara shares how her challenging family background fueled her drive for independence and shaped her understanding of money. She opens up about how a personal connection to candy sparked the vision for SmartSweets and the scrappy tactics she used to get it off the ground, like leveraging debt, securing a loan with her car, and finding a manufacturer against the odds.Tara takes us behind the scenes of the unexpected challenges of scaling fast and explains why resilience matters more than having all the answers. She reflects on how mindset was key to overcoming early failures and the serendipitous moment that got her in front of Whole Foods, a game-changing opportunity. We dive into the emotional rollercoaster of entrepreneurship, including the self-doubt, the tough decisions, and the reality of burnout. She also shares how she knew it was time to sell, the valuable lessons she's taking into her next chapter, and so much more.In this episode, we'll talk to Tara about:* Challenging upbringing & her drive for independence. [03:05]* First business in college and the biggest lessons learned. [08:00]* Tara's vision for SmartSweets. [14:32]* Dropping out of college and starting SmartSweets. [18:51]* Overcoming self-doubt in her entrepreneurial journey. [21:01]* Scrappiness in the early stages of building SmartSweets. [23:11]* Mindset plays a significant role in entrepreneurship. [29:20]* Finding a manufacturer in the candy industry. [31:04]* Turning to debt financing to launch the product. [35:38]* Building brand awareness in the early stage. [38:44]* The journey of going into retail. [40:17]* Scaling and manufacturing challenges. [45:46]* Deciding to sell and transitioning leadership. [55:31]* Operating in fight or flight mode isn't sustainable. [59:21]This episode is brought to you by Beeya:* If you or anyone you know have been struggling with hormonal imbalances and bad periods, go to https://beeyawellness.com/free to download the free guide to tackling hormonal imbalances and to learn more about Beeya's seed cycling bundle.* Plus, get $10 off your order by using promo code BEHINDHEREMPIRE10.Follow Yasmin:* Instagram: https://www.instagram.com/yasminknouri* Website: https://www.behindherempire.comFollow Tara: * Instagram: https://www.instagram.com/tarabosch* Instagram: https://www.instagram.com/smartsweets* Website: https://smartsweets.com Hosted on Acast. See acast.com/privacy for more information.

rose bros podcast
#216: Jamie Murray (Murray Wealth Group) - Out of Favor Stocks, Protecting Capital, Energy Tariffs & Market Opportunities in 2025

rose bros podcast

Play Episode Listen Later Mar 6, 2025 41:56


Greetings, and welcome back to the podcast.This episode we are joined by Mr. Jamie Murray - portfolio manager at the Murray Wealth Group - a Toronto based investment firm with ~$360M under management. Mr. Jamie Murray has approximately 20 years of experience in the investment sector with previous roles at Investors Group, Desjardins Capital Markets & Dundee Capital Markets.Mr. Murray graduated from The Richard Ivey School of Business, and has also received his Chartered Financial Analyst designation in 2011.Among other things, we discussed Out of Favor Stocks, Protecting Capital, Energy Tariffs & Market Opportunities in 2025.Enjoy.Thank you to our sponsors.Without their support this episode would not be possible:Connate Water SolutionsATB Capital MarketsEnergy United 360 Engineering & Environmental ConsultingEVA SoftwareSupport the show

Everyday Business with Aidan Donnelly
44: Bronwyn Brophy, CEO of the Vitrolife Group

Everyday Business with Aidan Donnelly

Play Episode Listen Later Jan 23, 2025 60:39


Bronwyn Brophy has over 25 years' experience in the MedTech industry having also worked for Thermo Fisher Scientific, Medtronic, Covidien, and Johnson & Johnson in a number of senior roles around the globe covering businesses including general surgery, women's health, gastroenterology, cardiology and oncology. She is now the CEO of the Vitrolife Group, a world leading provider of medical devices and genetic services for the reproductive health industry. The company is headquartered in Gothenburg, Sweden and is listed on the NASDAQ Stockholm. It has over 1100 employees and a revenue of circa $360M. Vitrolife is a world-leading global provider of medical devices and genetic testing services. Dedicated to the reproductive-health market since 1994, they have grown by focusing on product development, groundbreaking research, consistent quality control and acquisitions of innovative companies in the industry. A global leader in reproductive health focused on supporting customers and patients worldwide to improve treatment outcomes – always with sustainability in mind. Based on science and advanced research capabilities, their aim is to deliver products and services for the entire reproductive-health journey, providing consistent performance and guaranteed quality. She has a strong track record in strategy development, accelerating profitable growth, leading M & A and also driving digital transformations. Bronwyn has led several large scale R & D programs, building out next generation platforms. She has worked in multiple regions globally and has experience driving geographic expansion in the U.S. and China. This is the 44th episode with guest Bronwyn Brophy, CEO of the Vitrolife Group, in the Davy podcast series 'Everyday Business with Aidan Donnelly'.

La Estrategia del Día Argentina
Gobierno paga US$4.360M a bonistas, primera privatización de Milei y qué pasará con el crawling peg

La Estrategia del Día Argentina

Play Episode Listen Later Jan 9, 2025 8:30


En el capítulo 767 de este jueves, 9 de enero, @franaldaya te cuenta sobre los vencimientos de deuda en dólares que caen hoy, la venta de IMPSA al sector privado y las expectativas del mercado para la devaluación del peso. Además, @espinamariano con lo más importante de la política en #RecintosDelPoder.

Tip the Scales
112. Kevin Biniazan - Partner at 32, a $360M Verdict, and Nice Guy Offenders

Tip the Scales

Play Episode Listen Later Dec 25, 2024 66:03


On this week's episode, Maria talks with trial attorney and partner at Breit Biniazan, Kevin Biniazan. They discuss being a partner at 32, his latest case defending victims of sexual abuse, how to take jurors on an emotional journey, being aware of nice guy offenders, and the best books for optimizing your legal operations. Get in touch with Kevin at https://www.bbtrial.com Guest Kevin Biniazan (@kevinbiniazan on Instagram) is a trial attorney and partner at Breit Biniazan. In his first five years of practice, he has tried over a dozen jury trials and recovered over $100 million in verdicts and settlements for his clients. Currently, Kevin is representing 47 individuals who have alleged sexual abuse and battery over a period ranging from 2008 to 2020. Host Maria Monroy (@marialawrank on Instagram) is the Co-founder and President of LawRank, a leading SEO company for law firms since 2013. She has a knack for breaking down complex topics to make them more easily accessible and started Tip the Scales to share her knowledge with listeners like you. _____ LawRank grows your law firm with SEO Our clients saw a 384% increase in first-time calls and a 603% growth in traffic in 12 months. Get your free competitor report at https://lawrank.com/report. Subscribe to us on your favorite podcast app Rate us 5 stars on iTunes and Spotify Watch us on YouTube Follow us on Instagram and TikTok

Tip the Scales
112. Kevin Biniazan - Partner at 32, a $360M Verdict, and Nice Guy Offenders

Tip the Scales

Play Episode Listen Later Dec 25, 2024 66:03


On this week's episode, Maria talks with trial attorney and partner at Breit Biniazan, Kevin Biniazan. They discuss being a partner at 32, his latest case defending victims of sexual abuse, how to take jurors on an emotional journey, being aware of nice guy offenders, and the best books for optimizing your legal operations. Get in touch with Kevin at https://www.bbtrial.com Guest Kevin Biniazan (@kevinbiniazan on Instagram) is a trial attorney and partner at Breit Biniazan. In his first five years of practice, he has tried over a dozen jury trials and recovered over $100 million in verdicts and settlements for his clients. Currently, Kevin is representing 47 individuals who have alleged sexual abuse and battery over a period ranging from 2008 to 2020. Host Maria Monroy (@marialawrank on Instagram) is the Co-founder and President of LawRank, a leading SEO company for law firms since 2013. She has a knack for breaking down complex topics to make them more easily accessible and started Tip the Scales to share her knowledge with listeners like you. _____ LawRank grows your law firm with SEO Our clients saw a 384% increase in first-time calls and a 603% growth in traffic in 12 months. Get your free competitor report at https://lawrank.com/report. Subscribe to us on your favorite podcast app Rate us 5 stars on iTunes and Spotify Watch us on YouTube Follow us on Instagram and TikTok

Latent Space: The AI Engineer Podcast — CodeGen, Agents, Computer Vision, Data Science, AI UX and all things Software 3.0
2024 in Post-Transformers Architectures (State Space Models, RWKV) [LS Live @ NeurIPS]

Latent Space: The AI Engineer Podcast — CodeGen, Agents, Computer Vision, Data Science, AI UX and all things Software 3.0

Play Episode Listen Later Dec 24, 2024 43:02


Happy holidays! We'll be sharing snippets from Latent Space LIVE! through the break bringing you the best of 2024! We want to express our deepest appreciation to event sponsors AWS, Daylight Computer, Thoth.ai, StrongCompute, Notable Capital, and most of all all our LS supporters who helped fund the gorgeous venue and A/V production!For NeurIPS last year we did our standard conference podcast coverage interviewing selected papers (that we have now also done for ICLR and ICML), however we felt that we could be doing more to help AI Engineers 1) get more industry-relevant content, and 2) recap 2024 year in review from experts. As a result, we organized the first Latent Space LIVE!, our first in person miniconference, at NeurIPS 2024 in Vancouver.Of perennial interest, particularly at academic conferences, is scaled-up architecture research as people hunt for the next Attention Is All You Need. We have many names for them: “efficient models”, “retentive networks”, “subquadratic attention” or “linear attention” but some of them don't even have any lineage with attention - one of the best papers of this NeurIPS was Sepp Hochreiter's xLSTM, which has a particularly poetic significance as one of the creators of the LSTM returning to update and challenge the OG language model architecture:So, for lack of a better term, we decided to call this segment “the State of Post-Transformers” and fortunately everyone rolled with it.We are fortunate to have two powerful friends of the pod to give us an update here:* Together AI: with CEO Vipul Ved Prakash and CTO Ce Zhang joining us to talk about how they are building Together together as a quote unquote full stack AI startup, from the lowest level kernel and systems programming to the highest level mathematical abstractions driving new model architectures and inference algorithms, with notable industry contributions from RedPajama v2, Flash Attention 3, Mamba 2, Mixture of Agents, BASED, Sequoia, Evo, Dragonfly, Dan Fu's ThunderKittens and many more research projects this year* Recursal AI: with CEO Eugene Cheah who has helped lead the independent RWKV project while also running Featherless AI. This year, the team has shipped RWKV v5, codenamed Eagle, to 1.5 billion Windows 10 and Windows 11 machines worldwide, to support Microsoft's on-device, energy-usage-sensitive Windows Copilot usecases, and has launched the first updates on RWKV v6, codenamed Finch and GoldFinch. On the morning of Latent Space Live, they also announced QRWKV6, a Qwen 32B model modified with RWKV linear attention layers. We were looking to host a debate between our speakers, but given that both of them were working on post-transformers alternativesFull Talk on YoutubePlease like and subscribe!LinksAll the models and papers they picked:* Earlier Cited Work* Transformers are RNNs: Fast Autoregressive Transformers with Linear Attention* Hungry hungry hippos: Towards language modeling with state space models* Hyena hierarchy: Towards larger convolutional language models* Mamba: Linear-Time Sequence Modeling with Selective State Spaces* S4: Efficiently Modeling Long Sequences with Structured State Spaces* Just Read Twice (Arora et al)* Recurrent large language models that compete with Transformers in language modeling perplexity are emerging at a rapid rate (e.g., Mamba, RWKV). Excitingly, these architectures use a constant amount of memory during inference. However, due to the limited memory, recurrent LMs cannot recall and use all the information in long contexts leading to brittle in-context learning (ICL) quality. A key challenge for efficient LMs is selecting what information to store versus discard. In this work, we observe the order in which information is shown to the LM impacts the selection difficulty. * To formalize this, we show that the hardness of information recall reduces to the hardness of a problem called set disjointness (SD), a quintessential problem in communication complexity that requires a streaming algorithm (e.g., recurrent model) to decide whether inputted sets are disjoint. We empirically and theoretically show that the recurrent memory required to solve SD changes with set order, i.e., whether the smaller set appears first in-context. * Our analysis suggests, to mitigate the reliance on data order, we can put information in the right order in-context or process prompts non-causally. Towards that end, we propose: (1) JRT-Prompt, where context gets repeated multiple times in the prompt, effectively showing the model all data orders. This gives 11.0±1.3 points of improvement, averaged across 16 recurrent LMs and the 6 ICL tasks, with 11.9× higher throughput than FlashAttention-2 for generation prefill (length 32k, batch size 16, NVidia H100). We then propose (2) JRT-RNN, which uses non-causal prefix-linear-attention to process prompts and provides 99% of Transformer quality at 360M params., 30B tokens and 96% at 1.3B params., 50B tokens on average across the tasks, with 19.2× higher throughput for prefill than FA2.* Jamba: A 52B Hybrid Transformer-Mamba Language Model* We present Jamba, a new base large language model based on a novel hybrid Transformer-Mamba mixture-of-experts (MoE) architecture. * Specifically, Jamba interleaves blocks of Transformer and Mamba layers, enjoying the benefits of both model families. MoE is added in some of these layers to increase model capacity while keeping active parameter usage manageable. * This flexible architecture allows resource- and objective-specific configurations. In the particular configuration we have implemented, we end up with a powerful model that fits in a single 80GB GPU.* Built at large scale, Jamba provides high throughput and small memory footprint compared to vanilla Transformers, and at the same time state-of-the-art performance on standard language model benchmarks and long-context evaluations. Remarkably, the model presents strong results for up to 256K tokens context length. * We study various architectural decisions, such as how to combine Transformer and Mamba layers, and how to mix experts, and show that some of them are crucial in large scale modeling. We also describe several interesting properties of these architectures which the training and evaluation of Jamba have revealed, and plan to release checkpoints from various ablation runs, to encourage further exploration of this novel architecture. We make the weights of our implementation of Jamba publicly available under a permissive license.* SANA: Efficient High-Resolution Image Synthesis with Linear Diffusion Transformers* We introduce Sana, a text-to-image framework that can efficiently generate images up to 4096×4096 resolution. Sana can synthesize high-resolution, high-quality images with strong text-image alignment at a remarkably fast speed, deployable on laptop GPU. Core designs include: * (1) Deep compression autoencoder: unlike traditional AEs, which compress images only 8×, we trained an AE that can compress images 32×, effectively reducing the number of latent tokens. * (2) Linear DiT: we replace all vanilla attention in DiT with linear attention, which is more efficient at high resolutions without sacrificing quality. * (3) Decoder-only text encoder: we replaced T5 with modern decoder-only small LLM as the text encoder and designed complex human instruction with in-context learning to enhance the image-text alignment. * (4) Efficient training and sampling: we propose Flow-DPM-Solver to reduce sampling steps, with efficient caption labeling and selection to accelerate convergence. * As a result, Sana-0.6B is very competitive with modern giant diffusion model (e.g. Flux-12B), being 20 times smaller and 100+ times faster in measured throughput. Moreover, Sana-0.6B can be deployed on a 16GB laptop GPU, taking less than 1 second to generate a 1024×1024 resolution image. Sana enables content creation at low cost. * RWKV: Reinventing RNNs for the Transformer Era* Transformers have revolutionized almost all natural language processing (NLP) tasks but suffer from memory and computational complexity that scales quadratically with sequence length. In contrast, recurrent neural networks (RNNs) exhibit linear scaling in memory and computational requirements but struggle to match the same performance as Transformers due to limitations in parallelization and scalability. * We propose a novel model architecture, Receptance Weighted Key Value (RWKV), that combines the efficient parallelizable training of transformers with the efficient inference of RNNs.* Our approach leverages a linear attention mechanism and allows us to formulate the model as either a Transformer or an RNN, thus parallelizing computations during training and maintains constant computational and memory complexity during inference. * We scale our models as large as 14 billion parameters, by far the largest dense RNN ever trained, and find RWKV performs on par with similarly sized Transformers, suggesting future work can leverage this architecture to create more efficient models. This work presents a significant step towards reconciling trade-offs between computational efficiency and model performance in sequence processing tasks.* LoLCATs: On Low-Rank Linearizing of Large Language Models* Recent works show we can linearize large language models (LLMs) -- swapping the quadratic attentions of popular Transformer-based LLMs with subquadratic analogs, such as linear attention -- avoiding the expensive pretraining costs. However, linearizing LLMs often significantly degrades model quality, still requires training over billions of tokens, and remains limited to smaller 1.3B to 7B LLMs. * We thus propose Low-rank Linear Conversion via Attention Transfer (LoLCATs), a simple two-step method that improves LLM linearizing quality with orders of magnitudes less memory and compute. * We base these steps on two findings. * First, we can replace an LLM's softmax attentions with closely-approximating linear attentions, simply by training the linear attentions to match their softmax counterparts with an output MSE loss ("attention transfer").* Then, this enables adjusting for approximation errors and recovering LLM quality simply with low-rank adaptation (LoRA). * LoLCATs significantly improves linearizing quality, training efficiency, and scalability. We significantly reduce the linearizing quality gap and produce state-of-the-art subquadratic LLMs from Llama 3 8B and Mistral 7B v0.1, leading to 20+ points of improvement on 5-shot MMLU. * Furthermore, LoLCATs does so with only 0.2% of past methods' model parameters and 0.4% of their training tokens. * Finally, we apply LoLCATs to create the first linearized 70B and 405B LLMs (50x larger than prior work). * When compared with prior approaches under the same compute budgets, LoLCATs significantly improves linearizing quality, closing the gap between linearized and original Llama 3.1 70B and 405B LLMs by 77.8% and 78.1% on 5-shot MMLU.Timestamps* [00:02:27] Intros* [00:03:16] Why Scale Context Lengths? or work on Efficient Models* [00:06:07] The Story of SSMs* [00:09:33] Idea 1: Approximation -> Principled Modeling* [00:12:14] Idea 3: Selection* [00:15:07] Just Read Twice* [00:16:51] Idea 4: Test Time Compute* [00:17:32] Idea 2: Hardware & Kernel Support* [00:19:49] RWKV vs SSMs* [00:24:24] RWKV Arch* [00:26:15] QWRKWv6 launch* [00:30:00] What's next* [00:33:21] Hot Takes - does anyone really need long context?Transcript[00:00:00] AI Charlie: We're back at Latent Space Live, our first mini conference held at NeurIPS 2024 in Vancouver. This is Charlie, your AI co host. As a special treat this week, we're recapping the best of 2024 going domain by domain. We sent out a survey to the over 900 of you who told us what you wanted, and then invited the best speakers in the Latent Space Network to cover each field.[00:00:24] AI Charlie: 200 of you joined us in person throughout the day, with over 2200 watching live online. Thanks Our next keynote covers the State of Transformers alternative architectures, with a special joint presentation with Dan Fu of Together AI and Eugene Chia of Recursal AI and Featherless AI. We've featured both Together and Recursal on the pod before, with CEO Veepal Vedprakash introducing them.[00:00:49] AI Charlie: And CTO CE Zhang joining us to talk about how they are building together together as a quote unquote full stack AI startup from the lowest level kernel and systems [00:01:00] programming to the highest level mathematical abstractions driving new model architectures and inference algorithms with notable industry contributions from Red Pajama V2, Flash Attention 3, Mamba 2, Mixture of Agents.[00:01:15] AI Charlie: Based, Sequoia, Evo, Dragonfly, Danfoo's Thunder Kittens, and many more research projects this year. As for Recursal and Featherless, we were the first podcast to feature RWKV last year, and this year the team has shipped RWKV v5, codenamed Eagle, to 1. 5 billion Windows 10 and Windows 11 machines worldwide to support Microsoft's on device, end Energy Usage Sensitive Windows Copilot Use Cases and has launched the first updates on RWKV v6, codenamed Finch and Goldfinch.[00:01:53] AI Charlie: On the morning of Latent Space Live, they also announced QRdata UKv6, a QEN32B model [00:02:00] modified with RDWKV linear attention layers. Eugene has also written the most single most popular guest post on the Latent Space blog this year. Yes, we do take guest posts on what he has discovered about the H100 GPU inference NeoCloud market since the successful launch of Featherless AI this year.[00:02:20] AI Charlie: As always, don't forget to check the show notes for the YouTube link to their talk as well as their slides. Watch out and take care.[00:02:27] Intros[00:02:27] Dan Fu: Yeah, so thanks so much for having us. So this is going to be a little bit of a two part presentation. My name is Dan. I'm at Together AI, and I'll be joining UCSD as faculty in about a year. And Eugene, you want to introduce yourself?[00:02:46] Eugene Cheah: Eugene, I lead the art activity team, and I, I'm CEO of Featherless, and we both work on this new post transformer architecture space.[00:02:55] Dan Fu: Yeah, so yeah, so today we're really excited to talk to you a little bit [00:03:00] about that. So first I'm going to give a broad overview of kind of the last few years of progress in non post transformer architectures. And then afterwards Eugene will tell us a little bit about the latest and the greatest and the latest frontier models in this space.[00:03:16] Why Scale Context Lengths? or work on Efficient Models[00:03:16] Dan Fu: So, the story starts with Scaling. So this is probably a figure or something like this that you've seen very recently. Over the last five to six years, we've seen models really scale up in parameter size, and that's brought with it a bunch of new capabilities, like the ability to talk to you and tell you sometimes how to use your Colab screens.[00:03:35] Dan Fu: But another place where we've seen scaling especially recently is scaling in context length. So this can mean Having more text inputs for your models, but it can also mean things like taking a lot of visual token inputs image inputs to your models or generating lots of outputs. And one thing that's been really exciting over the last few months or so is that we're, we're seeing scaling, not only during training time, but also [00:04:00] during test time.[00:04:00] Dan Fu: So this is one of the, the, this is the iconic image from the OpenAI 01 release. Not only are we starting to scale train time compute, but we're also starting to scale test time compute. Now if you're familiar with our attention and our transformer architectures today, this graph on the right might look a little bit scary.[00:04:19] Dan Fu: And one of the reasons is that the implications are a little bit Interesting. So what does it mean if we want to continue having smarter and smarter models? Do we just need to start building bigger, bigger data centers, spending more flops? Is this this little Dolly 3, we need more flops, guys? Is this going to be the future of all of AI?[00:04:39] Dan Fu: Or is there a better way, another path forward? Maybe we can get the same capabilities that we've gotten used to, But for a lot less compute, a lot less flops. And one of the things that we're going to talk about today is specifically looking at that core attention operator in some of these models.[00:04:57] Dan Fu: And the reason is that so this is just some, some [00:05:00] basic you know, scaling curves, but attention has compute that scales quadratically in the context length. So that means that if you're doing something like test time compute and you want to spend a bunch of tokens thinking about what comes next, the longer that that goes the, the, the more tokens you spend on that, that compute grows quadratically in that.[00:05:19] Dan Fu: One of the questions that we're interested in is, can we take that basic sequence model, that basic sequence primitive at the bottom, and get it to scale better? Can we scale in, let's say, n to the 3 halves or n log n? So in, in the first part of the talk, so we just went over the introduction. What I'm gonna do over the next few slides is just talk about some of the key advances and ideas that have shown over the past few years since maybe early 2020 to, to now that shown promise that this might actually be possible.[00:05:48] Dan Fu: That you can actually get potentially the same quality that we want while scale, while scaling better. So to do that, we're and, and basically the, the story that we're gonna look is we're gonna start to see [00:06:00] how. So this is a basic graph of just the past couple years of progress of perplexity where that blue line, that dotted blue line, is attention.[00:06:07] The Story of SSMs[00:06:07] Dan Fu: It's your basic transformer, full dense attention. And then the dots coming down are some of the methods that you'll see in this presentation today. We're going to turn the clock back all the way to 2020. So this, this, this question of can we make attention subquadratic? Basically, as soon as we said attention is all you need, People started asking this question.[00:06:28] Dan Fu: So we have this quadratic attention operator. Can we do better? I'll briefly talk about why attention is quadratic. And the basic thing that happens, if you're not familiar, is that you have these inputs, these keys and queries. And what you do in this attention matrix, this S matrix over here, is that you're using, you're comparing every token in your input to every other token.[00:06:49] Dan Fu: So when I try to do something like upload a whole book to Gemini, what happens beyond the Maybe not Gemini, because we don't necessarily know what architecture is. But let's say we upload it to LLAMA, what happens beyond [00:07:00] the scenes, behind the scenes, is that it's going to take every single word in that book and compare it to every other word.[00:07:05] Dan Fu: And this has been a really, it's, it's led to some pretty impressive things. But it's kind of a brute forcing of the way that you would try to interpret a interpret something. And what attention does in particular is the, and then what attention, sorry, don't want to. Okay, no, no laser pointer. What, what attention does afterwards is that instead of always operating in this quadratic thing, it takes a row wise softmax over this matrix, and then multiplies it by this values matrix.[00:07:32] Dan Fu: So, one of the key points to notice is that the output size is always going to be the same as the inputs, at least in standard self attention. So one of the first things that folks tried to do around 2020 is this thing called linear attention, which is just, just noticing that if we take out this softmax from here, if we take out this non linearity in the middle of the attention operation, and then if you compute the keys and the values operation first, you actually never hit this quadratic bottleneck.[00:07:57] Dan Fu: So that, that's potentially a way [00:08:00] to get a lot more computationally efficient. And there are various ways to do this by basically using feature maps or try to approximate this overall attention computation. But some of this work sort of started to hit a wall in 2020. And the basic challenges were, were two.[00:08:16] Dan Fu: So one was quality. It was back then, it was kind of hard to, to get good quality with these linear attention operators. The other one was actually hardware efficiency. So these, this feature map that was just shown by a simplify simplify here. Actually ends up being quite computationally expensive if you just implement it naively.[00:08:34] Dan Fu: So you started having these operators that not only were you sure, you're not really sure if they have the same quality, but also they're actually just wall clock slower. So you kind of end up getting the worst of both worlds. So this was the the stage. So that kind of sets the stage for four years ago.[00:08:49] Dan Fu: Keep this in mind because linear attention is actually going to come back in a few years once we have a better understanding. But one of the works that started kicking off this, this [00:09:00] mini revolution in post transformer architectures was this idea called states based model. So here the seminal work is, is one about our work queue in 2022.[00:09:09] Dan Fu: And this, this piece of work really brought together a few ideas from, from some long running research research lines of work. The first one was, and this is really one of the keys to, to closing the gap in quality was just using things that, that if you talk to a, a, an electrical engineer off the street, they might know off, off the, like the back of their hand.[00:09:33] Idea 1: Approximation -> Principled Modeling[00:09:33] Dan Fu: But taking some of those properties with how we model dynamical systems in signal processing and then using those ideas to model the inputs, the, the text tokens in, for example a transformer like Next Token Prediction Architecture. So some of those early states-based model papers were looking at this relatively, relatively simple recurrent update model that comes from maybe chapter one of a signal processing class.[00:09:59] Dan Fu: But then using [00:10:00] some principle theory about how you should do that recurrent update in order to really get the most that you can out of your hidden state, out of your out of your sequence. So that, that was one key idea for quality and. When this was eventually realized, you started to see a bunch of benchmarks that were pretty sticky for a few years.[00:10:20] Dan Fu: Things like long range arena, some long sequence evaluation benchmarks, There was stuff in time series, time series analysis. They started to, you started to see the quality tick up in meaningful ways. But the other key thing that What's so influential about these states based models is that they also had a key idea about how you can compute these things efficiently.[00:10:45] Dan Fu: So if you go back to your machine learning 101 class where you learned about RNNs, one thing that you may have learned is that they don't paralyze as well as detention, because if you just run them naively, you have to do this kind of sequential update to process new tokens, [00:11:00] whereas in attention, you can process all the tokens in parallel at one time.[00:11:04] Dan Fu: One of the key insights behind the S4 paper was that these recurrent models, you could take them and you could also formulate them as a convolution. And in particular, with a convolution, you could, instead of using a PyTorch conv1d operation, you can compute that with the FFT. And that would give you n log n compute in the in the sequence length n with an operator that was relatively well optimized for modern hardware.[00:11:28] Dan Fu: So those are really, I'd say, the two key ideas in 2022 that started allowing these breakthroughs to happen in these non transformer architectures. So, these ideas about how to principally model sorry, how to model the recurrent updates of a mo of, of a sequence in a principled way, and also these key ideas in how you can compute it efficiently by turning it into a convolution and then scaling it up with the FFT.[00:11:53] Dan Fu: Along those same lines, so afterwards we started putting out some work on specialized kernels, so just [00:12:00] like we have flash attention for transformers, we also have works like flash fft conf, and if you look at these lines of work oftentimes when, whenever you see a new architecture, you see a new primitive one of the, one of the table stakes now is, do you have an efficient kernel so that you can actually get wall clock speed up?[00:12:14] Idea 3: Selection[00:12:14] Dan Fu: So by 2022, We are starting to have these models that had promising quality primitives, but and, and also promising wall clocks. So you could actually see regimes where they were better than transformers in meaningful ways. That being said, there were, there's still sometimes a quality gap, particularly for language modeling.[00:12:33] Dan Fu: And because languages, It's so core to what we do in sequence modeling these days the, the next, the next key idea that I'm going to talk about is this idea of selection mechanisms. And this is basically an idea of, so you have this recurrent state that you're keeping around that just summarizes everything that, that came before.[00:12:50] Dan Fu: And to get a good sequence model, one of the things that you really need to be able to do is have the model learn what's the best way to pick out pieces from that recurrent [00:13:00] state. So one of the, one of the major ideas here in a line of work called H3, Hungry Hungry Hippos, and also these hyena models were One way you can do this is by just adding some simple element wise gates.[00:13:13] Dan Fu: So versions of these ideas have been around for decades. If you squint at the LSTM paper you, you can probably find, find this gating mechanism. But turns out you can take those old ideas, add them into these new. state space models, and then you can see quality start to pick up. If you've heard of the Mamba model, this also takes the selection to the next level by actually making some changes in that fundamental recurrent state space.[00:13:40] Dan Fu: So, it's not only just this gating that happens around the SSM layer, but also you can actually make The ABCD matrices of your state space model, you can make them data dependent, which will allow you to even better select out different pieces from your hidden state depending on what you're seeing. I'll also point out if you look at the [00:14:00] bottom right of this figure, there's this little triangle with a GPU SRAM, GPU HBM, and this, this is just continuing that trend of when you have a new architecture you, you, you also release it with a kernel to, to, to show that it is hardware efficient, that it, that it can be hardware efficient on modern hardware.[00:14:17] Dan Fu: The, the, one of the next cool things that happened is once we had this understanding of these are the basic pieces, these are the basic principles behind some of the sequence models linear attention actually started to come back. So in earlier this year, there was a model called BASED the, from Simran Arora and, and some other folks, that combined a more principled version of linear attention that basically the, the, the, the two second summary is that it used a Taylor approximation of the softmax attention, combined that with a simple sliding window attention and was starting to able, starting to be able to expand the Pareto frontier of how much data can you recall from your sequence, versus how small is your recurrent state size.[00:14:58] Dan Fu: So those orange dots [00:15:00] are, at the top there, are just showing smaller sequences that can recall more memory.[00:15:07] Just Read Twice[00:15:07] Dan Fu: And the last major idea I think that has been influential in this line of work and is very relatively late breaking just a few months ago, is just the basic idea that when you have these models that are fundamentally more efficient in the sequence length, you maybe don't want to prompt them or use them in exactly the same way.[00:15:26] Dan Fu: So this was a really cool paper called Just Read Twice, also from Simran. That basically said, hey, all these efficient models can process tokens so much more efficiently than transformers that they can sometimes have unfair advantages compared to a simple transformer token. So, or sorry, a simple transformer model.[00:15:44] Dan Fu: So take, for example the standard, the standard use case of you have some long document, you're going to pass it in as input, and then you're going to ask some question about it. One problem you might imagine for a recurrent model where you have a fixed state size is, let's say that [00:16:00] you're. Article is very long, and you're trying to ask about some really niche thing.[00:16:04] Dan Fu: You can imagine it might be hard for the model to know ahead of time what information to put into the hidden state. But these, these, these models are so much more efficient that you can do something really stupid, like, you can just put the document write down the document, write down the question, write down the document again, and then write down the question again, and then this time, the second time that you go over that document, you know exactly what to look for.[00:16:25] Dan Fu: And the cool thing about this is, so this is, And this this results in better quality, especially on these recall intensive tasks. But the other interesting thing is it really takes advantage of the more efficient architectures that, that we're having here. So one of the other, I think, influential ideas in this line of work is if you change the fundamental compute capabilities of your model and the way that it scales, you can actually start to query it at test time differently.[00:16:51] Idea 4: Test Time Compute[00:16:51] Dan Fu: And this actually, of course, goes back to those slides on test time compute. So while everybody's looking at, say, test time compute for big transformer models, [00:17:00] I think potentially a really interesting research question is, how can you take those and how does it change with this new next generation of models?[00:17:09] Dan Fu: So the, I'll just briefly summarize what some of those key ideas were and then talk and then show you briefly kind of what the state of the art is today. So, so the four key ideas are instead of just doing a simple linear attention approximation, instead take ideas that we know from other fields like signal processing, do a more principled approach to your modeling of the sequence.[00:17:32] Idea 2: Hardware & Kernel Support[00:17:32] Dan Fu: Another key idea throughout all these lines of work is you really want. Hardware and kernel support from day one. So, so even if your model is theoretically more efficient if somebody goes and runs it and it's two times slower one of the things that, that we've learned is that if, if you're in that situation, it's, it's just gonna be dead on arrival.[00:17:49] Dan Fu: So you want to be designing your architectures one of the key, key machine learning ideas that has been important for the quality is just making sure that you encode different ways that you can [00:18:00] select from your hidden state and, and really focus on that as a key decider of quality. And finally, I think one of the, the, the emerging new, new things for, for this line of work and something that's quite interesting is, What are the right test time paradigms for these models?[00:18:15] Dan Fu: How do they change relative to relative to what you might do for a standard transformer? I'll briefly end this section. So I've labeled this slide where we are yesterday because Eugene is going to talk about some new models that he released literally this morning. But as of yesterday, some of the really cool results out of the, these efficient alternative models were so AI2 trained this hybrid MOE called Jamba.[00:18:40] Dan Fu: That, that, that seems, that is currently the state of the art for these non transformer architectures. There's this NVIDIA and MIT put out this new diffusion model called SANA recently that one of their key key observations is that you can take a standard diffusion transformer diffusion model, replace the layers with linear [00:19:00] attention, and then that lets you scale to much larger much larger images, much, much Much larger sequences more efficiently.[00:19:07] Dan Fu: And and one thing that I don't think anybody would have called when a few years ago is that one of those gated SSM, gated states based models ended up on the cover of Science because a great group of folks went and trained some DNA models. So that's Michael Polley, Eric Yuen from from Stanford and the Arc Institute.[00:19:26] Dan Fu: So it's, we're really at an exciting time in 2024 where these non transformer, post transformer architectures are showing promise across a wide range. Across a wide range of, of modalities, of applications, and, and of tasks. And with that, I'll pass it on to Eugene, who can tell you a little bit about the latest and greatest with RWKV.[00:19:49] RWKV vs SSMs[00:19:49] Eugene Cheah: So, that's useful? Yeah. You're talking to here. Oh, I'm talking to here. Okay. So, yeah, two streams. Yeah. So, I think one common questions that we tend to get asked, right, is what's the difference between [00:20:00] RWKV and state space? So I think one of the key things to really understand, right the difference between the two groups, right, is that we are actually more like an open source, random internet meets academia kind of situation.[00:20:11] Eugene Cheah: Like, most of us never wrote any paper, but we, we basically look at RNNs and linear intention when intention is all you need came out, and then we decided to like, hey there is a quadratic scaling problem. Why don't we try fixing that instead? So, so, so we end up developing our own branch, but we end up sharing ideas back and forth.[00:20:30] Eugene Cheah: So, and, and we do all this actively in Discord, GitHub, etc. This was so bad for a few years, right, that basically, the average group's H index was so close to zero, right, Illuter. ai actually came in and helped us write our first paper. Great, now our H index is now three, apparently. So, so, so, but, but the thing is, like, a lot of these experiments led to results, and, and, essentially, essentially, we we took the same ideas from linear attention, [00:21:00] and we built on it.[00:21:01] Eugene Cheah: So, to take a step back into, like, how does RWKB handle its own attention mechanic and achieve the same goals of, like, O and compute, respectively, and in focus of our overall goal to make AI accessible to everyone, regardless of language, nation, or compute, that's our goal. We actually train our models primarily on over a hundred languages, which is another topic altogether.[00:21:23] Eugene Cheah: And our goal is to train to even 200 languages to cover all languages in the world. But at the same time, we work on this architecture, To lower the compute cost so that people can run it on Raspberry Pis and on anything. So, how did RWKB break the dependency of LSTM token flow? Because I think to understand architecture, right, it's probably easier to understand it from the RNN lens.[00:21:46] Eugene Cheah: Because that's where we built on. We all, we all state space kind of like try to, try to start anew and took lessons from that and say, So there's a little bit of divergence there. And AKA, this our version of linear attention. So to take step back [00:22:00] all foundation models, be it transformers or non transformers at a very high level, right?[00:22:05] Eugene Cheah: Pumps in the token. I mean, text that things into embeddings and go through a lot of layers. Generate a lot of states where the QKV cache or be iron in states or RW KB states. And outputs and embedding, they are not the same thing. And we just take more layers and more embeddings. And somehow that magically works.[00:22:23] Eugene Cheah: So, if you, if you remember your ancient RNN lessons which we, which we, which we we call best learning these days the general idea is that you have the embedding information flowing all the way up, and when, and you take that information and you flow it back down, and then you process it as part of your LSTM layers.[00:22:41] Eugene Cheah: So, this is how it generally works. Kapati is quoted saying that RNNs are actually unreasonably effective. The problem is this is not scalable. To start doing work on the second token, you need to wait for the first token. And then you need to, and likewise for the third token and fourth token, yada yada.[00:22:55] Eugene Cheah: That is CPU land, not GPU land. So, so, so, you [00:23:00] can have a H100 and you can't even use 1 percent of it. So, so that's kind of why RNNs didn't really take off in the direction that we wanted, like, billions of parameters when it comes to training. So, what did RDAP KV version 0 do? Boom. We just did the dumbest, lamest thing.[00:23:13] Eugene Cheah: Sorry, this is the bottleneck for RNN. We did the dumb thing of removing that line. And it kind of worked. It trained. It sucked, but it kind of worked. Then we were like, hey, then no one cared because the loss was crap, but how do we improve that? And that's essentially where we move forward, because if you see this kind of flow, right, you can actually get your GPU saturated quickly, where it essentially cascades respectively.[00:23:41] Eugene Cheah: So I'm just waiting for this to loop again. So it's like, once you get your first layer, your token to be computed finish. You start to cascade your compute all the way until you are, Hey, I'm using 100 percent of the GPU. So we, we worked on it, and we started going along the principle of that as long as we keep this general architecture [00:24:00] where, where we can cascade and, and be highly efficient with our architecture, nothing is sacred in our architecture.[00:24:06] Eugene Cheah: And we have done some crazy ideas. In fact, you ask us, if you ask me to explain some things in the paper, right, officially in the paper, I'll say we had this idea and we wrote it this way. The reality is someone came with a code, we tested it, it worked, and then we rationalized later. So, so the general[00:24:24] RWKV Arch[00:24:24] Eugene Cheah: The idea behind rwkbr is that we generally have two major blocks that we do.[00:24:30] Eugene Cheah: We call time mix and channel mix. And time mix generally handles handles long term memory states, where essentially, where essentially where we apply the matrix multiplication and Cilu activation functions into processing an input embedding and an output embedding. I'm oversimplifying it because this, This calculation changed every version and we have, like, version 7 right now.[00:24:50] Eugene Cheah: ChannelMix is similar to Base in the sense that it does shorter term attention, where it just looks at the sister token, or the token before it, because [00:25:00] there's a shift in the token shift matrix. I don't really want to go too much into the papers itself, because, like, we do have three papers on this.[00:25:09] Eugene Cheah: Basically, RWKB, RNN for the transformer, ERA, Ego and Pinch, RWKB, Matrix Value State. This is the updated version 5, version 6. And Goldfinch is our, is, is, is, is our hybrid model respectively. We are writing the paper already for V seven and which is, which is for R wk V seven. Called, named Goose, or architectures are named by Bird.[00:25:30] Eugene Cheah: And, I'm going to cover as well, qrwkb, and mama100k, and rwkb, and Where did that lead to? Great! Because we are all GPU poor and to be clear, like, most of this research is done, like, only on a handful H100s, which I had one Google researcher told me that was, like, his experiment budget for a single researcher.[00:25:48] Eugene Cheah: So, our entire organization has less compute than a single researcher in Google. So We, we, one of the things that we explored into was to how do we convert transformer models instead? Because [00:26:00] someone already paid that billion dollars, a million dollars onto training, so why don't we take advantage of those weights?[00:26:05] Eugene Cheah: And, and to, I believe, together AI worked on the lockets for, for the Lambda side of things, and, and we took some ideas from there as well, and we essentially did that for RWKB.[00:26:15] QWRKWv6 launch[00:26:15] Eugene Cheah: And that led to, Q RWKB6, which we just dropped today, a 32 bit instruct preview model, where we took the Quen 32 bit instruct model, freeze the feedforward layer, remove the QKB attention layer, and replace it with RWKB linear layers.[00:26:32] Eugene Cheah: So to be clear, this means we do not have the rwkv channel mix layer, we only have the time mix layer. But but once we do that, we train the rwkv layer. Important is that the feedforward layer needs to be frozen, so the new attention can be learned. And then we unfreeze the feedforward layer, and train all the layers together with a custom learning rate schedule, so that they can learn how to work together.[00:26:54] Eugene Cheah: The end result, surprisingly, And, to be honest, to the frustration of the R. W. [00:27:00] KV MOE team, which ended up releasing the model on the same day, was that, with just a few hours of training on two nodes, we managed to get it to be on par, kind of, with the original QUAN32B model. So, in fact, when the first run, right, that completely confused us, it was like, and I was telling Daniel Goldstein, Smirky, who kind of leads most of our research coordination, When you pitched me this idea, you told me at best you'll get the same level of performance.[00:27:26] Eugene Cheah: You didn't tell me the challenge and score and Winograd score will shoot up. I don't know what's happening there. But it did. MMLU score dropping, that was expected. Because if you think about it, when we were training all the layers, right, we were essentially Like, Frankenstein this thing, and we did brain damage to the feedforward network layer 2 with the new RWKB layers.[00:27:47] Eugene Cheah: But, 76%, hey, somehow it's retained, and we can probably further train this. We didn't even spend more than 3 days training this, so there's a lot more that can be done, hence the preview. This brings up [00:28:00] a big question, because We are already now in the process of converting to 7TB. We are now, this is actually extremely compute efficient to test our attention mechanic.[00:28:10] Eugene Cheah: It's like, it becomes a shortcut. We can, we are already planning to do our version 7 and our hybrid architecture for it. Because we don't need to train from scratch. And we get a really good model out of it. And the other thing that is uncomfortable to say is that because we are doing right now on the 70b is that if this scales correctly to 128k context length, I'm not even talking about a million 128, majority of enterprise workload today is just on 70b at under 32k context length.[00:28:41] Eugene Cheah: That means if this works and the benchmark matches it, It means we can replace the vast majority of current AI workload, unless you want super long context. And then sorry, can someone give us more GPUs? Because we do need the VRAM for super long context, sadly. So yeah, that's what we are working on, and essentially, [00:29:00] we are excited about this to just push it further.[00:29:02] Eugene Cheah: And this conversion process, to be clear, I don't think it's going to be exclusive to RWKB. It probably will work for Mamba as well, I don't see why not. And we will probably see more ideas, or more experiments, or more hybrids, or Yeah, like, one of the weirdest things that I wanted to say outright, and I confirmed this with the Black Mamba team and the Jamba team, which because we did the GoFinch hybrid model, is that none of us understand why a hard hybrid with a state based model to be R.[00:29:28] Eugene Cheah: QA state space and transformer performs better when, than the baseline of both. It's like, it's like when you train one, you expect, and then you replace, you expect the same results. That's our pitch. That's our claim. But somehow when we jam both together, it outperforms both. And that's like one area of emulation that, like, we only have four experiments, plus four teams, that a lot more needs to be done.[00:29:51] Eugene Cheah: But, but these are things that excite me, essentially, because that is what it's potentially we can move ahead for. Which brings us to what comes next.[00:30:00] What's next[00:30:00] [00:30:00][00:30:00] Dan Fu: So, this part is kind of just some, where we'll talk a little bit about stuff that, that we're excited about. Maybe have some wild speculation on, on what, what's, what's coming next.[00:30:12] Dan Fu: And, of course this is also the part that will be more open to questions. So, a couple things that, that I'm excited about is continued hardware model co design for, for these models. So one of the things that we've put out recently is this library called ThunderKittens. It's a CUDA library.[00:30:29] Dan Fu: And one of the things that, that we found frustrating is every time that we built one of these new architectures, and I'm sure you had the exact same experience, we'd have to go and spend two months in CUDA land, like writing these, these new efficient things. And. If we decided to change one thing in PyTorch, like one line of PyTorch code is like a week of CUDA code at least.[00:30:47] Dan Fu: So one of our goals with, with a library like Thunderkitten, so we, we just broke down what are the key principles, what are the key hardware things what are the key, Compute pieces that you get from the hardware. So for example on [00:31:00] H100 everything is really revolves around a warp group matrix multiply operation.[00:31:06] Dan Fu: So you really want your operation to be able to split into relatively small matrix, matrix multiply operations. So like multiplying two 64 by 64 matrices, for example. And so if you know that ahead of time when you're designing your model, that probably gives you you know, some information about how you set the state sizes, how you set the update, how you set the update function.[00:31:27] Dan Fu: So with Thunderkittens we basically built a whole library just around this basic idea that all your basic compute primitives should not be a float, but it should be a matrix, and everything should just be matrix compute. And we've been using that to, to try to both re implement some existing architectures, and also start to design code.[00:31:44] Dan Fu: Some new ones that are really designed with this core with a tensor core primitive in mind. Another thing that that we're, that at least I'm excited about is we, over the last four or five years, we've really been looking at language models as the next thing. But if you've been paying [00:32:00] attention to Twitter there's been a bunch of new next generation models that are coming out.[00:32:04] Dan Fu: So there, there are. So, video generation models that can run real time, that are supported by your mouse and your keyboard, that I'm told if you play with them that, you know, that they only have a few seconds of memory. Can we take that model, can we give it a very long context length so that you could actually maybe generate an entire game state at a time?[00:32:25] Dan Fu: What does that look like for the model? You're certainly not going to do a giant quadratic attention computation to try to run that. Maybe, maybe use some of these new models, or some of these new video generation models that came out. So Sora came out I don't know, two days ago now. But with super long queue times and super long generation times.[00:32:43] Dan Fu: So that's probably a quadratic attention operation at the, at the bottom of it. What if we could remove that and get the same quality, but a lot faster generation time? Or some of the demos that we saw from Paige earlier today. You know, if I have a super long conversation with my [00:33:00] Gemini bot, what if I wanted to remember everything that it's seen in the last week?[00:33:06] Dan Fu: I mean, maybe you don't for personal reasons, but what if I did, you know? What does that mean for the architecture? And I think, you know, that's certainly something I'm pretty excited about. I'm sure you're excited about it too. So, I think we were supposed to have some hot takes, but I honestly don't remember what our hot takes were.[00:33:21] Hot Takes - does anyone really need long context?[00:33:21] Eugene Cheah: Yeah, including the next slide. Hot takes, yes, these are our[00:33:25] Dan Fu: hot takes.[00:33:25] Eugene Cheah: I think the big one on Twitter that we saw, that we shared, was the question is like, is RAG relevant? In the case of, like, the future of, like, state based models?[00:33:38] Dan Fu: Let's see, I haven't played too much with RAG. But when I have. I'll say I found it was a little bit challenging to do research on it because we had this experience over and over again, where you could have any, an embedding model of any quality, so you could have a really, really bad embedding model, or you could have a really, really [00:34:00] good one, By any measure of good.[00:34:03] Dan Fu: And for the final RAG application, it kind of didn't matter. That's what I'll say about RAG while I'm being recorded. I know it doesn't actually answer the question, but[00:34:13] Eugene Cheah: Yeah, so I think a lot of folks are like, extremely excited of the idea of RWKB or State Space potentially having infinite context.[00:34:21] Eugene Cheah: But I think the reality is that when we say infinite context, we just mean a different kind of infinite context, or you, or as it's previously covered, you need to test the model differently. So, think of it more along the lines of the human. Like, I don't remember what I ate for breakfast yesterday.[00:34:37] Eugene Cheah: Yeah, that's the statement that I'll say. And And we humans are not quadratic transformers. If we did, if let's say we increased our brain size for every second we live, we would have exploded by the time we are 5 years old or something like that. And, and I think, I think basically fundamentally for us, right, be it whether we, regardless of whether RWKB, statespace, XLSTM, [00:35:00] etc, our general idea is that instead of that expanding state, that increase in computational cost, what if we have a fixed state size?[00:35:08] Eugene Cheah: And Information theory detects that that fixed state size will have a limit. Just how big of a limit is a question, like, we, like, RWKB is running at 40 megabytes for, for its state. Its future version might run into 400 megabytes. That is like millions of tokens in, if you're talking about mathematically, the maximum possibility.[00:35:29] Eugene Cheah: It's just that I guess we were all more inefficient about it, so maybe we hit 100, 000. And that's kind of like the work we are doing, trying to like push it and maximize it. And that's where the models will start differing, because it will choose to forget things, it will choose to remember things. And that's why I think that there might be some element of right, but it may not be the same right.[00:35:49] Eugene Cheah: It may be the model learn things, and it's like, hmm, I can't remember that, that article. Let me do a database search, to search. Just like us humans, when we can't remember the article in the company. We do a search on Notion. [00:36:00][00:36:00] Dan Fu: I think something that would be really interesting is if you could have facts that are, so right now, the one intuition about language models is that all those parameters are around just to store random facts about the world.[00:36:14] Dan Fu: And this intuition comes from the observation that if you take a really small language model, it can do things like talk to you, or kind of has like the The style of conversation, it can learn that, but where it will usually fall over compared to a much larger one is it'll just be a lot less factual about things that it knows or that it can do.[00:36:32] Dan Fu: But that points to all those weights that we're spending, all that SGD that we're spending to train these models are just being used to store facts. And we have things like databases that are pretty good at storing facts. So I think one thing that would be really interesting is if we could actually have some sort of outside data store that a language model can can look at that that maybe is you know, has has some sort of gradient descent in it, but but would be quite interesting.[00:36:58] Dan Fu: And then maybe you could edit it, delete [00:37:00] facts, you know, change who's president so that it doesn't, it doesn't get lost.[00:37:04] Vibhu: Can we open up Q& A and hot takes for the audience? I have a hot take Q& A. Do these scale? When, when 405B state space model, RAG exists, no one does long context, who's throwing in 2 million token questions, hot takes?[00:37:24] Dan Fu: The, the who's throwing in 2 million token question, I think, is, is a really good question. So I actually, I was going to offer that as a hot take. I mean, my hot take was going to be that long context doesn't matter. I know I just gave a whole talk about it, but you know, what, what's the point of doing research if you can't, you know, play both sides.[00:37:40] Dan Fu: But I think one of the, so I think for both of us, the reason that we first got into this was just from the first principled questions of there's this quadratic thing. Clearly intelligence doesn't need to be quadratic. What is going on? Can we understand it better? You know, since then it's kind of turned into a race, which has [00:38:00] been exciting to watch, like, how much context you can take in.[00:38:03] Dan Fu: But I think it's right. Nobody is actually putting in a two million context prompt into these models. And, and, you know, if they are, maybe we can go, go You know, design a better model to do that particular thing. Yeah, what do you think about that? So you've also been working on this. Do you think long context matters?[00:38:19] Eugene Cheah: So I'm going to burn a bit. How many of you remember the news of Google Gemini supporting 3 million contacts, right? Raise your hand.[00:38:28] Vibhu: Yeah, 2 million.[00:38:29] Eugene Cheah: Oh, it's 2 million.[00:38:31] Eugene Cheah: Yeah, how many of you actually tried that? See?[00:38:34] Vibhu: I use it a lot. You? You work for MindsTV. I use it a lot.[00:38:41] Eugene Cheah: So, for some people that has used, and I think, I think that's the, that's might be, like, this is where my opinion starts to differ, because I think the big labs may have a bigger role in this, because Like, even for RWKB, even when we train non contacts, the reason why I say VRAM is a problem is that because when we did the, we need to backprop [00:39:00] against the states, we actually need to maintain the state in between the tokens by the token length.[00:39:05] Eugene Cheah: So that means we need to actually roll out the whole 1 million contacts if we are actually training 1 million. Which is the same for transformers, actually, but it just means we don't magically reuse the VRAM consumption in the training time space. So that is one of the VRAM bottlenecks, and I'm neither OpenAI nor Google, so donate GPUs if you have too much of them.[00:39:27] Eugene Cheah: But then, putting it back to another paradigm, right, is that I think O1 style reasoning might be actually pushing that direction downwards. In my opinion, this is my partial hot take is that if, let's say you have a super big model, And let's say you have a 70B model that may take double the tokens, but gets the same result.[00:39:51] Eugene Cheah: Strictly speaking, a 70B, and this is even for transformer or non transformer, right? We we'll take less less resources than that 400 B [00:40:00] model, even if it did double the amount thinking. And if that's the case, and we are still all trying to figure this out, maybe the direction for us is really getting the sub 200 B to be as fast as efficient as possible.[00:40:11] Eugene Cheah: We a very efficient architecture that some folks happen to be working on to, to just reason it out over larger and larger context thing.[00:40:20] Question: Yeah. One thing I'm super interested in is. Models that can watch forever? Obviously you cannot train something on infinite context length. How are y'all thinking about that, where you run on a much longer context length than is possible to train on?[00:40:38] Dan Fu: Yeah, it's a, it's a great question. So I think when I think you guys probably had tweets along these lines, too. When we first started doing these things, because these are all recurrent models in theory you could just run it forever. You could just run it forever. And at the very least it won't, it won't like error out on your crash.[00:40:57] Dan Fu: There's another question of whether it can actually [00:41:00] use what it's seen in that infinite context. And I think there, so one place where probably the research and architectures ran faster Then another research is actually the benchmarks for long context. So you turn it on forever. You want to do everything or watch everything.[00:41:16] Dan Fu: What is it that you actually wanted to do? Can we actually build some benchmarks for that? Then measure what's happening. And then ask the question, can the models do it? Is there something else that they need? Yeah, I think that if I were to turn back the clock to 2022, that's probably one of the things I would have done differently, which would have been actually get some long context benchmarks out at the same time as we started pushing context length on all these models.[00:41:41] Eugene Cheah: I will also say the use case. So like, I think we both agree that there's no Infinite memory and the model needs to be able to learn and decide. I think what we have observed for, I think this also fits the state space model, is that one of the key advantages of this alternate attention mechanic that is not based on token position is that the model don't suddenly become crazy when you go past the [00:42:00] 8k training context tank, or a million context tank.[00:42:03] Eugene Cheah: It's actually still stable. It's still able to run, it's still able to rationalize. It just starts forgetting things. But some of these things are still there in latent memory. Some of these things are still somewhat there. That's the whole point of why reading twice works. Things like that. And one of the biggest pushes in this direction is that I think both Statespace and RWKB have Separate papers by other researchers where they use this architecture for time series data.[00:42:26] Eugene Cheah: Weather modeling. So, you are not asking what was the weather five days ago. You're asking what's the weather tomorrow based on the infinite length that we, as long as this Earth and the computer will keep running. So, so, and they found that it is like, better than existing, like, transformer or existing architecture in modeling this weather data.[00:42:47] Eugene Cheah: Control for the param size and stuff. I'm quite sure there are people with larger models. So, so there are things that, that in this case, right, there is future applications if your question is just what's next and not what's 10 years ago.[00:42:59] Dan Fu: Thanks so [00:43:00] much for having us. Get full access to Latent Space at www.latent.space/subscribe

Latent Space: The AI Engineer Podcast — CodeGen, Agents, Computer Vision, Data Science, AI UX and all things Software 3.0

Happy holidays! We'll be sharing snippets from Latent Space LIVE! through the break bringing you the best of 2024! We want to express our deepest appreciation to event sponsors AWS, Daylight Computer, Thoth.ai, StrongCompute, Notable Capital, and most of all all our LS supporters who helped fund the gorgeous venue and A/V production!For NeurIPS last year we did our standard conference podcast coverage interviewing selected papers (that we have now also done for ICLR and ICML), however we felt that we could be doing more to help AI Engineers 1) get more industry-relevant content, and 2) recap 2024 year in review from experts. As a result, we organized the first Latent Space LIVE!, our first in person miniconference, at NeurIPS 2024 in Vancouver. Today, we're proud to share Loubna's highly anticipated talk (slides here)!Synthetic DataWe called out the Synthetic Data debate at last year's NeurIPS, and no surprise that 2024 was dominated by the rise of synthetic data everywhere:* Apple's Rephrasing the Web, Microsoft's Phi 2-4 and Orca/AgentInstruct, Tencent's Billion Persona dataset, DCLM, and HuggingFace's FineWeb-Edu, and Loubna's own Cosmopedia extended the ideas of synthetic textbook and agent generation to improve raw web scrape dataset quality* This year we also talked to the IDEFICS/OBELICS team at HuggingFace who released WebSight this year, the first work on code-vs-images synthetic data.* We called Llama 3.1 the Synthetic Data Model for its extensive use (and documentation!) of synthetic data in its pipeline, as well as its permissive license. * Nemotron CC and Nemotron-4-340B also made a big splash this year for how they used 20k items of human data to synthesize over 98% of the data used for SFT/PFT.* Cohere introduced Multilingual Arbitrage: Optimizing Data Pools to Accelerate Multilingual Progress observing gains of up to 56.5% improvement in win rates comparing multiple teachers vs the single best teacher model* In post training, AI2's Tülu3 (discussed by Luca in our Open Models talk) and Loubna's Smol Talk were also notable open releases this year.This comes in the face of a lot of scrutiny and criticism, with Scale AI as one of the leading voices publishing AI models collapse when trained on recursively generated data in Nature magazine bringing mainstream concerns to the potential downsides of poor quality syndata:Part of the concerns we highlighted last year on low-background tokens are coming to bear: ChatGPT contaminated data is spiking in every possible metric:But perhaps, if Sakana's AI Scientist pans out this year, we will have mostly-AI AI researchers publishing AI research anyway so do we really care as long as the ideas can be verified to be correct?Smol ModelsMeta surprised many folks this year by not just aggressively updating Llama 3 and adding multimodality, but also adding a new series of “small” 1B and 3B “on device” models this year, even working on quantized numerics collaborations with Qualcomm, Mediatek, and Arm. It is near unbelievable that a 1B model today can qualitatively match a 13B model of last year:and the minimum size to hit a given MMLU bar has come down roughly 10x in the last year. We have been tracking this proxied by Lmsys Elo and inference price:The key reads this year are:* MobileLLM: Optimizing Sub-billion Parameter Language Models for On-Device Use Cases* Apple Intelligence Foundation Language Models* Hymba: A Hybrid-head Architecture for Small Language Models* Loubna's SmolLM and SmolLM2: a family of state-of-the-art small models with 135M, 360M, and 1.7B parameters on the pareto efficiency frontier.* and Moondream, which we already covered in the 2024 in Vision talkFull Talk on YouTubeplease like and subscribe!Timestamps* [00:00:05] Loubna Intro* [00:00:33] The Rise of Synthetic Data Everywhere* [00:02:57] Model Collapse* [00:05:14] Phi, FineWeb, Cosmopedia - Synthetic Textbooks* [00:12:36] DCLM, Nemotron-CC* [00:13:28] Post Training - AI2 Tulu, Smol Talk, Cohere Multilingual Arbitrage* [00:16:17] Smol Models* [00:18:24] On Device Models* [00:22:45] Smol Vision Models* [00:25:14] What's NextTranscript2024 in Synthetic Data and Smol Models[00:00:00] ​[00:00:05] Loubna Intro[00:00:05] Speaker: ​I'm very happy to be here. Thank you for the invitation. So I'm going to be talking about synthetic data in 2024. And then I'm going to be talking about small on device models. So I think the most interesting thing about synthetic data this year is that like now we have it everywhere in the large language models pipeline.[00:00:33] The Rise of Synthetic Data Everywhere[00:00:33] Speaker: I think initially, synthetic data was mainly used just for post training, because naturally that's the part where we needed human annotators. And then after that, we realized that we don't really have good benchmarks to [00:01:00] measure if models follow instructions well, if they are creative enough, or if they are chatty enough, so we also started using LLMs as judges.[00:01:08] Speaker: Thank you. And I think this year and towards the end of last year, we also went to the pre training parts and we started generating synthetic data for pre training to kind of replace some parts of the web. And the motivation behind that is that you have a lot of control over synthetic data. You can control your prompt and basically also the kind of data that you generate.[00:01:28] Speaker: So instead of just trying to filter the web, you could try to get the LLM to generate what you think the best web pages could look like and then train your models on that. So this is how we went from not having synthetic data at all in the LLM pipeline to having it everywhere. And so the cool thing is like today you can train an LLM with like an entirely synthetic pipeline.[00:01:49] Speaker: For example, you can use our Cosmopedia datasets and you can train a 1B model on like 150 billion tokens that are 100 percent synthetic. And those are also of good quality. And then you can [00:02:00] instruction tune the model on a synthetic SFT dataset. You can also do DPO on a synthetic dataset. And then to evaluate if the model is good, you can use.[00:02:07] Speaker: A benchmark that uses LLMs as a judge, for example, MTBench or AlpacaEvil. So I think this is like a really mind blowing because like just a few years ago, we wouldn't think this is possible. And I think there's a lot of concerns about model collapse, and I'm going to talk about that later. But we'll see that like, if we use synthetic data properly and we curate it carefully, that shouldn't happen.[00:02:29] Speaker: And the reason synthetic data is very popular right now is that we have really strong models, both open and closed. It is really cheap and fast to use compared to human annotations, which cost a lot and take a lot of time. And also for open models right now, we have some really good inference frameworks.[00:02:47] Speaker: So if you have enough GPUs, it's really easy to spawn these GPUs and generate like a lot of synthetic data. Some examples are VLM, TGI, and TensorRT.[00:02:57] Model Collapse[00:02:57] Speaker: Now let's talk about the elephant in the room, model [00:03:00] collapse. Is this the end? If you look at the media and all of like, for example, some papers in nature, it's really scary because there's a lot of synthetic data out there in the web.[00:03:09] Speaker: And naturally we train on the web. So we're going to be training a lot of synthetic data. And if model collapse is going to happen, we should really try to take that seriously. And the other issue is that, as I said, we think, a lot of people think the web is polluted because there's a lot of synthetic data.[00:03:24] Speaker: And for example, when we're building fine web datasets here at Guillerm and Hinek, we're interested in like, how much synthetic data is there in the web? So there isn't really a method to properly measure the amount of synthetic data or to save a webpage synthetic or not. But one thing we can do is to try to look for like proxy words, for example, expressions like as a large language model or words like delve that we know are actually generated by chat GPT.[00:03:49] Speaker: We could try to measure the amount of these words in our data system and compare them to the previous years. For example, here, we measured like a, these words ratio in different dumps of common crawl. [00:04:00] And we can see that like the ratio really increased after chat GPT's release. So if we were to say that synthetic data amount didn't change, you would expect this ratio to stay constant, which is not the case.[00:04:11] Speaker: So there's a lot of synthetic data probably on the web, but does this really make models worse? So what we did is we trained different models on these different dumps. And we then computed their performance on popular, like, NLP benchmarks, and then we computed the aggregated score. And surprisingly, you can see that the latest DOMs are actually even better than the DOMs that are before.[00:04:31] Speaker: So if there's some synthetic data there, at least it did not make the model's worse. Yeah, which is really encouraging. So personally, I wouldn't say the web is positive with Synthetic Data. Maybe it's even making it more rich. And the issue with like model collapse is that, for example, those studies, they were done at like a small scale, and you would ask the model to complete, for example, a Wikipedia paragraph, and then you would train it on these new generations, and you would do that every day.[00:04:56] Speaker: iteratively. I think if you do that approach, it's normal to [00:05:00] observe this kind of behavior because the quality is going to be worse because the model is already small. And then if you train it just on its generations, you shouldn't expect it to become better. But what we're really doing here is that we take a model that is very large and we try to distill its knowledge into a model that is smaller.[00:05:14] Phi, FineWeb, Cosmopedia - Synthetic Textbooks[00:05:14] Speaker: And in this way, you can expect to get like a better performance for your small model. And using synthetic data for pre-training has become really popular. After the textbooks are all you need papers where Microsoft basically trained a series of small models on textbooks that were using a large LLM.[00:05:32] Speaker: And then they found that these models were actually better than models that are much larger. So this was really interesting. It was like first of its time, but it was also met with a lot of skepticism, which is a good thing in research. It pushes you to question things because the dataset that they trained on was not public, so people were not really sure if these models are really good or maybe there's just some data contamination.[00:05:55] Speaker: So it was really hard to check if you just have the weights of the models. [00:06:00] And as Hugging Face, because we like open source, we tried to reproduce what they did. So this is our Cosmopedia dataset. We basically tried to follow a similar approach to what they documented in the paper. And we created a synthetic dataset of textbooks and blog posts and stories that had almost 30 billion tokens.[00:06:16] Speaker: And we tried to train some models on that. And we found that like the key ingredient to getting a good data set that is synthetic is trying as much as possible to keep it diverse. Because if you just throw the same prompts as your model, like generate like a textbook about linear algebra, and even if you change the temperature, the textbooks are going to look alike.[00:06:35] Speaker: So there's no way you could scale to like millions of samples. And the way you do that is by creating prompts that have some seeds that make them diverse. In our case, the prompt, we would ask the model to generate a textbook, but make it related to an extract from a webpage. And also we try to frame it within, to stay within topic.[00:06:55] Speaker: For example, here, we put like an extract about cardiovascular bioimaging, [00:07:00] and then we ask the model to generate a textbook related to medicine that is also related to this webpage. And this is a really nice approach because there's so many webpages out there. So you can. Be sure that your generation is not going to be diverse when you change the seed example.[00:07:16] Speaker: One thing that's challenging with this is that you want the seed samples to be related to your topics. So we use like a search tool to try to go all of fine web datasets. And then we also do a lot of experiments with the type of generations we want the model to generate. For example, we ask it for textbooks for middle school students or textbook for college.[00:07:40] Speaker: And we found that like some generation styles help on some specific benchmarks, while others help on other benchmarks. For example, college textbooks are really good for MMLU, while middle school textbooks are good for benchmarks like OpenBookQA and Pico. This is like a sample from like our search tool.[00:07:56] Speaker: For example, you have a top category, which is a topic, and then you have some [00:08:00] subtopics, and then you have the topic hits, which are basically the web pages in fine web does belong to these topics. And here you can see the comparison between Cosmopedia. We had two versions V1 and V2 in blue and red, and you can see the comparison to fine web, and as you can see throughout the training training on Cosmopedia was consistently better.[00:08:20] Speaker: So we managed to get a data set that was actually good to train these models on. It's of course so much smaller than FineWeb, it's only 30 billion tokens, but that's the scale that Microsoft data sets was, so we kind of managed to reproduce a bit what they did. And the data set is public, so everyone can go there, check if everything is all right.[00:08:38] Speaker: And now this is a recent paper from NVIDIA, Neumatron CC. They took things a bit further, and they generated not a few billion tokens, but 1. 9 trillion tokens, which is huge. And we can see later how they did that. It's more of, like, rephrasing the web. So we can see today that there's, like, some really huge synthetic datasets out there, and they're public, so, [00:09:00] like, you can try to filter them even further if you want to get, like, more high quality corpses.[00:09:04] Speaker: So for this, rephrasing the web this approach was suggested in this paper by Pratyush, where basically in this paper, they take some samples from C4 datasets, and then they use an LLM to rewrite these samples into a better format. For example, they ask an LLM to rewrite the sample into a Wikipedia passage or into a Q& A page.[00:09:25] Speaker: And the interesting thing in this approach is that you can use a model that is Small because it doesn't, rewriting doesn't require knowledge. It's just rewriting a page into a different style. So the model doesn't need to have like knowledge that is like extensive of what is rewriting compared to just asking a model to generate a new textbook and not giving it like ground truth.[00:09:45] Speaker: So here they rewrite some samples from C4 into Q& A, into Wikipedia, and they find that doing this works better than training just on C4. And so what they did in Nemo Trans CC is a similar approach. [00:10:00] They rewrite some pages from Common Crawl for two reasons. One is to, like improve Pages that are low quality, so they rewrite them into, for example, Wikipedia page, so they look better.[00:10:11] Speaker: And another reason is to create more diverse datasets. So they have a dataset that they already heavily filtered, and then they take these pages that are already high quality, and they ask the model to rewrite them in Question and Answer format. into like open ended questions or like multi choice questions.[00:10:27] Speaker: So this way they can reuse the same page multiple times without fearing like having multiple duplicates, because it's the same information, but it's going to be written differently. So I think that's also a really interesting approach for like generating synthetic data just by rephrasing the pages that you already have.[00:10:44] Speaker: There's also this approach called Prox where they try to start from a web page and then they generate a program which finds how to write that page to make it better and less noisy. For example, here you can see that there's some leftover metadata in the web page and you don't necessarily want to keep that for training [00:11:00] your model.[00:11:00] Speaker: So So they train a model that can generate programs that can like normalize and remove lines that are extra. So I think this approach is also interesting, but it's maybe less scalable than the approaches that I presented before. So that was it for like rephrasing and generating new textbooks.[00:11:17] Speaker: Another approach that I think is really good and becoming really popular for using synthetic data for pre training is basically building a better classifiers. For filtering the web for example, here we release the data sets called fine web edu. And the way we built it is by taking Llama3 and asking it to rate the educational content of web pages from zero to five.[00:11:39] Speaker: So for example, if a page is like a really good textbook that could be useful in a school setting, it would get a really high score. And if a page is just like an advertisement or promotional material, it would get a lower score. And then after that, we take these synthetic annotations and we train a classifier on them.[00:11:57] Speaker: It's a classifier like a BERT model. [00:12:00] And then we run this classifier on all of FineWeb, which is a 15 trillion tokens dataset. And then we only keep the pages that have like a score that's higher than 3. So for example, in our case, we went from 15 trillion tokens to 3. to just 1. 5 trillion tokens. Those are really highly educational.[00:12:16] Speaker: And as you can see here, a fine web EDU outperforms all the other public web datasets by a larger margin on a couple of benchmarks here, I show the aggregated score and you can see that this approach is really effective for filtering web datasets to get like better corpuses for training your LLMs.[00:12:36] DCLM, Nemotron-CC[00:12:36] Speaker: Others also try to do this approach. There's, for example, the DCLM datasets where they also train the classifier, but not to detect educational content. Instead, they trained it on OpenHermes dataset, which is a dataset for instruction tuning. And also they explain like IAM5 subreddits, and then they also get really high quality dataset which is like very information dense and can help [00:13:00] you train some really good LLMs.[00:13:01] Speaker: And then Nemotron Common Crawl, they also did this approach, but instead of using one classifier, they used an ensemble of classifiers. So they used, for example, the DCLM classifier, and also classifiers like the ones we used in FineWebEducational, and then they combined these two. Scores into a, with an ensemble method to only retain the best high quality pages, and they get a data set that works even better than the ones we develop.[00:13:25] Speaker: So that was it for like synthetic data for pre-training.[00:13:28] Post Training - AI2 Tulu, Smol Talk, Cohere Multilingual Arbitrage[00:13:28] Speaker: Now we can go back to post training. I think there's a lot of interesting post training data sets out there. One that was released recently, the agent instructs by Microsoft where they basically try to target some specific skills. And improve the performance of models on them.[00:13:43] Speaker: For example, here, you can see code, brain teasers, open domain QA, and they managed to get a dataset that outperforms that's when fine tuning Mistral 7b on it, it outperforms the original instruct model that was released by Mistral. And as I said, to get good synthetic data, you really [00:14:00] have to have a framework to make sure that your data is diverse.[00:14:03] Speaker: So for example, for them, they always. And then they see the generations on either source code or raw text documents, and then they rewrite them to make sure they're easier to generate instructions from, and then they use that for their like instruction data generation. There's also the Tool3SFT mixture, which was released recently by Allen AI.[00:14:23] Speaker: It's also really good quality and it covers a wide range of tasks. And the way they make sure that this dataset is diverse is by using personas from the persona hub datasets. Which is basically a data set of like I think over a million personas. And for example, in the tool mixture to generate like a new code snippet, they would give like the model persona, for example, a machine learning researcher interested in neural networks, and then ask it to generate like a coding problem.[00:14:49] Speaker: This way you make sure that your data set is really diverse, and then you can further filter the data sets, for example, using the reward models. We also released a dataset called Smalltalk, [00:15:00] and we also tried to cover the wide range of tasks, and as you can see here, for example, when fine tuning Mistral 7b on the dataset, we also outperformed the original Mistral instructs on a number of benchmarks, notably on mathematics and instruction following with ifevil.[00:15:18] Speaker: Another paper that's really interesting I wanted to mention is this one called Multilingual Data Arbitrage by Cohere. And basically they want to generate a data set for post training that is multilingual. And they have a really interesting problem. It's the fact that there isn't like one model that's really good at all the languages they wanted.[00:15:36] Speaker: So what they do is that like they use not just one teacher model, but multiple teachers. And then they have a router which basically sends the prompts they have to all these models. And then they get the completions and they have a reward model that traces all these generations and only keeps the best one.[00:15:52] Speaker: And this is like arbitrage and finance. So well, I think what's interesting in this, it shows that like synthetic data, it doesn't have to come from a single model. [00:16:00] And because we have so many good models now, you could like pull these models together and get like a dataset that's really high quality and that's diverse and that's covers all your needs.[00:16:12] Speaker: I was supposed to put a meme there, but. Yeah, so that was it for like a synthetic data.[00:16:17] Smol Models[00:16:17] Speaker: Now we can go to see what's happening in the small models field in 2024. I don't know if you know, but like now we have some really good small models. For example, Lama 3. 2 1B is. It matches Lama 2. 13b from, that was released last year on the LMSYS arena, which is basically the default go to leaderboard for evaluating models using human evaluation.[00:16:39] Speaker: And as you can see here, the scores of the models are really close. So I think we've made like hugely forward in terms of small models. Of course, that's one, just one data point, but there's more. For example, if you look at this chart from the Quint 2. 5 blog post, it shows that today we have some really good models that are only like 3 billion parameters [00:17:00] and 4 billion that score really high on MMLU.[00:17:03] Speaker: Which is a really popular benchmark for evaluating models. And you can see here that the red, the blue dots have more than 65 on MMLU. And the grey ones have less. And for example, Llama33b had less. So now we have a 3b model that outperforms a 33b model that was released earlier. So I think now people are starting to realize that like, we shouldn't just scale and scale models, but we should try to make them more efficient.[00:17:33] Speaker: I don't know if you knew, but you can also chat with a 3B plus model on your iPhone. For example, here, this is an app called PocketPal, where you can go and select a model from Hugging Face. It has a large choice. For example, here we loaded the 5. 3. 5, which is 3. 8 billion parameters on this iPhone. And we can chat with this and you can see that even the latency is also acceptable.[00:17:57] Speaker: For example, here, I asked it to give me a joke about [00:18:00] NeurIPS. So let's see what it has to say.[00:18:06] Speaker: Okay, why did the neural network attend NeurIPS? Because it heard there would be a lot of layers and fun and it wanted to train its sense of humor. So not very funny, but at least it can run on device. Yeah, so I think now we have good small models, but we also have like good frameworks and tools to use these small models.[00:18:24] On Device Models[00:18:24] Speaker: So I think we're really close to having like really on edge and on device models that are really good. And I think for a while we've had this narrative. But just training larger models is better. Of course, this is supported by science scaling laws. As you can see here, for example, when we scale the model size, the loss is lower and obviously you get a better model.[00:18:46] Speaker: But and we can see this, for example, in the GPT family of models, how we went from just a hundred million parameters to more than a trillion. parameters. And of course, we all observed the performance improvement when using the latest model. But [00:19:00] one thing that we shouldn't forget is that when we scale the model, we also scale the inference costs and time.[00:19:05] Speaker: And so the largest models were are going to cost so much more. So I think now instead of just building larger models, we should be focusing on building more efficient models. It's no longer a race for the largest models since these models are really expensive to run and they require like a really good infrastructure to do that and they cannot run on, for example, consumer hardware.[00:19:27] Speaker: And when you try to build more efficient models that match larger models, that's when you can really unlock some really interesting on device use cases. And I think a trend that we're noticing now is the trend of training smaller models longer. For example, if you compare how much, how long LLAMA was trained compared to LLAMA3, there is a huge increase in the pre training length.[00:19:50] Speaker: LLAMA was trained on 1 trillion tokens, but LLAMA3 8b was trained on 15 trillion tokens. So Meta managed to get a model that's the same size, but But it performs so much [00:20:00] better by choosing to like spend the sacrifice during training, because as we know, training is a one time cost, but inference is something that's ongoing.[00:20:08] Speaker: If we want to see what are like the small models reads in 2024, I think this mobile LLM paper by Meta is interesting. They try to study different models that are like have the less than 1 billion parameters and find which architecture makes most sense for these models. For example, they find that depth is more important than width.[00:20:29] Speaker: So it's more important to have models that have like more layers than just one. making them more wide. They also find that GQA helps, that tying the embedding helps. So I think it's a nice study overall for models that are just a few hundred million parameters. There's also the Apple intelligence tech report, which is interesting.[00:20:48] Speaker: So for Apple intelligence, they had two models, one that was like on server and another model that was on device. It had 3 billion parameters. And I think the interesting part is that they trained this model using [00:21:00] pruning. And then distillation. And for example, they have this table where they show that, like, using pruning and distillation works much better than training from scratch.[00:21:08] Speaker: And they also have some interesting insights about, like, how they specialize their models on specific tasks, like, for example, summarization and rewriting. There's also this paper by NVIDIA that was released recently. I think you've already had a talk about, like, hybrid models that was all interesting.[00:21:23] Speaker: And this model, they used, like, a hybrid architecture between state space models and transformers. And they managed to train a 1B model that's really performant without needing to train it on a lot of tokens. And regarding our work, we just recently released SmallM2, so it's a series of three models, which are the best in class in each model size.[00:21:46] Speaker: For example, our 1. 7b model outperforms Lama 1b and also Qt 2. 5. And how we managed to train this model is the following. That's where you spent a lot of time trying to curate the pre training datasets. We did a lot of [00:22:00] ablations, trying to find which datasets are good and also how to mix them. We also created some new math and code datasets that we're releasing soon.[00:22:08] Speaker: But you basically really spent a lot of time trying to find what's the best mixture that you can train these models on. And then we spent some time trying to like we also trained these models for very long. For example, small M1 was trained only on 1 trillion tokens, but this model is trained on 11 trillion tokens.[00:22:24] Speaker: And we saw that the performance kept improving. The models didn't really plateau mid training, which I think is really interesting. It shows that you can train such small models for very long and keep getting performance gains. What's interesting about SmallLM2 is that it's fully open. We also released, like the pre training code base, the fine tuning code, the datasets, and also evaluation in this repository.[00:22:45] Smol Vision Models[00:22:45] Speaker: Also there's, like, really interesting small models for text, but also for vision. For example, here you can see SmallVLM, which is a 2B model that's really efficient. It doesn't consume a lot of RAM, and it also has a good performance. There's also Moondream 0. [00:23:00] 5b, which was released recently. It's like the smallest visual language model.[00:23:04] Speaker: And as you can see, there isn't like a big trade off compared to Moondream 2b. So now I showed you that we have some really good small models. We also have the tools to use them, but why should you consider using small models and when? I think, like, small models are really interesting because of the on device feature.[00:23:23] Speaker: Because these models are small and they can run fast, you can basically run them on your laptop, but also on your mobile phone. And this means that your dataset stays locally. You don't have to send your queries to third parties. And this really enhances privacy. That was, for example, one of the big selling points for Apple Intelligence.[00:23:42] Speaker: Also, right now, we really have a lot of work to do. So many frameworks to do on device inference. For example, there's MLX, MLC, Llama, CPP, Transformers, JS. So we have a lot of options and each of them have like great features. So you have so many options for doing that. Small models are also really powerful if you choose to specialize them.[00:24:00][00:24:00] Speaker: For example, here there's a startup called Numind, which took small LM and then they fine tuned it on text extraction datasets. And they managed to get a model that's not very far from models that are much larger. So I think text extraction is like one use case where small models can be really performant and it makes sense to use them instead of just using larger models.[00:24:19] Speaker: You can also chat with these models in browser. For example, here, you can go there, you can load the model, you can even turn off your internet and just start chatting with the model locally. Speaking of text extraction, if you don't want to fine tune the models, there's a really good method of structure generation.[00:24:36] Speaker: We can basically force the models to follow a JSON schema that you defined. For example, here, we try to force the model to follow a schema for extracting key information from GitHub issues. So you can input free text, which is a complaint about a GitHub repository, something not working. And then you can run it there and the model can extract anything that is relevant for your GitHub issue creation.[00:24:58] Speaker: For example, the [00:25:00] priority, for example, here, priority is high, the type of the issue bug, and then a title and the estimation of how long this will take to fix. And you can just like do this in the browser, you can transform your text into a GitHub issue that's properly formatted.[00:25:14] What's Next[00:25:14] Speaker: So what's next for synthetic data and small models?[00:25:18] Speaker: I think that domain specific synthetic data is going to be, it's already important, it's going to be even more important. For example, generating synthetic data for math. I think this really would help improve the reasoning of a lot of models. And a lot of people are doing it, for example, Quint 2. 12 math, everyone's trying to reproduce a one.[00:25:37] Speaker: And so I think for synthetic data, trying to specialize it on some domains is going to be really important. And then for small models, I think specializing them through fine tuning, it's also going to be really important because I think a lot of companies are just trying to use these large models because they are better.[00:25:53] Speaker: But on some tasks, I think you can already get decent performance with small models. So you don't need to Pay like a [00:26:00] cost that's much larger just to make your model better at your task by a few percent. And this is not just for text. And I think it also applies for other modalities like vision and audio.[00:26:11] Speaker: And I think you should also watch out for on device frameworks and applications. For example, like the app I showed, or lama, all these frameworks are becoming really popular and I'm pretty sure that we're gonna get like more of them in 2025. And users really like that. Maybe for other, I should also say hot take.[00:26:28] Speaker: I think that like in AI, we just started like with fine tuning, for example, trying to make BERT work on some specific use cases, and really struggling to do that. And then we had some models that are much larger. So we just switched to like prompt engineering to get the models And I think we're going back to fine tuning where we realize these models are really costly.[00:26:47] Speaker: It's better to use just a small model or try to specialize it. So I think it's a little bit of a cycle and we're going to start to see like more fine tuning and less of just like a prompt engineering the models. So that was my talk. Thank you for following. And if you have [00:27:00] any questions, we can take them now. Get full access to Latent Space at www.latent.space/subscribe

Wilson County News
Voters reject East Central ISD's $360M bond

Wilson County News

Play Episode Listen Later Nov 12, 2024 4:10


The East Central Independent School District's plans for a second high school and other new campuses have suffered a crushing blow, and the district now faces a deficit of almost million. “We didn't get the result we hoped for, but we look forward to reengaging with our Facilities Committee and gather additional feedback from the broader community,” said Superintendent Roland Toscano, after unofficial results were announced Nov. 5. “As a fast-growing district, we have important challenges to address in safety, space for our students, and support for quality educators. We're committed to working together with our community to develop...Article Link

Wilson County News
East Central voters go to the polls on $360M bond, tax

Wilson County News

Play Episode Listen Later Oct 22, 2024 3:04


In addition to voting for president and members of the school board, voters in the East Central Independent School District (ISD) will decide the outcome of a 0 million school bond and proposed tax rate increase. Proposition A, also known as a Tax Rate Election (TRE), will ask voters to approve the tax rate adopted by the board — adding 5 cents to maintenance and operations (M&O) tax, bringing it from 67 cents per 0 of taxable valuation currently to 72 cents per 0, which could add .6 million of revenue to the district to recruit and retain staff and...Article Link

Wilson County News
East Central voters to decide $360M bond, tax rate, trustees

Wilson County News

Play Episode Listen Later Sep 3, 2024 5:03


The East Central Independent School District (ISD) board of trustees unanimously approved calling for a school bond and tax rate election for Nov. 5. Proposition (Prop) A, also known as a Tax Rate Election (TRE), will ask voters to approve the tax rate adopted by the board — adding 5 cents to the maintenance and operations (M&O) tax, bringing it from 67 cents per 0 of taxable valuation currently to 72 cents per 0, which could add .6 million of revenue to the district to recruit and retain staff and provide security updates and an armed officer at every campus....Article Link

DUBAI WORKS Business Podcast
Brands For Less Sells Stake to TJX, KBW Ventures Invests in NoorNation, E Daddy Raises $15M for E-Motorcycle Launch,Kabir Joshi, Founder and CEO of Vantage Capital Speaks About UAE Property Market

DUBAI WORKS Business Podcast

Play Episode Listen Later Aug 22, 2024 41:21


Headlines:- Dubai-Based Brands For Less Sells 35% Stake to TJX for $360M, Valuing the Group at $1.2 Billion- KBW Ventures Invests in Egyptian Climate Tech Startup NoorNation-Sustainable Mobility Startup E Daddy Raises $15M for Electric Motorcycle Launch-Investing in the UAE Property Market: Tips from Kabir Joshi, CEO of Vantage Capital and Vantage Properties

FactSet Evening Market Recap
Weekly Market Recap - Tuesday, 4th June

FactSet Evening Market Recap

Play Episode Listen Later Jun 4, 2024 4:59


US equities ended mostly higher Tuesday, a bit off best levels. Not much behind modest late afternoon recovery from session lows with another Treasury rally and dovish repricing around Fed rate path the easiest excuses. Growth concerns remain key area of attention following softer ISM manufacturing print. April JOLTS job openings of 8.059M well below 8.360M consensus, lowest since Feb-21.

Collective Shift
Donald Trump Backs Crypto & Memecoin Mania Returns

Collective Shift

Play Episode Listen Later May 15, 2024 27:12


Join Matt and Nick as they break down this week's key market event; the spot Ethereum ETF applications; the SEC's potential lawsuit against Robinhood; Donald Trump's pro-crypto speech; memecoin prices soaring; altcoin updates; and more.Key Takeaways The market continues to trade sideways ahead of this week's inflation data in the U.S. Consensus forecast is 3.4%, which would be slightly lower than the 3.5% recorded last month.The SEC flagged its intent to sue Robinhood in relation to its crypto offerings. Unlike previously, Robinhood plans to defend a would-be lawsuit.Bitcoin ETFs saw net inflows for the first time in five weeks. A major asset manager in Boston was revealed to hold roughly $360M worth of bitcoin ETFs.The spot Ethereum ETF application is still widely expected to be denied. The SEC must approve or deny VanEck's application by May 23.U.S. presidential candidate Donald Trump spoke positively about crypto, contrasting the views held by the Biden administration. Time will tell whether he is just trying to win votes.‘Roaring Kitty', a key figure behind the ‘meme stock' mania of 2021, posted on social media after a long hiatus. This caused memecoins to rally.Among the altcoin updates covered: Infinex announces 30-day campaign; EigenLayer opens token claim window; and Botanix Labs secures funding.

Capitalism.com with Ryan Daniel Moran
How To Build A 9-Figure Brand In 5 Years Or Less ($100M+ Case Study)

Capitalism.com with Ryan Daniel Moran

Play Episode Listen Later May 13, 2024 42:01


Tara Bosch started Smart Sweets at the age of 21 and grew the company to over 9 figures/year in sales in just four years. At 25, she sold a majority stake in the company for $360M. This year, she came to Cap Con 7 (The Capitalism Conference) to share with our audience exactly how she did it.    She followed the playbook that I outline in my book 12 Months To $1M and scaled far beyond the million dollar mark to multi 9 figures.   In this fireside, we talk through the biggest steps to scaling to a 9 figure valuation.   To learn more about creating your own path to $1 million sign up for our FREE 30 day mini series at:   http://www.Capitalism.com/Playbook   Connect with me on Instagram at:   https://Instagram.com/RyanDanielMoran   Timestamps:   (0:00) - Introduction   (0:59) - The First Step To 9 Figures: Belief   (5:40) - The First 500 Fans   (9:50) - Raising Money To Get To 9 Figures   (17:00) - Building The Vision In Your Head   (20:40) - Getting Mentorship   (25:00 )- What's Next?   (28:00) - Advice For 7 Figure Entrepreneurs   (31:00) - How We Can Help You

Simply Bitcoin
El Salvador Launches $360M Bitcoin Treasury | EP 989

Simply Bitcoin

Play Episode Listen Later May 13, 2024 74:19


El Salvador launches $360M Bitcoin treasury website SPONSORS ►Bitcoin Well: https://bitcoinwell.com/simplybitcoin ►Passport by Foundation: https://foundation.xyz/simply ► Kaboomracks: https://www.kaboomracks.com ► Stamp Seed: https://www.stampseed.com PROMO CODE: SIMPLY for a 15% discount ► Orange Pill App: https://theorangepillapp.com/en BITCOIN CONFRENCE DISCOUNTS ► Bitcoin 2024: https://b.tc/conference/2024  PROMO CODE: SIMPLY for discount on your tickets! ► Pleb Party Nashville 2024: PlebPartyNashville.com FOLLOW US ► https://twitter.com/SimplyBitcoinTV ► https://twitter.com/BITVOLT7 ► https://twitter.com/Optimistfields ► Nostr: npub1vzjukpr2vrxqg2m9q3a996gpzx8qktg82vnl9jlxp7a9yawnwxfsqnx9gc JOIN OUR TELEGRAM, GIVE US A MEME TO REVIEW! ►https://t.me/SimplyBitcoinTV SUBSCRIBE TO OUR YOUTUBE ►https://bit.ly/3QbgqTQ SUPPORT US ► On-Chain: bc1qpm5j7wsnk46l2ukgpm7w3deesx2mdrzcgun6ms ►Lightning: simplybitcoin@walletofsatoshi.com  #bitcoin #bitcoinnews #simplybitcoin DISCLAIMER: All views in this episode are our own and DO NOT reflect the views of any of our guests or sponsors. Copyright Disclaimer under section 107 of the Copyright Act 1976, allowance is made for "fair use" for purposes such as criticism, comment, news reporting, teaching, scholarship, education and research. If you are or represent the copyright owner of materials used in this video and have a problem with the use of said material, please contact Simply Bitcoin.

Capitalism.com with Ryan Daniel Moran
From $0 To $360M In 4 Years... At 25 Years Old! w/ Tara Bosch

Capitalism.com with Ryan Daniel Moran

Play Episode Listen Later Mar 25, 2024 48:35


Tara Bosch founded SmartSweets at the age of 21, and sold it 4 years later for $360M. She had a very strong vision and scaled quickly, reaching $125M in sales in just her fourth year in business. What a story, right?   Tara is keynoting The Capitalism Conference this year, so I wanted to sit down with her to break down how she turned SmartSweets into a household name. This is one of the best case studies I've ever seen in how to build a huge brand by building a raving fanbase.   Want to join us at The Capitalism Conference at Austin, TX on April 2024?  Get on the waitlist today at:   https://Capitalism.com/CapCon   To learn more about creating your own path to $1 million sign up for our FREE 30 day mini series at:   http://www.Capitalism.com/Million     Timestamps:   (0:00) - Introduction   (2:00) - Building And Growing SmartSweets   (3:50) - Life After Tara's Exit   (6:30) - Getting Past The First Year   (9:28) - How To Build A Radical Fan Base   (16:00) - Establishing Your Massive Vision   (19:00) - Raising Capital   (22:00) - Finding Mentors And Advisors   (25:00) - Funding Inventory With Debt Capital   (28:00) - Hiring The Right People For Scale   (32:00) - Scaling SmartSweets   (38:00) - Authenticity Wins   (40:00) - Visualizing Success

The Twenty Minute VC: Venture Capital | Startup Funding | The Pitch
20VC: Bending Spoons: The Most Untold Success Story in Startups: Lessons Scaling to 500M Downloads, $360M in Reported 2023 Sales and a $2.55BN Valuation... Bootstrapped with Luca Ferrari, Co-Founder and CEO @ Bending Spoons

The Twenty Minute VC: Venture Capital | Startup Funding | The Pitch

Play Episode Listen Later Mar 15, 2024 53:23


Luca Ferrari is Co-Founder and CEO of Bending Spoons, one of the most incredible but untold success stories in startups. Luca has scaled Bending Spoons to 100M monthly active users, $380M in sales in 2023 and aiming to reach $500M in EBITDA by the end of 2026. The company's products include Evernote, Meetup, Remini, and Splice and their products have now been downloaded more than 500M times. In Today's Episode with Luca Ferrari We Discuss: From McKinsey Associate to $2BN Founder What was Luca like as a child? How would his parents have described him? Why did Luca share his McKinsey salary with his co-founders? What were Luca's biggest lessons from his failed startup? Bootstrapping Bending Spoons  Why did Luca decide to bootstrap Bending Spoons? What does Luca think about the EU vs. US startup environment? Why did Luca kill a $7M project? What were his lessons? How did Luca pick his investors? How to Find the Best Talent What are the 3 key traits Luca looks for when picking the best talent? Why does Luca think traditional interview strategies do not work? What tests does Luca conduct for each candidate? What were Luca's biggest hiring mistakes? Mastering Acquisition & Growth How does Luca determine which products to acquire? How does he identify signals? How does Luca approach pricing assets? How does he win every bid? What are Luca's biggest lessons from acquiring Evernote? What key lessons on risk management does Luca wish he'd known 10 years ago? What are Luca's biggest challenges on user acquisition?

Dropping Bombs
Chris Lee. The Road to Success is Not Always a Straight Line. Episode 667 with The Real Brad Lea (TRBL)

Dropping Bombs

Play Episode Listen Later Dec 11, 2023 51:34


Chris Lee is the Founder and Chairman of Solgen Power. Solgen, which employs over 1300 team members, was recently recognized by Financial Times as the 6th fastest-growing privately held company in the Nation. This was all accomplished in under 5.5 years of being in business. Recently, Chris closed on a private equity deal that valued Solgen at $360M, his second 9-Figure exit in the last 12 months. Beyond his business accolades, Chris is first and foremost a family man. He and his wife, Andrea have been married for 17 years and are raising 5 beautiful children on a 23-acre hobby farm in Washington State. Chris is a leader, successful entrepreneur, and member of the Forbes Business Council. He bootstrapped his success in the Solar industry before building an extensive real estate portfolio focused on multi-family & short-term rentals. Chris is most passionate about helping others achieve success and fighting for the causes closest to his heart. That passion led Chris to launch "The Founder Podcast," A top 30 Business show and one of the fastest-growing podcasts in the world.   In this episode, Chris and Brad discuss how the road to success is not always a straight line. Chris discusses what it takes for a company to exit.    Bombs:  “If the path to success isn't originally what you mapped out, it's success as long as you get there.” - Chris Lee Most religions have some sort of truth Artificial pressures entrepreneurs put on themselves Getting to the exit Get creative or change industries     Follow Chris @chrisleeqb Learn more about what Chris does here: Learn more about what Alex is doing https://planesdentalarts.com  Watch the full video episode on Brad's Rumble here: https://rumble.com/c/c-2544182   

Dear Twentysomething
Tara Bosch: Founder of SmartSweets

Dear Twentysomething

Play Episode Listen Later Nov 20, 2023 34:40


Tara Bosch made Canadian history by scaling her company to be the fastest growing CPG brand founded by a solo female founder...ever. When a love affair with candy devolved into an unhealthy relationship with food, Tara became inspired by a conversation with her grandmother about the detrimental effects of sugar. She soon dropped out of college to fulfill her dream: to create the future of candy. SmartSweets can now be found in 160,000 stores across the USA & Canada, with partners including Target, Kroger, and Costco. Scaling SmartSweets to be the fastest growing CPG brand founded by a solo female founder in Canadian history, she landed a $360M majority acquisition by TPG capital 4 years after launching the company from her basement.As a leader, Tara is wildly passionate about empowering the next generation of entrepreneurs and believes their big impact-driven visions are needed more than ever before in the world, leading her to create a first-of-its-kind initiative- Bold Beginnings. In addition to fostering connection and tailored support to entrepreneurs with big visions, Tara is working to normalize the feelings of self-doubt, imposter syndrome, and insecurities that all entrepreneurs face, while putting funds back into the ecosystem supporting women entrepreneurs to bring their visions and companies to scale.Publicly, Tara has been recognized for how she beat the statistics of women in entrepreneurship not only by being a solo founder, but maintaining ownership of her brand, keeping gender parity on her board and scaling a team that has always been 80% female-identifying.She is the recipient of several entrepreneurial awards, including being named Canada's Most Powerful Women Top 100 Award, Forbes 30 under 30, Peter Thiel Fellow, and EY Emerging Entrepreneur of the Year.Follow Us!Tara Bosch: @taraboschSmart Sweets: @SmartSweetsErica Wenger: @erica_wengerDear Twentysomething: @deartwentysomething

Female Founder World
This Mindset Tip Helped SmartSweets Hit $360M in Four Years With Founder Tara Bosch

Female Founder World

Play Episode Listen Later Sep 11, 2023 38:00


Tara Bosch, the founder of SmartSweets, is on the Female Founder World podcast with Jasmine Garnsworthy! Tara was 22 years old when she started her better-for-you candy company. After only four years of hard work and growing incredibly quickly, she sold the majority of her company for $360M so that they could scale globally.  Tune into Tara's interview on the Female Founder World podcast to learn how she was able to grow SmartSweets leveraging debt, how she landed nationwide distribution so early, and the mindset Tara needed for incredible success.  Links Get the Female Founder World newsletter https://femalefounderworld.beehiiv.com Join our free workshop series, Launch Labs: https://www.femalefounderworld.com/launch-lab Become a Business Bestie subscriber: femalefounderworld.com/subscriber Get our quick case studies on TikTok: www.tiktok.com/@jasgarnsworthy Shop Tara's brand: https://smartsweets.com/ Check out Tara's $25,000 pitch contest: https://www.boldbeginnings.com/

My Biggest Lesson
Godard Abel: Be a Conscious Leader

My Biggest Lesson

Play Episode Listen Later Aug 17, 2023 20:15


This week Chris speaks with Godard Abel, the CEO of G2, the leading business software review platform and marketplace, which he co-founded in 2012. G2 recently achieved unicorn status at a $1.1B valuation after raising $257M in capital to fuel global growth. Godard is also Executive Chairman of ThreeKit and Logik.io. Previously, he served as CEO of SteelBrick which was acquired by Salesforce in 2016 for $360M. In this episode, Godard shares how his first company inspired him to create G2, his perspectives on the evolution and differences in the SAAS and tech ecosystems in Colorado compared to Silicon Valley, and how his biggest lesson helped ease the burden on anxiety.Listen now on: Amazon Music (Alexa) | Spotify | Apple Podcasts G2 - https://www.g2.com/Check out more about what we're up to at Range.vc Connect with hosts Adam and Chris and the Range VC team on LinkedIn https://www.linkedin.com/company/range-ventures/See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

Progress Texas Happy Hour
Daily Dispatch 8/16/23: Anonymous Planned Parenthood Plaintiff Stands To Rake In $360M, and More

Progress Texas Happy Hour

Play Episode Listen Later Aug 16, 2023 9:40


A closer look at the Planned Parenthood lawsuit shows that an anonymous whistleblower working with Ken Paxton stands to take home 20% of any judgement: https://www.forbes.com/sites/darreonnadavis/2023/08/15/what-to-know-about-texas-lawsuit-that-could-cost-planned-parenthood-18-billion/?sh=67fec23f4ae3 ...And that Amarillo, where the case is being heard, itself has no Planned Parenthood clinics: https://apnews.com/article/planned-parenthood-texas-medicaid-6c016b80c0cf76e3b8f9577ad6ea8e69 A lawsuit to block Texas' ban on care for transgender youth has its first day in court: https://www.kut.org/health/2023-08-15/lawsuit-to-block-texas-ban-on-care-for-transgender-youth-has-its-first-day-in-court ...While a school board meeting in deep East Texas over a trans elementary school teacher goes as one might expect: https://www.12newsnow.com/article/news/local/teacher-gender-controversy-divides-kirbyville/502-5d6a01e8-4b5b-40f5-a43d-b4ac2e23444b On the Border: whoda thunk that Greg Abbott would put his deadly border barrier up on the Mexican side? https://www.cbsnews.com/news/texas-floating-border-barrier-technically-in-mexico-survey-finds/ ...While Operation Lone Star-related high-speed DPS chases and crashes are WAY up in El Paso: https://www.ktep.org/ktep-news/2023-08-14/operation-lone-star-leads-to-more-dps-troopers-in-high-speed-pursuits-resulting-in-collisions-in-el-paso ...and Hispanic U.S. Congressmen rally the Biden Administration to stop the separation of migrant fathers from their families: https://www.pbs.org/newshour/politics/hispanic-democrats-send-letter-urging-biden-administration-to-investigate-texas-separation-of-migrant-fathers A new state law interfering in Harris County elections is found unconstitutional - and is immediately appealed: https://abcnews.go.com/Politics/wireStory/judge-calls-new-texas-election-law-unconstitutional-state-102296644 A recent Miss Texas winner is running for the Texas House: https://www.expressnews.com/politics/article/miss-texas-house-campaign-18296949.php Dallas attorney Sidney Powell is up on RICO charges in Georgia - but continues hawking her merch: https://www.businessinsider.com/sidney-powell-indictment-georgia-election-conspiracy-theories-defiant-2023-8 Illinois carpetbagger and unconvicted double murderer Kyle Rittenhouse has a new gun-nut org based in Texas: https://www.texastribune.org/2023/08/16/kyle-rittenhouse-texas-foundation/ And there's a fundraiser this weekend for victims of the wildfires in Maui in Bedford: https://www.texasstandard.org/stories/native-hawaiians-north-texas-fundraiser-maui-residents-hawaii-wildfires/ Thanks as always for listening! Learn more about what we do at Progress Texas and how you can help at https://progresstexas.org/.

Expert Talk with TGo
Be the Best Version of Yourself: Achieving Goals Together - Featuring Jennifer Hammond, Valerie David and Dr. Hoa Nguyen

Expert Talk with TGo

Play Episode Listen Later Aug 7, 2023 28:30


http://ExpertTalk.fm ~ Tips and advice for navigating a successful career in real estate. It explains the importance of effective communication with clients, staying informed and connected to mentors and the local economic environment, and having a positive attitude and support system. It also discusses the potential of syndicated investing in multi-family real estate, the minimum amount required, and the lack of real estate qualifications needed. Finally, it emphasizes the importance of listening carefully and staying open minded when interviewing.Jennifer J. Hammond is a high energy and versatile woman who not only serves as a real estate executive; but also, as a satellite radio talk show host, a best-selling author, and is a member of The Happiness Hall of Fame! As a Vice President of TTR Sotheby'sValerie David, also known as the Pink Hulk, is a three-time cancer survivor and comedian who empowers others facing a cancer diagnosis. She encourages her viewers to take control of their own health, ask for help, and look for joy in life despite the challenges of cancer. Dr. Hoa Nguyen is an entrepreneur, author, business leader, eye doctor, and an accredited real estate investor and syndicator. She is the co-founder of 20/20 Platinum Capital. She has ownership in over $360M in real estate acquisitions and is invested in over 5,300 units as general partners and limited partners.#ExpertTalkWithTGo #ExpertTalkXtra #TalkShow #PodcastToBroadcast #TheresaGoss #ExpertTalkFM #Roku #Pandora #iHeartRADIO #PodNationTV #talkshowtv #talkshowonline #talkshowhost #podcast #motivation #broadcast #listennow #entrepreneur #marketing #TGoTV #9at9 #FastFunInformative #LightsCamerasTakeAction #YurView #Cox

Dan Lok Show
Interview with Chris Lee - From Experiencing Bankruptcy To Selling 9 Figures Businesses

Dan Lok Show

Play Episode Listen Later Jul 3, 2023 58:06


Chris Lee is the Founder and Chairman of Solgen Power. Solgen, who employs over 1300 team members, was recently recognized by Financial Times as the 6th fastest growing privately held company in the Nation. This was all accomplished in under 5.5 years of being in business.When he's not at work, Chris loves spending time with his wife Andrea and their five wonderful children. They live a serene life on their 23-acre hobby farm in Washington State. Known for his dedication to his team and community, as well as his positive influence on others, Chris embodies resilience and determination.Recently, Chris closed on a private equity deal that valued Solgen at $360M, his second 9-Figure exit in the last 12 months.Follow Chris Lee on social media:Spotify- https://open.spotify.com/show/1e0cL2vI1JAtQrojSOA7D2?si=cc6462209b5c4fe8Apple Podcast- https://apple.co/3pjSgicYouTube- https://youtube.com/@thefounderspodcastInstagram @chrisleeqbFacebook.com/chrisleeqbTikTok @chrisleeqb

The Squeaky Clean Energy Podcast
Episode 92: North Carolina Ratepayers Could Save How Much??

The Squeaky Clean Energy Podcast

Play Episode Listen Later Jun 5, 2023 49:21


Ever since the $8.5 Billion VC Sumner nuclear plant failure in South Carolina, the region has been exploring ways to better protect ratepayers from utility overreach and unchecked spending. This led to the introduction and passage of a bill in SC to study the benefits of market reform for the state. And the results are in…South Carolina ratepayers could save upwards of $360M a year by joining a regional wholesale market. Earlier this year, NC experienced our own utility challenges with Winter Storm Elliott, which knocked out power to more than 500k customers. These reliability concerns led to the NC General Assembly introducing our own version of a study bill to explore potential savings and reliability improvements that would come with alternative market structures. In fact, some third parties already authored a study that showed NC could save nearly $593M/year by joining an RTO market. On this episode, we discuss the latest developments on related to market reform in the southeast and the potential benefits it could bring to ratepayers. We're joined by Chris Carmody of the Carolinas Clean Energy Business Alliance and Reese Rogers of the Clean Energy Buyers Association who break down the latest! Additional resources from today's episode: SC Brattle Study: bit.ly/43LShtR NC Brattle Study: bit.ly/42tAaHU Presented by NC Sustainable Energy Association. Hosted and produced by Matt Abele (Twitter: @MattAbele) Be sure to follow us on Instagram at @squeakycleanpodcast.

Nothing Personal with David Samson
MLB Spring Training: The Mets Steve Cohen Plan is here; Can Cole keep HR down, can Judge keep HR up?; Meyers Leonard's send chance (Episode 759)

Nothing Personal with David Samson

Play Episode Listen Later Feb 21, 2023 49:47


Today's word of the day is 'position' as in Steve Cohen is in position as in 3-5 year plan is here as in MLB and its Economic Reform Committee has been started. What does all of this mean? Do the other owners dislike Steve Cohen? Is the spending going to continue forever? (12:30) Let's move to the Bronx and talk about the New York Yankees. Is Gerrit Cole going to give up 33 homers again? Is Carlos Rodon going to be the secondary star they've needed? What about Aaron Judge and 9/$360M deal? Can the Yankees get it done? (18:40) Meyers Leonard is back in the NBA. Let's do a refresher on why he's been out of the league. Should we be okay with this 2nd chance? Is this the “right” way to do things? (28:30) Review: All That Breathes. (34:55) NPPOD. (35:55) What is going on with John Henry and Liverpool? The team was for sale. Now it isn't for sale. Can it be for sale? Is anyone selling anything? (42:30) Let's finish the day off talking about Ted Lasso.  Being sick stinks! I wish it on no one! This section has basically become my diary. Thanks for reading! To learn more about listener data and our privacy practices visit: https://www.audacyinc.com/privacy-policy Learn more about your ad choices. Visit https://podcastchoices.com/adchoices

KYN Pods
From Brady to Brock | Out of Bounds #9

KYN Pods

Play Episode Listen Later Dec 12, 2022 55:11


Was Tom Brady's Sunday the lowest of his career? Is Aaron Judge being safe in Hawaii with money awaiting him?In the Know Your News sports podcast "Out Of Bounds," Jon Alba and Mia O'Brien talk the highs, the lows, and weird and the wild of everything you need to know about sports from this week!Topics This Week Include:00:00 Intro/Toast to Grant Wahl04:34 Tom Brady Loses to Brock Purdy15:03 Controversial Call Against Jaelan Phillips vs. Chargers19:43 Baker Mayfield is Back in Prime Time24:49 Army Black Knights Defeat Navy Midshipmen30:16 The Dumping Of Mayo at the Duke's Mayo Bowl38:04 Suns Angered Over Zion Williamson's Showboat Dunk42:27 Aaron Judge, Yankees Agree to 9-year, $360M deal46:26 Cristiano Ronaldo Likely Retiring, Portugal Loses to Morocco 51:48 Weekly WagerVisit our website:KnowYourNews.comSend in Superchats for sports moments you'd like to discuss!http://kynchat.comCheck out our socials:Facebook: facebook.com/knowyournewsTikTok: tiktok.com/@knowyournewsInstagram: instagram.com/knowyournewzTwitter: twitter.com/knowyournewz

Skip and Shannon: Undisputed
Full Show (Bucs at 49ers preview, Baker Mayfield active for game against Raiders, Lakers lose without LeBron & AD)

Skip and Shannon: Undisputed

Play Episode Listen Later Dec 8, 2022 125:17


00:00 Buccaneers at 49ers preview, Tom Brady returns home to Bay Area Read more about Tom Brady's return to the Bay Area to play the 49ers HERE 24:20 Is tonight too soon for Rams to start Baker Mayfield at QB? 54:11 Celtics beat Suns 125-98. Was the blowout more about the Celtics or Suns? 1:05:03 Texans at Cowboys preview, Cowboys largest favorites of any game this season 1:16:31 Latest QB grades: 1. Mahomes 2. Allen 3. Tua 4. Burrow (10. Brady) 1:30:19 LeBron and AD both miss game at Raptors, Lakers lose 2nd straight 1:42:12 Aaron Judge agrees to 9-year/$360M deal with Yankees 1:48:47 Trust Russell Westbrook in Sixth Man role with Lakers this season? 1:53:47 What do you expect to see from Baker Mayfield if he plays tonight? Learn more about your ad choices. Visit megaphone.fm/adchoices

Nothing Personal with David Samson
The Judge Effect: San Diego Padres lose out on Judge, but get Xander Bogaerts; What are the Dodgers big plans now!? MLB should be worried! (Episode 712)

Nothing Personal with David Samson

Play Episode Listen Later Dec 8, 2022 48:26


Today's word of the day is 'the Judge Effect' as in Aaron Judge signs for $360M and things start to get out of control. Word came out that the San Diego Padres were in on Judge. Don't think so. BUT! They are spending like they were in on him. Late, late last night Xander Bogaerts agreed to a 10-year deal with the Padres. How are the Padres doing this? Can they afford to do this? (15:20) Let's talk about MLB owners and spending right now. There are billions of dollars being thrown around by owners who are spending with their egos. (21:45) So what are the Dodgers going to be doing now? They lose Trea Turner. The Padres get better. But, the black cloud of Trevor Bauer is holding the team hostage. Can they afford Carlos Correa? A report says they don't want to do it because of the fans! (29:00) Review: The Swimmers. (33:45) So You Wanna Talk To Samson!? Someone asked me about the Chicago Cubs signing Cody Bellinger to a 1-year deal. Was it worth it? Is he betting on a comeback year? (40:00) NPPOD. (42:50) Cristiano Ronaldo news continues to leak out. He was benched in the knockout rounds. He still has a huge offer from Saudi Arabia.  We've made it to the midweek! The weekend is almost here! Let's have a day, people! To learn more about listener data and our privacy practices visit: https://www.audacyinc.com/privacy-policy Learn more about your ad choices. Visit https://podcastchoices.com/adchoices

Bronx Muchachos
Aaron Judge Returns Home | Bronx Muchachos Episode 59

Bronx Muchachos

Play Episode Listen Later Dec 8, 2022 20:32


The Muchachos and the rest of Yankees Universe can breathe easier now! Jon Morosi broke the news from San Diego this morning that Aaron Judge is returning to the New York Yankees on a 9 Year pact worth $360M! We will dive into the Judge singing and all the other winter meeting frenzy that has taken place in the last 24 hours! Bronx Muchachos Linktree: https://linktr.ee/realbronxmuchachos Bronx Muchachos Merchandise Store: https://bronx-muchachos-merch.creator-spring.com/ PROMO CODE: BM15 for 15% off your order Seat Geek: https://seatgeek.com Promo Code: BRONXMUCHACHOS save $20 on your purchase LIDS: https://lids.7q8j.net/oe3aMe Use the Promo Code of the day for instant savings! Some exclusions may apply. #NYYankees #MLB #BronxMuchachos --- This episode is sponsored by · Anchor: The easiest way to make a podcast. https://anchor.fm/app

The Steve Warne Project
880: Sens Lose; Poulin is Athlete of Year; Aaron Judge Gets $360M

The Steve Warne Project

Play Episode Listen Later Dec 8, 2022 28:04


Sens gets thumped by LA 5-2 and coach DJ Smith takes blames afterward; Marie-Phillip Poulin is Canada's athlete of the year. Was she the best choice? Aaron Judge gets a king's ransom in New York.

Bustin' Chops & Callin' Shots
Episode 37 - Aaron Judge : Tua & Lamar : Full Week 14 Breakdown

Bustin' Chops & Callin' Shots

Play Episode Listen Later Dec 8, 2022 67:59


In this episode: Aaron Judge gets paid - 9 year, $360M deal. Baker finds another home and might get playing time on TNFLamar Jackson is out and the fate of the RavensFull NFL Week 14 Breakdown and Analysis. Give us a call at 833-369-2227Follow us for more:www.bccssports.comhttps://www.instagram.com/bustinchopscallinshotshttps://www.tiktok.com/@bccssportshttps://www.facebook.com/BustinChopsCallinShots

News/Talk 94.9 WSJM
Spartans snap 2 game skid – Thursday Morning Sports Update

News/Talk 94.9 WSJM

Play Episode Listen Later Dec 8, 2022 3:01


NCAAMBKB – Men's College Basketball Last Night Michigan State 67, Penn State 58 Notre Dame 81, Boston University 75 Robert Morris 71, Central Michigan 66 Illinois State 87, Eastern Michigan 81 Michigan State 67, Penn State 58 – Hoggard scores career-high 23, Michigan St snaps 2-game skid A.J. Hoggard scored a career-high 23 points, Joey Hauser had 12 points and 15 rebounds and Michigan State beat Penn State 67-58 to snap a two-game losing streak. Michigan State avoided going .500 or worse after 10 games for the first time in 18 seasons. Hoggard blocked an open layup with less than a minute to play and Hauser grabbed the rebound before being fouled and making two free throws at the other end for a 66-58 lead. Hoggard, Hauser and Tyson Walker combined for 31 of Michigan State's 32 second-half points. The Michigan State defense allowed only one made field goal in the final five minutes. Penn State was just 1 of 9 from 3-point range in the second half after 7 of 18 before halftime. Notre Dame 81, Boston U. 75 – Ryan scores 21 to help Notre Dame hold off Boston University Cormac Ryan scored 21 points, including five free throws down the stretch, and Notre Dame held off Boston University 81-75. Ryan continued his hot 3-point shooting, making all four of his attempts. In his last three games, Ryan is 14 of 19 from the arc. Trey Wertz added three more treys and had 16 points with JJ Starling adding 15 for Notre Dame, which shot 49% and was even better from 3-point range at 9 of 17 for 53%. Ethan Brittain-Watts made 5 of 10 3-point tries and scored a career-high 19 points to lead the Terriers. Tonight Michigan at Minnesota, 9:00 p.m.              News/Talk/Sports 94.9 WSJM 8:30 NCAAMBKB – Michigan’s Jaelin Llewellyn out for season with knee injury Michigan point guard Jaelin Llewellyn is out for the rest of the season with an injured left knee. He is expected to have surgery next month. Llewellyn was hurt in a loss to Kentucky in London over the weekend. Llewellyn transferred to Michigan from Princeton last spring and that seemed to lead to Frankie Collins transferring to Arizona State. Llewellyn averaged seven points, 3.3 rebounds and 2.8 assists in eight games at Michigan. He was an All-Ivy League player last season and averaged nearly 16 points over three seasons at Princeton. NCAAWBKB – Women's College Basketball Last Night Western Michigan 68, Valparaiso 62 Today Cleveland State at Central Michigan, 11:00 a.m. Toledo at (14) Michigan, 7:00 p.m. (5) Notre Dame at Lafayette, Postponed NCAAWBKB – Notre Dame, Lafayette women postpone game for health reasons No. 5 Notre Dame and Lafayette have postponed a women’s basketball game scheduled for Thursday because of health and safety protocols. No other details were immediately available. The game was supposed to be played at Lafayette in Easton, Pennsylvania. The schools say they are looking at whether the game can be rescheduled. The Fighting Irish are still supposed to host Merrimack on Saturday. NFL – National Football League – Week 14 Tonight Las Vegas Raiders at Los Angeles Rams, 8:15 p.m. NFL – McVay: Baker Mayfield likely to be active for Rams vs Vegas Coach Sean McVay says Baker Mayfield probably will be active for the Los Angeles Rams' game against the Raiders on Thursday night. That’s just two days after the Rams claimed the former No. 1 draft pick off waivers from Carolina. McVay wouldn't rule out the possibility of Mayfield playing against Las Vegas, but the Rams coach indicated it likely would happen only if John Wolford's persistent neck injury forces struggling Los Angeles to try a desperate solution to its quarterback woes. Mayfield arrived in Los Angeles on Tuesday night to join the Rams. NHL – National Hockey League Tonight Detroit Red Wings at Florida Panthers, 7:30 p.m. NBA – National Basketball Association Last Night New Orleans Pelicans 104, Detroit Pistons 98 Chicago Bulls 115, Washington Wizards 111 Minnesota Timberwolves 121, Indiana Pacers 115 Pelicans 104, Pistons 98 – Williamson scores 29, Pelicans hold off Pistons 104-98 Zion Williamson had 29 points and 10 rebounds, and the New Orleans Pelicans held off the Detroit Pistons 104-98 for their fifth straight victory. Trey Murphy III hit four 3s and finished with 20 points for New Orleans, which won for the 10th time in 12 games. Naji Marshall added 17 points, including a put-back that made it 100-94 with 29 seconds left. Saddiq Bey scored 25 points for Detroit. Killian Hayes added 17 points and 12 assists for the Pistons. The Pelicans’ winning streak has come without Brandon Ingram because of the star forward’s left foot injury. Williamson has averaged 28 points during Ingram’s absence. Bulls 115, Wizards 111 – DeRozan, Bulls beat Wizards; Beal out with hamstring strain DeMar DeRozan scored 15 of his 27 points in the fourth quarter, Nikola Vucevic had 25 points and 11 rebounds, and the Chicago Bulls beat the Washington Wizards 115-111. Zach LaVine added 25 points as Chicago bounced back after a 2-4 road trip. Vucevic's basket gave the Bulls a 106-105 lead with 2:50 left, and he then blocked a shot that led to a jumper by DeRozan. After Porzingis missed a 3-pointer, DeRozan had a three-point play to put Chicago ahead 111-105 with 1:43 left. Kristaps Porzingis scored 28 points and Kyle Kuzma added 21 for Washington, which has lost four straight. The Wizards played without leading scorer and three-time All-Star Bradley Beal, who strained his hamstring Sunday in a loss to the Lakers. Beal will be re-evaluated next week. T-Wolves 121, Pacers 115 – Russell scores 15 points in 4th, Wolves beat Pacers 121-115 D'Angelo Russell had 15 of his 28 points in the fourth quarter, Anthony Edwards scored 26 points and the Minnesota Timberwolves held on to beat the Indiana Pacers 121-115 on Wednesday night. Rudy Gobert added 16 points and 20 rebounds for Minnesota, which blew a 23-point lead and had to overcome an eight-point deficit. Buddy Hield scored 26 points and hit 7 of 11 from 3-point territory for Indiana. Tyrese Haliburton added 26 points and 15 assists for the Pacers. MLB – AP source: Aaron Judge, Yankees reach $360M, 9-year deal A person familiar with the deal says that Aaron Judge has agreed to return to the New York Yankees on a $360 million, nine-year contract. The person spoke to The Associated Press on condition of anonymity because the deal had not been announced. It’s the largest free agent deal in baseball history. Judge will earn $40 million per season, the highest average annual payout for a position player. The contract trails only Mike Trout's $426.5 million deal with the Los Angeles Angels and Mookie Betts' $365 million pact with the Dodgers for biggest in baseball history. MLB – Braves acquire former All-Star reliever Jiménez from Tigers The Atlanta Braves have traded for former All-Star reliever Joe Jiménez, sending two minor leaguers to Detroit to help restock a bullpen now minus Kenley Jansen. The Tigers acquired outfielder Justyn-Henry Malloy and left-hander Jake Higginbotham. The swap at the winter meetings on Wednesday night came after Jansen, who led the NL with 41 saves last season, left the NL East champion Braves and got a $32 million, two-year deal with the Boston Red Sox. Jiménez, who turns 28 next month, was 3-2 with two saves and a 3.49 ERA in 62 games with the Tigers last season, striking out 77 in 56 2/3 innings. The right-hander has spent all six of his seasons in the majors with Detroit and was an All-Star in 2018. MLB – AP source: Cardinals, Contreras agree to 5-year contract Willson Contreras is going to St. Louis to replace Yadier Molina at catcher, agreeing to an $87.5 million, five-year contract with the Cardinals. A person familiar with the negotiations confirmed the move to The Associated Press on condition of anonymity because the deal was pending a physical. The addition of Contreras fills a major void for St. Louis, which won the NL Central this season with a 93-69 record. Molina decided to retire after spending his entire 19-year career with the Cardinals, making 10 All-Star teams and winning nine Gold Gloves. MCCAA – Junior College Athletics Last Night Women's Basketball Lake Michigan College 68, Mott Community College 63 The Lady Red Hawks had three players score in double figures in a 68-63 road win over Mott, Arial Ford had a game high 22 points along with 19 rebounds.  Aaliyah Reno and Kalyah Watson scored 10 points each. Southwestern Michigan 79, Henry Ford College 55 Cameron Thomas had a game high 27 points to lead Southwestern Michigan over Henry Ford College 79-55.  Her teammate Macey Laubach had 25. Men's Basketball Mott Community College 87, Lake Michigan College 50 20th ranked Mott out scored Lake Michigan College 45-19 in the second half on the way to an 87-50 win over the Red Hawks. Jailen Campbell scored 13 to lead LMC. AHL – American Hockey League Last Night Grand Rapids Griffins 6, Iowa Wild 2 ECHL – ECHL Hockey League Yesterday Kalamazoo Wings 6, Toledo Walleye 1 MHSAA – High School Sports Last Night Wrestling St. Joseph at Paw Paw Quad Jackson Northwest 48, St. Joseph 24 St. Joseph 51, Paw Paw 30 SAC Meet at South Haven South Haven 48, Delton-Kellogg 27 Martin 54, South Haven 17 Parchment Dual Parchment 54, Coloma 0 Watervliet 42, Parchment 36 Hudsonville Quad Hudsonville 34, Lakeshore 33 Lakeshore 63, East Kentwood 15 Girls Basketball Buchanan 63, Edwardsburg 42 Boys Basketball Kalamazoo Phoenix 65, White Pigeon 40 Today Girls Basketball St. Joseph at Battle Creek Central, 7:00 p.m. Lakeshore at Portage Central, 7:00 p.m. Michigan Lutheran at Our Lady of the Lake, 7:00 p.m. Bridgman at Coloma, 6:00 p.m. Portage Northern at Mattawan, 7:00 p.m. Kalamazoo Loy Norrix at Gull Lake, 7:00 p.m. Battle Creek Lakeview at Kalamazoo Central, 7:00 p.m. Climax-Scotts at Mendon, 6:00 p.m. Boys Basketball Countryside Academy at Bloomingdale, 6:00 p.m. Dowagiac at Comstock, 7:00 p.m. Bangor at Wyoming West Michigan Lutheran, 6:00 p.m. Boys Swimming and Diving Kalamazoo Central at St. Joseph, 6:00 p.m. FIFA – 2022 FIFA World Cup – Qatar Friday Quarterfinals Croatia vs. Brazil, 10:00 a.m. Netherlands vs. Argentina, 2:00 p.m. FIFA – After World Cup, US men recede to background for 3 1/2 years The United States men's soccer team recedes into the background of American sports for the next 3 1/2 years. The team's four World Cup matches averaged 12.2 million viewers on Fox but its 27 games on rated English-language networks from the start of 2020 through this fall averaged 668,000. That is according to Nielsen. The U.S. team averaged 2.45 million during the World Cup on Telemundo. That is double its 1.02 million average for 40 matches on Spanish-language networks during the three years ahead of the tournament. That compares with the NFL’s average of 17.1 million for the 2021 regular season.Penn State’s Seth Lundy (1) defends against a shot by Michigan State’s A.J. Hoggard (11) during the first half of an NCAA college basketball game Wednesday, Dec. 7, 2022, in State College, Pa. (AP Photo/Gary M. Baranec)Michigan Wolverines’ Jaelin Llewellyn (3) in action during an NCAA basketball game between Michigan Wolverines and Kentucky Wildcats at the O2 Arena, in London, Sunday, Dec.4, 2022. (AP Photo/Ian Walton)New Orleans Pelicans forward Zion Williamson (1) drives to the basket between Detroit Pistons forward Saddiq Bey (41) and center Jalen Duren (0) in the first half of an NBA basketball game in New Orleans, Wednesday, Dec. 7, 2022. (AP Photo/Gerald Herbert)Chicago Bulls’ Zach LaVine shorts between Washington Wizards’ Daniel Gafford, left, and Kristaps Porzingis during the second half of an NBA basketball game Wednesday, Dec. 7, 2022, in Chicago. The Bulls won 115-111. (AP Photo/Charles Rex Arbogast)See omnystudio.com/listener for privacy information.

american game chicago english los angeles washington nfl las vegas men nba michigan spanish minnesota pennsylvania brazil detroit judge new orleans indiana argentina kentucky ncaa world cup netherlands los angeles lakers notre dame lake raiders wolves era chicago bulls fifa all star rams new york yankees snap los angeles dodgers cardinals tigers michigan state university wizards penn state boston red sox arizona state university toledo boston university los angeles rams braves baker mayfield associated press detroit pistons nielsen williamson pelicans pacers zion williamson molina lafayette fifa world cup nl ingram aaron judge our lady mayfield mike trout indiana pacers spartans minnesota timberwolves washington wizards anthony edwards jansen contreras florida panthers new orleans pelicans rudy gobert beal mookie betts michigan wolverines nl east kristaps porzingis telemundo skid hauser mendon nl central zach lavine los angeles angels mott fighting irish bangor kentucky wildcats tyrese haliburton t wolves brandon ingram llewellyn kyle kuzma bloomingdales state college central michigan mcvay derozan sports update buddy hield thursday morning comstock terriers lakeshore yadier molina o2 arena nikola vucevic merrimack kenley jansen gold gloves killian hayes lmc coloma jalen duren bostonu bridgman trey murphy iii 360m morning sports john wolford joey hauser all ivy league gull lake tyson walker justyn henry malloy frankie collins southwestern michigan jj starling seth lundy joe jim mattawan lake michigan college mlb braves
News/Talk 94.9 WSJM
Spartans snap 2 game skid – WSJM Morning Sports

News/Talk 94.9 WSJM

Play Episode Listen Later Dec 8, 2022 3:01


NCAAMBKB – Men’s College Basketball Last Night Michigan State 67, Penn State 58 Notre Dame 81, Boston University 75 Robert Morris 71, Central Michigan 66 Illinois State 87, Eastern Michigan 81 Michigan State 67, Penn State 58 – Hoggard scores career-high 23, Michigan St snaps 2-game skid A.J. Hoggard scored a career-high 23 points, Joey Hauser had 12 points and 15 rebounds and Michigan State beat Penn State 67-58 to snap a two-game losing streak. Michigan State avoided going .500 or worse after 10 games for the first time in 18 seasons. Hoggard blocked an open layup with less than a minute to play and Hauser grabbed the rebound before being fouled and making two free throws at the other end for a 66-58 lead. Hoggard, Hauser and Tyson Walker combined for 31 of Michigan State’s 32 second-half points. The Michigan State defense allowed only one made field goal in the final five minutes. Penn State was just 1 of 9 from 3-point range in the second half after 7 of 18 before halftime. Notre Dame 81, Boston U. 75 – Ryan scores 21 to help Notre Dame hold off Boston University Cormac Ryan scored 21 points, including five free throws down the stretch, and Notre Dame held off Boston University 81-75. Ryan continued his hot 3-point shooting, making all four of his attempts. In his last three games, Ryan is 14 of 19 from the arc. Trey Wertz added three more treys and had 16 points with JJ Starling adding 15 for Notre Dame, which shot 49% and was even better from 3-point range at 9 of 17 for 53%. Ethan Brittain-Watts made 5 of 10 3-point tries and scored a career-high 19 points to lead the Terriers. Tonight Michigan at Minnesota, 9:00 p.m.              News/Talk/Sports 94.9 WSJM 8:30 NCAAMBKB – Michigan’s Jaelin Llewellyn out for season with knee injury Michigan point guard Jaelin Llewellyn is out for the rest of the season with an injured left knee. He is expected to have surgery next month. Llewellyn was hurt in a loss to Kentucky in London over the weekend. Llewellyn transferred to Michigan from Princeton last spring and that seemed to lead to Frankie Collins transferring to Arizona State. Llewellyn averaged seven points, 3.3 rebounds and 2.8 assists in eight games at Michigan. He was an All-Ivy League player last season and averaged nearly 16 points over three seasons at Princeton. NCAAWBKB – Women’s College Basketball Last Night Western Michigan 68, Valparaiso 62 Today Cleveland State at Central Michigan, 11:00 a.m. Toledo at (14) Michigan, 7:00 p.m. (5) Notre Dame at Lafayette, Postponed NCAAWBKB – Notre Dame, Lafayette women postpone game for health reasons No. 5 Notre Dame and Lafayette have postponed a women’s basketball game scheduled for Thursday because of health and safety protocols. No other details were immediately available. The game was supposed to be played at Lafayette in Easton, Pennsylvania. The schools say they are looking at whether the game can be rescheduled. The Fighting Irish are still supposed to host Merrimack on Saturday. NFL – National Football League – Week 14 Tonight Las Vegas Raiders at Los Angeles Rams, 8:15 p.m. NFL – McVay: Baker Mayfield likely to be active for Rams vs Vegas Coach Sean McVay says Baker Mayfield probably will be active for the Los Angeles Rams’ game against the Raiders on Thursday night. That’s just two days after the Rams claimed the former No. 1 draft pick off waivers from Carolina. McVay wouldn’t rule out the possibility of Mayfield playing against Las Vegas, but the Rams coach indicated it likely would happen only if John Wolford’s persistent neck injury forces struggling Los Angeles to try a desperate solution to its quarterback woes. Mayfield arrived in Los Angeles on Tuesday night to join the Rams. NHL – National Hockey League Tonight Detroit Red Wings at Florida Panthers, 7:30 p.m. NBA – National Basketball Association Last Night New Orleans Pelicans 104, Detroit Pistons 98 Chicago Bulls 115, Washington Wizards 111 Minnesota Timberwolves 121, Indiana Pacers 115 Pelicans 104, Pistons 98 – Williamson scores 29, Pelicans hold off Pistons 104-98 Zion Williamson had 29 points and 10 rebounds, and the New Orleans Pelicans held off the Detroit Pistons 104-98 for their fifth straight victory. Trey Murphy III hit four 3s and finished with 20 points for New Orleans, which won for the 10th time in 12 games. Naji Marshall added 17 points, including a put-back that made it 100-94 with 29 seconds left. Saddiq Bey scored 25 points for Detroit. Killian Hayes added 17 points and 12 assists for the Pistons. The Pelicans’ winning streak has come without Brandon Ingram because of the star forward’s left foot injury. Williamson has averaged 28 points during Ingram’s absence. Bulls 115, Wizards 111 – DeRozan, Bulls beat Wizards; Beal out with hamstring strain DeMar DeRozan scored 15 of his 27 points in the fourth quarter, Nikola Vucevic had 25 points and 11 rebounds, and the Chicago Bulls beat the Washington Wizards 115-111. Zach LaVine added 25 points as Chicago bounced back after a 2-4 road trip. Vucevic’s basket gave the Bulls a 106-105 lead with 2:50 left, and he then blocked a shot that led to a jumper by DeRozan. After Porzingis missed a 3-pointer, DeRozan had a three-point play to put Chicago ahead 111-105 with 1:43 left. Kristaps Porzingis scored 28 points and Kyle Kuzma added 21 for Washington, which has lost four straight. The Wizards played without leading scorer and three-time All-Star Bradley Beal, who strained his hamstring Sunday in a loss to the Lakers. Beal will be re-evaluated next week. T-Wolves 121, Pacers 115 – Russell scores 15 points in 4th, Wolves beat Pacers 121-115 D’Angelo Russell had 15 of his 28 points in the fourth quarter, Anthony Edwards scored 26 points and the Minnesota Timberwolves held on to beat the Indiana Pacers 121-115 on Wednesday night. Rudy Gobert added 16 points and 20 rebounds for Minnesota, which blew a 23-point lead and had to overcome an eight-point deficit. Buddy Hield scored 26 points and hit 7 of 11 from 3-point territory for Indiana. Tyrese Haliburton added 26 points and 15 assists for the Pacers. MLB – AP source: Aaron Judge, Yankees reach $360M, 9-year deal A person familiar with the deal says that Aaron Judge has agreed to return to the New York Yankees on a $360 million, nine-year contract. The person spoke to The Associated Press on condition of anonymity because the deal had not been announced. It’s the largest free agent deal in baseball history. Judge will earn $40 million per season, the highest average annual payout for a position player. The contract trails only Mike Trout’s $426.5 million deal with the Los Angeles Angels and Mookie Betts’ $365 million pact with the Dodgers for biggest in baseball history. MLB – Braves acquire former All-Star reliever Jiménez from Tigers The Atlanta Braves have traded for former All-Star reliever Joe Jiménez, sending two minor leaguers to Detroit to help restock a bullpen now minus Kenley Jansen. The Tigers acquired outfielder Justyn-Henry Malloy and left-hander Jake Higginbotham. The swap at the winter meetings on Wednesday night came after Jansen, who led the NL with 41 saves last season, left the NL East champion Braves and got a $32 million, two-year deal with the Boston Red Sox. Jiménez, who turns 28 next month, was 3-2 with two saves and a 3.49 ERA in 62 games with the Tigers last season, striking out 77 in 56 2/3 innings. The right-hander has spent all six of his seasons in the majors with Detroit and was an All-Star in 2018. MLB – AP source: Cardinals, Contreras agree to 5-year contract Willson Contreras is going to St. Louis to replace Yadier Molina at catcher, agreeing to an $87.5 million, five-year contract with the Cardinals. A person familiar with the negotiations confirmed the move to The Associated Press on condition of anonymity because the deal was pending a physical. The addition of Contreras fills a major void for St. Louis, which won the NL Central this season with a 93-69 record. Molina decided to retire after spending his entire 19-year career with the Cardinals, making 10 All-Star teams and winning nine Gold Gloves. MCCAA – Junior College Athletics Last Night Women’s Basketball Lake Michigan College 68, Mott Community College 63 The Lady Red Hawks had three players score in double figures in a 68-63 road win over Mott, Arial Ford had a game high 22 points along with 19 rebounds.  Aaliyah Reno and Kalyah Watson scored 10 points each.   Southwestern Michigan 79, Henry Ford College 55 Cameron Thomas had a game high 27 points to lead Southwestern Michigan over Henry Ford College 79-55.  Her teammate Macey Laubach had 25. Men’s Basketball Mott Community College 87, Lake Michigan College 50 20th ranked Mott out scored Lake Michigan College 45-19 in the second half on the way to an 87-50 win over the Red Hawks. Jailen Campbell scored 13 to lead LMC. AHL – American Hockey League Last Night Grand Rapids Griffins 6, Iowa Wild 2 ECHL – ECHL Hockey League Yesterday Kalamazoo Wings 6, Toledo Walleye 1 MHSAA – High School Sports Last Night Wrestling St. Joseph at Paw Paw Quad Jackson Northwest 48, St. Joseph 24 St. Joseph 51, Paw Paw 30 SAC Meet at South Haven South Haven 48, Delton-Kellogg 27 Martin 54, South Haven 17 Parchment Dual Parchment 54, Coloma 0 Watervliet 42, Parchment 36 Hudsonville Quad Hudsonville 34, Lakeshore 33 Lakeshore 63, East Kentwood 15 Girls Basketball Buchanan 63, Edwardsburg 42 Boys Basketball Kalamazoo Phoenix 65, White Pigeon 40 Today Girls Basketball St. Joseph at Battle Creek Central, 7:00 p.m. Lakeshore at Portage Central, 7:00 p.m. Michigan Lutheran at Our Lady of the Lake, 7:00 p.m. Bridgman at Coloma, 6:00 p.m. Portage Northern at Mattawan, 7:00 p.m. Kalamazoo Loy Norrix at Gull Lake, 7:00 p.m. Battle Creek Lakeview at Kalamazoo Central, 7:00 p.m. Climax-Scotts at Mendon, 6:00 p.m. Boys Basketball Countryside Academy at Bloomingdale, 6:00 p.m. Dowagiac at Comstock, 7:00 p.m. Bangor at Wyoming West Michigan Lutheran, 6:00 p.m. Boys Swimming and Diving Kalamazoo Central at St. Joseph, 6:00 p.m. FIFA – 2022 FIFA World Cup – Qatar Friday Quarterfinals Croatia vs. Brazil, 10:00 a.m. Netherlands vs. Argentina, 2:00 p.m. FIFA – After World Cup, US men recede to background for 3 1/2 years The United States men’s soccer team recedes into the background of American sports for the next 3 1/2 years. The team’s four World Cup matches averaged 12.2 million viewers on Fox but its 27 games on rated English-language networks from the start of 2020 through this fall averaged 668,000. That is according to Nielsen. The U.S. team averaged 2.45 million during the World Cup on Telemundo. That is double its 1.02 million average for 40 matches on Spanish-language networks during the three years ahead of the tournament. That compares with the NFL’s average of 17.1 million for the 2021 regular season.See omnystudio.com/listener for privacy information.

united states american game chicago english los angeles washington nfl las vegas men michigan spanish minnesota pennsylvania brazil detroit judge new orleans indiana argentina kentucky world cup netherlands los angeles lakers notre dame lake raiders wolves era chicago bulls fifa all star rams new york yankees snap los angeles dodgers cardinals tigers michigan state university wizards penn state boston red sox arizona state university toledo boston university los angeles rams braves baker mayfield associated press detroit pistons nielsen williamson pelicans pacers zion williamson molina lafayette fifa world cup nl ingram aaron judge our lady mayfield mike trout indiana pacers spartans minnesota timberwolves washington wizards anthony edwards jansen contreras florida panthers new orleans pelicans rudy gobert beal mookie betts nl east demar derozan kristaps porzingis telemundo skid hauser mendon nl central zach lavine los angeles angels mott fighting irish bangor tyrese haliburton t wolves brandon ingram llewellyn kyle kuzma bloomingdales central michigan robert morris mcvay derozan eastern michigan buddy hield comstock terriers lakeshore yadier molina nikola vucevic merrimack kenley jansen willson contreras gold gloves killian hayes lmc coloma bostonu bridgman south haven angelo russell trey murphy iii 360m morning sports john wolford joey hauser all ivy league gull lake tyson walker justyn henry malloy frankie collins southwestern michigan jj starling joe jim mattawan lake michigan college mlb braves
First Up with Landsberg & Colaiacovo
Steve Phillips “I think it's good for baseball that Judge stays in New York and plays for the Yankees”

First Up with Landsberg & Colaiacovo

Play Episode Listen Later Dec 8, 2022 12:20


TSN MLB Analyst Steve Phillips joins First Up to chat about Aaron Judge staying put in New York following a 9-year $360M contract with the Yankees. Phillips shares his thoughts on why Judge's mammoth deal with New York was a win for both sides, what to expect from the Blue Jays after a quiet few opening days of free agency, how Willson Contreras' deal with the Cardinals affects the catcher market across the MLB landscape, and more.

Swing The Twig
All Rise For The Return of Aaron Judge!

Swing The Twig

Play Episode Listen Later Dec 8, 2022 57:07


Tommy and Anthony talk about Aaron Judge's 9 year, $360M extension with the New York Yankees. Who came as the surprise third team in competition for Judge in the final hours? And what's in store for the rest of the Yankees' offseason?

97.5 Y-Country
Spartans snap 2 game skid – Thursday Morning Sports Update

97.5 Y-Country

Play Episode Listen Later Dec 8, 2022 3:01


NCAAMBKB – Men's College Basketball Last Night Michigan State 67, Penn State 58 Notre Dame 81, Boston University 75 Robert Morris 71, Central Michigan 66 Illinois State 87, Eastern Michigan 81 Michigan State 67, Penn State 58 – Hoggard scores career-high 23, Michigan St snaps 2-game skid A.J. Hoggard scored a career-high 23 points, Joey Hauser had 12 points and 15 rebounds and Michigan State beat Penn State 67-58 to snap a two-game losing streak. Michigan State avoided going .500 or worse after 10 games for the first time in 18 seasons. Hoggard blocked an open layup with less than a minute to play and Hauser grabbed the rebound before being fouled and making two free throws at the other end for a 66-58 lead. Hoggard, Hauser and Tyson Walker combined for 31 of Michigan State's 32 second-half points. The Michigan State defense allowed only one made field goal in the final five minutes. Penn State was just 1 of 9 from 3-point range in the second half after 7 of 18 before halftime. Notre Dame 81, Boston U. 75 – Ryan scores 21 to help Notre Dame hold off Boston University Cormac Ryan scored 21 points, including five free throws down the stretch, and Notre Dame held off Boston University 81-75. Ryan continued his hot 3-point shooting, making all four of his attempts. In his last three games, Ryan is 14 of 19 from the arc. Trey Wertz added three more treys and had 16 points with JJ Starling adding 15 for Notre Dame, which shot 49% and was even better from 3-point range at 9 of 17 for 53%. Ethan Brittain-Watts made 5 of 10 3-point tries and scored a career-high 19 points to lead the Terriers. Tonight Michigan at Minnesota, 9:00 p.m.              News/Talk/Sports 94.9 WSJM 8:30 NCAAMBKB – Michigan’s Jaelin Llewellyn out for season with knee injury Michigan point guard Jaelin Llewellyn is out for the rest of the season with an injured left knee. He is expected to have surgery next month. Llewellyn was hurt in a loss to Kentucky in London over the weekend. Llewellyn transferred to Michigan from Princeton last spring and that seemed to lead to Frankie Collins transferring to Arizona State. Llewellyn averaged seven points, 3.3 rebounds and 2.8 assists in eight games at Michigan. He was an All-Ivy League player last season and averaged nearly 16 points over three seasons at Princeton. NCAAWBKB – Women's College Basketball Last Night Western Michigan 68, Valparaiso 62 Today Cleveland State at Central Michigan, 11:00 a.m. Toledo at (14) Michigan, 7:00 p.m. (5) Notre Dame at Lafayette, Postponed NCAAWBKB – Notre Dame, Lafayette women postpone game for health reasons No. 5 Notre Dame and Lafayette have postponed a women’s basketball game scheduled for Thursday because of health and safety protocols. No other details were immediately available. The game was supposed to be played at Lafayette in Easton, Pennsylvania. The schools say they are looking at whether the game can be rescheduled. The Fighting Irish are still supposed to host Merrimack on Saturday. NFL – National Football League – Week 14 Tonight Las Vegas Raiders at Los Angeles Rams, 8:15 p.m. NFL – McVay: Baker Mayfield likely to be active for Rams vs Vegas Coach Sean McVay says Baker Mayfield probably will be active for the Los Angeles Rams' game against the Raiders on Thursday night. That’s just two days after the Rams claimed the former No. 1 draft pick off waivers from Carolina. McVay wouldn't rule out the possibility of Mayfield playing against Las Vegas, but the Rams coach indicated it likely would happen only if John Wolford's persistent neck injury forces struggling Los Angeles to try a desperate solution to its quarterback woes. Mayfield arrived in Los Angeles on Tuesday night to join the Rams. NHL – National Hockey League Tonight Detroit Red Wings at Florida Panthers, 7:30 p.m. NBA – National Basketball Association Last Night New Orleans Pelicans 104, Detroit Pistons 98 Chicago Bulls 115, Washington Wizards 111 Minnesota Timberwolves 121, Indiana Pacers 115 Pelicans 104, Pistons 98 – Williamson scores 29, Pelicans hold off Pistons 104-98 Zion Williamson had 29 points and 10 rebounds, and the New Orleans Pelicans held off the Detroit Pistons 104-98 for their fifth straight victory. Trey Murphy III hit four 3s and finished with 20 points for New Orleans, which won for the 10th time in 12 games. Naji Marshall added 17 points, including a put-back that made it 100-94 with 29 seconds left. Saddiq Bey scored 25 points for Detroit. Killian Hayes added 17 points and 12 assists for the Pistons. The Pelicans’ winning streak has come without Brandon Ingram because of the star forward’s left foot injury. Williamson has averaged 28 points during Ingram’s absence. Bulls 115, Wizards 111 – DeRozan, Bulls beat Wizards; Beal out with hamstring strain DeMar DeRozan scored 15 of his 27 points in the fourth quarter, Nikola Vucevic had 25 points and 11 rebounds, and the Chicago Bulls beat the Washington Wizards 115-111. Zach LaVine added 25 points as Chicago bounced back after a 2-4 road trip. Vucevic's basket gave the Bulls a 106-105 lead with 2:50 left, and he then blocked a shot that led to a jumper by DeRozan. After Porzingis missed a 3-pointer, DeRozan had a three-point play to put Chicago ahead 111-105 with 1:43 left. Kristaps Porzingis scored 28 points and Kyle Kuzma added 21 for Washington, which has lost four straight. The Wizards played without leading scorer and three-time All-Star Bradley Beal, who strained his hamstring Sunday in a loss to the Lakers. Beal will be re-evaluated next week. T-Wolves 121, Pacers 115 – Russell scores 15 points in 4th, Wolves beat Pacers 121-115 D'Angelo Russell had 15 of his 28 points in the fourth quarter, Anthony Edwards scored 26 points and the Minnesota Timberwolves held on to beat the Indiana Pacers 121-115 on Wednesday night. Rudy Gobert added 16 points and 20 rebounds for Minnesota, which blew a 23-point lead and had to overcome an eight-point deficit. Buddy Hield scored 26 points and hit 7 of 11 from 3-point territory for Indiana. Tyrese Haliburton added 26 points and 15 assists for the Pacers. MLB – AP source: Aaron Judge, Yankees reach $360M, 9-year deal A person familiar with the deal says that Aaron Judge has agreed to return to the New York Yankees on a $360 million, nine-year contract. The person spoke to The Associated Press on condition of anonymity because the deal had not been announced. It’s the largest free agent deal in baseball history. Judge will earn $40 million per season, the highest average annual payout for a position player. The contract trails only Mike Trout's $426.5 million deal with the Los Angeles Angels and Mookie Betts' $365 million pact with the Dodgers for biggest in baseball history. MLB – Braves acquire former All-Star reliever Jiménez from Tigers The Atlanta Braves have traded for former All-Star reliever Joe Jiménez, sending two minor leaguers to Detroit to help restock a bullpen now minus Kenley Jansen. The Tigers acquired outfielder Justyn-Henry Malloy and left-hander Jake Higginbotham. The swap at the winter meetings on Wednesday night came after Jansen, who led the NL with 41 saves last season, left the NL East champion Braves and got a $32 million, two-year deal with the Boston Red Sox. Jiménez, who turns 28 next month, was 3-2 with two saves and a 3.49 ERA in 62 games with the Tigers last season, striking out 77 in 56 2/3 innings. The right-hander has spent all six of his seasons in the majors with Detroit and was an All-Star in 2018. MLB – AP source: Cardinals, Contreras agree to 5-year contract Willson Contreras is going to St. Louis to replace Yadier Molina at catcher, agreeing to an $87.5 million, five-year contract with the Cardinals. A person familiar with the negotiations confirmed the move to The Associated Press on condition of anonymity because the deal was pending a physical. The addition of Contreras fills a major void for St. Louis, which won the NL Central this season with a 93-69 record. Molina decided to retire after spending his entire 19-year career with the Cardinals, making 10 All-Star teams and winning nine Gold Gloves. MCCAA – Junior College Athletics Last Night Women's Basketball Lake Michigan College 68, Mott Community College 63 The Lady Red Hawks had three players score in double figures in a 68-63 road win over Mott, Arial Ford had a game high 22 points along with 19 rebounds.  Aaliyah Reno and Kalyah Watson scored 10 points each. Southwestern Michigan 79, Henry Ford College 55 Cameron Thomas had a game high 27 points to lead Southwestern Michigan over Henry Ford College 79-55.  Her teammate Macey Laubach had 25. Men's Basketball Mott Community College 87, Lake Michigan College 50 20th ranked Mott out scored Lake Michigan College 45-19 in the second half on the way to an 87-50 win over the Red Hawks. Jailen Campbell scored 13 to lead LMC. AHL – American Hockey League Last Night Grand Rapids Griffins 6, Iowa Wild 2 ECHL – ECHL Hockey League Yesterday Kalamazoo Wings 6, Toledo Walleye 1 MHSAA – High School Sports Last Night Wrestling St. Joseph at Paw Paw Quad Jackson Northwest 48, St. Joseph 24 St. Joseph 51, Paw Paw 30 SAC Meet at South Haven South Haven 48, Delton-Kellogg 27 Martin 54, South Haven 17 Parchment Dual Parchment 54, Coloma 0 Watervliet 42, Parchment 36 Hudsonville Quad Hudsonville 34, Lakeshore 33 Lakeshore 63, East Kentwood 15 Girls Basketball Buchanan 63, Edwardsburg 42 Boys Basketball Kalamazoo Phoenix 65, White Pigeon 40 Today Girls Basketball St. Joseph at Battle Creek Central, 7:00 p.m. Lakeshore at Portage Central, 7:00 p.m. Michigan Lutheran at Our Lady of the Lake, 7:00 p.m. Bridgman at Coloma, 6:00 p.m. Portage Northern at Mattawan, 7:00 p.m. Kalamazoo Loy Norrix at Gull Lake, 7:00 p.m. Battle Creek Lakeview at Kalamazoo Central, 7:00 p.m. Climax-Scotts at Mendon, 6:00 p.m. Boys Basketball Countryside Academy at Bloomingdale, 6:00 p.m. Dowagiac at Comstock, 7:00 p.m. Bangor at Wyoming West Michigan Lutheran, 6:00 p.m. Boys Swimming and Diving Kalamazoo Central at St. Joseph, 6:00 p.m. FIFA – 2022 FIFA World Cup – Qatar Friday Quarterfinals Croatia vs. Brazil, 10:00 a.m. Netherlands vs. Argentina, 2:00 p.m. FIFA – After World Cup, US men recede to background for 3 1/2 years The United States men's soccer team recedes into the background of American sports for the next 3 1/2 years. The team's four World Cup matches averaged 12.2 million viewers on Fox but its 27 games on rated English-language networks from the start of 2020 through this fall averaged 668,000. That is according to Nielsen. The U.S. team averaged 2.45 million during the World Cup on Telemundo. That is double its 1.02 million average for 40 matches on Spanish-language networks during the three years ahead of the tournament. That compares with the NFL’s average of 17.1 million for the 2021 regular season.Penn State’s Seth Lundy (1) defends against a shot by Michigan State’s A.J. Hoggard (11) during the first half of an NCAA college basketball game Wednesday, Dec. 7, 2022, in State College, Pa. (AP Photo/Gary M. Baranec)Michigan Wolverines’ Jaelin Llewellyn (3) in action during an NCAA basketball game between Michigan Wolverines and Kentucky Wildcats at the O2 Arena, in London, Sunday, Dec.4, 2022. (AP Photo/Ian Walton)New Orleans Pelicans forward Zion Williamson (1) drives to the basket between Detroit Pistons forward Saddiq Bey (41) and center Jalen Duren (0) in the first half of an NBA basketball game in New Orleans, Wednesday, Dec. 7, 2022. (AP Photo/Gerald Herbert)Chicago Bulls’ Zach LaVine shorts between Washington Wizards’ Daniel Gafford, left, and Kristaps Porzingis during the second half of an NBA basketball game Wednesday, Dec. 7, 2022, in Chicago. The Bulls won 115-111. (AP Photo/Charles Rex Arbogast)See omnystudio.com/listener for privacy information.

american game chicago english los angeles washington nfl las vegas men nba michigan spanish minnesota pennsylvania brazil detroit judge new orleans indiana argentina kentucky ncaa world cup netherlands los angeles lakers notre dame lake raiders wolves era chicago bulls fifa all star rams new york yankees snap los angeles dodgers cardinals tigers michigan state university wizards penn state boston red sox arizona state university toledo boston university los angeles rams braves baker mayfield associated press detroit pistons nielsen williamson pelicans pacers zion williamson molina lafayette fifa world cup nl ingram aaron judge our lady mayfield mike trout indiana pacers spartans minnesota timberwolves washington wizards anthony edwards jansen contreras florida panthers new orleans pelicans rudy gobert beal mookie betts michigan wolverines nl east kristaps porzingis telemundo skid hauser mendon nl central zach lavine los angeles angels mott fighting irish bangor kentucky wildcats tyrese haliburton t wolves brandon ingram llewellyn kyle kuzma bloomingdales state college central michigan mcvay derozan sports update buddy hield thursday morning comstock terriers lakeshore yadier molina o2 arena nikola vucevic merrimack kenley jansen gold gloves killian hayes lmc coloma jalen duren bostonu bridgman trey murphy iii 360m morning sports john wolford joey hauser all ivy league gull lake tyson walker justyn henry malloy frankie collins southwestern michigan jj starling seth lundy joe jim mattawan lake michigan college mlb braves
97.5 Y-Country
Spartans snap 2 game skid – WSJM Morning Sports

97.5 Y-Country

Play Episode Listen Later Dec 8, 2022 3:01


NCAAMBKB – Men’s College Basketball Last Night Michigan State 67, Penn State 58 Notre Dame 81, Boston University 75 Robert Morris 71, Central Michigan 66 Illinois State 87, Eastern Michigan 81 Michigan State 67, Penn State 58 – Hoggard scores career-high 23, Michigan St snaps 2-game skid A.J. Hoggard scored a career-high 23 points, Joey Hauser had 12 points and 15 rebounds and Michigan State beat Penn State 67-58 to snap a two-game losing streak. Michigan State avoided going .500 or worse after 10 games for the first time in 18 seasons. Hoggard blocked an open layup with less than a minute to play and Hauser grabbed the rebound before being fouled and making two free throws at the other end for a 66-58 lead. Hoggard, Hauser and Tyson Walker combined for 31 of Michigan State’s 32 second-half points. The Michigan State defense allowed only one made field goal in the final five minutes. Penn State was just 1 of 9 from 3-point range in the second half after 7 of 18 before halftime. Notre Dame 81, Boston U. 75 – Ryan scores 21 to help Notre Dame hold off Boston University Cormac Ryan scored 21 points, including five free throws down the stretch, and Notre Dame held off Boston University 81-75. Ryan continued his hot 3-point shooting, making all four of his attempts. In his last three games, Ryan is 14 of 19 from the arc. Trey Wertz added three more treys and had 16 points with JJ Starling adding 15 for Notre Dame, which shot 49% and was even better from 3-point range at 9 of 17 for 53%. Ethan Brittain-Watts made 5 of 10 3-point tries and scored a career-high 19 points to lead the Terriers. Tonight Michigan at Minnesota, 9:00 p.m.              News/Talk/Sports 94.9 WSJM 8:30 NCAAMBKB – Michigan’s Jaelin Llewellyn out for season with knee injury Michigan point guard Jaelin Llewellyn is out for the rest of the season with an injured left knee. He is expected to have surgery next month. Llewellyn was hurt in a loss to Kentucky in London over the weekend. Llewellyn transferred to Michigan from Princeton last spring and that seemed to lead to Frankie Collins transferring to Arizona State. Llewellyn averaged seven points, 3.3 rebounds and 2.8 assists in eight games at Michigan. He was an All-Ivy League player last season and averaged nearly 16 points over three seasons at Princeton. NCAAWBKB – Women’s College Basketball Last Night Western Michigan 68, Valparaiso 62 Today Cleveland State at Central Michigan, 11:00 a.m. Toledo at (14) Michigan, 7:00 p.m. (5) Notre Dame at Lafayette, Postponed NCAAWBKB – Notre Dame, Lafayette women postpone game for health reasons No. 5 Notre Dame and Lafayette have postponed a women’s basketball game scheduled for Thursday because of health and safety protocols. No other details were immediately available. The game was supposed to be played at Lafayette in Easton, Pennsylvania. The schools say they are looking at whether the game can be rescheduled. The Fighting Irish are still supposed to host Merrimack on Saturday. NFL – National Football League – Week 14 Tonight Las Vegas Raiders at Los Angeles Rams, 8:15 p.m. NFL – McVay: Baker Mayfield likely to be active for Rams vs Vegas Coach Sean McVay says Baker Mayfield probably will be active for the Los Angeles Rams’ game against the Raiders on Thursday night. That’s just two days after the Rams claimed the former No. 1 draft pick off waivers from Carolina. McVay wouldn’t rule out the possibility of Mayfield playing against Las Vegas, but the Rams coach indicated it likely would happen only if John Wolford’s persistent neck injury forces struggling Los Angeles to try a desperate solution to its quarterback woes. Mayfield arrived in Los Angeles on Tuesday night to join the Rams. NHL – National Hockey League Tonight Detroit Red Wings at Florida Panthers, 7:30 p.m. NBA – National Basketball Association Last Night New Orleans Pelicans 104, Detroit Pistons 98 Chicago Bulls 115, Washington Wizards 111 Minnesota Timberwolves 121, Indiana Pacers 115 Pelicans 104, Pistons 98 – Williamson scores 29, Pelicans hold off Pistons 104-98 Zion Williamson had 29 points and 10 rebounds, and the New Orleans Pelicans held off the Detroit Pistons 104-98 for their fifth straight victory. Trey Murphy III hit four 3s and finished with 20 points for New Orleans, which won for the 10th time in 12 games. Naji Marshall added 17 points, including a put-back that made it 100-94 with 29 seconds left. Saddiq Bey scored 25 points for Detroit. Killian Hayes added 17 points and 12 assists for the Pistons. The Pelicans’ winning streak has come without Brandon Ingram because of the star forward’s left foot injury. Williamson has averaged 28 points during Ingram’s absence. Bulls 115, Wizards 111 – DeRozan, Bulls beat Wizards; Beal out with hamstring strain DeMar DeRozan scored 15 of his 27 points in the fourth quarter, Nikola Vucevic had 25 points and 11 rebounds, and the Chicago Bulls beat the Washington Wizards 115-111. Zach LaVine added 25 points as Chicago bounced back after a 2-4 road trip. Vucevic’s basket gave the Bulls a 106-105 lead with 2:50 left, and he then blocked a shot that led to a jumper by DeRozan. After Porzingis missed a 3-pointer, DeRozan had a three-point play to put Chicago ahead 111-105 with 1:43 left. Kristaps Porzingis scored 28 points and Kyle Kuzma added 21 for Washington, which has lost four straight. The Wizards played without leading scorer and three-time All-Star Bradley Beal, who strained his hamstring Sunday in a loss to the Lakers. Beal will be re-evaluated next week. T-Wolves 121, Pacers 115 – Russell scores 15 points in 4th, Wolves beat Pacers 121-115 D’Angelo Russell had 15 of his 28 points in the fourth quarter, Anthony Edwards scored 26 points and the Minnesota Timberwolves held on to beat the Indiana Pacers 121-115 on Wednesday night. Rudy Gobert added 16 points and 20 rebounds for Minnesota, which blew a 23-point lead and had to overcome an eight-point deficit. Buddy Hield scored 26 points and hit 7 of 11 from 3-point territory for Indiana. Tyrese Haliburton added 26 points and 15 assists for the Pacers. MLB – AP source: Aaron Judge, Yankees reach $360M, 9-year deal A person familiar with the deal says that Aaron Judge has agreed to return to the New York Yankees on a $360 million, nine-year contract. The person spoke to The Associated Press on condition of anonymity because the deal had not been announced. It’s the largest free agent deal in baseball history. Judge will earn $40 million per season, the highest average annual payout for a position player. The contract trails only Mike Trout’s $426.5 million deal with the Los Angeles Angels and Mookie Betts’ $365 million pact with the Dodgers for biggest in baseball history. MLB – Braves acquire former All-Star reliever Jiménez from Tigers The Atlanta Braves have traded for former All-Star reliever Joe Jiménez, sending two minor leaguers to Detroit to help restock a bullpen now minus Kenley Jansen. The Tigers acquired outfielder Justyn-Henry Malloy and left-hander Jake Higginbotham. The swap at the winter meetings on Wednesday night came after Jansen, who led the NL with 41 saves last season, left the NL East champion Braves and got a $32 million, two-year deal with the Boston Red Sox. Jiménez, who turns 28 next month, was 3-2 with two saves and a 3.49 ERA in 62 games with the Tigers last season, striking out 77 in 56 2/3 innings. The right-hander has spent all six of his seasons in the majors with Detroit and was an All-Star in 2018. MLB – AP source: Cardinals, Contreras agree to 5-year contract Willson Contreras is going to St. Louis to replace Yadier Molina at catcher, agreeing to an $87.5 million, five-year contract with the Cardinals. A person familiar with the negotiations confirmed the move to The Associated Press on condition of anonymity because the deal was pending a physical. The addition of Contreras fills a major void for St. Louis, which won the NL Central this season with a 93-69 record. Molina decided to retire after spending his entire 19-year career with the Cardinals, making 10 All-Star teams and winning nine Gold Gloves. MCCAA – Junior College Athletics Last Night Women’s Basketball Lake Michigan College 68, Mott Community College 63 The Lady Red Hawks had three players score in double figures in a 68-63 road win over Mott, Arial Ford had a game high 22 points along with 19 rebounds.  Aaliyah Reno and Kalyah Watson scored 10 points each.   Southwestern Michigan 79, Henry Ford College 55 Cameron Thomas had a game high 27 points to lead Southwestern Michigan over Henry Ford College 79-55.  Her teammate Macey Laubach had 25. Men’s Basketball Mott Community College 87, Lake Michigan College 50 20th ranked Mott out scored Lake Michigan College 45-19 in the second half on the way to an 87-50 win over the Red Hawks. Jailen Campbell scored 13 to lead LMC. AHL – American Hockey League Last Night Grand Rapids Griffins 6, Iowa Wild 2 ECHL – ECHL Hockey League Yesterday Kalamazoo Wings 6, Toledo Walleye 1 MHSAA – High School Sports Last Night Wrestling St. Joseph at Paw Paw Quad Jackson Northwest 48, St. Joseph 24 St. Joseph 51, Paw Paw 30 SAC Meet at South Haven South Haven 48, Delton-Kellogg 27 Martin 54, South Haven 17 Parchment Dual Parchment 54, Coloma 0 Watervliet 42, Parchment 36 Hudsonville Quad Hudsonville 34, Lakeshore 33 Lakeshore 63, East Kentwood 15 Girls Basketball Buchanan 63, Edwardsburg 42 Boys Basketball Kalamazoo Phoenix 65, White Pigeon 40 Today Girls Basketball St. Joseph at Battle Creek Central, 7:00 p.m. Lakeshore at Portage Central, 7:00 p.m. Michigan Lutheran at Our Lady of the Lake, 7:00 p.m. Bridgman at Coloma, 6:00 p.m. Portage Northern at Mattawan, 7:00 p.m. Kalamazoo Loy Norrix at Gull Lake, 7:00 p.m. Battle Creek Lakeview at Kalamazoo Central, 7:00 p.m. Climax-Scotts at Mendon, 6:00 p.m. Boys Basketball Countryside Academy at Bloomingdale, 6:00 p.m. Dowagiac at Comstock, 7:00 p.m. Bangor at Wyoming West Michigan Lutheran, 6:00 p.m. Boys Swimming and Diving Kalamazoo Central at St. Joseph, 6:00 p.m. FIFA – 2022 FIFA World Cup – Qatar Friday Quarterfinals Croatia vs. Brazil, 10:00 a.m. Netherlands vs. Argentina, 2:00 p.m. FIFA – After World Cup, US men recede to background for 3 1/2 years The United States men’s soccer team recedes into the background of American sports for the next 3 1/2 years. The team’s four World Cup matches averaged 12.2 million viewers on Fox but its 27 games on rated English-language networks from the start of 2020 through this fall averaged 668,000. That is according to Nielsen. The U.S. team averaged 2.45 million during the World Cup on Telemundo. That is double its 1.02 million average for 40 matches on Spanish-language networks during the three years ahead of the tournament. That compares with the NFL’s average of 17.1 million for the 2021 regular season.See omnystudio.com/listener for privacy information.

united states american game chicago english los angeles washington nfl las vegas men michigan spanish minnesota pennsylvania brazil detroit judge new orleans indiana argentina kentucky world cup netherlands los angeles lakers notre dame lake raiders wolves era chicago bulls fifa all star rams new york yankees snap los angeles dodgers cardinals tigers michigan state university wizards penn state boston red sox arizona state university toledo boston university los angeles rams braves baker mayfield associated press detroit pistons nielsen williamson pelicans pacers zion williamson molina lafayette fifa world cup nl ingram aaron judge our lady mayfield mike trout indiana pacers spartans minnesota timberwolves washington wizards anthony edwards jansen contreras florida panthers new orleans pelicans rudy gobert beal mookie betts nl east demar derozan kristaps porzingis telemundo skid hauser mendon nl central zach lavine los angeles angels mott fighting irish bangor tyrese haliburton t wolves brandon ingram llewellyn kyle kuzma bloomingdales central michigan robert morris mcvay derozan eastern michigan buddy hield comstock terriers lakeshore yadier molina nikola vucevic merrimack kenley jansen willson contreras gold gloves killian hayes lmc coloma bostonu bridgman south haven angelo russell trey murphy iii 360m morning sports john wolford joey hauser all ivy league gull lake tyson walker justyn henry malloy frankie collins southwestern michigan jj starling joe jim mattawan lake michigan college mlb braves
SuperHits 103.7 COSY-FM
Spartans snap 2 game skid – Cosy Sports Update

SuperHits 103.7 COSY-FM

Play Episode Listen Later Dec 8, 2022 3:01


NCAAMBKB – Men’s College Basketball Last Night Michigan State 67, Penn State 58 Notre Dame 81, Boston University 75 Robert Morris 71, Central Michigan 66 Illinois State 87, Eastern Michigan 81 Michigan State 67, Penn State 58 – Hoggard scores career-high 23, Michigan St snaps 2-game skid A.J. Hoggard scored a career-high 23 points, Joey Hauser had 12 points and 15 rebounds and Michigan State beat Penn State 67-58 to snap a two-game losing streak. Michigan State avoided going .500 or worse after 10 games for the first time in 18 seasons. Hoggard blocked an open layup with less than a minute to play and Hauser grabbed the rebound before being fouled and making two free throws at the other end for a 66-58 lead. Hoggard, Hauser and Tyson Walker combined for 31 of Michigan State’s 32 second-half points. The Michigan State defense allowed only one made field goal in the final five minutes. Penn State was just 1 of 9 from 3-point range in the second half after 7 of 18 before halftime. Notre Dame 81, Boston U. 75 – Ryan scores 21 to help Notre Dame hold off Boston University Cormac Ryan scored 21 points, including five free throws down the stretch, and Notre Dame held off Boston University 81-75. Ryan continued his hot 3-point shooting, making all four of his attempts. In his last three games, Ryan is 14 of 19 from the arc. Trey Wertz added three more treys and had 16 points with JJ Starling adding 15 for Notre Dame, which shot 49% and was even better from 3-point range at 9 of 17 for 53%. Ethan Brittain-Watts made 5 of 10 3-point tries and scored a career-high 19 points to lead the Terriers. Tonight Michigan at Minnesota, 9:00 p.m.              News/Talk/Sports 94.9 WSJM 8:30 NCAAMBKB – Michigan’s Jaelin Llewellyn out for season with knee injury Michigan point guard Jaelin Llewellyn is out for the rest of the season with an injured left knee. He is expected to have surgery next month. Llewellyn was hurt in a loss to Kentucky in London over the weekend. Llewellyn transferred to Michigan from Princeton last spring and that seemed to lead to Frankie Collins transferring to Arizona State. Llewellyn averaged seven points, 3.3 rebounds and 2.8 assists in eight games at Michigan. He was an All-Ivy League player last season and averaged nearly 16 points over three seasons at Princeton. NCAAWBKB – Women’s College Basketball Last Night Western Michigan 68, Valparaiso 62 Today Cleveland State at Central Michigan, 11:00 a.m. Toledo at (14) Michigan, 7:00 p.m. (5) Notre Dame at Lafayette, Postponed NCAAWBKB – Notre Dame, Lafayette women postpone game for health reasons No. 5 Notre Dame and Lafayette have postponed a women’s basketball game scheduled for Thursday because of health and safety protocols. No other details were immediately available. The game was supposed to be played at Lafayette in Easton, Pennsylvania. The schools say they are looking at whether the game can be rescheduled. The Fighting Irish are still supposed to host Merrimack on Saturday. NFL – National Football League – Week 14 Tonight Las Vegas Raiders at Los Angeles Rams, 8:15 p.m. NFL – McVay: Baker Mayfield likely to be active for Rams vs Vegas Coach Sean McVay says Baker Mayfield probably will be active for the Los Angeles Rams’ game against the Raiders on Thursday night. That’s just two days after the Rams claimed the former No. 1 draft pick off waivers from Carolina. McVay wouldn’t rule out the possibility of Mayfield playing against Las Vegas, but the Rams coach indicated it likely would happen only if John Wolford’s persistent neck injury forces struggling Los Angeles to try a desperate solution to its quarterback woes. Mayfield arrived in Los Angeles on Tuesday night to join the Rams. NHL – National Hockey League Tonight Detroit Red Wings at Florida Panthers, 7:30 p.m. NBA – National Basketball Association Last Night New Orleans Pelicans 104, Detroit Pistons 98 Chicago Bulls 115, Washington Wizards 111 Minnesota Timberwolves 121, Indiana Pacers 115 Pelicans 104, Pistons 98 – Williamson scores 29, Pelicans hold off Pistons 104-98 Zion Williamson had 29 points and 10 rebounds, and the New Orleans Pelicans held off the Detroit Pistons 104-98 for their fifth straight victory. Trey Murphy III hit four 3s and finished with 20 points for New Orleans, which won for the 10th time in 12 games. Naji Marshall added 17 points, including a put-back that made it 100-94 with 29 seconds left. Saddiq Bey scored 25 points for Detroit. Killian Hayes added 17 points and 12 assists for the Pistons. The Pelicans’ winning streak has come without Brandon Ingram because of the star forward’s left foot injury. Williamson has averaged 28 points during Ingram’s absence. Bulls 115, Wizards 111 – DeRozan, Bulls beat Wizards; Beal out with hamstring strain DeMar DeRozan scored 15 of his 27 points in the fourth quarter, Nikola Vucevic had 25 points and 11 rebounds, and the Chicago Bulls beat the Washington Wizards 115-111. Zach LaVine added 25 points as Chicago bounced back after a 2-4 road trip. Vucevic’s basket gave the Bulls a 106-105 lead with 2:50 left, and he then blocked a shot that led to a jumper by DeRozan. After Porzingis missed a 3-pointer, DeRozan had a three-point play to put Chicago ahead 111-105 with 1:43 left. Kristaps Porzingis scored 28 points and Kyle Kuzma added 21 for Washington, which has lost four straight. The Wizards played without leading scorer and three-time All-Star Bradley Beal, who strained his hamstring Sunday in a loss to the Lakers. Beal will be re-evaluated next week. T-Wolves 121, Pacers 115 – Russell scores 15 points in 4th, Wolves beat Pacers 121-115 D’Angelo Russell had 15 of his 28 points in the fourth quarter, Anthony Edwards scored 26 points and the Minnesota Timberwolves held on to beat the Indiana Pacers 121-115 on Wednesday night. Rudy Gobert added 16 points and 20 rebounds for Minnesota, which blew a 23-point lead and had to overcome an eight-point deficit. Buddy Hield scored 26 points and hit 7 of 11 from 3-point territory for Indiana. Tyrese Haliburton added 26 points and 15 assists for the Pacers. MLB – AP source: Aaron Judge, Yankees reach $360M, 9-year deal A person familiar with the deal says that Aaron Judge has agreed to return to the New York Yankees on a $360 million, nine-year contract. The person spoke to The Associated Press on condition of anonymity because the deal had not been announced. It’s the largest free agent deal in baseball history. Judge will earn $40 million per season, the highest average annual payout for a position player. The contract trails only Mike Trout’s $426.5 million deal with the Los Angeles Angels and Mookie Betts’ $365 million pact with the Dodgers for biggest in baseball history. MLB – Braves acquire former All-Star reliever Jiménez from Tigers The Atlanta Braves have traded for former All-Star reliever Joe Jiménez, sending two minor leaguers to Detroit to help restock a bullpen now minus Kenley Jansen. The Tigers acquired outfielder Justyn-Henry Malloy and left-hander Jake Higginbotham. The swap at the winter meetings on Wednesday night came after Jansen, who led the NL with 41 saves last season, left the NL East champion Braves and got a $32 million, two-year deal with the Boston Red Sox. Jiménez, who turns 28 next month, was 3-2 with two saves and a 3.49 ERA in 62 games with the Tigers last season, striking out 77 in 56 2/3 innings. The right-hander has spent all six of his seasons in the majors with Detroit and was an All-Star in 2018. MLB – AP source: Cardinals, Contreras agree to 5-year contract Willson Contreras is going to St. Louis to replace Yadier Molina at catcher, agreeing to an $87.5 million, five-year contract with the Cardinals. A person familiar with the negotiations confirmed the move to The Associated Press on condition of anonymity because the deal was pending a physical. The addition of Contreras fills a major void for St. Louis, which won the NL Central this season with a 93-69 record. Molina decided to retire after spending his entire 19-year career with the Cardinals, making 10 All-Star teams and winning nine Gold Gloves. MCCAA – Junior College Athletics Last Night Women’s Basketball Lake Michigan College 68, Mott Community College 63 The Lady Red Hawks had three players score in double figures in a 68-63 road win over Mott, Arial Ford had a game high 22 points along with 19 rebounds.  Aaliyah Reno and Kalyah Watson scored 10 points each.   Southwestern Michigan 79, Henry Ford College 55 Cameron Thomas had a game high 27 points to lead Southwestern Michigan over Henry Ford College 79-55.  Her teammate Macey Laubach had 25. Men’s Basketball Mott Community College 87, Lake Michigan College 50 20th ranked Mott out scored Lake Michigan College 45-19 in the second half on the way to an 87-50 win over the Red Hawks. Jailen Campbell scored 13 to lead LMC. AHL – American Hockey League Last Night Grand Rapids Griffins 6, Iowa Wild 2 ECHL – ECHL Hockey League Yesterday Kalamazoo Wings 6, Toledo Walleye 1 MHSAA – High School Sports Last Night Wrestling St. Joseph at Paw Paw Quad Jackson Northwest 48, St. Joseph 24 St. Joseph 51, Paw Paw 30 SAC Meet at South Haven South Haven 48, Delton-Kellogg 27 Martin 54, South Haven 17 Parchment Dual Parchment 54, Coloma 0 Watervliet 42, Parchment 36 Hudsonville Quad Hudsonville 34, Lakeshore 33 Lakeshore 63, East Kentwood 15 Girls Basketball Buchanan 63, Edwardsburg 42 Boys Basketball Kalamazoo Phoenix 65, White Pigeon 40 Today Girls Basketball St. Joseph at Battle Creek Central, 7:00 p.m. Lakeshore at Portage Central, 7:00 p.m. Michigan Lutheran at Our Lady of the Lake, 7:00 p.m. Bridgman at Coloma, 6:00 p.m. Portage Northern at Mattawan, 7:00 p.m. Kalamazoo Loy Norrix at Gull Lake, 7:00 p.m. Battle Creek Lakeview at Kalamazoo Central, 7:00 p.m. Climax-Scotts at Mendon, 6:00 p.m. Boys Basketball Countryside Academy at Bloomingdale, 6:00 p.m. Dowagiac at Comstock, 7:00 p.m. Bangor at Wyoming West Michigan Lutheran, 6:00 p.m. Boys Swimming and Diving Kalamazoo Central at St. Joseph, 6:00 p.m. FIFA – 2022 FIFA World Cup – Qatar Friday Quarterfinals Croatia vs. Brazil, 10:00 a.m. Netherlands vs. Argentina, 2:00 p.m. FIFA – After World Cup, US men recede to background for 3 1/2 years The United States men’s soccer team recedes into the background of American sports for the next 3 1/2 years. The team’s four World Cup matches averaged 12.2 million viewers on Fox but its 27 games on rated English-language networks from the start of 2020 through this fall averaged 668,000. That is according to Nielsen. The U.S. team averaged 2.45 million during the World Cup on Telemundo. That is double its 1.02 million average for 40 matches on Spanish-language networks during the three years ahead of the tournament. That compares with the NFL’s average of 17.1 million for the 2021 regular season.See omnystudio.com/listener for privacy information.

united states american game chicago english los angeles washington nfl las vegas men michigan spanish minnesota pennsylvania brazil detroit judge new orleans indiana argentina kentucky world cup netherlands los angeles lakers notre dame lake raiders wolves era chicago bulls fifa all star rams new york yankees snap los angeles dodgers cardinals tigers michigan state university wizards penn state boston red sox arizona state university toledo boston university los angeles rams braves baker mayfield associated press detroit pistons nielsen williamson pelicans pacers zion williamson molina lafayette fifa world cup nl ingram aaron judge our lady mayfield mike trout indiana pacers spartans minnesota timberwolves washington wizards anthony edwards jansen contreras florida panthers new orleans pelicans rudy gobert beal mookie betts nl east demar derozan kristaps porzingis telemundo skid hauser mendon nl central zach lavine los angeles angels mott fighting irish bangor tyrese haliburton t wolves brandon ingram llewellyn kyle kuzma bloomingdales central michigan robert morris mcvay derozan sports update eastern michigan buddy hield comstock cosy terriers lakeshore yadier molina nikola vucevic merrimack kenley jansen willson contreras gold gloves killian hayes lmc coloma bostonu bridgman south haven angelo russell trey murphy iii 360m john wolford joey hauser all ivy league gull lake tyson walker justyn henry malloy frankie collins southwestern michigan jj starling joe jim mattawan lake michigan college mlb braves
SuperHits 103.7 COSY-FM
Spartans snap 2 game skid – Thursday Morning Sports Update

SuperHits 103.7 COSY-FM

Play Episode Listen Later Dec 8, 2022 3:01


NCAAMBKB – Men's College Basketball Last Night Michigan State 67, Penn State 58 Notre Dame 81, Boston University 75 Robert Morris 71, Central Michigan 66 Illinois State 87, Eastern Michigan 81 Michigan State 67, Penn State 58 – Hoggard scores career-high 23, Michigan St snaps 2-game skid A.J. Hoggard scored a career-high 23 points, Joey Hauser had 12 points and 15 rebounds and Michigan State beat Penn State 67-58 to snap a two-game losing streak. Michigan State avoided going .500 or worse after 10 games for the first time in 18 seasons. Hoggard blocked an open layup with less than a minute to play and Hauser grabbed the rebound before being fouled and making two free throws at the other end for a 66-58 lead. Hoggard, Hauser and Tyson Walker combined for 31 of Michigan State's 32 second-half points. The Michigan State defense allowed only one made field goal in the final five minutes. Penn State was just 1 of 9 from 3-point range in the second half after 7 of 18 before halftime. Notre Dame 81, Boston U. 75 – Ryan scores 21 to help Notre Dame hold off Boston University Cormac Ryan scored 21 points, including five free throws down the stretch, and Notre Dame held off Boston University 81-75. Ryan continued his hot 3-point shooting, making all four of his attempts. In his last three games, Ryan is 14 of 19 from the arc. Trey Wertz added three more treys and had 16 points with JJ Starling adding 15 for Notre Dame, which shot 49% and was even better from 3-point range at 9 of 17 for 53%. Ethan Brittain-Watts made 5 of 10 3-point tries and scored a career-high 19 points to lead the Terriers. Tonight Michigan at Minnesota, 9:00 p.m.              News/Talk/Sports 94.9 WSJM 8:30 NCAAMBKB – Michigan’s Jaelin Llewellyn out for season with knee injury Michigan point guard Jaelin Llewellyn is out for the rest of the season with an injured left knee. He is expected to have surgery next month. Llewellyn was hurt in a loss to Kentucky in London over the weekend. Llewellyn transferred to Michigan from Princeton last spring and that seemed to lead to Frankie Collins transferring to Arizona State. Llewellyn averaged seven points, 3.3 rebounds and 2.8 assists in eight games at Michigan. He was an All-Ivy League player last season and averaged nearly 16 points over three seasons at Princeton. NCAAWBKB – Women's College Basketball Last Night Western Michigan 68, Valparaiso 62 Today Cleveland State at Central Michigan, 11:00 a.m. Toledo at (14) Michigan, 7:00 p.m. (5) Notre Dame at Lafayette, Postponed NCAAWBKB – Notre Dame, Lafayette women postpone game for health reasons No. 5 Notre Dame and Lafayette have postponed a women’s basketball game scheduled for Thursday because of health and safety protocols. No other details were immediately available. The game was supposed to be played at Lafayette in Easton, Pennsylvania. The schools say they are looking at whether the game can be rescheduled. The Fighting Irish are still supposed to host Merrimack on Saturday. NFL – National Football League – Week 14 Tonight Las Vegas Raiders at Los Angeles Rams, 8:15 p.m. NFL – McVay: Baker Mayfield likely to be active for Rams vs Vegas Coach Sean McVay says Baker Mayfield probably will be active for the Los Angeles Rams' game against the Raiders on Thursday night. That’s just two days after the Rams claimed the former No. 1 draft pick off waivers from Carolina. McVay wouldn't rule out the possibility of Mayfield playing against Las Vegas, but the Rams coach indicated it likely would happen only if John Wolford's persistent neck injury forces struggling Los Angeles to try a desperate solution to its quarterback woes. Mayfield arrived in Los Angeles on Tuesday night to join the Rams. NHL – National Hockey League Tonight Detroit Red Wings at Florida Panthers, 7:30 p.m. NBA – National Basketball Association Last Night New Orleans Pelicans 104, Detroit Pistons 98 Chicago Bulls 115, Washington Wizards 111 Minnesota Timberwolves 121, Indiana Pacers 115 Pelicans 104, Pistons 98 – Williamson scores 29, Pelicans hold off Pistons 104-98 Zion Williamson had 29 points and 10 rebounds, and the New Orleans Pelicans held off the Detroit Pistons 104-98 for their fifth straight victory. Trey Murphy III hit four 3s and finished with 20 points for New Orleans, which won for the 10th time in 12 games. Naji Marshall added 17 points, including a put-back that made it 100-94 with 29 seconds left. Saddiq Bey scored 25 points for Detroit. Killian Hayes added 17 points and 12 assists for the Pistons. The Pelicans’ winning streak has come without Brandon Ingram because of the star forward’s left foot injury. Williamson has averaged 28 points during Ingram’s absence. Bulls 115, Wizards 111 – DeRozan, Bulls beat Wizards; Beal out with hamstring strain DeMar DeRozan scored 15 of his 27 points in the fourth quarter, Nikola Vucevic had 25 points and 11 rebounds, and the Chicago Bulls beat the Washington Wizards 115-111. Zach LaVine added 25 points as Chicago bounced back after a 2-4 road trip. Vucevic's basket gave the Bulls a 106-105 lead with 2:50 left, and he then blocked a shot that led to a jumper by DeRozan. After Porzingis missed a 3-pointer, DeRozan had a three-point play to put Chicago ahead 111-105 with 1:43 left. Kristaps Porzingis scored 28 points and Kyle Kuzma added 21 for Washington, which has lost four straight. The Wizards played without leading scorer and three-time All-Star Bradley Beal, who strained his hamstring Sunday in a loss to the Lakers. Beal will be re-evaluated next week. T-Wolves 121, Pacers 115 – Russell scores 15 points in 4th, Wolves beat Pacers 121-115 D'Angelo Russell had 15 of his 28 points in the fourth quarter, Anthony Edwards scored 26 points and the Minnesota Timberwolves held on to beat the Indiana Pacers 121-115 on Wednesday night. Rudy Gobert added 16 points and 20 rebounds for Minnesota, which blew a 23-point lead and had to overcome an eight-point deficit. Buddy Hield scored 26 points and hit 7 of 11 from 3-point territory for Indiana. Tyrese Haliburton added 26 points and 15 assists for the Pacers. MLB – AP source: Aaron Judge, Yankees reach $360M, 9-year deal A person familiar with the deal says that Aaron Judge has agreed to return to the New York Yankees on a $360 million, nine-year contract. The person spoke to The Associated Press on condition of anonymity because the deal had not been announced. It’s the largest free agent deal in baseball history. Judge will earn $40 million per season, the highest average annual payout for a position player. The contract trails only Mike Trout's $426.5 million deal with the Los Angeles Angels and Mookie Betts' $365 million pact with the Dodgers for biggest in baseball history. MLB – Braves acquire former All-Star reliever Jiménez from Tigers The Atlanta Braves have traded for former All-Star reliever Joe Jiménez, sending two minor leaguers to Detroit to help restock a bullpen now minus Kenley Jansen. The Tigers acquired outfielder Justyn-Henry Malloy and left-hander Jake Higginbotham. The swap at the winter meetings on Wednesday night came after Jansen, who led the NL with 41 saves last season, left the NL East champion Braves and got a $32 million, two-year deal with the Boston Red Sox. Jiménez, who turns 28 next month, was 3-2 with two saves and a 3.49 ERA in 62 games with the Tigers last season, striking out 77 in 56 2/3 innings. The right-hander has spent all six of his seasons in the majors with Detroit and was an All-Star in 2018. MLB – AP source: Cardinals, Contreras agree to 5-year contract Willson Contreras is going to St. Louis to replace Yadier Molina at catcher, agreeing to an $87.5 million, five-year contract with the Cardinals. A person familiar with the negotiations confirmed the move to The Associated Press on condition of anonymity because the deal was pending a physical. The addition of Contreras fills a major void for St. Louis, which won the NL Central this season with a 93-69 record. Molina decided to retire after spending his entire 19-year career with the Cardinals, making 10 All-Star teams and winning nine Gold Gloves. MCCAA – Junior College Athletics Last Night Women's Basketball Lake Michigan College 68, Mott Community College 63 The Lady Red Hawks had three players score in double figures in a 68-63 road win over Mott, Arial Ford had a game high 22 points along with 19 rebounds.  Aaliyah Reno and Kalyah Watson scored 10 points each. Southwestern Michigan 79, Henry Ford College 55 Cameron Thomas had a game high 27 points to lead Southwestern Michigan over Henry Ford College 79-55.  Her teammate Macey Laubach had 25. Men's Basketball Mott Community College 87, Lake Michigan College 50 20th ranked Mott out scored Lake Michigan College 45-19 in the second half on the way to an 87-50 win over the Red Hawks. Jailen Campbell scored 13 to lead LMC. AHL – American Hockey League Last Night Grand Rapids Griffins 6, Iowa Wild 2 ECHL – ECHL Hockey League Yesterday Kalamazoo Wings 6, Toledo Walleye 1 MHSAA – High School Sports Last Night Wrestling St. Joseph at Paw Paw Quad Jackson Northwest 48, St. Joseph 24 St. Joseph 51, Paw Paw 30 SAC Meet at South Haven South Haven 48, Delton-Kellogg 27 Martin 54, South Haven 17 Parchment Dual Parchment 54, Coloma 0 Watervliet 42, Parchment 36 Hudsonville Quad Hudsonville 34, Lakeshore 33 Lakeshore 63, East Kentwood 15 Girls Basketball Buchanan 63, Edwardsburg 42 Boys Basketball Kalamazoo Phoenix 65, White Pigeon 40 Today Girls Basketball St. Joseph at Battle Creek Central, 7:00 p.m. Lakeshore at Portage Central, 7:00 p.m. Michigan Lutheran at Our Lady of the Lake, 7:00 p.m. Bridgman at Coloma, 6:00 p.m. Portage Northern at Mattawan, 7:00 p.m. Kalamazoo Loy Norrix at Gull Lake, 7:00 p.m. Battle Creek Lakeview at Kalamazoo Central, 7:00 p.m. Climax-Scotts at Mendon, 6:00 p.m. Boys Basketball Countryside Academy at Bloomingdale, 6:00 p.m. Dowagiac at Comstock, 7:00 p.m. Bangor at Wyoming West Michigan Lutheran, 6:00 p.m. Boys Swimming and Diving Kalamazoo Central at St. Joseph, 6:00 p.m. FIFA – 2022 FIFA World Cup – Qatar Friday Quarterfinals Croatia vs. Brazil, 10:00 a.m. Netherlands vs. Argentina, 2:00 p.m. FIFA – After World Cup, US men recede to background for 3 1/2 years The United States men's soccer team recedes into the background of American sports for the next 3 1/2 years. The team's four World Cup matches averaged 12.2 million viewers on Fox but its 27 games on rated English-language networks from the start of 2020 through this fall averaged 668,000. That is according to Nielsen. The U.S. team averaged 2.45 million during the World Cup on Telemundo. That is double its 1.02 million average for 40 matches on Spanish-language networks during the three years ahead of the tournament. That compares with the NFL’s average of 17.1 million for the 2021 regular season.Penn State’s Seth Lundy (1) defends against a shot by Michigan State’s A.J. Hoggard (11) during the first half of an NCAA college basketball game Wednesday, Dec. 7, 2022, in State College, Pa. (AP Photo/Gary M. Baranec)Michigan Wolverines’ Jaelin Llewellyn (3) in action during an NCAA basketball game between Michigan Wolverines and Kentucky Wildcats at the O2 Arena, in London, Sunday, Dec.4, 2022. (AP Photo/Ian Walton)New Orleans Pelicans forward Zion Williamson (1) drives to the basket between Detroit Pistons forward Saddiq Bey (41) and center Jalen Duren (0) in the first half of an NBA basketball game in New Orleans, Wednesday, Dec. 7, 2022. (AP Photo/Gerald Herbert)Chicago Bulls’ Zach LaVine shorts between Washington Wizards’ Daniel Gafford, left, and Kristaps Porzingis during the second half of an NBA basketball game Wednesday, Dec. 7, 2022, in Chicago. The Bulls won 115-111. (AP Photo/Charles Rex Arbogast)See omnystudio.com/listener for privacy information.

american game chicago english los angeles washington nfl las vegas men nba michigan spanish minnesota pennsylvania brazil detroit judge new orleans indiana argentina kentucky ncaa world cup netherlands los angeles lakers notre dame lake raiders wolves era chicago bulls fifa all star rams new york yankees snap los angeles dodgers cardinals tigers michigan state university wizards penn state boston red sox arizona state university toledo boston university los angeles rams braves baker mayfield associated press detroit pistons nielsen williamson pelicans pacers zion williamson molina lafayette fifa world cup nl ingram aaron judge our lady mayfield mike trout indiana pacers spartans minnesota timberwolves washington wizards anthony edwards jansen contreras florida panthers new orleans pelicans rudy gobert beal mookie betts michigan wolverines nl east kristaps porzingis telemundo skid hauser mendon nl central zach lavine los angeles angels mott fighting irish bangor kentucky wildcats tyrese haliburton t wolves brandon ingram llewellyn kyle kuzma bloomingdales state college central michigan mcvay derozan sports update buddy hield thursday morning comstock terriers lakeshore yadier molina o2 arena nikola vucevic merrimack kenley jansen gold gloves killian hayes lmc coloma jalen duren bostonu bridgman trey murphy iii 360m morning sports john wolford joey hauser all ivy league gull lake tyson walker justyn henry malloy frankie collins southwestern michigan jj starling seth lundy joe jim mattawan lake michigan college mlb braves
Flippin' Bats with Ben Verlander
BREAKING NEWS: AARON JUDGE HAS SIGNED WITH NEW YORK YANKEES - 9YRS, $360M

Flippin' Bats with Ben Verlander

Play Episode Listen Later Dec 7, 2022 16:05


Ben Verlander does an emergency podcast episode after Aaron Judge has signed with the New York Yankees for $360 million over 9 years. Ben breaks down the timeline of the Yankees' and Judge's offseason. He also discusses how the San Diego Padres came in with a deal for $400 million over 10 years. Judge turned down both the San Francisco Giants and Padres to return to the Yankees. What does this mean for the Yankees moving forward? Learn more about your ad choices. Visit megaphone.fm/adchoices

Pinstripe Strong - Yankees Podcast
EP 266 | Aaron Judge Signs $360M/ 9 Years HES BACK!| Yankees OffSZN VOL.2 | PinstripeStrong Podcast

Pinstripe Strong - Yankees Podcast

Play Episode Listen Later Dec 7, 2022 31:58


#yankees #hotstove #freeagency Aaron Judge signs HUGE DEAL to stay as a Yankee and we are happy! lets goo To join these live streams make sure to follow on Twitch: Www.twitch.tv/joezmcfly Join OUR DISCORD for conversations with other Yankee Fans and updates: https://discord.gg/rYCwp4F LIKE COMMENT SUBSCRIBE Follow on Twitter and Instagram : https://twitter.com/PinstripeStrong https://www.instagram.com/pinstripestrong/ Personal Link Https://twitter.com/joezmcfly https://www.instagram.com/joezmcfly Listen to our Podcast we record after every series: https://link.chtbl.com/PinstripeStrong Support us in the Merch Store! https://shop.jomboymedia.com/collections/pinstripe-strong Special thanks to Jomboy Media LETS GO YANKEES!

Locked On Giants – Daily Podcast On The San Francisco Giants
SF Giants rejected again as Aaron Judge re-signs with New York Yankees

Locked On Giants – Daily Podcast On The San Francisco Giants

Play Episode Listen Later Dec 7, 2022 27:15


San Francisco Giants fans woke up to the heartbreaking news that Aaron Judge was re-signing with the New York Yankees. It was believed that Judge may actually choose to sign with the Giants. There were even reports yesterday that he had made his decision and he was going to sign with San Francisco. But in the end, the SF Giants got played. Aaron Judge used the Giants to get the best possible offer from the Yankees. That was his right, and it's how the process works, but it stings in an all-too-familiar way for fans of the SF Giants. The Giants' offer was reportedly the same 9/$360M that Judge ultimately secured from the Yanks. But Judge wasn't the Giants' only option. While the market has thinned out considerably, arguably two of the three best players after Judge are still available, and the Giants are expected to pivot to them in a hurry. Carlos Correa, Xander Bogaerts, and Dansby Swanson are still out there, as is Brandon Nimmo, Carlos Rodón, Kodai Senga, and others. The Giants have $40M a year they were willing to spend on Judge, and they're still expected to be very active in bringing in talent this offseason. Correa seems to be the best fit, and he doesn't have loyalty to a team like Judge evidently did with the Yankees. The assumption is that Correa would sign with the highest bidder, but there are no guarantees in free agency.Correa would be a perfectly strong fallback option after Judge. The Giants already signed Mitch Haniger to a three-year deal last night, and Correa would be a great addition as well. The San Francisco Giants are also in the market for another outfielder in addition to Haniger, a starting pitcher, and back-end relief help. In other words, expect the SF Giants to remain very active even after missing out on Judge.Find and follow Locked On Giants on your favorite podcast platforms:

Locked On Giants – Daily Podcast On The San Francisco Giants
SF Giants rejected again as Aaron Judge re-signs with New York Yankees

Locked On Giants – Daily Podcast On The San Francisco Giants

Play Episode Listen Later Dec 7, 2022 31:00


San Francisco Giants fans woke up to the heartbreaking news that Aaron Judge was re-signing with the New York Yankees. It was believed that Judge may actually choose to sign with the Giants. There were even reports yesterday that he had made his decision and he was going to sign with San Francisco. But in the end, the SF Giants got played. Aaron Judge used the Giants to get the best possible offer from the Yankees. That was his right, and it's how the process works, but it stings in an all-too-familiar way for fans of the SF Giants. The Giants' offer was reportedly the same 9/$360M that Judge ultimately secured from the Yanks. But Judge wasn't the Giants' only option. While the market has thinned out considerably, arguably two of the three best players after Judge are still available, and the Giants are expected to pivot to them in a hurry. Carlos Correa, Xander Bogaerts, and Dansby Swanson are still out there, as is Brandon Nimmo, Carlos Rodón, Kodai Senga, and others. The Giants have $40M a year they were willing to spend on Judge, and they're still expected to be very active in bringing in talent this offseason. Correa seems to be the best fit, and he doesn't have loyalty to a team like Judge evidently did with the Yankees. The assumption is that Correa would sign with the highest bidder, but there are no guarantees in free agency. Correa would be a perfectly strong fallback option after Judge. The Giants already signed Mitch Haniger to a three-year deal last night, and Correa would be a great addition as well. The San Francisco Giants are also in the market for another outfielder in addition to Haniger, a starting pitcher, and back-end relief help. In other words, expect the SF Giants to remain very active even after missing out on Judge. Find and follow Locked On Giants on your favorite podcast platforms:

The Spotrac Podcast
Aaron Judge Returns & More MLB Offseason Analysis

The Spotrac Podcast

Play Episode Listen Later Dec 7, 2022 62:20


Aaron Judge returns to the Yankees on a $360M contract through 2031, putting the largest puzzle piece of the MLB offseason in place. Dan Soemann & Mike Ginnitti break down the ripple effects of this deal, Trea Turner's splash in Philly, a few teams with work still to do, & predictions on where the remaining best available players may land.

Locked On Yankees - Daily Podcast On The New York Yankees
BREAKING NEWS | Aaron Judge is back on a nine-year/$360M deal!

Locked On Yankees - Daily Podcast On The New York Yankees

Play Episode Listen Later Dec 7, 2022 25:57


Stacey celebrates the news of Aaron Judge re-signing with the Yankees on a nine-year/$360M deal. Stacey also berates some of the writers for making Tuesday an emotional roller coaster with erroneous tweets about Judge going to the Giants and unsubstantiated rumors that made it seem like he was on his way out west. He was, but it was only to go to San Diego and sign with the Yankees.Please note this was a live show on YouTube and there is some interaction with people who were chatting during the live stream. Support Us By Supporting Our Sponsors!Built BarBuilt Bar is a protein bar that tastes like a candy bar. Go to builtbar.com and use promo code “LOCKEDON15,” and you'll get 15% off your next order.BetOnlineBetOnline.net has you covered this season with more props, odds and lines than ever before. BetOnline – Where The Game Starts!SimpliSafeWith Fast Protect™️ Technology, exclusively from SimpliSafe, 24/7 monitoring agents capture evidence to accurately verify a threat for faster police response. There's No Safe Like SimpliSafe. Visit SimpliSafe.com/LockedOnMLB to learn more. Learn more about your ad choices. Visit podcastchoices.com/adchoices

Ordway, Merloni & Fauria
The Red Sox are starting to get the ball rolling

Ordway, Merloni & Fauria

Play Episode Listen Later Dec 7, 2022 45:02


Hour 1-  We start with the big news coming out of the baseball world today as Aaron “Arson” Judge is heading back to the Yankees on a 9-year $360M deal. Prior to the 2022 season, Aaron Judge declined a 7-year $213.5M deal and bet on himself. He went on to hit 62 homeruns in a historic season, and landed himself the massive 9-year deal. Is Aaron Judge's new deal the greatest bet on yourself moment in sports? Red Sox sign closer Kenley Jansen. Could the Red Sox and Xander be closing the gap? Mego needs a rant- People are their worst selves at the gym.

Locked On Yankees - Daily Podcast On The New York Yankees
BREAKING NEWS | Aaron Judge is back on a nine-year/$360M deal!

Locked On Yankees - Daily Podcast On The New York Yankees

Play Episode Listen Later Dec 7, 2022 29:42


Stacey celebrates the news of Aaron Judge re-signing with the Yankees on a nine-year/$360M deal. Stacey also berates some of the writers for making Tuesday an emotional roller coaster with erroneous tweets about Judge going to the Giants and unsubstantiated rumors that made it seem like he was on his way out west. He was, but it was only to go to San Diego and sign with the Yankees. Please note this was a live show on YouTube and there is some interaction with people who were chatting during the live stream. Support Us By Supporting Our Sponsors! Built Bar Built Bar is a protein bar that tastes like a candy bar. Go to builtbar.com and use promo code “LOCKEDON15,” and you'll get 15% off your next order. BetOnline BetOnline.net has you covered this season with more props, odds and lines than ever before. BetOnline – Where The Game Starts! SimpliSafe With Fast Protect™️ Technology, exclusively from SimpliSafe, 24/7 monitoring agents capture evidence to accurately verify a threat for faster police response. There's No Safe Like SimpliSafe. Visit SimpliSafe.com/LockedOnMLB to learn more. Learn more about your ad choices. Visit podcastchoices.com/adchoices

The Mike Francesa Podcast
Mike Francesa Reacts: Aaron Judge Returns to Yankees

The Mike Francesa Podcast

Play Episode Listen Later Dec 7, 2022 9:33 Transcription Available


Aaron Judge & the Yankees have agreed on a 9-year, $360M contract that keeps the superstar in NY. Mike Francesa says that it's a "bad contract," but the status quo has been maintained. With Judge onboard, the Yankees still have a lot of work to do to get back to the World Series.

JD Talkin Sports
JD TALKIN SPORTS #1172

JD Talkin Sports

Play Episode Listen Later Dec 7, 2022 43:16


How cool that #bakermayfield is wearing my lucky number with the @rams maybe this will be a match made in heaven. So @morning_blitz is the best.  @georgiafootball #stetsonbennett is 25 years old and in his fifth year at Georgia. @ravens #lamarjackson is also 25 and playing his fifth year in the #nfl crazy.  @yankeesrecaps called it #aaronjudge bet on himself 9-$360M nice raise from the 7-$213.5 before the season.  Love #jeffbrohm going back home to coach @louisvillefb in 2023. @btmcgraw @ukfootball #musiccitybowl should be called #backup bowl. @hawkeyefootball will start a #quarterback who has never thrown a pass in #collegefootball but there's no time like the present.  @njdevils are 21-4-1 and third worst in #nhl in attendance.  Just sad.

Moxie Bets with Katie Mox
World Cup Quarterfinals Preview, Aaron Judge is Back with the Yankees & TNF Picks

Moxie Bets with Katie Mox

Play Episode Listen Later Dec 7, 2022 31:14


Katie Mox welcomes in Action Network's BJ Cunningham to preview the World Cup Quarterfinals. BJ gives his picks for Croatia vs. Brazil, Netherlands vs. Argentina, Morocco vs. Portugal and England vs. France. Plus, Katie reacts to Aaron Judge and the Yankees agreeing to a 9-year, $360M deal, making it the biggest free agent contract of all time. And gives her TNF picks for Raiders and Rams. Learn more about your ad choices. Visit megaphone.fm/adchoices

880 Extras
Aaron Judge reportedly signs $360M contract with Yankees

880 Extras

Play Episode Listen Later Dec 7, 2022 2:31


First Up with Landsberg & Colaiacovo
Bryan Hayes “It's the biggest gamble of Dubas' career… and it's certainly paid off to this point without question"

First Up with Landsberg & Colaiacovo

Play Episode Listen Later Dec 7, 2022 18:38


Host of OverDrive Bryan Hayes joins First Up to chat about the Northern Star Award being handed out today and Matt Murray's standout performance against the Dallas Stars last night. Hayes shares his thoughts on who he thinks is on the shortlist for the Northern Star Award, the promising early returns of Matt Murray since arriving in Toronto, Kyle Dubas deserving credit for moving on from Jack Campbell, Aaron Judge re-sign with the Yankees on a 9-year $360M deal, and more.

Vanessa and Gallant
12/07/2022 The Paul Gallant Show Hour 1

Vanessa and Gallant

Play Episode Listen Later Dec 7, 2022 48:01


Paul kicks things off reacting to Aaron Judge returning to Yankees for 9 years and $360M which makes him ask if the Yankees are still the biggest threat to the Astros in the American League. Next, he explains why Mike Vrabel is pulling a Bill O'Brien in Tennessee before breaking down more of the biggest stories from around the NFL in the 10-Minute Drill. Then he wraps things up talking college basketball and his reaction the latest MLB Hall of Fame voting.

The Rich Eisen Show
REShow: Trent Dilfer/Andrea Savage - Hour 3 (12-6-2022)

The Rich Eisen Show

Play Episode Listen Later Dec 6, 2022 48:02


Former NFL QB Trent Dilfer tells Rich why he went from “running away from coaching” to taking the HC job at Alabama-Birmingham, how he plans to use Tom Brady and his Super Bowl ring as big-time recruiting flexes, and says why he never wavered on his belief in Tua Tagovailoa. Actress/comedian Andrea Savage joins Rich in-studio where she reveals how she reacted to the meeting co-star Sylvester Stallone for the first time on the set of ‘Tulsa King,' reveals that she's not all that familiar with ‘Step Brothers' event though she was in the comedy classic, and more.  Rich reacts to Jets Head Coach Robert Saleh naming Zach Wilson the team's #3 quarterback for the 2nd straight week and weighs in on reports that the Giants have offered Aaron Judge a $360M contract. Learn more about your ad choices. Visit podcastchoices.com/adchoices

JD Talkin Sports
JD TALKIN SPORTS #1171

JD Talkin Sports

Play Episode Listen Later Dec 6, 2022 56:37


Loved him as a #boxingreferee #rip #millslane always a fan.  #kirstiealley too. She was hilarious.  Please @bwmciver @lucygreymciver say it's not true #drakemaye entering the #transferportal & leaving @uncfootball that would be crushing.  @bryantufootball opens 2023 #collegefootball season @allegiantstadium vs @unlvfootball on August 31st. I would love to go.  Just as crushing @uncwomenssoccer giving up 2 goal lead & losing in OT to @uclawsoccer at the #collegecup last night. Great season.  Tough ending.  @frederic.michael @saints blow a 16-3 lead in last six minutes to #tb12 & @buccaneers 17-16 on #mnf last night. #bucs are #bowleligible at 6-6.  #bakermayfield is on his second team this year the @rams and could play #tnf vs @raiders who are on a 3 game win streak. Baker can get some great tape for @xfl & #usfl Spring 2023 seasons.  Very exciting.  @sfgiants reportedly offered #aaronjudge close to $360M.  The suspense is killing me.

4DMBOX.COM
THIS TWITCH STREAMER MADE $360M GAMBLING | TRAINWRECK

4DMBOX.COM

Play Episode Listen Later Oct 22, 2022 18:48


Get Debt Free: https://bit.ly/3w4Rx4n Contact Us & Get Debt Free: http://4dmbox.com Subscribe: https://www.youtube.com/channel/UCxQT5W5soXUo1kkDWH1R_zg?sub_confirmation=1 #news #money #personalfinance --- Send in a voice message: https://anchor.fm/4dmboxcom/message Support this podcast: https://anchor.fm/4dmboxcom/support

Pacers Podcast
Pacers President/COO Mel Raines on $360M in renovations to Gainbridge Fieldhouse

Pacers Podcast

Play Episode Listen Later Oct 5, 2022 37:30


On this episode of the Fieldhouse Files, Mel Raines, president and & chief operating officer of Pacers Sports & Entertainment, makes her first appearance on the show. We discussed the significant improvements made to and around Gainbridge Fieldhouse, what the new setup is like and what to expect. “It's very much a new building that people will be coming back into this fall,” she said. Among the items discussed on this episode: Oct. 4, 2022 being 500 days out from the 2024 NBA All-Star Game. Notable change at training camp and two names I hear repeatedly. How the Pacers will treat the preseason and a few things I'll be watching for. Then … Mel provides a throughout update on Phase 3 of renovations. (11:30) Dates when certain areas will be completed, including the new Kroger SkyDeck. How the plaza could be utilized — sunrise yoga, happy hour concerts, farmers market, etc. Her path from IU to politics to sports — and now having a significant role. The Details Total of $360 million spent on upgrades, plus another $18 million in capital expenditures or necessary upgrades. First event back: Oct. 2, Post Malone Team Store reopens: Oct. 3, plus a new store on the balcony level at games Kroger SkyDeck: Targeting the end of November to open Bicentennial unity plaza: Projecting to be done May 2023 Entertainment complex building: End of 2023; restaurant, speakeasy and an entertainment space just east of the plaza. ----- Follow Scott on Twitter and Instagram, and read his work on fieldhousefiles.com.

Female Startup Club
6 Quick Questions with Tara Bosch, founder of Smartsweets (part 2)

Female Startup Club

Play Episode Listen Later Jun 21, 2022 13:06


Today on the show we're joined by Tara Bosch, founder of the wildly successful candy brand, SmartSweets. Since launching in July 2016, SmartSweets' mission to innovate delicious, low-sugar candy you can feel good about has remained the same – Kick Sugar, Keep Candy™. Envisioned as "the future of candy," SmartSweets aims to be a global leader in the sugar reduction movement by tackling one of the largest and most concerning ingredients in our everyday food – sugar. In this episode, we cover Tara's inspiring journey from starting this business in her kitchen right through to selling a major stake in the business for $360m. It's a wild ride and I know you're going to learn so much from Tara! I couldn't believe how she only raised $3M to then go on to proceed with a $360M acquisition. Some seriously impressive stuff. Tara believes there to be so much stigma and idealization around raising as if that almost becomes the celebration. Realistically this can distract from the fact that the money is so you can grow your business, and those are the milestones that you should be celebrating. If you can raise no money that's amazing, it's much more beneficial for everyone. There are some real nuggets of wisdom here that I know will benefit a lot of you. For example, how it may look like entrepreneurs with a certain level of success have it all figured out but that's very rarely the case. Tara thought that when you got to a certain level of success you just have this unwavering confidence that you know what you're doing and you've got it all figured out. That as you scale, this feeling comes automatically. But this just isn't the case. The feeling changes, but you rarely feel like you've got it all figured out. That's just part of the entrepreneurial ride, what keeps you going in some ways, and also something that's good to accept. There's not one event that is the silver bullet to success, just accumulation that keeps that momentum going, and the snowball getting bigger and bigger. But you still have to push that snowball.  Stick around for the last part where Tara dives into her key piece of advice for entrepreneurs entering the candy space. For her, having a radical value proposition is one of the reasons at the heart of why SmartSweets scaled so quickly, and why they're continuing to be the market leader. To have a massively successful product and simultaneously better the world you have to take something that exists and create so much radical value that for people it's reinventing the wheel in a very meaningful way, that sparks the emotional response of someone to your product. Make sure you have that radical value proposition. If you love this episode remember to screenshot and share on Instagram stories to help other ears find us. LINKS WE MENTION Smartsweets Instagram Female Startup Club's Instagram Doone's Instagram Doone's TikTok Tim Ferriss' Blog Peter Thiel's Zero to One Felix Dennis' How To Get Rich Learn more about Dymo at Dymo.com Learn more about Athletic Greens at Athleticgreens.com Try Zapier for free today at zapier.com/STARTUP In partnership with Klaviyo, the best email marketing tool for ecommerce businesses. Female Startup Club's YouTubeFemale Startup Club's Private Facebook Group Say hello to Doone: hello@femalestartupclub.com Female Startup Club + Clearco: Clear.co/partner/female-star

Female Startup Club
From her kitchen to a $360M acquisition, SmartSweets Founder Tara Bosch shares her inspiring story (part 1)

Female Startup Club

Play Episode Listen Later Jun 20, 2022 46:43


Today on the show we're joined by Tara Bosch, founder of the wildly successful candy brand, SmartSweets. Since launching in July 2016, SmartSweets' mission to innovate delicious, low-sugar candy you can feel good about has remained the same – Kick Sugar, Keep Candy™. Envisioned as "the future of candy," SmartSweets aims to be a global leader in the sugar reduction movement by tackling one of the largest and most concerning ingredients in our everyday food – sugar. In this episode, we cover Tara's inspiring journey from starting this business in her kitchen right through to selling a major stake in the business for $360m. It's a wild ride and I know you're going to learn so much from Tara! I couldn't believe how she only raised $3M to then go on to proceed with a $360M acquisition. Some seriously impressive stuff. Tara believes there to be so much stigma and idealization around raising as if that almost becomes the celebration. Realistically this can distract from the fact that the money is so you can grow your business, and those are the milestones that you should be celebrating. If you can raise no money that's amazing, it's much more beneficial for everyone. There are some real nuggets of wisdom here that I know will benefit a lot of you. For example, how it may look like entrepreneurs with a certain level of success have it all figured out but that's very rarely the case. Tara thought that when you got to a certain level of success you just have this unwavering confidence that you know what you're doing and you've got it all figured out. That as you scale, this feeling comes automatically. But this just isn't the case. The feeling changes, but you rarely feel like you've got it all figured out. That's just part of the entrepreneurial ride, what keeps you going in some ways, and also something that's good to accept. There's not one event that is the silver bullet to success, just accumulation that keeps that momentum going, and the snowball getting bigger and bigger. But you still have to push that snowball.  Stick around for the last part where Tara dives into her key piece of advice for entrepreneurs entering the candy space. For her, having a radical value proposition is one of the reasons at the heart of why SmartSweets scaled so quickly, and why they're continuing to be the market leader. To have a massively successful product and simultaneously better the world you have to take something that exists and create so much radical value that for people it's reinventing the wheel in a very meaningful way, that sparks the emotional response of someone to your product. Make sure you have that radical value proposition. If you love this episode remember to screenshot and share on Instagram stories to help other ears find us. LINKS WE MENTION Smartsweets Instagram Female Startup Club's Instagram Doone's Instagram Doone's TikTok Tim Ferriss' Blog Peter Thiel's Zero to One Felix Dennis' How To Get Rich Learn more about Dymo at Dymo.com Learn more about Athletic Greens at Athleticgreens.com Try Zapier for free today at zapier.com/STARTUP In partnership with Klaviyo, the best email marketing tool for ecommerce businesses. Female Startup Club's YouTubeFemale Startup Club's Private Facebook Group Say hello to Doone: hello@femalestartupclub.com Female Startup Club + Clearco: Clear.co/partner/female-star

Expert Talk with TGo
Multi-Family Syndication Dr. Hoa Nguyen on Expert Talk 9@9

Expert Talk with TGo

Play Episode Listen Later Jun 8, 2022 11:36


http://ExpertTalk.fm (http://ExpertTalk.fm) ~ Hoa Nguyen is an entrepreneur, author, business leader, eye doctor, and an accredited real estate investor and syndicator. She is the co-founder of 20/20 Platinum Capital. She has ownership in over $360M in real estate acquisitions and is invested in over 5,300 units as general partners and limited partners. She owns 2 successful eye practices in DFW named Eye Pieces for over 10 years and co-founded 20/20 Platinum Capital to help families/busy professionals invest passively in apartment communities. She is a blessed mother to a 7 year old daughter and proud wife to her life partner for 16 years. Her passion is to travel the world to immerse in different cultures, empower the youth, empower women, and contribute back to her community and many non-profit organizations. #ExpertTalkWithTGo #ExpertTalkXtra #TalkShow #PodcastToBroadcast #TheresaGoss #ExpertTalkFM #Roku #Pandora #iHeartRADIO #PodNationTV #talkshowtv #talkshowonline #talkshowhost #podcast #motivation #broadcast #listennow #entrepreneur #marketing #TGoTV #9at9 #FastFunInformative #LightsCamerasTakeAction

The Multifamily Takeoff
Leveraging Mentorship Programs and Strategic Partnering - Hoa Nguyen

The Multifamily Takeoff

Play Episode Listen Later Jun 6, 2022 45:10


Hoa Nguyen is an entrepreneur, author, business leader, eye doctor, and an accredited real estate investor and syndicator. She is the CEO and co-founder of Blacksteel Investment Group. She has ownership in over $360M in real estate acquisitions and is invested in over 5,300 units as general partners and limited partners. She owns 2 successful eye practices in DFW named Eye Pieces for over 10 years and co-founded 20/20 Platinum Capital to help families/busy professionals invest passively in apartment communities. She is a blessed mother to a 7 year old daughter and proud wife to her life partner for 17 years. Her passion is to travel the world to immerse in different cultures, empower the youth, empower women, and contribute back to her community and many non-profit organizations. In this episode Hoa talks about how she quickly grew to over 5000 units in 5 years. She shared her journey of joining a mentorship program and finding partners that complemented her skill sets to co-sponsor deals. This episode will give you insight on barriers to enter to multifamily industry and why partnering with people who have experience is valuable in your journey. Connect with Hoa: Website: www.passivewealth23.com Partner with us: www.pac3capital.com Follow the show on Instagram: @themultifamilytakeoff

The Twenty Minute VC: Venture Capital | Startup Funding | The Pitch
20VC: The Most Revealing Breakdown of Unit Economics for Quick Commerce; AOVs, Retention, Delivery Costs and more, Why The Business Model is Different for Emerging Markets & Will This Be a Market of Consolidation or Many Players

The Twenty Minute VC: Venture Capital | Startup Funding | The Pitch

Play Episode Listen Later Jun 1, 2022 52:48


Over the last 10 days, we have seen unprecedented levels of layoffs from some of the biggest quick commerce providers in the world from Getir to GoPuff to Zapp and Gorillas. Today we dive into the world of quick commerce in emerging markets to uncover what is the same and what is different about the model in emerging markets. Usman Gul is the Founder & CEO @ Airlift, one of the fastest-growing quick commerce providers in the world with core operations in Pakistan. Airlift has raised over $100M in funding from First Round, Josh Buckley, Sam Altman, and 20VC. Ralf Wenzel is the Founder & CEO @ JOKR, a unique provider in the quick commerce market with their dual operations in both the US and LATAM. They are one of the only providers to operate in both emerging and developed economies. To date, JOKR has raised over $288M from Softbank, Balderton, GGV, and Kaszek to name a few. Aadit Palicha is the Founder & CEO @ Zepto, they have taken the Indian quick commerce market by storm since their early days in YC. To date, Aadit has raised over $360M with Zepto from YC, Lachy Groom, Breyer Capital, and Rocket Internet to name a few. In Today's Episode on Quick Commerce in Emerging Markets You Will Learn: 1.) Emerging Markets vs Developed Economies: Where is Quick Commerce Best? What are the single biggest benefits for quick commerce providers in emerging markets? What are the single biggest challenges of operating quick commerce companies in emerging markets as compared to developed economies? From a cost of goods and delivery perspective, what is the single biggest difference comparing operating in emerging markets? 2.) Warehouses, Picking and Delivery: The Economics Broken Down: What % of revenue does Zepto, Airlift and JOKR spend on average for new warehouses in mature markets? How does this change over time? How do they select warehouse locations? What % of revenue is picking costs for Zepto, Airlift and JOKR? What are some needle moving things that could reduce this picking cost? What % of revenue is delivery costs for Zepto, Airlift and JOKR? What levers can make this driver efficiency and delivery cost more efficient? What % of AOV does Airlift and Zepto charge for delivery? How does Zepto leverage power users to subsidise the delivery costs for newly acquired users? Why does JOKR not agree with charging delivery fees? How does charging delivery fees impact usage, frequency and AOV? 3.) Product Selection and Margins: Who Goods Have The Highest Margins? How do Zepto, Airlift and JOKR select the products they sell? How do the margins differ across different product categories? Why is fruit and vegetable the most important category for all three providers? What other metrics are heavily impacted by large spend on fruit and vegetable spend? 4.) AOV and Customer Spend: What is Good? What is the AOV for Airlift, JOKR and Zepto today? How do new markets compare to more mature markets? What are the drivers of the increase? Why does Zepto not believe that AOV is the right metric to be tracking? Why is gross profit per order the right metric to be tracking? 5.) Additional Business Models: Advertising: How much revenue does JOKR, Airlift and Zepto make from advertising revenue today? What can be done to increase this? How have JOKR been able to scale advertising revenue in such a short space of time? What has worked? What has not worked? How important is advertising revenue to the future sustainability of the business model?

MUSIC REACTIONS AND COMMENTS
NortonLifeLock says it has acquired Germany-based antivirus vendor Avira for $360M in cash, eight months after Avira was acquired for $180M by a PE firm (Ingrid Lunden/TechCrunch)

MUSIC REACTIONS AND COMMENTS

Play Episode Listen Later May 28, 2022 0:34


This episode is also available as a blog post: https://feedssoundcloudcomuserssoundcloudusers.wordpress.com/2020/12/07/nortonlifelock-says-it-has-acquired-germany-based-antivirus-vendor-avira-for-360m-in-cash-eight-months-after-avira-was-acquired-for-180m-by-a-pe-firm-ingrid-lunden-techcrunch/ --- Send in a voice message: https://anchor.fm/you-betterknow4/message

MUSIC REACTIONS AND COMMENTS
NortonLifeLock says it has acquired Germany-based antivirus vendor Avira for $360M in cash, eight months after Avira was acquired for $180M by a PE firm (Ingrid Lunden/TechCrunch)

MUSIC REACTIONS AND COMMENTS

Play Episode Listen Later May 28, 2022 0:34


This episode is also available as a blog post: https://feedssoundcloudcomuserssoundcloudusers.wordpress.com/2020/12/07/nortonlifelock-says-it-has-acquired-germany-based-antivirus-vendor-avira-for-360m-in-cash-eight-months-after-avira-was-acquired-for-180m-by-a-pe-firm-ingrid-lunden-techcrunch/ --- Send in a voice message: https://anchor.fm/you-betterknow4/message

Life Science Today
Chimerix, Emergent, SIGA, Merck & Kelun, Inceptor, Laekna, Kriya

Life Science Today

Play Episode Listen Later May 24, 2022 8:20 Transcription Available


Pox progress, an oncology acquisition, and we look at $360M worth of biotech fundingFind out more athttps://LifeScienceTodayPodcast.comStory ReferencesChimerix, Emergent, SIGA Merck & KelunInceptor, Laekna, KriyaAbout the ShowLife Science Today is your source for stories, insights, and trends across the life science industry. Expect weekly highlights about new technologies, pharmaceutical mergers and acquisitions, news about the moves of venture capital and private equity, and how the stock market responds to biotech IPOs. Life Science Today also explores trends around clinical research, including the evolving patterns that determine how drugs and therapies are developed and approved. It's news, with a dash of perspective, focused on the life science industry. 

Expert Talk with TGo
Understanding investing with Dr. Hoa Nguyen on Expert Talk 9@9

Expert Talk with TGo

Play Episode Listen Later May 11, 2022 11:45


http://ExpertTalk.fm (http://ExpertTalk.fm) ~ Hoa Nguyen is an entrepreneur, author, business leader, eye doctor, and an accredited real estate investor and syndicator. She is the co-founder of 20/20 Platinum Capital. She has ownership in over $360M in real estate acquisitions and is invested in over 5,300 units as general partners and limited partners. She owns 2 successful eye practices in DFW named Eye Pieces for over 10 years and co-founded 20/20 Platinum Capital to help families/busy professionals invest passively in apartment communities. She is a blessed mother to a 7 year old daughter and proud wife to her life partner for 16 years. Her passion is to travel the world to immerse in different cultures, empower the youth, empower women, and contribute back to her community and many non-profit organizations. #ExpertTalkWithTGo #ExpertTalkXtra #TalkShow #PodcastToBroadcast #TheresaGoss #ExpertTalkFM #Roku #Pandora #iHeartRADIO #PodNationTV #talkshowtv #talkshowonline #talkshowhost #podcast #motivation #broadcast #listennow #entrepreneur #marketing #TGoTV #9at9 #FastFunInformative #LightsCamerasTakeAction

The Top Entrepreneurs in Money, Marketing, Business and Life
The Math Behind the $360m Sage Paid to Acquire BrightPearl

The Top Entrepreneurs in Money, Marketing, Business and Life

Play Episode Listen Later Apr 18, 2022 19:51


Retail Operating System for Merchants

Expert Talk with TGo
Build your business and create raving fans Dr. Hoa Nguyen on Expert Talk 9@9

Expert Talk with TGo

Play Episode Listen Later Apr 13, 2022 12:04


Dr. Hoa Nguyen is an entrepreneur, author, business leader, eye doctor, and an accredited real estate investor and syndicator. She is the co-founder of 20/20 Platinum Capital. She has ownership in over $360M in real estate acquisitions and is invested in over 5,300 units as general partners and limited partners. She owns 2 successful eye practices in DFW named Eye Pieces for over 10 years and co-founded 20/20 Platinum Capital to help families/busy professionals invest passively in apartment communities. She is a blessed mother to a 7 year old daughter and proud wife to her life partner for 16 years. Her passion is to travel the world to immerse in different cultures, empower the youth, empower women, and contribute back to her community and many non-profit organizations. #ExpertTalkWithTGo #ExpertTalkXtra #TalkShow #PodcastToBroadcast #TheresaGoss #ExpertTalkFM #Roku #Pandora #iHeartRADIO #PodNationTV #talkshowtv #talkshowonline #talkshowhost #podcast #motivation #broadcast #listennow #entrepreneur #marketing #TGoTV #9at9 #FastFunInformative #LightsCamerasTakeAction

PitchIt
Weekly News Roundup - December 16, 2021

PitchIt

Play Episode Listen Later Dec 17, 2021 32:53


This week's list of top stories in fintech: Institutional Bitcoin Broker NYDIG Valued at $7B in Whopping $1B Funding Round Largest funding round in crypto history Led by Westcap, a lot of demand; other investors are mainly TradFi: Affirm, FIS, Fiserv, MassMutual, Morgan Stanley and New York Life Raised $200m in March of this year May not hold the record for long as FTX is raising a mega round Nydig = crypto banking, entirely BTC Anchorage Digital raises $350 million in funding round led by KKR $350m Series D at a $3b valuation Huge syndicate of TradFis: Goldman Sachs, Alameda Research, Andreessen Horowitz, Apollo credit funds, funds and accounts managed by BlackRock, Blockchain Capital, Delta Blockchain Fund, Elad Gil, GIC, GoldenTree Asset Management, Innovius Capital, Kraken, Lux Capital, PayPal Ventures, Senator Investment Group, Standard Investments, Thoma Bravo and Wellington Management Services are perhaps more ETH centric, including governance Institutional crypto custody has  momentum MoneyLion agrees to acquire Even Financial for $440M, sees accretion $360M upfront payment $15M in cash  $345M in preferred shares that are convertible into 34.5M MoneyLion common shares at $10/share $80M earn-out, payable up to 8M in preferred shares valued at $10 per share Current price is $3.75; it's hard to read whether or not the preferred shares are actually worth anything or if they are like underwater stock options. Fintech, CDFI offer small-dollar loan with new twist Small dollar loans from a CDFI, NAAC Finance in partnership with Asenso Finance Low interest 3.5% - 20%, up to $2,500 Financial education is a

Leverage
Building a $360m+ Business While Beating Cancer with Dave Woodward

Leverage

Play Episode Listen Later Dec 17, 2021 51:20


Dave Woodward is a world-class marketing expert, the CEO of Clickfunnels, and a stage 4 brain cancer survivor. Earlier this year, Dave was told he had a month to live—yet he's on the podcast today, cancer-free and on his way to a full recovery. Thanks to his neverending optimism, world-class doctors, and a strict health regimen Dave was able to beat cancer in just three months and get back to running one of the most successful funnel builders in the world. In this episode, Nick and Dave discuss Dave's history with cancer—from diagnosis to becoming cancer free—and how it has affected his outlook on life. Dave explains some of the health tricks he used to help his body overcome cancer and how they can also help people in their everyday lives. Nick and Dave then get into business talk, discussing some of the new features coming to Clickfunnels, what is and isn't working in digital marketing right now, and how to build meaningful relationships with your customers through the right communication channels.

TheTop.VC
$360M+ AUM, Neotribe Ventures's Managing Director, Aditya Singh: Super SEs

TheTop.VC

Play Episode Listen Later Apr 7, 2021 5:33


"Super SEs" = highly technical sales engineers --> are replacing traditional sales positions #customerempathy

The Success and Ideas Podcast
Are Traasdahl - Good Business and Doing Good

The Success and Ideas Podcast

Play Episode Listen Later Feb 16, 2021 27:50


Are Traasdahl is no stranger to success. He has founded and built up multiple companies, including Tapad, a marketing tech company, which he sold for $360M in 2016. He tells Richard about how he got from a small town in Norway to New York, how he nurtures business ideas, and his now focus on getting rid of food waste in the world through technological innovation.You can find Are Trasdaahl on Twitter and learn more about his company Crisp here.Are mentions "costumer concentration" in the episode. It means: If a significant percentage of a company's overall revenue relies on a single customer, or a small group of customers, that business may be considered to have a high level of customer concentration. (source)This is an Earshot Strategies production. Get in touch if you're looking to make a podcast! 

First Pitches with Lolita Taub and Eric Bahn
5. Beating the Odds to the Tune of 360M Dollars w/Elias Torres of Drift

First Pitches with Lolita Taub and Eric Bahn

Play Episode Listen Later Feb 4, 2021 45:52


How do you define the American dream?For most, it's the belief that anyone, regardless of where they were born or what class they were born into, can attain their own version of success in a society where upward mobility is possible for everyone.But for others, it's a pipe dream, a fairy tale, a false hope. Because no matter how hard they work or how much time and effort they put into something, it is virtually impossible for them to achieve success.And that's due to the country and societal class they were born into; there are too many barriers for upward mobility.But the silver lining in all this is that it builds grit, and when grit meets opportunity, well, you've got a force of nature.On this episode of First Pitches, we talk with an entrepreneur who became just that.Elias Torres is the co-founder and CTO of Drift, a conversational marketing platform valued at over 360M dollars.Elias arrives in the United States in 1993 as a poor 17-year-old from Nicaragua; his family home was destroyed the year before.But with this new opportunity, he hustles his way into a college scholarship, works at IBM, and at the start of the great recession--upends his security to follow his dream of starting a startup.And even then, success isn't guaranteed. He fails again and again until he finally lands on an idea that would become Drift.Drift and Elias's overnight success has been 10 years in the making, and his story will leave you speechless. Listen to find out why.Get tactical tips on how to master your first pitch.  Sign up for our newsletter at www.firstpitches.com

Tech Without Borders by DojoLIVE!
Data Streaming and the End of the Data Broker Model

Tech Without Borders by DojoLIVE!

Play Episode Listen Later Dec 9, 2020 30:26


Can a disruptive new approach remove the pain of buying and selling data? View the full video interview here. Nick Jordan founded Narrative in 2016 after spending nearly a decade in data related product management roles because he saw there was a significant opportunity in the marketplace to create a platform that eliminates inefficiencies in data transactions. Participants in the data economy need access to readily available data assets, and tools to help them manage their data business. Nick and the team at Narrative have developed an offering that gives data practitioners full control of their data strategy by allowing them to source, evaluate, buy, process, and activate new data supply in a matter of hours. Nick and his company have just announced the launch of a new category: Data Streaming, which effectively replaces the broken data broker industry model with a transformative solution. Prior to Narrative, Nick led Product + Strategy at Tapad where he helped evolve the company from a media business into a data and technology licensing business before its acquisition by Telenor for $360M in 2016. Before joining Tapad, Nick ran Product Management at Demdex, the industry’s first data management platform (DMP) prior to its acquisition by Adobe in 2011. He also held roles at Yahoo! running pricing and yield management for newly acquired assets like Right Media.

Exposure Ninja Digital Marketing Podcast | SEO, eCommerce, Digital PR, PPC, Web design and CRO
#178: Lessons Learnt From Peloton’s $360M Marketing Strategy

Exposure Ninja Digital Marketing Podcast | SEO, eCommerce, Digital PR, PPC, Web design and CRO

Play Episode Listen Later Dec 7, 2020 33:15


Learn the success secrets of super fitness brand Peloton’s rapid growth that can be applied to your business. In this episode, also find out where Peloton are missing key opportunities with their digital marketing strategy and how you can avoid making the same mistakes. Show notes: https://exposureninja.com/podcast/178/ Get a free marketing review @ https://exposureninja.com/review/

Mixergy - Startup Stories with 1000+ entrepreneurs and businesses

Today’s guest has a track record that she could have lived off of for the rest of her like, including a $360M exit. She didn’t do that though. She kept building; she kept pulling together teams and addressing problems. I want to find out how she does it and about her newest company. Anu Shukla is the co-founder of Botco.ai, a conversational marketing platform. Anu Shukla is the co-founder of Botco.ai, a conversational marketing platform enabling meaningful and intelligent conversations between businesses and their customers. Sponsored byToptal – Toptal is a global network of top talent in business, design, and technology that enables companies to scale their teams, on demand. Toptal serves thousands of clients, including Fortune 500 companies and innovative startups, delivering expertise and world-class solutions at an unparalleled success rate. With elite freelancers in over 100 countries, Toptal connects the world’s top talent with leading companies in days, not weeks. Plus, every new engagement begins with a no-risk trial period, so clients only pay if satisfied with the work. Get started hiring with Toptal today. HostGator – Ready to take your website to the next level? Whether you’re a first-time blogger or an experienced web pro, HostGator has all the tools you need to create a great-looking website or online store. A wide range of options includes cloud-based web hosting, reseller hosting, VPS hosting and dedicated servers. Founded in 2002, HostGator is the perfect web partner for business owners and individuals seeking hands-on support. Visit www.hostgator.com/mixergy to see what HostGator can do for your website. More interviews -> https://mixergy.com/moreint Rate this interview -> https://mixergy.com/rateint

Rich Ad Poor Ad
How AdEspresso Makes its $1K Backing for Every Campaign ROI Like Crazy

Rich Ad Poor Ad

Play Episode Listen Later Sep 8, 2020 26:48


Get a “God’s-eye-view” of Facebook ad management with Hootsuite/AdEspresso Facebook Ads Specialist, Paul Fairbrother to find out how he successfully markets to 10,000 customers with an $360M in ad spend.

AMFM247 Broadcasting Network
Driven Entrepreneur - Travis Chambers

AMFM247 Broadcasting Network

Play Episode Listen Later Jun 14, 2020 41:36


This week on The Driven Entrepreneur, I'm joined by Travis Chambers, the founder of Chamber Media, and a direct to consumer video advertising specialist. Travis has been recognized on the Forbes '30 Under 30' list and runs an advertising agency that has been credited with tripling the revenue of four multi-million dollar companies in the past two years. Travis' YouTube Ad "Kobe vs Messi" was recognized as YouTubes #1 Ad of the Decade and has been viewed more than 140 million times. Chamber Media has driven $360M in tracked revenue and 600M views across Facebook and YouTube. Travis has been recognized for his work and has served as a keynote speaker for Google Growth, VidCon, and even the Bill and Melinda Gates Foundation. He's also been highlighted in Entrepreneur, AdWeek, and HuffPo. He's the real deal and he brings a ton of value and experience to the table. Learn More About Travis Chambers and Chamber Media: Visit the Chamber Media Website at: https://www.chamber.media/ Follow Travis Chambers on LinkedIn: https://www.linkedin.com/in/travisallenchambers/ Follow Travis Chambers on Twitter: https://twitter.com/travis_chambers Whether you are new to The Driven Entrepreneur Podcast or are a fan, please don't forget to rate, review and subscribe to the show. Your support and your reviews help this show to attract prolific guests and to provide the best listening experience possible. Also, I love to hear from the fans and listeners. Please share your feedback, guest suggestions, or ideas for show topics with me on social media. Follow Matt Brauning on Social Media: Facebook: https://www.facebook.com/mattbrauning Instagram: https://www.instagram.com/mattbrauning/ Twitter: https://twitter.com/mattbrauning Visit Matt Brauning's Websites: www.mattbrauningpodcast.com www.fireboxbook.com

The Driven Entrepreneur with Matt Brauning
Travis Chambers on Video Advertising for Million Dollar Brands....

The Driven Entrepreneur with Matt Brauning

Play Episode Listen Later Jun 12, 2020 41:31


This week on The Driven Entrepreneur, I'm joined by Travis Chambers, the founder of Chamber Media, and a direct to consumer video advertising specialist. Travis has been recognized on the Forbes '30 Under 30' list and runs an advertising agency that has been credited with tripling the revenue of four multi-million dollar companies in the past two years. Travis' YouTube Ad "Kobe vs Messi" was recognized as YouTubes #1 Ad of the Decade and has been viewed more than 140 million times. Chamber Media has driven $360M in tracked revenue and 600M views across Facebook and YouTube. Travis has been recognized for his work and has served as a keynote speaker for Google Growth, VidCon, and even the Bill and Melinda Gates Foundation. He's also been highlighted in Entrepreneur, AdWeek, and HuffPo. He's the real deal and he brings a ton of value and experience to the table.   Learn More About Travis Chambers and Chamber Media: Visit the Chamber Media Website at: https://www.chamber.media/ Follow Travis Chambers on LinkedIn: https://www.linkedin.com/in/travisallenchambers/ Follow Travis Chambers on Twitter: https://twitter.com/travis_chambers     Whether you are new to The Driven Entrepreneur Podcast or are a fan, please don't forget to rate, review and subscribe to the show. Your support and your reviews help this show to attract prolific guests and to provide the best listening experience possible. Also, I love to hear from the fans and listeners. Please share your feedback, guest suggestions, or ideas for show topics with me on social media.   Follow Matt Brauning on Social Media: Facebook: https://www.facebook.com/mattbrauning Instagram: https://www.instagram.com/mattbrauning/ Twitter: https://twitter.com/mattbrauning   Visit Matt Brauning's Websites: www.mattbrauningpodcast.com www.fireboxbook.com Get a copy of my brand new book, "The Firebox Principle," on Amazon: https://www.amazon.com/Firebox-Principle-Drives-Every-Entrepreneur-ebook/dp/B07FDKK9QW  

Frontier Space
Flying UAS with Infinite Airborne Endurance - Ep. 10

Frontier Space

Play Episode Listen Later Jun 10, 2020 20:54


In the tenth episode, Fatema Hamdani, Co-founder and President at Hamdani Aerospace, and our host Kole discuss the technologies and innovations toward designing and operating Ultra-Long Endurance UAS on Earth and Mars. Key highlights include Earth observation applications, biomimicry, nuclear-powered solar cells, planetary atmosphere on flight, and more. 60 Seconds in Space Rust can reduce the weight or improve radiation shielding by 30% Jupiter is so huge, the solar system almost had two suns Erosion of ozone layer responsible for mass extinction event 360M yrs ago --- This episode is sponsored by · Anchor: The easiest way to make a podcast. https://anchor.fm/app Support this podcast: https://anchor.fm/frontierspace/support

Marketing Geeks
Travis Chambers Talks Video Advertising & How He 'Faked' His Way onto the Forbes '30 Under 30' List...

Marketing Geeks

Play Episode Listen Later Mar 27, 2020 71:50


Ep #99 - The Marketing Geeks welcome Travis Chambers, a direct to consumer video advertising specialist and the founder of Chamber Media, as our special guest this week. Travis has been recognized on the Forbes '30 Under 30' list and his advertising agency has tripled the revenue of four multi-million dollar direct to consumer companies in the past two years. Travis' YouTube Ad "Kobe vs Messi" was recognized as YouTubes #1 Ad of the Decade and acquired 140 million views. Chamber Media has driven $360M in tracked revenue and 600M views across Facebook and YouTube. Travis has been recognized for his work and has served as a keynote speaker for Google Growth, VidCon, and even the Bill and Melinda Gates Foundation. He's also been highlighted in Entrepreneur, AdWeek, and HuffPo. He's the real deal and he brings a ton of value and experience to the table. In a recent article, Travis wrote how he has faked his way onto the 30 Under 30 List for Forbes. We investigate this matter on the show and go deep on video advertising and all things marketing. Enjoy the show! Learn More About Travis Chambers and Chamber Media: Visit the Chamber Media Website at: https://www.chamber.media/ Follow Travis Chambers on LinkedIn: https://www.linkedin.com/in/travisallenchambers/ Follow Travis Chambers on Twitter: https://twitter.com/travis_chambers Follow Travis Chambers on Instagram: https://www.instagram.com/travis_chambers/ Connect & message the Marketing Geeks on LinkedIn: Click to Connect with Justin Womack at: https://www.linkedin.com/in/justinwomack1/ Click to Connect with Andros Sturgeon at: https://www.linkedin.com/in/androssturgeon/ Want to support the show with a donation? $0.99 / month gets a featured shout-out on the show $4.99 / month gets a shout-out + bonus webinar recordings $9.99 / month gets everything above + private group Zoom call 1x per month a show host Click here to donate & support the show: https://anchor.fm/marketing-geeks/support Want to be a guest on the Marketing Geeks Podcast or suggest someone? Please email us at info@marketinggeekspodcast.com Click here to visit the Marketing Geeks website at: https://marketinggeekspodcast.com --- This episode is sponsored by · Anchor: The easiest way to make a podcast. https://anchor.fm/app · Trainual: Trainual is a software that helps you document what you do, so you can easily delegate and train others. https://trainual.com/freemonth/ --- Send in a voice message: https://anchor.fm/marketing-geeks/message Support this podcast: https://anchor.fm/marketing-geeks/support

Asia InsurTech Podcast
EP 36 – Walter de Oude – CEO Singlife – We Built the Tech and then Sold the Insurance

Asia InsurTech Podcast

Play Episode Listen Later Nov 25, 2019 27:30


In this episode, Michael Waitze talks to Walter de Oude, CEO of Singlife. The Singapore headquartered neo-insurer, valued at $360M, acquired a license from the MAS in 2017 and began selling life insurance policies after buying Zurich Insurance’s Singapore business. Singlife closed a $90M Series C funding round on 1st July 2019. This announcement was […] The post EP 36 – Walter de Oude – CEO Singlife – We Built the Tech and then Sold the Insurance appeared first on Asia InsurTech Podcast.

Startup Grind
How to break a monopoly - Eric Yuan (Founder of Zoom) x Jim Scheinman (Maven Ventures)

Startup Grind

Play Episode Listen Later Oct 9, 2019 18:20


Prior to founding Zoom, Eric was Corporate Vice President of Engineering at Cisco, where he was responsible for Cisco's collaboration software development. As one of the founding engineers and Vice President of Engineering at WebEx, Eric was the heart and soul of the WebEx product from 1997 to 2011. Eric proudly grew the WebEx team from 10 engineers to more than 800 worldwide, and contributed to revenue growth from $0 to more than $800M. Eric is a named inventor on 11 issued and 20 pending patents in real time collaboration.Eric is a graduate of the Stanford University Executive Program.Jim has achieved 5 ‘unicorn’-level successes over the past 20 years as a founder, executive, and investor. His top performers include Zoom ($1B valuation), Cruise ($1B sale to GM), Bebo ($850M sale to AOL), Tango ($1B+ valuation), and NBCi ($6B IPO), plus several other exits including the recent acquisition of Check by Intuit for $360M. Jim is one of the leading growth experts in Silicon Valley, a TED speaker, and a frequent presenter and judge at many startup conferences and events. Jim has a BS in Neuropsychology from Duke University and earned a JD at the University of California Davis School of Law. Jim has startup in his blood. He’s been a serial entrepreneur dating back to his high school and college days.

CruxCasts
Orezone Gold Corp (TSX-V: ORE) - Getting it Right in West Africa and Look at Imminent Re-rate

CruxCasts

Play Episode Listen Later Jul 18, 2019 23:06


Patrick Downey, President and CEO of Orezone Gold Corp (TSX-V: ORE) is very optimistic about what they have and how they can drive this business going forward. The technical aspect has been de-risked significantly, they have lots of drill data, a much more simplified mine plan from the one they inherited, good jurisdiction to work in, a small CapEx, short time to production, and possible about to re-rate when the imminent debt component is finalised. One we have a lot of comfort with and take great interest in Orezone Gold.A great West African gold story, with a very simple and effective approach to mining. The management team has a long track record of making shareholders money by creating meaningful mining businesses. They have increased their NPV by $140M since we last spoke to $360M.Their project in Burkina Faso is surrounded by large Gold companies such as B2 Gold, West African Resources, Endeavour, Semafo. They also have large gold investment funds such as RCF, VanEck, Equinox and Sun Valley invested.Company page: https://www.orezone.com/Make smarter investment decisions, subscribe here: https://www.cruxinvestor.comFor FREE unbiased investment information, follow us on Twitter and LinkedIn:https://twitter.com/cruxinvestorhttps://www.linkedin.com/company/crux-investor/Take advantage, hear it here first: https://www.youtube.com/CRUXinvestor

Millennial Momentum
#77 – How To Build A Successful Company Through Conscious Leadership | Godard Abel

Millennial Momentum

Play Episode Listen Later Aug 29, 2018 48:02


Mark Cuban is famous for saying “In business, you only need to be right once.” Godard Abel doesn't believe in that. He's already been right twice and is working on his third project now. In 2013 Abel sold his software company, BigMachines to Oracle for $400M. Then in late 2015 he sold his next company, SteelBrick to Salesforce.com for $360M. But Abel isn't an overnight success story. BigMachines took nearly a dozen years to build and Abel told me that they missed their sales numbers every year for the first seven years. With only $1.5M left in the bank, he and his team decided to continue onward and the rest is history. Abel mentioned that a major piece of his success has come as a result of mindfulness. He follows Jim Dethmer's Concious Leadership style, he reads books from Eckhart Tolle, and meditates using Headspace every day. His focus on health and wellness has allowed him to think more clearly and become a more successful leader. And if being right twice is great, a third time would be icing on the cake. That's what Abel is working on right now with his team at G2 Crowd. In this interview, we talk about the impact that mindfulness has had on his life, how to build a great company and the best way you can book a meeting with someone important via email and LinkedIn. Listen Here: iTunes Google Play Stitcher Connect with Godard: G2 Crowd LinkedIn Twitter Sign up for the weekly Millennial Momentum Newsletter. No BS, All hustle

Millennial Momentum
#77 – How To Build A Successful Company Through Conscious Leadership | Godard Abel

Millennial Momentum

Play Episode Listen Later Aug 29, 2018 48:02


Mark Cuban is famous for saying “In business, you only need to be right once.” Godard Abel doesn’t believe in that. He’s already been right twice and is working on his third project now. In 2013 Abel sold his software company, BigMachines to Oracle for $400M. Then in late 2015 he sold his next company, SteelBrick to Salesforce.com for $360M. But Abel isn’t an overnight success story. BigMachines took nearly a dozen years to build and Abel told me that they missed their sales numbers every year for the first seven years. With only $1.5M left in the bank, he and his team decided to continue onward and the rest is history. Abel mentioned that a major piece of his success has come as a result of mindfulness. He follows Jim Dethmer’s Concious Leadership style, he reads books from Eckhart Tolle, and meditates using Headspace every day. His focus on health and wellness has allowed him to think more clearly and become a more successful leader. And if being right twice is great, a third time would be icing on the cake. That’s what Abel is working on right now with his team at G2 Crowd. In this interview, we talk about the impact that mindfulness has had on his life, how to build a great company and the best way you can book a meeting with someone important via email and LinkedIn. Listen Here: iTunes Google Play Stitcher Connect with Godard: G2 Crowd LinkedIn Twitter Sign up for the weekly Millennial Momentum Newsletter. No BS, All hustle

Humans 2.0 Archive
#117 - Keith Barry | World's Leading TV Hypnotist, Mentalist and Brain Hacker

Humans 2.0 Archive

Play Episode Listen Later Aug 20, 2018 22:43


“Mr. Barry is not content merely to perform sleights of hand; he wants his audiences to know how deeply he embraces risk, how very life-affirming careering toward the canyon of eternity can be.” — New York TimesAs the world's leading TV hypnotist, mentalist and brain hacker, Keith Barry has been blazing a trail across the globe for many years. His mind-blowing skills have been showcased in over forty international television shows, including his most recent series, You're Back in the Room. The first hypnotism format on TV in the UK for several years, this show garnered huge audiences and proved to be a big prime-time hit for ITV.As well as his own hugely successful US TV series, Deception with Keith Barry, Keith has appeared many times on some of the most prestigious US shows such as The Ellen DeGeneres Show, The Jimmy Kimmel Show and The Conan O'Brien Show. He has also brain-hacked many celebrities including Woody Harrelson, Bono, Nicole Scherzinger, Morgan Freeman and many more.One of Keith's recent projects saw him working as chief mentalist and hypnotist consultant for the blockbuster movie, Now You See Me. Shot in New Orleans and Paris, it starred Woody Harrelson, Morgan Freeman, Jesse Eisenberg, Mark Ruffalo and Michael Caine and grossed over $360M at the box office. His huge success in working extensively with Woody Harrelson, who plays a mentalist, ensured he was hired as chief magic and mentalist consultant for the sequel, Now You See Me – The Second Act. He worked on this for almost a year, through its filming in London and Macau.Keith has written, produced and performed many of his own stage shows in the last fifteen years and has sold out venues in the US, Australia, Canada, Spain, South Africa, the UK and of course his native Ireland. He has also recently presented his keynote speech ‘Mind Magic' at places such as The Cannes Lions Festival of Creativity, The Pendulum Summit and The Dublin Tech Summit. Keith's TED Talk has been in the top twenty-five TED Talks since 2008 and it currently boasts over twenty-five million views.A true master of his craft, Keith has gained many awards over the years, including ‘Best Magician in Las Vegas 2009', as voted for by The Las Vegas Review-Journal. He also received the very prestigious ‘Merlin Award' for Mentalist of the Year in 2009, joining the ranks of past recipients including Penn and Teller, Paul Daniels and David Copperfield.Keith's corporate entertainment show is designed to astound and amaze the assembled audience and also have them laughing until their faces hurt. During the show, he will implant thoughts, extract thoughts, and influence the behavior of the unsuspecting audience. No-one is safe!- http://www.keithbarry.com/- https://www.instagram.com/keithbarryofficial/- https://twitter.com/KeithpBarry- https://www.facebook.com/keithbarryofficial/Please do NOT hesitate to reach out to me on Instagram, LinkedIn or via email mark@vudream.comLinkedIn - https://www.linkedin.com/in/mark-metry/Instagram - https://www.instagram.com/markmetry/Twitter - https://twitter.com/markymetryHumans 2.0 Twitter - https://twitter.com/Humans2PodcastMedium - https://medium.com/@markymetryFacebook - https://www.facebook.com/mark.metry.9Mark Metry - https://www.markmetry.com/

Humans 2.0 | Mind Upgrade
#117 - Keith Barry | World’s Leading TV Hypnotist, Mentalist and Brain Hacker

Humans 2.0 | Mind Upgrade

Play Episode Listen Later Aug 20, 2018 22:43


“Mr. Barry is not content merely to perform sleights of hand; he wants his audiences to know how deeply he embraces risk, how very life-affirming careering toward the canyon of eternity can be.” — New York TimesAs the world’s leading TV hypnotist, mentalist and brain hacker, Keith Barry has been blazing a trail across the globe for many years. His mind-blowing skills have been showcased in over forty international television shows, including his most recent series, You’re Back in the Room. The first hypnotism format on TV in the UK for several years, this show garnered huge audiences and proved to be a big prime-time hit for ITV.As well as his own hugely successful US TV series, Deception with Keith Barry, Keith has appeared many times on some of the most prestigious US shows such as The Ellen DeGeneres Show, The Jimmy Kimmel Show and The Conan O’Brien Show. He has also brain-hacked many celebrities including Woody Harrelson, Bono, Nicole Scherzinger, Morgan Freeman and many more.One of Keith’s recent projects saw him working as chief mentalist and hypnotist consultant for the blockbuster movie, Now You See Me. Shot in New Orleans and Paris, it starred Woody Harrelson, Morgan Freeman, Jesse Eisenberg, Mark Ruffalo and Michael Caine and grossed over $360M at the box office. His huge success in working extensively with Woody Harrelson, who plays a mentalist, ensured he was hired as chief magic and mentalist consultant for the sequel, Now You See Me – The Second Act. He worked on this for almost a year, through its filming in London and Macau.Keith has written, produced and performed many of his own stage shows in the last fifteen years and has sold out venues in the US, Australia, Canada, Spain, South Africa, the UK and of course his native Ireland. He has also recently presented his keynote speech ‘Mind Magic’ at places such as The Cannes Lions Festival of Creativity, The Pendulum Summit and The Dublin Tech Summit. Keith’s TED Talk has been in the top twenty-five TED Talks since 2008 and it currently boasts over twenty-five million views.A true master of his craft, Keith has gained many awards over the years, including ‘Best Magician in Las Vegas 2009’, as voted for by The Las Vegas Review-Journal. He also received the very prestigious ‘Merlin Award’ for Mentalist of the Year in 2009, joining the ranks of past recipients including Penn and Teller, Paul Daniels and David Copperfield.Keith’s corporate entertainment show is designed to astound and amaze the assembled audience and also have them laughing until their faces hurt. During the show, he will implant thoughts, extract thoughts, and influence the behavior of the unsuspecting audience. No-one is safe!- http://www.keithbarry.com/- https://www.instagram.com/keithbarryofficial/- https://twitter.com/KeithpBarry- https://www.facebook.com/keithbarryofficial/Please do NOT hesitate to reach out to me on Instagram, LinkedIn or via email mark@vudream.comLinkedIn - https://www.linkedin.com/in/mark-metry/Instagram - https://www.instagram.com/markmetry/Twitter - https://twitter.com/markymetryHumans 2.0 Twitter - https://twitter.com/Humans2PodcastMedium - https://medium.com/@markymetryFacebook - https://www.facebook.com/mark.metry.9Mark Metry - https://www.markmetry.com/

Good Day, Sir! Show
Ten Showers

Good Day, Sir! Show

Play Episode Listen Later Jun 20, 2018 104:47


In this episode, we discuss Texas Dreamin' 2018, Elon Musk's rules for meetings, communities sharing model and deployment issues, and the new switch statement in Apex. Elements Data Privacy Manager – Elements.cloud What exactly is Salesforce? We asked San Franciscans if they knew. Salesforce is vulnerable Portugal’s OutSystems raises $360M from KKR and Goldman Sachs big sky dreamin’ 2018 After 20 years of Salesforce, what Marc Benioff got right and wrong about the cloud – TechCrunch

CB Insights - A Conversation with ...
A Conversation with Robinhood Co-Founder & Co-CEO, Vladimir Tenev

CB Insights - A Conversation with ...

Play Episode Listen Later May 11, 2018 39:01


Vlad Tenev, Co-founder & Co-CEO of fintech unicorn, Robinhood (who just closed a $360M funding round on May 10) chats with Fortune Magazine's Jen Wieczner about the future of stock trading, cryptocurrencies, and more with commentary by CB Insights Tech Analyst, Lindsay Davis.

The Tech Blog Writer Podcast
328: Why Every Serial Business Builder and Entrepreneur Needs a Startup Squad

The Tech Blog Writer Podcast

Play Episode Listen Later Sep 6, 2017 20:10


Godard Abel, founder and executive chairman of G2 Crowd, a fast-growing tech startup in Chicago, has been building businesses for more than a decade.   His first startup, BigMachines, was sold to Oracle for more than $400 million in 2013. His next startup, Steelbrick, raised more than $50 million in less than four years and was acquired by Salesforce for $360M.   But, none of this would have been possible without the underground network of talent that follow him from one venture to the next. He refers to this core group as his startup squad. He’s trained this team on how to grow businesses from the ground-up quickly and efficiently. It includes engineers, designers, sales and marketing reps and customer service specialists.    Godard shares his insight on how to build this team with special focus on how to maintain their loyalty through mergers, acquisitions, and IPOs. 

Tech Talk Radio Podcast
March 1, 2014 Tech Talk Radio Show

Tech Talk Radio Podcast

Play Episode Listen Later Mar 1, 2014 58:43


Ooma multi-ring (can be used for NoMoRobo), Wi-Fi hotspot virus (Chameleon, infects hotspots with default admin password), receiving large files using Dropbox inbox (dbinbox.com, very convenient), techniques for learning Salesforce (set up free developer account, create a new app using workbook, avoid cert classes, join user group), share PDF flipbook (Youblisher.com, free, very elegant), Profiles in IT (Ali and Hadi Partovi, co-founders Code.org and serial Internet entrepreneurs), Bitcoin exchange MtGox goes dark (security flaw allowed users to withdraw bitcoins more than once, shoddy accounting, over $360M lost). SanDisk releases 128GB microSD (16 cells stacked vertically), Website of the Week (someecard.com, vintage drawings combined with deadpan humor, 7 million unique views a month, crowd-sourced captions), and Windows XP retirement (April 8, end of securit updates, expect an attack surge, exploits selling for $50K to $150K). This show originally aired on Saturday, March 1, 2014, at 9:00 AM EST on WFED (1500 AM). This show originally aired on Saturday, March 1, 2014, at 9:00 AM EST on WFED (1500 AM).

Tech Talk Radio Podcast
March 1, 2014 Tech Talk Radio Show

Tech Talk Radio Podcast

Play Episode Listen Later Mar 1, 2014 58:43


Ooma multi-ring (can be used for NoMoRobo), Wi-Fi hotspot virus (Chameleon, infects hotspots with default admin password), receiving large files using Dropbox inbox (dbinbox.com, very convenient), techniques for learning Salesforce (set up free developer account, create a new app using workbook, avoid cert classes, join user group), share PDF flipbook (Youblisher.com, free, very elegant), Profiles in IT (Ali and Hadi Partovi, co-founders Code.org and serial Internet entrepreneurs), Bitcoin exchange MtGox goes dark (security flaw allowed users to withdraw bitcoins more than once, shoddy accounting, over $360M lost). SanDisk releases 128GB microSD (16 cells stacked vertically), Website of the Week (someecard.com, vintage drawings combined with deadpan humor, 7 million unique views a month, crowd-sourced captions), and Windows XP retirement (April 8, end of securit updates, expect an attack surge, exploits selling for $50K to $150K). This show originally aired on Saturday, March 1, 2014, at 9:00 AM EST on WFED (1500 AM). This show originally aired on Saturday, March 1, 2014, at 9:00 AM EST on WFED (1500 AM).