Podcast appearances and mentions of Alex Atala

  • 55PODCASTS
  • 70EPISODES
  • 51mAVG DURATION
  • 1MONTHLY NEW EPISODE
  • Mar 25, 2025LATEST
Alex Atala

POPULARITY

20172018201920202021202220232024


Best podcasts about Alex Atala

Latest podcast episodes about Alex Atala

Prato Cheio
Tá rolando uma gentrificação alimentar?

Prato Cheio

Play Episode Listen Later Mar 25, 2025 51:52


O dendê da Bahia foi desbancado pelo óleo de palma do Pará. O açaí do Pará viajou para divertir o gringo. O chef paulista vende peixe da Amazônia e carne do Nordeste a preço de ouro. Este episódio explora a teoria sobre gentrificação urbana para se perguntar: será que estamos diante da gentrificação alimentar? Nossa equipe tenta entender como inovação, gourmetização e apropriação formam um cenário que exclui as pessoas mais pobres – e aquelas que sempre valorizaram, de fato, um alimento ou uma cultura. Afinal, será que existe alguma maneira de escapar a esse fenômeno? A ficha técnica completa, com todas as fontes de informação está disponível em nosso site. O Joio e o Prato Cheio são mantidos com o apoio de organizações da sociedade que atuam na promoção da alimentação adequada e saudável. Este episódio é fruto de um edital do Instituto Serrapilheira pra fomentar a cooperação entre jornalistas e cientistas. A temporada tem o apoio da Fundação Heinrich Boll. ACT Promoção da Saúde, Oak Foundation, Fundação Ford, Instituto Ibirapitanga e Instituto Clima e Sociedade são apoiadores regulares dos nossos projetos.Entre em nosso canal do Whatsapp e fique mais perto da nossa comunidade. Contamos com a colaboração  de leitores e ouvintes para continuar produzindo conteúdo independente e de qualidade. Se puder nos apoiar financeiramente, todos os caminhos estão aqui. Se não puder, divulgue a Prato Cheio pra família e amigos, isso nos ajuda muito!

Roda Viva
RODA VIVA | ALEX ATALA | 13/01/2025

Roda Viva

Play Episode Listen Later Jan 14, 2025 97:20


O Roda Viva entrevista o chefe de cozinha Alex Atala. O programa está com um sabor especial com a presença do chef brasileiro mais famoso do mundo, Alex Atala. São quase 40 anos de profissão, 25 do restaurante D.O.M., 15 do Dalva e Dito e muitos novos desafios, como conquistar a terceira estrela Michelin. Nesta edição, participam da bancada de entrevistadores: Arnaldo Lorençato, editor-executivo da revista Veja São Paulo; Larissa Januário, jornalista e apresentadora do canal Sabor & Arte; Luiza Fecarotta, jornalista, curadora gastronômica e colunista da Rádio CBN; Marcos Nogueira, autor da Coluna Cozinha Bruta, da Folha de S. Paulo e Vivian Mesquita, editora-chefe de Paladar, do jornal O Estado de S. Paulo.  Com apresentação de Vera Magalhães, as ilustrações em tempo real são de Luciano Veronezi. Assista à íntegra:  #TVCultura #RodaViva #AlexAtala #Culinária #Brasil

Rádio BandNews BH
Salve Atala - 19/11/24

Rádio BandNews BH

Play Episode Listen Later Nov 19, 2024 2:06


O colunistas Paulo Navarro repercute sobre o chef de cozinha Alex Atala, o unico brasileiro convidado por Harvard para uma conferencia mundial.

Foodness Talks
Alex Atala - 25 anos de aprendizado #195

Foodness Talks

Play Episode Listen Later Jun 5, 2024 45:16


25 anos de DOM, 15 anos de Dalva e Dito e uma carreira que abriu as portas para a gastronomia do Brasil e pra tanta gente dentro do segmento.Nesse episódio a gente tem o prazer de receber Alex Atala para um papo leve e descontraído sobre negócios, carreira e novos sonhos.

Desculpa Alguma Coisa
Helena Rizzo: 'Nunca achei que eu fosse ser uma Gisele Bündchen'

Desculpa Alguma Coisa

Play Episode Listen Later Apr 17, 2024 72:03


Convidada de Tati Bernardi no Desculpa Alguma Coisa, a chef de cozinha Helena Rizzo fala sobre a polêmica envolvendo a referência feita a ela por Alex Atala durante o "MasterChef Brasil", diz que enjoou do marido durante a gravidez da filha Manuela e conta de fase pegadora em São Paulo, quando era modelo com Fernanda Lima. See omnystudio.com/listener for privacy information.

Latent Space: The AI Engineer Podcast — CodeGen, Agents, Computer Vision, Data Science, AI UX and all things Software 3.0

Speaker CFPs and Sponsor Guides are now available for AIE World's Fair — join us on June 25-27 for the biggest AI Engineer conference of 2024!Soumith Chintala needs no introduction in the ML world — his insights are incredibly accessible across Twitter, LinkedIn, podcasts, and conference talks (in this pod we'll assume you'll have caught up on the History of PyTorch pod from last year and cover different topics). He's well known as the creator of PyTorch, but he's more broadly the Engineering Lead on AI Infra, PyTorch, and Generative AI at Meta.Soumith was one of the earliest supporters of Latent Space (and more recently AI News), and we were overjoyed to catch up with him on his latest SF visit for a braindump of the latest AI topics, reactions to some of our past guests, and why Open Source AI is personally so important to him.Life in the GPU-Rich LaneBack in January, Zuck went on Instagram to announce their GPU wealth: by the end of 2024, Meta will have 350k H100s. By adding all their GPU clusters, you'd get to 600k H100-equivalents of compute. At FP16 precision, that's ~1,200,000 PFLOPS. If we used George Hotz's (previous guest!) "Person of Compute" measure, Meta now has 60k humans of compute in their clusters. Occasionally we get glimpses into the GPU-rich life; on a recent ThursdAI chat, swyx prompted PaLM tech lead Yi Tay to write down what he missed most from Google, and he commented that UL2 20B was trained by accidentally leaving the training job running for a month, because hardware failures are so rare in Google.Meta AI's Epic LLM RunBefore Llama broke the internet, Meta released an open source LLM in May 2022, OPT-175B, which was notable for how “open” it was - right down to the logbook! They used only 16 NVIDIA V100 GPUs and Soumith agrees that, with hindsight, it was likely under-trained for its parameter size.In Feb 2023 (pre Latent Space pod), Llama was released, with a 7B version trained on 1T tokens alongside 65B and 33B versions trained on 1.4T tokens. The Llama authors included Guillaume Lample and Timothée Lacroix, who went on to start Mistral.July 2023 was Llama2 time (which we covered!): 3 model sizes, 7B, 13B, and 70B, all trained on 2T tokens. The three models accounted for a grand total of 3,311,616 GPU hours for all pre-training work. CodeLlama followed shortly after, a fine-tune of Llama2 specifically focused on code generation use cases. The family had models in the 7B, 13B, 34B, and 70B size, all trained with 500B extra tokens of code and code-related data, except for 70B which is trained on 1T.All of this on top of other open sourced models like Segment Anything (one of our early hits!), Detectron, Detectron 2, DensePose, and Seamless, and in one year, Meta transformed from a company people made fun of for its “metaverse” investments to one of the key players in the AI landscape and its stock has almost tripled since (about $830B in market value created in the past year).Why Open Source AIThe obvious question is why Meta would spend hundreds of millions on its AI efforts and then release them for free. Zuck has addressed this in public statements:But for Soumith, the motivation is even more personal:“I'm irrationally interested in open source. I think open source has that fundamental way to distribute opportunity in a way that is very powerful. Like, I grew up in India… And knowledge was very centralized, but I saw that evolution of knowledge slowly getting decentralized. And that ended up helping me learn quicker and faster for like zero dollars. And I think that was a strong reason why I ended up where I am. So like that, like the open source side of things, I always push regardless of like what I get paid for, like I think I would do that as a passion project on the side……I think at a fundamental level, the most beneficial value of open source is that you make the distribution to be very wide. It's just available with no friction and people can do transformative things in a way that's very accessible. Maybe it's open source, but it has a commercial license and I'm a student in India. I don't care about the license. I just don't even understand the license. But like the fact that I can use it and do something with it is very transformative to me……Like, okay, I again always go back to like I'm a student in India with no money. What is my accessibility to any of these closed source models? At some scale I have to pay money. That makes it a non-starter and stuff. And there's also the control issue: I strongly believe if you want human aligned AI, you want all humans to give feedback. And you want all humans to have access to that technology in the first place. And I actually have seen, living in New York, whenever I come to Silicon Valley, I see a different cultural bubble.We like the way Soumith put it last year: Closed AI “rate-limits against people's imaginations and needs”!What It Takes For Open Source AI to WinHowever Soumith doesn't think Open Source will simply win by popular demand. There is a tremendous coordination problem with the decentralized nature of the open source AI development right now: nobody is collecting the valuable human feedback in the way that OpenAI or Midjourney are doing.“Open source in general always has a coordination problem. If there's a vertically integrated provider with more resources, they will just be better coordinated than open source. And so now open source has to figure out how to have coordinated benefits. And the reason you want coordinated benefits is because these models are getting better based on human feedback. And if you see with open source models, like if you go to the /r/localllama subreddit, like there's so many variations of models that are being produced from, say, Nous research. I mean, like there's like so many variations built by so many people. And one common theme is they're all using these fine-tuning or human preferences datasets that are very limited and they're not sufficiently diverse. And you look at the other side, say front-ends like Oobabooga or like Hugging Chat or Ollama, they don't really have feedback buttons. All the people using all these front-ends, they probably want to give feedback, but there's no way for them to give feedback… So we're just losing all of this feedback. Maybe open source models are being as used as GPT is at this point in like all kinds of, in a very fragmented way, like in aggregate all the open source models together are probably being used as much as GPT is, maybe close to that. But the amount of feedback that is driving back into the open source ecosystem is like negligible, maybe less than 1% of like the usage. So I think like some, like the blueprint here I think is you'd want someone to create a sinkhole for the feedback… I think if we do that, if that actually happens, I think that probably has a real chance of the open source models having a runaway effect against OpenAI, I think like there's a clear chance we can take at truly winning open source.”If you're working on solving open source coordination, please get in touch!Show Notes* Soumith Chintala Twitter* History of PyTorch episode on Gradient Podcast* The Llama Ecosystem* Apple's MLX* Neural ODEs (Ordinary Differential Equations)* AlphaGo* LMSys arena* Dan Pink's "Drive"* Robotics projects:* Dobb-E* OK Robot* Yann LeCun* Yangqing Jia of Lepton AI* Ed Catmull* George Hotz on Latent Space* Chris Lattner on Latent Space* Guillaume Lample* Yannic Kilcher of OpenAssistant* LMSys* Alex Atallah of OpenRouter* Carlo Sferrazza's 3D tactile research* Alex Wiltschko of Osmo* Tangent by Alex Wiltschko* Lerrel Pinto - RoboticsTimestamps* [00:00:00] Introductions* [00:00:51] Extrinsic vs Intrinsic Success* [00:02:40] Importance of Open Source and Its Impact* [00:03:46] PyTorch vs TinyGrad* [00:08:33] Why PyTorch is the Switzerland of frameworks* [00:10:27] Modular's Mojo + PyTorch?* [00:13:32] PyTorch vs Apple's MLX* [00:16:27] FAIR / PyTorch Alumni* [00:18:50] How can AI inference providers differentiate?* [00:21:41] How to build good benchmarks and learnings from AnyScale's* [00:25:28] Most interesting unexplored ideas* [00:28:18] What people get wrong about synthetic data* [00:35:57] Meta AI's evolution* [00:38:42] How do you allocate 600,000 GPUs?* [00:42:05] Even the GPU Rich are GPU Poor* [00:47:31] Meta's MTIA silicon* [00:50:09] Why we need open source* [00:59:00] Open source's coordination problem for feedback gathering* [01:08:59] Beyond text generation* [01:15:37] Osmo and the Future of Smell Recognition TechnologyTranscriptAlessio [00:00:00]: Hey everyone, welcome to the Latent Space podcast. This is Alessio, partner and CTO in residence at Decibel Partners, and I'm joined by my co-host Swyx, founder of Smol AI.Swyx [00:00:15]: Hey, and today we have in the studio Soumith Chintala, welcome.Soumith [00:00:17]: Thanks for having me.Swyx [00:00:18]: On one of your rare visits from New York where you live. You got your start in computer vision at NYU with Yann LeCun. That was a very fortuitous start. I was actually listening to your interview on the Gradient podcast. So if people want to know more about the history of Soumith, history of PyTorch, they can go to that podcast. We won't spend that much time there, but I just was marveling at your luck, or I don't know if it's your luck or your drive to find AI early and then find the right quality mentor because I guess Yan really sort of introduced you to that world.Soumith [00:00:51]: Yeah, I think you're talking about extrinsic success, right? A lot of people just have drive to do things that they think is fun, and a lot of those things might or might not be extrinsically perceived as good and successful. I think I just happened to like something that is now one of the coolest things in the world or whatever. But if I happen, the first thing I tried to become was a 3D VFX artist, and I was really interested in doing that, but I turned out to be very bad at it. So I ended up not doing that further. But even if I was good at that, whatever, and I ended up going down that path, I probably would have been equally happy. It's just like maybe like the perception of, oh, is this person successful or not might be different. I think like after a baseline, like your happiness is probably more correlated with your intrinsic stuff.Swyx [00:01:44]: Yes. I think Dan Pink has this book on drive that I often refer to about the power of intrinsic motivation versus extrinsic and how long extrinsic lasts. It's not very long at all. But anyway, now you are an investor in Runway, so in a way you're working on VFX. Yes.Soumith [00:02:01]: I mean, in a very convoluted way.Swyx [00:02:03]: It reminds me of Ed Catmull. I don't know if you guys know, but he actually tried to become an animator in his early years and failed or didn't get accepted by Disney and then went and created Pixar and then got bought by Disney and created Toy Story. So you joined Facebook in 2014 and eventually became a creator and maintainer of PyTorch. And there's this long story there you can refer to on the gradient. I think maybe people don't know that you also involved in more sort of hardware and cluster decision affair. And we can dive into more details there because we're all about hardware this month. Yeah. And then finally, I don't know what else, like what else should people know about you on a personal side or professional side?Soumith [00:02:40]: I think open source is definitely a big passion of mine and probably forms a little bit of my identity at this point. I'm irrationally interested in open source. I think open source has that fundamental way to distribute opportunity in a way that is very powerful. Like, I grew up in India. I didn't have internet for a while. In college, actually, I didn't have internet except for GPRS or whatever. And knowledge was very centralized, but I saw that evolution of knowledge slowly getting decentralized. And that ended up helping me learn quicker and faster for zero dollars. And I think that was a strong reason why I ended up where I am. So the open source side of things, I always push regardless of what I get paid for, like I think I would do that as a passion project on the side.Swyx [00:03:35]: Yeah, that's wonderful. Well, we'll talk about the challenges as well that open source has, open models versus closed models. Maybe you want to touch a little bit on PyTorch before we move on to the sort of Meta AI in general.PyTorch vs Tinygrad tradeoffsAlessio [00:03:46]: Yeah, we kind of touched on PyTorch in a lot of episodes. So we had George Hotz from TinyGrad. He called PyTorch a CISC and TinyGrad a RISC. I would love to get your thoughts on PyTorch design direction as far as, I know you talk a lot about kind of having a happy path to start with and then making complexity hidden away but then available to the end user. One of the things that George mentioned is I think you have like 250 primitive operators in PyTorch, I think TinyGrad is four. So how do you think about some of the learnings that maybe he's going to run into that you already had in the past seven, eight years almost of running PyTorch?Soumith [00:04:24]: Yeah, I think there's different models here, but I think it's two different models that people generally start with. Either they go like, I have a grand vision and I'm going to build a giant system that achieves this grand vision and maybe one is super feature complete or whatever. Or other people say they will get incrementally ambitious, right? And they say, oh, we'll start with something simple and then we'll slowly layer out complexity in a way that optimally applies Huffman coding or whatever. Like where the density of users are and what they're using, I would want to keep it in the easy, happy path and where the more niche advanced use cases, I'll still want people to try them, but they need to take additional frictional steps. George, I think just like we started with PyTorch, George started with the incrementally ambitious thing. I remember TinyGrad used to be, like we would be limited to a thousand lines of code and I think now it's at 5,000. So I think there is no real magic to which why PyTorch has the kind of complexity. I think it's probably partly necessitated and partly because we built with the technology available under us at that time, PyTorch is like 190,000 lines of code or something at this point. I think if you had to rewrite it, we would probably think about ways to rewrite it in a vastly simplified way for sure. But a lot of that complexity comes from the fact that in a very simple, explainable way, you have memory hierarchies. You have CPU has three levels of caches and then you have DRAM and SSD and then you have network. Similarly, GPU has several levels of memory and then you have different levels of network hierarchies, NVLink plus InfiniBand or Rocky or something like that, right? And the way the flops are available on your hardware, they are available in a certain way and your computation is in a certain way and you have to retrofit your computation onto both the memory hierarchy and like the flops available. When you're doing this, it is actually a fairly hard mathematical problem to do this setup, like you find the optimal thing. And finding the optimal thing is, what is optimal depends on the input variables themselves. So like, okay, what is the shape of your input tensors and what is the operation you're trying to do and various things like that. Finding that optimal configuration and writing it down in code is not the same for every input configuration you have. Like for example, just as the shape of the tensors change, let's say you have three input tensors into a Sparstar product or something like that. The shape of each of these input tensors will vastly change how you do this optimally placing this operation onto the hardware in a way that will get you maximal throughput. So a lot of our complexity comes from writing out hundreds of configurations for each single PyTorch operator and templatizing these things and symbolically generating the final CUDA code or CPU code. There's no way to avoid it because mathematically we haven't found symbolic ways to do this that also keep compile time near zero. You can write a very simple framework, but then you also should be willing to eat the long compile time. So if searching for that optimal performance at runtime, but that's the trade off. There's no, like, I don't think unless we have great breakthroughs George's vision is achievable, he should be thinking about a narrower problem such as I'm only going to make this for work for self-driving car connets or I'm only going to make this work for LLM transformers of the llama style. Like if you start narrowing the problem down, you can make a vastly simpler framework. But if you don't, if you need the generality to power all of the AI research that is happening and keep zero compile time and in all these other factors, I think it's not easy to avoid the complexity.Pytorch vs MojoAlessio [00:08:33]: That's interesting. And we kind of touched on this with Chris Lattner when he was on the podcast. If you think about frameworks, they have the model target. They have the hardware target. They have different things to think about. He mentioned when he was at Google, TensorFlow trying to be optimized to make TPUs go brr, you know, and go as fast. I think George is trying to make especially AMD stack be better than ROCm. How come PyTorch has been such as Switzerland versus just making Meta hardware go brr?Soumith [00:09:00]: First, Meta is not in the business of selling hardware. Meta is not in the business of cloud compute. The way Meta thinks about funding PyTorch is we're funding it because it's net good for Meta to fund PyTorch because PyTorch has become a standard and a big open source project. And generally it gives us a timeline edge. It gives us leverage and all that within our own work. So why is PyTorch more of a Switzerland rather than being opinionated? I think the way we think about it is not in terms of Switzerland or not. We actually the way we articulate it to all hardware vendors and software vendors and all who come to us being we want to build a backend in core for PyTorch and ship it by default is we just only look at our user side of things. Like if users are using a particular piece of hardware, then we want to support it. We very much don't want to king make the hardware side of things. So as the MacBooks have GPUs and as that stuff started getting increasingly interesting, we pushed Apple to push some engineers and work on the NPS support and we spend significant time from Meta funded engineers on that as well because a lot of people are using the Apple GPUs and there's demand. So we kind of mostly look at it from the demand side. We never look at it from like oh which hardware should we start taking opinions on.Swyx [00:10:27]: Is there a future in which, because Mojo or Modular Mojo is kind of a superset of Python, is there a future in which PyTorch might use Mojo features optionally?Soumith [00:10:36]: I think it depends on how well integrated it is into the Python ecosystem. So if Mojo is like a pip install and it's readily available and users feel like they can use Mojo so smoothly within their workflows in a way that just is low friction, we would definitely look into that. Like in the same way PyTorch now depends on Triton, OpenAI Triton, and we never had a conversation that was like huh, that's like a dependency. Should we just build a Triton of our own or should we use Triton? It almost doesn't, like those conversations don't really come up for us. The conversations are more well does Triton have 10,000 dependencies and is it hard to install? We almost don't look at these things from a strategic leverage point of view. We look at these things from a user experience point of view, like is it easy to install? Is it smoothly integrated and does it give enough benefits for us to start depending on it? If so, yeah, we should consider it. That's how we think about it.Swyx [00:11:37]: You're inclusive by default as long as it meets the minimum bar of, yeah, but like maybe I phrased it wrongly. Maybe it's more like what problems would you look to solve that you have right now?Soumith [00:11:48]: I think it depends on what problems Mojo will be useful at.Swyx [00:11:52]: Mainly a performance pitch, some amount of cross compiling pitch.Soumith [00:11:56]: Yeah, I think the performance pitch for Mojo was like, we're going to be performant even if you have a lot of custom stuff, you're going to write arbitrary custom things and we will be performant. And that value proposition is not clear to us from the PyTorch side to consider it for PyTorch. So PyTorch, it's actually not 250 operators, it's like a thousand operators. PyTorch exposes about a thousand operators and people kind of write their ideas in the thousand operators of PyTorch. Mojo is like, well, maybe it's okay to completely sidestep those thousand operators of PyTorch and just write it in a more natural form. Just write raw Python, write for loops or whatever, right? So from the consideration of how do we intersect PyTorch with Mojo, I can see one use case where you have custom stuff for some parts of your program, but mostly it's PyTorch. And so we can probably figure out how to make it easier for say Torch.compile to smoothly also consume Mojo subgraphs and like, you know, the interoperability being actually usable, that I think is valuable. But Mojo as a fundamental front end would be replacing PyTorch, not augmenting PyTorch. So in that sense, I don't see a synergy in more deeply integrating Mojo.Pytorch vs MLXSwyx [00:13:21]: So call out to Mojo whenever they have written something in Mojo and there's some performance related thing going on. And then since you mentioned Apple, what should people think of PyTorch versus MLX?Soumith [00:13:32]: I mean, MLX is early and I know the folks well, Ani used to work at FAIR and I used to chat with him all the time. He used to be based out of New York as well. The way I think about MLX is that MLX is specialized for Apple right now. It has a happy path because it's defined its product in a narrow way. At some point MLX either says we will only be supporting Apple and we will just focus on enabling, you know, there's a framework if you use your MacBook, but once you like go server side or whatever, that's not my problem and I don't care. For MLS, it enters like the server side set of things as well. Like one of these two things will happen, right? If the first thing will happen, like MLX's overall addressable market will be small, but it probably do well within that addressable market. If it enters the second phase, they're going to run into all the same complexities that we have to deal with. They will not have any magic wand and they will have more complex work to do. They probably wouldn't be able to move as fast.Swyx [00:14:44]: Like having to deal with distributed compute?Soumith [00:14:48]: Distributed, NVIDIA and AMD GPUs, like just like having a generalization of the concept of a backend, how they treat compilation with plus overheads. Right now they're deeply assumed like the whole NPS graph thing. So they need to think about all these additional things if they end up expanding onto the server side and they'll probably build something like PyTorch as well, right? Like eventually that's where it will land. And I think there they will kind of fail on the lack of differentiation. Like it wouldn't be obvious to people why they would want to use it.Swyx [00:15:24]: I mean, there are some cloud companies offering M1 and M2 chips on servers. I feel like it might be interesting for Apple to pursue that market, but it's not their core strength.Soumith [00:15:33]: Yeah. If Apple can figure out their interconnect story, maybe, like then it can become a thing.Swyx [00:15:40]: Honestly, that's more interesting than the cars. Yes.Soumith [00:15:43]: I think the moat that NVIDIA has right now, I feel is that they have the interconnect that no one else has, like AMD GPUs are pretty good. I'm sure there's various silicon that is not bad at all, but the interconnect, like NVLink is uniquely awesome. I'm sure the other hardware providers are working on it, but-Swyx [00:16:04]: I feel like when you say it's uniquely awesome, you have some appreciation of it that the rest of us don't. I mean, the rest of us just like, you know, we hear marketing lines, but what do you mean when you say NVIDIA is very good at networking? Obviously they made the acquisition maybe like 15 years ago.Soumith [00:16:15]: Just the bandwidth it offers and the latency it offers. I mean, TPUs also have a good interconnect, but you can't buy them. So you have to go to Google to use it.PyTorch MafiaAlessio [00:16:27]: Who are some of the other FAIR PyTorch alumni that are building cool companies? I know you have Fireworks AI, Lightning AI, Lepton, and Yangqing, you knew since college when he was building Coffee?Soumith [00:16:40]: Yeah, so Yangqing and I used to be framework rivals, PyTorch, I mean, we were all a very small close-knit community back then. Caffe, Torch, Theano, Chainer, Keras, various frameworks. I mean, it used to be more like 20 frameworks. I can't remember all the names. CCV by Liu Liu, who is also based out of SF. And I would actually like, you know, one of the ways it was interesting is you went into the framework guts and saw if someone wrote their own convolution kernel or they were just copying someone else's. There were four or five convolution kernels that were unique and interesting. There was one from this guy out of Russia, I forgot the name, but I remembered who was awesome enough to have written their own kernel. And at some point there, I built out these benchmarks called ConNet benchmarks. They're just benchmarking all the convolution kernels that are available at that time. It hilariously became big enough that at that time AI was getting important, but not important enough that industrial strength players came in to do these kinds of benchmarking and standardization. Like we have MLPerf today. So a lot of the startups were using ConNet benchmarks in their pitch decks as like, oh, you know, on ConNet benchmarks, this is how we fare, so you should fund us. I remember Nirvana actually was at the top of the pack because Scott Gray wrote amazingly fast convolution kernels at that time. Very interesting, but separate times. But to answer your question, Alessio, I think mainly Lepton, Fireworks are the two most obvious ones, but I'm sure the fingerprints are a lot wider. They're just people who worked within the PyTorch Cafe2 cohort of things and now end up at various other places.Swyx [00:18:50]: I think as a, both as an investor and a people looking to build on top of their services, it's a uncomfortable slash like, I don't know what I don't know pitch. Because I've met Yang Tsing and I've met Lin Chao. Yeah, I've met these folks and they're like, you know, we are deep in the PyTorch ecosystem and we serve billions of inferences a day or whatever at Facebook and now we can do it for you. And I'm like, okay, that's great. Like, what should I be wary of or cautious of when these things happen? Because I'm like, obviously this experience is extremely powerful and valuable. I just don't know what I don't know. Like, what should people know about like these sort of new inference as a service companies?Soumith [00:19:32]: I think at that point you would be investing in them for their expertise of one kind. So if they've been at a large company, but they've been doing amazing work, you would be thinking about it as what these people bring to the table is that they're really good at like GPU programming or understanding the complexity of serving models once it hits a certain scale. You know, various expertise like from the infra and AI and GPUs point of view. What you would obviously want to figure out is whether their understanding of the external markets is clear, whether they know and understand how to think about running a business, understanding how to be disciplined about making money or, you know, various things like that.Swyx [00:20:23]: Maybe I'll put it like, actually I will de-emphasize the investing bit and just more as a potential customer. Oh, okay. Like, it's more okay, you know, you have PyTorch gods, of course. Like, what else should I know?Soumith [00:20:37]: I mean, I would not care about who's building something. If I'm trying to be a customer, I would care about whether...Swyx [00:20:44]: Benchmarks.Soumith [00:20:44]: Yeah, I use it and it's usability and reliability and speed, right?Swyx [00:20:51]: Quality as well.Soumith [00:20:51]: Yeah, if someone from some random unknown place came to me and say, user stuff is great. Like, and I have the bandwidth, I probably will give it a shot. And if it turns out to be great, like I'll just use it.Benchmark dramaSwyx [00:21:07]: Okay, great. And then maybe one more thing about benchmarks, since we already brought it up and you brought up Confident Benchmarks. There was some recent drama around AnyScale. AnyScale released their own benchmarks and obviously they look great on their own benchmarks, but maybe didn't give the other... I feel there are two lines of criticism. One, which is they didn't test some apples for apples on the kind of endpoints that the other providers, that they are competitors with, on their benchmarks and that is due diligence baseline. And then the second would be more just optimizing for the right thing. You had some commentary on it. I'll just kind of let you riff.Soumith [00:21:41]: Yeah, I mean, in summary, basically my criticism of that was AnyScale built these benchmarks for end users to just understand what they should pick, right? And that's a very good thing to do. I think what they didn't do a good job of is give that end user a full understanding of what they should pick. Like they just gave them a very narrow slice of understanding. I think they just gave them latency numbers and that's not sufficient, right? You need to understand your total cost of ownership at some reasonable scale. Not oh, one API call is one cent, but a thousand API calls are 10 cents. Like people can misprice to cheat on those benchmarks. So you want to understand, okay, like how much is it going to cost me if I actually subscribe to you and do like a million API calls a month or something? And then you want to understand the latency and reliability, not just from one call you made, but an aggregate of calls you've made over several various times of the day and times of the week. And the nature of the workloads, is it just some generic single paragraph that you're sending that is cashable? Or is it like testing of real world workload? I think that kind of rigor, like in presenting that benchmark wasn't there. It was a much more narrow sliver of what should have been a good benchmark. That was my main criticism. And I'm pretty sure if before they released it, they showed it to their other stakeholders who would be caring about this benchmark because they are present in it, they would have easily just pointed out these gaps. And I think they didn't do that and they just released it. So I think those were the two main criticisms. I think they were fair and Robert took it well.Swyx [00:23:40]: And he took it very well. And we'll have him on at some point and we'll discuss it. But I think it's important for, I think the market being maturing enough that people start caring and competing on these kinds of things means that we need to establish what best practice is because otherwise everyone's going to play dirty.Soumith [00:23:55]: Yeah, absolutely. My view of the LLM inference market in general is that it's the laundromat model. Like the margins are going to drive down towards the bare minimum. It's going to be all kinds of arbitrage between how much you can get the hardware for and then how much you sell the API and how much latency your customers are willing to let go. You need to figure out how to squeeze your margins. Like what is your unique thing here? Like I think Together and Fireworks and all these people are trying to build some faster CUDA kernels and faster, you know, hardware kernels in general. But those modes only last for a month or two. These ideas quickly propagate.Swyx [00:24:38]: Even if they're not published?Soumith [00:24:39]: Even if they're not published, the idea space is small. So even if they're not published, the discovery rate is going to be pretty high. It's not like we're talking about a combinatorial thing that is really large. You're talking about Llama style LLM models. And we're going to beat those to death on a few different hardware SKUs, right? Like it's not even we have a huge diversity of hardware you're going to aim to run it on. Now when you have such a narrow problem and you have a lot of people working on it, the rate at which these ideas are going to get figured out is going to be pretty rapid.Swyx [00:25:15]: Is it a standard bag of tricks? Like the standard one that I know of is, you know, fusing operators and-Soumith [00:25:22]: Yeah, it's the standard bag of tricks on figuring out how to improve your memory bandwidth and all that, yeah.Alessio [00:25:28]: Any ideas instead of things that are not being beaten to death that people should be paying more attention to?Novel PyTorch ApplicationsSwyx [00:25:34]: One thing I was like, you know, you have a thousand operators, right? Like what's the most interesting usage of PyTorch that you're seeing maybe outside of this little bubble?Soumith [00:25:41]: So PyTorch, it's very interesting and scary at the same time, but basically it's used in a lot of exotic ways, like from the ML angle, what kind of models are being built? And you get all the way from state-based models and all of these things to stuff nth order differentiable models, like neural ODEs and stuff like that. I think there's one set of interestingness factor from the ML side of things. And then there's the other set of interesting factor from the applications point of view. It's used in Mars Rover simulations, to drug discovery, to Tesla cars. And there's a huge diversity of applications in which it is used. So in terms of the most interesting application side of things, I think I'm scared at how many interesting things that are also very critical and really important it is used in. I think the scariest was when I went to visit CERN at some point and they said they were using PyTorch and they were using GANs at the same time for particle physics research. And I was scared more about the fact that they were using GANs than they were using PyTorch, because at that time I was a researcher focusing on GANs. But the diversity is probably the most interesting. How many different things it is being used in. I think that's the most interesting to me from the applications perspective. From the models perspective, I think I've seen a lot of them. Like the really interesting ones to me are where we're starting to combine search and symbolic stuff with differentiable models, like the whole AlphaGo style models is one example. And then I think we're attempting to do it for LLMs as well, with various reward models and search. I mean, I don't think PyTorch is being used in this, but the whole alpha geometry thing was interesting because again, it's an example of combining the symbolic models with the gradient based ones. But there are stuff like alpha geometry that PyTorch is used at, especially when you intersect biology and chemistry with ML. In those areas, you want stronger guarantees on the output. So yeah, maybe from the ML side, those things to me are very interesting right now.Swyx [00:28:03]: Yeah. People are very excited about the alpha geometry thing. And it's kind of like, for me, it's theoretical. It's great. You can solve some Olympia questions. I'm not sure how to make that bridge over into the real world applications, but I'm sure people smarter than me will figure it out.Synthetic Data vs Symbolic ModelsSoumith [00:28:18]: Let me give you an example of it. You know how the whole thing about synthetic data will be the next rage in LLMs is a thing?Swyx [00:28:27]: Already is a rage.Soumith [00:28:28]: Which I think is fairly misplaced in how people perceive it. People think synthetic data is some kind of magic wand that you wave and it's going to be amazing. Synthetic data is useful in neural networks right now because we as humans have figured out a bunch of symbolic models of the world or made up certain symbolic models because of human innate biases. So we've figured out how to ground particle physics in a 30 parameter model. And it's just very hard to compute as in it takes a lot of flops to compute, but it only has 30 parameters or so. I mean, I'm not a physics expert, but it's a very low rank model. We built mathematics as a field that basically is very low rank. Language, a deep understanding of language, like the whole syntactic parse trees and just understanding how language can be broken down and into a formal symbolism is something that we figured out. So we basically as humans have accumulated all this knowledge on these subjects, either synthetic, we created those subjects in our heads, or we grounded some real world phenomenon into a set of symbols. But we haven't figured out how to teach neural networks symbolic world models directly. The only way we have to teach them is generating a bunch of inputs and outputs and gradient dissenting over them. So in areas where we have the symbolic models and we need to teach all the knowledge we have that is better encoded in the symbolic models, what we're doing is we're generating a bunch of synthetic data, a bunch of input output pairs, and then giving that to the neural network and asking it to learn the same thing that we already have a better low rank model of in gradient descent in a much more over-parameterized way. Outside of this, like where we don't have good symbolic models, like synthetic data obviously doesn't make any sense. So synthetic data is not a magic wand where it'll work in all cases in every case or whatever. It's just where we as humans already have good symbolic models off. We need to impart that knowledge to neural networks and we figured out the synthetic data is a vehicle to impart this knowledge to. So, but people, because maybe they don't know enough about synthetic data as a notion, but they hear, you know, the next wave of data revolution is synthetic data. They think it's some kind of magic where we just create a bunch of random data somehow. They don't think about how, and then they think that's just a revolution. And I think that's maybe a gap in understanding most people have in this hype cycle.Swyx [00:31:23]: Yeah, well, it's a relatively new concept, so. Oh, there's two more that I'll put in front of you and then you can see what you respond. One is, you know, I have this joke that it's, you know, it's only synthetic data if it's from the Mistral region of France, otherwise it's just a sparkling distillation, which is what news research is doing. Like they're distilling GPT-4 by creating synthetic data from GPT-4, creating mock textbooks inspired by Phi 2 and then fine tuning open source models like Llama. And so I don't know, I mean, I think that's, should we call that synthetic data? Should we call it something else? I don't know.Soumith [00:31:57]: Yeah, I mean, the outputs of LLMs, are they synthetic data? They probably are, but I think it depends on the goal you have. If your goal is you're creating synthetic data with the goal of trying to distill GPT-4's superiority into another model, I guess you can call it synthetic data, but it also feels like disingenuous because your goal is I need to copy the behavior of GPT-4 and-Swyx [00:32:25]: It's also not just behavior, but data set. So I've often thought of this as data set washing. Like you need one model at the top of the chain, you know, unnamed French company that has that, you know, makes a model that has all the data in it that we don't know where it's from, but it's open source, hey, and then we distill from that and it's great. To be fair, they also use larger models as judges for preference ranking, right? So that is, I think, a very, very accepted use of synthetic.Soumith [00:32:53]: Correct. I think it's a very interesting time where we don't really have good social models of what is acceptable depending on how many bits of information you use from someone else, right? It's like, okay, you use one bit. Is that okay? Yeah, let's accept it to be okay. Okay, what about if you use 20 bits? Is that okay? I don't know. What if you use 200 bits? I don't think we as society have ever been in this conundrum where we have to be like, where is the boundary of copyright or where is the boundary of socially accepted understanding of copying someone else? We haven't been tested this mathematically before,Swyx [00:33:38]: in my opinion. Whether it's transformative use. Yes. So yeah, I think this New York Times opening eye case is gonna go to the Supreme Court and we'll have to decide it because I think we never had to deal with it before. And then finally, for synthetic data, the thing that I'm personally exploring is solving this great stark paradigm difference between rag and fine tuning, where you can kind of create synthetic data off of your retrieved documents and then fine tune on that. That's kind of synthetic. All you need is variation or diversity of samples for you to fine tune on. And then you can fine tune new knowledge into your model. I don't know if you've seen that as a direction for synthetic data.Soumith [00:34:13]: I think you're basically trying to, what you're doing is you're saying, well, language, I know how to parametrize language to an extent. And I need to teach my model variations of this input data so that it's resilient or invariant to language uses of that data.Swyx [00:34:32]: Yeah, it doesn't overfit on the wrong source documents.Soumith [00:34:33]: So I think that's 100% synthetic. You understand, the key is you create variations of your documents and you know how to do that because you have a symbolic model or like some implicit symbolic model of language.Swyx [00:34:48]: Okay.Alessio [00:34:49]: Do you think the issue with symbolic models is just the architecture of the language models that we're building? I think maybe the thing that people grasp is the inability of transformers to deal with numbers because of the tokenizer. Is it a fundamental issue there too? And do you see alternative architectures that will be better with symbolic understanding?Soumith [00:35:09]: I am not sure if it's a fundamental issue or not. I think we just don't understand transformers enough. I don't even mean transformers as an architecture. I mean the use of transformers today, like combining the tokenizer and transformers and the dynamics of training, when you show math heavy questions versus not. I don't have a good calibration of whether I know the answer or not. I, you know, there's common criticisms that are, you know, transformers will just fail at X. But then when you scale them up to sufficient scale, they actually don't fail at that X. I think there's this entire subfield where they're trying to figure out these answers called like the science of deep learning or something. So we'll get to know more. I don't know the answer.Meta AI and Llama 2/3Swyx [00:35:57]: Got it. Let's touch a little bit on just Meta AI and you know, stuff that's going on there. Maybe, I don't know how deeply you're personally involved in it, but you're our first guest with Meta AI, which is really fantastic. And Llama 1 was, you know, you are such a believer in open source. Llama 1 was more or less the real breakthrough in open source AI. The most interesting thing for us covering on this, in this podcast was the death of Chinchilla, as people say. Any interesting insights there around the scaling models for open source models or smaller models or whatever that design decision was when you guys were doing it?Soumith [00:36:31]: So Llama 1 was Guillaume Lample and team. There was OPT before, which I think I'm also very proud of because we bridged the gap in understanding of how complex it is to train these models to the world. Like until then, no one really in gory detail published.Swyx [00:36:50]: The logs.Soumith [00:36:51]: Yeah. Like, why is it complex? And everyone says, oh, it's complex. But no one really talked about why it's complex. I think OPT was cool.Swyx [00:37:02]: I met Susan and she's very, very outspoken. Yeah.Soumith [00:37:05]: We probably, I think, didn't train it for long enough, right? That's kind of obvious in retrospect.Swyx [00:37:12]: For a 175B. Yeah. You trained it according to Chinchilla at the time or?Soumith [00:37:17]: I can't remember the details, but I think it's a commonly held belief at this point that if we trained OPT longer, it would actually end up being better. Llama 1, I think, was Guillaume Lample and team Guillaume is fantastic and went on to build Mistral. I wasn't too involved in that side of things. So I don't know what you're asking me, which is how did they think about scaling loss and all of that? Llama 2, I was more closely involved in. I helped them a reasonable amount with their infrastructure needs and stuff. And Llama 2, I think, was more like, let's get to the evolution. At that point, we kind of understood what we were missing from the industry's understanding of LLMs. And we needed more data and we needed more to train the models for longer. And we made, I think, a few tweaks to the architecture and we scaled up more. And that was Llama 2. I think Llama 2, you can think of it as after Guillaume left, the team kind of rebuilt their muscle around Llama 2. And Hugo, I think, who's the first author is fantastic. And I think he did play a reasonable big role in Llama 1 as well.Soumith [00:38:35]: And he overlaps between Llama 1 and 2. So in Llama 3, obviously, hopefully, it'll be awesome.Alessio [00:38:42]: Just one question on Llama 2, and then we'll try and fish Llama 3 spoilers out of you. In the Llama 2 paper, the loss curves of the 34 and 70B parameter, they still seem kind of steep. Like they could go lower. How, from an infrastructure level, how do you allocate resources? Could they have just gone longer or were you just, hey, this is all the GPUs that we can burn and let's just move on to Llama 3 and then make that one better?Soumith [00:39:07]: Instead of answering specifically about that Llama 2 situation or whatever, I'll tell you how we think about things. Generally, we're, I mean, Mark really is some numbers, right?Swyx [00:39:20]: So let's cite those things again. All I remember is like 600K GPUs.Soumith [00:39:24]: That is by the end of this year and 600K H100 equivalents. With 250K H100s, including all of our other GPU or accelerator stuff, it would be 600-and-something-K aggregate capacity.Swyx [00:39:38]: That's a lot of GPUs.Soumith [00:39:39]: We'll talk about that separately. But the way we think about it is we have a train of models, right? Llama 1, 2, 3, 4. And we have a bunch of GPUs. I don't think we're short of GPUs. Like-Swyx [00:39:54]: Yeah, no, I wouldn't say so. Yeah, so it's all a matter of time.Soumith [00:39:56]: I think time is the biggest bottleneck. It's like, when do you stop training the previous one and when do you start training the next one? And how do you make those decisions? The data, do you have net new data, better clean data for the next one in a way that it's not worth really focusing on the previous one? It's just a standard iterative product. You're like, when is the iPhone 1? When do you start working on iPhone 2? Where is the iPhone? And so on, right? So mostly the considerations are time and generation, rather than GPUs, in my opinion.Alessio [00:40:31]: So one of the things with the scaling loss, like Chinchilla is optimal to balance training and inference costs. I think at Meta's scale, you would rather pay a lot more maybe at training and then save on inference. How do you think about that from infrastructure perspective? I think in your tweet, you say you can try and guess on like how we're using these GPUs. Can you just give people a bit of understanding? It's like, because I've already seen a lot of VCs say, Llama 3 has been trained on 600,000 GPUs and that's obviously not true, I'm sure. How do you allocate between the research, FAIR and the Llama training, the inference on Instagram suggestions that get me to scroll, like AI-generated stickers on WhatsApp and all of that?Soumith [00:41:11]: Yeah, we haven't talked about any of this publicly, but as a broad stroke, it's like how we would allocate resources of any other kinds at any company. You run a VC portfolio, how do you allocate your investments between different companies or whatever? You kind of make various trade-offs and you kind of decide, should I invest in this project or this other project, or how much should I invest in this project? It's very much a zero sum of trade-offs. And it also comes into play, how are your clusters configured, like overall, what you can fit of what size and what cluster and so on. So broadly, there's no magic sauce here. I mean, I think the details would add more spice, but also wouldn't add more understanding. It's just gonna be like, oh, okay, I mean, this looks like they just think about this as I would normally do.Alessio [00:42:05]: So even the GPU rich run through the same struggles of having to decide where to allocate things.Soumith [00:42:11]: Yeah, I mean, at some point I forgot who said it, but you kind of fit your models to the amount of compute you have. If you don't have enough compute, you figure out how to make do with smaller models. But no one as of today, I think would feel like they have enough compute. I don't think I've heard any company within the AI space be like, oh yeah, like we feel like we have sufficient compute and we couldn't have done better. So that conversation, I don't think I've heard from any of my friends at other companies.EleutherSwyx [00:42:47]: Stella from Eleuther sometimes says that because she has a lot of donated compute. She's trying to put it to interesting uses, but for some reason she's decided to stop making large models.Soumith [00:42:57]: I mean, that's a cool, high conviction opinion that might pay out.Swyx [00:43:01]: Why?Soumith [00:43:02]: I mean, she's taking a path that most people don't care to take about in this climate and she probably will have very differentiated ideas. I mean, think about the correlation of ideas in AI right now. It's so bad, right? So everyone's fighting for the same pie. In some weird sense, that's partly why I don't really directly work on LLMs. I used to do image models and stuff and I actually stopped doing GANs because GANs were getting so hot that I didn't have any calibration of whether my work would be useful or not because, oh yeah, someone else did the same thing you did. It's like, there's so much to do, I don't understand why I need to fight for the same pie. So I think Stella's decision is very smart.Making BetsAlessio [00:43:53]: And how do you reconcile that with how we started the discussion about intrinsic versus extrinsic kind of like accomplishment or success? How should people think about that especially when they're doing a PhD or early in their career? I think in Europe, I walked through a lot of the posters and whatnot, there seems to be mode collapse in a way in the research, a lot of people working on the same things. Is it worth for a PhD to not take a bet on something that is maybe not as interesting just because of funding and visibility and whatnot? Or yeah, what suggestions would you give?Soumith [00:44:28]: I think there's a baseline level of compatibility you need to have with the field. Basically, you need to figure out if you will get paid enough to eat, right? Like whatever reasonable normal lifestyle you want to have as a baseline. So you at least have to pick a problem within the neighborhood of fundable. Like you wouldn't wanna be doing something so obscure that people are like, I don't know, like you can work on it.Swyx [00:44:59]: Would a limit on fundability, I'm just observing something like three months of compute, right? That's the top line, that's the like max that you can spend on any one project.Soumith [00:45:09]: But like, I think that's very ill specified, like how much compute, right? I think that the notion of fundability is broader. It's more like, hey, are these family of models within the acceptable set of, you're not crazy or something, right? Even something like neural or DS, which is a very boundary pushing thing or states-based models or whatever. Like all of these things I think are still in fundable territory. When you're talking about, I'm gonna do one of the neuromorphic models and then apply image classification to them or something, then it becomes a bit questionable. Again, it depends on your motivation. Maybe if you're a neuroscientist, it actually is feasible. But if you're an AI engineer, like the audience of these podcasts, then it's more questionable. The way I think about it is, you need to figure out how you can be in the baseline level of fundability just so that you can just live. And then after that, really focus on intrinsic motivation and depends on your strengths, like how you can play to your strengths and your interests at the same time. Like I try to look at a bunch of ideas that are interesting to me, but also try to play to my strengths. I'm not gonna go work on theoretical ML. I'm interested in it, but when I want to work on something like that, I try to partner with someone who is actually a good theoretical ML person and see if I actually have any value to provide. And if they think I do, then I come in. So I think you'd want to find that intersection of ideas you like, and that also play to your strengths. And I'd go from there. Everything else, like actually finding extrinsic success and all of that, I think is the way I think about it is like somewhat immaterial. When you're talking about building ecosystems and stuff, slightly different considerations come into play, but that's a different conversation.Swyx [00:47:06]: We're gonna pivot a little bit to just talking about open source AI. But one more thing I wanted to establish for Meta is this 600K number, just kind of rounding out the discussion, that's for all Meta. So including your own inference needs, right? It's not just about training.Soumith [00:47:19]: It's gonna be the number in our data centers for all of Meta, yeah.Swyx [00:47:23]: Yeah, so there's a decent amount of workload serving Facebook and Instagram and whatever. And then is there interest in like your own hardware?MTIASoumith [00:47:31]: We already talked about our own hardware. It's called MTIA. Our own silicon, I think we've even showed the standard photograph of you holding the chip that doesn't work. Like as in the chip that you basically just get like-Swyx [00:47:51]: As a test, right?Soumith [00:47:52]: Yeah, a test chip or whatever. So we are working on our silicon and we'll probably talk more about it when the time is right, but-Swyx [00:48:00]: Like what gaps do you have that the market doesn't offer?Soumith [00:48:04]: Okay, I mean, this is easy to answer. So basically, remember how I told you about there's this memory hierarchy and like sweet spots and all of that? Fundamentally, when you build a hardware, you make it general enough that a wide set of customers and a wide set of workloads can use it effectively while trying to get the maximum level of performance they can. The more specialized you make the chip, the more hardware efficient it's going to be, the more power efficient it's gonna be, the more easier it's going to be to find the software, like the kernel's right to just map that one or two workloads to that hardware and so on. So it's pretty well understood across the industry that if you have a sufficiently large volume, enough workload, you can specialize it and get some efficiency gains, like power gains and so on. So the way you can think about everyone building, every large company building silicon, I think a bunch of the other large companies are building their own silicon as well, is they, each large company has a sufficient enough set of verticalized workloads that can be specialized that have a pattern to them that say a more generic accelerator like an NVIDIA or an AMD GPU does not exploit. So there is some level of power efficiency that you're leaving on the table by not exploiting that. And you have sufficient scale and you have sufficient forecasted stability that those workloads will exist in the same form, that it's worth spending the time to build out a chip to exploit that sweet spot. Like obviously something like this is only useful if you hit a certain scale and that your forecasted prediction of those kind of workloads being in the same kind of specializable exploitable way is true. So yeah, that's why we're building our own chips.Swyx [00:50:08]: Awesome.Open Source AIAlessio [00:50:09]: Yeah, I know we've been talking a lot on a lot of different topics and going back to open source, you had a very good tweet. You said that a single company's closed source effort rate limits against people's imaginations and needs. How do you think about all the impact that some of the Meta AI work in open source has been doing and maybe directions of the whole open source AI space?Soumith [00:50:32]: Yeah, in general, I think first, I think it's worth talking about this in terms of open and not just open source, because like with the whole notion of model weights, no one even knows what source means for these things. But just for the discussion, when I say open source, you can assume it's just I'm talking about open. And then there's the whole notion of licensing and all that, commercial, non-commercial, commercial with clauses and all that. I think at a fundamental level, the most benefited value of open source is that you make the distribution to be very wide. It's just available with no friction and people can do transformative things in a way that's very accessible. Maybe it's open source, but it has a commercial license and I'm a student in India. I don't care about the license. I just don't even understand the license. But like the fact that I can use it and do something with it is very transformative to me. Like I got this thing in a very accessible way. And then it's various degrees, right? And then if it's open source, but it's actually a commercial license, then a lot of companies are gonna benefit from gaining value that they didn't previously have, that they maybe had to pay a closed source company for it. So open source is just a very interesting tool that you can use in various ways. So there's, again, two kinds of open source. One is some large company doing a lot of work and then open sourcing it. And that kind of effort is not really feasible by say a band of volunteers doing it the same way. So there's both a capital and operational expenditure that the large company just decided to ignore and give it away to the world for some benefits of some kind. They're not as tangible as direct revenue. So in that part, Meta has been doing incredibly good things. They fund a huge amount of the PyTorch development. They've open sourced Llama and those family of models and several other fairly transformative projects. FICE is one, Segment Anything, Detectron, Detectron 2. Dense Pose. I mean, it's-Swyx [00:52:52]: Seamless. Yeah, seamless.Soumith [00:52:53]: Like it's just the list is so long that we're not gonna cover. So I think Meta comes into that category where we spend a lot of CapEx and OpEx and we have a high talent density of great AI people and we open our stuff. And the thesis for that, I remember when FAIR was started, the common thing was like, wait, why would Meta wanna start a open AI lab? Like what exactly is a benefit from a commercial perspective? And for then the thesis was very simple. It was AI is currently rate limiting Meta's ability to do things. Our ability to build various product integrations, moderation, various other factors. Like AI was the limiting factor and we just wanted AI to advance more and we didn't care if the IP of the AI was uniquely in our possession or not. However the field advances, that accelerates Meta's ability to build a better product. So we just built an open AI lab and we said, if this helps accelerate the progress of AI, that's strictly great for us. But very easy, rational, right? Still the same to a large extent with the Llama stuff. And it's the same values, but the argument, it's a bit more nuanced. And then there's a second kind of open source, which is, oh, we built this project, nights and weekends and we're very smart people and we open sourced it and then we built a community around it. This is the Linux kernel and various software projects like that. So I think about open source, like both of these things being beneficial and both of these things being different. They're different and beneficial in their own ways. The second one is really useful when there's an active arbitrage to be done. If someone's not really looking at a particular space because it's not commercially viable or whatever, like a band of volunteers can just coordinate online and do something and then make that happen. And that's great.Open Source LLMsI wanna cover a little bit about open source LLMs maybe. So open source LLMs have been very interesting because I think we were trending towards an increase in open source in AI from 2010 all the way to 2017 or something. Like where more and more pressure within the community was to open source their stuff so that their methods and stuff get adopted. And then the LLMs revolution kind of took the opposite effect OpenAI stopped open sourcing their stuff and DeepMind kind of didn't, like all the other cloud and all these other providers, they didn't open source their stuff. And it was not good in the sense that first science done in isolation probably will just form its own bubble where people believe their own b******t or whatever. So there's that problem. And then there was the other problem which was the accessibility part. Like, okay, I again always go back to I'm a student in India with no money. What is my accessibility to any of these closers models? At some scale I have to pay money. That makes it a non-starter and stuff. And there's also the control thing. I strongly believe if you want human aligned stuff, you want all humans to give feedback. And you want all humans to have access to that technology in the first place. And I actually have seen, living in New York, whenever I come to Silicon Valley, I see a different cultural bubble. Like all the friends I hang out with talk about some random thing like Dyson Spheres or whatever, that's a thing. And most of the world doesn't know or care about any of this stuff. It's definitely a bubble and bubbles can form very easily. And when you make a lot of decisions because you're in a bubble, they're probably not globally optimal decisions. So I think open source, the distribution of open source powers a certain kind of non-falsifiability that I think is very important. I think on the open source models, like it's going great in the fact that LoRa I think came out of the necessity of open source models needing to be fine-tunable in some way. Yeah, and I think DPO also came out of the academic open source side of things. So do any of the closed source labs, did any of them already have LoRa or DPO internally? Maybe, but that does not advance humanity in any way. It advances some companies probability of doing the winner takes all that I talked about earlier in the podcast.Open Source and TrustI don't know, it just feels fundamentally good. Like when people try to, you know, people are like, well, what are the ways in which it is not okay? I find most of these arguments, and this might be a little controversial, but I find a lot of arguments based on whether closed source models are safer or open source models are safer very much related to what kind of culture they grew up in, what kind of society they grew up in. If they grew up in a society that they trusted, then I think they take the closed source argument. And if they grew up in a society that they couldn't trust, where the norm was that you didn't trust your government, obviously it's corrupt or whatever, then I think the open source argument is what they take. I think there's a deep connection to like people's innate biases from their childhood and their trust in society and governmental aspects that push them towards one opinion or the other. And I'm definitely in the camp of open source is definitely going to actually have better outcomes for society. Closed source to me just means that centralization of power, which, you know, is really hard to trust. So I think it's going well

Conversa com Bial
Pedro Bial entrevista Alex Atala - Parte 2

Conversa com Bial

Play Episode Listen Later Nov 1, 2023 31:32


Na continuação da conversa com Alex Atala, entra em cena Ricardo Elesbão Alves, pesquisador da Embrapa – Empresa Brasileira de Pesquisa Agropecuária – para debater as ligações entre ciência, comida, produção de alimentos e combate à fome.

Conversa com Bial
Pedro Bial entrevista Alex Atala – Parte 1

Conversa com Bial

Play Episode Listen Later Oct 31, 2023 31:34


Com 30 anos de carreira, o chef brasileiro com maior reconhecimento internacional fala de sua trajetória, o mundo da alta gastronomia, projetos de sustentabilidade e seu último livro, uma pesquisa sobre a importância da mandioca para a cultura e a história do Brasil.

Product Guru's
#194 Sulivan Santiago - Discovery tem que ter time box?

Product Guru's

Play Episode Listen Later Aug 30, 2023 51:27


Neste episódio, mergulhamos profundamente na abordagem de "pensamento espacial" como uma ferramenta inovadora para desenvolvimento de produtos. Mas o que exatamente é o pensamento espacial e como ele pode ser aplicado na fase de concepção e design? Sulivan Santiago se junta a nós para debater questões essenciais para todo Product Manager: O discovery deveria ter um time box? O pensamento espacial pode acelerar o processo de discovery? E como esse método se integra com outras práticas já consolidadas na área de produto? Além disso, tocamos em pontos como: A desconexão das práticas básicas atualmente. Desafios e formas de superar obstáculos na aplicação do pensamento espacial. Ferramentas recomendadas para quem deseja incorporar essa abordagem em seu trabalho. Se conecte com o Sulivan Santiago aqui. Se inscreva na newsletter do Sulivan também, clicando aqui. Artigos mencionados no episódio: Percolação de Produtos e Spatial Thinking (substack.com) Pensamento de Segunda Ordem: Abordagem para melhorar tomada de decisões e a estratégia de produto (substack.com) O Conselho do Alex Atala para você, Product Manager! (substack.com) Aproveite o cupom de 10% de desconto nos cursos da Tera: ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠https://bit.ly/3y7UxPr⁠⁠⁠⁠⁠⁠⁠⁠⁠ ⁠⁠⁠ Cupom: PRODUCT_GURU Assine gratuitamente nossa newsletter aqui: Product Guru's Newsletter Siga-nos nas redes sociais: Twitter: ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠https://twitter.com/product_gurus⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠  Threads: ⁠⁠⁠https://www.threads.net/@product.gurus⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠  Instagram: ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠https://bit.ly/2WJnJt7⁠⁠

Brasil-Mundo
Ceramistas brasileiros abrem ateliê em Portugal e conquistam mesas dos melhores restaurantes do mundo

Brasil-Mundo

Play Episode Listen Later Jul 9, 2023 4:59


Gabi Neves e Alex Hell criaram, em 2008, em São Paulo, Studioneves. Após dez anos de sucesso no Brasil, eles recomeçaram a vida em Portugal e abriram um novo ateliê, onde produzem peças que emolduram os pratos de chefs estrelados do continente europeu. Fábia Belém, correspondente da RFI em PortugalGabi Neves e Alex Hell se mudaram para Portugal em 2018. Naturais da cidade de São Paulo, os dois se conheceram numa agência, onde ela trabalhava como designer e ele como publicitário. Mas o amor pela gastronomia levou o casal a se apaixonar por cerâmica e mudar de vida. Em 2008, eles criaram o Studioneves, um ateliê onde começaram a produzir cerâmica artesanal: pratos, travessas, copos, xícaras, peças exclusivas para restaurantes brasileiros. “Na maioria dos casos, eu diria que são peças mais rústicas, mais naturais e mais orgânicas. Nada certinho, nada com ângulo, nada reto. Muito pelo contrário”, diz Gabi Neves.Ela faz os desenhos e os moldes, enquanto Alex é responsável pelo vidrado, substância aplicada para revestir cada peça. Os vidrados, explica, “não são só bonitos. Eles [também] têm de suportar a acidez dos molhos que os chefs vão usar, a adstringência [do sabão] da máquina de lava-louça e os riscos das facas”.Incentivo do chef Alex AtalaNos primeiros anos de trabalho no Brasil, a dupla se dedicou a aprimorar a cerâmica artesanal que produzia. Começou, inclusive, a importar argila mais resistente dos Estados Unidos. Com o intuito “de empoderar todo um setor”, conta Alex, o casal passou a vender parte da matéria-prima para concorrentes, fazendo com que mais ceramistas trabalhassem com argila de alta qualidade. “A gente acabou transformando o mercado num mercado mais profissional, e no final, todo mundo ganhou com isso”, ressalta.Quando chegou ao quarto ano de funcionamento, o Studioneves recebeu o que não esperava: uma encomenda do chef brasileiro Alex Atala, dono do restaurante D.O.M., premiado com duas estrelas do Guia Michelin. “Naquele momento de 2012, Alex Atala era o grande nome da gastronomia mundial. Ele dava muito orgulho e também muito medo, porque a gente sabia que era muito importante”, lembra Hell, que reconhece que o casal desconhecia o universo da alta gastronomia. Gabi, por sua vez, lembra de que “não tinha tanto repertório de criação de peças” à época que conheceu Atala. Mas quando desenhou as coleções para o restaurante do mais famoso chef brasileiro, a ceramista pôs “as coisas que achava que faziam sentido e que era capaz de fazer naquela altura". No final, deu tudo tão certo, que todas as criações do Studioneves inspiradas no menu do D.O.M. foram aprovadas por Atala e, em seguida, abriram as portas para novos clientes. “Os chefs que seguiam o Alex Atala naquele momento, e que eram chefs de todos os estados brasileiros, começaram a ligar para gente dizendo ‘Eu sei que vocês fizeram as peças pro Alex [Atala], e eu quero que vocês façam para mim também'”, conta Hell, com entusiasmo.Ele reconhece que há um Studioneves antes e depois daquela encomenda. “Não é mais uma questão de qual é o chef mais importante ou mais estrelado, mas é a pessoa que mais teve um fator transformador na nossa história”, ressalta o ceramista se referindo a Atala. O auge e a saída do BrasilCom a fama do Studioneves se espalhando pelo Brasil, o negócio parecia não parar de crescer. Em 2017, Gabi e Alex contavam com uma equipe de oito pessoas, sete fornos e um ateliê com uma estrutura que mais parecia “uma fábrica de cerâmica dentro da cidade [de São Paulo]”, conta o ceramista. Naquele ano, o Studioneves fornecia para 220 restaurantes, e o faturamento havia dobrado em relação ao ano anterior. “A gente costuma dizer que a gente não inventou a roda, mas eu tenho certeza de que nós tivemos um papel muito importante nessa ascensão da cerâmica artesanal no Brasil”, reconhece Alex Hell. O trabalho estava no auge, mas a família já sofria com a insegurança na cidade de São Paulo. Depois houve um assalto à mão armada, e um dos filhos do casal estava no carro. Gabi e Alex não pensaram duas vezes: decidiram fechar o ateliê e deixar o país. A mudança para Portugal aconteceu há cinco anos, e começar do zero não foi fácil. Na Europa, “ninguém sabia quem a gente era”, diz Alex. “Ir até um restaurante com uma maletinha e perguntar ‘posso mostrar meu trabalho?'… Esse recomeço foi duro, porque tirou a gente completamente de uma zona, que já era uma zona de conforto”, se recorda.Studioneves em terras lusitanasFoi no município de Cascais, distante cerca de meia hora de Lisboa, que Gabi e Alex encontraram, em 2018, o espaço para montar o novo Studioneves. Em Portugal, descobriram até uma argila de alta qualidade para a produção das peças.“Aos poucos, as coisas foram entrando nos trilhos, e hoje o Studioneves conta com mais de 70 restaurantes [entre seus clientes]”, comemora Hell. Para Gabi, “recomeçar na Europa, então, foi um bônus”.  As louças do ateliê estão nas mesas de alguns dos melhores restaurantes do mundo. E para atender a demanda local, de outros países europeus e até do Japão, os ceramistas produzem, em média, 1.500 peças por mês com a ajuda de uma equipe de sete pessoas. Para responder às necessidades dos chefs de cozinha, o casal cria louças que se adaptam ao menu. Segundo Hell, há chef que já chega com uma ideia na cabeça, dizendo ‘quero uma peça de tal diâmetro, com tal profundidade, porque eu vou colocar tal comida com tal molho. É importante que tenha tal cor para contrastar com tal comida que eu vou colocar´. Também há chef que convida o casal para jantar no restaurante e, ao final, dá total liberdade para os ceramistas criarem o serviço de mesa inspirado no menu. Mas Gabi deixa claro: “o que tá valendo ali é a comida. Então, o prato não tem que ser a obra de arte, ele tem que ser o que segura a obra de arte, ele não toma o lugar”.“Ateliê verde”Para adotar práticas ambientalmente sustentáveis no negócio, Gabi e Alex instalaram um reservatório para captar água da chuva, que serve para limpar o ateliê, as ferramentas e para fazer os vidrados. Depois que compraram um painel solar, também deixaram de gastar energia elétrica durante os processos de queima das peças. Além disso, as sobras de argila vão para a máquina de reciclagem, e não para o lixo. “O meu descarte vai próximo a zero hoje em dia. Então, eu perco quase nada”, garante Alex Hell. Ano passado, o ateliê ganhou o selo B Corp, uma certificação dada a empresas comprometidas com a sustentabilidade social e ambiental. Segundo Hell, atualmente, há clientes “que por acaso compram de nós, porque, apesar de outros fatores, somos muito sustentáveis e temos o selo para poder provar isso”.

Brasil-Mundo
Ceramistas brasileiros abrem ateliê em Portugal e conquistam mesas dos melhores restaurantes do mundo

Brasil-Mundo

Play Episode Listen Later Jul 9, 2023 4:59


Gabi Neves e Alex Hell criaram, em 2008, em São Paulo, Studioneves. Após dez anos de sucesso no Brasil, eles recomeçaram a vida em Portugal e abriram um novo ateliê, onde produzem peças que emolduram os pratos de chefs estrelados do continente europeu. Fábia Belém, correspondente da RFI em PortugalGabi Neves e Alex Hell se mudaram para Portugal em 2018. Naturais da cidade de São Paulo, os dois se conheceram numa agência, onde ela trabalhava como designer e ele como publicitário. Mas o amor pela gastronomia levou o casal a se apaixonar por cerâmica e mudar de vida. Em 2008, eles criaram o Studioneves, um ateliê onde começaram a produzir cerâmica artesanal: pratos, travessas, copos, xícaras, peças exclusivas para restaurantes brasileiros. “Na maioria dos casos, eu diria que são peças mais rústicas, mais naturais e mais orgânicas. Nada certinho, nada com ângulo, nada reto. Muito pelo contrário”, diz Gabi Neves.Ela faz os desenhos e os moldes, enquanto Alex é responsável pelo vidrado, substância aplicada para revestir cada peça. Os vidrados, explica, “não são só bonitos. Eles [também] têm de suportar a acidez dos molhos que os chefs vão usar, a adstringência [do sabão] da máquina de lava-louça e os riscos das facas”.Incentivo do chef Alex AtalaNos primeiros anos de trabalho no Brasil, a dupla se dedicou a aprimorar a cerâmica artesanal que produzia. Começou, inclusive, a importar argila mais resistente dos Estados Unidos. Com o intuito “de empoderar todo um setor”, conta Alex, o casal passou a vender parte da matéria-prima para concorrentes, fazendo com que mais ceramistas trabalhassem com argila de alta qualidade. “A gente acabou transformando o mercado num mercado mais profissional, e no final, todo mundo ganhou com isso”, ressalta.Quando chegou ao quarto ano de funcionamento, o Studioneves recebeu o que não esperava: uma encomenda do chef brasileiro Alex Atala, dono do restaurante D.O.M., premiado com duas estrelas do Guia Michelin. “Naquele momento de 2012, Alex Atala era o grande nome da gastronomia mundial. Ele dava muito orgulho e também muito medo, porque a gente sabia que era muito importante”, lembra Hell, que reconhece que o casal desconhecia o universo da alta gastronomia. Gabi, por sua vez, lembra de que “não tinha tanto repertório de criação de peças” à época que conheceu Atala. Mas quando desenhou as coleções para o restaurante do mais famoso chef brasileiro, a ceramista pôs “as coisas que achava que faziam sentido e que era capaz de fazer naquela altura". No final, deu tudo tão certo, que todas as criações do Studioneves inspiradas no menu do D.O.M. foram aprovadas por Atala e, em seguida, abriram as portas para novos clientes. “Os chefs que seguiam o Alex Atala naquele momento, e que eram chefs de todos os estados brasileiros, começaram a ligar para gente dizendo ‘Eu sei que vocês fizeram as peças pro Alex [Atala], e eu quero que vocês façam para mim também'”, conta Hell, com entusiasmo.Ele reconhece que há um Studioneves antes e depois daquela encomenda. “Não é mais uma questão de qual é o chef mais importante ou mais estrelado, mas é a pessoa que mais teve um fator transformador na nossa história”, ressalta o ceramista se referindo a Atala. O auge e a saída do BrasilCom a fama do Studioneves se espalhando pelo Brasil, o negócio parecia não parar de crescer. Em 2017, Gabi e Alex contavam com uma equipe de oito pessoas, sete fornos e um ateliê com uma estrutura que mais parecia “uma fábrica de cerâmica dentro da cidade [de São Paulo]”, conta o ceramista. Naquele ano, o Studioneves fornecia para 220 restaurantes, e o faturamento havia dobrado em relação ao ano anterior. “A gente costuma dizer que a gente não inventou a roda, mas eu tenho certeza de que nós tivemos um papel muito importante nessa ascensão da cerâmica artesanal no Brasil”, reconhece Alex Hell. O trabalho estava no auge, mas a família já sofria com a insegurança na cidade de São Paulo. Depois houve um assalto à mão armada, e um dos filhos do casal estava no carro. Gabi e Alex não pensaram duas vezes: decidiram fechar o ateliê e deixar o país. A mudança para Portugal aconteceu há cinco anos, e começar do zero não foi fácil. Na Europa, “ninguém sabia quem a gente era”, diz Alex. “Ir até um restaurante com uma maletinha e perguntar ‘posso mostrar meu trabalho?'… Esse recomeço foi duro, porque tirou a gente completamente de uma zona, que já era uma zona de conforto”, se recorda.Studioneves em terras lusitanasFoi no município de Cascais, distante cerca de meia hora de Lisboa, que Gabi e Alex encontraram, em 2018, o espaço para montar o novo Studioneves. Em Portugal, descobriram até uma argila de alta qualidade para a produção das peças.“Aos poucos, as coisas foram entrando nos trilhos, e hoje o Studioneves conta com mais de 70 restaurantes [entre seus clientes]”, comemora Hell. Para Gabi, “recomeçar na Europa, então, foi um bônus”.  As louças do ateliê estão nas mesas de alguns dos melhores restaurantes do mundo. E para atender a demanda local, de outros países europeus e até do Japão, os ceramistas produzem, em média, 1.500 peças por mês com a ajuda de uma equipe de sete pessoas. Para responder às necessidades dos chefs de cozinha, o casal cria louças que se adaptam ao menu. Segundo Hell, há chef que já chega com uma ideia na cabeça, dizendo ‘quero uma peça de tal diâmetro, com tal profundidade, porque eu vou colocar tal comida com tal molho. É importante que tenha tal cor para contrastar com tal comida que eu vou colocar´. Também há chef que convida o casal para jantar no restaurante e, ao final, dá total liberdade para os ceramistas criarem o serviço de mesa inspirado no menu. Mas Gabi deixa claro: “o que tá valendo ali é a comida. Então, o prato não tem que ser a obra de arte, ele tem que ser o que segura a obra de arte, ele não toma o lugar”.“Ateliê verde”Para adotar práticas ambientalmente sustentáveis no negócio, Gabi e Alex instalaram um reservatório para captar água da chuva, que serve para limpar o ateliê, as ferramentas e para fazer os vidrados. Depois que compraram um painel solar, também deixaram de gastar energia elétrica durante os processos de queima das peças. Além disso, as sobras de argila vão para a máquina de reciclagem, e não para o lixo. “O meu descarte vai próximo a zero hoje em dia. Então, eu perco quase nada”, garante Alex Hell. Ano passado, o ateliê ganhou o selo B Corp, uma certificação dada a empresas comprometidas com a sustentabilidade social e ambiental. Segundo Hell, atualmente, há clientes “que por acaso compram de nós, porque, apesar de outros fatores, somos muito sustentáveis e temos o selo para poder provar isso”.

PURA CONNECTION
ALEX ATALA - CONNECTION #0108

PURA CONNECTION

Play Episode Listen Later Jul 3, 2023 89:36


Alex Atala, faixa preta de jiu-jitsu, chef de cozinha brasileiro, e que ganhou destaque internacional por seu trabalho na valorização e divulgação da gastronomia amazônica. Atala começou a praticar jiu-jitsu ainda jovem, e se envolveu tanto com o esporte que chegou a alcançar a faixa preta e para ele, o jiu-jitsu não é apenas uma atividade física, mas também uma filosofia de vida. Ele consegue conciliar o mundo da gastronomia e o jiu-jitsu, especialmente no que diz respeito à disciplina, dedicação e busca constante pela excelência. Tanto a gastronomia, quando o jiu-jitsu exigem esforço, paciência e um comprometimento incansável para melhorar suas habilidades e superar os desafios pelo caminhoEle acredita que esses valores e princípios são fundamentais para alcançar o sucesso em qualquer área e leva muito do jiu-jitsu para seu papel na gastronomia e na forma de liderar sua equipe.Uma troca incrível, com esse exemplo de ser humano, líder e atleta, que conta um pouco de como o jiu-jitsu fez a diferença na sua vida profissional. Mais um episódio inédito e na íntegra no Pura Connection Podcast! Inscreva-se em nosso canal do youtube e assista todos os podcasts: www.youtube.com/puratempleofarts ❌NEW PROJECT PURA CONCEPT - GUARDA DIAMANTE BY XANDE RIBEIRO❌ Curso 100% on-line! Compre agora e aproveite o preço com desconto por tempo limitado… Clique no link abaixo e saiba mais: https://www.puratempleofarts.com.br/curso-de-jiu-jitsu-online-guarda-diamante-by-xande-ribeiro/ ❌ NEW PROJECT ⚡️ PURA CULTURE - LIFE PHILOSOPHY ❌ Um projeto que foi criado para que você faça parte da filosofia PURA em qualquer lugar do mundo. Juntamos todas as nossas experiências e conhecimentos adquiridos em todos esses anos, além de conexões com grandes ídolos dos tatâmes e da vida em um só lugar, na Plataforma PURA CULTURE! Nossa plataforma conta com módulos como trocas com grande mestres no dojo, jiu-jítsu Pura, treinos de mobilidade, Pura Yoga, posturas, técnicas de respiração, meditação , backstage exclusivos de podcasts e outros conteúdos com foco na evolução física, mental e espiritual. Novos conteúdos serão publicados toda semana com bônus de aulas ao vivo de jiu jitsu e yoga. Os 50 primeiros assinantes, terão um desconto vitalício de 50% na sua assinatura (apenas R$ 38,50) . Seja um membro. https://www.puratempleofarts.com.br/pura-culture/ Visite nosso website: www.puratempleofarts.com.br --- Support this podcast: https://podcasters.spotify.com/pod/show/puratempleofarts/support

BTG Pactual
Arte de se Relacionar T1.EP.4 - Alex Atala

BTG Pactual

Play Episode Listen Later Jun 10, 2023 54:12


Martha Leonardis, sócia do BTG Pactual, conversa com Alex Atala, chefe de cozinha. Atala inicia o bate-papo contando sobre seu “mochilão” pela Europa, com experiências e relacionamentos formados durante a viagem. O chefe detalha os desafios do início de carreira, a importância de manter bons relacionamentos com colaboradores, fornecedores e clientes, além de contar como foi criado o restaurante D.O.M.

Esquizofrenoias
TEA: “Você também é assim?” com Rafael Mantesso

Esquizofrenoias

Play Episode Listen Later May 1, 2023 76:06


Rafael Mantesso nasceu em 1983 em Carangola, Minas Gerais. Publicitário, escreveu por seis anos o blog Marketing na cozinha e, junto com o chef Alex Atala, é cofundador do Instituto ATÁ, que promove a cozinha, os ingredientes brasileiros e o meio ambiente. Equipe: Apresentação e roteiro: Amanda Ramalho Produção executiva: Natália Daumas Produção e estratégia digital: Juliana Mosca Beleza: Natália Stracieri Figurino: Lela Brandão Co, MIndse7 C&A e Adidas Cenografia: Carol Oliveira Animação: Vinicius Kahan Captação: Agência de Podcast Música e Áudio: Pedro Zimmer Edição e arte: Isabella Garcia Direção: Alexandre Nickel contato@esquizofrenoias.com.br

mymuybueno Chefs Get Personal
Chef Kirk Westaway

mymuybueno Chefs Get Personal

Play Episode Listen Later Dec 27, 2022 20:09


In this episode Justine interviews multi-award winning and internationally renowned chef, Kirk Westaway, where he shares about his favourite cookbooks, and which chefs inspire him most.Drawing from both his English heritage and a strong culinary training under world-renowned chefs, Kirk is recognised for his mastery in reinventing gastronomic classics that are easily received as comforting and refined. Born in South West Britain, Chef Kirk has redefined JAAN by channelling his deeply rooted passion for natural flavours that are reminiscent of his childhood in England. His perpetually evolving menu embodies a seasonal philosophy to showcase British dining in a modern, refreshing light. From conceptualising his culinary direction and selecting the world's best gourmet suppliers, to personalising bespoke tableware and curating specially designed furniture, subtle shifts in JAAN's dining experience over the years have seen Chef Kirk and his team move towards a stronger emphasis on ‘Reinventing British' – culminating in the eponymous rebranding of JAAN to JAAN by Kirk Westaway in June 2019. In 2021 JAAN was awarded its second Michelin Star.Prior to JAAN, Kirk had the privilege of working and honing his skills in some of the best kitchens in the world, under culinary masterminds such as Chef Alex Atala of D.O.M., Chef Raymond Patterson of Patterson's London, as well as Chef Antonin Bonnet who formerly helmed the two MICHELIN-starred Greenhouse in London. Westaway credits his friend and mentor, Chef Julien Royer of Odette, who was previously Executive Chef at JAAN, for opening his culinary chapter here in Singapore where he has lived since 2011.You can follow Kirk below to stay tuned and inspired with all he has going on ahead.Chef Kirk Westaway is Executive Chef of JAAN by Kirk WestawayWebsite: www.jaan.com.sgInstagram: @chefkirk_w & @jaanbykirkThank you for listening. Subscribe now so you don't miss an episode.Follow mymuybueno on Instagram to stay updated in all going on, now in it's eleventh year and Justine too, as well as all her restaurant visits and reviews.And our mymuybueno Chefs Instagram – our culinary community.Use our hashtag when posting your best dishes and when searching for inspiration #mymuybuenochefs#mymuybueno #mymuybuenochefs #mymuybuenochefsgetpersonal #eatlivelearn Hosted on Acast. See acast.com/privacy for more information.

Podcast Rebelião Saudável
Chef Liliane Pereira

Podcast Rebelião Saudável

Play Episode Listen Later Dec 5, 2022 47:46


Nessa live conversei com a Chef e Professora de Filosofia Liliane Pereira (@ChefLilianePereira). Liliane Nasceu em Alagoas e iniciou no mundo gastronômico aos 19 anos de idade. Estagiou na Alemanha, Nigéria, Portugal, em São Paulo com Alex Atala no D.O.M. Participou do primeiro reality de cozinha da TV brasileira, o super chef no programa da Ana Maria Braga. Para ela, unir pessoas em volta da mesa para que elas possam experienciar sabores e sentimentos é a proposta que alia gastronomia, arte e afetividade. Liliane é Professora de Filosofia na Nova Acrópole, uma instituição internacional que já existe há mais de 65 anos (@AcropoleFortaleza.Meireles). Você também pode nos acompanhar no instagram, http://www.instagram.com/henriqueautran. E em nosso canal do YouTube: https://youtube.com/c/henriqueautran. Estamos também no telegram com um grupo exclusivo que você pode participar. Lá no telegram eu consigo compartilhar materiais exclusivos que não dá para compartilhar no Instagram. Além disso, toda quarta feira às 6:30 da manhã temos a Reunião da Rebelião Saudável com a participação de Profissionais de Saúde. Na reunião discutimos assuntos relevantes a respeito de saúde e qualidade de vida. Você pode acessar o grupo no telegram em https://t.me/RebeliaoSaudavel. Se você gosta de nosso trabalho, deixe um review 5 estrelas e faça um comentário no seu app de podcast. Essa atitude é muito importante para a Rebelião saudável e vai ajudar nosso movimento a chegar a cada vez mais pessoas.

Podcast Studio FKM
Programa Backstage entrevista Iara Filardi Jornalista

Podcast Studio FKM

Play Episode Listen Later Nov 24, 2022 53:56


Jornalista especializada em Cultura e Arte, Iara Filardi também é Psicóloga e desenvolve estudos sobre as dimensões da comunicação na psicanálise. Começou sua carreira em veículos de rádio como CBN, Globo e Nova Brasil FM, e tem passagens pelo grupo O Estado de São Paulo e Screen vídeo (produções institucionais para agências como Talent, Lew Lara e DM9DDB). Atuou em importantes casas de shows como Palace (Citibank Hall) e Expresso Brasil, atendendo artistas e bandas como Caetano Veloso, B.B. King, Marina Lima, O Rappa, Paralamas do Sucesso, Gal Costa, Alcione entre outros. Em 2002 ingressou no mercado editorial por meio da revista História Viva. Editoras como Escala Educacional, Larousse do Brasil, Almedina, NVersos e Ciranda Cultural estão entre as empresas atendidas. Com o agenciamento de lançamentos editoriais de chefs e sommeliers como Alex Atala, Gordon Ramsay, Manoel Beato, Jamie Oliver, Claude Troisgros entre outros, conquistou espaço no setor gastronômico, o que a fez ampliar o atendimento para bares, restaurantes e profissionais da área. Com mais de dezoito anos de atuação em assessoria de imprensa, e bagagem suficiente para novas viagens, trafega também pelo universo da arquitetura em container, mantendo sempre o seu olhar atento às necessidades do mercado e ampliando as possibilidades de negócios, afinal, TUDO É CULTURA!

Let's Talk New York
#131 De Au Pair a chef de cozinha

Let's Talk New York

Play Episode Listen Later Nov 6, 2022 51:23


Mariana Lyra é baiana, mora em Nova York e é chef profissional. Mariana começou sua carreira há mais de 10 anos, quando se formou no The Culinary Institute of the Pacific no Havaí. Desde então, ela trabalhou com chefs prestigiados como Alex Atala e Thomas Keller, além de ter também trabalhado no prestigiado restaurante Per Se. Hoje, ela atua como chef particular e cultiva outra paixão: hortas. Mas, quem vê toda a trajetória da Mari nem imagina que a razão pela qual ela veio parar nos EUA foi o intercâmbio de Au Pair. Mari se interessou pelo programa quando decidiu que queria aprender inglês - na época, ela nem cogitava fazer gastronomia e estava em uma área completamente diferente. Porém, vocês sabem como é o destino né? LINKS DO EPISÓDIO: Instagram da Mariana @marisgardenandkitchen Código para desconto de U$30 ao se inscrever para o teste TOEFL LAURA30 - link de termos e condições: https://cirql.me/b/xvvnk

Expresso - Expresso da Manhã
Chef, tem uma formiga no meu prato

Expresso - Expresso da Manhã

Play Episode Listen Later Jul 1, 2022 12:47


O que comemos é uma questão cultural e há preconceitos que é preciso ultrapassar para provar ingredientes a que não estamos habituados, mesmo com sabores que conhecemos bem. Este fim-de-semana realiza-se em Lisboa um festival de comida de rua, que tem chefs com estrelas Michelin mas não só. No Foodtopia, há os pratos a que estamos habituados e há também uma formiga que veio de propósito da Amazónia. Neste episódio, porque a cozinha tem sabor e tem saber, conversamos com Alex Atala, chef com duas estrelas Michelin, responsável pelo D.O.M em São Paulo. See omnystudio.com/listener for privacy information.

The Shift
O lado B dos negócios digitais

The Shift

Play Episode Listen Later Jun 24, 2022 74:03


Com crise, ou sem crise, é preciso inovar sem atropelar conceitos fundamentais de finanças. E fazer muito bem o básico, alertam Rodrigo Fernandes e Rodrigo Quinalha, co-fundadores da escola de gestão executiva B.side. Vale para qualquer negócio, até os que nasceram digitais.LINKS DO EPISÓDIOO livro “A Verdade Dos Fatos - Gerenciamento Baseado Em Evidências”, de Jeffrey Pfeffer e Robert I. SuttonO filme “Sushi a la mexicana”, no NetflixO livro “Gestão de Alta Performance:Tudo o que um gestor precisa saber para gerenciar equipes e manter o foco em resultados”, de Andrew GroveO vídeo “Espaguete Alho & Óleo por Alex Atala”, no YouTube_____FALE CONOSCOEmail: news@theshift.info_____ASSINE A THE SHIFTwww.theshift.info

CHEFS
Marie-Victorine Manoa

CHEFS

Play Episode Listen Later May 17, 2022 44:30


Il y a un bastion Lyonnais affirmé par sa carte, au cœur de Paris, rue Saint Marc et on y trouve tout ce qui fait la richesse de cette terre de gastronomie.En cuisine, une figure, surprenante: une jeune cheffe qui s'impose naturellement par l'héritage de son savoir mais dans les faits, par sa sensibilité et la force qu'elle a mis à tracer sa route. On voulait vraiment vous présenter cette cheffe de trente ans car elle détonne par son histoire et l'énergie qu'elle met aujourd'hui à l'écrire.Marie-Victorine Manoa c'est une évidence et une anomalie à la fois:Née à Lyon, elle a grandi dans un bouchon, dans les pattes d'un père chef, figure bien connue des gastronomes de la région. Puis elle est allée voir ailleurs, elle a découvert la vision d'un Redzepi, elle s'est livrée à l'exigence du NOMA alors meilleur restaurant du monde, pour partir ensuite au Brésil chez Alex Atala, puis à New York... Avant d'être adoubée par Alain Ducasse!Voila, voilà...Et si aujourd'hui, elle est revenue sur le chemin originel, qu'elle s'impose dans la lignée d'une cuisine de tradition, elle est trop sauvage pour être une simple héritière...Pour faire sa cuisine, elle aurait pu renverser les codes, la table, revisiter ou carrément envoyer balader les mères... et son père, mais elle a choisi une autre partition.Tandis qu'elle se tient debout au-dessus de ces convives pour s'assurer que tout va bien, dans ce restaurant qui porte les couleurs de sa terre, chacun peut désormais découvrir une nouvelle version de la tradition. Généreuse mais délicate, réconfortante mais rafraîchie, assumée mais sensible.Une cuisine qui dit doublement son histoire, comme sa voix et son regard: elle est déterminée et fonceuse, il est tendre et généreux...Production: NOLA MEDIASRéalisation: Nathan Cohen et David OrdonoInterview : David OrdonoCréation musicale: Nathan CohenProgrammation : Marion CazesContact: podcast@nolamedias.comPour découvrir la cuisine de Marie-Victorine Manoa:Aux Lyonnais - 32 rue Saint-Marc - Paris 2https://www.auxlyonnais.com/Pour la suivre sur Insta: https://www.instagram.com/marievictorinemanoa/ Hébergé par Acast. Visitez acast.com/privacy pour plus d'informations.

Paladar Distinto
# 154 - Chef Flávia Quaresma | Trajetória e Cursos

Paladar Distinto

Play Episode Listen Later Apr 10, 2022 42:22


Entrevista com a Chef Flávia Quaresma, formada em gastronomia e em confeitaria, com o Grand Diplôme École Le Cordon Bleu Academie Culinaire de Paris. Se especializou em artes culinárias durante 10 anos, em algumas das mais famosas escolas de gastronomia na França, como a École Lenôtre, École Gastronomique Bellouet Conseil e INBP – Institut National de Boulangerie et Pâtisserie. Flávia trabalhou em importantes restaurantes no exterior, como Lapérouse e La Bute Chaillot do Chef Guy Savoy, em Paris, Keren em Tel Aviv, e Los Irabien na cidade do México. Foi proprietária do restaurante “Carême Bistrô”, onde recebeu diversas premiações. Em 2001, como consultora, ela abriu Restaurant “Eça” na área do centro do Rio, que ela administrou até dezembro de 2003. E recentemente teve o Café e Restaurante no Museu do Amanhã no Rio de Janeiro. Participou de vários programas de televisão como “Armazém 41” e “Flávia Quaresma Especial” no canal a cabo GNT, e “Mesa pra Dois” onde trabalhou ao lado Chef Alex Atala entre os anos de 2004 e 2006 sobre o mesmo canal. Realiza diversos cursos e oficinas, no Brasil e no Exterior apresentando nossa gastronomia. E também faz sucesso como autora, já tendo publicado diversos livros, entre eles: Saboreando Mudanças: O Poder Terapêutico dos Alimentos e Arte de Cozinha. Atualizada no Mundo Digital, lançou uma plataforma com aulas on-line “A Cozinha é para todos”, onde toda terça-feira as 19h ela realiza ao vivo diversas receitas da culinária francesa entre outros clássicos da gastronomia. (informações na Bio do Instagram da Chef) @chefflaviaquaresma Informações para aula on-line da Chef: Clique Aqui. Fique por dentro das novidades, siga nosso Instagram: @paladar.distinto

Paladar Distinto
# 149 - Chef Rafa Bocaina | Curiango Charcutaria Artesanal / "Trajetória e Charcutaria"

Paladar Distinto

Play Episode Listen Later Mar 24, 2022 36:39


Entrevista com Rafael Cardoso, o Rafa Bocaina, que cresceu no ambiente rural das fazendas do Vale do Paraíba (interior de SP), entre as Serras da Mantiqueira e da Bocaina. Formado em hotelaria e gastronomia pelo Senac Campos do Jordão, começou como estagiário do Alex Atala em 2003. Na Espanha trabalhou em algumas das cozinhas mais importantes da atualidade, como a do Restaurante Mugaritz de Andoni Aduriz (um dos melhores restaurantes do mundo pelo ranking da revista inglesa Restaurant Magazine). Também esteve no Chile, com Rodolfo Guzmán e em Belém do Pará com o grande defensor da cozinha paraense, Paulo Martins. Ainda chefiou casas em São Paulo, Goiânia e Belo Horizonte e foi professor de gastronomia em diversas entidades. Em 2013, palestrou no renomado congresso Madrid Fusión. Já com a carreira em gastronomia consolidada, mudou com sua família para uma pequena fazenda em Silveiras, interior de São Paulo. Faz a criação porcos a pasto e se dedica ao resgate do caruncho, uma espécie de porco crioulo da Serra da Bocaina, por isso, adotou o apelido Rafa Bocaina. Além de trabalhar com porcos, realiza um trabalho de resgate e disseminação dos sabores caipiras do Vale do Paraíba e se dedica a charcutaria artesanal. @rafa_bocaina @curiango.art Canal You Tube: Prosa Caipira Fique por dentro das novidades, siga nosso Instagram: @paladar.distinto

Paladar Distinto
# 148 - Chef Flavio Miyamura | Gastronomia Asiática / "Trajetória e Criações"

Paladar Distinto

Play Episode Listen Later Mar 20, 2022 37:28


Entrevista com o Chef Flavio Miyamura, à frente do Dasian, restaurante asiático com cozinha contemporânea localizado no bairro do Itaim Bibi em São Paulo. Em seu currículo, destacam-se passagens por restaurantes tradicionais e de treinamento bem rígido como o Shin Zushi e contemporâneos, como o D.O.M., onde começou como estagiário e terminou como subchef. Entre os anos de 2007 e 2011 esteve ao lado dos gêmeos catalães Sergio e Javier Torres, como chef executivo nos restaurantes eñe de São Paulo e Rio de Janeiro. Com sua capacidade de juntar a delicadeza japonesa à pureza da cozinha espanhola, Miyamura deixou suas marcas. Uma de suas mais famosas criações o Cochinillo Crujiente con Manzana (Leitão Crocante com Maçã), vencedor do prêmio Paladar 2008 na categoria “Carne de Porco”. Em 2012 Miyamura teve sua primeira aventura solo, o Miya, apresentando receitas descomplicadas, com ingredientes especiais e mantendo o clima descontraído. Com ele, recebeu dois prêmios na categoria de Chef Revelação, concedidos pelas revistas Prazeres da Mesa e Gosto. Miyamura também já participou de uma temporada do programa Mesa pra Dois, do GNT, ajudando o chef Alex Atala. Hoje comanda o programa “Hashi” no canal Sabor & Arte e na atração, conduz o telespectador em uma viagem pela culinária asiática, na qual une a cozinha e a cultura. @flaviomiyamura @dasiansp @saborearte Fique por dentro das novidades, siga nosso Instagram: @paladar.distinto

BEN-YUR Podcast
#110 THIAGO BAÑARES (TAN TAN, KOTORI)

BEN-YUR Podcast

Play Episode Listen Later Jan 18, 2022 126:45


Thiago Bañares é chef de cozinha, já trabalhou com Alex Atala, atualmente é proprietário do Kotori e do Tan Tan.

Wood Fired Oven
Masterclass Interview - Clive, The Wood Fired Oven Chef from YouTube sits down with me for an in-depth chat about Wood Fired Oven Cooking.

Wood Fired Oven

Play Episode Play 44 sec Highlight Listen Later Dec 12, 2021 127:56 Transcription Available


Join me as I chat to Clive, The Wood Fired Oven Chef from YouTube, about his love for fire, food and most importantly family. Over the past 3-4 years, Clive has inspired, educated and motivated countless wood fired oven enthusiasts around the world, with his superbly produced wood fired oven videos on YouTube. Like many, I eagerly await the next episode instalment, which almost always leads me to try new and exciting dishes in my wood fired oven. Getting to know Clive personally over the past month has been a real privilege and I am very excited to share my extended interview with Clive with you all. Many of you sent in audio questions for Clive - all of which he answers on the show.  Sit back, relax and get ready for an amazing Masterclass from a true gentleman and creative master.Check out Clive's YouTube channel: The Wood Fired Oven Chef Check out Clive's website: The Wood Fired Oven ChefFollow Clive on Instagram: @thewoodfiredovenchef Chefs and other influential folk mentioned on the episode:Victor Arguinzoniz, Keith Floyd, Nigel Slater, Julia Child, Jacque Pepin, Marcella Hazan, Claudia Roden, Yotam Ottolenghi, Tom Colicchio, Francis Mallmann, Lennox Hastie, Alex Atala, Eric Ripert, Richard Miscovich, Steven Raichlen, Jose Andres, Joel Robuchons.TV Shows mentioned on this episode:Chefs Table, Top Chef, Made In SpainWood Fired Oven manufacturers mentioned on this episode:Forno Bravo, Mugnaini, Ooni Pizza Ovens, Gozney, Maine Fire Brick Company, Zesti, Le Panyol, Melbourne Fire Brick CompanyRestaurants mentioned in this episode:Extebarri, Le Burnardin, El BulliWine mentioned in this episode:Burgundy, Barolo, Barbaresco, AlbarinoReview Wood Fired Oven on Apple Podcasts to let me know what you think of the show. Check out my website for episode show notes and links, wood fired oven tips and advice, pictures and recipes:  woodfiredoven.cookingSocialInstagram: marks_WoodFiredOven Twitter: @WFOPodcast Facebook: Wood Fired Oven Podcast Follow me on: YouTubeA few books mentioned in this episode.The Bread Builders by Daniel Wing and Alan ScottFrom the Wood Fired Oven by Richard MiscovichUsing these links helps support the show - and costs you nothing extra. ThermoworksI've been using the highly reviewed Thermapen ONE for a few years - always checking my chicken and steak. With full readings in ONE second or less, Thermapen ONE approaches the speed of thought - allowing you to focus exclusively on the quality of your cook.Support the show and use this link to grab yourself one of the best food thermometers around. Grab a Thermapen ONE.Support the show (https://www.buymeacoffee.com/MarkG)

PURA CONNECTION
ALEX ATALA - PURA CONNECTION EPISODE #0029

PURA CONNECTION

Play Episode Listen Later Dec 2, 2021 88:06


Com os pés fincados em suas raízes e os olhos voltados para o futuro, Alex Atala é acima de tudo um apaixonado. Pelo Brasil, pela natureza, pela gastronomia, pela vida. Movido por desafios e um grande sentimento de indignação, Atala consegue com extrema delicadeza e técnica transformar essa energia criativa em experiências inesquecíveis para quem tem a oportunidade de provar suas invenções. Suoer as fronteiras da cozinha e atua como cidadão responsável, valorizando o pequeno produtor, incentivando jovens profissionais e apoiando projetos de terceiro setor. Um ser humano fora do “normal” que carrega seu espírito de samurai e sua faixa preta em todas situações de vida. Foi uma grande hora receber mais um Mestre da vida em nosso podcast! Conversa imperdível! Inscreva-se em nosso canal do youtube e assista todos os podcasts: www.youtube.com/puratempleofarts Visite nosso website: www.puratempleofarts.com.br --- Support this podcast: https://podcasters.spotify.com/pod/show/puratempleofarts/support

Car Dogs
#82 Eating Ants, Alex Atala and Thanksgiving weekend

Car Dogs

Play Episode Listen Later Nov 29, 2021 55:49


If the ants rise up it will be a fight- 1.6 million exist per person- that's over 372kg of ants per person (over 820lbs.) --- This episode is sponsored by · Anchor: The easiest way to make a podcast. https://anchor.fm/app

Paladar Distinto
# 116 - Alexandre Park | Popa Artesanal / "Hot Dog Artesanal"

Paladar Distinto

Play Episode Listen Later Nov 18, 2021 30:45


Entrevista com Alexandre Park, criador da PoPa Artesanal Hot Dog, pós-graduado em Gastronomia pela Universidade Anhembi Morumbi, iniciou seu trabalho na cozinha profissional após trabalhar 18 anos na advocacia empresarial para pequenas, médias e grandes empresas como multinacionais de porte da Samsung entre outras. Movido pelo amor à gastronomia e também pela necessidade de cuidar de sua saúde, foi buscar resiliência em sua vida com a mudança de profissão, trocou o terno e gravata pelo dólmã e o “toque blanche” para viver uma vida mais feliz. Trouxe para a sua cozinha toda a experiência adquirida nas cozinhas dos restaurantes Kinoshita do Chef Tsuyoshi Murakami, do restaurante DOM do Chef Alex Atala, do restaurante EÑE dos Chefs irmãos Javier e Sergio Torres (atual Dos Cielos), onde trabalhou por dois anos e o restaurante Manioca da Chef Helena Rizzo entre outros restaurantes no Brasil e nos Estados Unidos. A base de criação da PoPa Artesanal Hot Dog tem três origens bem distintas, a primeira na sua memória afetiva de infância onde aprendeu a saborear os embutidos caseiros feitos pelos amigos de seus de seus pais de origem portuguesa, italiana e espanhola, a segunda na cozinha de sua mãe, exímia cozinheira da culinária coreana e a terceira seu hobby de produzir embutidos, quando em suas viagens ao exterior comprava livros de charcutaria colocando em prática em ocasiões como reunião familiar e de amigos. O projeto para a execução da PoPa Artesanal Hot Dog foi realizado ao longo de 3 anos, sendo que, 1 ano, rodou por 4 cidades norte americanas e 1 canadense, entre elas, Nova York, Miami, Los Angeles, Seattle e Vancouver para pesquisar os grandes “players” do negócio de Hot Dog mundial, entre eles Nathan's Famous, Feltman's, Sabrett, Pink's, Japadog entre outros e 2 anos de execução entre a reforma do imóvel da fábrica, compra dos equipamentos e utensílios de cozinha e a confecção pelas próprias mãos dos carrinhos onde seriam vendidos os Hot Dogs. Atualmente, reabriu sua pequena loja em São Paulo na rua Oscar Freire, 502, bem na esquina com a rua Padre João Manuel. @popaartesanal

Paladar Distinto
# 113 - Chef Claudemir Barros | Gastronomia Pernambucana / "Pratos típicos da região"

Paladar Distinto

Play Episode Listen Later Nov 7, 2021 35:01


Entrevista com Claudemir Barros, Chef de Cozinha e sócio do Restaurante Oleiro Cozinha Artesanal – Formado pelo Curso de Cozinheiro Internacional do Senac Pernambuco, Claudemir tem se dedicado às pesquisas dos ingredientes brasileiros, sobretudo os da região Nordeste. Claudemir tem no DNA o ofício de cozinheiro. Sua mãe, dona Anita, foi cozinheira líder por 17 anos do tradicional restaurante Leite, considerado o mais antigo em funcionamento do País. Na bagagem profissional, Barros acumula 14 anos à frente do recifense Wiella Bistrô, levou o título “Chef do Ano” pelo prêmio “Melhores do Ano de Pernambuco”, da revista Prazeres da Mesa, em 2012, e, recentemente, o lançamento do seu primeiro livro: o “Sonhos e Sabores”, com apoio do Funcultura, no qual traz um recorte de suas descobertas no Interior do Estado, desvelando ingredientes desprestigiados e elevando a importância gastronômica de insumos corriqueiros da cultura local. Da sua trajetória ainda constam estágios no Emiliano, junto ao chef Russo, braço direito do francês Laurent Suaudeau; no D.O.M de Alex Atala, ambos em São Paulo, e na Casa Marcelo, uma estrela Michelin, localizado em Santiago de Compostela, na Espanha. É idealizador do projeto “Plantar Ação”, que busca valorizar o trabalho dos pequenos produtores pernambucanos, e que serviu de ponto de partida para o seu livro. @claudemirbarros2 @oradionacozinha

Paladar Distinto
# 98 - Chef Thiago Bañares | Tan Tan, Kotori e Ototo / "Trajetória e Inspirações"

Paladar Distinto

Play Episode Listen Later Sep 16, 2021 32:55


Entrevista com Thiago Bañares, Chef de Cozinha do Tan Tan, Ototo e Kotori. Na gastronomia há 14 anos, possui sólida experiência em gestão administrativa e financeira. Sua maior influência vem do avô, chinês de Macau, que até aportar no Brasil era cozinheiro de navio. Foi a partir de um estágio com Alex Atala, no D.O.M, que ele se encantou com a cozinha. Depois, vieram o emprego e as viagens com Alex, para cozinhar em outros países. Iniciou sua carreira em gastronomia em 2006 no extinto Sophia Bistrot. Em 2009, foi responsável pelo menu degustação do D.O.M, de Alex Atala, e anos depois, ao lado de Paola Carosella, comandou o restaurante Arturito. Além disso, foi um dos responsáveis pela expansão da Z Deli Sandwiches. Este paulista de Guaraçaí, que se define sorridente, inquieto, rigoroso e persistente, passa seus dias em torno de um dos maiores sucessos de São Paulo, o Tan Tan e das suas novas casas o Kotori e Ototo. @tbanares @tantannb @kotori.sp @ototo.sp

PODCAST Styllus & Variedades
Mudar de carreira, você já pensou, mas acha que está muito tarde? Bate papo com o chef Ulisses

PODCAST Styllus & Variedades

Play Episode Listen Later Aug 18, 2021 12:15


Mudar de carreira, você já pensou, mas acha que está muito tarde? então não perca o 1° episódio do podcast “Styllus & Variedades”. Convidamos O chef de cozinha, Ulisses, trazendo ingredientes incríveis! e ainda nos conta como foi importante o aprendizado, com Rita lobo e Alex Atala, entre outros, ouça o nosso PODCAST, Styllus & Variedades, faça parte dessa história! @veronica.bispo.77 A vida é bela, ame verdadeiramente, nunca deixe de sorrir. Ulisses. --- Send in a voice message: https://anchor.fm/verobispo10gmailcom/message

PolitiCast - Marcelo Politi
Autenticidade é o prato principal. - Entrevista com Vinícius Rojo | Entrevista com Politi | #EP027

PolitiCast - Marcelo Politi

Play Episode Listen Later Jun 23, 2021 67:44


Depois de formado no Senac, Vinícius Rojo sempre buscou se aprofundar e explorar horizontes além do território nacional. Passou pela cozinha de celebridades em busca de ocasiões especiais. Além disso, encontrou e participou de bate-papos com Juan Roca, Heston Blumenthal e Ferran Adrià. Sua chegada ao Brasil fez com que tivesse a possibilidade de participar da equipe de Alex Atala e Murakami. Foi assim que fundou, em 2007, a Rojo Gastronomia, um lugar para colocar em prática tudo o que sentiu, ouviu, aprendeu e guardou ao longo dos anos. Neste episódio do podcast “Na Cozinha do Politi”, eu converso com Vinícius Rojo, que com todo seu entusiasmo, espontaneidade, perseverança e desejos, inspira quem prova o sabor da Rojo Gastronomia. Acompanhe-me e marque no Instagram @nacozinhadopoliti. Produção: Voz e Conteúdo - www.vozeconteudo.com.br

Mesa Pra Quatro
#1: Ele desistiu do grande sonho de sua vida, mas fez um ótimo negócio (com Alex Atala)

Mesa Pra Quatro

Play Episode Listen Later Jun 19, 2021 92:31


No primeiro episódio do Mesa Quadrada, Dan Stulbach, Caio Mesquita e Teco Medina recebem o renomado e premiado chefe Alex Atala. Um bate-papo descontraído sobre dinheiro, negócios, carreira e vida. Confira!

Paladar Distinto
# 61 - Chef Denise Guerschman | Gastronomia Nórdica / "Comida dos Vikings"

Paladar Distinto

Play Episode Listen Later May 24, 2021 30:04


Entrevista com a Chef Denise Guerschman, proprietária do Restaurante "O Escandinavo". A Chef se permite realizar cada um dos processos por trás do que será servido – misturar os grãos do verdadeiro pão sueco, que pode sempre acompanhar a manteiga feita de leite fresco e temperada com sais especiais; curar e defumar peixes; encher com pernil e bacon a salsicha, fazer ketchup de beterraba e fermentar a mostarda para os cachorros-quentes noruegueses; ficar junto do fogo baixinho enquanto carnes são cozidas em caldos cuidadosamente perfumados. O restaurante é intimista, tem apenas uma mesona. Ao redor dela, até 20 convivas podem percorrer o menu enxuto, elaborado cuidadosamente. Aos 15 anos, Denise Guerschman partiu para um intercâmbio em um pequeno vilarejo na Noruega. De quebra, passou alguns meses na Suécia também. Cravou a Escandinávia nos hábitos e entre as paixões. De volta ao Brasil, formou-se na primeira turma de hotelaria do Grande Hotel Senac de Campos do Jordão, no ano 2000. Uma aptidão natural levou-a ao Roanne, onde Claude Troisgros debutava em São Paulo. Dali, partiu com Alex Atala para a abertura do D.O.M. Foi então que a jovem cozinheira viu a oportunidade de integrar a brigada do primeiro uma estrela norueguês, o Bagatelle. Assim, permaneceu oito anos em Oslo, cozinhando ainda nos estrelados Ylajali, Sentralenn e Bare em Bergen. Em 2010, já em São Paulo, Denise focou em food styling. Participou de cursos na Escola Panamericana de Artes, onde mais tarde passou a lecionar, além de trabalhos em grandes estúdios de fotografias gastronômicas. Quando as saudades do fogão apertaram, não abandonou produções de fotos e vídeos gastronômicos, mas inventou mais um job: os jantares escandinavos – um formato itinerante em que trazia receitas aprendidas nos países escandinavos. A brincadeira durou quatro anos, com direito a viagens esporádicas de reciclagem pela Escandinávia. Tornou-se tão séria que a chef precisou de um lugar para se dedicar a ela. Assim surgiu, em 2018, O Escandinavo, que funciona hoje ao lado do seu estúdio de fotografia, fazendo com que Denise concilie sua vida entre as duas paixões. Rua Deputado Lacerda Franco, 141 – Pinheiros – São Paulo – SP @denise.guerschman @o.escandinavo

PolitiCast - Marcelo Politi
Equilíbrio entre gestão e talento! - Marcelo Fernandes | Entrevista com Politi | #EP23

PolitiCast - Marcelo Politi

Play Episode Listen Later May 19, 2021 82:13


Para especialistas em cozinha, as áreas de Gestão e Planejamento podem parecer conceitos difíceis ou até mesmo chatos. Porém, a importância deles para o crescimento do negócio é fundamental e não deve ser ignorada. Neste episódio do podcast “Na Cozinha do Politi”, converso com Marcelo Fernandes, fundador do Grupo MF, que administra diversos restaurantes e tem a diversidade como carro-chefe. Fernandes fala sobre a sua primeira sociedade com o renomado Chef Alex Atala, além da importância da gestão consciente para um restaurante. Acompanhe-me e marque no Instagram @nacozinhadopoliti. Produção: Voz e Conteúdo - www.vozeconteudo.com.br

Feijoada Cast
#38: Chef Aline Guedes

Feijoada Cast

Play Episode Listen Later Apr 28, 2021 52:03


Mestre em Hospitalidade pela Universidade Anhembi Morumbi – UAM (2016), pós graduada em Administração e Organização de Eventos pelo SENAC São Paulo (2008), graduada em Tecnologia em Gastronomia pelo Centro Universitário Senac – Águas de São Pedro (2005). Pesquisadora da alimentação e comensalidade de quilombos remanescentes do Estado de São Paulo, com publicações no Brasil, além de Portugal, Espanha e Argentina. Atua como docente nos cursos superiores de Gastronomia da Universidade Anhembi Morumbi (SP) e em instituições do interior adicionando às suas atividades docentes teóricas/práticas, a produção de conteúdo didático para o EAD. Tem ainda atuado há mais de 12 anos no segmento de eventos como Personal Chef e Catering em atendimentos personalizados sociais e coorporativos. Atuou em restaurantes e hotéis no Brasil, tendo vivenciado a prática profissional junto a chefs como Bel Coelho, Alex Atala, Mauro Maia, Hamilton Mellão, entre outros e à convite passou por temporada de abertura de hotel e restaurante em Dubai, nos Emirados Arabes Unidos, no cargo de Sous Chef.

Empreendedorismo e Humor A Tradição Conectada com o Novo
08 - LIVE_43 - Marcio Capelli O Herdeiro Que Decidiu Empreender

Empreendedorismo e Humor A Tradição Conectada com o Novo

Play Episode Listen Later Feb 11, 2021 56:16


Marcio Capelli filho de Reinaldo e Margarida Capelli detentora da Marca Simulassão, marca essa que é referência no mercado de moda feminina por muitos anos, com tudo decidiu depois de muitos anos tomar outro rumo em sua vida e partir por uma jornada sem o emblema da família e seguir por conta própria. Empreendendo com uma das maiores marcas de produtos alimentícios a Sun7. A Sun7 tem por objetivo facilitar a vida na cozinha oferecendo comida congelada de alta quaidade. Marcio é detentor da primeira loja (Varejo) da marca ao qual até então era apenas distribuidora de restaurantes qualificados como Paris 6, Alex Atala, Fran's Café, e diversar Cias Aéreas e possui grandes aspirações para a expansão da marca, mesmo em momentos dificeis como o da Pandemia. Incubadora Fusion Life mais uma vez trazendo uma história a ser contada vamos juntos aprender com o Marcio e ter diversos insights de como empreender. --- Send in a voice message: https://anchor.fm/incubadorafusionlife/message

The Kitchen Sisters Present
157 — Chido Govera—The Mushroom Queen of Zimbabwe

The Kitchen Sisters Present

Play Episode Listen Later Jan 12, 2021 25:20


A mushroom farmer, food activist, business entrepreneur, foster mother to more than a dozen girls—Chido Govera is a kitchen visionary in Zimbabwe—a pioneer in the cultivation of mushrooms throughout Africa and the world. Chido was orphaned at 7 when her mother died of AIDS. As a girl, who never had enough to eat, she began cultivating mushrooms when she was nine. Some people look at a mushroom and see a mushroom. Chido looked at a mushroom and saw a weapon for social change, a path out of hunger and poverty to empowerment and income for herself and other orphaned girls. The founder of The Future of Hope Foundation, Chido has promoted mushroom cultivation as a sustainable source of food and income in impoverished regions of the world. We met Chido in Sao Paolo at FRUTO, an international gathering of chefs, farmers, activists, fishermen, Amazonian tribal women organizers, botanists and more—organized by Brazilian chef Alex Atala, famous from Netflix’s Chef’s Table. Speakers from around the world delved deep into issues of food, zero waste, the destruction of coastal waters, agriculture and climate change, the rights and foods of indigenous people of the Amazon. The conference was profound—a global eye opener. Special thanks to Alex Atala, Felipe Ribenboim, Lars Williams and the NOMA community in Denmark. The Kitchen Sisters Present is part of Radiotopia from PRX, a curated collection of podcasts from some of the best independent producers around.

Gastronomia
Les vaques Salers i les rutes vitivin

Gastronomia

Play Episode Listen Later Dec 25, 2020 53:40


Aquest Nadal fem un tomb per Catalunya amb dues Laures. Parlem dels cal

Foodness Talks
Alex Atala: os drinks que a gente toma, e os tombos que a gente leva

Foodness Talks

Play Episode Listen Later Nov 18, 2020 48:24


Pra comemorar o episódio #50 do Foodness Talks, o 50tão: Alex Atala. Nesse papo ele nos conta histórias, apertos e os principais aprendizados ao longo dos mais de 30 anos de carreira na gastronomia.

80 WATTS
80 WATTS - Edição 21 (Relançamento)

80 WATTS

Play Episode Listen Later Nov 11, 2020 53:58


Esta edição foi lançada originalmente em 17 de julho de 2013. E lá vamos nós com mais uma edição antiga aqui no Projeto SOBE DE NOVO SHI. Na edição 21 eu mencionei o saudoso Carbono 14, que ficava lá na rua 13 de Maio e era um espaço com 4 andares e que reunia todas as tribos alternativas e roqueiras de São Paulo. A casa funcionou entre1982 e 1987. Artistas e frequentadores do Carbono 14: Pepe Escobar, Arthur Veríssimo, Alex Atala, Christina Wastrup, Nasi, Guilherme Isnard, Bia Abramo. (foto: https://saopaulosao.com.br) Destaco na playlist a banda Lone Justice, uma ótima banda com uma grande vocalista (Maria McKee), que deveria ter tido mais reconhecimento. É isso aí. Até a próxima reprise do 80 WATTS!SHI

Eat Your Words
Snacky Tunes Cookbook

Eat Your Words

Play Episode Listen Later Oct 18, 2020 37:10


Cathy welcomes Darin and Greg Bresnitz, the hosts of the podcast Snacky Tunes on HRN, and authors of a new cookbook on food and music also called Snacky Tunes: Music is the Main Ingredient. The authors share the fun behind-the-scenes of working on the book, which includes recipes from acclaimed chefs around the globe including May Chow, Pooja Dhingra, Alex Atala, and more.Photo Courtesy of Phaidon.Eat Your Words is powered by Simplecast.

CLOCK CHOKE: A GRAPPLING PODCAST BY SHOYOROLL
Brazilian Culture with Chef Alex Atala

CLOCK CHOKE: A GRAPPLING PODCAST BY SHOYOROLL

Play Episode Listen Later Aug 3, 2020 46:43


We talk to South America's most iconic chef (and jiu-jitsu player) about bringing Brazilian cuisine to the world's stage - all while protecting its environment, culture, and authenticity along the way.

Demian Maia Podcast
Alex Atala no Demian Maia Podcast EP011

Demian Maia Podcast

Play Episode Listen Later Jul 17, 2020 89:28


O trabalho que o chef Alex Atala desenvolveu em sua carreira com os ingredientes, cores e sabores brasileiros chamou a atenção dos especialistas desde o início. Sua coleção de prêmios nacionais e internacionais é extensa e, em 2013, entrou na lista das 100 pessoas mais influentes da TIME Magazine. Seu principal restaurante, o D.O.M., é parte do seleto grupo de restaurantes brasileiros com duas estrelas no Guia Michelin. Em 2014, Atala recebeu o prêmio de Lifetime Achievement na edição sul-americana da premiação da mesma revista.

Marcha e Sai
Ep52 - Entrevista Laurent Suaudeau

Marcha e Sai

Play Episode Listen Later May 25, 2020 103:03


Olá, querido ouvinte!Essa semana o programa está muito especial,o nosso convidado dispensa apresentações, nascido em Cholet, na França,ele começou sua carreira como cozinheiro aos 13 anos, vendendo frango assado e batata frita na barraca da sua tia, depois de ter passado por diversas cozinhas de grandes mestres ele trabalhou para o chef Paul Bocuse, no seu famoso restaurante em Collonges-au-Mont-d'Or, onde ficou durante os anos de 1978/1979. O Papa da cozinha reconheceu o talento do jovem e o convidou em dezembro de 1979 para um outro desafio: chefiar a cozinha do Le Saint-Honoré, no Hotel Méridien, no Rio de Janeiro. Ele chegou ao Brasil no início de 1980, com apenas 23 anos. Aqui ele obteve grande sucesso em tudo que fez, tendo recebido diversos prêmios com seu restaurante no Rio de Janeiro e depois em São Paulo,em 2000 em fundou a sua escola onde leciona até hoje. Tendo influenciado diversas gerações de grandes cozinheiros como Alex Atala e Jefferson Rueda, ele é referência no Brasil e no mundo. Como se não bastasse em 2004 lançou o livro Cartas a um Jovem Chef que é leitura obrigatória para todo profissional da área. Essa semana entrevistamos o grande mestre Laurent Suaudeau.O Marcha e Sai está muito honrado de poder levar esse episódio até vocês, espero que gostem tanto quanto nós. Muito obrigado por estar conosco mais uma semana, um grande abraço e até mais!Redes Sociais Laurent Suaudeauhttps://www.instagram.com/laurent.suaudeau/?hl=pt-brhttps://www.instagram.com/escolalaurentsuaudeau/?hl=pt-brMarcha e Saihttps://www.instagram.com/marchaesaipodcast/?hl=pt-brhttps://twitter.com/MarchaSai

Fora do Tatame com Fabio Gurgel
10 Questions Back to Back com Alex Atala

Fora do Tatame com Fabio Gurgel

Play Episode Listen Later Mar 10, 2020 97:03


Papo incrível com um dos mais renomados chefs do mundo, o faixa preta casca grossa, Alex Atala

Octanage
Da Periferia para o Mundo: Movimento, Logo Existo com Hebert Mota | Kal911 E075

Octanage

Play Episode Listen Later Feb 10, 2020 37:02


Hebert Mota é um conector de negócios. Empresário internacional, ele batalhou para sair da periferia de São Paulo e marcar presença nos mais diversos cantos do mundo, sem deixar de lembrar de suas origens. É Autor do livro, 'Movimento, Logo Existo' lançado recentemente e Proprietário da Kal911. Ele foi agente do lutador Anderson Silva, e traz, em seu portfólio, histórias especiais e inusitadas: gravou com Kanye West, se reuniu com Jay-Z para tratar de negócios, esteve no aniversário de Michael Jordan e foi convidado para a posse do ex-presidente norte-americano Barack Obama. Entre as personalidades brasileiras com quem já foi conector de negócios, figuram nomes como Mano Brown, Alex Atala, Maria Rita, Daniel Zuckerman, Lázaro Ramos e Taís Araújo, todos amigos pessoais do empresário. Confira os recursos exclusivos dessa entrevista feita pelo Vinicius Fachinetto: octanage.com/e075 Gostaria de ter a sua pergunta sobre inovação e empreendedorismo nos nossos episódios? Envie sua pergunta pelo WhatsApp em octanage.com/pergunte Se esse episódio despertou seu interesse em trabalhar conosco em lançar a sua ideia ou transformar o seu negócio, aplique para mentoria em octanage.com/mentoria Octanage é a sua dose de inspiração para inovar e empreender. Descubra como pensar e agir como um/a empreendedor/a de sucesso!Empreendedorismo | Inovação | Mentalidade e Habilidades para Empreender | Mulher Empreendedora | Negócios | Startups | MarketingOctanage é feito com amor para a comunidade empreendedora em língua portuguesa por André Piazza e Vinícius Fachinetto, consultores em Inovação e Negócios.

Vozes do Planeta | Rádio Vozes
13 - Olimpíada ambiental

Vozes do Planeta | Rádio Vozes

Play Episode Listen Later Dec 21, 2019 20:34


Nesta edição o programa destaca a abertura da Olimpíada com recado ambiental, fala de moda e sobre o algodão e também do lançamento dos livros de alimentos Yanomami, a primeira publicação neste sentido do Brasil. Participam nesta edição Alex Atala, Chiara Gadaleta e Resende Sanoma Yanomami.

Assim Assado
Assim Assado powered by Zomato: Alex Atala

Assim Assado

Play Episode Listen Later Nov 30, 2019 25:56


Assim Assado
Assim Assado powered by Zomato: Alex Atala

Assim Assado

Play Episode Listen Later Nov 29, 2019 25:57


3o de novembro de 1999: Alex Atala inaugurava, em São Paulo, no Brasil, aquele que se tornaria num dos mais importantes restaurantes na América do Sul, o D.O.M. O Assim Assado celebra-o com uma entrevista com o chefe brasileiro. Encontramos Alex Atala num sábado de manhã, no Porto, depois da… OuvirAssim Assado powered by Zomato: Alex Atala O conteúdo Assim Assado powered by Zomato: Alex Atala aparece primeiro em Assim Assado.

Pitadas e Palpites
Pitadas & Palpites recebe Vivi Araujo

Pitadas e Palpites

Play Episode Listen Later Sep 24, 2019 29:47


Vivi Araújo é formada em Gastronomia e, com apenas 31 anos, já acumula muitas experiências. Estagiou com o Chef Alex Atala, trabalhou com o francês Pascal Valero e Erick Jacquin. Em Paris estagiou no estrelado Taillevent. De volta ao Brasil assume o setor de Gastronomia do Portal Yahoo! Brasil; apresentou e criou o canal "Pé na Cozinha". Com esta trajetória, tornou-se referência no mundo da gastronomia. Participou e ganhou o Reality 1 Por Todos na Band em 2019, e possui uma empresa de Catering & Eventos. Seu propósito é "Levar criatividade e praticidade para a vidas das pessoas".

Un restaurant caníbal a Berlín
Alex Atala: sobrevalorem els diners i infravalorem el menjar perqu

Un restaurant caníbal a Berlín

Play Episode Listen Later Jun 15, 2019 54:20


The Unbearable Lightness of Being Hungry
Jowett Yu – Ho Lee Fook, Mr Wong, Ms.G's, Canton Disco

The Unbearable Lightness of Being Hungry

Play Episode Listen Later Nov 19, 2018 51:11


Jowett Yu was working at Tetsuya's – then in the Top 5 of the World's 50 Best Restaurants – but couldn't even afford a bed. It was a wild time (just listen to the memorable "pep talk" that head chef Martin Benn gave when the restaurant reached #4 on the list) and the kitchen was full of upcoming stars: Daniel Puskas (Sixpenny), Clayton Wells (Automata), Phil Wood (Laura), Luke Powell (LP's Quality Meats) and Dan Hong – who Jowett bonded with, because they basically had the same haircut and similar cultural backgrounds. Together, Dan and Jowett would go on to open Lotus, Ms.G's and Mr Wong together. At Lotus, there was the momentous night they launched David Chang's Momofuku book (and cooked for both Chang and Alex Atala), Ms.G's involved a memorable American research trip (where Jowett ate something that resulted in the "best 30 seconds of my life") and Mr Wong, which was an "intense" experience where he'd finish work at 3am and clock in again at 9am.  Jowett then opened Ho Lee Fook in Hong Kong (an experience that earnt him a "lecture" from his mum and a major grilling when he put her dumplings on the menu – but even she ended up a fan of the restaurant). Here, the chef has experimented with fascinating vegetarian dishes, like typhoon shelter corn and celeriac char siu. More recently, he's launched Canton Disco in Shanghai. Jowett also talks about growing up in Taiwan (and his visits to his totally boss grandmother's farm: she could look at an egg and tell when it would hatch – and be totally right) and his love of Hong Kong's Belon (he compares chef Daniel Calvert's cooking to the rise of Beatlemania). When you consider that Jowett ended up in the kitchen as a 14-year-old because he essentially didn't want to be a dishwasher (and he made the smart move avoiding a career in journalism, too!), there's no doubt that he's had a fascinating career.

The Unbearable Lightness of Being Hungry
Kylie Javier Ashton – Momofuku Seiobo

The Unbearable Lightness of Being Hungry

Play Episode Listen Later Sep 9, 2018 74:21


Kylie Javier Ashton has dealt with forged bookings and martini glass accidents; she's disguised Alex Atala with garbage bags, and endured countless people throwing up when she's been on the job (“you could see the frequency of the voms go up when the scampi dish was on” is one of the most memorable lines from this interview). Having survived all that, it's clear that she still loves her work and wants people to join the industry (as her involvement in Women In Hospitality, Appetite For Excellence and Grow shows). Kylie Javier Ashton got her start at Tetsuya's, when it was ranked in the Top 5 on The World's 50 Best Restaurants list. She's since become the award-winning restaurant manager at Momofuku Seiobo, which has been twice-named the best restaurant in Australia by Gourmet Traveller. Not a bad place for her to be, considering she didn't "even know how to carry plates" when she entered the industry. Kylie has many amazing stories to tell, and covers it all, from what it's like to actually work with David Chang, the background to Paul Carmichael's food at Seiobo and why she asks her staff to give presentations on Caribbean culture, and the reality of your restaurant being in two pieces in The New York Times: one by Pete Wells, the other by Besha Rodell. Plus: that memorable period running Duke Bistro with Mitch Orr, Thomas Lim and Mike Eggert (which followed her spell at Bentley Restaurant & Bar with Brent Savage and Nick Hildebrandt – the "hardest" place she worked). And let's not forget the time she also boxed in Cuba. I LOVED talking to Kylie for this interview and she drops some of the best lines I've heard (it's worth listening to this episode so you can discover why “I've just been out on Oxford Street with an eyepatch” and “I didn't realise I was Wolverine for so long" are two of the greatest things anyone has ever said on this podcast)!

The Grappling Central Podcast: The biggest names in Brazilian Jiu-Jitsu (BJJ), MMA and Grappling

Alex Atala is a world renown chef and culinary celebrity who has been featured on CNN, Netflix and Brazilian television. He is also a BJJ Brown Belt under Demian Maia. He talks about his journey and how BJJ has helped him in various aspects of his life. Click “read more” for show notes! SHOW NOTES: […]

The Grappling Central Podcast: The biggest names in Brazilian Jiu-Jitsu (BJJ), MMA and Grappling

Alex Atala is a world renown chef and culinary celebrity who has been featured on CNN, Netflix and Brazilian television. He is also a BJJ Brown Belt under Demian Maia. He talks about his journey and how BJJ has helped him in various aspects of his life.

Bouffons
#17 - Lyon : À jamais les premiers en cuisine !

Bouffons

Play Episode Listen Later Apr 10, 2018 29:48


En mars, Guilhem était à Lyon, qui dispute à Paris le titre de capitale de la gastronomie, pour une émission en live dans le cadre du festival Attable. Plutôt que de choisir arbitrairement un des nombreux plats de la cuisine lyonnaise, Guilhem a voulu mettre en avant deux chefs qui donnent un coup de jeune aux bouchons de la Cité des gones.En première partie (01:42), Guilhem reçoit Florent Poulard, qui a récemment ouvert le restaurant Monsieur P.En deuxième partie (13:52), Guilhem discute avec Ludovic Mey, qui cuisine en duo avec sa femme Tabata Mey au restaurant Les Apothicaires.Références entendues dans l’épisodeMonsieur P, le restaurant de Florent Poulard : 14 rue Royale, 69001 LyonLes comptes Facebook et Instagram de Monsieur PLe restaurant de Guy Savoy : 11 quai de Conti, 75006 ParisL'Arpège, le restaurant d’Alain Passard : 84 rue de Varenne, 75007 ParisDaniel, le restaurant de Daniel Boulud : 60 E 65th Street, New York, NY 10065, États-UnisLe restaurant La Mère Brazier : 12 rue Royale, 69001 LyonLes Apothicaires, le restaurant de Ludovic et Tabata Mey : 23 Rue de Sèze, 69006 LyonLe compte Facebook des ApothicairesLa Tour Rose - Food Traboule : 22 rdu Bœuf, 69005 LyonTêtedoie, le restaurant de Christian Têtedoie : 4 rue professeur Pierre Marion, 69005 LyonMarguerite, le restaurant ouvert par Paul Bocuse : 57 avenue des Frères Lumière, 69008 LyonL'Etabli, le restaurant de Louis Fargeton : 22 rue des Remparts d'Ainay, 69002 LyonDOM, le restaurant d’Alex Atala : rua Barão de Capanema, 549 - Jardins São Paulo, São Paulo, BrésilLe Noma : Refshalevej 96, 1432 Copenhague K, DanemarkLe site du festival AttableRetrouvez Bouffons sur Instagram, Twitter et Facebook !Bouffons est une émission de Nouvelles Écoutes animée par Guilhem Malissen. Réalisée par Aurore Meyer Mahieu. Montée et Mixée par Thibault Delage au studio L’Arrière Boutique. Production et coordination Laura Cuissard.

Imagina Juntas
Imagina Juntas #11 - Não Repara na Bagunça

Imagina Juntas

Play Episode Listen Later Mar 14, 2018 69:47


Nessa semana, as meninas e o Gus falam sobre as minúcias da limpeza de casa, bolo do aniversário de São Paulo e a galinhada do Alex Atala.Carol (Tchulim): Instagram, Twitter, Youtube: Carol RochaJeska: Instagram, Twitter, Canal: Amor ExisteGus Lanzetta: Instagram e Twitter--Conheça outros podcasts da Half Deaf:Bumbumcast: http://bit.ly/2F5RhJ7Papo Torto: http://bit.ly/2FLr1BD

All in the Industry ®️
Episode 168: Jonathan Rubinstein and Gabrielle Rubinstein-Cheong - The Brother and Sister Team Behind Joe Coffee

All in the Industry ®️

Play Episode Listen Later Feb 28, 2018 50:46


On today's episode of All in the Industry, host Shari Bayer is joined by the brother and sister team behind Joe Coffee, a family owned, coffee roasting company with locations in Manhattan and Philadelphia. They are Jonathan Rubinstein, Founder and President of Joe Coffee, and Gabrielle Rubenstein-Cheong, VP and Director of Catering and Customer Relations of Joe Coffee. Joe was founded in 2003 as a singular specialty coffee house in Manhattan’s West Village with the simple vision of brewing the highest quality, unique coffees and serving them with unsurpassed hospitality and knowledge. Tune in to hear all about their family business and how it has grown since its launch. Shari also features a PR tip, Speed Round game, Industry News discussion, Solo Dining experience, which this week is at Alex Atala's D.O.M. in Sao Paulo, Brazil; and The Final Question, tying the series together. Follow us @allindustry. All in the Industry is powered by Simplecast

The Emulsion Podcast
Communal Tables, Wasted food, food symposiums and 5 factors in Creativity- #TheEmulsion Ep. 28

The Emulsion Podcast

Play Episode Listen Later Sep 7, 2017 28:42


Communal Tables:http://www.grubstreet.com/.../communal-tables-must-go.html Whole Foods: http://www.grubstreet.com/.../amazon-prime-becoming-whole... Wasted Trailer: https://www.eater.com/.../wasted-documentary-anthony... Alex Atala's Fruto: https://www.eater.com/.../alex-atala-fruto-food-symposium Charter Oak: http://www.sfchronicle.com/.../For-Meadowood-duo-going... 5 Factors in Creativity: http://blog.ideasinfood.com/.../5-factors-shaping... What Happened to Monday: http://www.imdb.com/title/tt1536537/

FAQ21 | Frequentes e Amplas Questões do séc. XXI
#076 - A interface do Iron Man já é realidade?

FAQ21 | Frequentes e Amplas Questões do séc. XXI

Play Episode Listen Later Dec 13, 2016 40:44


Mamute brinca de Tony Stark. Kris morre de inveja. Primeiras impressões do Hololens. O subestimado óculos de realidade misturada da Microsoft. Desmascarando o revolucionário Magic Leap, robôs, Alex Atala, drones e mamões. O usual. http://twitter.com/faqvinteum http://facebook.com/faqvinteum http://instagram.com/faq21 Links Microsoft Hololens https://www.microsoft.com/microsoft-hololens/en-us Tony Stark e a interface do Jarvis https://www.youtube.com/watch?v=Wx7RCJvoCMc Microsoft HoloLens and the NFL look into the future of football https://www.youtube.com/watch?v=HvYj3_VmW6I Magic Leap https://www.magicleap.com Tecnicalidades do Magic Leap https://www.youtube.com/watch?v=QBa-668ByAk Zebra Imaging http://www.zebraimaging.com/ Magic Leap is actually way behind, like we always suspected it was http://www.theverge.com/2016/12/8/13894000/magic-leap-ar-microsoft-hololens-way-behind Demo Magic Leap Star Wars https://www.youtube.com/watch?v=lP5ZZI05A3g Mamute no Jacare Banguela (Não fale com o motorista) https://www.youtube.com/watch?v=q32PuIs67ZQ Wired https://www.wired.com/ DJI Mavic Pro https://www.dji.com/mavic 067 - Qual é o melhor drone? http://www.faq21.com/blog/067 Future Life with Pepper https://www.youtube.com/watch?v=3a4sZnLRvqk Japan is running out of people to take care of the elderly, so it’s making robots instead http://www.businessinsider.com/japan-developing-carebots-for-elderly-care-2015-11 Nintendo Switch Demo no Jimmy Fallon https://www.youtube.com/watch?v=7TJ7IUNWGl4

Serifacast
SerifaCast #12 Comunicação de luxo

Serifacast

Play Episode Listen Later Apr 9, 2016 28:15


O head de comunicação da Audi do Brasil, Christian Michael Marxen (pronunciei errado o nome dele :P), revelou os detalhes da comunicação da marca alemã no cotidiano, nos preparativos dos lançamentos dos carros e durante os eventos com jornalista. São dicas e insights valiosos sobre comunicação corporativa para marcas de carro premium e eventos de luxo. Tá massa este episódio! Vamos serifar? >> 5 insights - Peso da comunicação na venda de carro de luxo - Ferramentas internas e externas da Audi - Ralação de realizar e cobrir um lançamento de um carro - Adaptações da Audi pra nova realidade de mobilidade urbana - Exclusivo: novas mídias da Audi e parceria com a Uber o/ >> Para degustar - Entrevista que fiz com Alex Atala na festa de lançamento do Audi A4: http://bit.ly/1Vy31EX - Lançamento do Audi A4: http://bit.ly/1RXgjpI >> O cabra O filho de alemão, Christian Michael Marxen, 32, é relações públicas pela Faculdade Cásper Líbero, de São Paulo. Graduado também em gestão de projetos pela George Washington University, nos Estados Unidos. É técnico em administração de empresas em alemão. É Head de Comunicação da Audi do Brasil. Antes, foi relações públicas do Grupo Volkswagen, na Alemanha, e também foi da comunicação corporativa da Volkswagen no Brasil. No estágio, trabalhou na Mercedes Benz. Antes de tudo, organizava festivais de banda no colégio, treinando organizado de eventos, relações públicas e liderança. Mesmo sem saber que estava treinando isso. > Mande seu sinal de fumaça digital para serifacast@gmail.com, Instagram: @serifacast ou facebook.com/serifacast. Se gostou, compartilhe com um amigo que também se interessa pelo assunto. > Para ouvir e assinar no smartphone. É grátis! - No iPhone (iOS): Abra o aplicativo que já vem instalado chamado "Podcasts", procure por SerifaCast e assine. - Em smartphone Android: Baixe grátis o "Podcast Addict", pesquise por SerifaCast e assine. > Estamos também no - iTunes: https://itunes.apple.com/uy/podcast/serifacast/id1075700365?mt=2 - Podflix: podflix.com.br/serifacast - Youtuner: youtuner.co/channel/soundcloud.com_andrehjc - Teiacast: mundopodcast.com.br/teiacast/educacao/serifacast/ - Stitcher: www.stitcher.com/podcast/serifacast?refid=stpr > Melhor podcast do Brasil. Vote no SerifaCast no link abaixo. Se não tem saco pra cadastro (eu não tenho), então "logue" com sua conta do Facebook. Link: bit.ly/1QaUUxU >> SerifaCast Podcast de insights em comunicação, storytelling e bastidores da cobertura jornalística.

Trip FM
Marcelo Fernandes

Trip FM

Play Episode Listen Later Mar 24, 2016


Proprietário de alguns dos mais importantes restaurantes de São Paulo fala sobre empreendedorismo em tempos de crise, gastronomia e a febre dos programas de culinária Nesta edição do Trip FM a gente recebe um dos principais empresários brasileiros do ramo gastronômico, Marcelo Fernandes. Marcelo começou a atuar nessa área há 15 anos, junto com o chef de cozinha Alex Atala, quando eles fundaram juntos o Dom, e hoje ele é sócio-proprietário de alguns dos principais restaurantes de alta-gastronomia da cidade de São Paulo. Ele vem conversar com a gente sobre empreendedorismo em tempos de crise econômica, gastronomia, sobre a cidade de São Paulo, a febre dos programas de culinária. Muita coisa boa nessa conversa com o restauranter Marcelo Fernandes.   Céu – A Nave Vai Ray Charles -_ I Got a Woman Trupe Chá de Boldo – Chega de Tristeza Tim Maia – Cristina Allah-Las – Catamaran

The Culinary Institute of America
Cooking with Brazilian Chef Alex Atala of DOM

The Culinary Institute of America

Play Episode Listen Later Feb 27, 2014 4:36


Sao Paulo native, Alex Atala symbolizes the city's multi-ethnic heritage as an Irish Lebanese Brazilian. Chef Atala's DOM restaurant is regarded by many critics as the best in Brazil, and rated number 24 on the world's 50 best restaurant's list, according to San Pellegrino's annual rating. Atala's cooking combines intriguing Amazonian flavors with ultra-modern techniques… all filtered through the artful lens of haute cuisine. For recipes, visit www.ciaprochef.com/WCA  

Trip FM
Alex Atala 2005

Trip FM

Play Episode Listen Later Aug 25, 2010


O chef do restaurante DOM, falou sobre sua carreira de músico, infância, viagens e mais Dono de uma cozinha considerada inusitada, criativa, moderna e com um visual muito apurado, ele foi o primeiro latino-americano a dar aulas na escola de culinária francesa Le Cordon Bleu. Depois de pilotar pick-ups na antiga casa noturna Rose Bom Bom, fazer uns bicos de pintor na Europa e de trabalhar em cozinhas francesas e italianas, hoje ele comanda um dos melhores e mais sofisticados restaurantes de São Paulo, o DOM. Estamos falando de Alex Atala, autor do livro Por uma Gastronomia Brasileira e considerado atualmente um dos melhores chefs de cozinha do Brasil. 

Trip FM
26 anos de Trip FM: especial com Alex Atala

Trip FM

Play Episode Listen Later Aug 19, 2010


Crescimento, mudanças, viagens, teimosia e culinária. Alex Atala conta tudo no programa Como todo aniversário pede um prato especial, nosso convidado do programa é um dos chefs de cozinha mais admirados do Brasil, Alex Atala. Ele recentemente teve seu restaurante, o DOM, elencado na 18ª posição da lista San Pellegrino dos 50 melhores restaurante do mundo. Ele nasceu na Mooca e é dono de um temperamento determinado, que segundo Milad, seu pai, chega à teimosia. Saiu cedo de casa pra tocar a vida em São Paulo, virou punk, viajou pra Europa atrás de uma garota, foi pintor de paredes, estudou gastronomia, atacou de DJ no Rose Bom Bom e até aqui na Trip ele estagiou. No meio gastronômico trabalhou em restaurantes na Bélgica, na França, na Itália e de volta a São Paulo chamou a atenção com seu trabalho nos restaurantes Filomena e Namesa. Considerado o melhor chef brasileiro por muita gente boa, ele foi um dos primeiros e é um dos principais incentivadores do uso de ingredientes regionais, brasileiros, na cozinha. Dono dos restaurantes Dalva e Dito e D.O.M, que acaba de ser classificado em 18º lugar na lista San Pellegrino dos 50 melhores restaurante do mundo, também é autor dos livros Por uma Gastronomia Brasileira, de 2003, Com Unhas, Dentes & Cuca e Escoffianas Brasileiras, ambos de 2008.Quer mais?Escuta o programa! Leia mais: Alex Atala nas Páginas Negras #143

Trip FM
Especial Prêmio Trip Transformadores

Trip FM

Play Episode Listen Later Dec 6, 2007


Debate com Nuno Cobra, Sidarta Ribeiro, Tia Dag, Alex Atala, Mara Gabrili e Ciro Pirondi Debate com os indicados ao Prêmio Trip Transformadores 2007