POPULARITY
www.club360.jp@sam_gilbert_360
To fill out the form see the link below:docs.google.com/forms/d/1f10sKqBsQg5YZEUOBT8_qYwolmOSzBdzyHPP9z05J60/edit
The Tokyo Living Podcast is hosted by Sam Gilbert https://www.instagram.com/sam_gilbert_360/And proudly brought to you by:Club 360 - changing lives through health and fitnessElana jade - your oasis in the heart of Tokyo
Bob, Mike, and special guest Robie Malcomson discuss how opportunities for athletes to earn NIL and share revenue are transforming the competitive landscape in college basketball.Segment 1: How Does This New Landscape Impact How We See “Amateur” Athletics?Who were Jim Thorpe, Phil Dickens, Sam Gilbert and Curt Flood? Why Do They Matter Here?Why Could We Add Ed O'Bannon To This Group? How Has His Case Paved The Way For Direct Compensation?In The Era of Tik Tok Stars and Influencers: Is Anyone Really an Amateur?Segment 2: What Are The Impacts Of Paid Players On Who Finds Success (and Who Doesn't)?What Pathways Have We Or Could We See For High Major/Mid Major/Low Major Schools?How Have Other Related Yet Distinct Factors Also Have Impacted Teams and Programs? Why Water May Find Its Level In The Coming Years Like It Has In Other Sports Models?Segment 3 and Wrap Up: How Will The 2024-25 Season (and particularly IU's Activity) Demonstrate Some Of This And What We're Looking For? Some relevant videos and articles:Sparta v ThebesO'Bannon v NCAACurt FloodJim ThorpeMary Lou RettonMcCaskeys and Kevin WarrenRobie is host of the Trapped in History podcast: https://podcasts.apple.com/us/podcast/trapped-in-history/id1753424077See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.
Highlanders Co-Captain Sam Gilbert joined D'Arcy Waldegrave to preview their quarter final against the Brumbies in Canberra. LISTEN ABOVE See omnystudio.com/listener for privacy information.
Is Earth going to evaporate? Neil deGrasse Tyson and comedian Chuck Nice learn about exoplanet discovery, planetary evaporation, biosignatures and technosignatures with astrophysicist Anjali Tripathi.NOTE: StarTalk+ Patrons can listen to this entire episode commercial-free here:https://startalkmedia.com/show/habitable-worlds-super-earths-evaporating-planets-with-anjali-tripathi/Thanks to our Patrons Christopher Stowe, Bo Cribbs, Jennifer Pierce, Sam Gilbert, Steven Glasser, Antonio Garibay, and David Frigoletto for supporting us this week.Photo Credit: ESA/Hubble, CC BY 4.0, via Wikimedia Commons
Thank you so much to our listeners for being so kind while we navigate this new world! To make up for it we have an amazing episode feature Sam Gilbert, where we talk about what its like to work in the veterinarian industry, some do's and don't and along with a very hard topic, mental health in this industry. There will be a trigger warring as some content is very sensitive and there is mentioning of suicide, if you or a loved one is struggling please feel free to reach out to us here or follow this link attached for some guidance. https://www.islandhealth.ca/our-services/mental-health-substance-use-services/crisis-emergency-services Again a big thank you to our fantastic editor Janelle at the Crooked Larch, please hit her up for all of your podcast needs. https://www.crookedlarch.com/ We thank you again, for sticking with us during this, and we hope you enjoy!
On this special 99th episode of the show, I have friend Ira Bolden interviewing ME. We discuss my background, my karate career, my journey coming to Japan, the origins of Club 360, and my goals for the future.
Piney catches up with Sam Gilbert out of the Highlanders, currently the leading scorer in Super Rugby. LISTEN ABOVESee omnystudio.com/listener for privacy information.
What was the real story with UCLA booster Sam Gilbert... along with lessons Kareem taught Coach Wooden...
With busy lifestyles many turn to devices for aide memoires. Claudia discusses new findings with Dr Sam Gilbert who studies so called ‘offloading' and gives tips on how best to remember the important things. And a visit to Manchester's Turn it Up exhibition reveals what psychological research can tell us about the safest music to drive to; while guest Professor Catherine Loveday unpicks this year's trend, 'Dopamine Gifting'.
Vicki and Geraldine talk to the expert on Crypto, Web 3 and the Metaverse about what the future is going to look like - and learn it might involve child labour.Talking points:Should Parent Zone be getting into Crypto?What are the online harms of the future web?Is the future of the web the financialisation of all our relationships?
In Episode 9 of Series 7, Todd is joined again by Ben Lucas, Director of 3DI at the University of Nottingham, funders of this series. Together they reflect on some of the key themes and ideas to emerge from Series 7 of The Rights Track about human rights in a digital world. Transcript Todd Landman 0:01 Welcome to The Rights Track podcast, which gets the hard facts about the human rights challenges facing us today. In series seven, we've been discussing human rights in a digital world. I'm Todd Landman. And in the last episode of this fantastic series, I'm delighted to be joined for the second time by Ben Lucas, Managing Director of 3DI at the University of Nottingham, a hub for world class data science research and funders for this series of our podcast. Ben helped kick off series seven at the end of last year talking about some of the challenges and opportunities created in a data driven society and the implications for our human rights. Today, he's here to help us reflect on some of the key themes that have emerged from this series. So welcome, Ben, it's great to have you on this final episode of The Rights Track. Ben Lucas 0:46 Great to be here. Thanks very much. Todd Landman 0:48 So last night, we were at a launch event for INFINITY, which is an inclusive financial technology hub being launched here at the University of Nottingham, we had a bucolic setting at the Trent bridge, cricket ground, which I say was quite historic. But some of the messages I heard coming out of that event last night, really gave me hope for the promise of digital with respect, particularly to helping people who are currently excluded from financial technologies or finance more generally. And the ever, you know, sort of problem of people getting credit ratings getting access to finance, I wondered if you could just reflect on what was shared last night around the the positive story that could be told around using technology to give people access to hard to find finance? Ben Lucas 1:29 Yeah, absolutely. So I think the central issue with financial inaccessibility is really the fact that people get trapped in this really bad cycle, and perhaps don't have savings, and then you lean more on credit options, for example. And then you become more and more dependent, if you like on credit options. Equally, there are also folks who are excluded from accessing credit completely or at an affordable rate. In the first instance, which obviously changes very much the quality of life, let's say that they're able to enjoy the things they're able to purchase, and so on. So really, the mission of projects like INFINITY, which is focusing very much on this idea of inclusive financial technology, is trying to boost accessibility to everything from tools that help people save to tools that help people spend to a breaking that some of these negative cycles that cause people to end up in not so great financial situations. And yeah, it's really leveraging and learning from, you know, all the wonderful developments in, you know, things like analytics and new financial services, products, especially those that are app based, that we use in the rest of the financial services world, but applying them for good, basically, so very much consistent with this data for good message that we've been speaking about in this series. Todd Landman 2:51 Right that's really interesting. So it's a data driven approach to understanding the gaps and inequalities in a modern society that does have the data infrastructure and technological infrastructure to give people access. But really the data driven approach lowers the barriers to entry for those folks. And I was quite struck by that there was a colleague there from Experian, which is a credit rating agency talking about the millions of people who either don't have online bank accounts don't have access to the right kinds of technologies, and don't have the kind of credit rating that gives them access to the lower priced financial products out there, which in sort of ordinary terms means they're paying a much higher interest rate to borrow money than people that do have a credit rating. So one solution was to use data analytics and a data driven approach to understand their position to boost their credit rating in a way that would give them access to cheaper finance. Did I get that right? Ben Lucas 3:40 Yeah, that's exactly right. I mean, the central thing in financial services and lending is obviously managing their risk exposure with any individual consumer, but then also across, you know, their entire consumer portfolio. And I think, you know, one of the big opportunities in the inclusive FinTech space slash probably what we're going to see going forward is credit rating agencies and credit rating support products, looking for other variables or indicators that, you know, can really paint a clearer picture of individual consumers, and perhaps even say, well, actually, there's not so much risk with this consumer because there is other factors that the usual you know, bog standard algorithm doesn't pick up on, and maybe we don't have that risk exposure, maybe we can offer them, you know, financial products or lending products at a better rate, you know, that colleagues spoke also about Experian's boost product, for example, and I won't go into an advertisement for that, but yet a really interesting example of how by sort of extending the available data and what we do with that, you know, it's possible to sort of calibrate and tailor solutions that are a win win that reduce the risk for the credit provider, but give additional consumers more accessibility. And I think the other big piece just to detail briefly, within data driven and financial research, you know, some of the work that colleagues in the INFINITY team have been doing around, you know, helping to understand that an aggregate and in a privacy preserving way, where perhaps people are making not so great financial decisions. So being able to, you know, hopefully in the future help flag you know privacy protecting way to consumers when they're not making great decisions, which can be everything from wasteful over the top expenses to things like you know, too much gambling or unhealthy eating, for example. So certainly a very, very exciting space. Todd Landman 5:33 No, it's really fascinating, and it resonates well with many of the themes we've heard in this series of The Rights Track. So I'm going to just think about putting these things into groupings or clutches of perspectives if I may, so that you made reference this idea of data for good and of course, we had some guests on the podcast this series, including Sam Gilbert, who talked about the ability for digital transformation and data driven approaches to unearth previously unknown factors and public health benefits, and it could be social justice benefits and other benefits from leveraging data that don't normally talk to each other in a data analytic way. Wendy Betts told us about using really preserving the chain of evidence using visual imagery, but that date stamp timestamp location stamped and then preserving the metadata that sits behind an image for verification for the investigation of human rights abuse and human rights crimes. Amrit Dhir showed us in the United States how his organisation Recidiviz uses data from prisons to actually bring greater sense of justice to prisoners, as well as parolees. And finally, Diane Coyle, the world famous economist not only reflected on the many economic transformations that have happened with the digital disruption, but also made the case for universal access to online life and being on the grid almost as a basic human right, in the ways that access to information access to health care, access to services need to be provided. And certainly during COVID-19, we've learned that many people were excluded from those services precisely because they didn't have the right internet connection, or at least cannot afford to have the right kind of internet connection. So I just wondered what your general reflections are on that general theme of data for good. And what can you tell us about what you think listening to the guests that we've had during this series? Ben Lucas 7:21 Yeah, I mean, I really liked the way that Sam sort of sets the scene in his book, Good Data; An Optimist's Guide to our Digital Future. I think that nobody, of course likes to have their privacy compromised, at an individual level. But the reality is, when we look at, you know, the things we can do when we have data at scale across, you know, large populations, there's a lot that can be achieved, whether that's in something like inclusive FinTech, whether that's in protecting human rights by combating modern slavery, whether that's to do with health data in a system like the NHS. Yeah, I don't think anybody likes to have their privacy compromised, obviously, at that individual level. But if there's a sort of way to communicate that greater good message, I'm not trying to encourage people to willingly give away their data for free, quite the opposite. But I think that's the sort of big debate the both commercial and academic data scientists, you know, that's really the arena in which we work. Because there are a lot of benefits to be had. When we think about sort of data at scale. Equally, we need to protect, you know, individuals and communities. I think, you know, it's really great in this series to hear about, you know, things like eyeWitness up and Recidiviz and some of these platforms that I think are managing that really well and really getting that good out of the data. Yeah, I think that's been really nice. There's a lot we can say also, on the subject of, I think this is more of a frontier thing. But artificial intelligence in particular, which came up a few times, which I think is going to be the next well already is actually the next big frontier in terms of talking about, you know, transparency and fairness, especially because we're applying these tools to these large datasets. Todd Landman 9:04 Right. And I also came across a very interesting project and another group here at the University of Nottingham. It's within the Nottingham University Business School. And it's a neo-demographic lab or N/Lab, which works on you know, big data science projects around harnessing unknown information from pre existing datasets. And there was a partnership with OLIO, which is an app that allows people to trade food that they're not going to need so surplus food sits in people's houses, other people need food. So this app allows people to share food across the app, and to actually make best use almost the circular economy, if you will, in sharing food. Now, quite apart from the pragmatics and the practicalities of sharing food between households. Of course, the app collects data on who needs food and who has food, and that then allows the geo-mapping of food poverty within particular districts and jurisdictions within the United Kingdom. Can you say a bit more about that project and does this fit within the category of data for good? Ben Lucas 10:03 Absolutely. I mean, that's an absolutely fantastic piece of work, you know. And obviously, the purpose of that platform and all that work is to look at both combating food inaccessibility and food poverty, on the one hand, and on the other, combating food waste. So really, yeah, absolutely a fantastic example, as far as data for good and also doing the right thing by people in society. I think it is also a great example of this idea that we can, you know, log data from sharing platforms, and really whatever platform in an ethical way, you know, in the work those that colleagues at N/Lab are doing, you know, so it's all privacy preserved data. It's possible to get a, you know, useful enough geotagged picture of how the sharing is taking place, such that it can be understood at a network level, but it's not giving away, you know, exact locations, it has no identifiers of who's linked to it. But even just with that sort of network exchange level data, you know, it really tells a very interesting story about how this system works. And, you know, as you said, I mean, this is very much in the peer to peer sharing economy space, which is a relatively new idea. So it's also from an academic point of view, very important and very useful to be doing research to understand these entirely, relatively new kinds of systems. Todd Landman 11:26 So essentially, because the heat map that that project produced was for a belief Haringey Council in Greater London, and I guess, you know, knowing what I know about data, this could be scaled up for all jurisdictions, the United Kingdom. And beyond that the heat map tells you areas of food poverty, but also could inform government as to where to put resource and where dare I say levelling up funding could be targeted to help those most in need. Ben Lucas 11:53 Yeah, absolutely. I mean, as I understand it, that works, you know, been incredibly useful for the platform and how it's looking to grow and continue to be successful. But yeah, absolutely. That's really another key thing here is the value these platforms have for policymakers for government, indeed. Todd Landman 12:08 Great. So we've had the data for good story, I now turn our attention to the data for bad story, because we had some guests that were very suspicious, sceptical and were critical of this burst and proliferation and digital transformation and the production of data second by second day by day, week, by week, year by year and two of our guests had actually different perspective on this, so Susie Alegre has this fantastic new book out with Atlantic books, she called Freedom of Thought. And what she was really concerned about was not only the history of analogue ways in which people's freedom of thought had been compromised, but also the digital ways in which freedom of thought might be compromised by this digital revolution. And for her, her concern, really is that there are unwitting or witting ways in which people's thought patterns might be manipulated through AI and machine learning. And we use popular examples of consumerism, consumer platforms, such as Amazon and other shopping platforms where not only does one get bombarded by advertisements, but actually gets suggestions for new things to buy based on patterns of spend in the past. And there is cross referencing between platforms. And I think Sam Gilbert also addressed this thing about this micro targeting and cross referencing. So if I search for something on one platform, it shows up on another one, when I'm sort of, you know, at least expecting it to do so. A bought some shoe laces the other day, they came to the house within a day. So I had that lovely customer experience. And yet, when I went on to a CNN website to look at the news headlines, the first ad that popped up was for shoe laces. So can you say a bit more about the unease that people have around these sharing platforms and the worry that our thoughts are being manipulated by this new technology? Ben Lucas 13:45 Yeah, I think this idea of freedom of thought or, you know, illusion of decision freedom is a really important one, when we're talking about the internet, and especially, you know, one can imagine, you know, as was evidenced with the Cambridge Analytical scandal back a few years ago, you know, this becomes especially dangerous when we're talking about political messaging. I think it's important that we, as users of the internet, approach the internet with a healthy degree of scepticism being a bit, you know, cautiously analytical, and occasionally taking a step back and thinking about what the implications of our behaviour online, including simply consuming content consuming information really are. The reality is most of if not all of the online platforms that we use be that social media, ecommerce, or whatever. They are designed to achieve immersion. They're designed to keep you spending more time and if you're spending time in the wrong kind of echo chambers, or if you're getting exposed to messages from bad actors. You hear these stories of people going down all sorts of terrible rabbit holes and things and this is how conspiracy theories and so forth proliferate online. Yeah, but certainly even just for the regular internet user, we all definitely need to be thinking about where is information coming from? Is it from reliable sources? Is the intent good? And do we indeed have that decision making freedom? I think is the really important thing, or is someone trying to play with us? Todd Landman 15:13 Well, it's a really interesting answer. And it links very nicely to our episode with Tom Nichols, because he was saying that there's this tendency towards narcissism. And that's, you know, certainly during COVID, people had more time inside, they had more time to dedicate to being online. But at the same time, the rabbit holes that you're worrying about really raised too high relief. And so that retreat into narcissism, the idea that if you're going to post something, you're only going to post something negative, critical and maybe sowing division by posting those critical comments. But you also in your answer talked about the power of particular individuals. And I guess, I have to address the question of Twitter in two ways. So Tom made this observation of Twitter is this sort of, you know, you have now have 240 characters to, you know, vent your spleen online and criticise others, but also that's powerful platform to mobilise people. And I say this in two ways. The first is that the revelations from the January 6 committee investigating the events that led up to the insurrection against the US Capitol was putting a lot of weight this week on just the number of followers that former President Trump had, and a single tweet in December where he said, you know, come to the Capitol on January 6, it will be wild. And then there were an array of witnesses paraded in front of the committee, from far right groups from the Oathkeepers, and other groups of that nature, who were saying, but actually, we saw this as a call to arms. So there was a nascent organising taking place, but there's almost this call to arms issued by a single tweet to millions of followers that really was, you know, the spark that lit the fire and wonder if you might just reflect on that. Ben Lucas 16:50 Yeah, I think for anyone currently also trying to keep up with slash decipher the story in the news about Elon Musk, putting in an offer to buy Twitter, which has now fallen through, I would use that lens to sort of explore this because one of the goals that I think he was seeking to achieve in taking over Twitter was really opening up its potential for free speech further. But yeah, for anybody sort of observing. That's a really tricky one. Because sometimes when the speech is, well, I mean, that there should be free speech. But people should be saying, you know, hopefully nice things within that freedom, and not denying the rights of others and not weaponizing free speech to stir up trouble. I think it's really, you know, we touched on this in the first episode of the series as well, the really big question with social media is, who's the editor in chief? Is it everybody? Or is it nobody, and which is the better format? Todd Landman 17:42 Yeah, and we talked about that unmediated expression and unmediated speech and that Martin Scheinin, as well, as Tom Nichols talked about how traditional media organisations have had that mediating function, and the editorial function, which is lost when you have an open platform in the way that Twitter has, even though they did in the end, deplatform the former President. But I want to get back to that. I mean, you know, the task of the January 6 committee is not only to say we think there's a causal link between this tweet and people doing things, but they will also need to demonstrate the intentionality of the tweet in and of itself. And I think that's a major concern, because there's certainly ambiguity in the language saying, you know, come to the Capitol, it's going to be wild doesn't necessarily convert into a mass uprising with weapons and an insurrection. So there's a tall order of, I would say, legal proof, above reasonable doubt that needs to be established, were one to go down that legal route. But if we look at Elon Musk, I mean, here's one person who's exceptionally wealthy in the world who can buy an entire platform. And the concern that many people have is can one individual have that much power to acquire something that powerful, and we don't know if the deals fallen through, because there are some legal wranglings going on at the moment about whether he could actually withdraw at this late stage in the purchase process. But be that as it may, I wonder if you might just reflect on this ability for a very wealthy single individuals take control of a platform as powerful as Twitter. Ben Lucas 19:10 So I think it's a really complicated one, it's really one of the most complicated questions within the social media space, you know, because these platforms are ultimately businesses. There's a founder, there's a CEO, there's a board, there's that leadership, and hopefully accountability and responsibility. It is really a tough one, you know, one wonders about a future where, you know, in the same way, you've got the Open AI Foundation, for example, or you've got, you know, other truly sort of open peer to peer kind of platforms. If we think about how the internet is or technology is trying to decentralise things like finance in the future, wonders if there's sort of an alternative model that could solve some of these problems. I think the narrative so to speak specifically about Elon Musk that he's been putting forward, was really just to open up Twitter even further taking that sort of laissez faire kind of approach and just you know, letting free speech just sort itself out. And again, free speech is and can be a good thing. But sadly, when people engineer these kind of messages to avoid legal accountability, but are implying, you know, some sort of stirring up of trouble, when people engage in narcissistic sort of messaging when people engage in putting forward, you know, campaigns, you know, engineering very, very strong emotions, like fear and anger, obviously, that can get out of control very, very quickly. The reality is, I'm not qualified to come up with the solution. And I, sadly, I don't know who is. Yeah. Todd Landman 20:36 Well, that's interesting, because we have some guests that were suggesting a solution. And if I listened to you speak about the Elon Musk agenda to open up in a laissez faire way, it's almost the invisible hand of the information market, you know, if we go back to economics, and one tenant of economics at least has been that the invisible hand sort of guides markets, and the pricing and equilibrium that comes from supply and demand produces a regulatory outcome that is beneficial for the most people most of the time, it's a somewhat naive view, because there's always winners and losers and economic transactions. So counter to this idea of the invisible hand of the information market, we had quite an interesting set of thoughts from Martin Scheinin, and from Susie Alegre on the need for regulation. And that really does take us back to the beginning of this series of The Rights Track where you made the observation that tech is advancing more quickly than the regulatory frameworks are being promulgated that there's this lag, if you will, between the regulatory environment and the technological environment. So I wonder just for your final reflections, that really what both Martin Scheinin and Susie Alegre are saying that if tech is neutral, we need to go back to ethics, morality law and a human rights framework to give us the acceptable and reasonable boundary conditions with which all this activity needs to be thought about. Ben Lucas 21:56 Yeah, exactly. I mean, it really does come down to, you know, well constructed regulation, which is obviously complicated, especially when, you know, most major social media platforms have a global footprint. So it's then how to ensure consistency across the markets they operate in. I think a lot of the regulatory frameworks are kind of there for the offline world. And the main thing, yeah, that we were sort of getting at in the first episode of this series is really that because technology moves so fast, because these platforms grew so quickly, you know, there are laws to stop people, no one can just go into the town square and start, you know, hurling obscenities, you know, in public, but for some reason, you know, it happens millions and millions of times a day on social media platforms. So I think, yeah, regulation really is key here. But the other thing is, I would say the people that misuse, the definition and excuse of free speech, should actually really look up the definition of free speech again. Todd Landman 22:57 Well, it's this idea of doing no harm. You know, I think I mentioned this notion of a Hippocratic Oath, if you will, for the digital world that you can engage but do no harm. And what people conceive and perceive as harm, of course, is open to interpretation. But that's a general kind of impulse behind this. And you know, this distinction between the offline world and the online world is also really, really important. So Tom Nichols invites us to maybe get off the grid occasionally go back into our community, say hi to our neighbours, volunteer for things and experience humanity face to face in the offline world a bit more than were experiencing in the online world. And of course, the appeal to morality, ethics, law and the human rights framework is going back to you know, basic philosophy, basic conceptions of rights, basic conceptions of law, to make sure that, you know, our offline world thoughts can be applied to our online world behaviours. So, you know, these are super deep insights. And as the world progresses, as technology progresses, as the interconnections between human beings progress in ways that we've seen over the last several decades, through the medium of digital transformation, and this ever expanding digital world, it does make us pause at this moment to say that actually reflect on human dignity, human value, integrity, and accountability and responsibility for the kinds of things that we do both within the offline world and the online world. And you've given us much to think about here Ben certainly across the many episodes of this series, you kicked us off with this great, you know, offline - online regulation versus tech dichotomy that we all face. We've heard from so many people, evangelising the virtues of the digital world but also raising significant concerns about the harm that can come from that digital world if we allow it to run unchecked. So for now, it's just my job to thank you Ben for coming back on this final episode, giving us a good wrap up set of reflections on what you've heard across the series. And thank you ever so much for joining us today on this episode of The Rights Track. Ben Lucas 25:02 Thanks so much. Christine Garrington 25:04 Thanks for listening to this episode of The Rights track podcast which was presented by Todd Landman and produced by Chris Garrington of Research Podcasts with funding from 3DI. You can find a full transcript of this episode on the website at www.rightstrack.org together with useful links to content mentioned in the discussion. Don't forget to subscribe wherever you listen to your podcasts to access future and earlier episodes.
Sam Gilbert of Club 360 (episode 169 guest) interviews me on his podcast! I enjoyed going through the questions and having a chance to reflect on my experiences and talk about the origins of the I CAN program. Sam and I also exchanged stories about raising children, teaching children, and keeping them focused and motivated despite the time limitation (ex-pat families stay in Japan on average for 3 to 5 years).
Sam Gilbert, co-founder and owner of Club 360 joins me today and talks about his life experience as well as the makings and future of his successful fitness club. Sam has a background in karate and studied physiotherapy at university. He met Nathan, his business partner (also his brother-in-law), and together they fulfilled their goal of delivering a holistic fitness and wellness service that caters to individual needs.
Wherein I go over the AMAZING European finals, the incredible final round of Super Rugby Pacific, and so much more! The article about Sam Gilbert was here: https://www.rugbypass.com/news/sam-gilbert-handed-ban-for-tip-tackle-on-wallabies-captain-michael-hooper/ If you're enjoying listening, you can always buy me a beverage through the link below, or you can sign up through Anchor to become a monthly supporter! Thanks as always for your support! https://www.buymeacoffee.com/ScrumOfTheEarth https://anchor.fm/thescrumoftheearth For my fixtures and results, I use the official websites of various leagues: https://www.sixnationsrugby.com/fixtures/ https://www.majorleague.rugby/schedules/ https://www.premiershiprugby.com/gallagher-premiership-rugby/fixtures-results/ https://www.unitedrugby.com/match-centre https://super.rugby/superrugby/ Thanks, cheers and be well! --- Support this podcast: https://anchor.fm/thescrumoftheearth/support
Justin Marshall shared his thoughts on Sam Gilbert at first-five for the Highlanders.
In Episode 5 of Series 7 of The Rights Track, Todd is in conversation with Amrit Dhir, Director of Partnerships at Recidiviz – a team of technologists committed to getting decision makers the data they need to drive better criminal justice outcomes. Transcript Todd Landman 0:00 Welcome to the Rights Track podcast, which gets the hard facts about the human rights challenges facing us today. In series seven, we're discussing human rights in a digital world. I'm Todd Landman, in this episode, I'm delighted to be joined by Amrit Dhir. Amrit is the Director of Partnerships at Recidiviz, a team of technologists committed to getting decision makers the data they need to drive better criminal justice outcomes. He has previously spent over a decade at the intersection of technology and new business development, working, for example at Sidewalk Labs, Google for Startups and Verily. Today, we'll be exploring the practical uses of technology and data in the criminal justice system. So Amrit, it's great to have you on this episode of the Rights Track. Welcome from California. Amrit Dhir 0:44 Thank you so much, I'm really glad to be here. Todd Landman 0:46 It's great for you to join us. And I want to start with a simple question. We had a guest - Sam Gilbert - on our last episode, we made this distinction between the sort of data for good and data for bad and there's a very large sort of argument out there about surveillance capitalism, the misuses of data, you know, behavioural microtargeting and all these sorts of issues. And yet I see that where you're working at Recidiviz there's a kind of data for good argument here around using technology and data to help criminal justice systems and the healthcare sector. So just briefly, could you tell us about this data for good and data for bad distinction? Amrit Dhir 1:19 Yeah, well, as with most things, I think it's difficult to pigeonhole anything into one of those camps, everything it seems, can be used for good or bad. And so data itself is not one or the other. I think it's about the use, I think that's what Sam was getting at with you as well. With Recidiviz, you know, what we've understood is that data that's been collected over a long period of time, especially in the context of the United States, and our unfortunate kind of race to mass incarceration, from basically the 1970s until about mid-2010s. We've collected a lot of data along the way, and we're not actually using or understanding that data. And so what we do at Recidiviz is we bring that data together, so make it something that can be better understood and better utilised, to help reduce prison populations to help drive better outcomes. So we're focused on taking data that's been, again, collected over quite a long period of time and consistently collected, but also making it better understandable. Todd Landman 2:17 So this sounds like big, messy, disparate, fragmented data, is that correct? Amrit Dhir 2:22 Most of those things, most of the time. It's definitely fragmented most of the time, it's not always necessarily what we'd call big. Because, you know, coming from Google, I think of big in the terms of, you know, search query type volume. So in corrections, it's not necessarily that big, but it is certainly messy, and it is certainly fragmented. Todd Landman 2:42 You know we had a guest on Rights Track, some while back, David Fathi from the American Civil Liberties Union, he explained to us the structure of the American sort of prison system, not justice in itself, but prison system with, you know, 50 state prison systems, plus a federal prison system and a mix of public and private prisons. So it's a mixed picture in terms of jurisdiction, the use of incarceration and of course, the conditions of incarceration. So what's the sort of data that's being collected that you find useful at Recidiviz? Amrit Dhir 3:13 Yeah, I'll actually add a piece of that as well, you're exactly right to say, you know, every one of the 50 states has a different system, the federal system is itself separate. But then there's also county jails. And those systems are running completely separately from even the states that they're in. So it is messy. And the data also extends, by the way, so we're talking about what we consider the back half of the system. So once someone has already gone to prison, we think of that as the back half. Whereas there's a front half of the system as well, which is the courts, your prosecutor and defence attorneys, and up to policing. And so all of those different segments have their different datasets as well. At Recidiviz we're starting at the back half, largely, because we think there's a lot more impact to be had there, at least for now. And the data extends to many things. So it can be first of all, admissions data. When someone comes into a facility, what sentence did that person come in with? Where is that person going to be in the facility? As in like, where's that bed? And then, as often happens, there are transfers between prisons, within prisons. That's another set of data. There are programmes that the person may be participating in. Some of these are built with the spirit of rehabilitation and reintegration into society. Those are important and knowing how they work and when they work, and if they work is important. And then when someone gets out of prison, that's not the end either. We've whole infrastructure of supervision. And broadly, those are grouped into two categories - parole and probation. And someone may be back out in their community and still under a degree of supervision that's more than what someone who has not been in prison goes through. They have to check in with their parole officer. They have certain requirements, they have certain restrictions. All of those are data points as well. How are you checking in with your parole officer? Did you have to take a drug test? Did you ask for permission to leave the state, all of those things. And as you can imagine, even just by the list I've given you, which is just a very small percentage of it, all of those are sitting in different data silos and are interacted with by different people within the system and it gets pretty tricky. Todd Landman 5:21 And you collect data on the sort of sentencing? So you know an analysis of that plus demographic makeup of the prison population, time served? And also, the use of the death penalty and or deaths in custody - is that data that you can collect? Amrit Dhir 5:37 Yes, so we can do all that. And I'm glad you pointed out racial and demographic data, because that's a big part of what we do and what we highlight, because you may not be surprised to hear that in the US, there are like pretty severe disparities when it comes to race, ethnicity. And these are things that departments of corrections. So those are the executive agencies within each state, we usually call them department of corrections, although they'll have different names in different states. They have this data, and they want to make better sense of it. Their stakeholders want to understand it better. So generally, these agencies report to the governor, but they're also accountable to the legislature. So there's a degree of sharing that data or better unpacking that data that's important. Then we also have, I would broadly, categorise, and we say these kinds of things a lot where there's broad categorizations and then there's also much more detailed ones. But broadly, you can think of this as public data, and then departments of corrections data. So the public data is what's available anyway - we can go out there and find without any data sharing agreement with any agency. As these are government agencies where this data is required to be public. And so you'll find researchers and universities and different organisations accessing this data and publishing it or analysing it, we do that also. But we also get data sharing agreements directly with departments of corrections, and help them unpack that as well. So there's a kind of complimentary interaction there between the two datasets. Todd Landman 7:09 I understand. And how do you actually reduce incarceration through data analysis? I'm perplexed by that statement you made quite early on when you were talking to us. Amrit Dhir 7:18 There's a couple things and I'll categorise this. My broad categories into three categories. There are leadership tools, line staff tools, and then public tools. So let me start with public tools, because I think that's more related to what we just talked about in the previous question. The public tools are ones that are available to you and me. And so there's two that you can look on our website and find right now. One is a public dashboard that we call spotlight. As of the date of this recording there are two that have been published one for North Dakota and one for Pennsylvania. I encourage everyone to go check those out. If you just Google, you know our name Recidiviz and Pennsylvania, you'll see it come up as the first result. And there you can see that all the data in a accessible way. So the 'viz' in Recidiviz stands for data visualisation. We worked with the Pennsylvania Department of Corrections, to better represent the data that they have, so that the public can see it. And you can see the breakdown, by ethnicity, by district, by sex by other filters, and really get in there in some detail and see what's happened also over time. So that's one that's the public dashboard. That's largely to raise awareness. And it's something that when you talk to departments of corrections, you learn that they have lots of FOIA requests, which are Freedom of Information Act requests, so requests from media, from researchers, from the public, but also from the legislature. And so that's one thing that we do that just broadens the conversation. Another are what we call policy memos. If you go to our website, and or if you just type in Recidiviz.org/policy, these are one-page memos that we have our data scientists put together that assess the impact of a particular administrative or legislative policy proposal. So imagine that you are looking to Pennsylvania for example, wanting to make a change to geriatric parole, or if you want it to end the criminalization of marijuana, we can then and we have gone in there and analyse the data that's publicly available. And sometimes we also access our data with collaboration with the DOC. And we can tell you what the both impact on the number of basic liberty person years that are returned. How people will get out of prison earlier or not go to prison at all, as well as how much money the state in these cases will save. And so that's a great way to inform policymakers to say hey, this is actually a good policy or a bad policy, because it's going to get people out of prison and it's going to also save you money. Todd Landman 9:57 Yeah the concept it's like a variable called liberty person years that you use. And then of course, it's almost like a time series interrupted model where if you get new legislation, you can look at that liberty person years before the legislation and after to judge the degree to which that legislation may or may not have made a difference, right? Amrit Dhir 10:16 Exactly right. And I encourage folks just to go check, check some of those memos out, there's probably like 50 on there now. And they're very easy to understand, very easy to access. They're all one page. They're all very beautifully visualised. Because you can take this very, as you said, messy and fractured datasets, but actually come to some pretty simple insights. And I would say simple and actionable. And so that's what we do there. And that was a long description of public data, but I can go into the other two, if you're ready for it. Todd Landman 10:43 Yes, please. Amrit Dhir 10:44 Okay. So working backwards, we'll go to line staff tools. And so this line staff, meaning people who are working within corrections or on supervision. And let me take the example of supervision first, because one thing that's interesting and that I actually learned only while at Recidiviz is that half of prison admissions in the US every year, come from supervision. Meaning people who are getting their parole or probation revoked and are going back to prison. That's half of the emissions we get every year. And that's a huge number. Todd Landman 11:15 Wow. Amrit Dhir 11:15 And so this, you can think of this as the back end of the back end, it's the very last piece. And so for Recidiviz we were kind of assessing where we should start, that seemed like the right place to do so because the impact was just so great. Now, put yourself in the shoes of a parole officer. These folks have pretty difficult jobs in that they often have, you know, up to 100 and sometimes more, we've seen up to 120 people that they are I'll use a verb 'serving' as a parole officer. So the idea is you got people that have been returned to the community, they've been in prison, they now are trying to get jobs, they're trying to get job training. They're trying to reintegrate into their communities, and the parole officer is there to help them do that, and keep track of how they're doing. Now, that's one thing to do if you got 20 people, you want to keep track of and help and connect to the right resources, but if you've got 100, and you're supposed to meet with them every month, it becomes impractical. And that ends up meaning sometimes that parole officers aren't doing as good a job as they'd like to do. Because it's just too hard, just too much to manage. Todd Landman 12:22 You need a structured database approach. Amrit Dhir 12:24 Exactly. So that's where data can be very useful, because we can automate a lot of what a parole officer needs to do. And rather than having to check, you know, we've heard up to 12 different datasets to figure out where are the programmes my the people I'm serving are have available to them? When do I know if I need to do a home visit? Where do I find a list of employers that I can send them to? Where are housing options for them? All these are in different places, but we at Recidiviz, bring them all together, give them an easy-to-use tool, so that we can actually service them even you know, on their smartphones, in an app, to show them, hey, did you know that this person is actually eligible to be released from parole if they just upload a pay stub? And hey, do you want to just take a pay stub with your phone, and we can do it for you? I mean, how much easier that is than you having to go through all 100, figure out who's eligible based on your own recall or some other antiquated system and kind of struggle to try to help people. We can help you do that. And that's a big thing that we've done. Todd Landman 13:22 I mean it's almost like an E-portfolio approach that there's this way to archive parolees meeting certain milestones and conditions. And it makes the management of those cases so much more straightforward. Whilst there's also a record of that management that makes it easier for the parole officer to serve the people that they are serving. Amrit Dhir 13:42 Exactly. You got it exactly right. And by the way, there's, you know, a degree of nudging that can be done in this as well, if you're familiar with like the Cass Sunstein and others, behaviour psychology, but how, you know, instead of saying, hey, this person needs a drug test, and have that'd be the first thing that you prioritise. I mean you can say, hey, this person needs help finding a job. And here are some resources, here's some employers in the area that we know employ people who are formerly incarcerated. It's a great way to actually not only automate and make the life of the parole officer easier and better, but also to kind of encourage the better behaviours within those communities. Todd Landman 14:16 Now that makes sense. So what's the third channel then? Amrit Dhir 14:18 Ahhh the third one is leadership tools. And this is for the directors and their deputies, the most senior people in a department of corrections, they may come in. And actually what we're seeing now is that a lot of the people who are coming in today and are sitting in these roles are reformers. They believe that the size of our criminal justice system in the United States is just too large. And they are motivated to improve outcomes. And they're focusing on things like recidivism, which is a term for people coming back to prison after being released. And that's a number you want to have low naturally. But historically, what happens - actually not even you know what historically -what happens today. He is that these recidivism reports will come out maybe every three years. Yeah. So if you're a director, and by the time they come out, they're almost three years old. So you're almost like because the six year timelines, and you want to know, hey, I instituted this new reform this new programme, I want to know if it's been successful, you won't know until a couple years out whether it worked. And so what we do instead is to give you real time data, we can tell you what's happening on your team and in your agency on a real time basis. And also project out based on what we're seeing with some meaningful kind of population projections as well. So that helpful. Todd Landman 14:34 That's fascinating. And let me ask you just another technical question. So when people are released from prison, is it typical for them to also have a sort of GPS tag on their leg for a certain period of time? And does that form any of the data that you look at? Amrit Dhir 15:52 So it depends? It's a very good question. And it's one of the more controversial topics today in this space, and especially in the Reform Movement, there's a concern that we may be heading towards, from mass incarceration to mass incarceration, and that people will be monitored and supervised within their communities. And I think that is a very meaningful concern that we need to be careful of, because we don't want that to happen. But to broadly answer your question about the state of this today, it depends on where you are, it depends on the county depends on the state depends on all those things, in terms of whether you are wearing a device that electronically monitors, you know, we don't track that ourselves, that's something that we do or want to do. Our approach is to helping people get off of supervision and get into programmes and other kinds of initiatives that help them on their way. Todd Landman 16:43 Excellent. So this discussion really opened up into, you know, the bad side of the question, I guess, you know, you just have to go into this with our eyes open, I suspect that you're triangulating a lot of data. You're providing that in real time on dashboards, a lot of it's in the public domain. What are the risks around this? What are the pitfalls? What's the risk of re-identification? What's the risk of, you know, lapsing into kind of credit scoring philosophies? And just, as you said about the tags, there's worry about that kind of, you know, E-surveillance and E- carceration. Equally, someone could backward engineer some of your data and actually profile people. So, what's the downside of this approach? Amrit Dhir 17:21 Yeah, that was a great list. So there's certainly a concern of bias entering any analysis of a dataset. And we are very careful about that. So one thing to note is that everything that we do is open source. So it's open to the technology community to take a look at what's kind of under the hood. And that's important, because we would do want to make sure that we are not only participating and contributing to the broader ecosystem that are, in this case, tech and criminal justice ecosystem, but that we're also held accountable to them. So that's the first thing that we do, we also are very mindful and transparent about our data ethics policies, and how we handle those kinds of questions and sometimes ambiguities. So if you look at, for example, the spotlight dashboard that I mentioned that you'd find for Pennsylvania, North Dakota, you will see in the methodology that we explain what happens when there's a question. So for example, if someone puts down three different ethnicities, how do we manage that in the data visualisation that just shows them as one. Our approach there is transparency and engagement. Todd Landman 19:31 Have you done any links with the ACLU on this? Because they're quite interested in prison conditions. They're interested in incarceration, sentencing, etc. Do you do any kind of briefing with the ACLU? Amrit Dhir 20:16 Yeah, so two things actually on that. I will take them in reverse order. So first of all, we do work with ACLU. If you look at our website, on the policy page, which again, are those one page memos, the ACLU has requested a number of those. And there's naturally different chapters of the ACLU in different states in different parts of the country. And we work with different stakeholders within the ACLU as well on those. The other piece, though, get one of the back to what you said about three strikes. There's another piece of that I think people may not be as familiar with it. I certainly wasn't, which is this issue of technical revocations. So if you're on supervision, like I said, half of prison emissions every year are from revocation of your supervision, meaning you're going back to prison, from parole or probation. But half of those, so a quarter of all emissions every year are from technical revocations. And those are when someone breaks a rule, that is not a law for the rest of us. Right. So it's not that they stole something, it's not that they got caught breaking a law, that they broke a rule of their parole, and sometimes these are ones that you and I would feel horrified to learn of. So that, you know, we've got examples of people, for example, going to an open mic night where there was alcohol presence, and that person wasn't allowed to be around alcohol. Being in the wrong County. Being out past curfew. All of these things that are, and you know there are anecdotes all over the place of the kinds of things that send people back to prison that we as society would not tolerate. And those are also some of what we're reducing. Todd Landman 21:49 That's amazing, that sort of distinction to draw between, you know, breaking a rule and breaking the actual law, I guess the rules follow from the law. But I get your point in terms of, you know, how would somebody know if they crossed the county line, particularly if they're at an area they don't know well. So this has been a fascinating exploration with the ways in which you have triangulated datasets, made them more visible, put them into real time, and I have to reflect on what you said. I mean, I grew up in Harrisburg, Pennsylvania. So I'm going to immediately read all your Pennsylvania data. I actually grew up near a prison in Camp Hill, Pennsylvania, you know, so it'd be interesting to see how things have moved on from the time that I lived there many moons ago. I won't tell you how long ago that was. But, but this is a really good conversation for us to have around some of the ways in which different types of data can be leveraged for good. And also some of the challenges of that, or the misuse of that information, as well as the sort of things that you don't collect, you know, the fact that you don't collect data on these tags. And that that varies, of course, and the variation you see in terms of the population that you're collecting data on varies because of the fragmentation of the US prison system and the sort of federal system that the US is structured in, but also data that no one really brought together in one place before. And I think that when we hear this data for good argument, we hear a lot of people saying we're actually bringing datasets that haven't been brought together before in order to derive insights from those data and do something that is for good and brings about positive social changes result. So I just think this tour that you've given us today is absolutely fantastic. And on behalf of the Rights Track thanks so much for being on this episode with us today. Amrit Dhir 23:25 Oh, thank you for having me. It's been fun. Thank-you. Christine Garrington 23:29 Thanks for listening to this episode of The Rights Track, which was presented by Todd Landman and produced by Chris Garrington of Research Podcasts with funding from 3DI. You can find a detailed show notes on the website at www.RightsTrack.org. And don't forget to subscribe wherever you listen to your podcasts to access future and earlier episodes.
In Episode 4 of Series 7 of The Rights Track, Todd is in conversation with Sam Gilbert, an entrepreneur and affiliated researcher at the Bennett Institute for Public Policy at the University of Cambridge. Sam works on the intersection of politics and technology. His recent book – Good Data: An Optimist's Guide to Our Future – explores the different ways data helps us, suggesting that “the data revolution could be the best thing that ever happened to us”. Transcript Todd Landman 0:01 Welcome to The Rights Track podcast which gets the hard facts about the human rights challenges facing us today. In Series 7, we're discussing human rights in a digital world. I'm Todd Landman, in the fourth episode of this series, I'm delighted to be joined by Sam Gilbert. Sam is an entrepreneur and affiliated researcher at the Bennett Institute for Public Policy at the University of Cambridge, working on the intersection of politics and technology. His recent book, Good Data: An Optimist's Guide to Our Future explores the different ways data helps us suggesting the data revolution could be the best thing that ever happened to us. And today, we're asking him, what makes data good? So Sam, welcome to this episode of The Rights Track. Sam Gilbert 0:41 Todd thanks so much for having me on. Todd Landman 0:44 So I want to start really with the book around Good Data. And I'm going to start I suppose, with the negative perception first, and then you can make the argument for a more optimistic assessment. And this is this opening set of passages you have in the book around surveillance capitalism. Could you explain to us what surveillance capitalism is and what it means? Sam Gilbert 1:01 Sure. So surveillance capitalism is a concept that's been popularised by the Harvard Business School Professor, Shoshana Zuboff. And essentially, it's a critique of the power that big tech companies like Google and Facebook have. And what it says is that, that power is based on data about us that they accumulate, as we live our lives online. And by doing that produce data, which they collect, and analyse, and then sell to advertisers. And for proponents of surveillance capitalism theory, there's something sort of fundamentally illegitimate about that. In terms of the way that it, as they would see it, appropriates data from individuals for private gain on the path of tech companies. I think they would also say that it infringes individual's rights in a more fundamental way by subjecting them to surveillance. So that I would say is surveillance capitalism in a nutshell. Todd Landman 2:07 Okay. So to give you a concrete example, if I'm searching for a flannel shirt from Cotton Trader, on Google, the next day, I open up my Facebook and I start to see ads for Cotton Trader, on my Facebook feed, or if I go on to CNN, suddenly I see an ad for another product that I might have been searching for on Google. Is that the sort of thing that he's talking about in this concept? Sam Gilbert 2:29 Yes, that's certainly one dimension to it. So that example that you just gave is an example of something that's called behaviour or retargeting. So this is when data about things you've searched for, or places you've visited on the internet, are used to remind you about products or services that you've browsed. So I guess this is probably the most straightforward type of what surveillance capitalists would call surveillance advertising. Todd Landman 2:57 Yeah, I understand that, Sam, but you know when I'm internally in Amazon searching for things. And they say you bought this other people who bought this might like this, have you thought about, you know, getting this as well. But this is actually between platforms. This is, you know, might do a Google search one day. And then on Facebook or another platform, I see that same product being suggested to me. So how did, how did the data cross platforms? Are they selling data to each other? Is that how that works? Sam Gilbert 3:22 So there's a variety of different technical mechanisms. So without wanting to get too much into the jargon of the ad tech world, there are all kinds of platforms, which put together data from different sources. And then in a programmatic or automated way, allow advertisers the opportunity to bid in an auction for the right to target people who the data suggests are interested in particular products. So it's quite a kind of complex ecosystem. I think maybe one of the things that gets lost a little bit in the discussion is some of the differences between the ways in which big tech companies like Facebook and Google and Amazon use data inside their own platforms, and the ways in which data flows out from those platforms and into the wider digital ecosystem. I guess maybe just to add one more thing about that. I think, probably many people would have a hard time thinking of something as straightforward as being retargeted with a product that they've already browsed for, they wouldn't necessarily see that as surveillance, or see that as being particularly problematic. I think what gets a bit more controversial, is where this enormous volume of data can have machine learning algorithms applied to it, in order to make predictions about products or services that people might be interested in as consumers that they themselves haven't even really considered. I think that's where critics of what they would call surveillance capitalism have a bigger problem with what's going on. Todd Landman 4:58 No I understand that's, that's a great great explanation. Thank you. And I guess just to round out this set of questions, really then it sounds to me like there's a tendency for accumulated value and expenditure here, that is really creating monopolies and cartels. To what degree is the language of monopoly and cartel being used? Because these are, you know, we rattle off the main platforms we use, but we use those because they have become so very big. And, you know, being a new platform, how does a new platform cut into that ecosystem? Because it feels like it's dominated by some really big players. Sam Gilbert 5:32 Yes. So I think this is a very important and quite complicated area. So it is certainly the case that a lot of Silicon Valley tech companies have deliberately pursued a strategy of trying to gain a monopoly. In fact, it might even be said that that's sort of inherent to the venture capital driven start-up business model to try and dominate particular market space. But I suppose the sense in which some of these companies, let's take Facebook as an example, are monopolies is really not so related to the way in which they monetize data or to their business model. So Facebook might reasonably be said to be a monopolist of encrypted messaging, because literally billions of people use Facebook's platform to communicate with each other. But it isn't really a monopolist of advertising space, because there are so many other alternatives available to advertisers who want to promote their products. I guess another dimension to this is the fact that although there are unquestionably concentrations of power with the big tech companies, they also provide somewhat of a useful service to the wider market, in that they allow smaller businesses to acquire customers much more effectively. So that actually militates against monopoly. Because now in the current digital advertising powered world, not every business has to be so big and so rich in terms of capital, that it can afford to do things like TV advertising. The platform's that Facebook and Google provides are also really helpful to small businesses that want to grow and compete with bigger players. Todd Landman 7:15 Yeah, now I hear you shifting into the positive turn here. So I'm going to push you on this. So what is good data? And why are you an optimist about the good data elements to the work you've been doing? Sam Gilbert 7:27 Well, for me, when I talk about good data, what I'm really talking about is the positive public and social potential of data. And that really comes from my own professional experience. Because although at the moment, I spend most of my time researching and writing about these issues of data and digital technology, actually, my background is in the commercial sector. So I spent 18 years working in product and strategy and marketing roles, and particularly financial services. Also at the data company, Experian, also in a venture backed FinTech business called Bought By Many. And I learnt a lot about the ways in which data can be used to make businesses successful. And I learned a lot of techniques that, in general, at the moment, are only really put to use to achieve quite banal goals. So for example, to sell people more trainers, or to encourage them to buy more insurance products. And so one of the things that I'm really interested in is how some of those techniques and technologies can move across from the commercial sector, into the public sector, the third sector, and be put to work in ways that are more socially beneficial. So maybe just to give one example of that type of data that I think contains huge potential for public goods is search data. So this is the data set that is produced by all of us using Google and Bing and other search engines on a daily basis. Now, ordinarily, when this data is used, it is to do banal things like, target shoes more effectively. But there is also this emerging discipline called Infodemiology, where academic researchers use search data in response to public health challenges. So one great example of that, at the moment has been work by Bill Lampos at University College London and his team, where they've built a predictive model around COVID symptoms using search data. And that model actually predicts new outbreaks 17 days faster than conventional modes of epidemiological surveillance. So that's just one example of the sort of good I believe data can bring. Todd Landman 9:50 So it's like a really interesting example of an early early warning system and it could work not only for public health emergencies, but other emerging emergencies whether they be conflict, or natural disasters or any topic that people are searching for, is that correct? Sam Gilbert 10:05 Yes, that's right. I mean, it's not just in the public health field that researchers have used this, you just put me in mind actually Todd of a really interesting paper written by some scholars in Japan who are looking at citizens decision making in response to natural disaster warnings. So floods and earthquakes that that migration patterns I guess, would be the way of summarising it. Those are things that can also be detected using search data. Todd Landman 10:31 Well, that's absolutely fascinating. So if we go back to public health then. I was just reading a new book, out called Pandemocracy in Europe: Power, Parliaments and People in Times of COVID. And it's edited by Matthias Kettemann and Konrad Lachmayer. And there's a really fascinating chapter in this book that transcends the nation state, if you will. And it talks about platforms and pandemics. And one section of the chapter starts to analyse Facebook, Twitter, YouTube, and telegram on the degree to which they were able to control and or filter information versus disinformation or misinformation. And just the scale of some of this stuff is quite fascinating. So you know, Facebook has 2.7 billion daily users, it's probably a bigger number now. And you know, 22.3% of their investigated Facebook posts contain misinformation about COVID-19. And they found that the scale of misinformation was so large that they had to move to AI solutions, some human supervision of those AI solutions. But what's your take on the role of these big companies like we've been talking about Facebook, Twitter, YouTube, Telegram, and their ability to control the narrative and at least provide safe sources of information, let's say in times of COVID, but there may be other issues of public interest where they have a role to play? Sam Gilbert 11:57 Yes, I think this is such an important question. It's very interesting that you use the phrase, control the narrative, because of course, that is something that big tech companies have traditionally been extremely reluctant to do. And one of the things I explore a bit in my book is the extent to which this can really be traced back to some unexamined normative assumptions on the part of tech company executives, where they think that American norms of free speech and the free speech protections of the First Amendment that's sort of universal laws that are applicable everywhere, rather than things which are culturally and historically contingent. And for that reason, they have been extremely reluctant to do any controlling of the narrative and have tended to champion free speech over the alternative course of action that they might take, which is to be much more proactive in combating harms, including but not limited to misinformation. I think this probably also speaks to another problem that I'm very interested in, in the book, which is what we are concerned about when we say we're concerned about big tech companies' power, because I think ordinarily, the discussion about big tech companies power tends to focus on their concentrations of market power. Or in the case of surveillance capitalism theory, it concentrates on the theoretical power that algorithms have over individuals and their decision making. And what gets lost a bit in that is the extent to which tech companies by providing these platforms and these technologies actually empower other people to do things that weren't possible before. So in some work I've been doing with Amanda Greene, who's a philosopher at University College London, we've been thinking about that concept of empowering power, as we call it. And as far as we're concerned, that's actually a much more morally concerning aspect of the power of big tech, big tech companies than their market position. Todd Landman 14:11 Yeah. So I like it that you cite the First Amendment of the American Constitution, but interestingly, the international framework for the protection and promotion of human rights also, you know, has very strong articles around protection of free speech, free assembly, free association, which of course, the tech companies will be interested in looking at and and reviewing. But what it raises to I believe really is is a question around the kind of public regulation of private actors, because these are private actors. They're not subjected to international human rights law in the way that states are. And yet they're having an impact on mass publics. They're having an impact on politics. They're having an impact on debate. So perhaps I misspoke by saying control the narrative. What I'm really interested in is we seem to have lost mediation. We have unmediated access to information. And it seems to me that these it's incumbent upon these organisations to provide some kind of mediation of content, because not all things are true just because they're said. So it gets back to that question, what where's the boundary for them? When will they step in and say this is actually causing harm if there's some sort of a big tech Hippocratic oath about do no harm that needs to be developed? So that, so there is at least some kind of attempt to draw a boundary around what is shared and what is not shared? Sam Gilbert 15:34 Yes, so the idea of a Hippocratic oath for tech workers is definitely out there, the writer who has explored it more than I have is James Williams in his book Stand Out Of Our Light. I think that that is certainly something that would help. I also think that it is beneficial that at the moment, we're having more discussion about data ethics and the ethics of artificial intelligence, and that that is permeating some of the tech companies. So I think more ethical reflection on the part of tech executives and tech workers is to be welcomed. I don't think that's sufficient. And I do think that it's important that we have stronger regulation of the tech sector. And I suppose from my perspective, the thing that needs to be regulated, much more than anything to do with how data is collected or how data is used in advertising. Is this what sometimes referred to as online safety, or other times it's referred to as online harms. So that is anything that gives rise to individuals being at risk of being harmed as they live their lives online. There's actually legislation that is coming through in the UK at the moment called online safety bill, which is far from perfect legislation, but in my opinion, it's directionally right. Because it is more concerned with preventing harm and giving tech companies a responsibility for playing their part in it, then it is concerned with trying to regulate data or advertising. Todd Landman 17:13 Yeah, so it's really the result of activity that is trying to address rather than that the data that drives the the activity, if I could put it that way. So if we think about this, do no harm element, the mediating function that's required at least to get trusted information available to users. I, I wonder if we could pivot a little bit to the current crisis in Ukraine, because I've noticed on social media platforms, a number of sites have popped up saying we're a trusted source for reporting on on the current conflict, and they get a sort of kite mark or a tick for that. I've also seen users saying, don't believe everything you see being tweeted out from Ukraine. So where does this take us and not only COVID, but to something as real time active and horrific as conflict in a country, we can talk about Ukraine or other conflicts about the sharing of information on social media platforms? Sam Gilbert 18:08 Yes, well, this is a very difficult question. And unfortunately, I don't have the answer for you today. I guess what I would point to is something you touched on there Todd, which is the idea of mediation. And we have been through this period with social media, where the organizations, the institutions that we traditionally relied on to tell us what was true and what was false and sort fact from fiction, those organisations have been disintermediated. Or in some cases, they have found themselves trying to compete in this very different information environment that is much more dynamic in a way that actually ends up undermining the journalistic quality that we would otherwise expect from them. So this is not a very satisfactory answer, because I don't know what can be done about it, except that it is a very serious problem. I suppose just to make one final point that I've been reminded I've been reading stories on this topic in relation to the Ukraine crisis, is that the duality of this power that tech companies and that technology has given to ordinary users in the era of social media over the last 15 years or so. So if we were to rewind the clock to 2010, or 2011, the role of Twitter and Facebook and other technology platforms in enabling protest and resistance against repressive regimes that was being celebrated. If we then roll forwards a few years and look at a terrible case like the ethnic cleansing of the Rohingya people in Myanmar, we are at the complete opposite end of the spectrum where the empowerment of users with technology has disastrous consequences, and I guess if we then roll forward again to the Ukraine crisis, it's still not really clear whether the technology is having a beneficial or detrimental effect. So this is really just to say, once again, when we think about the power of tech companies, these are the questions I think we need to be grappling with, rather than questions to do with data. Todd Landman 20:31 Sure, there was there was a great book years ago called the Logic of Connective Action. And it was really looking at the way in which these emerging platforms because the book was published some years ago about lowering collective action costs, whether it was, you know, for protest movements, or, you know, anti-authoritarian movements, etc, we did a piece of work years ago with someone from the German Development Institute on the role of Facebook, in, in opposition to the Ben Ali regime in Tunisia, and Facebook allowed people to make a judgement as to whether they should go to a protest or not based on number of people who said they were going and and so it lowered the cost of participation, or at least the calculated costs of participating in those things. But as you say, we're now seeing this technology being used on a daily basis, I watch drone footage every day of tanks being blown up, of buildings being destroyed. And you know, part of my mind thinks it's this real, what I'm watching. And then also part of my mind thinks about, what's the impact of this? Does this have an impact on morale of the people involved in the conflict? Does it change the narrative, if you will, about the progress and or, you know, lack of progress in in the conflict, and then, of course, the multiple reporting of whether they're going to be peace talks, humanitarian corridors and all this other stuff. So it does raise very serious questions about the authenticity, veracity and ways in which technology could verify what we're seeing. And of course, you have time date stamps, metadata and other things that tell you that that was definitely a geolocated thing. So are these companies doing that kind of work? Are they going in and digging into the metadata, I noticed that Maxar Technologies, for example, is being used for its satellite data extensively, and looking at the build-up of forces and the movement of troops and that sort of thing. But again, that's a private company making things available in the public sphere for people to then reach judgments, media companies to use, it's an incredible ecosystem of information, and that it seems like a bit like a wild west to me, in terms of what we believe what we don't believe and the uses that can be made of this imagery and commentary. Sam Gilbert 22:32 Yes, so there is this as an all things, this super proliferation of data. And what is still missing is the intermediation layer to both make sense of that. And also tell stories around it that have some kind of journalistic integrity. I mean what you put me in mind of there Todd was the open source intelligence community, and some of the work that including human rights organisations do to leverage these different data data sources to validate and investigate human rights abuses taking place in different parts of the world. So to me, this seems like very important work, but also work that is rather underfunded. I might make the same comment about fact checking organisations, which seem to do very important work in the context of disinformation, but don't seem to be resourced in the way that perhaps they should be. Maybe just one final comment on this topic would relate to the media, the social media literacy of individuals. And I wonder whether that is something that is maybe going to help us in trying to get out of this impasse, because I think over time, people are becoming more aware that information that they see on the internet may not be reliable. And while I think there's still a tendency for people to get caught up in the moment, and retweets or otherwise amplify these types of messages, I think that some of the small changes the technology companies have made to encourage people to be more mindful when they're engaging with and amplifying content might just help build on top of that increase in media literacy, and take us to a slightly better place in the future. Todd Landman 24:26 Yeah, I mean, the whole thing around media literacy is really important. And I I also want to make a small plea for data literacy, just understanding and appreciating what data and statistics can tell us without having to be you know, an absolute epidemiologist, statistician or quantitative analyst. But I wanted to hark back to your idea around human rights investigations, we will have a future episode with a with a group that does just that and it's about maintaining the chain of evidence, corroborating evidence and using you know, digital evidence as you, you know in ways that help human rights investigations and, you know, if and when this conflict in Ukraine finishes, there will be some sort of human rights investigatory process. We're not sure which bodies going to do that yet, because we've been called for, you know, like a Nuremberg style trial, there have been calls for the ICC to be involved as been many other stakeholders involved, but that digital evidence is going to be very much part of the record. But I wonder just to, yeah go ahead Sam. Sam Gilbert 25:26 Sorry I am just going to add one thing on that, which I touched on this a little bit, and my book, but I think there's a real risk, actually, that open-source intelligence investigations become collateral damage in the tech companies pivot towards privacy. So what some investigators are finding is that material that they rely on to be able to do their investigations is being unilaterally removed by tech companies, either because it's YouTube, and they don't want to be accused of promoting terrorist content, or because it's Google or Facebook, and they don't want to being accused of infringing individual's privacy. So while this is not straightforward, I just think it's worth bearing in mind that sometimes pushing very hard for values like data privacy can have these unintended consequences in terms of open source intelligence. Todd Landman 26:24 Yes, it's an age old chestnut about the unintended consequences of purposive social action. I think that was a Robert Merton who said that at one point, but I guess in closing that I have a final question for you because you are an optimist. You're a data optimist, and you've written a book called good data. So what is there to be optimistic about for the future? Sam Gilbert 26:42 Well, I suppose I should say something about what type of optimist I am first, so to do that, I'll probably reach for Paul Romer's distinction between blind optimism and conditional optimism. So blind optimism is the optimism of a child hoping that her parents are going to build her a tree house. Conditional optimism is the optimism of a child who thinks, well, if I can get the tools and if I can get a few friends together, and if we can find the right tree, I think we can build a really incredible tree house together. So I'm very much in the second camp, the camp of conditional optimism. And I guess the basis for that probably goes to some of the things we've touched on already, where I just see enormous amounts of untapped potential in using data in ways that are socially useful. So perhaps just to bring in one more example of that. Opportunity Insights, the group at Harvard run by Raj Chetty has had some incredibly useful insights into social mobility and economic inequality in America, by using de-identified tax record data to understand over a long period of time, the differences in people's incomes. And I really think that that type of work is just the tip of the iceberg when it comes to this enormous proliferation of data that is out there. So I think if the data can be made available to researchers, also to private organisations in a way that, as far as possible, mitigates the risks that do exist to people's privacy. There's no knowing quite how many scientific breakthroughs or advances in terms of human and social understanding that we might be able to get to. Todd Landman 28:52 Amazing and I guess, to your conditional optimism, I would add my own category, which is a cautious optimist, and that's what I am. But talking to you today does really provide deep insight to us to understand the many, many different and complex issues here and that last point you made about, you know, the de-identified data used for for good purposes - shining a light on things that that are characterising our society, it with a view to be able to do something about it, you see things that you wouldn't see before and that's one of the virtues of good data analysis is that you end up revealing macro patterns and inconsistencies and inequalities and other things that then can feed into the policymaking process to try to make the world a better place and human rights are no exception to that agenda. So for now, Sam, I just want to thank you so much for coming on to this episode and sharing all these incredible insights and, and and the work that you've done. So thank you. Chris Garrington 29:49 Thanks for listening to this episode of The Rights Track, which was presented by Todd Landman and produced by Chris Garrington of Research Podcasts with funding from 3DI. You can find a detailed transcript on the website at www.RightsTrack.org. And don't forget to subscribe wherever you listen to your podcasts to access future and earlier episodes. Further reading and resources: Sam Gilbert (2021) Good Data: An Optimist's Guide to Our Digital Future. Bill Lampos' covid infodemiology: Lampos, V., Majumder, M.S., Yom-Tov, E. et al. (2021) “Tracking COVID-19 using online search”. Infodemiology Japan/natural disasters paper: [1906.07770] Predicting Evacuation Decisions using Representations of Individuals' Pre-Disaster Web Search Behavior (arxiv.org) On “empowering power”: Greene, Amanda and Gilbert, Samuel J., (2021) “More Data, More Power? Towards a Theory of Digital Legitimacy”. On the Hippocratic oath for tech workers: James Williams (2018) Stand out of our Light: Freedom and Resistance in the Attention Economy. Matthias C. Kettemann and Konrad Lachmayer (eds.) (2022) Pandemocracy in Europe: Power, Parliaments and People in Times of COVID-19. W. Lance Bennett and Alexandra Segerberg (2013) The Logic of Connective Action; Digital Media and the Personalization of Contentious Politics.
A bonus episode of the Japan Cricket Podcast as host turns interviewee for the YouTube Channel Cricket Spectacle. On this episode I'll talk a bit about how I ended up working for the JCA, what the landscape was like when I arrived and what it looks like now as well as the challenges of getting cricket into Japanese schools. This is not the only place I have popped up lately however, you can also here me witter on if you check out the Tokyo Living Podcast with former Karate Champion turned Club 360 owner Sam Gilbert. Do have a listen and subscribe for further insight on life and sport in Japan. Thank you to everyone who completed the Season One Feedback Form, it will remain open until the end of March if anyone else would like to add their thoughts and help shape the direction of the podcast. Please also remember to rate / review the show on whichever platform you get your podcasts, and give us a follow on Instagram as well, which is also the best place to communicate directly with me.
On Episode #61 of the Book Talk Today podcast we are joined by Sam Gilbert. Sam is an affiliated researcher at the Bennett Institute for Public Policy at the University of Cambridge. He is also an expert in data-driven marketing being part of the fintech startup Bought By Many as the chief marketing officer. Today we will be discussing his most recent book, “Good Data: An Optimist's Guide To Our Digital Future”. We discussed:
In the first Tech To Transform podcast of 2022, Mantis MD Eleanor Willock delves into the use of data and the ethics of AI in the NHS. Eleanor spoke to Sam Gilbert, author of Good Data: An Optimist's Guide to Our Digital Future, and Daniel Bamford, Deputy Director AI Award, Accelerated Access Collaborative at NHS England, on what NHSE is doing to fund AI innovation. They discussed how machine learning is already transforming 80,000 clinical pathways, and looked at reasons to be optimistic about how good data can make a positive difference. They also questioned whether the NHS has that good data, and whether public sector tech companies should focus on other markets to attract investors. And, they explained why switching the radio to Heart FM is the first thing they'll all be doing on their imaginary taxi ride with Sajid Javid. Take a listen.
This podcast episode is a beautiful collaboration with my soul sister, Sam Gilbert, of Wild Soul Craft. We dive deep into doing our own healing work- are vulnerable about patterns showing up for us and how we navigate them. This is a beautiful mirror to all the women out there bringing their medicine to the world- inviting you to see your own process as necessary, deliciously human and a permission slip for so many others. We talk about getting Soul Naked and becoming Unleashed and truly liberating women into their own sovereign self expression. It's such a candid look into our missions and what is at the heart of our revolutionary acts of rebellion. Grab some kombucha or a chai, pull up a seat and listen to the sacred prayer that we have for you today! Thank you so much for tuning in. Please leave a review and enjoy our free breathwork below as a sacred exchange. To work with Erin in the The Embodied Womban Program or to claim one of the last spots at the Embodied Womban Retreat: Sedona Embodied Woman Retreat February 17-21st 2022-https://www.soulnakedceo.com/soulnakedtheretreat21 On Demand Breathwork membership-- https://www.soulnakedceo.com/Membership FREE Self Healing Breathwork- https://www.soulnakedceo.com/opt-in-aad22992-8e64-4f03-9657-d68bcd3dd4cc Follow us on IG: www.instagram.com/SoulNakedCEO To Follow Sam: https://www.instagram.com/sammisoulcritter/ https://www.wildsoulcraft.com
This podcast episode is a beautiful collaboration with my soul sister, Sam Gilbert, of Wild Soul Craft. We dive deep into doing our own healing work- are vulnerable about patterns showing up for us and how we navigate them. This is a beautiful mirror to all the women out there bringing their medicine to the world- inviting you to see your own process as necessary, deliciously human and a permission slip for so many others. We talk about getting Soul Naked and becoming Unleashed and truly liberating women into their own sovereign self expression. It's such a candid look into our missions and what is at the heart of our revolutionary acts of rebellion. Grab some kombucha or a chai, pull up a seat and listen to the sacred prayer that we have for you today! Thank you so much for tuning in. Please leave a review and enjoy our free breathwork below as a sacred exchange. To work with Erin in the The Embodied Womban Program or to claim one of the last spots at the Embodied Womban Retreat: Sedona Embodied Woman Retreat February 17-21st 2022-https://www.soulnakedceo.com/soulnakedtheretreat21 On Demand Breathwork membership-- https://www.soulnakedceo.com/Membership FREE Self Healing Breathwork- https://www.soulnakedceo.com/opt-in-aad22992-8e64-4f03-9657-d68bcd3dd4cc Follow us on IG: www.instagram.com/SoulNakedCEO To Follow Sam: https://www.instagram.com/sammisoulcritter/ https://www.wildsoulcraft.com
Join Em Stroud and her Clown Barabra as they chat all things good data with Sam Gilbert. How can data be a force for good? What can we do as individuals to look after our data? How did Sam get interested in data? What does Barabra think of data and knitting? How can we change the narrative around our own data and tales of Eastenders and much much more Sam's Social Media: Facebook: https://www.facebook.com/samgilbertwrites LinkedIn: https://www.linkedin.com/in/samgilbert/ Twitter: https://twitter.com/samgilb About Sam Gilbert: Almost everything we do generates data. Digital technology is now so pervasive that it's very hard to escape its influence, and with that growth comes fear. But whatever the news has told you about data and technology, think again. Data expert and tech insider Sam Gilbert shows that, actually, this data revolution could be the best thing that ever happened to us. His book Good Data examines the incredible new ways this information explosion is already helping us – whether that's combating inequality, creating jobs, advancing the frontiers of knowledge or protecting us from coronavirus – and explains why the best is yet to come. Data touches everything, from our biggest hates (online advertising) to our greatest loves (our pets). Sam explores how, if we can embrace the revolution (even the ads), we could all live vastly improved lives.
This week on the Evolving Leader podcast, co-hosts Scott Allender and Jean Gomes talk to Sam Gilbert, expert in data-driven marketing, entrepreneur and researcher. Sam was employee number one and Chief Marketing Officer at Bought By Many, the multi-award winning fintech ranked No. 13 in The Sunday Times TechTrack100 and prior to that held the position of Head of Strategy and Development at Experian. Aged 39, Sam changed course and returned to university to rethink the role of data in our future. Sam is author of ‘Good Data: An Optimist's Guide to Our Digital Future' (Welbeck, 2021). Social:Instagram @evolvingleaderLinkedIn The Evolving Leader PodcastTwitter @Evolving_Leader
Sam Gilbert discusses with Ivan six things which he thinks should be better known. Sam Gilbert is an affiliated researcher at the Bennett Institute for Public Policy at the University of Cambridge. An expert in data-driven marketing, he was employee number one and chief marketing officer at Bought By Many, an award-winning fintech start-up named as one of Wired's hottest start-ups in Europe and ranked in the Sunday Times TechTrack100 list of the UK's fastest growing companies. Previously, he was head of strategy and development at the data company Experian and head of consumer finance at Santander. He lives in Copenhagen. West Highland Line: https://www.scotrail.co.uk/scotland-by-rail/great-scenic-rail-journeys/west-highland-line-glasgow-oban-and-fort-williammallaig AnswerThePublic.com: https://answerthepublic.com/ Danish Summerhouses: https://www.howtoliveindenmark.com/podcasts/danish-summerhouse-dollhouse-expect-youre-invited-danish-summer-home/ The Zuckerberg Files: https://zuckerbergfiles.org/ Judith Shklar's Liberalism of Fear: https://philpapers.org/archive/SHKTLO.pdf Novels of Magnus Mills: https://www.bloomsbury.com/author/magnus-mills This podcast is powered by ZenCast.fm
On this edition of the Spotlight Jeff learns about all the great events at Historic Rock Castle in Hendersonville, TN. Sam Gilbert, The Executive Director shares history, events and information about this historic property.
Open banking is said to be a seismic shift in the financial services industry where data becomes democratized and new opportunities for innovation will be created. But what is open banking really? What real and what's hype when it comes to open banking? And what is the impact on consumers? Many consumers express concerns about the safety and management of their data in this digital age - is there merit to this concern? And how can banks and fintechs clearly communicate the impact and the benefits of open finances to consumers? In this episode, we want to understand open finance better and discuss if consumers have a right to be afraid of open banking or if the concerns about data have been overblown. And to help me with that we have invited Sam Gilbert, author of Good data: An optimist guide to our digital future & Esben Toftdahl Nielsen, CCO og Co-founder of Penni Tune in and enjoy!
Just over a decade ago Experian UK, ran a programme called Inspire. This episode is a catalyst from Sam Gilbert who posted a picture on LinkedIn of the cohort he was in on the programme.I am joined by Sam, Jemma Price and Keith Ambrose to discuss the impact of the programme then and now! Sam said in his post: "More than a decade ago, I took part in a leadership programme called Inspire. All these years later, barely a week goes by without me drawing on the vast store of things I learnt from the experience and the people I shared it with. Reflecting on a meeting, I'll realize it went well because it transcended "ritual cliché", and achieved "true rapport". Facing a decision, I'll ask myself something Guy Bloom said when delivering: "What would a *good* version of you do? Not a superhuman, Mahatma Gandhi version, but a *good* version?"It made me a braver and more compassionate person, and a better leader because of it.Company priorities change over time, and I imagine Experian has been through several iterations of its people strategy since then. Inspire was a material investment that may not have driven short-term financial results.But I am 100% sure it was the right thing to do. As Fred Pelard writes, businesses should think of training as a "no regret" move - one that yields a positive outcome in every scenario. That I'm writing today with warmth and gratitude about a company I left 8 years ago speaks to the truth of that".Creating and running this programme was a labour of love, it was team effort to make it land and have the impact that is did.It went on to win the:WINNER-GOLD AWARD-Training Journals, Best Leadership ProgrammeHighly Recommended - HR Excellence Awards - Employee Engagement WINNER-ABP-2015-Excellence In Performance Improvement To find out more about the programmes that I run based on an award winning levels of design, delivery and impact reach out to me on guybloom@livingbrave.com or 07827 953814.Visit the website: livingbrave website You can see all links below.
The St Kilda Football Club is struggling. There's no doubt about that. What's gone wrong? We're back to discuss another demoralising loss, but there is some light at the end of the tunnel. We also catch up with former Saints defender, Sam Gilbert, as we look back at his 200+ game career, and get his thoughts on the current plight of the St Kilda FC. Catch all our video interviews on the https://www.youtube.com/channel/UCgturW0AMoA-biMrEEiabdg (Unpluggered YouTube page): don't forget to subscribe, rate and review Unpluggered wherever you listen or watch, and make sure to follow the lads on https://my.captivate.fm/twitter.com/unpluggered (Twitter), https://my.captivate.fm/facebook.com/unpluggered (Facebook) and https://my.captivate.fm/instagram.com/unpluggeredpodcast (Instagram) to stay connected with the podcast. Guest and sponsorship opportunities: unpluggered@gmail.com.
www.club360.jp
Sam Gilbert: Good Data with TRE´s Bill Padley
Vicki and Geraldine talk to Sam Gilbert, fintech entrepreneur, Cambridge researcher, and author of the radical new book, Good Data, about why he takes issue with the fashionable idea of surveillance capitalism.Talking points:How much does it matter that ‘our data' is being collected?Is data actually 'the new oil'?How bad is online advertising really?Is the focus on privacy actually harming our ability to use data for public good?What would good use of data look like?
Sam Gilbert is a registered physiotherapist with the Australian Physiotherapy Association (APA) and certified strength and conditioning specialist (CSCS) with the National Strength and Conditioning Association (NSCA). He holds a bachelor's degree in Physiotherapy from Latrobe university (Melbourne, Australia) and a master's degree in Exercise Science (Strength and Conditioning) from Edith Cowan University (Perth, Australia).Sam has combined his practical experience with an in-depth study of sports performance in relation to combat sports, and strives to help other combat athletes reach their full competitive potential, whilst at the same time decreasing injury risk and increasing competition and training potential.
What was the real story with UCLA booster Sam Gilbert... along with lesson Kareem taught Coach Wooden...
Learning to Teach better with Sam Gilbert as we discuss the featured book, Grit: The Power of Passion and Perseverance by Angela Duckworth.
Annika Boldt and Sam Gilbert discuss their research on spontaneously cognitive offloading and improving performance. https://wp.me/p8IxYp-1yY (Transcripts are available.) The Psychonomic Society (Society) is providing information through this podcast as a benefit and service in furtherance of the Society's nonprofit and tax-exempt status. The Society does not exert editorial control over such materials, and any opinions expressed in the podcast are solely those of the individual contributors and do not necessarily reflect the opinions or policies of the Society. The Society does not guarantee the accuracy of the content contained in the podcast and specifically disclaims any and all liability for any claims or damages that result from reliance on such content by third parties.
Learning to teach better with Sam Gilbert as we discuss the featured article, Teaching Mathematics as Agape: Responding to Oppression with Unconditional Love.
Boyd Hilton and Josh Landy welcome Sam Gilbert for today's show. We chat Arsenal memes, league table margins, team selection, David Luiz's performance, Pépé, and loads more! arsenalpodcast.net @arsenalpodcast Produced by Josh Landy Engineered by Leon Gorman A Playback Media Production playbackmedia.co.uk Copyright 2019 Playback Media Ltd - playbackmedia.co.uk/copyright Our GDPR privacy policy was updated on August 8, 2022. Visit acast.com/privacy for more information. Learn more about your ad choices. Visit megaphone.fm/adchoices
Boyd Hilton and Josh Landy welcome Sam Gilbert for today’s show. We chat Arsenal memes, league table margins, team selection, David Luiz’s performance, Pépé, and loads more! arsenalpodcast.net @arsenalpodcast Produced by Josh Landy Engineered by Leon Gorman A Playback Media Production playbackmedia.co.uk Copyright 2019 Playback Media Ltd - playbackmedia.co.uk/copyright
Welcome to the Good Beer Hunting Podcast. I’m Ashley Rodriguez. I remember the very first time I went to Temescal Brewing in Oakland, California. I had been watching the buildout for months, waiting in anticipation. You could step outside of my apartment building and see it from the corner. So when they finally opened, I was excited—and pleasantly surprised by the number of crushable, low-ABV beers they offered from the get-go. I was immediately on board. My relationship with Temescal Brewing is a mirror of my time in Oakland, initially because of proximity—I could throw a load of clothes in the washing machine and have a beer as I waited to transfer them to the dryer. Eventually, however, it evolved into a closer connection with the folks who ran the brewery. I was used to talking with the bartenders—at that time there were maybe six or seven regular folks I’d see—and eventually I ended up joining the bar staff, and being forever changed not just by the way they brew beer, but by the way they hire, the way they advocate for the rights of others, and the way they became a guiding light for businesses seeking to operate ethically and responsibly. A lot of what I love about Temescal comes from owner Sam Gilbert, who we interviewed on this podcast a couple of weeks ago, but also from then-taproom manager Theresa Bale. Theresa hired me back in the day, and is also the founder of Queer First Friday, the Bay’s only craft-beer-focused queer dance party. Every first Friday of the month, Temescal holds one of the most exciting and inclusive queer events in the area. From queer families who bring their children in at the beginning of the night to folks hopping on the dance floor to new performers, DJs, and singers showcasing their talents for the very first time, Queer First Friday is one of the loveliest celebrations of diversity and inclusion that I’ve encountered. And it’s all because of Theresa. In this episode, we talk a lot about Queer First Friday—I was at the very first one, slinging beers from the outside bar, slightly overwhelmed by the number of people who showed up, clamoring to get in, and we also talk about how Theresa thinks about hiring for diversity. Theresa isn’t shy about this—she’s intentional, she’s open, and she makes a point to seek out bartenders who maybe don’t have a ton of experience or know that much about beer. Because, for Theresa, to build a truly inclusive staff, you have to look outside the corners and social networks you know. I could talk about Temescal and what it means to me forever. But I’ll let Theresa, who recently transitioned from taproom manager to head of operations, a job she created for herself, tell the story. This is Theresa Bale, head of operations for Temescal Brewing in Oakland, California. Listen in.
There are easy topics brewery owners love to talk about to promote themselves: interesting beer releases, charity work, GABF awards, or cheeky packaging. And then there are other, much stickier issues most avoid addressing: gentrification, racism and sexism in the industry; labor; and a brewery’s moral responsibility to its community and the people within it. Sam Gilbert, who co-founded Oakland’s Temescal Brewing about three years ago in the neighborhood with which it shares a name, doesn’t shy away from such difficult conversations. Oakland’s changing, and it’s well past time to talk about it. Temescal Brewing came into existence in great part because of the community it now counts as patrons. A successful crowdfunding campaign helped the brewery get off the ground, and Gilbert hasn’t forgotten the significance of that. Since then, Temescal has prioritized hiring hyperlocal residents, hosting regular LGBT-focused bashes, and proactively reaching out to communities less frequently seen in your standard brewery taproom. But the vibes aren’t always positive. Sometimes, being a good neighbor can be much less fun, as when AB InBev announced an intention to open a Golden Road taproom not far from Temescal’s brewhouse. Gilbert, like others in the Oakland craft beer scene, was outspoken against the idea. Ultimately, the plan was quashed, and some believe it was in large part due to vocal critics like Gilbert, and to Oakland’s anti-corporate ideologies. All of this is to say nothing of the beer. Temescal is frequently cited as one of Bay Area drinkers’ favorite breweries, and its focus on soft, nuanced, and low-ABV styles (and in particular Pilsners) has made fans out of the pickiest consumers. It’s becoming increasingly common to see the brewery’s brightly colored cans in fridges around the Bay. There’s a reason Temescal’s reputation is as colorful as its approachable, pastel-splashed taproom. It aims for inclusion. And it lives by the motto: “No jerks.” This is Sam Gilbert of Temescal Brewing. Listen in.
SF Beer Week is at the top of today's Brew Ha Ha agenda. Gail Williams and Steve Shapiro are in from San Francisco. They were writers for Celebrator Beer magazine and Gail is now working on PR for SF Beer Week. Sam Gilbert from Temescal Brewing in Oakland is also in and has brought beer from Temescal, a Pilsner and a single-hop Citra Ale. Mark remembers SF Beer Week used to do events at Anchor Brewery. The first was in 2009. SF Beer Week is happening all around the region now. The opening gala is Friday, February 1. Feb. 2 through 10 there are events all over the Bay Area. Temescal Brewing also has some events. They are in North Oakland in the Temescal neighborhood. They do a live canning tour called “Fresh Cans” where you can drink a beer just plucked off the canning line. Also they do a “Low and Slow” cookout in their patio space. They try to schedule SF Beer Week events near public transportation. "It goes through the 10th so it's a long week," says Gail. Sam says that Temescal makes beer for everyone, including anyone who lives in Oakland that they want to welcome into craft beer. It is in its third year, after he did He worked for a thing called Brewlab San Francisco, where he met other home brewers who shared their beer and feedback about them. When the SF Chronicle wrote about it, there followed months of chaos as the idea grew. Eventually Brewlab became a community of younger brewers with a homebrew shared space in the Mission district. Then they decided to throw the parties with 500 gallons instead of 5. His sweatshirt says, "Fresh beer, no jerks." That has become their official motto. They are tasting the Citra Ale from Temescal. Gail says it's the nicest example of a hazy IPA that she has had. It has a beautiful orangey finish, at 6.2% alcohol. It has passion fruit and guava flavor. Citra is an example of hops that were developed for their flavor. Some hops are patented strains. Citra was developed here and is grown primarily in the Pacific Northwest. Mark says they used to go to Yakima, Washington, every year to choose hops. California used to be a huge hop grower but today the Sonoma County production is close enough to the breweries so that they can be used fresh, within a day or two of harvest. There is a big calendar on the SF Beer Week website listing their numerous events. Sam Gilbert talks about his background, with San Francisco Brewlab. When he started the brewery it was getting expensive to stay in San Francisco and they found a place in Oakland which was affordable but where there was an opportunity to start a beer community. Steve Shapiro also talks about a site he started called BeerbyBART.com as a way to help people to organize beer adventures and use public transit.
The context of a statement, a soundbite, a headline, or a verse of Scripture is critically important to rightly interpreting and fully understanding. In an age of provocative headlines, Twitter statements, and sensational news headlines, it is increasingly important to know the setting. Too often we are guilty of judgmental attitudes, condescension, and mistrust of others based solely on a single statement. Or worse, only a few words. Bob Knight - At It Again! Case in point - a headline quote from former Indiana University Men's Basketball Coach, Bob Knight - "I don't respect John Wooden." On the surface, that sounds pretty offensive. After all, Coach Wooden (former UCLA Men's Basketball Coach) is one of college basketball's greatest coaches of all time. Not to mention the fact that he was a great ambassador for college sports and Christ. Coach Wooden is known for his demeanor - as is Bob Knight (though somewhat opposite) - his leadership, and his success principles. So hearing that another Hall of Fame coach thinks little of Wooden is quite a shock. That is, until you begin to see the quote in context. It was made during an interview in which Knight was expressing his displeasure with the recruiting practices of UCLA during the 1970's and '80's. Taking the time to learn the context of the statement, you soon understand that Knight was not speaking of his dislike for John Wooden specifically. His complaint was against the UCLA system and specifically a man named Sam Gilbert that participated in unfair recruiting. Gilbert, by the way, caused UCLA to forfeit their 1980 Championship and participation in the 1981 post-season. While Wooden may have or should have had some influence or control over Gilbert, Knight's comments weren't at Wooden specifically, per se. Right Understanding This quote from Bob Knight only illustrates a bigger problem we have with not taking context into account. Whether it is for headlines, quotes, or more importantly, Scripture references. Too many people - preachers included - take verses out of context in an effort to support a particular viewpoint. It's called confirmation bias and it happens when we seek only information that supports an already held position. Instead of drawing on as much information as we can and then making a right interpretation, we jump on the first assumption. This does little to help us grow and even less in helping us work with and understand others. This episode addresses the critical importance of context in gaining a right interpretation.
Matt Finnis and Sam Gilbert discuss the 2016 Pride March.
White and McGuane present RSN Breakfast. Includes Barry Hall , Graig Gabriel, Sam Gilbert and Brock McLean.