POPULARITY
An analysis by Consumer Reports suggests that every person on Facebook is being monitored by thousands of companies, but new tools and laws are starting to make it easier for people to take back their privacy. This episode was recorded in New York City at an event hosted by the responsible tech nonprofit All Tech Is Human. We Meet: Consumer Reports R&D Director Ginny Fahs New York Times Contributing Opinion Writer & Investigative Journalist Julia Angwin Block Party Founder & CEO Tracy Chou Credits: This episode of SHIFT was produced by Jennifer Strong and Emma Cillekens. The on-site recording engineer was Josh Chapdelaine. SHIFT is mixed by Garret Lang, with original music from him and Jacob Gorski. Art direction by Anthony Green. The Founder and President of All Tech is Human is David Ryan Polgar.
Welcome back to the Tech Policy Grind Podcast by the Internet Law & Policy Foundry! This week, Reema chats with David Ryan Polgar, Founder of All Tech is Human, about creating a responsible tech community. They get into what responsible tech means, and why David thinks New York City is the center of the responsible tech community. Reema and David also dig into his background, and how his multiple hats as an attorney, educator, and founder contribute to his outlook on what responsible tech means for our local and global societies in the digital age. Plus, Reema and fellow Foundry Fellow Lama Mohammed chat with attendees of All Tech is Human's latest NYC Responsible Tech mixer about what the responsible tech community means to them. Thanks for listening, and stay tuned for our next episode! Resources Referenced: Responsible Tech Guide Tech & Democracy report AI & Human Rights report Check out the next All Tech is Human Responsible Tech Mixer in NYC Responsible Tech Summit in NYC on September 14 DISCLAIMER: Reema participates with the Internet Law & Policy Foundry voluntarily and in her personal capacity. The views and opinions expressed on this show are not necessarily those of the organizations Reema is affiliated with.
This week, we're exploring why it behooves businesses and business leaders to look at their users, consumers, customers, etc., as humans first. Slightly shifting perspective to consider the humanity behind purchasing decisions can lead to greater loyalty, more frequent use, and genuinely happier users, all of which add up to more business success and better outcomes for the world. Together with my guests, we discuss how human-centric decisions apply to various industries and how you can build better relationships that lead to success for all of humanity. Guests this week include Charlie Cole, Neil Redding, Dr. Rumman Chowdhury, Ana Milicevic, Cathy Hackl, Marcus Whitney, and David Ryan Polgar. The Tech Humanist Show is a multi-media-format program exploring how data and technology shape the human experience. Hosted by Kate O'Neill. Produced and edited by Chloe Skye, with research by Ashley Robinson and Erin Daugherty at Interrobang and input from Elizabeth Marshall. To watch full interviews with past and future guests, or for updates on what Kate O'Neill is doing next, subscribe to The Tech Humanist Show hosted by Kate O'Neill channel on YouTube, or head to KOInsights.com. Full Transcript Kate O'Neill: When you buy something, you're a customer. But — to paraphrase a line from the movie Notting Hill — you're also just a person, standing in front of a business, asking it to treat you like a human being. Over the last two decades plus working in technology, I've often held job titles that were centered on the experience of the user, the consumer, or the customer. In fact, the term ‘customer experience' has been in use since at least the 1960s, and has become so common that a recent survey of nearly 2,000 business professionals showed that customer experience was the top priority over the next five years. And while generally speaking this emphasis is a good thing, my own focus over the past decade or so has shifted. I've realized that the more macro consideration of human experience was a subtle but vital piece missing from the discussion at large. Because when we talk about experience design and strategy, no matter what word we use to qualify it—customer, user, patient, guest, student, or otherwise—we are always talking about humans, and the roles humans are in relative to that experience. In order to refocus on human experience instead of customer, you have to change the way you think about your buyers. You owe it to yourself to think not just about how people can have a better experience purchasing from your company, but also what it means to be fully human within the journey that brings them to that moment, and the uniquely human factors that drive us to make decisions leading to purchase or loyalty. A recent piece by Deloitte shared in the Wall Street Journal echoes this idea and offers five ways to be more human-centric in business: 1) be obsessed by all things human, 2) proactively identify & understand human needs before they are expressed, 3) execute with humanity, 4) be authentic, and 5) change the world. That's what today's episode is about: using empathy and strategic business-savvy to understand what it means to be human, and how that intersects with the worlds of technology and business. Neil Redding: “When you look at everything that has to do with buying and selling of things, it's so closely tied with what we care about, what we value most, value enough as humans to spend our hard-earned money on. And so, the realm of retail reflects something really deeply human, and profoundly human.” Kate: That was Neil Redding, brand strategist and self-described “Near Futurist” focused on the retail space. He's right—buying and selling things has become deeply entwined with humanity. But when we purchase something, it's not because we think of ourselves as “customers” or “end users.” We buy because we have a need or desire to fulfill, and sometimes that need is purely emotional. A ‘customer' buys your product—a human buys your product for a reason. 84% of consumers say that being treated like a person instead of a number is an important element to winning their business. It does seem like business professionals are catching on, as 79% say it's impossible to provide great service without full context of the client and their needs. But understanding something isn't the same as putting it into practice—only 34% of people say they feel like companies actually treat them as individuals. One major difference is the question of framing. Customer experience frames the motivator as, ‘how effectively the business operates the events related to a purchase decision.' It drives companies to focus on improving their own metrics, like bringing down call center wait times. These may yield worthwhile outcomes, but they're inherently skewed to the business perspective and aligned to the purchase transaction. Focusing instead on human experience shifts the perspective to the person outside the business, and what they want or need. It allows consideration of the emotional state they may be bringing to the interaction, which leaves greater room for empathy and context. A human experience mindset suggests that each individual's unique circumstances are more important than aggregate business metrics, because the reason why that person is interacting with your company probably can't be captured by measuring, say, how long they might have to wait on the phone. You could bring that wait time to zero and it still may not have any impact on whether the person feels heard, respected, or satisfied with the outcome — or whether they want to engage with you again. But as fuzzy as it is to talk about human experience, we know that measurement is fundamental to business success, so we have to find a way to define useful metrics somehow. For each business, that number is likely a bit different. So how do you know whether your customers feel like they're being treated as humans instead of just numbers? Charlie Cole, CEO of the flower delivery website ftd.com, believes one answer is obsessing over customer satisfaction metrics. Charlie Cole: “The best way to win this industry is just kick ass with the customer. We obsess over NPS scores, uh, as kind of leading indicators of LTV scores.” Kate: If you're not familiar with the acronyms, allow me to decipher: NPS stands for Net Promoter Score, which measures how likely the customer is to recommend the business, and LTV in this context means ‘lifetime value,' or the amount a customer may spend at your business over the course of their lifetime. Charlie Cole: “But remember, it's not the receiver's lifetime, it's the sender's lifetime. I mean, think about it. My stepmom is—just had a birthday April 9th, and I sent her a plant. If I went on a website and picked out a Roselia, and she received an Azelia, she's gonna be like, ‘thank you so much, that was so thoughtful of you,' and I'm gonna be pissed, right? And so like, we have to make sure we optimize that sender NPS score. It was shocking to us when we looked into the NPS, when we first got to FTD, our NPS, Kate, was in like the teens! My CTO looked at it and he goes, ‘how is this possible? We send gifts, who doesn't like receiving gifts?' And so we were looking at this stuff and we realized like, this is how you win. And I think when people look at the world of online delivery, there's very few companies that are extremely customer-centric… and in our world it matters. It's births, it's deaths, it's birthdays, it's Mother's Days… it's the most emotional moments of your life that you're relying on us for, so I think that gravitas just goes up to the next level.” Kate: Net Promoter Score offers directional insight about the customer experience, but it still isn't quite measurement of the broader human experience. The typical NPS question is phrased, “How likely is it that you would recommend [company X] to a friend or colleague?”, which forces customers to predict future actions and place themselves into hypothetical or idealistic scenarios. It is also measured on a 1-10 scale, which is pretty arbitrary and subjective — one person's 9 would not be another person's 9. A clearer way to ask this and gain more useful human-centric data would be with simple yes/no questions, asking people about actual past behaviors. For instance, “in the past 6 weeks, have you recommended [company X] to a friend or colleague?” Other alternative measures include PES, or Product Engagement Score, which measures growth, adoption, and stickiness of a given product or service, and doesn't require directly asking customers questions about their past or future habits. Instead, data comes in in real-time and allows for a clear measurement of success relative to a product's usage. While these metrics are useful in various ways, one thing missing from them is emotion. As humans, we are animals deeply driven by our emotions: research from MIT Sloan finds that before humans decide to take an action—any action, including buying something—the decision must first go through a filtering process that incorporates both reason and feelings. Reason leads to conclusions, but emotion leads to action. And if a customer feels frustrated by the customer service they're experiencing—perhaps they feel like they are being treated like a number, and not a person—they'll file a complaint, share on social media, and tell their friends and family to avoid the business. These actions can be quite time-consuming, but people will give up their time to right a wrong they feel they've experienced. All this is to say that if you want to retain human loyalty or attract new people to your business, you have to create a positive emotional response in your customers, which means understanding more about who they are than simply what product they might want. Many businesses have discovered that one of the best ways to create an emotional connection with people is through branding. A great brand image can forge a permanent bond with someone who feels strongly that the company shares their values and practices what they preach. Once someone has connected a brand to their own identity, it becomes much more difficult to convince them to switch to another company—even if that company provides the same product at lower cost—because switching companies feels like losing a part of them. Dr. Rumman Chowdhury, Director of the Machine Learning Ethics, Transparency, and Accountability team at Twitter, explored the concept of branding with me when she came on my show last year. Rumman Chowdhury: “Human flourishing is not at odds with good business. Some of what you build, especially if you're a B2C company, it's about brand. It's about how people feel when they interact with your technology or your product. You are trying to spark an emotion. Why do you buy Coke vs Pepsi? Why do you go to McDonald's vs Burger King? Some of this is an emotional decision. It's also this notion of value. People can get overly narrowly focused on value as revenue generation—value comes from many, many different things. People often choose less ‘efficient' outcomes or less economically sound outcomes because of how it makes them feel. A frivolous example but an extreme example of it would be luxury brands. Apple spends so much money on design. Opening every Apple product is designed to feel like you're opening a present. That was intentional. They fully understand the experience of an individual, in interacting with technology like a phone or a computer, is also an emotional experience.” Kate: If you're able to understand what people connect to about your brand, you can invest into magnifying that image. If your customer loves that you invest into clean energies, it becomes less important how much time they spend on the phone waiting for a service rep. Operational metrics can't show you this emotional resonance, so instead you have to think about what makes you stand out, and why people are attracted to you. Sometimes, however, human emotion has nothing to do with the product or brand in question, and more to do with the circumstances surrounding it. There's perhaps no better example of this than flowers, which can be given for myriad reasons, and usually at the extreme ends of the emotional spectrum. I'll let Charlie Cole explain. Charlie Cole: “For us, it's buyer journey by occasion. So, you are sending flowers for the birth of a newborn. You are sending flowers for the tragic death of a teenager. You are sending flowers for the death of your 96 year old great grandfather. You are sending flowers for your wife's birthday. I would argue that even though the end of all those buyer journeys is ‘flowers,' they are fundamentally different. And you have to understand the idiosyncrasies within those buyer journeys from an emotional component. You have to start with the emotions in mind. You're buying running shoes. The buying journey for like a runner, for like a marathoner, a guy who runs all the time, is emotionally different than someone who just got told they need to lose weight at the doctor. Someone who travels for business all the time versus someone who's taking their first ever international…travel. Like, my wife retold a story the other day to my aunt about how her first European trip was when she won a raffle to go to Austria when she was 17. And her, like, single mom was taking her to Europe, and neither of them had ever been to Europe. That's a different luggage journey than me, who used to fly 300,000 miles a year. And I think that if you take the time to really appreciate the emotional nuance of those journeys, yes there's data challenges, and yes there's customer recognition challenges, so you can personalize it. But I would urge every brand to start with like the emotional amino-acid level of why that journey starts, and then reverse-engineer it from there. Because I think you'll be able to answer the data challenges and the attribution challenges, but I think that's a place where we sometimes get too tech-y and too tactical, as opposed to human.” Kate: Another challenge unique to flowers and other products usually given as gifts is that there are two completely different humans involved in the transaction, each with different expectations and emotions riding on it. Charlie Cole: “There's two people involved in every one of our journeys, or about 92% of them: the buyer, and the receiver. So how do I message to you, I don't want to ruin the surprise! But I need to educate you, and oh yeah, I'm a really really nervous boyfriend, right? I wanna make sure everybody's doing it right, and it's gonna be there on time, and I need to make sure it's going to the right place… So the messaging pathways to the sender and receiver are fundamentally different. If you kind of forget about your buying journey, and imagine everything as a gifting buyer journey, it just changes the messaging component. Not in a nuanced way, but darn near in a reciprocal way.” And while some businesses struggle to connect emotionally with the humans that make up their customer base, the tech industry—and specifically social media companies—seem to fundamentally understand what it is that humans crave, in a way that allows them to use it against us. They thrive because they take something that is quintessentially human—connecting with people and sharing our lives—and turn it into a means for data collection that can then be used to sell us products that feel specifically designed for us. Like most of us, Neil Redding has experienced this phenomenon firsthand. Neil Redding: “We spend more and more of our time in contexts that we are apparently willing to have commercialized, right? Instagram is kind of my go-to example, where almost all of us have experienced this uncanny presentation to us of something that we can buy that's like so closely tied to… I mean, it's like how did you know that this is what I wanted? So myself and people close to me have just said, ‘wow, I just keep buying this stuff that gets presented to me on Instagram that I never heard of before but gets pushed to me as like, yeah it's so easy, and it's so aligned with what I already want. So there's this suffusion of commercial transaction—or at least discovery—of goods that can be bought and sold, y'know, in these moments of our daily lives, y'know, so that increasingly deep integration of commerce and buying and selling of things into our self-expression, into our communication, works because what we care about and what we are willing to buy or what we are interested in buying are so intertwined, right? They're kind of the same thing at some deep level.” Kate: Part of the reason this works is that humans crave convenience. Lack of convenience adds friction to any process, and friction can quickly lead to frustration, which isn't a mind state that leads to more business. The internet and social media has made keeping up with friends and gathering information incredibly convenient, so an advertisement here or there—especially one that looks and feels the same as everything else on our feed—doesn't bother us like it might in other contexts. And when those advertisements have been tailored specifically to our interests, they're even less likely to spark a negative emotion, and may in fact encourage us to buy something that we feel is very “us.” The big question for business leaders and marketers then is how do you digitize your business so that it emphasizes the richness of the human experience? How do you know which technologies to bring into your business, and which to leave aside? There are plenty of established and emerging technologies to choose from: Interactive email helps marketers drive engagement and also provides an avenue for additional data collection. Loyalty marketing strategies help brands identify their best customers and customize experiences for them. Salesforce introduced new features to help humanize the customer service experience with AI-powered conversational chatbots that feel pretty darn close to speaking with an actual human. Virtual and Augmented Reality website options allow customers to interact with products and see them in their hands or living rooms before they buy. With all the choice out there, it can be overwhelming. And t oo often, businesses and governments lean into the “just buy as much tech as possible!” approach without thinking integratively about the applications of said technology. Many companies are using that technology to leverage more data than ever before, hoping to customize and personalize experiences. David Ryan Polgar, a tech ethicist and founder of All Tech Is Human, explains why this method may not yield the results you think—because humans aren't just a collection of data points. David Ryan Polgar: “Are we an algorithm, or are we unique? I always joke, like, my mom always said I'm a, a snowflake! I'm unique! Because, when you think about Amazon and recommendations, it's thinking that your past is predicting your future. And that, with enough data, we can accurately determine where your next step is. Or even with auto-suggestion, and things like that. What's getting tricky is, is that true? Or is it subtly going to be off? With a lot of these auto-suggestions, let's say like text. Well the question I always like to think about is, how often am I influenced by what they said I should say? So if I wanna write, like, ‘have a…' and then it says ‘great day,' well, maybe I was gonna say great day, but maybe I was gonna say good day. And it's subtly different, but it's also influencing kinda, my volition. Now we're being influenced by the very technology that's pushing us is a certain direction. And we like to think of it, ‘well, it's already based on you,' but then that has a sort of cyclical nature to actually extending—” Kate: “Quantum human consciousness or something.” David: “Exactly! Exactly.” Kate: “Like, the moment you observe it, it's changed.” Kate: It's so easy, especially when you work with data, to view humans as output generators. But we're living in an age where people are growing increasingly wary of data collection, which means you may not know as much about the people whose data you've collected as you think you do. Becoming dependent on an entirely data-driven model for customer acquisition may lead to faulty decisions — and may even be seen as a huge mistake five years from now. Instead, I always talk about “human-centric digital transformation,” which means the data and tech-driven changes you make should start from a human frame. Even if you're already adopting intelligent automation to accelerate your operations, in some cases, very simple technologies may belong at the heart of your model. Here's Neil Redding again. Neil Redding: “Using Zoom or FaceTime or Skype is the only technology needed to do what a lot of stores have done during COVID, where their customers expect the store associate interaction when they come to the stores, they just create a one-on-one video call, and the shopper just has this interaction over videochat, or video call, and kind of does that associate-assisted shopping, right? And so you have that human connection, and again, it's nowhere near as great as sitting across a table and having coffee, but it's better than, y'know, a 2-dimensional e-commerce style shopping experience.” Kate: As a parallel to video conferencing, Virtual Reality has opened up avenues for new human experiences of business as well. Cathy Hackl, a metaverse strategist and tech futurist, explained a new human experience she was able to have during COVID that wouldn't have been possible without VR. Cathy Hackl: “I'll give you an example, like with the Wall Street Journal, they had the WSJ Tech Live, which is their big tech conference, and certain parts of it were in VR, and that was a lot of fun! I mean, I was in Spatial, which is one of the platforms, hanging out with Joanna Stern, and with Jason Mims, and like, in this kind of experience, where like I actually got to spend some 1-on-1 time with them, and I don't know if I would have gotten that if I was in a Zoom call, and I don't know if I would have gotten that in person, either.” Kate: Virtual Reality and video technologies have also opened up new avenues for healthcare, allowing patients to conference with doctors from home and only travel to a hospital if absolutely necessary. Marcus Whitney is a healthcare investor and founder of the first venture fund in America to invest exclusively in Black founded and led healthcare innovation companies; he explains that these virtual experiences allow for better happiness, healing, and comfort. Marcus Whitney: “Going forward, telehealth will be a thing. We were already on the path to doing more and more healthcare in the home. It was something that they were trying to stop because, is the home an appropriate place for healthcare to take place? Lo and behold, it's just fine. Patients feel more secure in the home, and it's a better environment for healing, so you're gonna see a lot more of that. I think we're finally gonna start seeing some real breakthroughs and innovation in healthcare. Most of the lack of innovation has not been because we didn't have great thinkers, it has largely been regulatory barriers. Remote patient monitoring was a huge one that came up in the last year, so now we have doctors caring about it. What moves in healthcare is what's reimbursable. They were always trying to regulate to protect people, but then they realized, well, we removed the regulatory barriers and people were fine, so that regulation makes actually no sense, and people should have more choice, and they should be able to do telehealth if they want to.” Kate: And that's just it: humans want choice. We want to feel seen, and heard, and like our opinions are being considered. There's another technology on the horizon that could give people more power over their technology, and therefore freedom and choice, that will likely cause massive change in the marketplace when it is more widely available: Brain-computer interface. Cathy Hackl explains. Cathy Hackl: “So I'm very keen right now on brain-computer interface. The way I'm gonna explain it is, if you've been following Elon Musk, you've probably heard of neuro-link—he's working on BCI that's more internal, the ones I've been trying are all external devices. So I'm able to put a device on that reads my brainwaves, it reads my intent, and it knows that I wanna scroll an iPad, or I've been able to turn on lights using just my thoughts, or play a video game, or input a code… I've been able to do all these things. And I'm very keen on it, very interested to see what's going on… I think the biggest thing that's stuck with me from studying all these technologies and trying them out from an external perspective, is that my brain actually really likes it. Loves the workout. Like, I'm thinking about it, and I'm like, the receptors here, pleasure receptors are like lighting up, I'm like ‘ohmygosh!' So I'm still sitting with that. Is that a good thing? Or a bad thing? I don't know, but I think these technologies can allow us to do a lot of things, especially people with disabilities. If they don't have a hand, being able to use a virtual hand to do things in a virtual space. I think that's powerful.” Kate: That story also illuminates the fact that there are many different types of people, each with different needs. Digital transformation has given people with disabilities a new way to claim more agency over their lives, which creates a brand new potential customer-base, filled with humans who desire freedom and choice as much as the next person. Now, let's talk about some companies who are doing at least a few q things right when it comes to the digital transformation of human experience. Starbucks, for instance. One of the worst parts of shopping in-store was waiting in line, and then the social pressure from the people behind you wishing you would order faster. If you weren't a regular customer, the experience could be overwhelming. When they launched their mobile order app, it tapped into a number of things that made the experience of buying coffee faster and easier, with all sorts of fun customization options that I never knew existed when I only ordered in-store. Now, even brand new customers could order complex coffee drinks — meaning in that one move the company may have brought in new customers and allowed the cost per coffee to increase — all without people feeling pressure from other shoppers, and without the inconvenience of waiting in line. Then there's Wal-Mart, who during the pandemic instituted ‘Wal-Mart pickup,' a service where people can shop online and pick up their goods without ever having to step into the store. The service is technically operating at a financial loss, but Wal-Mart understands that solid branding and convenience are worth more to their company's bottom-line in the long run than the amount of money they're losing by investing into this particular service. Of course, some businesses are better suited for the online-only world than others. As more companies attempt to digitize their businesses, it's incredibly important to tap into the human reasons that people wanted to engage with your business in the first place. In some cases, businesses have failed to make this connection, assuming that “if people liked us as a physical product, then they'll continue using us when we're digital,” or worse, “if we simply make people aware of us, they will become customers!” This assumption ignores human nature, as Ana Milicevic, a longtime digital media executive who is principal and co-founder of Sparrow Digital Holdings, explains. Ana Milicevic: “To be relevant in this direct to consumer world, you also have to approach awareness and customer acquisition differently. And this is the #1 mistake we see a lot of traditional companies make, and not really understand how to pitch to a digital-first, mobile-first consumer or a direct subscriber. They're just not wired to do it that way, and often times the technology stacks that they have in place just aren't the types of tools that can facilitate this type of direct interaction as well. So they're stuck in this very strange limbo where they are committed to continuing to acquire customers in traditional ways, but that's just not how you would go about acquiring a direct customer.” Kate: Acquiring those direct customers requires an understanding of what humans want—a large part of which is meaning. And how people create meaning in their lives is changing as well. Long before the pandemic, trends were already pointing toward a future where we live more of our lives online, but those trends have also been accelerated. So beyond digitizing your business, it may also be useful to invest time, money, and energy into discovering how the humans of the future will create meaning in their lives. Cathy Hackl discussed some of the trends she's seen in her own kids that show how today's children will consume and make purchasing decisions in a very different way than most modern businesses are used to. Cathy Hackl: “Something else that I'm noticing… y'know we're going to brick and mortar, but we're going to brick and mortar less. So you start to see this need for that virtual try-on to buy your makeup, or to buy clothes, and it's also transitioning not only from the virtual try-on into what I'm calling the direct-to-avatar economy. Everything from virtual dresses that you're buying, or custom avatars, y'know you're starting to create this virtualized economy. And this is the reason I always talk about this now, is my son recently did his first communion, and when we said, ‘hey, what do you want as a gift?' he said, ‘I don't want money, I want a Roblox gift card that I can turn into Robucks,'—which is the currency they use inside Roblox—'so that I can buy—whichever gamer's skin.' And, y'know, when I was growing up, my brother was saving up to buy AirJordans. My son doesn't want that, y'know, he wants Robucks, to buy something new for his avatar. This is direct-to-avatar; is direct-to-avatar the next direct-to-consumer?” Kate: Our online avatars represent us. We can customize them to directly express who we feel we are. Part of the reason this idea is so attractive is that many people—increasingly so in the context of online interaction—seek out meaningful experiences as our ‘aspirational' selves. We gravitate to the communities that align with facets of who we wish we were. And perhaps less productively, we may also choose to present the idealized version of ourselves to the world, omitting anything we're embarrassed by or that we feel may paint us in a negative light. But honestly, all of this makes sense in the context of making meaning, because humans are generally the most emotionally fulfilled when we feel empowered to control which ‘self' we present in any given interaction. With this much freedom of choice and expression, and with the complications of the modern supply chain—which I will talk about more in depth in our next episode—it's important to acknowledge that creating convenience and improving human satisfaction aren't going to be easy tasks. Behind the scenes, there is a tremendous amount of work that goes into providing a satisfying customer experience. Let's go back to the example of flowers and see what Charlie Cole has to say. Charlie Cole: “If it's too cold they freeze, if it's too hot they wilt, if UPS is a day late they die. And then, the real interesting aspect—and this isn't unique to flowers—the source is remarkably centralized. So the New York Times estimated that 90-92% of roses that are bought in America for Valentine's Day come from Columbia and Ecuador. And so, if anything goes wrong there, then you really don't have a chance. Imagine the quintessential Valentine's Day order: A dozen long-stem roses, New York City. Easy, right? I used to live on 28th and 6th, so let's say Chelsea. Okay, I've got 7 florists who could do it. Who has delivery capacity? Roses capacity? The freshest roses? The closest to proximity? The closest to the picture in the order? Who has the vase that's in the order? Did they buy roses from us? Because I like to be able to incentivize people based on margins they already have. And so without exaggeration, Kate, we have about 11-12 ranking factors that educate a quality score for a florist, and that's how it starts the process. But then there's all the other things, like how do we know somebody didn't walk into that florist that morning and buy all the roses, right? And so there's this real-time ebb-and-flow of demand because our demand is not ours! They have their own store, they have their own B2B business, they might take orders from some of our competitors. They might have their own website. We have no idea what any given florist happens in real time because they are not captive to us. What we've learned is the place we have to get really really really really good is technology on the forecasting side, on the florist communication side, and the customer communication side. Because I can't control the seeds on the ground in Columbia, but I can really control the communication across the entire network as far as we go, as well as the amounts the we need in various places.” Kate: Creating that small-scale, emotional human moment where someone receives flowers requires immense computing power and collaboration between multiple businesses and workers. Which is part of why Charlie Cole also believes that in some cases, the best way to help your business succeed is to invest in helping other businesses that yours interacts with. Charlie Cole: “Small businesses… I think it's our secret sauce. And I think COVID has shined a light on this: small businesses are the core of our communities. Right? They are the absolute core, and I think it was always nice to say that, but now we know it. And so here's what I think we do better than anybody else: we've invested more in helping our florists run their own small business independently of us than we have about optimizing our marketplace. We launched new POS software. We launched a new local website product where we're like the first person ever to become a reseller for Shopify because we made a custom platform for florists. We're just their website provider. They're actually competing with FTD.com in a lot of ways—but I think that's where we're gonna differentiate ourselves from all the other people that are perceived as, by small businesses, (their words not mine) leeches. Right? I think to actually effectively run a marketplace which is fulfilled by small businesses, you need to invest as much in helping them win their local market independent of you.” Kate: You could make the case that there is no more evolved human experience than choosing to help others. So if your business is engaged in activities that allow other businesses—and therefore humans—to thrive, you may also be building your brand in a direction that creates more customer loyalty than any exit survey or great service interaction ever could. Beyond understanding human emotions and needs, you can help your business by leaning into understanding how we create meaning. At our core, we are compelled to make meaning. Whether we realize it or not, meaningful experiences and interactions are the driving force behind many of our decisions, financial or otherwise. Meaning is different for everyone, but having it is vital to our happiness. If you are able to engage with potential customers in a way that helps them create meaning, or allows them to use your product to make meaning on their own, you are aligning your success with your customers' success, and that bodes well for the long term. At the end of the day, making any of these changes starts at the very top of your business. Leadership needs to set the tone, creating a culture that allows room for workers at every level to engage more meaningfully with customers, and with each other. (By the way, for more discussion on creating or changing work culture, you can check out our last episode, “Does the Future of Work Mean More Agency For Workers?”) Your effort will benefit not only your business, but society as a whole. Remember the Deloitte piece in the Wall Street Journal I mentioned at the start of the episode, with ways to be more human-centric in business? Number 5 on that list was “change the world,” and research from Frontiers suggests that the well-being of any society is directly linked to how the people living within it feel about their lives and purpose. How we do that may be as simple — and as complicated — as helping people to experience meaning at any level. While the technologies around us keep changing, the opportunity becomes increasingly clear for people who work around creating customer experiences and user experiences to open up the aperture to see humanity through a fuller lens. This way, as you set your business up for longterm success, you also advocate for making human experiences as meaningful as possible — and you just might be changing the world for the better. Thanks for joining me as I explored what it means to think of customers as human. Next time, I'll be exploring the supply chain and how, despite the vast technology involved, the closer you look the more you realize: the economy is people.
The world's richest man has pledged to buy Twitter for $44bn, saying he wants to “unlock” its potential by limiting moderation of tweets and publishing its algorithm. On the Sky News Daily podcast Jonathan Samuels is joined by Sky's technology correspondent Rowland Manthorpe and David Ryan Polgar, a tech ethicist and adviser to TikTok, about the challenges Mr Musk will face. Daily podcast team: Editor: Philly Beaumont Podcast producer: Rosie Gillott
On this week's episode, we're talking about how technology and social media impact our mental health, and has led to a mental health crisis that some have called “the next global pandemic.” From the algorithms that decide what we see to the marketing tricks designed to keep us constantly engaged, we explore how our assumptions about work have led to a feedback loop that keeps us feeling worse about ourselves for longer. But never fear! At the Tech Humanist Show, we're about finding solutions and staying optimistic, and I spoke with some of the brightest minds who are working on these problems. Guests this week include Kaitlin Ugolik Phillips, John C. Havens, Rahaf Harfoush, Emma Bedor Hiland, and David Ryan Polgar. The Tech Humanist Show is a multi-media-format program exploring how data and technology shape the human experience. Hosted by Kate O'Neill. To watch full interviews with past and future guests, or for updates on what Kate O'Neill is doing next, subscribe to The Tech Humanist Show hosted by Kate O'Neill channel on YouTube. Full Transcript: Kate: Hello humans! Today we look at a global crisis that's affecting us all on a near-daily basis… No, not that one. I'm talking about the other crisis—the one getting a lot less media attention: the Global Mental Health Crisis. In December, Gallup published an article with the headline, “The Next Global Pandemic: Mental Health.” A cursory Google search of the words “mental health crisis” pulls up dozens of articles published just within the past few days and weeks. Children and teenagers are being hospitalized for mental health crises at higher rates than ever. And as with most topics, there is a tech angle: we'll explore the role technology is playing in creating this crisis, and what we might be able to do about it. Let's start with social media. For a lot of us, social media is a place where we keep up with our friends and family, get our news, and keep people updated on what we're doing with our lives. Some of us have even curated feeds specifically with positivity and encouragement to help combat what we already know are the negative effects of being on social media too long. There's a downside to this, though, which I spoke about with Kaitlin Ugolik Phillips, the author of The Future of Feeling: Building Empathy in a Tech-Obsessed World. Kaitlin: I wrote about this a little bit in an article about mental health culture on places like Instagram and Pintrest where you have these pretty images that have nice sayings and sort of the commodification of things like anxiety and depression and it's cool to be not okay, but then you're comparing your ‘not-okay'ness to other people's. Kate: We've even managed to turn ‘being not okay' into a competition, which means we're taking our attempts to be healthy and poisoning them with feelings of inferiority and unworthiness, turning our solution back into the problem it was trying to solve. One of the other issues on social media is the tendency for all of us to engage in conversations–or perhaps ‘arguments' is a better word–with strangers that linger with us, sometimes for a full day or days at a time. Kaitlin explains one way she was able to deal with those situations. Kaitlin: Being more in touch with what our boundaries actually are and what we're comfortable and capable of talking about and how… I think that's a good place to start for empathy for others. A lot of times, when I've found myself in these kind of quagmire conversations (which I don't do so much anymore but definitely have in the past), I realized that I was anxious about something, or I was being triggered by what this person is saying. That's about me. I mean, that's a pretty common thing in pscyhology and just in general—when someone is trolling you or being a bully, it's usually about then. If we get better at empathizing with ourselves, or just setting better boundaries, we're going to wade into these situations less. I mean, that's a big ask. For Millennials, and Gen Z, Gen X, and anyone trying to survive right now on the Internet. Kate: But social media doesn't make it easy. And the COVID pandemic only exacerbated the issues already prevalent within the platforms. Part of the problem is that social media wasn't designed to make us happy, it was designed to make money. John C. Havens, the Executive Director of the IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems, elaborates on this idea. John: Often times, the value is framed in exponential growth, right? Not just profit. Exponential Growth is an ideology that's not just about getting some profit or speed, it's about doing this. But when you maximize any one thing, other things by definition take less of a focus. And especially with humans, that can be things like mental health. This is not bad or evil, but it is a decision. And in this case it's a key performance indicator decision, the priority is to get something to market, versus, how can we get something to market focused on well-being? How can we make innovation about mental health? Kate: The upside is that our time indoors led some people to more quickly realize the issues with technology and its effects on us. Early in the pandemic, I spoke with Rahaf Harfoush — a Strategist, Digital Anthropologist, and Best-Selling Author who focuses on the intersections between emerging technology, innovation, and digital culture — about what she learned about our relationship to technology during that time. Rahaf: For me I think it just amplified a lot of the issues with the way we were using tech before. I noticed in my social networks and friend groups, people were home more, so what can we do but turn to our online, to this never-ending content and distraction and connections. And in the first couple weeks, everyone was about the Zoom everything, and then there was a Zoom burnout… for me, there's a couple big issues at play. The first is that we have more bandwidth because we're at home, so we're consuming more information. A lot of these platforms leverage this addictive constant-refresh, breaking-news cycle, and with something as complex and nuanced as COVID, a lot of us were glued to our screens refreshing refreshing refreshing… that was not the best thing I could have done for my mental well being or anxiety. At one point I was like, “i need to step away!” because I was just addicted to the news of instead of increasing knowledge. And the other thing is that for many people, the forced pause made us realize that we use productivity as a coping mechanism, and what does it mean that we have more time? A lot of people started trying to make their personal time as productive as their professional time—pushing themselves to pick up 10 new hobbies and learn 10 new languages and take 10 new classes! One or two of those things is great, but I really saw people loading up. That was a good indication to me of our lack of comfort with not doing anything. I noticed I was guilting myself for not writing and not learning and then I was like, you know what? we're undergoing this immensely traumatic, super-stressful thing… it's okay to not do anything, like that's fine. Kate: If you're anything like me, that's a lot easier said than done. Even if you've mostly resumed your life as normal, you're probably still in the habit of working all day, and then filling your free time with more work, hobbies, or time on social media. I asked Rahaf what someone trapped in this cycle could do about it. Rahaf: Your brain needs at least a week to just unwind from the stress of work. If you're just constantly on planes and in deliverables and client stuff… you're never going to take the time to imagine new opportunities for yourself. The trick is we have to balance periods of actually producing the thing with periods of intangible creativity. A lot of the thinking you can't see—in our culture, we don't like things that we can't see. But how many of us have gone for a walk about got that idea, or were daydreaming and got that idea? So creatives, we need that downtime. And by the way, downtime isn't taking a coffee break and being on social media. Downtime is really downtime. Daydreaming, just letting your brain go. Which is why we need a different framework, because for a writer or strategist, like you, you spend so much time thinking about things… but to think about things, you need the time to think about them!” Kate: Most of us don't have the luxury to just shut off our Internet usage entirely. If you're someone, like most of us, who needs technology to get by. , how do we find that balance? And why is it so difficult? Rahaf: I think it's because we've shamed ourselves into thinking if we're not doing stuff, it's a waste. And that's the problem, the problem is intentional recovery, prioritizing and choosing rest, that's really hard for us, because we constantly hear these stories of CEOs and celebrities, and Elon Musk sleeping on the floor of his factory, and Tim Cooke waking up at 4:30 in the morning, and we think, I can't take a nap, I can't watch a movie, I can't go for a walk, because then I'm not really committed to being successful! And that's the most toxic belief system we've incorporated into our society today, especially for creatives. The breakthrough that I had was that it's not actually about systems or organizations, it's about us as people. We are our hardest taskmasters, we will push ourselves to the limit, even when other people tell us to take a break. If we're gonna move to a more humane productivity mindset, we have to have some uncomfortable conversations about the role of work in our lives, the link between our identity and our jobs and our self-worth, our need for validation with social media and professional recognition, our egos… all of these things battle it out, which is why I can't just come on here and be like, “okay guys, take a break here, do this…” we're not going to do it! We really have to talk about, ‘growing up, what did your parents teach you about work ethic?' how is that related to how you see yourself? Who are the people that you admire? And then there are statements you can ask yourself, like “if you work hard, anything is possible!” All these things, you can start testing your relationship with work, and you start to see that we have built a relationship with work psychologically where we feel like if we don't work hard enough, we're not deserving. And not only do we have to work hard, we have to suffer! We have to pull all-nighters! Think of the words we use, ‘hustle' and ‘grind'… these horrible verbs! The reason that's important to dig into is that our views about our work become assumptions that we don't question. We don't ever stop and say, ‘does this belief actually allow me to produce my best possible work, or is it just pushing me to a point where I'm exhausted and burnt out? The second thing is, a lot of the stories we've been told about success aren't true. As a super-quick example, if there's an equation for success, most people think it's “hard work = success.” But in reality, while hard work is important, it's not the only variable. Where you're born, your luck, your gender, your race… all of these things are little variables that add into the equation. So what I don't like about “hard work = success,” it's that the flip side of that tells people, “if you're not successful, it's because you aren't working hard enough.” And part of the awakening is understanding that there are other factors at play here, and we're all working pretty hard! We don't need more things telling us that we're not enough and we're not worthy. Rahaf: When I had my own burnout, I knew better but didn't do better. That was really frustrating to me, it's like, I have the knowledge, why could I not put the knowledge to practice? And then I realized, all these belief systems and stories are embedded in every IG meme and every algorithm that asks you to refresh every 10 seconds, and every notification that interrupts your time, and the design of these tools to socially shame people for not responding fast enough. With Whatsapp for example, the blue checkmark that lets you know if someone has seen your message. What is that if not social pressure to respond? We've also shaped technology to amplify the social norms that if you're ‘left on read,' that's a breach of etiquette. Kate: We, as a culture, believe things about success that aren't true. Then, we program those beliefs into our technology, and that technology ramps up and exacerbates the speed at which we're exposed to those flawed ideas. It creates a downward spiral for the user — or, the person using these platforms — to believe these untrue truths more deeply, broadening the disconnect between our ideal selves and reality. And yet, despite these outside forces at play, there is an urge to place responsibility on the user, to say that each of us is solely responsible for our own mental health. Emma Bedor Hiland — the author of Therapy Tech: The Digital Transformation of Mental Healthcare — calls this “Responsibilization” Emma: I draw from the work of Michelle Foucault who writes about neo-liberalism too. So the way I use it in the book is to say that there is an emphasis when we talk about neo-liberalism upon taking responsibility for yourself, anything that could be presumably in your control. And in this day and age, we're seeing mental health, one's own mental health, being framed as something we can take responsibility for. So in tandem with this rollback of what would ideally be large-scale support mechanisms, local mental health facilities to help people in need, we're seeing an increasing emphasis upon these ideas like ‘use the technology that you can get for free or low cost to help yourselves.' But at the same time, those technologies literally don't speak to or reflect an imagined user who we know in this country need interventions most badly. Kate: Thankfully, we live in a world where once a problem has been identified, some enterprising people set out to design a potential solution. Some of those solutions have been built into our technology, with ‘screen time tracking' designed for us to think twice about whether we should spend more time on our phones, and Netflix's “are you still watching?” feature that adds a little friction into the process of consuming content. When it comes to mental health specifically, there is a growing Telemental Healthcare industry, including online services such as BetterHelp, Cerebral, or Calmerry. These, however, may not be the solutions we want them to be. Emma: “A lot of my research, it's so interesting looking back at it now, my interviews with people who provide tele-mental health were conducted prior to the pandemic. It was really challenging at that time to find people who were advocates and supporters of screen-based mental health services, they told me that their peers sort of derided them for that because of this assumption that when care is screen-based, it is diluted in fundamental ways that impact the therapeutic experience. Which is understandable, because communication is not just about words or tone or what we can see on a screen, there's so much more to it. But when interactions are confined to a screen, you do lose communicative information. One of the things I've grappled with is I don't want it to seem like I don't think telemental health is an important asset. One of my critiques is that a lot of the times in our discussions, we assume people have access to the requisite technologies and access to infrastructure that makes telemental healthcare possible in the first place. Like having smart devices, even just Smartphones, if not a laptop or home computer station, as well as reliable access to an internet connection, in a place where they could interface with a mental healthcare provider. So a lot of the discourse is not about thinking about those people whatsoever, who due to the digital divide or technology gap, even using technology couldn't interface with a healthcare provider. Some of my other concerns are related to the ways our increased emphasis and desire to have people providing screen-based care also are actually transforming people who provide that care, like psychiatrists, psychologists, etc, into members of the digital gig economy, who have to divide up their time in increasingly burdensome ways, and work in ways where their employment tends to be increasingly tenuous. Relatedly, I am also worried about platforms. I know people are becoming more familiar with the idea that these places exist that they can go to on their laptops or wherever, assuming they have that technology, and be connected to service providers, but as we've seen with Crisis Text Line, there are a lot of reasons to be concerned about those platforms which become hubs of collecting and aggregating and potentially sharing user data. So while I think telemental healthcare services are important, I'd like to see dedication of resources not just to technologically facilitated care, but using that care to direct people to in-person care as well. We know due to the COVID Pandemic, we saw so many people offering services that were solely screen-based, and for good reason. A lot of clinics that provided healthcare for people without insurance or who are living, considered in poverty, relied upon in-person clinic services, and haven't been able to get them due to their shuttering due to the pandemic. So I worry about the people who we don't talk about as much as I worry about the negative consequences and affects of mental healthcare's technologization Kate: So while some people's access to mental healthcare has increased with technology, many of the people who need it most have even less access to help. On top of that, the business model of these platforms makes it so that healthcare professionals have to work harder for longer in order to make their living. On top of all this, as a means of sustaining the companies themselves, they sometimes turn to sharing user data, which is a major concern for myriad reasons, one of which is people who use that data to create predictive algorithms for mental health. Next, Emma elaborates on this concept. Emma: People have been trying this for a number of years; aggregating people's public social media posts and trying to make predictive algorithms to diagnose them with things like ADHD, depression, anxiety… I'm still unsure how I feel about trying to make predictive algorithms in any way that try to make predictions in any way about when people are likely to harm themselves or others, simply because of how easy it is to use that type of software for things like predictive policing. I write in the book as well that people want to harness internet data and what people do on social media to try to stop people from violent behavior before it starts, so it's very much a slippery slope, and that's why I find data sharing in the realm of mental health so difficult to critique, because of course I want to help people, but I'm also concerned about privacy. Kate: For those saying, “but what about the free services? Things like Crisis Text Line or Trevor Project?” Emma: Crisis Text Line, when it comes into fruition in 2013 and it says, “we can meet people where they are by allowing them to communicate via text when they're experiencing crises”… I think that's a really laudable thing that was done, and that people thought it was an intervention that could save lives, and based on research from external and internal researchers, we know that is the case. But for people who might not be aware, Crisis Text Line doesn't put people in contact with professional mental healthcare workers, instead it's often people who have no background or training in mental healthcare services, and instead go through training and serve as volunteers to help people in dire moments need and crisis. In Therapy Tech I also describe how I perceive that as a form of exploitative labor, because although in the past there were conversations about whether to provide financial compensation for volunteers, they ultimately decided that by emphasizing the altruistic benefits of volunteering, that sort of payment wasn't necessary. And then I compare that to Facebook's problematic compensation of its content moderators, and the fact that those moderators filed a lawsuit against Facebook—although it hasn't been disclosed what the settlement was, at least there's some acknowledgement that they experienced harm as a result of their work, even if it wasn't volunteering. So I do take some issue with Crisis Text Line and then, in relation to neo-liberalism and responsibilization, again I feel that CTL is not the ultimate solution to the mental healthcare crisis in this country, or internationally, and CTL has created international partners and affiliates. I underwent training for a separate entity called Seven Cups of Tea which is both a smartphone app as well as an internet-accessible platform on a computer. And Seven Cups of Tea's training, compared to what I know CTL volunteers have to go through, is incredibly short and I would characterize as unhelpful and inadequate. For me it took 10 minutes, and I can't imagine it would take anyone more than a half hour. So the types of things I learned were how to reflect user statements back to them, how to listen empathetically but also not provide any advice or tell them what to do, because you never know who's on the other end! At the time I conducted the research, I started to volunteer on the platform. A lot of the messages I got were not from people who were experiencing mental distress necessarily, but from people who just wanted to chat or abuse the platform. But even though I only had a few experiences with people who I felt were genuinely experiencing mental distress, I still found those experiences to be really difficult for me. That could be just because of who I am as a person, but one of the things I've realized or feel and believe, is that my volunteering on the platform was part of a larger-scale initiative of 7CoT to try to differentiate between who would pay for services after I suggested to them because of my perception of them experiencing mental distress, and those whose needs could be fulfilled by just being mean to me, or having their emotions reflected back to them through superficial messaging. I very rarely felt that I was able to help people in need, and therefore I feel worse about myself for not being able to help as though it's somehow my fault, related to this idea of individual responsibilization. Me with my no knowledge, or maybe slightly more than some other volunteers, feeling like I couldn't help them. As though I'm supposed to be able to help them. I worry about the fatalistic determinism types of rhetoric that make it seem like technology is the only way to intervene, because I truly believe that technology has a role to play, but is not the only way. Kate: Technology isn't going anywhere anytime soon. So if the products and services we've built to help us aren't quite as amazing as they purport themselves to be, is there a role for tech interventions in mental health scenarios? Emma explains one possible use-case. Emma: I think technology can help in cases where there are immediate dangers. Like if you see someone upload a status or content which says there is imminent intent to self-harm or harm another person. I think there is a warrant for intervention in that case. But we also know that there are problems associated with the fact that those cries for help (or whatever you want to call them) are technologically mediated and they happen on platforms, because everything that happens via a technology generates information / data, and then we have no control, depending on the platform being used, over what happens with that data. So I'd like to see platforms that are made for mental health purposes or interventions be held accountable in that they need to be closed-circuits. It needs to be that they all pledge not to engage in data sharing, not engage in monetization of user data even if it's not for-profit, and they need to have very clear terms of service that make very evident and easily-comprehendible to the average person who doesn't want to read 50 pages before agreeing, that they won't share data or information. Kate: Now, I do like to close my show with optimism. So first, let's go to Rahaf once again with one potential solution to the current tech issues plaguing our minds. Rahaf: To me one of the most important things that we need to tackle—and I don't know why we can't just do this immediately—we need to have the capacity on any platform that we use to turn off the algorithm. Having an algorithm choose what we see is one of the biggest threats, because think about all the information that you consume in a day, and think about how much of that was selected for you by an algorithm. We need to have an ability to go outside of the power that this little piece of code has to go out and select our own information, or hold companies accountable to produce information that is much more balanced. Kate: And that sounds like a great solution. But how do we do that? We don't control our technology, the parent companies do. It's easy to feel hopeless… unless you're my friend David Ryan Polgar, a tech ethicist and founder of All Tech Is Human, who's here to remind us that we aren't bystanders in this. I asked him what the most important question we should be asking ourselves is at this moment, and he had this to say. David: What do we want from our technology? This is not happening to us, this is us. We are part of the process. We are not just magically watching something take place, and I think we often times forget that. The best and brightest of our generation should not be focused on getting us to click on an ad, it should be focused on improving humanity. We have major societal issues, but we also have the talent and expertise to solve some of these problems. And another area that I think should be focused on a little more, we are really missing out on human touch. Frankly, I feel impacted by it. We need to hug each other. We need to shake hands as Americans. I know some people would disagree with that, but we need warmth. We need presence of somebody. If there was a way that if we ended this conversation and like, we had some type of haptic feedback, where you could like, pat me on the shoulder or something like that… everybody right now is an avatar. So I need to have something to say like, “Kate! You and I are friends, we know each other! So I want a greater connection with you than with any other video that I could watch online. You are more important than that other video.” But right now it's still very two dimensional, and I'm not feeling anything from you. And I think there's going to have to be a lot more focus on, how can I feel this conversation a little more. Because I mean listen, people are sick and tired right now, ‘not another Zoom call!' But if there was some kind of feeling behind it, then you could say, “I feel nourished!” whereas now, you can sometimes feel exhausted. We're not trying to replace humanity, what we're always trying to do is, no matter where you stand on an issue, at the end of the day, we're actually pretty basic. We want more friends, we want more love… there are actual base emotions and I think COVID has really set that in motion, to say, hey, we can disagree on a lot in life, but what we're trying to do is get more value. Be happier as humans, and be more fulfilled. Be more educated and stimulated. And technology has a major role in that, and now, it's about saying how can it be more focused on that, rather than something that is more extractive in nature? Kate: Whether we like it or not, the Internet and digital technology play a major role in our collective mental health, and most of the controls are outside of our hands. That can feel heavy, or make you want to throw in the towel. Those feelings are valid, but they aren't the end of the story. I asked David for something actionable, and this is what he had to say. David: Get more involved in the process. Part of the problem is we don't feel like we can, but we're going to have to demand that we are, and I think frankly some of this is going to come down to political involvement, to say ‘we want these conversations to be happening. We don't want something adopted and deployed before we've had a chance to ask what we actually desire.' So that's the biggest part is that everyone needs to add their voice, because these are political issues, and right now people think, ‘well, I'm not a techie!' Guess what? if you're carrying around a smartphone… Kate: All the more reason we need you! David: Right! We need everybody. Technology is much larger. Technology is society. These are actually social issues, and I think once we start applying that, then we start saying, ‘yeah, I can get involved.' And that's one of the things we need to do as a society is get plugged in and be part of the process. KO: There are a lot of factors that contribute to our overall sense of happiness as humans. And although it may sound like a cliche, some of those factors are the technologies that we use to make our lives easier and the algorithms that govern the apps we thought we were using to stay connected. But that doesn't mean things are hopeless. If we keep talking about what matters to us, and make an effort to bring back meaningful human interaction, we can influence the people building our technology so that it works for our mental health, instead of against it.
David Ryan Polgar is the Founder of All Tech is Human. He is a leading tech ethicist, an advocate for human-centric technology, and advisor on improving social media and crafting a better digital future. In this timely discussion, David traces his not-so-unlikely path from practicing law to being a standard bearer for the responsible technology movement. He artfully illustrates the many ways technology is altering the human experience and makes the case for “no application without representation”. Arguing that many of AI's misguided foibles stem from a lack of imagination, David shows how all paths to responsible AI start with diversity. Kimberly and David debunk the myth of the ethical superhero but agree there may be a need for ethical unicorns. David expounds on the need for expansive education, why non-traditional career paths will become traditional and the benefits of thinking differently. Acknowledging the complex, nuanced problems ahead, David advocates for space to air constructive, critical, and, yes, contrarian points of view. While disavowing 80s sitcoms, David celebrates youth intuition, bemoans the blame game, prioritizes progress over problem statements, and leans into our inevitable mistakes. Finally, David invokes a future in which responsible tech is so in vogue it becomes altogether unremarkable. A transcript of this episode can be found here. Our next episode features Vincent de Montalivet, leader of Capgemini's global AI Sustainability program. Vincent will help us explore the yin and yang of AI's relationship with the environment. Subscribe now to Pondering AI so you don't miss it.
Dr. Valérie Morignat PhD is the CEO of Intelligent Story and a leading advisor on the creative economy. She is a true polymath working at the intersection of art, culture, and technology.In this perceptive discussion, Valérie illustrates how cultural legacies inform technology and innovation today. Tracing a path from storytelling in caves to modern Sci-Fi she proves that everything new takes (a lot of) time. Far from theoretical, Valérie shows how this philosophical understanding helps business innovators navigate the current AI landscape.Discussing the evolution of VR/AR, Valérie highlights the existential quandary created by our increasingly fragmented digital identities. Kimberly and Valérie discuss the pillars of responsible innovation and the amplification challenges AI creates. Valérie shares the power of AI to teach us about ourselves and increase human learning, creativity, and autonomy. Assuming, of course, we don't encode ancient, spurious classification schemes or aggravate negative behaviors. She also describes our quest for authenticity and flipping the script to search for the real in the virtual.Finally, Valérie sketches a roadmap for success including executive education and incremental adoption to create trust and change our embedded mental models.A transcript of this episode can be found here.Our next episode features David Ryan Polgar, founder of All Tech is Human. David is a leading tech ethicist and responsible technology advocate who is well-known for his work on improving social media. Subscribe now so you don't miss it.
David Ryan Polgar is the founder of All Tech is Human. He recently created a report called Improving Social Media. David Ryan Polgar is a pioneering tech ethicist, Responsible Tech advocate, and expert on ways to improve social media and our information ecosystem. David is the founder of All Tech Is Human, an organization committed to building the Responsible Tech pipeline by making it more diverse, multidisciplinary, and aligned with the public interest. As the leader of All Tech Is Human, he has spearheaded the development of three recent reports: Guide to Responsible Tech: How to Get Involved & Build a Better Tech Future, The Business Case for AI Ethics: Moving From Theory to Action, and Improving Social Media: The People, Organizations and Ideas for a Better Tech Future. In March 2020, David became a member of TikTok's Content Advisory Council, providing expertise around the delicate and difficult challenges facing social media platforms to expand expression while limiting harm. The main throughline throughout David's work is that we need a collaborative, multi-stakeholder, and multidisciplinary approach in order to build a tech future that is aligned with the public interest. Tech Ethicist Creating rules around how we can have a better society. We set up the conditions for people to discriminate. Strong democracy is contingent on shared truth Attorneys think of wrost case scenario, but tech founders look at best-case scenarios. This isn't about tech. We didn't foresee this…It's a surprise to people who are pushing an agenda, but not a surprise to people who have been planning and researching it. Power structures. Meme literacy should be taught more in school. Section 230 - Social media is an amagamation of several different types of companies. Gordian Knot Power goes back to Cyberspace Manifesto Teens are on a platform that is defines teens as more than just users. IBM Watson - IBM Thinkleaders Thanks to our mission partner: Buoyancy Digital is proud to be the inaugural Mission Partner for the Cybertraps Podcast series. A digital advertising consultancy with an ethos, Buoyancy was founded by Scott Rabinowitz, who has been in digital media since 1997 and has overseen $300 million in youth safety compliant ad buys across all digital platforms. For IAB, Google and Bing accredited brand and audience safe advertising sales solutions, media buying and organizational training for media publishers, let's chat.
In the first half, we talk with Lucy Suchman, a professor and expert on human computer interaction & warfare about a recent report commissioned by the United States Government about artificial intelligence and national security. We look at some of the unexamined premises for the report. In the second half, we listen to a panel discussion hosted by Betalab on how to build a better social media future featuring David Ryan Polgar, a Responsible Tech advocate and founder of All Tech Is Human; Nicole Chi, a civic technologist and product manager who has worked at the intersection of product, policy, and public interest; Rana Sarkar, who was appointed by Prime Minister Justin Trudeau as Consul General of Canada in San Francisco & Silicon Valley in 2017; and moderator Yaël Eisenstat, Researcher-in-Residence at Betalab and now a Future of Democracy Fellow at Berggruen Institute.
This conversation explores the question: How can we reduce misinformation and disinformation on social media platforms while also ensuring that platforms promote the free exchange of ideas? Guests in this episode include Dr. Jasmine McNealy (Associate Professor of Telecommunication at the University of Florida, Harvard Berkman Klein Center affiliate, media & law expert) and Dr. Claire Wardle (co-founder and director of First Draft, leading expert on user generated content, verification and misinformation). This conversation is moderated by All Tech Is Human's David Ryan Polgar. The organizational partner for the event is TheBridge. The conversation does not stop here! For each of the episodes in our series with All Tech is Human, you can find a detailed “continue the conversation” page on our website radicalai.org. For each episode we will include all of the action items we just debriefed as well as annotated resources that were mentioned by the guest speakers during the livestream, ways to get involved, relevant podcast episodes, books, and other publications.
All Tech is Human founder, TikTok Content Advisory Council member, speaker, and writer David Ryan Polgar explores how he's connecting the dots between individuals, industries, and universities to build the responsible tech pipeline. Polgar explains how we can create a diverse, inclusive, and healthy tech ecosystem to tackle the thorniest issues facing us today. From Section 230 to the storming of the U.S. Capitol, nothing is outside of digital tech — and it's up to humans to create a responsible tech future.Follow David Ryan Polgar:Twitter: @TechEthicistLinkedIn: David Ryan PolgarFollow Digital VoidTwitter: https://twitter.com/digivoidmediaYouTube: https://www.youtube.com/channel/UCKWoac3SIfsUg6Xl0X1GS8AFacebook: https://www.facebook.com/digivoidmediaInstagram: https://www.instagram.com/digitalvoid.media/LinkedIn: https://www.linkedin.com/company/34894594Use #DigitalVoid #DigitalVoidPodcast to join the conversation-----CreditsHosted by Josh Chapdelaine and Dr. Jamie CohenAudio edited and mixed by Josh Chapdelaine-----Digital Void Podcast is a production of Digital Void Media.Contact Digital Void:Email: digivoidmedia@gmail.com Hosted on Acast. See acast.com/privacy for more information.
This conversation explores the topic Improving Social Media: Content Moderation & Democracy with invited panelists Sarah T. Roberts and Murtaza Shaikh Sarah T. Roberts is the co-founder and Co-Director of the UCLA Center for Critical Internet Inquiry, and the author of Behind the Screen: Content Moderation in the Shadows of Social Media. Murtaza Shaikh is the Senior Advisor on Hate Speech, Social Media and Minorities to the UN Special Rapporteur on Minority Issues This conversation is moderated by All Tech Is Human's David Ryan Polgar. The organizational partner for the event is TheBridge. The conversation does not stop here! For each of the episodes in our series with All Tech is Human, you can find a detailed “continue the conversation” page on our website radicalai.org. For each episode we will include all of the action items we just debriefed as well as annotated resources that were mentioned by the guest speakers during the livestream, ways to get involved, relevant podcast episodes, books, and other publications.
Part 2 of 4 interviews from our end-of-year event, The Pain and Triumphs of 2020: The Biggest Lessons We Learned This Year, where we asked guest speakers (several of our favorite people who have been part of the Tech 2025 community for years!) what their most profound lesson of 2020 was and how it changed their outlook on the future. They gave us much food for thought! In this episode, David Ryan Polgar (Tech Ethicist, Founder of All Tech is Human) explains how this past year has shown him things about building communities (which he specializes in) that we will continue to grapple with as we work remotely more post-pandemic. David is a pioneering tech ethicist who paved the way for the hotly-debated issues around Facebook, privacy, ethical design, digital well-being, and what it means to be human in the digital age. He is the founder of All Tech Is Human, an initiative to better align tech with the human interests of users and society, and is the co-host of Funny as Tech–a live show and podcast that tackles the thorniest issues in tech. David serves on the advisory boards for Common Sense’s Digital Citizenship Advisory Council, the non-profit #ICANHELP, and Hack Mental Health. In 2015, he co-founded the global Digital Citizenship Summit. David is currently researching the impact that “scaling intimacy” has on human relationships, and whether we are becoming botified (and less authentic) in our communications. Read The Guide to Responsible Tech (published by All Tech is Human). Listen to Charlie's interview on David's podcast: "What is the future of work? LIVE show with Charlie Oliver, Galina Ozgur & Lisa Cervenka" . CONNECT WITH DAVID LinkedIn: http://bit.ly/2uLO0rZ Twitter: @TechEthicist REACH OUT TO THE SHOW: Website: https://tech2025.com/fast-forward-podcast/ Twitter: @fastforward2025 Instagram: @fastforward2025 Facebook: https://bit.ly/fastforwardfacebook Email: fastforward@tech2025.com Charlie on Twitter: @itscomplicatedCharlie on Instagram: @charlieoliverbk Charlie on LinkedIn: linkedin.com/in/charlieoliverny
What is the business case for AI Ethics? This conversation explores the topic with invited panelists William Griffin and Alayna Kennedy. Willam Griffin is the Chief Ethics Officer of Hypergiant, an organization that works with partners to create powerful technology solutions and smarter, more efficient human workforces. Alayna Kennedy is a data scientist at IBM, working on creating ethical algorithms and aligning human and machine values. This conversation is moderated by All Tech Is Human's David Ryan Polgar. The organizational partner for the event is TheBridge. The conversation does not stop here! For each of the episodes in our series with All Tech is Human, you can find a detailed “continue the conversation” page on our website radicalai.org. For each episode we will include all of the action items we just debriefed as well as annotated resources that were mentioned by the guest speakers during the livestream, ways to get involved, relevant podcast episodes, books, and other publications.
How will Artificial Intelligence define the future of Civil Rights? To celebrate the NYC theater release of the film Coded Bias we present this Livestreamed conversation featuring Shalini Kantayya (director, Coded Bias), Meredith Broussard (Author, Artificial Unintelligence), and Timnit Gebru (Co-Lead, Ethical Artificial Intelligence Team at Google) This conversation is moderated by All Tech Is Human's David Ryan Polgar. The organizational partner for the event is TheBridge. The conversation does not stop here! For each of the episodes in our series with All Tech is Human, you can find a detailed “continue the conversation” page on our website radicalai.org. For each episode we will include all of the action items we just debriefed as well as annotated resources that were mentioned by the guest speakers during the livestream, ways to get involved, relevant podcast episodes, books, and other publications.
The 2016 US election made it clear that social media companies play a profound role in how voters are informed and influenced. What role should social media companies be playing in the upcoming US election? In partnership with All Tech is Human we present this Livestreamed conversation featuring Dipayan Ghosh (co-director of the Digital Platforms & Democracy Project at the Harvard Kennedy School, author of Terms of Disservice, & former public policy advisor at Facebook) & Vera Zakem (Senior Policy and Technology Advisor, Institute for Security and Technology, CEO of Zakem Global Strategies, & former strategy and research at Twitter). This conversation is moderated by All Tech Is Human's David Ryan Polgar. The organizational partner for the event is TheBridge. The conversation does not stop here! For each of the episodes in our series with All Tech is Human, you can find a detailed “continue the conversation” page on our website radicalai.org. For each episode we will include all of the action items we just debriefed as well as annotated resources that were mentioned by the guest speakers during the livestream, ways to get involved, relevant podcast episodes, books, and other publications.
Author and tech humanist, Kate O’Neill, and tech ethicist, David Ryan Polgar, discuss how as technology expands, we can end up supressing our basic needs as humans to relate and interact with one another.
How can we inform and inspire the next generation of responsible technologists and changemakers? How do you get involved as someone new to the responsible AI field? In partnership with All Tech is Human we present this Livestreamed conversation featuring Rumman Chowdhury (Responsible AI Lead at Accenture) and Yoav Schlesinger (Principal, Ethical AI Practice at Salesforce). This conversation is moderated by All Tech Is Human's David Ryan Polgar. The organizational partner for the event is TheBridge. The conversation does not stop here! For each of the episodes in our series with All Tech is Human, you can find a detailed “continue the conversation” page on our website radicalai.org. For each episode we will include all of the action items we just debriefed as well as annotated resources that were mentioned by the guest speakers during the livestream, ways to get involved, relevant podcast episodes, books, and other publications.
How should diplomacy and international cooperation adjust to the significant global power that major tech companies wield? In partnership with All Tech is Human we present this Livestreamed conversation featuring Alexis Wichowski (adjunct associate professor in Columbia University's School of International and Public Affairs, teaching in the Technology, Media, and Communications specialization) and Rana Sarkar (Consul General of Canada for San Francisco and Silicon Valley, with accreditation for Northern California and Hawaii.) This conversation is moderated by All Tech Is Human's David Ryan Polgar. The organizational partner for the event is TheBridge. The conversation does not stop here! For each of the episodes in our series with All Tech is Human, you can find a detailed “continue the conversation” page on our website radicalai.org. For each episode we will include all of the action items we just debriefed as well as annotated resources that were mentioned by the guest speakers during the livestream, ways to get involved, relevant podcast episodes, books, and other publications.
About this episode's guest: David Ryan Polgar is a leading voice in the areas of tech ethics, digital citizenship, and what it means to be human in the digital age. David is a global speaker, a regular media commentator for national & international press, and a frequent advisor & consultant on building a better tech […]
The Tech Humanist Show explores how data and technology shape the human experience. It's recorded live each week in a live-streamed video program before it's made available in audio format. Hosted by Kate O’Neill. About this episode's guest: David Ryan Polgar is a leading voice in the areas of tech ethics, digital citizenship, and what it means to be human in the digital age. David is a global speaker, a regular media commentator for national & international press, and a frequent advisor & consultant on building a better tech future. He is the co-host/co-creator of Funny as Tech, a NYC-based podcast & occasional live show that deals with our messy relationship with technology, and is the founder of All Tech Is Human, an accelerator for tech consideration & hub for the Responsible Tech movement. David serves as a founding member of TikTok's Content Advisory Council, along with the Technology & Adolescent Mental Wellness (TAM program). He tweets as @techethicist. This episode streamed live on Thursday, August 6, 2020. Episode highlights: 1:20 David Ryan Polgar intro 3:21 weird coincidence?! 4:40 and a tornado?! 6:05 previous podcast discussion — will update here with a link when it goes live! 7:23 attorney and educator?! 10:44 "no application without representation" 11:56 the politics of technology 15:55 impact over intent 16:25 social media and free speech online 21:13 content moderation: humans and AI 24:32 the role of friction in tech 27:32 distinguishing between thought and action in law 28:24 "your unfiltered brain is not what should be out on the internet" 28:50 brain to text 30:59 "are we an algorithm" 37:14 "do we even want these systems" 46:05 "I wanted to put the agency back on us" 46:28 "the future is not written" 53:55 "everybody needs to add their voice" 54:54 How can people find you and follow your work? (alltechishuman.org, hello@alltechishuman.org; funnyastech.com; @techethicist; David Ryan Polgar on LinkedIn; techethicist.com; davidryanpolgar.com)
How can we be our authentic selves online? Should teens approach social media as if they are their own brand? David & Joe chat with tech lifestyle expert Stephanie Humphrey about digital footprints and the difficulty to living our lives online. Funny as Tech is a podcast about our messy relationship with technology. Hosted by tech ethicist David Ryan Polgar and comedian Joe Leonardo, the show looks at how emerging technology is altering our life in so many profound ways. FunnyAsTech.com DavidRyanPolgar.com JoeLeonardo.com Info@FunnyAsTech.com @TechEthicist @ImJoeLeonardo == Technology and Lifestyle Expert Stephanie Humphrey is a former engineer who merges her passion for lifestyle media with in-depth tech expertise to show everyday people how empowering, enriching and fun technology can be. Stephanie is driven by the sole purpose of connecting people, particularly those underrepresented in technology, with the tech know-how to transform their worlds. Stephanie is currently a technology contributor for ABC News where she works as part of the nationally syndicated Strahan Sara & Keke team. Before that, she spent two years as the technology contributor to daytime talk show The Harry Show, hosted by Harry Connick, Jr. She’s also been a guest expert on the daytime morning show Home & Family on The Hallmark Channel, and Sister Circle Live on TV One. Stephanie was a frequent guest on NewsOne with Roland Martin and has also contributed her tech-life expertise to other national media outlets including Al-Jazeera America, HuffPostLive, TheGrio.com, and BlackEnterprise.com. She is a regular on-air tech contributor to Fox 29’s Good Day Philadelphia (WTXF) where she delivered tech news since 2012. Stephanie spent three years as the technology writer for EBONY.com, a contributor to EBONY magazine, and was the originator of the popular Tech2Go column on TheRoot.com. She also spent two years as the spokesperson for HP, Inc. on the QVC shopping channel. You can currently catch her monthly radio segment on The Karen Hunter Show on Sirius XM and hear regular tech segments on WDAS-FM in Philadelphia as well. And Stephanie uses social media to help thousands of people understand tech basics with her weekly 60-Second Tech Break on Instagram & Twitter (@TechLifeSteph). Clips of some of Stephanie’s work can be viewed on her YouTube channel. Helping students is a passion that drives Stephanie, and she has channeled this motivation into a seminar called ‘Til Death Do You Tweet. The seminar, tailored to either students, parents, or professionals, helps them understand the potential negative consequences of online behavior - especially through social media - and gives helpful advice on how people can maintain a positive reputation in cyberspace. Her book “Don’t Let Your Digital Footprint Kick You in the Butt!” expands on these concepts and is due to launch in August 2020.
How can we reduce data discrimination & algorithmic bias that perpetuate gender and racial inequalities? In partnership with All Tech is Human we present this Livestreamed conversation featuring Safiya Noble (Associate Professor at the University of California, Los Angeles (UCLA) in the Department of Information Studies and author of Algorithms of Oppression: How Search Engines Reinforce Racism) and Meredith Broussard (Associate Professor at the Arthur L. Carter Journalism Institute of New York University and the author of Artificial Unintelligence: How Computers Misunderstand the World). This conversation is moderated by All Tech Is Human's David Ryan Polgar. The organizational partner for the event is TheBridge. The conversation does not stop here! For each of the episodes in our series with All Tech is Human, you can find a detailed “continue the conversation” page on our website radicalai.org. For each episode we will include all of the action items we just debriefed as well as annotated resources that were mentioned by the guest speakers during the livestream, ways to get involved, relevant podcast episodes, books, and other publications.
All Tech Is Human's David Ryan Polgar moderates this discussion with participants Mutale Nkonde and Charlton McIlwain. The name--All Tech Is Human goes to the exact point of the conversation that any technology is created by humans and data from humans, which means that historical bias and inequalities can replicate itself. Always good to remember that tech is not magical...all tech is human. Racism In AI is from "Building Anti-Racist Technology & Culture" by All Tech Is Human. Tech This Out News. 2020. All Rights Reserved
All Tech Is Human's David Ryan Polgar moderates this discussion with participants Mutale Nkonde and Charlton McIlwain. The name--All Tech Is Human goes to the exact point of the conversation that any technology is created by humans and data from humans, which means that historical bias and inequalities can replicate itself. Always good to remember that tech is not magical...all tech is human. Racism In AI is from "Building Anti-Racist Technology & Culture" by All Tech Is Human. Tech This Out News. 2020. All Rights Reserved
Who exactly is David Ryan Polgar, the co-host of Funny as Tech? His fellow co-host, comedian Joe Leonardo, puts David on the hot seat to explain his career, where he sees the tech ethics conversation going, and the meaning behind his org All Tech Is Human. This episode was recorded at Civic Hall before Covid-19. Funny as Tech is a podcast about our messy relationship with technology. Hosted by tech ethicist David Ryan Polgar and comedian Joe Leonardo, the show looks at how emerging technology is altering our life in so many profound ways. FunnyAsTech.com DavidRyanPolgar.com JoeLeonardo.com Info@FunnyAsTech.com @TechEthicist @ImJoeLeonardo
In our inaugural episode of What's Betwixt Us, we are joined by tech ethicist, David Ryan Polgar. David Ryan Polgar is the founder of All Tech Is Human, which connects the organizations and people of the Responsible Tech movement. He is also an international speaker, commentator and passionate leader on how technology is affecting our lifestyles as humans. We discuss the difference between our physical selves and our online avatars and what this all means in terms of discovering empathy. You can learn more about David by visiting his website at https://www.davidpolgar.com/ or following him on Twitter @techethicist. What's Betwixt Us is powered by zanie, designed to build trust and authentic human connection in remote workspaces. More at zanie.app.
How can we ensure that our technological systems do not reproduce existing inequalities? In partnership with All Tech is Human we present this Livestreamed conversation featuring Mutale Nkonde (CEO of AI for the People & fellow at the Digital Society Lab at Stanford) & Charlton McIlwain (author of Black Software: The Internet & Racial Justice, From the AfroNet to Black Lives Matter, as well as Vice Provost for Faculty Engagement and Development at NYU). This conversation is moderated by All Tech Is Human's David Ryan Polgar. The organizational partner for the event is TheBridge. The conversation does not stop here! For each of the episodes in our series with All Tech is Human, you can find a detailed “continue the conversation” page on our website radicalai.org. For each episode we will include all of the action items we just debriefed as well as annotated resources that were mentioned by the guest speakers during the livestream, ways to get involved, relevant podcast episodes, books, and other publications.
Is it time that we left Facebook & Twitter? Can a person succeed without a social media presence? And is social media a communications platform with advertising, or an advertising platform with communication? Joe & David discuss and debate the virtues of leaving social media, and also who has more control over major platforms: the users or the advertisers. Funny as Tech is a podcast about our messy relationship with technology. Hosted by tech ethicist David Ryan Polgar and comedian Joe Leonardo, the show looks at how social media and emerging technology is altering what it means to be human. FunnyAsTech.com DavidRyanPolgar.com JoeLeonardo.com Info@FunnyAsTech.com @TechEthicist @ImJoeLeonardo
What is Section 230 and why is it so important to how social media companies operate? David & Joe discuss and debate, with David going over the nuances of social media that make is part public square, part media company, and part utility company. Given the current battle over how social media companies moderate content, it is a much needed conversation. Funny as Tech is a podcast about our messy relationship with technology. Hosted by tech ethicist David Ryan Polgar and comedian Joe Leonardo, the show looks at how emerging technology is altering our life in in so many profound ways. FunnyAsTech.com DavidRyanPolgar.com JoeLeonardo.com Info@FunnyAsTech.com @TechEthicist @ImJoeLeonardo
David & Joe talk to Amanda Brennan, the trend expert and meme librarian at Tumblr, about the importance of memes, where the internet is headed, and what we can learn from teens online. Amanda Brennan is Tumblr’s internet librarian who currently leads their content and social teams. After graduating with her MLIS from Rutgers University, she began her career at Know Your Meme, researching the history of internet phenomena and niche subcultures. She has been at Tumblr since 2013 where she spearheaded The Fandometrics, Tumblr’s weekly ranking of entertainment fandoms on the platform, which Paste Magazine asserted should be "given a permanent place in pop culture’s critical metrics tool belt." She has spoken about internet history at conferences across the US , with topics ranging from Slender Man to cat videos, the latter of which she discussed in the BBC Four documentary How To Go Viral. She lives in New Jersey with her spouse and their three cats. Funny as Tech is a podcast about our messy relationship with technology. Hosted by tech ethicist David Ryan Polgar and comedian Joe Leonardo, the show looks at how emerging technology is altering our life in in so many profound ways. FunnyAsTech.com DavidRyanPolgar.com JoeLeonardo.com Info@FunnyAsTech.com @TechEthicist @ImJoeLeonardo
Should Elon Musk get fired from Tesla?? Hosts David & Joe discuss the the outsized role that Elon Musk plays at Tesla and whether his erratic online behavior and cryptic posts are becoming too much to handle. Does Tesla=Elon Musk or can it have a life without him? Funny as Tech is a podcast about our messy relationship with technology. Hosted by tech ethicist David Ryan Polgar and comedian Joe Leonardo, the show looks at how emerging technology is altering our life in in so many profound ways. FunnyAsTech.com DavidRyanPolgar.com JoeLeonardo.com Info@FunnyAsTech.com @TechEthicist @ImJoeLeonardo
Is a trend authentic or merely being propped up by business? David & Joe talk to trend expert Matt Klein about our life on social media, why certain technology gets adopted and others don't, and the importance of culture and psychology in determining where our tech use is headed. Matt Klein studies emergent cultural trends and the interplay of our technology and psychology, helping us make sense of now, next and the future. You can find him on Twitter at @KleinKleinKlein Funny as Tech is a podcast about our messy relationship with technology. Hosted by tech ethicist David Ryan Polgar and comedian Joe Leonardo, the show looks at how emerging technology is altering the human experience. FunnyAsTech.com DavidRyanPolgar.com JoeLeonardo.com Info@FunnyAsTech.com @TechEthicist @ImJoeLeonardo MATT KLEIN Matt Klein is a Director of Strategy at sparks & honey, a cultural consultancy, and a writer at Forbes, analyzing memes and media theory. With experience working alongside organizations including Google, MetLife, Columbia, American Airlines, AB-InBev and Facebook, as well as non-profits and government agencies, he has become a trusted source in identifying cultural change and developing future-proofing business strategies. Distinguished as top talent across Omnicom's 1,500 marketing agencies, Matt's expertise spans verticals and capabilities including marketing strategy, business transformation, trend forecasting, and UX. Hisobservations have been featured in The New York Times, WSJ, The Atlantic, TechCrunch, CNBC, Adweek and Forbes, and my thoughts have been broadcasted by the likes of Richard Branson and Arianna Huffington. https://kleinkleinklein.com/
Will there be love during lockdown? Are we headed for a baby boom or a rush of divorces? How do couples adjust? On this episode of Funny as Tech, hosts David & Joe talk with renowned sex therapist Maureen McGrath about technology, Covid-19, and maintaining intimacy during this highly stressful time. Her passion for education takes her to the airwaves every Sunday Night as host of the Sunday Night Health Show, a live listener call-in radio program that airs live on the Corus Radio Network. It is also a podcast and can be heard on iTunes, Google Play and Spotify. MAUREEN MCGRATH Maureen McGrath specializes in women’s intimate health. A sought-after speaker, Her TEDxStanleyPark talk video, “No Sex Marriage – Masturbation, Loneliness, Cheating and Shame” went viral with more than 20 million views. https://maureenmcgrath.com/ === Our relationship with tech is messy...let's discuss. On Funny as Tech, a comedian (Joe Leonardo) teams up with a tech ethicist (David Ryan Polgar) to discuss how emerging tech is upending the human experience in this weekly podcast with a diverse range of experts. FunnyAsTech.com DavidRyanPolgar.com JoeLeonardo.com Info@FunnyAsTech.com @TechEthicist @ImJoeLeonardo
Have Covid & dating apps killed romance, or have we been emotionally distancing for years? David & Joe, both married, have a freewheeling conversation with comedian & actress Carolyn Paine about her experiences swiping during lock-down. Our relationship with tech is messy...let's discuss. On Funny as Tech, a comedian (Joe Leonardo) teams up with a tech ethicist (David Ryan Polgar) to discuss how emerging tech is upending the human experience in this weekly podcast with a diverse range of experts. FunnyAsTech.com DavidRyanPolgar.com JoeLeonardo.com Info@FunnyAsTech.com @TechEthicist @ImJoeLeonardo Carolyn Paine is an actress, dancer, and comedian. She has appeared in various TV shows and national commercials as well as regional and Off-Broadway theatre productions. As a dancer and choreographer, she has traveled internationally as dancer with pop stars, been featured in a photo series in Italian Vogue, and danced for ABC’s New Year’s Rockin’ Eve. Her choreography and dancing have also been featured on The Wonderful World of Dance and the international Dance Informa Magazine. Additionally, Carolyn is a comedian who frequently performs standup, has worked with Hannibal Burress, and has been featured on a HBO promo, and in Glamour magazine. She is also a writer for various podcasts, sketches, and websites. Her comedy and comedy shorts have been honored at the Women in Comedy Festival, The International Comedy Festival in California, and the London International Comedy Short Film Festival. Carolyn is also a regular panelist on WNPR’s The Nose with Colin McEnroe.
Will Facebook Gaming be a big hit or a massive flop? Hosts David & Joe offer some predictions, side rants, and investment advice that you should probably not follow. Our relationship with tech is messy...let's discuss. On Funny as Tech, a comedian (Joe Leonardo) teams up with a tech ethicist (David Ryan Polgar) to discuss how emerging tech is upending the human experience in this weekly podcast with a diverse range of experts. FunnyAsTech.com DavidRyanPolgar.com JoeLeonardo.com Info@FunnyAsTech.com @TechEthicist @ImJoeLeonardo @FunnyAsTech Send us a voicemail at:(646) 687-6309
David Ryan Polgar is the founder of All Tech Is Human and co-host of the podcast Funny as Tech. Produced by Matt Elzweig If you have a story to tell or would like to be featured on an upcoming episode like this one, email me at fallinguphill@fallinguphill.org Music Maccary Bay by Kevin MacLeod Link: https://incompetech.filmmusic.io/song/4010-maccary-bay License: http://creativecommons.org/licenses/by/4.0/
We talk to memetics expert Dr. Jamie Cohen about the importance of memes, why they are some big during Covid19, and why everyone should be fluent in memes. We also learn about Edge Lords, and debate why virtual reality is not having its moment in the sun. Connect with Jamie at @NewandDigital and JamesNCohen.com Our relationship with tech is messy...let's discuss. On Funny as Tech, a comedian (Joe Leonardo) teams up with a tech ethicist (David Ryan Polgar) to discuss how emerging tech is upending the human experience in this weekly podcast with a diverse range of experts. FunnyAsTech.com DavidRyanPolgar.com JoeLeonardo.com Info@FunnyAsTech.com @TechEthicist @ImJoeLeonardo @FunnyAsTech (646) 687-6309
Forget screen time--we are all stuck inside exploring new online worlds, rediscovering a fondness toward video chats, and trying to balance a desire for information with a need for calmness. David & Joe chat with Australian psychologist & researcher Jocelyn Brewer, a well-known voice in around psychology in the digital age and the leader of the digital nutrition movement. Connect with Jocelyn at: www.JocelynBrewer.com @JocelynBrewer Our relationship with tech is messy...let's discuss. On Funny as Tech, a comedian (Joe Leonardo) teams up with a tech ethicist (David Ryan Polgar) to discuss how emerging tech is upending the human experience in this weekly podcast with a diverse range of experts. FunnyAsTech.com DavidRyanPolgar.com JoeLeonardo.com Info@FunnyAsTech.com @TechEthicist @ImJoeLeonardo @FunnyAsTech (646) 687-6309
We live in a digital age full of astonishing connection possibilities. But are we connecting in a way that nourishes genuine meaning and maximizes our human existence? Hear my conversation with David Ryan Polgar, tech ethicist, and the Founder of All Tech is Human, as we explore how we can humanize technology for the growth of individuals and businesses, simultaneously. Topics include:Ethics and its importance in techTechnology will not solve all your problemsCan we scale intimacy through tech?Hyper-personalization and diversification of thoughtHow to start instilling laws and guidelines for AIHere are the resources we mention during this episode: https://www.davidpolgar.com/https://alltechishuman.org/ Enjoy!
Has Covid-19 killed the techlash? After the last few years of mounting criticism, the tech industry has been receiving kudos for their ability to tamp down Covid-19 and also provide digital tools with heightened importance. How does this alter our general relationship with Big Tech? David & Joe discuss and debate. Our relationship with tech is messy...let's discuss. On Funny as Tech, a comedian (Joe Leonardo) teams up with a tech ethicist (David Ryan Polgar) to tackle how emerging tech is upending the human experience in this weekly podcast. FunnyAsTech.com DavidRyanPolgar.com JoeLeonardo.com Info@FunnyAsTech.com
How will Covid-19 impact our relationship with tech? Now more than ever we are relaying on digital tools to feel connects with others. This moment has brought into focus both the tremendous ways that our tools are allowing us to cope, along with ways for improvement in the future. Our relationship with tech is messy...let's discuss. On Funny as Tech, a comedian (Joe Leonardo) teams up with a tech ethicist (David Ryan Polgar) to tackle how emerging tech is upending the human experience in this weekly podcast. FunnyAsTech.com DavidRyanPolgar.com JoeLeonardo.com Info@FunnyAsTech.com
The human side of data with data science circus performer Andrea Jones-Rooy! Did we mention Andrea is an NYU prof & also a stand-up comedian?! David & Joe dig into Andrea's eclectic background, talk about the need for humanities in data science, the difficulty of productivity stats, and so much more. Our relationship with tech is messy...let's discuss. On Funny as Tech, a comedian (Joe Leonardo) teams up with a tech ethicist (David Ryan Polgar) to tackle how emerging tech is upending the human experience. UPCOMING LIVE SHOW in NYC on March 10, 2020: thepit-nyc.com/events/funny-as-t…and-joe-leonardo/ TOPIC: Why Is There a Techlash?! Featuring special guests Yaël Eisenstat (Visiting Fellow at Cornell Tech in the Digital Life Initiative, former CIA, former Global Head of Elections Integrity Operations in Facebook’s business integrity org), Joe Toscano (author of Automating Humanity, founder of BEACON), & Stephanie Humphrey (Technology Contributor for ABC News). We’ll be kicking the evening off with stand-up by data science comedian Andrea Jones-Rooy! FunnyAsTech.com DavidRyanPolgar.com JoeLeonardo.com Info@FunnyAsTech.com == Andrea Jones-Rooy, Ph.D. is a social scientist specializing in complexity. She is the author of a book and several research articles on complex systems. She also contributes articles to media outlets on international relations, foreign affairs, and uncertainty. http://www.jonesrooy.com/
How can we prevent real world problems in virtual environments? What rights do our virtual bodies have? David & Joe chat with Kalila Shapiro, a Processing Foundation fellow who is currently writing a series on VR Ethics for All Tech Is Human, about the need to tackle thorny ethical issues facing virtual reality. Our relationship with tech is messy...let's discuss. On Funny as Tech, a comedian (Joe Leonardo) teams up with a tech ethicist (David Ryan Polgar) to tackle how emerging tech is upending the human experience. UPCOMING LIVE SHOW in NYC on March 10, 2020: https://thepit-nyc.com/events/funny-as-tech-with-david-ryan-polgar-and-joe-leonardo/ TOPIC: Why Is There a Techlash?! Featuring special guests Yaël Eisenstat (Visiting Fellow at Cornell Tech in the Digital Life Initiative, former CIA, former Global Head of Elections Integrity Operations in Facebook’s business integrity org), Joe Toscano (author of Automating Humanity, founder of BEACON), & Stephanie Humphrey (Technology Contributor for ABC News). We’ll be kicking the evening off with stand-up by data science comedian Andrea Jones-Rooy! FunnyAsTech.com DavidRyanPolgar.com JoeLeonardo.com Info@FunnyAsTech.com
Tech ethicist, David Ryan Polgar, joins us in episode 078 of the SHIPS Podcast! We discuss a lot of great topics in this episode including our struggle to express our relationship to technology, the concept of mental obesity, David's TEPP Process, botified communication, and many others. David is truly pioneering this movement towards a healthier relationship with technology and its impact on the world at large. To learn more about the amazing work that David is doing, please check out techethicist.com and alltechishuman.org. David Ryan Polgar is a pioneering tech ethicist who paved the way for the hotly-debated issues around Facebook, AI ethics, unintended consequences, digital wellbeing, and what it means to be human in the digital age. He has appeared on CBS This Morning, BBC World News, Fast Company, SiriusXM, Associated Press, the Washington Post's "Can He Do That?" podcast, and many others. An international speaker with rare insight into building a better future with technology, David has been on stage at Harvard Business School, Princeton University, The School of the New York Times, TechChill (Latvia), The Next Web (Netherlands), FutureNow (Slovakia), and the Future Health Summit (Ireland). David is the founder of All Tech Is Human, an organization that is shifting the process of how technology is developed and deployed to one that is more inclusive, multidisciplinary, and participatory. The organization unites a broad range of technologists, academics, artists, advocates, policymakers, and students to co-create a more thoughtful future towards technology. He is a frequent consultant, co-host of the podcast Funny as Tech, and an advisor for Hack Mental Health, the Technology and Adolescent Mental Wellness (TAM) program, and #ICANHELP--all committed to using tech for good. Please visit patmcandrew.com for more information on the host and producer of the SHIPS podcast. --- Send in a voice message: https://anchor.fm/relate-patrick-mcandrew/message Support this podcast: https://anchor.fm/relate-patrick-mcandrew/support
How can we fight misinformation & build trust online with news? David & Joe chat with Macaela Bennett from NewsGuard about our information ecosystem, building media literacy skills, and where we're headed. Our relationship with tech is messy...let's discuss. On Funny as Tech, a comedian (Joe Leonardo) teams up with a tech ethicist (David Ryan Polgar) to tackle how emerging tech is upending the human experience. UPCOMING LIVE SHOW in NYC on March 10, 2020: https://thepit-nyc.com/events/funny-as-tech-with-david-ryan-polgar-and-joe-leonardo/ TOPIC: Why Is There a Techlash?! FunnyAsTech.com DavidRyanPolgar.com JoeLeonardo.com Info@FunnyAsTech.com
What happens to your sensitive mental health data when using a telehealth platform? Is it secure? What if the company is sold? Does anyone really read the terms of service? David & Joe chat with Dr. Lauri Goldkind about data protection, the quality of care provided by telehealth platforms, privacy, and responsibility. Fun! Get in touch with Funny as Tech! Our relationship with tech is messy...let's discuss. On Funny as Tech, a comedian (Joe Leonardo) teams up with a tech ethicist (David Ryan Polgar) to tackle how emerging tech is upending the human experience. UPCOMING LIVE SHOW in NYC on March 10, 2020: https://thepit-nyc.com/events/funny-as-tech-with-david-ryan-polgar-and-joe-leonardo/ FunnyAsTech.com DavidRyanPolgar.com JoeLeonardo.com Info@FunnyAsTech.com Dr. Goldkind is an associate professor at Fordham’s Graduate School of Social Service. She is also the editor of the Journal of Technology in Human Services. Dr. Goldkind’s current research has two strands: technology implementation in the human services and nonprofits and the social justice and ethics implications of data collection, use and dissemination in community based organizations. Wherever possible she combines both ICT and social justice for a blend of tech enhanced civic engagement and improved organizational functioning. She holds an M.S.W. from SUNY Stony Brook with a concentration in planning, administration, and research and a PhD from the Wurzweiler School of Social Work at Yeshiva University. Dr. Goldkind is also a past visiting research fellow at the UN University on Computing in Society, Macau, SAR, China.
Is theater the best medium to explore the impact of technology on the human condition? We chat with playwright, actor, and fellow podcaster (S.H.I.P.S.) Pat McAndrew on the power of the stage to explore all of the weird, wild, and confusing ways that social media has upended how we connect (or don't). On Funny as Tech, tech ethicist David Ryan Polgar and comedian Joe Leonardo bring on a wide range of experts to unpack are confusing present and uncertain future. Our relationship with tech is messy...let's discuss! FunnyAsTech.com TechEthicist.com JoeLeonardo.com @FunnyAsTech @TechEthicist @ImJoeLeonardo Pat McAndrew is an actor, writer, international speaker, podcast host, and musician based in New York City. He performs and develops evocative performances and shows, often about technology's impact on human relationships. Pat is particularly interested in using theatre, film, and media as a gateway to learning about the Internet and social media's influence on our decisions, habits, and lifestyles. Pat has taken the stage, not only as a performer, but as a speaker as well, sharing his take on how we can be more human in the age of technology. He has spoken at international conferences, on college campuses, and at community events on topics related to enhancing our relationships in the technological revolution. His methods are rooted in his extensive background as an actor, as he believes the skillsets learned through actor training are vital in helping to fight tech addiction, isolation, anxiety, and depression and in improving the social well-being of individuals and communities. Pat has performed in various locations throughout New York City, including Carnegie Hall and Theatre Row, as well as other locations throughout the mid-Atlantic region. Pat's one-man show, REEL, which deals with the impact that excessive technology use has on human relationships, was performed in the 2017 United Solo Theatre Festival, the largest solo theatre festival in the world. His other one-man show, The Other Scottish Play, was performed in the 2018 United Solo Theatre Festival. Pat is a member of Village Playback Theatre, Endless River Arts, and Svaha Theatre Collective. In 2019, Pat launched the podcast, SHIPS: The Vessels for a Meaningful Life. In the show, he speaks with top experts in the fields of digital wellness, entertainment, and business to discover how to cultivate more genuine, meaningful relationships in an age where technology is king, anxiety and isolation are at an all time high, and the only possible cure for our loneliness is through building relationships and community while collaborating and connecting with others. https://www.patmcandrew.com/about
In Episode 21, Jessica Ann talks with tech ethicist David Ryan Polgar. Polgar digs below the surface to examine our tech use from an ethical, legal, and emotional perspective. With a background as an attorney and an educator, along with experience working with social media companies, he is able to take a Multidisciplinary approach to our evolving use of technology. In this interview, David and Jessica discuss: How Technology Impacts Us from an Ethical and Emotional Perspective Why we lack appropriate terms for what we're even talking about. Everyone is struggling with this. Why our rapturous submission to digital technology has led to an atrophying of human capacities like empathy and self-reflection, and the time has come to reassert ourselves, behave like adults, and put technology in its place (h/t Sherry Turkle's Reclaiming Conversation." What does it mean to be human in how we communicate and at the world at large? We're often living through a filter and that filter is coming through our screen. How we're hooked up to the Google brain. If we all have access to it, it's not important. It's more important to step back, and have creativity and wisdom with access the information. It's more important to be creative today but we're struggling with the mass-consumption of information. Creativity and wisdom are seen as the ability to connect dots. Creative people take things we wouldn't think and say "let's combine that." The problem that's happening is that we're struggling with something that never ends because we of unlimited consumption. When information becomes available we tend to gobble it up. We need to say "Hey Facebook, make those cookies a little less delicious." Silicon Valley is selling us a product that we're gobbling up but we do not know the nutritional content. We need to allow for more moments of digestion: Don't check your phone, close the browser. This is easier said than done but we need to think of an information diet the same as we think of a food diet. How do we not become a robot in a world that's trying to change us into robots? As a capitalist society, every revolution has a counter-revolution. How and why Silicon Valley is selling us a product that we're gobbling up. And why we need to make an informed decision about the content we consume. Do we humans have free-will with the endless use of algorithms? How humanity is becoming "bot-ified" based on predictive analytics. How LinkedIn automates intimacy and how we have gamified relationships. Why social adoption is not solely focused on utility. You can follow David on Twitter at @techethicist or visit his website here.
David Ryan Polgar comes on to discuss technology and how it is affecting your sex life. Married couples on average are have sex 11 times less a year, and there is a direct link to your use of technology.