POPULARITY
“Very demure, very mindful” has become the latest vocabulary defining the internet's summer. TikTok creator Jools Lebron is working to trademark the use of her now-viral words. Lebron filed to trademark “very demure, very mindful” for various entertainment and advertising services, including the promotion of beauty products, with the U.S. Patent and Trademark Office. Social media's love for “very demure” content started in early August when Lebron took to TikTok to describe the hair and makeup she was wearing to work. Her delivery took off and she kept going, with “mindful” and “cutesy” flooding the internet as scores of fans, including big-name celebrities, shared their own playful takes to describe just about any detail of day-to-day life. Can you trademark a viral phrase? Yes. But in the U.S., there needs to be an attached commercial use. “It's not just coming up with a phrase [...] (or) using it on social media and making it go viral,” said Alexandra J. Roberts, a professor of law and media at Northeastern University, explaining that there must be a connection to the sale of concrete goods or services. She calls trademarks a “source indicator,” as they help consumers understand who is producing what they're buying now, but not necessarily who came up with a name in the first place. “And if (someone else) created a social media marketing service and called it ‘very demure, very mindful social media marketing,' that would confuse consumers because they're gonna think it's associated with (Jools Lebron),” said Casey Fiesler, an associate professor of information science at the University of Colorado Boulder. In today's ever-digitized world of online trends, creators are increasingly expressing concerns about getting credited for their work. And for something like trademark rights, experts stress it's a battle of both getting there first and having resources to see it through. Beyond trademark-specific disputes, Fiesler added that creators seeing their work stolen and reposted on other platforms for monetization continue to be a “huge problem” today, but she hopes the tide is starting to turn. That includes Lebron, who has been so widely-credited for the “very demure” trend. This article was provided by The Associated Press.
Navigating the Ethical Horizons of AI with Prof. Casey FieslerJoin us in a thought-provoking conversation with Professor Casey Fiesler from the University of Colorado - Boulder, as we delve into the intricate intersection of artificial intelligence, ethics, technology, and the world of education. We look into the role of AI in education and gain insights on staying informed in this rapidly evolving landscape. Professor Fiesler, now known for leveraging her massive social media presence to educate on AI ethics, shares valuable perspectives on responsible technology use. Tune in for an exploration of the ethical dimensions of our AI-driven future and how educators can frame their approach with such technology - enjoy!Be sure to check out our past episodes here and check out our publication, Age of Awareness here - we appreciate your support! Links to connect with Professor Fiesler:TikTokInstagramYouTubeWebpagecaseyfiesler.comMedium (blog)
Truth in Learning: in Search of Something! Anything!! Anybody?
Clark and Matt are joined by instruction design guru and all around fantastic human, Julie Dirksen. Julie has been in the business for over 15 years creating highly interactive and more importantly, highly effective eLearning experiences for clients all around the world. But, Julie is more than that! She is one of the those go-to people in the industry. She is an expert that many other experts in our industry turn to and call a researcher translator. Meaning she digs into core issues… practical issues, in the work learning practitioners do. She figures out what the research says, and then puts it into succinct, useful bites that are immediately applicable. Her first book, DESIGN FOR HOW PEOPLE LEARN is one of those rare books in L&D that broke out and became a best seller beyond the industry. It is the go-to book for designers and trainers. Today, Clark and I get to talk with her about latest book, TALK TO THE ELEPHANT, DESIGN LEARNING FOR BEHAVIOR CHANGE. We talk about systems thinking and how the system can affect the factors that influence how and why one behaves as one does. We explore the individual factors such as motivation, incentives, and environmental factors. And more! As Clark will say in the episode, TALK TO THE ELEPHANT is a wonderful complement, a companion, to DESIGN FOR HOW PEOPLE LEARN. We originally planned to talk with Julie for just 20 minutes, One hour later… we were still going and felt like we could go on forever. Julie also joins us at the end for Best and Worst. You can find Julie at: https://usablelearning.com.Julie rattles off so many models and tools throughout the show, we recommend you simply buy the book to get more on each, as well as their respective references. For the links she directly references, here they are:The change ladder survey link is on the book page:https://usablelearning.com/elephant/ Julie's "best" was Casey Fiesler. Her video on Fair Use is here:https://www.youtube.com/watch?v=D2PuntvfN20 Casey has a shorter version here that skips the wolf-themed erotica: https://www.youtube.com/watch?v=FuDEgnxkGDg) The syllabus for her tech ethics course on tiktok is here:https://docs.google.com/document/d/1tWdqYqYBHARbZXFQX4cybe88S-0twqvUu1xLhYnLgU4/edit?usp=sharing
In this episode, Dr C speaks with Dr. Casey Fiesler, a Professor of Information Science at the University of Colorado Boulder, who specializes in technology ethics. They discuss the potential harm and biases of machine learning and the role of ethics in technology development. Dr. Fiesler highlights the racial bias inherent in training certain AI systems and the ethical challenges surrounding this bias. They also discuss the phenomenon of automated online test proctoring and how it disproportionately disadvantages students of color.
This week I spoke to Casey Fiesler, a tech ethicist, professor and rather famous TikTokker about the parallel universe of copyright litigation, fair use arguments, revenge pornography and so much more. Find more out about Fiesler's academic work here: Website | YouTube TikTok
Sandra and Karlee wax nostalgic about their fanfic journeys over the decades (yes, sigh, decades). This deep dive episode came from Sandra's stumble on the YouTube video “The Life and Death of Fandom Platforms” by Casey Fiesler. Give it a watch and check out all the research and reference materials in its description. It's a fascinating exploration of the rise and fall of various fandom platforms. Let Sandra and Karlee know what your fandom journey has been like and which platforms you've been a part of. Speaking of fandom communities, if you are a lover of SPN FanFic - and come on, you've gotta be if you listen to these two ramble on about the topic - and you aren't a part of it already, give SPN FanFic Pond a try! ~~~ We're taking you for a spin in Baby's backseat. Dean's House Rules - Driver picks the music, shotgun shuts his cakehole, and the ones in the back enjoy the ride... idling in the Impala. ~~~~~ TL;DR - If you can't be bothered clicking on all the things in this description, just visit our website: idlingintheimpala.com We'd love to hear your thoughts. Send us an email (idlingintheimpala@gmail.com)! DM us on Twitter or leave us a voicemail! All the Socials and AO3 and Fiction links: https://linktr.ee/idlingintheimpalapodcast Our Discord #backseat Channel (if the invite link is outdated; please contact us for one to join the group) Interested in being a guest on the podcast? Give us some info about you here so we can connect. Feel inclined to leave us a tip for all this AWESOME content? Visit our Ko-fi page. Supporters will get access to our #behindthescenes channel, content self-explanatory. We've got podcast merch for our fellow idlers. Take a look! ~~~~~ Charities supporting people affected by the Russian invasion of Ukraine World Central Kitchen and GlobalGiving ~~~~~ For Those in the US: Educate and Empower Yourself, Find Ways to Take Action Support Basic Human Rights - American Civil Liberties Union (ACLU) Prioritize Your Mental Health - National Alliance on Mental Illness (NAMI) Thrive (Not Just Survive) After Abuse - Rape, Abuse & Incest National Network (RAINN) ~~~~~ LGBTQ+ Charities Switchboard LGBT UK The Trevor Project - USA and Global --- Send in a voice message: https://podcasters.spotify.com/pod/show/idlingintheimpala/message Support this podcast: https://podcasters.spotify.com/pod/show/idlingintheimpala/support
New year, new hype? As the world gets swept up in the fervor over ChatGPT of late 2022, Emily and Alex give a deep sigh and begin to unpack the wave of fresh enthusiasm over large language models and the "chat" format specifically.Plus, more fresh AI hell.This episode was recorded on January 20, 2023.Watch the video of this episode on PeerTube.References:Situating Search (Shah & Bender 2022) Related op-ed: https://iai.tv/articles/all-knowing-machines-are-a-fantasy-auid-2334Piantadosi's thread showing ChatGPT writing a program to classify white males as good scientistsFind Anna Lauren Hoffman's publications (though not yet the one we were referring to) here: https://www.annaeveryday.com/publicationsSarah T. Roberts, Behind the Screen Karen Hao's AI Colonialism series Milagros Miceli: https://www.weizenbaum-institut.de/en/spezialseiten/persons-details/p/milagros-miceli/Julian Posada: https://posada.website/“This Isn't Your Data, Friend”: Black Twitter as a Case Study on Research Ethics for Public Data (Klassen & Fiesler 2022) No Humans Here: Ethical Speculation on Public Data, Unintended Consequences, and the Limits of Institutional Review (Pater, Fiesler & Zimmer 2022) Casey Fiesler's publications: https://caseyfiesler.com/publications/And TikTok: https://www.tiktok.com/@professorcaseyWhere are human subjects in Big Data research? The emerging ethics divide. (Metcalf & Crawford 2016) You can check out future livestreams at https://twitch.tv/DAIR_Institute. Follow us!Emily Twitter: https://twitter.com/EmilyMBender Mastodon: https://dair-community.social/@EmilyMBender Bluesky: https://bsky.app/profile/emilymbender.bsky.social Alex Twitter: https://twitter.com/@alexhanna Mastodon: https://dair-community.social/@alex Bluesky: https://bsky.app/profile/alexhanna.bsky.social Music by Toby Menon.Artwork by Naomi Pleasure-Park. Production by Christie Taylor.
AI AND EDUCATION...GOOD OR BAD? I've got CU associate professor in the Department of Information Science Casey Fiesler, who studies technology ethics and online communities, on today to talk about ChatGPT and how it can be used or abused in education. She joins me at 1.
In this episode, we unpack: is ChatGPT Ethical? In what ways? We interview Dr. Emily M. Bender and Dr. Casey Fiesler about the limitations of ChatGPT – we cover ethical considerations, bias and discrimination, and the importance of algorithmic literacy in the face of chatbots. Emily M. Bender is a Professor of Linguistics and an Adjunct Professor in the School of Computer Science and the Information School at the University of Washington, where she has been on the faculty since 2003. Her research interests include multilingual grammar engineering, computational semantics, and the societal impacts of language technology. Emily was also recently nominated as a Fellow of the American Association for the Advancement of Science (AAAS). Casey Fiesler is an associate professor in Information Science at University of Colorado Boulder. She researches and teaches in the areas of technology ethics, internet law and policy, and online communities. Also a public scholar, she is a frequent commentator and speaker on topics of technology ethics and policy, and her research has been covered everywhere from The New York Times to Teen Vogue. Full show notes for this episode can be found at Radicalai.org.
We talk with Dr. Casey Fiesler, Associate Professor of Information Science at the University of Colorado, Boulder about “generative AI,” particularly regarding ChatGPT and DALL-E, which are topics of recent news stories expressing excitement and concern. We asked ChatGPT to write a description of what such an interview might be: Dr. Casey Fiesler joins Joel Parker on the … Continue reading "AI or not AI, that is the question — ChatGPT, DALL-E, and Generative Artificial Intelligence in the Human World"
Algumas pessoas perder-se-ão pelo caminho. Mas o Twitter é uma coleção de comunidades e algumas podem conseguir migrar de forma mais bem sucedida do que outras, explica Casey Fiesler, professora de Ciências da Informação, neste artigo lido por Carlos Carujo.
Zoombombing, Cambridge Analytica, AI bias, misinformation, hate speech…when tech companies and researchers come under fire, people wonder: why are they not thinking about potential harms? Unintended consequences of technology are a significant social issue, and when we “move fast and break things” it's ethical considerations that often get pushed to the side. Like technical debt, the implied cost of future bug fixes when we rush to release technology, ethical debt is what we accumulate when we don't consider ethical and social implications during the design process. How can we help technologists speculate about the future? Also how might we understand real impacts of technological harms on everyone, and give everyone the knowledge and tools to be more critical of technology? Join the Robert Zicklin Center for Corporate Integrity on March 1, 2022 as we welcome Casey Fiesler for a moderated discussion with Professor Yafit Lev-Aretz, Director of the Robert Zicklin Center's Program on Tech Ethics.
Zoombombing, Cambridge Analytica, AI bias, misinformation, hate speech…when tech companies and researchers come under fire, people wonder: why are they not thinking about potential harms? Unintended consequences of technology are a significant social issue, and when we “move fast and break things” it's ethical considerations that often get pushed to the side. Like technical debt, the implied cost of future bug fixes when we rush to release technology, ethical debt is what we accumulate when we don't consider ethical and social implications during the design process. How can we help technologists speculate about the future? Also how might we understand real impacts of technological harms on everyone, and give everyone the knowledge and tools to be more critical of technology? Join the Robert Zicklin Center for Corporate Integrity on March 1, 2022 as we welcome Casey Fiesler for a moderated discussion with Professor Yafit Lev-Aretz, Director of the Robert Zicklin Center's Program on Tech Ethics.
Polygamer – A Podcast of Equality and Diversity in Gaming & Video Games
Brianna Dym is a Ph.D student in the University of Colorado Boulder's information science department under Dr. Casey Fiesler. Brianna's field of study is fanworks: the original creations, derivations, and reimaginings of existing brands and media, from movies to video games. LGBTQIA+ authors engage in this creative component of fandom to produce original stories and […] The post Polygamer #115: Brianna Dym on fanworks first appeared on Polygamer - A Podcast of Equality & Diversity in Gaming & Video Games.
“Things still aren’t perfect but we’re moving in the right direction.” — Casey Fiesler The idea of technology and social media influencing a presidential election once would’ve sounded ridiculous. Today, it’s a reality that raises big questions about the tech industry’s ethical responsibility regarding everything from flagging misinformation to reflecting our diverse society. Casey Fiesler, a research and assistant professor at the University of Colorado Boulder, is an expert on ethics in technology and the influence it has on our society. In this episode of Leading with Genuine Care, Casey explains the basics of tech ethics, why increasing inclusion in tech can create a better world, how rewriting a “bafflingly sexist” Barbie book on computer science helped Mattel improve how they teach girls about tech, and so much more. In this episode, you’ll learn: Why ethics in technology is important What it means to be ethical in tech About current tech ethic stories in the news What responsibility tech companies should have How Casey helped Mattel write better books about girls in computer science Why disinformation in the media creates ethical tech questions What Twitter’s “Civic Integrity” policy is Why tech companies need to follow more metrics than profit How the election brought up many ethical tech issues Why diversity in tech design and leadership offers better solutions What technical debt is How technology can invade privacy Why people must consider what they put on the internet If Casey thinks we’re on the right track And so much more! More About Casey Fielser Casey Fiesler is an assistant professor and founding faculty in the Department of Information Science at the University of Colorado Boulder, where she researches and teaches in the areas of technology ethics, internet law and policy, and online communities. She is also a public scholar and a frequent commentator and speaker on topics of technology ethics and policy, as well as women in STEM (including consulting with Mattel on their computing-related Barbies). Originally from Atlanta, she holds a Ph.D. from Georgia Tech in Human-Centered Computing and a J.D. from Vanderbilt University Law School. Connect with Casey Fiesler Website www.caseyfiesler.com Twitter https://twitter.com/cfiesler YouTubewww.youtube.com/c/CaseyFieslerPhD Get Rob’s Weekly Newsletter Never miss an inspiring conversation about compassionate, positive leadership on the Leading with Genuine Care podcast plus other great articles and insights. Click below and you’ll also get a download of his favorite mindful resources. https://www.donothingbook.com/resource-guide Follow Rob Dube on Social Media LinkedIn: www.linkedin.com/in/robdube Facebook: www.facebook.com/rob.dube.1 Twitter: twitter.com/robddube Rob Dube’s Website www.donothingbook.com Buy Rob’s book, donothing: The Most Rewarding Leadership Challenge You'll Ever Takeamzn.to/2y9N1TK
Episode 17 : Acné, pustules, kystes...Pourquoi regardons-nous des vidéos gores sur YouTube ? L'article original : Annabelle Mooney, "Rituals about the skin: comments on pimple popping videos", Social Semiotics, 2020. L'outil utilisé pour collecter les commentaires YouTube : https://ytcomments.klostermann.ca --------- Les références citées dans l'article et mobilisées implicitement ou explicitement dans le podcast : **Sur le dégoût** : Mary Douglas, De la souillure, Paris, La Découverte, 2005 [1966]. Julian Hanich, "Dis/Liking Disgust: The Revulsion Experience at the Movies", New Review of Film and Television Studies, 7(3), p. 293–309, 2009. **Sur la peau** : Lisa Blackman, The Body: The Key Concepts, Londres, Berg, 2008. Marc Lafrance, "Skin Studies: Past, Present and Future", Body & Society, 24(1‑2), 2018, p. 3‑32. Patricia McCormack, "The Great Ephemeral Tattooed Skin", Body & Society 12 (2),p. 57–82, 2006. Samantha Murray, "Corporeal Knowledges and Deviant Bodies: Perceiving the Fat Body", Social Semiotics 17 (3), p. 361–373, 2007. Julia Skelly, "Skin and Scars: Probing the Visual Culture of Addiction", Body & Society 24 (1-2), p. 193–209, 2018. **Sur l'éthique** : Casey Fiesler, Nicholas Proferes, "‘Participant' Perceptions of Twitter Research Ethics." Social Media+ Society, 4(1), 2018. Roxanne Leitão, "Technology-Facilitated Intimate Partner Abuse: A Qualitative Analysis of Data from Online Domestic Abuse Forums", Human–Computer Interaction, 2019. Michael Zimmer, "‘But the Data is Already Public': On the Ethics of Research in Facebook", Ethics and Information Technology, 12 (4), p. 313–325, 2010. ------ Pour aller plus loin : Une belle série de France culture sur la "philosophie du gore" : https://www.franceculture.fr/emissions/series/philosophie-du-gore Bernard Andrieu, "L'osmose émersive De la peau vivante à la peau vécue", Spirale, 2019, en ligne : https://hal.archives-ouvertes.fr/hal-02526949 Maurice Merleau-Ponty, Phénoménologie de la perception, Paris, Gallimard, 1976. Marie-Anne Paveau, "Littérature cutanée", 21 janvier 2012, en ligne : https://penseedudiscours.hypotheses.org/7942 Gianfranco Marrone, Principes de la sémiotique du texte, Paris, Mimesis, 2016.
Interview with Rebecca Katz, a member of the OTW Legal Advocacy Committee on their work supporting the legal status of fanworks. Shownotes: Consulting Fans: Rebecca Katz, Finnagain; Producer/Editor: Finn Fanlore Wiki Fanhackers blog OTW Legal Advocacy Blog Post on Fanworks and Fair Use Post on Tumblr TOS changes Post on Proposed EU copyright law changes Post on Fighting Fan Unfriendly EU copyright proposal Post on Fan Copyright Knowledge Survey Rebecca Tushnet’s blog Casey Fiesler on Medium Casey Fiesler’s Ted Talk Heidi Tandy’s fandom copyright blog Rebecca Katz’s website Fiesler and Dym Slate article on Fandom and Tumblr This segment was first released on January 6, 2019 in Episode 88: League of Furies. Music Credit Unless otherwise indicated, music is available for purchase through online retailers such as amazon.com and iTunes. OTW Legal Advocacy Interview – Todd Michaelsen: Sita’s String Theory from All Creative Work Is Derivative by Question Copyright Production Credits Segment Producer/Editor: Finnagain Banner Art: Fox Estacado Distribution funded by fans! Contact Forum: http://www.three-patch.com/forums Email: bored@three-patch.com Website: https://www.three-patch.com Facebook: https://www.facebook.com/threepatchpodcast LJ: http://threepatch.livejournal.com Skype: threepatch.podcast Twitter: https://twitter.com/threepatch Tumblr: http://threepatchpodcast.tumblr.com/ How to Cite APA By Three Patch Productions. (2019, January 6). Interview with OTW Legal Three Patch Podcast Episode 88: League of Furies. Podcast segment retrieved from https://www.three-patch.com/casefiles//88-OTW.
Casey Fiesler is a social computing researcher who primarily studies governance in online communities, technology ethics, and fandom. She is a Senior Fellow in the Silicon Flatirons Institute for Law, Technology, and Entrepreneurship, an ATLAS fellow, and holds a courtesy appointment in Computer Science. Also a public scholar, she is a frequent commentator and speaker on topics of technology ethics and policy, as well as women in STEM (including consulting with Mattel on their computing-related Barbies). Her work is supported in part by a $3 million collaborative National Science Foundation grant focused on empirical studies of research ethics. Fiesler holds a PhD from Georgia Tech in Human-Centered Computing and a JD from Vanderbilt University Law School. Click here to learn more about Casey Fiesler
In Episode 91, “Casey Fiesler,” Flourish and Elizabeth welcome the information science professor back onto the podcast to discuss her research, especially her study on the way transformative fandom migrates across platforms. Topics covered include feminist HCI, if you can truly get a “representative” sample of fans for a survey, privacy concerns as fandom comes under more scrutiny from researchers, and what Tumblr could have done—and still could do—better regarding the Great Porn Crackdown of 2018.
Casey Fiesler talks about the ethics of scientists using people's Tweets in research without their permission.