The WashingTECH Policy Podcast is your resource for media and tech law and policy news. Each week, the WashingTECH Policy Podcast gives you the latest developments in media and tech law & policy, as well as an interview with an influencer in the media and technology sectors, whether they be policyma…
Listeners of WashingTECH Tech Policy Podcast with Joe Miller that love the show mention: joe interviews, great work joe, tech news, policies, services, washington, technology, fantastic show, timely, upbeat, prepare, super interesting, definitely recommend, concept, intro, informed, media, hot, produced, tv.
The discourse in the US is increasingly toxic and divisive. Disinformation, misinformation, and hate speech are rampant and we have few options to change it. The US has been entrenched in a deteriorating political and information environment for a long time now. The Supreme Court's overturning of Roe v. Wade instantly removed a right that 165 million Americans expected to have indefinitely. The discourse has always been divisive, but now it's reaching a point where people are unable to listen to one another or understand each other's perspectives. This means that there isn't any real discourse anymore, just fighting words that make people feel good about themselves but don't actually accomplish anything at all. Bio Ryan Merkley @ryanmerkley Ryan Merkley is Managing Director at Aspen Digital, focused on emerging technology, internet policy, and the information ecosystem. He is an accomplished executive working at the intersection of public good and technology for organizations like Wikimedia, Creative Commons, and Mozilla. Resources Commission on Information Disorder Final Report
Supreme Court overturns Roe v. Wade, opening the door to surveillance In a 6-3 decision Friday, the Supreme Court overturned Roe v. Wade. Justices Breyer, Sotomayor, and Kagan were the only dissenting justices. Writing for the majority, Justice Alito left it up to state legislatures to write their own abortion laws. As far as tech policy is concerned, many advocates, including WashingTech, are concerned that law enforcement will now be able to surveil location data in any of the 13 states in which abortion is now outlawed. Congress inches closer to federal privacy law The House Energy & Commerce Committee passed a bipartisan privacy framework on Thursday, with the measure now heading to the Senate. Reuters' Diane Bartz reports the bill would let you opt out of targeted ads online. It would also give users the ability to sue firms for selling their user data to third parties. The bill would override the patchwork of state privacy laws we have currently in states including California, Colorado, Connecticut, Utah and Virginia. The Washington Post reported Thursday that Senator Maria Cantwell, a key vote, doesn't support the bill in its current form for precisely that reason – the Senator believes that, in many cases, the state privacy rules are stronger than the ones in the House bill. Greintens may have incited violence on social with ‘RINO' hunting video Eric Greitens, the former Republican governor of Missouri who is now running for the US Senate, posted a video appearing to encourage viewers to go ‘RINO' hunting – RINO being an acronym for Republican in Name Only. “Join the MAGA crew,” Greitens says in the video, “Get a RINO hunting permit.” He says this as he's holding a shotgun surrounded by smoke, and a couple of boneheads dressed up as SWOT team officers bust through the door behind him. Facebook was the only company to remove the video outright. Twitter and YouTube left it up, although Twitter added a “public interest notice” to the tweet. Talk therapy apps under scrutiny If you've used a talk therapy app like Betterhelp or Talkspace since COVID started, chances are you've had at least a fleeting concern about how these companies use your data. Well, Senators Elizabeth Warren, Cory Booker, and Ron Wyden are concerned too, which is why they sent a letter to the firms asking them to explain their privacy practices. The Democratic lawmakers want to know how these firms collect data, what they do to protect it, and how they communicate their data protection practices to their users. Meta reportedly to shut down misinformation-reporting tool The Verge reports that Meta is planning to shut down Crowdtangle, which can be used to find misinformation within popular social media posts. A Meta spokesperson told the Verge that the company will probably keep Crowdtangle working at least through the midterms. After that, the company says it plans to launch a better product. Amazon may enable your Alexa assistant to take on the voice of a dead relative So at its annual re:MARS conference in Las Vegas, Amazon SVP and Head Scientist for Alexa, Rohit Prasad, demonstrated how a future iteration would enable your Alexa Assistant to take on the voice of anyone, including a dead relative, at least that was how Prasad decided to demonstrate the product – with someone's dead grandma reading a bedtime story. Don't ask me – I'm just reading what they wrote for me here. Anyway, that's it for this week. You can find links to all of these stories in the show notes. Stay safe, stay informed, have a great weekend. See you Monday.
Biden calls for better kids privacy laws in State of the Union During his state of the Union address Tuesday, President Joe Biden called for better regulation of social media companies. First Lady Jill Biden invited Facebook whistleblower Frances Haugen to the State of the Union – Haugen was the first to shed light on Facebook's (now, Meta's) internal efforts to target children as young as 6 on Instagram, and the fact that the company ignored its own research showing Instagram damaged teenage girls' self-esteem. Facebook is getting sensitive medical information The Markup reported yesterday that Facebook may have been receiving your medical information from a tracking tool – Pixel. Many hospitals use Pixel on their websites to track site visits. So let's say hypothetically that you search for a health condition on the hospital's website – well, for about a third of those sites, the tracking tool sends the information to Facebook. Johns Hopkins, UCLA ReaMgan, New York Presbyterian, Northwestern Memorial, and Duke University Hospital are among the hospitals that track site visitors with Pixel. Democrats led by Sen. Warren introduce bill that bans sale of location data In the wake of the leaked Supreme Court decision to overturn Roe v. Wade, which paves the way for red states to criminalize abortion procedures, Sen. Elizabeth Warren led a group of Democratic lawmakers to introduce a new bill – The Health and Location Data Protection Act – that would completely ban the sale of your location data. The bill also envisions empowering the Federal Trade Commission to intervene when necessary. Advocates warn of hate speech problems in Klobuchar's antitrust bill Advocates including Free Press are pushing back against Amy Klobuchar's antitrust bill – the American Innovation and Choice Online Act – because they're concerned the bill would let companies that may have profited from hate speech and disinformation – like Infowars – to sue platforms like Google from banning them in search rankings. Advocates worry a provision that prohibits Google from favoring their own search results over smaller competitors – could pave the way for disinformation profiteers to make anticompetitive accusations when platforms ban their sites. Elon Musk suggests harmful content on Twitter should stay up if it's just entertainment Elon Musk announced to Twitter employees in a livestream that free speech issues should outweigh content moderation. Last month, said that he'd reinstate Donald Trump's account, although he said that prior to the commencement of the January 6th committee proceedings. Twitter's stock has steadily dropped from $54.20 per share, which was the price when Elon Musk made his $44 billion bid to purchase the company. At the closing bell today, Twitter was trading at $37.78. That's it for this week. You can find links to all of these stories in the show notes. Stay safe, stay informed, have a great weekend. See you Monday.
I decided to do a solo episode this week because I think it's really sort of super important to highlight bias in the public policy profession – because it is a profession. Because over the last 17 years that I have been working on tech and media public policymaking, majority-white organizations have always seemed to think it's totally fine to attack organizations founded and led by people of color, orgs like this one, to pursue their status ambitions. So, let's be transparent, shall we? Just to give you some background – when I started working in this space – there was one organization in Washington – the Multicultural Media & Telecom Council (MMTC) – that focused specifically on telecommunications and media policymaking as they relate to underserved and underrepresented communities. This is where I cut my teeth as a young lawyer, in the ONLY fellowship in town for somebody like me who went to law school at night. Their members are, and continue to be, some of the finest minds in the business – folks like Ari Fitzgerald, a partner at Hogan Lovells who was on the show last week. These are lawyers of color primarily, who are at the top of this craft. And let me interject something here – this isn't Oakland. We are not Color of Change. This is DC, it's a political town, it's buttoned-up, and comparing orgs of color in this town to orgs like Color of Change – is really just not a relevant comparison. A better comparison would be to an organization like NAACP or the National Urban League – both of which have local chapters but they're based in DC. Donations come from 2 primary categories here in DC – corporations and foundations – that's it. If you've been around a long time – like the NAACP, NUL, or AARP – you have members. You can raise money from them, in addition to seeking other forms of support. That is the way this market works. Back around 2011 and 2012 it was organizations like Public Knowledge and Free Press calling out MMTC for accepting donations from Comcast. Again, one of the worst companies in the world for customer service. But they are internet service providers. It didn't matter if larger, huge nonprofits worked with these same companies – all that mattered was that Free Press and Public Knowledge needed someone to pick on when they were advocating for net neutrality. MMTC opposed net neutrality, which didn't make sense to me, which is why I started this organization when I was laid off from the Joint Center, where I co-led an institute along with Nicol Turner-Lee, which was also focused on telecommunications and media policy at the intersection of communities of color. So-called progressives with super-deep pockets didn't like orgs like MMTC because they were the only game in town – they had too much credibility – and they opposed net neutrality ( for reasons, by the way, I continue to be baffled by – but, in any case, they opposed it). So orgs like Public Knowledge and Free Press called them out – and I called out Public Knowledge and Free Press for having ZERO people of color working there but somehow having the audacity to try to drag MMTC. And as a result of my advocacy – at least I like to think it was – since no one else was pushing back – Public Knowledge is led by the great Chris Lewis and Free Press, by Jessica Gonazlez, who serves co-president along with Craig Aaron. These orgs now actively recruit diverse talent – they have changed drastically – and I'm proud of them. Jessica, Craig and Chris are my colleagues – just like you have colleagues in any profession – they fixed their model and stopped attacking MMTC. We'll see what happens when and if the net neutrality debate starts up again. But, for now, we're good. Joe Torres is at Free Press who wrote the book on diversity in the news profession. SO LET'S forward to 2016. In 2016, when this organization – WashingTech – was still an LLC, for profit, Chanelle Hardy, someone I've known since I first moved to DC in 2005 – who had worked at the National Urban League, on the Hill, and the FCC – joined Google. And, again, as a result of my advocacy, since I was vocal about it, as I am now – one of our taglines then and now – is the Inclusive Voice of Tech Policy, since nobody else cared about that until George Floyd. In 2016, Google started a cohort of folks called Next Gen Policy Leaders – the only PROGRAM IN TOWN AT THE TIME – to engage and involve people of color in tech policy. I continue to participate in the program because it is educational, offers great networking, and, again, continues to fill a need that everyone else just woke up to a couple years ago: the lack of diversity on panels, at networking events, on faculties, you name it, related to tech policy issues. Google was a first mover, while the rest of these tech companies, and nonprofits, were asleep. So, whose fault is that? Whose fault is it that they've built loyalty by investing in us. Now, fast forward to today – here comes another organization – the so-called ‘Tech Transparency Project,' which, again, isn't a racially and ethnically diverse organization, attacking Google Next Gen in a mediocre paper suggesting that Google had bought out people of color so they wouldn't speak out about Timnit Gebru's firing – again Dr. Gebru is an engineer whom Google fired for speaking up about bias in one of Google's algorithms. First, I OPPOSED her firing, and vocally, on a listserv read by many Next Gens and other people of color in this space. I blasted Google for it. I was livid. I was so vocal, no one else in the Next Gen cohort needed to be – which is always the case in this town – and let me tell you something, we get PENNIES compared to some of these larger organizations. Do you know what Google donated to us last year? $35,000. The rest of our funding came from Foundation support. But let's take a look at the Center for Democracy & Technology (CDT), which published their annual report last week. Let's see what Google donated to CDT. And CDT is a partner of ours. I'm on their advisory council. Many of their fine scholars have been on this show. But let's take a look – How much did Google donate to CDT in 2021? Wait for it! Over $500,000. Two other donors gave that much – the Chan Zuckerberg Initiative and the Knight Foundation. Now, let's use CDT's search feature on its website to see how much work they've done on “Timnit Gebru.” How many times did CDT so much as mention Dr. Gebru's name, much less call out Google for firing her? ZERO. How many times has the Tech Transparency Project called out CDT for failing to discuss Timnit Gebru? ZERO. So come on, let's talk about ‘Tech Transparency,' everybody. I guarantee I've made more sacrifices in the cause of inclusion and so-called “transparency” in this space than most of Washington. Let's talk about it.
Congress makes progress on federal privacy law A bipartisan group of lawmakers in the House and Senate introduced a privacy bill last week that some observers including the Washington Post say faces a steep uphill battle. But the bill would require companies that use your data to only collect that data which is necessary for their businesses to function properly. It also proposes a number of other things such as a requirement for the Federal Trade Commission to keep a database of data brokers. Amazon may have to pay for work-from-home equipment A federal judge denied Amazon's motion to dismiss a lawsuit brought by a California engineer who works for the company from home. The employee brought a class-action lawsuit against Amazon for failing to reimburse him for equipment and internet service required to carry out job-related duties. The case now heads to trial in California. Family sues Meta for daughter's self-harm, eating disorder Remember the Facebook papers? that trove of documents company whistleblower Frances Haugen released last year showing that Facebook knew it was harming the self-esteem of young girls and continued doing it anyway? Well, the family of a 19 year old young woman is now suing Meta in the Northern District of California saying the company turned her daughter from a bright and happy child into one that has engaged in self-harm and been hospitalized for depression. YouTube proves you can train AI to spew out hate speech automatically by feeding it 4Chan posts A YouTuber ran an experiment training an algorithm on typical 4Chan posts spewing hate speech. For those who don't know, 4Chan is a public bulletin board where anyone can make posts anonymously. 4Chan is known for its users' racist, sexist, and nihilistic posts. So YouTuber and AI researcher Yannic Kilcher took 3.3 million 4Chan threads and fed them into an algorithm. He then set the algorithm to start posting on 4Chan, and lo and behold, the algorithm produced vile posts of its own. The research is important because it suggests that any bad actor can set an algorithm to post fake or misleading information by simply using a single algorithm. Republicans jump behind Elon Musk's bid for Twitter Republicans jumped in to defend Elon Musk in his accusations that Twitter was attempting to thwart his $44 billion offer to buy the platform. Musk argued that Twitter was refusing to provide important documentation about bots on the platform. Texas AG Ken Paxton then launched an investigation into whether Twitter was using unlawful means to obstruct the deal. Twitter ended up granting Musk access to the information he requested, while arguing that Musk's accusations about bots was simply a pretext to back out of the deal. Since Musk officially announced his plan to purchase the platform back in May, conservatives have rejoined the platform in droves. Feds to investigate Tesla's autopilot crashes There's a reason why you keep see ing Tesla's bashed apart. It's because the autopilot feature isn't working properly in many of these vehicles, and it's causing the cars to crash into stationary vehicles, including police cars. That's according to the National Highway Traffic Safety Administration, which has stepped up the probe it began conducting into Tesla's autopilot feature last year.
Most of us probably don't think about it much, but our cell phones and Wi-Fi connections use something called spectrum to send and receive data. The term “5G” refers to the fifth major version of this standard for transmitting information wirelessly. It's also known as “millimeter wave technology.” The 5GHz spectrum has long been viewed as a way to get around the crowded radiofrequency (RF) spectrum and its associated problems with Wi-Fi. The widespread adoption of wireless local area networks (WLANs) in the 2.4 GHz ISM band means that there are very few available channels in any given location, which can lead to high interference levels, reduced coverage range, and low throughput. The 5GHz spectrum is not as crowded as the 2.4GHz band, making it a better choice for high-throughput, low-interference deployments like indoor video surveillance networks. In general, this band also provides greater immunity from interference from devices like baby monitors or cordless phones that use frequencies in the 2.4GHz band. Ari Fitzgerald LinkedIn Ari Fitzgerald is a Partner with the law firm of Hogan Lovells, where he leads the firm's Communications, Internet, and Media practice. He provides strategic, legal, and policy advice on a wide range of communications and spectrum policy issues to some of the world's largest and most dynamic communications network operators and equipment manufacturers, as well as industry trade associations and investors. Resources Hogan Lovells' Communications, Internet, and Media Practice
Sheryl Sandberg is leaving Meta (née Facebook) after 14 years with the company Sheryl Sandberg is leaving Facebook after 14 years with the company.The Wall Street Journal reported that Meta has been reviewing Sheryl Sandberg's personal activities, including a reported allegation from the Journal's sources that she used company funds to plan her wedding. But Meta spokeswoman Caroline Nolan denied Sandberg's departure was related to the review. The tech sector and civil rights advocates achieve small victory in Supreme Court The tech sector and civil rights advocates succeeded before the Supreme Court after a 5-4 decision to stop HB20 - a Texas social media bill - from going into effect. The Republican bill would give citizens the right to sue platforms with over 50 million users for censoring their content for political reasons, or based on their “viewpoint.” Many conservatives believe the larger platforms discriminate against conservative speech. Smaller social media companies catering to conservatives, like Parler, however, wouldn't be subject to the Texas law. So they'd be able to censor progressive speech. Justices Alito, Thomas, and Gorsuch wrote the dissenting opinion, which Justice Kagan didn't join. However, Justice Kagan was the 4th justice who voted to leave the Texas law in place. The Court didn't provide the rationale for their decision. WashingTech filed an amicus brief in support of overturning the Texas law. Period-tracking apps aren't regulated Yesterday Democratic Congresswoman from California Sara Jacobs introduced a bill called the My Body, My Data Act. The bill comes on the heels of the leaked Supreme Court opinion to overturn Roe v. Wade. In a post-Roe v. Wade world, there is a forseeable risk that period-tracking apps would send data to law enforcement in areas where abortions become illegal. University of Maryland: Nearly 2/3rds of Republicans want net neutrality rules The University of Maryland published a report showing almost 2/3rds of Republicans would bring back at least some of the net neutrality rules the Trump administration overturned. Pure net neutrality rules would prevent internet service providers from discriminating against different types of traffic, prioritizing some traffic over others. Some conservative advocates want the same standard to apply to social media platforms. That means both the Twitters of the world and the internet carriers themselves could be held liable for discriminating against content, which, frankly would then have to be applied to broadcasters. Finally, a new paper from UNC argues that Gmail's spam filter might exclude things the intended recipient actually wants to read. Conservatives jumped on the study and started saying it proves Google's alleged anti-conservative bias. But the academics who wrote the study say that interpretation takes the study out of context. Obviously, mis and disinformation and hate speech filters have a completely different design, it's a whole different set of keywords, but, sometimes you just have to listen and let people have their say. That's it for this week! I hope you had a great Memorial Day weekend and week! Happy Summer! And remember, stay safe & informed. Have a great weekend.
Washington Post reports on “light-sensing” technology to identify weapons A company called Evolv uses “active sensing” — a light emission technique also used in radar and lidar—to create images and identify weapons in public and private venues, according to the company. Despite “fundamental limitations in differentiating benign objects from actual weapons,” existing clients include the New York Mets, Lincoln Center in New York City, the Charlotte-Mecklenburg school system, and Six Flags amusement parks across the U.S. Washington Post questions legitimacy of claims that social media plays outsize role in gun violence Writing for the Washington Post's Technology 202 newsletter, Cristiano Lima analyzed the legitimacy of arguments being made by government officials like Republican Texas governor Greg Abbott, that place undue focus on social media when it comes to gun violence, detracting from the real issue which is lax gun control laws. Activist's protest Amazon's work with police and immigration agencies Black and Muslim activists led a protest during Amazon Web Services' summit on Tuesday. The protesters opposed Amazon's work to facilitate the surveillance and deportation of immigrants and people of color. Amazon removed three Black or Latino activists who had registered for the summit. Digital surveillance affects 227,000 immigrants A critical new report by NoTechforICE details how digital surveillance systems impact immigrants' lives. Constant surveillance, which is touted as an “alternative” to traditional detention, has placed more than 227,000 immigrants under some form of surveillance. These anxiety-inducing technologies interfere with employment and further stigmatize immigrants. White House issues Executive Order to study facial recognition and predictive algorithms in the criminal justice system The National Academy of Sciences will examine how facial recognition and predictive algorithms are being used in the criminal justice system, in order to surface civil rights issues and make recommendations to correct them. The order also requires the Attorney General to perform a disparate impact analysis of PATTERN, the Bureau of Prisons' risk assessment program , to see how it affects inmates' chances of early release. New report details how law enforcement can use technology to enforce abortion laws [Wired] Wired reports on a new study from the Surveillance Technology Oversight Project that details how law enforcement can utilise existing data access technologies and tracking tools to enforce abortion bans. The report cites keyword search warrants and geofence warrants as examples of these these types of surveillance technologies, which the report states can harm those looking for abortion providers or obtaining abortions. Citing potential harm to abortion seekers and providers, lawmakers urge Google to stop the excessive retention of location data Separately, Democratic lawmakers wrote a letter urging Google to stop its practice of retaining what the representatives believe is too much location data, which they argue law enforcement could use against those seeking abortions at abortion clinics. New CDT report looks at surveillance of disabled people According to a new report by the Center for Democracy & Technology, algorithms and surveillance technologies are being used to surveil, control, discipline, and punish people, with particularly harmful effects on disabled people in education, the criminal legal system, healthcare, and the workplace.
Emnet Tafesse and Ranjit Singh: A Social Science Approach to AI [Ep. 267] AI is an interdisciplinary field that draws on many fields of study, including computer science, psychology, neuroscience, and mathematics. AI researchers often bring a social science perspective to the field. They want to understand the social implications of AI and to identify ways that humans can best interact with AI systems. Social scientists look at how people interact with AI systems and how these interactions might change over time. Social scientists also want to understand how AI systems learn from human behavior and make predictions about how people will behave in the future. There are two main types of social science approaches to AI. The first is an approach that focuses on the ethical implications of AI. This approach assumes that we cannot build AI without carefully considering its ethical implications for society. The second type of social science approach is an approach that looks at the psychological impact of AI on humans. This approach assumes that humans could feel uncomfortable or threatened by the presence of AI in their lives. Social scientists use these approaches to help us better understand and predict the impact of AI on society as a whole. Emnet Tafesse and Ranjit Singh contrast the differences in how researchers in the “Global South” and “Global North” investigate the social impacts of artificial intelligence. Bio Emnet Tefesse @emnetspeaks Emnet Tafesse is a Research Analyst at Data & Society on the AI on the Ground Initiative. She has a passion for utilizing advocacy, research, and policy to create positive social change and a more equitable world. She received her Master's in Public Policy from the University of Chicago and her BA in Political Science and Sociology from Howard University. Ranjit Singh http://ranjitsingh.me/ Ranjit has a doctorate in Science and Technology Studies (STS) from Cornell University. His research lies at the intersection of data infrastructures, global development, and public policy. He uses methods of interview-based qualitative sociology and multi-sited ethnography in his research. He examines the everyday experiences of people subject to data-driven practices and follows the mutual shaping of their lives and their data records. Resources AI on the Ground Blog Series
NY Attorney General Opens Twitch Probe in Buffalo Shooting Aftermath New York State Attorney General Letitia James opened a new probe in. to Twitch, 4chan, 8chan, and Discord in the aftermath of the mass hate-motivated shooting at a Buffalo supermarket that left 10 Black people dead. The probe is geared towards determining how the shooter used these platforms to plan, discuss, stream or promote the mass murder to which the defendant, who was apprehended at the scene of the shooting, has pled “not guilty.” Federal Trade Commission Votes to Hold EdTech Companies Accountable for Kids' Privacy The Federal Trade Commission (FTC) voted unanimously yesterday to adopt a policy statement cracking down on how EdTech companies use kids' data beyond enabling their ability to do homework or attend class remotely. The statement warns EdTech Companies that it is against the law for these companies to require parents and schools to give up their childrens' privacy rights in order for them to access these critical apps. The FTC told EdTech companies that they should also expect fines for using kids' data for marketing purposes, keeping kids' information for longer than necessary, or not having proper safeguards in place to maintain kids' privacy while they're using these apps. President Biden commended the FTC in a statement. Twitter Adds New Content Moderation Rules Despite Elon Musk's $44 Billion Bid Twitter actually expanded its content moderation rules despite Elon Musk's $44 billion bid to buy the social media platform. The move came the same day Twitter GC Vijaya Gadde said “there's no such thing as a deal being on hold,” after Musk's statement that he was reconsidering his hostile takeover. The new content moderation rules target misinformation about wars and natural disasters by preventing users from retweeting misinformation about things like the war in Ukraine, and preventing misinformation from appearing in search results. Texas AG Sues Google Over ‘Incognito Mode' Texas Attorney General Ken Paxton added another allegation to the lawsuit it filed against Google back in January, along with the States of Washington, Indiana, and the District of Columbia, over Google's location settings. Paxton says that Google's incognito mode isn't actually private at all, and that it risks exposing a users political leanings or sexual orientation. Google is disputing the claims. Instagram Removes Summer Walker's Post Criticizing KKK Instagram left a lot of folks scratching their heads after it removed R&B singer Summer Walker's post criticizing the KKK. Walker's post simply said “why the KKK never got a RICO,” RICO, of course, being the Racketeer INfluenced and Corrupt Organizations Act, which used to target the mafia, but now targets street gangs, according to VICE. Asking why the KKK never got a RICO charge seems like a damned good question, and it's interesting that Meta took that down as so much white supremacist hate speech stays up. We need more intel on who exactly within these companies is making the last call on these takedowns. MIT Finds Medical AI Can Predict Patients' Race A new MIT study reviewed several medical imaging technologies and found they did a better job identifying the race of patients than humans. But the researchers have no idea how this was possible. They looked at chest X-rays, chest CT scans, and mammograms and the AI identified the patients' race, even though the patients hadn't identified what their race actually was.
Today's global economy has made it easier than ever to explore opportunities and access resources. However, this comes with the challenge of understanding how these new markets and opportunities can be beneficial or harmful to your family. Technology is a powerful tool that can help you stay informed, connected, and entertained wherever you are in the world. As content from different countries and cultures becomes more accessible, it's important for parents and caregivers to learn about new media so they can make informed decisions about what content is appropriate for their family. Bios Vicki Shotbolt @VShotbolt Vicki Shotbolt is the founder and CEO of Parent Zone which keeps families informed and works with partners on digital literacy issues. Sh's worked with global brands including Microsoft and Vodafone to make Britain more Family Friendly. Her passion is in finding practical solutions to complex problems.She often speaks publicly about parenting and digital issues and is on the executive board of the UK Council for Internet Safety (UKCIS). Vicki likes sailing and is proud rescuer and owner of Teddy the dog. Geraldine Bedell @geraldineBedell Geraldine is Parent Zone's Executive Editor. Prior to joining Parent Zone, Geraldine worked for 10 years as a writer at the Observer in Britain. Before that she was a columnist and writer for the Independent. She has written for most national newspapers and a wide range of magazines in the UK. She has also produced and presented documentaries for Radio 4. She was the founding editor of Gransnet and has published several books of fiction and non-fiction. Resources Parent Zone Tech Shock Podcast
Fifth Circuit Upholds State of Texas' Content Moderation Ban The Fifth Circuit Court of Appeals overturned a lower court ruling to strike down the State of Texas' controversial statute that created a private right of action for anyone who thinks they were banned from a social media platform because of their “viewpoint.” With the Texas law back in effect, platforms with more than 50 million users, like Twitter & YouTube, must notify users in Texas when they remove their content and give them a chance to appeal. If the user isn't happy with the decision, they can sue. Experts are now waiting to hear whether the Supreme Court will take up the case. In the meantime, Elon Musk Tweeted earlier today that his acquisition of Twitter is on hold as he seeks to verify that fewer than 5% of its accounts are fake. And the number of conservative users who re-joined Twitter after Musk announced he'd be acquiring the company skyrocketed, according to the Washington Post. State of Virginia Lifts Facial Recognition Ban The State of Virginia announced that it will be lifting its ban on facial recognition technology. Many states began implementing a facial recognition ban after a study by the Algorithmic Justice League found facial recognition software disproportionately misidentified Black people. Reuters reports that many jurisdictions, including New Orleans, are lifting their facial recognition bans as well, saying accuracy has improved. Sony Playstation CEO Triggers Pro-Choice Staffers Over With ‘Lighthearted' Abortion Email Sony Playstation CEO Jim Ryan sent a company-wide email requesting staff to respect differences of opinion regarding the leak of the draft Supreme Court decision to overturn Roe v. Wade. In the same email, Ryan celebrated his two cats' first birthday, shared their birthday cakes, and joked that they may someday want a dog. One employee told a Bloomberg reporter that they have never been so mad about a cat's birthday before. Apple Memo Shows Company's Opposition to Unions Motherboard reports that Apple is circulating a set of talking points to store managers to help them push back against unionization efforts by retail employees. The memo warns employees that joining a union may affect their career prospects, flexibility, and paid time off. In recent weeks, Apple Stores in Atlanta, New York City, and Towson, Maryland have filed for union elections, becoming the first Apple stores to do so.
Afua Bruce is a leading public interest technologist whose career has spanned the government, non-profit, private, and academic sectors, as she has held senior science and technology positions at the White House, the FBI, IBM, and the nonprofit sector. Her new book, The Tech That Comes Next: How Changemakers, Philanthropists, and Technologists Can Build an Equitable World, explores how technology can advance equity. Resources The Tech that Comes Next: How Changemakers, Philanthropists can Build a More Equitable World by Amy Sample Ward and Afua Bruce @afua_bruce
Tech companies use data to spot patterns in their users' search histories. They use this information to understand how customers behave. But in the 6 years since the Cambridge Analytica scandal happened, the details of how tech companies use our data are still murky. Ranking Digital Rights' Jessica Dheere joined Joe Miller to discuss where the gaps are and what the public needs to know. Bio Jessica Dheere is the Director of Ranking Digital Rights. She founded and was Executive Director of SMEX, the Middle East's leading digital rights research and advocacy organization. In 2018, she was a research fellow at the Berkman Klein Center for Internet & Society. and a Technology and Human Rights Fellow at Harvard Carr Center for Human Rights Policy. She was part of the 2019-20 cohort of Technology and Human Rights Fellows at Harvard's Carr Center for Human Rights Policy. Her publications include “Misguiding Multistakeholderism: A Nongovernmental Perspective on the Arab IGF”, and a legal research methodology for locating digital rights-related law. Resources Ranking Digital Rights 2022 Big Tech Scorecard @JessDheere
With the convergence of the Metaverse, Web 3.0 and the blockchain, it's hard to imagine just how far we have come over the last century. We can't fully appreciate this giant leap forward, without examining the origins of the internet. Who better to help us understand this journey and how we got where we are today than Dr. Vinton Cerf. Dr. Cerf, widely considered “One of the Fathers of the Internet,” helped to develop the TCP/IP protocol. Since 2005, Dr. Cerf has served as Google's vice president and chief Internet evangelist. He identifies new technologies to support the development of advanced, Internet-based products and services. Dr. Cerf is the former Senior Vice President of Technology Strategy for MCI. There, he guided MCI's technical strategy. In December 1997, President Clinton presented the U.S. National Medal of Technology to Cerf and his colleague, Robert E. Kahn, for founding and developing the Internet. In 2004, Drs. Kahn and Cerf won the Alan M. Turing Award for their work on the Internet protocols. The Turing award is sometimes called the “Nobel Prize of Computer Science.” In November 2005, President George Bush awarded Cerf and Kahn the Presidential Medal of Freedom for their work. The medal is the highest civilian award given by the United States to its citizens. In April 2008, Cerf and Kahn received the prestigious Japan Prize. Prior to rejoining MCI in 1994, Cerf was vice president of the Corporation for National Research Initiatives (CNRI). As vice president of MCI Digital Information Services from 1982 to 1986, he led the engineering of MCI Mail, the first commercial email service, to be connected to the Internet. During his tenure from 1976 to 1982 with the U.S. Department of Defense's Advanced Research Projects Agency (DARPA), Cerf played a key role leading the development of Internet and Internet-related packet data and security technologies.
We are in the middle of a disruptive tech revolution, and it will take some time for society to adjust. Tech, media, and telecom companies turn to Jonathan Cohen for advice as they navigate a continually shifting legal, technological and political landscape. Decades of transactional and policy experience (in private practice and in government) enable him to efficiently advise clients regarding strategies and details in their dealings with other industry players, the Federal Communications Commission (FCC), and other executive branch agencies to align the private sector with the public interest. His expertise ranges from media (both traditional and social) to broadband wireless, and from commercial transactions to regulatory policy. Mr. Cohen's government service included stints at The White House and the FCC, and he is an expert in platform regulation issues, spectrum licensing and transactions, and the rules and processes governing participation in FCC auctions. His clients have singled out his “outstanding service” on corporate and commercial transactions for nationwide recognition. He holds the Martindale Hubbell AV® Preeminent Peer Review Rating and is perennially selected as a Washington DC Super Lawyer. After announcing football and basketball games for his college radio station, Jonathan's career began as a radio news reporter in New York City. Communications law therefore was a natural fit for him after obtaining his law degree. Over his legal career in the media and telecom arenas, Jonathan has negotiated and closed countless telecom transactions and worked on a wide variety of policy issues. He is a proud alumnus of The Wharton School of the University of Pennsylvania and Georgetown University Law Center. Links: Wilkinson Barker Knauer LLP Jonathan Cohen on LinkedIn Book mentioned: Talking about God: Exploring the Meaning of Religious Life with Kierkegaard, Buber, Tillich, and Heschel by Daniel F. Polish P.h.D. From Guttenburg to Google by Tom Wheeler
The way you process information will affect your survival. It's just a fact. Without the ability to evaluate information through a critical lens, it's hard to discern fact from fiction. Kyle Williams and Kamal Carter of A Long Talk About An Uncomfortable Truth joined Joe Miller to discuss their work to go beyond critical race theory. In partnership with colleges and universities, Mr. Williams and Mr. Carter teach important facts about America's racial history. These facts are missing from most K-12 history curricula in the United States. Through their efforts to promote awareness about what Black people experience in their day-to-day lives, Mr. Williams and Mr. Carter seek to promote empathy. Over the last 19 months, A Long Talk has reached over 4,000 people. But their goal is to bring their expertise to every kitchen table in America. Links: A Long Talk Book: Antisocial by Andrew Marantz A Long Talk is on all social media platforms @ALongTalk2020 A Long Talk on LinkedIn
You're well aware that this podcast is about public policy and so often we focus on that, but today, we're also bringing you the creative side of policy making. So many of the policies that we fight to implement are created as a way to protect and preserve our ability to be creative. We know that for many professionals, it's hard to be able to live a completely creative life, while balancing it with work and earning a living. In a podcast first, we begin today's episode with a poem by our guest, Anita Balaraman. The poem is called “Doubt” and you can read it on Medium.com. Anita never specifically sought out a creative life. Anita is a technology product leader with more than 10 years of experience in building technology products that delight the customer both in the B2B and B2C domain. She is also an adjunct faculty at UC Berkeley, teaching and coaching hi-tech product management. She is currently the founder of an early stage ed-tech startup. Most recently she led the digital customer experience practice at Cisco Systems, designing and launching enterprise solutions for customer experience. Prior to that, she led the product team at WalmartLabs launching products that combine machine learning, predictive analytics and personalization. She consults independently and on the board of technology startups in the advertising, ecommerce, and ed-tech space. Anita received her MS in toxicology and applied statistics, and an MBA, both from the University of California, Berkeley. Experiencing Creativity As Anita has gotten older, her view of creativity is much different than it would have been when she was in her 20's. Now, it's more of having the ability to move forward, regardless of the constraints that are imposed upon you. We all deal with different challenges and constraints, and Anita sees creativity as almost being a river which flows around the boulders and roadblocks in our way. Your roadblocks are what make your path unique, but it's also what allows you to tap into that creativity. Rethinking Overly Technical Job Descriptions Recently, Anita published some research indicating that overly technical job descriptions can actually discourage some of the most creative people from applying for the job. The problem with that is in tech and cyber security, many minority populations are already underserved and these highly technical job descriptions can further exacerbate the problem. Translating the Technical-Speak One of the issues that many newly minted interns are seeing in their job searches is that job postings tend to lean heavily on engineering and technical data, and it seems as if they are only wanting applicants with very specific majors. The reality is that the technical data in the job posting rarely captures what the job actually is, and it doesn't show the impact that the employee will have in their role. So it almost takes some translation to let the job posting paint the picture of the actual role. Hard Skills, Competencies and Skill Sets There is little doubt that many of the hard skills and competencies that a company would want could be clearly articulated in a job posting, but so often we default to a technical context that only attracts applicants with certain degrees. The reality is that most of the hard skills and competencies that a company would desire in a role would be possessed by applicants with a range of degrees. There is a plethora of anecdotal evidence that these types of highly technical job postings discourage even the most skilled and qualified women or minorities from applying for the job. So, this segment of the population has removed themselves from the job pool and it becomes increasingly homogenous over time. Multiple Streams of Income Having multiple roles and multiple streams of income can really broaden your skill set. This is especially true if one role requires you to be in touch with the technology for the sake of technology, and then maybe another role is in product development which would involve technology for the sake of a social reason, or to solve a problem. Then it becomes critical to stay in touch with customers and users, in addition to having a handle on the technology, so it's very beneficial. You Have A Much Right As Anyone Else Jane Goodall is a world renowned expert on chimpanzees and other wildlife that she works to preserve, but Jane Goodall never even got a college degree. Her natural curiosity in chimpanzees drew her into her work and research and she made herself an expert. Jane's admonition to women who find themselves in a workplace or collaboration where they feel insecure about their credentials, or even as if they don't have the same brilliant mind as everyone else in the room was, “You have just as much right to be in the room as anybody else.” It's important for women to realize this and to pursue jobs they would be qualified for. Women Leaving Tech There are some inherent blind spots in the struggle for equity in the workplace. As much as companies or males in the workplace try, they don't always get it right. Men, be careful about validating an experience or feeling of another person in terms of relating your own experience. When women hear a man say, “I felt that way too.” or “The same thing happened to me”, we understand that there is a societal contract that wants us to find commonality with our peers, but you are discounting the different starting point of the other person. You need to get through the layers to fundamentally understand how the experience from the same trigger could be different for other people who are different from you. As we approach the future of work in some ways, how we think about STEM, how we think about cybersecurity being one of the STEM areas, how we think about equity, how we think about the purpose for the technology that's being built, it's becoming more and more critical. And having technology be for technology's sake, in some senses is a moot point, especially when you have the demand for these roles outstripping the supply. We need to be smarter, better at attracting the talent to opt into these fields and keep them there and enable them to do the work that they do. And we don't have the luxury of writing job descriptions or fostering an environment which in some ways is a weed out rather than opt in kind of a frame. Links: Sapiens: A Brief History of Humankind Washingtech.org Berkeley College of Engineering Doubt, by Anita Balaraman An obedient child Never wild Begged to be schooled Never one to do, what she wants to. Somewhere in my teens I grew To my parents, a quarrelsome, defiant point of view. Aspired to cross the oceans blue To America for graduate school to pursue. Girls can't be safe, outside of parents' purview Unless she has a husband, never mind she is just twenty-two! In Berkeley, I was told you can be what you want to Even a brown girl with big starry eyes, can dream one day to be a researcher, a professional, or a professor someday. Worked hard, very hard, or at least I thought, For I've been given a chance, a really long shot. But told that I may never be a researcher sought There must be more than just the grades, I thought. Despite how hard I fought… Hiding my feminine brownness was like adding a nought[*]. Perhaps they are right, went my train of thought… Why else would I not see someone like me in doctoral gown? Oh don't be sad, said my loved ones around You can be happy, rich, and successful without a doctoral gown- hands down. Look at the valley of silicon and sand A dreamland of success, prestige and wealth For those that are committed to technology at hand. Yes, but my mind wandered… Where did I lose the defiance in my view? I really care about children and leukemia And I can build risk models that I learned in academia. But can you blame them if they did not trust The models I built that needed their process to adjust. I don't look like them, or speak like them The assumptions in my models are hard to trust. I found my kind, the brown variety, Who spoke bad English with no anxiety. The friends at home and those at work Looked and spoke like they belong to the same network. No apologies for being a vegetarian during team lunch Who clairvoyantly knew that salad wasn't a good munch. This must be beautiful- to feel like you belong Without having to rehearse your lines so I don't say something wrong. To work with the bunch where I hoped I belonged, I got another graduate degree, not the Ph.D. I longed. A business degree, hoping to correct the wronged. A Mom twice over, a wife and an employee, ‘you can't get promoted if you leave at 5', would annoy me. Benevolent prejudice, paternalism, and sexism: Belonging, I understood, with deep skepticism. A misfit perhaps, have always been A toxicologist, but not the wet-lab kind A technologist, but not an engineer's mind An entrepreneur, who venture capital declined An educator, living the adjunct grind A researcher, without the terminal degree- unrefined. Seeking belonging, but always unaligned. Perhaps down in my subconscious mind the fringes appeal more than the straight jacket kind? The fringes feed concern for mistakes, Suspended between two or more contradictory states. An indecision between belief and non-belief Hiding, somewhere, is a fictitious fig leaf? Belonging requires suspending the lunatic fringe To honor and reflect the collective doubt. But that is harder to live, day in and day out Easier it seems to simply not honor their doubts?
Now more than ever we need to protect our kids as they have more exposure online than ever before. Children's Online Safety is a priority that we are taking very seriously. We're living in a hyperconnected age and we've got the Metaverse bearing down on us, so online safety for our children is going to soon be on everyone's radar, whether you like it or not. Today's guest is Dana Miller, the Internet Crimes Against Children Task Force Commander, with the Wisconsin Department of Justice. She joins me to talk about her career, some more tips on how to interact with your kids, protect them online, and what kinds of issues policymakers should be paying attention to as the internet evolves. Dana has worked in ICAC since 2014 and manages the Wisconsin ICAC Task Force Program, including oversight of Wisconsin CyberTips from the National Center for Missing & Exploited Children and the ICAC Victim Services Program. Prior to joining the ICAC Task Force, Dana worked for the University of Wisconsin-Platteville's Criminal Justice program, teaching courses in classroom, online and hybrid formats and managing the online undergraduate Criminal Justice program. Throughout her career, Dana has provided online safety outreach through numerous media outlets, including live presentations and training, television and radio appearances, live and recorded webinars, and streamed social media appearances. Dana is creator and co-creator of multiple safety programs for the Wisconsin Department of Justice, including their online safety interact! e-course and the Protect Kids Online (PKO) podcast. Online Safety Day It's hard to know where to focus efforts to keep your children safe online, and that's something that Dana understands. In a post-pandemic and Metaverse world - kids will be more exposed online than ever before, so it's important to understand both the threats and the vulnerabilities our kids face. Recently, Dana hosted Online Safety Day to try to bring awareness to the issue. Non-Profit and Sexual Assault Support Background Dana started out working for a local non-profit in Wisconsin and part of her role was to work with victims of sexual assualt. Early in her career, she was doing talks about online safety, but its evolved into much more in recent years. Even though much of the threats remain the same, what we're seeing currently is more of a focus on that recovery of a victim centered approach to things and recognizing that we have to really focus on their needs as we're responding in a law enforcement world. And so that's a really great perspective that is coming into play with our work everyday. Where We Are Today In the world of ICANN Internet Crimes Against Children, because we're seeing more people online, we're seeing kids using devices at younger and younger ages. We're seeing increased numbers of reports and because of that, we're seeing those reports involving younger and younger children, which is a major concern. So for Wisconsin, the number of cyber tips that come in from the National Center for Missing and Exploited Children is increasing. In the last number of years, it's increasing over 30% each year. This is a huge concern. What Kids Are Doing Online As far as the actual concerns that are reported, they always heavily lie on the sexting side of things. Also, sharing nude images and oversharing; also talking with people you don't know and sharing too much with those people you don't know are the main things that are going on online. A great recent survey by Thorn about self generated Child Sexual Abuse material - they surveyed kids between nine and 17. One in six respondents had shared a nude image online. So that's a high number, and it's definitely concerning behavior. Sexploitation Is Growing What's really concerning about these stats is, of the ones that said they shared, half of them didn't care if they knew who they shared it with. They didn't care if it was a stranger! Aprroximately 40% of those didn't care if it was an adult. That's really concerning, because that leads more into this realm of sextortion, which is an extreme concern especially over the last few years. Kids participating in this type of online activity are often embarrassed or ashamed of what they have done, so they aren't quick to seek out help, which allows for even more victimization. The Correct Response to Exploitation It's important to be in a position to recognize red flags in the conduct we see online and the relationships that our kids are engaging in online. Oversight will become a non-negotiable as you work to keep your kids safe from online predators. Keep the lines of communication open with your children. The stronger your connection to them is, and the more they feel safe sharing their online activity with you, the more likely you are to be able to guide them appropriately so that you can keep them safe online. Connect with Dana: https://www.doj.state.wi.us/dci/icac/icac-task-force-home https://www.thorn.org/ https://www.stopsextortion.com/ https://www.fosi.org/ https://connect.missingkids.org/products/parent-connect-a-child-safety-virtual-discussion-series-summer-2021#tab-product_tab_speaker_s https://twitter.com/icactaskforce https://www.linkedin.com/in/dana-miller-46038280/
When the pandemic started, courts that were slower in adopting technology had to undergo a two-week revolution to move their operations to a remote setting. Under normal circumstances, that would have taken them twenty years to achieve. Existing research shows that while remote technologies can be helpful in court proceedings, they can also harm individuals if not used carefully. Several issues have been coming up around the effects that remote court proceedings have had on our communities. Today's guest is Douglas Keith, counsel in the Brennan Center's Democracy Program, where he works primarily on promoting fair, diverse, and impartial courts. He will walk us through the various concerns. Douglas Keith was the George A. Katz Fellow at the Brennan Center, where he worked on issues around money in politics, voting rights, and redistricting. His work has been featured in the New York Times, Washington Post, NPR, Atlantic, Guardian, New York Daily News, and Huffington Post. Before that, Keith worked as a Ford Foundation public interest law fellow at Advancement Project. He directed voting rights advocates in New York, served as an international election observer for the National Democratic Institute and OSCE Office for Democratic Institutions and Human Rights, and educated poll workers for the New York City Board of Elections. Keith is a graduate of NYU School of Law and Duke University. What should we be concerned about? All existing research suggests a real reason exists for courts to be cautious about doing video hearings. Studies have shown that video court cases have not always worked out as well as those cases when people have appeared in person. Higher bail amounts charged for video court cases in Chicago In Chicago, in the early 2000s, courts began using video for most of their felony bail hearings. A study that looked at 600,000 of those hearings found that judges imposed much higher bail amounts for those required to have video hearings rather than appearing in person. On average, the video cases paid 50% more bail, and in some instances, they paid up to 90% more. People detained in deportation proceedings People detained in deportation proceedings stood a much higher chance of being removed if they were required to appear by video rather than appearing in person. A quiet place to appear and access to broadband When people get detained, questions tend to arise about the quality of the broadband and them having access to a quiet place to appear. Also, when someone has to appear in court remotely from a jail or prison setting, the background could influence, impact, or change how a judge might view them as an individual. The digital divide When someone not detained has to appear remotely, many different issues related to the digital divide could arise. They might not have the quality of internet that a judge might expect, and there are also massive differences in terms of the devices people are using to access the proceedings. Those issues need to be taken into account if the proceedings are to be fair. What has changed? Since Douglas has been advocating for the communities that have been affected by doing court proceedings remotely, there have been technological improvements that might make a difference. Remote proceedings are here to stay Over the last year, courts have become very enthusiastic about how remote proceedings have been working out. Court leaders across the country have said that remote proceedings are here to stay because they have been efficient, speedy, and time-saving. The problem Most jurisdictions have not been talking to the people going through remote court proceedings or their attorneys to learn what is and is not working. A common concern A common concern with remote hearings is the ability for the client to communicate with their attorney during the proceedings. That ability gets hampered because remote tools do not allow the client and attorney to make eye contact and quietly confer about any information that might be relevant to the case during the proceedings. Eviction proceedings Douglas spoke to many individuals from legal aid organizations, representing people earning below certain income thresholds and going through eviction proceedings. What you can do, on a local level, when someone's rights are violated Pay closer attention to what the courts in your jurisdiction are doing. Courts often allow for public comment or testimony when going through the process of proposing rule changes to allow for more remote proceedings. Engage with the courts and get involved. Watch your local courts to see the types of rule changes they are proposing, in terms of remote proceedings. If you disapprove and they do not require consent to move forward remotely, write to the court to tell them about your concerns and why you think consent should be required. Resolving the issues Advocates from all over the country are busy working on resolving these issues. They range from academics studying the impact of remote tools during the pandemic to practitioners in various spaces, guiding attorneys. Research More research is needed because we do not know enough about how people are being affected by remote tools. At the Brennan Center, they advocate for more resources towards that research to prevent the courts from inadvertently doing any harm. Some other issues that Douglas is working on that are happening where tech intersects with the judicial system Douglas is working on allowing the public access to court proceedings. During the pandemic, many courts started live streaming. That allows court watch groups to remotely observe the court proceedings and report to the public what is and is not working in the courthouses. That raised questions about the point of allowing public access to the courts. The watchdog effect Public access makes the court aware that it is being watched and reminds them of their responsibility. Live streaming might result in a loss of some of that watchdog effect. So although technology has improved public access to the courts in some ways, we could also lose something along the way. Remote tools The use of remote tools in the courts is nuanced. They can lessen the burden that the courts place on people, but there are also times when those tools could be a cause for concern. That is why the courts need to work with their communities to find the right answers. Resources: The Brennan Center for Justice Washingtech.org
WashingTECH Policy Podcast was started largely because of the impact of tech, AI and cybersecurity on communities of color, working class and immigrants, and none of the big players in the industry have it on their radar. Our conversation today is on this very topic and there is no one more knowledgeable on the topic than Camille Stewart. Camille Stewart is an attorney and executive whose crosscutting perspective on complex technology, cyber, and national security, and foreign policy issues has landed her in significant roles at leading government and private sector companies like the Department of Homeland Security, Deloitte, and Google. Camille builds global cybersecurity, privacy, and election security/integrity programs in complex environments for large companies and government agencies. Camille is the Global Head of Product Security Strategy at Google advising Google's product leads on federated security and risk. Previously, Camille was the Head of Security Policy for Google Play and Android at Google where she leads security, privacy, election integrity, and dis/mis-information. Prior to Google, Camille was a manager in Deloitte's Cyber Risk practice working on cybersecurity, election security, tech innovation, and risk issues for DHS, DOD, and other federal agencies. Diversity in Cybersecurity is a Problem We have long ignored the fact that addressing issues of diversity is more than just the right thing to do, as it is actually a mission imperative in cybersecurity. And as technology underpins pretty much everything that we do, how systemic racism is amplified, or cured by technology implementation, is something that we have to be thinking about. And the policy decisions that we've made in the past, and the ones that we make moving forward, are all impacted by a society built on systemic racism, our investments are all impacted by legacy and current day systemic racism, informed decision making policies and bodies. The Paper to Address Diversity The Aspen Institute came to Camille seeing this moment where we needed to kind of dive in and talk about how diversity, equity and inclusion is impacting the work and convened a large group of folks across diverse backgrounds, leaders in cybersecurity, academia, industry government, to come together for a closed door, Chatham House rules, discussion on how we could move the needle on this. How can we come together to identify what the issues are around diversity and cybersecurity and then come up with some solutions. And the thing that was really appreciated is, as Aspen and Camille worked through this, they were very clear that it needed to be action oriented. And so the discussion was really rooted in that how can we actually do work, take action, to drive diversity and inclusion in cybersecurity, for the betterment of not only the people who will and may participate in this industry, but also for the work. Why Diversity In Cybersecurity Should Matter to Everyone Let's think about the large scale cyber incidents we've seen recently. The attack on Colonial Pipeline then cascaded into you, not being able to get gas. The attack on JBS foods that meant you probably couldn't get your lunch meat for your kids, means that you should be concerned about cybersecurity as an individual. And there are so many other reasons beyond that, but those very large scale incidents are very attached to the individual and how they impact your ability to access services and operate, or because you as an individual could take an action that could lead to one of those breaches. So diversity, as a part of cybersecurity as a part of the industry is important because you can identify things based on your lived experiences and how technology shows up in your life that other people cannot. The Future of the Cybersecurity Workforce A lot of the diversity issues in cybersecurity are systemic. There are issues with hiring; there are issues with retention; issues of education. So many people don't even recognize the fact that working in technology, and cybersecurity is an option for them - access to the industry, building a network, etc. And so we created some buckets that kind of address those things divided up the practitioners that were participating. They put their brain power behind thinking about what are some solutions to the educational barriers. Certifications are a common tool in cybersecurity. But that's really tough, because most certifications require some years of experience. And you're seeing a lot of entry level jobs that require those certifications. How can it be an entry level job if you need five years of experience to get the certification that is required to get the job? Links and Resources: Connect with Camille on Twitter or Instagram @Camilleesq Camille's Paper
Some of the earliest documented instances of health misinformation occurred as early as the 1930s. So it's not as recent of a phenomenon as you may think. Obviously, social media has made the speed and prevalence of health misinformation and disinformation much worse. It started with cancer misinformation, but now we are dealing with vaccine misinformation and of course, the battle field is social media. All of the medical and technological advances we have made as a first world country don't make a difference if we can't overcome the health disinformation that is prevalent on social media. Today's guest is Dr. Tara Sell. Dr. Sell is an Assistant Professor in the Department of Environmental Health and Engineering and a Senior Associate at the Johns Hopkins Center for Health Security at the Johns Hopkins Bloomberg School of Public Health. She conducts research to develop a greater understanding of potentially large-scale health events such as disease outbreaks, bioterrorism, natural disasters, or radiological/nuclear events. Dr. Sell's work focuses on improving public health policy and practice in order to reduce the health impacts of disasters and terrorism. She works on qualitative and quantitative research analyses and uses this research to assist in the development of strategy and policy recommendations. Her primary research interests include biosecurity and biodefense, public health preparedness, emerging infectious disease, federal funding and budgeting, and nuclear preparedness policy and practice. She also serves as an Associate Editor of the peer-reviewed journal Health Security (formerly Biosecurity and Bioterrorism). How to Discern the Truth Determining how true information that you share on social media is can be confusing. Disinformation is designed to look like accurate information and is more easily shareable that you realize. As a society, we have to do 2 things: We need to hold policy makers accountable to the truth and to come up with solutions to address the disinformation. We have to have a national strategy that controls the spread and sources of misinformation and a system to promote good information, increase public resiliency to misinformation and bring all of the stakeholders together. Identifying Misinformation Needs to be a Top Priority Misinformation is intentionally designed to play on your emotions, and be so compelling that you will also want to share it. We need to have a unified effort to show people the tactics that are being used to make us an unwitting accomplice in the spread of misinformation. There are online fact checking tools that simply aren't used enough. Additionally, when misinformation is prevalent, there isn't a cohesive strategy to help us share the truth as a countermeasure to the misinformation. Better digital literacy will go a long way in helping combat misinformation. Government Skepticism One thing that is hindering the adoption of the safe and effective vaccines, is the general distrust of the government. It's not good for the government to be playing the role as the arbiter of truth in these situations. The government needs to be more transparent and bring together different agencies and address misinformation from a public health perspective, but it also needs to bleed into the national security side of government. We need a national strategy to confront health misinformation across the spectrum, so that it can more easily be identified and stamped out. Being Caught In the Middle With Friends and Family Many people are dreading the potential confrontations that may happen at the Thanksgiving table over the medical misinformation on the different sides of the political arguments. If you find yourself in this situation, Tara advises: Engage respectfully. No one will be convinced of the truth if they are dismissed or ridiculed. Connect over shared values. Everyone wants their children and their families to be safe. Talk about the tactics used to spread misinformation and your experience with it. Discuss alternative explanations for the conspiracy theories and use information sources that people will accept. Provide trusted sources for information that are not the CDC or WHO since they have become targets. Use Johns Hopkins or other reputable medical sources. Enlist the help of trusted family members. The High Cost of Misinformation Tara's organization did an analysis of what the actual misinformation is costing us, as medical misinformation has been declared a public health emergency. If 5% of non-vaccination is caused by medical misinformation, that leads to a cost of $50M in harm each day of a non-Delta surge environment. It would be even bigger during the Delta surge or if that non-vaccination number due to misinformation is higher. If it's 30% then that number increases to a cost of $300M dollars per day. Who Is It Costing? The cost of misinformation is spread out across sectors, but where ever there are people bearing more of the burden for misinformation, the associated costs are also concentrated there. What Can You Do? We have to face the fact that health related misinformation is going to be with us for a while. We aren't just going to be done with it when the pandemic is over. New targets will emerge. We have to make investments in solutions for health misinformation. We have to work on our own resilience. Encourage friends and family to be more resilient to misinformation as well. Resources: Connect with Tara on Twitter @skirkell Chamber of Progress Website Johns Hopkins Bloomberg School of Public Health
Across the US, many states are considering laws that prohibit online platforms like Facebook, YouTube, Instagram, etc. from enforcing rules against what we call “lawful but awful” online content. Lawmakers are motivated to do this because they think laws are needed to prevent social media platforms from censoring conservative viewpoints. As with many laws though, the unintended consequences of these laws could prove to be much more harmful than the behavior the law was intended to regulate. To help us navigate the craziness of what would and would not be allowed if these laws go through, our guest today is Elizabeth Banker, VP of Legal Advocacy for Chamber of Progress. Chamber of Progress is also a sponsor of this show. Elizabeth Banker is Vice President of Legal Advocacy for Chamber of Progress. Elizabeth brings twenty-five years of in-house, law firm, and trade association experience on intermediary liability, Section 230, and online safety. Most recently, Elizabeth was Deputy General Counsel at Internet Association where she directed policy on consumer privacy and content moderation. While at IA, Elizabeth conducted a review of 500 Section 230 decisions and testified twice before the Senate on efforts to reform Section 230. Elizabeth has first-hand experience responding to the challenges that face online services as a veteran of both Twitter and Yahoo!. She was Vice President and Associate General Counsel for Law Enforcement, Security and Safety at Yahoo! Inc. for more than a decade. More recently she was Senior Director and Associate General Counsel for Global Law Enforcement and Safety at Twitter. Elizabeth spent five years as a shareholder at ZwillGen, a boutique law firm focused on privacy and security in Washington, D.C. Elizabeth began her career in government with the President's Commission on Critical Infrastructure Protection during the Clinton Administration. Hate Speech and Bully Speech Would Stand Many of the laws being proposed would actually tie the hands of social media platforms on some of the regulations that they currently have in place about harassment, bullying, and threatening behavior. These are all types of content that no social media platform wants to see on their platforms. Currently, the social media providers have rules and regulations that they currently enforce across their platforms to keep users free from hateful, bullying speech and harassment. These new laws would add many complexities to enforcing the rules and it will open them up to the constant appeals process for users who have their content removed, etc. 100 Bills and Counting So far in 2021, we have seen over 100 bills proposed in state legislatures all across the nation. There will probably be many more before the end of the year. The Amicus Brief that Chamber of Progress files was a way to explain to the court the real world implications of these laws, should they be passed and hold up to the legal battles ensuing. Objections Being Filed The TX law that is currently under consideration was one in which we filed our objections in the amicus brief. We believe that all platforms should be able to moderate harmful content in order for consumers to be healthy and safe on their platforms. Additionally, these platforms should be inclusive and widely accessible. Here are the main objections we have to this TX law: It prevents platforms from removing content that is not illegal, such as harassment, hate speech, misinformation, suicide, etc. The law undermines the current content moderation efforts by forcing platforms to basically publish a playbook about how they detect illegal content. This means child abusers, terrorists, spammers, identity thieves, and other bad actors would have enough information to evade detection. So this will lead to more illegal content online. This law places an undue burden on content moderation. If content is removed, the platform has to go through lots of additional steps that will discourage the company from actually removing content that should actually be taken down. So again, the net effect is that consumers will have more harmful content to wade through in order to enjoy a platform. Should Parents Be Worried? The TX law actually prevents platforms from taking the content moderation steps that they currently take. When it comes to content directed at children, there are many areas that fall under the awful, but lawful heading that would probably be left on the platform. For example, content glorifying suicide, or self-harm, or promoting eating disorders, etc. are all types of content that platforms would no longer be able to regulate. Cyber bullying is another area where the current protections would be removed. So, school fight videos that are normally removed, would still be accessible. Non-consensual intimate images, called revenge porn would not be taken down, as well as other types of harassment that could be very harmful to teens. So parents have every right to be worried, especially if you've already been through dealing with these sorts of problems, because under this law, they will only worsen. Misconceptions About Free Speech The First Amendment does not apply to private companies. It only prohibits government regulation and restriction. Each social media platform has their own First Amendment concerns about what they allow on their platforms too. The argument that social media platforms are violating a person's right to free speech just doesn't hold water. Misunderstanding Section 230 Section 230 plays a critical role in allowing the platforms to remove harmful content without being sued. The platforms rely on this protection. Recently a Russian foreign influence campaign sued because their content was removed. The lawsuit failed because of Section 230. It's important for us to fight to keep both the First Amendment and the Section 230 protections for content moderation strong in order to keep consumers safe while they enjoy these online platforms. Resources: Progress Chamber Website Follow Elizabeth on Twitter: @elizabethbanker
With Federal privacy regulation leaving much to be desired, it has fallen to individual states to make up the gap and establish their own privacy rules. This approach is problematic for many reasons, which is why Justin Brookman is on the show today. Correction: The name of the individual Joe referenced in the intro is Alex Stamos, from the Stanford Internet Observatory, not John Stamos as was stated in the episode Consumer Privacy Has a Home a Consumer Reports Justin Brookman is with Consumer Reports where he's the head of tech policy. He wrote an excellent paper several months ago on state privacy regulation (you can read it here). Justin is the Director, Consumer Privacy and Technology Policy, for Consumers Union, the policy and advocacy arm of Consumer Reports. In this new privacy role at CR, he will help the organization continue its groundbreaking work to shape the digital marketplace in a way that empowers consumers and puts their data privacy and security needs first. This work includes using CR research to identify critical gaps in consumer privacy, data security, and technology law and policy, as well as building strategies to expand the use and influence of the new Digital Standard being developed by CR and partner organizations to evaluate the privacy and security of products and services. The Politics of Privacy If you keep up with the news of the day, you know that right now, everybody has had it with big tech companies, like Facebook. Consumers, politicians, the media and other businesses have been sounding off about the pitfalls of having big tech intrude into our lives. It's brought about a lot of policy proposals, but no comprehensive legislation that is likely to pass at the Federal level. This gaping hole has been filled in by the privacy legislation that is popping up at the state level. Legislation State By State As is often the case, California is one of the first states to come forward with privacy legislation of its own. The California Consumer Privacy Act has already been amended to make the legislation stronger than the original bill. Virginia also came forward with a bill, and Colorado quickly followed suit. We're also currently seeing legislative battles in New York and Washington State over privacy, and the proposals are really all over the place. The Federal Role of Privacy The Federal government has basically taken a hands off approach to the privacy legislation popping up around the country. Because all of the privacy laws ultimately center around the first amendment, the Federal government is reluctant to play a heavy handed role in the laws that are cropping up throughout the country. There have been some challenges to legislation around the first amendment and some have been rejected, as the judiciary is reluctant to regulate companies. Consumers vs. Businesses vs. Government Consumers don't want Facebook or their ISPs to track their every move and collect data on them. At the same time, the government doesn't want private data collected to be in the hands of these companies and outside of the reach of government agencies. Many states are willing to take a more aggressive approach to privacy in light of the massive data breaches that consumers have experienced in recent years. Where are we now While it's clear that aggressive action needs to be taken to prevent data breaches, it's going to take regulatory agencies some time to catch up because Federal legislation moves so slowly. Much of the existing legislation is unwieldy for the consumer. Whether it relies on a physical opt out by consumers or it goes state by state, it's just not that easy for consumers to actually protect themselves with the current regulations. State legislatures do not have the staff or the expertise to create the kind of legislation that is needed for consumers to truly be protected. We need to find a balance between what can effectively protect consumers, but also allow businesses to function in a way that doesn't put consumers at risk. Resources: Connect with Justin on Twitter @justinbrookman
CDT Comments to US Dept. of ED Urging the Protection of Students of Color and Students with Disabilities, And Their Data @venzkec Cody Venzke is a Policy Counsel for CDT's Equity in Civic Technology Project, where he works to ensure that education agencies and other civic institutions use technology responsibly and equitably while protecting the privacy and civil rights of individuals. He is a contributor to the California Lawyers Association's treatise on the California Consumer Privacy Act, including on the right to opt out and compliance with recordkeeping and training requirements. Prior to joining CDT, Cody served as an Attorney Advisor at the Federal Communications Commission and clerked for the Honorable Julio M. Fuentes on the Third Circuit and the Honorable Jan E. DuBois in the Eastern District of Pennsylvania. Cody also worked on the litigation team of an international law firm, where he served clients in emerging technologies such as clean energy, medicine, and media. In his pro bono work, Cody has represented tenants in eviction actions, assisted applicants under the U visa program, and supported litigation to ensure criminal defendants receive adequate representation under the Fifth Amendment. Prior to starting his law career, Cody taught math at a large public high school in Houston, Texas through Teach For America. Cody graduated from St. Olaf College and Stanford Law School, and grew up in rural Iowa.
Margaret graduated cum laude from the University of Minnesota Law School in 2003 and began her legal career with the Housing Preservation Project. She began her career as a community organizer, fighting for the rights of manufactured home community members with All Parks Alliance for Change. After law school, Margaret joined HJC under its former name Housing Preservation Project, where she worked on a range of issues including preservation of federally subsidized housing and manufactured home community preservation. She then returned to APAC as the Legal and Public Policy Director where she helped resident associations fight for their rights in parks, represented resident associations in court, and helped push legislation to support manufactured homeowners, including a law that prevented deceptive lending practices and creation of the relocation trust fund for homeowners displaced through park closures. Next, she spent four years at the Center for Urban and Regional Affairs as the Operations Director for the Minnesota Center for Neighborhood Organizing, working to ensure that people affected by decisions had the tools and skills to organize and advocate on issues ranging from education to transportation to police community relations to housing. Most recently she spent six years as the Community Development Director at Minnesota Housing where she worked to create connections between community needs across the state of Minnesota and the programs and policies of Minnesota Housing. Margaret was also a policy fellow with the North Star Policy Institute. She brings a wealth of knowledge about local, state, and federal housing policy and programs as well as a robust background in the intersection of community organizing and the law. Discussion Materials Opening the Door: Tenant Screening and Selection
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & Innovation. He is an associate research scientist and adjunct professor at NYU Tandon School of Engineering.
Bio David J. Johns is known for his passion, public policy acumen and fierce advocacy for youth. He is an enthusiast about equity—leveraging his time, talent and treasures to address the needs of individuals and communities often neglected and ignored. A recognized thought leader and social justice champion, David’s career has focused on improving life outcomes and opportunities for Black people. On September 1, 2017, David Johns began his next life chapter as the executive director of the National Black Justice Coalition (NBJC)—a civil rights organization dedicated to the empowerment of Black lesbian, gay, bisexual, transgender and queer (LGBTQ) people, including people living with HIV/AIDS. NBJC’s mission is to end racism, homophobia, and LGBTQ bias and stigma. In 2013, Johns was appointed as the first executive director of the White House Initiative on Educational Excellence for African Americans (Initiative) by President Barack H. Obama and served until the last day of the Obama Administration in January, 2017. The Initiative worked across federal agencies and with partners and communities nationwide to produce a more effective continuum of education and workforce development programs for African American students of all ages. Under his leadership, the Initiative studied the experiences of students—leveraged a partnership with Johnson Publishing Company (EBONY Magazine) to produce a series of African American Educational Summits (AfAmEdSummits) at college campuses throughout the country, where the only experts who sat in front of the White House seal were students as young as elementary school. The recommendations that students made at AfAmEdSummits have been used to improve policies, programs and practices, including curriculum, designed to ensure that students thrive—both in school and in life. Prior to his White House appointment, Johns was a senior education policy advisor to the Senate Committee on Health, Education, Labor and Pensions (HELP) under the leadership of U.S. Senator Tom Harkin (D-Iowa). Before working for the Senate HELP Committee, Johns served under the leadership of the late U.S. Senator Ted Kennedy (D-MA). Johns also was a Congressional Black Caucus Foundation Fellow in the office of Congressman Charles Rangel (D-NY). Johns has worked on issues affecting low-income and minority students, neglected youth, early childhood education, and Historically Black Colleges and Universities (HBCUs). His research as an Andrew W. Mellon Fellow served as a catalyst to identify, disrupt and supplant negative perceptions of black males—both within academia and society. Johns is committed to volunteer services and maintains an active commitment to improve literacy among adolescent minority males. Johns has been featured as an influential politico and advocate by several publications and outlets, including TheRoot.com, NBC, EBONY and The Washington Post. Johns is a prominent strategist who offers commentary for several media outlets including BET, CNN, EducationPost and TV One. David is currently pursuing his Ph.D. in sociology and education policy at Columbia University. Johns obtained a master’s degree in sociology and education policy at Teachers College, Columbia University, where he graduated summa cum laude while simultaneously teaching elementary school in New York City. He graduated with honors from Columbia University in 2004 with a triple major in English, creative writing and African American studies. Johns was named to the Root100 in both 2013 and 2014, selected as a member of the Ebony Power 100 in 2015, and received an early career award from Columbia University, Teachers College in 2016. He has also served as an adjunct professor at American University. Resources National Black Justice Coalition Johns, D., 2020. Don’t Make the Internet Unwelcome to Diverse Communities, Especially Black and Latinx LGBTQ People. [Blog] Morning Consult, Available at: [Accessed 11 November 2020]. Related Episodes ‘Social media policy: It's the moderation, stupid!’ with Chris Lewis Ep. 232(Opens in a new browser tab) Intro JOE: Hey everybody. So here we are on the other side of the election. They're still counting the votes. But this thing looks over. Even in the face of several lawsuits, President Trump has brought to challenge the election results, Biden's win is only becoming more decisive. The president-elect is on track to win by over 5 million popular votes, bringing his total to more than 80 million, more than any presidential candidate in history, and he still has another 75 likely electoral votes outstanding in Wisconsin, Michigan, Arizona, Pennsylvania, and Georgia. So ... we're pretty much done here. Hit me up. (866) 482-3898. Leave your thoughts! Maybe we’ll use them in a future episode. (866) 482-3898. What tech policy issues should the Biden administration and Congress focus on? Let us know. (866) 482-3898. Save it to your contacts. So, you know, I don’t have to say the number over and over again. Like a ShamWow commercial. So that brings us to -- what will the next 4+ years look like in terms of tech policy? Obviously, China will be a major issue, and particularly Huawei. It will be interesting to see whether the Biden administration continues its ban of U.S. companies doing any business with Huawei whatsoever. Key allies haven’t supported the Trump administration’s ban, citing their reliance on Huawei technology. Outside of technology, what are the chances of war with China over the coming years, as China has continued to object to the U.S. presence in the South China Sea? What happens there directly affects the tech markets--war would certainly have a major impact on the supply chain. So that is definitely something to watch out for. Section 230 of the Communications Decency Act, which states that platforms aren’t legally responsible for the content their users post, has been an issue, as you know, with the Trump administration attempting to get the FCC--an independent agency, no less -- to use Section 230 to rein in what some conservatives see as an “anti-conservative bias” on platforms like Twitter. I’d be very surprised to see the Biden administration continue down that path. It’s just a huge waste of administrative, legislative and judicial resources for a policy that, I believe, would ultimately lose on First Amendment grounds once it hit the Supreme Court. Republicans and some Democrats could certainly purse reforming Section 230. But we’ll have to see if Josh Hawley is as passionate about illegal sexual content, and sex trafficking, as he says he is, and pursues Section 230 as vigorously as he has up until now. And another issue, I think, that we haven’t heard a lot about but probably should since we saw growth among Latino and Black working-class voters voting for Trumpism, is the Future of Work. What does the future of work look like for Americans in a tech sector that hasn’t done anything meaningful, other than releasing diversity reports, to improve diversity and inclusion -- nothing observable, I should say, because we can’t see everything that’s going on--all we see are the numbers which are pretty sad--they don’t look anything like the U.S. population. And you have companies like IBM already lobbying the Biden administration to fill the government skills gap by working with these same companies. The same companies hiring from the same 5 schools. We have over 5,000 colleges in the United States, many of which offer amazing programs -- since they’re accredited, right? -- they have amazing programs but don’t have the endowments--they don’t have the marketing budgets--for various, historical reasons we don’t need to get into. We hear a lot about recruiting from HBCUs. That’s great! But we have many many state and local colleges with incredible diversity -- Minority Serving Institutions -- with Black, Latino, Middle Eastern, Asian, and Native American students -- that don’t get much advocacy at all. Why is that? So those are just 3 areas I’m certainly going to be watching. There are many, many others, we’ll get to them on future episodes … Let’s get into Section 230 -- David Johns, Executive Director of the National Black Justice Coalition, and someone I greatly, and many, greatly respect and admire this man for his sheer intellect and incredible interpersonal skills. He is an enthusiast about equity—leveraging his time, talent, and treasures to address the needs of individuals and communities often neglected and ignored. A recognized thought leader and social justice champion, David’s career has focused on improving life outcomes and opportunities for Black people. David Johns.
Bio Alejandro Roark is Executive Director of the Hispanic Technology and Telecommunications Partnership (HTTP) in Washington, DC. HTTP is a national non-profit that convenes an intersectional coalition of national Latino organizations committed to promoting access, adoption, and the full utilization of technology and telecommunications resources by the Hispanic/Latino/a community in the United States. HTTP works at the intersection of ethics, technology, and public policy to educate, advocate, and serve as a national voice for Hispanics/Latinos in technology and telecommunications policy. As Executive Director, Alejandro leads a strategic planning process with HTTP member organizations to set the national Latino tech policy agenda that creates opportunities for national, and local advocates to engage with Congress and the Administration to advocate for inclusive public policy that promotes civil rights protections, equitable access to broadband, and increased diversity in media and tech workforces. HTTP works to extend Latino priorities in the following policy areas: broadband adoption, spectrum allocation, consumer privacy, open internet, intellectual property, and diversity & Inclusion within the technology workforce. With nearly a decade of experience working at the local, state and national level, Alejandro has dedicated his career to the elimination of structural inequities across LGBT inclusion, racial and social justice, and civil rights policies, through community power building, story-telling, equitable resource allocation and by creating pathways for a more diverse workforce. Alejandro applies his skills and leadership to the examination of the ethical and social dimensions of technological change including the attention economy, data privacy, algorithmic decision-making, and artificial intelligence to ensure that Latino priorities are integrated into the policy-making process. Prior to his position with HTTP, Alejandro oversaw the tech policy portfolio which included the planning and execution of its annual Latinx Tech Summit, for LULAC National, the nation's oldest nation’s country’s oldest and largest Latino civil rights organization. In addition to leading the corporate social responsibility team where he worked with fortune 500 companies to develop, implement, and scale nationwide community programs and coordinating LULAC’s Corporate Alliance. Alejandro has also served as the founding executive director for Utah’s first and only Mexican Cultural Arts organization, as well as the associate director for Equality Utah where he managed the region’s public relations systems, community outreach programming, and state, local, and federal advocacy work. Resources Hispanic Technology and Telecommunications Partnership Intro Hey everyone. Here we are on Election Day as purveyors of misinformation and intimidation use both traditional and digital tactics to keep voters away from the polls. The backdrop to this, of course, has been the Supreme Court’s roll-back of the Voting Rights Act, most notably its Shelby County v. Holder decision, in which it essentially neutered the VRA’s preclearance requirement -- the provision requiring state and local governments to get federal approval before making changes to their voting laws and practices. Section 5 is still there. The Court just ruled the 40-year-old data Congress relied on to decide which states are subject to the requirement were too-old. Then, as Laurence Tribe wrote in Lawfare last week, we have the current, conservative majority of the Supreme Court, with the exception of Chief Justice Roberts, suggesting state legislatures should be the highest authority in each state when it comes to each state’s voting laws, even above the highest state court charged with enforcing each state’s constitution. Social media has not played as dominant a role in shaping public opinion as it did in 2016. But that doesn’t mean state actors and others aren’t still using it. And the Washington Post reports bad actors are using robocalls, in Michigan specifically, to explicitly tell people to stay away from the polls. The FCC empowered carriers to block robocalls before they reach consumers. But apparently they dropped the ball here. The New York Times warned the public this morning about potential rigged voting machines, tossed ballots, and intimidating federal agents, Yes, this is 2020. And yes, we are still fighting this battle. In this election though the electorate cast their votes by mail in record numbers. So we are seeing this shift across the political spectrum to more analog tacticseither to suppress votes or to preserve them. --- We’ll see what happens. I’m tuning it out--at least until tomorrow. I don’t think I’m even gonna watch the results come in. I’ll wake up tomorrow and see what happened. But my guest today is Alejandro Roark, Executive Director of the Hispanic Technology and Telecommunications Parntership here in Washington. Previously, Alejandro led LULAC’s tech portfolio. He was also the founding Executive Director of the state of Utah’s first and only Mexican Culutral Arts Organization. Alejandro Roark!
Bio Richard Fowler is Host of radio’s nationally syndicated The Richard Fowler Show, Democratic Messaging Expert, and Millennial Engagement Specialist, Richard Fowler is an advocate for youth and social policy reform. Currently, Richard works with teachers, nurses, and higher education faculty to make sure their voices matters in the decision making process taking place at city halls, state capitols, and our nation’s Capital. Fowler is regularly featured on prime-time cable news discussing a wide variety of issues, including the 2016 election, social justice, race, and news of the day. Most frequently, he appears on The Kelly File and Hardball on MSNBC, in addition to other major international and outlets across the country. He was a 2012 Democratic National Convention Delegate. The Richard Fowler Show can be heard in over 9.1 million homes internationally and is a partner in the TYT Network, a multi-channel network on YouTube specializing in political talk shows. Richard has been a regular fill-in anchor on Current TV and RTTV and currently serves as the official guest host for The Full Court Press with Bill Press. A native of Fort Lauderdale, Richard got his first taste of politics at a young age when he went with his mother into the voting booth to pull the lever for Bill Clinton for President. After that auspicious start, Richard began his involvement in politics. As a young man he volunteered on numerous local races in Florida, including former Attorney General Janet Reno’s gubernatorial campaign. From registering and organizing more than a thousand young voters in Florida for the NAACP — to being a campaign manager in the District of Columbia, Richard has used his experience to advise youth, minority and female candidates. Richard has been a featured speaker at the Center For American Progress, National Council of La Raza’s National Conference, College Democrats of America, United States Student Association, the American Councils on International Education, the Young Democrats of America, over twenty different foreign delegations, and numerous colleges and universities. He has trained nearly 2,000 young people about the importance of image and messaging in the political arena. Richard is also the co-founder of Richard Media Company, a boutique messaging, public relations, and production outfit located in Washington, DC. Outside of his work in media, Richard was the co-founder and director of PHOENIX FREEDOM PAC, a transportation solutions political action committee. Richard Formerly served as the Advocacy Director of The Young Democrats of America and as the Executive Director of Generational Alliance, a progressive youth engagement organization. He sat on the Board of Directors for Amara Legal Center and now is a National Executive Board Member for Pride at Work. He is also the former Executive Director of the Virginia Young Democrats Annual Conference, a Fellow at the New Leaders Council, and a former Fellow at the Center for Progressive Leadership. Richard earned a Bachelor of Science in Economics and a Bachelors of Arts in International Affairs from The George Washington University. Resources The Richard Fowler Show Intro A coalition of the United States Department of Justice and 11 mostly red states announced Tuesday that they filed a new antitrust lawsuit against Google because of its search dominance. The complaint accuses Google of engaging in a number of anti-competitive practices. One of them is Apple’s exclusive relationship with Google that allows Google’s search engine to be the default in Apple’s Safari browser. The Wall Street Journal reports that some estimates place the cost to Google for this relationship at $11 billion, comprising some 20% of Apple’s total revenue. A key piece of evidence here was a 2018 email from a top Apple executive telling his counterpart at Google, “Our vision is that we work as if we are one company.” Neither company has released the name of the executive who sent that email. But I am just beside myself trying to figure out, and I’m really trying to empathize with the person who sent it, why, out of all of the things they could have put in writing, why they wrote the absolute worst thing they could possibly think of. This was a high-level interaction with a competitor in which anticompetitive pitfalls were blatantly obvious. The first thing on this executive’s mind should have been to avoid an appearance of impropriety at all costs, especially given the discourse here in Washington about both companies’ market dominance and bipartisan support for regulating tech companies. These executives are supposed to be the best and brightest, right? But this is just basic antitrust law and policy. A high-ranking executive in a company like Apple should know it. It’s just basic. It’s not hard. I cannot help but wonder if the executive here was a person of color. Forgive me if I sound harsh. But companies like Apple use their purported inability to find qualified diverse talent as an excuse to justify the sheer lack of diversity in their executive ranks. I really want to know how someone, who is supposed to be so superior to everyone else who competed for their job, could make such a dumb mistake. I’m not saying this person should be fired. Everyone makes mistakes. But for a company that seems so invested in meritocracy, I, like many of you, can’t help but wonder 1) was this executive a person of color, and; 2) how did the company respond to this? Are they treating it as an isolated, forgivable incident, or, are they are globalizing it, making a value judgment about the executive’s overall intelligence? I’m not saying it’s right. I’m not even saying it’s healthy to think this way. I’m just saying it crossed my mind. And I won’t even get into Jeffrey Toobin.
How to spot and stop misinformation with John Breyault (Ep. 244) How to spot and stop misinformation with John Breyault (Ep. 244) -- John and Joe Miller discuss how consumers themselves can correct misinformation, by weighing in when they see it, rather than relying on tech companies. Bio John Breyault is a nationally-recognized consumer advocate with more than 15 years of experience championing the rights of consumers and the underserved. At the National Consumers League, he advocates for stronger consumer protections before Congress and federal agencies on issues related to telecommunications, fraud, data security, privacy, aviation, live event ticketing, video gaming, and other consumer concerns. In addition, John manages NCL’s Fraud.org and #DataInsecurity project campaigns. John has testified multiple times before Congress and federal agencies and is a regular contributor to national press outlets including the Washington Post, New York Times, and The Wall Street Journal. Prior to NCL, John was the director of research at Amplify Public Affairs, where hs supported clients in the telecommunications, energy, labor, and environmental sectors. Earlier in his career, John worked at Sprint and at the American Center for Polich Culture in Washington, DC. A lifelong Virginian, John is a graduate of George Mason University, where he received a bachelor’s degree in International Studies with a minor in French. Resources: National Consumers League Intro Joe: Hey everybody. Congress can’t get anything done. Now the state Attorneys General are hamstrung by corruption and politics, as they try to execute a serious of actions against big tech. Real news outlets believe the Department of Justice and various state coalitions are planning to sue Google. The DOJ is expected to focus on Google’s search dominance. The state coalitions are working together with the DOJ, but then again, they’re no,t because many of them believe the DOJ’s moving too slowly. Congress has subpoenaed Facebook, Google, and Twitter. But, of course, Republicans and Democrats rarely see eye-to-eye. But now we’ve got problems in Texas -- with their Attorney General, Ken Paxton, facing bribery charges -- accusations his own Deputies alleged in a whistleblower complaint. It’s a litany of allegations claiming that: He received hundreds of thousands of dollars in gifts for his own legal defense fund. What did the people who gave those donations expect in exchange? They couldn’t have given them out of the goodness of their hearts. In that case, why not give it to poor people? Nevertheless, Paxton says these donors are family friends. His own wife, a state senator, introduced a bill to expand his power to exempt individuals from state regulations, which would have set him up to return favors to people. He unilaterally decided that Texas Governor Greg Abbot’s ban of elective procedures due to COVID-19 should apply to abortions. This went into effect immediately, forcing women to cancel their appointments, pending the outcome of litigation arising from this. The list goes on and on. So Democratic Attorneys General are calling for Paxton to step down, saying it threatens their multistate investigation into Google’s market practices. Meanwhile, sources expect the DOJ to file a lawsuit in a few days. Why that’s public, I have no idea. You’d think it’d be attorney-privileged. But, frankly, following ethical guidelines doesn’t appear to be part of Bill Barr’s skill set. To make matters worse, you have a dozen or so other Republican Attorneys General facing similar corruption problems. Eliot Spitzer must feel vindicated for his little prostitution situation back in 2008. But that was 12 years ago! Let’s move on, let’s move on. John McAfee, the namesake of the antivirus software, was arrested in Spain Monday. The Securities and Exchange Commission alleges McAfee took $23 million from people to invest in cryptocurrencies he was being paid to promote. But the officials note this is a personal lawsuit, not one against McAfee, the company. So we’re in this place where politics are holding up anything meaningful when it comes to antitrust enforcement against big tech companies. We’ll see what the DOJ lawsuit says. But, without even looking at it, I anticipate a number of free speech problems that will have to be overcome, and much of the case law has been written by Conservatives. Related Episodes ‘Social media policy: It's the moderation, stupid!’ with Chris Lewis Ep. 232(Opens in a new browser tab) 'Health Tech and Communications in Crisis' with Licy DoCanto (Ep. 231)(Opens in a new browser tab) 'They Smile in Your Face: How the Internet is Unmasking Hidden Racism' with Robert Eschmann (Ep. 222)(Opens in a new browser tab) Ep. 203: The Internet and Racial Justice w/ Charlton McIlwain(Opens in a new browser tab) Kids and YouTube with Patrick van Kessel (Ep. 197)(Opens in a new browser tab)
John Bergmayer is Legal Director at Public Knowledge, specializing in telecommunications, media, internet, and intellectual property issues. He advocates for the public interest before courts and policymakers, and works to make sure that all stakeholders -- including ordinary citizens, artists, and technological innovators -- have a say in shaping emerging digital policies. Resources Bergmayer, J., 2020. Tending The Garden: How To Ensure App Stores Put Users First. [ebook] Washington, DC: Public Knowledge. Available at: [Accessed 27 September 2020].
Bio Charlton McIlwain (@cmcilwain) is Vice Provost or Faculty Engagement and Development; Professor of Media, Culture and Communication at New York University. His recent work focuses on the intersections of race, digital media, and racial justice activism. He recently wrote Racial Formation, Inequality & the Political Economy of Web Traffic, in the journal Information, Communication & Society, and he co-authored, with Deen Freelon and Meredith Clark, the recent report Beyond the Hashtags: Ferguson, #BlackLivesMatter, and the Online Struggle for Offline Justice, published by the Center for Media & Social Impact, and supported by the Spencer Foundation. Today, Tuesday October 1st, 2019, his new book entitled Black Software: The Internet & Racial Justice, From the AfroNet to Black Lives Matter releases via Oxford University Press and available wherever you buy books. Resources McIlwain, Charlton. Black Software: The Internet & Racial Justice, from the AfroNet to Black Lives Matter (Oxford University Press, 2019)
Bios Mike Alkire Michael J. Alkire (@AlkirePremier) is the President at Premier, the largest global supply chain, healthcare technology company in the U.S, that helps hospitals and health systems provide higher quality patient care at a better cost. In addition to leading the integration of the company’s clinical, financial, supply chain and operational performance offerings, Alkire also oversees the quality, safety, labor and supply chain technology solutions. An influential figure in America’s efforts to address drug shortages and infuse data-enabled technology solutions into the U.S. healthcare system, Alkire has been consulted by the U.S. Department of Health and Human Services, FEMA, congressional lawmakers, Wall Street investors and private sector industry leaders on how to stabilize the medical and pharmaceutical supply chain during the COVID-19 pandemic. In addition to offering his expertise in the media, he shares perspectives via his podcast, InsideOut, through discussions with insiders in healthcare. With an eye on equipping the nation’s hospitals and health systems with the clinical, financial, supply chain and operational performance improvement offerings they need to provide quality care at efficient costs, Alkire oversees Premier’s quality, safety, labor and supply chain technology apps and data-driven collaboratives including Premier’s comparative database, one of the nation’s largest outcomes databases. Alkire also led Premier’s efforts to address public health and safety issues from the nationwide drug shortage problem, testifying before the U.S. House of Representatives regarding Premier research on shortages and gray market price gouging. This work contributed to the president and Congress taking action to investigate and correct the problem, resulting in two pieces of bipartisan legislation. Alkire is a past board member of GHX and the Healthcare Supply Chain Association. He recently was named one of the Top 25 COOs in Healthcare for 2018 by Modern Healthcare. In 2015, Alkire won the Gold Stevie Award for Executive of the Year and in 2014 he was recognized as a Gold Award Winner for COO of the Year by the Golden Bridge Awards. He has more than 20 years of experience in running business operations and business development organizations at Deloitte & Touche and Cap Gemini Ernst & Young. Before joining Premier, he served in a number of leadership roles at Cap Gemini, including North American responsibilities for supply chain and high-tech manufacturing. Alkire graduated magna cum laude with a Bachelor of Science from Indiana State University and a MBA from Indiana University. Jonathan Slotkin MD, FAANS Jonathan Slotkin leads clinical strategy, innovation and operations for Contigo Health by partnering with health systems and employers to deliver the highest quality care at a fair price. He works to support the development of novel products and implementation approaches that always aim for clinical excellence, patient satisfaction and value. Slotkin is a neurosurgeon and scientist who has led prominent care delivery reengineering and digital transformation initiatives centered around patients. He has partnered with some of the nation’s largest employers to help them reimagine the care of their associates. Slotkin believes higher quality care will always be the most cost-effective care in the end and that innovative employers and providers working together is the most powerful force we have to fix the U.S. healthcare system. He maintains a clinical practice caring for patients directly at Geisinger where he is associate chief medical informatics officer and Vice Chair of neurosurgery. Resources Scott Weingarten, Jonathan Slotkin & Mike Alkire, Building A Real-Time Covid-19 Early-Warning System, Harvard Business Review, 2020, https://hbr.org/2020/06/building-a-real-time-covid-19-early-warning-system (last visited Aug 3, 2020). Lisa Woods, Jonathan R. Slotkin & M. Ruth Coleman, How Employers are Fixing Healthcare, Harvard Business Review, 2019, https://hbr.org/cover-story/2019/03/how-employers-are-fixing-health-care (last visited Aug 3, 2020). Jonathan R. Slotkin, Karen Murphy & Jaewon Ryu, How One Health System is Transforming in Response to COVID-19, Harvard Business Review, 2020, https://hbr.org/2020/06/how-one-health-system-is-transforming-in-response-to-covid-19 (last visited Aug 3, 2020).