Es ist richtig, dass die re:publica eine Gesellschaftskonferenz ist – und sich deshalb viel mit Konzepten, "analoger Software" und Politiken beschäftigt. Infrastrukturen und Hardware interessieren uns aber ebenso.
Silja Vöneky, Philipp Kellmeyer, Christiane Miethge The session will start with a video showing parts of the web documentary "Homo Digitalis". The panelists will then discuss, together with the audience, the ethical, legal and social implications of the convergence of consumer neurotechnology with big data and advanced machine learning. After all, the brain is not any old organ: it is the seat of our feelings, desires, personality, attitude, creativity and thoughts. Gaining access to this rich trove of highly personal biometric data via advanced neurotechnology may be an enticing prospect for companies that want to harness this data for potentially unprecedented levels of personalization of their services, e.g. targeted advertising. Yet, thus far, there is no wide ranging public disocurse or deliberation of the potential threats of this development for the privileged privacy and the freedom of our thoughts, feelings and other mental states. Do existing regimens for protecting biomedical data suffice to manage this potential flood of Big Brain Data? Can technology, such as blockchain, federated learning or differential privacy, protect users' brain data from unwarranted access and commercial exploitation? Who will decide over the priorities for research and applications as experts in data science and machine learning systematically move from public research institutions to the private sector? In the first part of the session, we will present the benefits and risks of this scenario from a medical, scientific, legal and neuroethical perspective. In the second part, we will engage the audience in discussing and deliberating about possible solutions for these ethical, legal and social challenges.
Shermin Voshmgir, Ricardo Ferrer Rivero Bitcoin wird nicht das neue Zahlungsmittel – dafür ist das Programm zu träge und verbraucht zu viel Energie. Aber die Technologie dahinter, die Blockchain, hat große Hoffnungen geweckt. Neue Anwendungsideen und Lösungen überschwemmen den Markt. Vier Trends fallen dabei auf: 1. Viele treibt die Gier. Im Netz versprechen selbsternannte Krypto-Experten und Trading-Gruppen das schnelle Geld. Mittlerweile kann man sogar auf den Kurs von Bitcoin wetten. Das treibt den Preis. Statt eines Tauschmittels sind Bitcoin und Co zur vermeintlichen Geldanlage geworden. 2. Das Ziel scheint keine allgemeingültige, digitale Währung mehr zu sein, sondern die Neuordnung einzelner Branchen. Krypto-Unternehmen wollen den Musikmarkt umkrempeln, die Stromvergütung neu ordnen oder den Kauf von Cannabis vereinfachen. Dafür schaffen sie immer neue, digitale Tauschmittel – "Tokens". 3. Blender haben leichtes Spiel. Offenbar reicht es, den Begriff Blockchain an eine Idee zu binden, schon stecken Menschen Millionen in Unternehmen. Immer wieder laufen Nutzer betrügerischen Drittanbietern auf. 4. Staaten wissen nicht, wie sie mit all dem umgehen sollen: Die einen verbieten Kryptowährungen, die anderen wollen sie stärker regulieren oder ihre eigenen entwickeln. Darauf aufbauend möchten wir in einem Gespräch herausfinden, was aus der eigentlichen Idee hinter Bitcoin geworden ist. Wer verfolgt sie noch? Kann sie nach heutigem Stand überleben oder hat sie unser System schon verschluckt? Welche Zukunft hat die "Token-Ökonomie"? Und was muss passieren, damit „Kryptopia“ doch noch Wirklichkeit wird? supported by T-Labs
Christoph Boecken Das IoT wird sich über kurz oder lang durchsetzen, trotzdem geht damit ein Sicherheitsrisiko einher, da wir uns einerseits abhängig vom Hersteller machen, die wiederum mit den Daten missbräuchlich umgehen oder sich erst gar nicht um die Sicherheit ihrer Hardware kümmern. Nutzte man früher für Botnetze noch anfällige Rechner, so sind es heute eben Drucker, Getränkeautomaten oder Leuchtmittel. Und gerade bei Dingen wie z.B. Autos oder Ampeln kann es hier auch mitunter lebensgefährlich werden. Der Vortrag ist kurzweilig und stellt in einer Präsentation ein paar der absurdesten Beispiele vor. Er soll ganz klar unterhalten, ohne dabei die ernsthafte Message zu vergessen.
Joseph Cox, Lorenzo Franceschi-Bicchierai This talk covers two areas: the inner workings of the consumer spyware industry, and how that industry has been repeatedly linked to cases of domestic and sexual violence, rape, and murder. The first is based on a slew of internal spreadsheets, financial documents, customer records, and even live intercepts captured by malware which activist hackers stole and provided to us as journalists. This data shows the popularity of consumer spyware, how some companies explicitly market their products to jealous or paranoid lovers to spy on their spouses, and even some connections to the same companies that provide malware for authoritarian regimes. But our talk also offers the behind-the-scenes of an investigation that relied heavily on information provided by criminal hackers: how do journalists verify that data, and how do they handle intensely private information? And we explain why we purchased the malware ourselves to give readers a deeper understanding of how exactly it worked. The second part brings together interviews with sexual violence victims, domestic violence researchers, and concrete evidence of malware being used to facilitate abuse. This malware may require physical access to install, but to ignore this issue would be to miss the point: in an abusive relationship, the attacker often stays in the same building, room, or even bed as the target. This scenario presents a complicated melding of physical and digital security that the infosec community may want to pay more attention to.
Manuella Cunha Brito Society is at a crossroads, facing two “singularities”. This means we will see events where the depth of changes around us become exponential and thus our current way of thinking does not apply anymore — like the effects of gravity in a black hole. First, is the intensifying system crisis : climate change, food and water insecurity, extreme economic, social and gender inequality. The UN made a framework to help us fight those big challenges, the 17 Sustainable Development Goals we need to tackle before 2030. Second, is the technological acceleration driven by exponential technologies: synthetic biology, machine learning, robotics or blockchains. The WEF states that we are living the “Fourth Industrial Revolution”. This brings many promises, but also grand challenges: what is the future of work when automation is everywhere? How do we ensure that the benefits reaped are fairly distributed across society? Will our consumption of natural resources be exponential too, or will circular economy principles prevail? How do we guarantee the safety of these technologies? The question is: can we harness this second singularity, to prevent the first one, and regenerate society and the planet? What are the signs of hope, and the challenges that lie ahead? Answering these questions is the work of the Good Tech Lab, which will be presented by Manuella Cunha Brito in this session.
Matthew Stender, Adam Harvey Biometric sensors are used by governments, corporations and even NGOs to perceive unique identifiers for individual humans. This session will parse the distinction between two types of biometric traits (hard and soft) - those that we possess and those that we perform - and discuss how both are increasingly being used as unique identifiers to catalogue our whereabouts, categorize our actions and customize our experiences. These automated biometric systems, created by humans are increasing religating humanity ‘out of the loop'. Simultaneously, humans are unable to hide or escape from the view of machines. Without oversight or accountability, humans changes the way they operate in the physical space, eroding their agency. The ethnographic and anthropological implications of the biometric shape more than individual behavior and can even lead to a “flattening of culture.” From data modeling leads to replication of discrimination to biometric product design doesn't account for non-able bodied persons, technology will increasingly leave its fingerprint on human society. Finally, forced participation in these systems brings up philosophical questions as to what constitutes human rights in the 21st century. Beyond traditional access control mechanisms (like an undergoing an eye scan to enter into a secure facility), collective biometrics now give those in power the capacity for predictive analysis through future modeling. Micromovements can now be disambiguated to tell stories about our human bodies that we ourselves may be unaware of.
Frederike Kaltheuner, Paul-Olivier Dehaye, Ravi Naik The Cambridge Analytica scandal is about campaign financing, possible law breaking and Brexit. It's also about money in politics. It's about the continuing irresponsibility of platforms like Facebook. It's about the systemic problem of data exploitation and surveillance capitalism. It's about targeted advertising, persuasion and manipulation. It's also about a shady company that has possibly broken data protection laws (together with political campaigns, and other data analytics firms). It's about neo-colonialims and global inequalities. It's also about foreign interference in elections and government propaganda (not just by Russia). It's about GDPR and compliance - competition law and consumer protection. In other words: this is complicated.
Sean Bonner In 2008 I spoke at re:publica about citizen journalism and specifically how blogs were filling gaps left by traditional media outlets that lacked in-depth coverage of some topics and markets because of their reactionary nature, and how journalism wasn't inherently better or worse because of a degree the author might have held. 10 years later I'll follow on from with a focus on Safecast and crowd sourcing environmental data and citizen sensing. Popular Mechanics recently said that "Safecast has revolutionized Citizen Science" - In this talk I'll discuss the how and why, as well as show again how the people can often do a better job than the professionals, and illustrate how without the trust of the community it doesn't matter how accurate published readings might be. How independent and trusted data can be used to verify official data and vice versa, and why it's important for communities to proactively measure things themselves rather than assuming governments or corporations are doing it for them, and everyone benefits from more open data.
Simon David Hirsbrunner, Lila Warszawski, Tobias Geiger, Toralf Staud Using scientific expertise, supercomputers and big data, researchers are able to broaden our understanding of the past and present of our climate system. By drawing on knowledge of our natural and social worlds, they can go even further, envisioning possible impacts, risks, and solutions for a future with climate change. These scenarios of the future don't just define abstract quantities, like average global temperatures, they describe future worlds that are relevant for policy makers, stakeholders, and for a curious and knowledge-hungry public. The web has no shortage of open data, visualizations, photos, videos and blogs describing climate change and its impacts. To put it bluntly, we have more than enough information available to understand and act on climate change. However, for researchers and non-scientists alike, the challenge is to match informational needs with the scientific data and knowledge available. It's time-consuming work searching through all these web sources, and most citizen scientists give up, before they find what they're looking for. Quite often a certain level of expertise is required to realize the answer has been staring you in the face all along! We've come to think that open data is still a long way from truly open science. During the last couple of months, we've been wading through the sea of climate information available on websites, information portals, and social media platforms. Based on this work, we will take a closer look at the ‘openness' of information on climate impacts together with interested rp18 participants. The discussions' premise is that openness is not a stable state, but always requires work: ‘data' doesn't talk on its own, it is constantly being translated, situated, and put into new perspectives. If so, what's the role of open data scientists in the digital society? How to deal with the fact that “looking at big data” has become a mundane element of our contemporary digital culture - not just in science, but in everyday life?
Dorothee Klaus, Stevens Le Blond, Massimo Marelli, Nathaniel A. Raymond Civil society, governmental and private sector partners are increasingly engaged in and reliant on digital data and ICTs for the delivery of public services and support to vulnerable populations. However, emerging and related cyber- and data-reliant risks threaten the human security and human rights of these populations, undermining their development potential. The proliferation of ICTs among affected populations and humanitarian and development actors alike exposes critical, unaddressed gaps in the legal, ethical and technological frameworks that have traditionally defined and governed humanitarians' professional conduct. These gaps are an open secret, as is the lack of professionalization around data protection and ICT use. Increasingly, they are a disaster waiting to happen. As evidenced by the recent security breach of a software platform used by aid agencies to store the data of vulnerable people, the risk of such ICT- and data-related disasters is very real and far-reaching in the humanitarian and development sectors. In the face of these evermore complex threats, the need for capacity development for digital security and cyber resilience is increasingly recognized in the international humanitarian and development communities as critical. Unfortunately, an effective approach for such capacity development is lacking. In this panel discussion convened by the Signal Program on Human Security and Technology of the Harvard Humanitarian Initiative, the United Nations Relief and Works Agency (UNRWA), the École polytechnique Fédérale de Lausanne (EPFL), and the International Committee of the Red Cross (ICRC), speakers from a diverse set of backgrounds will explore and debate the major challenges and opportunities of digital security and cyber resilience in the 21st century. Through the unique experience and perspectives of the speakers, the panel will bring theory and practice together to frame a critical narrative and agenda for ensuring that ethics and human rights are central to global and national debates around digital security and cyber resilience. supported by BMZ
Moritz Metz 2015 waren meine "Sieben DIY Weisheiten von Netzbasteln" wegen Krankheit ausgefallen, diesmal ist der Erfahrungsschatz in alle Richtungen gewachsen, die wissenschaftliche und amateurhafte Weisheit lässt sich aus mittlerweile fast 100 Netzbastelprojekten destillieren. Und unterhaltsame Fragen beantworten. Wie repariert man Reißverschlüsse - und lassen sich mit Drohnen-abgeworfenen Seedbombs Kornkreise errichten? Wie züchtet man Keimsprossen im Badezimmer, startet ein Auto per Spanngurt, gießt durchsichtigen Beton, gart Spargel über dem Dieselmotor? Was bringt die iOT-Toiletenbürste? Sind Sonnenspiegel gefährlich? Und wozu lasern wir Bananensägen? Neben Antworten auf diese Fragen gebe ich eine Einordnung aktueller Makertrends, nenne selbsterprobte Bastler-Sicherheitstipps und verurteile das Internet of Things.
Kai Krämer, Sarah Hambridge Blockchain technology has the potential to accelerate the transition to the clean energy future, by redesigning the rules of the game on how energy is distributed and transacted. As blockchain soars to new heights of hype there's a critical need for the value of the technology to be established in major industries. The mission is clear: accelerating the arrival of blockchain technology across the energy sector, so we enable the transition to a decentralized, more democratic and carbon-free energy sector. The EWF is a global non-profit organization focused on capturing this potential. Currently, EWF is building an open-source blockchain infrastructure to serve as the standard industry platform for blockchain applications in the energy sector to accelerate the deployment of renewables and distributed energy resources (DERs). A platform like the one being built by EWF will be defined by the applications it supports. It will be able to host and run every energy use case, redefining the energy sector as a decentralized AppStore for energy dApps. By developing a platform that addresses the specific needs of the energy industry, we created an ecosystem of energy players, start-ups and developers, incentivizing the exchange of knowledge between them. This was vital to eliminate repetition of efforts and to leverage scarce talent. In this session, we'll talk about the disruptive potential of blockchain in the energy sector and why the EWF is the key to unlock it. We will provide an overview on the most promising energy use cases based on blockchain technology. Essentially, we will describe how the future we are building looks like. supported by T-Labs
Michael Henretty, Ola Gasidlo, Cathleen Berger This talk will present ideas for how to tackle three dimensions of the future Web: voice recognition, web compatibility, and digital inclusion. When your latest Siri, Alexa or Cortana fail to answer your questions because they seem to simply not understand your instructions, that's probably because you're not a (white) male with a mainstream American accent. Voice recognition tools are only as good as their training data, if the data set doesn't include all sort of accents, dialects, and dynamic adaptation, the systems we built on top of it will never fully reap their potentials. Creating an inclusive, and freely available database to train speech algorithms comes with its challenges - but they're not unsolvable. And once we have that? That's when we need to translate and connect these tools to our everyday web experience, writing code that is accessible, compatible, and adaptable across whichever platform you use to access the Web. We can already see a push to create a speech driven future of the Web, but while this is still a vision for most users, the technical questions underlying this development are being addressed now - which is why we need to make sure our answers contribute to a healthy environment. And once we have the training data and the standards? That's when our political contexts will be yet again put to a test: how can we make sure our societies continue to grow in an open and inclusive manner? What do we need to pay attention to, if we don't want technologies to become our oppressors? It's in our hands to use the tools we create to make sure they give power to the people - because POP culture needs a voice.
Susan Long, Bahar Kumar, Sebastian Jünemann, Ruben Neugebauer Crisis situations quickly put conventional supply chains out of order and medical supplies become a scarce commodity. How can digital systems of micro- and distributed manufacturing help strengthen local innovation and production capacity in crisis areas? What tools can be developed so that those affected can help themselves? Questions and answers that benefit humanitarian aid and are essential in the context of a global transition to more sustainable economic models. The speakers will discuss their experiences: CADUS has founded the world's first Crisis Response Makerspace in Berlin, which deals solely with problems of humanitarian aid and refugees. The speakers are Sebastian Jünemann, who has developed a concept for a cheap mobile hospital based on his experience in Syria and Northern Iraq with his organization, and Lissette Feliciano, a filmmaker from Puerto Rico. She experienced the political blockade of humanitarian aid after the devastating Typhoon Maria in the fall of 2017 first-hand. Together they explain why such a concept is necessary to disrupt the otherwise non-innovation friendly humanitarian relief organizations. Bahar Kumar will talk about her experiences running a community innovation hub in Kathmandu and taking part in the latest MakerNet experiment on contract distribution, 3D printing models of earthquake-proof houses for use teaching safer building techniques to those rebuilding after the earthquake. Susanne Long from Fieldready will report from the teams in Nepal and Syria, and the global outreach of Humanitarian Makers. Nepalese and Syrian makers visit medical practitioners in places where supplies don't get through, they find out what is needed, and make it with support from a global network of engineers. She will look at the path to massively transforming medical supply chains to use local production capacity in post-disaster or conflict situations. Susanne is also demonstrating her work in a workshop at the Makerspace. supported by BMZ
Andres Guadamuz In 2014, the conceptual art collective !Mediengruppe Bitnik programmed an autonomous online agent (a bot) to purchase random items from the dark web with a weekly budget of $100 USD in bitcoins. During his ongoing experience, the Random Darknet Shopper has purchased jeans, generic Viagra, cigarettes, collector's coins, and instructions on how to hack a Coke vending machine. But perhaps the most interesting item arrived in 2015, when the artists received 10 yellow ecstasy pills inside a DVD case. The police in the Swiss town of St Gallen confiscated the machine, but later “released it” after prosecutors determined that no crime had been committed as the possession was for the purpose of an art project. With the rise of more sophisticated and independent artificial intelligence, situations like the one above will take place more often. Self-driving cars, smart contracts, IoT devices, data mining bots, machine learning algorithms; technology will be given autonomy to make decisions in various circumstances, and sometimes these may prove to be illegal or illicit. What happens when these autonomous agents break the law? Who is liable? Is there even anyone liable? At the moment, the law has not given much thought to infringement committed by AI, mostly because of until now most autonomous agents were not sophisticated. But with the growing presence of intelligent bots in all areas of life, we will need to explore new solutions, or perhaps re-visit older regimes. This presentation will explore potential legal pitfalls regarding AI liability, and will look at various legal solutions that we could explore to allocate liability.
Eyal Weizman In recent years, the group Forensic Architecture began using novel research methods to undertake a series of investigations into human rights abuses. The group uses architecture as an optical device to investigate armed conflicts and environmental destruction, as well as to cross-reference a variety of evidence sources, such as new media, remote sensing, material analysis, witness testimony, and crowd-sourcing. In this talk, Eyal Weizman provides an in-depth introduction to the history, practice, assumptions, potentials, and double binds of this practice.
Eden Kupermintz If we were forced to define Western civilization, we would have to address its hunger for the novel. It seems as if our culture is obsessed with pushing forward, with discovering new ways and places in which to be. This is, often, viewed as a positive thing; the texts which exist in favor of entrepreneurship and pioneering are plenty. However, in our rush for the new, we don't often delay on the prices those who come to occupy these next frontiers will pay. One of the prices that seem more and more actual is the price of loneliness. These prices are hard to pin down and are thus more scary, as we might all be susceptible to them. The paradox is clear: the more important and crushing dangers of being out on the edge are ignored. This becomes important as Western culture (and, indeed, other, larger parts of humanity) stands on the brink of two frontiers. We've already waded out into the shallows of the first and are preparing to jump into the deep; these are the waters of the digital society. It affects all of us and we are all exposed to its prices. As we submerge ourselves more and more in its binary waters, we enter a new mode of being and create new places in which to exist. The second frontier is an emerging one; we've only begun to test its frigid water. This frontier is outer space and we are preparing to start moving into it. Both of these frontiers hold plenty of challenges but one is often overlooked: loneliness. In both the digital and the astral realms, denizens will be faced with an acute perception that they are alone. However, the types of loneliness felt will obviously differ. What even is digital loneliness? How can we feel alone when we are supposedly so connected? From the other end, how can loneliness in outer space be mitigated? What creates it? What will we do in order to deal with something which seems like an inherent aspect of space exploration?
Luca Caracciolo In dem von Steven Spielberg verfilmten Kult-Roman „Ready Player One“ (Kinostart April 2018) beschreibt Ernest Cline eine nicht allzu ferne Zukunft, in der ein Großteil der Menschen ihren Alltag mit einer VR-Brille in virtuellen Realitäten verbringt. Aber anders als ihre herkömmliche Deutung als „Nicht-Realität“ oder Flucht vermuten lässt, schaffen die Nutzer in diesen virtuellen Welten durch Interaktion mit anderen einen sozialen Sinn und eine lebenswerte Umgebung. Trotz der dystopischen Note des Romans, stellt Cline damit einen zentralen Aspekt virtueller Realitäten heraus, der bis heute oftmals falsch gedeutet oder komplett übersehen wird: (Digitale) Virtualität als sinnstiftender Faktor des Sozialen und somit wichtiger Bestandteil der physischen Realität. Vor dem Hintergrund dieses sinnerzeugenden Moments virtueller Realitäten erscheinen die technischen Entwicklungen im Bereich Virtual (VR) und Augmented Reality (AR) der vergangenen Jahre in einem anderen Licht: Sie stellen nicht einfach nur eine Fortentwicklung des Computer-Interfaces dar, sondern markieren den Beginn der Spatial-Computing-Ära, in der digitale Virtualität im Form von VR- und AR-Technologien selbstverständlicher Teil des Alltags und Realitätserlebens der Menschen wird. Der Talk diskutiert zwei Schlüsselfragen, die eng miteinander verknüpft sind: Wie ist eine digitale Virtualität zu denken, die nicht als Gegenteil von Realität begriffen wird, sondern als einer ihrer sinnstiftenden Faktoren? Und welche technologischen Herausforderungen müssen auf dem Weg zu dieser neuen Computer-Ära noch gemeistert werden?
Eva Wolfangel Für eine Magazin-Recherche habe ich meine virtuellen und umso realeren Freunde im echten Leben besucht, um dem auf die Spur zu kommen, was soziale VR für uns in Zukunft für eine Rolle spielen könnte: Sana, die strenggläubige muslimische Witwe aus Kuwait, die ohne VR kein Sozialleben hätte. Chris, der Diplomat in Krisengebieten, der sein Familienleben teils in VR verlegt hat, weil er von seinen Liebsten aus Sicherheitsgründen getrennt war. Cattz, der verarmte und schwer herzkranke Computertechniker aus den USA, für den nur das Leben in seinen Avataren noch Leben ist. Und Ben und Shoo, der Amerikaner und die Chinesin, die sich in VR gefunden und verlobt haben. Ich habe mit allen viel Zeit in VR und im echten Leben verbracht. Dabei wurde fühlbar, wie unsinnig die Unterscheidung zwischen virtueller und „echter“ Realität schon jetzt ist. Wie wenig echt das echte Leben, wie real das Virtuelle sein kann. Das haben mir Forscher bestätigt, die ich ebenfalls auf dieser Reise getroffen habe: virtuelle Realität ist keine zweite-Klasse-Realität, wir werden in Zukunft zwischen verschiedenen Realitäten wählen können. Das heißt nicht, dass wir nicht unterscheiden werden können zwischen diesen und der Welt, in der unser biologischer Körper lebt. Aber auch so kommt VR dem echten Leben erstaunlich nah, was unter anderem daran liegt, dass unser Bewusstsein nicht an den Körper gebunden ist. Der Philosoph Thomas Metzinger hat mir für diese Recherche erklärt, wie er in Experimenten belegen konnte, dass sich der Körper eines Avatars sehr wohl wie der eigene Körper anfühlen kann und was das alles für unser Konzept von Realität bedeutet. Ich will euch daran teilhaben lassen, wie sich das virtuelle und reale Leben meiner besuchten Freundr und der Kontakt zu ihnen auf den verschiedenen Ebenen anfühlt, was Forscher dazu sagen, was Second-Life-Gründer Philip Rosedale, den ich ebenfalls besucht habe, eigentlich in VR ausheckt und was das alles für uns und unsere virtuelle und ganz reale Zukunft bedeutet.
Ranga Yogeshwar Manche Bereiche des Deep-Learning sind ungeheuer leistungsfähig. Bei der Auswertung von Röntgenbildern oder dem Erkennen von Verkehrsschildern übertreffen die neuronalen Netze bereits heute den Menschen. Moderne Systeme können bei der Vergabe von Bankkrediten besser urteilen als Menschen es tun. Die Folge: Die Algorithmen entscheiden. Doch nach welchen moralischen Prinzipien fällen sie ihre Entscheidung? Betrachtet man die genaue Arbeitsweise dieser Systeme, so verhalten sie sich aufgrund von Rückkopplungseffekten und ihrer Komplexität nicht mehr kausal. Diese neuronalen Systeme funktionieren zwar, doch ihre genauen Entscheidungsmuster entziehen sich einer klassischen Logik. Wir erleben also einen Übergang von der Kausalität zur Korrelation. Das entspricht einem Bruch mit den grundlegenden Prinzipien der Aufklärung. Selbst innerhalb der Kommunikationsströme von Twitter oder Facebook oder den Anwendungen des „natural language processing" zeigt sich ein beachtliches Potenzial einer gesellschaftlichen Fragmentierung. Mensch und Maschine– wer programmiert am Ende wen? // Vor dem Talk von Ranga Yogehswar wird Anja Karlizcek, Bundesministerin für Bildung und Forschung, die Subkonferenz "We can WORK it out" zum Wissenschaftsjahr 2018 - Arbeitswelten der Zukunft mit einem Impuls zum Thema "Wissenschaftskommunikation in den Arbeitswelten der Zukunft" eröffnen.
Tom Wallis The gap between the cultures of science and the humanities has grown progressively larger as generations of each side have learned to further disregard the other. Worryingly, a point of view that the humanities have nothing to contribute is infecting Silicon Valley and tech culture at large. A recent Stephen Hawking quote is symptomatic of the problem: “Philosophy is dead”. This should concern us: cultural change is now brought about by people who increasingly shun cultural study, and it is easy to envision a future where those who work in technology are bankrupt in their understanding of the humanities. There is a lot of danger in “disrupting” people's lives without a solid ability to assess what the disruption might bring about. Countering Hawking, this talk explores what technology loses out on in this growing philosophical bankruptcy, taking philosophy and computing science as examples. It will cover how cultural studies bring about unique analytical skills, and show that without them the tech industry can be left vulnerable to manipulation, a dangerous threat. This naturally leads to a discussion of the utility of the humanities: some of these analytical skills are adopted in tech implicitly, and there is a definite opportunity to strengthen this part of the industry by changing course and working closer with the humanities. Ultimately though, the talk will demonstrate that a field's utility alone is not sufficient in measuring its value.
Olivia Klose Computer Vision enables computers to obtain a high-level understanding from images and videos by automatically extracting, analysing and understanding useful information. With autonomous driving, visual failure detection or scene understanding, computer vision is becoming one of the focus areas in artificial intelligence (AI) to enable computers to see and perceive like humans. In this talk we will present our ongoing collaboration with the Royal Holloway - University of London on illegal small scale mines in Ghana. Illegal small-scale mining is a growing industry in many African, Asian and Latin American developing countries. Gold and other precious minerals are extracted in a low-tech, labour-intensive process linked to environmental damages, health hazards and social ills. Additionally, the process provides huge employment and income potential in poverty-stricken communities. Since these small mining operations are mostly illegal, there is virtually no data to analyse their exact impact. This project seeks to fill this void to enable better-informed policy decisions by relevant stakeholders. We built an image classification model in Keras and scaled the training of the model using Kubernetes on Azure. Once small scale mines were identified, we investigated the impact of those mines on surrounding environments and populations in Python. supported by BMZ
Thomas Hervé Mboa Nkoudou, Jenny Molloy, Tarek Omar, Lucy Patterson In this session, we will focus on hardware and software solutions, tools and services, resources and projects that adopt the Open Source approach and have one goal: to increase the access to scientific research. We present initiatives and visionaries putting this into action in different regional contexts who will discuss their insights, practical knowledge and experience. What can science learn from hacker culture, which methods and principles can be shared? What impact would this kind of junction have on individual academics? How can digital tools help the global scientific community to promote the growth of knowledge worldwide? Part of the panel are The Gathering for Open Science Hardware, a community of developers and scientists who support the growth of knowledge through global access to hardware for science; openeuroscience.com, a network for Open Science projects; the Open Research Lab for Emerging Technologies, as well as the mobile lab approach developed by the Cairo Hackerspace.
Lior Zalmanson, Thomas Gegenhuber Platforms bring together clients (e.g., consumers) with workers in specific domains. Platform-mediated work thrives on using the Internet to mobilize workers and deploying algorithms to supervise, control, and manage them (i.e., algorithmic management). Typically, the relationship between platforms and workers is market-based: rather hiring them as workers, they are flexible, self-employed freelancers. In this session, we assess the state and the future of labor relations by shedding light on following questions: How do workers operate in environments where AI govern and enforce platform's policies? How do human workers respond? To give the human worker a voice, we further ask whether and how platforms enable workers to participate and improve their autonomy over work, influence how work-processes are organized or even shape a platform's strategy? Lior Zalmanson from the University of Haifa will present the insights from a recent study on Uber (conducted with Mareike Möhlmann & Ola Henfridsson of Warwick University). In this study, they followed drivers in New York and London and found a wide array of methods drivers use to regain the control and exercise their agency in the face of algorithmic Management. Among those are acts of resistance, switch to alternatives, and even gaming and tricking the system to get back their sense of autonomy. It shows that even in the face of computerized control, workers will attempt to subvert the technology to reclaim their humanity. Subsequently, Dr. Thomas Gegenhuber (Leuphana University Lüneburg) will discuss how to give workers voice in platform-mediated work. In his talk, he draws on the results of a study funded by the Hans-Böckler-Foundation (conducted with Markus Ellmer & Claudia Scheba), which explored whether and how six german-based crowdsourcing platforms implement voluntary participation instruments for crowdworkers. The study found that most platforms emphasize informing (e.g., blog) and reporting issues (e.g., chatbox). Some platforms provide the opportunity to discuss issues in a forum or message board; voting is barely relevant. This kind of participation empowers workers to some extent to increase their autonomy, yet, workers have little say in how work is organized, let alone, shaping the overall direction of the platform. Based on these findings, Thomas provides an overview of best-practice participation examples and outlines future directions.
Steven Hill, Lena Ulbricht, Malte Spitz, Annette Mühlberg Facebook and Google use their “engagement algorithms” to suck up our data and turn it over to advertisers, purveyors of fake news and “psychographic messaging,” fed by Russian bots and digital operatives trolling for Brexit and Donald Trump's election. A battle is looming over who will control this sea of Big Data and artificial intelligence (AI)-- the public interest or Internet-based platform companies? French president Emmanuel Macron has outlined a forward-looking strategy that seeks to inject European values into the race for AI development. When combined with efforts by EU competition commissioner Margrethe Vestager to enforce a rules-based order, and the EU's forthcoming General Data Protection Regulation, there is the vague outline of a vision for the Digital Age that provides an alternative to Silicon Valley. But many parts of the blueprint remain incomplete. What is the German contribution to this discussion? Many debates and policy decisions lie ahead. For example, should we become “data shareholders” who get paid for permitting Facebook and Google to mine our personal data? Or should our data be re-conceptualized as “social data” that is protected as part of the commons? Do we need to establish a collaborative CERN-type organization for the development of AI, to ensure the availability of open-source data sets and that the public good is kept at the forefront? Nations have always required licensing and permits for traditional companies -- in this high tech era, do we need to create “digital licenses” which would make clear the rules and conditions for allowing platform companies access to German and EU markets, and develop the technological tools to protect one's “digital borders”? In addition, might blockchain technologies be deployed for creating “radical transparency” in commercial transactions, real estate records, online labor platforms, and for tracking online services for regulatory and taxation purposes (such as Airbnb, Clickworker and Upwork)?
Safiya Umoja Noble Noble argues that the combination of private interests in promoting certain sites, along with the monopoly status of a relatively small number of Internet search engines, leads to a biased set of search algorithms that privilege whiteness and discriminate against people of color, specifically women of color- and contributes to our understanding of how racism is created, maintained, and disseminated in the 21st century.
Pandu Nayak, Lorena Jaume-Palasí Join Google Fellow and Vice President of Search, Pandu Nayak and Lorena Jaume-Palasí of AlgorithmWatch for an open discussion on how Google's search engine works to ensure users get the most relevant answers to their questions. How does Google's search team deal with misinformation and other forms of deceptive content online? How does search evolve over time in order to address new demands and tackle new challenges? And how transparent can algorithms be? Hear about the constant evolution of Google Search that addresses the changing and dynamic nature of the web today and discover real-life examples of recent changes.