Podcast appearances and mentions of Christine L Borgman

  • 7PODCASTS
  • 7EPISODES
  • 41mAVG DURATION
  • ?INFREQUENT EPISODES
  • Oct 15, 2018LATEST

POPULARITY

20172018201920202021202220232024


Latest podcast episodes about Christine L Borgman

Berkman Klein Center for Internet and Society: Audio Fishbowl
Open Data, Grey Data, and Stewardship: Universities at the Privacy Frontier

Berkman Klein Center for Internet and Society: Audio Fishbowl

Play Episode Listen Later Oct 15, 2018 69:12


Universities have automated many aspects of teaching, instruction, student services, libraries, personnel management, building management, and finance, leading to a profusion of discrete data about the activities of individuals. Universities see great value of these data for learning analytics, faculty evaluation, strategic decisions, and other sensitive matters. Commercial entities, governments, and private individuals also see value in these data and are besieging universities with requests for access. In this talk, Christine L. Borgman discusses the conflicts & challenges of balancing obligations for stewardship, trust, privacy, confidentiality – and often academic freedom – with the value of exploiting data for analytical and commercial purposes. For more information about this event visit: https://cyber.harvard.edu/events/2018-10-09/open-data-grey-data-and-stewardship Photo by @AlyssaAGoodman

New Books in Science, Technology, and Society
Christine L. Borgman, “Big Data, Little Data, No Data: Scholarship in the Networked World” (MIT Press, 2015)

New Books in Science, Technology, and Society

Play Episode Listen Later Apr 20, 2015 37:11


Social media and digital technology now allow researchers to collect vast amounts of a variety data quickly. This so-called “big data,” and the practices that surround its collection, is all the rage in both the media and in research circles. What makes data “big,” is described by the v’s: volume, velocity, variety, and veracity. Volume refers to the massive scale of the data that can be collected, velocity, the speed of streaming analysis. Variety refers to the different forms of data available, while veracity considers the bias and noise in the data. Although many would like to focus on these details, two other v’s,validity and volatility, hold significance for big data. Validity considers the level of uncertainty in the data, asking whether it is accurate for the intended use. Volatility refers to how long the data can be stored, and remain valid. In her new book, Big Data, Little Data, No Data: Scholarship in the Networked World (MIT Press, 2015), Professor Christine L. Borgman, Presidential Chair in Information Studies at the University of California, Los Angeles, examines the infatuation with big data and the implications for scholarship. Borgman asserts that although the collection of massive amounts of data is alluring, it is best to have the correct data for the kind of research being conducted. Further, scholars must now consider the economic, technical, and policy issues related to data collection, storage and sharing. In examining these issues, Borgman details data collection, use, storage and sharing practices across disciplines, and analyzes what data means for different scholarly traditions. Learn more about your ad choices. Visit megaphone.fm/adchoices

New Books in Education
Christine L. Borgman, “Big Data, Little Data, No Data: Scholarship in the Networked World” (MIT Press, 2015)

New Books in Education

Play Episode Listen Later Apr 20, 2015 37:11


Social media and digital technology now allow researchers to collect vast amounts of a variety data quickly. This so-called “big data,” and the practices that surround its collection, is all the rage in both the media and in research circles. What makes data “big,” is described by the v’s: volume, velocity, variety, and veracity. Volume refers to the massive scale of the data that can be collected, velocity, the speed of streaming analysis. Variety refers to the different forms of data available, while veracity considers the bias and noise in the data. Although many would like to focus on these details, two other v’s,validity and volatility, hold significance for big data. Validity considers the level of uncertainty in the data, asking whether it is accurate for the intended use. Volatility refers to how long the data can be stored, and remain valid. In her new book, Big Data, Little Data, No Data: Scholarship in the Networked World (MIT Press, 2015), Professor Christine L. Borgman, Presidential Chair in Information Studies at the University of California, Los Angeles, examines the infatuation with big data and the implications for scholarship. Borgman asserts that although the collection of massive amounts of data is alluring, it is best to have the correct data for the kind of research being conducted. Further, scholars must now consider the economic, technical, and policy issues related to data collection, storage and sharing. In examining these issues, Borgman details data collection, use, storage and sharing practices across disciplines, and analyzes what data means for different scholarly traditions. Learn more about your ad choices. Visit megaphone.fm/adchoices

New Books in Communications
Christine L. Borgman, “Big Data, Little Data, No Data: Scholarship in the Networked World” (MIT Press, 2015)

New Books in Communications

Play Episode Listen Later Apr 20, 2015 37:11


Social media and digital technology now allow researchers to collect vast amounts of a variety data quickly. This so-called “big data,” and the practices that surround its collection, is all the rage in both the media and in research circles. What makes data “big,” is described by the v’s: volume, velocity, variety, and veracity. Volume refers to the massive scale of the data that can be collected, velocity, the speed of streaming analysis. Variety refers to the different forms of data available, while veracity considers the bias and noise in the data. Although many would like to focus on these details, two other v’s,validity and volatility, hold significance for big data. Validity considers the level of uncertainty in the data, asking whether it is accurate for the intended use. Volatility refers to how long the data can be stored, and remain valid. In her new book, Big Data, Little Data, No Data: Scholarship in the Networked World (MIT Press, 2015), Professor Christine L. Borgman, Presidential Chair in Information Studies at the University of California, Los Angeles, examines the infatuation with big data and the implications for scholarship. Borgman asserts that although the collection of massive amounts of data is alluring, it is best to have the correct data for the kind of research being conducted. Further, scholars must now consider the economic, technical, and policy issues related to data collection, storage and sharing. In examining these issues, Borgman details data collection, use, storage and sharing practices across disciplines, and analyzes what data means for different scholarly traditions. Learn more about your ad choices. Visit megaphone.fm/adchoices

New Books in Technology
Christine L. Borgman, “Big Data, Little Data, No Data: Scholarship in the Networked World” (MIT Press, 2015)

New Books in Technology

Play Episode Listen Later Apr 20, 2015 37:11


Social media and digital technology now allow researchers to collect vast amounts of a variety data quickly. This so-called “big data,” and the practices that surround its collection, is all the rage in both the media and in research circles. What makes data “big,” is described by the v’s: volume, velocity, variety, and veracity. Volume refers to the massive scale of the data that can be collected, velocity, the speed of streaming analysis. Variety refers to the different forms of data available, while veracity considers the bias and noise in the data. Although many would like to focus on these details, two other v’s,validity and volatility, hold significance for big data. Validity considers the level of uncertainty in the data, asking whether it is accurate for the intended use. Volatility refers to how long the data can be stored, and remain valid. In her new book, Big Data, Little Data, No Data: Scholarship in the Networked World (MIT Press, 2015), Professor Christine L. Borgman, Presidential Chair in Information Studies at the University of California, Los Angeles, examines the infatuation with big data and the implications for scholarship. Borgman asserts that although the collection of massive amounts of data is alluring, it is best to have the correct data for the kind of research being conducted. Further, scholars must now consider the economic, technical, and policy issues related to data collection, storage and sharing. In examining these issues, Borgman details data collection, use, storage and sharing practices across disciplines, and analyzes what data means for different scholarly traditions. Learn more about your ad choices. Visit megaphone.fm/adchoices

New Books Network
Christine L. Borgman, “Big Data, Little Data, No Data: Scholarship in the Networked World” (MIT Press, 2015)

New Books Network

Play Episode Listen Later Apr 20, 2015 37:11


Social media and digital technology now allow researchers to collect vast amounts of a variety data quickly. This so-called “big data,” and the practices that surround its collection, is all the rage in both the media and in research circles. What makes data “big,” is described by the v’s: volume, velocity, variety, and veracity. Volume refers to the massive scale of the data that can be collected, velocity, the speed of streaming analysis. Variety refers to the different forms of data available, while veracity considers the bias and noise in the data. Although many would like to focus on these details, two other v’s,validity and volatility, hold significance for big data. Validity considers the level of uncertainty in the data, asking whether it is accurate for the intended use. Volatility refers to how long the data can be stored, and remain valid. In her new book, Big Data, Little Data, No Data: Scholarship in the Networked World (MIT Press, 2015), Professor Christine L. Borgman, Presidential Chair in Information Studies at the University of California, Los Angeles, examines the infatuation with big data and the implications for scholarship. Borgman asserts that although the collection of massive amounts of data is alluring, it is best to have the correct data for the kind of research being conducted. Further, scholars must now consider the economic, technical, and policy issues related to data collection, storage and sharing. In examining these issues, Borgman details data collection, use, storage and sharing practices across disciplines, and analyzes what data means for different scholarly traditions. Learn more about your ad choices. Visit megaphone.fm/adchoices

New Books in Systems and Cybernetics
Christine L. Borgman, “Big Data, Little Data, No Data: Scholarship in the Networked World” (MIT Press, 2015)

New Books in Systems and Cybernetics

Play Episode Listen Later Apr 20, 2015 37:11


Social media and digital technology now allow researchers to collect vast amounts of a variety data quickly. This so-called “big data,” and the practices that surround its collection, is all the rage in both the media and in research circles. What makes data “big,” is described by the v's: volume,... Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/systems-and-cybernetics