POPULARITY
Recorded on October 17, 2023, this video features an "Authors Meet Critics" panel on the book Reactionary Mathematics: A Genealogy of Purity, by Massimo Mazzotti, Professor in the UC Berkeley Department of History and the Thomas M. Siebel Presidential Chair in the History of Science. Professor Mazzotti was joined in conversation by Matthew L. Jones, the Smith Family Professor of History at Princeton University, and David Bates, Professor of Rhetoric at UC Berkeley. Thomas Laqueur, the Helen Fawcett Distinguished Professor Emeritus at UC Berkeley, moderated. This event was co-sponsored by the Center for Science, Technology, Medicine, & Society and the UC Berkeley Department of History. The Social Science Matrix “Authors Meet Critics” book series features lively discussions about recently published books authored by social scientists at UC Berkeley. For each event, the author discusses the key arguments of their book with fellow scholars. Learn more at https://matrix.berkeley.edu. ABOUT THE BOOK A forgotten episode of mathematical resistance reveals the rise of modern mathematics and its cornerstone, mathematical purity, as political phenomena. The nineteenth century opened with a major shift in European mathematics, and in the Kingdom of Naples, this occurred earlier than elsewhere. Between 1790 and 1830 its leading scientific institutions rejected as untrustworthy the “very modern mathematics” of French analysis and in its place consolidated, legitimated, and put to work a different mathematical culture. The Neapolitan mathematical resistance was a complete reorientation of mathematical practice. Over the unrestricted manipulation and application of algebraic algorithms, Neapolitan mathematicians called for a return to Greek-style geometry and the preeminence of pure mathematics. For all their apparent backwardness, Massimo Mazzotti explains, they were arguing for what would become crucial features of modern mathematics: its voluntary restriction through a new kind of rigor and discipline, and the complete disconnection of mathematical truth from the empirical world—in other words, its purity. The Neapolitans, Mazzotti argues, were reacting to the widespread use of mathematical analysis in social and political arguments: theirs was a reactionary mathematics that aimed to technically refute the revolutionary mathematics of the Jacobins. During the Restoration, the expert groups in the service of the modern administrative state reaffirmed the role of pure mathematics as the foundation of a newly rigorous mathematics, which was now conceived as a neutral tool for modernization. What Mazzotti's penetrating history shows us in vivid detail is that producing mathematical knowledge was equally about producing certain forms of social, political, and economic order. A transcript of this talk is available at https://matrix.berkeley.edu/research-article/reactionary-mathematics/
From facial recognition―capable of checking people into flights or identifying undocumented residents―to automated decision systems that inform who gets loans and who receives bail, each of us moves through a world determined by data-empowered algorithms. But these technologies didn't just appear: they are part of a history that goes back centuries, from the census enshrined in the US Constitution to the birth of eugenics in Victorian Britain to the development of Google search. In How Data Happened: A History from the Age of Reason to the Age of Algorithms (Norton, 2023), Chris Wiggins and Matthew L. Jones illuminate the ways in which data has long been used as a tool and a weapon in arguing for what is true, as well as a means of rearranging or defending power. They explore how data was created and curated, as well as how new mathematical and computational techniques developed to contend with that data serve to shape people, ideas, society, military operations, and economies. Although technology and mathematics are at its heart, the story of data ultimately concerns an unstable game among states, corporations, and people. How were new technical and scientific capabilities developed; who supported, advanced, or funded these capabilities or transitions; and how did they change who could do what, from what, and to whom? Wiggins and Jones focus on these questions as they trace data's historical arc, and look to the future. By understanding the trajectory of data―where it has been and where it might yet go―Wiggins and Jones argue that we can understand how to bend it to ends that we collectively choose, with intentionality and purpose. Jake Chanenson is a computer science Ph.D. student at the University of Chicago. Broadly, Jake is interested in topics relating to HCI, privacy, and tech policy. Jake's work has been published in top venues such as ACM's CHI Conference on Human Factors in Computing Systems. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/history
From facial recognition―capable of checking people into flights or identifying undocumented residents―to automated decision systems that inform who gets loans and who receives bail, each of us moves through a world determined by data-empowered algorithms. But these technologies didn't just appear: they are part of a history that goes back centuries, from the census enshrined in the US Constitution to the birth of eugenics in Victorian Britain to the development of Google search. In How Data Happened: A History from the Age of Reason to the Age of Algorithms (Norton, 2023), Chris Wiggins and Matthew L. Jones illuminate the ways in which data has long been used as a tool and a weapon in arguing for what is true, as well as a means of rearranging or defending power. They explore how data was created and curated, as well as how new mathematical and computational techniques developed to contend with that data serve to shape people, ideas, society, military operations, and economies. Although technology and mathematics are at its heart, the story of data ultimately concerns an unstable game among states, corporations, and people. How were new technical and scientific capabilities developed; who supported, advanced, or funded these capabilities or transitions; and how did they change who could do what, from what, and to whom? Wiggins and Jones focus on these questions as they trace data's historical arc, and look to the future. By understanding the trajectory of data―where it has been and where it might yet go―Wiggins and Jones argue that we can understand how to bend it to ends that we collectively choose, with intentionality and purpose. Jake Chanenson is a computer science Ph.D. student at the University of Chicago. Broadly, Jake is interested in topics relating to HCI, privacy, and tech policy. Jake's work has been published in top venues such as ACM's CHI Conference on Human Factors in Computing Systems. Learn more about your ad choices. Visit megaphone.fm/adchoices
From facial recognition―capable of checking people into flights or identifying undocumented residents―to automated decision systems that inform who gets loans and who receives bail, each of us moves through a world determined by data-empowered algorithms. But these technologies didn't just appear: they are part of a history that goes back centuries, from the census enshrined in the US Constitution to the birth of eugenics in Victorian Britain to the development of Google search. In How Data Happened: A History from the Age of Reason to the Age of Algorithms (Norton, 2023), Chris Wiggins and Matthew L. Jones illuminate the ways in which data has long been used as a tool and a weapon in arguing for what is true, as well as a means of rearranging or defending power. They explore how data was created and curated, as well as how new mathematical and computational techniques developed to contend with that data serve to shape people, ideas, society, military operations, and economies. Although technology and mathematics are at its heart, the story of data ultimately concerns an unstable game among states, corporations, and people. How were new technical and scientific capabilities developed; who supported, advanced, or funded these capabilities or transitions; and how did they change who could do what, from what, and to whom? Wiggins and Jones focus on these questions as they trace data's historical arc, and look to the future. By understanding the trajectory of data―where it has been and where it might yet go―Wiggins and Jones argue that we can understand how to bend it to ends that we collectively choose, with intentionality and purpose. Jake Chanenson is a computer science Ph.D. student at the University of Chicago. Broadly, Jake is interested in topics relating to HCI, privacy, and tech policy. Jake's work has been published in top venues such as ACM's CHI Conference on Human Factors in Computing Systems. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/communications
From facial recognition―capable of checking people into flights or identifying undocumented residents―to automated decision systems that inform who gets loans and who receives bail, each of us moves through a world determined by data-empowered algorithms. But these technologies didn't just appear: they are part of a history that goes back centuries, from the census enshrined in the US Constitution to the birth of eugenics in Victorian Britain to the development of Google search. In How Data Happened: A History from the Age of Reason to the Age of Algorithms (Norton, 2023), Chris Wiggins and Matthew L. Jones illuminate the ways in which data has long been used as a tool and a weapon in arguing for what is true, as well as a means of rearranging or defending power. They explore how data was created and curated, as well as how new mathematical and computational techniques developed to contend with that data serve to shape people, ideas, society, military operations, and economies. Although technology and mathematics are at its heart, the story of data ultimately concerns an unstable game among states, corporations, and people. How were new technical and scientific capabilities developed; who supported, advanced, or funded these capabilities or transitions; and how did they change who could do what, from what, and to whom? Wiggins and Jones focus on these questions as they trace data's historical arc, and look to the future. By understanding the trajectory of data―where it has been and where it might yet go―Wiggins and Jones argue that we can understand how to bend it to ends that we collectively choose, with intentionality and purpose. Jake Chanenson is a computer science Ph.D. student at the University of Chicago. Broadly, Jake is interested in topics relating to HCI, privacy, and tech policy. Jake's work has been published in top venues such as ACM's CHI Conference on Human Factors in Computing Systems. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/science-technology-and-society
From facial recognition―capable of checking people into flights or identifying undocumented residents―to automated decision systems that inform who gets loans and who receives bail, each of us moves through a world determined by data-empowered algorithms. But these technologies didn't just appear: they are part of a history that goes back centuries, from the census enshrined in the US Constitution to the birth of eugenics in Victorian Britain to the development of Google search. In How Data Happened: A History from the Age of Reason to the Age of Algorithms (Norton, 2023), Chris Wiggins and Matthew L. Jones illuminate the ways in which data has long been used as a tool and a weapon in arguing for what is true, as well as a means of rearranging or defending power. They explore how data was created and curated, as well as how new mathematical and computational techniques developed to contend with that data serve to shape people, ideas, society, military operations, and economies. Although technology and mathematics are at its heart, the story of data ultimately concerns an unstable game among states, corporations, and people. How were new technical and scientific capabilities developed; who supported, advanced, or funded these capabilities or transitions; and how did they change who could do what, from what, and to whom? Wiggins and Jones focus on these questions as they trace data's historical arc, and look to the future. By understanding the trajectory of data―where it has been and where it might yet go―Wiggins and Jones argue that we can understand how to bend it to ends that we collectively choose, with intentionality and purpose. Jake Chanenson is a computer science Ph.D. student at the University of Chicago. Broadly, Jake is interested in topics relating to HCI, privacy, and tech policy. Jake's work has been published in top venues such as ACM's CHI Conference on Human Factors in Computing Systems. Learn more about your ad choices. Visit megaphone.fm/adchoices
From facial recognition―capable of checking people into flights or identifying undocumented residents―to automated decision systems that inform who gets loans and who receives bail, each of us moves through a world determined by data-empowered algorithms. But these technologies didn't just appear: they are part of a history that goes back centuries, from the census enshrined in the US Constitution to the birth of eugenics in Victorian Britain to the development of Google search. In How Data Happened: A History from the Age of Reason to the Age of Algorithms (Norton, 2023), Chris Wiggins and Matthew L. Jones illuminate the ways in which data has long been used as a tool and a weapon in arguing for what is true, as well as a means of rearranging or defending power. They explore how data was created and curated, as well as how new mathematical and computational techniques developed to contend with that data serve to shape people, ideas, society, military operations, and economies. Although technology and mathematics are at its heart, the story of data ultimately concerns an unstable game among states, corporations, and people. How were new technical and scientific capabilities developed; who supported, advanced, or funded these capabilities or transitions; and how did they change who could do what, from what, and to whom? Wiggins and Jones focus on these questions as they trace data's historical arc, and look to the future. By understanding the trajectory of data―where it has been and where it might yet go―Wiggins and Jones argue that we can understand how to bend it to ends that we collectively choose, with intentionality and purpose. Jake Chanenson is a computer science Ph.D. student at the University of Chicago. Broadly, Jake is interested in topics relating to HCI, privacy, and tech policy. Jake's work has been published in top venues such as ACM's CHI Conference on Human Factors in Computing Systems. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/technology
From facial recognition―capable of checking people into flights or identifying undocumented residents―to automated decision systems that inform who gets loans and who receives bail, each of us moves through a world determined by data-empowered algorithms. But these technologies didn't just appear: they are part of a history that goes back centuries, from the census enshrined in the US Constitution to the birth of eugenics in Victorian Britain to the development of Google search. In How Data Happened: A History from the Age of Reason to the Age of Algorithms (Norton, 2023), Chris Wiggins and Matthew L. Jones illuminate the ways in which data has long been used as a tool and a weapon in arguing for what is true, as well as a means of rearranging or defending power. They explore how data was created and curated, as well as how new mathematical and computational techniques developed to contend with that data serve to shape people, ideas, society, military operations, and economies. Although technology and mathematics are at its heart, the story of data ultimately concerns an unstable game among states, corporations, and people. How were new technical and scientific capabilities developed; who supported, advanced, or funded these capabilities or transitions; and how did they change who could do what, from what, and to whom? Wiggins and Jones focus on these questions as they trace data's historical arc, and look to the future. By understanding the trajectory of data―where it has been and where it might yet go―Wiggins and Jones argue that we can understand how to bend it to ends that we collectively choose, with intentionality and purpose. Jake Chanenson is a computer science Ph.D. student at the University of Chicago. Broadly, Jake is interested in topics relating to HCI, privacy, and tech policy. Jake's work has been published in top venues such as ACM's CHI Conference on Human Factors in Computing Systems. Learn more about your ad choices. Visit megaphone.fm/adchoices Support our show by becoming a premium member! https://newbooksnetwork.supportingcast.fm/book-of-the-day
Chris Wiggins and Matthew L. Jones are co-authors of How Data Happened: A History from the Age of Reason to the Age of Algorithms. Chris is an associate professor of applied mathematics at Columbia University and the New York Times's chief data scientist and Matt is a professor of history at Columbia. Together, they taught a course called “Data: Past, Present, and Future," and their book is an extension thereof. We discuss the history of how data is made; the relationship between data and truth; and the unstable three-player game between corporate, state, and people power. We are currently in an unstable and unpredictable three-player game between state power, corporate power, and people power. In fact, we have a lot of collective influence via the way we construct norms. Our constant human activity is the grist of the mill for machine learning. Corporations do not have all the power. Still, the mix between advertising and data has created a lot of the most pressing concerns in the world's algorithmically mediated reality.Follow Chris on Twitter: https://twitter.com/chrishwigginsFollow Matt on Twitter: https://twitter.com/nescioquidFollow Mila on Twitter: https://twitter.com/milaatmosAdditional InformationThe Democracy Group listener surveyFuture Hindsight PodcastMore shows from The Democracy Group
Thursday, May 18th, 2023 Chris Wiggins and Matthew L. Jones are co-authors of How Data Happened: A History from the Age of Reason to the Age of Algorithms. Chris is an associate professor of applied mathematics at Columbia University and the New York Times's chief data scientist and Matt is a professor of history at Columbia. Together, they taught a course called “Data: Past, Present, and Future," and their book is an extension thereof. We discuss the history of how data is made; the relationship between data and truth; and the unstable three-player game between corporate, state, and people power. We are currently in an unstable and unpredictable three-player game between state power, corporate power, and people power. In fact, we have a lot of collective influence via the way we construct norms. Our constant human activity is the grist of the mill for machine learning. Corporations do not have all the power. Still, the mix between advertising and data has created a lot of the most pressing concerns in the world's algorithmically mediated reality. Follow Chris on Twitter: https://twitter.com/chrishwiggins Follow Matt on Twitter: https://twitter.com/nescioquid Follow Mila on Twitter: https://twitter.com/milaatmos Follow Future Hindsight on Instagram: https://www.instagram.com/futurehindsightpod/ Love Future Hindsight? Take our Listener Survey! http://survey.podtrac.com/start-survey.aspx?pubid=6tI0Zi1e78vq&ver=standard Take the Democracy Group's Listener Survey! https://www.democracygroup.org/survey Want to support the show and get it early? https://patreon.com/futurehindsight Check out the Future Hindsight website! www.futurehindsight.com Read the transcript here: https://www.futurehindsight.com/episodes/people-power-and-ai-chris-wiggins-matt-jones Credits: Host: Mila Atmos Guests: Chris Wiggins & Matt Jones Executive Producer: Mila Atmos Producer: Zack Travis
Chris Wiggins and Matthew Jones, authors of the new book How Data Happened: A History from the Age of Reason to the Age of Algorithms, join Patrick to discuss the history of data, why enumerating things isn't a neutral act, and the ethics of building a world on the foundation of data.Patrick's book is now available! Get The Verge: Reformation, Renaissance, and Forty Years that Shook the World in hardcopy, ebook, or audiobook (read by Patrick) here: https://bit.ly/PWvergeListen to new episodes 1 week early, to exclusive seasons 1 and 2, and to all episodes ad free with Wondery+. Join Wondery+ for exclusives, binges, early access, and ad free listening. Available in the Wondery App https://wondery.app.link/tidesofhistorySee Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.
Fourth Amendment The Fourth amendment protects people from unlawful searches and seizures. For example, in the 1970s the Supreme Court ruled that a warrant is necessary in order to listen in on telephone conversations, but not to collect the phone numbers. This is the precedent that allows for big data to collect a vast amount of information about people on the internet. Further, the Foreign Intelligence Surveillance Court has determined that the legal analysis for the Fourth amendment is the same, whether the right is applied to millions of people or to just one. Data privacy and literacy The issue with collecting data at scale is that it becomes granular and social. At that point, the data is no longer innocuous but is invasive of privacy. It turns out that our every-day seemingly trivial interactions matter profoundly in the aggregate, and our habit of almost blindly agreeing to arcane privacy policies on the internet is misguided. We need newer forms of transparency that really tell us how the data is being used and how it affects our online profile, as well as a collective effort to prioritize data and technological literacy. We also need to have a conversation about what kind of analyses are and are not allowed. Technological Determinism Technological determinism is a vision of history in which technology leads the way and pushes a narrative that certain changes in technology are inevitable to the point of altering the people’s expectations. It’s also a reminder that decisions are always being made along the way, whether consciously or not, to yield the current system. We now accept the model of advertising services based on the surveillance of users' everyday interactions, but there were actually technological developments in the 1990s that would have made cash transactions largely anonymous. The internet could have developed differently. Find out more: Matthew L. Jones is the James R. Barker Professor of Contemporary Civilization at Columbia University. He studies the history of science and technology, focused on early modern Europe and on recent information technologies. A Guggenheim Fellow for 2012-13 and a Mellon New Directions fellow for 2012-15, he is writing a book on computing and state surveillance of communications, and is working on Data Mining: The Critique of Artificial Reason, 1963-2005, a historical and ethnographic account of “big data,” its relation to statistics and machine learning, and its growth as a fundamental new form of technical expertise in business and scientific research. He was also a Data & Society Fellow for 2017-2018 and authored numerous other papers. Follow Matthew L. Jones on Twitter @nescioquid
Matthew L. Jones speaks about key illiteracies surrounding metadata, the hacking of our court system, and the possibility of ethics at scale. Jones is a 2017-2018 Data & Society Fellow who studies the history of science and technology, with a focus on early modern Europe and on recent information technologies. He is completing a book on computing and state surveillance of communications and is working on a historical and ethnographic account of big data, its relation to statistics and machine learning, and its growth as a fundamental new form of technical expertise. Jones is currently a James R. Barker Professor of Contemporary Civilization at Columbia University's Department of History.
Matthew L. Jones's wonderful new book traces a history of failed efforts to make calculating machines, from Blaise Pascal's work in the 1640s through the efforts of Charles Babbage in the nineteenth century, incorporating an account of both the work and relationships of scholars and artisans, and their reflections on the nature of invention. Innovative in its approach and its form, Reckoning with Matter: Calculating Machines, Innovation, and Thinking about Thinking from Pascal to Babbage (University of Chicago Press, 2016) offers a thoughtful and beautifully-written history of technology that offers an important perspective on a division between two poles of writing the history of technology: “the collective, deterministic account of inventive activity and the individualistic, heroic, creative account (7).” In Jones's hands, we are offered a third way of understanding cultural production in early modernity, one that did not bifurcate between imitation and originality, social and individual making, or design and production. Central to the story is the history of efforts to mechanize the process of carrying ones in addition, and this fascinating problem persists as a thread through many of the projects discussed in the book. On the pages of Reckoning with Matter, readers will not only enjoy a compelling account of machine calculation through the nineteenth century, but will also find the story of a frog that tears out the eyes of a fish, a man who designed machines for making breakfast, and discussions of the significance of credit and intellectual property, modern programming, sketching, imitation, and debates over the nature of thinking. Highly recommended! Learn more about your ad choices. Visit megaphone.fm/adchoices
Matthew L. Jones's wonderful new book traces a history of failed efforts to make calculating machines, from Blaise Pascal's work in the 1640s through the efforts of Charles Babbage in the nineteenth century, incorporating an account of both the work and relationships of scholars and artisans, and their reflections on the nature of invention. Innovative in its approach and its form, Reckoning with Matter: Calculating Machines, Innovation, and Thinking about Thinking from Pascal to Babbage (University of Chicago Press, 2016) offers a thoughtful and beautifully-written history of technology that offers an important perspective on a division between two poles of writing the history of technology: “the collective, deterministic account of inventive activity and the individualistic, heroic, creative account (7).” In Jones's hands, we are offered a third way of understanding cultural production in early modernity, one that did not bifurcate between imitation and originality, social and individual making, or design and production. Central to the story is the history of efforts to mechanize the process of carrying ones in addition, and this fascinating problem persists as a thread through many of the projects discussed in the book. On the pages of Reckoning with Matter, readers will not only enjoy a compelling account of machine calculation through the nineteenth century, but will also find the story of a frog that tears out the eyes of a fish, a man who designed machines for making breakfast, and discussions of the significance of credit and intellectual property, modern programming, sketching, imitation, and debates over the nature of thinking. Highly recommended! Learn more about your ad choices. Visit megaphone.fm/adchoices
Matthew L. Jones’s wonderful new book traces a history of failed efforts to make calculating machines, from Blaise Pascal’s work in the 1640s through the efforts of Charles Babbage in the nineteenth century, incorporating an account of both the work and relationships of scholars and artisans, and their reflections on the nature of invention. Innovative in its approach and its form, Reckoning with Matter: Calculating Machines, Innovation, and Thinking about Thinking from Pascal to Babbage (University of Chicago Press, 2016) offers a thoughtful and beautifully-written history of technology that offers an important perspective on a division between two poles of writing the history of technology: “the collective, deterministic account of inventive activity and the individualistic, heroic, creative account (7).” In Jones’s hands, we are offered a third way of understanding cultural production in early modernity, one that did not bifurcate between imitation and originality, social and individual making, or design and production. Central to the story is the history of efforts to mechanize the process of carrying ones in addition, and this fascinating problem persists as a thread through many of the projects discussed in the book. On the pages of Reckoning with Matter, readers will not only enjoy a compelling account of machine calculation through the nineteenth century, but will also find the story of a frog that tears out the eyes of a fish, a man who designed machines for making breakfast, and discussions of the significance of credit and intellectual property, modern programming, sketching, imitation, and debates over the nature of thinking. Highly recommended! Learn more about your ad choices. Visit megaphone.fm/adchoices
Matthew L. Jones’s wonderful new book traces a history of failed efforts to make calculating machines, from Blaise Pascal’s work in the 1640s through the efforts of Charles Babbage in the nineteenth century, incorporating an account of both the work and relationships of scholars and artisans, and their reflections on the nature of invention. Innovative in its approach and its form, Reckoning with Matter: Calculating Machines, Innovation, and Thinking about Thinking from Pascal to Babbage (University of Chicago Press, 2016) offers a thoughtful and beautifully-written history of technology that offers an important perspective on a division between two poles of writing the history of technology: “the collective, deterministic account of inventive activity and the individualistic, heroic, creative account (7).” In Jones’s hands, we are offered a third way of understanding cultural production in early modernity, one that did not bifurcate between imitation and originality, social and individual making, or design and production. Central to the story is the history of efforts to mechanize the process of carrying ones in addition, and this fascinating problem persists as a thread through many of the projects discussed in the book. On the pages of Reckoning with Matter, readers will not only enjoy a compelling account of machine calculation through the nineteenth century, but will also find the story of a frog that tears out the eyes of a fish, a man who designed machines for making breakfast, and discussions of the significance of credit and intellectual property, modern programming, sketching, imitation, and debates over the nature of thinking. Highly recommended! Learn more about your ad choices. Visit megaphone.fm/adchoices
Matthew L. Jones’s wonderful new book traces a history of failed efforts to make calculating machines, from Blaise Pascal’s work in the 1640s through the efforts of Charles Babbage in the nineteenth century, incorporating an account of both the work and relationships of scholars and artisans, and their reflections on the nature of invention. Innovative in its approach and its form, Reckoning with Matter: Calculating Machines, Innovation, and Thinking about Thinking from Pascal to Babbage (University of Chicago Press, 2016) offers a thoughtful and beautifully-written history of technology that offers an important perspective on a division between two poles of writing the history of technology: “the collective, deterministic account of inventive activity and the individualistic, heroic, creative account (7).” In Jones’s hands, we are offered a third way of understanding cultural production in early modernity, one that did not bifurcate between imitation and originality, social and individual making, or design and production. Central to the story is the history of efforts to mechanize the process of carrying ones in addition, and this fascinating problem persists as a thread through many of the projects discussed in the book. On the pages of Reckoning with Matter, readers will not only enjoy a compelling account of machine calculation through the nineteenth century, but will also find the story of a frog that tears out the eyes of a fish, a man who designed machines for making breakfast, and discussions of the significance of credit and intellectual property, modern programming, sketching, imitation, and debates over the nature of thinking. Highly recommended! Learn more about your ad choices. Visit megaphone.fm/adchoices
Matthew L. Jones’s wonderful new book traces a history of failed efforts to make calculating machines, from Blaise Pascal’s work in the 1640s through the efforts of Charles Babbage in the nineteenth century, incorporating an account of both the work and relationships of scholars and artisans, and their reflections on the nature of invention. Innovative in its approach and its form, Reckoning with Matter: Calculating Machines, Innovation, and Thinking about Thinking from Pascal to Babbage (University of Chicago Press, 2016) offers a thoughtful and beautifully-written history of technology that offers an important perspective on a division between two poles of writing the history of technology: “the collective, deterministic account of inventive activity and the individualistic, heroic, creative account (7).” In Jones’s hands, we are offered a third way of understanding cultural production in early modernity, one that did not bifurcate between imitation and originality, social and individual making, or design and production. Central to the story is the history of efforts to mechanize the process of carrying ones in addition, and this fascinating problem persists as a thread through many of the projects discussed in the book. On the pages of Reckoning with Matter, readers will not only enjoy a compelling account of machine calculation through the nineteenth century, but will also find the story of a frog that tears out the eyes of a fish, a man who designed machines for making breakfast, and discussions of the significance of credit and intellectual property, modern programming, sketching, imitation, and debates over the nature of thinking. Highly recommended! Learn more about your ad choices. Visit megaphone.fm/adchoices
Matthew L. Jones’s wonderful new book traces a history of failed efforts to make calculating machines, from Blaise Pascal’s work in the 1640s through the efforts of Charles Babbage in the nineteenth century, incorporating an account of both the work and relationships of scholars and artisans, and their reflections on the nature of invention. Innovative in its approach and its form, Reckoning with Matter: Calculating Machines, Innovation, and Thinking about Thinking from Pascal to Babbage (University of Chicago Press, 2016) offers a thoughtful and beautifully-written history of technology that offers an important perspective on a division between two poles of writing the history of technology: “the collective, deterministic account of inventive activity and the individualistic, heroic, creative account (7).” In Jones’s hands, we are offered a third way of understanding cultural production in early modernity, one that did not bifurcate between imitation and originality, social and individual making, or design and production. Central to the story is the history of efforts to mechanize the process of carrying ones in addition, and this fascinating problem persists as a thread through many of the projects discussed in the book. On the pages of Reckoning with Matter, readers will not only enjoy a compelling account of machine calculation through the nineteenth century, but will also find the story of a frog that tears out the eyes of a fish, a man who designed machines for making breakfast, and discussions of the significance of credit and intellectual property, modern programming, sketching, imitation, and debates over the nature of thinking. Highly recommended! Learn more about your ad choices. Visit megaphone.fm/adchoices
Matthew L. Jones’s wonderful new book traces a history of failed efforts to make calculating machines, from Blaise Pascal’s work in the 1640s through the efforts of Charles Babbage in the nineteenth century, incorporating an account of both the work and relationships of scholars and artisans, and their reflections on the nature of invention. Innovative in its approach and its form, Reckoning with Matter: Calculating Machines, Innovation, and Thinking about Thinking from Pascal to Babbage (University of Chicago Press, 2016) offers a thoughtful and beautifully-written history of technology that offers an important perspective on a division between two poles of writing the history of technology: “the collective, deterministic account of inventive activity and the individualistic, heroic, creative account (7).” In Jones’s hands, we are offered a third way of understanding cultural production in early modernity, one that did not bifurcate between imitation and originality, social and individual making, or design and production. Central to the story is the history of efforts to mechanize the process of carrying ones in addition, and this fascinating problem persists as a thread through many of the projects discussed in the book. On the pages of Reckoning with Matter, readers will not only enjoy a compelling account of machine calculation through the nineteenth century, but will also find the story of a frog that tears out the eyes of a fish, a man who designed machines for making breakfast, and discussions of the significance of credit and intellectual property, modern programming, sketching, imitation, and debates over the nature of thinking. Highly recommended! Learn more about your ad choices. Visit megaphone.fm/adchoices