

New Dark Age
Chapter Summaries
What's Here for You
Prepare to confront the unsettling realities of our digitally saturated world. 'New Dark Age' doesn't offer easy answers, but instead plunges you into the heart of technological paradoxes. You'll embark on a journey to understand how computation, climate change, and complex algorithms shape our perceptions and societies, often in ways we barely comprehend. Expect to grapple with the limitations of knowledge, the rise of weaponized secrecy, and the unsettling truths hidden within the cloud. This book offers a critical lens, challenging the seductive optimism surrounding technology and prompting you to question the very nature of progress in an age of overwhelming information. It's an invitation to navigate the chasm between connection and understanding, fostering a deeper awareness of the unseen forces governing our lives.
Chasm
In “Chasm,” James Bridle grapples with technology's paradoxical promise: connection amidst a growing chasm of understanding. Bridle begins with a personal anecdote, a looped soundbite from *The West Wing*, framing the book as an urgent message from technology itself, a call to understand what we know, how we know, and what remains unknowable. The author argues that technology, while transformative, hasn't necessarily deepened our comprehension and that we are entangled within systems that shape our thoughts and actions. The core tension arises: technologies exacerbate global challenges, yet we lack the critical literacy to navigate them effectively. Bridle insists that true literacy transcends functional use, demanding an understanding of context, consequences, and the inherent limitations of any single solution. He critiques the oversimplified call to 'learn to code' and the dangers of 'computational thinking,' which assumes every problem is solvable through computation. Instead, Bridle advocates for systemic literacy, a way of thinking that acknowledges the non-computable aspects of our world. The “cloud” becomes a central metaphor, initially a tool to reduce complexity, it now obscures the physical infrastructure and power dynamics of the internet; a digital fog hiding the weighty realities of data centers and exploited territories. Bridle urges us to seed the cloud, to condense it and force it to give up its secrets. He proposes “cloudy thinking,” embracing unknowing as a productive force, drawing inspiration from the medieval concept of the “Cloud of Unknowing,” which emphasizes experience over mere knowledge. The author posits that the network's signifying quality lies in its lack of singular intent, a tool unconsciously generated by collective enterprise. This thinking reveals the interconnectedness of all things and the need to constantly rethink and reflect upon the network's weights, balances, and possibilities. Bridle challenges the belief in inevitable technological progress, arguing that we have abdicated our objections, falling into the chasm of computational thinking. He suggests that the abundance of information has paradoxically darkened the world, leading to fundamentalist narratives and post-factual politics. The author calls for a re-enchantment of our tools, urging us to see them not just as instruments for achieving desired effects, but as metaphors that shape our understanding of the world. Bridle emphasizes the importance of acknowledging the darkness, not as an absence of knowledge, but as an opportunity for new ways of seeing, advocating for thoughtful engagement with technology and a radical rethinking of what we can think and know about the world. He concludes with a reminder of the existential threats we face, urging us to speak honestly and think deeply, for our collective survival depends on it, and that we cannot fail each other now.
Computation
In "Computation," James Bridle begins with John Ruskin's haunting observations of a new, ominous storm-cloud in the 19th-century sky, a metaphor for unseen forces shaping our world. Bridle draws a parallel to our present age, where computation, like Ruskin's plague-cloud, subtly governs our lives, its influence often unseen yet deeply felt. He introduces us to early computational thinkers like Lewis Fry Richardson, whose dream of predicting the weather through numerical processes was limited by the technology of his time. Richardson's vision of a vast theater of human computers illuminates the ambition and scale of early computational thought. Then comes Vannevar Bush, envisioning the memex, a device to manage the overwhelming flood of information, and John von Neumann, who boldly declared, "All stable processes we shall predict. All unstable processes we shall control." These figures set the stage for the rise of militarized computation, exemplified by the ENIAC and the SSEC, machines capable of both incredible calculation and terrifying destruction. The SSEC, calculating hydrogen bomb simulations behind the facade of a New York shopfront, becomes a potent image of hidden computational power. Bridle reveals how this computational thinking, initially driven by military imperatives, has quietly permeated every aspect of modern life, from weather forecasting to airline reservations, even shaping our very thoughts. He explains automation bias, where we prioritize automated information over our own senses, sometimes with disastrous consequences. The chapter then shows that as computation becomes increasingly complex and opaque, it creates a new kind of darkness, obscuring both its limitations and its biases. The dream of total control through computation is not only unrealizable, but also dangerous, leading to a world where our models replace reality. The author cautions that the more obsessively we compute the world, the more unknowably complex it appears, echoing Lewis Fry Richardson's recognition that nature resists simple explanations, like a coastline whose length increases with the precision of measurement. Ultimately, Bridle suggests that by acknowledging the limits and biases of computational thinking, we can begin to reclaim our understanding of the world and resist its total capture by the machine.
Climate
James Bridle paints a stark picture of our current climate crisis, opening with a haunting image of the Siberian tundra, its seemingly solid ground revealed as a trembling, methane-releasing slush—a visceral metaphor for the destabilization of the entire planet. The author elucidates how the melting permafrost is not merely an environmental issue but a harbinger of cognitive and infrastructural collapse. Bridle explains that the Arctic, once a remote frontier, now exemplifies a positive feedback loop of destruction, as melting ice makes previously inaccessible fossil fuels available, further accelerating global warming, and requiring protective measures for industrial infrastructure necessitated by the infrastructure itself. He masterfully connects seemingly disparate events, such as the Syrian conflict, to climate change, illustrating how environmental stressors exacerbate social and political instability. The narrative then shifts to the Svalbard Global Seed Vault, humanity's attempt to safeguard genetic biodiversity against potential catastrophes, only to find itself threatened by the very climate change it seeks to mitigate. Bridle underscores that the Seed Vault represents a bastion of diversity-in-knowing, essential for combating the monoculture of thought that blinds us to the world's messy incoherence. The chapter gains momentum as it transitions to the melting archaeological sites in Greenland, repositories of invaluable knowledge about past civilizations' responses to climate change, now vanishing before our eyes—a chilling echo of the Library of Alexandria ablaze. Bridle warns that the loss of this knowledge, coupled with the degradation of our cognitive abilities due to rising carbon dioxide levels, signals a looming new dark age, one where our capacity to understand and address complex challenges diminishes. He points to the increasing turbulence in transatlantic flights as a tangible consequence of climate change, disrupting even our ability to traverse the skies smoothly. Yet, amidst this bleak outlook, Bridle offers a glimmer of hope: the network, particularly the internet, despite its misuse, holds the potential to illuminate our path forward. He advocates for embracing the network as a means of seeing, thinking, and acting, urging us to acknowledge uncertainties and contradictions, for it is through this lens that we can navigate the hyperobjects of our time and cultivate a necessary affinity with the uncertain, recognizing the impossibility of separation in an interconnected world.
Calculation
In 'Calculation,' James Bridle delves into the illusion of progress perpetuated by technological advancements, particularly in the realm of computation. Bridle begins by introducing the concept of 'steam engine time,' the idea that inventions often arise simultaneously and seemingly spontaneously, defying linear historical narratives. He uses Tim Berners-Lee's creation of the World Wide Web as a case study, illustrating how a confluence of personal history, technological advancements, and cultural dispositions made the Web's emergence seem inevitable. However, Bridle cautions against such justificatory histories, which often serve to reinforce the self-fulfilling prophecy of technological progress. The author then turns to Moore's Law, the observation that the number of components per integrated circuit doubles approximately every two years, as a prime example of a projection that has shaped and been shaped by the computing industry. This 'law,' initially an off-the-cuff observation, has become a self-perpetuating cycle, driving the relentless pursuit of faster and smaller technology, often at the expense of craft, care, and efficiency in software development. Bridle argues that this obsession with ever-increasing computational power has led to a moral and cognitive shift, where the idea of progress itself is tied to a future of plenty, excusing present-day excesses. The narrative tension escalates as Bridle introduces Eroom's Law—Moore's Law in reverse—demonstrating that despite massive investment, the rate of new drug discovery has been declining since the 1950s. This paradox reveals a deeper crisis within science, marked by increasing retractions, falling replicability, and questionable research practices like p-hacking. Bridle suggests that the reliance on vast amounts of data, as championed by proponents of big data, has led to a neglect of traditional scientific methods and a reductionist approach that fails to capture the complexity of real-world systems. A sensory image emerges: scientists, once imaginative explorers, now sit before high-speed centrifuges, producing the same superposable graphs, imagination stifled by corporate finances. Bridle introduces the concept of 'overflow,' the overwhelming abundance of information that exceeds our capacity to process it meaningfully, leading to a failure of quality control and an erosion of trust in scientific research. As a resolution, Bridle offers the example of nuclear fusion research, where scientists at Tri Alpha Energy, in collaboration with Google, developed the 'Optometrist Algorithm,' a system that combines human intuition with machine learning to navigate the complex parameters of plasma containment. This approach, which relies on human judgment to guide the algorithm towards unexpected and potentially valuable results, suggests a path forward for scientific research: one that embraces the ambiguity and unpredictability of complex systems and acknowledges the limits of human conceptualization. The author concludes by emphasizing that the tools we use shape the way we think, and that a belief in the singular, inviolable answer produced by machines can blind us to the deeper cognitive pressures at work. Ultimately, Bridle calls for a critical examination of our technological trajectory, urging us to respond to the evidence in front of us and to embrace a more nuanced and holistic approach to scientific inquiry.
Complexity
In 'New Dark Age,' James Bridle embarks on a psychogeographic journey, tracing the invisible networks of digital technology across South East England, revealing how these systems shape our social reality. He begins by mapping surveillance devices in London and electromagnetic networks guiding aircraft, then cycles from Slough, home to data centers like Equinix LD4—the virtual heart of the London Stock Exchange—to Basildon, site of the Euronext Data Center. Bridle illuminates how the virtualization of money markets has led to high-frequency trading, where firms spend millions to reduce latency, gaining millisecond advantages worth fortunes. This quest for speed and privileged access fosters dark pools, private forums where unseen predators exploit the system, costing ordinary people their savings. The author highlights the stark contrast at Hillingdon Hospital, where microwave dishes atop the building symbolize the parasitic relationship between invisible markets and essential public services. Michael Lewis's observation that the stock market has become a private viewing of a stolen work of art encapsulates this inequality, echoing Thomas Piketty's analysis of wealth disparities, where the richest control a disproportionate share, accelerated by technology. Bridle examines how automation, exemplified by Amazon's chaotic storage and tracking of workers, renders human labor computationally efficient but also incomprehensible and oppressive. He notes the spread of these efficiencies, with Uber drivers sleeping in their cars, bearing the costs of a system designed for maximum profit. The Volkswagen emissions scandal reveals how hidden technological processes can have deadly consequences, concentrating power and understanding in fewer hands. However, resistance emerges as UberEats drivers use the app to organize a strike, turning the system's logic against itself. Bridle then explores the flash crash of 2010, a symptom of augmented markets where algorithms amplify volatility, and the vulnerability exposed by a hacked AP tweet that briefly erased billions in market value. He illustrates artificial stupidity with Zazzle's auto-generated products and Amazon's austerity nostalgia T-shirts, revealing the havoc wreaked by poorly designed systems. The author warns of the internet of things, where everyday objects become vectors for exploitation, citing examples like smart locks and lightbulbs. He cautions against accelerationist thought, which ignores the complexity driving inequality, and advocates for a hermeneutic understanding of technology, guided by Hermes, embracing ambiguity and uncertainty, recognizing that technology reflects the actual world, not an ideal one. Complexity, Bridle suggests, is not a condition to be tamed, but a lesson to be learned, a call to see beyond movement and efficiency, and to temper progress with social justice.
Cognition
James Bridle explores the perplexing world of machine learning, beginning with the apocryphal tale of the US Army's tank-detecting AI that learned to recognize sunny days instead of tanks, setting the stage for a central question: What can we truly know about what a machine knows? Bridle introduces the Perceptron, an early neural network, and the connectionist theory, championed by Friedrich Hayek, highlighting the belief that intelligence emerges from connections, mirroring Hayek's neoliberal vision of a decentralized, unbiased market. The narrative tension rises as Bridle discusses modern neural networks, like Google's AI, capable of recognizing faces and even inferring criminality, sparking ethical debates reminiscent of phrenology. The author shows how technology reifies beliefs, illustrated by cameras failing to recognize non-Caucasian faces, exposing encoded biases in data sets. Bridle then pivots to predictive policing, revealing how algorithms like PredPol use past crime data to forecast future hotspots, turning crime into a self-fulfilling prophecy, an aftershock echoing through city streets. A critical insight emerges: machines can produce decisions and consequences we don't understand, operating at scales beyond human comprehension, as seen with Google Translate's neural network creating its own multidimensional map of meaning. Bridle contrasts Deep Blue's brute force chess victory with AlphaGo's creative, yet inscrutable, moves, suggesting a realm of meta-mathematical possibility, a kind of 'Infinite Fun Space' beyond human grasp. The narrative takes a surreal turn with AI-generated faces and dream bedrooms, blurring the line between reality and simulation, culminating in a story of Google's AutoAwesome creating a false memory, a photograph of a moment that never was. Bridle reveals that the machines' dreams aren't rewriting history, but showing that history itself cannot be reliably narrated. The chapter finds resolution in the concept of cooperation, inspired by Kasparov's Advanced Chess, where human-machine teams outperform solo machines, a metaphor for mindful collaboration. Bridle concludes with an urgent call for an ethic of cooperation, emphasizing stewardship and universal justice in the present, acknowledging our entanglement with technology and each other, envisioning acts of justice not in a distant, computable future, but in the here and now.
Complicity
In this chapter of *New Dark Age*, James Bridle unveils a chilling landscape of weaponized secrecy and technological opacity, a world where knowledge is power, hoarded and manipulated by those in control. He begins with a personal quest, a frustrating attempt to glean information about drone usage by the London Metropolitan Police during the 2012 Olympics. This leads him to the concept of the Glomar response—"We can neither confirm nor deny"—a verbal smokescreen born from Cold War intelligence tactics, now permeating everyday discourse, a kind of political technology. The author then evokes the historical precedent of the Egyptian nilometers, instruments of scientific prediction masked as divine knowledge, used to control populations through carefully curated narratives. Bridle shines a light on the modern-day equivalents: mathematicians working in the shadows of agencies like the NSA and GCHQ, their groundbreaking discoveries shrouded in classification, lost to the public eye, like stars swallowed by the night. He argues that official secrecy is corrosive, systematically denying access to our own history and potential. A critical point emerges: the very networks meant to illuminate are used to obscure. The revelations of global surveillance, laid bare by Edward Snowden, should have sparked reform, yet have largely been met with apathy, a collective decision to not look too closely into the abyss. Like climate change, the scale of mass surveillance is too vast, too destabilizing for society to truly grasp. Bridle warns that we risk drowning in data, mistaking information for understanding, as intelligence agencies collect everything but comprehend little. He challenges the conventional wisdom that transparency is the antidote to secrecy, suggesting that transparency and surveillance share the same underlying logic, each seeking a singular truth to control the narrative. The author paints a picture of a world saturated with light, yet devoid of true vision, where surveillance becomes a self-fulfilling prophecy, blinding us to the very realities it purports to reveal, like a home security system that streams the destruction of one's home, a perfect metaphor for our age: universal vision, but reduced agency. The core dilemma he poses is whether increased awareness translates into remedy, or whether it merely reinforces the structures of power that thrive in the shadows. Ultimately, Bridle suggests that in this new dark age, the pursuit of absolute certainty paralyzes action, binding us to a present where everyone knows what's happening, yet nobody can do anything about it.
Conspiracy
In James Bridle's exploration of conspiracy within the digital age, he begins by illustrating the Catch-22, a concept where rational actions within irrational systems lead to paradoxical outcomes, mirroring our attempts to comprehend the overwhelming tide of information through narratives. Bridle suggests that our failure to grasp the complex world drives us to seek more information, paradoxically deepening our confusion and fueling increasingly intricate theories; thus, the digital age, with its promise of transparency, instead breeds a unique form of paranoia, where the feeling of being watched is not delusional but a reflection of pervasive surveillance. He recounts his own experience tracking mysterious surveillance planes, leading to a deeper investigation into covert activities and the fine line between legitimate security measures and unwarranted intrusion, revealing how easily public trust can erode when transparency is sacrificed for secrecy. Bridle then pivots to the phenomenon of chemtrails, a potent symbol of the schisms in mass perception, noting that while some see covert surveillance, others perceive a global conspiracy to manipulate the atmosphere, highlighting how the same skies can be interpreted in wildly different ways, each fueled by distrust and a desire for control. Bridle argues that conspiracy theories, though often dismissed, serve a vital function by bringing ignored discourses into view, acting as a distorted reflection of real anxieties about environmental ruin and technological overreach; it is as if the weather itself has become active data, a storm cloud of the Anthropocene spreading through the network and infecting the paranoid imagination. He draws a parallel with indigenous knowledge, particularly the Inuit elders' observations of a changing Arctic, whose experiences, though initially dismissed, underscore the limitations of scientific and political knowledge, and how easily direct, embodied experiences can be invalidated. Bridle then introduces the concept of homogenitus clouds, man-made formations that serve as a constant reminder of human impact on the environment, a visible manifestation of our entanglement with the natural world; the skies, once a source of wonder, now reflect our anxieties and the complex interplay of human actions and environmental consequences. He ultimately suggests that the proliferation of conspiracy theories is a symptom of a deeper crisis: a world of limited knowability and existential doubt, where the gray zone between provable facts and falsehoods becomes the dominant landscape, requiring a radical shift in how we perceive and engage with information, urging us to embrace the ambiguity and uncertainty that define our times, and to recognize that all our apprehensions are merely approximations of a far more complex reality.
Concurrency
In "New Dark Age," James Bridle navigates the perplexing world of childrens content on YouTube, revealing a landscape far removed from innocent entertainment. It begins with a seemingly innocuous scene: a child meticulously unboxing Kinder Eggs, a ritual repeated millions of times across the platform. Bridle highlights the algorithms that drive this content, creating endless loops of repetition and reward, transfixing young viewers. He draws a parallel to the Teletubbies, noting how seemingly bizarre content can create a safe, reassuring world for children, tapping into their psychological traits. Yet, the algorithmic variation introduces a terrifying element, a sense of unease. The core tension emerges: the promise of reward clashes with the potential for exploitation. YouTube's recommendation algorithms, fueled by AdSense, incentivize creators to chase views, often targeting children with keyword-stuffed, nonsensical videos. Bridle illustrates this with examples like "Surprise Play Doh Eggs Peppa Pig Stamper Cars Pocoyo Minecraft Smurfs Kinder Play Doh Sparkle Brilho," a word salad designed to game the system. Here, the author underscores how easily trusted content can become a gateway to unverified and potentially harmful material, echoing the delamination of news on platforms like Facebook. Bridle introduces the disturbing trend of Finger Family videos, spawning countless automated variations that accumulate billions of views, questioning the indeterminate nature of their actions and audiences. The bots inflating view counts and the human element re-entering the loop with channels like Bounce Patrol, which further blurs the line between human and machine, underscores the weirdness. The author asks us to consider the implications of full automation, citing examples like Amazon phone cases and "Keep Calm and Rape A Lot" T-shirts, which highlight the system's complicity in disturbing outputs. The constant overlaying and intermixing of different tropes breeds a growing sense of something inhuman, a digital uncanny valley. Toy Freaks, a popular channel featuring gross-out situations, exemplifies this trend, sparking controversy and raising concerns about exploitation. This amplification leads to increasingly outlandish and distorted recombinations, revealing an undercurrent of violence and degradation. Bridle explores the world of YouTube Poop and disturbing Peppa Pig videos, revealing a matrix of interactions between desires, rewards, technologies, and audiences. Ultimately, the author asks, what does it take to make these videos, and who makes them? The cheap technologies and distribution methods are put in the service of industrialised nightmare production. Bridle argues that children are being traumatized by these videos, and the network effects cause real and lasting damage, a form of abuse amplified by the internet's ability to enable latent desires. The author concludes with a sobering assessment: the exploitation encoded into these systems degrades both viewers and creators, with corporations profiting in between. This crisis is not just about bad actors or inappropriate content; it's about a violence inherent in the combination of digital systems and capitalist incentives, creating a dark age where the structures built for communication are used against us. The author references the fake news boom in Veles, Macedonia, as a microcosm of this wider cognitive crisis, and warns against seeking easy scapegoats. The heart of the matter is the impossibility of discerning truth from falsehood, manipulation from genuine expression, in this vast and complex digital landscape. Bridle emphasizes the need to comprehend these mechanisms, rather than simply reacting with paranoia or censorship, to navigate this new dark age.
Cloud
In "Cloud," James Bridle dissects the seductive optimism surrounding technology, particularly the belief that increased visibility equates to progress. He begins by recounting Eric Schmidt's assertion that smartphones could prevent atrocities like the Rwandan genocide, a claim Bridle swiftly dismantles. He points out that in both Rwanda and Srebrenica, the issue wasn't a lack of information, but a lack of will to act, painting a stark image of satellites and spy planes capturing horrors that political inertia ignored. Bridle then reveals the dark side of hyper-connectivity, describing how cell phones in Kenya fueled ethnic violence, demonstrating that technology can amplify existing prejudices. He urges us to question the uncritical acceptance of technology's neutrality, illustrating how it often reinforces the status quo. The author introduces the potent metaphor of "data as the new oil," a resource tied to imperialist exploitation, its extraction polluting our social relationships and enforcing rigid, computational thinking. Like oil, the pursuit of data is insatiable, but unlike it, data's potential for harm is unlimited, resembling atomic power in its destructive capacity. Bridle draws a chilling parallel between the Cold War's mutually assured destruction and our current state of information overload, where more data doesn't lead to clarity but to confusion and the proliferation of conspiracy theories. He suggests that our reliance on data is a kind of intellectual dead end, a Borgesian library that refuses to cohere, overwhelming our ability to make sense of the world. Ultimately, Bridle calls for a shift towards "guardianship," a concept rooted in taking responsibility for the toxic products of our technological culture, acting with justice in the present, and acknowledging the limits of computational thinking. He advocates for a renewed focus on the here and now, urging us to think critically about the conscious choices we make in designing and using these systems, recognizing that our agency and capacity for thought remain undiminished even in the face of overwhelming complexity.
Conclusion
Bridle's 'New Dark Age' serves as a potent warning against technological solutionism and the uncritical embrace of computation. The book argues that our relentless pursuit of knowledge and control through technology has paradoxically led to a new form of unknowing, obscuring the complex realities we seek to understand. Emotionally, the book evokes a sense of unease, prompting readers to question the narratives of progress and consider the unintended consequences of our digital creations. Its practical wisdom lies in advocating for systemic literacy, embracing ambiguity, and cultivating a critical approach to technology. We must recognize technology as a metaphor that shapes our understanding and actions. The book urges us to move beyond a reliance on data and computation, embracing human intuition, cooperation, and stewardship. It calls for a re-enchantment of technology, not through blind faith, but through a deep understanding of its limitations and a commitment to ethical design and responsible use. Ultimately, 'New Dark Age' is a call to action: to resist the allure of simplistic solutions, to confront the complexities of our interconnected world, and to prioritize social justice and environmental sustainability in the face of technological advancement.
Key Takeaways
Reflect on how the tools we use shape the way we think and avoid the belief in singular, machine-produced answers.
Technological advancement without critical understanding deepens societal divides and obscures power structures.
True systemic literacy requires understanding technology's context, consequences, and limitations, not just its functionality.
Computational thinking, the belief that all problems are solvable through computation, blinds us to the non-computable aspects of reality.
The 'cloud' metaphor obscures the physical and political realities of the internet, demanding critical investigation.
Embracing 'unknowing' and prioritizing experience over knowledge can lead to a more nuanced understanding of complex systems.
Acknowledging the inherent lack of singular intent in the network reveals the interconnectedness of all things and the need for constant reflection.
Re-enchanting technology involves recognizing our tools as metaphors that shape our understanding and engagement with the world.
Computation, like Ruskin's storm-cloud, operates as an often-invisible force shaping our understanding and experience of the world.
The ambition to predict and control all processes, exemplified by early computational thinkers, underlies the development and application of computational technologies.
Militarized computation, born from wartime research, has become deeply embedded in civilian life, influencing everything from weather forecasts to commercial systems.
Automation bias can lead individuals to prioritize automated information over their own senses and observations, potentially resulting in errors and dangerous situations.
As computation increases in complexity and pervasiveness, it risks obscuring its own limitations and biases, creating a new form of opacity.
Over-reliance on computational models can lead to the replacement of reality with flawed representations, distorting our understanding of the world.
The more obsessively we attempt to compute the world, the more complex and unknowable it appears, revealing the inherent limitations of computational thinking.
Melting permafrost demonstrates a positive feedback loop: rising temperatures unlock resources that further accelerate warming, demanding protective measures against the very infrastructure causing the problem.
Climate change exacerbates social and political instability, turning environmental stressors into triggers for conflict and displacement, as seen in the Syrian drought.
The Svalbard Global Seed Vault, intended as a sanctuary, faces threats from climate change, highlighting the interconnectedness of our solutions and the problems they address.
Melting archaeological sites in Greenland represent a loss of invaluable knowledge about past civilizations' adaptations to climate change, diminishing our ability to learn from history.
Rising carbon dioxide levels degrade cognitive abilities, hindering our capacity to think clearly and address the climate crisis effectively.
Increasing turbulence in transatlantic flights serves as a tangible consequence of climate change, disrupting established systems and patterns.
The network, despite its misuse, offers a potential solution by enabling us to see, think, and act collectively, embracing uncertainty and interconnectedness.
Recognize that narratives of inevitable technological progress can obscure the complex, contingent factors that shape invention.
Understand that Moore's Law, while influential, is a projection, not a law, and its pursuit can have unintended consequences.
Be aware of the limitations of relying solely on vast amounts of data, as it can lead to a neglect of traditional scientific methods and a reductionist approach.
Acknowledge the phenomenon of 'overflow,' where an abundance of information exceeds our capacity to process it meaningfully, leading to a failure of quality control.
Consider the potential of combining human intuition with machine learning to navigate complex systems and uncover unexpected results.
Cultivate a critical approach to technological advancements, recognizing their potential benefits and unintended consequences.
The relentless pursuit of speed and efficiency in financial markets, driven by high-frequency trading, exacerbates inequality by creating privileged access for those with the resources to minimize latency.
Technological opacity allows corporations to mask exploitation and environmental damage, making it difficult to perceive the wider, networked effects of individual and corporate actions.
Automation, while increasing efficiency, often leads to the degradation of human labor, transforming workers into mere algorithms in service of the machine.
The internet of things introduces new vulnerabilities, as everyday objects become potential vectors for surveillance and control, eroding privacy and autonomy.
A hermeneutic approach to technology, guided by Hermes, is essential for navigating complexity, embracing ambiguity, and tempering progress with social justice.
AI systems can learn unintended correlations, highlighting the need for careful data curation and awareness of potential biases.
Technology often reflects and reinforces existing societal biases, necessitating critical examination of data sets and algorithms.
Machine learning operates at scales and in dimensions that are incomprehensible to humans, leading to decisions and outcomes that lack transparency.
The rise of AI blurs the line between reality and simulation, creating new forms of unknowingness and challenging our understanding of history and memory.
Cooperation between humans and machines can lead to more effective and ethical outcomes than either working alone, emphasizing the importance of mindful integration.
An ethic of cooperation and stewardship is essential for navigating the complexities of AI, promoting universal justice in the present rather than relying on future solutions.
Recognize the "Glomar response" in everyday language as a tool to evade transparency and critical inquiry.
Understand that secrecy, especially weaponized secrecy, limits our ability to know our history, understand our present capabilities, and envision alternative futures.
Be aware that the pursuit of total information can lead to information overload, hindering effective analysis and decision-making.
Question the assumption that transparency alone is sufficient to counter secrecy, as both can operate under the same logic of control and narrative dominance.
Actively resist the allure of computational thinking that reduces understanding to quantifiable data, potentially blinding us to broader contexts and human factors.
Acknowledge that surveillance does not work as claimed, and that exposure of abuses has not curbed abuses and may have legitimized them.
Challenge the demand for absolute certainty as a prerequisite for action, recognizing that waiting for complete information can lead to paralysis and inaction.
Acknowledge that the desire for simple narratives in a complex world can lead to increased confusion and paranoia, driving the need for critical evaluation of information.
Recognize that pervasive surveillance in the digital age blurs the line between rational concern and paranoia, necessitating a demand for greater transparency and accountability from institutions.
Understand that conspiracy theories, while often dismissed, reflect deeper anxieties about environmental issues and technological control, prompting a need to address these underlying concerns.
Appreciate that indigenous knowledge and embodied experiences offer valuable perspectives on environmental change, challenging the dominance of purely scientific or political viewpoints.
Embrace the gray zone of uncertainty and ambiguity in a world inundated with conflicting information, allowing for more nuanced understanding and effective action.
Be aware that algorithmic radicalization can create echo chambers of extreme and polarizing opinions, necessitating conscious efforts to seek diverse perspectives and challenge one's own biases.
Algorithmic systems, while seemingly neutral, can amplify harmful content and exploit vulnerable audiences, especially children, due to incentives that prioritize engagement over well-being.
The delamination of content from its source on platforms like YouTube and Facebook erodes trust and allows harmful or inappropriate material to seamlessly mix with trusted sources.
Automation and algorithmic amplification can lead to the creation of bizarre and disturbing content that reflects unconscious desires and societal biases, blurring the line between human and machine.
The pursuit of profit within digital ecosystems can incentivize the exploitation of both content creators and consumers, particularly children, creating a system where abuse can occur on a massive scale.
The inability to discern truth from falsehood in the digital age contributes to a broader cognitive crisis, making it difficult to understand the motives and intentions behind online content.
The internet's capacity to amplify latent desires, when combined with algorithmic targeting, can lead to violent and destructive outcomes, necessitating a critical examination of the systems we build.
Increased visibility through technology does not automatically lead to positive outcomes; the willingness to act on information is crucial.
Technology can amplify existing social divisions and violence, rather than inherently preventing them.
The belief in technology's neutrality is dangerous, as it often reinforces existing power structures and inequalities.
The metaphor of "data as the new oil" highlights the exploitative and environmentally damaging aspects of data extraction and utilization.
Our reliance on data can lead to information overload and confusion, hindering our ability to understand complex systems.
A shift towards "guardianship" is necessary, emphasizing responsibility for the consequences of our technological creations and a focus on present action.
Critical thinking and conscious choices in technology design are essential for navigating the complexities of the modern world.
Action Plan
Promote collaboration between humans and machines in decision-making processes to leverage the strengths of both.
Cultivate systemic literacy by studying the historical, social, and political contexts of technological systems.
Challenge computational thinking by actively seeking out non-computable perspectives and solutions to complex problems.
Investigate the physical infrastructure of the internet, such as data centers and undersea cables, to understand the real-world impacts of the 'cloud'.
Embrace 'unknowing' by prioritizing direct experience and intuition alongside data-driven analysis.
Reflect on the metaphors embedded in the technologies we use and how they shape our understanding of the world.
Question the assumption of inevitable technological progress and actively resist narratives that promote a linear view of history.
Seek out diverse perspectives and voices to counter the homogenizing effects of dominant technological narratives.
Engage in open and honest conversations about the challenges and opportunities presented by new technologies.
Support initiatives that promote transparency and accountability in the development and deployment of technological systems.
Actively participate in shaping the future of technology by advocating for ethical design and equitable access.
Actively question automated information and compare it against personal observations and experiences.
Seek out diverse sources of information to avoid relying solely on algorithmically curated content.
Practice critical thinking to identify the biases and limitations inherent in computational models.
Support initiatives that promote transparency and accountability in the development and deployment of computational technologies.
Advocate for policies that protect individual autonomy and privacy in the face of increasing data collection and algorithmic governance.
Cultivate skills and knowledge that are not easily automated or replaced by machines.
Prioritize human connection and social interaction over reliance on digital communication tools.
Engage in activities that foster mindfulness and self-awareness to resist cognitive shortcuts offered by automated systems.
Advocate for policies that reduce carbon emissions and promote sustainable practices.
Support organizations working to preserve and digitize at-risk historical and archaeological sites.
Reduce personal carbon footprint by making conscious choices about energy consumption and data usage.
Engage in conversations about climate change to raise awareness and promote collective action.
Cultivate a mindset that embraces uncertainty and complexity, recognizing the interconnectedness of global systems.
Support efforts to diversify and protect genetic resources, such as contributing to seed banks and conservation initiatives.
Promote the development of resilient and sustainable infrastructure that can withstand the impacts of climate change.
Seek out and share accurate information about climate change to combat misinformation and denial.
Critically evaluate narratives of technological inevitability and consider alternative perspectives.
Question the assumptions underlying Moore's Law and its impact on your own work and consumption habits.
Seek out diverse sources of information and avoid relying solely on data-driven insights.
Practice mindful consumption of information to avoid being overwhelmed by 'overflow'.
Explore opportunities to combine human intuition and judgment with machine learning in your own field.
Reflect on how the tools you use shape your thinking and consider alternative approaches.
Cultivate a healthy skepticism towards technological solutions and prioritize ethical considerations.
Support initiatives that promote open science and increased transparency in research.
Trace the physical infrastructure of digital networks in your local area to understand their impact on the landscape and community.
Critically evaluate the terms of service and privacy policies of the technologies you use to understand how your data is being collected and used.
Support initiatives that promote transparency and accountability in the tech industry.
Advocate for policies that protect workers from exploitation and ensure fair wages in the age of automation.
Cultivate a hermeneutic approach to technology, questioning its claims and seeking alternative interpretations.
Critically examine the data sets used to train AI systems for potential biases and historical inequalities.
Advocate for transparency and explainability in AI algorithms, especially in high-stakes applications like criminal justice.
Develop ethical guidelines and regulations for the development and deployment of AI technologies.
Engage in ongoing dialogue and education about the societal implications of AI and machine learning.
Seek out diverse perspectives and voices in the field of AI to ensure that technology reflects a wider range of values and experiences.
Actively question official narratives and statements, especially those employing the "Glomar response."
Support organizations and initiatives that promote transparency and accountability in government and technology.
Cultivate critical thinking skills to analyze information and identify potential biases or hidden agendas.
Advocate for policies that limit mass surveillance and protect individual privacy rights.
Seek out diverse sources of information and perspectives to avoid echo chambers and groupthink.
Engage in informed discussions and debates about the ethical implications of technology and surveillance.
Take steps to protect your own digital privacy, such as using encryption and secure communication tools.
Recognize that action is possible even in the absence of complete certainty, and take steps to address pressing issues.
Actively seek out diverse sources of information and challenge your own biases.
Practice critical thinking and media literacy skills to evaluate the credibility of information.
Engage in open and respectful dialogue with people who hold different viewpoints.
Support transparency and accountability in government and corporate institutions.
Advocate for policies that address the underlying anxieties fueling conspiracy theories.
Reflect on your own emotional responses to information and consider how they might influence your beliefs.
Embrace uncertainty and complexity, recognizing that simple answers are often insufficient.
Cultivate a sense of humility and intellectual curiosity, remaining open to new perspectives and information.
Critically evaluate the content consumed by children, paying close attention to algorithmic recommendations and potential exploitation.
Support media literacy initiatives that teach children and adults how to discern truth from falsehood online.
Advocate for greater transparency and accountability from online platforms regarding their algorithms and content moderation policies.
Promote ethical design principles in the development of digital systems, prioritizing user well-being over engagement and profit.
Engage in open and honest conversations about the challenges of navigating the digital landscape and the potential for harm.
Support independent journalism and fact-checking organizations that work to combat misinformation and promote accurate reporting.
Be aware of the potential for algorithmic amplification and unconscious biases in online content, and seek out diverse perspectives.
Actively cultivate critical thinking skills and media literacy in children and young people to help them navigate the digital world safely and responsibly.
Critically evaluate claims about the inherent goodness of technology and consider potential unintended consequences.
Actively seek out diverse perspectives and challenge your own assumptions about technology's impact.
Support initiatives that promote responsible data collection and usage practices.
Advocate for policies that address the ethical implications of technology and protect vulnerable populations.
Cultivate your critical thinking skills and question the narratives presented by technology companies and media outlets.
Take responsibility for the technology you use and its potential impact on society.
Engage in open and honest conversations about the challenges and opportunities presented by technology.
Prioritize ethical considerations in the design and development of new technologies.