أخبار المركز
  • مركز "المستقبل" يستضيف الدكتور محمود محيي الدين في حلقة نقاشية
  • مُتاح عدد جديد من سلسلة "ملفات المستقبل" بعنوان: (هاريس أم ترامب؟ الانتخابات الأمريكية 2024.. القضايا والمسارات المُحتملة)
  • د. أحمد سيد حسين يكتب: (ما بعد "قازان": ما الذي يحتاجه "بريكس" ليصبح قوة عالمية مؤثرة؟)
  • أ.د. ماجد عثمان يكتب: (العلاقة بين العمل الإحصائي والعمل السياسي)
  • أ. د. علي الدين هلال يكتب: (بين هاريس وترامب: القضايا الآسيوية الكبرى في انتخابات الرئاسة الأمريكية 2024)

Advancing Media Production

Professional Applications of Generative Artificial Intelligence

23 مايو، 2024


Perhaps there is no better way to introduce the topic of generative artificial intelligence than by referencing Aldous Huxley's novel "Brave New World," which captures the essence of the latest wave of AI, highlighting both its risks and opportunities. 

When examining the use of AI, it is only natural for a critical spirit to prevail. Throughout history, humanity has approached new technologies with caution, especially when they involve content creation, the generation of images, programming codes, and the creation of new texts using internet data. In Huxley's novel, the controllers realized that force was ineffective in controlling people. Instead, they discovered that temptation could enslave individuals and force them to live in an artificial society. In the novel, the controllers come to realize that coercion is ineffective. Instead, they discover that it is possible to subjugate people through temptation, compelling them to exist in an artificial society where the traditional family structure is eradicated, and individual thoughts and desires are suppressed. In this society, the works of Shakespeare have been forsaken, and the only individual who yearns for these lost aspects is the protagonist of the novel. 

This dystopian society depicts a fundamental alteration in human nature, brought about by the establishment of an artificial and dehumanizing world, where the inhabitants are oblivious to their own dehumanization, becoming unwitting captives of this fabricated reality.

In creative professions that rely on imagination and innovation, such as journalism and media, there exist well-established norms governing their practice. The influx of new technological advancements, however, presents significant challenges. When incorporating these tools, professionals must be diligent in adhering to their responsibilities, akin to following driving manuals in newsrooms.

It's crucial to recognize that these tools bring both positive and negative aspects, much like any other technology. In professions centered on investigation, verification, documentation, and auditing, it is imperative to wield these tools and guide their usage responsibly.

As a response to this evolving landscape, we are launching a series of articles. The inaugural piece will delve into critiquing technology itself and the discourse surrounding it. The objective is to foster a well-rounded perspective that acknowledges the inevitability of technological integration while avoiding becoming enslaved by it.

The Critiques of Technology: Heeding the Warnings

We will begin our exploration of a critical viewpoint on technology by examining the present. To gain insight into the concerns surrounding the power of technology, let us first turn our attention to "Geoffrey Hinton" - the godfather of artificial intelligence. Hinton's vocal warnings about the increasing dangers associated with the development of artificial intelligence have reverberated throughout the world, causing an earthquake of concern. This serves as a stark reminder of past expressions of regret, much like Oppenheimer's remorse following the invention of the atomic bomb several decades ago.

Geoffrey Hinton's warnings

We will begin with Hinton's insights and explore his takeaways in subsequent discussions. Our focus will then shift to the influential figure of Martin Heidegger (1889-1976), a prominent critic of technology. We will embark on a journey through time, starting with Hinton's perspective and eventually reaching Heidegger's renowned lecture from the 1950s, titled "The Question Concerning Technology." 

Hinton expressed his regret for working in the field of chatbots. He described certain aspects of chatbots, such as GPT chat and Google's development called "cold and then Gemini," as "very scary." The BBC quoted him as saying, "Chatbots are not smarter than us, but I think they will overtake us soon." Hinton predicts that chatbots could soon surpass the human brain in terms of the amount of information they can hold.

Martin Heidegger's critique

Heidegger expressed fear towards technology, stating, "In the essence of any technology, it does not only affect things and people, but rather 'attacks everything else: nature, history, human beings, and the gods.'" He argued that technology is causing a reduction in both temporal and spatial distances. However, he believes that the haste to disregard these distances does not result in true closeness, as true closeness is not defined by mere proximity. He emphasizes that we are now finding it increasingly difficult to truly experience and comprehend this closeness, as everything is being viewed and handled in a more technological manner. We perceive and interact with them as mere 'standing reserve.'

In his proposal, Heidegger argues against the dehumanizing effects of technology. He believes that we often view human capabilities solely as tools for technological processes, reducing workers to mere instruments of production, which is evident in the way leaders and planners treat us as disposable human resources, constantly arranging and rearranging us as they see fit.

He believes that when things are presented technically, they lose their essential epicness and distinctiveness. He states, "Everything that is presented technically loses its distinctive independence and form. We push other possibilities aside, hide them, or simply cannot see them." He further criticizes, "We tend to think that technology is a means to achieve our goals and that we have control over it. But in reality, we now see means, goals, and even ourselves as replaceable and manipulable. Control and direction are now dictated by technology. Our attempts to master technology are still confined within its boundaries." This only reinforces his point.

James Bridle's perspective

From Heidegger to James Bridle, who presents a modernist vision of technology criticism, there is a focus on the information overload that can restrict humanity. In his book "A New Dark Age," published in 2018, the British technical researcher and writer highlights the clash between technology's dominance and its control over reality.

Bridle explores the impact of data on our intelligence. He poses the question, "Does data make us more stupid?" and boldly asserts that it does. According to him, this paradoxical phenomenon is a fundamental characteristic of life, particularly in the technology-dominated northern hemisphere. As a result, individuals are compelled to seek more efficient methods of coping with the overwhelming influx of data.

He observes that humans often experience a sense of uncertainty, comparing it to floating in a zero-gravity zone without guidance and being surrounded by a state of unease. This intriguing yet unsettling hypothesis is explained in reference to the "New Aesthetic" project, which concentrates on extracting and documenting the significance of the data produced by the digital world, encompassing data centers and surveillance aircraft.

He also admits to encountering a challenging struggle in completing this work. He felt ashamed discussing the present demands and vulnerability in the face of current challenges, but he emphasized that these feelings should not impede our ability to think or cause us to undermine each other.

Bridle argues that data is failing us, leading to a sense of shame and weakness. He continues that our deep faith in data is now proving to be a disappointment and that the vast amount of information was expected to lead us to a brighter future, as per the prevailing logic in the Western world since the Enlightenment. Therefore, the caution about the breakdown of the connection between data availability and sound decision-making strongly resonates across all facets of life.

He expresses concern about the overwhelming amount of data that can negatively impact our lives: "It is not just that data can become a burden that weighs us down and misguides us; rather, it leads to a world that disregards common sense and intuition and excessively relies on data."

Bridel provides several examples that highlight the dangers of an absolute bias towards automation and the idolization of data. One such example is the occurrence of horrific airline accidents caused by overreliance on automated systems. Another instance involves Japanese tourists who blindly followed the instructions of a satellite navigation device installed in their car, resulting in their vehicle plunging into the sea. He also references the testimony of William Binney, an official at the US National Security Agency, who stated in 2016 that "99% of the communications data available to the agency is worthless." Bridel asserts that this sentiment extends to other domains as well.

The fear of darkness that Bridel desires from us may appear unsettling, but it could serve as motivation for us to find a shared approach to handling it. This approach should neither involve exclusion nor enslavement, as suggested by Heidegger's response in an interview with "Der Spiegel" when he was asked, "Why must technology overpower us?" Heidegger clarified, "I am not saying that we have been overpowered; rather, I am saying that we have not yet found a path that aligns with the essence of technology."

Conclusion

In this space, our perpetual endeavor is to seek consensus through rational means. While acknowledging the presence of risks, we remain vigilant to seize opportunities, endeavoring to harness and redirect potential challenges towards benefiting humanity without causing harm. Just as one cannot drive a fast car without an instruction manual, our approach to risks is deliberate and calculated.