أخبار المركز
  • صدور العدد 38 من دورية "اتجاهات الأحداث"
  • د. إيهاب خليفة يكتب: (الروبوتات البشرية.. عندما تتجاوز الآلة حدود البرمجة)
  • د. فاطمة الزهراء عبدالفتاح تكتب: (اختراق الهزلية: كيف يحدّ المحتوى الإبداعي من "تعفن الدماغ" في "السوشيال ميديا"؟)
  • د. أحمد قنديل يكتب: (أزمات "يون سوك يول": منعطف جديد أمام التحالف الاستراتيجي بين كوريا الجنوبية والولايات المتحدة)
  • أسماء الخولي تكتب: (حمائية ترامب: لماذا تتحول الصين نحو سياسة نقدية "متساهلة" في 2025؟)

Professional Use of Generative Artificial Intelligence in the Media

Does AI affect jobs in newsrooms?

27 ديسمبر، 2024


The 2024 Reuters Digital News Report, based on a survey of 95,000 people across six continents, reveals varying comfort levels with AI in news production. Only 36% of respondents feel at ease with human-produced news assisted by AI, while a mere 19% are comfortable with AI-generated news under human supervision. Despite the potential risks, current AI integration practices in newsrooms generally do not imply a complete replacement of human journalists.

Concerns about AI adoption in journalism intensified following the "CNET crisis," where the American technology-focused platform encountered significant challenges. In response, CNET journalists negotiated an agreement to ensure transparency in AI usage within their newsroom. The accord also safeguards journalists' rights to be informed about the institution's plans regarding machine utilization and AI integration in content production processes. These measures were implemented after accusations surfaced that the platform had published reports entirely generated by AI, raising intellectual property concerns.

The pace of machine takeover in human jobs is decelerating. According to the World Economic Forum's report on the future of jobs, machines perform 34% of all business-related tasks, while humans handle the remaining 66%. Companies surveyed have also reduced their forecasts for increased automation, adjusting expectations from 47% to 42% by 2025.

In contrast to the general use of AI technology, journalism and media present a unique scenario. Creative professions involve a human touch that machines cannot fully replicate, regardless of the amount of information, data, or statistics they are fed. Consequently, AI's role in newsrooms is currently confined to back-office tasks rather than front-facing roles. These tasks include translation, correction, fact-checking, and assistance with recommendations, statistics, and improving user experience. Although virtual voiceovers and presenters are technically feasible, significant debate persists regarding their ability to convey emotion, read news, and interact with audiences effectively.

In the realm of newsroom "heavy industry," humans maintain their pivotal role, especially for stories concerning wars, conflicts, politics, complex economic issues, investigative journalism, and content falling under the umbrella of depth and human interest. While AI tools may offer cost reduction and workflow streamlining benefits for smaller newsrooms, they fall short of creating a sustainable platform by institutional media standards. As a result, AI's role is likely to remain auxiliary, regardless of technological advancements. The production formula for AI-integrated content in newsrooms will likely persist as "human, then machine, then human," necessitating mandatory human oversight for any content utilizing intelligent technological tools.

In 1998, Canadian computer scientist Peter Moravec envisioned the future of work in a way that I find inspiring, despite its simplicity, as it applies to journalism as well. He imagined tasks scattered across plains, hills, and mountains; some situated in the plains and hills, while others perched at the peaks. Artificial intelligence excels in well-defined, repetitive, or routine tasks where performance can be easily assessed. However, these boundaries remain flexible and subject to new research, as noted by experts at the RAND Corporation in their 2017 study on the risks of AI and the future of work and security.

The study indicates that professions in low-chaos environments are more likely to be automated by AI, with those requiring longer response times being the most susceptible. In contrast, high-chaos professions pose greater challenges to automation. Concluding that industrial tools are essentially multipliers of attention—such as "big data" and the like—the study warns these tools can lead to unexpected and dangerous systemic effects.

These industrial tools circumvent and navigate around the concept introduced by economic historian Karl Polanyi, who stated that human experiences encompass more than what we can articulate or learn. Meanwhile, AI systems continuously increase their ability to learn the desired experiences as long as data is available, even if we are unable to express these experiences ourselves.

The intertwining of creative tasks and the sequencing of procedures in content creation and presentation to the audience can be complex enough that generative AI may struggle to fully execute it. A report published by Goldman Sachs in March 2023 estimates that AI capable of generating and writing content could perform a quarter of the work currently done by humans. The report further indicates that 300 million jobs will be displaced across the European Union and the United States due to automation.

Martin Ford, author of "Rise of the Robots: Technology and the Threat of a Jobless Future," warns, "This could be catastrophic." He believes that the change will not be limited to individuals alone but may also affect systems to some extent. However, certain tasks involving distinctive human qualities such as emotional intelligence and out-of-the-box thinking will remain untouched by AI for now.

The transition of individuals to skill-based jobs will help reduce the use of robots in place of humans. Ford identifies three categories that will be relatively insulated from this threat in the foreseeable future. "The first category is creative jobs: you are not doing organized work or merely rearranging things; you are innovating and building something new," he explains. However, he also cautions that not all roles considered 'creative' will be safe. Some professions, such as graphic design and visual arts, may be among the first to be threatened, as basic algorithms can direct robots to analyze millions of images, allowing AI to master the art of aesthetics almost instantly. Ford adds that the second category insulated from the risks of AI includes professions that require complex personal relationships, such as nursing, business consulting, and investigative journalism.

I believe it is impossible for creative humans in journalism to be replaced by machines in any form, especially investigative journalists and data journalists who specialize in information verification. The introduction of AI in traditional assisting roles may threaten some routine journalism jobs in the future, as "amateurs abstain" whenever newsrooms evolve. Consequently, the professional journalist equipped with skills and the ability to integrate AI and tailor technology in all its forms to serve their journalistic work—focused on depth, quality, and adherence to journalistic standards—will be the journalist of the future. While robots may assist in tasks like analyzing big data, the process lacks significance without a human mind to contextualize it. Understanding the initial intent behind choosing a specific story for development in any media outlet, aiming to engage a particular audience, remains a uniquely human capability.