You are here

Code of conduct key for responsibly harnessing AI in newsrooms

Jun 04,2023 - Last updated at Jun 04,2023

Artificial intelligence (AI) tools are rapidly transforming the journalism landscape, offering new possibilities and challenges alike for newsrooms worldwide. 

However, a recent survey conducted by the World Association of Newspapers and News Publishers (WAN-IFRA) in collaboration with Germany-based Schickler Consulting reveals a stark reality — while half of newsrooms are actively working with AI tools, only 20 per cent have established guidelines for their use. This concerning disparity raises the urgent need for agreed upon rules and a code of conduct to ensure ethical practices and maintain journalistic integrity.

One of the primary concerns highlighted in the survey is the potential impact of AI tools on journalists’ jobs. A staggering 82 per cent of respondents expressed concern that increased usage of AI would change newsrooms and affect their roles. This sentiment underscores the significance of establishing guidelines that clearly define the boundaries and responsibilities of AI tools in journalism. Without such rules in place, uncertainty and fear may permeate the industry, leading to resistance and hindering progress.

Furthermore, the survey indicates that a majority of publishers currently adopt a relaxed approach, with journalists granted the freedom to use AI technology as they see fit. While this flexibility may foster experimentation and innovation, it also poses risks. Without proper guidelines, the responsible and ethical use of AI tools may be compromised, leading to concerns such as inaccuracies, plagiarism, copyright infringement, and issues related to data protection and privacy.

By implementing agreed upon rules and a code of conduct, media organisations can effectively address these concerns. These guidelines would ensure that AI tools are used responsibly, maintaining the core principles of journalism. 

Key areas that should be covered include transparent labelling and verification of AI-generated content, mitigation of bias and manipulation, preservation of editorial independence, protection of privacy and data ethics and fostering public trust.

Establishing rules and guidelines for AI tools in journalism not only mitigates risks but also enables the industry to leverage the full potential of these technologies. With clear boundaries, journalists can effectively navigate the use of AI tools, leveraging their capabilities in tasks such as data analysis, content generation, simplified research, text correction and workflow improvement. Rather than replacing journalists, AI tools should augment their capabilities, enabling them to focus on higher-level tasks that require creativity, critical thinking and investigative journalism.

Moreover, the implementation of guidelines will contribute to building public trust. Transparency, accountability and responsible use of AI tools are vital for maintaining the credibility of the media. By adhering to agreed upon rules, media organisations can demonstrate their commitment to ethical practices and reassure the public that AI is being harnessed for the benefit of accurate and reliable news reporting.

To achieve this, the development of AI policies should go hand in hand with staff training and open communication. Journalists and media professionals need to be educated about the responsible use of AI tools, including their limitations and potential pitfalls. This shared knowledge will foster a culture of responsible AI usage and enable journalists to utilise these tools effectively while upholding ethical standards.

The WAN IFRA survey results clearly indicate the necessity for agreed upon rules and a code of conduct for AI tools in journalism. The overwhelming positive attitude towards generative AI tools, combined with the concerns and uncertainties expressed by industry professionals, underscores the urgency of establishing clear guidelines. By doing so, the media industry can harness the potential of AI tools while safeguarding journalistic integrity, building public trust and embracing a future where technology and human journalism can harmoniously coexist.

This also requires intensified training in the skills and capacities of media outlets and media professionals to optimally deploy AI tools as the basic core skills of professional journalism, fact-checking, being in the field, conveying the stories of the communities and going beyond the obvious. Complementing AI’s benefits to the field, professionalism and transparency in journalism requires all-human skills.

up
3 users have voted.


Newsletter

Get top stories and blog posts emailed to you each day.

PDF