Anadolu prepared a guideline for the ethical use of artificial intelligence in the media, responding to the need to bridge the fundamental principles of journalism with new technological possibilities.
The rapid advancement of technology necessitates AI applications in the media sector. AI is transforming the media industry across a broad spectrum, from data processing to content creation, from personalized news streams to automatic language translation.
To support the healthy integration of AI into the media sector, Anadolu organized a forum titled Managing Artificial Intelligence in Media at AAtolye.
The forum brought together academics and journalists from Türkiye’s esteemed universities and media organizations, as well as industry representatives, to discuss the use of AI in the media and the ethical principles to be followed.
As a result of the forum, the Guidelines for the Ethical Use of Artificial Intelligence in Media were prepared. This guide aims to contribute to the framework that Anadolu and other media organizations can follow when using AI technologies.
The guidelines, adaptable to changes driven by ongoing technological advances and the dynamic needs of the media industry, are based on Anadolu’s extensive journalistic expertise and the globally recognized principles governing the use of AI in the media field.
The guidelines, formulated with input from the forum participants and Anadolu News Academy, comprise 10 key principles.
Protection of journalistic principles
Media organizations do not compromise journalistic professional principles in the use of AI. Media organizations strictly adhere to the press ethics that encourage journalists to present news in an accurate, impartial, and ethical manner.
Use of AI algorithms in media
The use of AI algorithms in media effectively occurs in many areas such as the personalization of news, content recommendations, language processing, and analysis of sound and images. The use of these technologies brings ethical and legal issues such as data privacy, content manipulation, and impartiality. Therefore, media organizations use algorithms that are consistent with journalistic professional principles and press ethics, ensuring impartiality in the AI applications that media organizations use. Media organizations avoid algorithms that exploit humanity's fundamental values and personal weaknesses.
Fidelity to truth, verification mechanisms
Media organizations support applications that contribute to the fight against disinformation and fake news. Media organizations remember that AI outputs can be filtered, and may contain false, biased, and discriminatory information. Media organizations are cautious against misleading information and they create verification mechanisms. Media organizations avoid any technological applications (like deepfake) that distort the truth. Media organizations allow journalists to use personal initiative for control within the process against the danger of disinformation and manipulation.
Social benefits, readers' rights
The use of AI in the media sector primarily aims to protect social benefits and convey information more effectively. Therefore, AI applications are used by media organizations not to increase circulation, ratings, or views, but to help news reach readers, viewers, and listeners quickly, transparently, and accurately.
Respect for human dignity, privacy
Protecting individuals' privacy and personal information is a fundamental human right and must be respected. Therefore, media organizations fully respect data privacy and the right to privacy in the use of personal data, prioritizing consent. Media organizations avoid any AI applications that could harm human dignity. Media organizations license and cite the sources of real human profiles and voices used for AI-based digital twins (digital avatars) in accordance with copyright laws. Media organizations promote fair use by respecting copyright owners and meeting legal requirements.
Editorial framework against bias
Potential biases in AI applications can affect the opinions expressed. Therefore, media organizations carefully use these technologies. Media organizations ensure that opinions expressed by AI are supervised and verified by human editors to reduce possible biases.
Sustainable journalism
AI offers significant opportunities for speeding up processes and automation in the media sector. However, this automation concern should not diminish the value of human creativity. In this context, media organizations regulate the use of AI in creative job fields like news production, cartooning, writing, photojournalism, visual direction, and graphic design, minimizing negative impacts on employment and protecting journalists' rights. Sustainable Journalism promotes a balanced technological integration in the media industry by preserving human capabilities and creativity
Preservation of human-produced information, diversity
The share of data produced by AI in AI applications is constantly increasing. Media organizations maintain a reasonable level of human-produced original information in AI applications for the sustainability of content richness. Media organizations also pay special attention to the diversity of information and sources. This approach preserves content quality and forms a more robust information foundation by combining AI-produced data and human creativity in a balanced way.
Legal responsibility, transparency, copyrights
Media organizations contribute to clearly defining the legal framework of content produced by AI, respecting transparency and copyright issues. In this context, media organizations use a logo to indicate that the content is produced by AI and clearly state AI's contribution and the sources used in the content through footnotes and warnings.
Legal process in use of AI
The use of AI in media does not yet have a clear legal framework due to the rapid development and changing nature of technology. Until clear universal and national rules and guidelines are established for the use of AI, media organizations follow temporary regulations and ethical rules that have been prepared.