Google tests an AI tool that can write news articles: is journalism in danger?

3comments
Google tests an AI tool that can write news articles: is journalism in danger?
You're probably already tired of hearing about artificial intelligence, but guess what: AI is here to stay, and it may not only make it to news headlines but also write the news itself.

The New York Times reported that Google is pitching a product that utilizes AI to generate news stories to major publishing organizations like The New York Times itself, The Washington Post, and News Corp., the owner of The Wall Street Journal.

This new AI tool from Google is reportedly called Genesis, or at least, this seems to be the project's working title. People familiar with the matter, who, according to The New York Times, want to remain anonymous, have shared that the tool’s main purpose is to take in information and then generate news content.

So imagine, for example, how this article would have got written with the Genesis tool. I would have put some news details into the tool, such as the source’s name and some information about Google, and its new project, like its name and the fact it is using AI, and that is it. My job would be done, and I would just have to copy and paste the AI-generated article and share it with you. I can’t help but wonder: Would I even be necessary in this process?

According to The New York Times sources, Google pitched the idea for the new tool as a kind of personal assistant for journalists, which would help them free up their time by helping with automating some tasks. The sources also shared that the company sees the Genesis tool as a responsible technology that could help the publishing industry steer away from the pitfalls of generative AI.

And it seems that Google really believes that, according to a tweet from the Google Communications team regarding the story. The tweet states that the new AI tool would, for example, help journalists with crafting headlines or choosing different writing styles. And even if this is true and that is the goal, I wonder who will be responsible for monitoring how the tool is really used by different publishers?

Recommended Stories

Misinformation is a pressing issue today, and one of the key responsibilities of journalists is fact-checking to ensure their audience is not misled. While AI is developing rapidly, we must acknowledge that it can sometimes produce incorrect or irrelevant information.

And don’t get me wrong, I am fascinated by the abilities of AI tools like OpenAI’s ChatGPT or Google’s Bard, but several issues related to their usage need to be addressed, and one of them, for sure, is how they are being trained. For example, using articles of published authors without their permission to train the AI tool which might later replace this author is a bit unfair, don't you think?
Loading Comments...
FCC OKs Cingular\'s purchase of AT&T Wireless