Ad
Short

Google claims that the origin of content—whether human-written or AI-generated—doesn't matter as long as the content is great. But the search giant might be measuring it regardless. SEO expert Juan González Villa found references to an AI classification value called "racterScores" in leaked Google search documents. This value supposedly indicates the likelihood that a website's content is artificially generated (AGC), similar to user-generated content (UGC). The name "Racter" could be a nod to an early ChatGPT predecessor created by IBM in the 1980s. According to Villa, the mention of Racter and the AGC classification value appears in the "Model.QualityNsrNsrData" module, which seems to contain multiple ratings and labels for entire websites.

Ad
Ad
Ad
Ad
Ad
Ad
Short

OpenAI CTO Mira Murati and Microsoft AI CTO Mustafa Suleyman recently made statements that suggest they undervalue the human work that powers their AI models. In an interview with her alma mater, Dartmouth University, Murati said generative AI could replace creative professions that "maybe they shouldn’t have been there in the first place." Suleyman referred to data on the Internet as "freeware." In an explicit response to Murati's remarks, Dartmouth student Will Elliott found an apt analogy in the university newspaper "The Dartmouth":

"A GenAI architect suggesting that some creative jobs shouldn’t have existed in the first place is like a person saying they’d be better off if their parents were never born."

Google News