Tech giant Apple is once again facing a strange problem. In a recent incident, some iPhone users noticed that when they used Apple's Dictation tool to pronounce the word "racist", the software was typing it as "Trump".
This unusual phenomenon has been widely criticized after it spread on social media. Apple quickly responded by saying that it was caused by a speech recognition problem in the Dictation tool and that they are working quickly to fix the problem. However, technology experts believe that this is not just a simple glitch, but perhaps a part of the software has been deliberately changed.
How did this problem come to light?
First, some users started sharing videos of the problem on TikTok, Twitter (now X), and Reddit. In the videos, when the word “racist” is said using the iPhone’s Dictation feature, it is typed as “Trump.” In some cases, the word is changed from Trump to “racist,” which makes the problem even more mysterious.
When the BBC and other media outlets tried to verify this, it was found that the problem was gradually decreasing, indicating that Apple had already started to resolve it.
Apple's response
Apple said in a statement:
“We are aware of an issue with the speech recognition model in the Dictation feature and we are rolling out an update to fix it today.”
However, many experts believe that Apple's explanation is not reasonable. Professor Peter Bell, a speech technology expert at the University of Edinburgh, believes that Apple's explanation is not entirely acceptable. He said,
“These two words ('racist' and 'Trump') are so phonetically different that a typical AI model would never confuse one with the other.”
He added that large companies like Apple typically train their AI language models with hundreds or thousands of hours of speech data, which greatly reduces the chances of such errors occurring. So it may not just be a simple software issue.
Apple's Dictation tool is based on AI (Artificial Intelligence) and NLP (Natural Language Processing) models that listen to human speech and convert it into text.
This process uses:
- Acoustic Model: An attempt is made to understand which word was spoken by analyzing the sound waves.
- Language Model: AI guesses which word should be used by looking at the context of the text.
- Pronunciation Model: The correct word is chosen by analyzing different pronunciations.
In this case, experts say that the word “racist” becoming “Trump” is not a simple algorithmic error. Rather, it is likely that it was done intentionally or due to a programming glitch.
Recent controversy surrounding Apple's AI
This isn't the first time Apple's AI technology has been in controversy. Not long ago, Apple News' AI-powered summaries feature was temporarily suspended after it was found to be displaying incorrect information.
A false summary stated that world-renowned tennis star Rafael Nadal had identified as gay, which was completely false. After this incident, Apple promised to make changes to its AI model.
It is noteworthy that the association of the word "Trump" with this dictation problem has also created political controversy.
Amid a series of controversies, Apple recently announced that it will invest $1.45 trillion in the United States over the next four years, including the construction of a major AI research center.
The impact of this incident
Apple's Dictation problems have led ordinary users to question the reliability of the technology.
Many technologists and journalists say that,
"If such mistakes are made, people's trust in AI technology will decrease. In the future, big companies should make their AI technology more transparent and reliable."
Meanwhile, some users jokingly say,
“Did Apple do this on purpose, or is this a secret message from AI?”
It's not entirely clear whether the fact that Apple's Dictation tool is replacing the word "racist" with "Trump" is a simple technical issue or something intentional. However, Apple has quickly begun working on a fix for the problem, which is already starting to take effect.
The incident has sparked new discussions about the reliability, political implications, and user privacy of AI technology. It remains to be seen how Apple and other tech companies will develop AI and improve the user experience in the future.