After apologising in August for using humans to review Siri voice dictation and commands, and subsequently suspending the practice, Apple apparently just waited for the dust to settle down before commencing the same practice once again.
Although this kind of thing is quite common in the technology industry, it is undoubtedly undermining the Cupertino-based company’s attempts to portray itself as a trustworthy guardian of people’s personal information.
Apple’s CEO Tim Cook has on several occasions reiterated the firm’s conviction that “privacy is a fundamental human right.”
As a matter of fact, that particular phrase was used when the company apologised to the Apple community.
The only thing that has changed is that this time, Apple is at least giving device owners the option to opt out of audio review and storage when they install iOS 13.2.
If you enable it but change your mind later, you can still disable it in the device settings.
Apple also insists that it does not associate Siri data with any particular user ID.
Tech firms believe that this practice assists them with improving their AI services.
Using people instead of machines to review audio recordings is, however, something that bothers privacy experts very much, because it creates a risk of a rogue contractor or employee leaking private user information or sensitive conversations.
Other technology firms have also restarted the practice after giving their users notice.
Google, for example, resumed the practice last month after also implementing a new system to ensure that users know what they are signing up for if they agree.
Last month, Amazon also announced that users of its digital assistant (Alexa) would in future be able to request that all voice command recordings are deleted automatically.
Apple earlier revealed plans to carry on with human reviews, but it didn’t say when.
It did, however, promise to no longer use outside contractors to do these reviews.
Add Comment