Apple

Apple Addresses Siri’s Audio Grading: New Policy to be Implemented Letting Users Opt for Quality Assurance

Apple introduced Siri back in 2011, almost seven years ago. The iPhone 4S was equipped with the virtual assistant, a relatively new concept back then. It was quite primitive at the time and served mostly as a gimmick. Today we see the platform has evolved quite a bit and although it doesn’t compete all so well with the likes of Google’s assistant, it gets the job done.

Siri was introduced on the iPhone 4S back in 2011

The thing with virtual assistants is the response time and how human it sounds and is. Real-time processing is key. For that, many companies (including Apple) sample audio when users say, “Hey Siri!“. This is done to ensure quality assurance and regulating performance and preventing hiccups. Recently, in a statement issued by Apple, there have been instances where the phone mistook something for the Siri pop up command and started recording. This means that people’s regular, everyday conversation and privacy was compromised. According to the company, Apple contractors listen to these recordings for grading and quality control. In an article by 9to5Mac, the company revealed to have released a statement to stop this practice.

In light of the recent breach of privacy by these smart assistants by Google and Amazon and even Apple, the trillion-dollar company has decided to stop the program altogether. While it may worry users that quality may fall, that would not be the case. Apple has decided to halt the practice for the time being until it can implement its solution to work around it.

Like Google has done so previously, Apple will allow users to choose whether to take part in the program or not. Then it would be the user’s choice to give their information or not. While, even after this protocol, there may be instances where the software glitches and records people’s conversations, Apple informing and taking consent from its users, changes the dynamic entirely. For now, though, users should not fear as their privacy would continue to be safe and there is still time till the policy is finally implemented.


Leave a Reply

Your email address will not be published.

Close