Smart speaker privacy concerns spread to Korea

Smart speaker privacy concerns spread to Korea
Published 10 September 2019
By Yeo Jun-suk

 

Naver, Kakao and mobile carriers admit their AI speakers collect recordings of voice commands, but downplay privacy concerns, say collected voice data ‘deidentified’

With Google, Amazon and Apple in hot water for enabling their artificial intelligence speakers to collect recordings of users’ voice commands, similar incidents are now occurring in South Korea, where the country’s tech giants have been found gathering personal dialogue with their speakers.

Naver and Kakao admitted last week that their AI-based interfaces have been collecting users’ audio data and converting it to written files. Such processes were also found to have been conducted by KT and SK Telecom to enhance the mobile carriers’ AI speaker performance.

While the companies asserted they aimed to improve their AI systems’ performance and took measures to protect users’ privacy, advocates are worried that smart speakers can eavesdrop on intimate aspects of users’ personal lives.

The number of Koreans using smart speakers has been increasing. According to the 2018 data from KT’s market research institute Nasmedia, about 3 million people used AI speakers. That number is expected to increase to 8 million by this year’s end.

The first time Korean companies introduced AI-based voice command system was in 2016, when SKT unveiled its smart speaker Nugu. It was followed by KT’s GiGA Genie, which connects with internet protocol TV and other home devices. Naver’s Clova and Kakao’s Kakao Mini followed suit in 2017.

Until recently, the mobile carriers’ smart speakers have gained popularity among local consumers by connecting the AI devices to IPTV services. According to the 2018 research from Consumer Insight, KT and SKT’s market share reached 39 and 26 percent, respectively.

Meanwhile, Naver and Kakao have been increasing their appeal by connecting smart speakers with internet search engines and messenger services. Naver’s Korean market share in AI speakers was 16 percent as of last year, while Kakao claimed 12 percent, according to Consumer Insight.

Can they spy on us?

Local daily Hankook Ilbo reported on Sept. 2 that Naver had recorded what people said to its AI-based voice recognition service Clova. A similar allegation was subsequently made against Kakao’s AI interface Kakao Mini.

According to the reports, Naver and Kakao had enabled their smart speakers to collect recordings of people using voice commands. The recordings were subsequently transcribed by outside contractors working at affiliates.

The companies acknowledged that they operated teams engaged in the efforts, but that it was limited to the time when the AI speakers had been “summoned.” The companies also asserted that gathering such information was designed to improve performance.

“In order to accurately assess Clova’s performance and improve its AI service capability, we store data when users make a voice command,” Naver said in a statement released Sept. 3. “Unless it is called upon, Clova does not collect any dialogue data.”

According to the company, about 1 percent of users’ voice commands were recorded when they were trying to communicate with Clova AI system. Kakao said it randomly collected 0.2 percent of commands.

The recordings were anonymized then transcribed by human beings and this was subsequently compared to what AI devices had recognized. Afterward, the comparison analysis is sent back to all machines to improve performance on future tasks.

KT and SKT said they have been applying similar security measures to their smart speakers. According to the companies, only a small fraction of anonymized data was selected for voice recognition analysis by their contracted workers.

“Before our contracted workers analyzed data, we conducted voice modulation and delated personal data to prevent people from figuring out users’ identity,” an SKT official told The Korea Herald.

Despite the explanations, such activities have prompted privacy concerns among those using smart speakers, pointing to the mere possibility of the sensitive information being recorded, shared and even used against them.

“Now that I know what the AI speakers are capable of, I’m starting to worry about what my daughters say to AI speakers,” said Lee Yeon-soo, who owns several smart speakers placed in his home. “I should figure out ways to use them more safely and smartly.”