Google’s New Application Tools for Maps, YouTube, and Assistant Put Privacy in the Hands of Its Users

Image Source: www.iStock.com/IngusKruklitis

Just in time for National Cybersecurity Awareness Month, Google Maps, YouTube, and Google Assistant were recently announced to have new tools related to user privacy and security. The new updates to these applications give users more control over what data Google can retrieve, and even gives the option for users to delete already collected data such as within Google Voice Assistant. 

Google Maps has now included an incognito mode to keep the application from tracking which places you search for and where you travel to, this thus giving its application users more control over privacy. Incognito mode also helps to keep users’ personalized recommendations from including any locations that would otherwise be irrelevant. Android and iOS users are expected to have this feature available to their Maps application this month.

Image Source: Google | https://www.blog.google/technology/safety-security/keeping-privacy-and-security-simple-you/

 

YouTube is receiving an update as well, with users now able to choose when the app will automatically delete accumulated history. You can choose to keep your watch history for three or 18 months, or just choose to keep the data until you delete it manually.

Google Assistant is also getting an update that allows users to delete any saved voice data. By saying phrases like “Hey Google, delete the last thing I said to you,” or “Hey Google, delete everything I said to you last week,” to your device, Google Assistant will delete its “Assistant Activity”. Deleting voice data from a while back would require you to go into account settings.

After it was revealed that actual people could listen to voice recordings for the purposes of improving voice assistants, Google, Amazon, and Apple all took action to remedy the privacy situation. Alexa, for instance, was implemented with the option for consumers to choose whether recordings will be reviewed. Two months ago, Apple also stated the suspension of its Siri grading program which similarly recorded user audio. The company commented on how they would incorporate consumer participation choice in the grading program with a future update. 

Image Source: Google | https://www.blog.google/technology/safety-security/keeping-privacy-and-security-simple-you/

 

This Google Assistant feature is expected to be released in all languages by next month. The English commands will be available this month. 

Lastly, Google had released Password Checkup within its Password Manager tool. The Checkup feature notifies its users if their passwords have been compromised from a data breach, weak and need to be strengthened, or whether a password has been reused. Google will be adding this tool to Chrome soon, but users can still take advantage of the feature at passwords.google.com.

The Unsafe Aspects to Smart Assistants

Smart speakers like Alexa or Google Assistant may make life much more convenient for users, but how safe are the devices when it comes to malicious hacking?

Amichai Shulman, an adjunct professor at Technion Israel Institute of Technology, decided to test that question with his computer science students more than a year ago. After challenging his class to find security flaws in Microsoft’s voice assistant Cortana, the results were quite alarming. One of the smart assistant’s vulnerabilities allowed a potential hacker to gain access over a Windows device just by using voice commands that would direct a device to download malware even as it’s locked.

As Shulman explains, “I took undergraduate students, and in three months, they were able to come up with a whole wealth of vulnerabilities.”

As discussed at the Black Hat cybersecurity conference in Las Vegas, the professor’s college assignment highlights the potential risk that surrounds voice assistants as they are integrated into more homes worldwide. In just the first quarter of 2018, 9.2 million smart speakers–a majority of those including Amazon’s Alexa and Google Assistant–had shipped, and the market for such devices isn’t slowing down any time soon. According to researchers, it is expected that 55 percent of US households will have a smart assistant by the year 2022.

Each device acts as a potential entry-point to which hackers can use to their advantage.

While security researchers Shulman and Tal Be’ery found such vulnerabilities within Cortana, McAfee’s researchers independently discovered the same flaws as well. Cortana’s shortcomings in security instigated researchers to further look into the problem with voice assistants.

As McAfee’s chief consumer security advocate Gary Davis explains, “It is too ripe of an environment. There are too many of these going into homes for them not to be considered.”

Davis further goes on to explain how the smart assistant’s spread across homes worldwide increases the chances of attacks happening in the future.

Microsoft has already addressed the security vulnerability (that allowed voice command access through locked devices) within Cortana by implementing a June 2018 software update.

Cortana isn’t the only device that comes with security flaws.

In the previous year, researchers have also looked into Amazon’s Echo that features voice assistant Alexa. Back in April, Checkmarx–a security testing firm–found a security flaw within the device. Security researchers at the firm developed an app for Alexa–called a “skill”–which allowed potential hackers to activate Echo as a listening device.

After Amazon was notified of the security issue, the problem was shortly resolved, the company soon releasing a statement stating, “[We] take customer security seriously and we have full teams dedicated to ensuring the safety and security of our products. We have taken measures to make Echo secure.”

Even last September, China’s researchers found that a low-frequency pitch could be used to send commands to voice assistants without the knowledge of users, as such frequencies could not be heard by the human ear.

Symantec’s principal threat researcher, Candid Wueest, comments on how more security issues will arise even as these reported vulnerabilities are fixed.

“Skill and actions are probably one of the most prevalent attack vectors we’ll see,” he explains. “There will be others that can be found in the future that we probably haven’t even heard of yet.”

With Shulman’s discovery on Cortana’s security shortcomings, the device was even able to browse over to non-secure websites just by use of voice commands, to which a hacker could then carry out an attack due to the page lacking encryption.

While Microsoft may have fixed the problem, Shulman still found a loophole through the tech giant’s security updates. By just saying the voice commands differently, the smart assistant would still browse to such non-secure sites.

He explains how “instead of saying ‘Go to BBC.com’ [one] would say, ‘Launch BBC,’ and it would open the non-SSL site in the background,” referring to the non-secure website. He goes on to say how he was “able to find many, many sentences that repeat[ed] the same behavior.”

With voice assistant technology comes a great deal of potential attacks from cybercriminals. Such devices may even be able to send payments soon, as developers have expressed interest in implementing that skill, thereby grabbing the attention of cybercriminals who wish to exploit the system’s vulnerabilities.

Nowadays voice assistants can be used on almost any device–from our television, to our cars, our phones, and even our bathrooms. This poses a risk to us users, especially as we become more comfortable using them. The more we implement them into our lives, “the more our guard will be dropped,” as Davis explains.

For this reason, Shulman suggested that not every device needs to have voice command control.

“You take a concept that is very helpful with handheld devices, and you try to replicate it,” as he explains. “In which, it is not extremely helpful, and as we’ve shown, [it can become] very dangerous.”
For more information and to view the original article, please click here.