Google is stamping down on privacy-flouting apps

Google is stamping down on privacy-flouting apps

Google to use peer group analysis to stamp out privacy flouting apps

 Although dressed up as 'peer group analysis', Google is actually using machine learning to root out apps from its mega-successful Play Store that take consumer privacy lightly, or pry too much.

In a blog post on the Android Developers Blog, for application developers, Google say the are taking this step specifically to make sure some apps are not privy to more information about the user than is strictly needed. "Creating peer groups allows us to calibrate our estimates of users' expectations and set adequate boundaries of behaviors that may be considered unsafe or intrusive. This process helps detect apps that collect or send sensitive data without a clear need, and makes it easier for users to find apps that provide the right functionality and respect their privacy.

Is Apple’s iPhone the most secure phone in the world?

"For example, most coloring book apps don't need to know a user's precise location to function and this can be established by analyzing other coloring book apps. By contrast, mapping and navigation apps need to know a user's location, and often require GPS sensor access.

The search engine giant illustrated their idea in a venn diagram showing how these peer groups are created and it IS actually a very good idea to deploy machine learning into one of the most exploited marketplaces. While Apple rigorously vett every app that eventually makes its way onto its app store, Google have a more laissez faire policy towards it all. It is an oft repeated saying that being listed on Google's Play Store will bring in the consumers, but it is Apples iTunes that will help the app maker money.

Macate launches Genio, a £249 phone with a special focus on cyber security

There have been plenty of instances where apps on Play Store have been known to be trojans for malicious script or ones that would spy on users.

Although machine learning will also be used for advancing Google's own understanding of language models, it is primarily aimed at making Play Store more water-tight, privacy-wise. Google will use app metadata, such as text descriptions, and user metrics, such as installs across similar apps and then home in on anomalous signals from apps that are trying to glean too much information from the user's device. "The correlation between different peer groups and their security signals helps different teams at Google decide which apps to promote and determine which apps deserve a more careful look by our security and privacy experts. We also use the result to help app developers improve the privacy and security of their apps," they further wrote in the blog.

With ransomware attacks on the rise, it is good to see the likes of Google taking measures to curb the rise in malicious threat vectors within mobile devices which are incredibly prone to attacks.

Copyright Lyonsdown Limited 2020

Top Articles

Universal Health Services lost $67m to a Ryuk ransomware attack last year

Universal Health Services said the cyber attack cost it $67 million in remediation efforts, loss of acute care services, and other expenses.

How the human immune system inspired a new approach to cyber-security

Artificial intelligence is being used to understand what’s ‘normal’ inside digital systems and autonomously fight back against cyber-threats

Solarwinds CEO blames former intern for hilarious password fiasco

SolarWinds has accused a former intern of creating a very weak password for its update server and storing it on a GitHub server for months.

Related Articles