Quantcast
Channel: Christian Fuchs » user-generated content
Viewing all articles
Browse latest Browse all 2

Google’s “New“ Terms of Use and Privacy Policy: Old Exploitation and User Commodification in a New Ideological Skin

0
0

Google’s “New“ Terms of Use and Privacy Policy: Old Exploitation and Commodification in a New Ideological Skin

On March 1st, 2012, Google changed its terms of use and privacy policy. What has changed? Has something changed?

Google’s general terms of services that were valid from April 16, 2007, until the end of February 2012, applied to all of its services. It thereby enabled the economic surveillance of a diverse multitude of user data that was collected from various services and user activities for the purpose of targeted advertising: “Some of the Services are supported by advertising revenue and may display advertisements and promotions. These advertisements may be targeted to the content of information stored on the Services, queries made through the Services or other information”.

Google specified in its old privacy policy (valid from October 20, 2011, until the end of February 2012) that the company “may collect the following types of information”: personal registration information, cookies that store “user preferences”, log information (requests, interactions with a service, IP address, browser type, browser language, date and time of requests, cookies that uniquely identify a user), user communications, location data, unique application number. Google said that it was using Cookies for “improving search results and ad selection”, which is only a euphemism for saying that Google sells user data for advertising purposes. “Google also uses cookies in its advertising services to help advertisers and publishers serve and manage ads across the web and on Google services”. To “serve and manage ads” means to exploit user data for economic purposes. The Google ad preferences manager displays the user interests and preferences that are collected by the use of cookies and used for targeted advertising.

Google’s old privacy policy specified that Google uses the DoubleClick advertising cookie on AdSense partner sites and certain Google services to help advertisers and publishers serve and manage ads across the web”. Google used DoubleClick, a commercial advertising server owned by Google since 2007 that collects and networks data about usage behaviour on various websites, sells this data, and helps providing targeted advertising – for networking the data it holds about its users with data about these users’ browsing and usage behaviour on other web platforms. There was only an opt-out option from this form of networked economic surveillance. Google’s privacy policy provided a link to this option. Opt-out options are always rather unlikely to be used because in many cases they are hidden inside of long privacy and usage terms and are therefore only really accessible to knowledgeable users. Many Internet corporations avoid opt-in advertising solutions because such mechanisms can drastically reduce the potential number of users participating in advertising. That Google helped advertisers to “serve and manage ads across the web” means that it used the DoubleClick server for collecting user behaviour data from all over the WWW and using this data for targeted advertising. Google’s exploitation of users is not only limited to its own sites, its surveillance process is networked, spreads and tries to reach all over the WWW.

The analysis shows that Google makes use of privacy policies and terms of service that enable the large-scale economic surveillance of users for the purpose of capital accumulation. Advertising clients of Google that use Google AdWords are able to target ads for example by country, exact location of users and distance from a certain location, language users speak, the type of device used: (desktop/laptop computer, mobile device (specifiable)), the mobile phone operator used (specifiable), gender, or age group.

On January 25, 2012, the EU released a proposal for a General Data Protection Regulation that defines a right of individuals not to be subject to profiling, which is understood as  “automated processing intended to evaluate certain personal aspects relating to this natural person or to analyse or predict in particular the natural person’s performance at work, economic situation, location, health, personal preferences, reliability or behaviour“ (article 20, 1). Targeted advertising is such a form of profiling. According to (the planned) article 20, 2 (c), profiling is allowed if the data subject consents according to the conditions of article 7, which says that if the consent is given as part of a written declaration (as e.g. a web site’s terms of use or privacy policy), the “consent must be presented distinguishable in its appearance from this other matter“ (article 7, 2). The regulation furthermore proposes a right of citizens to be forgotten (article 17), which also includes that third parties should be informed and asked to erase the same data (article 17, 2), the right to data portability (article 18), which e.g. means that all personal data must be exportable from Facebook to other social networking sites. A further suggested regulation is that by default only the minimum of data that is necessary for obtaining the purpose of processing is collected and stored (article 23). Fines of up to 1 000 000 Euros and 2% of the annual worldwide turnover of a company are implemented (article 79). The EU regulation to a certain extent limits targeted advertising by the right to be forgotten and the special form in which consensus must be given, it does however not make targeted advertising a pure opt-in option, which were a more efficient way for protecting consumers’ and users’ privacy.

As a result of the announcement of the EU Data Protection Regulation, Google over night announced the change and unification of all its privacy policies and the change of its terms of use. In the new terms of use, the use of targeted advertising is no longer defined in the terms of use, but the privacy policy: “We use the information we collect from all of our services to provide, maintain, protect and improve them, to develop new ones, and to protect Google and our users. We also use this information to offer you tailored content – like giving you more relevant search results and ads”. Although Google presents its new policies as major privacy enhancement (“a simpler, more intuitive Google experience. […]  we’re consolidating more than 60 into our main Privacy Policy. Regulators globally have been calling for shorter, simpler privacy policies – and having one policy covering many different products is now fairly standard across the web” (http://googleblog.blogspot.com/2012/01/updating-our-privacy-policies-and-terms.html).

The core of the regulations – the automatic use of targeted advertising – has not changed. The European Union does not require Google to base targeted ads on opt-in. Google offers two opt-out options for targeted ads: one can opt-out from the basing of targeted ads on a) search keywords and b) visited websites that have Google ads (Ads Preferences Manager, https://www.google.com/settings/ads/preferences/).

In the new privacy policy, “user communications” are no longer mentioned separately as collected user information. But rather content is defined as part of log information: “Log information. When you use our services or view content provided by Google, we may automatically collect and store certain information in server logs. This may include: details of how you used our service, such as your search queries”.  Search keywords can be interpreted as the content of a Google search. The formulation that log information is how one uses a service is vague. It can be interpreted to also include all type of Google content, such as the text of a gMail message or a Google+ posting.

In the new privacy policy, Google says: “We may combine personal information from one service with information, including personal information, from other Google services – for example to make it easier to share things with people you know. We will not combine DoubleClick cookie information with personally identifiable information unless we have your opt-in consent”. This change is significant and reflects the circumstance of the EU data protection regulation’s third-party regulation in the right to be forgotten (article 17, 2). The question if DoubleClick is used for Google’s targeted ads more or less is based on the question how extensively and aggressively Google tries to make users to opt-in to DoubleClick. The effect is that Google will no longer be able to automatically use general Internet user data collected by DoubleClick. However, the unification of the privacy policies and the provision that information from all Google services and all Google ads on external sites can be combined allows Google to base targeted advertising on user profiles that contain a broad range of user data. The sources of user surveillance are now mainly Google services. As Google spreads its ad service all over the web, this surveillance is still networked and spread out. Google tries to compensate the limited use of DoubleClick data for targeted advertising with an integration of the data that it collects itself.

Concerning the use of sensitive data, both the old and the new privacy policy specify: “We require opt-in consent for the sharing of any sensitive personal information”.  In addition, the new policy says: “When showing you tailored ads, we will not associate a cookie or anonymous identifier with sensitive categories, such as those based on race, religion, sexual orientation or health”. Targeted ads use data from all Google services, including content data”.

The proposed EU Data Protection Regulation says that the processing of sensitive data (race, ethnicity, political opinions, religion, beliefs, trade-union membership, genetic data, health data, sex life, criminal convictions or related security measures) is forbidden, except if the data subject consents (article 9). Google continues to use content data (such as search queries) for targeting advertising that is based on algorithms that make an automatic classification of interests. By collecting a large number of search keywords by one individual, the likelihood that he or she can be personally identified increases. Search keywords are furthermore linked to IP addresses that make the computers of users identifiable. Algorithms can never perfectly analyze the semantics of data. Therefore use of sensitive data for targeted advertising cannot be avoided as long as search queries and other content are automatically analyzed. Google’s provision that it does not use sensitive data for targeted ads stands in contradiction with the fact that it says it uses “details of how you used our service, such as your search queries”.

The overall changes introduced by Google’s new privacy policies and terms of use are modest, the fundamentals remains unchanged: Google uses targeted advertising as a default. DoubleClick is now less likely to be used for targeted advertising. Google has unified its privacy policies. Whereas Google presents this move as providing more transparency (“We believe this new, simpler policy will make it easier for people to understand our privacy practices as well as enable Google to improve the services we offer”, http://googleblog.blogspot.com/2012/01/updating-our-privacy-policies-and-terms.html), it also enables Google to base its targeted ads on a wide range of user data that stem from across all its services.

Google claims that it does not use sensitive data for targeted ads, which is contradicted by the definition of content data as log data that can be used for targeted ads. Google’s old privacy terms (version from October 20, 2011) had 10 917 characters, which is an increase of 30%. The main privacy terms have thereby grown in complexity, although the number of privacy policies that apply to Google services was reduced from more than 70 to one.

Google present its updated terms of use and privacy policies as new, although no fundamental improvements of user privacy protection can be found. The “change” is an ideological marketing strategy aimed at maintaining the stability of the exploitation of the labour of users that generates value and generates Google’s profits that in 2011 amounted to $8.5 billion (http://www.forbes.com/global2000/#p_1_s_arank_ComputerServices_All_All). Google continues to automatically collect, analyse and commodify a multitude of user data that is generated by searches and the use of Google services. The Marxist communication scholar Dallas Smythe wrote in 1981: “For the great majority of the population […] 24 hours a day is work time. […] [Audiences] work to market […] things to themselves”. For the great majority of Internet users, most of Internet use is (value-generating) labour time. Internet users work on Google and other corporate platforms to market things to themselves and are transformed into an Internet commodity that is sold to targeted advertising clients in order to accumulate capital in the amount of billions of Euros.

In a response letter to the EU Article 29 Data Protection Working Party (concerning Google’s updated policies and terms; see http://www.edri.org/book/export/html/1225), Google’s Global Privacy Counsel Peter Fleischer writes that “we are not selling our users’ data”. One wonders where Google’s $US 8.5 billion profits come from, except from the commodification of the data results of users’ activities?

The EU Article 29 Data Protection Working Party asked the French National Commission for Computing and Civil Liberties (CNIL) to analyse Google’s new policies. In a letter to Google, CNIL shows deep concern and said that “our preliminary analysis shows that Google’s new policy does not meet the requirements of the European Directive on Data Protection […] Moreover, rather than promoting transparency, the terms of the new policy and the fact that Google claims publicly that it will combine data across services raises fears about Google’s actual practices. Our preliminary investigation shows that it is extremely difficult to know exactly which data is combined between which services for which purposes, even for trained privacy professionals. In addition, Google is using cookies (among other tools) for these combinations and in this regard, it is not clear how Google aims to comply with the principle of consent laid down in Article 5(3) of the revised ePrivacy Directive, when applicable. The CNIL and the EU data protection authorities are deeply concerned about the combination of personal data across services: they have strong doubts about the lawfulness and fairness of such processing, and about its compliance with European Data Protection legislation”. Big Brother Watch reports that only 12% of the Google users have read the new policy and that 65% are not aware that the changes have now come into effect. The initiative says: “Google is putting advertiser’s interests before user privacy and should not be rushing ahead before the public understand what the changes will mean”.

According to the proposed new EU Data Protection Regulation (http://ec.europa.eu/justice/newsroom/data-protection/news/120125_en.htm), Google’s exploitation of users is perfectly legal. That it is legal does however not mean that we cannot consider Google commodification as a violation of user/consumer/Internet workers’ privacy, but rather that the EU’s suggested legal provisions do not provide enough protection for users. The only way forward is to legally require all Internet companies (and companies in general) to necessarily make targeted advertising an opt-in option by law, which would give users and consumers more control. Implementing such a provision requires not only courage, it also requires not to be afraid of organised business interests. It is however the only way for putting privacy interests first. Today, profit stands over privacy protection and therefore over people. Google is one of the best examples for this circumstance. Google’s “new” privacy policy is not new at all and should consequently best be renamed to “privacy violation policy” or “user exploitation policy”.

Related publication:
Fuchs, Christian. 2011. A contribution to the critique of the political economy of Google. Fast Capitalism 8 (1). http://www.uta.edu/huma/agger/fastcapitalism/8_1/fuchs8_1.html

Related stories:
Google’s Privacy Policy Changing For Everyone: So What’s Really Goign to Happen? The Huffington Post, 29.2.2012, http://www.huffingtonpost.com/2012/02/29/google-privacy-policy-changes_n_1310506.html

Thoms Gideon and James Losey: The Real Problem with Google’s New Privacy Policy. http://www.slate.com/articles/technology/future_tense/2012/02/google_privacy_policy_the_missing_opt_out_isn_t_the_only_problem_.html

Google answers Article 29 Working Party on data protection standards. European Digital Rights, http://www.edri.org/book/export/html/1225

9 in 10 People Haven’t Read Google’s New Privacy Policy. Big Brother Watch UK, http://www.bigbrotherwatch.org.uk/home/2012/02/ten-people-havent-read-googles.html#more-4205

France Says Google Privacy Plan Likely Violates European Law. New York Times, 28.2.2012, http://www.nytimes.com/2012/02/29/technology/france-says-google-privacy-plan-likely-violates-european-law.html

Googles neuer Daten-Schmu. Der Spiegel Online, 29.2.2012, http://www.spiegel.de/netzwelt/web/0,1518,818105,00.html

Google BuzzShare/Bookmark

Viewing all articles
Browse latest Browse all 2

Latest Images

Trending Articles





Latest Images