Tag Archives: privacy

Personalized offers without compromising the privacy of personal information


Searching on the internet for information is one of the most common activities these days. This behavior varies from searching for the closing time of your favorite shop, to finding out how cognitive processes work. Consumers are generating and sharing data when they search on the web. The knowledge of this process is not always present. Even if consumers knew their steps were being followed, would it change their daily behavior? Privacy is not a new term. Yet, it disclosed itself more than ever the last couple of decades, together with the introduction of the Internet. The general definition is the right to be let alone. It is based on the principle of protection of the individual in both person and property (Warren & Brandeis, 1890). However, this definition is too broad. Informational privacy fits more in the context of these days. The definition is the right to control of access to personal information (Moor, 1991).
It is clear that privacy concern is an issue for companies. Companies want to use personal information to offer personalized adverts which is often considered as something positive by the consumers. A business example is Amazon.  The well-known E-commerce company continually updates the user’s personal page to create more tailored experiences. This is done based on past purchases and browsing history and the objective is to stimulate impulse buys (Reverte, 2013). However, personalization has some concerns.

Amazon

 

Sutanto, Palme, Tan and Phang (2013) wrote an article in which they study the so called personalization-privacy paradox. This is the tension between how IT developers and marketers of applications exploit personal users’ information to offer personalized products or services and these users’ increasing concern regarding the privacy of that information. Eventually, this may restrain the use of these applications. The purpose of this paper is to study whether a personalized privacy-safe application works. This application stores and processes information within the user’s smartphone, but does not transmit it to the marketers. This way, personalized information can be offered, without compromising the privacy of personal information (Sutanto, 2013).

Personalized privacy-safe application
To understand the personalization-paradox, the authors build on two theories; the use and gratification theory (UGT) and information boundary theory (IBT).  UGT suggests that consumers use a medium either for the experience of the process or for the content it offers. These two dimensions are called process gratification and content gratification (Sutanto et al., 2013). While the latter refers to the messages carried by the medium, the first one relates to the enjoyment of using the medium.

Next, IBT gives a better understanding in which factors influence process and content gratification.  This theory suggests that consumers form so-called physical or virtual information spaces around themselves. These spaces have boundaries and an attempt by third parties to cross these boundaries will be considered invasive which makes consumers uncomfortable.   In case of an intrusion, consumers will apply a risk-control assessment, weighting the risk of disclosing personal information and the benefits they gain when doing so (Stanton, 2003).

This study conducted a field experiment with three mobile advertising applications. The first mobile application broadcasts adverts generally (i.e. non-personalized application). The second application filters and displays adverts based on the profile information of users, stored in a central server (i.e. personalized, non-privacy-safe application). The last application filters and displays adverts on the profile information of users, stored on their smartphone (i.e., personalized, privacy-safe application). In this context, process gratification is measured with the number of application launches.  On the other hand, if a user is interested in the content offered by the application, they are more likely to save the advert with the purpose of retrieving it later, thus content gratification is measured in terms of the frequency of saving adverts (Sutanto et al., 2013).

Personalized application

The results of the field experiment showed that there is indeed a difference in process and content gratification between the three different applications.  Process gratification increased by 64.5% when the adverts were personalized compared to when adverts were general. However, there was no significant difference in content gratification. This may be explained by the fact that saving adverts explicitly indicates interest in a specific product, thus it requires the user to reveal deeper levels of information than their own boundaries allow. It is likely that this situation causes an uncomfortable feeling and which eventually will lead to a hesitation to save adverts.  Next, the local privacy-safe personalization design increased both process and content gratification. Application use increased by 9.6% compared to personalized, non-privacy-safe application and by 79.1% compared to the non-personalized application.  Respectively, advert saving increase by 24.5% and 55.1%.

However, there is an important limitation in this paper. There is a possibility that some users launched the application, but already were interested in a certain ad. This makes it more difficult to disentangle process gratification from content gratification.

Concluding, this article proposes a personalized privacy-safe application. The results show significant differences between the three applications in favor of the local privacy-safe personalization application. Thus, offering personalized adverts without compromising the privacy of personal information is possible.

References:

Moor, J.H. (1991) ‘The ethics of privacy protection’, , pp. 69–82

Reverte, C. (2013) Personalization Innovators: Amazon, Netflix, and Yahoo! | Available at: https://www.addthis.com/blog/2013/08/28/personalization-innovators-amazon-netflix-and-yahoo/#.WomB3OciHIU. [Accessed 18 February 2018].

Stanton, J. M. (2003). Information technology and privacy: A boundary management perspective. In Socio-technical and human cognition elements of information systems (pp. 79-103). Igi Global.

Sutanto, J., Palme, E., Tan, C. H., & Phang, C. W. (2013). Addressing the Personalization-Privacy Paradox: An Empirical Assessment from a Field Experiment on Smartphone Users. Mis Quarterly, 37(4).

Warren, S.D. and Brandeis, L.D. (1890) ‘The right to privacy’, Harvard Law Review, 4(5), pp. 193–220. doi: 10.2307/1321160

QLIQZ – Human Web Against Data Giants


Sick of giving away your data for free to data giants like Google, Facebook, and so on? You want to have control over your browser history, search terms, and advertisement when surfing the internet? You are annoyed by scrolling through a list of ads first when entering a term in Google Search? Then CLIQZ might be a very interesting alternative for you. 

CLIQZ combines search engine and browser. A quick search is performed directly in the browser, without having to go to another search engine. Search terms or names of a website are directly entered into the browser line. As done by other browsers, a selection of suggestions for websites appears, without you having to leave the current website.

The Human Web

Heart of the search engine is its algorithm, called Human Web, which ranks search results according to their relevancy, and not related to the content, structuring, and linking of websites as done under Search Engine Optimization. With the Human Web, CLIQZ makes use of the wisdom of the crowd – the community of users. The search algorithm of CLIQZ weighs data about people’s behavior on the web more than the technical analysis of websites. And the more data collected, the better is the search algorithm. But now you might think: “How is that different from Chrome or Firefox? They also collect my data.” In contrast to other search engines that build complete and detailed profiles on their users, CLIQZ only works with anonymous statistical data (CLIQZ Human Web, 2017).

Recent Development

CLIQZ recently hit the headlines with the acquisition of Ghostery, a browser plug-in and mobile app that enables its users to easily detect and control JavaScript “tags” and “trackers”, which allow the collection of the user’s browsing habits via cookies, as well as participating in more sophisticated forms of tracking such as canvas fingerprinting. By this acquisition, CLIQZ wants to increase its user base (AdAge, 2017).

Business Model Evaluation

Although the concept of CLIQZ  is widely described as very promising due to a global shift towards privacy awareness (EY Privacy Trends Report, 2016), this does not automatically lead to business success. Currently, CLIQZ  is a start-up backed by renown investors, namely Hubert Burda Media and Mozilla. As stated on their website, CLIQZ did not decide on how they will generate revenue (CLIQZ Support, 2017), but it should be compatible with the respect they have for our users’ privacy and with the core benefits of their product (direct, fast, clearly structured). Thus, making money like other conventional browsers (mainly through advertisements and search royalties) does not fit their strategy.

The single functions of CLIQZ and Ghostery are not new for consumers. If you do not want to be tracked, other options are already available on the market. Add-ons such as the Privacy Badger block scripts and cookies. The Tor browser offers even more anonymity. To avoid search engines such as Google, Startpage, for example, is also a privacy-friendly alternative.

However, CLIQZ provides a seamless integration of those single features into an encompassing solution. Next to this, the most useful differentiator of CLIQZ is its relevancy-based Quick Search. This combined functionality is currently the main reason for consumers to consider using CLIQZ as a default browser.


References

AdAge. Cliqz, a Mozilla-Backed Search Engine, Buys Privacy Extension Ghostery. Retrieved March 5, 2017 from http://adage.com/article/digital/cliqz-a-mozilla-backed-search-engine-buys-ghostery/307980/

CLIQZ. Human Web. Retrieved March 5, 2017 from https://cliqz.com/en/whycliqz/human-web

CLIQZ. Support. Retrieved March 5, 2017 from https://cliqz.com/en/support

EY Privacy Trends Report (2016). Can privacy really be protected anymore? Retrieved March 5, 2017 from http://www.ey.com/Publication/vwLUAssets/ey-can-privacy-really-be-protected-anymore/$FILE/ey-can-privacy-really-be-protected-anymore.pdf

Managing Consumer Privacy Concerns in Personalization: A Strategic Analysis of Privacy Protection


In the digital age that we are living, one of the major concerns is the protection of our privacy. Research shows that 90% of all online consumers either do not disclose any personal information to companies at all or choose to disclose only to the ones committed to fully protecting their privacy (Taylor, 2003). On the other hand, companies need as much data regarding their customers as possible, in order to be able to provide them with effective personalized product recommendations.

The study of Lee, Ahn et al. delves into this topic by following a very interesting method of research. By implementing game theory, the authors studied the impact of autonomous privacy protection decisions, by firms, on competition, pricing and social welfare. Additionally, this research sheds light on the impact of a regulated environment, regarding privacy protection implementation, on social welfare.

The three main findings:

  1. Asymmetric protection mitigates competition. In simple words, when there are differences between the privacy protection measures that firms implement in any given market, the firm with the strongest privacy protection policy is able to increase its profitability by getting access to a wider pool of consumer data. This simply happens because this firm inspires customers to feel confident to share their personal data with it.
  2. The strategies that firms implement regarding privacy protection should be based on two criteria:
  • Investment cost of protection. This factor introduces the notion that firms in order to implement a privacy protection policy incur some costs such as, infrastructure, personnel and training costs.
  • Size of the personalization scope. This perception regards the pool of the customers for which companies possess personal information and thus are in a position of offering them personalized products or services.
  1. Regulation is socially desirable. According to the research, this holds true since, although the autonomous decisions of firms improve social welfare in general, they redistribute the benefits between firms and customers with firms enjoying the benefits and customers becoming worse-off.

As it can become easily understood, there are many stakeholders when it comes to privacy protection decisions. This research provides a robust foundation regarding the factors that managers should take into account while making decisions concerning their firm’s privacy protection policy. As far as academia is concerned, it connects privacy, in a personalization setting, with equilibria points regarding competition in a market setting. Finally, regulators have one additional source of guarantee that the introduction of privacy protection legislation will be beneficial for society.

In order for all the interested parties to be able to evaluate the findings of this study, it should be underlined, that the authors, in order to calculate the equilibrium in the market, used the notion of firm and customer privacy calculus. This notion advocates that both consumers and firms are perfectly capable of calculating the profits and costs of disclosure of their personal information and the decision of the implementation of a protection strategy respectively. However, this might not always be the case since a lot of biases take place in these processes, as research has already proven.

 

References

Sutanto, J, Palme, E, Chuan-Hoo, T, & Chee Wei, P 2013, ‘ADDRESSING THE PERSONALIZATION-PRIVACY PARADOX: AN EMPIRICAL ASSESSMENT FROM A FIELD EXPERIMENT ON SMARTPHONE USERS’, MIS Quarterly, 37, 4, pp. 1141-A5, Business Source Premier, EBSCOhost, viewed 16 February 2017.

Taylor, H. 2003. “Most People Are ‘Privacy Pragmatists’ Who, While Concerned about Privacy, Will Sometimes Trade it Off for Other Benefits,” The Harris Poll #17, Harris Interactive, New York, March 19 (available at http://www.harrisinteractive. com/harris_poll/index.asp? pid=365)

Is it normal that my >insert body part here< hurts?


Sharing your symptoms with PatientsLikeMe

Have you ever been sick and googled your symptoms? There’s a good chance the Internet told you you might have something way worse than your doctor told you, or that your little headache means you might have a brain tumor…  Or have you ever been sick but experienced there was no one close by to talk to and ask what symptoms are normal or what you could expect?

Continue reading Is it normal that my >insert body part here< hurts?

Does privacy still exist?


Is it still possible to maintain privacy nowadays when purchasing products from webshops? There is a lot of debate regarding this subject, lately privacy issues are often in the headlines. Almost everyone in America en Europe uses the search engine Google nowadays, it is the most visited website in the entire world. As you can imagine this is a sexy target for cybercriminals, in the past Google has been hacked and personal information of more than 300,000 website owners have been leaked (Kirk, 2015). Facebook, one of the largest social media platforms in the world also leaked personal information. Personal information of over 6 million user accounts were leaked, the issue was caused by a bug in the social platform (Guarini, 2013). Remember Edward Snowden? Some call him a traitor, but I would call him a hero.

Continue reading Does privacy still exist?

A rational perspective on the privacy issues when considering using location-based services


Big data and data collection are often seen in a negative daylight, as public attention to big data gathering usually results in unwanted attention for organizations. The other side of the story is that such data collection is usually the result of organizations wanting to deliver personalized services more effectively. In cases where the user becomes skeptical when asked to share their personal and private data, organizations provide an (additional) incentive to mitigate their perceived risk. In the case of recent developments in mobile shopping services, there is a balance between the perceived value and the perceived risk of sharing private information of the customer Xu et al. noted [1]. An easy example of this the case of a customer of having an empty stomach, an empty fridge at 10pm and a connected smartphone. Will he decide to give out his location-based information to a mobile service in order to look for food ordering opportunities or will he not? Will he value the potential to find food less than his location-based information at that hour? You decide.

Furthermore, Xu et al. found that the usage of location-based services is correlated to monetary incentives. Individuals are more willing to disclose their locality to location-based services when offered a financial incentive, Xu et al. have found in their research [1]. The financial incentive is often given in the form of some future saving, implying that there is money to be gained in future expenses. These incentives often take the form in discounts on related services or rebates. Some skeptics have been in agreement with having their personalized data shared in trade for an additional incentive. When asked about their rationalization, some skeptics claim that ‘the risk is worth the gain’ while others state that they have ‘serious concerns’ about sharing their information. If you think that the former is non-existent, please consider the example of the guy with the empty stomach again.

The location-based are services that require more personal and private information in order to function better. The so-called personalization privacy paradox is the epitome of the previous statement; the better services an individual wants or requires, the more willing he has to be to share his personal information. Xu et al. have found that using personalized services could help individuals in superseding their privacy concerns. When addressing the paradox, the authors imply that if customers are more knowledgeable of the service that they require, they make a more rational decision. If the customers have high privacy concerns towards the use of personalized services, they are less inclined to consider using the service and will automatically consider alternate opportunities (in case of the hungry customer, he could use the ‘service’ of asking his physical neighbor for information) and therefore are not part the targeted demographic Pappas et al. imply [2]. In addition, if the location-based services give the option to the consumer to control the use of their personalized information, the mitigated effect might tempt critics to use the service after all [3], although future research would have to investigate this in more detail.

In the end, the rule of thumb is: “when (information) services are offered for free, you are paying with your personal data”. Some people are okay with this, and that is… okay.


Disclaimer: Although largely based on the article of Xu et al. [1], the opinion presented in this article does not portray the sentiment in the paper itself. The opinion presented in this article rests solely by the author and by none of the authors cited in this article. Critics are free to comment below, and are encouraged to do so.

References
[1] Xu, H., Luo, X. R., Carroll, J. M., & Rosson, M. B. (2011). The personalization privacy paradox: An exploratory study of decision making process for location-aware marketing. Decision Support Systems, 51(1), 42-52.
[2] Pappas, I. O., Giannakos, M. N., & Chrissikopoulos, V. (2012, June). Personalized services in online shopping: Enjoyment and privacy. In Information Society (i-Society), 2012 International Conference on (pp. 168-173). IEEE.
[3] Schwaig, K. S., Segars, A. H., Grover, V., & Fiedler, K. D. (2013). A model of consumers’ perceptions of the invasion of information privacy. Information & Management, 50(1), 1-12.