Tag Archives: privacy

How PSD2 is changing the financial industry


After the development of Internet Banking, and Mobile Banking, the European Union has now paved the way for Open Banking. ‘Open Banking’ is the relatively new umbrella term for opening the bank to other parties to access customer data (Courbe, 2018). The Payment Service Directive (PSD), enforced in 2007, is revised recently with the aim to stir innovation and emphasize consumers’ protection by increasing security and transparency through enhanced know-your-customer capabilities, identity validation, and fraud detection (Brodsky & Oakes, 2017). The new European legislation: ‘The Payment Service Directive 2 (PSD2)’, which became applicable in January 2018, sets the banking industry into motion by shifting the authority to share data from financial institutions to bank customers by the rule of: access to account (European Commission, sd). Under PSD2 large financial institutions may move towards the background, maintaining the back-end systems, where digital “giants” are able to extend their close customer relationships by fulfilling the specific customer needs by adding digital value-added services on top of the bank, leading to more competition, digital payment methods and lower transaction costs for consumers (McKinsey, 2016). These digital ‘giants’ like  Amazon and Google, are now able to directly access bank customers and collect the final piece of data that was not accessible before. This could lead to end-to-end solutions that complete the circle of services offered by these parties (PWC, sd).

The Payment Service Directive 2 (PSD2)

Two new categories of licences are created: the Payment Initiation Service Providers (PISPs), which enables third parties, if permission is granted, to directly initiate payments at the bank on behalf of the customer; and the AISP (third party account information service providers), multiple accounts of various banks can be combined into one interface (Deloitte, 2016). By establishing a single legal framework for payments within the EU, cross-border payment transactions can be made as easy, efficient and secure as the domestic payments in Europe (European commission, sd). In this way, the directive lowers entry boundaries of the payment market and thus competition increases. Efficiency is reached by standardization of rules, which results in lower transaction costs and improved financial services. Though, new entrants must meet strict technical requirement set by the European Banking Authority. The customer-centric legislation aims for increased security and transparency of Third-Party Service Providers (TPSPs) as well as banks towards customers. Newly non-banking solutions can be offered as well; payments via digital channels such as social media (Noctor, 2018).

By giving Third-Parties their consent, customers have to trust the Third-Party first, but consumers may not be able to assess the same value and sensitivity to certain data elements as banks and regulators do (Brodsky & Oakes, 2017) as they can be blinded by the benefits that a certain payment service of a TPSPs provides. Thus, the customer-centric regulation results in a cost-benefit trade-off concerning the ability to utilize more efficient and improved bank services, while putting one’s own privacy at risk. New consumer-payments relationships in the financial industry raises the need for a better understanding of how to build consumer trust over the internet. Are bank customers willing to share their personal financial information with TPSPs in return for improved financial services or personalized financial applications? In other words, do the benefits outweigh the risks of sharing your financial data? The following paragraphs explain the advantages and disadvantages of the PSD2 along with related developments in banking.  

Personalization of the financial industry

In today’s world, personalization in e-commerce is rather a must than a nice to have. The future of the financial industry will follow the e-commerce sector by responding to the financial needs of consumers through new types of payment services delivered by Third-Parties’ interfaces on top of banks’ existing data and infrastructure. The PSD2 enables, for example, PayPal to provide additional services on top of the banks infrastructure in which the bank customer barely interacts with their own banking institution. This can threaten banks since PayPal can access multiple bank accounts of bank customers, if consent is given, and can thus collect more information on customers. The information can then be used to fulfill the customers’ needs. In this way, banking services can be offered in a more personal way. In contrast to banks, Third-Parties have the benefit that they can specialize on specific needs of consumers since they do not have the burden of meeting all of the needs of the consumers (Deloitte, 2016). Though, Third-Parties still need to be granted access to the banks’ interface through API’s (Application Programming Interface) provided by financial institutions to interact with bank accounts to third-parties. Although the customer centric, mobile and swift nature of TPSP services is in conflict with how banks traditionally operate, banks have the opportunity to differ between basic and advanced API’s in order to generate a new stream of revenue. Banks expect to face the most significant challenge, not from new digital banks or fintechs, but from the consumer tech giants such as Google, Facebook and Apple. Apart from the end-users’ financial information, these firms were able to access almost every other part of personal information that is available on the internet.

Another important element of banks is their reputation and institutional trust that they have gained over the years. Though, the image of some banks have been harmed in the past years (Volkskrant, 2018), banks do invest heavily in security because their reputation is at stake. TPSPs on the other hand, do not possess a similar security foundation because it was not possible to access bank customers’ accounts or initiate payments on behalf of the customer. However, in the online context, uncertainty increases as users are not aware of the consequences associated with sharing of personal financial information, including account information, obtained financial services and transaction data. Consumers are not likely to share highly sensitive data because of perceived privacy concerns that are due to the invisible nature of the online environment (Culnan & Armstrong, 1999). Thus, e-commerce and consumers are confronted with more payment options, but this does not necessarily benefit to the level of confidence in the payment system, because too much fragmentation of providers can also increase uncertainty and therefore uncertainty among consumers. Then again the conversion at online retailers can have a negative effect in addition to potential market saturation and regulatory burdens which can become another challenge for TPSPs. (Deloitte, 2016). The customer-centric legislation may in the end not be so customer-centric concerning the potential market saturation and the corresponding privacy concerns in the uncertain online environment.

The rise of ‘Digital Giants’ in banking

Currently Google has its primary payment method Google Pay which is a digital wallet that offers a limited number of financial services. As of january 2019, Google has been granted the new payment license in Ireland and Lithuania (Finextra, 2019), which enables them to access bank customers’ account and initiate payments on behalf of the bank customers Although Google did not publish any new service ideas yet, efficiency gains can be made by providing convenient interfaces and features that banks do not offer. These potential services can be combined or linked with existing products, resulting in end-to-end solutions. In this way digital giants are empowered to complete the circle of services offered by these parties. In sum, banks have their brand image and the in-house security foundations as strategic assets, whereas Third-Party Service Providers (TPSPs) have the flexible nature to adapt quickly to customers’ needs. Instead of entering the red ocean, banks can leverage their assets and strengthen their position by collaborating with fintechs and digital giants. Especially, in the uncertain online environment where risk is inherent, trust becomes an important factor. Collaboration is thé solution in the customer-centric world of today. Google has already announced that they prefer to work with banks instead of continuing by themselves. Under the PSD2, Europe puts the customer first and customer protection is number one priority. It is a starting point for change in the traditional financial industry.

References

Brodsky, L. & Oakes, L., 2017. Data sharing and open banking. [Online] Available at: https://www.mckinsey.com/industries/financial-services/our-insights/data-sharing-and-open-banking [Accessed 18 february 2018].

Courbe, J., 2018. Building ‘Open Banking’ on a Platform of Trust. ABA BANKING JOURNAL , pp. 38-39.

Culnan, M. J. & Armstrong, P. K., 1999. Information Privacy Concerns, Procedural Fairness, and Impersonal Trust: An Empirical Investigation. Organization Science, 10(1), pp. 104-115.

Deloitte, 2016. Anticipating the challenges and opportunities of the PSD2. Inside, June, pp. 60-65.

European Commission, n.d. Payment services. [Online] Available at: https://ec.europa.eu/info/business-economy-euro/banking-and-finance/consumer-finance-and-payments/payment-services/payment-services_en [Accessed 18 february 2018].

Finextra, (2019). Google gets payments licence in Ireland. [Online] Available at:https://www.finextra.com/newsarticle/33167/google-gets-payments-licence-in-ireland [Accessed 22 february 2019].

McKinsey, 2016. Technology innovations driving change in transaction banking. [Online] Available at: https://www.mckinsey.com/industries/financial-services/our-insights/technology-innovations-driving-change-in-transaction-banking [Accessed 18 february 2019].

PWC, n.d. PSD2 stimuleert slimme authenticatiemethoden banken. [Online] Available at: https://www.pwc.nl/nl/themas/blogs/psd2-stimuleert-slimme-authenticatiemethoden-banken.html[Accessed 17 february 2018].

Volkskrant, (2018). Ministers wil schandalen zoals bij ING voorkomen en scherpt beloning van bankiers verder aan. [Online] Available at: https://www.volkskrant.nl/nieuws-achtergrond/minister-wil-schandalen-zoals-bij-ing-voorkomen-en-scherpt-beloning-bankiers-verder-aan~b37cfd78/?referer=https%3A%2F%2Fwww.google.com%2F[Accessed at 22 february 2019]


Personalized offers without compromising the privacy of personal information


Searching on the internet for information is one of the most common activities these days. This behavior varies from searching for the closing time of your favorite shop, to finding out how cognitive processes work. Consumers are generating and sharing data when they search on the web. The knowledge of this process is not always present. Even if consumers knew their steps were being followed, would it change their daily behavior? Privacy is not a new term. Yet, it disclosed itself more than ever the last couple of decades, together with the introduction of the Internet. The general definition is the right to be let alone. It is based on the principle of protection of the individual in both person and property (Warren & Brandeis, 1890). However, this definition is too broad. Informational privacy fits more in the context of these days. The definition is the right to control of access to personal information (Moor, 1991).
It is clear that privacy concern is an issue for companies. Companies want to use personal information to offer personalized adverts which is often considered as something positive by the consumers. A business example is Amazon.  The well-known E-commerce company continually updates the user’s personal page to create more tailored experiences. This is done based on past purchases and browsing history and the objective is to stimulate impulse buys (Reverte, 2013). However, personalization has some concerns.

Amazon

 

Sutanto, Palme, Tan and Phang (2013) wrote an article in which they study the so called personalization-privacy paradox. This is the tension between how IT developers and marketers of applications exploit personal users’ information to offer personalized products or services and these users’ increasing concern regarding the privacy of that information. Eventually, this may restrain the use of these applications. The purpose of this paper is to study whether a personalized privacy-safe application works. This application stores and processes information within the user’s smartphone, but does not transmit it to the marketers. This way, personalized information can be offered, without compromising the privacy of personal information (Sutanto, 2013).

Personalized privacy-safe application
To understand the personalization-paradox, the authors build on two theories; the use and gratification theory (UGT) and information boundary theory (IBT).  UGT suggests that consumers use a medium either for the experience of the process or for the content it offers. These two dimensions are called process gratification and content gratification (Sutanto et al., 2013). While the latter refers to the messages carried by the medium, the first one relates to the enjoyment of using the medium.

Next, IBT gives a better understanding in which factors influence process and content gratification.  This theory suggests that consumers form so-called physical or virtual information spaces around themselves. These spaces have boundaries and an attempt by third parties to cross these boundaries will be considered invasive which makes consumers uncomfortable.   In case of an intrusion, consumers will apply a risk-control assessment, weighting the risk of disclosing personal information and the benefits they gain when doing so (Stanton, 2003).

This study conducted a field experiment with three mobile advertising applications. The first mobile application broadcasts adverts generally (i.e. non-personalized application). The second application filters and displays adverts based on the profile information of users, stored in a central server (i.e. personalized, non-privacy-safe application). The last application filters and displays adverts on the profile information of users, stored on their smartphone (i.e., personalized, privacy-safe application). In this context, process gratification is measured with the number of application launches.  On the other hand, if a user is interested in the content offered by the application, they are more likely to save the advert with the purpose of retrieving it later, thus content gratification is measured in terms of the frequency of saving adverts (Sutanto et al., 2013).

Personalized application

The results of the field experiment showed that there is indeed a difference in process and content gratification between the three different applications.  Process gratification increased by 64.5% when the adverts were personalized compared to when adverts were general. However, there was no significant difference in content gratification. This may be explained by the fact that saving adverts explicitly indicates interest in a specific product, thus it requires the user to reveal deeper levels of information than their own boundaries allow. It is likely that this situation causes an uncomfortable feeling and which eventually will lead to a hesitation to save adverts.  Next, the local privacy-safe personalization design increased both process and content gratification. Application use increased by 9.6% compared to personalized, non-privacy-safe application and by 79.1% compared to the non-personalized application.  Respectively, advert saving increase by 24.5% and 55.1%.

However, there is an important limitation in this paper. There is a possibility that some users launched the application, but already were interested in a certain ad. This makes it more difficult to disentangle process gratification from content gratification.

Concluding, this article proposes a personalized privacy-safe application. The results show significant differences between the three applications in favor of the local privacy-safe personalization application. Thus, offering personalized adverts without compromising the privacy of personal information is possible.

References:

Moor, J.H. (1991) ‘The ethics of privacy protection’, , pp. 69–82

Reverte, C. (2013) Personalization Innovators: Amazon, Netflix, and Yahoo! | Available at: https://www.addthis.com/blog/2013/08/28/personalization-innovators-amazon-netflix-and-yahoo/#.WomB3OciHIU. [Accessed 18 February 2018].

Stanton, J. M. (2003). Information technology and privacy: A boundary management perspective. In Socio-technical and human cognition elements of information systems (pp. 79-103). Igi Global.

Sutanto, J., Palme, E., Tan, C. H., & Phang, C. W. (2013). Addressing the Personalization-Privacy Paradox: An Empirical Assessment from a Field Experiment on Smartphone Users. Mis Quarterly, 37(4).

Warren, S.D. and Brandeis, L.D. (1890) ‘The right to privacy’, Harvard Law Review, 4(5), pp. 193–220. doi: 10.2307/1321160

QLIQZ – Human Web Against Data Giants


Sick of giving away your data for free to data giants like Google, Facebook, and so on? You want to have control over your browser history, search terms, and advertisement when surfing the internet? You are annoyed by scrolling through a list of ads first when entering a term in Google Search? Then CLIQZ might be a very interesting alternative for you. 

CLIQZ combines search engine and browser. A quick search is performed directly in the browser, without having to go to another search engine. Search terms or names of a website are directly entered into the browser line. As done by other browsers, a selection of suggestions for websites appears, without you having to leave the current website.

The Human Web

Heart of the search engine is its algorithm, called Human Web, which ranks search results according to their relevancy, and not related to the content, structuring, and linking of websites as done under Search Engine Optimization. With the Human Web, CLIQZ makes use of the wisdom of the crowd – the community of users. The search algorithm of CLIQZ weighs data about people’s behavior on the web more than the technical analysis of websites. And the more data collected, the better is the search algorithm. But now you might think: “How is that different from Chrome or Firefox? They also collect my data.” In contrast to other search engines that build complete and detailed profiles on their users, CLIQZ only works with anonymous statistical data (CLIQZ Human Web, 2017).

Recent Development

CLIQZ recently hit the headlines with the acquisition of Ghostery, a browser plug-in and mobile app that enables its users to easily detect and control JavaScript “tags” and “trackers”, which allow the collection of the user’s browsing habits via cookies, as well as participating in more sophisticated forms of tracking such as canvas fingerprinting. By this acquisition, CLIQZ wants to increase its user base (AdAge, 2017).

Business Model Evaluation

Although the concept of CLIQZ  is widely described as very promising due to a global shift towards privacy awareness (EY Privacy Trends Report, 2016), this does not automatically lead to business success. Currently, CLIQZ  is a start-up backed by renown investors, namely Hubert Burda Media and Mozilla. As stated on their website, CLIQZ did not decide on how they will generate revenue (CLIQZ Support, 2017), but it should be compatible with the respect they have for our users’ privacy and with the core benefits of their product (direct, fast, clearly structured). Thus, making money like other conventional browsers (mainly through advertisements and search royalties) does not fit their strategy.

The single functions of CLIQZ and Ghostery are not new for consumers. If you do not want to be tracked, other options are already available on the market. Add-ons such as the Privacy Badger block scripts and cookies. The Tor browser offers even more anonymity. To avoid search engines such as Google, Startpage, for example, is also a privacy-friendly alternative.

However, CLIQZ provides a seamless integration of those single features into an encompassing solution. Next to this, the most useful differentiator of CLIQZ is its relevancy-based Quick Search. This combined functionality is currently the main reason for consumers to consider using CLIQZ as a default browser.


References

AdAge. Cliqz, a Mozilla-Backed Search Engine, Buys Privacy Extension Ghostery. Retrieved March 5, 2017 from http://adage.com/article/digital/cliqz-a-mozilla-backed-search-engine-buys-ghostery/307980/

CLIQZ. Human Web. Retrieved March 5, 2017 from https://cliqz.com/en/whycliqz/human-web

CLIQZ. Support. Retrieved March 5, 2017 from https://cliqz.com/en/support

EY Privacy Trends Report (2016). Can privacy really be protected anymore? Retrieved March 5, 2017 from http://www.ey.com/Publication/vwLUAssets/ey-can-privacy-really-be-protected-anymore/$FILE/ey-can-privacy-really-be-protected-anymore.pdf

Managing Consumer Privacy Concerns in Personalization: A Strategic Analysis of Privacy Protection


In the digital age that we are living, one of the major concerns is the protection of our privacy. Research shows that 90% of all online consumers either do not disclose any personal information to companies at all or choose to disclose only to the ones committed to fully protecting their privacy (Taylor, 2003). On the other hand, companies need as much data regarding their customers as possible, in order to be able to provide them with effective personalized product recommendations.

The study of Lee, Ahn et al. delves into this topic by following a very interesting method of research. By implementing game theory, the authors studied the impact of autonomous privacy protection decisions, by firms, on competition, pricing and social welfare. Additionally, this research sheds light on the impact of a regulated environment, regarding privacy protection implementation, on social welfare.

The three main findings:

  1. Asymmetric protection mitigates competition. In simple words, when there are differences between the privacy protection measures that firms implement in any given market, the firm with the strongest privacy protection policy is able to increase its profitability by getting access to a wider pool of consumer data. This simply happens because this firm inspires customers to feel confident to share their personal data with it.
  2. The strategies that firms implement regarding privacy protection should be based on two criteria:
  • Investment cost of protection. This factor introduces the notion that firms in order to implement a privacy protection policy incur some costs such as, infrastructure, personnel and training costs.
  • Size of the personalization scope. This perception regards the pool of the customers for which companies possess personal information and thus are in a position of offering them personalized products or services.
  1. Regulation is socially desirable. According to the research, this holds true since, although the autonomous decisions of firms improve social welfare in general, they redistribute the benefits between firms and customers with firms enjoying the benefits and customers becoming worse-off.

As it can become easily understood, there are many stakeholders when it comes to privacy protection decisions. This research provides a robust foundation regarding the factors that managers should take into account while making decisions concerning their firm’s privacy protection policy. As far as academia is concerned, it connects privacy, in a personalization setting, with equilibria points regarding competition in a market setting. Finally, regulators have one additional source of guarantee that the introduction of privacy protection legislation will be beneficial for society.

In order for all the interested parties to be able to evaluate the findings of this study, it should be underlined, that the authors, in order to calculate the equilibrium in the market, used the notion of firm and customer privacy calculus. This notion advocates that both consumers and firms are perfectly capable of calculating the profits and costs of disclosure of their personal information and the decision of the implementation of a protection strategy respectively. However, this might not always be the case since a lot of biases take place in these processes, as research has already proven.

 

References

Sutanto, J, Palme, E, Chuan-Hoo, T, & Chee Wei, P 2013, ‘ADDRESSING THE PERSONALIZATION-PRIVACY PARADOX: AN EMPIRICAL ASSESSMENT FROM A FIELD EXPERIMENT ON SMARTPHONE USERS’, MIS Quarterly, 37, 4, pp. 1141-A5, Business Source Premier, EBSCOhost, viewed 16 February 2017.

Taylor, H. 2003. “Most People Are ‘Privacy Pragmatists’ Who, While Concerned about Privacy, Will Sometimes Trade it Off for Other Benefits,” The Harris Poll #17, Harris Interactive, New York, March 19 (available at http://www.harrisinteractive. com/harris_poll/index.asp? pid=365)

Is it normal that my >insert body part here< hurts?


Sharing your symptoms with PatientsLikeMe

Have you ever been sick and googled your symptoms? There’s a good chance the Internet told you you might have something way worse than your doctor told you, or that your little headache means you might have a brain tumor…  Or have you ever been sick but experienced there was no one close by to talk to and ask what symptoms are normal or what you could expect?

Continue reading Is it normal that my >insert body part here< hurts?

Does privacy still exist?


Is it still possible to maintain privacy nowadays when purchasing products from webshops? There is a lot of debate regarding this subject, lately privacy issues are often in the headlines. Almost everyone in America en Europe uses the search engine Google nowadays, it is the most visited website in the entire world. As you can imagine this is a sexy target for cybercriminals, in the past Google has been hacked and personal information of more than 300,000 website owners have been leaked (Kirk, 2015). Facebook, one of the largest social media platforms in the world also leaked personal information. Personal information of over 6 million user accounts were leaked, the issue was caused by a bug in the social platform (Guarini, 2013). Remember Edward Snowden? Some call him a traitor, but I would call him a hero.

Continue reading Does privacy still exist?

A rational perspective on the privacy issues when considering using location-based services


Big data and data collection are often seen in a negative daylight, as public attention to big data gathering usually results in unwanted attention for organizations. The other side of the story is that such data collection is usually the result of organizations wanting to deliver personalized services more effectively. In cases where the user becomes skeptical when asked to share their personal and private data, organizations provide an (additional) incentive to mitigate their perceived risk. In the case of recent developments in mobile shopping services, there is a balance between the perceived value and the perceived risk of sharing private information of the customer Xu et al. noted [1]. An easy example of this the case of a customer of having an empty stomach, an empty fridge at 10pm and a connected smartphone. Will he decide to give out his location-based information to a mobile service in order to look for food ordering opportunities or will he not? Will he value the potential to find food less than his location-based information at that hour? You decide.

Furthermore, Xu et al. found that the usage of location-based services is correlated to monetary incentives. Individuals are more willing to disclose their locality to location-based services when offered a financial incentive, Xu et al. have found in their research [1]. The financial incentive is often given in the form of some future saving, implying that there is money to be gained in future expenses. These incentives often take the form in discounts on related services or rebates. Some skeptics have been in agreement with having their personalized data shared in trade for an additional incentive. When asked about their rationalization, some skeptics claim that ‘the risk is worth the gain’ while others state that they have ‘serious concerns’ about sharing their information. If you think that the former is non-existent, please consider the example of the guy with the empty stomach again.

The location-based are services that require more personal and private information in order to function better. The so-called personalization privacy paradox is the epitome of the previous statement; the better services an individual wants or requires, the more willing he has to be to share his personal information. Xu et al. have found that using personalized services could help individuals in superseding their privacy concerns. When addressing the paradox, the authors imply that if customers are more knowledgeable of the service that they require, they make a more rational decision. If the customers have high privacy concerns towards the use of personalized services, they are less inclined to consider using the service and will automatically consider alternate opportunities (in case of the hungry customer, he could use the ‘service’ of asking his physical neighbor for information) and therefore are not part the targeted demographic Pappas et al. imply [2]. In addition, if the location-based services give the option to the consumer to control the use of their personalized information, the mitigated effect might tempt critics to use the service after all [3], although future research would have to investigate this in more detail.

In the end, the rule of thumb is: “when (information) services are offered for free, you are paying with your personal data”. Some people are okay with this, and that is… okay.


Disclaimer: Although largely based on the article of Xu et al. [1], the opinion presented in this article does not portray the sentiment in the paper itself. The opinion presented in this article rests solely by the author and by none of the authors cited in this article. Critics are free to comment below, and are encouraged to do so.

References
[1] Xu, H., Luo, X. R., Carroll, J. M., & Rosson, M. B. (2011). The personalization privacy paradox: An exploratory study of decision making process for location-aware marketing. Decision Support Systems, 51(1), 42-52.
[2] Pappas, I. O., Giannakos, M. N., & Chrissikopoulos, V. (2012, June). Personalized services in online shopping: Enjoyment and privacy. In Information Society (i-Society), 2012 International Conference on (pp. 168-173). IEEE.
[3] Schwaig, K. S., Segars, A. H., Grover, V., & Fiedler, K. D. (2013). A model of consumers’ perceptions of the invasion of information privacy. Information & Management, 50(1), 1-12.