TL;DR: Longer does not necessarily mean better

Do you hate when you receive really long emails from that annoying workmate that could have summarized everything in 3 simple bullet points? I don’t know about you, but I certainly do.


I am one of those persons that skips long-text posts and go directly to the “TL;DR” line.


 According to the Urban Dictonary, TL;DR stands for “Too long didn’t read”, which is a generic reply (or sometimes a self-reply to introduce a brief summary), that is used when someone took too much words to describe something that could have been sufficiently clear and complete with less words. [1]


Not surprisingly, information processing theory suggests that the use we can extract from the amount of information available is restricted when its either scarce or abundant; [2] and models of human cognition suggest that people respond with a curvilinear performance to the amount of information displayed in their environment. [3]


This is why we see TL;DR in online communities so often.


Anyway,  but what I wanted to talk about today was about the findings of one of the most recent articles on online reviews. It is so recent, that it has been issued  in the 39th Volume of the International Journal of Information Management for April 2018, which is an international, peer-reviewed journal with -according to Elsevier- the best analysis and discussions in the field of information management. [4]


Longer online reviews are not necessarily better

By Lior Fink, Liron Rosenfeld, Gilad Ravid

 The purpose of this paper is to test for this obvious curviliniear relationship between the length of a text and its usability in online consumer reviews.


It just happens that regardless of how obvious this could be, researchers on the effectiveness of online (consumer-generated) reviews did not bother about it.


Review length is measured as the average number of characters/words in the reviews for a specific product and it has been consistently hypothesized to have a positive linear relationship to either sales or review helpfulness [5].


Here are the two arguments that justify such blatant and negligent reasoning:


  1.  Longer reviews are less likely to be overlooked because they are “visually more salient” [6]
  2. Longer reviews contain more information, which leads to less product-related uncertainty which leads to more consumer confidence [7] [8]
Well it turns out that despite my skepticism both assumptions have been turned out to be supported by multiple evidence. [5]



Hey hey hey! wait a second. There’s a fact professor Fink and his colleagues identified to be the reason to negative curvilinear relationships not being observed by previous researchers.


This is because the data analysis of  consumer behavior on product reviews has been tested on PCs, which apparently afford a less constrained environment when it comes to large information processing relative to mobile devices. [5]



Professor Fink and his colleagues argue that in mobile devices the optimum cognitive load that precedes the start of increasingly negative marginal length usefulness may be empirically observed.


To do so they collected large-scale data from free and paid apps on Google Play and Amazon Appstore.


The time of the data collection was December 2013 and the apps selected from Google Play were listed in 26 different categories and the ones from Amazon Appstore in 28, yielding a total of 33,119 and 95,683 apps respectively. To be consistent with previous research, the authors also removed the apps that did not have reviews at all, yielding a total of 7864 Google Play free apps, 6,206 Google Play paid apps, 6,158 Amazon Appstore free apps, and 2,734 Amazon Appstore paid apps.

These apps were then divided into intervals according to the amount of downloads, which then it’d later stand for “demand”.

Besides demand (which would be the dependent variable), the other variables observed in the different types of apps where:

  • Days since last update
  • App size
  • Star Rating
  • Number of reviews
  • Average review length (in words)

But the only relationship we care about here is between review length and “demand”. And this is what we’ve got.


Figure 1. Fitted curves

As we can observe from the figure above, professor Fink and his colleagues identified a sweet spot for most categories of 100 to 150 words when it comes to the optimal average of words in a review.

This fitted curves of predicted demand as a function of review length shows means as ‘tics’ and ‘quartile ranges as dots.

The authors believe that maximum demand correspond to the optimum cognitive load, however “optimum” load shall only apply from the companies point of view [see 4th weakness below].

The authors came up to the previous fitted curve by equating quadratic formulas to 0 in the following way:

Let’s now wrap up the paper with its weaknesses, strengths, and implications for managers and researchers, let’d do a TL;DR:


Strengths of the paper

  • Explores a different setting with very high managerial implications
  • Kudos for challenging dogmas
  • More relevant scenario than previous research as smartphones overtake computers in e-commerce source of traffic

Weakness of the paper

  • Longer reviews might be rants
  • Popular apps do not necessarily require the users to know more about the app by reading reviews.
    For instance may instagram users decide to download the app rather than because of app reviews but because of peer pressure/social acceptance.
  • It won’t have a large impact on research done in computer settings
  • The variables tested are review length and downloads “demand”, not exactly “review usefulness”.
    Usefulness in terms of explicitly stated by a review reader when it comes to helping him/her make a sound decision.
  • Since good reviews may also advise you to not buy a bad product, in this case clearing up product uncertainty does not logically seem to lead to necessarily more sales, something most authors do not take into account, including these.


Take away for managers
Correlation does not mean causation, but if there is an optimal length that you want to aim for in your mobile-based apps, try to lure reviewers into writing a review of the sweet spot length of 100 to 150 words.


Take away for researchers
Do not cherry pick the variables and samples until you manage to get the results that you want, especially when you are trying to replicate other studies.
Conduct further research on online reviews in mobile devices as it is lagging compared to PCs studies [5] and as Smartphones overtake computers as E-commerce main traffic source [9] and thus more relevant.

Reference list

[1]  Urban Dictionary. (n.d.). Urban Dictionary: tl;dr. [online] Available at: [Accessed 5 Mar. 2018]

[2] Schroder, H., Driver, M., & Streufert, S. (1967). Human information processing: Individual’s and group’s functioning in complex social situations. New York: Holt, Rinehart and Winston.

[3] Driver, M. J., & Mock, T. J. (1975). Human information processing, decision style theory, and accounting information systems. The Accounting Review, 50(3), 490–508

[4] Elsevier. (n.d.). International Journal of Information Management. [online] Available at: [Accessed 5 Mar. 2018].

[5] Fink, L., Rosenfeld, L. and Ravid, G. (2018). Longer online reviews are not necessarily better. International Journal of Information Management, 39, pp.30-37.

[6] Kuan, K. K. Y., Hui, K. L., Prasarnphanich, P., & Lai, H. Y. (2015). What makes a review
voted? An empirical investigation of review voting in online review systems. Journal
of the Association for Information Systems, 16(1), 48–71

[7] Schwenk, C. R. (1986). Information, cognitive biases, and commitment to a course of
action. Academy of Management Review, 11(2), 298–310.

[8] Tversky, A., & Kahneman, D. (1974). Judgement under uncertainty: Heuristics and biases.
Science, 185(4157), 1124–1131.

[9] Bloomberg. (2016). Smartphones Overtake Computers as Top E-Commerce Traffic Source. [online] Available at: [Accessed 5 Mar. 2018].

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s