All posts by irisvdheijdenrsm

Why users contribute knowledge to online communities: An empirical study of an online social Q&A community

Knowledge & the Internet

Ever since the inception of the Internet, the volume of knowledge has exceptionally increased, especially since it improve-knowledge-managementfacilitates crowdsourcing knowledge. Websites such as Wikipedia and Quora help individuals provide other individuals with information and answers to lingering questions. Quality control is also crowd controlled, where different kinds of voting systems enable fellow users to assess the provided answers and filtering out low-quality ones. Online Q&A communities are special social networks focused specifically on information sharing. They are a special place since there is usually no monetary incentive to motivate people to contribute. This paper focusses on these online communities and tries to explain the motivation behind the contributors.

Related Theory

There are 3 theories that are related to this study and on which the hypotheses are built, they are social cognitive theory, social capital & social exchange theory. The Social cognitive theory claims that that people’s thinking and actions are influenced by watching others through social interactions (Anderson, Winett, & Wojcik, 2007). The theory has been used to analyze how content is generated by users and how this content affects future contributions. Social capital is a known concept describing the value derived from interpersonal relationships and is built over time. It includes trust, respect & friendship among other things (M.M., 2005). The social exchange theory highlights intrinsic rewards from social interactions, similar to economic exchange theory it claims that individuals will behave in a certain way to acquire rewards from an interaction (Liu & Chen, 2005).

What is measured and how?

Based on the previously mentioned theories/concepts 4 aspects were identified that are possible drivers of knowledge contribution in online Q&A communities.

Identity Communication

Identity communication refers to an identityindividual’s efforts to express and present his/her identity. It explains who a person is and how he/she is different from others. It includes the concept of self-presentation information; the transfer of personal information about one’s personality, experience etc. so others understand their social identity (Tajfel & Turner, 1979). In the study, it is measured as a number of items that a user discloses about himself with a maximum of 11 (maximum of items available on the website).

H1: Individuals who disclose more self-presentation information will contribute more knowledge to online social Q&A communities.

Peer Recognition

The more knowledge becomes available the more attention is divided between different sources of information. The same goes for the information in online Q&A communities. Peer recognition is the positive feedback users receive on their behavior and is measured by the number of usefulness votes on a post.

H2: Individuals who receive more positive feedback will contribute more knowledge to online social Q&A communities.

Group-size Effects

Since most intrinsic rewards are based on transactions with others, as explained before, the presence of others and the number of possible recipients are important. A larger following means a wider reach and thus more social rewards (Nahapiet & Ghoshal, 1998). A member’s following is measures by the member’s so-called ‘followers’.

H3: Individuals with a larger group size will contribute more knowledge to online social Q&A communities.

Social Learning

Social learning is a type of learning that comes from observation of others. In online communities content feeds provide constant updates of other individuals actions, providing continuous learning opportunities (Anderson, Winett, & Wojcik, 2007). Social learning is measured by the number of topics, questions and members a participant is subscribed to, the more they are subscribed to the more learning opportunities a member has.

H4: Individuals with more social learning opportunities will contribute more knowledge to online social Q&A communities



The researchers have analyzed 1.762 data points from 306 members of a popular online Chinese Q&A community. These data points include all knowledge contribution behavior from March 15 to June 22, 2014. After processing the data H1, H2 & H4 are supported and H3 is rejected.

Why is this important?

The internet is a great tool to share knowledge, people from all over the world can distribute information to others. This can help people with a more difficult start in life acquire knowledge to help them further. Understanding why people contribute to online knowledge sharing can help increase knowledge that is available online.


Anderson, E., Winett, R., & Wojcik, J. (2007). Self-regulation, self-efficacy, outcome expectations, and social support: Social cognitive theory and nutrition behavior. Annals Of Behavioral Medicine34(3), 304-312.

Anderson, E., Winett, R., & Wojcik, J. (2007). Self-regulation, self-efficacy, outcome expectations, and social support: Social cognitive theory and nutrition behavior. Annals Of Behavioral Medicine34(3), 304-312.

Jin, J., Li, Y., Zhong, X., & Zhai, L. (2015). Why users contribute knowledge to online communities: An empirical study of an online social Q&A community. Information & Management52(7), 840-849.

Liu, C. & Chen, S. (2005). Determinants of knowledge sharing of e-learners. International Journal Of Innovation And Learning2(4), 434.

M.M., W. (2005). Why should I share? Examining social capital and knowledge contribution in electronic networks of practice. MIS Quarterly: Management Information Systems29(1), 35-57.

Nahapiet, J. & Ghoshal, S. (1998). Social Capital, Intellectual Capital, and the Organizational Advantage. The Academy Of Management Review23(2), 242.

Crowdsourcing Invention – The tale of Quirky


Crowdsourcing is a great method of bringing people together who can achieve more when they cooperate than they could ever have accomplished alone. A great invention is a combination of a good idea and a good execution. This is the idea behind Quirky, an interesting idea that had a dream start, then (spoiler alert) fails but in the end gets a second life with new owners.

What is Quirky?

Some people overflow with (great) ideas but don’t have the resources or know-how to bring these ideas to life. On the other hand, some people have the capabilities to bring ideas to the marketplace and make them successful but aren’t terribly creative. Quirky was created in 2009 by Ben Kaufman as a marketplace to bring these people together.


quirky-logo“We’re making invention accessible”

 Crowdsourcing innovation is a way to get an audience, potentially even buyers, for a product before it is even in production. Quirky retains the rights to ideas and facilitates communication between inventor and executioners. It pulls invention out of isolation and created a community in which people work together to create new products (


The platform succeeded in creating a community and successful products.


For example, Garthen Leslie came up with a smart air conditioner, uploaded a rough idea to Quirky and with the help of the community came up with extra features and a name: Aros. Quirky then patented the idea, manufactured it and sold the product to big retailers such as Walmart and Amazon. Leslie’s return for his idea was 10% which in this particular instance amounted to more than $400.000, but also the community also benefitted with $200.000 that was distributed to all of the members that contributed to Aros (Silvester, 2015).


However, the invention business turned out to be difficult. Crowdsourcing as a method of quality control turned out to be flawed. For every Aros there were many examples that did not do too well. These cost the company money, a lot of money. For instance, the Beat Booster that cost the company $388.000 but only ever sold 30 units (Popper, 2015).


There is almost never one single reason a company fails. In the case of Quirky there are several, with these being the main ones:

Crowdsourcing as Market Validation

Quirky operated under the assumption that when the community ‘upvoted’ a product enough it would be a success in the market. However, this turned out not to be the case. In real life companies, an R&D department would thoroughly test ideas with pilot runs etc. However, Quirky has a fast-to-market mentality and it negatively impacted quality. The company started to receive many complaints by inventors feeling rushed and buyers feeling deceived with faulty products that did not live up to the hype (K, 2015).

Capital Risks & Brick-and-mortar retail

The company’s margins were thin, they took everything on themselves including manufacturing and distribution resulting in high upfront costs. Quirky decided not to solely focus on its core competency (facilitating ideas) but also on logistics, supply chain management etc. Especially when dealing with large brick-and-mortar retailers (with high minimum inventory demands) inventory costs are high, very high resulting in high capital risks when products were not successful (K, 2015).

Lack of Economies of scale/iterations

Where ‘normal’ companies generate and execute 2 or 3 ideas per year, Quirky aimed for more than 50. Resulting in many, but incoherent products. This resulted in a lack of recognition of Quirky as a brand. This absence of focus also led to an absence of iterations of successful products (Einstein, 2015). Aros, one of the company’s major successes could have been a multi-million company on its own if the company had decided to improve the first version with feedback from its customers. Yet, the company decided to invest time in putting, even more, products on the shelf instead, while not creating economies of scale (Fixson & Marion, 2016).


The founder of Quirky, Ben Kaufman, was only 25 when in 2015 his company grew to 300 employees and raised over $185 milben-kaufmanlion in capital.

He was lovable, ambitious and well on its way to make it in the business world since he was a good salesperson (Lagorio-Chafkin, n.d.). One Quirky member said: “I’m pretty sure he could sell ice to Eskimos” (D’Onfro, 2015). He was excellent at spotting talent and persuading them to come work for him. However, it has also been said he failed to listen to their knowledge soon after. For example, Kaufman demanded a team to work on Egg Minder. A smart egg tray which notified people on their smartphone that they were almost out of eggs even though a team of engineers claimed it did not make much economic sense (Popper, 2015).

Bankruptcy & Restart

After 6 years, in 2015, Quirky filed for bankruptcy. The before mentioned problems turned out to be too big to handle and the company could no longer pay its obligations. The company, at that point, had gathered more than 1 million registered users and brought more than 400 products to market (Lohr, 2015). However, this turned out not be the end for Quirky. On February 8 2016, it was announced that Quirky acquired new financiers and owners and the company relaunched in May 2016. The company redefined its definition of ‘public’ and made the process more private requiring people to sign up and log in before contributing to society. Furthermore, it has changed its evaluation methods, however not much is communicated in detail on this change (Quirky Blog).

Only the future can tell if this new and improved version of Quirky will last as it is too soon to tell right now. However, it is clear that crowdsourcing innovation can yield great revenues the only question is how to manage this exchange of responsibilities to great joint profitability for all parties involved.


D’Onfro, J. (2015). How a quirky 28-year-old plowed through $150 million and almost destroyed his start-up. Business Insider. Retrieved 3 March 2017, from

Einstein, B. (2015). The Real Reason Quirky Failed. Bolt Blog. Retrieved 2 March 2017, from

Fixson, S. & Marion, T. (2016). A Case Study of Crowdsourcing Gone Wrong. Harvard Business Review. Retrieved 3 March 2017, from

K, C. (2015). How Not to Crowdsource : The Demise of Quirky – Digital Innovation and Transformation. Harvard Business School. Retrieved 2 March 2017, from

Lagorio-Chafkin, C. What Happened to Quirky?. Retrieved 2 March 2017, from

Lohr, S. (2015). Quirky, an Invention Start-Up, Files for Bankruptcy. Retrieved 2 March 2017, from

Popper, B. (2015). Exclusive: the secret struggles of Quirky, a seemingly successful startup. The Verge. Retrieved 3 March 2017, from

Silvester, J. (2015). The Rise and Fall of Quirky — the Start-Up That Bet Big on the Genius of Regular Folks. New York Magazine. Retrieved 1 March 2017, from

Quirky Blog

Citizen science: Crowdsourcing Scientific Knowledge

Citizen Science






Nowadays you can pretty much crowdsource anything from statistical analysis (Kaggle) to Graphic Design (99Designs), whatever you want help with you can find it online. But science is probably not the first thing on peoples’ mind when they think of this phenomenon. Science has an image of being a restricted activity, that requires specific knowledge and skills. Scientists are smart people locked away in laboratories or universities. citizen-scienceWe believe science is our most reliable system of gaining new knowledge and should be reserved for special people who are trained for it. However, nothing is further from the truth according to citizen science (also called crowd science and/or amateur science). Citizen science projects can be very diverse and can serve both specific research questions and open-ended data collection (Lukyanenko, et al., 2016).


Citizen science has been met with some criticism, including issues with data quality and ethics.

Data Quality

Is citizen science reliable? crossed-fingersThis is, of course, a valid question and the corresponding answer could fill a blog post on its own. To give a short answer: yes, in most cases (Galloway et al., 2006). When scientists use citizen science in their research, they can take different actions to ensure data quality. For example, they could provide training/close supervision to the participants, of course keeping in mind the time/costs incurred with this.
Furthermore, scientists can cross-check for consistency with existing literature or with their own previous observations and last but not least the task that is asked to the public could be simplified to the point ‘little can go wrong’ (Riesch, et al., 2014). Actions that are appropriate to take of course depend on the characteristics of each research and ultimately need to be decided and justified by the researchers themselves.


An obvious problem in citizen science is the accreditation of research results. In some projects, the involvement of participants is high and requires a lot of time and/or effort making their contribution to the research quite substantial. Ownership of data should be clearly defined beforehand and considerations regarding accreditation should be handled in a fair manner and communicated explicitly before participation.


Citizen science of course also has significant benefits including increasing accessibility of science, changes in science literacy, providing a different perspective and the possibility to analyze larger datasets.


As mentioned before the term science and research can sound intimidating, especially for ‘outsiders’. Citizen science can help people ‘ease into’ the world of science in a manageable manner. It helps make research more inclusive (Lukyanenko, et al., 2016). This inclusiveness, in turn, can increase interest for science in general, change people’s views and can persuade more people to study and/or work in any field of science.

 Science Literacy

There is some debate about this but studies have shown that participating in citizen science can increase science literacy and familiarity with the scientific method (Cronje, Rohlinger, Crall & Newman, 2011).


Since most participants in citizen science lack academic scientific education, they can offer a new perspective on issues/research which can be useful to explore new options, help studies advance after problems have occurred and/or offer future research ideas (Lukyanenko, et al., 2016). By including a larger group of people, the group most likely also becomes more diverse and thus also more diverse in terms of knowledge (Raddick et al., 2013).

Larger datasets

By outsourcing some of the data analysis larger data sets can be included in studies. Of course computers also have the power to analyze large data sets, however, some tasks require capabilities that humans are more efficient in such as image and sound analysis (Fleming, 2001).

Examples of current applications of

citizen science

Bird research

Citizen science projects have made a serious contribution to scientific knowledge (Ceccaroni, 2016). For example, it has helped examine the distribution of bird populations (Cooper et al. 2007, Bonter and Harvey 2008, Bonter et al. 2009), the influence of environmental change on birds’ breeding behavior (Hames et al., 2002a) and the effect of acid rain on bird population (Hames et al. 2002b).

Scistarter is a website stimulating people to learn about, participate in and contribute to science. Their goal is to create a place in which there is an open communication between citizens and scientists. It is an online database of current citizen science projects and acts as a link between interested citizens and researchers in need of these citizens (“About Us”, 2017).


galaxy-zooGalaxy Zoo

This is possibly the most famous example of an online citizen science project. It is a crowdsourced astronomy project in which people can help classify galaxies. It launched on the 11th of July 2007 and collected more than 50 million during its first year. To date is has gone through 13 different ‘rounds’ each focusing on a different task/image set (“Story”, 2017). The data collected has been used in many studies and contributes greatly to a better understanding of the phenomenon (Raddick et al., 2013).


About Us. (2017). SciStarter. Retrieved 15 February 2017, from

Bonter DN, Harvey MG. 2008. Winter survey data reveal rangewide dedine in Evening Grosbeak populations.The Condor 110: 376–381. BioOne

Bonter DN, Zuckerberg B, Dickinson JL. 2009. Invasive birds in a novel landscape: Habitat associations and effects on established species. Ecography.doi:10.1111/j.1600-0587.2009.06017.x

Ceccaroni, L. (2016). Analyzing the role of citizen science in modern research (1st ed.). IGI Global.

Cooper CB, Dickinson J, Phillips TB, Bonney R. 2007. Citizen science as a tool for conservation in residential ecosystems. Ecology and Society 12: 11.

Cronje, R., Rohlinger, S., Crall, A., & Newman, G. (2011). Does Participation in Citizen Science Improve Scientific Literacy?. Applied Environmental Education & Communication, 10(3), 135-145. doi:10.1080/1533015x.2011.603611

Estelles Arolas, E., Gonzalez Ladron de Guerra, F., 2012. Towards an integrated crowdsourcing definition, Journal of Information Science 38 (2), 189-200.

Fleming, L., 2001. Recombinant uncertainty in technological search. Management Science 47 (1), 117–132

Galloway, A. W. E., Tudor, M. T. and Haegen, W. M. V. (2006), The Reliability of Citizen Science: A Case Study of Oregon White Oak Stand Surveys. Wildlife Society Bulletin, 34: 1425–1429. doi:10.2193/0091-7648(2006)34[1425:TROCSA]2.0.CO;2

Hames RS, Rosenberg K, Lowe JD, Barker S, Dhondt AA. 2002a. Effects of forest fragmentation on tanager and thrush species in eastern and western North America. Pages 81–91 in George L, Dobkins DS, eds. The Effects of Habitat Fragmentation on Birds in Western Landscapes: Contrasts with Paradigms from the Eastern United States, vol. 25. Cooper Ornithological Society.

Hames RS, Rosenberg K, Lowe JD, Barker S, Dhondt AA. 2002b. Adverse effects of acid rain on the distribution of the wood thrush Hylocichla mustelina in North America. Proceedings of the National Academy of Sciences 99: 11235–11240. CrossRefPubMed

Lukyanenko, R., Parsons, J. and Wiersma, Y. F. (2016), Emerging problems of data quality in citizen science. Conservation Biology, 30: 447–449. doi:10.1111/cobi.12706

Riesch, H. and Potter, C., (2014) Citizen science as seen by scientists: Methodological, epistemological and ethical dimensions, Public Understanding of Science 23 (1) : 107-120Jordan

Jordan Raddick; G. Bracey; P. L. Gay; C. J. Lintott; C. Cardamone; P. Murray; K. Schawinski; A.S. Szalay; J. Vandenberg (2013). “Galaxy Zoo: Motivations of Citizen Scientists”.

Jordan Raddick; G. Bracey; P. L. Gay; C. J. Lintott; C. Cardamone; P. Murray; K. Schawinski; A.S. Szalay; J. Vandenberg (2013). “Galaxy Zoo: Motivations of Citizen Scientists”.

Serrano, F. (2013). Green Paper on Citizen Science. Citizen Science for Europe: Towards a better society of empowered citizens and enhanced research.

Story. (2017). Retrieved 15 February 2017, from