Crowdsourcing System

From GM-RKB
(Redirected from Crowdsourcing)
Jump to navigation Jump to search

A Crowdsourcing System is a human-based computing system that enables crowd members to solve a crowdsourced task.



References

2023

  • (Wikipedia, 2023) ⇒ https://en.wikipedia.org/wiki/Crowdsourcing Retrieved:2023-1-8.
    • Crowdsourcing involves a large group of dispersed participants contributing or producing goods or services—including ideas, votes, micro-tasks, and finances—for payment or as volunteers. Contemporary crowdsourcing often involves digital platforms to attract and divide work between participants to achieve a cumulative result. Crowdsourcing is not limited to online activity, however, and there are various historical examples of crowdsourcing. The word crowdsourcing is a portmanteau of “crowd” and “outsourcing". [1][2][3] In contrast to outsourcing, crowdsourcing usually involves less specific and more public groups of participants. [4] [5]

      Advantages of using crowdsourcing include lowered costs, improved speed, improved quality, increased flexibility, and/or increased scalability of the work, as well as promoting diversity.[6] [7] Crowdsourcing methods include competitions, virtual labor markets, open online collaboration and data donation.[7][8] [9] Some forms of crowdsourcing, such as in "idea competitions" or "innovation contests" provide ways for organizations to learn beyond the "base of minds" provided by their employees (e.g. LEGO Ideas).[10] [11] Commercial platforms, such as Amazon Mechanical Turk, match microtasks submitted by requesters to workers who perform them. Crowdsourcing is also used by nonprofit organizations to develop common goods, such as Wikipedia.

  1. Schenk, Eric; Guittard, Claude (1 January 2009). "Crowdsourcing What can be Outsourced to the Crowd and Why". Center for Direct Scientific Communication. Retrieved 1 October 2018 – via HAL.
  2. Hirth, Matthias; Hoßfeld, Tobias; Tran-Gia, Phuoc (2011). "Anatomy of a Crowdsourcing Platform - Using the Example of Microworkers.com" (PDF). 2011 Fifth International Conference on Innovative Mobile and Internet Services in Ubiquitous Computing. pp. 322–329. doi:10.1109/IMIS.2011.89. ISBN 978-1-61284-733-7. S2CID 12955095. Archived from the original (PDF) on 22 November 2015. Retrieved 5 September 2015.
  3. Estellés-Arolas, Enrique; González-Ladrón-de-Guevara, Fernando (2012), "Towards an Integrated Crowdsourcing Definition" (PDF), Journal of Information Science, 38 (2): 189–200, doi:10.1177/0165551512437638, hdl:10251/56904, S2CID 18535678
  4. Brabham, D. C. (2013). Crowdsourcing. Cambridge, Massachusetts; London, England: The MIT Press.
  5. Prpić, J., & Shukla, P. (2016). Crowd Science: Measurements, Models, and Methods. In: Proceedings of the 49th Annual Hawaii International Conference on System Sciences, Kauai, Hawaii: IEEE Computer Society
  6. Buettner, Ricardo (2015). A Systematic Literature Review of Crowdsourcing Research from a Human Resource Management Perspective. 48th Annual Hawaii International Conference on System Sciences. Kauai, Hawaii: IEEE. pp. 4609–4618. doi:10.13140/2.1.2061.1845. ISBN 978-1-4799-7367-5.
  7. 7.0 7.1 Prpić, John; Taeihagh, Araz; Melton, James (September 2015). “The Fundamentals of Policy Crowdsourcing". Policy & Internet. 7 (3): 340–361. arXiv:1802.04143. doi:10.1002/poi3.102. S2CID 3626608.
  8. Afuah, A.; Tucci, C. L. (2012). “Crowdsourcing as a Solution to Distant Search". Academy of Management Review. 37 (3): 355–375. doi:10.5465/amr.2010.0146.
  9. de Vreede, T., Nguyen, C., de Vreede, G. J., Boughzala, I., Oh, O., & Reiter-Palmon, R. (2013). A Theoretical Model of User Engagement in Crowdsourcing. In Collaboration and Technology (pp. 94-109). Springer Berlin Heidelberg
  10. Liu, Wei; Moultrie, James; Ye, Songhe (4 May 2019). "The Customer-Dominated Innovation Process: Involving Customers as Designers and Decision-Makers in Developing New Product". The Design Journal. 22 (3): 299–324. doi:10.1080/14606925.2019.1592324. ISSN 1460-6925. S2CID 145931864.
  11. Schlagwein, Daniel; Bjørn-Andersen, Niels (2014), "Organizational Learning with Crowdsourcing: The Revelatory Case of LEGO" (PDF), Journal of the Association for Information Systems, 15 (11): 754–778, doi:10.17705/1jais.00380

2009

2008a

  • (Kittur et al., 2008) ⇒ Aniket Kittur, Ed H. Chi, and Bongwon Suh. (2008). “Crowdsourcing User Studies with Mechanical Turk.” In: Proceeding of the 26th annual SIGCHI conference on Human factors in computing systems. doi:10.1145/1357054.1357127
    • ABSTRACT: User studies are important for many aspects of the design process and involve techniques ranging from informal surveys to rigorous laboratory studies. However, the costs involved in engaging users often requires practitioners to trade off between sample size, time requirements, and monetary costs. Micro-task markets, such as Amazon's Mechanical Turk, offer a potential paradigm for engaging a large number of users for low time and monetary costs. Here we investigate the utility of a micro-task market for collecting user measurements, and discuss design considerations for developing remote micro user evaluation tasks. Although micro-task markets have great potential for rapidly collecting user measurements at low costs, we found that special care is needed in formulating tasks in order to harness the capabilities of the approach.

2008b

  • D.C. Brabham. (2008). “Crowdsourcing as a model for problem solving: an introduction and cases.” In: Convergence: The International Journal of Research into New Media Technologies 14, 1, Feb. 2008.