Skip to main content

Crowd Vigilante

Detecting Sabotage in Crowdsourcing

  • Conference paper
  • First Online:

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 809))

Abstract

Crowdsourcing is a complex and sociotechnical problem solving approach for collaboration of geographically distributed volunteer crowd to contribute to the achievement of a common task. One of the major issues faced by crowdsourced projects is the trustworthiness of the crowd. This paper presents a vision to develop a framework with supporting methods and tools for early detection of the malicious acts of sabotage in crowdsourced projects by utilizing and scaling digital forensic techniques. The idea is to utilize the crowd to build the digital evidence of sabotage with systematic collection and analysis of data from the same crowdsourced project where the threat is situated. The proposed framework aims to improve the security of the crowdsourced projects and their outcomes by building confidence about the trustworthiness of the workers.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Maalej, W., Hadeer, N.: Bug report, feature request, or simply praise? On automatically classifying app reviews. In: 2015 IEEE 23rd International Requirements Engineering Conference (RE), pp. 116–125. IEEE (2015)

    Google Scholar 

  2. Carreño, L.V.G., Winbladh, K.: Analysis of user comments: an approach for software requirements evolution. In: Proceedings of the 2013 International Conference on Software Engineering. IEEE Press (2013)

    Google Scholar 

  3. Guzman, E., Maalej, W.: How do users like this feature? A fine grained sentiment analysis of app reviews. In: 2014 IEEE 22nd International Requirements Engineering Conference (RE) (2014)

    Google Scholar 

  4. Brabham, D.C.: Crowdsourcing. MIT Press, Cambridge (2013)

    Google Scholar 

  5. Mao, K., Capra, L., Harman, M., Jia, Y.: A survey of the use of crowdsourcing in software engineering. J. Syst. Softw. 126, 57–84 (2017)

    Article  Google Scholar 

  6. Estellés-Arolas, E., González-Ladrón-de-Guevara, F.: Towards an integrated crowdsourcing definition. J. Inf. Sci. 38(2), 189–200 (2012)

    Article  Google Scholar 

  7. Gadiraju, U., et al.: Understanding malicious behavior in crowdsourcing platforms: the case of online surveys. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. ACM (2015)

    Google Scholar 

  8. Dwarakanath, A., et al.: Trustworthiness in enterprise crowdsourcing: a taxonomy & evidence from data. In: Proceedings of the 38th International Conference on Software Engineering Companion. ACM (2016)

    Google Scholar 

  9. Stefanovitch, N., et al.: Error and attack tolerance of collective problem solving: The DARPA Shredder Challenge. EPJ Data Sci. 3(1), 1–27 (2014)

    Article  Google Scholar 

  10. Johnson, C.W.: Anti-social networking: crowdsourcing and the cyber defence of national critical infrastructures. Ergonomics 57(3), 419–433 (2014)

    Article  Google Scholar 

  11. Raghavan, S.: Digital forensic research: current state of the art. CSI Trans. ICT 1(1), 91–114 (2013)

    Article  Google Scholar 

  12. Garfinkel, S.L.: Digital forensics research: the next 10 years. Digital Investigation 7, S64–S73 (2010)

    Article  Google Scholar 

  13. Howe, J.: The rise of crowdsourcing. Wired 14(6), 1–4 (2006)

    Google Scholar 

  14. Surowiecki, J.: The Wisdom of Crowds: Why the Many Are Smarter Than the Few and How Collective Wisdom Shapes Business, Economics, Society and Nations. Little, Brown, New York (2004)

    Google Scholar 

  15. Breaux, T.D., Schaub, F.: Scaling requirements extraction to the crowd: experiments with privacy policies. In: 2014 IEEE 22nd International Requirements Engineering Conference (RE) (2014)

    Google Scholar 

  16. Lim, S.L., Quercia, D., Finkelstein, A.: StakeNet: using social networks to analyse the stakeholders of large-scale software projects. In: Proceedings of the 32nd ACM/IEEE International Conference on Software Engineering-Volume 1. ACM (2010)

    Google Scholar 

  17. Lim, S.L., Quercia, D., Finkelstein, A.: StakeSource: harnessing the power of crowdsourcing and social networks in stakeholder analysis. In: Proceedings of the 32nd ACM/IEEE International Conference on Software Engineering-Volume 2. ACM (2010)

    Google Scholar 

  18. Lesk, M.: The new front line: Estonia under cyberassault. IEEE Secur. Priv. 5(4), 76–79 (2007)

    Article  Google Scholar 

  19. Compton, D., Hamilton, J.: An examination of the techniques and implications of the crowd-sourced collection of forensic data. In: 2011 IEEE Third International Conference on Privacy, Security, Risk and Trust (PASSAT) and 2011 IEEE Third International Conference on Social Computing (SocialCom) (2011)

    Google Scholar 

  20. Ghosh, A., Kale, S., McAfee, P.: Who moderates the moderators?: Crowdsourcing abuse detection in user-generated content. In: Proceedings of the 12th ACM Conference on Electronic Commerce. ACM (2011)

    Google Scholar 

  21. Pasquale, L., et al.: Adaptive evidence collection in the cloud using attack scenarios. Comput. Secur. 59, 236–254 (2016)

    Article  Google Scholar 

  22. Maalej, W., et al.: Toward data-driven requirements engineering. IEEE Softw. 33(1), 48–54 (2016)

    Article  Google Scholar 

  23. Hosseini, M., et al.: Towards crowdsourcing for requirements engineering. In: 20th International Working Conference on Requirements Engineering: Foundations for Software Quality, Empirical Track (2014)

    Google Scholar 

  24. Snijders, R., et al.: Crowd-centric requirements engineering: a method based on crowdsourcing and gamification. Technical Report Series (UU-CS-2015-004) (2015)

    Google Scholar 

  25. Groen, E.C., Doerr, J., Adam, S.: Towards crowd-based requirements engineering a research preview. In: Fricker, S.A., Schneider, K. (eds.) REFSQ 2015. LNCS, vol. 9013, pp. 247–253. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-16101-3_16

    Google Scholar 

  26. Pachidi, S., Spruit, M., Van De Weerd, I.: Understanding users’ behavior with software operation data mining. Comput. Hum. Behav. 30, 583–594 (2014)

    Article  Google Scholar 

  27. Groen, E.C., Koch, M.: How Requirements Engineering can benefit from crowds. The Magazine for RE Professionals from IREB. http://re-magazine.ireb.org/issues/2016-2-take-the-broader-view/how-requirements-engineering-can-benefit-from-crowds/. Accessed 27 June 2016

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Muneera Bano .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Bano, M., Zowghi, D. (2018). Crowd Vigilante. In: Kamalrudin, M., Ahmad, S., Ikram, N. (eds) Requirements Engineering for Internet of Things. APRES 2017. Communications in Computer and Information Science, vol 809. Springer, Singapore. https://doi.org/10.1007/978-981-10-7796-8_9

Download citation

  • DOI: https://doi.org/10.1007/978-981-10-7796-8_9

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-10-7795-1

  • Online ISBN: 978-981-10-7796-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics