skip to main content
10.1145/2666539.2666569acmconferencesArticle/Chapter ViewAbstractPublication PagesfseConference Proceedingsconference-collections
Article

iTest: testing software with mobile crowdsourcing

Published: 17 November 2014 Publication History

Abstract

In recent years, a lot of crowdsourcing systems have emerged and lead to many successful crowdsourcing systems like Wiki-pedia, Amazon Mechanical Turk and Waze. In the field of software engineering, crowdtesting has acquired increased interest and adoption, especially among personal developers and smaller companies. In this paper, we present iTest which combines mobile crowdsourcing and software testing together to support the testing of mobile application and web services. iTest is a framework for software developers to submit their software and conveniently get the test results from the crowd testers. Firstly, we analyze the key problems need to be solved in a mobile crowdtesting platform; Secondly, we present the architecture of iTest framework; Thirdly, we introduce the workflow of testing web service in iTest and propose an algorithm for solving the tester selection problem mentioned in Section 2; Then the development kit to support testing mobile application is explained; Finally, we perform two experiments to illustrate that both the way to access network and tester's location influence the performance of web service.

References

[1]
A. J. Quinn and B. B. Bederson. Human computation: a survey and taxonomy of a growing field. In Proceedings of CHI 2011, 1403-1412.
[2]
A. Doan, R. Ramakrishnan, and A. Y. Halevy. Crowdsourcing systems on the World-Wide Web. Commun. ACM 54, 4 (April 2011), 86-96.
[3]
Crowdsourcing. http://en.wikipedia.org/wiki/Crowdsourcing
[4]
R. Pham, L. Singer and Kurt Schneider. Building Test Suites in Social Coding Sites by Leveraging Drive-By Commits. 35th International Conference on Software Engineering, pp. 1209-1212, May, 2013.
[5]
D. Liu, M. Lease, R. Kuipers and R. Bias. Crowdsourcing for Usability Testing. March 2012. arXiv 1203.1468. http://arxiv.org/abs/1203.1468
[6]
Mobile Crowdsoucing, http://www.clickworker.com/ en/crowdsourcing-glossar/mobile-crowdsourcing/
[7]
L. Guoliang. Crowdsourcing: Challenges and Opportunities. Tutorial on HotDB 2012. February, 2012.
[8]
R. M. Karp. Reducibility Among Combinatorial Problems. Complexity of Computer Computations. New York: Plenum. pp. 85´lC103. 1972.
[9]
V. V. Vazirani. Approximation Algorithms. Springer : New York, 2001. ISBN 3-540-65367-8
[10]
Apache Log4j Project. http://logging.apache.org/log4j/2.x/
[11]
Y. Liu, T. Alexandrova and T. Nakajima. Using stranger as sensors: temporal and geo-sensitive question answering via social media. In Proceedings of the 22nd international conference on World Wide Web (WWW ’13), pp. 803-814.
[12]
Y. Fan, W. H. Lee, C. T. Iam, G. H. Syu, Indoor Place Name Annotations with Mobile Crowd, 2013 International Conference on Parallel and Distributed Systems (ICPADS) pp.546-551, 15-18 Dec. 2013
[13]
uTest. White Paper: Crowdsourced Usability Testing. http://alexcrockett.com/wp-content/uploads/ downloads/Books/Crowdsourced_Usability_ Testing.pdf Introduction Problem Analysis Tester Management Tester Selection Incentive Mechanism Test Result Aggregation iTest Architecture Web Service Testing Testing Workflow Tester Selection Algorithm Mobile Application Testing Logging Tool Visual Info Collector Experiments Network Type Influence on Service Performance Location Influence on Service Performance Related Work Mobile Crowdsourcing Crowdsourced Testing Conclusion and Future Work Acknowledgment References

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CrowdSoft 2014: Proceedings of the 1st International Workshop on Crowd-based Software Development Methods and Technologies
November 2014
66 pages
ISBN:9781450332248
DOI:10.1145/2666539
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 17 November 2014

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Software testing
  2. mobile application
  3. mobile crowdsourcing
  4. web service

Qualifiers

  • Article

Conference

SIGSOFT/FSE'14
Sponsor:

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)9
  • Downloads (Last 6 weeks)1
Reflects downloads up to 20 Jan 2025

Other Metrics

Citations

Cited By

View all

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media