Call for the 2nd Edition of the International Software Testing Contest

Goal

The IEEE International Conference on Software Testing, Verification and Validation (ICST) is one of the premier conferences for research in all areas of software quality. The conference has historically been a platform for both collaboration, and competition, to share and compare research ideas, approaches and results. In 2017 ICST also aims to continue with competitive software testing research, following-up on the success of the bug-bash contest from ICST 2015, in the 2nd Edition that we will call the International Software Testing Contest! In the 2017 contest, researchers and practitioners will compete with their approaches and tools for GUI-based testing to see which is “the best”! Great prizes can be won! Moreover, the results of the competition will then be synthesized into a research paper where all participants are invited as authors!

How can you compete?

The contest will be performed as an experiment where each participant shall use a GUI-based test technique, of his or her choice (Treatment), to identify as many software faults as possible on a common application under test (Subject). Any tool or technique may be used, including static analysis, but some preparation, e.g. definition of manual or automated test cases, must be performed and submitted prior to the contest. If you have your own tool you are more than welcome to compete with that! Another possibility is to use an existing tool like one of the following example open source tools:

  1. Sikuli (http://www.sikuli.org/)
  2. GUITAR (https://sourceforge.net/projects/guitar/)
  3. JAutomate (http://jautomate.com/)
  4. TESTAR (http:/www.testar.org)
  5. PGBT ( paginas.fe.up.pt/~apaiva/pbgtwiki/doku.php?id=start)
  6. ……

In preparation for the competition, which will be held at the ICST venue, all participants will be provided with the defect free version of the application under test that will be used during the competition. This will allow you to create test cases, models and/or fine-tune your tools/techniques in preparation for the main event! To make sure we can compare your testing effort with the rest you need to maintain a detailed working diary, where you log how much time you spent on the different preparation tasks. This will be one of the crucial additional inputs to be able to compare and evaluate the tools and techniques industrial viability (effectiveness, efficiency and usage). So we rely on research ethics!

During the competition, each participant will have the same amount of time (yet to be decided) to find as many faults (failures or defects) as possible in the competition application under test. Note: The number of found faults and severity of said faults will then be evaluated to determine the winner of the competition.
In addition, all submitted/used tools and techniques will be judged based on their perceived industrial applicability by a jury of industrial practitioners and external experts prior to the event. The score of this evaluation will then be added to the number of found defects to produce each participant’s final score.

Apply to the competition!

In preparation for the contest we require each participant to first submit “an interest to compete application” followed by a description of the tool or approach that you aim to use that explains how it works and its prerequisites. In particular, each submission must include a description of:

  1. Platform: On what platforms the tool/technique can be applied, e.g. desktop, mobile, web?
  2. Language: For what programming language(s) the tool/technique can be applied to, e.g. Java, OpenC?
  3. Domain: Is the tool/technique intended for exploratory testing or regression testing?
  4. Setup: Time spent on setting up/fine-tuning/creating test cases.

This information will be compiled from all applicants to select a suitable open-source application for the competition. The application will then be seeded with defects on different levels of system abstraction. Effectiveness of each tool/technique then be measured based on mutation score per time spent in preparation. Note: We aim to facilitate as many participants’ tool prerequisites as possible but do NOT guarantee that all prerequisites for all tools/techniques will be supported in the competition. Consequently your contribution may NOT be accepted for entry in the competition if it is considered too diverse from the other entries.

If your entry is accepted to the competition, we kindly request that you submit the test tool/description of the technique to us such that we can perform an external evaluation of its usability and industrial viability.

The Champion

The winner of this prestigious competition will be awarded with a still to be decided prize, supplied by the competition’s sponsors. His/her achievement will also be announced to everyone during the last day of the conference.

How to apply and important dates

This competition is open to everyone and is a great opportunity to compare your testing expertise, tool or both to other market/research leading GUI-based test automation tools. If you wish to compete, or have additional questions, please contact the organisers or submit your applications via e-mail in a PDF (max 3 pages) to the cochairs.

Important Dates
Submission of full papers: Accepting entries
Notification of acceptance: On entry of Submission
Deadline for submission of the test tool/scripts/cases: Day of the competition at ICST (Date TBD)

Doctoral Symposium Committee Co-Chairs

  • Emil Alégroth, Blekinge Institute of Technology (Emil.Alegroth@Bth.se)
  • Tanja E.J Vos, Open Universiteit, Vakgroep Informatica, (Tanja.Vos@ou.nl)
  • Shinsuke Matsuki, Veriserve Corporation,(shinsk@gmail.com)
  • Kinji Akemine, NTT DATA Corp. (kjstylepp@gmail.com)