lnu.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Empirical Comparison Between Conventional and AI-based Automated Unit Test Generation Tools in Java
Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM).
Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM).
2023 (English)Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
Abstract [en]

Unit testing plays a crucial role in ensuring the quality and reliability of software systems. However, manual testing can often be a slow and time-consuming process. With current advancements in artificial intelligence (AI), new tools have emerged for automated unit testing to address this issue. But how do these new AI tools compare to conventional automated unit test generation tools? To answer this question, we compared two state-of-the-art conventional unit test tools (EVOSUITE and RANDOOP) with the sole commercially available AI-based unit test tool (DIFFBLUE COVER) for Java. We tested them on 10 sample classes from 3 real-life projects provided by the Defects4J dataset to evaluate their performance regarding code coverage, mutation score, and fault detection. The results showed that EVOSUITE achieved the highest code coverage, averaging 89%, while RANDOOP and DIFFBLUE COVER achieved similar results, averaging 63%. In terms of mutation score, DIFFBLUE COVER had the lowest average score of 40%, while EVOSUITE and RANDOOP scored 67% and 50%, respectively. For fault detection, EVOSUITE and RANDOOP detected a higher number of bugs (7 out of 10 and 5 out of 10, respectively) compared to DIFFBLUE COVER, which found only 4 out of 10. Although the AI-based tool was outperformed in all three criteria, it still shows promise by being able to achieve adequate results, in some cases even surpassing the conventional tools while generating a significantly smaller number of total assertions and more comprehensive tests. Nonetheless, the study acknowledges its limitations in terms of the restricted number of AI-based tools used and the small number of projects utilized from Defects4J.

Place, publisher, year, edition, pages
2023. , p. 30
Keywords [en]
Software Testing, Unit Testing, Automatic Test Case Generation, AI, Defects4J, Experiment;
National Category
Computer Sciences Computer and Information Sciences
Identifiers
URN: urn:nbn:se:lnu:diva-121527OAI: oai:DiVA.org:lnu-121527DiVA, id: diva2:1764443
Subject / course
Computer Science
Educational program
Software Technology Programme, 180 credits
Supervisors
Examiners
Available from: 2023-06-13 Created: 2023-06-08 Last updated: 2023-06-13Bibliographically approved

Open Access in DiVA

Degree project(1053 kB)566 downloads
File information
File name FULLTEXT01.pdfFile size 1053 kBChecksum SHA-512
b1b9111393feeee1aaf9862d6bd38f997a82883eb37f0326998d9ccf14f1b57aa731005aedeeb30f4057ca1b25e9eb18a5f825c7f85c1638f5e13a67efb1cace
Type fulltextMimetype application/pdf

Search in DiVA

By author/editor
Gkikopouli, MariosBataa, Batjigdrel
By organisation
Department of computer science and media technology (CM)
Computer SciencesComputer and Information Sciences

Search outside of DiVA

GoogleGoogle Scholar
Total: 566 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 2292 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf