Pex

Pex–White Box Test Generation for .NET. Pex automatically produces a small test suite with high code coverage for a .NET program. To this end, Pex performs a systematic program analysis (using dynamic symbolic execution, similar to path-bounded model-checking) to determine test inputs for Parameterized Unit Tests. Pex learns the program behavior by monitoring execution traces. Pex uses a constraint solver to produce new test inputs which exercise different program behavior. The result is an automatically generated small test suite which often achieves high code coverage. In one case study, we applied Pex to a core component of the .NET runtime which had already been extensively tested over several years. Pex found errors, including a serious issue.


References in zbMATH (referenced in 28 articles )

Showing results 1 to 20 of 28.
Sorted by year (citations)

1 2 next

  1. Ahrendt, Wolfgang; Chimento, Jesús Mauricio; Pace, Gordon J.; Schneider, Gerardo: Verifying data- and control-oriented properties combining static and runtime verification: theory and tools (2017)
  2. Lucanu, Dorel; Rusu, Vlad; Arusoaie, Andrei: A generic framework for symbolic execution: a coinductive approach (2017)
  3. Rusu, Vlad; Arusoaie, Andrei: Executing and verifying higher-order functional-imperative programs in Maude (2017)
  4. Liang, Tianyi; Reynolds, Andrew; Tsiskaridze, Nestan; Tinelli, Cesare; Barrett, Clark; Deters, Morgan: An efficient SMT solver for string constraints (2016)
  5. Cavalcanti, Ana; Gaudel, Marie-Claude: Test selection for traces refinement (2015)
  6. Xie, Tao; Zhang, Lu; Xiao, Xusheng; Xiong, Ying-Fei; Hao, Dan: Cooperative software testing and analysis: advances and challenges (2014) ioport
  7. Brucker, Achim D.; Wolff, Burkhart: On theorem prover-based testing (2013)
  8. Amato, Gianluca; Parton, Maurizio; Scozzari, Francesca: Discovering invariants via simple component analysis (2012)
  9. Carlier, Matthieu; Dubois, Catherine; Gotlieb, Arnaud: A first step in the design of a formally verified constraint-based testing tool: FocalTest (2012)
  10. Christakis, Maria; Müller, Peter; Wüstholz, Valentin: Collaborative verification and testing with explicit assumptions (2012)
  11. Janičić, Predrag: URSA: a system for uniform reduction to SAT (2012)
  12. Kosmatov, Nikolai; Williams, Nicky; Botella, Bernard; Roger, Muriel; Chebaro, Omar: A lesson on structural testing with PathCrawler-online.com (2012) ioport
  13. Vanoverberghe, Dries; de Halleux, Jonathan; Tillmann, Nikolai; Piessens, Frank: State coverage: Software validation metrics beyond code coverage (2012) ioport
  14. Yang, Guowei; Khurshid, Sarfraz; Kim, Miryung: Specification-based test repair using a lightweight formal method (2012) ioport
  15. Fähndrich, Manuel; Logozzo, Francesco: Static contract checking with abstract interpretation (2011)
  16. Giannakopoulou, Dimitra; Bushnell, David H.; Schumann, Johann; Erzberger, Heinz; Heere, Karen: Formal testing for separation assurance (2011)
  17. Hooimeijer, Pieter; Veanes, Margus: An evaluation of automata algorithms for string analysis (2011)
  18. Obdržálek, Jan; Trtík, Marek: Efficient loop navigation for symbolic execution (2011)
  19. Tschannen, Julian; Furia, Carlo A.; Nordio, Martin; Meyer, Bertrand: Usable verification of object-oriented programs by combining static and dynamic techniques (2011) ioport
  20. Alshraideh, Mohammad; Bottaci, Leonardo; Mahafzah, Basel A.: Using program data-state scarcity to guide automatic test data generation (2010) ioport

1 2 next