High-Precision Evaluation of Both Static and Dynamic Tools Using DataRaceBench
Parallel Programming Languages and Models
Reliability and Resiliency
TimeFriday, 19 November 202110:50am - 11:10am CST
DescriptionDataRaceBench (DRB) is a dedicated benchmark suite to evaluate tools aimed to find data race bugs in OpenMP programs. Using microbenchmarks with or without data races, DRB is able to generate standard quality metrics and provide systematical and quantitative assessments of data race detection tools. In this paper, we present a new version of DRB with several improvements. First, we design a novel approach to enable high-precision checking of tool results. The approach relies on a format to accurately encode data race ground truth information. The workflow of DRB has also been improved to support static data race detection tools. Finally, an enhanced code similarity analysis is developed to better detect redundant code patterns. Our experiments show that the improved DRB generates more accurate reports for both static and dynamic data race detection tools.