Person:
Orso, Alessandro

Associated Organization(s)
Organizational Unit
ORCID
ArchiveSpace Name Record

Publication Search Results

Now showing 1 - 2 of 2
  • Item
    BugRedux: Reproducing Field Failures for In-house Debugging
    (Georgia Institute of Technology, 2011) Jin, Wei ; Orso, Alessandro
    When a software system fails in the field, on a user machine, and the failure is reported to the developers, developers in charge of debugging the failure must be able to reproduce the failing behavior in house. Unfortunately, reproducing field failures is a notoriously challenging task that has little support today. Typically, developers are provided with a bug report that contains data about the failure, such as memory dumps and, in the best case, some additional information provided by the user. However, this data is usually insufficient for recreating the problem, as recently reported in a survey conducted among developers of the Apache, Eclipse, and Mozilla projects. Even more advanced approaches for gathering field data and help in-house debugging tend to collect either too little information, which results in inexpensive but often ineffective techniques, or too much information, which makes the techniques effective but too costly. To address this issue, we present a novel general approach for supporting in-house debugging of field failures, called BUGREDUX. The goal of BUGREDUX is to synthesize, using execution data collected in the field, executions that mimic the observed field failures. We define several instances of BUGREDUX that collect different types of execution data and perform, through an empirical study, a cost-benefit analysis of the approach and its variations. In the study, we use a tool that implements our approach to recreate 17 failures of 15 realworld programs. Our results are promising and lead to several findings, some of which unexpected. In particular, they show that by collecting a suitable yet limited set of execution data the approach can synthesize in-house executions that reproduce the observed failures.
  • Item
    Execution Hijacking: Improving Dynamic Analysis by Flying off Course
    (Georgia Institute of Technology, 2010) Tsankov, Petar ; Jin, Wei ; Orso, Alessandro ; Sinha, Saurabh
    Typically, dynamic-analysis techniques operate on a small subset of all possible program behaviors, which limits their effectiveness and the representativeness of the computed results. To address this issue, a new paradigm is emerging: execution hijacking—techniques that explore a larger set of program behaviors by forcing executions along specific paths. Although hijacked executions are infeasible for the given inputs, they can still produce feasible behaviors that could be observed under other inputs. In such cases, execution hijacking can improve the effectiveness of dynamic analysis without requiring the (expensive) generation of additional inputs. To evaluate the usefulness of execution hijacking, we defined, implemented, and evaluated several variants of it. Specifically, we performed empirical study where we assessed whether execution hijacking could improve the effectiveness of two common dynamic analyses: software testing and memory error detection. The results of the study show that execution hijacking, if suitably performed, can indeed help dynamic analysis techniques.