What you see is what you test: a methodology for testing form-based visual programs
Proceedings of the 20th international conference on Software engineering
Refactoring: improving the design of existing code
Refactoring: improving the design of existing code
WYSIWYT testing in the spreadsheet paradigm: an empirical evaluation
Proceedings of the 22nd international conference on Software engineering
Testing Homogeneous Spreadsheet Grids with the "What You See Is What You Test" Methodology
IEEE Transactions on Software Engineering
End-user software engineering with assertions in the spreadsheet paradigm
Proceedings of the 25th International Conference on Software Engineering
Scaling Up a "What You See Is What You Test" Methodology to Spreadsheet Grids
VL '99 Proceedings of the IEEE Symposium on Visual Languages
Impact of interruption style on end-user debugging
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
WEUSE I Proceedings of the first workshop on End-user software engineering
Inferring templates from spreadsheets
Proceedings of the 28th international conference on Software engineering
Scaling a Dataflow Testing Methodology to the MultiparadigmWorld of Commercial Spreadsheets
ISSRE '06 Proceedings of the 17th International Symposium on Software Reliability Engineering
UCheck: A spreadsheet type checker for end users
Journal of Visual Languages and Computing
Automatic detection of dimension errors in spreadsheets
Journal of Visual Languages and Computing
Software Engineering for Spreadsheets
IEEE Software
Automatically extracting class diagrams from spreadsheets
ECOOP'10 Proceedings of the 24th European conference on Object-oriented programming
Supporting professional spreadsheet users by generating leveled dataflow diagrams
Proceedings of the 33rd International Conference on Software Engineering
Detecting and visualizing inter-worksheet smells in spreadsheets
Proceedings of the 34th International Conference on Software Engineering
Towards a catalog of spreadsheet smells
ICCSA'12 Proceedings of the 12th international conference on Computational Science and Its Applications - Volume Part IV
Model-based programming environments for spreadsheets
SBLP'12 Proceedings of the 16th Brazilian conference on Programming Languages
Refactoring meets spreadsheet formulas
ICSM '12 Proceedings of the 2012 IEEE International Conference on Software Maintenance (ICSM)
Detecting code smells in spreadsheet formulas
ICSM '12 Proceedings of the 2012 IEEE International Conference on Software Maintenance (ICSM)
Data clone detection and visualization in spreadsheets
Proceedings of the 2013 International Conference on Software Engineering
Hi-index | 0.00 |
Current testing practices for spreadsheets are ad hoc in nature: spreadsheet users put 'test formulas' in their spreadsheets to validate outcomes. In this paper we show that this practice is common, by analyzing a large set of spreadsheets from practice to investigate if spreadsheet users are currently testing. In a follow up analysis, we study the test practices found in this set to deeply understand the way in which spreadsheet users test, in lack of formal testing methods. Subsequently, we describe the Expector approach to extract formulas that are already present in a spreadsheet, presenting these formulas to the user and suggesting improvements, both on the level of individual test formulas as on the spreadsheet as a whole by increasing the coverage of the test formulas. Finally, we offer support to understand why a test formula is breaking. We end the paper with an example underlining the applicability of our approach.