Little languages: little maintenance
Journal of Software Maintenance: Research and Practice
Notable design patterns for domain-specific languages
Journal of Systems and Software
When and how to develop domain-specific languages
ACM Computing Surveys (CSUR)
Open versus closed: a cautionary tale
NSDI'06 Proceedings of the 3rd conference on Networked Systems Design & Implementation - Volume 3
Data ONTAP GX: a scalable storage cluster
FAST '07 Proceedings of the 5th USENIX conference on File and Storage Technologies
Hi-index | 0.00 |
In order to effectively measure the performance of large scale data management solutions at NetApp, we use a fully automated infrastructure to execute end-to-end system performance tests. Both the software and user requirements of this infrastructure are complex: the system under test runs a multi-protocol, highly specialized operating system and the infrastructure serves a diverse audience of developers, analysts, and field engineers (including both sales and support). In this paper we describe our approach to rapidly constructing automated performance system tests by using a lightweight, little, or domain-specific language called SLSL in order to more effectively express test specifications. Using a real world example, we illustrate the efficacy of SLSL in terms of its expressiveness, flexibility, and ease of use by showing a complex test configuration expressed with just a few language constructs. We also demonstrate how SLSL can be used in conjunction with our performance measurement lab to quickly deploy performance tests that yield highly repeatable measurements.