skip to main content
10.1145/2661088.2661089acmconferencesArticle/Chapter ViewAbstractPublication PagessplashConference Proceedingsconference-collections
research-article

ACDC-JS: explorative benchmarking of javascript memory management

Published: 14 October 2014 Publication History

Abstract

We present ACDC-JS, an open-source JavaScript memory management benchmarking tool. ACDC-JS incorporates a heap model based on real web applications and may be configured to expose virtually any relevant performance characteristics of JavaScript memory management systems. ACDC-JS is based on ACDC, a benchmarking tool for C/C++ that models periodic allocation and deallocation behavior (AC) as well as persistent memory (DC). We identify important characteristics of JavaScript mutator behavior and propose a configurable heap model based on typical distributions of these characteristics as foundation for ACDC-JS. We describe heap analyses of 13 real web applications extending existing work on JavaScript behavior analysis. Our experimental results show that ACDC-JS enables performance benchmarking and debugging of state-of-the-art JavaScript virtual machines such as V8 and SpiderMonkey by exposing key aspects of their memory management performance.

References

[1]
Are We Fast Yet?, 2014. URL https://github.com/haytjes/arewefastyet.
[2]
Date.now, 2014. URL https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Date/now.
[3]
Chrome DevTools, 2014. URL https://developers.google.com/chrome-developer-tools.
[4]
Kraken, 2014. URL https://wiki.mozilla.org/Kraken.
[5]
Octane 2.0, 2014. URL https://developers.google.com/octane.
[6]
}PerfMeasurementPerfMeasurement.jsm, 2014\natexlaba. URL https://developer.mozilla.org/en-US/docs/Mozilla/JavaScript_code_modules/PerfMeasurement.jsm.
[7]
}PerfNowPerformance.now, 2014\natexlabb. URL https://developer.mozilla.org/en-US/docs/Web/API/Performance.now().
[8]
SeleniumHQ Browser Automation, 2014. URL http://docs.seleniumhq.org.
[9]
SunSpider 1.0.2 JavaScript benchmark, 2014. URL http://www.webkit.org/perf/sunspider/sunspider.html.
[10]
Web Workers, 2014. URL http://dev.w3.org/html5/workers.
[11]
M. Aigner and C. M. Kirsch. ACDC: Towards a Universal Mutator for Benchmarking Heap Management Systems. In phProceedings of the 2013 International Symposium on Memory Management, ISMM '13, pages 75--84. ACM, 2013.
[12]
VanDrunen, von Dincklage, and Wiedermann}Blackburn2006S. M. Blackburn, R. Garner, C. Hoffmann, A. M. Khang, K. S. McKinley, R. Bentzur, A. Diwan, D. Feinberg, D. Frampton, S. Z. Guyer, M. Hirzel, A. Hosking, M. Jump, H. Lee, J. E. B. Moss, A. Phansalkar, D. Stefanović, T. VanDrunen, D. von Dincklage, and B. Wiedermann. The DaCapo Benchmarks: Java Benchmarking Development and Analysis. In phProceedings of the 21st Annual ACM SIGPLAN Conference on Object-oriented Programming Systems, Languages, and Applications, OOPSLA '06, pages 169--190, New York, NY, USA, 2006. ACM.
[13]
B. Livshits, P. Ratanaworabhan, D. Simmons, and B. G. Zorn. JSMeter: Characterizing Real-World Behavior of JavaScript Programs. Technical report, Microsoft Research, 2009.
[14]
P. Ratanaworabhan, B. Livshits, and B. G. Zorn. JSMeter: Comparing the Behavior of JavaScript Benchmarks with Real Web Applications. In phProceedings of the 2010 USENIX Conference on Web Application Development, WebApps'10. USENIX Association, 2010.
[15]
G. Richards, S. Lebresne, B. Burg, and J. Vitek. An Analysis of the Dynamic Behavior of JavaScript Programs. In phProceedings of the 2010 ACM SIGPLAN Conference on Programming Language Design and Implementation, PLDI '10, pages 1--12. ACM, 2010.
[16]
Richards, Gal, Eich, and Vitek}Richards2011G. Richards, A. Gal, B. Eich, and J. Vitek. Automated Construction of JavaScript Benchmarks. In phProceedings of the 2011 ACM International Conference on Object Oriented Programming Systems Languages and Applications, OOPSLA '11, pages 677--694. ACM, 2011\natexlaba.
[17]
Richards, Hammer, Burg, and Vitek}Richards2011--2G. Richards, C. Hammer, B. Burg, and J. Vitek. The Eval That Men Do: A Large-scale Study of the Use of Eval in Javascript Applications. In phProceedings of the 25th European Conference on Object-oriented Programming, ECOOP'11, pages 52--78. Springer-Verlag, 2011\natexlabb.
[18]
B. Zorn and D. Grunwald. Empirical measurements of six allocation-intensive C programs. phSIGPLAN Not., 27 (12): 71--80, Dec. 1992.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
DLS '14: Proceedings of the 10th ACM Symposium on Dynamic languages
October 2014
160 pages
ISBN:9781450332118
DOI:10.1145/2661088
  • cover image ACM SIGPLAN Notices
    ACM SIGPLAN Notices  Volume 50, Issue 2
    DLS '14
    February 2015
    146 pages
    ISSN:0362-1340
    EISSN:1558-1160
    DOI:10.1145/2775052
    • Editor:
    • Andy Gill
    Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

In-Cooperation

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 October 2014

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. automatic heap management
  2. benchmarking
  3. program behavior

Qualifiers

  • Research-article

Funding Sources

Conference

SPLASH '14
Sponsor:

Acceptance Rates

DLS '14 Paper Acceptance Rate 13 of 28 submissions, 46%;
Overall Acceptance Rate 32 of 77 submissions, 42%

Upcoming Conference

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)9
  • Downloads (Last 6 weeks)1
Reflects downloads up to 27 Dec 2024

Other Metrics

Citations

Cited By

View all

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media