skip to main content
10.1145/3176258.3176340acmconferencesArticle/Chapter ViewAbstractPublication PagescodaspyConference Proceedingsconference-collections
short-paper

Identifying Relevant Information Cues for Vulnerability Assessment Using CVSS

Published: 13 March 2018 Publication History

Abstract

The assessment of new vulnerabilities is an activity that accounts for information from several data sources and produces a 'severity' score for the vulnerability. The Common Vulnerability Scoring System (CVSS) is the reference standard for this assessment. Yet, no guidance currently exists on which information aids a correct assessment and should therefore be considered. In this paper we address this problem by evaluating which information cues increase (or decrease) assessment accuracy. We devise a block design experiment with 67 software engineering students with varying vulnerability information and measure scoring accuracy under different information sets. We find that baseline vulnerability descriptions provided by standard vulnerability sources provide only part of the information needed to achieve an accurate vulnerability assessment. Further, we find that additional information on assets, attacks, and vulnerability type contributes in increasing the accuracy of the assessment; conversely, information on known threats misleads the assessor and decreases assessment accuracy and should be avoided when assessing vulnerabilities. These results go in the direction of formalizing the vulnerability communication to, for example, fully automate security assessments.

References

[1]
Luca Allodi and Fabio Massacci. 2014. Comparing vulnerability severity and exploits using case-control studies. ACM Transaction on Information and System Security (TISSEC) Vol. 17, 1 (August. 2014).
[2]
Luca Allodi, Shim Woohyun, and Fabio Massacci. 2013. Quantitative assessment of risk reduction with cybercrime black market monitoring. In Proc. of IWCC'13.
[3]
Sean Barnum and Gary McGraw. 2005. Knowledge for software security. IEEE Security & Privacy Vol. 3, 2 (2005), 74--78.
[4]
Steve Christey and Brian Martin. 2013. Buying into the bias: why vulnerability statistics suck. https://www.blackhat.com/us-13/archives.html#Martin. (July. 2013).
[5]
Stanislav Dashevskyi, Achim D Brucker, and Fabio Massacci. 2016. On the Security Cost of Using a Free and Open Source Component in a Proprietary Product. In International Symposium on Engineering Secure Software and Systems. Springer, 190--206.
[6]
Prem Devanbu, Thomas Zimmermann, and Christian Bird. 2016. Belief & Evidence in Empirical Software Engineering Proceedings of the 38th International Conference on Software Engineering (ICSE '16). ACM, New York, NY, USA, 108--119.
[7]
Martin J Eppler and Jeanne Mengis. 2004. The concept of information overload: A review of literature from organization science, accounting, marketing, MIS, and related disciplines. The information society Vol. 20, 5 (2004), 325--344.
[8]
Henning Femmer, Daniel Méndez Fernández, Stefan Wagner, and Sebastian Eder. 2016. Rapid quality assurance with Requirements Smells. Journal of Systems and Software (2016).
[9]
James Walden, Jeff Stuckman, and Riccardo Scandariato. 2014. Predicting vulnerable components: Software metrics vs text mining 2014 IEEE 25th International Symposium on Software Reliability Engineering. IEEE, 23--33.
[10]
Koen Yskout, Riccardo Scandariato, and Wouter Joosen. 2015. Do Security Patterns Really Help Designers?. In Proceedings of the 37th International Conference on Software Engineering - Volume 1 (ICSE '15). IEEE Press, 292--302.
[11]
Hongyu Zhang, Liang Gong, and Steve Versteeg. 2013. Predicting bug-fixing time: an empirical study of commercial software projects Proceedings of the 2013 International Conference on Software Engineering. IEEE Press, 1042--1051.
[12]
Yu Zhao, Feng Zhang, Emad Shihab, Ying Zou, and Ahmed E Hassan. 2016. How Are Discussions Associated with Bug Reworking?: An Empirical Study on Open Source Projects. In Proceedings of the 10th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement. ACM, 21.
[13]
Thomas Zimmermann, Rahul Premraj, Nicolas Bettenburg, Sascha Just, Adrian Schroter, and Cathrin Weiss. 2010. What makes a good bug report? IEEE Transactions on Software Engineering Vol. 36, 5 (2010), 618--643.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CODASPY '18: Proceedings of the Eighth ACM Conference on Data and Application Security and Privacy
March 2018
401 pages
ISBN:9781450356329
DOI:10.1145/3176258
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 13 March 2018

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. cvss
  2. software vulnerability assessment
  3. vulnerability information

Qualifiers

  • Short-paper

Funding Sources

  • NWO

Conference

CODASPY '18
Sponsor:

Acceptance Rates

CODASPY '18 Paper Acceptance Rate 23 of 110 submissions, 21%;
Overall Acceptance Rate 149 of 789 submissions, 19%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)56
  • Downloads (Last 6 weeks)4
Reflects downloads up to 01 Feb 2025

Other Metrics

Citations

Cited By

View all

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media