Under the IMPACT program, DHS also supports efforts to develop an online cyber-risk ethics decision support tool to be made widely available to the cyber security Research and Development (R&D) community. The immediate goal of the work is to lower the barrier to researchers evaluating the ethical impact of their data collection, research, and publication efforts, both within the context of the IMPACT project, as well as within the researcher community. More broadly, the goal of such a tool is to inform our reasoning about the socio-technical interdependencies in developing and deploying cyber security technologies and measures, and to apply that improved understanding to directly enhancing cyber security and to better tailor R&D investments.
Prototype ICT Research Ethics Application Tool and Evaluation
To date, developers have conducted research and development related to an online cyber-risk ethics decision support tool, in concert with the IMPACT Program's mission to enable responsible innovation in cyber security R&D. The scope of this existing work involves activities related to the information and communication technology (ICT) research ethics thrust, which addresses the principles, application and enforcement of ethics in the context of cyber risk research challenges by academia, industry and the Government. It has built off of the knowledge generated from the Menlo Report Working Group and the Ethical Impact Assessment (EIA) framework. More information on this can be found at the links below.
The Menlo Report
The Menlo Companion Report
The Ethical Assessment Tool Version 1
The Ethical Assessment Tool Version 2
CREDS (Cyber-risk Ethics Decision Support) Tool
firstname.lastname@example.org for access.
Ethics: Cyber-risk Ethics Decision Support (CREDS) Tool
For members of the cyber security Research and Development (R&D) community who are dissatisfied with a scarcity of data collection infrastructure and the cost of collection and curation of data, DHS' IMPACT program provides data collection and hosting that enables meaningful cyber security research. Unlike the current ecosystem of "data-rich" and "data poor" research dominated by those researchers with long-term personal relationships with data suppliers, considerable resources, and years of expertise in legal, ethical, and operational aspects of data collection and curation, IMPACT democratizes data, technologies, and processes behind successful data-drive research. We seek to support this mission through the development and evaluation of an online cyber-risk ethics decision support tool which to be made widely available to the cyber security Research and Development (R&D) community. The tool seeks to reduce the cost of evaluating the ethical impact of data collection, research, and disclosure by codifying information gained both through the Menlo Working Group and through years of apply these Menlo principles to numerous cyber security case studies.
The CREDS Tool is an applied research and development project intended to operationalize a decision support methodology, conceptual framework, and an interactive online tool to identify, reason, and manage ethical and legal issues related to cyber-based research (e.g., network and system security). In short, it is a "Operationalized" version of the Ethical Impact Assessment (EIA) framework that was coined and commenced under the IMPACT ethics project.
The objectives of the tool are to facilitate research that minimizes potential harm while enabling innovation, and to advance the collective dialogue between and among researchers, oversight entities and policymakers about research ethics principles and practices. The functional goals include: estimating and communicating ethical risk; identifying potential impacts of technology; and measuring and improving judgment and reasoning. The methodology involves deriving principles and practices from established law, ethics and best practices, and then using that output to drive the underlying logic of the tool. The CREDS tool is intended to be a resource for the entire community to engage repeatable and transparent decision making in an effort to prevent unattended harm, diminished public trust, and reputational blowback by association arising from undifferentiated comparisons to public or private surveillance and cyber opportunism.
Next Steps: Forthcoming development efforts will focus on maturing the prototype tool (alpha) to a public release (beta), which will involve additional GUI, EIA and Evaluation development.
Community Building: Cyber-security Research Ethics Dialogue & Strategy (CREDS) Workshops
The inaugural CREDS Workshop was held on May 23, 2013, in conjunction with the IEEE Security Privacy Symposium in San Francisco, California. CREDS embraced the theme of "ethics-by-design" in the context of cyber security research, and aimed to: Educate participants about underlying ethics principles and applications; Discuss ethical frameworks and how they are applied across the various stakeholders and respective communities who are involved; Impart recommendations about how ethical frameworks can be used to inform policymakers in evaluating the ethical underpinning of critical policy decisions; Explore cyber security research ethics techniques, tools, standards and practices so researchers can apply ethical principles within their research methodologies; and discuss specific case vignettes and explore the ethical implications of common research acts and omissions.
CREDS 2014 focused on discussions, themes, and momentum generated from the inaugural CREDS 2013 workshop. Specifically, it targeted the shifting roles, responsibilities, and relationships between Researchers, Ethical Review Boards, Government Agencies, Professional Societies, and Program Committees in incentivizing and overseeing ethical research. CREDS 2014 spawned dialogue and practicable solutions around the following proposition: Building a more effective research ethics culture is a prerequisite for balancing research innovation (i.e., academic freedom, reduced burdens and ambiguities) with public trust (i.e., respect for privacy and confidentiality, accountability, data quality), so we explore the pillars of such a culture as well as the strategies that might be adopted to incorporate them into research operations. The functional goals of CREDS include: estimating and communicating ethical risk; identifying potential impacts of technology; and measuring and improving judgment and reasoning.
CREDS embraced the theme of "ethics-by-design" in the context of cyber security research, and aimed to:
- Educate participants about underlying ethics principles and applications
- Discuss ethical frameworks and how they are applied across the various stakeholders and respective communities who are involved
- Impart recommendations about how ethical frameworks can be used to inform policymakers in evaluating the ethical underpinning of critical policy decisions
- Explore cyber security research ethics techniques, tools, standards and practices so researchers can apply ethical principles within their research methodologies
- Discuss specific case vignettes and explore the ethical implications of common research acts and omissions.
Ethics of Cyber Security
Starting the Conversation
The logical approach in the face of this dilemma is to revisit first-order ethical principles and applications and for that we turned to the most often used resource in advising human subject research - the Belmont Report. The Belmont Report was the culmination of years of working meetings by the National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research. Although the Belmont Report stands alone in many researchers and ethics reviewers’ minds, it is part of a larger body of supporting documents. During the multiyear deliberation process, 26 papers were written that provide background, scientific observations and recommendations on topics including basic ethical principles relating to research involving human subjects, boundaries between research and practice, risk/benefit criteria and informed consent.
The Belmont Report describes three basic ethical principles and their application:
- Respect for persons. Participation as a research subject is voluntary and follows from informed consent. Individuals should be treated as autonomous agents, and their right to decide about their own best interests respected. Individuals with diminished autonomy, incapable of deciding for themselves, are entitled to protection.
- Beneficence. Do no harm. Maximize possible benefits and minimize possible harm. Systematically assess both risk and benefit.
- Justice. Each person should receive an equal share in treatments and benefit of research according to individual need, effort, societal contribution and merit. There should be fairness of procedures and outcomes in selection of subjects.
These basic principles serve as the starting point for our exploration in role of ethics in cyber security research.
The Menlo Report
The Menlo Report was created under an informal, grassroots process that was catalyzed by the ethical issues raised in ICT computer security research. Discussions at conferences and in public discourse exposed growing awareness of strong ethical debates in computer security research and that existing US oversight authorities might have been unaware of or might not have believed they had a mandate for reviewing some of this research. The Menlo Report: Ethical Principles Guiding Information and Communication Technology Research is the core document stemming from the series of working group meetings that broached these issues. This process inspired the production and dissemination of other documents intended to supplement and clarify some of the more complex and controversial aspects the role of ethics in cyber security. The Menlo Report details four core ethical principles: three from the original Belmont Report - respect for persons, beneficence and justice, and an additional principle - respect for law and public interest. The report explains each of these in the context of ICT research.
The Menlo Report’s major appendix is a Companion Report: “Applying Ethical Principles to Information and Communication Technology Research.” The companion goes into greater depth on the Menlo Report’s history and relationship to the Belmont Report. It expands on the application of principles, including providing assistive questions that help design and evaluate research in conformance with the stated principles.
Socializing Cyber Security Research Ethics
Though an important first step, the Menlo Report represents an initial effort to help raise the level of discourse about the importance of the role of ethics in cyber security research. Our forthcoming work in this area will continue to explore the practical implications and needs that such discourse brings to the fore. At present this entails galvanizing the community around generally acceptable guidelines based on the principles outlined in the Menlo Report.
In January 2010, an early version of the Menlo Report’s principles and applications introduced an ethical impact assessment (EIA) framework intended to facilitate ethical ICT research design and evaluation. In addition, working group members have participated in multiple panels, workshops, and presentations at which the Menlo Report and related topics were discussed. Some of these events included:
- 2010 Workshop on Ethics in Computer Security Research
- 2010 Symposium on Usable Privacy and Security
- 2010 Network and Distributed System Symposium
- Public Responsibility in Medicine and Research’s 2011 Social, Behavioral, and Education Research Conference
- 1st International Digital Ethics Symposium
- 2011 Honeynet Project Public Day
- 2011 Annual Computer Security Applications Conference
- 2011 Anti-Phishing Working Group Meeting
Through IMPACT we are continuing to explore the complex issues surrounding ethics as they relate to cyber security research. We continue to focus our efforts on developing community-based guidelines that can help researchers, and research review boards understand the importance and the need for understanding the impacts of a given research project.
IMPACT Research Ethics Links
The IEEE Security Privacy Symposium held May 19-22 2013 in San Francisco, CA, co-located with The 34th IEEE Symposium on Security and Privacy (IEEE S&P 2013), an event of The IEEE Computer Society's Security and Privacy Workshops (SPW 2013),
The 2nd Cyber-security Research Ethics Dialog & Strategy (CREDS II) held May 17, 2014 in San Jose, , co-located with the 35th IEEE Symposium on Security and Privacy (IEEE S&P), an event of the IEEE CS Security and Privacy Workshops (SPW),
The ACM SIGCOMM held August 17-22 2014 in Chicago, IL,
The Workshop on Ethics in Networked Systems Research, Co-located with ACM SIGCOMM'15 held August 21 2015 in London, UK,
Stakeholders Acting Together on the Ethical Impact Assessment of Research and Innovation (SATORI) Workshop on Cost Effectiveness of Ethics Assessments held May 30, 2016 in Copenhagen, Denmark,
Beyond IRBs: Ethical Review Processes for Big Data Research, Keynote, held December 10, 2015 in Washington, DC,
Beyond Warm & Fuzzy, Ethics as a Value Prop, ENIGMA 2017 (February 2017),
"Cyber Research Ethics Decision Support (CREDS) Tool," 2015 Proceedings of the 2015 ACM SIGCOMM Workshop on Ethics in Networked Systems Research, London, United Kingdom (August 21 - 21, 2015),