Skip to main content

Risk Assessment Standard

Standard Number: 1.11.2.1.3
Category: Information Security
Owner: Information Technology Services
Effective: October 5, 2021
Revision History: None
Review Date: October 4, 2024

  1. PURPOSE, SCOPE AND RESPONSIBILITIES

    1. Pursuant to the Information Security Policy, the University will conduct risk assessments to identify potential cybersecurity risks to the University. Risk assessments seek to identify risks to the University’s core mission/business functions, processes, segments, infrastructure, support services, or information systems.
    2. This Standard identifies how the University will conduct risk assessments to assess the risk to the University, including its mission, functions, image, or reputation, organizational assets, and individuals, resulting from the operation of organizational systems and the associated processing, storage, or transmission of University Data. This Standard is based on NIST 800-30r1: Guide for Conducting Risk Assessments.
    3. The Chief Information Officer, supported by the Chief Information Security Officer, is responsible for the implementation and enforcement of this Standard.
    4. Information Security Services (“ISS”) is responsible for conducting risk assessments, maintaining records of risk assessments conducted, documenting identified technology risks to the University, and reporting technology risks to senior management to make risk-informed decisions.
    5. Pursuant to the HIPAA Hybrid Entity Designation Policy, the Health Sciences Center Privacy and Security Team is responsible for conducting annual risk assessments on named University Health Care Components.
    6. Academic and administrative IT leaders and Information System Owners are responsible for working with ITS to conduct risk assessments and remediate identified risks.
  2. Risk Assessment Tiers

    1. Risk assessments will be conducted at the following tier levels:
      1. Tier 1. Assessments conducted at the University level and focus on comprehensive assessments across mission and business lines. Tier 1 assessments include annual campus-wide cybersecurity assessments.
      2. Tier 2. Assessments conducted on specific business/academic processes, departments, units or services. Tier 2 assessments include annual HIPAA risk assessments.
      3. Tier 3. Assessments conducted on individual University Technology Resources, such as a specific asset or information system to identify its overall risk to the University. Tier 3 assessments include requests for compliance exceptions to Technology Governance. See Compliance Exception Management Standard.
  3. Risk Assessment Preparation

    1. The following items must be identified and documented prior to conducting a risk assessment:
      1. Purpose. Each risk assessment’s purpose must be stated to ensure the appropriate and intended information is assessed. The purpose should also identify if the assessment is an initial or a subsequent assessment.
      2. Scope. The scope must specify exactly what will be considered in the assessment, including system owner/responsible party, system components, business units/departments, and processes that will be assessed. Risk assessments should include consideration for IT-focused fraud scenarios (i.e., threats and vulnerabilities arising from the use of IT).
      3. Assumptions and Constraints. All assumptions, constraints, identified risk tolerance, and priorities used within the University to make operational decisions must be made explicit. Assumptions in key areas relevant to risk assessments include, but are not limited to, known Threat sources and Threat events; known vulnerabilities or predisposing conditions; potential impacts of risks to University; the primary mission/business functions; operational considerations related to the activities; and any approved compliance exceptions for the system/department.
      4. Information Sources Utilized . Identify the specific information sources that will be used to evaluate the current state of existing controls and their effectiveness. Sources can be internal or external (e.g., ISAC) to the University and may include, but is not limited to, the following information sources:
        1. Self-assessment questionnaires completed about an information system or department related to their security and privacy practices, business requirements, and mandated security requirements.
        2. Security assessments, such as network vulnerability scans, penetration testing, web application scans, Sensitive Data identification scans, database compliance checks, anti-virus scans, and control testing. Security assessments may also include review of media reports of new vulnerabilities and security incident reports. See Security Assessment Standard.
        3. Privacy impact assessments identifying data being collected, classification of data, reason for collection, if data is being shared, and with whom.
        4. Document review such as incident reports, logs, architectural diagrams, and business continuity plans.
        5. On-site visits/observations or interviews to identify potential risks related to clean desks, privacy of computer monitors, and secure storage spaces.
        6. Third-party assurances from external organizations with subject matter expertise in the area being assessed.
      5. Risk model and analytical approach. Define the risk factors that will be assessed (e.g., threats, vulnerabilities, impact, likelihood, predisposing condition) and the reason for assessing them. Identify if lower level risks with direct relationships will be aggregated into general/high-level institutional risks; resources available for the assessment; skills and expertise required for the assessment.
  4. Risk Assessment

    1. 4.1. All risk assessments will seek to identify the following risks to the University:
      1. Circumstances or events with the potential to adversely impact the University (“Threats”),
      2. The sources of such Threats;
      3. Vulnerabilities that could be exploited by Threat sources;
      4. The likelihood such a Threat would occur; and,
      5. The adverse impacts to organizational operations, assets, and individuals from the exploitation of vulnerabilities by Threat sources.
    2. All risks identified as a result of a risk assessment will be assigned an associated security risk level resulting from the likelihood of exploitation and impact that such an exploitation of the risk will occur. See Risk Matrix, Appendix I:
      1. Very High. Indicates the risk associated with the technology could be expected to have multiple severe or catastrophic adverse effects on University operations, organizational assets, individuals, or other organizations.
      2. High. Indicates the risk associated with the technology could be expected to have a severe or catastrophic adverse effect on University operations, organizational assets, individuals, or other organizations.
      3. Moderate. Indicates the risk associated with the technology could be expected to have a serious adverse effect on University operations, organizational assets, and individuals, or other organizations.
      4. Low. Indicates the risk associated with the technology could be expected to have a limited adverse effect on University operations, assets, and individuals, or other organizations.
      5. Very Low. Indicates the risk associated with the technology could be expected to have negligible adverse effect on University operations, assets, and individuals, or other organizations.
  5. Risk Treatment and Approvals

    1. Communication of the results of the risk assessment will be provided to senior management and relevant stakeholders (e.g., information system owner, compliance exception requestor, CIO, CISO) via a Technology Risk Assessment Report which must include the following information:
      1. Summary of risk assessment including information identified prior to conducting risk assessment. See Section 2;
      2. Identified risks and associated risk levels;
      3. Potential alternative solutions to remediate or mitigate the risk; and,
      4. Recommendation for moving forward.
    2. All identified technology risks will be maintained within a Risk Register by Information Technology Services.
    3. Treatment options for identified risks are as follows:
      1. Accept. Accept the risk with no intent to mitigate the risk’s likelihood or impact.
      2. Remediate. Implement plans to stop the risk creating activity. A plan of action and milestones (POAM) to document remedial actions to correct detected weaknesses or failed control testing is identified. This plan must include all known threats associated with the asset or system, planned actions to cease the weakness or failed testing, and a timeline to complete the remediation steps. Existing POAM must be updated based on risk from security assessments or continuous monitoring.
      3. Mitigate. Implement alternate security controls (“Compensating Controls”) to lessen likelihood or impact of the risk identified. Justification for utilizing these controls must be identified.
    4. All risks must have an owner identified who is responsible for remediating it, implementing compensating controls, or accepting the risk.
    5. Risks identified with a Very Low or Low risk level determination may be accepted by the system owner/responsible party without further review.
    6. Approved Compensating Controls are only valid for one year from date of approval and must be re-evaluated annually to determine continued effectiveness.
    7. Accepted risks are valid for one year and must be re-evaluated annually.
    8. System owners/responsible parties that seek to accept risks with a Medium- or High-risk must submit such a request to ISS for further review and analysis prior to being accepted.
    9. Risk with a Very High-risk rating must be remediated.
  6. DEFINITIONS

    1. “Risk” means the relative impact that an exploited vulnerability would have to a user’s environment.
    2. “Risk Register” means a central record of current risks, and related information. Risks are comprised of both accepted risks and risks that have a planned mitigation path via POAM. Inherently, all mitigated risks will also be included in the Risk Register.
    3. “Security Incident” means a violation or imminent threat of violation of Technology Governance or standard security practices.
    4. “Threat” means any circumstance or event with the potential to adversely impact organizational operations (including mission, functions, image, or reputation), organizational assets, or individuals through an information system via unauthorized access, destruction, disclosure, modification of information, and/or denial of service. Also, the potential for a threat-source to successfully exploit a particular information system vulnerability.
    5. “University Technology Resource” means the Campus Network, University-owned hardware, software, and communications equipment, technology facilities, and other relevant hardware and software items, as well as personnel tasked with the planning, implementation, and support of technology.
  7. Related Documents

    1. Information Security Policy
    2. Compliance Exception Management Standard
    3. Security Assessment Standard (pending)
    4. NIST 800-30r1: Guide for Conducting Risk Assessments

Appendix I – Risk Determination Matrix

Impact Likelihood
Rare Unlikely Possible Likely Almost Certain
Catastrophic Moderate Moderate High Very High Very High
Major Low Moderate Moderate High Very High
Moderate Low Moderate
Moderate
Moderate
High
Minor Very Low
Low
Moderate
Moderate
Moderate
Insignificant Very Low
Very Low
Low
Low
Moderate

Likelihood Levels

  • Rare. Error, accident, or act of nature is highly unlikely to occur or occurs less than once every 10 years.
  • Unlikely. Error, accident, or act of nature is unlikely to occur but occurs more than once every 10 years.
  • Possible. Error, accident, or act of nature is somewhat likely to occur or occurs between 1-10 times per year.
  • Likely. Error, accident, or act of nature is highly likely to occur or occurs between 10-100 times per year.
  • Almost Certain. Error, accident, or act of nature is almost certain to occur or occurs more than 100 times per year.

Impact Levels

  • Catastrophic. The event could be expected to have multiple severe or catastrophic adverse effects on organizational operations, organizational assets, individuals, or other organizations.
  • Major. The event could be expected to have a severe adverse effect on organizational operations, organizational assets, individuals, or other organizations.
  • Moderate. The event could be expected to have a serious adverse effect on organizational operations, organizational assets, individuals, or other organizations.
  • Minor. The event could be expected to have a limited adverse effect on organizational operations, organizational assets, individuals, or other organizations.
  • Insignificant. The event could be expected to have a negligible adverse effect on organizational operations, organizational assets, individuals, or other organizations.

Connect With Us

Service Desk Hours and Contact

Service Desk Hours

Monday – Friday: 7:30 a.m. – 8 p.m.
Saturday and Sunday: Noon – 8 p.m.

Closed on official University holidays.

Contact Us

Information Technology Services
One Waterfront Place
Morgantown, WV 26506

(304) 293-4444 | 1 (877) 327-9260
ITSHelp@mail.wvu.edu

Get Help

Maintenance Schedule

To function effectively and securely, applications and the systems that support them must undergo regularly planned maintenance and updates.

See Schedule