{
	"id": "577a743d-ffb6-4c33-97d4-09824eb3d61a",
	"created_at": "2026-04-06T00:16:15.160695Z",
	"updated_at": "2026-04-10T03:22:00.568533Z",
	"deleted_at": null,
	"sha1_hash": "03e1475766008579e7995816adfb791e65186a34",
	"title": "NIST Special Publication 800-63B",
	"llm_title": "",
	"authors": "",
	"file_creation_date": "0001-01-01T00:00:00Z",
	"file_modification_date": "0001-01-01T00:00:00Z",
	"file_size": 810202,
	"plain_text": "NIST Special Publication 800-63B\r\nBy Authority\r\nArchived: 2026-04-05 13:34:39 UTC\r\nThis publication has been developed by NIST in accordance with its statutory responsibilities under the Federal\r\nInformation Security Modernization Act (FISMA) of 2014, 44 U.S.C. § 3551 et seq., Public Law (P.L.) 113-283.\r\nNIST is responsible for developing information security standards and guidelines, including minimum\r\nrequirements for federal systems, but such standards and guidelines shall not apply to national security systems\r\nwithout the express approval of appropriate federal officials exercising policy authority over such systems. This\r\nguideline is consistent with the requirements of the Office of Management and Budget (OMB) Circular A-130.\r\nNothing in this publication should be taken to contradict the standards and guidelines made mandatory and\r\nbinding on federal agencies by the Secretary of Commerce under statutory authority. Nor should these guidelines\r\nbe interpreted as altering or superseding the existing authorities of the Secretary of Commerce, Director of the\r\nOMB, or any other federal official. This publication may be used by nongovernmental organizations on a\r\nvoluntary basis and is not subject to copyright in the United States. Attribution would, however, be appreciated by\r\nNIST.\r\nThe Information Technology Laboratory (ITL) at the National Institute of Standards and Technology (NIST)\r\npromotes the U.S. economy and public welfare by providing technical leadership for the Nation’s measurement\r\nand standards infrastructure. ITL develops tests, test methods, reference data, proof of concept implementations,\r\nand technical analyses to advance the development and productive use of information technology. ITL’s\r\nresponsibilities include the development of management, administrative, technical, and physical standards and\r\nguidelines for the cost-effective security and privacy of other than national security-related information in federal\r\ninformation systems. The Special Publication 800-series reports on ITL’s research, guidelines, and outreach efforts\r\nin information system security, and its collaborative activities with industry, government, and academic\r\norganizations.\r\nThese guidelines provide technical requirements for federal agencies implementing digital identity services and\r\nare not intended to constrain the development or use of standards outside of this purpose. These guidelines focus\r\non the authentication of subjects interacting with government systems over open networks, establishing that a\r\ngiven claimant is a subscriber who has been previously authenticated. The result of the authentication process may\r\nbe used locally by the system performing the authentication or may be asserted elsewhere in a federated identity\r\nsystem. This document defines technical requirements for each of the three authenticator assurance levels. This\r\npublication supersedes corresponding sections of NIST Special Publication (SP) 800-63-2.\r\nauthentication; credential service provider; digital authentication; digital credentials; electronic authentication;\r\nelectronic credentials, federation.\r\nThe authors gratefully acknowledge Kaitlin Boeckl for her artistic graphics contributions to all volumes in the SP\r\n800-63 suite and the contributions of our many reviewers, including Joni Brennan from the Digital ID \u0026\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 1 of 63\n\nAuthentication Council of Canada (DIACC), Kat Megas, Ellen Nadeau, and Ben Piccarreta from NIST, and Ryan\r\nGalluzzo and Danna Gabel O’Rourke from Deloitte \u0026 Touche LLP.\r\nThe authors would also like to acknowledge the thought leadership and innovation of the original authors: Donna\r\nF. Dodson, W. Timothy Polk, Sarbari Gupta, and Emad A. Nabbus. Without their tireless efforts, we would not\r\nhave had the incredible baseline from which to evolve 800-63 to the document it is today. In addition, special\r\nthanks to the Federal Privacy Council’s Digital Authentication Task Force for the contributions to the development\r\nof privacy requirements and considerations.\r\nThe terms “SHALL” and “SHALL NOT” indicate requirements to be followed strictly in order to conform to the\r\npublication and from which no deviation is permitted.\r\nThe terms “SHOULD” and “SHOULD NOT” indicate that among several possibilities one is recommended as\r\nparticularly suitable, without mentioning or excluding others, or that a certain course of action is preferred but not\r\nnecessarily required, or that (in the negative form) a certain possibility or course of action is discouraged but not\r\nprohibited.\r\nThe terms “MAY” and “NEED NOT” indicate a course of action permissible within the limits of the publication.\r\nThe terms “CAN” and “CANNOT” indicate a possibility or capability, whether material, physical or causal or, in\r\nthe negative, the absence of that possibility or capability.\r\nTable of Contents\r\n1. Purpose\r\n2. Introduction\r\n3. Definitions and Abbreviations\r\n4. Authenticator Assurance Levels\r\n5. Authenticator and Verifier Requirements\r\n6. Authenticator Lifecycle Requirements\r\n7. Session Management\r\n8. Threats and Security Considerations\r\n9. Privacy Considerations\r\n10. Usability Considerations\r\n11. References\r\nAppendix A — Strength of Memorized Secrets\r\nErrata\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 2 of 63\n\nThis table contains changes that have been incorporated into Special Publication 800-63B. Errata updates can\r\ninclude corrections, clarifications, or other minor changes in the publication that are either editorial or substantive\r\nin nature.\r\nDate Type Change Location\r\n2017-\r\n12-01\r\nEditorial\r\nUpdated AAL descriptions for consistency with other text in\r\ndocument\r\nIntroduction\r\n  Editorial\r\nDeleted “cryptographic” to consistently reflect authenticator\r\noptions at AAL3\r\n§4.3\r\n  Substantive Refined the requirements about processing of attributes §4.4\r\n  Editorial\r\nMake language regarding activation factors for multifactor\r\nauthenticators consistent\r\n§5.1.5.1, 5.1.8.1,\r\nand 5.1.9.1\r\n  Substantive\r\nRecognize use of hardware TPM as hardware crypto\r\nauthenticator\r\n§5.1.7.1, 5.1.9.1\r\n  Editorial\r\nImprove normative language on authenticated protected\r\nchannels for biometrics\r\n§5.2.3\r\n  Editorial\r\nChanged “transaction” to “binding transaction” to emphasize\r\nthat requirement doesn’t apply to authentication transactions\r\n§6.1.1\r\n  Editorial Replaced out-of-context note at end of section 7.2 §7.2\r\n  Editorial\r\nChanged IdP to CSP to match terminology used elsewhere in\r\nthis document\r\nTable 8-1\r\n  Editorial Corrected capitalization of Side Channel Attack Table 8-2\r\n  Substantive\r\nChanged the title to processing limitation; clarified the\r\nlanguage, incorporated privacy objectives language, and\r\nspecified that consent is explicit\r\n§9.3\r\n  Editorial Added NISTIR 8062 as a reference §11.1\r\n  Editorial Corrected title of SP 800-63C §11.3\r\n2020-\r\n03-02\r\nSubstantive\r\nClarified wording of verifier impersonation resistance\r\nrequirement\r\n§4.3.2\r\n  Editorial\r\nEmphasized use of key unlocked by additional factor to sign\r\nnonce\r\n§5.1.9.1\r\n  Editorial Provided examples of risk-based behavior observations §5.2.2\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 3 of 63\n\nDate Type Change Location\r\n  Editorial Removed redundant phrase §5.2.3\r\n  Editorial Updated URL for reference [Blacklists] §11.1\r\n1 Purpose\r\nThis section is informative.\r\nThis document and its companion documents, Special Publication (SP) 800-63, SP 800-63A, and SP 800-63C,\r\nprovide technical guidelines to agencies for the implementation of digital authentication.\r\n2 Introduction\r\nThis section is informative.\r\nDigital identity is the unique representation of a subject engaged in an online transaction. A digital identity is\r\nalways unique in the context of a digital service, but does not necessarily need to be traceable back to a specific\r\nreal-life subject. In other words, accessing a digital service may not mean that the underlying subject’s real-life\r\nrepresentation is known. Identity proofing establishes that a subject is actually who they claim to be. Digital\r\nauthentication is the process of determining the validity of one or more authenticators used to claim a digital\r\nidentity. Authentication establishes that a subject attempting to access a digital service is in control of the\r\ntechnologies used to authenticate. For services in which return visits are applicable, successfully authenticating\r\nprovides reasonable risk-based assurances that the subject accessing the service today is the same as the one who\r\naccessed the service previously. Digital identity presents a technical challenge because it often involves the\r\nproofing of individuals over an open network and always involves the authentication of individuals over an open\r\nnetwork. This presents multiple opportunities for impersonation and other attacks which can lead to fraudulent\r\nclaims of a subject’s digital identity.\r\nThe ongoing authentication of subscribers is central to the process of associating a subscriber with their online\r\nactivity. Subscriber authentication is performed by verifying that the claimant controls one or more authenticators\r\n(called tokens in earlier versions of SP 800-63) associated with a given subscriber. A successful authentication\r\nresults in the assertion of an identifier, either pseudonymous or non-pseudonymous, and optionally other identity\r\ninformation, to the relying party (RP).\r\nThis document provides recommendations on types of authentication processes, including choices of\r\nauthenticators, that may be used at various Authenticator Assurance Levels (AALs). It also provides\r\nrecommendations on the lifecycle of authenticators, including revocation in the event of loss or theft.\r\nThis technical guideline applies to digital authentication of subjects to systems over a network. It does not address\r\nthe authentication of a person for physical access (e.g., to a building), though some credentials used for digital\r\naccess may also be used for physical access authentication. This technical guideline also requires that federal\r\nsystems and service providers participating in authentication protocols be authenticated to subscribers.\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 4 of 63\n\nThe strength of an authentication transaction is characterized by an ordinal measurement known as the AAL.\r\nStronger authentication (a higher AAL) requires malicious actors to have better capabilities and expend greater\r\nresources in order to successfully subvert the authentication process. Authentication at higher AALs can\r\neffectively reduce the risk of attacks. A high-level summary of the technical requirements for each of the AALs is\r\nprovided below; see Sections 4 and 5 of this document for specific normative requirements.\r\nAuthenticator Assurance Level 1: AAL1 provides some assurance that the claimant controls an authenticator\r\nbound to the subscriber’s account. AAL1 requires either single-factor or multi-factor authentication using a wide\r\nrange of available authentication technologies. Successful authentication requires that the claimant prove\r\npossession and control of the authenticator through a secure authentication protocol.\r\nAuthenticator Assurance Level 2: AAL2 provides high confidence that the claimant controls an authenticator(s)\r\nbound to the subscriber’s account. Proof of possession and control of two different authentication factors is\r\nrequired through secure authentication protocol(s). Approved cryptographic techniques are required at AAL2 and\r\nabove.\r\nAuthenticator Assurance Level 3: AAL3 provides very high confidence that the claimant controls\r\nauthenticator(s) bound to the subscriber’s account. Authentication at AAL3 is based on proof of possession of a\r\nkey through a cryptographic protocol. AAL3 authentication requires a hardware-based authenticator and an\r\nauthenticator that provides verifier impersonation resistance; the same device may fulfill both these requirements.\r\nIn order to authenticate at AAL3, claimants are required to prove possession and control of two distinct\r\nauthentication factors through secure authentication protocol(s). Approved cryptographic techniques are required.\r\nThe following table states which sections of the document are normative and which are informative:\r\nSection Name Normative/Informative\r\n1. Purpose Informative\r\n2. Introduction Informative\r\n3. Definitions and Abbreviations Informative\r\n4. Authenticator Assurance Levels Normative\r\n5. Authenticator and Verifier Requirements Normative\r\n6. Authenticator Lifecycle Management Normative\r\n7. Session Management Normative\r\n8. Threat and Security Considerations Informative\r\n9. Privacy Considerations Informative\r\n10. Usability Considerations Informative\r\n11. References Informative\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 5 of 63\n\nSection Name Normative/Informative\r\nAppendix A — Strength of Memorized Secrets Informative\r\n3 Definitions and Abbreviations\r\nSee SP 800-63, Appendix A for a complete set of definitions and abbreviations.\r\n4 Authenticator Assurance Levels\r\nThis section contains both normative and informative material.\r\nTo satisfy the requirements of a given AAL, a claimant SHALL be authenticated with at least a given level of\r\nstrength to be recognized as a subscriber. The result of an authentication process is an identifier that SHALL be\r\nused each time that subscriber authenticates to that RP. The identifier MAY be pseudonymous. Subscriber\r\nidentifiers SHOULD NOT be reused for a different subject but SHOULD be reused when a previously-enrolled\r\nsubject is re-enrolled by the CSP. Other attributes that identify the subscriber as a unique subject MAY also be\r\nprovided.\r\nDetailed normative requirements for authenticators and verifiers at each AAL are provided in Section 5.\r\nSee SP 800-63 Section 6.2 for details on how to choose the most appropriate AAL.\r\nFIPS 140 requirements are satisfied by FIPS 140-2 or newer revisions.\r\nAt IAL1, it is possible that attributes are collected and made available by the digital identity service. Any PII or\r\nother personal information — whether self-asserted or validated — requires multi-factor authentication.\r\nTherefore, agencies SHALL select a minimum of AAL2 when self-asserted PII or other personal information is\r\nmade available online.\r\n4.1 Authenticator Assurance Level 1\r\nThis section is normative.\r\nAAL1 provides some assurance that the claimant controls an authenticator bound to the subscriber’s account.\r\nAAL1 requires either single-factor or multi-factor authentication using a wide range of available authentication\r\ntechnologies. Successful authentication requires that the claimant prove possession and control of the\r\nauthenticator through a secure authentication protocol.\r\n4.1.1 Permitted Authenticator Types\r\nAAL1 authentication SHALL occur by the use of any of the following authenticator types, which are defined in\r\nSection 5:\r\nMemorized Secret (Section 5.1.1)\r\nLook-Up Secret (Section 5.1.2)\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 6 of 63\n\nOut-of-Band Devices (Section 5.1.3)\r\nSingle-Factor One-Time Password (OTP) Device (Section 5.1.4)\r\nMulti-Factor OTP Device (Section 5.1.5)\r\nSingle-Factor Cryptographic Software (Section 5.1.6)\r\nSingle-Factor Cryptographic Device (Section 5.1.7)\r\nMulti-Factor Cryptographic Software (Section 5.1.8)\r\nMulti-Factor Cryptographic Device (Section 5.1.9)\r\n4.1.2 Authenticator and Verifier Requirements\r\nCryptographic authenticators used at AAL1 SHALL use approved cryptography. Software-based authenticators\r\nthat operate within the context of an operating system MAY, where applicable, attempt to detect compromise (e.g.,\r\nby malware) of the user endpoint in which they are running and SHOULD NOT complete the operation when\r\nsuch a compromise is detected.\r\nCommunication between the claimant and verifier (using the primary channel in the case of an out-of-band\r\nauthenticator) SHALL be via an authenticated protected channel to provide confidentiality of the authenticator\r\noutput and resistance to man-in-the-middle (MitM) attacks.\r\nVerifiers operated by government agencies at AAL1 SHALL be validated to meet the requirements of FIPS 140\r\nLevel 1.\r\n4.1.3 Reauthentication\r\nPeriodic reauthentication of subscriber sessions SHALL be performed as described in Section 7.2. At AAL1,\r\nreauthentication of the subscriber SHOULD be repeated at least once per 30 days during an extended usage\r\nsession, regardless of user activity. The session SHOULD be terminated (i.e., logged out) when this time limit is\r\nreached.\r\n4.1.4 Security Controls\r\nThe CSP SHALL employ appropriately-tailored security controls from the low baseline of security controls\r\ndefined in SP 800-53 or equivalent federal (e.g. FEDRAMP) or industry standard. The CSP SHALL ensure that\r\nthe minimum assurance-related controls for low-impact systems, or equivalent, are satisfied.\r\n4.1.5 Records Retention Policy\r\nThe CSP shall comply with its respective records retention policies in accordance with applicable laws,\r\nregulations, and policies, including any National Archives and Records Administration (NARA) records retention\r\nschedules that may apply. If the CSP opts to retain records in the absence of any mandatory requirements, the CSP\r\nSHALL conduct a risk management process, including assessments of privacy and security risks, to determine\r\nhow long records should be retained and SHALL inform the subscriber of that retention policy.\r\n4.2 Authenticator Assurance Level 2\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 7 of 63\n\nThis section is normative.\r\nAAL2 provides high confidence that the claimant controls authenticator(s) bound to the subscriber’s account.\r\nProof of possession and control of two distinct authentication factors is required through secure authentication\r\nprotocol(s). Approved cryptographic techniques are required at AAL2 and above.\r\n4.2.1 Permitted Authenticator Types\r\nAt AAL2, authentication SHALL occur by the use of either a multi-factor authenticator or a combination of two\r\nsingle-factor authenticators. A multi-factor authenticator requires two factors to execute a single authentication\r\nevent, such as a cryptographically-secure device with an integrated biometric sensor that is required to activate the\r\ndevice. Authenticator requirements are specified in Section 5.\r\nWhen a multi-factor authenticator is used, any of the following MAY be used:\r\nMulti-Factor OTP Device (Section 5.1.5)\r\nMulti-Factor Cryptographic Software (Section 5.1.8)\r\nMulti-Factor Cryptographic Device (Section 5.1.9)\r\nWhen a combination of two single-factor authenticators is used, it SHALL include a Memorized Secret\r\nauthenticator (Section 5.1.1) and one possession-based (i.e., “something you have”) authenticator from the\r\nfollowing list:\r\nLook-Up Secret (Section 5.1.2)\r\nOut-of-Band Device (Section 5.1.3)\r\nSingle-Factor OTP Device (Section 5.1.4)\r\nSingle-Factor Cryptographic Software (Section 5.1.6)\r\nSingle-Factor Cryptographic Device (Section 5.1.7)\r\nNote: When biometric authentication meets the requirements in Section 5.2.3, the device has to be\r\nauthenticated in addition to the biometric — a biometric is recognized as a factor, but not recognized as\r\nan authenticator by itself. Therefore, when conducting authentication with a biometric, it is unnecessary\r\nto use two authenticators because the associated device serves as “something you have,” while the\r\nbiometric serves as “something you are.”\r\n4.2.2 Authenticator and Verifier Requirements\r\nCryptographic authenticators used at AAL2 SHALL use approved cryptography. Authenticators procured by\r\ngovernment agencies SHALL be validated to meet the requirements of FIPS 140 Level 1. Software-based\r\nauthenticators that operate within the context of an operating system MAY, where applicable, attempt to detect\r\ncompromise of the platform in which they are running (e.g., by malware) and SHOULD NOT complete the\r\noperation when such a compromise is detected. At least one authenticator used at AAL2 SHALL be replay\r\nresistant as described in Section 5.2.8. Authentication at AAL2 SHOULD demonstrate authentication intent from\r\nat least one authenticator as discussed in Section 5.2.9.\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 8 of 63\n\nCommunication between the claimant and verifier (the primary channel in the case of an out-of-band\r\nauthenticator) SHALL be via an authenticated protected channel to provide confidentiality of the authenticator\r\noutput and resistance to MitM attacks.\r\nVerifiers operated by government agencies at AAL2 SHALL be validated to meet the requirements of FIPS 140\r\nLevel 1.\r\nWhen a device such as a smartphone is used in the authentication process, the unlocking of that device (typically\r\ndone using a PIN or biometric) SHALL NOT be considered one of the authentication factors. Generally, it is not\r\npossible for a verifier to know that the device had been locked or if the unlock process met the requirements for\r\nthe relevant authenticator type.\r\nWhen a biometric factor is used in authentication at AAL2, the performance requirements stated in Section 5.2.3\r\nSHALL be met, and the verifier SHOULD make a determination that the biometric sensor and subsequent\r\nprocessing meet these requirements.\r\n4.2.3 Reauthentication\r\nPeriodic reauthentication of subscriber sessions SHALL be performed as described in Section 7.2. At AAL2,\r\nauthentication of the subscriber SHALL be repeated at least once per 12 hours during an extended usage session,\r\nregardless of user activity. Reauthentication of the subscriber SHALL be repeated following any period of\r\ninactivity lasting 30 minutes or longer. The session SHALL be terminated (i.e., logged out) when either of these\r\ntime limits is reached.\r\nReauthentication of a session that has not yet reached its time limit MAY require only a memorized secret or a\r\nbiometric in conjunction with the still-valid session secret. The verifier MAY prompt the user to cause activity just\r\nbefore the inactivity timeout.\r\n4.2.4 Security Controls\r\nThe CSP SHALL employ appropriately-tailored security controls from the moderate baseline of security controls\r\ndefined in SP 800-53 or equivalent federal (e.g., FEDRAMP) or industry standard. The CSP SHALL ensure that\r\nthe minimum assurance-related controls for moderate-impact systems or equivalent are satisfied.\r\n4.2.5 Records Retention Policy\r\nThe CSP shall comply with its respective records retention policies in accordance with applicable laws,\r\nregulations, and policies, including any NARA records retention schedules that may apply. If the CSP opts to\r\nretain records in the absence of any mandatory requirements, the CSP SHALL conduct a risk management\r\nprocess, including assessments of privacy and security risks to determine how long records should be retained and\r\nSHALL inform the subscriber of that retention policy.\r\n4.3 Authenticator Assurance Level 3\r\nThis section is normative.\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 9 of 63\n\nAAL3 provides very high confidence that the claimant controls authenticator(s) bound to the subscriber’s account.\r\nAuthentication at AAL3 is based on proof of possession of a key through a cryptographic protocol. AAL3\r\nauthentication SHALL use a hardware-based authenticator and an authenticator that provides verifier\r\nimpersonation resistance — the same device MAY fulfill both these requirements. In order to authenticate at\r\nAAL3, claimants SHALL prove possession and control of two distinct authentication factors through secure\r\nauthentication protocol(s). Approved cryptographic techniques are required.\r\n4.3.1 Permitted Authenticator Types\r\nAAL3 authentication SHALL occur by the use of one of a combination of authenticators satisfying the\r\nrequirements in Section 4.3. Possible combinations are:\r\nMulti-Factor Cryptographic Device (Section 5.1.9)\r\nSingle-Factor Cryptographic Device (Section 5.1.7) used in conjunction with Memorized Secret (Section\r\n5.1.1)\r\nMulti-Factor OTP device (software or hardware) (Section 5.1.5) used in conjunction with a Single-Factor\r\nCryptographic Device (Section 5.1.7)\r\nMulti-Factor OTP device (hardware only) (Section 5.1.5) used in conjunction with a Single-Factor\r\nCryptographic Software (Section 5.1.6)\r\nSingle-Factor OTP device (hardware only) (Section 5.1.4) used in conjunction with a Multi-Factor\r\nCryptographic Software Authenticator (Section 5.1.8)\r\nSingle-Factor OTP device (hardware only) (Section 5.1.4) used in conjunction with a Single-Factor\r\nCryptographic Software Authenticator (Section 5.1.6) and a Memorized Secret (Section 5.1.1)\r\n4.3.2 Authenticator and Verifier Requirements\r\nCommunication between the claimant and verifier SHALL be via an authenticated protected channel to provide\r\nconfidentiality of the authenticator output and resistance to MitM attacks. At least one cryptographic authenticator\r\nused at AAL3 SHALL be verifier impersonation resistant as described in Section 5.2.5 and SHALL be replay\r\nresistant as described in Section 5.2.8. All authentication and reauthentication processes at AAL3 SHALL\r\ndemonstrate authentication intent from at least one authenticator as described in Section 5.2.9.\r\nMulti-factor authenticators used at AAL3 SHALL be hardware cryptographic modules validated at FIPS 140\r\nLevel 2 or higher overall with at least FIPS 140 Level 3 physical security. Single-factor cryptographic devices\r\nused at AAL3 SHALL be validated at FIPS 140 Level 1 or higher overall with at least FIPS 140 Level 3 physical\r\nsecurity.\r\nVerifiers at AAL3 SHALL be validated at FIPS 140 Level 1 or higher.\r\nVerifiers at AAL3 SHALL be verifier compromise resistant as described in Section 5.2.7 with respect to at least\r\none authentication factor.\r\nHardware-based authenticators and verifiers at AAL3 SHOULD resist relevant side-channel (e.g., timing and\r\npower-consumption analysis) attacks. Relevant side-channel attacks SHALL be determined by a risk assessment\r\nperformed by the CSP.\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 10 of 63\n\nWhen a device such a smartphone is used in the authentication process — presuming that the device is able to\r\nmeet the requirements above — the unlocking of that device SHALL NOT be considered to satisfy one of the\r\nauthentication factors. This is because it is generally not possible for verifier to know that the device had been\r\nlocked nor whether the unlock process met the requirements for the relevant authenticator type.\r\nWhen a biometric factor is used in authentication at AAL3, the verifier SHALL make a determination that the\r\nbiometric sensor and subsequent processing meet the performance requirements stated in Section 5.2.3.\r\n4.3.3 Reauthentication\r\nPeriodic reauthentication of subscriber sessions SHALL be performed as described in Section 7.2. At AAL3,\r\nauthentication of the subscriber SHALL be repeated at least once per 12 hours during an extended usage session,\r\nregardless of user activity, as described in Section 7.2. Reauthentication of the subscriber SHALL be repeated\r\nfollowing any period of inactivity lasting 15 minutes or longer. Reauthentication SHALL use both authentication\r\nfactors. The session SHALL be terminated (i.e., logged out) when either of these time limits is reached. The\r\nverifier MAY prompt the user to cause activity just before the inactivity timeout.\r\n4.3.4 Security Controls\r\nThe CSP SHALL employ appropriately-tailored security controls from the high baseline of security controls\r\ndefined in SP 800-53 or an equivalent federal (e.g., FEDRAMP) or industry standard. The CSP SHALL ensure\r\nthat the minimum assurance-related controls for high-impact systems or equivalent are satisfied.\r\n4.3.5 Records Retention Policy\r\nThe CSP shall comply with its respective records retention policies in accordance with applicable laws,\r\nregulations, and policies, including any NARA records retention schedules that may apply. If the CSP opts to\r\nretain records in the absence of any mandatory requirements, the CSP SHALL conduct a risk management\r\nprocess, including assessments of privacy and security risks, to determine how long records should be retained\r\nand SHALL inform the subscriber of that retention policy.\r\n4.4 Privacy Requirements\r\nThe CSP SHALL employ appropriately-tailored privacy controls defined in SP 800-53 or equivalent industry\r\nstandard.\r\nIf CSPs process attributes for purposes other than identity proofing, authentication, or attribute assertions\r\n(collectively “identity service”), related fraud mitigation, or to comply with law or legal process, CSPs SHALL\r\nimplement measures to maintain predictability and manageability commensurate with the privacy risk arising\r\nfrom the additional processing. Measures MAY include providing clear notice, obtaining subscriber consent, or\r\nenabling selective use or disclosure of attributes. When CSPs use consent measures, CSPs SHALL NOT make\r\nconsent for the additional processing a condition of the identity service.\r\nRegardless of whether the CSP is an agency or private sector provider, the following requirements apply to an\r\nagency offering or using the authentication service:\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 11 of 63\n\n1. The agency SHALL consult with their Senior Agency Official for Privacy (SAOP) and conduct an analysis\r\nto determine whether the collection of PII to issue or maintain authenticators triggers the requirements of\r\nthe Privacy Act of 1974 [Privacy Act] (see Section 9.4).\r\nThe agency SHALL publish a System of Records Notice (SORN) to cover such collections, as\r\napplicable.\r\nThe agency SHALL consult with their SAOP and conduct an analysis to determine whether the\r\ncollection of PII to issue or maintain authenticators triggers the requirements of the E-Government\r\nAct of 2002 [E-Gov].\r\nThe agency SHALL publish a Privacy Impact Assessment (PIA) to cover such collection, as\r\napplicable.\r\n4.5 Summary of Requirements\r\nThis section is informative.\r\nTable 4-1 summarizes the requirements for each of the AALs:\r\nTable 4-1 AAL Summary of Requirements\r\nRequirement AAL1 AAL2 AAL3\r\nPermitted\r\nauthenticator types\r\nMemorized Secret;\r\nLook-up Secret;\r\nOut-of-Band;\r\nSF OTP Device;\r\nMF OTP Device;\r\nSF Crypto\r\nSoftware;\r\nSF Crypto Device;\r\nMF Crypto\r\nSoftware;\r\nMF Crypto Device\r\nMF OTP Device;\r\nMF Crypto Software;\r\nMF Crypto Device;\r\nor Memorized Secret plus:\r\n • Look-up Secret\r\n • Out-of-Band\r\n • SF OTP Device\r\n • SF Crypto Software\r\n • SF Crypto Device\r\nMF Crypto Device;\r\nSF Crypto Device plus\r\n  Memorized Secret;\r\nSF OTP Device plus MF\r\nCrypto Device or Software;\r\nSF OTP Device plus SF\r\nCrypto Software plus\r\nMemorized Secret\r\nFIPS 140 validation\r\nLevel 1\r\n(Government\r\nagency verifiers)\r\nLevel 1 (Government\r\nagency authenticators and\r\nverifiers)\r\nLevel 2 overall (MF\r\nauthenticators)\r\nLevel 1 overall (verifiers\r\nand SF Crypto Devices)\r\nLevel 3 physical security (all\r\nauthenticators)\r\nReauthentication 30 days\r\n12 hours or 30 minutes\r\ninactivity; MAY use one\r\nauthentication factor\r\n12 hours or 15 minutes\r\ninactivity; SHALL use both\r\nauthentication factors\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 12 of 63\n\nRequirement AAL1 AAL2 AAL3\r\nSecurity controls\r\nSP 800-53 Low\r\nBaseline (or\r\nequivalent)\r\nSP 800-53 Moderate\r\nBaseline (or equivalent)\r\nSP 800-53 High Baseline (or\r\nequivalent)\r\nMitM resistance Required Required Required\r\nVerifier-impersonation\r\nresistance\r\nNot required Not required Required\r\nVerifier-compromise\r\nresistance\r\nNot required Not required Required\r\nReplay resistance Not required Required Required\r\nAuthentication\r\nintent\r\nNot required Recommended Required\r\nRecords Retention\r\nPolicy\r\nRequired Required Required\r\nPrivacy Controls Required Required Required\r\n5 Authenticator and Verifier Requirements\r\nThis section is normative.\r\nThis section provides the detailed requirements specific to each type of authenticator. With the exception of\r\nreauthentication requirements specified in Section 4 and the requirement for verifier impersonation resistance at\r\nAAL3 described in Section 5.2.5, the technical requirements for each of the authenticator types are the same\r\nregardless of the AAL at which the authenticator is used.\r\n5.1 Requirements by Authenticator Type\r\n5.1.1 Memorized Secrets\r\nA Memorized Secret authenticator — commonly referred to as a password or, if\r\nnumeric, a PIN — is a secret value intended to be chosen and memorized by the user.\r\nMemorized secrets need to be of sufficient complexity and secrecy that it would be\r\nimpractical for an attacker to guess or otherwise discover the correct secret value. A\r\nmemorized secret is something you know.\r\n5.1.1.1 Memorized Secret Authenticators\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 13 of 63\n\nMemorized secrets SHALL be at least 8 characters in length if chosen by the subscriber. Memorized secrets\r\nchosen randomly by the CSP or verifier SHALL be at least 6 characters in length and MAY be entirely numeric. If\r\nthe CSP or verifier disallows a chosen memorized secret based on its appearance on a blacklist of compromised\r\nvalues, the subscriber SHALL be required to choose a different memorized secret. No other complexity\r\nrequirements for memorized secrets SHOULD be imposed. A rationale for this is presented in Appendix A\r\nStrength of Memorized Secrets.\r\n5.1.1.2 Memorized Secret Verifiers\r\nVerifiers SHALL require subscriber-chosen memorized secrets to be at least 8 characters in length. Verifiers\r\nSHOULD permit subscriber-chosen memorized secrets at least 64 characters in length. All printing ASCII [RFC\r\n20] characters as well as the space character SHOULD be acceptable in memorized secrets. Unicode [ISO/ISC\r\n10646] characters SHOULD be accepted as well. To make allowances for likely mistyping, verifiers MAY replace\r\nmultiple consecutive space characters with a single space character prior to verification, provided that the result is\r\nat least 8 characters in length. Truncation of the secret SHALL NOT be performed. For purposes of the above\r\nlength requirements, each Unicode code point SHALL be counted as a single character.\r\nIf Unicode characters are accepted in memorized secrets, the verifier SHOULD apply the Normalization Process\r\nfor Stabilized Strings using either the NFKC or NFKD normalization defined in Section 12.1 of Unicode Standard\r\nAnnex 15 [UAX 15]. This process is applied before hashing the byte string representing the memorized secret.\r\nSubscribers choosing memorized secrets containing Unicode characters SHOULD be advised that some characters\r\nmay be represented differently by some endpoints, which can affect their ability to authenticate successfully.\r\nMemorized secrets that are randomly chosen by the CSP (e.g., at enrollment) or by the verifier (e.g., when a user\r\nrequests a new PIN) SHALL be at least 6 characters in length and SHALL be generated using an approved\r\nrandom bit generator [SP 800-90Ar1].\r\nMemorized secret verifiers SHALL NOT permit the subscriber to store a “hint” that is accessible to an\r\nunauthenticated claimant. Verifiers SHALL NOT prompt subscribers to use specific types of information (e.g.,\r\n“What was the name of your first pet?”) when choosing memorized secrets.\r\nWhen processing requests to establish and change memorized secrets, verifiers SHALL compare the prospective\r\nsecrets against a list that contains values known to be commonly-used, expected, or compromised. For example,\r\nthe list MAY include, but is not limited to:\r\nPasswords obtained from previous breach corpuses.\r\nDictionary words.\r\nRepetitive or sequential characters (e.g. ‘aaaaaa’, ‘1234abcd’).\r\nContext-specific words, such as the name of the service, the username, and derivatives thereof.\r\nIf the chosen secret is found in the list, the CSP or verifier SHALL advise the subscriber that they need to select a\r\ndifferent secret, SHALL provide the reason for rejection, and SHALL require the subscriber to choose a different\r\nvalue.\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 14 of 63\n\nVerifiers SHOULD offer guidance to the subscriber, such as a password-strength meter [Meters], to assist the user\r\nin choosing a strong memorized secret. This is particularly important following the rejection of a memorized\r\nsecret on the above list as it discourages trivial modification of listed (and likely very weak) memorized secrets\r\n[Blacklists].\r\nVerifiers SHALL implement a rate-limiting mechanism that effectively limits the number of failed authentication\r\nattempts that can be made on the subscriber’s account as described in Section 5.2.2.\r\nVerifiers SHOULD NOT impose other composition rules (e.g., requiring mixtures of different character types or\r\nprohibiting consecutively repeated characters) for memorized secrets. Verifiers SHOULD NOT require\r\nmemorized secrets to be changed arbitrarily (e.g., periodically). However, verifiers SHALL force a change if there\r\nis evidence of compromise of the authenticator.\r\nVerifiers SHOULD permit claimants to use “paste” functionality when entering a memorized secret. This\r\nfacilitates the use of password managers, which are widely used and in many cases increase the likelihood that\r\nusers will choose stronger memorized secrets.\r\nIn order to assist the claimant in successfully entering a memorized secret, the verifier SHOULD offer an option\r\nto display the secret — rather than a series of dots or asterisks — until it is entered. This allows the claimant to\r\nverify their entry if they are in a location where their screen is unlikely to be observed. The verifier MAY also\r\npermit the user’s device to display individual entered characters for a short time after each character is typed to\r\nverify correct entry. This is particularly applicable on mobile devices.\r\nThe verifier SHALL use approved encryption and an authenticated protected channel when requesting memorized\r\nsecrets in order to provide resistance to eavesdropping and MitM attacks.\r\nVerifiers SHALL store memorized secrets in a form that is resistant to offline attacks. Memorized secrets SHALL\r\nbe salted and hashed using a suitable one-way key derivation function. Key derivation functions take a password,\r\na salt, and a cost factor as inputs then generate a password hash. Their purpose is to make each password guessing\r\ntrial by an attacker who has obtained a password hash file expensive and therefore the cost of a guessing attack\r\nhigh or prohibitive. Examples of suitable key derivation functions include Password-based Key Derivation\r\nFunction 2 (PBKDF2) [SP 800-132] and Balloon [BALLOON]. A memory-hard function SHOULD be used\r\nbecause it increases the cost of an attack. The key derivation function SHALL use an approved one-way function\r\nsuch as Keyed Hash Message Authentication Code (HMAC) [FIPS 198-1], any approved hash function in SP 800-\r\n107, Secure Hash Algorithm 3 (SHA-3) [FIPS 202], CMAC [SP 800-38B] or Keccak Message Authentication\r\nCode (KMAC), Customizable SHAKE (cSHAKE), or ParallelHash [SP 800-185]. The chosen output length of the\r\nkey derivation function SHOULD be the same as the length of the underlying one-way function output.\r\nThe salt SHALL be at least 32 bits in length and be chosen arbitrarily so as to minimize salt value collisions\r\namong stored hashes. Both the salt value and the resulting hash SHALL be stored for each subscriber using a\r\nmemorized secret authenticator.\r\nFor PBKDF2, the cost factor is an iteration count: the more times the PBKDF2 function is iterated, the longer it\r\ntakes to compute the password hash. Therefore, the iteration count SHOULD be as large as verification server\r\nperformance will allow, typically at least 10,000 iterations.\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 15 of 63\n\nIn addition, verifiers SHOULD perform an additional iteration of a key derivation function using a salt value that\r\nis secret and known only to the verifier. This salt value, if used, SHALL be generated by an approved random bit\r\ngenerator [SP 800-90Ar1] and provide at least the minimum security strength specified in the latest revision of SP\r\n800-131A (112 bits as of the date of this publication). The secret salt value SHALL be stored separately from the\r\nhashed memorized secrets (e.g., in a specialized device like a hardware security module). With this additional\r\niteration, brute-force attacks on the hashed memorized secrets are impractical as long as the secret salt value\r\nremains secret.\r\n5.1.2 Look-Up Secrets\r\nA look-up secret authenticator is a physical or electronic record that stores a set of secrets\r\nshared between the claimant and the CSP. The claimant uses the authenticator to look up the\r\nappropriate secret(s) needed to respond to a prompt from the verifier. For example, the\r\nverifier may ask a claimant to provide a specific subset of the numeric or character strings\r\nprinted on a card in table format. A common application of look-up secrets is the use of\r\n\"recovery keys\" stored by the subscriber for use in the event another authenticator is lost or\r\nmalfunctions. A look-up secret is something you have.\r\n5.1.2.1 Look-Up Secret Authenticators\r\nCSPs creating look-up secret authenticators SHALL use an approved random bit generator [SP 800-90Ar1] to\r\ngenerate the list of secrets and SHALL deliver the authenticator securely to the subscriber. Look-up secrets\r\nSHALL have at least 20 bits of entropy.\r\nLook-up secrets MAY be distributed by the CSP in person, by postal mail to the subscriber’s address of record, or\r\nby online distribution. If distributed online, look-up secrets SHALL be distributed over a secure channel in\r\naccordance with the post-enrollment binding requirements in Section 6.1.2.\r\nIf the authenticator uses look-up secrets sequentially from a list, the subscriber MAY dispose of used secrets, but\r\nonly after a successful authentication.\r\n5.1.2.2 Look-Up Secret Verifiers\r\nVerifiers of look-up secrets SHALL prompt the claimant for the next secret from their authenticator or for a\r\nspecific (e.g., numbered) secret. A given secret from an authenticator SHALL be used successfully only once. If\r\nthe look-up secret is derived from a grid card, each cell of the grid SHALL be used only once.\r\nVerifiers SHALL store look-up secrets in a form that is resistant to offline attacks. Look-up secrets having at least\r\n112 bits of entropy SHALL be hashed with an approved one-way function as described in Section 5.1.1.2. Look-up secrets with fewer than 112 bits of entropy SHALL be salted and hashed using a suitable one-way key\r\nderivation function, also described in Section 5.1.1.2. The salt value SHALL be at least 32 in bits in length and\r\narbitrarily chosen so as to minimize salt value collisions among stored hashes. Both the salt value and the resulting\r\nhash SHALL be stored for each look-up secret.\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 16 of 63\n\nFor look-up secrets that have less than 64 bits of entropy, the verifier SHALL implement a rate-limiting\r\nmechanism that effectively limits the number of failed authentication attempts that can be made on the\r\nsubscriber’s account as described in Section 5.2.2.\r\nThe verifier SHALL use approved encryption and an authenticated protected channel when requesting look-up\r\nsecrets in order to provide resistance to eavesdropping and MitM attacks.\r\n5.1.3 Out-of-Band Devices\r\nAn out-of-band authenticator is a physical device that is uniquely addressable and can\r\ncommunicate securely with the verifier over a distinct communications channel, referred\r\nto as the secondary channel. The device is possessed and controlled by the claimant and\r\nsupports private communication over this secondary channel, separate from the primary\r\nchannel for e-authentication. An out-of-band authenticator is something you have.\r\nThe out-of-band authenticator can operate in one of the following ways:\r\n- The claimant transfers a secret received by the out-of-band device via the secondary\r\nchannel to the verifier using the primary channel. For example, the claimant may receive\r\nthe secret on their mobile device and type it (typically a 6-digit code) into their\r\nauthentication session.\r\n- The claimant transfers a secret received via the primary channel to the out-of-band\r\ndevice for transmission to the verifier via the secondary channel. For example, the\r\nclaimant may view the secret on their authentication session and either type it into an app\r\non their mobile device or use a technology such as a barcode or QR code to effect the\r\ntransfer.\r\n- The claimant compares secrets received from the primary channel and the secondary\r\nchannel and confirms the authentication via the secondary channel.\r\nThe secret's purpose is to securely bind the authentication operation on the primary and\r\nsecondary channel. When the response is via the primary communication channel, the\r\nsecret also establishes the claimant's control of the out-of-band device.\r\n5.1.3.1 Out-of-Band Authenticators\r\nThe out-of-band authenticator SHALL establish a separate channel with the verifier in order to retrieve the out-of-band secret or authentication request. This channel is considered to be out-of-band with respect to the primary\r\ncommunication channel (even if it terminates on the same device) provided the device does not leak information\r\nfrom one channel to the other without the authorization of the claimant.\r\nThe out-of-band device SHOULD be uniquely addressable and communication over the secondary channel\r\nSHALL be encrypted unless sent via the public switched telephone network (PSTN). For additional authenticator\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 17 of 63\n\nrequirements specific to the PSTN, see Section 5.1.3.3. Methods that do not prove possession of a specific device,\r\nsuch as voice-over-IP (VOIP) or email, SHALL NOT be used for out-of-band authentication.\r\nThe out-of-band authenticator SHALL uniquely authenticate itself in one of the following ways when\r\ncommunicating with the verifier:\r\nEstablish an authenticated protected channel to the verifier using approved cryptography. The key used\r\nSHALL be stored in suitably secure storage available to the authenticator application (e.g., keychain\r\nstorage, TPM, TEE, secure element).\r\nAuthenticate to a public mobile telephone network using a SIM card or equivalent that uniquely identifies\r\nthe device. This method SHALL only be used if a secret is being sent from the verifier to the out-of-band\r\ndevice via the PSTN (SMS or voice).\r\nIf a secret is sent by the verifier to the out-of-band device, the device SHOULD NOT display the authentication\r\nsecret while it is locked by the owner (i.e., requires an entry of a PIN, passcode, or biometric to view). However,\r\nauthenticators SHOULD indicate the receipt of an authentication secret on a locked device.\r\nIf the out-of-band authenticator sends an approval message over the secondary communication channel — rather\r\nthan by the claimant transferring a received secret to the primary communication channel — it SHALL do one of\r\nthe following:\r\nThe authenticator SHALL accept transfer of the secret from the primary channel which it SHALL send to\r\nthe verifier over the secondary channel to associate the approval with the authentication transaction. The\r\nclaimant MAY perform the transfer manually or use a technology such as a barcode or QR code to effect\r\nthe transfer.\r\nThe authenticator SHALL present a secret received via the secondary channel from the verifier and prompt\r\nthe claimant to verify the consistency of that secret with the primary channel, prior to accepting a yes/no\r\nresponse from the claimant. It SHALL then send that response to the verifier.\r\n5.1.3.2 Out-of-Band Verifiers\r\nFor additional verification requirements specific to the PSTN, see Section 5.1.3.3.\r\nIf out-of-band verification is to be made using a secure application, such as on a smart phone, the verifier MAY\r\nsend a push notification to that device. The verifier then waits for the establishment of an authenticated protected\r\nchannel and verifies the authenticator’s identifying key. The verifier SHALL NOT store the identifying key itself,\r\nbut SHALL use a verification method (e.g., an approved hash function or proof of possession of the identifying\r\nkey) to uniquely identify the authenticator. Once authenticated, the verifier transmits the authentication secret to\r\nthe authenticator.\r\nDepending on the type of out-of-band authenticator, one of the following SHALL take place:\r\nTransfer of secret to primary channel: The verifier MAY signal the device containing the subscriber’s\r\nauthenticator to indicate readiness to authenticate. It SHALL then transmit a random secret to the out-of-https://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 18 of 63\n\nband authenticator. The verifier SHALL then wait for the secret to be returned on the primary\r\ncommunication channel.\r\nTransfer of secret to secondary channel: The verifier SHALL display a random authentication secret to the\r\nclaimant via the primary channel. It SHALL then wait for the secret to be returned on the secondary\r\nchannel from the claimant’s out-of-band authenticator.\r\nVerification of secrets by claimant: The verifier SHALL display a random authentication secret to the\r\nclaimant via the primary channel, and SHALL send the same secret to the out-of-band authenticator via the\r\nsecondary channel for presentation to the claimant. It SHALL then wait for an approval (or disapproval)\r\nmessage via the secondary channel.\r\nIn all cases, the authentication SHALL be considered invalid if not completed within 10 minutes. In order to\r\nprovide replay resistance as described in Section 5.2.8, verifiers SHALL accept a given authentication secret only\r\nonce during the validity period.\r\nThe verifier SHALL generate random authentication secrets with at least 20 bits of entropy using an approved\r\nrandom bit generator [SP 800-90Ar1]. If the authentication secret has less than 64 bits of entropy, the verifier\r\nSHALL implement a rate-limiting mechanism that effectively limits the number of failed authentication attempts\r\nthat can be made on the subscriber’s account as described in Section 5.2.2.\r\n5.1.3.3 Authentication using the Public Switched Telephone Network\r\nUse of the PSTN for out-of-band verification is RESTRICTED as described in this section and in Section 5.2.10.\r\nIf out-of-band verification is to be made using the PSTN, the verifier SHALL verify that the pre-registered\r\ntelephone number being used is associated with a specific physical device. Changing the pre-registered telephone\r\nnumber is considered to be the binding of a new authenticator and SHALL only occur as described in Section\r\n6.1.2.\r\nVerifiers SHOULD consider risk indicators such as device swap, SIM change, number porting, or other abnormal\r\nbehavior before using the PSTN to deliver an out-of-band authentication secret.\r\nNOTE: Consistent with the restriction of authenticators in Section 5.2.10, NIST may adjust the\r\nRESTRICTED status of the PSTN over time based on the evolution of the threat landscape and the\r\ntechnical operation of the PSTN.\r\n5.1.4 Single-Factor OTP Device\r\nA single-factor OTP device generates OTPs. This category includes hardware devices and\r\nsoftware-based OTP generators installed on devices such as mobile phones. These devices\r\nhave an embedded secret that is used as the seed for generation of OTPs and does not require\r\nactivation through a second factor. The OTP is displayed on the device and manually input\r\nfor transmission to the verifier, thereby proving possession and control of the device. An\r\nOTP device may, for example, display 6 characters at a time. A single-factor OTP device is\r\nsomething you have.\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 19 of 63\n\nSingle-factor OTP devices are similar to look-up secret authenticators with the exception that\r\nthe secrets are cryptographically and independently generated by the authenticator and\r\nverifier and compared by the verifier. The secret is computed based on a nonce that may be\r\ntime-based or from a counter on the authenticator and verifier.\r\n5.1.4.1 Single-Factor OTP Authenticators\r\nSingle-factor OTP authenticators contain two persistent values. The first is a symmetric key that persists for the\r\ndevice’s lifetime. The second is a nonce that is either changed each time the authenticator is used or is based on a\r\nreal-time clock.\r\nThe secret key and its algorithm SHALL provide at least the minimum security strength specified in the latest\r\nrevision of SP 800-131A (112 bits as of the date of this publication). The nonce SHALL be of sufficient length to\r\nensure that it is unique for each operation of the device over its lifetime. OTP authenticators — particularly\r\nsoftware-based OTP generators — SHOULD discourage and SHALL NOT facilitate the cloning of the secret key\r\nonto multiple devices.\r\nThe authenticator output is obtained by using an approved block cipher or hash function to combine the key and\r\nnonce in a secure manner. The authenticator output MAY be truncated to as few as 6 decimal digits\r\n(approximately 20 bits of entropy).\r\nIf the nonce used to generate the authenticator output is based on a real-time clock, the nonce SHALL be changed\r\nat least once every 2 minutes. The OTP value associated with a given nonce SHALL be accepted only once.\r\n5.1.4.2 Single-Factor OTP Verifiers\r\nSingle-factor OTP verifiers effectively duplicate the process of generating the OTP used by the authenticator. As\r\nsuch, the symmetric keys used by authenticators are also present in the verifier, and SHALL be strongly protected\r\nagainst compromise.\r\nWhen a single-factor OTP authenticator is being associated with a subscriber account, the verifier or associated\r\nCSP SHALL use approved cryptography to either generate and exchange or to obtain the secrets required to\r\nduplicate the authenticator output.\r\nThe verifier SHALL use approved encryption and an authenticated protected channel when collecting the OTP in\r\norder to provide resistance to eavesdropping and MitM attacks. Time-based OTPs [RFC 6238] SHALL have a\r\ndefined lifetime that is determined by the expected clock drift — in either direction — of the authenticator over its\r\nlifetime, plus allowance for network delay and user entry of the OTP. In order to provide replay resistance as\r\ndescribed in Section 5.2.8, verifiers SHALL accept a given time-based OTP only once during the validity period.\r\nIf the authenticator output has less than 64 bits of entropy, the verifier SHALL implement a rate-limiting\r\nmechanism that effectively limits the number of failed authentication attempts that can be made on the\r\nsubscriber’s account as described in Section 5.2.2.\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 20 of 63\n\n5.1.5 Multi-Factor OTP Devices\r\nA multi-factor OTP device generates OTPs for use in authentication after activation through an\r\nadditional authentication factor. This includes hardware devices and software-based OTP\r\ngenerators installed on devices such as mobile phones. The second factor of authentication may\r\nbe achieved through some kind of integral entry pad, an integral biometric (e.g., fingerprint)\r\nreader, or a direct computer interface (e.g., USB port). The OTP is displayed on the device and\r\nmanually input for transmission to the verifier. For example, an OTP device may display 6\r\ncharacters at a time, thereby proving possession and control of the device. The multi-factor OTP\r\ndevice is something you have, and it SHALL be activated by either something you know or\r\nsomething you are.\r\n5.1.5.1 Multi-Factor OTP Authenticators\r\nMulti-factor OTP authenticators operate in a similar manner to single-factor OTP authenticators (see Section\r\n5.1.4.1), except that they require the entry of either a memorized secret or the use of a biometric to obtain the OTP\r\nfrom the authenticator. Each use of the authenticator SHALL require the input of the additional factor.\r\nIn addition to activation information, multi-factor OTP authenticators contain two persistent values. The first is a\r\nsymmetric key that persists for the device’s lifetime. The second is a nonce that is either changed each time the\r\nauthenticator is used or is based on a real-time clock.\r\nThe secret key and its algorithm SHALL provide at least the minimum security strength specified in the latest\r\nrevision of [SP 800-131A] (112 bits as of the date of this publication). The nonce SHALL be of sufficient length to\r\nensure that it is unique for each operation of the device over its lifetime. OTP authenticators — particularly\r\nsoftware-based OTP generators — SHOULD discourage and SHALL NOT facilitate the cloning of the secret key\r\nonto multiple devices.\r\nThe authenticator output is obtained by using an approved block cipher or hash function to combine the key and\r\nnonce in a secure manner. The authenticator output MAY be truncated to as few as 6 decimal digits\r\n(approximately 20 bits of entropy).\r\nIf the nonce used to generate the authenticator output is based on a real-time clock, the nonce SHALL be changed\r\nat least once every 2 minutes. The OTP value associated with a given nonce SHALL be accepted only once.\r\nAny memorized secret used by the authenticator for activation SHALL be a randomly-chosen numeric secret at\r\nleast 6 decimal digits in length or other memorized secret meeting the requirements of Section 5.1.1.2 and SHALL\r\nbe rate limited as specified in Section 5.2.2. A biometric activation factor SHALL meet the requirements of\r\nSection 5.2.3, including limits on the number of consecutive authentication failures.\r\nThe unencrypted key and activation secret or biometric sample — and any biometric data derived from the\r\nbiometric sample such as a probe produced through signal processing — SHALL be zeroized immediately after an\r\nOTP has been generated.\r\n5.1.5.2 Multi-Factor OTP Verifiers\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 21 of 63\n\nMulti-factor OTP verifiers effectively duplicate the process of generating the OTP used by the authenticator, but\r\nwithout the requirement that a second factor be provided. As such, the symmetric keys used by authenticators\r\nSHALL be strongly protected against compromise.\r\nWhen a multi-factor OTP authenticator is being associated with a subscriber account, the verifier or associated\r\nCSP SHALL use approved cryptography to either generate and exchange or to obtain the secrets required to\r\nduplicate the authenticator output. The verifier or CSP SHALL also establish, via the authenticator source, that the\r\nauthenticator is a multi-factor device. In the absence of a trusted statement that it is a multi-factor device, the\r\nverifier SHALL treat the authenticator as single-factor, in accordance with Section 5.1.4.\r\nThe verifier SHALL use approved encryption and an authenticated protected channel when collecting the OTP in\r\norder to provide resistance to eavesdropping and MitM attacks. Time-based OTPs [RFC 6238] SHALL have a\r\ndefined lifetime that is determined by the expected clock drift — in either direction — of the authenticator over its\r\nlifetime, plus allowance for network delay and user entry of the OTP. In order to provide replay resistance as\r\ndescribed in Section 5.2.8, verifiers SHALL accept a given time-based OTP only once during the validity period.\r\nIn the event a claimant’s authentication is denied due to duplicate use of an OTP, verifiers MAY warn the claimant\r\nin case an attacker has been able to authenticate in advance. Verifiers MAY also warn a subscriber in an existing\r\nsession of the attempted duplicate use of an OTP.\r\nIf the authenticator output or activation secret has less than 64 bits of entropy, the verifier SHALL implement a\r\nrate-limiting mechanism that effectively limits the number of failed authentication attempts that can be made on\r\nthe subscriber’s account as described in Section 5.2.2. A biometric activation factor SHALL meet the requirements\r\nof Section 5.2.3, including limits on the number of consecutive authentication failures.\r\n5.1.6 Single-Factor Cryptographic Software\r\nA single-factor software cryptographic authenticator is a cryptographic key stored on\r\ndisk or some other \"soft\" media. Authentication is accomplished by proving possession\r\nand control of the key. The authenticator output is highly dependent on the specific\r\ncryptographic protocol, but it is generally some type of signed message. The single-factor software cryptographic authenticator is something you have.\r\n5.1.6.1 Single-Factor Cryptographic Software Authenticators\r\nSingle-factor software cryptographic authenticators encapsulate one or more secret keys unique to the\r\nauthenticator. The key SHALL be stored in suitably secure storage available to the authenticator application (e.g.,\r\nkeychain storage, TPM, or TEE if available). The key SHALL be strongly protected against unauthorized\r\ndisclosure by the use of access controls that limit access to the key to only those software components on the\r\ndevice requiring access. Single-factor cryptographic software authenticators SHOULD discourage and SHALL\r\nNOT facilitate the cloning of the secret key onto multiple devices.\r\n5.1.6.2 Single-Factor Cryptographic Software Verifiers\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 22 of 63\n\nThe requirements for a single-factor cryptographic software verifier are identical to those for a single-factor\r\ncryptographic device verifier, described in Section 5.1.7.2.\r\n5.1.7 Single-Factor Cryptographic Devices\r\nA single-factor cryptographic device is a hardware device that performs cryptographic\r\noperations using protected cryptographic key(s) and provides the authenticator output via direct\r\nconnection to the user endpoint. The device uses embedded symmetric or asymmetric\r\ncryptographic keys, and does not require activation through a second factor of authentication.\r\nAuthentication is accomplished by proving possession of the device via the authentication\r\nprotocol. The authenticator output is provided by direct connection to the user endpoint and is\r\nhighly dependent on the specific cryptographic device and protocol, but it is typically some\r\ntype of signed message. A single-factor cryptographic device is something you have.\r\n5.1.7.1 Single-Factor Cryptographic Device Authenticators\r\nSingle-factor cryptographic device authenticators encapsulate one or more secret keys unique to the device that\r\nSHALL NOT be exportable (i.e., cannot be removed from the device). The authenticator operates by signing a\r\nchallenge nonce presented through a direct computer interface (e.g., a USB port). Alternatively, the authenticator\r\ncould be a suitably secure processor integrated with the user endpoint itself (e.g., a hardware TPM). Although\r\ncryptographic devices contain software, they differ from cryptographic software authenticators in that all\r\nembedded software is under control of the CSP or issuer and that the entire authenticator is subject to all\r\napplicable FIPS 140 requirements at the AAL being authenticated.\r\nThe secret key and its algorithm SHALL provide at least the minimum security length specified in the latest\r\nrevision of SP 800-131A (112 bits as of the date of this publication). The challenge nonce SHALL be at least 64\r\nbits in length. Approved cryptography SHALL be used.\r\nSingle-factor cryptographic device authenticators SHOULD require a physical input (e.g., the pressing of a\r\nbutton) in order to operate. This provides defense against unintended operation of the device, which might occur if\r\nthe endpoint to which it is connected is compromised.\r\n5.1.7.2 Single-Factor Cryptographic Device Verifiers\r\nSingle-factor cryptographic device verifiers generate a challenge nonce, send it to the corresponding authenticator,\r\nand use the authenticator output to verify possession of the device. The authenticator output is highly dependent\r\non the specific cryptographic device and protocol, but it is generally some type of signed message.\r\nThe verifier has either symmetric or asymmetric cryptographic keys corresponding to each authenticator. While\r\nboth types of keys SHALL be protected against modification, symmetric keys SHALL additionally be protected\r\nagainst unauthorized disclosure.\r\nThe challenge nonce SHALL be at least 64 bits in length, and SHALL either be unique over the authenticator’s\r\nlifetime or statistically unique (i.e., generated using an approved random bit generator [SP 800-90Ar1]). The\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 23 of 63\n\nverification operation SHALL use approved cryptography.\r\n5.1.8 Multi-Factor Cryptographic Software\r\nA multi-factor software cryptographic authenticator is a cryptographic key stored on disk or\r\nsome other \"soft\" media that requires activation through a second factor of authentication.\r\nAuthentication is accomplished by proving possession and control of the key. The\r\nauthenticator output is highly dependent on the specific cryptographic protocol, but it is\r\ngenerally some type of signed message. The multi-factor software cryptographic\r\nauthenticator is something you have, and it SHALL be activated by either something you\r\nknow or something you are.\r\n5.1.8.1 Multi-Factor Cryptographic Software Authenticators\r\nMulti-factor software cryptographic authenticators encapsulate one or more secret keys unique to the authenticator\r\nand accessible only through the input of an additional factor, either a memorized secret or a biometric. The key\r\nSHOULD be stored in suitably secure storage available to the authenticator application (e.g., keychain storage,\r\nTPM, TEE). The key SHALL be strongly protected against unauthorized disclosure by the use of access controls\r\nthat limit access to the key to only those software components on the device requiring access. Multi-factor\r\ncryptographic software authenticators SHOULD discourage and SHALL NOT facilitate the cloning of the secret\r\nkey onto multiple devices.\r\nEach authentication operation using the authenticator SHALL require the input of both factors.\r\nAny memorized secret used by the authenticator for activation SHALL be a randomly-chosen numeric value at\r\nleast 6 decimal digits in length or other memorized secret meeting the requirements of Section 5.1.1.2 and SHALL\r\nbe rate limited as specified in Section 5.2.2. A biometric activation factor SHALL meet the requirements of\r\nSection 5.2.3, including limits on the number of consecutive authentication failures.\r\nThe unencrypted key and activation secret or biometric sample — and any biometric data derived from the\r\nbiometric sample such as a probe produced through signal processing — SHALL be zeroized immediately after an\r\nauthentication transaction has taken place.\r\n5.1.8.2 Multi-Factor Cryptographic Software Verifiers\r\nThe requirements for a multi-factor cryptographic software verifier are identical to those for a single-factor\r\ncryptographic device verifier, described in Section 5.1.7.2. Verification of the output from a multi-factor\r\ncryptographic software authenticator proves use of the activation factor.\r\n5.1.9 Multi-Factor Cryptographic Devices\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 24 of 63\n\nA multi-factor cryptographic device is a hardware device that performs cryptographic\r\noperations using one or more protected cryptographic keys and requires activation through a\r\nsecond authentication factor. Authentication is accomplished by proving possession of the\r\ndevice and control of the key. The authenticator output is provided by direct connection to the\r\nuser endpoint and is highly dependent on the specific cryptographic device and protocol, but it\r\nis typically some type of signed message. The multi-factor cryptographic device is something\r\nyou have, and it SHALL be activated by either something you know or something you are.\r\n5.1.9.1 Multi-Factor Cryptographic Device Authenticators\r\nMulti-factor cryptographic device authenticators use tamper-resistant hardware to encapsulate one or more secret\r\nkeys unique to the authenticator and accessible only through the input of an additional factor, either a memorized\r\nsecret or a biometric. The authenticator operates by using a private key that was unlocked by the additional factor\r\nto sign a challenge nonce presented through a direct computer interface (e.g., a USB port). Alternatively, the\r\nauthenticator could be a suitably secure processor integrated with the user endpoint itself (e.g., a hardware TPM).\r\nAlthough cryptographic devices contain software, they differ from cryptographic software authenticators in that\r\nall embedded software is under control of the CSP or issuer, and that the entire authenticator is subject to any\r\napplicable FIPS 140 requirements at the selected AAL.\r\nThe secret key and its algorithm SHALL provide at least the minimum security length specified in the latest\r\nrevision of SP 800-131A (112 bits as of the date of this publication). The challenge nonce SHALL be at least 64\r\nbits in length. Approved cryptography SHALL be used.\r\nEach authentication operation using the authenticator SHOULD require the input of the additional factor. Input of\r\nthe additional factor MAY be accomplished via either direct input on the device or via a hardware connection\r\n(e.g., USB, smartcard).\r\nAny memorized secret used by the authenticator for activation SHALL be a randomly-chosen numeric value at\r\nleast 6 decimal digits in length or other memorized secret meeting the requirements of Section 5.1.1.2 and SHALL\r\nbe rate limited as specified in Section 5.2.2. A biometric activation factor SHALL meet the requirements of\r\nSection 5.2.3, including limits on the number of consecutive authentication failures.\r\nThe unencrypted key and activation secret or biometric sample — and any biometric data derived from the\r\nbiometric sample such as a probe produced through signal processing — SHALL be zeroized immediately after an\r\nauthentication transaction has taken place.\r\n5.1.9.2 Multi-Factor Cryptographic Device Verifiers\r\nThe requirements for a multi-factor cryptographic device verifier are identical to those for a single-factor\r\ncryptographic device verifier, described in Section 5.1.7.2. Verification of the authenticator output from a multi-factor cryptographic device proves use of the activation factor.\r\n5.2 General Authenticator Requirements\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 25 of 63\n\n5.2.1 Physical Authenticators\r\nCSPs SHALL provide subscriber instructions on how to appropriately protect the authenticator against theft or\r\nloss. The CSP SHALL provide a mechanism to revoke or suspend the authenticator immediately upon notification\r\nfrom subscriber that loss or theft of the authenticator is suspected.\r\n5.2.2 Rate Limiting (Throttling)\r\nWhen required by the authenticator type descriptions in Section 5.1, the verifier SHALL implement controls to\r\nprotect against online guessing attacks. Unless otherwise specified in the description of a given authenticator, the\r\nverifier SHALL limit consecutive failed authentication attempts on a single account to no more than 100.\r\nAdditional techniques MAY be used to reduce the likelihood that an attacker will lock the legitimate claimant out\r\nas a result of rate limiting. These include:\r\nRequiring the claimant to complete a CAPTCHA before attempting authentication.\r\nRequiring the claimant to wait following a failed attempt for a period of time that increases as the account\r\napproaches its maximum allowance for consecutive failed attempts (e.g., 30 seconds up to an hour).\r\nAccepting only authentication requests that come from a white list of IP addresses from which the\r\nsubscriber has been successfully authenticated before.\r\nLeveraging other risk-based or adaptive authentication techniques to identify user behavior that falls\r\nwithin, or out of, typical norms. These might, for example, include use of IP address, geolocation, timing of\r\nrequest patterns, or browser metadata.\r\nWhen the subscriber successfully authenticates, the verifier SHOULD disregard any previous failed attempts for\r\nthat user from the same IP address.\r\n5.2.3 Use of Biometrics\r\nThe use of biometrics (something you are) in authentication includes both measurement of physical characteristics\r\n(e.g., fingerprint, iris, facial characteristics) and behavioral characteristics (e.g., typing cadence). Both classes are\r\nconsidered biometric modalities, although different modalities may differ in the extent to which they establish\r\nauthentication intent as described in Section 5.2.9.\r\nFor a variety of reasons, this document supports only limited use of biometrics for authentication. These reasons\r\ninclude:\r\nThe biometric False Match Rate (FMR) does not provide confidence in the authentication of the subscriber\r\nby itself. In addition, FMR does not account for spoofing attacks.\r\nBiometric comparison is probabilistic, whereas the other authentication factors are deterministic.\r\nBiometric template protection schemes provide a method for revoking biometric credentials that is\r\ncomparable to other authentication factors (e.g., PKI certificates and passwords). However, the availability\r\nof such solutions is limited, and standards for testing these methods are under development.\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 26 of 63\n\nBiometric characteristics do not constitute secrets. They can be obtained online or by taking a picture of\r\nsomeone with a camera phone (e.g., facial images) with or without their knowledge, lifted from objects\r\nsomeone touches (e.g., latent fingerprints), or captured with high resolution images (e.g., iris patterns).\r\nWhile presentation attack detection (PAD) technologies (e.g., liveness detection) can mitigate the risk of\r\nthese types of attacks, additional trust in the sensor or biometric processing is required to ensure that PAD\r\nis operating in accordance with the needs of the CSP and the subscriber.\r\nTherefore, the limited use of biometrics for authentication is supported with the following requirements and\r\nguidelines:\r\nBiometrics SHALL be used only as part of multi-factor authentication with a physical authenticator (something\r\nyou have).\r\nAn authenticated protected channel between sensor (or an endpoint containing a sensor that resists sensor\r\nreplacement) and verifier SHALL be established and the sensor or endpoint SHALL be authenticated prior to\r\ncapturing the biometric sample from the claimant.\r\nThe biometric system SHALL operate with an FMR [ISO/IEC 2382-37] of 1 in 1000 or better. This FMR SHALL\r\nbe achieved under conditions of a conformant attack (i.e., zero-effort impostor attempt) as defined in [ISO/IEC\r\n30107-1].\r\nThe biometric system SHOULD implement PAD. Testing of the biometric system to be deployed SHOULD\r\ndemonstrate at least 90% resistance to presentation attacks for each relevant attack type (i.e., species), where\r\nresistance is defined as the number of thwarted presentation attacks divided by the number of trial presentation\r\nattacks. Testing of presentation attack resistance SHALL be in accordance with Clause 12 of [ISO/IEC 30107-3].\r\nThe PAD decision MAY be made either locally on the claimant’s device or by a central verifier.\r\nNote: PAD is being considered as a mandatory requirement in future editions of this guideline.\r\nThe biometric system SHALL allow no more than 5 consecutive failed authentication attempts or 10 consecutive\r\nfailed attempts if PAD meeting the above requirements is implemented. Once that limit has been reached, the\r\nbiometric authenticator SHALL either:\r\nImpose a delay of at least 30 seconds before the next attempt, increasing exponentially with each\r\nsuccessive attempt (e.g., 1 minute before the following failed attempt, 2 minutes before the second\r\nfollowing attempt), or\r\nDisable the biometric user authentication and offer another factor (e.g., a different biometric modality or a\r\nPIN/Passcode if it is not already a required factor) if such an alternative method is already available.\r\nThe verifier SHALL make a determination of sensor and endpoint performance, integrity, and authenticity.\r\nAcceptable methods for making this determination include, but are not limited to:\r\nAuthentication of the sensor or endpoint.\r\nCertification by an approved accreditation authority.\r\nRuntime interrogation of signed metadata (e.g., attestation) as described in Section 5.2.4.\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 27 of 63\n\nBiometric comparison can be performed locally on claimant’s device or at a central verifier. Since the potential for\r\nattacks on a larger scale is greater at central verifiers, local comparison is preferred.\r\nIf comparison is performed centrally:\r\nUse of the biometric as an authentication factor SHALL be limited to one or more specific devices that are\r\nidentified using approved cryptography. Since the biometric has not yet unlocked the main authentication\r\nkey, a separate key SHALL be used for identifying the device.\r\nBiometric revocation, referred to as biometric template protection in ISO/IEC 24745, SHALL be\r\nimplemented.\r\nAll transmission of biometrics SHALL be over the authenticated protected channel.\r\nBiometric samples collected in the authentication process MAY be used to train comparison algorithms or — with\r\nuser consent — for other research purposes. Biometric samples and any biometric data derived from the biometric\r\nsample such as a probe produced through signal processing SHALL be zeroized immediately after any training or\r\nresearch data has been derived.\r\nBiometrics are also used in some cases to prevent repudiation of enrollment and to verify that the same individual\r\nparticipates in all phases of the enrollment process as described in SP 800-63A.\r\n5.2.4 Attestation\r\nAn attestation is information conveyed to the verifier regarding a directly-connected authenticator or the endpoint\r\ninvolved in an authentication operation. Information conveyed by attestation MAY include, but is not limited to:\r\nThe provenance (e.g., manufacturer or supplier certification), health, and integrity of the authenticator and\r\nendpoint.\r\nSecurity features of the authenticator.\r\nSecurity and performance characteristics of biometric sensor(s).\r\nSensor modality.\r\nIf this attestation is signed, it SHALL be signed using a digital signature that provides at least the minimum\r\nsecurity strength specified in the latest revision of SP 800-131A (112 bits as of the date of this publication).\r\nAttestation information MAY be used as part of a verifier’s risk-based authentication decision.\r\n5.2.5 Verifier Impersonation Resistance\r\nVerifier impersonation attacks, sometimes referred to as “phishing attacks,” are attempts by fraudulent verifiers\r\nand RPs to fool an unwary claimant into authenticating to an impostor website. In prior versions of SP 800-63,\r\nprotocols resistant to verifier-impersonation attacks were also referred to as “strongly MitM resistant.”\r\nA verifier impersonation-resistant authentication protocol SHALL establish an authenticated protected channel\r\nwith the verifier. It SHALL then strongly and irreversibly bind a channel identifier that was negotiated in\r\nestablishing the authenticated protected channel to the authenticator output (e.g., by signing the two values\r\ntogether using a private key controlled by the claimant for which the public key is known to the verifier). The\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 28 of 63\n\nverifier SHALL validate the signature or other information used to prove verifier impersonation resistance. This\r\nprevents an impostor verifier, even one that has obtained a certificate representing the actual verifier, from\r\nreplaying that authentication on a different authenticated protected channel.\r\nApproved cryptographic algorithms SHALL be used to establish verifier impersonation resistance where it is\r\nrequired. Keys used for this purpose SHALL provide at least the minimum security strength specified in the latest\r\nrevision of SP 800-131A (112 bits as of the date of this publication).\r\nOne example of a verifier impersonation-resistant authentication protocol is client-authenticated TLS, because the\r\nclient signs the authenticator output along with earlier messages from the protocol that are unique to the particular\r\nTLS connection being negotiated.\r\nAuthenticators that involve the manual entry of an authenticator output, such as out-of-band and OTP\r\nauthenticators, SHALL NOT be considered verifier impersonation-resistant because the manual entry does not\r\nbind the authenticator output to the specific session being authenticated. In a MitM attack, an impostor verifier\r\ncould replay the OTP authenticator output to the verifier and successfully authenticate.\r\n5.2.6 Verifier-CSP Communications\r\nIn situations where the verifier and CSP are separate entities (as shown by the dotted line in SP 800-63-3 Figure 4-\r\n1), communications between the verifier and CSP SHALL occur through a mutually-authenticated secure channel\r\n(such as a client-authenticated TLS connection) using approved cryptography.\r\n5.2.7 Verifier-Compromise Resistance\r\nUse of some types of authenticators requires that the verifier store a copy of the authenticator secret. For example,\r\nan OTP authenticator (described in Section 5.1.4) requires that the verifier independently generate the\r\nauthenticator output for comparison against the value sent by the claimant. Because of the potential for the verifier\r\nto be compromised and stored secrets stolen, authentication protocols that do not require the verifier to\r\npersistently store secrets that could be used for authentication are considered stronger, and are described herein as\r\nbeing verifier compromise resistant. Note that such verifiers are not resistant to all attacks. A verifier could be\r\ncompromised in a different way, such as being manipulated into always accepting a particular authenticator\r\noutput.\r\nVerifier compromise resistance can be achieved in different ways, for example:\r\nUse a cryptographic authenticator that requires the verifier store a public key corresponding to a private\r\nkey held by the authenticator.\r\nStore the expected authenticator output in hashed form. This method can be used with some look-up secret\r\nauthenticators (described in Section 5.1.2), for example.\r\nTo be considered verifier compromise resistant, public keys stored by the verifier SHALL be associated with the\r\nuse of approved cryptographic algorithms and SHALL provide at least the minimum security strength specified in\r\nthe latest revision of SP 800-131A (112 bits as of the date of this publication).\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 29 of 63\n\nOther verifier compromise resistant secrets SHALL use approved hash algorithms and the underlying secrets\r\nSHALL have at least the minimum security strength specified in the latest revision of SP 800-131A (112 bits as of\r\nthe date of this publication). Secrets (e.g., memorized secrets) having lower complexity SHALL NOT be\r\nconsidered verifier compromise resistant when hashed because of the potential to defeat the hashing process\r\nthrough dictionary lookup or exhaustive search.\r\n5.2.8 Replay Resistance\r\nAn authentication process resists replay attacks if it is impractical to achieve a successful authentication by\r\nrecording and replaying a previous authentication message. Replay resistance is in addition to the replay-resistant\r\nnature of authenticated protected channel protocols, since the output could be stolen prior to entry into the\r\nprotected channel. Protocols that use nonces or challenges to prove the “freshness” of the transaction are resistant\r\nto replay attacks since the verifier will easily detect when old protocol messages are replayed since they will not\r\ncontain the appropriate nonces or timeliness data.\r\nExamples of replay-resistant authenticators are OTP devices, cryptographic authenticators, and look-up secrets.\r\nIn contrast, memorized secrets are not considered replay resistant because the authenticator output — the secret\r\nitself — is provided for each authentication.\r\n5.2.9 Authentication Intent\r\nAn authentication process demonstrates intent if it requires the subject to explicitly respond to each authentication\r\nor reauthentication request. The goal of authentication intent is to make it more difficult for directly-connected\r\nphysical authenticators (e.g., multi-factor cryptographic devices) to be used without the subject’s knowledge, such\r\nas by malware on the endpoint. Authentication intent SHALL be established by the authenticator itself, although\r\nmulti-factor cryptographic devices MAY establish intent by reentry of the other authentication factor on the\r\nendpoint with which the authenticator is used.\r\nAuthentication intent MAY be established in a number of ways. Authentication processes that require the subject’s\r\nintervention (e.g., a claimant entering an authenticator output from an OTP device) establish intent. Cryptographic\r\ndevices that require user action (e.g., pushing a button or reinsertion) for each authentication or reauthentication\r\noperation are also establish intent.\r\nDepending on the modality, presentation of a biometric may or may not establish authentication intent.\r\nPresentation of a fingerprint would normally establish intent, while observation of the claimant’s face using a\r\ncamera normally would not by itself. Behavioral biometrics similarly are less likely to establish authentication\r\nintent because they do not always require a specific action on the claimant’s part.\r\n5.2.10 Restricted Authenticators\r\nAs threats evolve, authenticators’ capability to resist attacks typically degrades. Conversely, some authenticators’\r\nperformance may improve — for example, when changes to their underlying standards increases their ability to\r\nresist particular attacks.\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 30 of 63\n\nTo account for these changes in authenticator performance, NIST places additional restrictions on authenticator\r\ntypes or specific classes or instantiations of an authenticator type.\r\nThe use of a RESTRICTED authenticator requires that the implementing organization assess, understand, and\r\naccept the risks associated with that RESTRICTED authenticator and acknowledge that risk will likely increase\r\nover time. It is the responsibility of the organization to determine the level of acceptable risk for their system(s)\r\nand associated data and to define any methods for mitigating excessive risks. If at any time the organization\r\ndetermines that the risk to any party is unacceptable, then that authenticator SHALL NOT be used.\r\nFurther, the risk of an authentication error is typically borne by multiple parties, including the implementing\r\norganization, organizations that rely on the authentication decision, and the subscriber. Because the subscriber\r\nmay be exposed to additional risk when an organization accepts a RESTRICTED authenticator and that the\r\nsubscriber may have a limited understanding of and ability to control that risk, the CSP SHALL:\r\n1. Offer subscribers at least one alternate authenticator that is not RESTRICTED and can be used to\r\nauthenticate at the required AAL.\r\n2. Provide meaningful notice to subscribers regarding the security risks of the RESTRICTED authenticator\r\nand availability of alternative(s) that are not RESTRICTED.\r\n3. Address any additional risk to subscribers in its risk assessment.\r\n4. Develop a migration plan for the possibility that the RESTRICTED authenticator is no longer acceptable at\r\nsome point in the future and include this migration plan in its digital identity acceptance statement.\r\n6 Authenticator Lifecycle Management\r\nThis section is normative.\r\nA number of events can occur over the lifecycle of a subscriber’s authenticator that affect that authenticator’s use.\r\nThese events include binding, loss, theft, unauthorized duplication, expiration, and revocation. This section\r\ndescribes the actions to be taken in response to those events.\r\n6.1 Authenticator Binding\r\nAuthenticator binding refers to the establishment of an association between a specific authenticator and a\r\nsubscriber’s account, enabling the authenticator to be used — possibly in conjunction with other authenticators —\r\nto authenticate for that account.\r\nAuthenticators SHALL be bound to subscriber accounts by either:\r\nIssuance by the CSP as part of enrollment; or\r\nAssociating a subscriber-provided authenticator that is acceptable to the CSP.\r\nThese guidelines refer to the binding rather than the issuance of an authenticator as to accommodate both options.\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 31 of 63\n\nThroughout the digital identity lifecycle, CSPs SHALL maintain a record of all authenticators that are or have\r\nbeen associated with each identity. The CSP or verifier SHALL maintain the information required for throttling\r\nauthentication attempts when required, as described in Section 5.2.2. The CSP SHALL also verify the type of\r\nuser-provided authenticator (e.g., single-factor cryptographic device vs. multi-factor cryptographic device) so\r\nverifiers can determine compliance with requirements at each AAL.\r\nThe record created by the CSP SHALL contain the date and time the authenticator was bound to the account. The\r\nrecord SHOULD include information about the source of the binding (e.g., IP address, device identifier) of any\r\ndevice associated with the enrollment. If available, the record SHOULD also contain information about the source\r\nof unsuccessful authentications attempted with the authenticator.\r\nWhen any new authenticator is bound to a subscriber account, the CSP SHALL ensure that the binding protocol\r\nand the protocol for provisioning the associated key(s) are done at a level of security commensurate with the AAL\r\nat which the authenticator will be used. For example, protocols for key provisioning SHALL use authenticated\r\nprotected channels or be performed in person to protect against man-in-the-middle attacks. Binding of multi-factor\r\nauthenticators SHALL require multi-factor authentication or equivalent (e.g., association with the session in which\r\nidentity proofing has been just completed) be used in order to bind the authenticator. The same conditions apply\r\nwhen a key pair is generated by the authenticator and the public key is sent to the CSP.\r\n6.1.1 Binding at Enrollment\r\nThe following requirements apply when an authenticator is bound to an identity as a result of a successful identity\r\nproofing transaction, as described in SP 800-63A. Since Executive Order 13681 [EO 13681] requires the use of\r\nmulti-factor authentication for the release of any personal data, it is important that authenticators be bound to\r\nsubscriber accounts at enrollment, enabling access to personal data, including that established by identity\r\nproofing.\r\nThe CSP SHALL bind at least one, and SHOULD bind at least two, physical (something you have) authenticators\r\nto the subscriber’s online identity, in addition to a memorized secret or one or more biometrics. Binding of\r\nmultiple authenticators is preferred in order to recover from the loss or theft of the subscriber’s primary\r\nauthenticator.\r\nWhile all identifying information is self-asserted at IAL1, preservation of online material or an online reputation\r\nmakes it undesirable to lose control of an account due to the loss of an authenticator. The second authenticator\r\nmakes it possible to securely recover from an authenticator loss. For this reason, a CSP SHOULD bind at least two\r\nphysical authenticators to the subscriber’s credential at IAL1 as well.\r\nAt IAL2 and above, identifying information is associated with the digital identity and the subscriber has\r\nundergone an identity proofing process as described in SP 800-63A. As a result, authenticators at the same AAL as\r\nthe desired IAL SHALL be bound to the account. For example, if the subscriber has successfully completed\r\nproofing at IAL2, then AAL2 or AAL3 authenticators are appropriate to bind to the IAL2 identity. While a CSP\r\nMAY bind an AAL1 authenticator to an IAL2 identity, if the subscriber is authenticated at AAL1, the CSP SHALL\r\nNOT expose personal information, even if self-asserted, to the subscriber. As stated in the previous paragraph, the\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 32 of 63\n\navailability of additional authenticators provides backup methods for authentication if an authenticator is\r\ndamaged, lost, or stolen.\r\nIf enrollment and binding cannot be completed in a single physical encounter or electronic transaction (i.e., within\r\na single protected session), the following methods SHALL be used to ensure that the same party acts as the\r\napplicant throughout the processes:\r\nFor remote transactions:\r\n1. The applicant SHALL identify themselves in each new binding transaction by presenting a temporary\r\nsecret which was either established during a prior transaction, or sent to the applicant’s phone number,\r\nemail address, or postal address of record.\r\n2. Long-term authenticator secrets SHALL only be issued to the applicant within a protected session.\r\nFor in-person transactions:\r\n1. The applicant SHALL identify themselves in person by either using a secret as described in remote\r\ntransaction (1) above, or through use of a biometric that was recorded during a prior encounter.\r\n2. Temporary secrets SHALL NOT be reused.\r\n3. If the CSP issues long-term authenticator secrets during a physical transaction, then they SHALL be loaded\r\nlocally onto a physical device that is issued in person to the applicant or delivered in a manner that\r\nconfirms the address of record.\r\n6.1.2 Post-Enrollment Binding\r\n6.1.2.1 Binding of an Additional Authenticator at Existing AAL\r\nWith the exception of memorized secrets, CSPs and verifiers SHOULD encourage subscribers to maintain at least\r\ntwo valid authenticators of each factor that they will be using. For example, a subscriber who usually uses an OTP\r\ndevice as a physical authenticator MAY also be issued a number of look-up secret authenticators, or register a\r\ndevice for out-of-band authentication, in case the physical authenticator is lost, stolen, or damaged. See Section\r\n6.1.2.3 for more information on replacement of memorized secret authenticators.\r\nAccordingly, CSPs SHOULD permit the binding of additional authenticators to a subscriber’s account. Before\r\nadding the new authenticator, the CSP SHALL first require the subscriber to authenticate at the AAL (or a higher\r\nAAL) at which the new authenticator will be used. When an authenticator is added, the CSP SHOULD send a\r\nnotification to the subscriber via a mechanism that is independent of the transaction binding the new authenticator\r\n(e.g., email to an address previously associated with the subscriber). The CSP MAY limit the number of\r\nauthenticators that may be bound in this manner.\r\n6.1.2.2 Adding an Additional Factor to a Single-Factor Account\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 33 of 63\n\nIf the subscriber’s account has only one authentication factor bound to it (i.e., at IAL1/AAL1) and an additional\r\nauthenticator of a different authentication factor is to be added, the subscriber MAY request that the account be\r\nupgraded to AAL2. The IAL would remain at IAL1.\r\nBefore binding the new authenticator, the CSP SHALL require the subscriber to authenticate at AAL1. The CSP\r\nSHOULD send a notification of the event to the subscriber via a mechanism independent of the transaction\r\nbinding the new authenticator (e.g., email to an address previously associated with the subscriber).\r\n6.1.2.3 Replacement of a Lost Authentication Factor\r\nIf a subscriber loses all authenticators of a factor necessary to complete multi-factor authentication and has been\r\nidentity proofed at IAL2 or IAL3, that subscriber SHALL repeat the identity proofing process described in SP\r\n800-63A. An abbreviated proofing process, confirming the binding of the claimant to previously-supplied\r\nevidence, MAY be used if the CSP has retained the evidence from the original proofing process pursuant to a\r\nprivacy risk assessment as described in SP 800-63A Section 4.2. The CSP SHALL require the claimant to\r\nauthenticate using an authenticator of the remaining factor, if any, to confirm binding to the existing identity.\r\nReestablishment of authentication factors at IAL3 SHALL be done in person, or through a supervised remote\r\nprocess as described in SP 800-63A Section 5.3.3.2, and SHALL verify the biometric collected during the original\r\nproofing process.\r\nThe CSP SHOULD send a notification of the event to the subscriber. This MAY be the same notice as is required\r\nas part of the proofing process.\r\nReplacement of a lost (i.e., forgotten) memorized secret is problematic because it is very common. Additional\r\n“backup” memorized secrets do not mitigate this because they are just as likely to also have been forgotten. If a\r\nbiometric is bound to the account, the biometric and associated physical authenticator SHOULD be used to\r\nestablish a new memorized secret.\r\nAs an alternative to the above re-proofing process when there is no biometric bound to the account, the CSP MAY\r\nbind a new memorized secret with authentication using two physical authenticators, along with a confirmation\r\ncode that has been sent to one of the subscriber’s addresses of record. The confirmation code SHALL consist of at\r\nleast 6 random alphanumeric characters generated by an approved random bit generator [SP 800-90Ar1]. Those\r\nsent to a postal address of record SHALL be valid for a maximum of 7 days but MAY be made valid up to 21 days\r\nvia an exception process to accommodate addresses outside the direct reach of the U.S. Postal Service.\r\nConfirmation codes sent by means other than physical mail SHALL be valid for a maximum of 10 minutes.\r\n6.1.3 Binding to a Subscriber-provided Authenticator\r\nA subscriber may already possess authenticators suitable for authentication at a particular AAL. For example, they\r\nmay have a two-factor authenticator from a social network provider, considered AAL2 and IAL1, and would like\r\nto use those credentials at an RP that requires IAL2.\r\nCSPs SHOULD, where practical, accommodate the use of subscriber-provided authenticators in order to relieve\r\nthe burden to the subscriber of managing a large number of authenticators. Binding of these authenticators\r\nSHALL be done as described in Section 6.1.2.1. In situations where the authenticator strength is not self-evident\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 34 of 63\n\n(e.g., between single-factor and multi-factor authenticators of a given type), the CSP SHOULD assume the use of\r\nthe weaker authenticator unless it is able to establish that the stronger authenticator is in fact being used (e.g., by\r\nverification with the issuer or manufacturer of the authenticator).\r\n6.1.4 Renewal\r\nThe CSP SHOULD bind an updated authenticator an appropriate amount of time before an existing authenticator’s\r\nexpiration. The process for this SHOULD conform closely to the initial authenticator binding process (e.g.,\r\nconfirming address of record). Following successful use of the new authenticator, the CSP MAY revoke the\r\nauthenticator that it is replacing.\r\n6.2 Loss, Theft, Damage, and Unauthorized Duplication\r\nCompromised authenticators include those that have been lost, stolen, or subject to unauthorized duplication.\r\nGenerally, one must assume that a lost authenticator has been stolen or compromised by someone that is not the\r\nlegitimate subscriber of the authenticator. Damaged or malfunctioning authenticators are also considered\r\ncompromised to guard against any possibility of extraction of the authenticator secret. One notable exception is a\r\nmemorized secret that has been forgotten without other indications of having been compromised, such as having\r\nbeen obtained by an attacker.\r\nSuspension, revocation, or destruction of compromised authenticators SHOULD occur as promptly as practical\r\nfollowing detection. Agencies SHOULD establish time limits for this process.\r\nTo facilitate secure reporting of the loss, theft, or damage to an authenticator, the CSP SHOULD provide the\r\nsubscriber with a method of authenticating to the CSP using a backup or alternate authenticator. This backup\r\nauthenticator SHALL be either a memorized secret or a physical authenticator. Either MAY be used, but only one\r\nauthentication factor is required to make this report. Alternatively, the subscriber MAY establish an authenticated\r\nprotected channel to the CSP and verify information collected during the proofing process. The CSP MAY choose\r\nto verify an address of record (i.e., email, telephone, postal) and suspend authenticator(s) reported to have been\r\ncompromised. The suspension SHALL be reversible if the subscriber successfully authenticates to the CSP using a\r\nvalid (i.e., not suspended) authenticator and requests reactivation of an authenticator suspended in this manner.\r\nThe CSP MAY set a time limit after which a suspended authenticator can no longer be reactivated.\r\n6.3 Expiration\r\nCSPs MAY issue authenticators that expire. If and when an authenticator expires, it SHALL NOT be usable for\r\nauthentication. When an authentication is attempted using an expired authenticator, the CSP SHOULD give an\r\nindication to the subscriber that the authentication failure is due to expiration rather than some other cause.\r\nThe CSP SHALL require subscribers to surrender or prove destruction of any physical authenticator containing\r\nattribute certificates signed by the CSP as soon as practical after expiration or receipt of a renewed authenticator.\r\n6.4 Revocation and Termination\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 35 of 63\n\nRevocation of an authenticator — sometimes referred to as termination, especially in the context of PIV\r\nauthenticators — refers to removal of the binding between an authenticator and a credential the CSP maintains.\r\nCSPs SHALL revoke the binding of authenticators promptly when an online identity ceases to exist (e.g.,\r\nsubscriber’s death, discovery of a fraudulent subscriber), when requested by the subscriber, or when the CSP\r\ndetermines that the subscriber no longer meets its eligibility requirements.\r\nThe CSP SHALL require subscribers to surrender or certify destruction of any physical authenticator containing\r\ncertified attributes signed by the CSP as soon as practical after revocation or termination takes place. This is\r\nnecessary to block the use of the authenticator’s certified attributes in offline situations between\r\nrevocation/termination and expiration of the certification.\r\nFurther requirements on the termination of PIV authenticators are found in FIPS 201.\r\n7 Session Management\r\nThis section is normative.\r\nOnce an authentication event has taken place, it is often desirable to allow the subscriber to continue using the\r\napplication across multiple subsequent interactions without requiring them to repeat the authentication event. This\r\nrequirement is particularly true for federation scenarios — described in SP 800-63C — where the authentication\r\nevent necessarily involves several components and parties coordinating across a network.\r\nTo facilitate this behavior, a session MAY be started in response to an authentication event, and continue the\r\nsession until such time that it is terminated. The session MAY be terminated for any number of reasons, including\r\nbut not limited to an inactivity timeout, an explicit logout event, or other means. The session MAY be continued\r\nthrough a reauthentication event — described in Section 7.2 — wherein the user repeats some or all of the initial\r\nauthentication event, thereby re-establishing the session.\r\nSession management is preferable over continual presentation of credentials as the poor usability of continual\r\npresentation often creates incentives for workarounds such as cached unlocking credentials, negating the freshness\r\nof the authentication event.\r\n7.1 Session Bindings\r\nA session occurs between the software that a subscriber is running — such as a browser, application, or operating\r\nsystem (i.e., the session subject) — and the RP or CSP that the subscriber is accessing (i.e., the session host). A\r\nsession secret SHALL be shared between the subscriber’s software and the service being accessed. This secret\r\nbinds the two ends of the session, allowing the subscriber to continue using the service over time. The secret\r\nSHALL be presented directly by the subscriber’s software or possession of the secret SHALL be proven using a\r\ncryptographic mechanism.\r\nThe secret used for session binding SHALL be generated by the session host in direct response to an\r\nauthentication event. A session SHOULD inherit the AAL properties of the authentication event which triggered\r\nits creation. A session MAY be considered at a lower AAL than the authentication event but SHALL NOT be\r\nconsidered at a higher AAL than the authentication event.\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 36 of 63\n\nSecrets used for session binding:\r\n1. SHALL be generated by the session host during an interaction, typically immediately following\r\nauthentication.\r\n2. SHALL be generated by an approved random bit generator [SP 800-90Ar1] and contain at least 64 bits of\r\nentropy.\r\n3. SHALL be erased or invalidated by the session subject when the subscriber logs out.\r\n4. SHOULD be erased on the subscriber endpoint when the user logs out or when the secret is deemed to\r\nhave expired.\r\n5. SHOULD NOT be placed in insecure locations such as HTML5 Local Storage due to the potential\r\nexposure of local storage to cross-site scripting (XSS) attacks.\r\n6. SHALL be sent to and received from the device using an authenticated protected channel.\r\n7. SHALL time out and not be accepted after the times specified in Sections 4.1.4, 4.2.4, and 4.3.4, as\r\nappropriate for the AAL.\r\n8. SHALL NOT be available to insecure communications between the host and subscriber’s endpoint.\r\nAuthenticated sessions SHALL NOT fall back to an insecure transport, such as from https to http,\r\nfollowing authentication.\r\nURLs or POST content SHALL contain a session identifier that SHALL be verified by the RP to ensure that\r\nactions taken outside the session do not affect the protected session.\r\nThere are several mechanisms for managing a session over time. The following sections give different examples\r\nalong with additional requirements and considerations particular to each example technology. Additional\r\ninformative guidance is available in the OWASP Session Management Cheat Sheet [OWASP-session].\r\n7.1.1 Browser Cookies\r\nBrowser cookies are the predominant mechanism by which a session will be created and tracked for a subscriber\r\naccessing a service.\r\nCookies:\r\n1. SHALL be tagged to be accessible only on secure (HTTPS) sessions.\r\n2. SHALL be accessible to the minimum practical set of hostnames and paths.\r\n3. SHOULD be tagged to be inaccessible via JavaScript (HttpOnly).\r\n4. SHOULD be tagged to expire at, or soon after, the session’s validity period. This requirement is intended to\r\nlimit the accumulation of cookies, but SHALL NOT be depended upon to enforce session timeouts.\r\n7.1.2 Access Tokens\r\nAn access token — such as found in OAuth — is used to allow an application to access a set of services on a\r\nsubscriber’s behalf following an authentication event. The presence of an OAuth access token SHALL NOT be\r\ninterpreted by the RP as presence of the subscriber, in the absence of other signals. The OAuth access token, and\r\nany associated refresh tokens, MAY be valid long after the authentication session has ended and the subscriber has\r\nleft the application.\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 37 of 63\n\n7.1.3 Device Identification\r\nOther methods of secure device identification — including but not limited to mutual TLS, token binding, or other\r\nmechanisms — MAY be used to enact a session between a subscriber and a service.\r\n7.2 Reauthentication\r\nContinuity of authenticated sessions SHALL be based upon the possession of a session secret issued by the\r\nverifier at the time of authentication and optionally refreshed during the session. The nature of a session depends\r\non the application, including:\r\n1. A web browser session with a “session” cookie, or\r\n2. An instance of a mobile application that retains a session secret.\r\nSession secrets SHALL be non-persistent. That is, they SHALL NOT be retained across a restart of the associated\r\napplication or a reboot of the host device.\r\nPeriodic reauthentication of sessions SHALL be performed to confirm the continued presence of the subscriber at\r\nan authenticated session (i.e., that the subscriber has not walked away without logging out).\r\nA session SHALL NOT be extended past the guidelines in Sections 4.1.3, 4.2.3, and 4.3.3 (depending on AAL)\r\nbased on presentation of the session secret alone. Prior to session expiration, the reauthentication time limit\r\nSHALL be extended by prompting the subscriber for the authentication factor(s) specified in Table 7-1.\r\nWhen a session has been terminated, due to a time-out or other action, the user SHALL be required to establish a\r\nnew session by authenticating again.\r\nTable 7-1 AAL Reauthentication Requirements\r\nAAL Requirement\r\n1 Presentation of any one factor\r\n2 Presentation of a memorized secret or biometric\r\n3 Presentation of all factors\r\nNote: At AAL2, a memorized secret or biometric, and not a physical authenticator, is required because\r\nthe session secret is something you have, and an additional authentication factor is required to continue\r\nthe session.\r\n7.2.1 Reauthentication from a Federation or Assertion\r\nWhen using a federation protocol as described in SP 800-63C, Section 5 to connect the CSP and RP, special\r\nconsiderations apply to session management and reauthentication. The federation protocol communicates an\r\nauthentication event between the CSP and the RP but establishes no session between them. Since the CSP and RP\r\noften employ separate session management technologies, there SHALL NOT be any assumption of correlation\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 38 of 63\n\nbetween these sessions. Consequently, when an RP session expires and the RP requires reauthentication, it is\r\nentirely possible that the session at the CSP has not expired and that a new assertion could be generated from this\r\nsession at the CSP without reauthenticating the user.\r\nAn RP requiring reauthentication through a federation protocol SHALL — if possible within the protocol —\r\nspecify the maximum acceptable authentication age to the CSP, and the CSP SHALL reauthenticate the subscriber\r\nif they have not been authenticated within that time period. The CSP SHALL communicate the authentication\r\nevent time to the RP to allow the RP to decide if the assertion is sufficient for reauthentication and to determine\r\nthe time for the next reauthentication event.\r\n8 Threats and Security Considerations\r\nThis section is informative.\r\n8.1 Authenticator Threats\r\nAn attacker who can gain control of an authenticator will often be able to masquerade as the authenticator’s\r\nowner. Threats to authenticators can be categorized based on attacks on the types of authentication factors that\r\ncomprise the authenticator:\r\nSomething you know may be disclosed to an attacker. The attacker might guess a memorized secret. Where\r\nthe authenticator is a shared secret, the attacker could gain access to the CSP or verifier and obtain the\r\nsecret value or perform a dictionary attack on a hash of that value. An attacker may observe the entry of a\r\nPIN or passcode, find a written record or journal entry of a PIN or passcode, or may install malicious\r\nsoftware (e.g., a keyboard logger) to capture the secret. Additionally, an attacker may determine the secret\r\nthrough offline attacks on a password database maintained by the verifier.\r\nSomething you have may be lost, damaged, stolen from the owner, or cloned by an attacker. For example,\r\nan attacker who gains access to the owner’s computer might copy a software authenticator. A hardware\r\nauthenticator might be stolen, tampered with, or duplicated. Out-of-band secrets may be intercepted by an\r\nattacker and used to authenticate their own session.\r\nSomething you are may be replicated. For example, an attacker may obtain a copy of the subscriber’s\r\nfingerprint and construct a replica.\r\nThis document assumes that the subscriber is not colluding with an attacker who is attempting to falsely\r\nauthenticate to the verifier. With this assumption in mind, the threats to the authenticator(s) used for digital\r\nauthentication are listed in Table 8-1, along with some examples.\r\nTable 8-1 Authenticator Threats\r\nAuthenticator\r\nThreat/Attack\r\nDescription Examples\r\nAssertion\r\nManufacture or\r\nThe attacker generates a false assertion Compromised CSP asserts identity of a\r\nclaimant who has not properly\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 39 of 63\n\nAuthenticator\r\nThreat/Attack\r\nDescription Examples\r\nModification authenticated\r\n \r\nThe attacker modifies an existing\r\nassertion\r\nCompromised proxy that changes AAL\r\nof an authentication assertion\r\nTheft\r\nA physical authenticator is stolen by an\r\nAttacker.\r\nA hardware cryptographic device is\r\nstolen.\r\n    An OTP device is stolen.\r\n    A look-up secret authenticator is stolen.\r\n    A cell phone is stolen.\r\nDuplication\r\nThe subscriber’s authenticator has been\r\ncopied with or without their knowledge.\r\nPasswords written on paper are\r\ndisclosed.\r\n   \r\nPasswords stored in an electronic file are\r\ncopied.\r\n   \r\nSoftware PKI authenticator (private key)\r\ncopied.\r\n    Look-up secret authenticator copied.\r\n   \r\nCounterfeit biometric authenticator\r\nmanufactured.\r\nEavesdropping\r\nThe authenticator secret or authenticator\r\noutput is revealed to the attacker as the\r\nsubscriber is authenticating.\r\nMemorized secrets are obtained by\r\nwatching keyboard entry.\r\n   \r\nMemorized secrets or authenticator\r\noutputs are intercepted by keystroke\r\nlogging software.\r\n   \r\nA PIN is captured from a PIN pad\r\ndevice.\r\n   \r\nA hashed password is obtained and used\r\nby an attacker for another authentication\r\n(pass-the-hash attack).\r\n \r\nAn out-of-band secret is intercepted by\r\nthe attacker by compromising the\r\ncommunication channel.\r\nAn out-of-band secret is transmitted via\r\nunencrypted Wi-Fi and received by the\r\nattacker.\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 40 of 63\n\nAuthenticator\r\nThreat/Attack\r\nDescription Examples\r\nOffline Cracking\r\nThe authenticator is exposed using\r\nanalytical methods outside the\r\nauthentication mechanism.\r\nA software PKI authenticator is\r\nsubjected to dictionary attack to identify\r\nthe correct password to use to decrypt\r\nthe private key.\r\nSide Channel\r\nAttack\r\nThe authenticator secret is exposed\r\nusing physical characteristics of the\r\nauthenticator.\r\nA key is extracted by differential power\r\nanalysis on a hardware cryptographic\r\nauthenticator.\r\n   \r\nA cryptographic authenticator secret is\r\nextracted by analysis of the response\r\ntime of the authenticator over a number\r\nof attempts.\r\nPhishing or\r\nPharming\r\nThe authenticator output is captured by\r\nfooling the subscriber into thinking the\r\nattacker is a verifier or RP.\r\nA password is revealed by subscriber to\r\na website impersonating the verifier.\r\n   \r\nA memorized secret is revealed by a\r\nbank subscriber in response to an email\r\ninquiry from a phisher pretending to\r\nrepresent the bank.\r\n   \r\nA memorized secret is revealed by the\r\nsubscriber at a bogus verifier website\r\nreached through DNS spoofing.\r\nSocial Engineering\r\nThe attacker establishes a level of trust\r\nwith a subscriber in order to convince\r\nthe subscriber to reveal their\r\nauthenticator secret or authenticator\r\noutput.\r\nA memorized secret is revealed by the\r\nsubscriber to an officemate asking for\r\nthe password on behalf of the\r\nsubscriber’s boss.\r\n   \r\nA memorized secret is revealed by a\r\nsubscriber in a telephone inquiry from an\r\nattacker masquerading as a system\r\nadministrator.\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 41 of 63\n\nAuthenticator\r\nThreat/Attack\r\nDescription Examples\r\n   \r\nAn out of band secret sent via SMS is\r\nreceived by an attacker who has\r\nconvinced the mobile operator to\r\nredirect the victim’s mobile phone to the\r\nattacker.\r\nOnline Guessing\r\nThe attacker connects to the verifier\r\nonline and attempts to guess a valid\r\nauthenticator output in the context of\r\nthat verifier.\r\nOnline dictionary attacks are used to\r\nguess memorized secrets.\r\n   \r\nOnline guessing is used to guess\r\nauthenticator outputs for an OTP device\r\nregistered to a legitimate claimant.\r\nEndpoint\r\nCompromise\r\nMalicious code on the endpoint proxies\r\nremote access to a connected\r\nauthenticator without the subscriber’s\r\nconsent.\r\nA cryptographic authenticator connected\r\nto the endpoint is used to authenticate\r\nremote attackers.\r\n \r\nMalicious code on the endpoint causes\r\nauthentication to other than the intended\r\nverifier.\r\nAuthentication is performed on behalf of\r\nan attacker rather than the subscriber.\r\n   \r\nA malicious app on the endpoint reads\r\nan out-of-band secret sent via SMS and\r\nthe attacker uses the secret to\r\nauthenticate.\r\n \r\nMalicious code on the endpoint\r\ncompromises a multi-factor software\r\ncryptographic authenticator.\r\nMalicious code proxies authentication or\r\nexports authenticator keys from the\r\nendpoint.\r\nUnauthorized\r\nBinding\r\nAn attacker is able to cause an\r\nauthenticator under their control to be\r\nbound to a subscriber’s account.\r\nAn attacker intercepts an authenticator\r\nor provisioning key en route to the\r\nsubscriber.\r\n8.2 Threat Mitigation Strategies\r\nRelated mechanisms that assist in mitigating the threats identified above are summarized in Table 8-2.\r\nTable 8-2 Mitigating Authenticator Threats\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 42 of 63\n\nAuthenticator\r\nThreat/Attack\r\nThreat Mitigation Mechanisms\r\nNormative\r\nReference(s)\r\nTheft\r\nUse multi-factor authenticators that need to be activated\r\nthrough a memorized secret or biometric.\r\n4.2.1, 4.3.1\r\n \r\nUse a combination of authenticators that includes a\r\nmemorized secret or biometric.\r\n4.2.1, 4.3.1\r\nDuplication\r\nUse authenticators from which it is difficult to extract and\r\nduplicate long-term authentication secrets.\r\n4.2.2, 4.3.2, 5.1.7.1\r\nEavesdropping\r\nEnsure the security of the endpoint, especially with\r\nrespect to freedom from malware such as key loggers,\r\nprior to use.\r\n4.2.2\r\n \r\nAvoid use of non-trusted wireless networks as\r\nunencrypted secondary out-of-band authentication\r\nchannels.\r\n5.1.3.1\r\n \r\nAuthenticate over authenticated protected channels (e.g.,\r\nobserve lock icon in browser window).\r\n4.1.2, 4.2.2, 4.3.2\r\n \r\nUse authentication protocols that are resistant to replay\r\nattacks such as pass-the-hash.\r\n5.2.8\r\n \r\nUse authentication endpoints that employ trusted input\r\nand trusted display capabilities.\r\n5.1.6.1, 5.1.8.1\r\nOffline Cracking\r\nUse an authenticator with a high entropy authenticator\r\nsecret.\r\n5.1.2.1, 5.1.4.1,\r\n5.1.5.1, 5.1.7.1, 5.1.9.1\r\n \r\nStore memorized secrets in a salted, hashed form,\r\nincluding a keyed hash.\r\n5.1.1.2, 5.2.7\r\nSide Channel\r\nAttack\r\nUse authenticator algorithms that are designed to\r\nmaintain constant power consumption and timing\r\nregardless of secret values.\r\n4.3.2\r\nPhishing or\r\nPharming\r\nUse authenticators that provide verifier impersonation\r\nresistance.\r\n5.2.5\r\nSocial Engineering\r\nAvoid use of authenticators that present a risk of social\r\nengineering of third parties such as customer service\r\nagents.\r\n6.1.2.1, 6.1.2.3\r\nOnline Guessing Use authenticators that generate high entropy output. 5.1.2.1, 5.1.7.1, 5.1.9.1\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 43 of 63\n\nAuthenticator\r\nThreat/Attack\r\nThreat Mitigation Mechanisms\r\nNormative\r\nReference(s)\r\n \r\nUse an authenticator that locks up after a number of\r\nrepeated failed activation attempts.\r\n5.2.2\r\nEndpoint\r\nCompromise\r\nUse hardware authenticators that require physical action\r\nby the subscriber.\r\n5.2.9\r\n \r\nMaintain software-based keys in restricted-access\r\nstorage.\r\n5.1.3.1, 5.1.6.1, 5.1.8.1\r\nUnauthorized\r\nBinding\r\nUse MitM-resistant protocols for provisioning of\r\nauthenticators and associated keys.\r\n6.1\r\nSeveral other strategies may be applied to mitigate the threats described in Table 8-1:\r\nMultiple factors make successful attacks more difficult to accomplish. If an attacker needs to both steal a\r\ncryptographic authenticator and guess a memorized secret, then the work to discover both factors may be\r\ntoo high.\r\nPhysical security mechanisms may be employed to protect a stolen authenticator from duplication. Physical\r\nsecurity mechanisms can provide tamper evidence, detection, and response.\r\nRequiring the use of long memorized secrets that don’t appear in common dictionaries may force attackers\r\nto try every possible value.\r\nSystem and network security controls may be employed to prevent an attacker from gaining access to a\r\nsystem or installing malicious software.\r\nPeriodic training may be performed to ensure subscribers understand when and how to report compromise\r\n— or suspicion of compromise — or otherwise recognize patterns of behavior that may signify an attacker\r\nattempting to compromise the authentication process.\r\nOut of band techniques may be employed to verify proof of possession of registered devices (e.g., cell\r\nphones).\r\n8.3 Authenticator Recovery\r\nThe weak point in many authentication mechanisms is the process followed when a subscriber loses control of one\r\nor more authenticators and needs to replace them. In many cases, the options remaining available to authenticate\r\nthe subscriber are limited, and economic concerns (e.g., cost of maintaining call centers) motivate the use of\r\ninexpensive, and often less secure, backup authentication methods. To the extent that authenticator recovery is\r\nhuman-assisted, there is also the risk of social engineering attacks.\r\nTo maintain the integrity of the authentication factors, it is essential that it not be possible to leverage an\r\nauthentication involving one factor to obtain an authenticator of a different factor. For example, a memorized\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 44 of 63\n\nsecret must not be usable to obtain a new list of look-up secrets.\r\n8.4 Session Attacks\r\nThe above discussion focuses on threats to the authentication event itself, but hijacking attacks on the session\r\nfollowing an authentication event can have similar security impacts. The session management guidelines in\r\nSection 7 are essential to maintain session integrity against attacks, such as XSS. In addition, it is important to\r\nsanitize all information to be displayed [OWASP-XSS-prevention] to ensure that it does not contain executable\r\ncontent. These guidelines also recommend that session secrets be made inaccessible to mobile code in order to\r\nprovide extra protection against exfiltration of session secrets.\r\nAnother post-authentication threat, cross-site request forgery (CSRF), takes advantage of users’ tendency to have\r\nmultiple sessions active at the same time. It is important to embed and verify a session identifier into web requests\r\nto prevent the ability for a valid URL or request to be unintentionally or maliciously activated.\r\n9 Privacy Considerations\r\nThese privacy considerations supplement the guidance in Section 4. This section is informative.\r\n9.1 Privacy Risk Assessment\r\nSections 4.1.5, 4.2.5, and 4.3.5 require the CSP to conduct a privacy risk assessment for records retention. Such a\r\nprivacy risk assessment would include:\r\n1. The likelihood that the records retention could create a problem for the subscriber, such as invasiveness or\r\nunauthorized access to the information.\r\n2. The impact if such a problem did occur.\r\nCSPs should be able to reasonably justify any response they take to identified privacy risks, including accepting\r\nthe risk, mitigating the risk, and sharing the risk. The use of subscriber consent is a form of sharing the risk, and\r\ntherefore appropriate for use only when a subscriber could reasonably be expected to have the capacity to assess\r\nand accept the shared risk.\r\n9.2 Privacy Controls\r\nSection 4.4 requires CSPs to employ appropriately-tailored privacy controls. SP 800-53 provides a set of privacy\r\ncontrols for CSPs to consider when deploying authentication mechanisms. These controls cover notices, redress,\r\nand other important considerations for successful and trustworthy deployments.\r\n9.3 Use Limitation\r\nSection 4.4 requires CSPs to use measures to maintain the objectives of predictability (enabling reliable\r\nassumptions by individuals, owners, and operators about PII and its processing by an information system) and\r\nmanageability (providing the capability for granular administration of PII, including alteration, deletion, and\r\nselective disclosure) commensurate with privacy risks that can arise from the processing of attributes for purposes\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 45 of 63\n\nother than identity proofing, authentication, authorization, or attribute assertion, related fraud mitigation, or to\r\ncomply with law or legal process NISTIR8062.\r\nCSPs may have various business purposes for processing attributes, including providing non-identity services to\r\nsubscribers. However, processing attributes for other purposes than those specified at collection can create privacy\r\nrisks when individuals are not expecting or comfortable with the additional processing. CSPs can determine\r\nappropriate measures commensurate with the privacy risk arising from the additional processing. For example,\r\nabsent applicable law, regulation or policy, it may not be necessary to get consent when processing attributes to\r\nprovide non-identity services requested by subscribers, although notices may help subscribers maintain reliable\r\nassumptions about the processing (predictability). Other processing of attributes may carry different privacy risks\r\nthat call for obtaining consent or allowing subscribers more control over the use or disclosure of specific\r\nattributes (manageability). Subscriber consent needs to be meaningful; therefore, as stated in Section 4.4, when\r\nCSPs use consent measures, acceptance by the subscriber of additional uses SHALL NOT be a condition of\r\nproviding authentication services.\r\nConsult your SAOP if there are questions about whether the proposed processing falls outside the scope of the\r\npermitted processing or the appropriate privacy risk mitigation measures.\r\n9.4 Agency-Specific Privacy Compliance\r\nSection 4.4 covers specific compliance obligations for federal CSPs. It is critical to involve your agency’s SAOP\r\nin the earliest stages of digital authentication system development in order to assess and mitigate privacy risks and\r\nadvise the agency on compliance requirements, such as whether or not the collection of PII to issue or maintain\r\nauthenticators triggers the Privacy Act of 1974 Privacy Act or the E-Government Act of 2002 E-Gov requirement\r\nto conduct a PIA. For example, with respect to centralized maintenance of biometrics, it is likely that the Privacy\r\nAct requirements will be triggered and require coverage by either a new or existing Privacy Act system of records\r\ndue to the collection and maintenance of PII and any other attributes necessary for authentication. The SAOP can\r\nsimilarly assist the agency in determining whether a PIA is required.\r\nThese considerations should not be read as a requirement to develop a Privacy Act SORN or PIA for\r\nauthentication alone. In many cases it will make the most sense to draft a PIA and SORN that encompasses the\r\nentire digital authentication process or include the digital authentication process as part of a larger programmatic\r\nPIA that discusses the service or benefit to which the agency is establishing online.\r\nDue to the many components of digital authentication, it is important for the SAOP to have an awareness and\r\nunderstanding of each individual component. For example, other privacy artifacts may be applicable to an agency\r\noffering or using federated CSP or RP services (e.g., Data Use Agreements, Computer Matching Agreements). The\r\nSAOP can assist the agency in determining what additional requirements apply. Moreover, a thorough\r\nunderstanding of the individual components of digital authentication will enable the SAOP to thoroughly assess\r\nand mitigate privacy risks either through compliance processes or by other means.\r\n10 Usability Considerations\r\nThis section is informative.\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 46 of 63\n\nISO/IEC 9241-11 defines usability as the “extent to which a product can be used by specified users to achieve\r\nspecified goals with effectiveness, efficiency and satisfaction in a specified context of use.” This definition\r\nfocuses on users, their goals, and the context of use as key elements necessary for achieving effectiveness,\r\nefficiency, and satisfaction. A holistic approach that accounts for these key elements is necessary to achieve\r\nusability.\r\nA user’s goal for accessing an information system is to perform an intended task. Authentication is the function\r\nthat enables this goal. However, from the user’s perspective, authentication stands between them and their\r\nintended task. Effective design and implementation of authentication makes it easy to do the right thing, hard to do\r\nthe wrong thing, and easy to recover when the wrong thing happens.\r\nOrganizations need to be cognizant of the overall implications of their stakeholders’ entire digital authentication\r\necosystem. Users often employ one or more authenticator, each for a different RP. They then struggle to remember\r\npasswords, to recall which authenticator goes with which RP, and to carry multiple physical authentication\r\ndevices. Evaluating the usability of authentication is critical, as poor usability often results in coping mechanisms\r\nand unintended work-arounds that can ultimately degrade the effectiveness of security controls.\r\nIntegrating usability into the development process can lead to authentication solutions that are secure and usable\r\nwhile still addressing users’ authentication needs and organizations’ business goals.\r\nThe impact of usability across digital systems needs to be considered as part of the risk assessment when deciding\r\non the appropriate AAL. Authenticators with a higher AAL sometimes offer better usability and should be allowed\r\nfor use for lower AAL applications.\r\nLeveraging federation for authentication can alleviate many of the usability issues, though such an approach has\r\nits own tradeoffs, as discussed in SP 800-63C.\r\nThis section provides general usability considerations and possible implementations, but does not recommend\r\nspecific solutions. The implementations mentioned are examples to encourage innovative technological\r\napproaches to address specific usability needs. Further, usability considerations and their implementations are\r\nsensitive to many factors that prevent a one-size-fits-all solution. For example, a font size that works in the\r\ndesktop computing environment may force text to scroll off of a small OTP device screen. Performing a usability\r\nevaluation on the selected authenticator is a critical component of implementation. It is important to conduct\r\nevaluations with representative users, realistic goals and tasks, and appropriate contexts of use.\r\nASSUMPTIONS\r\nIn this section, the term “users” means “claimants” or “subscribers.”\r\nGuidelines and considerations are described from the users’ perspective.\r\nAccessibility differs from usability and is out of scope for this document. Section 508 was enacted to eliminate\r\nbarriers in information technology and require federal agencies to make their online public content accessible to\r\npeople with disabilities. Refer to Section 508 law and standards for accessibility guidance.\r\n10.1 Usability Considerations Common to Authenticators\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 47 of 63\n\nWhen selecting and implementing an authentication system, consider usability across the entire lifecycle of the\r\nselected authenticators (e.g., typical use and intermittent events), while being mindful of the combination of users,\r\ntheir goals, and context of use.\r\nA single authenticator type usually does not suffice for the entire user population. Therefore, whenever possible —\r\nbased on AAL requirements — CSPs should support alternative authenticator types and allow users to choose\r\nbased on their needs. Task immediacy, perceived cost benefit tradeoffs, and unfamiliarity with certain\r\nauthenticators often impact choice. Users tend to choose options that incur the least burden or cost at that moment.\r\nFor example, if a task requires immediate access to an information system, a user may prefer to create a new\r\naccount and password rather than select an authenticator requiring more steps. Alternatively, users may choose a\r\nfederated identity option — approved at the appropriate AAL — if they already have an account with an identity\r\nprovider. Users may understand some authenticators better than others, and have different levels of trust based on\r\ntheir understanding and experience.\r\nPositive user authentication experiences are integral to the success of an organization achieving desired business\r\noutcomes. Therefore, they should strive to consider authenticators from the users’ perspective. The overarching\r\nauthentication usability goal is to minimize user burden and authentication friction (e.g., the number of times a\r\nuser has to authenticate, the steps involved, and the amount of information he or she has to track). Single sign-on\r\nexemplifies one such minimization strategy.\r\nUsability considerations applicable to most authenticators are described below. Subsequent sections describe\r\nusability considerations specific to a particular authenticator.\r\nUsability considerations for typical usage of all authenticators include:\r\nProvide information on the use and maintenance of the authenticator, e.g., what to do if the authenticator is\r\nlost or stolen, and instructions for use — especially if there are different requirements for first-time use or\r\ninitialization.\r\nAuthenticator availability should also be considered as users will need to remember to have their\r\nauthenticator readily available. Consider the need for alternate authentication options to protect against\r\nloss, damage, or other negative impacts to the original authenticator.\r\nWhenever possible, based on AAL requirements, users should be provided with alternate authentication\r\noptions. This allows users to choose an authenticator based on their context, goals, and tasks (e.g., the\r\nfrequency and immediacy of the task). Alternate authentication options also help address availability issues\r\nthat may occur with a particular authenticator.\r\nCharacteristics of user-facing text:\r\nWrite user-facing text (e.g., instructions, prompts, notifications, error messages) in plain language\r\nfor the intended audience. Avoid technical jargon and, typically, write for a 6th to 8th grade literacy\r\nlevel.\r\nConsider the legibility of user-facing and user-entered text, including font style, size, color, and\r\ncontrast with surrounding background. Illegible text contributes to user entry errors. To enhance\r\nlegibility, consider the use of:\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 48 of 63\n\nHigh contrast. The highest contrast is black on white.\r\nSans serif fonts for electronic displays. Serif fonts for printed materials.\r\nFonts that clearly distinguish between easily confusable characters (e.g., the capital letter\r\n“O” and the number “0”).\r\nA minimum font size of 12 points as long as the text fits for display on the device.\r\nUser experience during authenticator entry:\r\nOffer the option to display text during entry, as masked text entry is error-prone. Once a given\r\ncharacter is displayed long enough for the user to see, it can be hidden. Consider the device when\r\ndetermining masking delay time, as it takes longer to enter memorized secrets on mobile devices\r\n(e.g., tablets and smartphones) than on traditional desktop computers. Ensure masking delay\r\ndurations are consistent with user needs.\r\nEnsure the time allowed for text entry is adequate (i.e., the entry screen does not time out\r\nprematurely). Ensure allowed text entry times are consistent with user needs.\r\nProvide clear, meaningful and actionable feedback on entry errors to reduce user confusion and\r\nfrustration. Significant usability implications arise when users do not know they have entered text\r\nincorrectly.\r\nAllow at least 10 entry attempts for authenticators requiring the entry of the authenticator output by\r\nthe user. The longer and more complex the entry text, the greater the likelihood of user entry errors.\r\nProvide clear, meaningful feedback on the number of remaining allowed attempts. For rate limiting\r\n(i.e., throttling), inform users how long they have to wait until the next attempt to reduce confusion\r\nand frustration.\r\nMinimize the impact of form-factor constraints, such as limited touch and display areas on mobile devices:\r\nLarger touch areas improve usability for text entry since typing on small devices is significantly\r\nmore error prone and time consuming than typing on a full-size keyboard. The smaller the onscreen\r\nkeyboard, the more difficult it is to type, due to the size of the input mechanism (e.g., a finger)\r\nrelative to the size of the on-screen target.\r\nFollow good user interface and information design for small displays.\r\nIntermittent events include events such as reauthentication, account lock-out, expiration, revocation, damage, loss,\r\ntheft, and non-functional software.\r\nUsability considerations for intermittent events across authenticator types include:\r\nTo prevent users from needing to reauthenticate due to user inactivity, prompt users in order to trigger\r\nactivity just before (e.g., 2 minutes) an inactivity timeout would otherwise occur.\r\nPrompt users with adequate time (e.g., 1 hour) to save their work before the fixed periodic reauthentication\r\nevent required regardless of user activity.\r\nClearly communicate how and where to acquire technical assistance. For example, provide users with\r\ninformation such as a link to an online self-service feature, chat sessions or a phone number for help desk\r\nsupport. Ideally, sufficient information can be provided to enable users to recover from intermittent events\r\non their own without outside intervention.\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 49 of 63\n\n10.2 Usability Considerations by Authenticator Type\r\nIn addition to the previously described general usability considerations applicable to most authenticators (Section\r\n10.1), the following sections describe other usability considerations specific to particular authenticator types.\r\n10.2.1 Memorized Secrets\r\nTypical Usage\r\nUsers manually input the memorized secret (commonly referred to as a password or PIN).\r\nUsability considerations for typical usage include:\r\nMemorability of the memorized secret.\r\nThe likelihood of recall failure increases as there are more items for users to remember. With fewer\r\nmemorized secrets, users can more easily recall the specific memorized secret needed for a\r\nparticular RP.\r\nThe memory burden is greater for a less frequently used password.\r\nUser experience during entry of the memorized secret.\r\nSupport copy and paste functionality in fields for entering memorized secrets, including\r\npassphrases.\r\nIntermittent Events\r\nUsability considerations for intermittent events include:\r\nWhen users create and change memorized secrets:\r\nClearly communicate information on how to create and change memorized secrets.\r\nClearly communicate memorized secret requirements, as specified in Section 5.1.1.\r\nAllow at least 64 characters in length to support the use of passphrases. Encourage users to make\r\nmemorized secrets as lengthy as they want, using any characters they like (including spaces), thus\r\naiding memorization.\r\nDo not impose other composition rules (e.g. mixtures of different character types) on memorized\r\nsecrets.\r\nDo not require that memorized secrets be changed arbitrarily (e.g., periodically) unless there is a\r\nuser request or evidence of authenticator compromise. (See Section 5.1.1 for additional\r\ninformation).\r\nProvide clear, meaningful and actionable feedback when chosen passwords are rejected (e.g., when it\r\nappears on a “black list” of unacceptable passwords or has been used previously).\r\n10.2.2 Look-Up Secrets\r\nTypical Usage\r\nUsers use the authenticator — printed or electronic — to look up the appropriate secret(s) needed to respond to a\r\nverifier’s prompt. For example, a user may be asked to provide a specific subset of the numeric or character\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 50 of 63\n\nstrings printed on a card in table format.\r\nUsability considerations for typical usage include:\r\nUser experience during entry of look-up secrets.\r\nConsider the prompts’ complexity and size. The larger the subset of secrets a user is prompted to\r\nlook up, the greater the usability implications. Both the cognitive workload and physical difficulty\r\nfor entry should be taken into account when selecting the quantity and complexity of look-up\r\nsecrets for authentication.\r\n10.2.3 Out-of-Band\r\nTypical Usage\r\nOut-of-band authentication requires users have access to a primary and secondary communication channel.\r\nUsability considerations for typical usage:\r\nNotify users of the receipt of a secret on a locked device. However, if the out of band device is locked,\r\nauthentication to the device should be required to access the secret.\r\nDepending on the implementation, consider form-factor constraints as they are particularly problematic\r\nwhen users must enter text on mobile devices. Providing larger touch areas will improve usability for\r\nentering secrets on mobile devices.\r\nA better usability option is to offer features that do not require text entry on mobile devices (e.g., a single\r\ntap on the screen, or a copy feature so users can copy and paste out-of-band secrets). Providing users such\r\nfeatures is particularly helpful when the primary and secondary channels are on the same device. For\r\nexample, it is difficult for users to transfer the authentication secret on a smartphone because they must\r\nswitch back and forth—potentially multiple times—between the out of band application and the primary\r\nchannel.\r\n10.2.4 Single-Factor OTP Device\r\nTypical Usage\r\nUsers access the OTP generated by the single-factor OTP device. The authenticator output is typically displayed\r\non the device and the user enters it for the verifier.\r\nUsability considerations for typical usage include:\r\nAuthenticator output allows at least one minute between changes, but ideally allows users the full two\r\nminutes as specified in Section 5.1.4.1. Users need adequate time to enter the authenticator output\r\n(including looking back and forth between the single-factor OTP device and the entry screen).\r\nDepending on the implementation, the following are additional usability considerations for implementers:\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 51 of 63\n\nIf the single-factor OTP device supplies its output via an electronic interface (e.g, USB) this is\r\npreferable since users do not have to manually enter the authenticator output. However, if a physical\r\ninput (e.g., pressing a button) is required to operate, the location of the USB ports could pose\r\nusability difficulties. For example, the USB ports of some computers are located on the back of the\r\ncomputer and will be difficult for users to reach.\r\nLimited availability of a direct computer interface such as a USB port could pose usability\r\ndifficulties. For example, the number of USB ports on laptop computers is often very limited. This\r\nmay force users to unplug other USB peripherals in order to use the single-factor OTP device.\r\n10.2.5 Multi-Factor OTP Device\r\nTypical Usage\r\nUsers access the OTP generated by the multi-factor OTP device through a second authentication factor. The OTP\r\nis typically displayed on the device and the user manually enters it for the verifier. The second authentication\r\nfactor may be achieved through some kind of integral entry pad to enter a memorized secret, an integral biometric\r\n(e.g., fingerprint) reader, or a direct computer interface (e.g., USB port). Usability considerations for the additional\r\nfactor apply as well — see Section 10.2.1 for memorized secrets and Section 10.4 for biometrics used in multi-factor authenticators.\r\nUsability considerations for typical usage include:\r\nUser experience during manual entry of the authenticator output.\r\nFor time-based OTP, provide a grace period in addition to the time during which the OTP is\r\ndisplayed. Users need adequate time to enter the authenticator output, including looking back and\r\nforth between the multi-factor OTP device and the entry screen.\r\nConsider form-factor constraints if users must unlock the multi-factor OTP device via an integral\r\nentry pad or enter the authenticator output on mobile devices. Typing on small devices is\r\nsignificantly more error prone and time-consuming than typing on a traditional keyboard. The\r\nsmaller the integral entry pad and onscreen keyboard, the more difficult it is to type. Providing\r\nlarger touch areas improves usability for unlocking the multi-factor OTP device or entering the\r\nauthenticator output on mobile devices.\r\nLimited availability of a direct computer interface like a USB port could pose usability difficulties.\r\nFor example, laptop computers often have a limited number of USB ports, which may force users to\r\nunplug other USB peripherals to use the multi-factor OTP device.\r\n10.2.6 Single-Factor Cryptographic Software\r\nTypical Usage\r\nUsers authenticate by proving possession and control of the cryptographic software key.\r\nUsability considerations for typical usage include:\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 52 of 63\n\nGive cryptographic keys appropriately descriptive names that are meaningful to users since users have to\r\nrecognize and recall which cryptographic key to use for which authentication task. This prevents users\r\nfrom having to deal with multiple similarly- and ambiguously-named cryptographic keys. Selecting from\r\nmultiple cryptographic keys on smaller mobile devices may be particularly problematic if the names of the\r\ncryptographic keys are shortened due to reduced screen size.\r\n10.2.7 Single-Factor Cryptographic Device\r\nTypical Usage\r\nUsers authenticate by proving possession of the single-factor cryptographic device.\r\nUsability considerations for typical usage include:\r\nRequiring a physical input (e.g., pressing a button) to operate the single-factor cryptographic device could\r\npose usability difficulties. For example, some USB ports are located on the back of computers, making it\r\ndifficult for users to reach.\r\nLimited availability of a direct computer interface like a USB port could pose usability difficulties. For\r\nexample, laptop computers often have a limited number of USB ports, which may force users to unplug\r\nother USB peripherals to use the single-factor cryptographic device.\r\n10.2.8 Multi-Factor Cryptographic Software\r\nTypical Usage\r\nIn order to authenticate, users prove possession and control of the cryptographic key stored on disk or some other\r\n“soft” media that requires activation. The activation is through the input of a second authentication factor, either a\r\nmemorized secret or a biometric. Usability considerations for the additional factor apply as well — see Section\r\n10.2.1 for memorized secrets and Section 10.4 for biometrics used in multi-factor authenticators.\r\nUsability considerations for typical usage include:\r\nGive cryptographic keys appropriately descriptive names that are meaningful to users since users have to\r\nrecognize and recall which cryptographic key to use for which authentication task. This prevents users\r\nfrom having to deal with multiple similarly- and ambiguously-named cryptographic keys. Selecting from\r\nmultiple cryptographic keys on smaller mobile devices may be particularly problematic if the names of the\r\ncryptographic keys are shortened due to reduced screen size.\r\n10.2.9 Multi-Factor Cryptographic Device\r\nTypical Usage\r\nUsers authenticate by proving possession of the multi-factor cryptographic device and control of the protected\r\ncryptographic key. The device is activated by a second authentication factor, either a memorized secret or a\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 53 of 63\n\nbiometric. Usability considerations for the additional factor apply as well — see Section 10.2.1 for memorized\r\nsecrets and Section 10.4 for biometrics used in multi-factor authenticators.\r\nUsability considerations for typical usage include:\r\nDo not require users to keep multi-factor cryptographic devices connected following authentication. Users\r\nmay forget to disconnect the multi-factor cryptographic device when they are done with it (e.g., forgetting\r\na smartcard in the smartcard reader and walking away from the computer).\r\nUsers need to be informed regarding whether the multi-factor cryptographic device is required to\r\nstay connected or not.\r\nGive cryptographic keys appropriately descriptive names that are meaningful to users since users have to\r\nrecognize and recall which cryptographic key to use for which authentication task. This prevents users\r\nbeing faced with multiple similarly and ambiguously named cryptographic keys. Selecting from multiple\r\ncryptographic keys on smaller mobile devices (such as smartphones) may be particularly problematic if the\r\nnames of the cryptographic keys are shortened due to reduced screen size.\r\nLimited availability of a direct computer interface like a USB port could pose usability difficulties. For\r\nexample, laptop computers often have a limited number of USB ports, which may force users to unplug\r\nother USB peripherals to use the multi-factor cryptographic device.\r\n10.3 Summary of Usability Considerations\r\nTable 10-1 summarizes the usability considerations for typical usage and intermittent events for each authenticator\r\ntype. Many of the usability considerations for typical usage apply to most of the authenticator types, as\r\ndemonstrated in the rows. The table highlights common and divergent usability characteristics across the\r\nauthenticator types. Each column allows readers to easily identify the usability attributes to address for each\r\nauthenticator. Depending on users’ goals and context of use, certain attributes may be valued over others.\r\nWhenever possible, provide alternative authenticator types and allow users to choose between them.\r\nMulti-factor authenticators (e.g., multi-factor OTP devices, multi-factor cryptographic software, and multi-factor\r\ncryptographic devices) also inherit their secondary factor’s usability considerations. As biometrics are only\r\nallowed as an activation factor in multi-factor authentication solutions, usability considerations for biometrics are\r\nnot included in Table 10-1 and are discussed in Section 10.4.\r\nTable 10-1 Usability Considerations Summary by Authenticator Type\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 54 of 63\n\n10.4 Biometrics Usability Considerations\r\nThis section provides a high-level overview of general usability considerations for biometrics. A more detailed\r\ndiscussion of biometric usability can be found in Usability \u0026 Biometrics, Ensuring Successful Biometric Systems\r\nNIST Usability.\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 55 of 63\n\nAlthough there are other biometric modalities, the following three biometric modalities are more commonly used\r\nfor authentication: fingerprint, face and iris.\r\nTypical Usage\r\nFor all modalities, user familiarity and practice with the device improves performance.\r\nDevice affordances (i.e., properties of a device that allow a user to perform an action), feedback, and clear\r\ninstructions are critical to a user’s success with the biometric device. For example, provide clear\r\ninstructions on the required actions for liveness detection.\r\nIdeally, users can select the modality they are most comfortable with for their second authentication factor.\r\nThe user population may be more comfortable and familiar with — and accepting of — some biometric\r\nmodalities than others.\r\nUser experience with biometrics as an activation factor.\r\nProvide clear, meaningful feedback on the number of remaining allowed attempts. For example, for\r\nrate limiting (i.e., throttling), inform users of the time period they have to wait until next attempt to\r\nreduce user confusion and frustration.\r\nFingerprint Usability Considerations:\r\nUsers have to remember which finger(s) they used for initial enrollment.\r\nThe amount of moisture on the finger(s) affects the sensor’s ability for successful capture.\r\nAdditional factors influencing fingerprint capture quality include age, gender, and occupation (e.g.,\r\nusers handling chemicals or working extensively with their hands may have degraded friction\r\nridges).\r\nFace Usability Considerations:\r\nUsers have to remember whether they wore any artifacts (e.g., glasses) during enrollment because it\r\naffects facial recognition accuracy.\r\nDifferences in environmental lighting conditions can affect facial recognition accuracy.\r\nFacial expressions affect facial recognition accuracy (e.g., smiling versus neutral expression).\r\nFacial poses affect facial recognition accuracy (e.g., looking down or away from the camera).\r\nIris Usability Considerations:\r\nWearing colored contacts may affect the iris recognition accuracy.\r\nUsers who have had eye surgery may need to re-enroll post-surgery.\r\nDifferences in environmental lighting conditions can affect iris recognition accuracy, especially for\r\ncertain iris colors.\r\nIntermittent Events\r\nAs biometrics are only permitted as a second factor for multi-factor authentication, usability considerations for\r\nintermittent events with the primary factor still apply. Intermittent events with biometrics use include, but are not\r\nlimited to, the following, which may affect recognition accuracy:\r\nIf users injure their enrolled finger(s), fingerprint recognition may not work. Fingerprint authentication will\r\nbe difficult for users with degraded fingerprints.\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 56 of 63\n\nThe time elapsed between the time of facial recognition for authentication and the time of the initial\r\nenrollment can affect recognition accuracy as a user’s face changes naturally over time. A user’s weight\r\nchange may also be a factor.\r\nIris recognition may not work for people who had eye surgery, unless they re-enroll.\r\nAcross all biometric modalities, usability considerations for intermittent events include:\r\nAn alternative authentication method must be available and functioning. In cases where biometrics do not\r\nwork, allow users to use a memorized secret as an alternative second factor.\r\nProvisions for technical assistance:\r\nClearly communicate information on how and where to acquire technical assistance. For example,\r\nprovide users information such as a link to an online self-service feature and a phone number for\r\nhelp desk support. Ideally, provide sufficient information to enable users to recover from\r\nintermittent events on their own without outside intervention.\r\nInform users of factors that may affect the sensitivity of the biometric sensor (e.g., cleanliness of the\r\nsensor).\r\n11 References\r\nThis section is informative.\r\n11.1 General References\r\n[BALLOON] Boneh, Dan, Corrigan-Gibbs, Henry, and Stuart Schechter. “Balloon Hashing: A Memory-Hard\r\nFunction Providing Provable Protection Against Sequential Attacks,” Asiacrypt 2016, October, 2016. Available at:\r\nhttps://eprint.iacr.org/2016/027.\r\n[Blacklists] Habib, Hana, Jessica Colnago, William Melicher, Blase Ur, Sean Segreti, Lujo Bauer, Nicolas\r\nChristin, and Lorrie Cranor. “Password Creation in the Presence of Blacklists,” 2017. Available at:\r\nhttps://www.ndss-symposium.org/wp-content/uploads/2017/09/usec2017_01_3_Habib_paper.pdf\r\n[Composition] Komanduri, Saranga, Richard Shay, Patrick Gage Kelley, Michelle L Mazurek, Lujo Bauer,\r\nNicolas Christin, Lorrie Faith Cranor, and Serge Egelman. “Of Passwords and People: Measuring the Effect of\r\nPassword-Composition Policies.” In Proceedings of the SIGCHI Conference on Human Factors in Computing\r\nSystems, 2595–2604. ACM, 2011. Available at: https://www.ece.cmu.edu/~lbauer/papers/2011/chi2011-\r\npasswords.pdf.\r\n[E-Gov] E-Government Act [includes FISMA] (P.L. 107-347), December 2002, available at:\r\nhttp://www.gpo.gov/fdsys/pkg/PLAW-107publ347/pdf/PLAW-107publ347.pdf.\r\n[EO 13681] Executive Order 13681, Improving the Security of Consumer Financial Transactions, October 17,\r\n2014, available at: https://www.federalregister.gov/d/2014-25439.\r\n[FEDRAMP] General Services Administration, Federal Risk and Authorization Management Program, available\r\nat: https://www.fedramp.gov/.\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 57 of 63\n\n[ICAM] National Security Systems and Identity, Credential and Access Management Sub-Committee Focus\r\nGroup, Federal CIO Council, ICAM Lexicon, Version 0.5, March 2011.\r\n[M-03-22] OMB Memorandum M-03-22, OMB Guidance for Implementing the Privacy Provisions of the E-Government Act of 2002, September 26, 2003, available at: https://georgewbush-whitehouse.archives.gov/omb/memoranda/m03-22.html.\r\n[M-04-04] OMB Memorandum M-04-04, E-Authentication Guidance for Federal Agencies, December 16, 2003,\r\navailable at: https://georgewbush-whitehouse.archives.gov/omb/memoranda/fy04/m04-04.pdf.\r\n[Meters] de Carné de Carnavalet, Xavier and Mohammad Mannan. “From Very Weak to Very Strong: Analyzing\r\nPassword-Strength Meters.” In Proceedings of the Network and Distributed System Security Symposium (NDSS),\r\n2014. Available at: http://www.internetsociety.org/sites/default/files/06_3_1.pdf\r\n[NISTIR8062] NIST Internal Report 8062, An Introduction to Privacy Engineering and Risk Management in\r\nFederal Systems, January 2017, available at: http://nvlpubs.nist.gov/nistpubs/ir/2017/NIST.IR.8062.pdf.\r\n[NIST Usability] National Institute and Standards and Technology, Usability \u0026 Biometrics, Ensuring Successful\r\nBiometric Systems, June 11, 2008, available at: http://www.nist.gov/customcf/get_pdf.cfm?pub_id=152184.\r\n[OWASP-session] Open Web Application Security Project, Session Management Cheat Sheet, available at:\r\nhttps://www.owasp.org/index.php/Session_Management_Cheat_Sheet.\r\n[OWASP-XSS-prevention] Open Web Application Security Project, XSS (Cross Site Scripting) Prevention Cheat\r\nSheet, available at: https://www.owasp.org/index.php/XSS_(Cross_Site_Scripting)_Prevention_Cheat_Sheet.\r\n[Persistence] herley, cormac, and Paul van Oorschot. “A Research Agenda Acknowledging the Persistence of\r\nPasswords,” IEEE Security\u0026Privacy Magazine, 2012. Available at:\r\nhttp://research.microsoft.com/apps/pubs/default.aspx?id=154077.\r\n[Privacy Act] Privacy Act of 1974 (P.L. 93-579), December 1974, available at:\r\nhttps://www.justice.gov/opcl/privacy-act-1974.\r\n[Policies] Weir, Matt, Sudhir Aggarwal, Michael Collins, and Henry Stern. “Testing Metrics for Password\r\nCreation Policies by Attacking Large Sets of Revealed Passwords.” In Proceedings of the 17th ACM Conference\r\non Computer and Communications Security, 162–175. CCS ‘10. New York, NY, USA: ACM, 2010.\r\ndoi:10.1145/1866307.1866327.\r\n[Section 508] Section 508 Law and Related Laws and Policies (January 30, 2017), available at:\r\nhttps://www.section508.gov/content/learn/laws-and-policies.\r\n[Shannon] Shannon, Claude E. “A Mathematical Theory of Communication,” Bell System Technical Journal, v.\r\n27, pp. 379-423, 623-656, July, October, 1948.\r\n[Strength] Kelley, Patrick Gage, Saranga Komanduri, Michelle L Mazurek, Richard Shay, Timothy Vidas, Lujo\r\nBauer, Nicolas Christin, Lorrie Faith Cranor, and Julio Lopez. “Guess Again (and Again and Again): Measuring\r\nPassword Strength by Simulating Password-Cracking Algorithms.” In Security and Privacy (SP), 2012 IEEE\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 58 of 63\n\nSymposium On, 523–537. IEEE, 2012. Available at:\r\nhttp://ieeexplore.ieee.org/iel5/6233637/6234400/06234434.pdf.\r\n11.2 Standards\r\n[BCP 195] Sheffer, Y., Holz, R., and P. Saint-Andre, Recommendations for Secure Use of Transport Layer Security\r\n(TLS) and Datagram Transport Layer Security (DTLS), BCP 195, RFC 7525,DOI 10.17487/RFC7525, May 2015,\r\nhttps://doi.org/10.17487/RFC7525.\r\n[ISO 9241-11] International Standards Organization, ISO/IEC 9241-11 Ergonomic requirements for office work\r\nwith visual display terminals (VDTs) — Part 11: Guidance on usability, March 1998, available at:\r\nhttps://www.iso.org/standard/16883.html.\r\n[ISO/IEC 2382-37] International Standards Organization, Information technology — Vocabulary — Part 37:\r\nBiometrics, 2017, available at: http://standards.iso.org/ittf/PubliclyAvailableStandards/c066693_ISO_IEC_2382-\r\n37_2017.zip.\r\n[ISO/IEC 10646] International Standards Organization, Universal Coded Character Set, 2014, available at:\r\nhttp://standards.iso.org/ittf/PubliclyAvailableStandards/c063182_ISO_IEC_10646_2014.zip.\r\n[ISO/IEC 24745] International Standards Organization, Information technology — Security techniques —\r\nBiometric information protection, 2011, available at:\r\nhttp://www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_detail.htm?csnumber=52946.\r\n[ISO/IEC 30107-1] International Standards Organization, Information technology — Biometric presentation attack\r\ndetection — Part 1: Framework, 2016, available at:\r\nhttp://standards.iso.org/ittf/PubliclyAvailableStandards/c053227_ISO_IEC_30107-1_2016.zip.\r\n[ISO/IEC 30107-3] International Standards Organization, Information technology — Biometric presentation attack\r\ndetection — Part 3: Testing and reporting, 2017.\r\n[RFC 20] Cerf, V., ASCII format for network interchange, STD 80, RFC 20, DOI 10.17487/RFC0020, October\r\n1969, https://doi.org/10.17487/RFC0020.\r\n[RFC 5246] IETF, The Transport Layer Security (TLS) Protocol Version 1.2, RFC 5246, DOI 10.17487/RFC5246,\r\nAugust 2008, https://doi.org/10.17487/RFC5246.\r\n[RFC 5280] IETF, Internet X.509 Public Key Infrastructure Certificate and CRL Profile, RFC 5280, DOI\r\n10.17487/RFC5280, May 2008, https://doi.org/10.17487/RFC5280.\r\n[RFC 6238] IETF, TOTP: Time-Based One-Time Password Algorithm,RFC 6238, DOI 10.17487/RFC6238,\r\nhttps://doi.org/10.17487/RFC6238.\r\n[RFC 6960] IETF, X.509 Internet Public Key Infrastructure Online Certificate Status Protocol - OCSP, RFC\r\n6960, DOI 10.17487/RFC6960, https://doi.org/10.17487/RFC6960.\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 59 of 63\n\n[UAX 15] Unicode Consortium, Unicode Normalization Forms, Unicode Standard Annex 15, Version 9.0.0,\r\nFebruary, 2016, available at: http://www.unicode.org/reports/tr15/.\r\n11.3 NIST Special Publications\r\nNIST 800 Series Special Publications are available at: http://csrc.nist.gov/publications/nistpubs/index.html. The\r\nfollowing publications may be of particular interest to those implementing systems of applications requiring\r\ndigital authentication.\r\n[SP 800-38B] NIST Special Publication 800-38B, Recommendation for Block Cipher Modes of Operation: the\r\nCMAC Mode for Authentication, October, 2016, http://dx.doi.org/10.6028/NIST.SP.800-38B.\r\n[SP 800-52] NIST Special Publication 800-52 Revision 1, Guidelines for the Selection, Configuration, and Use of\r\nTransport Layer Security (TLS) Implementations, April, 2014, http://dx.doi.org/10.6028/NIST.SP.800-52r1\r\n[SP 800-53] NIST Special Publication 800-53 Revision 4, Recommended Security and Privacy Controls for\r\nFederal Information Systems and Organizations, April 2013 (updated January 22, 2015),\r\nhttp://dx.doi.org/10.6028/NIST.SP.800-53r4.\r\n[SP 800-57 Part 1] NIST Special Publication 800-57 Part 1, Revision 4, Recommendation for Key Management,\r\nPart 1: General, January 2016, http://dx.doi.org/10.6028/NIST.SP.800-57pt1r4.\r\n[SP 800-63-3] NIST Special Publication 800-63-3, Digital Identity Guidelines, June 2017,\r\nhttps://doi.org/10.6028/NIST.SP.800-63-3.\r\n[SP 800-63A] NIST Special Publication 800-63A, Digital Identity Guidelines: Enrollment and Identity Proofing\r\nRequirements, June 2017, https://doi.org/10.6028/NIST.SP.800-63a.\r\n[SP 800-63C] NIST Special Publication 800-63C, Digital Identity Guidelines: Federation and Assertions, June\r\n2017, https://doi.org/10.6028/NIST.SP.800-63c.\r\n[SP 800-90Ar1] NIST Special Publication 800-90A Revision 1, Recommendation for Random Number Generation\r\nUsing Deterministic Random Bit Generators, June 2015, http://dx.doi.org/10.6028/NIST.SP.800-90Ar1.\r\n[SP 800-107] NIST Special Publication 800-107 Revision 1, Recommendation for Applications Using Approved\r\nHash Algorithms, August 2012, http://dx.doi.org/10.6028/NIST.SP.800-107r1.\r\n[SP 800-131A] NIST Special Publication 800-131A Revision 1, Transitions: Recommendation for Transitioning\r\nthe Use of Cryptographic Algorithms and Key Lengths, November 2015, http://dx.doi.org/10.6028/NIST.SP.800-\r\n131Ar1\r\n[SP 800-132] NIST Special Publication 800-132, Recommendation for Password-Based Key Derivation,\r\nDecember 2010, http://dx.doi.org/10.6028/NIST.SP.800-132.\r\n[SP 800-185] NIST Special Publication 800-185, SHA-3 Derived Functions: cSHAKE, KMAC, TupleHash, and\r\nParallelHash, December, 2016, https://doi.org/10.6028/NIST.SP.800-185.\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 60 of 63\n\n11.4 Federal Information Processing Standards\r\n[FIPS 140-2] Federal Information Processing Standard Publication 140-2, Security Requirements for\r\nCryptographic Modules, May 25, 2001 (with Change Notices through December 3, 2002),\r\nhttps://doi.org/10.6028/NIST.FIPS.140-2.\r\n[FIPS 198-1] Federal Information Processing Standard Publication 198-1, The Keyed-Hash Message\r\nAuthentication Code (HMAC), July 2008, https://doi.org/10.6028/NIST.FIPS.198-1.\r\n[FIPS 201] Federal Information Processing Standard Publication 201-2, Personal Identity Verification (PIV) of\r\nFederal Employees and Contractors, August 2013, http://dx.doi.org/10.6028/NIST.FIPS.201-2.\r\n[FIPS 202] Federal Information Processing Standard Publication 202, SHA-3 Standard: Permutation-Based Hash\r\nand Extendable-Output Functions, August 2015, http://dx.doi.org/10.6028/NIST.FIPS.202.\r\nAppendix A—Strength of Memorized Secrets\r\nThis appendix is informative.\r\nThroughout this appendix, the word “password” is used for ease of discussion. Where used, it should be\r\ninterpreted to include passphrases and PINs as well as passwords.\r\nA.1 Introduction\r\nDespite widespread frustration with the use of passwords from both a usability and security standpoint, they\r\nremain a very widely used form of authentication [Persistence]. Humans, however, have only a limited ability to\r\nmemorize complex, arbitrary secrets, so they often choose passwords that can be easily guessed. To address the\r\nresultant security concerns, online services have introduced rules in an effort to increase the complexity of these\r\nmemorized secrets. The most notable form of these is composition rules, which require the user to choose\r\npasswords constructed using a mix of character types, such as at least one digit, uppercase letter, and symbol.\r\nHowever, analyses of breached password databases reveal that the benefit of such rules is not nearly as significant\r\nas initially thought [Policies], although the impact on usability and memorability is severe.\r\nComplexity of user-chosen passwords has often been characterized using the information theory concept of\r\nentropy [Shannon]. While entropy can be readily calculated for data having deterministic distribution functions,\r\nestimating the entropy for user-chosen passwords is difficult and past efforts to do so have not been particularly\r\naccurate. For this reason, a different and somewhat simpler approach, based primarily on password length, is\r\npresented herein.\r\nMany attacks associated with the use of passwords are not affected by password complexity and length. Keystroke\r\nlogging, phishing, and social engineering attacks are equally effective on lengthy, complex passwords as simple\r\nones. These attacks are outside the scope of this Appendix.\r\nA.2 Length\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 61 of 63\n\nPassword length has been found to be a primary factor in characterizing password strength [Strength]\r\n[Composition]. Passwords that are too short yield to brute force attacks as well as to dictionary attacks using\r\nwords and commonly chosen passwords.\r\nThe minimum password length that should be required depends to a large extent on the threat model being\r\naddressed. Online attacks where the attacker attempts to log in by guessing the password can be mitigated by\r\nlimiting the rate of login attempts permitted. In order to prevent an attacker (or a persistent claimant with poor\r\ntyping skills) from easily inflicting a denial-of-service attack on the subscriber by making many incorrect guesses,\r\npasswords need to be complex enough that rate limiting does not occur after a modest number of erroneous\r\nattempts, but does occur before there is a significant chance of a successful guess.\r\nOffline attacks are sometimes possible when one or more hashed passwords is obtained by the attacker through a\r\ndatabase breach. The ability of the attacker to determine one or more users’ passwords depends on the way in\r\nwhich the password is stored. Commonly, passwords are salted with a random value and hashed, preferably using\r\na computationally expensive algorithm. Even with such measures, the current ability of attackers to compute many\r\nbillions of hashes per second with no rate limiting requires passwords intended to resist such attacks to be orders\r\nof magnitude more complex than those that are expected to resist only online attacks.\r\nUsers should be encouraged to make their passwords as lengthy as they want, within reason. Since the size of a\r\nhashed password is independent of its length, there is no reason not to permit the use of lengthy passwords (or\r\npass phrases) if the user wishes. Extremely long passwords (perhaps megabytes in length) could conceivably\r\nrequire excessive processing time to hash, so it is reasonable to have some limit.\r\nA.3 Complexity\r\nAs noted above, composition rules are commonly used in an attempt to increase the difficulty of guessing user-chosen passwords. Research has shown, however, that users respond in very predictable ways to the requirements\r\nimposed by composition rules [Policies]. For example, a user that might have chosen “password” as their\r\npassword would be relatively likely to choose “Password1” if required to include an uppercase letter and a\r\nnumber, or “Password1!” if a symbol is also required.\r\nUsers also express frustration when attempts to create complex passwords are rejected by online services. Many\r\nservices reject passwords with spaces and various special characters. In some cases, the special characters that are\r\nnot accepted might be an effort to avoid attacks like SQL injection that depend on those characters. But a properly\r\nhashed password would not be sent intact to a database in any case, so such precautions are unnecessary. Users\r\nshould also be able to include space characters to allow the use of phrases. Spaces themselves, however, add little\r\nto the complexity of passwords and may introduce usability issues (e.g., the undetected use of two spaces rather\r\nthan one), so it may be beneficial to remove repeated spaces in typed passwords prior to verification.\r\nUsers’ password choices are very predictable, so attackers are likely to guess passwords that have been successful\r\nin the past. These include dictionary words and passwords from previous breaches, such as the “Password1!”\r\nexample above. For this reason, it is recommended that passwords chosen by users be compared against a “black\r\nlist” of unacceptable passwords. This list should include passwords from previous breach corpuses, dictionary\r\nwords, and specific words (such as the name of the service itself) that users are likely to choose. Since user choice\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 62 of 63\n\nof passwords will also be governed by a minimum length requirement, this dictionary need only include entries\r\nmeeting that requirement.\r\nHighly complex memorized secrets introduce a new potential vulnerability: they are less likely to be memorable,\r\nand it is more likely that they will be written down or stored electronically in an unsafe manner. While these\r\npractices are not necessarily vulnerable, statistically some methods of recording such secrets will be. This is an\r\nadditional motivation not to require excessively long or complex memorized secrets.\r\nA.4 Randomly-Chosen Secrets\r\nAnother factor that determines the strength of memorized secrets is the process by which they are generated.\r\nSecrets that are randomly chosen (in most cases by the verifier or CSP) and are uniformly distributed will be more\r\ndifficult to guess or brute-force attack than user-chosen secrets meeting the same length and complexity\r\nrequirements. Accordingly, at LOA2, SP 800-63-2 permitted the use of randomly generated PINs with 6 or more\r\ndigits while requiring user-chosen memorized secrets to be a minimum of 8 characters long.\r\nAs discussed above, the threat model being addressed with memorized secret length requirements includes rate-limited online attacks, but not offline attacks. With this limitation, 6 digit randomly-generated PINs are still\r\nconsidered adequate for memorized secrets.\r\nA.5 Summary\r\nLength and complexity requirements beyond those recommended here significantly increase the difficulty of\r\nmemorized secrets and increase user frustration. As a result, users often work around these restrictions in a way\r\nthat is counterproductive. Furthermore, other mitigations such as blacklists, secure hashed storage, and rate\r\nlimiting are more effective at preventing modern brute-force attacks. Therefore, no additional complexity\r\nrequirements are imposed.\r\nSource: https://pages.nist.gov/800-63-3/sp800-63b.html\r\nhttps://pages.nist.gov/800-63-3/sp800-63b.html\r\nPage 63 of 63",
	"extraction_quality": 1,
	"language": "EN",
	"sources": [
		"MITRE"
	],
	"references": [
		"https://pages.nist.gov/800-63-3/sp800-63b.html"
	],
	"report_names": [
		"sp800-63b.html"
	],
	"threat_actors": [],
	"ts_created_at": 1775434575,
	"ts_updated_at": 1775791320,
	"ts_creation_date": 0,
	"ts_modification_date": 0,
	"files": {
		"pdf": "https://archive.orkl.eu/03e1475766008579e7995816adfb791e65186a34.pdf",
		"text": "https://archive.orkl.eu/03e1475766008579e7995816adfb791e65186a34.txt",
		"img": "https://archive.orkl.eu/03e1475766008579e7995816adfb791e65186a34.jpg"
	}
}