<img src="https://ws.zoominfo.com/pixel/pIUYSip8PKsGpxhxzC1V" width="1" height="1" style="display: none;">

How Will the New Cyber Incident Notification Rule Affect Your FI? 4 Steps to Update Your Institution’s Incident Response Plan

3 min read
Jan 5, 2022

Federal regulators have been encouraging financial institutions to share information about known cyber incidents for years. Now banks and their third-party service providers have until May 1, 2022, to comply with the new rule requiring prompt regulator notification in the event of a cyber incident.

How to Respond When a Vendor Gets Hacked

Bank cyber incident notification rule  

Issued by the FDIC, Federal Reserve, and OCC, the new rule requires a banking organization to notify its primary federal regulator of any significant computer-security incident as soon as possible and no later than 36 hours after the banking organization determines that a cyber incident has occurred.   

This includes incidents that materially affect—or are reasonably likely to materially affect—the viability of: 

    • a banking organization’s operations 
    • its ability to deliver banking products and services or  
    • the stability of the financial sector. 

Third-party vendors cyber incident notification rule 
The federal regulators’ rule also requires bank service providers (narrower definition of third-party providers) to notify a primary contact at an affected financial institution if it experiences a computer-security incident that materially affects or is reasonably likely to materially affect banking organization customers for four or more hours. 

In the past, third-party incident notification has been defined and enforced with well-drafted vendor agreements and attentive due diligence. It’s been up to financial institutions to draft contracts that adequately address incident reporting—and many have struggled to do this effectively.  

Why did the rule change? 
Research by the FDIC’s Office of the Inspector General (OIG) has found that while most third-party vendor contracts require the vendor to inform the financial institution of a breach, contracts tend to be vague and hard to enforce. For example, they don’t include details of how a vendor assesses or responds to potential incidents or whether it reports them to authorities. 

Defining “timely notification”  
About 20 percent of banks in the OIG study included the term “timely notification of financial institution” in contracts but didn’t define a specific time period, leaving it up to interpretation. Other terms that are equally undefined often include “unauthorized access,” “security incident,” and “substantial harm or inconvenience.”  

The new rule should make it easier for financial institutions to know when a third-party vendor is experiencing a cyber event. Good contracts are drafted to require vendors to comply with all federal, state, and local laws and regulations. That means many bank-vendor agreement will now have a specific definition of a computer-security incident, but financial institutions will need to ensure contracts for third parties not subject to the security notification rule continue to have strong language and defined expectations. 

What if sensitive customer data is exposed? 

This isn’t the first rule to require a notification of a breach. There are already numerous state and federal laws (including Gramm-Leach-Bliley) covering privacy breaches. Whether it’s a vendor or financial institution breach, if a cybersecurity incident results in the breach of non-public personal information (NPPI), financial institutions will still need to follow existing state and federal laws when dealing with the breach.  

Updating your institution’s incident response plan 
This new rule will bring heightened attention to incident response programs, an essential component of any financial institution’s information security program.   

Now is the time to review your institution’s incident response plan to: 

Update timelines. Make sure your policies and procedures require federal regulator notification of any significant computer-security incident within 36 hours. If current policies and procedures permit a longer threshold or don’t define one, they need to be edited to meet this new requirement. 

Review vendor’s incident response plans. While it’s always been important to be aware of vendors’ incident response plans, it’s possible that examiners will more proactively review this area. Make sure your third-party vendors are updating their incident response plans to ensure they inform you of any computer-security incident that will materially impact customers for four or more hours.  

Define the vendor contact. Make sure your vendor agreements specify the contact at the institution (by role, not name) and contact information for any incident notifications. You don’t want the vendor to send the message to the wrong person or someone who is unlikely to understand the implications. 

Test your incident response plan. The sooner a breach is discovered and addressed, the less it ends up costing a company, research has found. Breaches that take less than 200 days between occurrence and remediation cost 37 percent less than when it takes more than 200 days, according to IBM and Ponemon. 

The cost of an outdated, untested incident response plan 
Companies that use tabletop exercises or other simulations to test their incident response plans often catch breaches sooner. The study found that breaches cost $1.23M more when a company lacks an incident response team or tested plan.  

How to Reduce the Cost of a Data Security Breach at a Bank or Credit Union 

The rule goes into effect May 1, 2022, but don’t wait until the last minute to address necessary changes to your institution’s incident response plan. 


Creating Reliable Risk Assessments

New call-to-action

Subscribe to the Nsight Blog