Financial crime is a threat to your business. The FBI’s Internet Crime Complaint Centre (IC3) reported internet-enabled fraud led to an incredible $2.7 billion in losses in 2018, double the losses reported in 2017.
From sophisticated cyber criminals and would-be money launderers, to organised crime groups and fraud-minded customers, there is no shortage of threats to which a company may be subjected. And, as different as these situations may seem, they have a common tie.
Detection and prevention of any financial crime hinges on the timely and effective analysis of data. A Chief Financial Officer (CFO) of a company relies on data to be sure they are not wiring money to a supplier that is on a sections list, just as the accounts payable department relies on data to be sure they are not initiating a fraudulent payment or facilitating money laundering.
When it comes to data, accuracy, integrity and lineage are paramount. Inaccurate data leads to inaccurate conclusions. Data that lacks integrity is unreliable as a source of information. And without being able to trace lineage, there is no way to confirm that the data has not been undermined, potentially deliberately by a bad actor or malicious software, at some point in its life cycle.
These three non-negotiable data requirements can be supported in five steps:
- Data profiling and security
- Data cleansing
- Process documentation
- Testing, activation and maintenance
Ensuring data accuracy, integrity and lineage requires a collaboration between multiple parties within a business, including the CFO and Chief Risk Officer (CRO), technology teams and compliance teams, all of whom will need to be included at various points in this process.
Planning is critical to determine what data is needed to manage and assess the risk of financial crime, how you will obtain that data, and any limitations that may exist within it.
Begin by documenting objectives, defining the risks you need to assess and why. This will allow you to hone in on the data that is relevant to your task. Then, choose the right technique to obtain that data.
Will you need information in real time, or will batch updates suffice? Do you need to begin to collect new data or will you need to be able to access information from third parties?
Identifying data sources will also help determine limitations of your data, such as how cycle times may impact data flow. Identifying data sources can also help improve data confidence. Stunningly, according to IBM, one in three business leaders don’t trust the information that they use to make decisions.
Once these evaluations are complete, document your data plan to allow for contingencies in case products and services or attack vectors change over time. If you already have a plan in place, consider this step an opportunity to validate and make needed updates.
Data profiling and security
Once you have determined what data is needed to drive your risk assessment, how it will be obtained, and evaluated existing limitations, the next step is to examine the data already available, a process known as data profiling.
In many cases, there may be a need to enhance existing data. The owner of a bank account may need to be validated against the owner of a phone number used to initiate a mobile transaction, making sure both are the same person and adding an extra layer of security. For example, third party data such as ultimate beneficial ownership data, can be used to enhance a business’s own data.
This is particularly important because with today’s global supply chains, initial suppliers can be obscured, and no business wants to find that their money has been flowing back to a sanctioned company.
Dedicated “landing zones” for incoming data, end-to-end encryption and verifying the integrity of data via reconciliation can help maintain the integrity of data and ensure there is no deliberate or incidental manipulation of information.
The ultimate goal of data cleansing is to make sure data is accurate and in a form that can be readily analysed. Inaccurate data – even if it is innocuous – can lead to bad decisions and inefficiencies that compound over time.
When assessing risk, you can identify and exclude transactions such as bank fees and institution-initiated transactions like interest payments. Then comes the often tedious but necessary process of de-duping data. If the same data is coming from multiple sources, or if data overlaps, it will need to be de-duped, so it does not skew results.
Data controls should be put in place to make sure all needed data is included, and all excluded data is documented, and that changes can be handled or identified for remediation.
Defining rules for data entry, such as setting default values for missing data and hierarchies of inputs, can promote the creation and maintenance of high-quality data. For example, if a suppliers’ or customers’ address is stored in one field in a database, instead of in multiple fields such as City and Country, then that data is not in an appropriate format, and it will be much more difficult to determine if different addresses appear on accounts for the same supplier or customer, a clear flag for fraud.
Once processes are established, document them so they are well understood and that checks and balances can be implemented as needed. This includes establishing default values for missing data and determining when escalations will be triggered.
Most risk assessment and management processes involve an application of technology, which should be part of documenting a risk-based approach to data. This means that not only should you document the data that is relevant to helping detect financial crime, you should also make sure it is being delivered in a format that can be used by your financial crime detection and prevention solutions.
Documenting processes may lead to the identification of changes that need to be made. If you need to be able to tell which amounts of a mixed deposit are checks, cash and money orders, but your current system only records the total amount of the deposit, that may need to be changed.
Testing, activation and maintenance
In many ways, testing is a reality check, ensuring that the data that is expected compares to that which is actually observed. Creating test scripts will allow for trial runs, and results can be reviewed for accuracy and precision.
Activation includes the implementation of a change control process that allows for new sources or new data to be incorporated, and that can accommodate new or updated regulatory requirements.
Maintenance includes ongoing fine tuning. If you are getting too many alerts because of bad data, you can identify and address that at the source. There is also an opportunity to continue enhancing your program through ongoing reviews and data model validation.
Data is the foundation for the successful management of financial risk, compliance, operational efficiency and customer experience. The better your data, the better prepared you will be to see what is happening, make appropriate inferences and manage risk.
High quality, well-managed data enables businesses to better detect and prevent financial crime and establish more trust and transparency with all stakeholders, including regulators, employees and customers.
Subscribe to get your daily business insights
Was this article helpful?