Gold-copy data & AI in the trade lifecycle process | By The Digital Insider

The current end-to-end trade lifecycle is highly dependent on having accurate data at each stage. The goal of the Investment iook of records (IBOR) system is to ensure the trade, position, and cash data match the custodian and for the accounting book of records (ABOR) system for this same data set to match the fund accountant.

There are other stakeholders in the process, including broker systems, transfer agents, central clearing parties, etc, depending on the type and location of execution. A position that reflects identically across all systems is known as having been “straight-through processed”; in other words, systems have recognized the trade, and datasets are in line, or at least, within tolerance. 

While efficient, the addressal and eventual resolution of non-STP executions remains highly manual. Stakeholders typically compare data points across multiple systems, beginning as upstream as possible, and gradually move down the lifecycle to the root cause of the break. This investigation takes time, creates noise across the value chain, and most importantly, creates uncertainty for the front office to take new decisions. 

The proposal is to leverage AI to continually create and refine gold-copy data at each stage of the life cycle through comparison with sources and link downstream processes to automatically update in real-time with the accurate datasets. Guardrails should also be implemented in case of material differences. 

Leveraging AI to accelerate sales effectiveness

Artificial Intelligence can be an extremely useful tool for organizations looking to improve the effectiveness of their sales and marketing activities.


Introduction

Let’s analyze the current process with an example - a vanilla bond is about to undergo a payment-in-kind (PIK) corporate action (PIKs occur when an issuer decides to capitalize interest it would have paid in cash as additional security). Assume that the vendor an IBOR system is using utilizes an ACT/360 day-count (to calculate accrual) than the custodian (who uses ACT/365):

  • On ex-date, the PIK will process with a higher capitalization than the custodian and a mismatch will form between IBOR and Bank. 
  • This mismatch will first be uncovered on ex-date, assuming the bank sends MT567 (corp. action status) and flags the positional difference between the two systems. 
  • Next, on SD+1, this will again be flagged when the bank sends MT535 (position statement), showing the mismatch during position reconciliation. 
  • Finally, if investment accounting is run on ex-date or on SD+1, there’ll be a mismatch between IBOR and the fund accountant, where the balance sheet and statement of change in net asset reports will again show an exception for the security. 

This simple example illustrates how one mismatch well upstream in the lifecycle causes three separate breaks in the downstream chain; in other words, three different segments of users (corp. action user, reconciliation user, and an accounting user are all investigating the same root cause).

Once the IBOR system’s data is resolved, each of these user segments need to coordinate the waterfall logic to have each of the downstream system/process updated. 

The problem

Unfortunately, such occurrences are common. As front-to-middle-to-back investment systems become more integrated, inaccurate data at any point in the process chain creates inefficiencies across a number of user segments and forces multiple users to analyze the same exception (or the effect of that exception) on their respective tools.

Downstream users that are reconciling to the bank or the fund accountant will notice the security mismatch but would not immediately recognize the root cause of day count difference. These users would typically undertake the below tasks to investigate:

  • Raise an inquiry with the bank’s MT535 statement to explain the position difference
  • Raise an inquiry with the fund accountant’s statement to explain the position difference
  • Raise inquiry with the internal data team to specify IBOR’s position calculations
  • Once aware of a recent corp. action, raise inquiry with the internal COAC team to investigate the processing of the PIK

As seen, multiple teams’ energy and capacity are being expended to investigate the root cause and all being undertaken manually. 

On the other hand, an AI process that could continually query multi-source datasets should have been proactively able to flag the day count discrepancy prior to the corp. action processing, as well as automatically inform downstream teams of potential inaccuracy in the specific position of the PIK security.

While any changes to user data from AI should still undergo a reviewer check, such proactive detection and communication drastically increases resolution times and should reduce user frustration. 

LLM economics: How to avoid costly pitfalls

Avoid costly LLM pitfalls: Learn how token pricing, scaling costs, and strategic prompt engineering impact AI expenses—and how to save.


The proposal

Let's look at the corporate action workflow in detail. Users typically create a “gold-copy” event once they’ve “scrubbed” data from multiple sources and created an accurate, up-to-date copy of the event that will occur. This is ideal in many ways: scrubbing multiple sources ensures there’s less chance of an incorrect feed from a single vendor, creating process gaps. 

We need AI to undertake this process continuously. IBOR systems should, at minimum, be subscribed to two or more vendors from whom data should be retrieved. Any change to the dataset should be continually updated (either through a push or pull API mechanism). This would work as follows: 

  • A new public security is set up in the marketplace with public identifiers including CUSIP, ISIN, SEDOL etc. 
  • The data vendors supplying the feed to IBOR systems should feed this through automatically, once the required minimum data point details are populated. 
    • IBOR systems, at this point, would create this security within their data systems
    • Any mismatches across vendors should be reviewed by a user, and appropriate values chosen (if deemed necessary)
  • Any updates the securities undergo from that point in the market should be automatically captured and security updated in the IBOR system
    • At this point, downstream applications that leverage the application should automatically flag a security market update and the impending event-driven update
      • This informs users that the dataset they’re seeing may be stale vs. external processes that may be receiving up-to-date data
    • To protect against the risk of inaccurate data from a single vendor, only a dataset that is consistent across all vendors should be automatically updated
    • Data updates from a single vendor only should be prompted to a user to review and approve
  • Once underlying securities are updated, this would be considered an ‘event’, which should drive updates to all downstream applications that rely on the security update (called event-driven updates)
    • Event-driven updates greatly reduce the number of manual touches downstream users need to make for inaccuracies that have been identified upstream
    • Once all applications are in line with the updated data sets, the security market update flag should be removed automatically. 

Potential concerns

While exciting, the use of AI and event-driven updates raises a few concerns worth discussing - data capacity/storage, potential timing differences with external participants, and materiality/tolerance. 

Let’s address the latter first - materiality/tolerance. Securities can undergo immaterial changes from time to time that may have little to no impact on all upstream and downstream processes in the trade lifecycle.

As a result, a set of fields and tolerances should be identified to be flagged in case of market updates (core dataset). If the updates occur on these specific fields and they’re outside of the existing tolerance, IBOR systems should consume the updates provided by vendors.

If updates occur on any other fields (or are within tolerance), the updates should be rejected. This would ensure the system leverages the efficiency of AI without the inefficiency of noise. 

Secondly, there is potential for timing differences with external participants. While the IBOR system may have up-to-date data, external participants (e.g., banks or fund accounting systems) may continue to leverage stale or outdated datasets.

AI and data analytics-driven finance transformation

AI and data analytics are valuable assets for discovering real-time insights, proactive decision-making, and predictive capabilities.


There should be an audit history available of the core dataset’s historical data; in other words, if the bank/fund accounting system refers to any of the audit datasets, an automatic note should be sent to the external participant informing them of stale data and to recheck against external market vendors. 

Finally, there is the concern about data capacity. There’s no doubt that continual querying, validation, and updates of core datasets by multiple vendors, along with maintaining audit data, will increase data consumption and storage costs.

A number of companies are required by law to keep an audit history of at least five years, and adding the above requirement would certainly expand the capacity requirements. Making security updates to solely the core data sets and allowing tolerance should help to manage some of this required capacity. 

Future

Despite these strong concerns highlighted, the use of AI is still valuable to design and implement across the trade lifecycle process and would be substantially more valuable than the costs that would likely be incurred. While much of the examples in this paper discussed public securities, the universe is substantially wider in private securities with much less high-quality data. 

With the investing world transitioning to increased investments in private securities, leveraging AI will continue to pay dividends across both universes. 





#Accounting, #Agents, #Ai, #Analytics, #API, #Applications, #Articles, #ArtificialIntelligence, #Assets, #Audit, #Bank, #Banks, #Book, #Change, #Communication, #Companies, #Comparison, #Data, #DataAnalytics, #Datasets, #Design, #Details, #Detection, #DifferenceBetween, #Economics, #Efficiency, #Energy, #Engineering, #Event, #Finance, #Form, #Gold, #Hand, #History, #How, #HowTo, #Impact, #Insights, #Intelligence, #Investing, #Investment, #Investments, #It, #Law, #Learn, #LESS, #Life, #Link, #Llm, #Logic, #Marketing, #Material, #Noise, #One, #Organizations, #Other, #PAID, #Paper, #Pricing, #Proactive, #Process, #PROMPTENGINEERING, #QualityData, #Query, #Raise, #RealTime, #Reports, #Resolution, #Review, #Risk, #Sales, #Scaling, #Security, #Storage, #Sync, #Teams, #Time, #Timing, #Tool, #Tools, #Trade, #Transfer, #Universe, #Validation, #Vendor, #Vendors, #Vs, #Work, #Workflow, #World
Published on The Digital Insider at https://is.gd/w8fNp0.

Comments