Data is not difficult – Define it, Use it

Data is not Difficult- Define it Use it

One of the things I designed and built 20+ years ago was a middleware layer that took transactions, accounts and products, (TAP) and distributed them on a broadcast basis, so receivers subscribed to required data, and published any and all events they generated.

This middleware layer is still in place today in the bank.

It’s use meant you could build a system with a subscriber/publisher pair. The subscriber would fulfil the data based on a subscription, with a set of simple rules. And the new system would publish events it created for other subscribers to use.

Tackling the core data for an investment bank, and making it a service rather than a repeated chore, removes the risk of development, reduces system build overhead, removes reconciliation gaps and simplifies the entire technology management process.

Investment Banking is a simple business. Whether transacting simple cash products or complex structured products, all core transactions are made of three items, accounts, products and current value. Even the value is transient, and needs to be derived from the product.

Positions are generated from the cumulative effects of transactions, and can have both securities and cash postings against a position, which are just events.

Once you define the elements of each product, at a base model level, then re-use that same product across all systems. Don’t repeat the product store, or reproduce it elsewhere. Centralise it, manage it once, distribute it.

Ideally for the market based products (Equities, Bonds, Listed Derivatives), take a market feed. Don’t setup and manage market products in-house. Take them from the market, centralise them, distribute internally.

For accounts, whether external clients, or internal books, these should also be centralised, managed once, and distributed.

In todays tougher regulatory controls, KYC is now critical. (It always was but now you get fined if you don’t deal with it !!!).

Spreading accounts over multiple systems, all manually input directly into the system, even partially is just asking for trouble. Someone will make a mistake, and you will have clients on front-end systems you shouldn’t be trading with, but you are….. So fix the problem, before you get fined, and then have to fix the problem.

Having sorted a single source of Client data, with credit limits, KYC flags, tradable product mappings, market name maps, entity structures, … – feed this to all your systems.

Don’t get confused – you don’t need to sort CRM out in this process. The moment you cross that line, there is a whole different set of issues. BUT your core data should be the source of client data to your CRM, and only clients fed from this should be in the CRM, and talked to on behalf of the bank.

The old adage of “prospects are not clients” no longer applies. Trying to sell something to someone makes them a client, even if they don’t buy it. You are still having a market interaction, and your regulator will still want to know why you didn’t know he was on a blacklist !!!

Internal Books are just client accounts, they are internal client accounts. And should be marked as such. And treated as such. Generating profit by client, can show you profit by internal client too. And this is useful information. Having escalation for unmatched positions on clients, applies as much to internal books, and critical information. So treat “books” as clients. Have one book structure across the whole firm, and centralise it, manage it once, distribute it.

Having sorted your core data, transactions are just a combination of two accounts, one or more products, a trade owner (often driven off the product), a sales owner (often driven off the client), and a current value.

Note the value of the transaction as “done” is done. But the value of the open positions generated by one or more transactions will change in value, and risk will then change with that value.

For cash positions, such as Equities or Bonds, the value will change based on market value, mark-to-market. But the open positions will also change in value due to market events, fed from your central source of market events, based off your product database. Dividends, Coupons, Margin Calls, … will change the value of a position. (Note a position is the result of a series of one or more transactions).

More complex products, which are made of a number of underlying products, have a value derived from the current value of those underlying products, and often a cash projection against those products. Same principals apply though.

Where the process gets too bogged down is in trying to define the ultimate data model for this. Don’t.

Define what you know, and leave flexibility for what you don’t. Spending months on every variation of every product, account, entity, etc, just delays the process. Anyone having worked in a bank for 10+ years, should be able to define the main components of client, product, transactions, positions, …. So use the experience, and take that as the starting data model.

There is no silver bullet, and all the tech buzzwords don’t solve the issue that there are transactions made of accounts (clients, books), and products (securities, structures) that create events (booking, settlement, margin call, payment….).

This is not rocket science. We are not saving lives, or getting to the moon, we are moving dollars from A to B. So simplify the data, and clean up the reconciliations, reporting, P&L accuracy, risk values etc.

Then go to the pub.

Post Tags :

Recent Post

Have a question?

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus.

Legacy Tech is a threat to banks

Legacy Tech is a threat to banks

If you are just day dreaming about using a “real” machine whilst sat inside a bank trying to get your IE7 on Windows XP to actually browse anything younger than five years old, you are not alone..

When I started in Investment Banking IT, at Salomon Brothers, we were pioneers. Often the first financial and often commercial users of many things. Sun workstations, Cisco routers, Mosaic browsers, all docs in HTML, Oak before it became Java, Perl and yes even Windows 3.1……

Seems so long ago. But we invested in the latest and actively sought it out as a competitive advantage – and it worked well.

Now the banks are generally stuck in their legacy silo’s and very encumbered by their lack of ability to change. Security which is the “big thing” in many organisations, is laughably compromised by the sheer age of the tools that they are using.

One bank I was working in a while back were so proud that they were not hit by Shellshock, and I was quite surprised. Until I dug around and found that all their external facing kit was so old it was pre-bug….

A lot internal debate and energy in the banks remains around storage, server farms, licensing of product vs open source etc. All with charges maintained at artificially high levels internally to encourage change, but the cost of the change outweighs the charges, and the required change is seemingly always pushed to the application owners, who have bigger fish to fry than getting off old stuff.

The use of Cloud for both server capacity and storage is being approved by regulators for non-core functions, but most banks are nowhere near ready.

In five to ten years time, it will become increasingly hard to actually buy kit to run in data centres, instead of just purchasing Cloud infrastructure, but the banks will be on eBay buying boards to keep their kit running.

Half the core applications at peoples desks are running on what are deemed End-User Applications. Many of these are critical to the running of the organisation. Which delays the roll out of desktop upgrades, as most are in Excel, and too difficult to upgrade to later versions and macros.

Internal networks are 10-15 years old, and the kit facing external connections still running legacy kit against very slow connections. Across the banks we have worked in the last several years, the Internet connections are worse than most of us have in our living rooms. This prevents the effective use of Cloud or SAAS or Virtual Desktops or even dynamic VPN technology.

The industry need to look at SAAS solutions for non-core functions sooner rather than later. They need to centrally invest and manage programs to upgrade infrastructure, and not give it up to the Systems Integrators to avoid responsibility, but own it themselves.

It would be good to see the banks at the forefront, moving to Mac’s for their desktops, shift EUC to compute farm based web services applications, move to an integrated Cloud based environment for areas allowed by the regulators, implement low or zero footprint SAAS style applications to replace legacy, and upgrading their networks to current kit for internal and external connections.

If this is not happening soon, the security and vulnerability of many institutions will remain a real and growing concern, until it is too late, and they are no longer a going concern.

 

Post Tags :

Recent Post

Have a question?

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus.

FCA allows cloud services for financial services

PROJECT DELIVERY SHOULD BE EASY….

FCA allows cloud for financial services

There has been much speculation on the use of Cloud computing for Financial Services organisations.

Concerns raised have been due to security and data privacy concerns by Compliance departments and various audit bodies.

The IT community have continued to stress that Financial Services, as one of the biggest global computer users, must use Cloud to be able to survive.

The situation is now clear. The FCA will support the use of Cloud services, in the same way that they support Outsourcing services. Under the final guidelines issued on July, 7th, FG 16/5: Guidance for firms outsourcing to the ‘cloud’ and other third party IT services, the FCA are effectively allowing the use of Cloud, with similar restrictions as any Outsourcing.

In the main the use of Cloud will need to follow the same guidelines and rules as for the use of Outsourcing, and be controlled by following the FCA SYSC Systems and Controls guidelines 3.1 covering the use of all IT systems within Financial Services, and includes Outsourcing details.

As with all IT, the FCA FG 16/5 guidelines are quite clear that moving compute capacity to the Cloud does not remove a firm’s responsibility for that compute environment. Like the Outsourcing rules, company representatives still need to be responsible for the use and controls of a Cloud service.

With the Outsourcing options already available from the FCA, for non EEA hosting and services, these facilities need specific checks and balances to be put into place. The Cloud regime allows for the same options. So non-EEA facilities can be utilised, with the right controls.

This is a brave and foresighted decision by the FCA, and one that is critical for the future of the industry.

Data centres run directly by organisations cannot scale quickly enough. And only the biggest firms can afford to own and run their own. Data Centres are now facilities that are hired. Cloud is the next logical step from this, without the concerns about physical access.As the Cloud continues to expand, and the big IT services companies, build acres of compute facilities, purchasing physical servers that fit “individual financial services firms” standards is becoming increasingly difficult.

Fewer IT vendors are building proprietary servers as the Cloud virtualisation removes this specific requirement, and profitability drops out of this sector in the industry.

Choice of physical servers is reducing, choice of Cloud suppliers providing virtual servers, using generic kit is increasing. Financial Services need to embrace the Cloud, and the regulators allowing this critical step, is a big leap forward.

Security, always a bug bear within the huge IT Security departments within Banks and Asset Management firms becomes easier, and more secure. The big Cloud providers, Amazon, IBM, Microsoft and the specialist niche firms are security experts, employ huge teams of specialists, and make sure all their servers are patched. Most break-in’s and Virus attacks in financial firms are simply caused by the fact that their internal teams are simply not this good.

If anything moving to Cloud, levels the playing field, reduces cost, improves security and allows for the massive expansion that is still being driven by financial services growth in volumes and regulatory calculation demand.

The sticking point will continue to be Data Privacy. Whilst the FCA have allowed Cloud as they allow Outsourcing, the use of Cloud and Outsource is still hampered by the exposure of Client data to non-resident analysis of the data. For example the US looking at EU client data, Swiss clients being exposed outside Switzerland.

The exception list is long, and often a local requirement by a local regulator, protectionist maybe, and always argued as protecting Clients in their jurisdiction.

The declaration that the Safe Harbour process was not valid by the EU in October 2015 made this situation more difficult.

The rushed replacement of this by the Privacy Shield initiative between the EU and the US in February 2016, plugs the gap, at least short term.

To be clear, even with this Data Privacy issue hanging over the Cloud process, and indeed Outsourced facilities outsides a firms’ core jurisdiction, there are many processes in Financial Services that contain no client data whatsoever.

These include overnight Regulatory compute, pricing mechanisms, much of the Reg Reporting requirement, and indeed many of the Desktops sat physically in numerous locations could all safely be moved to the Cloud.

There is an expectation that Privacy Shield and other initiatives will continue to close the gap in what can be put into the Cloud, and the Cloud providers themselves are working very actively to bring Financial Services on board by guaranteeing the location of data used, and specific data sets within that data.

All in all the FCA 16/5 Cloud announcement is a huge step forward for the Financial Services industry, and indeed is entirely the right move by the FCA. Now let’s move the over complex and heavily exposed facilities within these Financial Services firms to a more flexible, secure, cost-effective Cloud environment, and focus on building a business, not an IT stack.

We have worked with Cloud services for several years, and manage much of our services in both public and private cloud environments. If you want advice on the approach here please do contact us, informally or formally.

FCA Cloud Guidelines: https://www.fca.org.uk/news/fg16-5-guidance-for-firms-outsourcing-to-cloud-and-third-party-it-servicesFCA SYSC Handbook: https://www.handbook.fca.org.uk/handbook/SYSC/

Post Tags :

Recent Post

Have a question?

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus.

32T Philosophy

We asked our clients their main reasons for selecting 32T in a marketplace where so much choice exists.

Clients cited the following:

Common-sense approach

32T, is a management and technology consultancy dedicated to the financial services industry, providing highly experienced practitioners who are focused on delivery and guided by a common-sense approach to delivering services to our clients.

Outcome based delivery model

Combining innovative thinking with detailed industry experience we offer clients project delivery expertise, technology integration services and transformation delivery. We believe in an outcome-based delivery model that promotes positive change to drive organisations forward.

Partnership and transparency

Working in partnership with our clients we focus on creating manageable, controlled and transparent delivery projects that deliver results which in turn enables our clients to increase revenue, manage risk, deliver regulatory change and reduce costs.

Proven track record leaving lasting legacy

32T has a proven track record of effectively integrating with existing client teams and other vendors engaged by our clients, our focus is serving our clients to ensure what’s best for them, by providing a greater emphasis on the depth of subject matter expertise and working hard to transfer knowledge back to our clients, our aim is to always deliver sustainable results and leave a lasting legacy long after our engagement ends.

Competitive pricing model

Our combined experience and well-established credibility in Programme leadership, Governance and delivery execution is backed by thousands of days on the frontline of providing programme, project and PMO delivery over the last 20+ years, this experience ensures we are able to bring the most effective tools and practices to enable our clients to achieve success.

We offer tailored, practical consulting support, at a competitive price, backed by significant financial services domain expertise which enables 32T to support our customers continuously evolving requirements to deliver change.

Finally, many clients commented on our honesty and desire to only engage when certain we could deliver the project successfully. If we do not believe we can deliver the desired results, we tell our clients