As history has proven, many global recessions have resulted in financial service institutions looking to curb costs. There are many ways in which companies seek to save money and enhance operational efficiency, including:
One specific area of risk that exists is the lack of controls and governance by the industry in the realm of spreadsheets, which are increasingly relied upon to make both complex and high-frequency modeling and calculations, and/or used as testing environments for forecasting or regression testing scenarios.
Such data-driven outputs are then used in critical decisions or transactions that may have huge repercussions for the financial services industry. How do institutions ensure spreadsheets have auditable controls in place while also eliminating time lag and ensuring hyper-automation and time-to-market and reporting are met? Excel or spreadsheets are not the issue; indeed, it is ensuring that there are sufficient controls in place to support the decisions made by the users of spreadsheets.
Calculations and modeling scenarios on spreadsheets are by and large not kept in a framework that can be tracked, audited, or decoded in terms of the final conclusions of data outputs. Also, as banks downsize and retrench, how can one increase performance with a smaller workforce performing such workflows, while also maintaining accuracy, speed, and the necessary controls to unpick the IP trail of the results? How do financial institutions manage their reliance on spreadsheets and optimize results in the backdrop of a bear market?
Highly regulated institutions need visibility and controls relating to maintaining a consistent audit of all uses and high-frequency IP created within spreadsheets used in critical decisions. Every time the spreadsheet is executed or shared, there is an unchecked reliance on the user to use the correct version and to keep a copy of the final spreadsheet output. It may be stored on some occasions on open accessible drives. This poses risks, should the user leave or retrench, and there are those that, at times, deliberately manipulate spreadsheets or values.
By using a centralized service or function, you can ensure an audit is kept of every call and that all users are using the correct version. This means that if an auditor is looking into a particular case or wants to look back over your records, there’s a comprehensive log of all actions. This can simplify the process and avoid costly fines.
Another potentially costly situation is IP theft. If you have a model that is performing well and would be of benefit to competitors, there’s always a chance that the model could be copied and reused elsewhere if people have direct access to the spreadsheet. Securing the logic behind a functional representation means that only certain users get access to the full spreadsheet and everyone else just sees the inputs and outputs.
The manual action of using spreadsheets is itself rife with lurking dangers. The time teams spend manually executing actions is costly in and of themselves and is further exacerbated if waiting for the resourcing of technology developers to assist them. This creates further delays due to multiple controls in place for monitoring the technology processes deployed, including the wait time for relevant resources to become available.
How to cut operational costs in this area is simple: eliminate the manual process. When presented with this statement, the knee-jerk reaction of most IT professionals is, “Excel needs to be removed.” However, this is not the case, and the alternative could be a costly tech stack development to replace such workflows on Excel. It’s easy to blame a spreadsheet for causing the issue, but it’s the way spreadsheets are used which causes the problem.
The fundamentals of Microsoft Excel make it perfectly suited for processing data in complex ways. Its issue is that it operates on someone’s PC and requires direct manual intervention for it to be executed.
If we can turn this spreadsheet into an automated function executed remotely, it means, we can then integrate it into the process and call it without manual intervention. This means you will invariably reduce the workload of the team involved and, more than likely, increase the speed of the process.
Automating in this way has the added benefit of reducing “fat-finger errors”. These are errors caused by manual intervention where data is mistakenly copied or modified and leads to incorrect results. Automating the process and having the system handle the inputs and outputs means these errors can be completely avoided. This means you’re saving money at least on the time it would take to correct the problem and, potentially, saving a lot on costly legal or security issues.
As Benjamin Franklin wrote, “Time is money”.
When you approach your problems with Excel, the traditional approach is to convert the spreadsheet into an application, either using traditional development languages or more recently a low/no-code solution. Whichever method you choose, the development of this application takes time. Time in scoping, breaking down the logic of the model, calculation or test, takes time developing, testing, re-developing, testing
Although this process has been made more efficient with low/no-code, it’s still a time-sink, given the people who create the spreadsheet are often not the people who will be doing the coding. This means a lot of back and forth between two teams who may not be operating in sync or have dedicated resources for this task; this causes delays and opens the door for further errors.
Excel is an application with almost 40 years of history. It’s been honed to be the ideal tool for modeling complex calculations. Trying to recreate what a spreadsheet does in code, or low/no-code is a complex operation even when you’ve completely broken down the logic within the spreadsheet.
An ideal solution to this would be to allow the people who created the original solution to simply provide their functionality to the people who are creating the application. This effectively doubles the capacity of your development team and leaves the experts to operate in their own domains.
By leveraging low/no-code tools and citizen developers (people who have no formal development experience but have trained to use these tools), you can expand your effective team size even further.
These developers can augment their applications with advanced logic without needing to understand the full complexity of the low/no-code platform and by utilizing the power of Excel. This frees up costly professional development resources for other tasks.
With Coherent Spark, you can easily take your spreadsheet, tag up your inputs/outputs, upload it, and have an accessible function generated for you in minutes. This function, or REST API, can then be used in your other applications (bespoke or off-the-shelf) and processes. This will help improve process efficiencies, can be consumed in a way to avoid costly mistakes, and is fully audited. On top of all that, you get the other amazing features of Spark like Test Center and versioning.
By using this one simple tool, you can cut process times, manual interventions, development costs, audit issues, and potential fines and increase your speed-to-market. Bring us your most challenging spreadsheet, and we will show you how to convert, control, and connect it. Schedule a demo today.